1 00:00:01,840 --> 00:00:04,480 Speaker 1: This is a numbers Game with Ryan Gardusky. Welcome back. 2 00:00:04,559 --> 00:00:06,720 Speaker 1: I appreciate you all being here again this week. I 3 00:00:06,800 --> 00:00:11,080 Speaker 1: know that everyone's busy with work and family and school 4 00:00:11,119 --> 00:00:13,200 Speaker 1: and laundry and all the rest of it, and I 5 00:00:13,280 --> 00:00:15,920 Speaker 1: know that you have very limited time sometimes, so listen 6 00:00:15,960 --> 00:00:18,480 Speaker 1: to this podcast means quite a lot to me. I've 7 00:00:18,520 --> 00:00:21,720 Speaker 1: realized being built that the amount of hours needed to 8 00:00:21,720 --> 00:00:24,680 Speaker 1: have a clean house, decent body, and good at your 9 00:00:24,760 --> 00:00:26,599 Speaker 1: job is like four hundred hours a week. It is 10 00:00:27,080 --> 00:00:32,000 Speaker 1: not fair whatsoever. So I know that I know that 11 00:00:32,320 --> 00:00:34,680 Speaker 1: many of you have not had the opportunity to like 12 00:00:34,720 --> 00:00:36,520 Speaker 1: and subscribe to this podcast and give me a five 13 00:00:36,520 --> 00:00:38,560 Speaker 1: star rating. But there's always time this week, and I 14 00:00:38,640 --> 00:00:41,120 Speaker 1: hope you can find the time. You will all be 15 00:00:41,200 --> 00:00:44,040 Speaker 1: my best friends. That's what I would say growing up. 16 00:00:44,040 --> 00:00:46,080 Speaker 1: All the time people are like, oh you my best friend, 17 00:00:46,080 --> 00:00:48,239 Speaker 1: Get me a soda, but please give me like and subscribe, 18 00:00:48,280 --> 00:00:50,320 Speaker 1: like and subscribe, give me a five star review. It 19 00:00:50,360 --> 00:00:54,160 Speaker 1: would mean a lot. Okay, I always start this podcast 20 00:00:54,200 --> 00:00:56,640 Speaker 1: off with a cultural reference, and I'm going to use 21 00:00:56,680 --> 00:00:59,160 Speaker 1: the movie Arthur, which is my favorite comedy of all time. 22 00:00:59,560 --> 00:01:02,400 Speaker 1: Dudley More looks at a prostitute. He just picked up 23 00:01:02,400 --> 00:01:04,840 Speaker 1: off the street and he says, well, Princess Gloria, tonight 24 00:01:04,959 --> 00:01:07,840 Speaker 1: is New Year's Eve, third time this week. It's not 25 00:01:07,880 --> 00:01:10,640 Speaker 1: the most famous line in the movie, but it invokes 26 00:01:10,680 --> 00:01:13,480 Speaker 1: a general amount of happiness, and he has in this moment. 27 00:01:13,680 --> 00:01:16,919 Speaker 1: He's drunk. He's what the things are great, very little 28 00:01:16,959 --> 00:01:19,399 Speaker 1: things in life. Get me that excited, get me just 29 00:01:19,520 --> 00:01:22,840 Speaker 1: that happy on a normal basis, and for me, because 30 00:01:22,920 --> 00:01:27,080 Speaker 1: I am incredibly nerdy. That is, polls like really good polls, 31 00:01:27,120 --> 00:01:31,640 Speaker 1: polls that are interesting, polls that that you know, tell 32 00:01:31,720 --> 00:01:34,480 Speaker 1: a good story with the national environment, I had the 33 00:01:34,520 --> 00:01:37,400 Speaker 1: opportunity this week. I was reached out to by a 34 00:01:37,480 --> 00:01:41,600 Speaker 1: polster to ask questions for a national polling survey. Now, 35 00:01:42,000 --> 00:01:43,800 Speaker 1: this is like something that costs a lot of money 36 00:01:43,880 --> 00:01:45,600 Speaker 1: to do. When they were offering to do it, to 37 00:01:45,800 --> 00:01:48,920 Speaker 1: be on the podcast and talk about it to my audience. 38 00:01:48,960 --> 00:01:51,520 Speaker 1: So this is something very exclusive. There's something very special. 39 00:01:52,040 --> 00:01:55,640 Speaker 1: The polster is Signal, which is Active Vote, which kind 40 00:01:55,640 --> 00:01:58,480 Speaker 1: of ranks all the polsters that come out during an 41 00:01:58,520 --> 00:02:01,200 Speaker 1: even year. They ranked Signal is number twenty seven at 42 00:02:01,240 --> 00:02:03,360 Speaker 1: one hundred and thirty six. That's very good, that's a 43 00:02:03,440 --> 00:02:06,680 Speaker 1: very high number. They are a very well accredited Republican firm, 44 00:02:07,000 --> 00:02:08,840 Speaker 1: probably one of the top three if you separate all 45 00:02:08,840 --> 00:02:10,919 Speaker 1: the Polsters, probably one of the top three or four 46 00:02:11,160 --> 00:02:14,560 Speaker 1: Republican pulling firms in the business. Did a good track record, 47 00:02:14,600 --> 00:02:17,120 Speaker 1: So I solidly believe in them, and I'm excited to 48 00:02:17,120 --> 00:02:19,240 Speaker 1: sit there and talk to them. Our guest is Brent Buchanan, 49 00:02:19,760 --> 00:02:21,880 Speaker 1: founder of Signal Polls, and I want to talk to 50 00:02:21,919 --> 00:02:25,480 Speaker 1: the audience first about what questions I chose to ask 51 00:02:25,560 --> 00:02:28,560 Speaker 1: and why I chose to ask them. So first and foremost, 52 00:02:28,639 --> 00:02:31,919 Speaker 1: the first question I asked was America brings in about 53 00:02:31,960 --> 00:02:35,720 Speaker 1: one million legal immigrants per year. Is that number too high, 54 00:02:36,000 --> 00:02:39,560 Speaker 1: too low? Or about right? Then I asked how many 55 00:02:39,639 --> 00:02:42,160 Speaker 1: legal immigrants should America bring in per year, with the 56 00:02:42,200 --> 00:02:45,959 Speaker 1: options being zero, one hundred thousand, two hundred and fifty thousand, 57 00:02:46,120 --> 00:02:50,640 Speaker 1: five hundred thousand, one million, five million, or unlimited. Okay, 58 00:02:50,639 --> 00:02:53,399 Speaker 1: here's why I asked that question. A lot of Polsters 59 00:02:53,560 --> 00:02:59,519 Speaker 1: talk about immigration in esoteric you know, I like, as 60 00:02:59,560 --> 00:03:01,639 Speaker 1: if they don't give the information before, and they talk 61 00:03:01,680 --> 00:03:03,920 Speaker 1: about the idea without giving the information give people the 62 00:03:04,000 --> 00:03:06,720 Speaker 1: hard numbers. It's like when they say do you support amnesty? 63 00:03:06,760 --> 00:03:09,760 Speaker 1: They all speak English and go to church and pay 64 00:03:09,800 --> 00:03:11,760 Speaker 1: their taxes and have never committed a fel any. Well, 65 00:03:11,800 --> 00:03:13,680 Speaker 1: then like, yeah, a majority always say yes. But that's 66 00:03:13,720 --> 00:03:16,560 Speaker 1: not how how the actual process works. I want to 67 00:03:16,600 --> 00:03:19,040 Speaker 1: talk about the actual thing, the actual hard numbers. So 68 00:03:19,600 --> 00:03:24,520 Speaker 1: that's why I did. There's only one polster recently in 69 00:03:24,560 --> 00:03:26,520 Speaker 1: the last few years to ask a question like that. 70 00:03:26,560 --> 00:03:28,960 Speaker 1: Believe it was Cato of All Places, which is a 71 00:03:29,000 --> 00:03:32,320 Speaker 1: libertarian thing tank which dreams of the day that America 72 00:03:32,320 --> 00:03:34,520 Speaker 1: will have no borders and we look like Angola or 73 00:03:34,680 --> 00:03:37,360 Speaker 1: Honduras or something. They're the ones who asked the question, 74 00:03:37,440 --> 00:03:41,240 Speaker 1: and they found in twenty twenty one that nine percent 75 00:03:41,240 --> 00:03:43,960 Speaker 1: of Americans wanted no immigration at all. They wanted a 76 00:03:44,000 --> 00:03:48,080 Speaker 1: complete and utter freeze, forty four panted a ninety percent 77 00:03:48,120 --> 00:03:51,240 Speaker 1: reduction or greater, and sixty one percent a fifty percent 78 00:03:51,280 --> 00:03:54,320 Speaker 1: reduction or greater. So we'll talk to signal and we'll 79 00:03:54,360 --> 00:03:57,200 Speaker 1: see what it looks like now with Donald Trump in 80 00:03:57,280 --> 00:04:01,280 Speaker 1: charge and the border's much more secure. The second question 81 00:04:01,440 --> 00:04:04,320 Speaker 1: I asked was about the future of artificial intelligence. This 82 00:04:04,400 --> 00:04:07,240 Speaker 1: is a question that is really important to our nation 83 00:04:07,280 --> 00:04:09,120 Speaker 1: because we're going to be the United States is supposed 84 00:04:09,120 --> 00:04:13,040 Speaker 1: to be on the forefront about artificial intelligence, and there's 85 00:04:13,080 --> 00:04:16,560 Speaker 1: almost no polsters asking about how Americans feel like it's 86 00:04:16,600 --> 00:04:21,200 Speaker 1: not ass and I think that I think that your 87 00:04:21,240 --> 00:04:24,560 Speaker 1: opinion of it based either on your how you think 88 00:04:24,600 --> 00:04:27,560 Speaker 1: you'll thrive in the environment or what you think the 89 00:04:27,600 --> 00:04:30,359 Speaker 1: future looks like, Like does it look like Rosie the 90 00:04:30,400 --> 00:04:32,960 Speaker 1: robot from the Jetson's doing chores? I mean, who wouldn't 91 00:04:32,960 --> 00:04:35,120 Speaker 1: want that? I would? Are we going to get the terminator? 92 00:04:35,240 --> 00:04:38,120 Speaker 1: And is everyone you know going to be in really 93 00:04:38,200 --> 00:04:41,479 Speaker 1: rough shape on judgment day? I don't know. AI makes 94 00:04:41,600 --> 00:04:44,200 Speaker 1: me nervous, but I don't know how the rest of 95 00:04:44,240 --> 00:04:46,560 Speaker 1: the country feels. And I would love to know if 96 00:04:46,560 --> 00:04:49,040 Speaker 1: that I'm it's just me being crazy, or how everyone 97 00:04:49,040 --> 00:04:51,240 Speaker 1: else's opinion is on it. I need a pulse check. 98 00:04:51,360 --> 00:04:54,120 Speaker 1: The last question was also about tech, and somebody was 99 00:04:54,120 --> 00:04:57,120 Speaker 1: about AI and tech and there's our big tech in 100 00:04:57,120 --> 00:05:01,920 Speaker 1: social media. The question was is social media social media 101 00:05:01,960 --> 00:05:05,520 Speaker 1: has almost no regulations on it people. I want everyone 102 00:05:05,600 --> 00:05:08,760 Speaker 1: to realize that there's almost no regulations on social media. 103 00:05:08,920 --> 00:05:11,640 Speaker 1: Utah became like the first day to say like kids 104 00:05:11,720 --> 00:05:13,920 Speaker 1: under eighteen can't access social media, can't make a social 105 00:05:13,920 --> 00:05:16,960 Speaker 1: media account. But I think that's a basically that's very 106 00:05:16,960 --> 00:05:19,600 Speaker 1: close to being basically it. I mean, you can't obviously 107 00:05:19,800 --> 00:05:22,200 Speaker 1: sell things illegally on social media. You can't like put 108 00:05:22,800 --> 00:05:26,560 Speaker 1: you know, child's pornography or something like that. That that's 109 00:05:26,600 --> 00:05:31,839 Speaker 1: a national regulation. But as far as like regulating anything 110 00:05:32,160 --> 00:05:35,080 Speaker 1: Section two eighty or suing them or what they own 111 00:05:35,120 --> 00:05:38,200 Speaker 1: of your data or reselling your data, there's no regulations. 112 00:05:38,400 --> 00:05:41,280 Speaker 1: And big tech is very very, very very few. So 113 00:05:41,400 --> 00:05:44,240 Speaker 1: I asked the question of should be big tech be 114 00:05:44,360 --> 00:05:49,240 Speaker 1: more regulated because in an era of Elon Musk, it 115 00:05:49,320 --> 00:05:52,479 Speaker 1: used to be that Democrats are very poor regulations. You know, 116 00:05:52,520 --> 00:05:54,200 Speaker 1: you have the caras Wishes of the world. The very 117 00:05:54,240 --> 00:05:57,159 Speaker 1: liberals were like, yeah, let's let's uh, you know, regulate 118 00:05:57,200 --> 00:05:59,120 Speaker 1: big tech. Those are the same people who are very 119 00:05:59,120 --> 00:06:01,960 Speaker 1: excited about big tech, you know, when Steve Jobs was 120 00:06:02,000 --> 00:06:06,239 Speaker 1: alive and now Republicans who blamed everything on the twenty 121 00:06:06,240 --> 00:06:09,360 Speaker 1: twenty election on zuck Bucks, they're very pro big tech. 122 00:06:09,480 --> 00:06:12,120 Speaker 1: So did people change sides? It's a good analysis of 123 00:06:12,160 --> 00:06:15,400 Speaker 1: how do people feel on this issue. And the last 124 00:06:15,480 --> 00:06:18,280 Speaker 1: question I asked is with trade, I didn't want to 125 00:06:18,320 --> 00:06:21,000 Speaker 1: use the word tariff just because it's Trump said the 126 00:06:21,000 --> 00:06:23,960 Speaker 1: most beautiful word in the English language. I had to 127 00:06:24,000 --> 00:06:27,560 Speaker 1: like parse through how to approach the idea of tariffs 128 00:06:27,560 --> 00:06:31,640 Speaker 1: and about Trump's push for tariffs without saying the word tariff. 129 00:06:31,880 --> 00:06:34,600 Speaker 1: So the question I said was, would you be willing 130 00:06:34,680 --> 00:06:38,599 Speaker 1: to pay higher prices for more American jobs? Would you 131 00:06:38,640 --> 00:06:41,720 Speaker 1: trade prices for jobs? I think this is very important 132 00:06:41,760 --> 00:06:45,320 Speaker 1: because regardless of what happens ultimately with these tariffs, I 133 00:06:45,360 --> 00:06:47,880 Speaker 1: don't think this is going to go away. During the trumdministration, 134 00:06:48,000 --> 00:06:51,880 Speaker 1: Biden was very supportive tariffs. China is a very our 135 00:06:52,080 --> 00:06:56,120 Speaker 1: is our geopolitical adversary, and we need to be more 136 00:06:56,160 --> 00:06:59,400 Speaker 1: conscious of reindectrializing the United States. So what can people, 137 00:06:59,480 --> 00:07:03,039 Speaker 1: what can politicians get, what is their ultimate how much 138 00:07:03,080 --> 00:07:05,440 Speaker 1: can they can they push out on this without getting 139 00:07:05,960 --> 00:07:09,040 Speaker 1: slapped backed by voters? So those are my four questions, 140 00:07:09,200 --> 00:07:11,840 Speaker 1: and I am very excited to hear what the responses 141 00:07:11,880 --> 00:07:14,080 Speaker 1: were and I talk to Brent Buchanan. So coming up 142 00:07:14,120 --> 00:07:17,440 Speaker 1: next is my conversation with Polster Signal owner Brem Buchanan. 143 00:07:17,480 --> 00:07:23,080 Speaker 1: Stay tuned. My guest this week is Brent Buchanan. He 144 00:07:23,200 --> 00:07:25,640 Speaker 1: is the founder of Signal polling firm. Thank you for 145 00:07:25,680 --> 00:07:26,160 Speaker 1: being here. 146 00:07:26,040 --> 00:07:27,720 Speaker 2: Brent, Hey, great to be with you all. 147 00:07:27,800 --> 00:07:30,720 Speaker 1: Ron Brent, When did you found Signal and why did 148 00:07:30,720 --> 00:07:32,360 Speaker 1: you start in the polling industry? 149 00:07:32,920 --> 00:07:36,040 Speaker 2: Well, I actually didn't start in the polling industry, oddly enough. 150 00:07:36,120 --> 00:07:38,480 Speaker 2: I started just running campaigns, which is I think a 151 00:07:38,520 --> 00:07:40,560 Speaker 2: lot of people's story who are in this world is 152 00:07:40,600 --> 00:07:43,880 Speaker 2: they just volunteered. At some point thought this is pretty fun. 153 00:07:43,920 --> 00:07:45,960 Speaker 2: Maybe I could make a living with it, and so 154 00:07:46,000 --> 00:07:48,880 Speaker 2: that that's where my story started, actually started in a 155 00:07:48,880 --> 00:07:54,120 Speaker 2: mayoral campaign in Montgomery, Alabama. We got absolutely trounced. They 156 00:07:54,120 --> 00:07:56,760 Speaker 2: don't elect Republicans of Montgomery, and I was too naive 157 00:07:56,800 --> 00:07:58,600 Speaker 2: to know that and just how we could will our 158 00:07:58,640 --> 00:08:01,680 Speaker 2: way into winning the election, and we did not. And 159 00:08:01,720 --> 00:08:04,640 Speaker 2: then went on to work for a Greek restaurant runner 160 00:08:05,200 --> 00:08:08,200 Speaker 2: who was running for county commission against a Republican incumbent 161 00:08:08,440 --> 00:08:11,240 Speaker 2: and he was a Republican too, and we beat the guy, 162 00:08:11,520 --> 00:08:14,640 Speaker 2: and that was my second taste of a campaign was 163 00:08:14,680 --> 00:08:17,840 Speaker 2: actually victory. And he paid me in cash and bac 164 00:08:17,880 --> 00:08:19,760 Speaker 2: la bah, which was a pretty good arrangement for a 165 00:08:19,760 --> 00:08:21,480 Speaker 2: single dude. 166 00:08:21,560 --> 00:08:23,920 Speaker 1: But can I just tell you for one second, I 167 00:08:23,960 --> 00:08:26,560 Speaker 1: worked for a guy named Stamataslacaccas who ran for the 168 00:08:26,600 --> 00:08:30,200 Speaker 1: State Assembly and every third day, I think we went 169 00:08:30,240 --> 00:08:32,960 Speaker 1: to a Greek pastry. I mean I was eating so 170 00:08:33,240 --> 00:08:36,240 Speaker 1: much Greek food for the entire summer. I guess it 171 00:08:36,280 --> 00:08:40,160 Speaker 1: was twenty fourteen and it was a million Greek pastries. 172 00:08:40,200 --> 00:08:43,160 Speaker 1: So this is a this was this, I feel this. 173 00:08:43,280 --> 00:08:46,440 Speaker 1: I feel this story for sure right now, one hundred percent. 174 00:08:46,920 --> 00:08:48,560 Speaker 1: So that was your first race ever worked on. 175 00:08:49,440 --> 00:08:53,000 Speaker 2: It was and then ended up starting a firm. We 176 00:08:53,000 --> 00:08:55,880 Speaker 2: were kind of a jack of all trades. Basically, if 177 00:08:55,920 --> 00:08:57,280 Speaker 2: you were willing to pay us for it and it 178 00:08:57,320 --> 00:09:00,840 Speaker 2: was legal, we would do it. And realized that you 179 00:09:00,880 --> 00:09:03,320 Speaker 2: can't really take that model outside of your home state. 180 00:09:03,400 --> 00:09:05,480 Speaker 2: This is back in still living in Montgomery at the time, 181 00:09:05,840 --> 00:09:08,079 Speaker 2: and really land it on polling after the twenty ten 182 00:09:08,160 --> 00:09:10,560 Speaker 2: cycle when I saw that it was getting more expensive, 183 00:09:10,600 --> 00:09:13,359 Speaker 2: it was getting harder to do. I loved the strategic 184 00:09:13,400 --> 00:09:15,240 Speaker 2: aspect of what you get to do as a polster, 185 00:09:16,160 --> 00:09:18,080 Speaker 2: and it was more scalable. You know, I could work 186 00:09:18,120 --> 00:09:19,880 Speaker 2: on a lot more races as a polster than I 187 00:09:19,920 --> 00:09:22,880 Speaker 2: could managing or running or being the consultant on a race. 188 00:09:22,920 --> 00:09:26,400 Speaker 1: That is in my mind, there's so many negative connotations 189 00:09:26,480 --> 00:09:29,840 Speaker 1: to polling. You're pulled it firmed it very well in 190 00:09:29,880 --> 00:09:32,520 Speaker 1: the last secle on the twenty twenty four cycle. How 191 00:09:32,720 --> 00:09:36,480 Speaker 1: does one manage a firm well? How do you create 192 00:09:36,480 --> 00:09:40,640 Speaker 1: a formula that works, because very few do well. 193 00:09:40,840 --> 00:09:43,800 Speaker 2: The first is to separate private from public polsters, because 194 00:09:43,840 --> 00:09:46,120 Speaker 2: we're not the same. So when you see a Marist 195 00:09:46,120 --> 00:09:49,800 Speaker 2: poll or you know somebody who's only doing polling to 196 00:09:49,880 --> 00:09:52,480 Speaker 2: release it and get attention, which I mean, we put 197 00:09:52,480 --> 00:09:55,280 Speaker 2: out a few poles, but you're singing barely the scratching 198 00:09:55,320 --> 00:09:57,800 Speaker 2: of a surface of what we're doing. And so that's 199 00:09:57,800 --> 00:09:59,680 Speaker 2: the first thing is that we are not a public 200 00:09:59,720 --> 00:10:03,280 Speaker 2: pol so we actually have an economic incentive to be right. 201 00:10:04,400 --> 00:10:06,440 Speaker 2: Some of these public pollsters, they can be wrong over 202 00:10:06,480 --> 00:10:08,680 Speaker 2: and over again. They get coverage over and over again, 203 00:10:09,679 --> 00:10:11,720 Speaker 2: and I don't know who really funds them. That always 204 00:10:11,800 --> 00:10:14,080 Speaker 2: makes me ask that question when I see certain numbers 205 00:10:14,120 --> 00:10:16,600 Speaker 2: come out. But those of us who do private polling, 206 00:10:16,600 --> 00:10:19,640 Speaker 2: who were being hired by campaigns and corporations and committees 207 00:10:19,640 --> 00:10:22,920 Speaker 2: and caucuses to help guide them through, what do I say, 208 00:10:22,960 --> 00:10:24,520 Speaker 2: Who do I say it to? How do I say 209 00:10:24,520 --> 00:10:27,959 Speaker 2: a best So that's a big differentiation between the two. 210 00:10:28,000 --> 00:10:30,560 Speaker 2: The second is that one of the most important pieces 211 00:10:30,559 --> 00:10:35,199 Speaker 2: of polling is the actual getting respondents from voters, and 212 00:10:35,320 --> 00:10:38,400 Speaker 2: that is in our world and our competitors, most people 213 00:10:38,480 --> 00:10:41,200 Speaker 2: outsource that function. So they write a script and then 214 00:10:41,240 --> 00:10:42,880 Speaker 2: they send it off to a call center and a 215 00:10:42,920 --> 00:10:45,400 Speaker 2: texting vendor, and then they wait for those people to 216 00:10:45,440 --> 00:10:48,000 Speaker 2: do the survey collection, and then it comes back and 217 00:10:48,040 --> 00:10:50,200 Speaker 2: they build their reports, give it at the clients. We 218 00:10:50,280 --> 00:10:53,520 Speaker 2: decided over a decade ago that that was such an 219 00:10:53,559 --> 00:10:56,680 Speaker 2: important piece of the process that we needed to control 220 00:10:56,720 --> 00:10:59,800 Speaker 2: every aspect of it. So our firm is larger than 221 00:11:00,080 --> 00:11:02,400 Speaker 2: I think any other Republican polling for him by headcount, 222 00:11:02,400 --> 00:11:05,040 Speaker 2: because it takes a lot more effort in people to 223 00:11:05,120 --> 00:11:08,080 Speaker 2: control that part of the process, and that is really 224 00:11:08,120 --> 00:11:09,440 Speaker 2: the magic piece of. 225 00:11:09,440 --> 00:11:13,200 Speaker 1: All what and so you okay, this the whole basis 226 00:11:13,200 --> 00:11:16,160 Speaker 1: of this podcast episodes. You did a national poll. It's 227 00:11:16,360 --> 00:11:18,320 Speaker 1: is it released publicly? It will be probably released publicly 228 00:11:18,320 --> 00:11:21,040 Speaker 1: by the time we come to air, which will be Monday. 229 00:11:22,520 --> 00:11:26,920 Speaker 1: So what are some general themes that you've seen about 230 00:11:26,960 --> 00:11:31,720 Speaker 1: how people feel at President Donald Trump performance in this poll. 231 00:11:31,760 --> 00:11:34,480 Speaker 2: I think the first is that there's not this massive 232 00:11:34,520 --> 00:11:38,800 Speaker 2: elasticity in his image and if you were watching public polling. 233 00:11:38,920 --> 00:11:42,360 Speaker 2: You would think that Donald Trump started out really good, 234 00:11:42,559 --> 00:11:46,199 Speaker 2: everybody loved him, and now everybody hates him. But that's 235 00:11:46,280 --> 00:11:48,480 Speaker 2: just not what our polling data shows. 236 00:11:48,880 --> 00:11:51,200 Speaker 1: I got I got a call today from a friend 237 00:11:51,280 --> 00:11:54,000 Speaker 1: who's was a Republican republic like that official, and he's 238 00:11:54,080 --> 00:11:55,679 Speaker 1: very pusel with the tariffs, and he said to me 239 00:11:56,040 --> 00:11:58,520 Speaker 1: Trump's approval is probably thirty two percent now. And I 240 00:11:58,559 --> 00:12:03,480 Speaker 1: was like, now, ah, sorry, go ahead, go ahead, but 241 00:12:03,559 --> 00:12:07,360 Speaker 1: you're one hundred percent right, very little elasticity. Sorry. 242 00:12:08,000 --> 00:12:12,080 Speaker 2: Yeah, we've seen a slight degradation in his image from 243 00:12:12,480 --> 00:12:15,400 Speaker 2: you know, a couple of weeks after his inauguration to now. 244 00:12:15,559 --> 00:12:19,000 Speaker 2: But I think if he'd literally just gotten into office 245 00:12:19,160 --> 00:12:22,240 Speaker 2: and sat in the oval office and done nothing, we 246 00:12:22,280 --> 00:12:24,560 Speaker 2: would have seen a degradation in his image because there's 247 00:12:24,800 --> 00:12:26,920 Speaker 2: there's a sugar rush. There's a high that comes off 248 00:12:26,960 --> 00:12:29,400 Speaker 2: of a campaign and of the attention you get from it, 249 00:12:29,440 --> 00:12:31,760 Speaker 2: and then you got to go govern. And I think 250 00:12:31,800 --> 00:12:34,800 Speaker 2: considering that all that he has done and all the 251 00:12:34,840 --> 00:12:37,679 Speaker 2: topics that he's broached in this time that he's been 252 00:12:37,679 --> 00:12:40,520 Speaker 2: in office for less than three months at this point, 253 00:12:40,600 --> 00:12:43,319 Speaker 2: that the fact that his image is still you know, 254 00:12:43,440 --> 00:12:46,640 Speaker 2: hovering around forty six percent favorable, is way better than 255 00:12:46,679 --> 00:12:49,320 Speaker 2: he ever was, even at the beginning of his first term. 256 00:12:49,360 --> 00:12:52,400 Speaker 2: And that's that's one thing people forget about polling is 257 00:12:52,400 --> 00:12:54,880 Speaker 2: is polling is not just a data point in time. 258 00:12:54,920 --> 00:12:57,439 Speaker 2: A poll is a data point in time, but polling 259 00:12:58,080 --> 00:13:00,560 Speaker 2: is different, and that is watching what happens over time 260 00:13:00,600 --> 00:13:03,960 Speaker 2: and comparing it to things that matter. And so when 261 00:13:04,040 --> 00:13:06,160 Speaker 2: you look at things in terms of his first term 262 00:13:06,160 --> 00:13:09,360 Speaker 2: and his second term, he is doing significantly better now 263 00:13:09,640 --> 00:13:12,560 Speaker 2: in his image than he was at this point in 264 00:13:12,600 --> 00:13:13,360 Speaker 2: twenty seventeen. 265 00:13:13,880 --> 00:13:17,160 Speaker 1: And even though the tariffs aren't popular, certainly not, but 266 00:13:17,240 --> 00:13:20,880 Speaker 1: in the media standards, it has an overall I mean, 267 00:13:20,880 --> 00:13:24,520 Speaker 1: people have a much more positive opinion of him than 268 00:13:24,559 --> 00:13:27,520 Speaker 1: they do some like the tariffs for example, right. 269 00:13:28,040 --> 00:13:31,640 Speaker 2: Completely, and a lot of people just trust him that Okay, 270 00:13:31,679 --> 00:13:33,600 Speaker 2: he said he was going to do this during the campaign. 271 00:13:33,679 --> 00:13:35,920 Speaker 2: I don't really understand why he's doing it, but I 272 00:13:35,960 --> 00:13:38,240 Speaker 2: trust that he's got a reason for it. 273 00:13:38,200 --> 00:13:40,600 Speaker 1: Right, Okay. So now I want to get to the 274 00:13:40,679 --> 00:13:42,760 Speaker 1: questions that I asked, because this is what I'm so 275 00:13:42,800 --> 00:13:45,760 Speaker 1: excited about because I worked on campaigns for almost twenty 276 00:13:45,840 --> 00:13:48,880 Speaker 1: years and just I'm a nerd, so I get really 277 00:13:48,920 --> 00:13:53,120 Speaker 1: excited by it. So I went over the top of 278 00:13:53,160 --> 00:13:55,040 Speaker 1: the show the questions that I asked, but I want 279 00:13:55,040 --> 00:13:57,240 Speaker 1: to go first to it and we'll talk about it. 280 00:13:57,280 --> 00:14:00,480 Speaker 1: So the first question being the question over levels of 281 00:14:00,600 --> 00:14:03,719 Speaker 1: legal immigration in the country. This is something that I 282 00:14:03,760 --> 00:14:07,080 Speaker 1: don't think Bolsters ever asked enough about. Galub does once 283 00:14:07,080 --> 00:14:10,080 Speaker 1: a year or twice a year, but that's about it regularly. 284 00:14:11,679 --> 00:14:13,439 Speaker 1: So the first question was do you think that the 285 00:14:13,520 --> 00:14:15,960 Speaker 1: levels of legal immigration are too high, too small, or 286 00:14:16,040 --> 00:14:18,679 Speaker 1: just right? Thirty eight percents that they were they were 287 00:14:18,720 --> 00:14:21,160 Speaker 1: fine with the current levels, thirty seven percent they wanted 288 00:14:21,160 --> 00:14:25,000 Speaker 1: to reduce, eleven percent they wanted to increased. But when 289 00:14:25,000 --> 00:14:28,360 Speaker 1: you ask the actual number desired, it was a very 290 00:14:28,400 --> 00:14:32,200 Speaker 1: different result. So almost nine percent eight point five percent 291 00:14:32,200 --> 00:14:36,280 Speaker 1: of Americans wanted zero immigration, none at all, a total freeze. 292 00:14:36,640 --> 00:14:40,480 Speaker 1: Seventeen percent wanted a ninety percent reduction about one hundred 293 00:14:40,520 --> 00:14:43,840 Speaker 1: thousand a year, because we've currently taken a million in total. 294 00:14:43,920 --> 00:14:46,920 Speaker 1: Forty three percent of Americans wanted a fifty percent or 295 00:14:46,960 --> 00:14:50,800 Speaker 1: greater reduction of legal immigration, which is if you said 296 00:14:50,840 --> 00:14:52,560 Speaker 1: that on the news, you'd be called an extremist. I 297 00:14:52,640 --> 00:14:54,880 Speaker 1: know I've been called an extremist, but forty three percent 298 00:14:54,920 --> 00:14:57,000 Speaker 1: of the country say that, and about sixty percent of 299 00:14:57,040 --> 00:14:59,640 Speaker 1: the country want somewhere between, like a fifty to ninety 300 00:14:59,640 --> 00:15:03,480 Speaker 1: percent reduction, So somewhere more than forty percent of them 301 00:15:03,480 --> 00:15:06,040 Speaker 1: are current numbers, but less than we currently have. I 302 00:15:06,120 --> 00:15:09,520 Speaker 1: know that sounded confusing, So sixty percent they'll favor some 303 00:15:09,880 --> 00:15:12,680 Speaker 1: sort of reduction of legal immigration. I just want to 304 00:15:12,680 --> 00:15:14,760 Speaker 1: break it down really quickly and we'll talk about it. 305 00:15:14,800 --> 00:15:17,440 Speaker 1: Fifty five percent of women, sixty one percent of men, 306 00:15:17,560 --> 00:15:20,840 Speaker 1: seventy one percent of Republicans, fifty seven percent of independence, 307 00:15:21,080 --> 00:15:24,920 Speaker 1: forty seven percent of Democrats, sixty percent of whites, fifty 308 00:15:24,920 --> 00:15:27,760 Speaker 1: seven percent of blacks, fifty three percent of Hispanics, fifty 309 00:15:27,800 --> 00:15:30,040 Speaker 1: eight percent of swing voters, and seventy three percent of 310 00:15:30,080 --> 00:15:35,880 Speaker 1: all Trump supporters. Does that shock you. 311 00:15:34,200 --> 00:15:35,880 Speaker 2: Not in the least. And this is why I really 312 00:15:35,960 --> 00:15:38,520 Speaker 2: love that we did this as two separate questions, because 313 00:15:38,920 --> 00:15:41,360 Speaker 2: I think it suses something out that we see a 314 00:15:41,400 --> 00:15:44,160 Speaker 2: lot of times in polling, is it's a lot easier 315 00:15:44,200 --> 00:15:47,120 Speaker 2: to get agreement on a concept, and then when you 316 00:15:47,160 --> 00:15:50,960 Speaker 2: go throw numbers behind it, people's it's like you fry 317 00:15:51,040 --> 00:15:52,880 Speaker 2: their brain and they don't know how to answer it 318 00:15:52,920 --> 00:15:55,080 Speaker 2: per se, because if you look at these two questions, 319 00:15:55,160 --> 00:15:59,720 Speaker 2: they actually somewhat conflict. Where you have thirty eight percent 320 00:15:59,760 --> 00:16:02,440 Speaker 2: of PEECE people say that it is about right at 321 00:16:02,480 --> 00:16:06,920 Speaker 2: a million legal immigrants a year. But then when you 322 00:16:06,960 --> 00:16:08,600 Speaker 2: go look at the number of people who said a 323 00:16:08,640 --> 00:16:11,680 Speaker 2: million to five million, five to ten are unlimited, it's 324 00:16:11,760 --> 00:16:15,760 Speaker 2: less than thirty eight percent. So you even have a 325 00:16:15,800 --> 00:16:18,640 Speaker 2: differential there of the number of people who, as you mentioned, 326 00:16:18,640 --> 00:16:21,280 Speaker 2: are more likely to be Democrats or or left leaning 327 00:16:21,360 --> 00:16:24,600 Speaker 2: voters are saying, yes, it's the right amount, or we 328 00:16:24,680 --> 00:16:27,840 Speaker 2: need more. But then when you get to the actual numbers, 329 00:16:27,840 --> 00:16:29,800 Speaker 2: even though we just told them the number in the 330 00:16:29,880 --> 00:16:33,360 Speaker 2: last question, like the question itself said a million legal 331 00:16:33,600 --> 00:16:37,480 Speaker 2: is this just right? Too little? Too much? And then 332 00:16:37,520 --> 00:16:40,040 Speaker 2: when we break out the numbers in the exact next question, 333 00:16:40,560 --> 00:16:43,280 Speaker 2: you get even fewer people that tell us it's a 334 00:16:43,280 --> 00:16:46,080 Speaker 2: million or more that they're good with. And I think 335 00:16:46,120 --> 00:16:49,080 Speaker 2: this is one of the challenges with communication in general, 336 00:16:49,160 --> 00:16:50,760 Speaker 2: and it's one of the things that Donald Trump is 337 00:16:50,800 --> 00:16:54,160 Speaker 2: really good at. Is we like to talk about, especially 338 00:16:54,160 --> 00:16:56,480 Speaker 2: those of us who enjoy policy. You know, what should 339 00:16:56,520 --> 00:17:00,360 Speaker 2: the number be about abortion? How many weeks is that 340 00:17:00,400 --> 00:17:03,320 Speaker 2: it should be legal at and voters don't think in 341 00:17:03,400 --> 00:17:06,480 Speaker 2: terms of these hard construct numbers. They think in terms 342 00:17:06,560 --> 00:17:11,320 Speaker 2: of principles and of concepts, and these two questions that 343 00:17:11,359 --> 00:17:14,480 Speaker 2: you had us ask, I think plays that out perfectly 344 00:17:14,560 --> 00:17:16,720 Speaker 2: that if we're going to win messaging battles, let's not 345 00:17:16,760 --> 00:17:18,360 Speaker 2: get bogged down in numbers. 346 00:17:19,359 --> 00:17:22,160 Speaker 1: That's so interesting, now there was I think the thing 347 00:17:22,160 --> 00:17:24,600 Speaker 1: I found super interesting was that there was really not 348 00:17:25,320 --> 00:17:31,560 Speaker 1: huge stark differences between groups aside from partisanship. So Hispanics 349 00:17:31,560 --> 00:17:33,919 Speaker 1: are very close to blacks, Blacks are pretty close to whites, 350 00:17:33,960 --> 00:17:37,640 Speaker 1: men were fairly close to women. There wasn't this huge 351 00:17:37,720 --> 00:17:41,520 Speaker 1: surge in one area as I expected, And even among Democrats, 352 00:17:41,560 --> 00:17:44,440 Speaker 1: forty seven percent of Democrats is a lot. That's not 353 00:17:44,640 --> 00:17:48,480 Speaker 1: what you would assume by hearing the media narrative. And 354 00:17:48,480 --> 00:17:51,399 Speaker 1: then you'd watch clips of like, you know, you know, 355 00:17:51,440 --> 00:17:54,520 Speaker 1: black voters in Chicago, who were I mean ninety nine 356 00:17:54,880 --> 00:17:57,480 Speaker 1: times out of one hundred a Democrat sitting there and 357 00:17:57,520 --> 00:18:01,080 Speaker 1: demanding that the illegal immigration legal immigrants to be deported. 358 00:18:01,119 --> 00:18:03,520 Speaker 1: You see the white people up in up in Cape Cod, 359 00:18:03,720 --> 00:18:06,399 Speaker 1: you know, waving goodbye to Leegal immigrants that Joe Biden 360 00:18:06,440 --> 00:18:09,840 Speaker 1: brought in. So there's just a constant conflict between what 361 00:18:09,960 --> 00:18:12,320 Speaker 1: is being presented and then these numbers over here and 362 00:18:12,359 --> 00:18:13,920 Speaker 1: what you kind of see from these the news clip 363 00:18:13,920 --> 00:18:16,080 Speaker 1: that's you're like, there is supplicing bigger than this. 364 00:18:17,160 --> 00:18:20,160 Speaker 2: Yeah, And a lot of it plays into educational attainment. 365 00:18:20,200 --> 00:18:21,800 Speaker 2: And this is something I talk about a lot, the 366 00:18:21,840 --> 00:18:25,600 Speaker 2: diploma divide, and that is that thirty years ago, Republicans 367 00:18:25,600 --> 00:18:28,960 Speaker 2: were the party of the highly educated, higher income individuals, 368 00:18:29,480 --> 00:18:32,800 Speaker 2: and those folks are more likely not Democrats now, and 369 00:18:33,440 --> 00:18:36,640 Speaker 2: Republicans are the party of the working class. And when 370 00:18:36,640 --> 00:18:40,680 Speaker 2: you do look at this question on construct of educational attainment, 371 00:18:40,760 --> 00:18:44,080 Speaker 2: you start to see a big divide almost as much 372 00:18:44,080 --> 00:18:47,360 Speaker 2: as you see on the partisanship, because they're so intercorrelated now, 373 00:18:47,480 --> 00:18:51,560 Speaker 2: this diploma divide and partisanship. So if you're a non 374 00:18:51,640 --> 00:18:56,119 Speaker 2: college educated voter, then you are thirty over a third 375 00:18:56,160 --> 00:18:58,440 Speaker 2: of those voters are saying one hundred thousand or less, 376 00:18:59,119 --> 00:19:01,119 Speaker 2: which there's not a lot of other groups that we 377 00:19:01,200 --> 00:19:03,960 Speaker 2: look at where you would say, have a third of 378 00:19:04,000 --> 00:19:07,200 Speaker 2: the voters saying one hundred thousand or fewer to none 379 00:19:07,280 --> 00:19:09,880 Speaker 2: one hundred thousand to none. And then when you look 380 00:19:09,920 --> 00:19:13,240 Speaker 2: at college educated voters, that's where you start to get 381 00:19:13,440 --> 00:19:16,720 Speaker 2: more that are saying that it should be somewhere between 382 00:19:16,800 --> 00:19:21,080 Speaker 2: half a million to five million, But there's very few 383 00:19:21,119 --> 00:19:25,320 Speaker 2: who say any number beyond that. And that's I think 384 00:19:25,359 --> 00:19:28,040 Speaker 2: the story here is that you've got two Americas, and 385 00:19:28,080 --> 00:19:30,680 Speaker 2: it's the tariff argument I think is too Americas. Also, 386 00:19:30,680 --> 00:19:33,600 Speaker 2: if you're highly educated and highly invested in the stock market, 387 00:19:33,800 --> 00:19:35,520 Speaker 2: then yeah, you're freaking out. Well, you may not be 388 00:19:35,600 --> 00:19:38,040 Speaker 2: freaking out today because it's April ninth, you were freaking 389 00:19:38,080 --> 00:19:42,520 Speaker 2: out yesterday in April eighth, you know, third massive four 390 00:19:42,560 --> 00:19:46,879 Speaker 2: figure dropping in a day. But regular Americans aren't worried 391 00:19:46,880 --> 00:19:49,399 Speaker 2: about that. But they are worried about this immigration topic 392 00:19:49,440 --> 00:19:51,840 Speaker 2: because they see it as supplanting their ability to earn 393 00:19:51,880 --> 00:19:52,320 Speaker 2: an income. 394 00:19:52,600 --> 00:19:54,399 Speaker 1: Well, and so before we get to a tariffs, So 395 00:19:54,480 --> 00:19:56,080 Speaker 1: that's one of the questions. But the thing I wanted 396 00:19:56,119 --> 00:19:59,359 Speaker 1: to ask was, so your firm says, quote for Republican 397 00:19:59,359 --> 00:20:01,800 Speaker 1: members of cong is concerned of being primary. A majority 398 00:20:01,840 --> 00:20:06,119 Speaker 1: of Republican voters want to see legal immigration reduced to 399 00:20:06,400 --> 00:20:09,600 Speaker 1: two hundred and fifty thousand or less annually. You also 400 00:20:09,680 --> 00:20:13,680 Speaker 1: wrote Trump made major inroads in November with non white 401 00:20:13,680 --> 00:20:17,440 Speaker 1: men seventy three percent of non white men who voted 402 00:20:17,480 --> 00:20:22,080 Speaker 1: for Trump favor immigration reduction. That is something that I 403 00:20:22,200 --> 00:20:24,040 Speaker 1: did not expect that number for non white men to 404 00:20:24,040 --> 00:20:24,199 Speaker 1: be that. 405 00:20:24,520 --> 00:20:24,880 Speaker 2: I owe. 406 00:20:24,960 --> 00:20:27,200 Speaker 1: I know Republican voters, I know them my gout, I 407 00:20:27,320 --> 00:20:29,520 Speaker 1: know how to had a campaign to Republican voters. It's 408 00:20:29,600 --> 00:20:32,360 Speaker 1: very It's something that I just was born, bred and 409 00:20:32,440 --> 00:20:35,240 Speaker 1: you know, and lived my whole life as a Republican voter. 410 00:20:35,520 --> 00:20:40,840 Speaker 1: But the non white number being non white men who 411 00:20:40,960 --> 00:20:49,840 Speaker 1: voted for Trump, it feels like ideology is surpassing race, surpassing. 412 00:20:51,280 --> 00:20:53,960 Speaker 1: I mean, maybe there's college attainment is having some differential 413 00:20:54,320 --> 00:20:59,000 Speaker 1: but forever racial identity superseded that of partisanship, and now 414 00:20:59,560 --> 00:21:03,560 Speaker 1: or political ideology. Now political ideology supersedes that of race. 415 00:21:03,920 --> 00:21:05,680 Speaker 1: That's actually I feel like a good thing. 416 00:21:07,200 --> 00:21:10,040 Speaker 2: Completely. And going back to the first race that I 417 00:21:10,080 --> 00:21:13,679 Speaker 2: worked on for a mayoral campaign in Montgomery, Alabama, I 418 00:21:13,760 --> 00:21:16,800 Speaker 2: went knockdoors in the one hundred percent black part of 419 00:21:16,840 --> 00:21:19,440 Speaker 2: town thinking I could earn votes there twenty five years ago. 420 00:21:20,000 --> 00:21:22,960 Speaker 2: And that was so dumb of me, because I guarantee 421 00:21:22,960 --> 00:21:25,919 Speaker 2: our Republican leaning Canada got none of those votes that 422 00:21:25,960 --> 00:21:28,359 Speaker 2: I went and worked so hard for. But if I 423 00:21:28,359 --> 00:21:31,600 Speaker 2: did that exact same thing today, they're willing to have 424 00:21:31,640 --> 00:21:35,880 Speaker 2: a conversation. And it's because Democrats have gone so far 425 00:21:35,960 --> 00:21:40,439 Speaker 2: to the left on all kinds of topics, and because 426 00:21:40,480 --> 00:21:42,920 Speaker 2: you've got these highly educated white voters that are now 427 00:21:43,040 --> 00:21:45,480 Speaker 2: more the base of the Democratic Party than anything else. 428 00:21:46,080 --> 00:21:50,160 Speaker 2: They are so liberal and progressive, and these non white voters, 429 00:21:50,280 --> 00:21:52,680 Speaker 2: there are very few of them that consider themselves liberal 430 00:21:52,720 --> 00:21:55,600 Speaker 2: and progressive. They're much more likely to say they're moderate 431 00:21:55,640 --> 00:21:58,600 Speaker 2: to somewhat conservative. And so it's kind of like that 432 00:21:58,680 --> 00:22:01,800 Speaker 2: meme that Elon Musk before he got involved in politics 433 00:22:01,880 --> 00:22:04,000 Speaker 2: last year or the year before, whenever it was, where 434 00:22:04,280 --> 00:22:07,359 Speaker 2: he shows himself as a stick figure in the same place, 435 00:22:07,480 --> 00:22:09,679 Speaker 2: and the left keeps moving further away from him and 436 00:22:09,720 --> 00:22:12,040 Speaker 2: calling him a radical and calling him a right winger. 437 00:22:12,440 --> 00:22:15,280 Speaker 2: And that's really what's happening with these non white populations 438 00:22:15,359 --> 00:22:18,560 Speaker 2: is they're sitting there saying, we're moderate to somewhat conservative, 439 00:22:18,920 --> 00:22:21,199 Speaker 2: and you used to be moderate to somewhat conservative, but 440 00:22:21,240 --> 00:22:25,480 Speaker 2: now you're insanely leftist and you left us right. 441 00:22:25,520 --> 00:22:28,320 Speaker 1: And they're economically centrist. In the Republican Party has become 442 00:22:28,359 --> 00:22:32,879 Speaker 1: more economically centers under Trump. The yeah, I mean I 443 00:22:32,920 --> 00:22:35,919 Speaker 1: can't believe LATINX didn't really work to win over hispanics. 444 00:22:35,960 --> 00:22:38,680 Speaker 2: I'm shocked by that or anything else. 445 00:22:39,920 --> 00:22:43,439 Speaker 1: The Okay, the second thing that I asked you guys about, 446 00:22:43,600 --> 00:22:47,040 Speaker 1: which was I'm very anxious about is AI. I don't 447 00:22:47,080 --> 00:22:50,800 Speaker 1: think it's aside from corporations pulling on AI. I never 448 00:22:50,880 --> 00:22:54,240 Speaker 1: see a question about AI ever being asked. So your 449 00:22:54,280 --> 00:22:57,000 Speaker 1: poll found that thirty three percent of respondents when I asked, 450 00:22:57,080 --> 00:22:59,560 Speaker 1: when you asked, how does the future of AI make 451 00:22:59,600 --> 00:23:02,800 Speaker 1: you feel? Thirty three percent said nervous, twenty six percent 452 00:23:02,880 --> 00:23:06,040 Speaker 1: of curious. I guess that means like they want to 453 00:23:06,080 --> 00:23:08,960 Speaker 1: see where it goes, eighteen percent said anxious, and twelve 454 00:23:08,960 --> 00:23:12,760 Speaker 1: percent that excited, eleven percent were unsure. That's a lot 455 00:23:12,800 --> 00:23:17,600 Speaker 1: of that's more negative connotation I expected. What are your 456 00:23:18,040 --> 00:23:18,600 Speaker 1: fifty one. 457 00:23:18,480 --> 00:23:22,960 Speaker 2: Percent negative emotions? I mean to the emotions were obviously negative. 458 00:23:23,520 --> 00:23:25,720 Speaker 2: One is kind of up in the air. The curiosity 459 00:23:25,840 --> 00:23:28,280 Speaker 2: is more a positive emotion, but I wouldn't say it's 460 00:23:28,359 --> 00:23:31,639 Speaker 2: anywhere near like excited from an emotional standpoint. And so 461 00:23:31,800 --> 00:23:35,800 Speaker 2: fifty one percent of voters answered nervous are anxious, which 462 00:23:35,800 --> 00:23:39,080 Speaker 2: are both negative emotions and one of the most fascinating 463 00:23:39,119 --> 00:23:42,639 Speaker 2: things is that is really driven by a gender gap. 464 00:23:43,200 --> 00:23:46,359 Speaker 2: So women are much more likely to say nervous thirty 465 00:23:46,400 --> 00:23:49,040 Speaker 2: six percent of women said nervous, only twenty nine percent 466 00:23:49,080 --> 00:23:52,119 Speaker 2: of males did. And on curiosity, thirty one percent of 467 00:23:52,119 --> 00:23:56,200 Speaker 2: males curious, twenty two percent of females curious. So there's 468 00:23:56,320 --> 00:23:59,879 Speaker 2: a real gender gap here. And then taking that further 469 00:24:00,200 --> 00:24:03,240 Speaker 2: into what we were talking about earlier on the diploma divide, 470 00:24:03,320 --> 00:24:06,320 Speaker 2: there's also a really big diploma divide on this question. 471 00:24:06,400 --> 00:24:10,080 Speaker 2: If you were non college educated, then you are about 472 00:24:10,119 --> 00:24:12,880 Speaker 2: thirty eight percent said that they were nervous about this. 473 00:24:13,920 --> 00:24:17,359 Speaker 2: And then when you go look at college educated voters, 474 00:24:17,440 --> 00:24:20,360 Speaker 2: especially college educated males, thirty four percent, so they were 475 00:24:20,400 --> 00:24:23,080 Speaker 2: curious about this. So I think if we were to 476 00:24:23,119 --> 00:24:26,520 Speaker 2: really go Freudian on this, this has to do with 477 00:24:26,640 --> 00:24:29,400 Speaker 2: like where do I fit in when AI comes in, 478 00:24:29,520 --> 00:24:32,560 Speaker 2: and college educated voters feel more prepared for that than 479 00:24:32,600 --> 00:24:36,240 Speaker 2: non college educated voters, which is fascinating because it's not 480 00:24:36,240 --> 00:24:39,399 Speaker 2: going to replace a plumber, right, I always. 481 00:24:39,160 --> 00:24:42,480 Speaker 1: Say, because it's going after white collar jobs AI primarily, 482 00:24:43,000 --> 00:24:46,919 Speaker 1: and that's who would be affected. The other population I 483 00:24:47,000 --> 00:24:50,920 Speaker 1: was excited about this were parents. Why do you think 484 00:24:50,960 --> 00:24:51,359 Speaker 1: that is. 485 00:24:52,720 --> 00:24:55,280 Speaker 2: I think part of it is that the demographics, and 486 00:24:55,320 --> 00:24:57,199 Speaker 2: that's one thing when you're reading a poll, you have 487 00:24:57,240 --> 00:25:00,360 Speaker 2: to ask yourself, is this cause relative or correlative? And 488 00:25:01,000 --> 00:25:04,640 Speaker 2: I think that that is more posative that they parents 489 00:25:04,840 --> 00:25:06,760 Speaker 2: just happen to be younger. So if you go look 490 00:25:06,800 --> 00:25:11,440 Speaker 2: at young youth answering these questions, you're going to see 491 00:25:11,440 --> 00:25:14,080 Speaker 2: about as high numbers as you do in the parents, 492 00:25:14,119 --> 00:25:17,000 Speaker 2: because parents are younger than you know, some seventy year 493 00:25:17,040 --> 00:25:18,040 Speaker 2: old answering the survey. 494 00:25:18,320 --> 00:25:19,600 Speaker 1: I thought maybe because they don't want to do the 495 00:25:19,880 --> 00:25:21,640 Speaker 1: homework with their kids anymore, and they were like, oh, 496 00:25:21,680 --> 00:25:24,040 Speaker 1: this is great. But I think it also goes back 497 00:25:24,040 --> 00:25:26,760 Speaker 1: to like, what's your vision of AI. Is at Rosie 498 00:25:26,800 --> 00:25:28,520 Speaker 1: the robot who's going to make you dinner every night 499 00:25:28,560 --> 00:25:31,080 Speaker 1: and do the dishes? Or is it the terminator? And 500 00:25:31,119 --> 00:25:34,639 Speaker 1: that's really where I think that's where the mentality divide is. 501 00:25:34,640 --> 00:25:36,840 Speaker 1: I hope this question is asked more because it is 502 00:25:36,920 --> 00:25:41,240 Speaker 1: part of our future and we we don't, uh, we 503 00:25:41,280 --> 00:25:43,479 Speaker 1: don't talk about it so enough. I think I think 504 00:25:43,520 --> 00:25:47,119 Speaker 1: people are talking about, you know, Jadbanski, that specioism and 505 00:25:47,119 --> 00:25:48,880 Speaker 1: we're going to be a leader in it and kind 506 00:25:48,880 --> 00:25:50,280 Speaker 1: of I guess there's no way to avert it, but 507 00:25:50,800 --> 00:25:52,240 Speaker 1: I don't talk about it enough as like what the 508 00:25:52,280 --> 00:25:55,240 Speaker 1: ramification would be. Speaking of tech. The other question I 509 00:25:55,280 --> 00:25:58,600 Speaker 1: asked was, because this is something I've also become very 510 00:25:58,680 --> 00:26:03,560 Speaker 1: passionate about and last a year or so, I asked 511 00:26:04,400 --> 00:26:07,000 Speaker 1: it should big tech and social media companies do they 512 00:26:07,000 --> 00:26:10,160 Speaker 1: need more regulation because they have virtually none. Now people 513 00:26:10,200 --> 00:26:12,520 Speaker 1: don't sit there and says there's virtually no regulations over 514 00:26:12,560 --> 00:26:16,800 Speaker 1: big attach or over special social media. Sixty nine percent 515 00:26:16,840 --> 00:26:19,040 Speaker 1: of women, sixty four percent of men, sixty one percent 516 00:26:19,040 --> 00:26:22,480 Speaker 1: of Republicans, sixty nine percent of independence, seventy one percent 517 00:26:22,520 --> 00:26:25,119 Speaker 1: of Democrats, sixty seven percent of white voters, sixty three 518 00:26:25,160 --> 00:26:27,520 Speaker 1: percent of Black voters, sixty eight percent of Hispanic voters, 519 00:26:27,880 --> 00:26:31,159 Speaker 1: sixty six percent of Hispanics, seventy four percent of Harris voters, 520 00:26:31,160 --> 00:26:33,639 Speaker 1: sixty percent of Trump voters, and sixty seven percent of 521 00:26:33,640 --> 00:26:38,760 Speaker 1: Swing voters. Said yes, it is overwhelmingly on one side. 522 00:26:39,040 --> 00:26:42,640 Speaker 1: The one thing that was like glaringly obvious to me, though, 523 00:26:42,720 --> 00:26:46,720 Speaker 1: is that sixty percent for Trump voters versus seventy four 524 00:26:46,760 --> 00:26:50,680 Speaker 1: percent for Harris voters. I is that the Ela Musk 525 00:26:50,720 --> 00:26:55,760 Speaker 1: effect Because Republicans were gung ho on regulating big tech, 526 00:26:55,840 --> 00:26:59,879 Speaker 1: breaking up some tech companies, making Twitter a public platform 527 00:27:00,680 --> 00:27:03,400 Speaker 1: twenty twenty, Like that was not that long ago. 528 00:27:04,880 --> 00:27:07,119 Speaker 2: I think you've got to rewind the clock even further 529 00:27:07,200 --> 00:27:10,399 Speaker 2: than that. Like, imagine if we'd asked this question about 530 00:27:10,440 --> 00:27:14,960 Speaker 2: almost any emerging industry thirty years ago, and you would 531 00:27:14,960 --> 00:27:17,760 Speaker 2: have seen Democrats saying, regulate the heck out of it, 532 00:27:17,800 --> 00:27:20,520 Speaker 2: and Republicans saying regulation is bad. I don't care who 533 00:27:20,600 --> 00:27:23,760 Speaker 2: it's on. And so the way I read this question, 534 00:27:24,240 --> 00:27:26,359 Speaker 2: the results of this question when it came back is 535 00:27:26,960 --> 00:27:31,920 Speaker 2: we now have a unique issue where Democrats have always 536 00:27:31,960 --> 00:27:35,119 Speaker 2: been for regulation, you know, if it walks or moves regulated, 537 00:27:35,680 --> 00:27:40,000 Speaker 2: and Republicans who are now former Democrats have taken some 538 00:27:40,040 --> 00:27:43,000 Speaker 2: of their ideology on certain things with them when they 539 00:27:43,080 --> 00:27:46,159 Speaker 2: left the Democratic Party. And I think regulation and the 540 00:27:46,200 --> 00:27:49,800 Speaker 2: fact that big fill in the blank anything is negative 541 00:27:50,520 --> 00:27:54,440 Speaker 2: is really a carryover from these former Democrats now Republican 542 00:27:54,520 --> 00:27:58,600 Speaker 2: MAGA voters on this issue. And now big tech and 543 00:27:58,640 --> 00:28:01,679 Speaker 2: really business in general finds itself on an island because 544 00:28:01,800 --> 00:28:06,240 Speaker 2: Democrats still hate big everything, and Republicans, these newer Republicans 545 00:28:06,280 --> 00:28:09,520 Speaker 2: have taken with them this you know, you know, if 546 00:28:09,560 --> 00:28:12,560 Speaker 2: it's if it's big, it's probably bad mindset with them 547 00:28:12,600 --> 00:28:13,720 Speaker 2: from the Democratic Party. 548 00:28:14,640 --> 00:28:16,600 Speaker 1: I hear you, but I think that you know, the 549 00:28:16,640 --> 00:28:21,320 Speaker 1: twenty sixteen election was so blamed on Russia buying Facebook ads, 550 00:28:21,760 --> 00:28:24,639 Speaker 1: and the twenty twenty election was so blamed on a 551 00:28:24,840 --> 00:28:29,520 Speaker 1: Musk but not Musk, duck Bucks Zuckbucks, and twenty twenty 552 00:28:29,560 --> 00:28:32,080 Speaker 1: four had to do with Elon Musk. Like, that's three 553 00:28:32,160 --> 00:28:37,359 Speaker 1: elections in a row where tech played some part in 554 00:28:37,440 --> 00:28:40,440 Speaker 1: the narrative is why one side lost over the other. 555 00:28:40,480 --> 00:28:44,680 Speaker 1: And interestingly, in your survey, college educative voters were more 556 00:28:44,760 --> 00:28:47,240 Speaker 1: likely to support regulations on big tech, which they likely 557 00:28:47,280 --> 00:28:49,720 Speaker 1: either have stock in or work for or could work for. 558 00:28:50,200 --> 00:28:53,520 Speaker 1: Then non college educated voters who would who were more 559 00:28:53,640 --> 00:28:57,280 Speaker 1: for more nurse at ai. There's a lot of inverse 560 00:28:57,360 --> 00:29:00,000 Speaker 1: opinions about the two Junescy weren't going with us? 561 00:29:00,040 --> 00:29:03,320 Speaker 2: Yes, yeah, I do. You also got to look at 562 00:29:03,480 --> 00:29:06,880 Speaker 2: you know, the highest number here is female college educated voters. 563 00:29:07,000 --> 00:29:10,960 Speaker 2: That's also the largest demographic that votes Democrat. So this 564 00:29:11,000 --> 00:29:16,320 Speaker 2: is another causation correlation conversation where they are just much 565 00:29:16,360 --> 00:29:19,520 Speaker 2: more likely to be Democrats the higher educated you are 566 00:29:20,080 --> 00:29:22,680 Speaker 2: than the lower educated you are. 567 00:29:22,720 --> 00:29:22,960 Speaker 1: And so. 568 00:29:24,360 --> 00:29:26,280 Speaker 2: But I think if we were to ask a question 569 00:29:26,360 --> 00:29:28,280 Speaker 2: on the next survey and have a conversation of do 570 00:29:28,360 --> 00:29:30,360 Speaker 2: you think big tech and social media is good or 571 00:29:30,400 --> 00:29:32,920 Speaker 2: bad for us, that we'd probably say most people, even 572 00:29:32,960 --> 00:29:34,480 Speaker 2: though they use it all the time, they may be 573 00:29:34,640 --> 00:29:37,840 Speaker 2: watching us on a big tech platform right now, would 574 00:29:37,880 --> 00:29:40,440 Speaker 2: also agree that they don't think it's positive for them. 575 00:29:40,600 --> 00:29:43,200 Speaker 1: Oh, I know, one hundred percent, I completely agree with you. 576 00:29:43,240 --> 00:29:45,240 Speaker 1: That's the people who sit there and say McDonald's terrible 577 00:29:45,240 --> 00:29:49,960 Speaker 1: as they're ordering a big mac. The last question I 578 00:29:50,040 --> 00:29:53,840 Speaker 1: that was able to ask was about would you support 579 00:29:54,160 --> 00:29:55,680 Speaker 1: it has to do a lot with tariff, would you 580 00:29:55,760 --> 00:30:01,720 Speaker 1: support increase prices for more jobs domestically? And here were 581 00:30:01,760 --> 00:30:04,720 Speaker 1: the findings. This was more split than the other answers 582 00:30:04,960 --> 00:30:07,320 Speaker 1: forty seven percent so that they would support higher prices 583 00:30:07,320 --> 00:30:09,560 Speaker 1: for more domestic jobs, forty three percent that they were 584 00:30:09,560 --> 00:30:13,480 Speaker 1: against it, And there was a huge gender gap. Fifty 585 00:30:13,480 --> 00:30:16,680 Speaker 1: eight percent of men, seventy three percent of Republicans, fifty 586 00:30:16,720 --> 00:30:19,000 Speaker 1: one percent of white voters, fifty percent of parents, and 587 00:30:19,080 --> 00:30:21,720 Speaker 1: forty percent of spend voters were for it to only 588 00:30:21,760 --> 00:30:26,000 Speaker 1: twenty sorry four higher prices for more jobs. Only twenty 589 00:30:26,040 --> 00:30:28,800 Speaker 1: one percent of Kamala Harris supporters were for it. Only 590 00:30:29,000 --> 00:30:31,480 Speaker 1: thirty two percent of Black voters were for it, and 591 00:30:31,560 --> 00:30:34,000 Speaker 1: women were very against it. Is this has to be 592 00:30:34,000 --> 00:30:36,600 Speaker 1: a partisan thing related to Trump, right, or is it 593 00:30:36,960 --> 00:30:38,080 Speaker 1: an ideology thing? 594 00:30:40,120 --> 00:30:42,360 Speaker 2: That was one that I asked myself that exact same 595 00:30:42,440 --> 00:30:45,320 Speaker 2: question before this interview in reading the results, and I 596 00:30:45,400 --> 00:30:48,080 Speaker 2: could not come up with an answer for it, because 597 00:30:48,200 --> 00:30:52,120 Speaker 2: you would think that Republicans who are saying jobs and 598 00:30:52,200 --> 00:30:55,320 Speaker 2: economy and inflation are their top issue, are going to 599 00:30:55,840 --> 00:30:58,160 Speaker 2: not answer another question where they say I'm willing to 600 00:30:58,320 --> 00:31:02,600 Speaker 2: pay inflated prices for some other benefit that I will 601 00:31:02,640 --> 00:31:08,920 Speaker 2: trade off higher cost for something else. But these independents, man, 602 00:31:08,920 --> 00:31:11,440 Speaker 2: they split right down the middle. Forty five percent support this, 603 00:31:11,600 --> 00:31:14,720 Speaker 2: forty six percent oppose this. They just kind of prove 604 00:31:14,800 --> 00:31:17,440 Speaker 2: their name right there that they're independent, because on the 605 00:31:17,440 --> 00:31:22,200 Speaker 2: partisan edges, you've got a complete inversion of support and oppose, 606 00:31:22,320 --> 00:31:25,000 Speaker 2: whether you're a Republican or a Democrat. And I would 607 00:31:25,040 --> 00:31:27,680 Speaker 2: have thought, you know, every time we do a poll, 608 00:31:28,040 --> 00:31:30,400 Speaker 2: I like to read through the script and in my mind, 609 00:31:31,040 --> 00:31:33,000 Speaker 2: guess what I think the responses are going to be, 610 00:31:33,160 --> 00:31:35,239 Speaker 2: and if they don't come back that way, that's the 611 00:31:35,240 --> 00:31:38,200 Speaker 2: first place, I go dig into information to figure out why, 612 00:31:38,800 --> 00:31:40,840 Speaker 2: and I still have not come up with an answer 613 00:31:40,880 --> 00:31:43,080 Speaker 2: as to why seventy three percent of Republicans say that 614 00:31:43,120 --> 00:31:45,720 Speaker 2: they're willing to pay higher prices for goods and services 615 00:31:45,720 --> 00:31:48,400 Speaker 2: to bring back jobs. But when you think about it 616 00:31:48,400 --> 00:31:51,520 Speaker 2: in the context of what is Donald Trump doing with 617 00:31:51,640 --> 00:31:55,280 Speaker 2: tariffs and how is he messaging tariffs, that aligns so 618 00:31:55,480 --> 00:32:00,000 Speaker 2: much with this this survey response where people are willing 619 00:32:00,120 --> 00:32:03,320 Speaker 2: to Republicans are willing to pay higher prices if it 620 00:32:03,320 --> 00:32:05,760 Speaker 2: means more American jobs, and Democrats do not want to 621 00:32:05,760 --> 00:32:07,880 Speaker 2: pay higher prices if it means American jobs. I think 622 00:32:07,920 --> 00:32:10,880 Speaker 2: the big takeaways Democrats hate American jobs. That's where I'm. 623 00:32:10,760 --> 00:32:16,719 Speaker 1: Gonna that is that is a great way of pushing 624 00:32:16,720 --> 00:32:19,240 Speaker 1: as a slogan. Uh do you think that it's also 625 00:32:19,320 --> 00:32:23,120 Speaker 1: the sense of the country too, like as far as 626 00:32:24,200 --> 00:32:27,560 Speaker 1: feelings towards the country go. Because I read through your 627 00:32:27,600 --> 00:32:31,040 Speaker 1: other your other questions that you had that I didn't ask, 628 00:32:31,080 --> 00:32:32,880 Speaker 1: and I won't go into all those, but one was 629 00:32:32,960 --> 00:32:35,880 Speaker 1: interesting was on the economy, where it was either a 630 00:32:35,960 --> 00:32:38,880 Speaker 1: majority or plurality of Democrats believe we were currently in 631 00:32:38,880 --> 00:32:41,800 Speaker 1: a recession, which like we're not. I mean, you could 632 00:32:41,800 --> 00:32:44,000 Speaker 1: sit there and say the market's down, markets up whatever, 633 00:32:44,160 --> 00:32:46,160 Speaker 1: or like you know, like Trump's tariffs, but we're not 634 00:32:46,200 --> 00:32:50,880 Speaker 1: in a recession. So is this just what they believe 635 00:32:50,880 --> 00:32:52,960 Speaker 1: because they hope that they're hoping for it to come 636 00:32:52,960 --> 00:32:55,040 Speaker 1: through to or they read a lot of narratives on 637 00:32:55,200 --> 00:32:58,920 Speaker 1: on MSNBC and CNN. What do you like, do you 638 00:32:58,920 --> 00:33:01,080 Speaker 1: have any thoughts on that? Well? 639 00:33:01,320 --> 00:33:04,719 Speaker 2: Magically, their response was that fifty six percent said it 640 00:33:04,880 --> 00:33:07,360 Speaker 2: happened within the last three months, which, hmm, I wonder 641 00:33:07,360 --> 00:33:09,000 Speaker 2: what happened in the last three months. Maybe we've got 642 00:33:09,040 --> 00:33:12,800 Speaker 2: a new president. Yeah, And so it is. It's a 643 00:33:13,040 --> 00:33:17,200 Speaker 2: gut response and reaction to Trump. It has nothing to 644 00:33:17,240 --> 00:33:19,680 Speaker 2: do with the fact that they actually believe we're in 645 00:33:19,680 --> 00:33:22,400 Speaker 2: our procession. They just don't like the fact that Donald 646 00:33:22,400 --> 00:33:23,560 Speaker 2: Trump's in charge. 647 00:33:24,280 --> 00:33:27,120 Speaker 1: What's the one thing before we wrap up, what's the 648 00:33:27,120 --> 00:33:31,040 Speaker 1: one thing about polling that Americans should know that they 649 00:33:31,200 --> 00:33:33,360 Speaker 1: don't What's the one thing they could sit there and say, 650 00:33:33,520 --> 00:33:36,480 Speaker 1: because I mean, I love crossed abs, but everyone says 651 00:33:36,480 --> 00:33:38,760 Speaker 1: don't read them, and I'm like, I'm able to parse 652 00:33:38,800 --> 00:33:42,000 Speaker 1: them a little better than the average person. I always 653 00:33:42,000 --> 00:33:44,160 Speaker 1: say there was a snapshot on time, they're not, you know, 654 00:33:44,200 --> 00:33:46,840 Speaker 1: the prediction of the entire future. But what's something that 655 00:33:46,880 --> 00:33:48,440 Speaker 1: people could take. We are in a nation that is 656 00:33:48,480 --> 00:33:51,320 Speaker 1: obsessive polling, despite we were sess at polling and then 657 00:33:51,320 --> 00:33:53,640 Speaker 1: we love dismantling polling when it doesn't say what we like. 658 00:33:54,000 --> 00:33:56,960 Speaker 1: What is what is people should know about the polling 659 00:33:57,000 --> 00:33:57,920 Speaker 1: industry and polls in. 660 00:33:57,880 --> 00:34:01,960 Speaker 2: General, Well, what's the saying lies, damn lies and statistics? 661 00:34:02,480 --> 00:34:05,160 Speaker 2: And I think that's where most people end up landing 662 00:34:05,200 --> 00:34:10,399 Speaker 2: on polling a poll some of them are just flat 663 00:34:10,400 --> 00:34:15,719 Speaker 2: out wrong. But what I would say is, yeah, become. 664 00:34:15,480 --> 00:34:18,960 Speaker 1: As said that senior citizen women were voting seventy percent 665 00:34:19,000 --> 00:34:21,799 Speaker 1: for Kamala Harris White evangelical senior system women. I was like, 666 00:34:22,320 --> 00:34:25,120 Speaker 1: this poll is not correct. I'm sorry, go. 667 00:34:25,080 --> 00:34:27,680 Speaker 2: Ahead, Yeah, no, no, you're spot on. And so I 668 00:34:27,719 --> 00:34:30,200 Speaker 2: think it's one become a better consumer of polling. So, 669 00:34:30,320 --> 00:34:32,239 Speaker 2: if you're going to pay attention to poles, learn how 670 00:34:32,280 --> 00:34:35,400 Speaker 2: to read them, because the top line is a very 671 00:34:35,440 --> 00:34:37,960 Speaker 2: small portion of the story. The thing that I always 672 00:34:38,000 --> 00:34:41,080 Speaker 2: recommend to folks is start at the bottom of the 673 00:34:41,120 --> 00:34:42,759 Speaker 2: pole and work your way up. And what I mean 674 00:34:42,800 --> 00:34:45,520 Speaker 2: by that is, look at the demographics if you look 675 00:34:45,520 --> 00:34:48,120 Speaker 2: at the demographics and they don't pass the smell test, 676 00:34:48,239 --> 00:34:50,400 Speaker 2: don't even look at the rest of the pole. And 677 00:34:50,440 --> 00:34:53,920 Speaker 2: so if you start thinking about polling from the terms 678 00:34:54,000 --> 00:34:56,360 Speaker 2: of is this a good poll to start with, I 679 00:34:56,360 --> 00:35:00,520 Speaker 2: think you'll understand polling better. The second is pay to 680 00:35:00,680 --> 00:35:04,439 Speaker 2: trends over time, not just small data points, because even 681 00:35:04,480 --> 00:35:07,319 Speaker 2: the best pollsters, there's a reason that in polling you'll 682 00:35:07,360 --> 00:35:09,760 Speaker 2: see a margin of air, which is whatever your number 683 00:35:09,760 --> 00:35:12,520 Speaker 2: you're looking at, it can be that percent higher or lower. 684 00:35:12,880 --> 00:35:15,040 Speaker 2: And that's another piece of reading polls. People say, well, 685 00:35:15,080 --> 00:35:17,200 Speaker 2: the polls got it wrong. They never get it wrong 686 00:35:17,239 --> 00:35:18,719 Speaker 2: if you look at it within a margin of error. 687 00:35:18,760 --> 00:35:20,560 Speaker 2: So if the pole says Donald Trump's going to get 688 00:35:20,600 --> 00:35:23,359 Speaker 2: forty five percent, and it's a three percent margin of air, 689 00:35:23,400 --> 00:35:25,719 Speaker 2: If he gets forty eight percent, that poll was actually 690 00:35:25,920 --> 00:35:29,920 Speaker 2: was statistically correct. The second is that there's always a 691 00:35:30,000 --> 00:35:34,360 Speaker 2: ninety five percent confidence interval on public opinion polling, and 692 00:35:34,400 --> 00:35:36,920 Speaker 2: that means that if you ran that poll one hundred times, 693 00:35:37,000 --> 00:35:38,640 Speaker 2: ninety five percent of the time, you're going to get 694 00:35:38,640 --> 00:35:41,200 Speaker 2: the same results within the margin of air. It also 695 00:35:41,280 --> 00:35:43,439 Speaker 2: means that five percent of the time you are going 696 00:35:43,480 --> 00:35:46,359 Speaker 2: to get numbers outside of the margin of air, and 697 00:35:46,440 --> 00:35:48,799 Speaker 2: even the best polster cannot one hundred percent of the 698 00:35:48,840 --> 00:35:52,400 Speaker 2: time replicate everything. You know, we see this sometimes in 699 00:35:52,440 --> 00:35:56,240 Speaker 2: our polling where we'll collect a flawless sample look exactly 700 00:35:56,280 --> 00:35:59,560 Speaker 2: like the demographics and population that we're intending to survey, 701 00:36:00,080 --> 00:36:01,440 Speaker 2: and then we look at a number and we're like, 702 00:36:01,480 --> 00:36:04,040 Speaker 2: that just doesn't seem right and we cannot explain it 703 00:36:04,120 --> 00:36:08,600 Speaker 2: in statistically and scientifically. That is likely a five percent 704 00:36:09,480 --> 00:36:12,200 Speaker 2: instance right there, right the other ninety five percent of 705 00:36:12,200 --> 00:36:14,960 Speaker 2: the time it's going to be right. And so you know, 706 00:36:15,040 --> 00:36:17,600 Speaker 2: don't don't look on all of polling is this horrible 707 00:36:17,640 --> 00:36:19,600 Speaker 2: industry that gets stuff wrong all the time. And then 708 00:36:19,600 --> 00:36:22,680 Speaker 2: the last thing I would recommend is if you get polled, 709 00:36:22,760 --> 00:36:25,000 Speaker 2: take the freaking poll. Like that is how you help 710 00:36:25,080 --> 00:36:25,719 Speaker 2: poll and get put it. 711 00:36:26,000 --> 00:36:28,920 Speaker 1: Participating this dad all the times, like I got a 712 00:36:28,920 --> 00:36:30,239 Speaker 1: call for a poll, I hung up the phone. I 713 00:36:30,239 --> 00:36:33,160 Speaker 1: go why, He's like, I was like, well, why, I 714 00:36:33,160 --> 00:36:34,680 Speaker 1: have a friend who does the same thing. I'm like, 715 00:36:34,960 --> 00:36:37,120 Speaker 1: what are you doing? I would I would have run 716 00:36:37,160 --> 00:36:40,680 Speaker 1: over someone to answer a poll, but I'm I'm built differently, 717 00:36:40,719 --> 00:36:42,520 Speaker 1: And that's one hundred percent true with the numbers you 718 00:36:42,560 --> 00:36:44,239 Speaker 1: know when the when the polls were coming out in 719 00:36:44,280 --> 00:36:47,320 Speaker 1: the twenty twenty four election, and there was like six 720 00:36:47,480 --> 00:36:50,240 Speaker 1: public polls that had Trump at like nineteen twenty percent 721 00:36:50,280 --> 00:36:53,000 Speaker 1: of the black boat. The first like three times was like, ah, whatever, 722 00:36:53,080 --> 00:36:55,160 Speaker 1: this is kind of off, and then like the fourth time, 723 00:36:55,160 --> 00:36:56,680 Speaker 1: I was like, maybe this is right, and then the 724 00:36:56,719 --> 00:36:58,719 Speaker 1: sixth and I'm like, okay, this is probably And when 725 00:36:58,719 --> 00:37:00,399 Speaker 1: there was I think it was the ABC the Its 726 00:37:00,480 --> 00:37:03,279 Speaker 1: is Full, which was black only voters, and they were 727 00:37:03,280 --> 00:37:06,000 Speaker 1: coming out with the same numbers, like this is actually 728 00:37:06,080 --> 00:37:09,719 Speaker 1: probably real. Like at this point, it's probably real when 729 00:37:09,760 --> 00:37:12,480 Speaker 1: it's happened six times across different firms and they're all 730 00:37:12,480 --> 00:37:15,520 Speaker 1: coming the same thing. And there's there's institutions that are 731 00:37:15,560 --> 00:37:17,879 Speaker 1: solely looking at you know, this one or that one. 732 00:37:17,880 --> 00:37:20,680 Speaker 1: But anyway, where can people learn more about signal polls? 733 00:37:20,920 --> 00:37:22,640 Speaker 1: They can contact you if they want to hire you, 734 00:37:22,760 --> 00:37:24,120 Speaker 1: what should they reach out to? 735 00:37:25,000 --> 00:37:28,160 Speaker 2: So our website is cygn dot al, so it's our 736 00:37:28,160 --> 00:37:31,160 Speaker 2: company name, which is spelled differently than most people spell 737 00:37:31,200 --> 00:37:35,080 Speaker 2: the word signal and then also on social at cygn al. 738 00:37:35,719 --> 00:37:37,480 Speaker 1: Brent, thank you for being on this podcast. I really 739 00:37:37,480 --> 00:37:38,760 Speaker 1: appreciate it's a great conversation. 740 00:37:39,080 --> 00:37:40,200 Speaker 2: Well, thanks for the opportunity. 741 00:37:40,280 --> 00:37:43,280 Speaker 1: Run you're listening to It's a Numbers Game with Ryan Grodowsky. 742 00:37:43,400 --> 00:37:49,080 Speaker 1: We'll be right back. Our question this week from the 743 00:37:49,160 --> 00:37:51,680 Speaker 1: Ask Me Anything segment of the podcast, which, by the way, 744 00:37:51,719 --> 00:37:55,720 Speaker 1: please join. Ask me anything about politics or social media, 745 00:37:55,800 --> 00:37:59,080 Speaker 1: or culture or the economy. If I don't know the answer, 746 00:37:59,080 --> 00:38:00,719 Speaker 1: I will look up the answer and I'll give you 747 00:38:00,760 --> 00:38:05,520 Speaker 1: the best analysis possible. Email Ryan at Numbers Game podcast 748 00:38:05,600 --> 00:38:09,680 Speaker 1: dot com. That's Ryan at Numbers Plural Numbers Numbers Game 749 00:38:09,760 --> 00:38:13,239 Speaker 1: podcast dot com. This question actually was texting me repeatedly 750 00:38:13,400 --> 00:38:15,640 Speaker 1: by my aunt Carol. So I have to answer it 751 00:38:15,680 --> 00:38:17,480 Speaker 1: because I think she's going to blow a gasket if 752 00:38:17,520 --> 00:38:20,160 Speaker 1: I don't. She's listening right now, so thank you, Carol, 753 00:38:20,800 --> 00:38:24,160 Speaker 1: she says to me. Her question was, and she lives 754 00:38:24,200 --> 00:38:27,600 Speaker 1: in New Jersey, why have property taxes increased to fund 755 00:38:27,600 --> 00:38:32,160 Speaker 1: our schools while test scores have declined? That is a 756 00:38:33,239 --> 00:38:35,919 Speaker 1: Since getting involved in the seventeen seventy six Project Pack 757 00:38:35,960 --> 00:38:38,760 Speaker 1: and then the Foundation and dealing so much with education, 758 00:38:39,080 --> 00:38:42,120 Speaker 1: it's hard to even put this into words, how frustrated 759 00:38:42,160 --> 00:38:46,080 Speaker 1: people feel, especially people who have maybe more adult children 760 00:38:46,320 --> 00:38:49,680 Speaker 1: or our younger adults, themselves who remember education being much 761 00:38:49,719 --> 00:38:53,760 Speaker 1: better twenty years ago or fifteen years ago. And there's 762 00:38:53,800 --> 00:38:57,080 Speaker 1: not a simple answer for it. There's not a simple 763 00:38:57,120 --> 00:39:00,000 Speaker 1: answer why test scores have declined. Part of the reason 764 00:39:00,440 --> 00:39:03,399 Speaker 1: is that the bulk of money going to schools is 765 00:39:03,440 --> 00:39:05,800 Speaker 1: not going to things that make people smarter. There's a 766 00:39:05,840 --> 00:39:09,919 Speaker 1: lot of money and administrators and vice principles in outside 767 00:39:10,520 --> 00:39:13,840 Speaker 1: tech companies being involved in, you know, with the idea 768 00:39:13,880 --> 00:39:18,480 Speaker 1: of it improving, and teachers' unions, which is the largest 769 00:39:18,560 --> 00:39:21,040 Speaker 1: organization or anything to do with schools, does not advocate 770 00:39:21,080 --> 00:39:24,200 Speaker 1: on part of children. They advocate on behalf of adults, 771 00:39:24,239 --> 00:39:27,160 Speaker 1: people who work in the buildings, and a lot of 772 00:39:27,280 --> 00:39:30,880 Speaker 1: administrators have used education as guinea pigs. And there is 773 00:39:31,200 --> 00:39:34,240 Speaker 1: there's as much as people. When I started the seventeen 774 00:39:34,239 --> 00:39:36,319 Speaker 1: seventy six Project Pack, there was a lot of people saying, 775 00:39:36,320 --> 00:39:40,480 Speaker 1: why are you politicizing education? I to quote Billy Joel, 776 00:39:40,480 --> 00:39:44,440 Speaker 1: I didn't start the fire. It has been politicized for decades, 777 00:39:44,520 --> 00:39:46,719 Speaker 1: going back to when George W. Bush was trying to 778 00:39:46,719 --> 00:39:49,960 Speaker 1: put phonics in classrooms and liberals opposed that the pledge 779 00:39:49,960 --> 00:39:52,920 Speaker 1: of allegiance to the Prayer, I mean everything. There's a 780 00:39:52,960 --> 00:39:54,400 Speaker 1: lot of politics in it, and there's a lot of 781 00:39:54,480 --> 00:39:57,280 Speaker 1: nonsense in it, and I think that's hurt kids too. 782 00:39:57,680 --> 00:39:59,759 Speaker 1: But I think that the funding, how all this new 783 00:39:59,800 --> 00:40:02,120 Speaker 1: money he has gone into schools has not always been positive. 784 00:40:02,160 --> 00:40:03,480 Speaker 1: I don't think it's always been not for the benefit 785 00:40:03,520 --> 00:40:05,839 Speaker 1: of the kids. I think that the main drivers are 786 00:40:06,040 --> 00:40:08,640 Speaker 1: lack of parental involvement, especially as people have to work 787 00:40:08,680 --> 00:40:11,359 Speaker 1: more just to survive. I get that, but I think 788 00:40:11,360 --> 00:40:13,040 Speaker 1: that also a big part of it is the way 789 00:40:13,080 --> 00:40:17,920 Speaker 1: that struct schools are structured for their funding schools. If 790 00:40:17,960 --> 00:40:19,960 Speaker 1: a school, if a school is like, okay, we're not 791 00:40:20,280 --> 00:40:23,960 Speaker 1: going to past kids forward. If it's if they're failing, 792 00:40:23,960 --> 00:40:25,640 Speaker 1: we're going to sit there. And but a hard stop 793 00:40:26,000 --> 00:40:29,120 Speaker 1: and a school in Chicago or a school in Baltimore 794 00:40:29,600 --> 00:40:32,759 Speaker 1: failed ninety nine percent of all students from one year 795 00:40:32,800 --> 00:40:35,960 Speaker 1: to the next, that principle will probably be fired. That 796 00:40:36,040 --> 00:40:38,920 Speaker 1: school will probably be shut down. The incentive is to 797 00:40:38,960 --> 00:40:42,400 Speaker 1: look at graduation rates or college admission, which is for 798 00:40:42,520 --> 00:40:46,000 Speaker 1: some places not that hard to do. Not to look 799 00:40:46,120 --> 00:40:49,200 Speaker 1: at hey, can they read, can they do math? Can 800 00:40:49,200 --> 00:40:51,120 Speaker 1: they do science? If we got to a place in 801 00:40:51,160 --> 00:40:53,160 Speaker 1: our country where ninety percent of the country could do 802 00:40:53,200 --> 00:40:55,160 Speaker 1: eighth grade, math, reading, and science. We would be in 803 00:40:55,200 --> 00:40:57,520 Speaker 1: the smartest nation in the We're already one of the 804 00:40:57,560 --> 00:40:59,000 Speaker 1: smart nations in the whole world, but we would be 805 00:40:59,440 --> 00:41:01,640 Speaker 1: arms and leans above everybody else in a way that 806 00:41:01,680 --> 00:41:04,719 Speaker 1: we're not now. So I think that that is a big, 807 00:41:04,880 --> 00:41:07,040 Speaker 1: big part of it. Is the incentive is to pass 808 00:41:07,120 --> 00:41:09,520 Speaker 1: kids forward even if they're failing, because they don't want 809 00:41:09,520 --> 00:41:12,440 Speaker 1: to show what failing grades actually look like. That's a 810 00:41:12,480 --> 00:41:14,960 Speaker 1: big part of it. And then I think the DEI structure. 811 00:41:15,040 --> 00:41:20,160 Speaker 1: I think that the influence of unions and the influence 812 00:41:20,320 --> 00:41:24,879 Speaker 1: of politics in the way that's been going from the left, 813 00:41:24,880 --> 00:41:26,879 Speaker 1: because for the right it's always been at school choice, 814 00:41:26,880 --> 00:41:29,000 Speaker 1: school choice, school choices. Maybe I'll do an episode school 815 00:41:29,080 --> 00:41:31,759 Speaker 1: choice soon, and it's not been about the eighty five 816 00:41:31,800 --> 00:41:33,640 Speaker 1: percent of kids who go to public school. I think 817 00:41:33,719 --> 00:41:35,960 Speaker 1: that that's I think that that's really the main, main 818 00:41:36,440 --> 00:41:38,640 Speaker 1: part of it. But failing kids forward is a huge, 819 00:41:39,400 --> 00:41:42,239 Speaker 1: huge part of the problem, and that's dealt with a 820 00:41:42,239 --> 00:41:46,440 Speaker 1: lot of the financial structure of schools. Anyway, Thank you 821 00:41:46,520 --> 00:41:48,800 Speaker 1: so so much. I appreciate you listening to the episode. 822 00:41:49,000 --> 00:41:51,400 Speaker 1: Education is not about money, it's about how it's applied. 823 00:41:52,000 --> 00:41:53,840 Speaker 1: So that's what you should learn. That's so what we 824 00:41:53,880 --> 00:41:55,920 Speaker 1: should take away from all the stats we've seen in 825 00:41:55,920 --> 00:41:58,759 Speaker 1: the last few years. Anyway, please like and subscribe on 826 00:41:58,800 --> 00:42:01,719 Speaker 1: the iHeartRadio app podcast wherever you get your podcast. I 827 00:42:01,760 --> 00:42:03,479 Speaker 1: will see you guys. Letter this week