1 00:00:02,680 --> 00:00:05,080 Speaker 1: Welcome back to a numbers game with Ryan Gardowski. Thank 2 00:00:05,080 --> 00:00:07,160 Speaker 1: you guys for being here. We have an amazing show 3 00:00:07,200 --> 00:00:09,119 Speaker 1: up for you guys today, a big guest coming up, 4 00:00:09,119 --> 00:00:10,800 Speaker 1: and a lot of data to break, so let's really 5 00:00:10,800 --> 00:00:13,520 Speaker 1: get into it. On Monday's episode, I talked a lot 6 00:00:13,520 --> 00:00:17,040 Speaker 1: about affordability. It's the buzzword right now, especially since Mandani 7 00:00:17,120 --> 00:00:19,720 Speaker 1: won the New York City's mayoral election. I received a 8 00:00:19,760 --> 00:00:22,599 Speaker 1: lot of feedback about young people, especially how they fill 9 00:00:22,680 --> 00:00:24,880 Speaker 1: the economy, the worries of their ability to live and 10 00:00:24,920 --> 00:00:26,920 Speaker 1: buy a home, and the worry that AI is going 11 00:00:27,000 --> 00:00:30,440 Speaker 1: to wreck have it on their opportunities going forward. And 12 00:00:30,720 --> 00:00:32,280 Speaker 1: I have data on it, but I think that's a 13 00:00:32,320 --> 00:00:34,440 Speaker 1: really good thing going to hone into and think about 14 00:00:34,520 --> 00:00:36,680 Speaker 1: right now. And I want to use a pop culture 15 00:00:36,760 --> 00:00:40,080 Speaker 1: reference to really a misunderstanding of how young people are 16 00:00:40,120 --> 00:00:43,800 Speaker 1: doing financially in this moment. Let's go all the way 17 00:00:43,800 --> 00:00:46,040 Speaker 1: back to January twenty twenty five. Right the world was 18 00:00:46,080 --> 00:00:49,120 Speaker 1: shaken and it wasn't about the upcoming midterms or about 19 00:00:49,280 --> 00:00:51,959 Speaker 1: the Iraq War. No, it was an announcement that came 20 00:00:52,000 --> 00:00:55,680 Speaker 1: out that America's sweethearts, Brad Pitt and Jennifer Anison were 21 00:00:55,760 --> 00:00:58,480 Speaker 1: breaking up because he had cheated on her with Angelina 22 00:00:58,520 --> 00:01:03,200 Speaker 1: jo Lee then up entered the American Zeitchist afterwards, known 23 00:01:03,200 --> 00:01:06,240 Speaker 1: as Sad Jennifer. Right, Jennifer Anison has been two decades 24 00:01:06,240 --> 00:01:08,600 Speaker 1: since she broke up with her then husband Brad Pitt, 25 00:01:08,959 --> 00:01:11,240 Speaker 1: but that she's sad and it lingers in the air 26 00:01:11,280 --> 00:01:13,679 Speaker 1: to the state follows her, that Jennifer Aniston is sad 27 00:01:13,680 --> 00:01:15,320 Speaker 1: because her marriage is gonna work and she didn't get 28 00:01:15,319 --> 00:01:17,800 Speaker 1: to have children. And whether it's true or not true, 29 00:01:17,800 --> 00:01:20,640 Speaker 1: I don't know, Jennifer Aniston, but it doesn't make a 30 00:01:20,680 --> 00:01:24,640 Speaker 1: difference because the narrative prevails. Jennifer Aniston is sad and 31 00:01:24,680 --> 00:01:29,800 Speaker 1: she'll never really find happiness. That same prevailing narrative was 32 00:01:29,840 --> 00:01:32,840 Speaker 1: created just a few years later, around the global financial 33 00:01:32,920 --> 00:01:37,320 Speaker 1: crisis about millennials, the poor millennial like sad Jennifer, poor millennial. 34 00:01:37,319 --> 00:01:40,840 Speaker 1: The words just kind of always fit together. Millennials enter 35 00:01:40,880 --> 00:01:42,880 Speaker 1: the workforce at a really bad time, there's kind of 36 00:01:42,880 --> 00:01:45,399 Speaker 1: no way to shake it. They'll never be able to 37 00:01:45,480 --> 00:01:49,360 Speaker 1: get ahead, is what the pervasive idea was after that happened. 38 00:01:49,400 --> 00:01:50,840 Speaker 1: That they're always going to spend their money on like 39 00:01:50,920 --> 00:01:53,000 Speaker 1: avocado toast, that they're just going to live in dead 40 00:01:53,040 --> 00:01:56,200 Speaker 1: and in their mom's basement, and it's just gonna linger 41 00:01:56,280 --> 00:02:00,960 Speaker 1: with them forever. That prevailing narrative somehow harry though into 42 00:02:01,040 --> 00:02:04,720 Speaker 1: Generation Z, the generation afterwards who were just children when 43 00:02:04,720 --> 00:02:07,560 Speaker 1: the global financial crisis happened, that didn't have to look 44 00:02:07,600 --> 00:02:10,440 Speaker 1: for a job when unemployment was heading into the double digits. 45 00:02:10,639 --> 00:02:12,480 Speaker 1: I don't know how they got lumped into it, but 46 00:02:12,520 --> 00:02:15,280 Speaker 1: they've been lumped into it. And while I think it's 47 00:02:15,320 --> 00:02:18,840 Speaker 1: true that millennials like myself definitely got a rough hand 48 00:02:18,880 --> 00:02:21,080 Speaker 1: as when they were going to sit there and join 49 00:02:21,160 --> 00:02:25,960 Speaker 1: the economy and become adults, millennial men actually earn the 50 00:02:26,000 --> 00:02:29,600 Speaker 1: most amount of money of any generation adjusted for inflation 51 00:02:29,720 --> 00:02:32,440 Speaker 1: at this point in their lives. In their late thirties. 52 00:02:32,840 --> 00:02:36,440 Speaker 1: Millennial women lag behind men, but they still make more 53 00:02:36,480 --> 00:02:39,920 Speaker 1: money on average than most other generation's women, right, not 54 00:02:40,000 --> 00:02:42,880 Speaker 1: Gen X they're the one exception, but Baby Boomer women 55 00:02:43,080 --> 00:02:45,400 Speaker 1: for sure, though there were less of them working. I'll 56 00:02:45,440 --> 00:02:49,160 Speaker 1: put that caveat in there. But anyways, besides just that 57 00:02:49,240 --> 00:02:52,240 Speaker 1: point that millennial men make all this money, there has 58 00:02:52,280 --> 00:02:56,400 Speaker 1: been massive rebounds in general and wealth creation since the 59 00:02:56,880 --> 00:03:00,639 Speaker 1: Since the COVID pandemic. According to the Federal Reserve, Generation 60 00:03:00,880 --> 00:03:04,320 Speaker 1: Z is set to outpace millennials, so you actually make 61 00:03:04,400 --> 00:03:06,560 Speaker 1: more money than them. This is even true when you 62 00:03:06,600 --> 00:03:11,960 Speaker 1: compare across different economic distributions. Right, So poor millennials make 63 00:03:12,080 --> 00:03:15,160 Speaker 1: more money at this point in life than poor baby boomers. 64 00:03:15,520 --> 00:03:18,840 Speaker 1: Wealthy millennials make more money than wealthy baby boomers at 65 00:03:18,840 --> 00:03:21,240 Speaker 1: this point in their life on average. Now, I'm not 66 00:03:21,280 --> 00:03:24,520 Speaker 1: trying to paint an incredibly rosy picture. I'm not saying 67 00:03:24,560 --> 00:03:27,440 Speaker 1: we haven't had high inflation. We definitely have had We've 68 00:03:27,440 --> 00:03:30,919 Speaker 1: had some rough years, right, But things aren't as bad 69 00:03:31,000 --> 00:03:34,120 Speaker 1: for millennials and Gen Z as some people in the 70 00:03:34,160 --> 00:03:37,400 Speaker 1: media suggest. We're not in the same situation that young 71 00:03:37,440 --> 00:03:40,280 Speaker 1: people were, let's say, one hundred years ago, growing up 72 00:03:40,360 --> 00:03:44,200 Speaker 1: during the Great Depression. The one key area of wealth 73 00:03:44,240 --> 00:03:48,000 Speaker 1: generation that hasn't kept up for millennials and Gen Z, though, 74 00:03:48,360 --> 00:03:51,360 Speaker 1: is housing. Data from the Federal Reserve of Saint Louis 75 00:03:51,480 --> 00:03:54,560 Speaker 1: is the national median sales price on a house in 76 00:03:54,560 --> 00:03:58,160 Speaker 1: the US reached four hundred and ten thousand dollars in 77 00:03:58,240 --> 00:04:02,200 Speaker 1: Q two, twenty twenty five. Additionally, Zillo, the website Zillo 78 00:04:02,200 --> 00:04:04,480 Speaker 1: of draw that I'm up at four o'clock in the 79 00:04:04,480 --> 00:04:07,560 Speaker 1: morning searching for houses that I can't afford. That website 80 00:04:07,880 --> 00:04:11,040 Speaker 1: put that home values have increased by forty five percent 81 00:04:11,120 --> 00:04:14,800 Speaker 1: since twenty twenty, obviously unsustainable people trying to buy their 82 00:04:14,840 --> 00:04:18,840 Speaker 1: first house. Zoning laws are often blamed for this right 83 00:04:18,880 --> 00:04:21,279 Speaker 1: you can't build in a number of places like San Francisco, 84 00:04:21,279 --> 00:04:24,000 Speaker 1: which is one hundred percent true. But a big driver 85 00:04:24,240 --> 00:04:27,080 Speaker 1: for that population growth is also a mass immigration. We 86 00:04:27,160 --> 00:04:31,400 Speaker 1: are increasing the population at an unnaturally faster rate. If 87 00:04:31,440 --> 00:04:34,680 Speaker 1: you just had births versus deaths, our population would be increasing, 88 00:04:35,040 --> 00:04:38,080 Speaker 1: but bringing in more than one point one million legal 89 00:04:38,120 --> 00:04:42,000 Speaker 1: immigrants per year is making the situation much harder. It's 90 00:04:42,120 --> 00:04:45,360 Speaker 1: not just a supply issue when it comes to housing, 91 00:04:45,480 --> 00:04:48,520 Speaker 1: is also a demand issue. The demand grows as the 92 00:04:48,600 --> 00:04:53,279 Speaker 1: population grows, and it grows primarily by immigration, not to 93 00:04:53,320 --> 00:04:58,160 Speaker 1: mention corporations and foreigners buying up housing. So young people 94 00:04:58,760 --> 00:05:00,960 Speaker 1: long and short of it, have made big advances in 95 00:05:00,960 --> 00:05:02,839 Speaker 1: the last twenty years. And when I say young people, 96 00:05:03,040 --> 00:05:05,680 Speaker 1: people under forty or around forty from millennials, we are 97 00:05:06,080 --> 00:05:08,400 Speaker 1: middle aged guys. Sorry to break it to you, it's 98 00:05:08,400 --> 00:05:10,400 Speaker 1: hard for me to deal with sometimes too. I've actually 99 00:05:10,440 --> 00:05:12,120 Speaker 1: been saying I'm in middle age for a while now, 100 00:05:12,160 --> 00:05:14,040 Speaker 1: so that way, when I'm like real thick in the 101 00:05:14,080 --> 00:05:15,919 Speaker 1: middle Agian, noess, it won't affect me because I've been 102 00:05:15,920 --> 00:05:19,719 Speaker 1: saying something like thirty three. Anyway, we have made big 103 00:05:19,760 --> 00:05:22,479 Speaker 1: gains as a generation since the Great Recession, which is good. 104 00:05:22,760 --> 00:05:26,159 Speaker 1: There are growing concerns though, that those gains are going 105 00:05:26,160 --> 00:05:29,560 Speaker 1: to slide because of AI. We're gonna be in an 106 00:05:29,640 --> 00:05:33,240 Speaker 1: error soon where AI, you know, may take millions or 107 00:05:33,320 --> 00:05:37,000 Speaker 1: tens of millions of jobs and bring unemployment too much higher. 108 00:05:37,360 --> 00:05:41,880 Speaker 1: Big corporations are investing heavily into into AI. Research shows 109 00:05:41,880 --> 00:05:46,159 Speaker 1: that research into AI and generative AI spending, the amount 110 00:05:46,160 --> 00:05:49,120 Speaker 1: of spending is spent into those initiatives is probably the 111 00:05:49,160 --> 00:05:52,039 Speaker 1: one thing keeping the US out of a recession. They spend. 112 00:05:52,080 --> 00:05:55,799 Speaker 1: Three companies spent three hundred and seventy five billion dollars 113 00:05:55,839 --> 00:05:59,039 Speaker 1: on AI research in twenty twenty five, and it's expected 114 00:05:59,080 --> 00:06:01,760 Speaker 1: to hit half a trick billion dollars in twenty twenty six. 115 00:06:02,440 --> 00:06:06,200 Speaker 1: Here's how Dan Iver's managing director of web Bush Securities 116 00:06:06,240 --> 00:06:09,400 Speaker 1: put it, quote, it's the start of trillions being spent 117 00:06:09,480 --> 00:06:12,640 Speaker 1: in this in this build out for a fourth industrial revolution. 118 00:06:13,320 --> 00:06:15,599 Speaker 1: Big tech right now is doing the equivalent of building 119 00:06:15,680 --> 00:06:17,440 Speaker 1: up Vegas in the nineteen fifties when there was just 120 00:06:17,560 --> 00:06:20,600 Speaker 1: sand or Dubai thirty years ago. That's what's happening. With 121 00:06:20,640 --> 00:06:24,159 Speaker 1: the AI infrastructure built out, from chips to the data 122 00:06:24,240 --> 00:06:27,240 Speaker 1: centers to the grid, you're really building out the future 123 00:06:27,240 --> 00:06:32,200 Speaker 1: economy for consumers and enterprises. He added, Well, that's great 124 00:06:32,240 --> 00:06:36,039 Speaker 1: for the GDP overall, right, But is it great for workers? 125 00:06:36,240 --> 00:06:38,599 Speaker 1: Is it great for employees? Is it great for the 126 00:06:38,640 --> 00:06:41,920 Speaker 1: person trying to make a life for themselves and either 127 00:06:42,000 --> 00:06:44,920 Speaker 1: build just you know, enough capital to buy a house 128 00:06:45,160 --> 00:06:48,200 Speaker 1: and a car and sustain a family, or is it 129 00:06:48,279 --> 00:06:50,840 Speaker 1: you know, or people trying to build generational wealth, something 130 00:06:50,880 --> 00:06:54,080 Speaker 1: that you know a lot of Americans aspire to. Proponents 131 00:06:54,080 --> 00:06:56,960 Speaker 1: of AI say that whenever new technology comes around, some 132 00:06:57,120 --> 00:06:59,760 Speaker 1: old jobs are wiped out, but there's new opportunities that 133 00:06:59,760 --> 00:07:02,800 Speaker 1: we have yet to foresee, things we can't possibly imagine. 134 00:07:03,120 --> 00:07:06,799 Speaker 1: A study from Stanford University researchers back in August concluded 135 00:07:06,880 --> 00:07:10,080 Speaker 1: there's been a substantial decline and employment for early career 136 00:07:10,120 --> 00:07:13,920 Speaker 1: workers in occupations most exposed to AI, such as software 137 00:07:13,960 --> 00:07:17,520 Speaker 1: development and customer service, but job growth has continued. Entry 138 00:07:17,600 --> 00:07:21,520 Speaker 1: level employment has declined and applications of AI and automated work, 139 00:07:21,680 --> 00:07:24,400 Speaker 1: but those effects are kind of muted. When they're augmented 140 00:07:24,400 --> 00:07:26,280 Speaker 1: form and people learn how to work with AI, the 141 00:07:26,360 --> 00:07:29,720 Speaker 1: job losses aren't as severe as people who are not 142 00:07:29,760 --> 00:07:30,320 Speaker 1: trained at all. 143 00:07:30,360 --> 00:07:30,640 Speaker 2: AI. 144 00:07:31,040 --> 00:07:35,000 Speaker 1: Studies from MIT and the Rand Organization found that the 145 00:07:35,040 --> 00:07:37,800 Speaker 1: most of the investments in AI that companies have had 146 00:07:37,920 --> 00:07:40,960 Speaker 1: have shown very little to no returns, and they're slow 147 00:07:41,040 --> 00:07:45,600 Speaker 1: to integrate AI into their businesses. So maybe the fears 148 00:07:45,680 --> 00:07:49,720 Speaker 1: of the sky is falling aren't as founded as possible 149 00:07:49,760 --> 00:07:52,680 Speaker 1: because corporations are being very slow to implement this that 150 00:07:52,720 --> 00:07:55,160 Speaker 1: they haven't worked out. Ninety five percent of this investment 151 00:07:55,200 --> 00:07:56,800 Speaker 1: hasn't worked out, and there is the fear of the 152 00:07:57,440 --> 00:08:00,520 Speaker 1: economic bubble of AI will that take things back a 153 00:08:00,560 --> 00:08:04,920 Speaker 1: few more years. While AI is slower than some people 154 00:08:05,000 --> 00:08:08,360 Speaker 1: had expected, it's clearly going in one direction though, right 155 00:08:08,400 --> 00:08:10,040 Speaker 1: It's like the Internet. The Internet had a bubble, but 156 00:08:10,120 --> 00:08:13,720 Speaker 1: the Internet continued, and we're going to affect the way 157 00:08:13,760 --> 00:08:15,960 Speaker 1: we work in our economy. But it's also going to 158 00:08:16,000 --> 00:08:19,720 Speaker 1: affect our politics, which is where my brain always goes 159 00:08:19,760 --> 00:08:24,840 Speaker 1: in what happens when there's a permanent unemployment of let's 160 00:08:24,840 --> 00:08:27,280 Speaker 1: say ten to twenty percent of the workforce, and what 161 00:08:27,400 --> 00:08:30,800 Speaker 1: happens when millions of people can't find especially young people, 162 00:08:30,880 --> 00:08:34,760 Speaker 1: cannot find work and thus can't afford to have a life. 163 00:08:34,760 --> 00:08:36,680 Speaker 1: They can't afford a family, they can't afford a house, 164 00:08:36,679 --> 00:08:39,400 Speaker 1: they can't afford a car, they can't afford that avocado 165 00:08:39,440 --> 00:08:41,640 Speaker 1: toast that millennials has been eating for thirty years. According 166 00:08:41,640 --> 00:08:45,760 Speaker 1: to some baby boomers, their politics will shift to become 167 00:08:45,760 --> 00:08:48,920 Speaker 1: more progressive or more radical in any kind of direction. 168 00:08:49,080 --> 00:08:49,280 Speaker 2: Right. 169 00:08:49,520 --> 00:08:52,040 Speaker 1: That's not to say that money can't be created because 170 00:08:52,040 --> 00:08:55,720 Speaker 1: it's AI revolution. It just may be too concentrated among 171 00:08:55,760 --> 00:08:58,880 Speaker 1: the very wealthy, and those left behind who can't get 172 00:08:58,960 --> 00:09:02,800 Speaker 1: jobs will not be angry by it. The Trump administration 173 00:09:02,960 --> 00:09:05,520 Speaker 1: is really having an accelerationist view on AI. 174 00:09:05,640 --> 00:09:06,480 Speaker 2: Right, go all in. 175 00:09:06,600 --> 00:09:08,840 Speaker 1: We have to be China in this race against China, 176 00:09:09,440 --> 00:09:12,040 Speaker 1: and a lot of voters, including in private polling that 177 00:09:12,080 --> 00:09:15,120 Speaker 1: I've seen at or asking really, why why does it matter? 178 00:09:15,160 --> 00:09:18,040 Speaker 1: I don't understand there's more fear out of AI than 179 00:09:18,080 --> 00:09:22,000 Speaker 1: I think there is optimism, whether that is good founded 180 00:09:22,080 --> 00:09:24,480 Speaker 1: or unfounded or bad or whatnot. I'm just telling you 181 00:09:24,520 --> 00:09:28,000 Speaker 1: the data as it is, and I think that progressives 182 00:09:28,360 --> 00:09:31,320 Speaker 1: are starting to look at the temperature in the room 183 00:09:31,360 --> 00:09:33,840 Speaker 1: and saying we need to have an answer to that 184 00:09:34,000 --> 00:09:37,280 Speaker 1: call to the AI revolution, to what's happening to jobs, 185 00:09:37,800 --> 00:09:41,360 Speaker 1: and I think that they're really already testing the waters 186 00:09:41,400 --> 00:09:44,400 Speaker 1: on it. With me on today's episode is Democratic Congressman 187 00:09:44,480 --> 00:09:47,480 Speaker 1: Roe Kanna, and he has a progressive message on AI. 188 00:09:47,679 --> 00:09:50,720 Speaker 1: His conversation, I hope will bring insight to how Democrats, 189 00:09:50,800 --> 00:09:55,240 Speaker 1: especially left wing Democrats, progressive Democrats, will approach this policy 190 00:09:55,280 --> 00:09:57,920 Speaker 1: going forward, and what it could mean for our politics 191 00:09:57,920 --> 00:10:02,120 Speaker 1: in general, and who is really answering people's anxiety. That's 192 00:10:02,160 --> 00:10:08,160 Speaker 1: coming up next congress and ro Kana has represented a 193 00:10:08,200 --> 00:10:11,200 Speaker 1: California seventeen districts since twenty seventeen. It's known as the 194 00:10:11,240 --> 00:10:14,800 Speaker 1: trillion Dollar District as it's home to so many tech companies. Congressman, 195 00:10:14,840 --> 00:10:15,680 Speaker 1: thank you for being here. 196 00:10:16,400 --> 00:10:18,199 Speaker 2: Well, thank you, I appreciate it. 197 00:10:18,360 --> 00:10:20,720 Speaker 1: Well, first, I want to ask you about how you 198 00:10:20,720 --> 00:10:23,839 Speaker 1: think AI is currently affecting the economy, especially when it 199 00:10:23,840 --> 00:10:25,720 Speaker 1: comes to wages and job growth, because there's a big 200 00:10:25,720 --> 00:10:29,880 Speaker 1: difference between GDP as a whole and jobs and wages specifically. 201 00:10:31,080 --> 00:10:34,720 Speaker 3: Well, I think it's had a bifurcated effect. On the 202 00:10:34,760 --> 00:10:37,960 Speaker 3: one hand, you've had obviously a data center build out, 203 00:10:38,040 --> 00:10:42,720 Speaker 3: a huge CAPEX expenditure. That's created some jobs for electricians, 204 00:10:42,760 --> 00:10:45,520 Speaker 3: for unions, for people who are building these centers, and 205 00:10:45,559 --> 00:10:48,920 Speaker 3: there's also been huge wealth generation for those who have 206 00:10:49,679 --> 00:10:54,440 Speaker 3: invested in AI, who are developing AI, and who're in 207 00:10:54,520 --> 00:10:58,000 Speaker 3: the capitol class. On the other hand, it is already 208 00:10:58,000 --> 00:11:01,720 Speaker 3: starting to have an automation impact, making it harder for 209 00:11:01,880 --> 00:11:06,600 Speaker 3: entry level jobs, particularly for kids graduating from school, making 210 00:11:06,640 --> 00:11:10,880 Speaker 3: it harder for people who are being displaced, and. 211 00:11:12,200 --> 00:11:15,520 Speaker 2: It is creating anxiety for many. 212 00:11:15,360 --> 00:11:19,360 Speaker 3: People about whether they'll have good paying jobs in the future. 213 00:11:19,679 --> 00:11:22,800 Speaker 1: Yeah, you've tweeted this something called the new deal for AI, 214 00:11:22,880 --> 00:11:25,040 Speaker 1: and what you said was this, you call for an 215 00:11:25,080 --> 00:11:28,679 Speaker 1: AI jobs new deal, future workforce, administration hires for young 216 00:11:28,760 --> 00:11:33,120 Speaker 1: graduates and displaced workers in care jobs, government services, national 217 00:11:33,120 --> 00:11:39,080 Speaker 1: security laws, protect drivers from automation, provoking immunity for algorithmic content, 218 00:11:39,200 --> 00:11:42,199 Speaker 1: and emphasizing humanities in schools. Can you go into a 219 00:11:42,200 --> 00:11:45,040 Speaker 1: little detail what that actually means and how be executed. 220 00:11:45,679 --> 00:11:48,920 Speaker 2: Sure, if you're a college graduate or you're a graduate. 221 00:11:48,559 --> 00:11:50,959 Speaker 3: Of and you're not getting a job, then the federal 222 00:11:51,000 --> 00:11:52,960 Speaker 3: government would say, Okay, we're either going to help create 223 00:11:53,000 --> 00:11:55,640 Speaker 3: an apprenticeship and subsidizing it at a private company, or 224 00:11:55,679 --> 00:11:58,720 Speaker 3: we're going to help you in your local community do 225 00:11:58,960 --> 00:12:02,160 Speaker 3: some care job, whether it's healthcare, childcare, eldercare, or in 226 00:12:02,200 --> 00:12:05,520 Speaker 3: your local community, work on some project for that community, 227 00:12:05,679 --> 00:12:08,520 Speaker 3: whether it's an infrastructure project or whether it's a project 228 00:12:08,559 --> 00:12:10,960 Speaker 3: of delivering services. Or we could come into the federal 229 00:12:11,040 --> 00:12:14,880 Speaker 3: government to help make government more efficient, either in cybersecurity 230 00:12:14,960 --> 00:12:17,920 Speaker 3: or digital services, or on some national security projects. 231 00:12:17,920 --> 00:12:20,480 Speaker 2: And this way you're going to get experienced, and then 232 00:12:20,520 --> 00:12:20,920 Speaker 2: you can. 233 00:12:20,800 --> 00:12:23,240 Speaker 3: Go back out into the private sector, because the private 234 00:12:23,240 --> 00:12:25,600 Speaker 3: sector may not be hiring those entry level jobs. It 235 00:12:25,600 --> 00:12:27,480 Speaker 3: can be for people also right out of high school 236 00:12:27,480 --> 00:12:30,079 Speaker 3: and trade schools, though I think the trade schools ironically 237 00:12:30,080 --> 00:12:32,319 Speaker 3: will be less automated and there may be less of 238 00:12:32,400 --> 00:12:36,120 Speaker 3: a need that for them. The unemployment rate is lower 239 00:12:36,120 --> 00:12:39,720 Speaker 3: for them actually initially than those out of college. And 240 00:12:39,760 --> 00:12:42,160 Speaker 3: I also would say that it should include people then 241 00:12:42,160 --> 00:12:45,360 Speaker 3: who are being displaced based on automation, and the people 242 00:12:45,360 --> 00:12:48,200 Speaker 3: who should be paying for that, or the companies that 243 00:12:48,240 --> 00:12:51,520 Speaker 3: are automating should be having some kind of a tax 244 00:12:51,600 --> 00:12:53,520 Speaker 3: on that so that they can help pay for this 245 00:12:53,720 --> 00:12:57,360 Speaker 3: workforce administration in many ways. We should do this in 246 00:12:57,559 --> 00:13:00,920 Speaker 3: a way that's correct, as opposed to globalization, where we 247 00:13:01,120 --> 00:13:05,440 Speaker 3: basically allowed wealth to accumulate and hollowed out communities and 248 00:13:05,480 --> 00:13:08,400 Speaker 3: de industrialized and promised training programs that never came. 249 00:13:09,080 --> 00:13:11,719 Speaker 1: So, and what is revoke algorithmic community mean? 250 00:13:12,360 --> 00:13:12,440 Speaker 2: So? 251 00:13:12,640 --> 00:13:19,600 Speaker 3: Right now, if you are Facebook or X and you 252 00:13:20,320 --> 00:13:27,480 Speaker 3: amplifies things that are libelists or slander or provoking violence, 253 00:13:27,480 --> 00:13:30,240 Speaker 3: you don't have any immunity under section two thirty, I 254 00:13:30,240 --> 00:13:31,360 Speaker 3: mean you don't have any liability. 255 00:13:31,400 --> 00:13:34,560 Speaker 2: Intersection two thirty, you should have liability. What would that mean? 256 00:13:34,840 --> 00:13:38,160 Speaker 3: It would mean that these these companies may move more 257 00:13:38,200 --> 00:13:41,320 Speaker 3: to a subscription model than an algorithmic model, more to 258 00:13:41,679 --> 00:13:44,840 Speaker 3: presenting content chronologically, and I just think it would be 259 00:13:45,080 --> 00:13:47,640 Speaker 3: less of an outrage driven democracy. 260 00:13:48,280 --> 00:13:50,679 Speaker 1: Now, do you know have any estimate that sounds like 261 00:13:51,200 --> 00:13:55,240 Speaker 1: it sounds very it sounds very promising. But do you 262 00:13:55,240 --> 00:13:57,520 Speaker 1: know what an effort like that would cost by any chance? 263 00:13:58,160 --> 00:13:59,559 Speaker 3: It's hard to say because we don't know what the 264 00:13:59,600 --> 00:14:04,240 Speaker 3: display effect of AI it will be. So it would 265 00:14:04,240 --> 00:14:06,959 Speaker 3: be certainly in the billions of dollars a year, but 266 00:14:07,080 --> 00:14:09,360 Speaker 3: it depends on how much the displacement is, but it 267 00:14:09,400 --> 00:14:11,280 Speaker 3: needs to be coupled, in my view, with a comprehensive 268 00:14:11,320 --> 00:14:14,480 Speaker 3: economic development strategy, what I call a Marshall Plan for America, 269 00:14:14,720 --> 00:14:17,800 Speaker 3: build a thousand new trade schools, have a National Industrial 270 00:14:17,840 --> 00:14:21,360 Speaker 3: Development Bank that's actually funding new factories or preventing providing 271 00:14:21,400 --> 00:14:24,320 Speaker 3: financing for a factory not to close if there was 272 00:14:24,360 --> 00:14:27,360 Speaker 3: a market failure the lack of financing, or making sure 273 00:14:27,360 --> 00:14:30,600 Speaker 3: that we are investing in care jobs in communities, making 274 00:14:30,640 --> 00:14:33,560 Speaker 3: sure that we're investing in AI tech institute so people 275 00:14:33,600 --> 00:14:36,160 Speaker 3: can actually develop an AI literacy for these new jobs. 276 00:14:36,360 --> 00:14:38,560 Speaker 3: So I think that what I'm calling game about with 277 00:14:38,600 --> 00:14:41,600 Speaker 3: the future workforce administration is sort of defensive. If you're 278 00:14:41,600 --> 00:14:44,680 Speaker 3: not finding a job or if you're unemployed, what do 279 00:14:44,720 --> 00:14:48,160 Speaker 3: you do? But we also need an offensive strategy, which is, 280 00:14:48,160 --> 00:14:50,560 Speaker 3: how do we have an economic development strategy for job 281 00:14:50,640 --> 00:14:53,920 Speaker 3: creation across this country so that we mitigate the need 282 00:14:54,400 --> 00:14:55,640 Speaker 3: for people being displaced. 283 00:14:56,120 --> 00:15:00,480 Speaker 1: Your career, you've been a very vocal support order of 284 00:15:01,320 --> 00:15:03,720 Speaker 1: people's personal rights and people's person protection. You've been a 285 00:15:03,720 --> 00:15:06,880 Speaker 1: civil libertarian and many facets, especially when the War on 286 00:15:07,000 --> 00:15:10,640 Speaker 1: Terror was really ramping up. Are you afraid of what's 287 00:15:10,640 --> 00:15:14,160 Speaker 1: going on people's civil liberties and AI about people's data 288 00:15:14,200 --> 00:15:15,560 Speaker 1: being really tracked and sold. 289 00:15:17,600 --> 00:15:19,560 Speaker 2: I am, I'm very concerned about it. 290 00:15:20,160 --> 00:15:22,920 Speaker 3: The first rule should be you own your data, and 291 00:15:23,120 --> 00:15:25,360 Speaker 3: we should have a data dividend in this country. It 292 00:15:25,400 --> 00:15:28,480 Speaker 3: wouldn't be life changing, but you would probably get about 293 00:15:28,480 --> 00:15:31,960 Speaker 3: five hundred bucks a year just for the exploitation or 294 00:15:31,960 --> 00:15:34,400 Speaker 3: the use of your data by these companies. And five 295 00:15:34,480 --> 00:15:36,560 Speaker 3: hundred bucks is a fair amount for folks. I mean 296 00:15:36,600 --> 00:15:39,960 Speaker 3: we're paying grossery, billy, I mean, I'll take it. So 297 00:15:40,040 --> 00:15:43,040 Speaker 3: that there should be a data dividend. There also should 298 00:15:43,080 --> 00:15:45,840 Speaker 3: be a right that you have to your own data. 299 00:15:45,880 --> 00:15:48,040 Speaker 3: That they don't have a federal government building a social 300 00:15:48,120 --> 00:15:49,320 Speaker 3: media profile on you. 301 00:15:49,360 --> 00:15:50,720 Speaker 2: Can you imagine how scary that is. 302 00:15:51,440 --> 00:15:54,480 Speaker 3: There shouldn't be companies that are able to feed you 303 00:15:55,680 --> 00:15:59,960 Speaker 3: information by building profiles on you. So I think we'd 304 00:16:00,120 --> 00:16:02,520 Speaker 3: need I called for this in twenty seventeen, an Internet 305 00:16:02,560 --> 00:16:03,600 Speaker 3: Bill of Wrights, and. 306 00:16:03,440 --> 00:16:05,360 Speaker 2: We still haven't had it past. You know. 307 00:16:05,400 --> 00:16:07,240 Speaker 3: One of the problems is, just bluntly, we've had three 308 00:16:07,280 --> 00:16:09,920 Speaker 3: eighty year old presidents in a row, and we need 309 00:16:10,280 --> 00:16:13,640 Speaker 3: leadership in this country that understands technology and understands how 310 00:16:13,680 --> 00:16:15,040 Speaker 3: to respond to that technology. 311 00:16:15,200 --> 00:16:18,360 Speaker 1: So whenever you have any criticisms towards tech, one of 312 00:16:18,400 --> 00:16:20,200 Speaker 1: the bigger things that people say is we need this 313 00:16:20,320 --> 00:16:22,880 Speaker 1: race against China. We need to be China, right, any 314 00:16:22,920 --> 00:16:26,600 Speaker 1: criticisms to our sectas you can't have anything any restrictions whatsoever. 315 00:16:27,200 --> 00:16:30,680 Speaker 1: What do you make of this race against China when 316 00:16:30,720 --> 00:16:34,520 Speaker 1: it comes to AI, and would restrictions actually promote the 317 00:16:34,600 --> 00:16:38,440 Speaker 1: Chinese Communist Party and promote like have their ability to 318 00:16:38,480 --> 00:16:40,520 Speaker 1: beat us in Ai whatever that really needs? 319 00:16:40,760 --> 00:16:43,400 Speaker 2: Right? Well, sometimes people make this have not even been 320 00:16:43,440 --> 00:16:43,920 Speaker 2: to China. 321 00:16:43,960 --> 00:16:46,560 Speaker 3: I just was in China and met with Premier Chiang, 322 00:16:46,720 --> 00:16:50,720 Speaker 3: met with the Foreign Minister Wangi, with the mayors of 323 00:16:50,720 --> 00:16:54,320 Speaker 3: Beijing and Shanghai, and had extensive conversations. A lot of 324 00:16:54,360 --> 00:16:56,400 Speaker 3: people saying this don't know that there's twenty percent youth 325 00:16:56,480 --> 00:17:00,480 Speaker 3: unemployment in China, twenty percent because they have over capacity 326 00:17:00,480 --> 00:17:03,320 Speaker 3: in their factories and they have undercapacity in software development. 327 00:17:03,880 --> 00:17:07,080 Speaker 3: They don't know the lack of economic diversification in China. 328 00:17:07,359 --> 00:17:09,680 Speaker 3: They don't know how much local governments are suffering from 329 00:17:09,720 --> 00:17:12,879 Speaker 3: debt in China. The concern on the AI races, we 330 00:17:12,920 --> 00:17:15,560 Speaker 3: should have talent here. I'm considered that we're banning foreign 331 00:17:15,640 --> 00:17:20,280 Speaker 3: students from coming here and pursuing their education. But we 332 00:17:20,359 --> 00:17:24,200 Speaker 3: can develop as long as we are open to smart 333 00:17:24,240 --> 00:17:28,000 Speaker 3: immigration policy like we were with allowing Einstein to come 334 00:17:28,040 --> 00:17:29,800 Speaker 3: and so many of the German scientists to come and 335 00:17:29,880 --> 00:17:32,600 Speaker 3: Russian scientists to come. If we continue to do that, 336 00:17:33,240 --> 00:17:35,280 Speaker 3: I am fully comforted that our model is better to 337 00:17:35,359 --> 00:17:39,240 Speaker 3: lead and we can do that while having sensible regulations 338 00:17:39,440 --> 00:17:43,800 Speaker 3: so that you aren't compromising people's jobs, having massive inequality, 339 00:17:44,000 --> 00:17:45,760 Speaker 3: or violating people's CIMIL liberties. 340 00:17:46,160 --> 00:17:49,800 Speaker 1: Well, you mentioned immigration, so you've tweeted about corporate layoffs 341 00:17:49,800 --> 00:17:51,320 Speaker 1: when it comes to AI, and that's a big part 342 00:17:51,359 --> 00:17:53,840 Speaker 1: of your new deal. AI think is that corporations have 343 00:17:53,920 --> 00:17:57,560 Speaker 1: laid off tens of thousands of American workers. Do you 344 00:17:57,600 --> 00:17:59,640 Speaker 1: have and does it give you any pause that they've 345 00:17:59,640 --> 00:18:01,600 Speaker 1: also brought in H one B workers at the same 346 00:18:01,640 --> 00:18:03,360 Speaker 1: time they're laying off Americans. 347 00:18:03,640 --> 00:18:06,880 Speaker 3: Yes, I think the H one B program should be reformed. 348 00:18:06,880 --> 00:18:08,880 Speaker 3: I've said this and today I got into Congress. It's 349 00:18:08,880 --> 00:18:11,840 Speaker 3: been abused by a lot of companies that are paying 350 00:18:11,920 --> 00:18:15,680 Speaker 3: undermarket wages for jobs that aren't really high skilled, and 351 00:18:15,920 --> 00:18:17,959 Speaker 3: there's no doubt that it needs to be reformed. Now, 352 00:18:17,960 --> 00:18:20,520 Speaker 3: I don't agree, just put on one hundred thousand dollars 353 00:18:20,520 --> 00:18:22,560 Speaker 3: fee is the reform. What I would say is you 354 00:18:22,600 --> 00:18:25,760 Speaker 3: do need people who have real skills. The best is 355 00:18:25,800 --> 00:18:27,399 Speaker 3: by the way, give them a green card, don't have 356 00:18:27,480 --> 00:18:30,000 Speaker 3: them in underwage positions like H one where they're locked 357 00:18:30,040 --> 00:18:32,840 Speaker 3: into an employer, but make sure that you're getting market 358 00:18:32,880 --> 00:18:35,159 Speaker 3: wages and make sure they're really high skilled jobs, not 359 00:18:35,320 --> 00:18:38,520 Speaker 3: just jobs to undercut American workers. But if the question 360 00:18:38,640 --> 00:18:41,280 Speaker 3: is that has the H one BVS a program led 361 00:18:41,320 --> 00:18:44,840 Speaker 3: to American layoffs that you'd have to be in a 362 00:18:44,840 --> 00:18:48,320 Speaker 3: cocoon to say to say anything, but yes it has. 363 00:18:48,680 --> 00:18:48,960 Speaker 2: Well. 364 00:18:49,359 --> 00:18:51,720 Speaker 1: One last question of the AI and how it's affecting things, 365 00:18:51,760 --> 00:18:54,840 Speaker 1: I think you mentioned section two thirty. There hasn't been 366 00:18:54,840 --> 00:18:58,000 Speaker 1: any real big reforms. There's been a few on child protections, 367 00:18:58,040 --> 00:19:02,640 Speaker 1: but not massive reforms around internet since since Section two 368 00:19:02,640 --> 00:19:05,160 Speaker 1: thirty was created when the Internet was in its infancy. 369 00:19:05,920 --> 00:19:08,000 Speaker 1: There's a lot of losses going on around OPENINGI and 370 00:19:08,080 --> 00:19:12,800 Speaker 1: chat GPT about how these programs have promoted, you know, 371 00:19:13,000 --> 00:19:15,679 Speaker 1: people ending their own lives and basically said go do 372 00:19:15,800 --> 00:19:20,119 Speaker 1: it when when people have been very depressed. Could companies 373 00:19:20,160 --> 00:19:24,760 Speaker 1: be culpable for their programs being anti human Do you 374 00:19:24,800 --> 00:19:27,439 Speaker 1: know what I'm saying? So what I'm saying, well, I. 375 00:19:27,880 --> 00:19:29,920 Speaker 3: Think they could be culpable for the liability of someone 376 00:19:29,920 --> 00:19:33,880 Speaker 3: committing suicide, or someone going into depression, or someone having 377 00:19:33,920 --> 00:19:36,639 Speaker 3: suicidal thoughts. And I do think that there should be 378 00:19:36,720 --> 00:19:40,480 Speaker 3: liability for causing mental health at harm or causing a 379 00:19:40,480 --> 00:19:41,280 Speaker 3: harm to children. 380 00:19:42,160 --> 00:19:45,880 Speaker 1: Do you has the Democratic Party embraced what you're saying? 381 00:19:46,000 --> 00:19:49,920 Speaker 1: Because I think that the AI is clearly the future, 382 00:19:50,280 --> 00:19:53,920 Speaker 1: and it's lacking in the overall national conversation that we're having. 383 00:19:53,920 --> 00:19:55,760 Speaker 1: We're getting lost in a lot of things that aren't 384 00:19:56,080 --> 00:19:58,360 Speaker 1: the big picture, and you're one of the few politicians 385 00:19:58,400 --> 00:20:01,000 Speaker 1: really talking about this in a very forward looking way. 386 00:20:01,320 --> 00:20:04,600 Speaker 1: Have other Democratic politicians really jumped onto your messaging? 387 00:20:05,560 --> 00:20:07,359 Speaker 2: I think people are still figuring it out. Look. 388 00:20:07,400 --> 00:20:10,320 Speaker 3: I believe that that we need political leaders who understand AI, 389 00:20:10,320 --> 00:20:13,399 Speaker 3: who understand technology, understand what the future wealth generation is 390 00:20:13,440 --> 00:20:15,119 Speaker 3: going to be, what the future jobs are going to be, 391 00:20:15,400 --> 00:20:17,800 Speaker 3: and have a concrete vision of making sure that AI 392 00:20:17,880 --> 00:20:20,160 Speaker 3: is for all of us and not for them. If 393 00:20:20,160 --> 00:20:22,640 Speaker 3: we just continue down the current path, what you will 394 00:20:22,640 --> 00:20:26,480 Speaker 3: have is tech billionaires, tech trillionaires, and everyone else will 395 00:20:26,480 --> 00:20:29,960 Speaker 3: sort of be left to fortunes whims, and I don't 396 00:20:29,960 --> 00:20:31,879 Speaker 3: believe in that. I believe that we need as a 397 00:20:31,880 --> 00:20:36,639 Speaker 3: democratic society to direct the development of AI so that 398 00:20:36,680 --> 00:20:40,280 Speaker 3: it increases jobs in communities, increases wealth in communities, and 399 00:20:40,560 --> 00:20:43,520 Speaker 3: doesn't degrade our democracy. The problem is we have a 400 00:20:43,520 --> 00:20:46,359 Speaker 3: lot of people who don't understand the technology, who don't 401 00:20:46,359 --> 00:20:48,600 Speaker 3: have the confidence to go back and forth as I 402 00:20:48,760 --> 00:20:50,679 Speaker 3: recently did in a Twitter thread with some of my 403 00:20:51,000 --> 00:20:53,280 Speaker 3: tech friends who didn't like what I said, and have 404 00:20:53,359 --> 00:20:55,960 Speaker 3: the confidence to stick up for their views. And I 405 00:20:56,000 --> 00:20:58,679 Speaker 3: think this country needs to elect people who understand technology, 406 00:20:58,760 --> 00:21:02,440 Speaker 3: understand AI, and have the moral courage to stand up 407 00:21:02,600 --> 00:21:05,440 Speaker 3: to some of the corporate leaders who saying, no, we 408 00:21:05,720 --> 00:21:07,880 Speaker 3: need you to make sure that this is in America's interest, 409 00:21:07,960 --> 00:21:08,879 Speaker 3: in communities interest. 410 00:21:09,440 --> 00:21:12,000 Speaker 2: Their job is to maximize well for the shareholders. I'm 411 00:21:12,040 --> 00:21:12,840 Speaker 2: not imputing them. 412 00:21:12,920 --> 00:21:17,200 Speaker 3: But politicians' job is to look out for the interests 413 00:21:17,200 --> 00:21:17,800 Speaker 3: of the country. 414 00:21:18,720 --> 00:21:20,639 Speaker 1: And now you represent a district, I think I think 415 00:21:20,680 --> 00:21:23,560 Speaker 1: your district's worth I think are twelve or thirteen trillion dollars, 416 00:21:23,560 --> 00:21:24,400 Speaker 1: which is like insant. 417 00:21:24,560 --> 00:21:25,080 Speaker 2: That's what I say. 418 00:21:25,080 --> 00:21:26,879 Speaker 3: It was up to eighteen, I don't know, maybe it 419 00:21:27,000 --> 00:21:32,639 Speaker 3: was over billion dollars at Apple, Google, Nvidia, Broadcom, and Tesla. 420 00:21:32,680 --> 00:21:34,439 Speaker 1: So what do those CEOs say to you when you 421 00:21:34,480 --> 00:21:35,080 Speaker 1: bring this up? 422 00:21:35,119 --> 00:21:35,840 Speaker 2: They say, you. 423 00:21:35,800 --> 00:21:38,760 Speaker 1: Know, representive Connor, You're going to destroy our industry. Or 424 00:21:38,760 --> 00:21:41,080 Speaker 1: do they understand that reforms have to be made. 425 00:21:41,200 --> 00:21:43,520 Speaker 3: They understand that performers have permitted They don't. They may 426 00:21:43,520 --> 00:21:45,359 Speaker 3: not agree with all of mine, but I say to them, 427 00:21:45,400 --> 00:21:47,080 Speaker 3: you can't have an island. 428 00:21:46,760 --> 00:21:49,800 Speaker 2: Of prosperity and a sea of despair. 429 00:21:50,320 --> 00:21:53,919 Speaker 3: America can't survive half prosperous and half in decline. I 430 00:21:53,960 --> 00:21:56,520 Speaker 3: say call it the anti revolution tax, you know the 431 00:21:57,359 --> 00:22:00,719 Speaker 3: when I bring up some of these policies. But the 432 00:22:00,760 --> 00:22:05,200 Speaker 3: reality is I am not a luddite. I'm a technology optimist. 433 00:22:05,240 --> 00:22:07,560 Speaker 3: I believe AI will help us with medicine. I believe 434 00:22:07,600 --> 00:22:10,520 Speaker 3: AI will help us in times of education. A I 435 00:22:10,520 --> 00:22:14,560 Speaker 3: should be, like Steve Jobs said about computers, a bicycle 436 00:22:14,600 --> 00:22:17,520 Speaker 3: for the mind. It should be about enhancing human capability, 437 00:22:17,720 --> 00:22:22,560 Speaker 3: not about excessive elimination of jobs and excessive accumulation of 438 00:22:22,600 --> 00:22:26,000 Speaker 3: profits in the hands of a few. And yeah, that's 439 00:22:26,040 --> 00:22:27,280 Speaker 3: the balance we need to strike. 440 00:22:27,480 --> 00:22:29,400 Speaker 1: And my last question to you is do you have 441 00:22:29,680 --> 00:22:32,560 Speaker 1: a fear then that if let's say we do have this. 442 00:22:32,720 --> 00:22:34,520 Speaker 1: I mean, we already have a massive consentration of wealth, 443 00:22:34,560 --> 00:22:39,240 Speaker 1: but it is exacerbated one hundredfold one thousandfold and a 444 00:22:39,280 --> 00:22:41,520 Speaker 1: massive job losses where you have a permanent underclass of 445 00:22:41,520 --> 00:22:43,520 Speaker 1: ten to twenty percent of the population who can't get 446 00:22:43,560 --> 00:22:45,800 Speaker 1: fine work because their work has all been automated and 447 00:22:45,800 --> 00:22:48,919 Speaker 1: they weren't trained for jobs. Do you fear of a 448 00:22:49,040 --> 00:22:52,800 Speaker 1: real technological backlash by the voters to sit there and say, no, 449 00:22:52,920 --> 00:22:56,439 Speaker 1: we need something pretty strong against tech companies and in 450 00:22:56,480 --> 00:22:57,919 Speaker 1: the most excessive ways. 451 00:22:58,240 --> 00:23:00,960 Speaker 3: Yeah, I fear not only a backlash technology. I fair 452 00:23:01,040 --> 00:23:05,080 Speaker 3: backlash potential against multi racial democracy. I fear backlash against 453 00:23:05,080 --> 00:23:08,000 Speaker 3: America being a leader in the world, A fair backlash 454 00:23:08,040 --> 00:23:10,440 Speaker 3: against any engagement. 455 00:23:09,920 --> 00:23:10,440 Speaker 2: With the world. 456 00:23:10,560 --> 00:23:13,760 Speaker 3: I fear that you would have an ugly nationalism that 457 00:23:13,760 --> 00:23:16,840 Speaker 3: could emerge. I call for an economic patriotism, that's an 458 00:23:16,840 --> 00:23:20,080 Speaker 3: inclusive patriotism that has a role for immigrants and has 459 00:23:20,080 --> 00:23:22,720 Speaker 3: a role for innovation. But I think you would have 460 00:23:22,760 --> 00:23:25,560 Speaker 3: a very ugly movement. I mean Andy Grove, who was 461 00:23:25,600 --> 00:23:29,560 Speaker 3: an Intel CEO and a great founders and escaped from 462 00:23:29,560 --> 00:23:32,879 Speaker 3: both Nazi Germany and a Soviet Communism, said well, the 463 00:23:32,920 --> 00:23:34,000 Speaker 3: biggest danger is. 464 00:23:34,080 --> 00:23:36,640 Speaker 2: Long unemployment lines. You haven't seen them. 465 00:23:36,680 --> 00:23:39,199 Speaker 3: I have, and so I think that's why he was 466 00:23:39,200 --> 00:23:42,840 Speaker 3: so forwardlooking in America, needing an economic development policy outside 467 00:23:43,119 --> 00:23:45,400 Speaker 3: Silicon Valley. I have one question for you, Ryan, because 468 00:23:45,400 --> 00:23:46,879 Speaker 3: I see some of your Twitter posts and things in 469 00:23:46,920 --> 00:23:47,880 Speaker 3: your always engaged. 470 00:23:48,119 --> 00:23:51,320 Speaker 2: I don't even know your politics. Are you more left right? 471 00:23:51,440 --> 00:23:53,479 Speaker 1: So I'm a pretty right wing person. 472 00:23:53,560 --> 00:23:54,040 Speaker 2: I just. 473 00:23:55,960 --> 00:23:58,639 Speaker 1: My belief is this is that I came from a 474 00:23:58,640 --> 00:24:01,199 Speaker 1: working class family. I underst and working class people, and 475 00:24:01,240 --> 00:24:03,959 Speaker 1: I always want what's best for them. That doesn't have 476 00:24:04,080 --> 00:24:06,919 Speaker 1: clear ideological lines. So you have to sit there and 477 00:24:06,960 --> 00:24:09,680 Speaker 1: be willing, and if they are upset, they will move 478 00:24:09,720 --> 00:24:13,080 Speaker 1: to policies that are extreme, and I don't. And I 479 00:24:13,119 --> 00:24:16,480 Speaker 1: believe everyone has the everyone with the ability should have 480 00:24:16,600 --> 00:24:20,080 Speaker 1: the right to at least pursue an American dream that's, 481 00:24:20,119 --> 00:24:23,600 Speaker 1: you know, one of not just comfortable living, but respectable livings. 482 00:24:24,359 --> 00:24:25,240 Speaker 2: So when I say a. 483 00:24:25,400 --> 00:24:28,120 Speaker 3: New deal hypothetically as I define it, and you can 484 00:24:28,119 --> 00:24:30,800 Speaker 3: be blind, that doesn't scare you. No. 485 00:24:30,920 --> 00:24:33,280 Speaker 1: I mean, you say a new deal because it's easy 486 00:24:33,320 --> 00:24:36,959 Speaker 1: branding FDR new deal. Everyone understands it, right, I get that. 487 00:24:37,000 --> 00:24:38,120 Speaker 1: So I work in politics. 488 00:24:38,160 --> 00:24:38,680 Speaker 2: I understand. 489 00:24:38,680 --> 00:24:41,919 Speaker 1: I'm like, Okay, that's clear when you talk about redistributive wealth. 490 00:24:42,359 --> 00:24:45,880 Speaker 1: It's not something I am naturally in opposition to because 491 00:24:45,920 --> 00:24:48,080 Speaker 1: I believe we should tax data. I don't know why 492 00:24:48,200 --> 00:24:51,080 Speaker 1: data is in tax considering it's clearly more valuable than 493 00:24:51,119 --> 00:24:54,320 Speaker 1: oil and gold is everyone's data. What I and I'm 494 00:24:54,320 --> 00:24:56,800 Speaker 1: not opposed to, and I do think that the tech 495 00:24:56,840 --> 00:25:00,720 Speaker 1: companies have gone very long without any regular at all. 496 00:25:00,920 --> 00:25:04,440 Speaker 1: What I get a little anxiety written about is how 497 00:25:04,520 --> 00:25:08,879 Speaker 1: do you in a country that is in continuation of 498 00:25:09,040 --> 00:25:13,040 Speaker 1: maximizing wealth for very few while importing one point one 499 00:25:13,040 --> 00:25:16,199 Speaker 1: million legal immgreds per year who are overwhelmingly part of 500 00:25:16,240 --> 00:25:18,960 Speaker 1: the population that would be underserved by this tech revolution. 501 00:25:19,359 --> 00:25:22,840 Speaker 1: How do you keep that prosperity going. That's my big 502 00:25:22,920 --> 00:25:26,840 Speaker 1: fear is that we're importing future poverty unless we're all 503 00:25:26,840 --> 00:25:31,320 Speaker 1: bringing in brilliant tech workers and that has a process 504 00:25:31,359 --> 00:25:34,840 Speaker 1: of running out. There will not be unless you're unless 505 00:25:34,840 --> 00:25:37,080 Speaker 1: you're going to exhaust that or people or capital belief. 506 00:25:37,320 --> 00:25:40,640 Speaker 1: But as far as nearly capitalistic, no, I'm not opposed 507 00:25:40,640 --> 00:25:42,320 Speaker 1: to taxes. I lived in New York City my almost 508 00:25:42,359 --> 00:25:44,960 Speaker 1: my entire life. I'm not opposed to taxes. I am 509 00:25:45,200 --> 00:25:47,040 Speaker 1: opposed to story the immigration issue. 510 00:25:47,040 --> 00:25:48,800 Speaker 3: Well what did you make a President Trump's point with 511 00:25:48,880 --> 00:25:51,800 Speaker 3: Laura Ingram and then where he said, look, you need 512 00:25:52,359 --> 00:25:54,320 Speaker 3: you know, for z has. 513 00:25:54,200 --> 00:25:56,320 Speaker 2: A shortage of five thousand workers. Yes, we need a 514 00:25:56,400 --> 00:25:58,480 Speaker 2: market workers. We need to be investigating market workers. I 515 00:25:58,520 --> 00:26:01,359 Speaker 2: disagreed with President Trump that American workers aren't talented. I 516 00:26:01,400 --> 00:26:02,800 Speaker 2: think they're incredibly talented. 517 00:26:02,920 --> 00:26:06,000 Speaker 3: But he said we also need immigrants to help. But 518 00:26:06,080 --> 00:26:08,000 Speaker 3: I guess were you more on Laura ingram side than 519 00:26:08,000 --> 00:26:08,600 Speaker 3: his side on that? 520 00:26:08,760 --> 00:26:11,679 Speaker 1: I think that President Trump is very easily influenced them 521 00:26:11,680 --> 00:26:13,359 Speaker 1: as by the last person who spoke to him, And 522 00:26:13,400 --> 00:26:15,280 Speaker 1: I want to know. I want to know who the 523 00:26:15,359 --> 00:26:17,840 Speaker 1: last person who spoke to him was and if it 524 00:26:17,920 --> 00:26:20,280 Speaker 1: was our commerce secretary who's been giving him tons of 525 00:26:20,320 --> 00:26:24,160 Speaker 1: bad advice. I believe that there are listen, there's an 526 00:26:24,200 --> 00:26:26,920 Speaker 1: issue where people I think have the assumption of what 527 00:26:27,000 --> 00:26:28,680 Speaker 1: they what they have a right to. And I also 528 00:26:28,720 --> 00:26:31,359 Speaker 1: think there's a big education problem. Right we have millions 529 00:26:31,400 --> 00:26:35,440 Speaker 1: of people who are not equipped for jobs that I agree, 530 00:26:35,800 --> 00:26:38,640 Speaker 1: and the graduation rates are a complete smoke screen because 531 00:26:38,640 --> 00:26:41,600 Speaker 1: they're graduating schools with no ability to read or write 532 00:26:41,600 --> 00:26:45,440 Speaker 1: and it's just graduation numbers are there to promote ways 533 00:26:45,600 --> 00:26:48,840 Speaker 1: of getting funding for the schools. You know, Mississippi, Louisiana's 534 00:26:48,880 --> 00:26:51,280 Speaker 1: done amazing jobs of jumping out that. But we need 535 00:26:51,320 --> 00:26:54,360 Speaker 1: to really look at education and how to sit there 536 00:26:54,400 --> 00:26:57,560 Speaker 1: and get these get school great would you before? 537 00:26:58,000 --> 00:26:58,600 Speaker 2: And absolutely? 538 00:26:58,600 --> 00:27:00,560 Speaker 1: And I think that the German model which worked for 539 00:27:00,560 --> 00:27:02,119 Speaker 1: a long time, it's not really working right now, but 540 00:27:02,160 --> 00:27:04,080 Speaker 1: it did work for a long time of getting these 541 00:27:04,080 --> 00:27:07,640 Speaker 1: employers into the high schools and satying this person's brilliant, 542 00:27:07,840 --> 00:27:11,000 Speaker 1: just skips college and let's have a method where you 543 00:27:11,040 --> 00:27:13,919 Speaker 1: work for a corporation again for twenty years or thirty years. 544 00:27:14,119 --> 00:27:18,040 Speaker 1: If Ford were to find a kid from Mississippi who's 545 00:27:18,119 --> 00:27:22,120 Speaker 1: not brilliant at I don't know, some I don't know writing, 546 00:27:22,480 --> 00:27:24,399 Speaker 1: can't write a novel, but he can fix the car, 547 00:27:24,400 --> 00:27:26,360 Speaker 1: why shouldn't he pick him up at seventeen years old. 548 00:27:26,600 --> 00:27:28,920 Speaker 1: I do think that those things should be working better 549 00:27:29,000 --> 00:27:30,840 Speaker 1: than they are. And I think that a lot of 550 00:27:30,880 --> 00:27:34,080 Speaker 1: the backlash the Republicans both in the last election are 551 00:27:34,200 --> 00:27:38,400 Speaker 1: mostly around the economy, affordability, and fear of the economy. 552 00:27:38,480 --> 00:27:39,879 Speaker 1: I think those are the number one drivers. I think 553 00:27:39,920 --> 00:27:42,199 Speaker 1: people thought that Trump had a magic wand and we 554 00:27:42,200 --> 00:27:44,159 Speaker 1: can go to twenty nineteen again. But I don't think 555 00:27:44,200 --> 00:27:45,960 Speaker 1: we're ever going to be back to twenty nineteen. So 556 00:27:46,119 --> 00:27:48,879 Speaker 1: I'm not anti AI on its face. I'm not anti 557 00:27:49,040 --> 00:27:52,480 Speaker 1: tax on tech comp corporations on its face. But I 558 00:27:52,480 --> 00:27:55,639 Speaker 1: think we really need to have intelligent conversations over policies 559 00:27:55,680 --> 00:27:59,919 Speaker 1: and less conversations over personalities than flamethrowing to get Internet clicks. 560 00:28:00,000 --> 00:28:01,560 Speaker 1: I find that that's why I try to I hope 561 00:28:01,600 --> 00:28:03,800 Speaker 1: you like this podcast interview. I hope I was bringing 562 00:28:04,240 --> 00:28:04,960 Speaker 1: very very awful. 563 00:28:05,040 --> 00:28:05,200 Speaker 3: Yeah. 564 00:28:05,440 --> 00:28:06,879 Speaker 2: In fact, I to the point that I didn't know 565 00:28:06,920 --> 00:28:07,520 Speaker 2: your politics. 566 00:28:07,760 --> 00:28:10,320 Speaker 1: I appreciate that very much, and I've liked a lot 567 00:28:10,359 --> 00:28:12,360 Speaker 1: of what you've done on civil liberties for a long time, 568 00:28:12,400 --> 00:28:14,240 Speaker 1: so I've always thought of you as a very thoughtful 569 00:28:14,480 --> 00:28:15,960 Speaker 1: leader in the Democrat Party. 570 00:28:16,280 --> 00:28:19,080 Speaker 3: I appreciate it well. To be continued. I look forward 571 00:28:19,080 --> 00:28:21,359 Speaker 3: to coming back, and I wish we could have more 572 00:28:21,400 --> 00:28:24,080 Speaker 3: of this. I think we desperately need more of this 573 00:28:24,320 --> 00:28:26,960 Speaker 3: in the country. And while you were kind of I'm 574 00:28:26,960 --> 00:28:28,840 Speaker 3: not to ask me or restrained, and I'm not to 575 00:28:28,880 --> 00:28:30,480 Speaker 3: ask me a single question about e Fstein, which is 576 00:28:30,640 --> 00:28:31,440 Speaker 3: what I've been probably. 577 00:28:31,200 --> 00:28:33,360 Speaker 1: About you know, you know what, there's only so much. 578 00:28:33,440 --> 00:28:36,280 Speaker 3: I mean, the email, I will say, the one thing 579 00:28:36,320 --> 00:28:38,520 Speaker 3: about it is that it showed sort of the value 580 00:28:38,520 --> 00:28:42,960 Speaker 3: of stability and conversation with people who you may not 581 00:28:43,200 --> 00:28:45,360 Speaker 3: you may think of as caricatures. And I just think 582 00:28:45,360 --> 00:28:48,760 Speaker 3: that that kind of dialogue and seeing the humanity of 583 00:28:48,800 --> 00:28:51,360 Speaker 3: people and seeing where there may be common granted where 584 00:28:51,360 --> 00:28:54,200 Speaker 3: there's disagreement. We've got to get back to that in 585 00:28:54,200 --> 00:28:57,080 Speaker 3: our politics. Not out of a polyannishness, but if you 586 00:28:57,120 --> 00:28:59,920 Speaker 3: listened to like the Nixon Kennedy debates and the substance 587 00:29:00,120 --> 00:29:02,880 Speaker 3: behind them, we've lost a lot of the just the 588 00:29:03,000 --> 00:29:06,240 Speaker 3: robustness of an exchange of ideas. I mean, it's hurting 589 00:29:06,240 --> 00:29:07,600 Speaker 3: our democracy. 590 00:29:07,160 --> 00:29:10,920 Speaker 1: Right, even radical quote unquote radicals back then, who like 591 00:29:11,560 --> 00:29:13,160 Speaker 1: you know, they talk a lot about pap Uchanan, who 592 00:29:13,200 --> 00:29:15,120 Speaker 1: I was lucky enough to know throughout my life, but 593 00:29:15,120 --> 00:29:16,680 Speaker 1: he was very smart and very thoughtful. A lot of 594 00:29:16,680 --> 00:29:18,680 Speaker 1: things he talked about in policy ideas, And I think 595 00:29:18,680 --> 00:29:20,480 Speaker 1: that in this platform that I have, I tried to 596 00:29:20,640 --> 00:29:23,520 Speaker 1: just raise the conversation where it's not all tell me 597 00:29:23,560 --> 00:29:26,440 Speaker 1: who you hate the most. I mean, that's fun. But 598 00:29:27,040 --> 00:29:29,680 Speaker 1: I left high school and I don't I'm not trying 599 00:29:29,680 --> 00:29:32,640 Speaker 1: to return to it so but Tokersman, thank you for 600 00:29:32,640 --> 00:29:38,440 Speaker 1: coming this podcast. I genuinely appreciate it. Now it's time 601 00:29:38,480 --> 00:29:40,080 Speaker 1: for the Ask Me Anything segment. If you want a 602 00:29:40,200 --> 00:29:42,400 Speaker 1: part of the Ask Me Anything segment, email me ryanant 603 00:29:42,520 --> 00:29:46,360 Speaker 1: Numbers Gamepodcast dot Com. That's Ryan at Numbers Plural Numbers 604 00:29:46,360 --> 00:29:49,400 Speaker 1: Gamepodcast dot Com. Love your questions, guys. This one comes 605 00:29:49,480 --> 00:29:51,880 Speaker 1: from and you email me last week and I haven't 606 00:29:51,920 --> 00:29:54,480 Speaker 1: learned how to pronounce your name yet her Dwall. I 607 00:29:54,480 --> 00:29:57,160 Speaker 1: believe it is from Florida. Thank you for emailing me again. 608 00:29:57,280 --> 00:29:59,680 Speaker 1: Thank you for listening, She writes, Hey, Ryan, huge fan. 609 00:29:59,720 --> 00:30:01,800 Speaker 1: I'm an immigrant and currently a permanent residence, so I've 610 00:30:01,800 --> 00:30:05,520 Speaker 1: been around legal immigrants that right to talk about. I 611 00:30:05,640 --> 00:30:09,320 Speaker 1: am from the twenty sixteen Batchel's students here, and anecdotally 612 00:30:09,360 --> 00:30:11,240 Speaker 1: I can speak for my group. Not one single of 613 00:30:11,320 --> 00:30:13,360 Speaker 1: the people in my batch are right wing, are going 614 00:30:13,400 --> 00:30:15,840 Speaker 1: to vote for Republicans in their lifetime. They are probably 615 00:30:15,840 --> 00:30:18,520 Speaker 1: a couple who are probably moderate and agree on social 616 00:30:18,600 --> 00:30:21,720 Speaker 1: conservative stances here and there, but none of them would 617 00:30:21,720 --> 00:30:25,080 Speaker 1: ever actually vote for quote unquote racist Republicans. Would you 618 00:30:25,080 --> 00:30:28,080 Speaker 1: say economics would change it? Immigration policy would change it 619 00:30:28,200 --> 00:30:31,760 Speaker 1: or would I say, nothing will make them vote right, 620 00:30:31,840 --> 00:30:34,640 Speaker 1: because everyone cares about how to get your wife her 621 00:30:34,680 --> 00:30:37,160 Speaker 1: parents here, how to get easy money, and nobody gives 622 00:30:37,160 --> 00:30:39,840 Speaker 1: a damn about their host country and improving it. Given this, 623 00:30:40,040 --> 00:30:45,600 Speaker 1: why why even support any immigration, legal or not. I 624 00:30:45,720 --> 00:30:48,680 Speaker 1: can go on how legal visas are abused and gained, 625 00:30:48,720 --> 00:30:51,000 Speaker 1: but that would be your whole show. Thank you, Pardol, 626 00:30:51,240 --> 00:30:53,440 Speaker 1: Thank you if that's how you pronounce your name, Pardol. 627 00:30:53,520 --> 00:30:56,360 Speaker 1: Thank you for listening to this show. Amazing question. So 628 00:30:57,360 --> 00:31:01,400 Speaker 1: would changing immigration change if people are willing to vote 629 00:31:01,400 --> 00:31:03,440 Speaker 1: for Republicans. The answer is no, and it's been proven 630 00:31:03,520 --> 00:31:08,080 Speaker 1: no because we've ran this experiment. Ronald Reagan received lost 631 00:31:08,240 --> 00:31:10,920 Speaker 1: one everything except for the immigrant vote. 632 00:31:11,120 --> 00:31:11,320 Speaker 2: Right. 633 00:31:11,360 --> 00:31:13,720 Speaker 1: He lost immigrant, immigrant, and black vote, But he lost 634 00:31:13,720 --> 00:31:16,800 Speaker 1: immigrant vote pretty substantially back in nineteen eighty four when 635 00:31:16,840 --> 00:31:20,160 Speaker 1: everyone voted for Republicans. In nineteen eighty eight, when his 636 00:31:20,400 --> 00:31:22,920 Speaker 1: vice president, George H. W. Bush was running for president, 637 00:31:22,960 --> 00:31:27,400 Speaker 1: and remember HV was super liberal immigration. He invented many 638 00:31:27,600 --> 00:31:30,520 Speaker 1: of the forms of immigration that we have today that 639 00:31:30,560 --> 00:31:33,640 Speaker 1: are plaguing our country to all the different visa categories. 640 00:31:34,000 --> 00:31:38,040 Speaker 1: And Reagan had the amnesty between eighty four and eighty eight, 641 00:31:38,800 --> 00:31:42,520 Speaker 1: HW Bush received a smaller share of the immigrant vote 642 00:31:42,640 --> 00:31:45,600 Speaker 1: than Ronald Reagan did. Then in nineteen ninety two, we 643 00:31:45,640 --> 00:31:49,480 Speaker 1: had Bob Dole, sorry run nineteen to two was HW's reelection. 644 00:31:49,640 --> 00:31:53,720 Speaker 1: In HW's reelection, after he had done another several smaller 645 00:31:53,760 --> 00:31:57,040 Speaker 1: amnesties and created brand new visa categories to bring in 646 00:31:57,360 --> 00:32:00,880 Speaker 1: millions of more immigrants. During his presidency, he got an 647 00:32:00,880 --> 00:32:03,920 Speaker 1: even smaller share of the immigrant vote than there was 648 00:32:03,960 --> 00:32:08,040 Speaker 1: Bob Dole. Bob Dole very I mean, not super left 649 00:32:08,080 --> 00:32:10,760 Speaker 1: wing on immigration, but much more left wing immigration than 650 00:32:10,800 --> 00:32:14,120 Speaker 1: Pap Buchanan and his chief Republican rival. And he got 651 00:32:14,160 --> 00:32:16,440 Speaker 1: a smaller share of the imigrant vote than George H. W. 652 00:32:16,520 --> 00:32:19,720 Speaker 1: Bush did. George W. Bush got the I think he 653 00:32:19,760 --> 00:32:21,400 Speaker 1: got in the UND thirty nine percent. There are some 654 00:32:21,520 --> 00:32:23,239 Speaker 1: numbers to say forty four, but I think they've been, 655 00:32:23,480 --> 00:32:25,680 Speaker 1: you know, taken it down. It's really thirty nine percent, 656 00:32:25,880 --> 00:32:28,960 Speaker 1: thirty nine percent of the immigrant voat, which was considered 657 00:32:29,000 --> 00:32:32,120 Speaker 1: the high watermark until Donald Trump in twenty twenty four. 658 00:32:32,160 --> 00:32:35,400 Speaker 1: And Trump did not run left wing on immigration. He 659 00:32:35,480 --> 00:32:38,200 Speaker 1: said mass deportations that were signs at his rally. It 660 00:32:38,360 --> 00:32:42,160 Speaker 1: wasn't a hidden agenda point, and people didn't care. I 661 00:32:42,200 --> 00:32:44,640 Speaker 1: think in part because of the economy, they were willing 662 00:32:44,640 --> 00:32:47,280 Speaker 1: to give Trump, you know, a chance. And I think 663 00:32:47,280 --> 00:32:49,960 Speaker 1: that a lot of immigrants legal immigrants fell at the 664 00:32:50,000 --> 00:32:56,360 Speaker 1: border was unfair. Fairness is a American trait that people 665 00:32:56,400 --> 00:32:58,240 Speaker 1: talk about rugged individualism and all the rest of it, 666 00:32:58,280 --> 00:33:01,240 Speaker 1: limited government, but fairness is a component of being an American. 667 00:33:01,280 --> 00:33:04,320 Speaker 1: They don't like the idea that they have to work 668 00:33:04,360 --> 00:33:06,960 Speaker 1: for something and then foreigners get to sit there and 669 00:33:07,040 --> 00:33:10,320 Speaker 1: abuse the system and take advantage. It's a really intricate 670 00:33:10,360 --> 00:33:12,720 Speaker 1: part of who we are as a people. So I 671 00:33:12,720 --> 00:33:15,040 Speaker 1: think that that is definitely something that is to cheer 672 00:33:15,120 --> 00:33:18,200 Speaker 1: one right. Looser immigration laws supporting amnesties will not make 673 00:33:18,240 --> 00:33:20,600 Speaker 1: immigrants want to vote for you. And I think that 674 00:33:20,600 --> 00:33:23,239 Speaker 1: there's also an issue where immigrants come from and what 675 00:33:23,320 --> 00:33:26,720 Speaker 1: their beliefs are. Right immigrants, whether they have different opinions 676 00:33:26,760 --> 00:33:29,920 Speaker 1: over Christianity. A lot of countries from Eastern Asia don't have, 677 00:33:30,240 --> 00:33:33,120 Speaker 1: and then at least rather don't also have high opinions 678 00:33:33,120 --> 00:33:36,600 Speaker 1: of Christianity, on gun ownership, on free speech, on welfare, 679 00:33:36,600 --> 00:33:38,840 Speaker 1: on the size of government. Some of these immigrant views 680 00:33:38,840 --> 00:33:41,920 Speaker 1: are incompatible if we want to continue a country that 681 00:33:42,000 --> 00:33:44,080 Speaker 1: has been going on for over two hundred and fifty 682 00:33:44,120 --> 00:33:45,440 Speaker 1: years or two hundred few years as. 683 00:33:45,400 --> 00:33:45,920 Speaker 2: Of next year. 684 00:33:46,600 --> 00:33:49,920 Speaker 1: That's the thing to rationalize and say, Okay, when we 685 00:33:49,960 --> 00:33:52,200 Speaker 1: talk about immigrants, it's not like a bag of oranges. 686 00:33:52,200 --> 00:33:55,160 Speaker 1: We don't have to bring in every single person that 687 00:33:55,320 --> 00:33:58,240 Speaker 1: just categorize as an immigrant. Do they share our values? 688 00:33:58,280 --> 00:34:01,040 Speaker 1: Do they provide benefits to our economy? Are they going 689 00:34:01,080 --> 00:34:03,600 Speaker 1: to commick crime these are are they going to belong 690 00:34:03,640 --> 00:34:06,320 Speaker 1: to gangs? Are they going to assimilate? And assimilation is 691 00:34:06,400 --> 00:34:09,680 Speaker 1: really an interesting category as well, because a lot of 692 00:34:09,719 --> 00:34:13,759 Speaker 1: immigrants and their children who do assimilate kind of assimilate 693 00:34:13,840 --> 00:34:16,560 Speaker 1: to shit lib culture. There was I'm gonna guess because 694 00:34:16,560 --> 00:34:20,040 Speaker 1: your name is Perdual, you're Indian or Southeast Asian of 695 00:34:20,080 --> 00:34:23,680 Speaker 1: some type. There was a show on Netflix a couple 696 00:34:23,719 --> 00:34:25,680 Speaker 1: of years ago called Never Have I Ever, which I 697 00:34:25,760 --> 00:34:27,799 Speaker 1: really liked. It was a very funny show and it 698 00:34:27,840 --> 00:34:31,000 Speaker 1: was about an Indian girl and she was very clearly 699 00:34:31,040 --> 00:34:32,400 Speaker 1: assimilated into American culture. 700 00:34:32,440 --> 00:34:32,560 Speaker 2: Right. 701 00:34:32,600 --> 00:34:36,120 Speaker 1: She ate meat, she wanted a white boyfriend. She was 702 00:34:37,120 --> 00:34:39,719 Speaker 1: but she was she was a leftist. She talked about 703 00:34:39,760 --> 00:34:44,360 Speaker 1: Ruth Biter Ginsberg and and and idealizing liberal values and abortion. 704 00:34:44,920 --> 00:34:46,680 Speaker 1: And obviously it was just like it was a TV show. 705 00:34:46,680 --> 00:34:48,160 Speaker 1: It wasn't a reality show. It was it was it 706 00:34:48,200 --> 00:34:49,480 Speaker 1: was it was written, it was a narra, it was 707 00:34:49,480 --> 00:34:52,239 Speaker 1: a scripted show. But I think that that's true for 708 00:34:52,320 --> 00:34:56,080 Speaker 1: a lot of immigrant second generation immigrants, especially immigrants I 709 00:34:56,120 --> 00:35:00,480 Speaker 1: find from Asia who do assimilate rather quickly to America culture, 710 00:35:00,520 --> 00:35:04,480 Speaker 1: but assimilate to the leftist American culture, the predominant culture 711 00:35:04,480 --> 00:35:07,400 Speaker 1: on college campuses, and I think that's something really to 712 00:35:07,640 --> 00:35:10,560 Speaker 1: chew on. So should we take in immigrants, Yeah, there's 713 00:35:10,640 --> 00:35:13,239 Speaker 1: immigrants worthy of taking in the world that are going 714 00:35:13,280 --> 00:35:16,360 Speaker 1: to be great Americans, but one point one million legal 715 00:35:16,360 --> 00:35:18,759 Speaker 1: immigrants per year and any of their family members that 716 00:35:18,800 --> 00:35:23,200 Speaker 1: they can bring over and endless birthright citizenship or illegal aliens, 717 00:35:23,719 --> 00:35:26,359 Speaker 1: we probably should clamp down on that. But Pirdaal, I'm 718 00:35:26,400 --> 00:35:28,640 Speaker 1: happy you're here and I'm happy you're a listener. So 719 00:35:28,680 --> 00:35:30,600 Speaker 1: thank you so much for your question. I really appreciate it. 720 00:35:30,840 --> 00:35:33,680 Speaker 1: If you like this podcast, please press like and subscribe 721 00:35:33,680 --> 00:35:36,600 Speaker 1: on the iHeartRadio app, Apple Podcasts, wherever you get your podcasts, 722 00:35:36,640 --> 00:35:40,040 Speaker 1: and on YouTube now can't forget about YouTube. Thank you guys. 723 00:35:40,120 --> 00:35:40,560 Speaker 2: So much. 724 00:35:40,600 --> 00:35:42,400 Speaker 1: I will see you guys on Monday,