1 00:00:02,360 --> 00:00:05,200 Speaker 1: Welcome back to a Numbers Game with Ryan Gardowski. Thank 2 00:00:05,200 --> 00:00:07,320 Speaker 1: you all for being here on this Thursday episode. A 3 00:00:07,360 --> 00:00:09,480 Speaker 1: lot of news broke since Monday, so let me give 4 00:00:09,480 --> 00:00:12,240 Speaker 1: you guys some quick hits before talking with the main topic. 5 00:00:12,320 --> 00:00:14,520 Speaker 1: You should know a little bit of news to keep 6 00:00:14,560 --> 00:00:17,520 Speaker 1: you more informed than the average person. In Poland, the 7 00:00:17,600 --> 00:00:21,079 Speaker 1: Nationalist Party candidate, the Law and Justice Party nominee for president, 8 00:00:21,640 --> 00:00:25,520 Speaker 1: Carol Norwiki, won the presidency in an absolute come from 9 00:00:25,520 --> 00:00:29,720 Speaker 1: behind victory. Law and Justice is a nationalist party in Europe. 10 00:00:29,760 --> 00:00:31,520 Speaker 1: Not my favorite because they do a lot of like 11 00:00:31,960 --> 00:00:35,320 Speaker 1: public outreach that's not really They come out as a 12 00:00:35,400 --> 00:00:39,840 Speaker 1: very anti immigration policy immigrant immigrant politicians and political party, 13 00:00:39,880 --> 00:00:42,239 Speaker 1: but they're very pro immigrant. Really, it's a lot of 14 00:00:42,240 --> 00:00:47,200 Speaker 1: pr that's bigger than the actual policy anyway. Norwicky's win, though, 15 00:00:47,440 --> 00:00:51,200 Speaker 1: is notable because he was double digits behind the polls 16 00:00:51,240 --> 00:00:55,640 Speaker 1: as recently in April and had an absolute monster comeback, 17 00:00:55,640 --> 00:00:59,000 Speaker 1: and it dispels the myth that Trump is so toxic 18 00:00:59,040 --> 00:01:01,760 Speaker 1: to nationalist and pop those candidates around the globe. This 19 00:01:01,840 --> 00:01:04,200 Speaker 1: is the third straight presidential election with the Law and 20 00:01:04,360 --> 00:01:07,360 Speaker 1: Justice Party one in Poland and on the other side 21 00:01:07,360 --> 00:01:11,800 Speaker 1: of Europe, National's populace firebrand Gets Wilders removed his party 22 00:01:11,800 --> 00:01:14,960 Speaker 1: from the coalition government and it collapsed the coalition. It 23 00:01:14,959 --> 00:01:18,440 Speaker 1: will force early elections later on this year. Here's how 24 00:01:18,480 --> 00:01:22,240 Speaker 1: that went down. So the Netherlands is a multi party system. 25 00:01:22,319 --> 00:01:25,119 Speaker 1: They have lots of parties. I'm talking they have more 26 00:01:25,160 --> 00:01:27,560 Speaker 1: parties than gen Z has Genders like. It is a 27 00:01:27,600 --> 00:01:30,760 Speaker 1: lot of parties. In twenty twenty three, Gets Wilders, who 28 00:01:30,800 --> 00:01:34,440 Speaker 1: is the longest serving Dutch politician and always a political 29 00:01:34,480 --> 00:01:37,680 Speaker 1: outsider for being a hardliner against mass immigration and the 30 00:01:37,720 --> 00:01:41,400 Speaker 1: Islamification of the Netherlands, and had his party, the Freedom Party, 31 00:01:41,720 --> 00:01:45,080 Speaker 1: surprise everyone in come in first place. But in order 32 00:01:45,080 --> 00:01:48,200 Speaker 1: to form a government you need seventy six seats in parliament. 33 00:01:48,320 --> 00:01:51,040 Speaker 1: His party only had thirty seven. So we entered a 34 00:01:51,040 --> 00:01:54,600 Speaker 1: coalition government with three other center right parties and populist 35 00:01:54,640 --> 00:01:57,920 Speaker 1: parties on the condition that he could not be prime minister, 36 00:01:58,240 --> 00:02:02,160 Speaker 1: but they would concede his demands and immigration. Eleven months 37 00:02:02,240 --> 00:02:05,960 Speaker 1: later the coalition. After the coalition government was formed, Fielders 38 00:02:06,320 --> 00:02:09,239 Speaker 1: has decided to leave the coalition and trigger early elections 39 00:02:09,280 --> 00:02:12,359 Speaker 1: because the other three parties refused to move forward on 40 00:02:12,400 --> 00:02:15,880 Speaker 1: what he called the strictest asylum policies in Europe. This 41 00:02:15,960 --> 00:02:18,720 Speaker 1: is not the first time Builders has lost left the 42 00:02:18,720 --> 00:02:22,959 Speaker 1: coalition government created a snap election. In twenty ten, he 43 00:02:23,480 --> 00:02:26,799 Speaker 1: pulled a similar move because the coalition government moved forward 44 00:02:26,800 --> 00:02:31,440 Speaker 1: on austerity measures. The result was a political instability and 45 00:02:31,480 --> 00:02:34,800 Speaker 1: Wilder's party was punished in the next election, even though 46 00:02:34,880 --> 00:02:39,359 Speaker 1: austerity measures were unpopular. Because if there's anything you should 47 00:02:39,400 --> 00:02:42,040 Speaker 1: know about voters in all of the West, regardless of 48 00:02:42,080 --> 00:02:46,160 Speaker 1: what country it is, they punish in political instability more 49 00:02:46,160 --> 00:02:49,520 Speaker 1: than anything else. Now, whether it be Builders leaving the 50 00:02:49,520 --> 00:02:53,080 Speaker 1: coalition government or Republicans shutting down the government or Democrats 51 00:02:53,120 --> 00:02:57,400 Speaker 1: passing Obamacare, people don't like to make it the feeling 52 00:02:57,560 --> 00:03:01,160 Speaker 1: of being unstable in their politics. So we'll see what 53 00:03:01,200 --> 00:03:04,360 Speaker 1: happens in the next election. If Wilders's party can survive 54 00:03:04,480 --> 00:03:07,640 Speaker 1: and grow, I don't know. I'm pessimistic on that. Stranger 55 00:03:07,680 --> 00:03:10,200 Speaker 1: things have happened. Maybe immigration is a big enough issue 56 00:03:10,240 --> 00:03:13,720 Speaker 1: that voters won't mind, but it's something worth keeping an 57 00:03:13,720 --> 00:03:15,720 Speaker 1: eye on. It's all the politics I have for Europe. 58 00:03:15,760 --> 00:03:18,560 Speaker 1: I love talking about European politics, I love foreign politics. 59 00:03:18,600 --> 00:03:20,840 Speaker 1: I know it's not for everybody. I try to make 60 00:03:20,880 --> 00:03:23,680 Speaker 1: sure that you understand how it's connects to American politics. 61 00:03:23,760 --> 00:03:26,600 Speaker 1: But if you're okay with me doing more episodes on 62 00:03:26,919 --> 00:03:30,280 Speaker 1: other countries, particularly in Europe's politics, let me know. I'm 63 00:03:30,320 --> 00:03:33,040 Speaker 1: building the show with you for you, so your feedback 64 00:03:33,120 --> 00:03:35,840 Speaker 1: is very important. I read every email. Emailm me Ryan 65 00:03:35,880 --> 00:03:38,960 Speaker 1: at numbers gamepodcast dot com and shoot me either an 66 00:03:39,000 --> 00:03:40,680 Speaker 1: idea for an episode or if you want to hear 67 00:03:40,720 --> 00:03:42,920 Speaker 1: more about European politics, I'd love to do an episode 68 00:03:42,920 --> 00:03:46,880 Speaker 1: about it. Okay, Now to the States. Elon Musk has 69 00:03:47,000 --> 00:03:49,360 Speaker 1: officially left the White House and DOGE is all but 70 00:03:49,480 --> 00:03:53,480 Speaker 1: dead after just a few months of being inactive. And 71 00:03:53,560 --> 00:03:56,480 Speaker 1: I told people that I didn't think what Elon was 72 00:03:56,480 --> 00:03:58,480 Speaker 1: trying to do was what he said he was going 73 00:03:58,560 --> 00:04:00,040 Speaker 1: to do. I don't think it was about trying to 74 00:04:00,080 --> 00:04:02,840 Speaker 1: balance the budget, but that's what it was sold at, 75 00:04:02,880 --> 00:04:06,040 Speaker 1: and that is what he's leaving on. On Tuesday, Elon 76 00:04:06,080 --> 00:04:10,360 Speaker 1: took a swipe at President Trump's big beautiful bill, tweeting quote, 77 00:04:10,400 --> 00:04:13,200 Speaker 1: I am sorry, but I can't stand it anymore. The massive, 78 00:04:13,280 --> 00:04:18,599 Speaker 1: outrageous Porkfield Congressional spending bill is a disgusting abomination. Shame 79 00:04:18,680 --> 00:04:20,919 Speaker 1: on those who voted for it. You know you did wrong. 80 00:04:21,080 --> 00:04:23,200 Speaker 1: You know it, and then he followed up by saying 81 00:04:23,240 --> 00:04:26,760 Speaker 1: it will massively increase the already gigantic budget deficit to 82 00:04:26,839 --> 00:04:30,680 Speaker 1: two point five trillion and burden American citizens with crushingly 83 00:04:30,839 --> 00:04:36,039 Speaker 1: unstable debt. Elon has a point, right, not my favorite 84 00:04:36,040 --> 00:04:38,840 Speaker 1: personal world, but Elon has a point, and he has 85 00:04:38,880 --> 00:04:43,240 Speaker 1: been warring with members of this administration for months now, 86 00:04:43,800 --> 00:04:46,800 Speaker 1: according to Axeos Treasury Secretary Scout Besset, and he got 87 00:04:46,839 --> 00:04:50,240 Speaker 1: into his screaming match outside the Oval office, where Besset 88 00:04:50,320 --> 00:04:53,839 Speaker 1: called him a fraud for not finding two trillion dollars 89 00:04:53,920 --> 00:04:56,279 Speaker 1: dollars in wasteful spending. It he claimed he would find, 90 00:04:56,600 --> 00:05:00,720 Speaker 1: and to be fair, Beset, he's right. Elon promised brillions 91 00:05:00,800 --> 00:05:03,680 Speaker 1: in cuts without having any pain because he was going 92 00:05:03,720 --> 00:05:06,080 Speaker 1: to find it in a waste fraud abuse. He gave 93 00:05:06,240 --> 00:05:11,159 Speaker 1: Republicans cover to increase spending and then didn't deliver. It's 94 00:05:11,240 --> 00:05:15,560 Speaker 1: just calling balls and strikes. But here's the thing. Among Republicans, 95 00:05:15,839 --> 00:05:21,839 Speaker 1: especially grassroots donors, Elon is very popular. So I don't 96 00:05:21,880 --> 00:05:24,159 Speaker 1: think this is all about spending. I think there's a 97 00:05:24,200 --> 00:05:26,960 Speaker 1: lot of sour grapes between Elon and Trump that isn't 98 00:05:26,960 --> 00:05:30,080 Speaker 1: going away. But this part of this bill is going 99 00:05:30,120 --> 00:05:33,320 Speaker 1: to Elon's words about this bill are going to impact 100 00:05:33,360 --> 00:05:36,359 Speaker 1: the base of the Republican party. Parts of this bill 101 00:05:36,440 --> 00:05:40,400 Speaker 1: are very unpopular. I think it's still likely to pass, 102 00:05:40,560 --> 00:05:43,800 Speaker 1: given that almost all of Trump's legislative agenda is wrapped 103 00:05:43,880 --> 00:05:46,400 Speaker 1: up in this single bill, and Republicans can't afford not 104 00:05:46,480 --> 00:05:49,520 Speaker 1: to pass anything. But the one provision I want to 105 00:05:49,560 --> 00:05:52,359 Speaker 1: talk about, the one persition I'm thinking about, is on 106 00:05:52,400 --> 00:05:54,400 Speaker 1: page two hundred and seventy eight and to two hundred 107 00:05:54,440 --> 00:05:55,920 Speaker 1: and seventy nine of the bill. If you want to 108 00:05:55,920 --> 00:05:58,600 Speaker 1: go in the congressional website and read it, which up 109 00:05:58,680 --> 00:06:02,560 Speaker 1: till this week had almost nobody noticed. On page two 110 00:06:02,600 --> 00:06:05,320 Speaker 1: hundred and seventy eight, Congressional Republicans snuck in a ten 111 00:06:05,440 --> 00:06:10,479 Speaker 1: year moudatorium on states regulating artificial intelligence. The section reads, 112 00:06:10,480 --> 00:06:12,320 Speaker 1: and this is a bit long, but I want I 113 00:06:12,360 --> 00:06:16,080 Speaker 1: want to read the whole thing. It's important. Quote no 114 00:06:16,279 --> 00:06:19,960 Speaker 1: state or political subdivision thereof may enforce during the ten 115 00:06:20,040 --> 00:06:23,599 Speaker 1: year period beginning on the date of the enactment of 116 00:06:23,640 --> 00:06:28,120 Speaker 1: this Act any law or regulation of that state or 117 00:06:28,400 --> 00:06:34,799 Speaker 1: political subdivision thereof limiting, restricting, or otherwise regulating artificial intelligence models. 118 00:06:35,080 --> 00:06:40,680 Speaker 1: Artificial intelligence systems or automated decision systems entered into interstate commerce. 119 00:06:41,520 --> 00:06:45,760 Speaker 1: Paragraph one may not be construed as prohibited enforcement of 120 00:06:45,839 --> 00:06:49,880 Speaker 1: any law or regulation that the primary purpose and effect 121 00:06:50,000 --> 00:06:53,680 Speaker 1: of which is to remove legal impediments or facilitate the 122 00:06:53,760 --> 00:06:58,480 Speaker 1: development or operation of artificial intelligence models, artificial intelligence systems, 123 00:06:58,600 --> 00:07:05,560 Speaker 1: or automated decision system or streamline licensing, permitting, routing, zoning, procurement, 124 00:07:05,960 --> 00:07:09,760 Speaker 1: or reporting procedures in a manner that facilitates the adoption 125 00:07:09,840 --> 00:07:15,040 Speaker 1: of artificial intelligence models, systems or automated decision systems. Does 126 00:07:15,200 --> 00:07:24,200 Speaker 1: not impose any substantial design, performance, data handling, documentation, civil liability, taxation, fee, 127 00:07:24,640 --> 00:07:29,360 Speaker 1: or other requirement of artificial intelligence models, artificial in intelligence systems, 128 00:07:29,560 --> 00:07:35,120 Speaker 1: or automated decision systems unless required, unless re requirement is 129 00:07:35,160 --> 00:07:37,880 Speaker 1: imposed under the federal law. Sorry, that was long. I 130 00:07:37,880 --> 00:07:39,760 Speaker 1: know I started at the end, but that is important. 131 00:07:39,800 --> 00:07:42,120 Speaker 1: I want you to hear the actual law, not just 132 00:07:42,160 --> 00:07:45,760 Speaker 1: what somebody is saying about it. So, states which already 133 00:07:45,840 --> 00:07:48,440 Speaker 1: have begun to regulate AI. More than twenty states have 134 00:07:48,480 --> 00:07:51,800 Speaker 1: a law on the books to regulate AI in some fashion, 135 00:07:52,320 --> 00:07:56,360 Speaker 1: will not only not be allowed to pass new regulations, 136 00:07:56,400 --> 00:08:00,800 Speaker 1: but they cannot even enforce the ones currently on the books. 137 00:08:01,560 --> 00:08:04,520 Speaker 1: They are also not allowed to make any AI civil 138 00:08:04,720 --> 00:08:08,640 Speaker 1: civilly liable and cannot create a special tax for the 139 00:08:08,680 --> 00:08:11,640 Speaker 1: AI system. Okay, I'm going to give you both sides 140 00:08:11,640 --> 00:08:14,960 Speaker 1: the argument, and then my take. First, the most ideal 141 00:08:15,160 --> 00:08:18,440 Speaker 1: case would be for a national standard in all fifty states, 142 00:08:18,480 --> 00:08:21,080 Speaker 1: not to have a patchwork across the country. It's what's 143 00:08:21,080 --> 00:08:24,680 Speaker 1: best for business. It helps companies know the regulations and 144 00:08:24,720 --> 00:08:27,120 Speaker 1: the laws and how to work properly. I think that's 145 00:08:27,160 --> 00:08:29,840 Speaker 1: best business practices. A lot of economist would agree with me. 146 00:08:30,120 --> 00:08:32,480 Speaker 1: It's what we've done for our whole host of industries 147 00:08:32,520 --> 00:08:36,960 Speaker 1: like cars, telecom, food, drugs, and the AI innovation has 148 00:08:37,000 --> 00:08:39,280 Speaker 1: a lot of positives to it. A lot of doctors 149 00:08:39,280 --> 00:08:42,240 Speaker 1: told me they use AI to double check research. Scientists 150 00:08:42,280 --> 00:08:44,880 Speaker 1: are using AI to help discover new curs for diseases. 151 00:08:45,120 --> 00:08:47,720 Speaker 1: I use AI when doing research for podcasts and articles. 152 00:08:47,760 --> 00:08:51,199 Speaker 1: I find it to superior to Google. So that's I mean, 153 00:08:51,240 --> 00:08:54,280 Speaker 1: there are positives this new technology, and there is a 154 00:08:54,320 --> 00:08:57,200 Speaker 1: giant elephant in the room, being China. We want to 155 00:08:57,200 --> 00:09:01,120 Speaker 1: be more advanced than our main global adversary. Now here's 156 00:09:01,160 --> 00:09:05,200 Speaker 1: the negative side. Congress is broken and does very little, 157 00:09:05,400 --> 00:09:09,080 Speaker 1: very quickly, and AI is moving at light speed. Remember 158 00:09:09,160 --> 00:09:12,680 Speaker 1: Chat GPT is less than three years old, and colleges 159 00:09:12,679 --> 00:09:15,840 Speaker 1: and high schools are struggling to adapt. We don't know 160 00:09:15,840 --> 00:09:18,720 Speaker 1: where this technology is going. And with an aging population 161 00:09:18,840 --> 00:09:22,200 Speaker 1: in Congress that knows very little about technology and is 162 00:09:22,240 --> 00:09:26,080 Speaker 1: dependent on donation from the text industry, how can we 163 00:09:26,120 --> 00:09:28,400 Speaker 1: hope that they properly regulate it and it moves with 164 00:09:28,480 --> 00:09:33,359 Speaker 1: the times. There are legitimate concerns what happens to intellectual 165 00:09:33,440 --> 00:09:37,000 Speaker 1: property with AI? What happens to deep fakes? Especially regarding 166 00:09:37,000 --> 00:09:41,319 Speaker 1: to health science politics. Boomers can barely already tell the 167 00:09:41,360 --> 00:09:43,240 Speaker 1: difference between a real video and a fake video and 168 00:09:43,240 --> 00:09:46,320 Speaker 1: pictures on Facebook. What about if an AI system is 169 00:09:46,360 --> 00:09:49,760 Speaker 1: incorporated in healthcare database and screws up and gets people 170 00:09:49,800 --> 00:09:52,680 Speaker 1: injured or god forbid, dies and they can't be held 171 00:09:52,720 --> 00:09:56,199 Speaker 1: civilly liable. What happens if an AI company makes a 172 00:09:56,280 --> 00:09:59,960 Speaker 1: ship and some it's designed to put into kids' brains? 173 00:10:00,040 --> 00:10:02,280 Speaker 1: I know that sounds insane, but that's what tech people 174 00:10:02,360 --> 00:10:04,880 Speaker 1: are saying that they want to do. That's what the 175 00:10:04,920 --> 00:10:06,520 Speaker 1: future they want to have. I don't know if they 176 00:10:06,520 --> 00:10:09,120 Speaker 1: can get there, but what if they can. Are we 177 00:10:09,120 --> 00:10:12,200 Speaker 1: supposed to wait for Congress to just do something? How 178 00:10:12,280 --> 00:10:16,000 Speaker 1: is that going? We are thirty years, thirty plus years 179 00:10:16,000 --> 00:10:19,040 Speaker 1: into the Internet revolution, and Congress has only passed a 180 00:10:19,200 --> 00:10:22,360 Speaker 1: handful of regulatory bills over the Internet, many of which 181 00:10:22,480 --> 00:10:26,840 Speaker 1: protect Internet companies and not consumers. We're twenty one years 182 00:10:27,040 --> 00:10:30,280 Speaker 1: since the creation of Facebook and the rise of social media, 183 00:10:30,360 --> 00:10:33,720 Speaker 1: which we know has massive effects on children's mental health. 184 00:10:33,840 --> 00:10:36,679 Speaker 1: We know that drug dealers use Snapchat to pedal illegal 185 00:10:36,720 --> 00:10:40,160 Speaker 1: substance substances to miners and other users. We know how 186 00:10:40,160 --> 00:10:42,920 Speaker 1: social media companies censor news and affect our elections. And 187 00:10:43,000 --> 00:10:48,359 Speaker 1: until last year, Congress didn't do anything to protect Americans 188 00:10:48,520 --> 00:10:52,560 Speaker 1: on social media. And what they've passed is very limited 189 00:10:52,880 --> 00:10:56,640 Speaker 1: and so in scope except when they ban TikTok, and 190 00:10:56,679 --> 00:10:59,800 Speaker 1: that isn't even being enforced by either the Democrat president 191 00:11:00,280 --> 00:11:03,880 Speaker 1: left office or the current Republican president. So don't tell 192 00:11:03,920 --> 00:11:06,679 Speaker 1: me that you're so concerned over the future of the 193 00:11:06,760 --> 00:11:10,040 Speaker 1: Chinese Communist Party when you won't even stop using their 194 00:11:10,080 --> 00:11:12,719 Speaker 1: spy app. You know, because nurses have to do their 195 00:11:12,720 --> 00:11:16,400 Speaker 1: little TikTok dances in hospitals. We can't sit there and 196 00:11:16,400 --> 00:11:18,880 Speaker 1: stop Chinese spyware, but we have to beat them in 197 00:11:18,880 --> 00:11:23,920 Speaker 1: the AI race. It is not a coherent message. There's 198 00:11:23,960 --> 00:11:26,400 Speaker 1: also the worry about what's going to happen to the 199 00:11:26,520 --> 00:11:30,000 Speaker 1: job market in the future. AI Dario Emodi, and I'm 200 00:11:30,000 --> 00:11:33,160 Speaker 1: probably mispronouncing his name. I'm sorry. He is the CEO, 201 00:11:33,280 --> 00:11:36,200 Speaker 1: anthropic and a leader in the AI industry. He even 202 00:11:36,600 --> 00:11:39,960 Speaker 1: an interview to Anderson Cooper on CNN. Sign note, I 203 00:11:40,000 --> 00:11:42,720 Speaker 1: watched the interview, and Dario looks exactly what you would 204 00:11:42,760 --> 00:11:46,240 Speaker 1: think tech AI CEO looks like like a virgin who 205 00:11:46,280 --> 00:11:50,480 Speaker 1: just fell up a building. Anyway. Anyway, he believes that 206 00:11:50,559 --> 00:11:54,200 Speaker 1: we are just five years away from twenty percent of 207 00:11:54,400 --> 00:11:58,560 Speaker 1: all entry level white collar jobs being erased. He says 208 00:11:58,600 --> 00:12:00,880 Speaker 1: he has an idea of the future where GDP is 209 00:12:00,920 --> 00:12:04,200 Speaker 1: at ten percent, the national debt is erased because of 210 00:12:04,240 --> 00:12:10,280 Speaker 1: massive GDP growth, and unemployment is at twenty percent. He said, 211 00:12:10,320 --> 00:12:12,640 Speaker 1: that's not out of the question. Now, remember during the 212 00:12:12,640 --> 00:12:16,080 Speaker 1: Great Recession of the two thousand and eight, which led 213 00:12:16,120 --> 00:12:19,400 Speaker 1: to rise of a lot of socialists thinking in our country, 214 00:12:19,520 --> 00:12:21,320 Speaker 1: led to a rise of the Bernie Sanders movement in 215 00:12:21,320 --> 00:12:24,040 Speaker 1: a certain way, led to rise of Barack Obama in 216 00:12:24,120 --> 00:12:27,400 Speaker 1: a certain way, that was when unemployment was at nine 217 00:12:27,480 --> 00:12:34,079 Speaker 1: percent nine point five nationwide. Twenty percent is double more 218 00:12:34,160 --> 00:12:37,880 Speaker 1: than double that, and he's not the only one. David Sue, 219 00:12:38,320 --> 00:12:41,400 Speaker 1: the founder of retools, says his goal is to automate 220 00:12:41,440 --> 00:12:45,160 Speaker 1: ten percent of the labor force in the next five years. 221 00:12:45,360 --> 00:12:47,440 Speaker 1: I want a trip to the border with some tech CEOs, 222 00:12:47,480 --> 00:12:50,160 Speaker 1: like maybe two years ago, and they were all talking 223 00:12:50,160 --> 00:12:52,240 Speaker 1: about this, that millions of jobs were going to be 224 00:12:52,240 --> 00:12:54,880 Speaker 1: wiped out and people would not be able to find work. 225 00:12:55,160 --> 00:12:56,880 Speaker 1: They all said, we're going to have to either have 226 00:12:56,960 --> 00:12:59,560 Speaker 1: some kind of government work projects or some people have 227 00:12:59,559 --> 00:13:03,319 Speaker 1: something to do, or a universal basic income to subsidize 228 00:13:03,320 --> 00:13:06,480 Speaker 1: people who won't be able to find work. We already 229 00:13:06,480 --> 00:13:09,319 Speaker 1: could be seeing the signs. Derek Thompson from The Atlantic 230 00:13:09,400 --> 00:13:13,199 Speaker 1: Track the recent college graduates have a higher unemployment rate 231 00:13:13,240 --> 00:13:16,560 Speaker 1: than the national average for the first time ever. This 232 00:13:16,600 --> 00:13:20,640 Speaker 1: is especially true for kids going into STEM fields. Computer 233 00:13:20,720 --> 00:13:25,679 Speaker 1: engineers have the third highest unemployment rate among recent college graduates. 234 00:13:25,760 --> 00:13:29,600 Speaker 1: That's right now, and Americans are increasingly worried about the 235 00:13:29,640 --> 00:13:33,079 Speaker 1: fear that AI will cost their jobs and their industries. 236 00:13:33,360 --> 00:13:36,240 Speaker 1: In March twenty twenty three, a Yuga poll found that 237 00:13:36,320 --> 00:13:39,240 Speaker 1: twenty nine percent of Americans thought that the advances in 238 00:13:39,320 --> 00:13:41,839 Speaker 1: AI would lead to a decrease in the amount of 239 00:13:41,920 --> 00:13:45,520 Speaker 1: jobs available in their industry. Fast forward to August twenty 240 00:13:45,559 --> 00:13:49,640 Speaker 1: twenty four, that number increased to forty eight percent. So 241 00:13:49,720 --> 00:13:53,839 Speaker 1: what's the reaction then to the big beautiful bills AI moratorium. Well, 242 00:13:53,920 --> 00:13:56,559 Speaker 1: Ted Cruz says that he's all for it. Josh Howley 243 00:13:56,600 --> 00:13:58,440 Speaker 1: says he's got some anxiety about it. I think he'll 244 00:13:58,440 --> 00:14:01,600 Speaker 1: fold like a house of cards. And MTG, who voted 245 00:14:01,600 --> 00:14:03,079 Speaker 1: for the bill in the House, the first version of 246 00:14:03,080 --> 00:14:04,720 Speaker 1: the bill in the House, she says she's going to 247 00:14:04,760 --> 00:14:07,880 Speaker 1: vote against it, the compromise bill if the Senate does 248 00:14:07,920 --> 00:14:10,800 Speaker 1: not strip that language. Now, remember the bill passed by 249 00:14:10,800 --> 00:14:13,400 Speaker 1: a single vote in the House Representative. So if MGG 250 00:14:14,160 --> 00:14:18,120 Speaker 1: sticks to her guns, then we might get the ten 251 00:14:18,200 --> 00:14:22,560 Speaker 1: year moratorium out of the legislation. We'll see Speaker Mike 252 00:14:22,640 --> 00:14:25,960 Speaker 1: Johnson said that he feels very compassionate. We need to 253 00:14:26,040 --> 00:14:28,760 Speaker 1: keep that in there. More than three hundred and sixty 254 00:14:28,840 --> 00:14:32,280 Speaker 1: state legislators, both Republicans and Democrats, I need a letter 255 00:14:32,680 --> 00:14:35,040 Speaker 1: asking Congress to strip the language from the bill. This 256 00:14:35,080 --> 00:14:39,040 Speaker 1: includes very progressive Democrats and Super Mago Republicans who signed 257 00:14:39,080 --> 00:14:41,720 Speaker 1: with this letter. It wasn't like, you know, just a 258 00:14:41,800 --> 00:14:44,600 Speaker 1: rhino and a bunch of Democrats. No, there were hardcore 259 00:14:45,280 --> 00:14:48,880 Speaker 1: state legislators signed on to this letter. Senate Republicans will 260 00:14:48,920 --> 00:14:51,680 Speaker 1: likely have to make some reform simply because something called 261 00:14:51,680 --> 00:14:55,040 Speaker 1: the Bird Rule, which allows Senators to block provisions and 262 00:14:55,080 --> 00:14:59,480 Speaker 1: a reconciliation bill that are deemed extoraneous. The federal budget 263 00:15:00,040 --> 00:15:02,720 Speaker 1: can't put anything in the reconciliation bill. It has to 264 00:15:02,760 --> 00:15:05,800 Speaker 1: be kind of stuck to spending. I spoke to a 265 00:15:05,840 --> 00:15:08,400 Speaker 1: Republican Senate staff or a top Republican Senate center for 266 00:15:08,520 --> 00:15:11,440 Speaker 1: some cenator you would all know, and he told me 267 00:15:11,520 --> 00:15:15,040 Speaker 1: that they plan on attaching the regulation to federal money 268 00:15:15,360 --> 00:15:18,360 Speaker 1: so that states could go forward regulation on AI, but 269 00:15:18,400 --> 00:15:20,800 Speaker 1: they'll lose access to federal money, something that some states 270 00:15:20,840 --> 00:15:23,800 Speaker 1: can't afford to do, but many can't. And when I 271 00:15:23,920 --> 00:15:27,520 Speaker 1: brought up the concerns with AI, the staff are simply 272 00:15:27,560 --> 00:15:30,320 Speaker 1: said there is no AI crisis, and that this is 273 00:15:31,080 --> 00:15:34,520 Speaker 1: to prevent future gridlock, to create a national standard something 274 00:15:34,560 --> 00:15:36,840 Speaker 1: that we've seen the kind of gridlock we've seen over 275 00:15:37,040 --> 00:15:39,920 Speaker 1: national data privacy laws. We can't get it because states 276 00:15:39,960 --> 00:15:43,080 Speaker 1: already started making their own policy. He also assured me 277 00:15:43,120 --> 00:15:46,080 Speaker 1: that this is only a moratorium undeveloping AI models, and 278 00:15:46,160 --> 00:15:49,280 Speaker 1: states can do whatever they want about specific harms or 279 00:15:49,320 --> 00:15:51,800 Speaker 1: offering it in their states. I cast a lot of 280 00:15:51,880 --> 00:15:55,200 Speaker 1: doubt on the second part, because why would they ban 281 00:15:55,360 --> 00:15:59,760 Speaker 1: legislation on civil liabilities? Republicans in favor of the ten 282 00:15:59,840 --> 00:16:02,720 Speaker 1: year more term have fallen into the scarecrow argument that 283 00:16:02,720 --> 00:16:06,200 Speaker 1: if Congress doesn't forbid it, then California will create the 284 00:16:06,280 --> 00:16:11,000 Speaker 1: regulations and we can't trust gavenusom Okay, if California becomes 285 00:16:11,040 --> 00:16:14,920 Speaker 1: too onerous, they'll move to Texas or Florida, or Utah 286 00:16:15,160 --> 00:16:19,120 Speaker 1: or Tennessee. Elon Musk moved to Tennessee. They'll make their 287 00:16:19,120 --> 00:16:22,920 Speaker 1: own set of standards, the ones that Republican legislators would 288 00:16:22,920 --> 00:16:25,880 Speaker 1: work better with. The Truth is that twenty states, including 289 00:16:25,920 --> 00:16:30,000 Speaker 1: many Red states, already put AI regulation in the law, 290 00:16:30,400 --> 00:16:34,800 Speaker 1: while Congress has done nothing. But by waiting ten years 291 00:16:34,920 --> 00:16:40,040 Speaker 1: with the hope that Congress will make regulation that protects consumers, children, workers, 292 00:16:40,160 --> 00:16:44,280 Speaker 1: and somehow manages to avoid mass unemployment and wealth consolidation, 293 00:16:44,760 --> 00:16:49,160 Speaker 1: leaves someone like me feeling very skeptical and as far 294 00:16:49,200 --> 00:16:52,000 Speaker 1: as winning the AI race, against China, goes when does 295 00:16:52,040 --> 00:16:55,360 Speaker 1: this race end? Where's the finish line? What are we 296 00:16:55,560 --> 00:16:58,560 Speaker 1: racing tours? And can we for one second have an 297 00:16:58,560 --> 00:17:03,720 Speaker 1: adult conversation without hyperbole before doing something that probably can't 298 00:17:03,760 --> 00:17:07,119 Speaker 1: be undone? And my last thought on this monologue to 299 00:17:07,200 --> 00:17:09,800 Speaker 1: any Republican who is running for office or considering to 300 00:17:10,000 --> 00:17:12,919 Speaker 1: or in office and hearing this, if you want to 301 00:17:13,000 --> 00:17:18,560 Speaker 1: turn twenty million Trump voters into hardcore AOC or Bernie voters, 302 00:17:18,720 --> 00:17:21,399 Speaker 1: in the blink of an eye, replace their jobs with 303 00:17:21,440 --> 00:17:25,679 Speaker 1: AI without ensuring they can find another one. Now, I 304 00:17:25,720 --> 00:17:28,240 Speaker 1: have two guests coming on No Far More but AI 305 00:17:28,280 --> 00:17:30,159 Speaker 1: and the policies around it and how it affects the 306 00:17:30,160 --> 00:17:36,440 Speaker 1: economy than I do. They're coming up next. Stay tune. 307 00:17:36,560 --> 00:17:39,200 Speaker 1: My guest for today's episode is Brad Carson. He's the 308 00:17:39,240 --> 00:17:41,800 Speaker 1: former president of the University of Tulsa and the current 309 00:17:41,840 --> 00:17:45,440 Speaker 1: co founder of Americans for Responsible Innovation, and Mark Beam, 310 00:17:45,480 --> 00:17:48,960 Speaker 1: who's the president of Government Affairs for the AI Policy Network. 311 00:17:49,000 --> 00:17:51,760 Speaker 1: Thank you both for being here. I want to start 312 00:17:51,760 --> 00:17:54,280 Speaker 1: by talking about the Big Beautiful Bill and the ten 313 00:17:54,359 --> 00:17:59,600 Speaker 1: year moratorium on states regulating AI systems. Brad, your organization, 314 00:17:59,680 --> 00:18:02,359 Speaker 1: America for Responsible Innovation took part in an effort to 315 00:18:02,359 --> 00:18:05,360 Speaker 1: get three hundred and sixty state legislators to sign onto 316 00:18:05,359 --> 00:18:08,080 Speaker 1: a letter to oppose that section of the bill. Why 317 00:18:08,119 --> 00:18:10,840 Speaker 1: do you feel it's better to have states regulate the 318 00:18:10,880 --> 00:18:13,880 Speaker 1: industry instead of like a federal proposition. 319 00:18:14,880 --> 00:18:16,960 Speaker 2: Well, I think you have to view the choices as 320 00:18:17,000 --> 00:18:20,639 Speaker 2: not simply a binary between states regulating it and the 321 00:18:20,640 --> 00:18:25,199 Speaker 2: federal government. We would prefer a uniform federal regulatory scheme. 322 00:18:25,480 --> 00:18:28,840 Speaker 2: That's the best option. The second best option is states 323 00:18:28,920 --> 00:18:32,399 Speaker 2: experimenting with something. The worst option is to have a 324 00:18:32,480 --> 00:18:36,520 Speaker 2: moratorium on state action and have no federal approach. And 325 00:18:36,640 --> 00:18:39,560 Speaker 2: my worry is going forward that third option is actually 326 00:18:39,560 --> 00:18:42,120 Speaker 2: what Congress is going to take. We know how hard 327 00:18:42,160 --> 00:18:44,600 Speaker 2: it is it get things through Congress. So a moratoria 328 00:18:44,720 --> 00:18:47,600 Speaker 2: without a regulatory scheme in place is the worst of 329 00:18:47,680 --> 00:18:48,720 Speaker 2: all possible worlds. 330 00:18:48,800 --> 00:18:52,720 Speaker 1: What about the argument from supporters of a more robust 331 00:18:52,800 --> 00:18:56,840 Speaker 1: AI policy that has very little regulation, that we don't 332 00:18:56,880 --> 00:19:00,800 Speaker 1: want California setting the standards for the and that's just 333 00:19:00,880 --> 00:19:02,800 Speaker 1: too onerous for companies to fallow. 334 00:19:03,760 --> 00:19:06,000 Speaker 2: You know, I'm open to the idea that California or 335 00:19:06,080 --> 00:19:09,040 Speaker 2: New York shouldn't set a national standard, but states seem 336 00:19:09,080 --> 00:19:12,120 Speaker 2: to be free to experiment with regulation in the absence 337 00:19:12,119 --> 00:19:15,640 Speaker 2: of a federal regulatory scheme, and the preemption as it's 338 00:19:15,640 --> 00:19:19,040 Speaker 2: written today, would ensure that a lot of very important 339 00:19:19,040 --> 00:19:21,760 Speaker 2: consumer protection laws they really have nothing to do with 340 00:19:21,840 --> 00:19:25,399 Speaker 2: frontier AI regulation, which is the cutting edge, would be 341 00:19:25,400 --> 00:19:28,240 Speaker 2: stopped at their tracks too. So what we have going 342 00:19:28,280 --> 00:19:30,640 Speaker 2: forward with the Congress that can't pass laws, and I'm 343 00:19:30,680 --> 00:19:33,280 Speaker 2: a former congressman, so these are my colleagues. They can't 344 00:19:33,320 --> 00:19:36,879 Speaker 2: pass laws. A moratorium would basically leave a vacuum in 345 00:19:36,880 --> 00:19:39,560 Speaker 2: which there'd be no oversight at all of what's probably 346 00:19:39,560 --> 00:19:41,840 Speaker 2: going to be the world's most transformative technology. 347 00:19:42,680 --> 00:19:46,800 Speaker 1: Mark You had a tweet in response to Dario Anodi's prediction, 348 00:19:46,920 --> 00:19:48,440 Speaker 1: I don't know if I'm saying his last name correctly. 349 00:19:48,480 --> 00:19:50,480 Speaker 1: I think I am prediction that AI will lead to 350 00:19:50,520 --> 00:19:53,399 Speaker 1: wiping out twenty percent of white collar jobs for recent 351 00:19:53,440 --> 00:19:56,480 Speaker 1: college graduates. You said, quote nine to eleven may meet 352 00:19:56,520 --> 00:20:00,240 Speaker 1: study intelligence failures. How we had warnings but couldn't believe them. 353 00:20:00,280 --> 00:20:03,560 Speaker 1: AI execs are telling us what's coming millions of jobs 354 00:20:03,600 --> 00:20:06,879 Speaker 1: gone or worse. Will we act on this intelligence overly 355 00:20:06,920 --> 00:20:09,080 Speaker 1: wait for an impact. History doesn't repeat it something it 356 00:20:09,119 --> 00:20:13,679 Speaker 1: does rhyme what does responsible AI regulation look like? Because 357 00:20:13,720 --> 00:20:17,399 Speaker 1: tech execs basically say, if you either have almost no 358 00:20:17,480 --> 00:20:20,440 Speaker 1: regulation or you allow us to lose to China. 359 00:20:21,720 --> 00:20:25,000 Speaker 3: Yeah, I think we love to sometimes frame a bit 360 00:20:25,000 --> 00:20:28,280 Speaker 3: of a false choice and a false dichotomy here. And 361 00:20:28,359 --> 00:20:30,840 Speaker 3: I think, certainly, you know, as as folks in the 362 00:20:30,840 --> 00:20:34,159 Speaker 3: Senate Commerce Committee, like to say, it's either going to 363 00:20:34,200 --> 00:20:38,240 Speaker 3: be some European Union style heavy handed regulation or America 364 00:20:38,320 --> 00:20:40,320 Speaker 3: is going to win and accelerate. And I think we 365 00:20:40,400 --> 00:20:43,720 Speaker 3: have to find a middle path, you know. And I 366 00:20:43,760 --> 00:20:46,960 Speaker 3: think AI, given how disruptive and transformative it might be, 367 00:20:47,640 --> 00:20:50,760 Speaker 3: and also given some of the uncertainty about the timelines 368 00:20:50,800 --> 00:20:53,520 Speaker 3: associated with how fast it's moving, I think what would 369 00:20:53,560 --> 00:20:56,479 Speaker 3: be obvious to do sort of time now would include 370 00:20:56,520 --> 00:21:00,720 Speaker 3: basically increasing the US government's capacity to test and evaluate 371 00:21:00,840 --> 00:21:04,040 Speaker 3: these systems for things like loss of control risks, for 372 00:21:04,119 --> 00:21:07,520 Speaker 3: things like weaponization, and even things like starting to track 373 00:21:07,920 --> 00:21:11,960 Speaker 3: the types of tasks and jobs that are being automated 374 00:21:11,960 --> 00:21:14,520 Speaker 3: at least in the Fortune five hundred companies. And this 375 00:21:14,560 --> 00:21:17,240 Speaker 3: is an important set of data that can help drive 376 00:21:18,119 --> 00:21:21,640 Speaker 3: a more refined regulatory approach. And the fact that we're 377 00:21:21,680 --> 00:21:24,800 Speaker 3: not even seeming really interested in getting our arms around 378 00:21:24,800 --> 00:21:27,120 Speaker 3: where this has head. It seems to be a significant 379 00:21:27,320 --> 00:21:28,639 Speaker 3: potential intelligence failure. 380 00:21:28,680 --> 00:21:32,000 Speaker 1: Oh question, I forgive me if I sound naive, because 381 00:21:32,000 --> 00:21:35,800 Speaker 1: I don't know this answer. If China invades Taiwan, is 382 00:21:35,880 --> 00:21:39,000 Speaker 1: the AI race essentially over? You know, I obviously it's 383 00:21:39,119 --> 00:21:41,600 Speaker 1: it's sort of the chips, That's why I'm thinking that. 384 00:21:43,280 --> 00:21:46,480 Speaker 3: So obviously, you know, the Taiwan some In Conductor Manufacturing 385 00:21:46,520 --> 00:21:51,040 Speaker 3: Center is perhaps the world's most strategic foundry for producing 386 00:21:51,119 --> 00:21:54,800 Speaker 3: the chips that go into the training runs for advanced 387 00:21:54,840 --> 00:21:58,560 Speaker 3: AI systems. And you know TSMC is looking to diverse 388 00:21:58,640 --> 00:22:03,480 Speaker 3: its manufacturing capacity, including in places like Arizona. But I think, yeah, 389 00:22:03,640 --> 00:22:07,800 Speaker 3: one of the biggest for strategic challenges if the China 390 00:22:07,800 --> 00:22:10,400 Speaker 3: were to invade Taiwan would be associated with that fabric. 391 00:22:10,520 --> 00:22:13,040 Speaker 3: That fab and whether we even survive the war, our 392 00:22:13,080 --> 00:22:15,720 Speaker 3: survivor conflict would be sort of a top of mind question. 393 00:22:15,760 --> 00:22:17,639 Speaker 3: But if it's in fact true that China were to 394 00:22:17,880 --> 00:22:20,119 Speaker 3: seize that capability and have it for its own, then 395 00:22:20,119 --> 00:22:22,719 Speaker 3: I think it would certainly put us in a significant 396 00:22:22,720 --> 00:22:24,040 Speaker 3: strategic disadvantage. 397 00:22:24,160 --> 00:22:27,040 Speaker 1: Yeah, and I so, speaking of China, we always hear 398 00:22:27,080 --> 00:22:29,000 Speaker 1: about the race against China. I want to ask you 399 00:22:29,000 --> 00:22:30,760 Speaker 1: both this question. I'll start with Brad and go to Mark. 400 00:22:30,960 --> 00:22:34,680 Speaker 1: But there's the race against China. What is the finish 401 00:22:34,720 --> 00:22:38,520 Speaker 1: line of that race look like? Because that's the thing 402 00:22:38,560 --> 00:22:42,560 Speaker 1: that we all have. Every job and military thing is 403 00:22:42,560 --> 00:22:45,880 Speaker 1: completely replaced by computers. Like what are we racing towards? 404 00:22:45,960 --> 00:22:47,240 Speaker 1: And I don't know that answer. 405 00:22:47,880 --> 00:22:50,040 Speaker 2: It's a great question, they don't think people who use 406 00:22:50,080 --> 00:22:53,160 Speaker 2: that metaphor have always thoughted through. Usually in the past 407 00:22:53,240 --> 00:22:57,000 Speaker 2: we talk about arms races. We use that term pejoratively, right, 408 00:22:57,040 --> 00:23:01,879 Speaker 2: It's about excessive expenditures of an escalation of threats that 409 00:23:01,960 --> 00:23:04,200 Speaker 2: often leaves a lot of people dead in the back. 410 00:23:04,560 --> 00:23:06,920 Speaker 2: And so it's a good question. Is the arms race 411 00:23:06,960 --> 00:23:09,400 Speaker 2: the right metaphor for AI? I think what we're trying 412 00:23:09,400 --> 00:23:12,640 Speaker 2: to get to is some sense of artificial superintelligence. Right, 413 00:23:12,760 --> 00:23:16,640 Speaker 2: the first person to get there, in some speculative scenarios, 414 00:23:17,000 --> 00:23:20,720 Speaker 2: I could have a decisive military advantage. The issue is like, 415 00:23:20,840 --> 00:23:23,399 Speaker 2: is that actually possible? What does that mean? How quickly 416 00:23:23,440 --> 00:23:25,960 Speaker 2: would it diffuse to China? And I think we'd actually 417 00:23:25,960 --> 00:23:28,240 Speaker 2: probably be wise to get away from the arms race 418 00:23:28,320 --> 00:23:30,600 Speaker 2: framing of it and instead think about what we can 419 00:23:30,640 --> 00:23:34,760 Speaker 2: actually do in this country sometimes cooperatively recognize that China 420 00:23:34,800 --> 00:23:38,239 Speaker 2: will likely have AI whatever we do shortly thereafter, So 421 00:23:38,280 --> 00:23:40,800 Speaker 2: it's going to be difficult to keep to have us 422 00:23:40,880 --> 00:23:43,040 Speaker 2: possession of it alone and try to think of a 423 00:23:43,080 --> 00:23:44,920 Speaker 2: way to use AI for the good and get away 424 00:23:44,920 --> 00:23:47,280 Speaker 2: from this kind of militaristic framing of my. 425 00:23:48,600 --> 00:23:53,520 Speaker 1: Yeah, I mean self taught AI is and I push 426 00:23:53,600 --> 00:23:56,840 Speaker 1: this over to Mark. Self taught AI isn't that everyone 427 00:23:56,920 --> 00:24:03,320 Speaker 1: says could be possible or super intelligence AI, but it's 428 00:24:03,600 --> 00:24:06,439 Speaker 1: years in the future. If then, but there was a 429 00:24:07,119 --> 00:24:09,280 Speaker 1: thing instance with a I might be saying this wrong. 430 00:24:09,520 --> 00:24:14,240 Speaker 1: Palisade AI open AI three model rewriting its own code 431 00:24:14,280 --> 00:24:18,199 Speaker 1: to avoid being shut down, and even blackmailing one of 432 00:24:18,200 --> 00:24:22,520 Speaker 1: the developers saying they'll reveal their spouses indiscretion or their 433 00:24:22,560 --> 00:24:25,879 Speaker 1: indiscretion with their spouse for the fact, are we close 434 00:24:25,920 --> 00:24:30,040 Speaker 1: to that? And like is I mean, do we want 435 00:24:30,160 --> 00:24:34,040 Speaker 1: a situation where we're like doctor Frankenstein making the self 436 00:24:34,080 --> 00:24:38,040 Speaker 1: taught monster and saying, okay, this is a superior for humanity. 437 00:24:38,960 --> 00:24:43,400 Speaker 3: This is a very concerning development, and unfortunately it's one 438 00:24:43,640 --> 00:24:46,320 Speaker 3: we've seen little warning signs along the way that these 439 00:24:46,359 --> 00:24:49,520 Speaker 3: AI models behave in strange ways. And to be clear, Ryan, 440 00:24:50,119 --> 00:24:52,720 Speaker 3: when developers who are making these systems are not really 441 00:24:52,800 --> 00:24:56,240 Speaker 3: building them themselves, they're almost like growing them. And so 442 00:24:56,280 --> 00:24:58,200 Speaker 3: they take a whole bunch of compute resources and a 443 00:24:58,200 --> 00:25:00,440 Speaker 3: whole bunch of data and run these algorithms ruin these 444 00:25:00,480 --> 00:25:04,679 Speaker 3: AI write their own ways, and as a result, it's 445 00:25:04,680 --> 00:25:06,480 Speaker 3: it's like there's no it's like a very much a 446 00:25:06,480 --> 00:25:09,000 Speaker 3: black box. We can't like crack these things open and 447 00:25:09,160 --> 00:25:11,199 Speaker 3: understand how they work and reason about them, and they 448 00:25:11,240 --> 00:25:14,520 Speaker 3: make these really weird decisions sometimes, like the example that 449 00:25:14,560 --> 00:25:17,720 Speaker 3: you made with the team at Palisade Research in three 450 00:25:17,880 --> 00:25:24,480 Speaker 3: avoiding shutdown anthropics model Opus four attempting to blackmail. It's uh, 451 00:25:24,520 --> 00:25:27,640 Speaker 3: it's it's engineer, you know. I think if you extrapolate 452 00:25:28,080 --> 00:25:30,920 Speaker 3: this further, I think super intelligence might actually be a 453 00:25:30,960 --> 00:25:33,760 Speaker 3: little bit closer than than folks may realize. Although it's 454 00:25:33,760 --> 00:25:36,199 Speaker 3: some some some disagreement in the field there. But if 455 00:25:36,240 --> 00:25:39,159 Speaker 3: you have a super intelligent system that is capable of 456 00:25:39,200 --> 00:25:41,760 Speaker 3: rewriting its own code and avoiding shutdown, this is the 457 00:25:41,760 --> 00:25:43,879 Speaker 3: scenario on which a lot of the experts are starting 458 00:25:43,880 --> 00:25:45,480 Speaker 3: to sound the warm bells about right now. 459 00:25:45,760 --> 00:25:49,240 Speaker 1: Yeah, there was there was that a conversation Dario had 460 00:25:49,280 --> 00:25:52,520 Speaker 1: with Dara Modi had with r Anderson Cooper talking about 461 00:25:52,560 --> 00:25:55,560 Speaker 1: this entire thing. And I have a young recent college 462 00:25:55,600 --> 00:25:57,840 Speaker 1: grad who's a researcher for me, helps kill like some 463 00:25:57,920 --> 00:26:00,680 Speaker 1: data for this podcast. And it's a part time job, 464 00:26:00,800 --> 00:26:03,440 Speaker 1: you know, it's just data for my twice a week podcast. 465 00:26:04,359 --> 00:26:06,119 Speaker 1: But he can't find a full time job as a 466 00:26:06,160 --> 00:26:09,639 Speaker 1: recent college graduate. And he blames in part automation and 467 00:26:09,720 --> 00:26:13,400 Speaker 1: they just don't hire these types of entry level jobs anymore. 468 00:26:13,720 --> 00:26:17,199 Speaker 1: We've seen that college graduates, recent college graduates have a 469 00:26:17,280 --> 00:26:19,680 Speaker 1: higher unemployment rate than the national average for the first 470 00:26:19,720 --> 00:26:23,320 Speaker 1: time in forty years. You're both dads. You said this 471 00:26:23,359 --> 00:26:25,600 Speaker 1: on Twitter, so I'm not exploding some new information. But 472 00:26:25,640 --> 00:26:27,240 Speaker 1: your both dads. You both have kids. I don't know 473 00:26:27,240 --> 00:26:31,360 Speaker 1: how old they are. Are you both worried about their 474 00:26:31,440 --> 00:26:35,360 Speaker 1: opportunity for the workforce and what are college kids supposed 475 00:26:35,359 --> 00:26:39,240 Speaker 1: to do to ensure they can get work in this future. 476 00:26:41,040 --> 00:26:41,919 Speaker 2: It's a great question. 477 00:26:42,240 --> 00:26:42,399 Speaker 1: You know. 478 00:26:42,480 --> 00:26:45,720 Speaker 2: We do see companies like do Alinko and Shopify say 479 00:26:45,760 --> 00:26:48,160 Speaker 2: that before you post a job, you have to assert 480 00:26:48,200 --> 00:26:50,959 Speaker 2: that AI cannot do it. As you mentioned, the un 481 00:26:50,960 --> 00:26:54,800 Speaker 2: deployment rate for new college graduates is larger higher than 482 00:26:54,760 --> 00:26:58,520 Speaker 2: the national average. The Washington Post recently said that it's 483 00:26:58,520 --> 00:27:02,160 Speaker 2: the worst market for software years since nineteen seventy nine, 484 00:27:02,440 --> 00:27:06,399 Speaker 2: we hardly even had computers in wide dissemination. So I 485 00:27:06,440 --> 00:27:08,960 Speaker 2: do think the jobs are going to go away. There's 486 00:27:09,000 --> 00:27:11,600 Speaker 2: a debate in the AI community about how rapidly that 487 00:27:11,640 --> 00:27:14,720 Speaker 2: will be, but most people think that white college jobs, 488 00:27:14,760 --> 00:27:17,760 Speaker 2: especially will be increasingly automated. And you might see in 489 00:27:17,800 --> 00:27:20,240 Speaker 2: the next five or seven years fifty percent of white 490 00:27:20,240 --> 00:27:23,960 Speaker 2: college jobs could be done by AI. And so it's 491 00:27:24,000 --> 00:27:26,600 Speaker 2: a very good question, and there's no obviously answer for 492 00:27:26,640 --> 00:27:28,800 Speaker 2: what you should study. You know. One hand, one could 493 00:27:28,800 --> 00:27:30,840 Speaker 2: try to study machine learning itself, where you're one of 494 00:27:30,880 --> 00:27:33,840 Speaker 2: the engineers making these products, you know. On the other hand, 495 00:27:34,680 --> 00:27:36,639 Speaker 2: it's not obvious what you could do. And I think 496 00:27:36,920 --> 00:27:39,800 Speaker 2: this calls into question, like the very social compact, what 497 00:27:39,880 --> 00:27:42,480 Speaker 2: is democracy in a world where lots of people don't 498 00:27:42,520 --> 00:27:45,919 Speaker 2: have jobs, where we have incredible economic growth, perhaps, but 499 00:27:45,960 --> 00:27:49,280 Speaker 2: it's concentrated in a very very few number of people, 500 00:27:49,320 --> 00:27:51,560 Speaker 2: the people who are running these labs, and the rest 501 00:27:51,560 --> 00:27:54,520 Speaker 2: of us find ourselves and penury. I think it's actually 502 00:27:54,600 --> 00:27:57,280 Speaker 2: going to be a devastating problem for us, and it's coming. 503 00:27:57,520 --> 00:27:59,359 Speaker 2: As Mark said, we very much agree on most of 504 00:27:59,359 --> 00:28:02,240 Speaker 2: these issues. It's coming a lot faster to the average 505 00:28:02,280 --> 00:28:06,040 Speaker 2: American things, whether it's two years, five years, seven years, 506 00:28:06,200 --> 00:28:08,879 Speaker 2: it's coming very rapidly for all of our jobs. And 507 00:28:08,960 --> 00:28:10,400 Speaker 2: there's no easy answer to it. 508 00:28:10,720 --> 00:28:12,680 Speaker 1: Yeah, it might be the number one issue going into 509 00:28:12,680 --> 00:28:16,880 Speaker 1: the twenty twenty eight presidential election. Mark, what I mean, 510 00:28:17,119 --> 00:28:19,800 Speaker 1: what would you say about fifty percent wipe out of 511 00:28:19,840 --> 00:28:26,160 Speaker 1: why college educated jobs is like absolutely devastating. And when 512 00:28:26,200 --> 00:28:29,359 Speaker 1: you hear these techeos talk and I've been on trips 513 00:28:29,359 --> 00:28:31,960 Speaker 1: with tech CEOs and they've talked about this going back 514 00:28:32,000 --> 00:28:35,760 Speaker 1: to three years. In my case, to me, they're like, oh, 515 00:28:35,760 --> 00:28:38,360 Speaker 1: we have to do ubi universal basic income. There is 516 00:28:38,400 --> 00:28:41,880 Speaker 1: no other They will just be permanently unemployed people. 517 00:28:43,520 --> 00:28:45,880 Speaker 3: Yeah, you know, I agree with a lot of what 518 00:28:45,880 --> 00:28:48,640 Speaker 3: with Brad said. I have a sixteen year old son. 519 00:28:48,840 --> 00:28:50,560 Speaker 3: We were talking about, like what do you study in 520 00:28:50,600 --> 00:28:55,240 Speaker 3: college these days? I mentioned maybe physics and philosophy could 521 00:28:55,240 --> 00:28:57,800 Speaker 3: be things that could be useful, But I mean, I 522 00:28:57,840 --> 00:29:00,960 Speaker 3: think even the Bureau of Labor Statistics earlier this year 523 00:29:01,080 --> 00:29:04,080 Speaker 3: reported something like a twenty percent year over year drop 524 00:29:04,320 --> 00:29:07,920 Speaker 3: of entry leveled positions. And I think to your point, 525 00:29:08,040 --> 00:29:10,800 Speaker 3: it's not that people will become unemployed, it's like they'll 526 00:29:10,840 --> 00:29:15,720 Speaker 3: become unemployable. And I think this is a significant disruption. 527 00:29:15,840 --> 00:29:17,959 Speaker 3: I think on the good news is that looking at 528 00:29:18,000 --> 00:29:20,840 Speaker 3: where this administration is and some of the remarks that 529 00:29:20,880 --> 00:29:23,320 Speaker 3: the Vice President made in Paris, it seems like we're 530 00:29:23,320 --> 00:29:25,320 Speaker 3: going to try to give workers. I see it at 531 00:29:25,360 --> 00:29:28,240 Speaker 3: the table. I know folks like Senator Cruz are very 532 00:29:28,240 --> 00:29:32,080 Speaker 3: focused on jobs, jobs, jobs. I think this is one 533 00:29:32,120 --> 00:29:33,920 Speaker 3: that we're going to have to grapple with and candidly 534 00:29:34,040 --> 00:29:36,280 Speaker 3: data Like Ryan, I, you know, when people say UBI, 535 00:29:36,480 --> 00:29:39,440 Speaker 3: it's probably the most under specified term I've ever heard. 536 00:29:40,080 --> 00:29:43,160 Speaker 3: I think by default it's the wealth and power will 537 00:29:43,200 --> 00:29:47,000 Speaker 3: amass with a few folks, and I think the rest 538 00:29:47,040 --> 00:29:48,920 Speaker 3: of us are going to be left holding the bag. 539 00:29:48,960 --> 00:29:52,680 Speaker 3: And I'll say that whenever anyone sort of talks about 540 00:29:52,880 --> 00:29:56,720 Speaker 3: this utopian vision, I like to think that. You know, 541 00:29:56,800 --> 00:29:59,360 Speaker 3: if you look at history, every time someone promises utopia 542 00:29:59,440 --> 00:30:02,120 Speaker 3: ends in one place, and that place is the Gulag. 543 00:30:03,040 --> 00:30:06,960 Speaker 1: And so I think we need to outbeat That's not 544 00:30:07,000 --> 00:30:07,600 Speaker 1: a lot I. 545 00:30:08,440 --> 00:30:10,600 Speaker 3: See some message coming out of the White House saying like, oh, 546 00:30:10,600 --> 00:30:12,719 Speaker 3: it's just a left wing agenda, is not. I think 547 00:30:12,760 --> 00:30:15,320 Speaker 3: it's going to affect everybody, and we have to make 548 00:30:15,360 --> 00:30:18,160 Speaker 3: this as much as we can, not partisan, and together. 549 00:30:18,200 --> 00:30:20,040 Speaker 3: We have to have a national dialogue. And it's going 550 00:30:20,080 --> 00:30:24,560 Speaker 3: to provoke some very serious reconsiderations of the fundamental assumptions 551 00:30:24,560 --> 00:30:26,720 Speaker 3: of it that strames our constitutional order right now. 552 00:30:27,160 --> 00:30:30,880 Speaker 1: Yeah, you know, I hear. I hear the things that 553 00:30:31,000 --> 00:30:35,080 Speaker 1: come out of Republican and I work in republic and politics, 554 00:30:35,120 --> 00:30:38,320 Speaker 1: so this is what I'm most familiar with. Is uh, 555 00:30:38,600 --> 00:30:42,320 Speaker 1: you can't let Gavenuwsom run the entire AI economy. You 556 00:30:42,360 --> 00:30:46,960 Speaker 1: can't let China win. And even Ted Cruz, who you mentioned, 557 00:30:48,800 --> 00:30:52,360 Speaker 1: he is absolutely opposed to any state regulation. Even though 558 00:30:52,400 --> 00:30:55,520 Speaker 1: Texas regulates AI. It was one of the earlier states 559 00:30:55,560 --> 00:30:58,000 Speaker 1: to have a regulation, and I a I put into 560 00:30:58,560 --> 00:31:00,640 Speaker 1: law and signed by the governor. I don't it's not 561 00:31:00,680 --> 00:31:05,360 Speaker 1: a serious, serious piece of legislation, but it is a regulation. 562 00:31:07,560 --> 00:31:09,520 Speaker 1: So it's just curious of what see they are. I 563 00:31:09,600 --> 00:31:14,800 Speaker 1: spoke to someone, a very senior politician in office right now, 564 00:31:15,440 --> 00:31:19,480 Speaker 1: and they had a very optimistic view. They have been saying, yes, 565 00:31:19,640 --> 00:31:22,320 Speaker 1: jobs will be lost, but jobs will be created. A 566 00:31:22,320 --> 00:31:24,360 Speaker 1: bil like the Internet, We're going to see millions of 567 00:31:24,360 --> 00:31:26,600 Speaker 1: new jobs that we don't even know about be created. 568 00:31:27,320 --> 00:31:29,000 Speaker 1: Is that a possibility. 569 00:31:29,120 --> 00:31:31,960 Speaker 2: It's always a possibility, right. We talked about the lump 570 00:31:31,960 --> 00:31:35,680 Speaker 2: of labor fallacy and economics, where even when the automobile 571 00:31:35,760 --> 00:31:37,920 Speaker 2: comes and many people were put out of business, new 572 00:31:37,960 --> 00:31:41,960 Speaker 2: jobs were created as mechanics or manufacturing. I do think 573 00:31:42,080 --> 00:31:45,120 Speaker 2: there's a very real chance at this time. Is different. 574 00:31:45,800 --> 00:31:48,160 Speaker 2: I mean, we should be paying attention to what these 575 00:31:48,160 --> 00:31:51,160 Speaker 2: tech executives are saying. They have stayed it openly that 576 00:31:51,160 --> 00:31:54,120 Speaker 2: it's their job to try to create an artificial intelligence 577 00:31:54,240 --> 00:31:56,040 Speaker 2: that can do ninety five percent of the work that 578 00:31:56,120 --> 00:31:59,080 Speaker 2: humans can do today. And we're giving them hundreds of 579 00:31:59,120 --> 00:32:01,720 Speaker 2: billions of dollars and the smartest people on the planet 580 00:32:01,840 --> 00:32:03,880 Speaker 2: to make that happen. And they seem to be making 581 00:32:04,040 --> 00:32:07,840 Speaker 2: very real steps toward the realization of that goal. And 582 00:32:07,920 --> 00:32:09,640 Speaker 2: so it's a bit of hope to always say, well, 583 00:32:09,680 --> 00:32:12,160 Speaker 2: we'll have some kind of new job, you know, this time, 584 00:32:12,200 --> 00:32:14,680 Speaker 2: it seems like they're actually going to come and take 585 00:32:14,720 --> 00:32:18,080 Speaker 2: a lot of our jobs away, interestingly white collar first, 586 00:32:18,480 --> 00:32:21,160 Speaker 2: but as AI gets embedded in robotics, it will come 587 00:32:21,200 --> 00:32:23,880 Speaker 2: for the blue collar jobs too, and the goal is 588 00:32:23,920 --> 00:32:25,840 Speaker 2: to displace all of us, and they seem to be 589 00:32:25,880 --> 00:32:28,400 Speaker 2: making steps toward it. And so that's a very glib 590 00:32:28,480 --> 00:32:32,120 Speaker 2: thing to say when said you already see the software 591 00:32:32,120 --> 00:32:36,640 Speaker 2: engineer market being crushed, unemployment rising, and again you're probably 592 00:32:36,640 --> 00:32:38,360 Speaker 2: going to see more of this going into the future. So, 593 00:32:38,800 --> 00:32:41,280 Speaker 2: you know, I hope that person's right. One doesn't really 594 00:32:41,280 --> 00:32:43,200 Speaker 2: know how this will develop, but you have to take 595 00:32:43,240 --> 00:32:45,800 Speaker 2: it seriously that this time it's very different. 596 00:32:46,040 --> 00:32:47,280 Speaker 1: Yeah, Mark, what do you saying? 597 00:32:48,480 --> 00:32:52,440 Speaker 3: You know, I asked Claude Opus for who would be 598 00:32:52,480 --> 00:32:55,760 Speaker 3: the most likely the United States senator to oppose federal 599 00:32:55,760 --> 00:32:58,920 Speaker 3: preemption of state regulation, and they answer back with Senator Cruise. 600 00:33:00,480 --> 00:33:03,600 Speaker 3: And you know, I think I think we, of course are. 601 00:33:03,920 --> 00:33:06,000 Speaker 3: We want to be optimistic, we also want to be 602 00:33:06,040 --> 00:33:08,280 Speaker 3: clear eyed about the issue. We can't just pretend and 603 00:33:08,400 --> 00:33:11,440 Speaker 3: handwave that this is all just going to be fine. 604 00:33:11,560 --> 00:33:13,240 Speaker 3: And you know, if you look at what happened during 605 00:33:13,240 --> 00:33:15,600 Speaker 3: the Industrial Revolution, which is what a lot of folks 606 00:33:15,640 --> 00:33:20,800 Speaker 3: compare this to. You know, human physical labor was automated 607 00:33:21,400 --> 00:33:25,719 Speaker 3: and it became next to zero related to GDP output today. 608 00:33:26,120 --> 00:33:28,920 Speaker 3: So physical strength became automated, and that's now not a 609 00:33:29,000 --> 00:33:31,000 Speaker 3: function a factor of the economy. Well, now what we're 610 00:33:31,120 --> 00:33:34,479 Speaker 3: doing is automating our intelligence, and so the question is like, 611 00:33:34,520 --> 00:33:36,360 Speaker 3: what's what's left at that point? Are are you know, 612 00:33:36,360 --> 00:33:39,160 Speaker 3: maybe people can have access to their own, you know, 613 00:33:39,360 --> 00:33:41,520 Speaker 3: very powerful AI systems, and those AIS can go out 614 00:33:41,560 --> 00:33:44,520 Speaker 3: on the Internet and do work for them. But it's again, 615 00:33:44,520 --> 00:33:46,880 Speaker 3: it's it's quite under specified. And I admit the fact 616 00:33:46,920 --> 00:33:49,360 Speaker 3: that I might not be intellectually intelligent enough to predict 617 00:33:49,640 --> 00:33:52,440 Speaker 3: what's going to happen. But I think this seems to 618 00:33:52,440 --> 00:33:54,360 Speaker 3: be a bit of a platitude in a handwave to 619 00:33:54,440 --> 00:33:57,160 Speaker 3: keep people from panicking, and we might be at the 620 00:33:57,200 --> 00:33:59,280 Speaker 3: ponment where it's time to have some of that panic. 621 00:34:00,080 --> 00:34:02,640 Speaker 1: Yeah, someone said that there's the possibility that within the 622 00:34:02,680 --> 00:34:05,480 Speaker 1: next year, I forget who was as an AI person 623 00:34:05,480 --> 00:34:07,600 Speaker 1: on Twitter, So take it with a very small grainssalt. 624 00:34:07,880 --> 00:34:10,320 Speaker 1: But they said that it is possible for a single 625 00:34:10,360 --> 00:34:13,080 Speaker 1: person to have no employees and make a company that's 626 00:34:13,320 --> 00:34:15,600 Speaker 1: valued in the tens of millions of dollars, and then 627 00:34:15,640 --> 00:34:18,440 Speaker 1: the very near future will work the billions of dollars. 628 00:34:18,880 --> 00:34:22,840 Speaker 1: Weird though, that it feels like people are having conversations 629 00:34:22,960 --> 00:34:25,600 Speaker 1: past each other about something that will affect all of us. 630 00:34:25,760 --> 00:34:27,000 Speaker 1: Do you guys get that feeling? 631 00:34:28,440 --> 00:34:30,520 Speaker 2: I definitely think people are talking a bit past one. 632 00:34:30,520 --> 00:34:33,040 Speaker 2: Another part of it is it's still a very technical 633 00:34:33,080 --> 00:34:35,920 Speaker 2: and fairly new technology for most people, and so they 634 00:34:35,920 --> 00:34:39,120 Speaker 2: don't really understand it. They're not aware of what's really happening, 635 00:34:39,440 --> 00:34:41,759 Speaker 2: and even if they're aware of what's happening today, they're 636 00:34:41,800 --> 00:34:44,800 Speaker 2: not paying attention to where it's going. It's rapidly improving, 637 00:34:45,120 --> 00:34:47,160 Speaker 2: so it's not we don't worry so much about what's 638 00:34:47,160 --> 00:34:49,799 Speaker 2: happening at this moment, it's what's happening in a year, 639 00:34:50,000 --> 00:34:53,919 Speaker 2: two years, five years. Where they've actually radically improved even 640 00:34:53,920 --> 00:34:57,120 Speaker 2: today is quite remarkable capabilities. So I think people to 641 00:34:57,239 --> 00:35:00,000 Speaker 2: understand it, they want to believe things will just be okay. 642 00:35:00,600 --> 00:35:03,600 Speaker 2: There's a lot of reflexive opposition to any kind of regulation, 643 00:35:03,800 --> 00:35:06,920 Speaker 2: especially if you're working in Republican politics right regulating business. 644 00:35:07,000 --> 00:35:09,920 Speaker 2: It's just the per se bad which I understand that, 645 00:35:09,960 --> 00:35:12,560 Speaker 2: and they're not often wrong to believe that. But here 646 00:35:12,640 --> 00:35:15,640 Speaker 2: is a technology that's openly stating it's going to try 647 00:35:15,640 --> 00:35:19,759 Speaker 2: to transform our lives up in our world, and it's 648 00:35:19,760 --> 00:35:23,319 Speaker 2: probably worth paying attention to and putting in like reasonable safeguards, 649 00:35:23,360 --> 00:35:26,000 Speaker 2: where like Mark suggests, at the beginning, we're at least 650 00:35:26,280 --> 00:35:28,600 Speaker 2: having the capabilities to deal with it. We're getting more 651 00:35:28,640 --> 00:35:31,560 Speaker 2: information about where it's going, We're watching what jobs are 652 00:35:31,600 --> 00:35:35,120 Speaker 2: being displaced, and we begin the conversation very difficult one 653 00:35:35,360 --> 00:35:38,799 Speaker 2: about how this kind of social compact to undergirds our 654 00:35:38,840 --> 00:35:40,760 Speaker 2: life might radically change. 655 00:35:41,160 --> 00:35:42,879 Speaker 1: Okay, so I want to ask you both of our 656 00:35:42,960 --> 00:35:46,960 Speaker 1: last question. What is a state and what is a 657 00:35:47,000 --> 00:35:51,040 Speaker 1: federal regulation on AI that is not in place currently 658 00:35:51,200 --> 00:35:53,239 Speaker 1: that should be in place if you were speaking to 659 00:35:53,280 --> 00:35:56,160 Speaker 1: a state legislator or governor or someone either in Congress 660 00:35:56,200 --> 00:35:59,360 Speaker 1: or the president. So you can go first, Mark. 661 00:36:00,280 --> 00:36:03,360 Speaker 3: I mean, I think the most important thing the federal 662 00:36:03,400 --> 00:36:06,400 Speaker 3: government ought to be doing time now is taking the 663 00:36:06,440 --> 00:36:10,880 Speaker 3: testing and evaluation regime at the classified level very seriously. 664 00:36:11,680 --> 00:36:15,560 Speaker 3: I think having the data to make inform regulatory choices 665 00:36:15,680 --> 00:36:18,439 Speaker 3: is a foundational first step. And given everything Brad said 666 00:36:18,440 --> 00:36:22,200 Speaker 3: about the technical nature, the relative opacity, the relative fact 667 00:36:22,239 --> 00:36:25,040 Speaker 3: that you know, Washington's quite behind on understanding whether I 668 00:36:25,040 --> 00:36:27,520 Speaker 3: mean the Vatican seems to be further ahead than Washington, 669 00:36:27,600 --> 00:36:31,480 Speaker 3: d C. I'm appreciating the significance of this moment and 670 00:36:31,480 --> 00:36:34,840 Speaker 3: then I would urge my friends and the Republicans Republican Party, 671 00:36:35,760 --> 00:36:38,120 Speaker 3: you know, we can't take a country club attitude towards 672 00:36:38,120 --> 00:36:41,120 Speaker 3: this one. And I agree with that. The fact that, 673 00:36:41,160 --> 00:36:44,319 Speaker 3: you know, the government's oftentimes very ham fisted when it 674 00:36:44,360 --> 00:36:47,040 Speaker 3: engages in the economy. It's not efficient, but it does 675 00:36:47,080 --> 00:36:49,319 Speaker 3: have an important role to play. And we can't, as 676 00:36:49,360 --> 00:36:52,920 Speaker 3: Mike Davis use the term, let the tech brodogarchs you know, 677 00:36:53,120 --> 00:36:55,240 Speaker 3: run the show. We have to think about the broader 678 00:36:55,560 --> 00:36:59,239 Speaker 3: social compact and issues associated with that. And so, but 679 00:36:59,320 --> 00:37:00,839 Speaker 3: the first thing we need to do to get through 680 00:37:00,840 --> 00:37:03,640 Speaker 3: some of this talking past each other is generating the 681 00:37:03,719 --> 00:37:06,840 Speaker 3: right data and having that informed conversation being grounded in 682 00:37:06,880 --> 00:37:08,840 Speaker 3: the facts, and then from there we can have actually 683 00:37:08,840 --> 00:37:09,840 Speaker 3: a productive discussion. 684 00:37:10,440 --> 00:37:11,000 Speaker 2: That's great. 685 00:37:11,160 --> 00:37:13,239 Speaker 1: I love the idea of the country club attitude because 686 00:37:13,280 --> 00:37:20,000 Speaker 1: that is pervasive still despite the whole working class overtures 687 00:37:20,120 --> 00:37:23,759 Speaker 1: being made at least verbally towards voters. Right now, it 688 00:37:23,840 --> 00:37:26,480 Speaker 1: is still very much a country club attitude. What on't you, Brad, 689 00:37:26,520 --> 00:37:29,320 Speaker 1: what would you say is a state or federal policy 690 00:37:29,360 --> 00:37:31,279 Speaker 1: that it's not an ACTI currently that should be an actor, 691 00:37:31,320 --> 00:37:32,680 Speaker 1: or it's an acted in one state and all the 692 00:37:32,719 --> 00:37:33,719 Speaker 1: other states should adopt it. 693 00:37:34,680 --> 00:37:37,239 Speaker 2: I think what marks is something I would associate myself with, 694 00:37:37,320 --> 00:37:39,680 Speaker 2: but I would add just this. We have the AI 695 00:37:39,760 --> 00:37:44,280 Speaker 2: Safety Institute, something that the Treasury Department yesterday renamed as CASEY, 696 00:37:44,440 --> 00:37:48,920 Speaker 2: the Center for AI Standards. So having an institution in 697 00:37:48,960 --> 00:37:52,440 Speaker 2: the federal government that can collect data that brings together expertise. 698 00:37:52,760 --> 00:37:55,040 Speaker 2: It can be the repository of this kind of data 699 00:37:55,120 --> 00:37:58,280 Speaker 2: and to help work with the frontier labs to get 700 00:37:58,360 --> 00:38:01,279 Speaker 2: information and to fund them adequately. But they have the 701 00:38:01,360 --> 00:38:04,399 Speaker 2: right people who are often quite high paid and very 702 00:38:04,520 --> 00:38:08,400 Speaker 2: high levels of skill. That's an important kind of bedrock 703 00:38:08,480 --> 00:38:10,560 Speaker 2: policy we have to have in place. We've had something 704 00:38:10,640 --> 00:38:12,880 Speaker 2: like that, but it's never been codified. Right, it's an 705 00:38:12,880 --> 00:38:15,800 Speaker 2: executive order Biden. Now they've changed a little bit of 706 00:38:15,880 --> 00:38:18,200 Speaker 2: the President Trump. We need to put that into law. 707 00:38:18,480 --> 00:38:21,080 Speaker 2: We have an institution, it's like, dedicated to looking at 708 00:38:21,120 --> 00:38:22,840 Speaker 2: what's happening AI in our economy. 709 00:38:23,120 --> 00:38:27,160 Speaker 1: It's politicians is one class of jobs that probably will 710 00:38:27,160 --> 00:38:30,080 Speaker 1: not be automated, and that's probably why we're not seeing 711 00:38:30,200 --> 00:38:32,600 Speaker 1: much moving on it. Uh Bra, Where can people go 712 00:38:32,640 --> 00:38:33,960 Speaker 1: to find more of your work. 713 00:38:34,800 --> 00:38:37,880 Speaker 2: In the American's responsible innovation can be found at AARI 714 00:38:38,000 --> 00:38:40,600 Speaker 2: dot us. You know, read about all our policies, what 715 00:38:40,640 --> 00:38:43,520 Speaker 2: we support, blog posts, links to many of the kind 716 00:38:43,560 --> 00:38:46,919 Speaker 2: of other thinkers that we find important the work we're doing. 717 00:38:47,000 --> 00:38:48,880 Speaker 2: So ar dot us. 718 00:38:49,080 --> 00:38:52,080 Speaker 1: And Mark where can people find more about them? Government 719 00:38:52,080 --> 00:38:54,160 Speaker 1: affairs for the AI Policy Network. 720 00:38:54,320 --> 00:38:59,640 Speaker 3: Yeah, THEAIPN dot org. We're focused on accelerating federal preparedness 721 00:38:59,680 --> 00:39:04,080 Speaker 3: for tras formative AI, looking at national security, at economic security, 722 00:39:04,440 --> 00:39:07,600 Speaker 3: and human flourishing, and would be excited to partner with 723 00:39:07,640 --> 00:39:10,320 Speaker 3: anyone out there who wants to help promote education around 724 00:39:10,360 --> 00:39:14,080 Speaker 3: this content and help Congress make informed decisions that's going 725 00:39:14,160 --> 00:39:16,280 Speaker 3: to be driving towards the betterment of the American people. 726 00:39:17,120 --> 00:39:18,880 Speaker 1: Well, you guys do great work, so I hope that 727 00:39:19,760 --> 00:39:22,120 Speaker 1: hope they listen, and hope that we are not as 728 00:39:22,120 --> 00:39:24,600 Speaker 1: slow moving on AI as we were on social media 729 00:39:24,640 --> 00:39:26,719 Speaker 1: and everything else. So thank you both for being here. 730 00:39:27,040 --> 00:39:27,839 Speaker 3: Thank you so much. 731 00:39:28,160 --> 00:39:30,480 Speaker 1: You're listening to It's the Numbers Game with Ryan Grodowski. 732 00:39:30,560 --> 00:39:36,239 Speaker 1: We'll be right back after this message. Okay, I know 733 00:39:36,320 --> 00:39:38,080 Speaker 1: the show's gone a little long today and I hope 734 00:39:38,080 --> 00:39:40,600 Speaker 1: you've learned something I did. But before we go, I 735 00:39:40,640 --> 00:39:43,040 Speaker 1: need to do the listener question they Ask Me Anything 736 00:39:43,080 --> 00:39:46,359 Speaker 1: segment you This might be actually a great episode one 737 00:39:46,400 --> 00:39:48,759 Speaker 1: day to do just ask me anythings. If I get 738 00:39:48,840 --> 00:39:51,000 Speaker 1: enough emails, I would love to do that. If you 739 00:39:51,040 --> 00:39:52,720 Speaker 1: want to be part of the ask Me Anything segment, 740 00:39:52,760 --> 00:39:56,000 Speaker 1: please m them email me Ryan at Numbers Game podcast 741 00:39:56,080 --> 00:40:00,359 Speaker 1: dot com. That's ryanat Numbers gamepodcast dot com. Today's question 742 00:40:00,400 --> 00:40:03,440 Speaker 1: comes from David Inda. He writes, my name is Dave. 743 00:40:03,480 --> 00:40:06,040 Speaker 1: I'm a huge fan of your podcast. My question is 744 00:40:06,080 --> 00:40:09,759 Speaker 1: regarding the changing voting patterns of different demographics in the US, 745 00:40:09,800 --> 00:40:13,640 Speaker 1: specifically black and Latino, and any possible conversations you may 746 00:40:13,640 --> 00:40:15,400 Speaker 1: have had with your friend Ann Kolzer, who I'm a 747 00:40:15,400 --> 00:40:18,120 Speaker 1: massive fan of AND's great. That's not the thing, but 748 00:40:18,280 --> 00:40:20,960 Speaker 1: and is great anyway, and talks extensively how the GOP 749 00:40:21,000 --> 00:40:23,480 Speaker 1: should only ever focus on winning over white voters. But 750 00:40:23,520 --> 00:40:25,960 Speaker 1: in light of the recent data that has been released 751 00:40:25,960 --> 00:40:28,200 Speaker 1: post election, I am wondering have you tried to convince 752 00:40:28,239 --> 00:40:31,200 Speaker 1: her that the GOP tried to win black and Latino voters? 753 00:40:31,280 --> 00:40:33,200 Speaker 1: Is that a good idea? Or do you think the 754 00:40:33,239 --> 00:40:35,719 Speaker 1: only thing Republicans should talk about is crime and immigration? 755 00:40:35,880 --> 00:40:38,919 Speaker 1: This is what I believe, and these two policies seems 756 00:40:38,960 --> 00:40:41,959 Speaker 1: to be universally popular regarding race and ethnicity. I would 757 00:40:41,960 --> 00:40:43,920 Speaker 1: love to hear your thoughts on this. Also, I currently 758 00:40:43,960 --> 00:40:46,560 Speaker 1: live in Michigan planning to move to Tennessee, and I'm 759 00:40:46,600 --> 00:40:49,280 Speaker 1: wondering do Republicans have any chance of winning the open 760 00:40:49,360 --> 00:40:51,680 Speaker 1: Senate or governor seat in Michigan in twenty twenty six. 761 00:40:51,760 --> 00:40:55,000 Speaker 1: Along the same lines, why are some states voting one 762 00:40:55,000 --> 00:40:57,520 Speaker 1: way at state levels at either red or blue and 763 00:40:57,520 --> 00:41:00,160 Speaker 1: the opposite of presidential elections? Namely places like George in 764 00:41:00,200 --> 00:41:02,840 Speaker 1: New Hampshire and Michigan come to mind? And this question 765 00:41:03,160 --> 00:41:06,279 Speaker 1: is party affiliation as salient as people typically believe. Where 766 00:41:06,280 --> 00:41:08,880 Speaker 1: does candidate quality in a particular year have a bigger 767 00:41:08,920 --> 00:41:11,200 Speaker 1: impact on the people's votes. I know there's a lot 768 00:41:11,239 --> 00:41:13,480 Speaker 1: of rambling, so I apologize and massive fan of the 769 00:41:13,480 --> 00:41:17,440 Speaker 1: podcast and all the important work you do. One final question, Sorry, 770 00:41:17,480 --> 00:41:19,759 Speaker 1: the ADHD is really strong today. I am originally from 771 00:41:19,800 --> 00:41:22,439 Speaker 1: Illinois and wondering if there's any hope for a place 772 00:41:22,480 --> 00:41:24,560 Speaker 1: like that or is it goondla with California? Wow? Dave, Okay, 773 00:41:24,960 --> 00:41:27,759 Speaker 1: lots of questions. Highly recommend switching a decaf, but I 774 00:41:27,840 --> 00:41:30,400 Speaker 1: love the passion I have ADHD two so I know 775 00:41:30,640 --> 00:41:33,200 Speaker 1: how difficult the struggle com based going to go through 776 00:41:33,200 --> 00:41:35,440 Speaker 1: as quickly. Okay, so first and a culture. I try 777 00:41:35,480 --> 00:41:38,480 Speaker 1: to keep our conversations private. But what I believe Annas 778 00:41:38,480 --> 00:41:41,040 Speaker 1: said is that they shouldn't try to win over non 779 00:41:41,120 --> 00:41:45,840 Speaker 1: white voters by remaking policies like we should try to 780 00:41:45,880 --> 00:41:50,920 Speaker 1: win over non white voters, but focus on policies that 781 00:41:51,000 --> 00:41:53,880 Speaker 1: actually help our voter base, which is namely whites without 782 00:41:53,920 --> 00:41:57,400 Speaker 1: a college degree. We haven't talked about the demographic breakdown 783 00:41:57,400 --> 00:41:59,360 Speaker 1: the election yet. It's something we're trying to get a 784 00:41:59,440 --> 00:42:01,720 Speaker 1: dinner around to sit there and go through it and 785 00:42:01,760 --> 00:42:04,560 Speaker 1: we talk about it. I'll send her your woman's regards. 786 00:42:04,560 --> 00:42:06,680 Speaker 1: But I think that's what ann really says. It's not 787 00:42:06,719 --> 00:42:08,920 Speaker 1: that we shouldn't try, It's that we shouldn't sell out 788 00:42:08,960 --> 00:42:12,040 Speaker 1: on policies like on crime, immigration, which are two very 789 00:42:12,040 --> 00:42:15,040 Speaker 1: popular policies. Secondly, do Republicans have a chance at winning 790 00:42:15,080 --> 00:42:18,520 Speaker 1: the Michigan governor or Senate election next year? Both seats 791 00:42:18,600 --> 00:42:21,760 Speaker 1: are going to be vacant, that is correct. My bet 792 00:42:21,840 --> 00:42:24,440 Speaker 1: is on the governor's race. And here's why. Mike Dugan 793 00:42:24,800 --> 00:42:28,160 Speaker 1: Dugan whatever's last name is the Democrat mayor of Detroit 794 00:42:28,239 --> 00:42:31,120 Speaker 1: is running as an independent and will likely take more 795 00:42:31,200 --> 00:42:34,880 Speaker 1: votes from Democrats than Republicans. Duggan has been on the 796 00:42:35,000 --> 00:42:37,040 Speaker 1: rise in the polls, but it's essentially a toss up 797 00:42:37,040 --> 00:42:41,680 Speaker 1: between the likely Democrat Joyce Leen Benson and Republican John James. 798 00:42:42,280 --> 00:42:44,279 Speaker 1: As for the Senate seat, it just depends on who 799 00:42:44,280 --> 00:42:47,600 Speaker 1: the Democrats nominate, because Republicans will likely pick Mike Rogers again. 800 00:42:48,080 --> 00:42:51,400 Speaker 1: Early polls show it's close, but the only canet Rogers 801 00:42:51,400 --> 00:42:54,680 Speaker 1: has a decent lead against is Wayne County Health director 802 00:42:54,719 --> 00:42:58,960 Speaker 1: and Bernie Sanders supporter. Abdul l said, I think that's 803 00:42:58,960 --> 00:43:02,279 Speaker 1: how you pronounce that name. So we'll see if that 804 00:43:02,840 --> 00:43:05,040 Speaker 1: you know who they pick. That's a big question. And 805 00:43:05,239 --> 00:43:08,160 Speaker 1: how many third party nominees get on the ballot is 806 00:43:08,200 --> 00:43:10,360 Speaker 1: a big question. There were a lot of third party, conservative, 807 00:43:10,400 --> 00:43:14,160 Speaker 1: third party, libertarian, third party. Last time, they took enough 808 00:43:14,200 --> 00:43:15,759 Speaker 1: of the vote to matter. I think he lost by 809 00:43:15,840 --> 00:43:18,360 Speaker 1: nineteen thousand votes and they took well over two hundred 810 00:43:18,360 --> 00:43:21,400 Speaker 1: thousand or something like that. Anyway, why do you states 811 00:43:21,440 --> 00:43:24,360 Speaker 1: vote differently in federal elections state elections? This depends on 812 00:43:24,360 --> 00:43:27,000 Speaker 1: one canon. Quality does matter. If you run some link 813 00:43:27,040 --> 00:43:29,440 Speaker 1: Kerry Lake, you're probably going to lose irregardless of the 814 00:43:29,440 --> 00:43:33,880 Speaker 1: fact that there's more Republicans than Democrats. Party affiliation matters 815 00:43:34,080 --> 00:43:37,080 Speaker 1: to a point. It is the most likely outcome. If 816 00:43:37,080 --> 00:43:39,600 Speaker 1: you're readister Democrat, you will most likely vote Democrat. That's 817 00:43:39,719 --> 00:43:43,080 Speaker 1: the biggest consideration than any almost anything else. Right, I 818 00:43:43,080 --> 00:43:46,279 Speaker 1: think gun ownership is the only other larger indicator of 819 00:43:46,320 --> 00:43:50,080 Speaker 1: how you will vote than party affiliation. But party affiliation 820 00:43:50,360 --> 00:43:53,919 Speaker 1: is very, very important. But in places like Kentucky which 821 00:43:54,040 --> 00:43:57,480 Speaker 1: vote for Republicans federally but Democrats locally, or Vermont, which 822 00:43:57,560 --> 00:44:02,520 Speaker 1: vote for Republicans for govern but Democrats in the legislature 823 00:44:02,560 --> 00:44:06,200 Speaker 1: and federally, a lot of it matters in the minds 824 00:44:06,200 --> 00:44:11,520 Speaker 1: of voters is who can put a check on the legislature. So, 825 00:44:11,719 --> 00:44:15,640 Speaker 1: because the working class people of Kentucky oftentimes have a 826 00:44:15,680 --> 00:44:19,520 Speaker 1: Republican party that has a lot of country club attitudes, 827 00:44:19,520 --> 00:44:22,480 Speaker 1: as I said earlier on this podcast, they will support 828 00:44:22,560 --> 00:44:25,480 Speaker 1: a Democrat who's more pro worker, pro union. I mean 829 00:44:25,480 --> 00:44:29,040 Speaker 1: it was a coal mining state forever Likewise, Vermont, they 830 00:44:29,080 --> 00:44:33,280 Speaker 1: will support a socially liberal Republican who's more pro economic 831 00:44:33,320 --> 00:44:36,480 Speaker 1: growth because the legislature is so anti economic growth. A 832 00:44:36,520 --> 00:44:38,920 Speaker 1: lot of it depends on what the state's going through 833 00:44:39,360 --> 00:44:43,360 Speaker 1: and the character and how the character of the candidate 834 00:44:43,360 --> 00:44:45,480 Speaker 1: and how much they reflect the interest of that state. 835 00:44:45,840 --> 00:44:47,920 Speaker 1: The last question was on Illinois, has it gone the 836 00:44:47,920 --> 00:44:50,239 Speaker 1: way of California. I actually think California is in a 837 00:44:50,280 --> 00:44:54,560 Speaker 1: better place in respect of how Republicans are winning over 838 00:44:54,640 --> 00:44:58,760 Speaker 1: voters than Illinois Black voters, which make a bigger population 839 00:44:58,840 --> 00:45:02,080 Speaker 1: Illinois than in CALIFORNI and you are moving slower at 840 00:45:02,120 --> 00:45:04,520 Speaker 1: a slower speed towards the GP than Asians and Hispanics, 841 00:45:04,880 --> 00:45:09,000 Speaker 1: and southern California whites move towards the Republicans faster than 842 00:45:09,120 --> 00:45:13,160 Speaker 1: Chicago area whites. So I think we're going to see 843 00:45:13,160 --> 00:45:16,360 Speaker 1: more gains in California than in Illinois, both in Congress 844 00:45:16,360 --> 00:45:18,120 Speaker 1: and the state legislature. I know it's probably not the 845 00:45:18,120 --> 00:45:20,400 Speaker 1: answer you want to hear, but that's how I see it. 846 00:45:20,440 --> 00:45:24,479 Speaker 1: Thank you again, though for listening. I really appreciate your email, Dave. 847 00:45:24,640 --> 00:45:27,200 Speaker 1: If you anyone else on any other emails, please email me, 848 00:45:27,239 --> 00:45:30,200 Speaker 1: and once again Ryan at numbers gamepodcast dot com. Thank 849 00:45:30,239 --> 00:45:33,080 Speaker 1: you for listening. Please like and subscribe to this podcast 850 00:45:33,120 --> 00:45:37,160 Speaker 1: on the iHeartRadio app, Apple podcast, wherever you get your podcasts. 851 00:45:37,160 --> 00:45:39,000 Speaker 1: Give me a five star review if you're feeling generous, 852 00:45:39,040 --> 00:45:42,279 Speaker 1: that really helps people find the show. And I will 853 00:45:42,280 --> 00:45:44,160 Speaker 1: see you guys on Monday. Thank you again.