1 00:00:00,040 --> 00:00:02,240 Speaker 1: This episode of the Chuck Podcast is brought to you 2 00:00:02,279 --> 00:00:04,800 Speaker 1: by Incognate. You know, one of the things we talk 3 00:00:05,040 --> 00:00:07,880 Speaker 1: a lot about on this podcast is trust, who deserves 4 00:00:07,880 --> 00:00:09,920 Speaker 1: it and who doesn't. And lately I've been thinking about 5 00:00:09,920 --> 00:00:14,200 Speaker 1: how much of our personal information is just floating around online. 6 00:00:14,520 --> 00:00:17,920 Speaker 1: I'm talking about phone numbers, home addresses, emails, even information 7 00:00:17,960 --> 00:00:20,400 Speaker 1: about your family, all sitting on websites you've probably never 8 00:00:20,440 --> 00:00:22,720 Speaker 1: heard of. In the scary part, that information can be 9 00:00:22,840 --> 00:00:26,600 Speaker 1: used by scammers and identity thieves to piece together enough 10 00:00:26,640 --> 00:00:30,800 Speaker 1: details to impersonate you, target you, or worse. That's where 11 00:00:30,840 --> 00:00:34,920 Speaker 1: incognate comes in. Incognate works behind the scenes to track 12 00:00:35,000 --> 00:00:38,080 Speaker 1: down and remove your personal data from hundreds of websites, 13 00:00:38,479 --> 00:00:42,760 Speaker 1: things like online directories, people search sites, and even commercial 14 00:00:42,800 --> 00:00:45,680 Speaker 1: databases that collect and sell your data without your consent. 15 00:00:46,159 --> 00:00:49,520 Speaker 1: The process is automated, and because data can pop back 16 00:00:49,600 --> 00:00:53,320 Speaker 1: up again, incognit keeps monitoring and removing it for you. 17 00:00:53,720 --> 00:00:56,160 Speaker 1: And one feature I really like is custom removals. If 18 00:00:56,160 --> 00:00:59,800 Speaker 1: you find something about yourself online that you want eliminated, 19 00:01:00,160 --> 00:01:02,240 Speaker 1: it's a listing, an old record, or something you just 20 00:01:02,280 --> 00:01:05,840 Speaker 1: don't want publicly available, you can send incognit the link 21 00:01:06,040 --> 00:01:09,280 Speaker 1: and a dedicated privacy expert handles the removal for you. 22 00:01:09,400 --> 00:01:12,039 Speaker 1: It's a great way to reduce your digital footprint and 23 00:01:12,120 --> 00:01:15,880 Speaker 1: protect yourself from scams and identity theft. Right now, you 24 00:01:15,920 --> 00:01:20,560 Speaker 1: can get sixty percent off an annual incognyplan. Just go 25 00:01:20,640 --> 00:01:24,320 Speaker 1: to incognit dot com slash Chuck podcasts and use the 26 00:01:24,319 --> 00:01:27,880 Speaker 1: code Chuck podcast. Use the code you get a discal again, 27 00:01:28,040 --> 00:01:32,280 Speaker 1: that's in cogni dot com slash chucktodcast and use the 28 00:01:32,400 --> 00:01:35,520 Speaker 1: code Chuck podcast to get sixty percent off and start 29 00:01:35,560 --> 00:01:40,360 Speaker 1: taking control of your personal data today. Trust me, I'm 30 00:01:40,560 --> 00:01:44,440 Speaker 1: constantly working on this for years myself, and here's an 31 00:01:44,480 --> 00:01:46,919 Speaker 1: opportunity for you to take matters into your own hands 32 00:01:46,959 --> 00:01:47,360 Speaker 1: as well. 33 00:01:51,040 --> 00:01:51,280 Speaker 2: Well. 34 00:01:51,360 --> 00:01:54,080 Speaker 1: Everybody is trying to figure out things when it comes 35 00:01:54,120 --> 00:01:58,800 Speaker 1: to artificial intelligence, right I think as consumers or trying 36 00:01:58,800 --> 00:02:02,800 Speaker 1: to figure out how is it impacting our lives? How 37 00:02:02,880 --> 00:02:06,240 Speaker 1: is this going to impact our future of work. You've 38 00:02:06,280 --> 00:02:10,880 Speaker 1: got older kids, You're wondering are you even remotely giving 39 00:02:10,960 --> 00:02:13,880 Speaker 1: them the education they need to survive in this new 40 00:02:13,880 --> 00:02:17,520 Speaker 1: AI driven world. Governments are trying to figure out, you know, 41 00:02:17,560 --> 00:02:23,800 Speaker 1: how do we help this great technology come around? But 42 00:02:23,960 --> 00:02:29,880 Speaker 1: do so with some guardrails considering how poorly the same 43 00:02:29,960 --> 00:02:31,919 Speaker 1: tech companies that are trying to bring us AI brought 44 00:02:31,960 --> 00:02:36,760 Speaker 1: us social media, and there certainly seems to be disputes 45 00:02:36,760 --> 00:02:41,360 Speaker 1: about that. And what's been interesting is is how negative 46 00:02:41,560 --> 00:02:45,320 Speaker 1: the country is becoming on AI, artificial intelligence. And I 47 00:02:45,360 --> 00:02:47,359 Speaker 1: think there's a lot of fear out there. I think 48 00:02:47,360 --> 00:02:50,320 Speaker 1: the fear is understandable. I think the technology is remarkable. 49 00:02:50,400 --> 00:02:53,600 Speaker 1: I think, like many people, I think now we all 50 00:02:53,680 --> 00:02:56,119 Speaker 1: every day we use it more. Sometimes we don't even 51 00:02:56,120 --> 00:02:58,040 Speaker 1: fully realize how much we're using it, but we use 52 00:02:58,080 --> 00:03:02,040 Speaker 1: it more. But the fear factor is definitely gone up. 53 00:03:02,840 --> 00:03:04,760 Speaker 1: And in some ways that fear is driven by the 54 00:03:04,800 --> 00:03:07,760 Speaker 1: people that are bringing us these AI companies, because they're 55 00:03:07,760 --> 00:03:10,240 Speaker 1: the ones that tell us, well, there's this small, small 56 00:03:10,360 --> 00:03:12,839 Speaker 1: chance that whatever we're developing is going to consume us all, 57 00:03:13,360 --> 00:03:15,840 Speaker 1: and you're sitting there going so why are we doing this? 58 00:03:18,000 --> 00:03:21,760 Speaker 1: So it is look, the next ten years probably are 59 00:03:21,800 --> 00:03:26,600 Speaker 1: going to be defined politically by this battle of how 60 00:03:26,639 --> 00:03:31,560 Speaker 1: do we integrate AI into society and how do we 61 00:03:31,600 --> 00:03:34,440 Speaker 1: do it on our terms rather than on AI's terms. 62 00:03:34,480 --> 00:03:36,920 Speaker 1: So my guest today is a former member of Congress, 63 00:03:36,960 --> 00:03:39,880 Speaker 1: former Democrat Brad Carson from the state of Oklahoma, president 64 00:03:39,920 --> 00:03:45,480 Speaker 1: of Fulsea University, and he's president of Americans for Responsible Innovation. 65 00:03:47,040 --> 00:03:49,360 Speaker 1: So the idea is that he's trying to work on 66 00:03:49,440 --> 00:03:55,960 Speaker 1: trying to create some better conditions perhaps, and I say perhaps, 67 00:03:56,000 --> 00:04:01,080 Speaker 1: because we're what this conversation is about, of a of 68 00:04:01,120 --> 00:04:04,080 Speaker 1: a regulatory environment that could earn the trust of the public. Because, 69 00:04:04,080 --> 00:04:07,320 Speaker 1: as I point out, it's some very recent NBC News poll, 70 00:04:07,360 --> 00:04:13,080 Speaker 1: but this has been pretty consistent so far. Most Americans 71 00:04:13,120 --> 00:04:16,440 Speaker 1: see more negative coming from AI than positive. Now we're 72 00:04:16,560 --> 00:04:19,680 Speaker 1: unique around the world. Most other Western democracies are a 73 00:04:19,680 --> 00:04:23,080 Speaker 1: bit more pro AI than we are. And I think, 74 00:04:23,240 --> 00:04:27,120 Speaker 1: like I said, the experiences of social media on this country, 75 00:04:27,120 --> 00:04:31,320 Speaker 1: in particular the experience of what globalization did and the 76 00:04:31,360 --> 00:04:33,840 Speaker 1: fact that that's still a living memory for many of us. 77 00:04:34,400 --> 00:04:38,080 Speaker 1: So I in many ways Americans have earned their skepticism 78 00:04:38,360 --> 00:04:41,800 Speaker 1: on this. Brad Carson, Welcome to the podcast. 79 00:04:42,200 --> 00:04:43,880 Speaker 2: Thanks Chun, great to see you again. 80 00:04:44,560 --> 00:04:47,000 Speaker 1: So let's let me start with that. I want to 81 00:04:47,000 --> 00:04:50,680 Speaker 1: get into some overall politics, whether there's such thing as 82 00:04:50,680 --> 00:04:55,320 Speaker 1: an Oklahoma Democrat anymore, and we can discuss that, because 83 00:04:55,360 --> 00:04:57,640 Speaker 1: even then, you were an outlier when you were in 84 00:04:57,680 --> 00:05:03,480 Speaker 1: Congress start with this. Tell me about your organization and 85 00:05:04,120 --> 00:05:08,360 Speaker 1: be transparent with me. Who's funding it, who's what you know? 86 00:05:09,320 --> 00:05:11,400 Speaker 1: Because I tell you we're in the middle. There's so 87 00:05:11,760 --> 00:05:14,520 Speaker 1: much money flowing into politics right now, coming from a 88 00:05:14,520 --> 00:05:18,520 Speaker 1: handful of people trying to influence the regulation of AI. 89 00:05:19,200 --> 00:05:22,320 Speaker 1: So tell me about organization and tell me who's buying it. 90 00:05:22,680 --> 00:05:25,880 Speaker 2: So American's Responsible Innovation is now about two years old. 91 00:05:26,080 --> 00:05:28,679 Speaker 2: We're by partisan group. I'm a Democrat. I'm the president 92 00:05:28,720 --> 00:05:31,240 Speaker 2: of it. We have several Republicans who work for us, 93 00:05:31,400 --> 00:05:33,679 Speaker 2: and we take great pride in working across the aisle 94 00:05:34,120 --> 00:05:36,880 Speaker 2: because we believe AI should not be a partisan issue 95 00:05:37,040 --> 00:05:39,359 Speaker 2: and that it's some one that both parties have a 96 00:05:39,440 --> 00:05:43,320 Speaker 2: huge stake in managing well. We're funded by family foundations 97 00:05:43,440 --> 00:05:46,440 Speaker 2: or charities. We take no money from the tech sector 98 00:05:46,520 --> 00:05:49,680 Speaker 2: at all, So our biggest funders are groups like Coefficient Giving, 99 00:05:49,960 --> 00:05:53,520 Speaker 2: ohmed Y are Packard Foundation, and a number of family offices. 100 00:05:53,600 --> 00:05:56,440 Speaker 2: So we're independent of the tech sector entirely, and we're 101 00:05:56,440 --> 00:05:58,760 Speaker 2: often critical of them. But we believe that AI can 102 00:05:58,839 --> 00:06:02,120 Speaker 2: be powerful, transformative, and so we're not We don't want 103 00:06:02,160 --> 00:06:04,360 Speaker 2: to kill the golden goose either. 104 00:06:05,520 --> 00:06:08,240 Speaker 1: All right, you say you're entirely independent from the tech sector, 105 00:06:08,320 --> 00:06:10,760 Speaker 1: and then you also mentioned the Packard family. I mean 106 00:06:10,760 --> 00:06:13,359 Speaker 1: that's part of Hewlett Packard. I mean I assume you 107 00:06:13,360 --> 00:06:15,720 Speaker 1: have some family offices that made money in the tech sector. 108 00:06:16,200 --> 00:06:18,360 Speaker 2: Sure, some people who gave give to us actually even 109 00:06:18,400 --> 00:06:20,960 Speaker 2: work for the labs today. They might work for OPAI 110 00:06:21,160 --> 00:06:23,800 Speaker 2: or Anthropy. We don't take any kind of corporate contributions 111 00:06:23,839 --> 00:06:25,960 Speaker 2: like you might see at other groups in Washington, DC. 112 00:06:28,000 --> 00:06:30,039 Speaker 1: Look, the reason I asked that you've got We're in 113 00:06:30,040 --> 00:06:34,159 Speaker 1: the middle of primary season and you see you literally 114 00:06:34,200 --> 00:06:37,760 Speaker 1: have super PACs that open AI is supporting versus a 115 00:06:37,800 --> 00:06:41,080 Speaker 1: super pac that Anthropic is supporting, because they're actually not 116 00:06:41,160 --> 00:06:44,680 Speaker 1: always on the same page when it comes to the 117 00:06:44,720 --> 00:06:49,440 Speaker 1: regular regulation issue. Do you do you guys have that 118 00:06:49,520 --> 00:06:51,120 Speaker 1: tension inside your organization. 119 00:06:52,160 --> 00:06:53,920 Speaker 2: We do have a bit of a tension about that. 120 00:06:53,960 --> 00:06:56,479 Speaker 2: Although you should know that that super pac network, of 121 00:06:56,480 --> 00:06:59,760 Speaker 2: which Anthropic is a contributor, I actually run that as well. 122 00:07:00,080 --> 00:07:02,440 Speaker 2: That's kind of something I do in addition to my 123 00:07:02,520 --> 00:07:05,320 Speaker 2: work as AARI. And that's a group we found that 124 00:07:05,400 --> 00:07:07,680 Speaker 2: kind of spun out of AAR and we raised a 125 00:07:07,680 --> 00:07:10,400 Speaker 2: lot of money for it, and then we convinced Anthropic 126 00:07:10,440 --> 00:07:13,120 Speaker 2: to contribute to it because to their credit, Anthropic, like 127 00:07:13,160 --> 00:07:15,520 Speaker 2: the other donors, believe that we should have reasonable guard 128 00:07:15,560 --> 00:07:17,960 Speaker 2: rails around that. So I'm very involved in the politics 129 00:07:17,960 --> 00:07:20,480 Speaker 2: of it, very involved in the Illinois races today as 130 00:07:20,520 --> 00:07:24,200 Speaker 2: we tape this, but also involved to aari kind of 131 00:07:24,200 --> 00:07:26,120 Speaker 2: the advocacy work on the hill. So both of those 132 00:07:26,200 --> 00:07:32,960 Speaker 2: kinds of things, do you why should we yah? 133 00:07:33,360 --> 00:07:35,600 Speaker 1: Let me put it this way, the amount of money 134 00:07:36,040 --> 00:07:38,120 Speaker 1: you brought up the Illinois races. By time people hear this, 135 00:07:38,400 --> 00:07:41,800 Speaker 1: we will have already had these primaries. But it's funny 136 00:07:41,840 --> 00:07:44,200 Speaker 1: you bring up the Illinois races. I think it's just 137 00:07:45,000 --> 00:07:48,640 Speaker 1: I can't believe the amount of money from outside groups 138 00:07:49,080 --> 00:07:51,640 Speaker 1: that's involved in the center race. Now I understand it. 139 00:07:51,720 --> 00:07:54,280 Speaker 1: The primary is the center race, right. The winner of 140 00:07:54,280 --> 00:07:56,880 Speaker 1: the Democratic primary has like a ninety five percent chance 141 00:07:56,880 --> 00:08:01,680 Speaker 1: of being the United States center barring some unforeseen event, scandal, etc. 142 00:08:02,360 --> 00:08:06,440 Speaker 1: So the game is the primary. But you ran for office. 143 00:08:06,480 --> 00:08:10,240 Speaker 1: Can you imagine having to run for office now in 144 00:08:10,320 --> 00:08:13,800 Speaker 1: this era of super PACs where you may have a 145 00:08:13,840 --> 00:08:15,840 Speaker 1: theory of the case of what kind of campaign you'd 146 00:08:15,920 --> 00:08:20,240 Speaker 1: run a run, but the outside group that supposedly wants 147 00:08:20,280 --> 00:08:23,080 Speaker 1: you to win, has more money, and if they say, 148 00:08:23,080 --> 00:08:24,720 Speaker 1: you know, we don't like your messaging, We're going to 149 00:08:24,720 --> 00:08:28,720 Speaker 1: do another messaging. You're you no longer have control of 150 00:08:28,760 --> 00:08:32,760 Speaker 1: your own campaign. Have you? Can you imagine what life 151 00:08:32,800 --> 00:08:34,640 Speaker 1: would have been like for you as a candidate under 152 00:08:34,720 --> 00:08:36,280 Speaker 1: the under these conditions. 153 00:08:36,520 --> 00:08:38,280 Speaker 2: Well, I think it's one of the greatest changes in 154 00:08:38,280 --> 00:08:41,600 Speaker 2: American politics that most people don't appreciate the hard dollars 155 00:08:41,840 --> 00:08:46,199 Speaker 2: campaign raises, they're almost insignificant. You look at the Illinois primaries, 156 00:08:46,440 --> 00:08:50,360 Speaker 2: candidates there often raising one hundred thousand, two hundred thousand dollars. 157 00:08:50,640 --> 00:08:53,360 Speaker 2: Super PACs are then spending four or five million dollars 158 00:08:54,040 --> 00:08:56,800 Speaker 2: against you. We're involved in a race in North Carolina's 159 00:08:56,800 --> 00:08:59,640 Speaker 2: fourth district in the research triangle, each candidates spent about 160 00:08:59,640 --> 00:09:02,400 Speaker 2: one hundred thousand dollars. The super PACs spent a couple 161 00:09:02,480 --> 00:09:05,040 Speaker 2: of million on each side. So you see, happy in 162 00:09:05,080 --> 00:09:07,520 Speaker 2: American politics is in some ways you would even have 163 00:09:07,559 --> 00:09:10,040 Speaker 2: to raise money as a candidate. You could just sit 164 00:09:10,120 --> 00:09:12,400 Speaker 2: back in your house let the super PACs fight it out. 165 00:09:12,679 --> 00:09:15,200 Speaker 2: Because whatever you raise and asks hard it is to 166 00:09:15,320 --> 00:09:17,839 Speaker 2: raise when you're running for office, how exhausting that is, 167 00:09:18,240 --> 00:09:21,440 Speaker 2: it's going to be matched probably ten x or more 168 00:09:21,559 --> 00:09:24,280 Speaker 2: by the super PACs, So really extraordinary. 169 00:09:24,440 --> 00:09:28,400 Speaker 1: Should this should? It doesn't sound like you think this 170 00:09:28,520 --> 00:09:31,520 Speaker 1: is a good system. Obviously they're working within the system 171 00:09:31,559 --> 00:09:33,840 Speaker 1: that it is. I always joke I cover politics as 172 00:09:33,840 --> 00:09:36,880 Speaker 1: it is, not as I wish it were, and you're 173 00:09:36,920 --> 00:09:39,440 Speaker 1: participating in the system as it is, perhaps not as 174 00:09:39,480 --> 00:09:42,559 Speaker 1: you wish it were. But my word, I mean this 175 00:09:42,640 --> 00:09:47,960 Speaker 1: is if the candidates aren't in charge of their messaging 176 00:09:48,040 --> 00:09:49,600 Speaker 1: or their campaign, then what's the point. 177 00:09:50,360 --> 00:09:53,120 Speaker 2: That's a great question. American democracy is really at stake 178 00:09:53,160 --> 00:09:55,839 Speaker 2: here the superPAC era. But you're going to really see 179 00:09:55,880 --> 00:09:58,160 Speaker 2: it manifested this cycle and the next one. You know, 180 00:09:58,200 --> 00:10:01,680 Speaker 2: Crypto pioneered this strategy. Apex very involved with it too. 181 00:10:02,000 --> 00:10:02,800 Speaker 2: Both of them have a hut. 182 00:10:02,960 --> 00:10:03,520 Speaker 1: You're not alone. 183 00:10:03,520 --> 00:10:03,959 Speaker 2: I don't want it. 184 00:10:03,960 --> 00:10:05,640 Speaker 1: You're right, I'm picking on you because you're here. 185 00:10:06,000 --> 00:10:09,000 Speaker 2: No, the well four or five of us were playing 186 00:10:09,040 --> 00:10:11,920 Speaker 2: big are superbacks at seventy five million dollars. I'd be 187 00:10:11,920 --> 00:10:14,280 Speaker 2: happy to stand down if everyone else would as well, 188 00:10:14,360 --> 00:10:17,520 Speaker 2: because it's really destroying American democracy, changing the way elections 189 00:10:17,520 --> 00:10:20,880 Speaker 2: are had. So it's a terrible system that we've to 190 00:10:21,080 --> 00:10:24,480 Speaker 2: Supreme Court, both Supreme Court and FEC interpretations of that 191 00:10:24,559 --> 00:10:27,319 Speaker 2: Supreme Court opinion. We permitted it to get completely out 192 00:10:27,320 --> 00:10:27,640 Speaker 2: of hand. 193 00:10:28,559 --> 00:10:31,240 Speaker 1: Look, when you served in Congress, do you think back 194 00:10:31,440 --> 00:10:37,840 Speaker 1: about the do you have any regrets collectively, not you 195 00:10:37,920 --> 00:10:40,480 Speaker 1: the individual member of Congress, but collectively because it was 196 00:10:40,520 --> 00:10:42,920 Speaker 1: the entire era had a hands off approach on the Internet, 197 00:10:43,760 --> 00:10:46,600 Speaker 1: very hands off on regulating, you know. I mean, I 198 00:10:46,640 --> 00:10:48,959 Speaker 1: remember at one point John McCain advocating we should have 199 00:10:49,120 --> 00:10:51,520 Speaker 1: the Internet should be tax free forever. And I remember thinking, 200 00:10:51,520 --> 00:10:53,720 Speaker 1: you know, at some point that's not going to work, 201 00:10:54,040 --> 00:10:56,520 Speaker 1: and it didn't. The Internet shouldn't have any sales tax. 202 00:10:56,559 --> 00:10:58,200 Speaker 1: You know, at some point that's not going to work, 203 00:10:58,320 --> 00:10:58,880 Speaker 1: and it didn't. 204 00:10:59,040 --> 00:10:59,199 Speaker 2: Right. 205 00:11:00,640 --> 00:11:03,080 Speaker 1: But we had such a light touch on the Internet, 206 00:11:04,040 --> 00:11:08,240 Speaker 1: and it worked until it didn't. 207 00:11:08,400 --> 00:11:08,520 Speaker 2: Right. 208 00:11:08,679 --> 00:11:13,240 Speaker 1: Social media, to me, is the greatest failed experiment on 209 00:11:13,280 --> 00:11:16,240 Speaker 1: the human brain that is still ongoing. And I don't 210 00:11:16,280 --> 00:11:18,679 Speaker 1: know why we haven't stopped the experiment yet right. 211 00:11:20,600 --> 00:11:21,360 Speaker 2: Looking back. 212 00:11:21,440 --> 00:11:24,000 Speaker 1: And the reason I'm asking it that way because how 213 00:11:24,000 --> 00:11:26,720 Speaker 1: do we avoid the same mistakes with AI? And what's 214 00:11:26,760 --> 00:11:31,199 Speaker 1: something you think Congress and government should have done differently 215 00:11:31,200 --> 00:11:34,080 Speaker 1: in the late nineties and early aughts with the Internet. 216 00:11:33,520 --> 00:11:35,560 Speaker 2: Because I would actually say we did more than just 217 00:11:35,640 --> 00:11:37,800 Speaker 2: not have a light or more than just a light touch. 218 00:11:38,040 --> 00:11:42,160 Speaker 2: We effectively immunized all of the Internet companies from any liability. 219 00:11:42,480 --> 00:11:45,040 Speaker 2: This is the famous section two thirty of the Communications 220 00:11:45,080 --> 00:11:47,080 Speaker 2: Decency Act. And so it wasn't that we just said 221 00:11:47,080 --> 00:11:49,040 Speaker 2: we're not going to regulate you. We said we will 222 00:11:49,080 --> 00:11:52,440 Speaker 2: immunize you from any claims that you defamed people, that 223 00:11:52,480 --> 00:11:55,000 Speaker 2: you hurt them in any way, that you intentionally inflicted 224 00:11:55,040 --> 00:11:59,520 Speaker 2: emotional distress, that you're liable for products liability, all of 225 00:11:59,559 --> 00:12:02,160 Speaker 2: those things. We actually told the companies you're free from 226 00:12:02,200 --> 00:12:04,920 Speaker 2: having to worry about that. That was in nineteen ninety six, 227 00:12:05,200 --> 00:12:07,400 Speaker 2: at a time when only seven percent of the people 228 00:12:07,480 --> 00:12:11,040 Speaker 2: even had Internet. Google and Facebook were just a gleam 229 00:12:11,120 --> 00:12:14,680 Speaker 2: in their founder's eyes. So it was a terrible air 230 00:12:14,840 --> 00:12:17,040 Speaker 2: for us, and we celebrate the thirtieth anniversary of that 231 00:12:17,160 --> 00:12:20,040 Speaker 2: this year. Well, we want to make sure AI is 232 00:12:20,040 --> 00:12:21,520 Speaker 2: we don't do the same thing. So one of the 233 00:12:21,520 --> 00:12:24,320 Speaker 2: big fights in Washington, DC right now is about the 234 00:12:24,360 --> 00:12:27,640 Speaker 2: preemption of state laws. So a lot of the venture 235 00:12:27,679 --> 00:12:30,280 Speaker 2: capitalists Open AI has been in this camp that we 236 00:12:30,320 --> 00:12:33,400 Speaker 2: want to pre empt all state regulation of AI in 237 00:12:33,440 --> 00:12:36,480 Speaker 2: many ways, that's just two thirty reprized for the AI 238 00:12:36,600 --> 00:12:39,120 Speaker 2: age basically say there be no law. We're going to 239 00:12:39,120 --> 00:12:42,280 Speaker 2: basically hold these companies unaccountable. And I think there's a 240 00:12:42,280 --> 00:12:44,360 Speaker 2: way to have your cake and eat it too by 241 00:12:44,400 --> 00:12:46,840 Speaker 2: saying we need some light touch regulation. They should be 242 00:12:46,840 --> 00:12:49,680 Speaker 2: held accountable for their harms. We which allow the courts 243 00:12:49,720 --> 00:12:52,640 Speaker 2: to develop common law over the years, which is finally 244 00:12:52,679 --> 00:12:56,120 Speaker 2: calibrated to the actual risks that the AI systems present, 245 00:12:56,520 --> 00:12:58,840 Speaker 2: rather than like giving them a blanket immunity. That's what 246 00:12:58,880 --> 00:13:01,080 Speaker 2: we did to social media. And the stories that come 247 00:13:01,080 --> 00:13:03,920 Speaker 2: out of social media are obviously horrifying. And when those 248 00:13:03,960 --> 00:13:08,600 Speaker 2: families sue Facebook, TikTok or someone, they have no recourse 249 00:13:08,720 --> 00:13:12,760 Speaker 2: at all. That's a terrible system that obviously harms the families, 250 00:13:12,840 --> 00:13:15,760 Speaker 2: gives them no redress, but doesn't make the companies fully 251 00:13:15,800 --> 00:13:19,439 Speaker 2: internalize the many costs are opposing on society. 252 00:13:19,440 --> 00:13:22,080 Speaker 1: Well, and I get I mean, many of my listeners 253 00:13:22,120 --> 00:13:25,880 Speaker 1: know I have this pet peeve about section two thirty, 254 00:13:26,240 --> 00:13:30,440 Speaker 1: specifically when I don't understand how it is any not 255 00:13:30,480 --> 00:13:34,079 Speaker 1: only that I don't understand how it has any function anymore. 256 00:13:34,120 --> 00:13:37,880 Speaker 1: For this reason, not that you know, yes, government should 257 00:13:37,920 --> 00:13:41,000 Speaker 1: have withdrawn it a while ago. But forget that the 258 00:13:41,080 --> 00:13:50,880 Speaker 1: second Facebook or Instagram or Google created an algorithm to 259 00:13:50,960 --> 00:13:56,240 Speaker 1: determine what I see and what gets fed to me quicker. 260 00:13:57,280 --> 00:14:00,559 Speaker 1: They have become a publisher, They are an editor. They 261 00:14:00,559 --> 00:14:03,959 Speaker 1: made the decision what goes above the fold, Brad, and 262 00:14:03,960 --> 00:14:06,960 Speaker 1: what goes below the fold in the old fashioned news stand. 263 00:14:07,200 --> 00:14:10,480 Speaker 1: That is a decision. It was not, you know, and 264 00:14:10,800 --> 00:14:14,200 Speaker 1: you're a publisher at that point. So I don't understand 265 00:14:14,200 --> 00:14:17,360 Speaker 1: how the courts have interpreted Section two thirty as even 266 00:14:17,440 --> 00:14:19,800 Speaker 1: being functional anymore for any of these entities. 267 00:14:20,600 --> 00:14:23,280 Speaker 2: It's a great question, and the use of algorithmic curation 268 00:14:24,080 --> 00:14:27,200 Speaker 2: is one of the kind of emerging questions I think. 269 00:14:27,240 --> 00:14:29,160 Speaker 2: You know, we've already talked about how the Supreme Court 270 00:14:29,240 --> 00:14:32,160 Speaker 2: bollock stuff campaign finance and brought us the superPAC era. 271 00:14:32,760 --> 00:14:35,560 Speaker 2: Their interpretation of First Amendment law over the last few 272 00:14:35,640 --> 00:14:38,520 Speaker 2: years is equally troublesome, because they now hold things like 273 00:14:38,560 --> 00:14:41,640 Speaker 2: the use of algorithm as being no different than you know, 274 00:14:41,680 --> 00:14:44,600 Speaker 2: when you were back in the editor of Hotline, back 275 00:14:44,600 --> 00:14:46,480 Speaker 2: when I first met you a long time ago. You're 276 00:14:46,520 --> 00:14:49,640 Speaker 2: making editorial decisions every day, right, that's covered by the 277 00:14:49,640 --> 00:14:52,160 Speaker 2: First Amendment, you can be for a candidate, you can 278 00:14:52,200 --> 00:14:54,440 Speaker 2: be against the candidate, you can write for or against. 279 00:14:55,000 --> 00:14:57,240 Speaker 2: The Supreme Court now holds those kind of algorithms are 280 00:14:57,240 --> 00:15:00,600 Speaker 2: equivalent to that. That makes no sense to me. But 281 00:15:00,680 --> 00:15:04,160 Speaker 2: as long as that jurisprudence exists, right, it's very hard 282 00:15:04,160 --> 00:15:06,640 Speaker 2: to argue that algorithms are outside of Section two thirty 283 00:15:06,680 --> 00:15:08,800 Speaker 2: is immunity. So we have a problem with how the 284 00:15:08,840 --> 00:15:11,840 Speaker 2: Supreme Court's interpreting the First Amendment as it comes to 285 00:15:11,880 --> 00:15:13,520 Speaker 2: like these tech platforms as well. 286 00:15:14,080 --> 00:15:17,480 Speaker 1: No, you're look, that's a that's a fair pushback on 287 00:15:17,520 --> 00:15:21,119 Speaker 1: my pushback, which is this First Amendment interpretation of an algorithm. 288 00:15:21,120 --> 00:15:25,360 Speaker 1: But the point is is that they've the initial instance 289 00:15:25,400 --> 00:15:27,960 Speaker 1: of Section two thirty was well where they're not publishers. 290 00:15:27,960 --> 00:15:30,840 Speaker 1: They're just a platform that they somehow don't have any 291 00:15:30,960 --> 00:15:34,160 Speaker 1: influence on their platform. And while that might still be 292 00:15:34,200 --> 00:15:36,400 Speaker 1: true of our reddit, you know, in the way that 293 00:15:36,600 --> 00:15:39,840 Speaker 1: Reddit works, that is not true of x that is 294 00:15:39,880 --> 00:15:44,360 Speaker 1: not true of Instagram. They you know, yes, you know, 295 00:15:44,440 --> 00:15:47,640 Speaker 1: they don't create the content, but they go out of 296 00:15:47,680 --> 00:15:51,360 Speaker 1: their way to make sure I see certain content based 297 00:15:51,400 --> 00:15:55,360 Speaker 1: on my profile that they have decided to suck up 298 00:15:55,360 --> 00:15:59,640 Speaker 1: with data. So that's why I'm that's where my you 299 00:15:59,640 --> 00:16:02,000 Speaker 1: know what, I don't understand that how they can then 300 00:16:02,040 --> 00:16:03,880 Speaker 1: say no, no, no, no no, We're just a plot. We're just 301 00:16:03,920 --> 00:16:05,920 Speaker 1: in charge of a platform. You can't hold us accountable 302 00:16:05,960 --> 00:16:06,160 Speaker 1: for it. 303 00:16:06,480 --> 00:16:07,840 Speaker 2: I mean that shows you the danger of a two 304 00:16:07,960 --> 00:16:10,200 Speaker 2: thirty pass at a time when no one even understood 305 00:16:10,240 --> 00:16:10,720 Speaker 2: what the internet. 306 00:16:10,800 --> 00:16:12,000 Speaker 1: We didn't know what they were passing. 307 00:16:12,040 --> 00:16:14,320 Speaker 2: So you pass these like vague terms that are in 308 00:16:14,400 --> 00:16:17,200 Speaker 2: the law, courts are forced to interpret them, and right 309 00:16:17,240 --> 00:16:19,920 Speaker 2: courts go off on this or that tangent and they 310 00:16:19,960 --> 00:16:22,200 Speaker 2: have a lot of money now, the social media companies do, 311 00:16:22,280 --> 00:16:24,680 Speaker 2: so they fight back in Congress with millions of dollars. 312 00:16:24,840 --> 00:16:27,160 Speaker 2: They all have super packs of their own, and so 313 00:16:27,240 --> 00:16:29,120 Speaker 2: it's very hard to change. And so one of the 314 00:16:29,120 --> 00:16:32,000 Speaker 2: great lessons of two thirty for AI is like, don't 315 00:16:32,040 --> 00:16:34,680 Speaker 2: pass some kind of metal law before you even understand 316 00:16:34,720 --> 00:16:36,800 Speaker 2: like what's going to happen in five years, because then 317 00:16:36,840 --> 00:16:38,680 Speaker 2: you can't change it. You know, it's very hard to 318 00:16:38,760 --> 00:16:40,800 Speaker 2: politics to get these things reversed. 319 00:16:42,200 --> 00:16:46,200 Speaker 1: Well, this is what I love federal federalism. We have 320 00:16:46,240 --> 00:16:51,960 Speaker 1: fifty states, and it does seem as if it's a were. 321 00:16:52,360 --> 00:16:55,600 Speaker 1: The divide on whether states should be allowed to regulate 322 00:16:55,640 --> 00:17:00,640 Speaker 1: AI is I think an interesting fight because it's not 323 00:17:00,720 --> 00:17:05,639 Speaker 1: a left right divide. This really is a industry versus 324 00:17:05,720 --> 00:17:09,680 Speaker 1: everybody else fight. Right, you have Ron DeSantis, who is 325 00:17:09,920 --> 00:17:13,600 Speaker 1: who's marching forward on trying to get AI regulation in 326 00:17:13,640 --> 00:17:16,280 Speaker 1: the state of Florida, while Donald Trump's still trying to 327 00:17:16,320 --> 00:17:19,680 Speaker 1: give immunity to these companies. I imagine if you're taking 328 00:17:19,680 --> 00:17:22,240 Speaker 1: anthropic money, you're on the side of letting the states 329 00:17:22,600 --> 00:17:23,760 Speaker 1: give this a shot. First. 330 00:17:24,080 --> 00:17:26,760 Speaker 2: Yeah, we actually said we've always advocated for that. They've 331 00:17:26,760 --> 00:17:29,160 Speaker 2: brought it twice before Congress, and we led the coalition 332 00:17:29,240 --> 00:17:33,080 Speaker 2: to AARI to defeat that again. Republicans and Democrats, Republican 333 00:17:33,119 --> 00:17:38,399 Speaker 2: ags Sarah Huckaby Standers was rallying Republican governors. DeSantis, Steve 334 00:17:38,480 --> 00:17:42,320 Speaker 2: Bannon was on the Hustings like really active in this. 335 00:17:42,680 --> 00:17:45,879 Speaker 2: So it does transcend party label in some way, and 336 00:17:45,920 --> 00:17:48,639 Speaker 2: really it's a world now. I think only people actually 337 00:17:48,640 --> 00:17:51,640 Speaker 2: for this policy are like three people. They're a sixteen 338 00:17:51,760 --> 00:17:56,120 Speaker 2: z Anderson Horroriz, the VC firm that Mark Andrews ben 339 00:17:56,160 --> 00:17:59,200 Speaker 2: Horowitz runs, and the Open AI guys that's really used 340 00:17:59,240 --> 00:18:02,439 Speaker 2: for it. Topics not for it again, anthropics in the industry, 341 00:18:02,440 --> 00:18:05,480 Speaker 2: they're very divided. Anthropics says, well, we prefer to have 342 00:18:05,680 --> 00:18:08,399 Speaker 2: a comprehensive federal standard. That would be wonderful, but the 343 00:18:08,440 --> 00:18:10,240 Speaker 2: absence of that, you have to have the states to 344 00:18:10,240 --> 00:18:12,720 Speaker 2: do their work, and so through a lot of divisions, 345 00:18:12,800 --> 00:18:15,560 Speaker 2: and it's actually just a few small people. Unfortunately, they 346 00:18:15,560 --> 00:18:18,840 Speaker 2: have billions of dollars and the ear of the White 347 00:18:18,840 --> 00:18:21,560 Speaker 2: House as we speak, because of the tech rights influence 348 00:18:21,600 --> 00:18:24,600 Speaker 2: there that they can get their way against. Really people 349 00:18:24,600 --> 00:18:29,120 Speaker 2: in both parties. 350 00:18:35,200 --> 00:18:38,920 Speaker 1: Having good life insurance is incredibly important. I know from 351 00:18:38,920 --> 00:18:42,000 Speaker 1: personal experience. I was sixteen when my father passed away. 352 00:18:42,040 --> 00:18:44,439 Speaker 1: We didn't have any money. He didn't leave us in 353 00:18:44,480 --> 00:18:48,840 Speaker 1: the best shape. My mother single mother, now widow. Myself 354 00:18:48,920 --> 00:18:50,680 Speaker 1: sixteen trying to figure out how am I going to 355 00:18:50,720 --> 00:18:53,880 Speaker 1: pay for college and lo and behold, my dad had 356 00:18:53,960 --> 00:18:57,159 Speaker 1: one life insurance policy that we found wasn't a lot, 357 00:18:57,480 --> 00:19:00,159 Speaker 1: but it was important at the time, and it's I 358 00:19:00,560 --> 00:19:03,720 Speaker 1: was able to go to college. Little did he know 359 00:19:03,880 --> 00:19:07,879 Speaker 1: how important that would be in that moment. Well, guess what. 360 00:19:08,160 --> 00:19:10,679 Speaker 1: That's why I am here to tell you about Ethos Life. 361 00:19:11,080 --> 00:19:13,480 Speaker 1: They can provide you with peace of mind knowing your 362 00:19:13,480 --> 00:19:16,800 Speaker 1: family is protected even if the worst comes to pass. 363 00:19:17,160 --> 00:19:20,400 Speaker 1: Ethos is an online platform that makes getting life insurance 364 00:19:20,440 --> 00:19:23,520 Speaker 1: fast and easy, all designed to protect your family's future 365 00:19:23,560 --> 00:19:27,080 Speaker 1: in minutes, not months. There's no complicated process, and it's 366 00:19:27,240 --> 00:19:31,440 Speaker 1: one hundred percent online. There's no medical exam require you 367 00:19:31,520 --> 00:19:33,959 Speaker 1: just answer a few health questions online. You can get 368 00:19:34,000 --> 00:19:35,760 Speaker 1: a quote in as little as ten minutes, and you 369 00:19:35,760 --> 00:19:39,480 Speaker 1: can get same day coverage without ever leaving your home. 370 00:19:40,200 --> 00:19:42,080 Speaker 1: You can get up to three million dollars in coverage, 371 00:19:42,080 --> 00:19:44,159 Speaker 1: and some policies start as low as two dollars a 372 00:19:44,200 --> 00:19:47,480 Speaker 1: day that would be billed monthly. As of March twenty 373 00:19:47,520 --> 00:19:50,520 Speaker 1: twenty five, Business Insider named Ethos the number one no 374 00:19:50,640 --> 00:19:55,040 Speaker 1: medical exam instant life insurance provider. So protect your family 375 00:19:55,520 --> 00:19:58,320 Speaker 1: with life insurance from Ethos. Get your free quote at 376 00:19:58,320 --> 00:20:03,200 Speaker 1: ethos dot com slash so again, that's ethos dot com 377 00:20:03,400 --> 00:20:07,800 Speaker 1: slash chuck. Application times may vary and the rates themselves 378 00:20:07,840 --> 00:20:10,760 Speaker 1: may vary as well, but trust me, life insurance is 379 00:20:10,760 --> 00:20:13,600 Speaker 1: something you should really think about it, especially if you've 380 00:20:13,600 --> 00:20:20,560 Speaker 1: got a growing family. So, I mean, is this when 381 00:20:20,600 --> 00:20:24,080 Speaker 1: you're getting involved in these campaigns, how much are you 382 00:20:24,160 --> 00:20:26,240 Speaker 1: spending money on the issue of AI itself and how 383 00:20:26,320 --> 00:20:28,720 Speaker 1: much are you just involved in helping somebody who you 384 00:20:28,760 --> 00:20:31,199 Speaker 1: think will be on your side in the in the 385 00:20:32,680 --> 00:20:34,040 Speaker 1: battle over regulating that. 386 00:20:34,520 --> 00:20:36,080 Speaker 2: So it's a bit of both. It's people we know 387 00:20:36,240 --> 00:20:38,199 Speaker 2: are going to be on our side, and then we 388 00:20:38,240 --> 00:20:41,159 Speaker 2: are spending some money on trying to socialize AI make 389 00:20:41,200 --> 00:20:43,920 Speaker 2: people more aware of it, because right now many people 390 00:20:43,960 --> 00:20:46,160 Speaker 2: have a lot of concerns, but it's not the top 391 00:20:46,200 --> 00:20:47,480 Speaker 2: of mind for them in voting. 392 00:20:47,760 --> 00:20:49,679 Speaker 1: Are you surprised at how negative people are about this? 393 00:20:50,359 --> 00:20:52,240 Speaker 2: I know I'm not really because I can see it 394 00:20:52,280 --> 00:20:54,720 Speaker 2: even here in Oklahoma, where I still live. You know, 395 00:20:54,880 --> 00:20:58,480 Speaker 2: people who are very Trump oriented, very conservative state. This 396 00:20:58,600 --> 00:21:01,600 Speaker 2: is seventy thirty Trumps. You know, I don't know many 397 00:21:01,680 --> 00:21:04,560 Speaker 2: liberal Democrats here, and many people are fired up. We 398 00:21:04,640 --> 00:21:07,199 Speaker 2: have a huge data center rebellion from people who are 399 00:21:07,240 --> 00:21:10,960 Speaker 2: deeply religious, conservative across the board Trump voters, and they 400 00:21:11,000 --> 00:21:13,920 Speaker 2: don't want data centers, which is one manifestation of this fight. 401 00:21:14,400 --> 00:21:16,439 Speaker 2: And you know, we actually pulled a lot. You know, 402 00:21:16,760 --> 00:21:20,000 Speaker 2: the only thing more unpopular than Benjamin net and Yahoo 403 00:21:20,080 --> 00:21:23,719 Speaker 2: in a Democratic primary electorate or AI companies. It's kind 404 00:21:23,760 --> 00:21:25,960 Speaker 2: of extraordinary, actually, and I think you know, what you 405 00:21:26,000 --> 00:21:28,520 Speaker 2: said at the outset is actually really telling every other 406 00:21:28,560 --> 00:21:31,520 Speaker 2: country in the world. AI is much more popular. In China, 407 00:21:31,640 --> 00:21:34,000 Speaker 2: something like eighty five percent of the people are very 408 00:21:34,040 --> 00:21:37,840 Speaker 2: positive about AI. So it's not the technology itself it's unpopular. 409 00:21:38,080 --> 00:21:41,600 Speaker 2: It's how it's interactive with our somewhat dysfunctional political system. 410 00:21:42,080 --> 00:21:44,560 Speaker 2: That's what makes it so unpopular here. I think that 411 00:21:44,560 --> 00:21:46,800 Speaker 2: does speak to the need to really have like a 412 00:21:46,840 --> 00:21:50,399 Speaker 2: better social contract with people. You know, more people sense 413 00:21:50,440 --> 00:21:52,639 Speaker 2: like this, the political system can't be hijacked by a 414 00:21:52,680 --> 00:21:55,160 Speaker 2: couple of people that you know will work against us. 415 00:21:55,359 --> 00:21:57,600 Speaker 2: That's I think what makes it so unpopular here because 416 00:21:57,600 --> 00:22:00,600 Speaker 2: we are an outlawier in that respect, and the explanation 417 00:22:00,640 --> 00:22:03,440 Speaker 2: for it is our unique kind of socio political dynamic. 418 00:22:04,160 --> 00:22:06,760 Speaker 1: Well, I hear you there. At the other two points, 419 00:22:06,760 --> 00:22:10,479 Speaker 1: I think there's no doubt it's the fear of our 420 00:22:10,520 --> 00:22:12,919 Speaker 1: politics and whether they can manage the moment. And I 421 00:22:12,960 --> 00:22:18,119 Speaker 1: certainly have those concerns. But you know that, I just 422 00:22:18,160 --> 00:22:21,720 Speaker 1: think trust in the tech companies is so low because 423 00:22:22,240 --> 00:22:24,119 Speaker 1: I don't think you know, social media is one of 424 00:22:24,160 --> 00:22:29,520 Speaker 1: these phenomenons that it really has turned into the smoking 425 00:22:29,520 --> 00:22:31,840 Speaker 1: of the twenty first century. We all know it's not 426 00:22:31,880 --> 00:22:35,320 Speaker 1: good for us. We all know it's unhealthy. We're desperate 427 00:22:35,320 --> 00:22:38,199 Speaker 1: to make sure our kids, don't. You know. We're like, 428 00:22:38,520 --> 00:22:40,399 Speaker 1: let's get it out of schools. You want to talk 429 00:22:40,440 --> 00:22:42,119 Speaker 1: about it some this bipartisan right that a matter of 430 00:22:42,160 --> 00:22:44,840 Speaker 1: red state of blue. Get the get the spreens out 431 00:22:44,840 --> 00:22:50,000 Speaker 1: of the schools. Right, And yet we can't quit. We 432 00:22:50,040 --> 00:22:50,800 Speaker 1: all can't quit. 433 00:22:50,960 --> 00:22:51,120 Speaker 2: Right. 434 00:22:51,160 --> 00:22:56,119 Speaker 1: It's it powers economy, It helps to fuel this economy 435 00:22:56,119 --> 00:23:01,080 Speaker 1: in some ways, we don't know how to We know 436 00:23:01,160 --> 00:23:02,800 Speaker 1: this is bad and we don't know how to stop it. 437 00:23:03,400 --> 00:23:07,080 Speaker 1: And I think the fear that's my theory, and that, 438 00:23:07,359 --> 00:23:09,360 Speaker 1: oh my god, if we do AI, then we'll never 439 00:23:09,400 --> 00:23:10,359 Speaker 1: be able to unwind this. 440 00:23:11,240 --> 00:23:13,520 Speaker 2: That's exactly right. I mean, we know now from the 441 00:23:13,600 --> 00:23:17,080 Speaker 2: various leaks, for example, from Meta who know about things 442 00:23:17,119 --> 00:23:20,080 Speaker 2: they knew that they tolerated, that they were almost willing 443 00:23:20,119 --> 00:23:24,359 Speaker 2: to encourage behavior that is criminal and shocking in lots 444 00:23:24,359 --> 00:23:27,080 Speaker 2: of ways, right, because it promoted engagement. You know, we 445 00:23:27,160 --> 00:23:29,920 Speaker 2: know the AI companies, especially we've seen Chat GPT because 446 00:23:29,920 --> 00:23:32,359 Speaker 2: it's the most consumer facing of all of the AI 447 00:23:32,440 --> 00:23:37,080 Speaker 2: products out there, more than Anthropic or Google's products. Are right, 448 00:23:37,119 --> 00:23:40,159 Speaker 2: How people can like misuse them and become addicted to them. 449 00:23:40,160 --> 00:23:43,000 Speaker 2: We can lead people to solve farm I think, I mean, 450 00:23:43,040 --> 00:23:45,320 Speaker 2: because you covered DC so much. There's a question like 451 00:23:45,920 --> 00:23:48,320 Speaker 2: tink the Kids Online Safety Act, which is a response 452 00:23:48,359 --> 00:23:51,119 Speaker 2: to the social media problems. It passes the House and 453 00:23:51,160 --> 00:23:55,280 Speaker 2: the Senate overwhelmingly. It gets bottled up in leadership. Why 454 00:23:55,640 --> 00:23:58,080 Speaker 2: because they're on the phone with like Meta you know 455 00:23:58,160 --> 00:24:00,880 Speaker 2: that's got millions of dollars to give to the super packs, 456 00:24:00,920 --> 00:24:04,560 Speaker 2: the majority and minority leadership packs, these kind of things. 457 00:24:04,880 --> 00:24:07,400 Speaker 2: And so that's the dysfunction. It's like people know it 458 00:24:07,760 --> 00:24:10,080 Speaker 2: and their politicians look kind of the rank and vile. 459 00:24:10,160 --> 00:24:13,800 Speaker 2: They do support changes, but it gets bottled up because 460 00:24:13,840 --> 00:24:16,840 Speaker 2: again a couple of phone calls from people you know 461 00:24:17,040 --> 00:24:19,600 Speaker 2: really kind of like can stranglehold it. That's a unique 462 00:24:19,600 --> 00:24:22,880 Speaker 2: aspect of American politics. That's again about the money, and 463 00:24:22,920 --> 00:24:25,040 Speaker 2: it's one of the terrible aspects of American politics. 464 00:24:25,400 --> 00:24:28,280 Speaker 1: Well this is I mean, look, you keep coming back 465 00:24:28,320 --> 00:24:30,560 Speaker 1: to something that I don't think is unfair to coming 466 00:24:30,600 --> 00:24:33,679 Speaker 1: back to, which is as you know, here we have 467 00:24:33,720 --> 00:24:38,800 Speaker 1: a we have a democracy that desperately needs updating. I'm 468 00:24:38,840 --> 00:24:40,960 Speaker 1: not going to call it broken. I'm going to call 469 00:24:41,000 --> 00:24:44,199 Speaker 1: it in disrepair and it needs modernizing. You know, we 470 00:24:44,280 --> 00:24:47,639 Speaker 1: need some cases, it's just simply catching up to reality, 471 00:24:47,880 --> 00:24:52,399 Speaker 1: particularly when it comes down money and politics works and 472 00:24:52,440 --> 00:24:58,880 Speaker 1: things like that. But it is I don't We're not 473 00:24:58,960 --> 00:25:02,480 Speaker 1: going to improve the democracy in time to get AI right. 474 00:25:03,119 --> 00:25:07,240 Speaker 1: So let's put on our well, we got to deal 475 00:25:07,240 --> 00:25:09,440 Speaker 1: with the politics that we have now that we wish 476 00:25:09,680 --> 00:25:14,919 Speaker 1: we had. What's the safer way within this system that 477 00:25:15,000 --> 00:25:17,800 Speaker 1: we can be rolling out AI to help people feel 478 00:25:17,840 --> 00:25:18,399 Speaker 1: better about this. 479 00:25:19,240 --> 00:25:20,639 Speaker 2: I think the first thing is we should let the 480 00:25:20,640 --> 00:25:23,880 Speaker 2: states which are more agile, less captured by big money interests, 481 00:25:24,960 --> 00:25:26,960 Speaker 2: have the ability to take a stab at this and 482 00:25:27,000 --> 00:25:29,480 Speaker 2: to experiment with it. That's the first thing. I think 483 00:25:29,480 --> 00:25:32,160 Speaker 2: the second thing is we should minimally demand that these 484 00:25:32,400 --> 00:25:35,280 Speaker 2: frontier labs, these are the five big companies to have 485 00:25:35,680 --> 00:25:40,800 Speaker 2: the very well known models. This is ex Google, Meta, Panthropic, 486 00:25:41,000 --> 00:25:44,040 Speaker 2: Open AI. Right. They need to be more transparent about 487 00:25:44,040 --> 00:25:46,480 Speaker 2: what they're doing. They need to be reporting to government 488 00:25:46,520 --> 00:25:50,080 Speaker 2: authorities like can this great novel pathogens? Can it promote 489 00:25:50,119 --> 00:25:52,560 Speaker 2: offensive cybersecurity risk? Can it help? 490 00:25:52,600 --> 00:25:54,360 Speaker 1: By the way, what is the government entity that ought 491 00:25:54,359 --> 00:25:55,080 Speaker 1: to be overseeing this? 492 00:25:55,880 --> 00:25:57,159 Speaker 2: I would like to see something of the Department of 493 00:25:57,160 --> 00:26:00,000 Speaker 2: Commerce or Department of Energy. Okay, are there already to 494 00:26:00,160 --> 00:26:03,120 Speaker 2: nuclear weapons or a lot of export controls, So that's 495 00:26:03,160 --> 00:26:05,440 Speaker 2: where they should do it. They should be reporting up there. 496 00:26:05,840 --> 00:26:07,520 Speaker 2: And I think, you know, if you have a model 497 00:26:07,840 --> 00:26:11,240 Speaker 2: that actually say it could be proven to develop novel pathogens, 498 00:26:11,480 --> 00:26:13,199 Speaker 2: and we already know that most of these models are 499 00:26:13,240 --> 00:26:17,000 Speaker 2: better than PhD virologists, so they're incredibly good at this 500 00:26:17,080 --> 00:26:19,040 Speaker 2: kind of thing. And if it could develop a bio weapon, 501 00:26:19,400 --> 00:26:22,080 Speaker 2: you should be actually prohibited from releasing that. Today you're 502 00:26:22,080 --> 00:26:24,320 Speaker 2: not prohibited in any way from releasing that, and you 503 00:26:24,320 --> 00:26:26,159 Speaker 2: don't even have to test for it. And so the 504 00:26:26,200 --> 00:26:28,159 Speaker 2: idea is like, there are certain things that should be 505 00:26:28,160 --> 00:26:31,520 Speaker 2: testing for, you know, and that you shouldn't be able 506 00:26:31,560 --> 00:26:33,359 Speaker 2: to release these, and you should have safeguards. So if 507 00:26:33,400 --> 00:26:36,480 Speaker 2: you could develop child pornography, for example, we should know 508 00:26:36,520 --> 00:26:38,680 Speaker 2: ahead of time and you shouldn't be rock. 509 00:26:39,200 --> 00:26:42,160 Speaker 1: Look at what GROC has pulled off, right, Elon must 510 00:26:42,520 --> 00:26:48,640 Speaker 1: decides to sexualize GROC and usership what triples? Yes, exactly, 511 00:26:48,800 --> 00:26:51,120 Speaker 1: I don't want to. We can make a comment about 512 00:26:51,119 --> 00:26:54,240 Speaker 1: our own society about that. But let's humans are going 513 00:26:54,320 --> 00:26:57,840 Speaker 1: to human Okay, So that is that is what it is. 514 00:26:57,920 --> 00:27:01,040 Speaker 1: Humans are going to human. You know, I always say 515 00:27:01,320 --> 00:27:04,760 Speaker 1: our problems in this country are incentive structures. Right. Good 516 00:27:04,800 --> 00:27:07,560 Speaker 1: people can do bad things with bad incentives. Bad people 517 00:27:07,560 --> 00:27:12,359 Speaker 1: will do good things with good incentives. So what Groc 518 00:27:12,520 --> 00:27:15,840 Speaker 1: basically did is, Hey, let's imagine I have the worst 519 00:27:15,880 --> 00:27:20,320 Speaker 1: incentives imaginable. Could I triple the amount of users I have? 520 00:27:20,560 --> 00:27:24,880 Speaker 2: Well, the answers, yes, exactly, they're new deifying images. Of course, 521 00:27:24,880 --> 00:27:27,000 Speaker 2: there's the whole Mecha Hitler episode. Do you know where 522 00:27:27,000 --> 00:27:30,040 Speaker 2: it took on the persona of Adolf Hitler. It kind 523 00:27:30,040 --> 00:27:33,240 Speaker 2: of blows your mind. Actually, And these products like take Groc, 524 00:27:33,280 --> 00:27:36,119 Speaker 2: the one we're just talking about, it's now embedded increasingly 525 00:27:36,119 --> 00:27:40,679 Speaker 2: in the Defense department. It's replacing Anthropic and Claud's product 526 00:27:40,680 --> 00:27:42,879 Speaker 2: there in many ways. And this is the most, the 527 00:27:42,960 --> 00:27:46,359 Speaker 2: least reliable, the least safeguarded of all of them. And 528 00:27:46,440 --> 00:27:48,120 Speaker 2: so you know, someone who spent a lot of time 529 00:27:48,119 --> 00:27:50,640 Speaker 2: working at the Department of Defense under the Obama administration 530 00:27:51,119 --> 00:27:53,320 Speaker 2: and thinking, you're like, I can't I wouldn't want to 531 00:27:53,320 --> 00:27:56,000 Speaker 2: rely on this. But well, again, because humans will be human, 532 00:27:56,240 --> 00:27:58,600 Speaker 2: we often do rely on computer outputs that think like, oh, 533 00:27:58,600 --> 00:28:00,960 Speaker 2: that must be right, the computer said, and we don't 534 00:28:01,000 --> 00:28:04,480 Speaker 2: actually interrogate it enough. So it's actually very parallel situation 535 00:28:04,520 --> 00:28:05,560 Speaker 2: we're finding ourselves in. 536 00:28:06,240 --> 00:28:10,760 Speaker 1: You know, I was speaking with he wanted to say 537 00:28:10,760 --> 00:28:13,360 Speaker 1: this off the record, but it was a retired general 538 00:28:14,440 --> 00:28:19,119 Speaker 1: who was knee deep in COVID, so he was helping 539 00:28:19,160 --> 00:28:22,560 Speaker 1: in that world. Has not has not been has been 540 00:28:22,560 --> 00:28:26,119 Speaker 1: in the private sector since. But he remarked to me, 541 00:28:26,280 --> 00:28:28,359 Speaker 1: he said, I do not understand. He said, you know, 542 00:28:29,440 --> 00:28:35,199 Speaker 1: the government made sure, you know, if they were in 543 00:28:35,280 --> 00:28:39,160 Speaker 1: charge of the scientists making the nuclear weapons. The government 544 00:28:39,400 --> 00:28:44,160 Speaker 1: created the Internet, then it took the internet public. We 545 00:28:44,280 --> 00:28:48,840 Speaker 1: are letting the private sector do this without much government 546 00:28:48,920 --> 00:28:53,280 Speaker 1: supervision at all. And I say this. We know how 547 00:28:53,360 --> 00:28:55,760 Speaker 1: people feel about government these days, and you may not 548 00:28:55,800 --> 00:28:59,000 Speaker 1: trust who's in charge of the government, depending on your 549 00:28:59,400 --> 00:29:07,200 Speaker 1: political person. But previous instances, you know, collective collective concern 550 00:29:07,240 --> 00:29:13,480 Speaker 1: by government proved probably a good thing with nuclear weapons, 551 00:29:13,520 --> 00:29:16,160 Speaker 1: and proved a good thing with the initial creation of 552 00:29:16,200 --> 00:29:21,200 Speaker 1: the Internet as a as a defense thing. This general 553 00:29:21,360 --> 00:29:24,880 Speaker 1: was mortified that the government had no was not essentially 554 00:29:25,280 --> 00:29:26,840 Speaker 1: holding the hands of these companies. 555 00:29:28,000 --> 00:29:30,080 Speaker 2: I think he should be mortified. It's really the first 556 00:29:30,440 --> 00:29:34,000 Speaker 2: kind of transformative military technology that's being developed by the 557 00:29:34,040 --> 00:29:37,520 Speaker 2: private sector, private companies. We don't even see their research. 558 00:29:37,600 --> 00:29:39,800 Speaker 2: We have no idea like what's happening in the skunk 559 00:29:39,840 --> 00:29:44,160 Speaker 2: works of these labs, no visibility into it. It's really 560 00:29:44,240 --> 00:29:46,520 Speaker 2: quite extraordinary. And now we see with Claude in this 561 00:29:46,560 --> 00:29:49,800 Speaker 2: whole speed with p Hexeth Department of War, how reliant 562 00:29:49,840 --> 00:29:52,320 Speaker 2: they are on AI had. The Iraq War has shown 563 00:29:52,760 --> 00:29:56,680 Speaker 2: some examples of Claude's use there, and so I think, yes, 564 00:29:56,760 --> 00:29:59,120 Speaker 2: he should be mortified because what's happening in San Francisco 565 00:29:59,360 --> 00:30:03,400 Speaker 2: within a six by six square mile area there. It's 566 00:30:03,400 --> 00:30:06,680 Speaker 2: transforming the world. It's going to transform military power, and 567 00:30:06,720 --> 00:30:09,960 Speaker 2: we really have no oversight or even insight into what's 568 00:30:09,960 --> 00:30:10,280 Speaker 2: going on. 569 00:30:11,400 --> 00:30:13,280 Speaker 1: Do you ever say to yourself that are we living 570 00:30:13,280 --> 00:30:17,000 Speaker 1: a Bond movie? Right? I mean, this is the plot 571 00:30:17,160 --> 00:30:19,880 Speaker 1: of one of those b minus Bond movies, like not 572 00:30:19,960 --> 00:30:21,440 Speaker 1: even a really good one. 573 00:30:22,080 --> 00:30:24,160 Speaker 2: It's some people, I think are waking up to it. 574 00:30:24,160 --> 00:30:26,360 Speaker 2: And that's why people are scared about this. That's why 575 00:30:26,360 --> 00:30:28,920 Speaker 2: the unpopularity is there. People are like, you know, I 576 00:30:28,960 --> 00:30:31,200 Speaker 2: see a lot of polling data and people do recognize 577 00:30:31,240 --> 00:30:33,560 Speaker 2: it might be able to do some amazing things, you know, 578 00:30:33,640 --> 00:30:34,720 Speaker 2: cure diseases, new. 579 00:30:34,640 --> 00:30:37,480 Speaker 1: Materials, social media. I look, I'm old enough to remember 580 00:30:37,520 --> 00:30:41,960 Speaker 1: the Arab Spring. We thought, wow, we have democratized the world. 581 00:30:42,120 --> 00:30:46,080 Speaker 1: Social media is going to get rid of authoritarianism. Oopsh 582 00:30:46,280 --> 00:30:48,720 Speaker 1: you know, it's like, it's funny you bring up we 583 00:30:48,760 --> 00:30:51,200 Speaker 1: all talk about the health benefits. There's a story out 584 00:30:51,240 --> 00:30:54,400 Speaker 1: there about the guy in Australia who developed the cancer 585 00:30:54,440 --> 00:30:57,160 Speaker 1: treatment for his dog. Right, and as a dog lover, 586 00:30:57,320 --> 00:31:00,840 Speaker 1: you know, I was, I'm like, yeah, I could see 587 00:31:00,840 --> 00:31:06,280 Speaker 1: myself doing that for Cali, you know, my puppy. That's 588 00:31:06,280 --> 00:31:09,600 Speaker 1: the good part. But if you can develop, if somebody 589 00:31:09,640 --> 00:31:14,480 Speaker 1: without scientific training can develop this amazing treatment to shrink 590 00:31:14,480 --> 00:31:18,440 Speaker 1: a tumor on their dog, what could they do to 591 00:31:18,560 --> 00:31:23,040 Speaker 1: actually give you a tumor and to actually create cancers 592 00:31:23,080 --> 00:31:25,960 Speaker 1: if they chose to. It's the same thing with the 593 00:31:26,000 --> 00:31:29,440 Speaker 1: Arab Spring. That's what we learned with the Arab Spring. Right, 594 00:31:29,520 --> 00:31:33,480 Speaker 1: what was a positive the minute, Well, you could break 595 00:31:33,520 --> 00:31:37,480 Speaker 1: through until the authoritarians decided, oh, we'll control the flow 596 00:31:37,480 --> 00:31:38,400 Speaker 1: of information again. 597 00:31:38,840 --> 00:31:40,959 Speaker 2: Yeah. I think that's what one of the lessons of 598 00:31:41,080 --> 00:31:44,200 Speaker 2: social media is there's good and bad to both these things. 599 00:31:44,200 --> 00:31:47,920 Speaker 2: They're not unalloyed one way of the other. And you 600 00:31:48,000 --> 00:31:50,520 Speaker 2: have to have a regulatory scheme that's flexible enough, which 601 00:31:50,560 --> 00:31:52,440 Speaker 2: isn't easy to achieve, but it must be allowed to 602 00:31:52,440 --> 00:31:55,040 Speaker 2: be flexible. Social media we froze it, you know, in 603 00:31:55,120 --> 00:31:57,640 Speaker 2: Ice in nineteen ninety six, so section two thirty it 604 00:31:57,680 --> 00:31:57,960 Speaker 2: was over. 605 00:31:58,040 --> 00:31:58,280 Speaker 1: Yep. 606 00:31:58,480 --> 00:32:01,920 Speaker 2: It's basically been unregulated, set and immunize any harm. And 607 00:32:01,960 --> 00:32:03,480 Speaker 2: if you try to get after it, they always say 608 00:32:03,560 --> 00:32:06,200 Speaker 2: it's section two thirty. We're immunized from this. You know. 609 00:32:06,280 --> 00:32:08,640 Speaker 2: You can't allow that to happen for major technologies, and 610 00:32:08,680 --> 00:32:10,440 Speaker 2: that is the danger of AI. That's really the big 611 00:32:10,440 --> 00:32:12,240 Speaker 2: fight to AI. Now. It's like, do you want to 612 00:32:12,320 --> 00:32:14,800 Speaker 2: repeat section two thirty? I think it's fair to say 613 00:32:14,800 --> 00:32:18,440 Speaker 2: that like that tech right crowd David Sachs A sixteen 614 00:32:18,520 --> 00:32:21,160 Speaker 2: Z people like that open AI. That really is what 615 00:32:21,160 --> 00:32:23,240 Speaker 2: they want to do. And the question is, like, we 616 00:32:23,320 --> 00:32:25,400 Speaker 2: have learned our lesson, you know. And it's not to 617 00:32:25,400 --> 00:32:27,920 Speaker 2: say AI should be prohibited or that it can't do 618 00:32:28,000 --> 00:32:30,320 Speaker 2: many good things. It can't, but you do have to 619 00:32:30,360 --> 00:32:33,640 Speaker 2: have a certain agility which the law permits, and to 620 00:32:33,680 --> 00:32:36,080 Speaker 2: freeze that like we did with social media is just 621 00:32:36,120 --> 00:32:39,800 Speaker 2: a disastrous mistake that you know has changed society irrevocably. 622 00:32:40,720 --> 00:32:43,880 Speaker 1: What are some good state regulations being developed that you'd 623 00:32:43,920 --> 00:32:47,120 Speaker 1: like to see. Let's see how that sticks. Maybe that 624 00:32:47,240 --> 00:32:49,880 Speaker 1: will work. I mean, you know, this is the laboratory 625 00:32:49,920 --> 00:32:52,760 Speaker 1: of well, we like to call the fifty states, right, 626 00:32:53,120 --> 00:32:57,960 Speaker 1: the laboratory of Democracy's what's a few in attempts you'd 627 00:32:58,000 --> 00:33:01,080 Speaker 1: like to see to see if okay, and maybe this 628 00:33:01,120 --> 00:33:03,280 Speaker 1: will be a good national regulation. But let's try it. 629 00:33:03,600 --> 00:33:06,240 Speaker 1: Let's try it in Indiana or Oklahoma or Yeah. 630 00:33:06,280 --> 00:33:08,160 Speaker 2: First I'll give you like three or four of those. 631 00:33:08,200 --> 00:33:10,520 Speaker 2: For one thing, Ron DeSantis is proposing an AI Bill 632 00:33:10,520 --> 00:33:12,960 Speaker 2: of Rights in Florida. It gives you know, your right 633 00:33:13,040 --> 00:33:17,040 Speaker 2: to owne your data and have transparency into AI decision making. 634 00:33:17,240 --> 00:33:20,480 Speaker 2: I like laws like that. California and New York for 635 00:33:20,520 --> 00:33:25,240 Speaker 2: the Frontier labs have required transparency in reporting requirements have 636 00:33:25,320 --> 00:33:28,200 Speaker 2: added to those requirements. That's not enough, but that's a good, 637 00:33:29,040 --> 00:33:32,120 Speaker 2: good step in the right direction. Maybe the third bucket 638 00:33:32,160 --> 00:33:34,520 Speaker 2: is it's a whole different type of AI than like 639 00:33:34,720 --> 00:33:38,280 Speaker 2: ROCK or chat GPT. It's very pervasive. It's how AI 640 00:33:38,360 --> 00:33:42,280 Speaker 2: is being used in our day to day lives. For example, surveillance, 641 00:33:42,560 --> 00:33:46,480 Speaker 2: you know, facial recognition, algorithms deciding whether we're going to 642 00:33:46,600 --> 00:33:49,720 Speaker 2: hire you or not, algorithms determining whether your health claim 643 00:33:49,800 --> 00:33:51,440 Speaker 2: is going to be approb. 644 00:33:51,160 --> 00:33:53,120 Speaker 1: Let me give you the one that I'm just rip 645 00:33:53,280 --> 00:34:00,400 Speaker 1: us about surveillance pricing. I think this is so unconstitutionally legal, 646 00:34:00,440 --> 00:34:03,520 Speaker 1: and it's like, I am you want to really make 647 00:34:03,760 --> 00:34:05,320 Speaker 1: you know, It's like, do you want when the middle 648 00:34:05,320 --> 00:34:07,880 Speaker 1: class and upper middle class decides to pick up a pitchfork. 649 00:34:07,920 --> 00:34:11,040 Speaker 1: That's when politicians are troubled in Surveillance pricing is one. 650 00:34:10,880 --> 00:34:12,759 Speaker 2: Of those Well, this shows you the power of state law. 651 00:34:12,800 --> 00:34:14,160 Speaker 2: So you know, you may have seen this. It kind 652 00:34:14,160 --> 00:34:16,120 Speaker 2: of broke in the news last week where I'm a 653 00:34:16,239 --> 00:34:20,520 Speaker 2: Washington Post subscriber online and to renew right, it quoted 654 00:34:20,560 --> 00:34:22,480 Speaker 2: me a price and at the bottom it had a 655 00:34:22,520 --> 00:34:25,440 Speaker 2: little note that this was based on an algorithmic determination. 656 00:34:25,600 --> 00:34:28,319 Speaker 2: Right of my use patterns they knew about. The thing is, 657 00:34:28,320 --> 00:34:30,239 Speaker 2: they knew my credit history. They probably know how much 658 00:34:30,280 --> 00:34:32,560 Speaker 2: money I make, so they probably you know Taylor, like, hey, 659 00:34:32,560 --> 00:34:33,480 Speaker 2: he's a super user. 660 00:34:33,800 --> 00:34:38,520 Speaker 1: Hey do you remember when you clicked when you accepted 661 00:34:38,600 --> 00:34:40,719 Speaker 1: terms and conditions that allowed the Washington Post to do 662 00:34:40,800 --> 00:34:41,640 Speaker 1: a credit check on you. 663 00:34:42,400 --> 00:34:43,920 Speaker 2: I don't. I don't think I did that, but like 664 00:34:44,000 --> 00:34:46,160 Speaker 2: all terms of Star, I'm sure we did. That's my right. 665 00:34:46,280 --> 00:34:48,879 Speaker 1: I hate those goddamn things because I'm sure they'll say, oh, 666 00:34:48,920 --> 00:34:52,440 Speaker 1: you accepted terms and conditions. You mean oh, because because 667 00:34:52,440 --> 00:34:54,160 Speaker 1: you put a gun to my head to accept terms 668 00:34:54,160 --> 00:34:57,920 Speaker 1: and conditions every time you update the app. But they 669 00:34:57,960 --> 00:35:00,400 Speaker 1: looked at your credit report, think about that. 670 00:35:00,280 --> 00:35:03,800 Speaker 2: Brand among other things. Right, they have all the consumer 671 00:35:03,880 --> 00:35:05,160 Speaker 2: data you could want on me. They know what I 672 00:35:05,200 --> 00:35:06,799 Speaker 2: else I subscribe to you, they know I'm a New 673 00:35:06,880 --> 00:35:09,440 Speaker 2: York Times subscriber and The Atlantic and things. But the 674 00:35:09,520 --> 00:35:11,520 Speaker 2: only reason we even got who noticed about that is 675 00:35:11,520 --> 00:35:13,640 Speaker 2: because New York has a law that says if you 676 00:35:13,719 --> 00:35:16,440 Speaker 2: engage in algorithmic pricing, you have to put a little 677 00:35:16,480 --> 00:35:18,120 Speaker 2: note on it. And they obviously have a lot of 678 00:35:18,160 --> 00:35:20,720 Speaker 2: New York readers. And it's only because of that state 679 00:35:20,800 --> 00:35:24,239 Speaker 2: law that I in Oklahoma got notified of that. So 680 00:35:24,280 --> 00:35:26,360 Speaker 2: these are the kind of things like that's not frontier 681 00:35:26,440 --> 00:35:29,839 Speaker 2: level AI. They're not using chat GPT, that's AI, and 682 00:35:29,920 --> 00:35:33,080 Speaker 2: the average American should know about that, and we should 683 00:35:33,120 --> 00:35:36,280 Speaker 2: like have a debate. And that goes into the broader 684 00:35:36,320 --> 00:35:39,120 Speaker 2: question like surveillance. You know, that anthropic Apartment of War 685 00:35:39,160 --> 00:35:41,680 Speaker 2: got into the things that we call legal in this 686 00:35:41,800 --> 00:35:45,120 Speaker 2: country that you and I might over cocktails call surveillance, 687 00:35:45,480 --> 00:35:48,200 Speaker 2: or ordinary people like that's surveillance. The government doesn't call 688 00:35:48,239 --> 00:35:50,879 Speaker 2: it surveillance. I mean, it's shocking the amount of data 689 00:35:50,920 --> 00:35:54,400 Speaker 2: they have. Llms can create a whole dossier about you. 690 00:35:54,400 --> 00:35:58,239 Speaker 2: You know that there wasn't possible before llms existed to 691 00:35:58,320 --> 00:36:01,560 Speaker 2: really assimilate all this data U and so those kind 692 00:36:01,600 --> 00:36:03,680 Speaker 2: of things like I like that New York law, things 693 00:36:03,719 --> 00:36:06,600 Speaker 2: like that that make you, at least even if we 694 00:36:06,600 --> 00:36:09,200 Speaker 2: don't prohibit it, you need to be notified. Like if 695 00:36:09,200 --> 00:36:11,600 Speaker 2: I'm doing a job interview and this happens in places 696 00:36:11,840 --> 00:36:14,520 Speaker 2: you often now will like stare into the screen and 697 00:36:14,560 --> 00:36:16,840 Speaker 2: the AI will say like is this person smarter or 698 00:36:16,880 --> 00:36:20,200 Speaker 2: not because of my gestures or maybe my glasses. You know, 699 00:36:20,239 --> 00:36:23,520 Speaker 2: it's like I deserve to know if AI is evaluating 700 00:36:23,560 --> 00:36:25,840 Speaker 2: me in some way. Those kind of laws I like 701 00:36:25,880 --> 00:36:31,240 Speaker 2: to see. 702 00:36:30,360 --> 00:36:34,480 Speaker 1: What do you think? What is your biggest concern in 703 00:36:34,520 --> 00:36:39,440 Speaker 1: this fight? That that you're you're you're you don't have 704 00:36:39,600 --> 00:36:41,600 Speaker 1: enough of the big tech titans on your side. 705 00:36:42,760 --> 00:36:45,200 Speaker 2: I think that's part of it. I think you knew 706 00:36:45,239 --> 00:36:48,239 Speaker 2: there's a there's a political economy aspect of this right 707 00:36:48,239 --> 00:36:50,120 Speaker 2: where you have to fight. And we know that good 708 00:36:50,160 --> 00:36:53,280 Speaker 2: ideas don't just prevail because they're sound and even popular 709 00:36:53,320 --> 00:36:57,040 Speaker 2: ones kids Online safety shows that big money can prevail 710 00:36:57,160 --> 00:36:59,920 Speaker 2: in American politics. So I think that's the biggest concern, 711 00:37:00,320 --> 00:37:02,800 Speaker 2: you know, if people are left My job leading a 712 00:37:02,840 --> 00:37:05,239 Speaker 2: super pac is quite easy. The often aspect, well, what 713 00:37:05,280 --> 00:37:07,400 Speaker 2: do I have to believe to earn your support? And 714 00:37:07,440 --> 00:37:09,040 Speaker 2: I say, all you have to do is believe in 715 00:37:09,400 --> 00:37:12,440 Speaker 2: having a democratic debate about AI. If that happens, I 716 00:37:12,480 --> 00:37:14,719 Speaker 2: think most people will probably come to my side. What 717 00:37:14,800 --> 00:37:16,840 Speaker 2: I don't want to do is for close it, you know, 718 00:37:16,880 --> 00:37:19,239 Speaker 2: like we've had with crypto policy in this country, where 719 00:37:19,280 --> 00:37:22,480 Speaker 2: today there's no real debate on cryptopolicy because the superpack 720 00:37:22,920 --> 00:37:25,840 Speaker 2: has now made it impossible for people to be against 721 00:37:25,840 --> 00:37:28,160 Speaker 2: that issue. We don't want that for AI, right. We 722 00:37:28,200 --> 00:37:30,160 Speaker 2: want it to be free where people can have actually, 723 00:37:30,560 --> 00:37:33,440 Speaker 2: you know, an argument about what we should do, and 724 00:37:33,480 --> 00:37:36,120 Speaker 2: if so, I have confidence in the American people to 725 00:37:36,120 --> 00:37:37,160 Speaker 2: come to the right conclusion. 726 00:37:41,680 --> 00:37:43,600 Speaker 1: This episode of the Chuck Podcast is brought to you 727 00:37:43,680 --> 00:37:47,160 Speaker 1: by Quints, a thoughtfully built wardrobe comes down to pieces 728 00:37:47,200 --> 00:37:51,279 Speaker 1: that mix well and last. That's where Quint shines. Premium fabrics, 729 00:37:51,520 --> 00:37:54,880 Speaker 1: considered design, and every day essentials that feel effortless, toware 730 00:37:55,400 --> 00:37:58,480 Speaker 1: and dependable even as the seasons change. Quin's has the 731 00:37:58,560 --> 00:38:02,840 Speaker 1: everyday essentials I love with quality that lasts, lightweight cashmere 732 00:38:02,880 --> 00:38:05,960 Speaker 1: sweaters not too expensive either by the way, short sleeve 733 00:38:06,040 --> 00:38:10,840 Speaker 1: Mongolian kashmere, polos, linen bottoms, and shorts. Quits works directly 734 00:38:10,880 --> 00:38:13,760 Speaker 1: with top factories and cuts out the middlemen. That's why 735 00:38:13,840 --> 00:38:16,640 Speaker 1: it's more affordable. You're not paying for the brand markup 736 00:38:16,760 --> 00:38:19,760 Speaker 1: or any fancy retail stores, but you're still getting quality. 737 00:38:19,840 --> 00:38:22,680 Speaker 1: Everything is built to hold up to regular wear and 738 00:38:22,760 --> 00:38:25,960 Speaker 1: still look good. Quite often I'm wearing something that I've 739 00:38:25,960 --> 00:38:30,360 Speaker 1: gotten from Quints. It is incredibly comfortable. I've become a 740 00:38:30,360 --> 00:38:32,520 Speaker 1: middle aged man who loves his quarter zips. They have 741 00:38:32,600 --> 00:38:35,879 Speaker 1: plenty of those. I have been very pleased, and it's 742 00:38:35,920 --> 00:38:37,760 Speaker 1: just you know, when I like a piece of clothing, 743 00:38:38,160 --> 00:38:41,799 Speaker 1: I probably wear it too much, and usually you can 744 00:38:41,840 --> 00:38:44,200 Speaker 1: tell after about two or three months. That's not the case. 745 00:38:44,239 --> 00:38:47,680 Speaker 1: So far with Quints that's been impressive. So right now, 746 00:38:47,680 --> 00:38:49,920 Speaker 1: go to quins dot com. Slash chuck for free shipping 747 00:38:49,960 --> 00:38:52,000 Speaker 1: and three hundred and sixty five day returns. It's a 748 00:38:52,040 --> 00:38:54,480 Speaker 1: full year to build your wardrobe and you'll love it. 749 00:38:54,520 --> 00:38:57,320 Speaker 1: Now available in Canada too. Don't keep settling for clothes 750 00:38:57,360 --> 00:39:02,480 Speaker 1: that don't last. Go to qu and ce dot com 751 00:39:02,520 --> 00:39:05,640 Speaker 1: slash chuck for free shipping, three hundred and sixty five 752 00:39:05,719 --> 00:39:14,320 Speaker 1: day returns, quints dot com slash chuck. It doesn't feel 753 00:39:14,360 --> 00:39:16,680 Speaker 1: like we're having enough of a debate. It doesn't feel 754 00:39:16,719 --> 00:39:19,640 Speaker 1: like we've had enough of a tutorial, you know, in 755 00:39:19,680 --> 00:39:23,440 Speaker 1: a in a healthier democratic system, because our systems, you know, 756 00:39:24,560 --> 00:39:27,040 Speaker 1: some version of we have a cold. We can some 757 00:39:27,120 --> 00:39:29,239 Speaker 1: of you may decide how bad are is it? The flu? 758 00:39:29,480 --> 00:39:33,040 Speaker 1: Is it? How bad is it? But we're definitely not healthy. 759 00:39:35,880 --> 00:39:39,600 Speaker 1: Who would be leading this conversation about about AI? I mean, 760 00:39:39,680 --> 00:39:43,359 Speaker 1: do we need an old time fireside chat? Do we 761 00:39:43,400 --> 00:39:48,120 Speaker 1: need a General Eisenhower to sort of, you know, explain things? 762 00:39:49,280 --> 00:39:52,200 Speaker 1: You know, what what would what what? What would a healthier, 763 00:39:52,440 --> 00:39:55,799 Speaker 1: small d democratic system be doing right now with the 764 00:39:55,840 --> 00:39:59,879 Speaker 1: development of AI and bringing the public along and making 765 00:39:59,880 --> 00:40:02,080 Speaker 1: the and feel a little more secure about it. 766 00:40:02,760 --> 00:40:04,839 Speaker 2: I think two groups can do this. One is if 767 00:40:04,880 --> 00:40:07,719 Speaker 2: the company leaders themselves were more open about it. I 768 00:40:07,760 --> 00:40:09,440 Speaker 2: think Anthropic has actually done a good job of that. 769 00:40:09,440 --> 00:40:12,520 Speaker 2: They've been pretty forward leading, They've expressed concerns that's made 770 00:40:12,520 --> 00:40:15,040 Speaker 2: people aware. If you had all the LAB leaders come 771 00:40:15,080 --> 00:40:18,160 Speaker 2: together and talk about these things more openly, not simply 772 00:40:18,200 --> 00:40:21,040 Speaker 2: like be arguing their own book, like what's best for 773 00:40:21,080 --> 00:40:25,040 Speaker 2: the narrow share interest here, but like the broader social context, 774 00:40:25,080 --> 00:40:27,600 Speaker 2: that would help. I think my real hope is in 775 00:40:27,640 --> 00:40:30,680 Speaker 2: the twenty twenty eight presidential race, you need someone at 776 00:40:30,680 --> 00:40:33,800 Speaker 2: that level of platform, that level of free media covering 777 00:40:33,840 --> 00:40:37,160 Speaker 2: them say look, America, look what's happening here, you know, 778 00:40:37,200 --> 00:40:39,480 Speaker 2: and to like that it kind of like raises the 779 00:40:39,480 --> 00:40:42,040 Speaker 2: consciousness of the whole country. Ron DeSantis has done a 780 00:40:42,080 --> 00:40:43,560 Speaker 2: bit of that in Florida. Have been a great leader 781 00:40:43,600 --> 00:40:45,719 Speaker 2: on this issue. But if you want a president and 782 00:40:45,719 --> 00:40:47,720 Speaker 2: you said, look, you know, technology is actually a defining 783 00:40:47,760 --> 00:40:51,280 Speaker 2: issue of our day. It's redefining party allegiances. Where Steve 784 00:40:51,320 --> 00:40:54,400 Speaker 2: Bannon and Elizabeth Warren or like eye to eye on 785 00:40:54,480 --> 00:40:56,960 Speaker 2: these kind of questions, you know, and you should be 786 00:40:56,960 --> 00:40:59,359 Speaker 2: paying attention. And here's some solutions for it as well, 787 00:40:59,440 --> 00:41:02,160 Speaker 2: not just crying of the solutions that works. 788 00:41:02,719 --> 00:41:04,840 Speaker 1: You know, it strikes me as the safest place to 789 00:41:04,880 --> 00:41:08,160 Speaker 1: be politically is if you're pro consumer. You know what 790 00:41:08,200 --> 00:41:13,040 Speaker 1: you're describing, what essentially what Governor DeSantis is proposing. And 791 00:41:13,120 --> 00:41:15,600 Speaker 1: in some ways it is very much a small C 792 00:41:15,800 --> 00:41:19,360 Speaker 1: conservative approach, which is, hey, we got to have a 793 00:41:19,440 --> 00:41:23,600 Speaker 1: specific essentially a bill of rights right, which is really 794 00:41:23,680 --> 00:41:27,560 Speaker 1: a consumer protection you know, we can have larger arguments about, 795 00:41:27,640 --> 00:41:29,520 Speaker 1: you know what should this be allowed and should this 796 00:41:29,560 --> 00:41:32,440 Speaker 1: be allowed? Right? And that's where sometimes you lose people 797 00:41:32,440 --> 00:41:35,880 Speaker 1: on the right. Well, I don't want government to involved 798 00:41:35,960 --> 00:41:39,480 Speaker 1: with all the decision making, but let's have some basic 799 00:41:39,560 --> 00:41:44,080 Speaker 1: protections for the consumer right. And it feels like if 800 00:41:44,120 --> 00:41:49,520 Speaker 1: you put if the north star is consumer protection hard stop. 801 00:41:50,560 --> 00:41:52,359 Speaker 2: I think it's a one political message. 802 00:41:52,000 --> 00:41:54,280 Speaker 1: That you probably can get to the right place without 803 00:41:54,400 --> 00:41:57,200 Speaker 1: without feeling that you're too blue or too red. 804 00:41:57,880 --> 00:41:59,360 Speaker 2: I think it is actually a great message, and I 805 00:41:59,360 --> 00:42:01,920 Speaker 2: think though you do need someone of stature to inject 806 00:42:01,920 --> 00:42:04,160 Speaker 2: that into the debate to let people know, like AI 807 00:42:04,320 --> 00:42:06,200 Speaker 2: is changing your life. You may not even be able 808 00:42:06,239 --> 00:42:08,120 Speaker 2: to see all the things that's doing right. You don't 809 00:42:08,120 --> 00:42:11,240 Speaker 2: even know it. Sometimes it's invisible to you. But consumer 810 00:42:11,280 --> 00:42:14,080 Speaker 2: protection claims are very powerful and to make a kind 811 00:42:14,120 --> 00:42:15,919 Speaker 2: of a major issue. That's what I hope twenty twenty 812 00:42:15,960 --> 00:42:17,719 Speaker 2: eight brings. And I actually think it's going to be 813 00:42:17,760 --> 00:42:20,000 Speaker 2: a major issue in twenty twenty eight. So I'm pretty 814 00:42:20,000 --> 00:42:22,799 Speaker 2: optimistic that it's going to be a huge question. And 815 00:42:22,880 --> 00:42:25,439 Speaker 2: once it's become part of the broader discussion, a few 816 00:42:25,440 --> 00:42:27,840 Speaker 2: people can't control it anymore. Right, it can't be just 817 00:42:27,880 --> 00:42:30,799 Speaker 2: like two guys in Silicon Valley plotting against you. Once 818 00:42:30,800 --> 00:42:35,239 Speaker 2: it's in South Carolina, New Hampshire. It's a thing, you. 819 00:42:35,160 --> 00:42:37,479 Speaker 1: Know, It's funny, you know, the Internet never did become 820 00:42:37,520 --> 00:42:42,439 Speaker 1: a centerpiece conversation in a presidential election, not ninety six, 821 00:42:42,520 --> 00:42:45,840 Speaker 1: not two thousand. You know, I remember McCain tried with 822 00:42:45,960 --> 00:42:50,640 Speaker 1: the tax free Internet and thought there'd be some there 823 00:42:50,640 --> 00:42:54,839 Speaker 1: would be a somewhat of a centralizing You know that 824 00:42:55,280 --> 00:42:58,640 Speaker 1: you'd have been dismissed as a candidate if you didn't 825 00:42:58,680 --> 00:43:02,200 Speaker 1: have a policy talking about how to integrate the Internet 826 00:43:02,280 --> 00:43:06,200 Speaker 1: into every day commerce, et cetera. I'm with you. I 827 00:43:06,200 --> 00:43:09,480 Speaker 1: think twenty eight is going to be the year where 828 00:43:09,520 --> 00:43:12,359 Speaker 1: that is focused on fear of AI job displacement. I'm 829 00:43:12,360 --> 00:43:14,560 Speaker 1: not saying AI job displacement is going to be taking place, 830 00:43:15,200 --> 00:43:17,319 Speaker 1: But the fear of it, I think is going to 831 00:43:17,320 --> 00:43:20,080 Speaker 1: be at a more heightened state come twenty twenty eight. 832 00:43:20,120 --> 00:43:24,160 Speaker 1: We've seen already a few things like the Jack Dorsey 833 00:43:24,200 --> 00:43:27,240 Speaker 1: company where he just laid off forty percent of his folks. 834 00:43:27,320 --> 00:43:31,200 Speaker 1: Whether that was really AI driven, you know, right every 835 00:43:32,200 --> 00:43:35,879 Speaker 1: you're like, oh, that's that's interesting. Let's see if two 836 00:43:35,920 --> 00:43:37,960 Speaker 1: or three more companies do this, then it's a trend, right. 837 00:43:40,000 --> 00:43:43,160 Speaker 1: But I do sense fear of AI job displacement is 838 00:43:43,760 --> 00:43:51,160 Speaker 1: probably going to be a a central concern, particularly you're 839 00:43:51,200 --> 00:43:52,879 Speaker 1: certainly seeing it labor units right now. 840 00:43:53,400 --> 00:43:57,160 Speaker 2: I think also rising electricity prices and perhaps water usage. 841 00:43:57,320 --> 00:44:01,359 Speaker 2: Some of those concerns are more serious than others. Overestimate 842 00:44:01,680 --> 00:44:04,920 Speaker 2: the use of water things, but you've seen Spanberger run 843 00:44:04,960 --> 00:44:07,040 Speaker 2: all those kind of issues. Loud And County flip a 844 00:44:07,080 --> 00:44:10,320 Speaker 2: seat from red to blue. A couple Georgia political public 845 00:44:10,320 --> 00:44:10,800 Speaker 2: health Brad. 846 00:44:10,880 --> 00:44:13,560 Speaker 1: I mean not to get like very specific here, because 847 00:44:13,680 --> 00:44:17,759 Speaker 1: I've my electric bills gone up through the roof and 848 00:44:17,800 --> 00:44:20,840 Speaker 1: I'm in Arlington County, which is not far from Loudon. 849 00:44:21,480 --> 00:44:25,240 Speaker 1: But the reason we're paying a steeper price is because 850 00:44:25,280 --> 00:44:29,640 Speaker 1: of how electricity is transmitted alunk off the grid. Into 851 00:44:29,719 --> 00:44:32,640 Speaker 1: the mid Atlantic, and it turns out that you know, 852 00:44:33,360 --> 00:44:37,320 Speaker 1: like you know, it's gonna hurt us more than it 853 00:44:37,320 --> 00:44:41,640 Speaker 1: will hurt another state more because they get their electricity 854 00:44:41,719 --> 00:44:45,880 Speaker 1: slightly differently than Maryland gets THEIRS. Then Pennsylvania gets THEIRS, 855 00:44:46,239 --> 00:44:50,600 Speaker 1: Virginia gets THEIRS. You know, what it's done for me 856 00:44:50,680 --> 00:44:53,320 Speaker 1: is that it's highlighted. Hey, our grid is a mess. 857 00:44:53,680 --> 00:44:58,240 Speaker 1: We probably need to restructure the grid in a way 858 00:44:58,480 --> 00:45:04,880 Speaker 1: to minimize consumer pain and maximize stability. 859 00:45:05,080 --> 00:45:07,000 Speaker 2: You know, I think that's one of the underappreciated aspects 860 00:45:07,000 --> 00:45:10,000 Speaker 2: of what AI has done, is there's this incredible energy 861 00:45:10,000 --> 00:45:11,280 Speaker 2: build out across the country. 862 00:45:11,440 --> 00:45:14,320 Speaker 1: Yeah no, but it's exposed some problems, problems too. 863 00:45:14,400 --> 00:45:16,560 Speaker 2: But it's actually I think probably will lead more than 864 00:45:16,560 --> 00:45:19,239 Speaker 2: anything else has to some solutions to those. Right, you're right. 865 00:45:19,280 --> 00:45:22,000 Speaker 2: The grid has to have probably a trillion dollars invested 866 00:45:22,040 --> 00:45:25,080 Speaker 2: into it, right, We need more interconnection agreements. The regulation 867 00:45:25,280 --> 00:45:26,239 Speaker 2: is pretty come by. 868 00:45:26,239 --> 00:45:27,680 Speaker 1: I know, who should pay for it? We don't have 869 00:45:27,719 --> 00:45:29,719 Speaker 1: to pay for it now. Friends that want to build 870 00:45:29,719 --> 00:45:32,200 Speaker 1: these data centers, this should be the price. Nothing wrong 871 00:45:32,239 --> 00:45:32,640 Speaker 1: with that. 872 00:45:32,880 --> 00:45:35,759 Speaker 2: And that's the right, right, that's actually and this is 873 00:45:35,760 --> 00:45:37,480 Speaker 2: why I think it's going to be a good issue. Actually, 874 00:45:37,520 --> 00:45:39,440 Speaker 2: you know, you asked me, like Oklahoma Democrats and like 875 00:45:39,520 --> 00:45:42,799 Speaker 2: Southern Democrats like this is actually a classic wedge issue 876 00:45:42,800 --> 00:45:45,160 Speaker 2: if you're a Democrat in these states, because you're something 877 00:45:45,200 --> 00:45:47,040 Speaker 2: like you know what, I'm I'm actually for lowering your 878 00:45:47,040 --> 00:45:49,680 Speaker 2: electricity bill. I'm like, I want these data centers to 879 00:45:49,719 --> 00:45:52,880 Speaker 2: pay their way. Sometimes, like pro business Republicans have a 880 00:45:52,880 --> 00:45:55,719 Speaker 2: hard time getting themselves to that position. And so you've 881 00:45:55,760 --> 00:45:57,640 Speaker 2: seen in a couple places, right, it's a really good 882 00:45:57,680 --> 00:46:00,719 Speaker 2: issue to kind of like get these concernsvatives who care 883 00:46:00,719 --> 00:46:04,279 Speaker 2: about affordability and water, environment, energy use, I mean, who 884 00:46:04,280 --> 00:46:06,960 Speaker 2: are skeptical at AI in Silicon Valley as well. Kind 885 00:46:06,960 --> 00:46:09,479 Speaker 2: of inherently all that comes into kind of a world 886 00:46:09,520 --> 00:46:11,080 Speaker 2: where I think he's going to make for a lot 887 00:46:11,080 --> 00:46:12,000 Speaker 2: of strange politics. 888 00:46:12,080 --> 00:46:15,960 Speaker 1: Actually, and let me let me end with a little 889 00:46:16,000 --> 00:46:18,560 Speaker 1: bit of just your observations of what's going on in 890 00:46:19,960 --> 00:46:22,799 Speaker 1: particularly in your I don't know, is it your former 891 00:46:22,800 --> 00:46:25,320 Speaker 1: political party or your current one person? Do you consider 892 00:46:25,320 --> 00:46:26,200 Speaker 1: yourself a Democrat? 893 00:46:26,320 --> 00:46:27,160 Speaker 2: Oh? Yes, of course. 894 00:46:27,440 --> 00:46:32,640 Speaker 1: Okay, what's the difference between an Oklahoma Democrat and a 895 00:46:32,719 --> 00:46:34,080 Speaker 1: national Democrat these days? 896 00:46:34,200 --> 00:46:37,520 Speaker 2: Or is there one I'm about the last Democrat in 897 00:46:37,560 --> 00:46:40,200 Speaker 2: the state of Oklahoma today, So I'm kind of like, 898 00:46:40,760 --> 00:46:45,160 Speaker 2: you know, the last of the Mohicans here, I'd say, 899 00:46:45,160 --> 00:46:48,319 Speaker 2: I think, you know, my view has always been that 900 00:46:48,400 --> 00:46:50,759 Speaker 2: the parties do capture by the values of the professional 901 00:46:50,800 --> 00:46:55,040 Speaker 2: managerial class, you know, whose concerns are almost post material, 902 00:46:55,480 --> 00:46:57,759 Speaker 2: right about things like if you come like I do 903 00:46:57,880 --> 00:47:00,480 Speaker 2: from eastern Oklahoma, small towns, That's what I presented in 904 00:47:00,480 --> 00:47:03,960 Speaker 2: Congress where I was from, their concerns are very material 905 00:47:04,280 --> 00:47:08,680 Speaker 2: health care jobs, roads, schools, We have meth and opiate 906 00:47:08,760 --> 00:47:12,360 Speaker 2: addiction that are endemic, you know, these concerns of like 907 00:47:12,840 --> 00:47:16,400 Speaker 2: kind of esoteric social questions, which if you're like living 908 00:47:16,480 --> 00:47:18,880 Speaker 2: you know in loud In County, one of the richest 909 00:47:18,880 --> 00:47:20,719 Speaker 2: counties in the country, there are not that many people 910 00:47:20,760 --> 00:47:24,000 Speaker 2: who like worry about these things with the same intensity. 911 00:47:24,520 --> 00:47:27,680 Speaker 1: And to me, I like to say, I have you 912 00:47:27,719 --> 00:47:29,279 Speaker 1: have the luxury to worry about those things? 913 00:47:29,360 --> 00:47:31,560 Speaker 2: Yeah, exactly, And so things that you know, even if 914 00:47:31,560 --> 00:47:33,560 Speaker 2: they're in the right about it, they make them more 915 00:47:33,640 --> 00:47:36,520 Speaker 2: salient than they should be. You know. It's like, you know, 916 00:47:36,560 --> 00:47:38,000 Speaker 2: in my mind, like I think the whole question that 917 00:47:38,000 --> 00:47:40,839 Speaker 2: the party's debating about trans issues, you know, you could 918 00:47:40,840 --> 00:47:44,000 Speaker 2: be four or against it. These are different positions. It's 919 00:47:44,080 --> 00:47:47,000 Speaker 2: really not that salient to the average voter. And so 920 00:47:47,040 --> 00:47:48,680 Speaker 2: you know, we should focus on things that are like 921 00:47:49,120 --> 00:47:52,600 Speaker 2: driving their their kind of household decision making, and Democrats 922 00:47:52,640 --> 00:47:55,280 Speaker 2: need to be better about that. So that's the problem 923 00:47:55,280 --> 00:47:57,840 Speaker 2: with the coastal Party. But I still think Democrats have 924 00:47:57,880 --> 00:48:01,200 Speaker 2: the best message for people who have very material concerns 925 00:48:01,239 --> 00:48:04,439 Speaker 2: about healthcare and public schools, things like that. So I'm 926 00:48:04,440 --> 00:48:06,960 Speaker 2: working do my best to reclaim that tradition. 927 00:48:07,920 --> 00:48:10,520 Speaker 1: I have a specific question about Oklahoma. You know, one 928 00:48:10,560 --> 00:48:12,239 Speaker 1: of the things I like to remind people is that 929 00:48:12,280 --> 00:48:16,720 Speaker 1: even in one party states and a political a strong 930 00:48:16,760 --> 00:48:21,600 Speaker 1: political opposition develops some and it isn't sometimes it's within 931 00:48:21,640 --> 00:48:24,080 Speaker 1: the party. Right, We've seen that in Texas there are 932 00:48:24,080 --> 00:48:27,520 Speaker 1: basically two Republican parties and they're clashing right now in 933 00:48:27,520 --> 00:48:29,960 Speaker 1: that center base. But we had seen this, right, the 934 00:48:30,160 --> 00:48:32,680 Speaker 1: one part of the Republican Party wanted to impeach Compact, 935 00:48:33,200 --> 00:48:36,360 Speaker 1: another part is trying to defend him. But you know, 936 00:48:36,440 --> 00:48:39,399 Speaker 1: the weak Democratic party and it's perhaps getting a little 937 00:48:39,400 --> 00:48:42,480 Speaker 1: stronger down there today, but at the time, but you know, 938 00:48:42,800 --> 00:48:45,680 Speaker 1: there's the reason the phrase exists. Politics of horors a vacuum. 939 00:48:45,760 --> 00:48:49,440 Speaker 1: What I've noticed about in Oklahoma is as the Democratic 940 00:48:49,480 --> 00:48:54,080 Speaker 1: Party has fallen away as the chief opposition to the 941 00:48:54,440 --> 00:49:00,080 Speaker 1: to the majority governing governance in the state, that the 942 00:49:00,160 --> 00:49:06,480 Speaker 1: tribes have become the check, the oppositional check. And sometimes 943 00:49:06,520 --> 00:49:10,640 Speaker 1: it'll be by the tribes funding a Republican primary candidate 944 00:49:10,680 --> 00:49:14,120 Speaker 1: to a certain sitting governor or sitting attorney general. Maybe 945 00:49:14,120 --> 00:49:16,440 Speaker 1: it will be finding a former Republican to run as 946 00:49:16,480 --> 00:49:17,640 Speaker 1: a Democrat for governor. 947 00:49:18,840 --> 00:49:18,920 Speaker 2: Or. 948 00:49:18,960 --> 00:49:23,960 Speaker 1: It's, you know, is Oklahoma politics. It's it's a two 949 00:49:24,000 --> 00:49:27,200 Speaker 1: party state, but it's not Democrats and Republicans, it's Republicans. 950 00:49:27,360 --> 00:49:30,640 Speaker 1: And that the next most powerful interest group is the tribes. 951 00:49:31,840 --> 00:49:34,080 Speaker 2: That's probably a fair assessment of it. The tribes are 952 00:49:34,160 --> 00:49:37,600 Speaker 2: historically very democratic. Unsurprisingly, over the last twenty years have 953 00:49:37,680 --> 00:49:40,040 Speaker 2: become very wealthy and have a lot more political interests 954 00:49:40,040 --> 00:49:43,239 Speaker 2: at stake, and they've spread that money around. Because Republicans 955 00:49:43,280 --> 00:49:46,759 Speaker 2: are in ascendancy really and have like basically every office 956 00:49:47,239 --> 00:49:49,359 Speaker 2: at every level. Right, they've played in a lot more 957 00:49:49,360 --> 00:49:51,839 Speaker 2: Republican politics to find people who are supportive of them. 958 00:49:52,160 --> 00:49:55,200 Speaker 2: They've been real at war with Kevin sitt or governess. 959 00:49:55,200 --> 00:49:58,600 Speaker 1: They have and that's basically been over a over good 960 00:49:58,600 --> 00:50:01,239 Speaker 1: old fashioned economic right how much did the state get 961 00:50:01,280 --> 00:50:02,760 Speaker 1: in the gambling revenue. 962 00:50:02,719 --> 00:50:05,560 Speaker 2: And also jurisdiction, you know of things. But now they 963 00:50:05,560 --> 00:50:07,400 Speaker 2: have people running for governor, which is all about the 964 00:50:07,440 --> 00:50:11,080 Speaker 2: Republican primary. Again, there's no serious democratic cand is. Some 965 00:50:11,120 --> 00:50:13,360 Speaker 2: of them are very strongly aligned with the tribes, some 966 00:50:13,560 --> 00:50:16,560 Speaker 2: less so, and so the tribal issues are actually driving 967 00:50:16,560 --> 00:50:18,120 Speaker 2: a lot of it. They fund a lot of things, 968 00:50:18,160 --> 00:50:20,839 Speaker 2: independent expenditures, and so I think it probably is fair 969 00:50:20,880 --> 00:50:23,960 Speaker 2: to say that after kind of state government, we have 970 00:50:24,000 --> 00:50:26,840 Speaker 2: a kind of rump oil and gas industry that still 971 00:50:26,840 --> 00:50:29,080 Speaker 2: has some power, but probably the tribes are the most 972 00:50:29,200 --> 00:50:32,280 Speaker 2: organized kind of figures. They keep a check on government. 973 00:50:32,840 --> 00:50:36,279 Speaker 1: Yeah, we're seeing Look, this is what you're seeing a 974 00:50:36,320 --> 00:50:37,640 Speaker 1: little bit in the state of Florida and what you're 975 00:50:37,640 --> 00:50:39,560 Speaker 1: seeing a little bit in the state of California where 976 00:50:39,600 --> 00:50:44,080 Speaker 1: the tribes are becoming the sort of the bulwark against 977 00:50:44,640 --> 00:50:48,160 Speaker 1: a majority, you know, the majority party running amok, almost 978 00:50:48,239 --> 00:50:51,320 Speaker 1: running unopposed. And I think we all know our democracy 979 00:50:51,360 --> 00:50:54,160 Speaker 1: doesn't work without a little bit of friction, that's sure, 980 00:50:54,400 --> 00:50:57,000 Speaker 1: and in that sense it develops a friction. What would 981 00:50:57,000 --> 00:50:59,440 Speaker 1: you tell you gave some advice about like how you 982 00:50:59,600 --> 00:51:04,279 Speaker 1: viewed the party in general. What kind of anybody out 983 00:51:04,280 --> 00:51:08,160 Speaker 1: there running in twenty eight that you're like, boy, they're 984 00:51:08,200 --> 00:51:12,320 Speaker 1: at least talking about issues differently and it's about time, 985 00:51:12,480 --> 00:51:15,040 Speaker 1: or you know, who's who's just standing out to you 986 00:51:15,200 --> 00:51:17,280 Speaker 1: is sort of like, well, I hope to hear from 987 00:51:17,360 --> 00:51:19,040 Speaker 1: hear more from them after twenty six. 988 00:51:19,600 --> 00:51:21,719 Speaker 2: You know, I hear John Hassoff on the stump. I'm 989 00:51:21,760 --> 00:51:24,480 Speaker 2: always very impressed by him and how he's thinking of things, 990 00:51:24,920 --> 00:51:27,480 Speaker 2: and I hear Buddha Judge that kind of similar rhetoric. 991 00:51:27,680 --> 00:51:30,520 Speaker 2: I think kind of that analysis I gave, perhaps offered 992 00:51:30,719 --> 00:51:34,200 Speaker 2: less stridently than I did, is one people recognize, you know, 993 00:51:34,280 --> 00:51:35,600 Speaker 2: that we need to get back to more of a 994 00:51:35,640 --> 00:51:39,480 Speaker 2: class based politics than that we've had in the past. 995 00:51:39,719 --> 00:51:42,359 Speaker 1: Well that's what Trump did. He ran a good old 996 00:51:42,360 --> 00:51:46,240 Speaker 1: flash and class wharf, you know, class warfare campaign which 997 00:51:46,480 --> 00:51:49,120 Speaker 1: Bob Shrum used to help Democrats run back in the 998 00:51:49,239 --> 00:51:52,360 Speaker 1: dates of Gore and Kerry and and and those campaigns 999 00:51:52,880 --> 00:51:54,080 Speaker 1: which were quite competitive. 1000 00:51:54,560 --> 00:51:56,520 Speaker 2: Yeah, I actually can envision. This is one of my 1001 00:51:56,560 --> 00:51:58,879 Speaker 2: goals in life is have a Democrat go on Steve 1002 00:51:58,920 --> 00:52:02,000 Speaker 2: Bannon's war room and say to the magabase that listen 1003 00:52:02,040 --> 00:52:05,839 Speaker 2: to him, Like, the truth is Trump has betrayed the magabase, right, 1004 00:52:05,880 --> 00:52:08,759 Speaker 2: all the things you cared about about the swamp and 1005 00:52:08,760 --> 00:52:08,960 Speaker 2: all this. 1006 00:52:09,080 --> 00:52:11,319 Speaker 1: That's my wife's theory. She's like that you should say 1007 00:52:11,320 --> 00:52:14,359 Speaker 1: he lied to you. Yeah, don't tell them they were dumb. 1008 00:52:14,960 --> 00:52:16,879 Speaker 1: Go the other way around. I mean, this has been 1009 00:52:16,880 --> 00:52:19,520 Speaker 1: the problem with the left. They keep attacking the voter 1010 00:52:19,840 --> 00:52:24,320 Speaker 1: for supporting Trump, rather than understanding why the voter wants 1011 00:52:24,400 --> 00:52:25,279 Speaker 1: Trump in the first place. 1012 00:52:25,680 --> 00:52:27,640 Speaker 2: So, if you want to make America great again, so 1013 00:52:27,880 --> 00:52:30,359 Speaker 2: what a great thing that is, you should vote Democratic, right, 1014 00:52:30,400 --> 00:52:31,960 Speaker 2: because we are the ones who are actually going to 1015 00:52:32,000 --> 00:52:34,360 Speaker 2: deliver this message to you. I actually think that's the 1016 00:52:34,480 --> 00:52:37,760 Speaker 2: jiu jitsu that said that's probably too hard for people. 1017 00:52:37,560 --> 00:52:41,279 Speaker 1: To You're, well, it's why. It's a reminder. I always say, 1018 00:52:42,040 --> 00:52:44,719 Speaker 1: today's MAGA voter was a Clinton voter in the nineties. 1019 00:52:44,400 --> 00:52:46,760 Speaker 2: One hundred those were all Democratic voters. 1020 00:52:47,400 --> 00:52:49,640 Speaker 1: I think they were all Bill Clinton voters. I wonder, 1021 00:52:50,040 --> 00:52:52,160 Speaker 1: don't you agree they were Reagan voters and they were 1022 00:52:52,160 --> 00:52:52,920 Speaker 1: Bill Clinton voters. 1023 00:52:52,960 --> 00:52:55,080 Speaker 2: I bet they were. All of them were registered Democrats 1024 00:52:55,120 --> 00:52:57,760 Speaker 2: back in the day, and they were your constituents. Yeah, exactly, 1025 00:52:58,000 --> 00:53:00,480 Speaker 2: and my district now that I won is seventy five 1026 00:53:00,560 --> 00:53:04,400 Speaker 2: twenty five Trump And so it shows you like these people. 1027 00:53:04,800 --> 00:53:07,080 Speaker 2: They have certain interests, and Trump spoke to them in 1028 00:53:07,120 --> 00:53:09,319 Speaker 2: a brilliant way. And we shouldn't take that away from 1029 00:53:09,320 --> 00:53:12,000 Speaker 2: like the genius of his campaigning. We should try to 1030 00:53:12,040 --> 00:53:14,200 Speaker 2: like re channel that in some way. 1031 00:53:15,040 --> 00:53:17,719 Speaker 1: President of Tulsa University did love it or hate it? 1032 00:53:18,440 --> 00:53:20,759 Speaker 2: I love there. Actually, Jane in June to do full 1033 00:53:20,760 --> 00:53:23,160 Speaker 2: time AI work. I loved it. University Tulsa is a 1034 00:53:23,160 --> 00:53:23,560 Speaker 2: great place. 1035 00:53:23,600 --> 00:53:26,319 Speaker 1: What's worst it is it as bad politically. I mean, 1036 00:53:26,400 --> 00:53:28,440 Speaker 1: you know that's the old joke, right The politics are 1037 00:53:28,440 --> 00:53:30,520 Speaker 1: worse on a college campus because the stakes are so low. 1038 00:53:31,000 --> 00:53:33,680 Speaker 2: It was far harder politics than the actual being an 1039 00:53:33,680 --> 00:53:34,880 Speaker 2: elective office ever was. 1040 00:53:36,200 --> 00:53:39,680 Speaker 1: Well, it's I always had a soft spot being a 1041 00:53:39,719 --> 00:53:42,279 Speaker 1: Miami Hurricane boy. I always had a soft spot for 1042 00:53:42,320 --> 00:53:42,960 Speaker 1: the Golden. 1043 00:53:42,719 --> 00:53:45,520 Speaker 2: Hurricanes, exactly where the two hurricane in the world. Right, 1044 00:53:45,560 --> 00:53:48,480 Speaker 2: of course, Oh, you grad myself. I have to hate Miami. 1045 00:53:48,760 --> 00:53:50,120 Speaker 1: I know you don't, but that's all right. 1046 00:53:50,160 --> 00:53:50,560 Speaker 2: We had to. 1047 00:53:50,719 --> 00:53:52,960 Speaker 1: If it wasn't bro Oklahoma, Miami wouldn't have gotten into 1048 00:53:53,000 --> 00:53:53,520 Speaker 1: the top tier. 1049 00:53:54,040 --> 00:53:54,479 Speaker 2: That's true. 1050 00:53:54,520 --> 00:53:59,040 Speaker 1: Beating Oklahoma gave Miami an identity, so losing I I'm 1051 00:53:59,160 --> 00:54:02,960 Speaker 1: very thankful for the old Big eight, our coaches. 1052 00:54:02,719 --> 00:54:05,520 Speaker 2: Jimmy Johnson right from Oklahoma, since you're our best people. 1053 00:54:05,680 --> 00:54:07,920 Speaker 1: That's right. Well, and then we sent you one back 1054 00:54:07,960 --> 00:54:11,200 Speaker 1: with Schnellenberger that didn't go so hot, but that's that's 1055 00:54:11,239 --> 00:54:13,880 Speaker 1: neither her door there. Hey Brad Carson, this was great. 1056 00:54:14,840 --> 00:54:18,719 Speaker 1: I appreciate you coming on. And uh, look this this 1057 00:54:18,800 --> 00:54:21,879 Speaker 1: is I see why you're engaged in this. I mean, 1058 00:54:21,920 --> 00:54:27,719 Speaker 1: this really is in some ways more how if we 1059 00:54:27,800 --> 00:54:29,160 Speaker 1: got to get a ri we got to get this 1060 00:54:29,320 --> 00:54:33,279 Speaker 1: right because getting it wrong could be could be very 1061 00:54:33,880 --> 00:54:34,920 Speaker 1: very damaging. 1062 00:54:35,360 --> 00:54:37,960 Speaker 2: It is the most important fight going on right now. 1063 00:54:38,120 --> 00:54:41,560 Speaker 1: That's for sure. All Right, we'll be well, let's let's 1064 00:54:41,560 --> 00:54:43,040 Speaker 1: stay in touching, Chuck,