1 00:00:04,160 --> 00:00:07,520 Speaker 1: On this episode of Newtsworld, we are continuing our series 2 00:00:07,560 --> 00:00:10,840 Speaker 1: on artificial intelligence, and on this episode we're going to 3 00:00:10,840 --> 00:00:14,360 Speaker 1: be focusing on public policy. What should lawmakers know about 4 00:00:14,400 --> 00:00:17,200 Speaker 1: AI and what should be done to moderate or legislative 5 00:00:17,760 --> 00:00:20,800 Speaker 1: here to have this discussion. I'm really pleased to welcome 6 00:00:20,800 --> 00:00:25,320 Speaker 1: my guest, Congressman Jay Obernalti, who represents California's twenty third 7 00:00:25,400 --> 00:00:29,160 Speaker 1: congressional district. He holds a master's degree from UCLA in 8 00:00:29,320 --> 00:00:32,440 Speaker 1: artificial intelligence. As one of the leading voices in the 9 00:00:32,479 --> 00:00:45,720 Speaker 1: House on proposed legislation, Jay, welcome and thank you for 10 00:00:45,800 --> 00:00:46,920 Speaker 1: joining me on Newtsworld. 11 00:00:47,400 --> 00:00:48,919 Speaker 2: Well, thank you, nud it's great to see you. 12 00:00:49,520 --> 00:00:51,680 Speaker 1: Frankly, it's a pleasant surprise to realize that in a 13 00:00:52,280 --> 00:00:55,040 Speaker 1: rapidly evolving field like this, we have a member who 14 00:00:55,040 --> 00:00:57,560 Speaker 1: has the level of knowledge you do. First of all, 15 00:00:57,760 --> 00:01:00,840 Speaker 1: earning a bachelor's degree in engineering and applied science from 16 00:01:00,880 --> 00:01:04,240 Speaker 1: Caltech in itself is pretty impressive, and then a master's 17 00:01:04,240 --> 00:01:08,559 Speaker 1: degree in artificial intelligence from UCLA. You started a video 18 00:01:08,680 --> 00:01:12,920 Speaker 1: game development company called Farside Studios. Then you elected to 19 00:01:12,959 --> 00:01:15,760 Speaker 1: Big Bear City Council where you served as mayor, then 20 00:01:15,880 --> 00:01:19,120 Speaker 1: California State Legislature and now you're a member of Congress. 21 00:01:19,520 --> 00:01:22,800 Speaker 1: Did you ever think that you'd bring together sort of 22 00:01:22,800 --> 00:01:26,200 Speaker 1: the world of artificial intelligence and political capability? 23 00:01:26,680 --> 00:01:29,200 Speaker 2: Oh? I never thought that this would be my future 24 00:01:29,240 --> 00:01:33,080 Speaker 2: at all. In fact, if you had even suggested to 25 00:01:33,120 --> 00:01:35,640 Speaker 2: me ten years ago that I would be sating in Congress, 26 00:01:35,680 --> 00:01:37,720 Speaker 2: I would say you were crazy. I just got into 27 00:01:37,720 --> 00:01:40,679 Speaker 2: public service through a desire to serve my community, and 28 00:01:40,720 --> 00:01:42,920 Speaker 2: one thing led to another. But I'll tell you it's 29 00:01:42,959 --> 00:01:46,360 Speaker 2: been simultaneously the hardest job I've ever done, but also 30 00:01:46,400 --> 00:01:47,200 Speaker 2: the most rewarding. 31 00:01:47,840 --> 00:01:51,800 Speaker 1: Speaker McCarthy has been working to increase the awareness of 32 00:01:51,960 --> 00:01:55,320 Speaker 1: artificial intelligence, and I think two weeks ago had a 33 00:01:55,360 --> 00:02:00,200 Speaker 1: bipartisan briefing from MIT, just trying to bring people up 34 00:02:00,200 --> 00:02:03,880 Speaker 1: to speed. What's your view of how really important artificial 35 00:02:03,880 --> 00:02:06,880 Speaker 1: intelligence is going to be as we look at the future. 36 00:02:07,240 --> 00:02:09,720 Speaker 2: I think it's going to be critically important. It's going 37 00:02:09,800 --> 00:02:14,320 Speaker 2: to be an inflection point in the development of human civilization, 38 00:02:14,880 --> 00:02:18,840 Speaker 2: every bit as important as the invention of the printing press, 39 00:02:19,240 --> 00:02:22,880 Speaker 2: or the industrial revolution or the invention of the Internet. 40 00:02:22,960 --> 00:02:25,800 Speaker 2: It's going to be critical. And I think that in general, 41 00:02:26,360 --> 00:02:28,400 Speaker 2: although I think that there's a lot of hesitation and 42 00:02:28,440 --> 00:02:30,200 Speaker 2: a lot of fear, and there's no denying that it's 43 00:02:30,200 --> 00:02:32,720 Speaker 2: going to be disruptile. I think in general, it's going 44 00:02:32,760 --> 00:02:35,240 Speaker 2: to be overwhelmingly positive. We're going to look back on 45 00:02:35,280 --> 00:02:38,760 Speaker 2: the advent of AI as something that was empowering to humans, 46 00:02:38,880 --> 00:02:40,119 Speaker 2: not something that replaced them. 47 00:02:40,480 --> 00:02:44,240 Speaker 1: Can you just put in simple, everyday terms, what is 48 00:02:44,400 --> 00:02:48,560 Speaker 1: artificial intelligence and what is its core capability that makes 49 00:02:48,560 --> 00:02:50,600 Speaker 1: it so potentially impactful. 50 00:02:51,200 --> 00:02:54,760 Speaker 2: Let me use terms that are not technical and more approachable, 51 00:02:54,800 --> 00:02:59,080 Speaker 2: because there are technical definitions of what constitutes AI and 52 00:02:59,080 --> 00:03:04,040 Speaker 2: what differentiates it from traditional computer software. But rather than 53 00:03:04,040 --> 00:03:08,160 Speaker 2: going there, let's just broadly define artificial intelligence as something 54 00:03:08,560 --> 00:03:12,320 Speaker 2: that allows a computer to do a task that before 55 00:03:12,400 --> 00:03:15,280 Speaker 2: now could only be done by human and that makes 56 00:03:15,320 --> 00:03:19,480 Speaker 2: the computer look and seem very human like. So I 57 00:03:19,520 --> 00:03:22,880 Speaker 2: think that that's a definition that people will find more 58 00:03:22,919 --> 00:03:25,560 Speaker 2: approachable and that everyone can relate to. So, if you 59 00:03:25,760 --> 00:03:28,440 Speaker 2: talk about what AI is going to be good at, 60 00:03:28,560 --> 00:03:32,880 Speaker 2: it's going to automate a lot of tasks that right 61 00:03:32,919 --> 00:03:37,920 Speaker 2: now require a huge input of human capability and manpower, 62 00:03:38,160 --> 00:03:40,680 Speaker 2: and it's going to do it in ways that empower 63 00:03:40,760 --> 00:03:44,720 Speaker 2: people and allow them to spend that labor on higher 64 00:03:44,800 --> 00:03:47,480 Speaker 2: order usage I'll give you an example something that is 65 00:03:47,520 --> 00:03:50,440 Speaker 2: already happening. Now. This is a short term benefit of AI. 66 00:03:50,800 --> 00:03:54,160 Speaker 2: AI is starting to be used for the automated reading 67 00:03:54,320 --> 00:03:58,360 Speaker 2: of medical procedures like CT scans, And as it turns out, 68 00:03:58,480 --> 00:04:02,920 Speaker 2: artificial intelligence can read CT scan in seconds and can 69 00:04:02,960 --> 00:04:06,480 Speaker 2: detect tumors that are just forming, that are so small 70 00:04:06,520 --> 00:04:09,840 Speaker 2: that a human radiologist would not even see them forming yet. 71 00:04:10,440 --> 00:04:13,520 Speaker 2: So from a patient standpoint, you get your CT scan 72 00:04:13,640 --> 00:04:17,320 Speaker 2: read more quickly, less expensively, and more accurately, and you 73 00:04:17,400 --> 00:04:21,640 Speaker 2: have this added capability of detecting cancer long before any 74 00:04:21,640 --> 00:04:24,799 Speaker 2: symptoms show up, which, as I'm sure you know, greatly 75 00:04:24,839 --> 00:04:28,320 Speaker 2: improves the outlook for treating that cancer if you've caught 76 00:04:28,320 --> 00:04:32,440 Speaker 2: it early. So that's a huge win for human civilization. 77 00:04:32,520 --> 00:04:35,240 Speaker 2: And that's just one tiny facet of what AI is 78 00:04:35,279 --> 00:04:37,800 Speaker 2: going to do, and that's today, that's not ten years 79 00:04:37,800 --> 00:04:38,600 Speaker 2: from a happening. 80 00:04:38,640 --> 00:04:41,880 Speaker 1: Now. How much of that can be done at long distance, 81 00:04:41,920 --> 00:04:45,720 Speaker 1: so that, for example, rural areas could actually access world 82 00:04:45,800 --> 00:04:47,480 Speaker 1: class analytics. 83 00:04:48,040 --> 00:04:50,200 Speaker 2: As you point out, one of its benefits is that 84 00:04:50,680 --> 00:04:55,240 Speaker 2: where a rural hospital, for example, my hometown of Big 85 00:04:55,279 --> 00:04:58,480 Speaker 2: Bear Lake, which is a little mountain ski town, whereas 86 00:04:58,520 --> 00:05:00,000 Speaker 2: you would have had to send that off and maybe 87 00:05:00,120 --> 00:05:02,760 Speaker 2: you wait twenty four hours for a human radiologist to 88 00:05:02,839 --> 00:05:05,839 Speaker 2: read it somewhere else because there's not a radiologist on staff. 89 00:05:06,080 --> 00:05:08,080 Speaker 2: This could be done in minutes and you'll have the 90 00:05:08,080 --> 00:05:08,880 Speaker 2: result right now. 91 00:05:09,240 --> 00:05:10,960 Speaker 1: I mean. The other area that it seems to me 92 00:05:11,480 --> 00:05:14,720 Speaker 1: is emerging very rapidly, as in defense. I've been noticing, 93 00:05:14,720 --> 00:05:18,719 Speaker 1: for example, that the Ukrainians are now using drones and 94 00:05:18,760 --> 00:05:23,560 Speaker 1: other assets in ways that are virtually revolutionary, creating a 95 00:05:23,600 --> 00:05:27,520 Speaker 1: sort of pinpointable battlefield and which requires a huge amount 96 00:05:27,520 --> 00:05:31,080 Speaker 1: of analytic capability. And the Israelis have developed now they're 97 00:05:31,480 --> 00:05:36,000 Speaker 1: Aaron dome which analyzes in real time ballistic missile flights 98 00:05:36,400 --> 00:05:40,120 Speaker 1: and determines which ones matter, ignores the others, and just 99 00:05:40,200 --> 00:05:43,720 Speaker 1: focuses all the defense efforts directly against the greatest threat. 100 00:05:43,920 --> 00:05:47,120 Speaker 1: I mean, in that sense, isn't the speed and the 101 00:05:47,160 --> 00:05:50,800 Speaker 1: accuracy of artificial intelligence going to lead to virtually a 102 00:05:50,839 --> 00:05:52,719 Speaker 1: revolution in the art of war? 103 00:05:53,160 --> 00:05:57,040 Speaker 2: I think you're right, and I also think that it 104 00:05:57,120 --> 00:06:02,320 Speaker 2: might lead to deterrence in the same way that the 105 00:06:02,360 --> 00:06:05,000 Speaker 2: development of nuclear weapons led to de terrence. You know, 106 00:06:05,040 --> 00:06:08,320 Speaker 2: when we live in a world where the thought of 107 00:06:08,680 --> 00:06:12,800 Speaker 2: war is so terrible that it's something that everyone tries 108 00:06:12,839 --> 00:06:16,520 Speaker 2: to avoid at all costs. It incentivizes everyone to come 109 00:06:16,600 --> 00:06:19,320 Speaker 2: up with solutions other than war, and as we get 110 00:06:19,320 --> 00:06:22,799 Speaker 2: better and better at doing things like delivering drone based weaponry, 111 00:06:23,080 --> 00:06:25,040 Speaker 2: I think that that will be even more true. 112 00:06:25,520 --> 00:06:28,480 Speaker 1: What do you think is the appropriate role of Congress 113 00:06:29,200 --> 00:06:31,080 Speaker 1: in dealing with this and what is it that members 114 00:06:31,120 --> 00:06:35,120 Speaker 1: of Congress should know about the emergence of artificial intelligence? 115 00:06:36,000 --> 00:06:38,960 Speaker 2: Well, I'm a firm believer that the role of government 116 00:06:39,279 --> 00:06:45,400 Speaker 2: is constitutionally limited, and especially in a society that values 117 00:06:45,560 --> 00:06:48,520 Speaker 2: free markets the way that ours does, I think that 118 00:06:48,560 --> 00:06:52,039 Speaker 2: you have to be very deliberate when you insert government 119 00:06:52,160 --> 00:06:56,120 Speaker 2: into regulation of an industry. So, to me, we have 120 00:06:56,200 --> 00:06:59,640 Speaker 2: to ask ourselves, is this an industry that needs to 121 00:06:59,640 --> 00:07:01,640 Speaker 2: be read related and if so, what should the role 122 00:07:01,680 --> 00:07:05,359 Speaker 2: of the federal government be in creating that regulation. I 123 00:07:05,360 --> 00:07:08,680 Speaker 2: think it's pretty clear that in addition to the potential 124 00:07:08,760 --> 00:07:13,240 Speaker 2: upsides of AI, there are also substantial risks and substantial 125 00:07:13,280 --> 00:07:17,520 Speaker 2: consumer harms that potentially need to be mitigated, and that 126 00:07:17,600 --> 00:07:20,720 Speaker 2: to me, is a good reason for government regulation. You know, 127 00:07:20,720 --> 00:07:22,840 Speaker 2: if you've got a list of potential harms and you 128 00:07:22,840 --> 00:07:25,440 Speaker 2: want an act consumer protection, then you need to create 129 00:07:25,440 --> 00:07:29,920 Speaker 2: a regulatatory framework to address those issues. But I also 130 00:07:30,000 --> 00:07:32,720 Speaker 2: think it's very important that we not go too far. 131 00:07:33,160 --> 00:07:35,240 Speaker 2: If you look at what the European Union has been 132 00:07:35,240 --> 00:07:39,560 Speaker 2: doing recently in creating a huge new government bureaucracy to 133 00:07:39,720 --> 00:07:43,480 Speaker 2: essentially control who's allowed to develop a news AI and 134 00:07:43,520 --> 00:07:45,640 Speaker 2: who is not, I think that that would be a 135 00:07:45,720 --> 00:07:49,240 Speaker 2: dangerous path for us to take, particularly because it empowers 136 00:07:49,920 --> 00:07:53,640 Speaker 2: bureaucracy over the elected representatives of the people, and I 137 00:07:53,640 --> 00:07:57,680 Speaker 2: think that's an abdication of our responsibility as legislators. If 138 00:07:57,680 --> 00:08:00,760 Speaker 2: we empower the executive branch agencies and our clear about 139 00:08:00,960 --> 00:08:02,960 Speaker 2: what we're trying to regulate and how we want to do. 140 00:08:02,960 --> 00:08:09,520 Speaker 1: It, shouldn't the regulatory approach focus on outcomes rather than processes. 141 00:08:09,560 --> 00:08:13,160 Speaker 1: That is, if I'm using AI to cheat you, or 142 00:08:13,200 --> 00:08:16,680 Speaker 1: to harass you, or to do something, then I'm responsible, 143 00:08:16,760 --> 00:08:20,800 Speaker 1: not the AI system. We don't hold cars accountable for accidents, 144 00:08:20,800 --> 00:08:24,480 Speaker 1: for example, we hold drivers accountable, and we haven't tried 145 00:08:24,480 --> 00:08:27,480 Speaker 1: to regulate that nobody should be allowed to drive because 146 00:08:27,640 --> 00:08:29,600 Speaker 1: frol you might have be in a wreck, we'd all 147 00:08:29,600 --> 00:08:32,000 Speaker 1: be back on horses. So to what extent can you 148 00:08:32,040 --> 00:08:37,320 Speaker 1: focus on the negative outcomes and making them illegal rather 149 00:08:37,360 --> 00:08:39,079 Speaker 1: than focusing on the process. 150 00:08:39,679 --> 00:08:42,120 Speaker 2: I think it's critical to do that, and if you 151 00:08:42,200 --> 00:08:45,880 Speaker 2: look at some of the debate around AI that has 152 00:08:46,120 --> 00:08:49,800 Speaker 2: occurred so far, I think that you realize how important 153 00:08:49,800 --> 00:08:53,640 Speaker 2: that is. So some of the negative outcomes of AI 154 00:08:53,840 --> 00:08:57,560 Speaker 2: so far have been widely publicized. One of them is that, 155 00:08:58,000 --> 00:09:01,280 Speaker 2: as it turns out, the early generation of AI empowered 156 00:09:01,640 --> 00:09:06,520 Speaker 2: facial recognition did not recognize African American faces nearly as 157 00:09:06,559 --> 00:09:09,320 Speaker 2: well as Caucasian faces, and that was a factor of 158 00:09:09,600 --> 00:09:12,360 Speaker 2: the way that that AI algorithm was trained. It was unintentional, 159 00:09:12,360 --> 00:09:15,240 Speaker 2: but it was verifiable. And then lately there has been 160 00:09:15,280 --> 00:09:17,960 Speaker 2: a lot of controversy over the use of AI to 161 00:09:18,120 --> 00:09:22,520 Speaker 2: automatically screen resumes before a human has to look at them, 162 00:09:22,800 --> 00:09:25,320 Speaker 2: and it's been proven in the early generation of that 163 00:09:25,440 --> 00:09:30,520 Speaker 2: AI that you can detect some bias against certain socioeconomic 164 00:09:30,559 --> 00:09:33,440 Speaker 2: classes of people. But as you say, if you focus 165 00:09:33,440 --> 00:09:36,200 Speaker 2: on the outcome and not the tool, then regulation becomes 166 00:09:36,280 --> 00:09:39,640 Speaker 2: very easy because we already have laws that prohibit discrimination 167 00:09:40,160 --> 00:09:42,960 Speaker 2: on the basis of things like skin color, or ethnic 168 00:09:43,000 --> 00:09:47,079 Speaker 2: background or socioeconomic background. So it doesn't matter whether or 169 00:09:47,120 --> 00:09:50,360 Speaker 2: not you use AI to do that. If it's done intentionally, 170 00:09:50,400 --> 00:09:52,720 Speaker 2: it's already against the law, so we don't need a 171 00:09:52,760 --> 00:10:03,400 Speaker 2: new regulatory framework to guard against those harms. 172 00:10:06,160 --> 00:10:08,880 Speaker 1: Hi, this is newt. In my new book, March the Majority, 173 00:10:08,920 --> 00:10:12,400 Speaker 1: The Real Story of the Republican Revolution, I offer strategies 174 00:10:12,440 --> 00:10:16,120 Speaker 1: and insights for everyday citizens and for season politicians. It's 175 00:10:16,160 --> 00:10:19,240 Speaker 1: both a guide for political success and for winning back 176 00:10:19,280 --> 00:10:22,520 Speaker 1: the Majority in twenty twenty four. March to the Majority 177 00:10:22,520 --> 00:10:26,560 Speaker 1: outlines the sixteen year campaign to write the Contract with America. 178 00:10:27,000 --> 00:10:30,400 Speaker 1: Explains how we elected the first Republican House majority in 179 00:10:30,600 --> 00:10:34,040 Speaker 1: forty years in how we worked with President Bill Clinton 180 00:10:34,280 --> 00:10:39,520 Speaker 1: to pass major reforms, including four consecutive balanced budgets. March 181 00:10:39,559 --> 00:10:42,600 Speaker 1: to the Majority tells the behind the scenes story of 182 00:10:42,640 --> 00:10:45,240 Speaker 1: how we got it done. Here's a special offer for 183 00:10:45,320 --> 00:10:49,160 Speaker 1: my podcast listeners. You can pre order March the Majority 184 00:10:49,280 --> 00:10:52,760 Speaker 1: right now at gingrishfree sixty dot com slash book and 185 00:10:52,800 --> 00:10:56,160 Speaker 1: it'll be shipped directly to you on June sixth. Don't 186 00:10:56,200 --> 00:10:58,560 Speaker 1: miss out on a special offer to pre order my 187 00:10:58,640 --> 00:11:02,439 Speaker 1: new book today genglishwree sixty dot com slash book and 188 00:11:02,600 --> 00:11:06,200 Speaker 1: order your copy now. Order it today at gannglishwe sixty 189 00:11:06,240 --> 00:11:17,560 Speaker 1: dot com slash book. To what extent do you think 190 00:11:17,600 --> 00:11:22,319 Speaker 1: that AI will weakend jobs and to what extent will 191 00:11:22,320 --> 00:11:24,679 Speaker 1: actually create a whole new generation of jobs. 192 00:11:25,000 --> 00:11:27,560 Speaker 2: Well, I think it's going to be very economically disruptive. 193 00:11:28,280 --> 00:11:31,840 Speaker 2: And if you look at throughout human history the different 194 00:11:31,880 --> 00:11:36,280 Speaker 2: technological revolutions that have occurred, they have all been disruptive. 195 00:11:36,600 --> 00:11:38,840 Speaker 2: In fact, we use a word in the English language 196 00:11:38,840 --> 00:11:42,360 Speaker 2: called Luddite to describe someone who's resistant to the advancement 197 00:11:42,360 --> 00:11:46,280 Speaker 2: of technology, and that comes from pre Industrial Revolution England, 198 00:11:46,760 --> 00:11:50,600 Speaker 2: where a group of textile workers burned the new automated 199 00:11:50,640 --> 00:11:54,080 Speaker 2: textile looms in protest of the fact that they were 200 00:11:54,120 --> 00:11:56,520 Speaker 2: going to take their jobs away. Those were the Luddites, 201 00:11:56,520 --> 00:11:58,960 Speaker 2: and you know, we still talk about them because of 202 00:11:59,040 --> 00:12:01,680 Speaker 2: the action that they took. Looking back on that though, 203 00:12:01,679 --> 00:12:05,280 Speaker 2: no one can deny that those textile mills were a 204 00:12:05,320 --> 00:12:07,319 Speaker 2: good thing for humanity. I mean, it was the first 205 00:12:07,320 --> 00:12:11,320 Speaker 2: time that high quality cloth garments were available to middle 206 00:12:11,320 --> 00:12:14,320 Speaker 2: income people, the first time ever. So that's great for 207 00:12:14,440 --> 00:12:17,400 Speaker 2: most of humanity, not so great for textile workers. And 208 00:12:17,400 --> 00:12:19,880 Speaker 2: you're going to see exactly the same thing happen with AI. 209 00:12:20,200 --> 00:12:23,360 Speaker 2: It's going to be overwhelmingly a good thing because it's 210 00:12:23,400 --> 00:12:26,600 Speaker 2: going to free up human labor for higher order uses, 211 00:12:26,760 --> 00:12:29,120 Speaker 2: but it's going to be disruptive to the people who 212 00:12:29,240 --> 00:12:32,199 Speaker 2: work in the industries that are displaced. 213 00:12:32,600 --> 00:12:36,040 Speaker 1: Does that become a challenge for us in helping people 214 00:12:36,120 --> 00:12:40,760 Speaker 1: make the transition or is it part of a natural process. 215 00:12:41,200 --> 00:12:43,520 Speaker 2: I think both. But I think government will have a 216 00:12:43,600 --> 00:12:47,080 Speaker 2: role to play not only in retraining people who have 217 00:12:47,160 --> 00:12:51,840 Speaker 2: seen this disruption happen, but also in informing young people 218 00:12:51,960 --> 00:12:56,079 Speaker 2: as to what a productive life in twenty second century 219 00:12:56,120 --> 00:12:59,360 Speaker 2: America looks like. In particular, I think we're going to 220 00:12:59,400 --> 00:13:02,640 Speaker 2: have to completely re examine our approach to education. We 221 00:13:02,760 --> 00:13:05,680 Speaker 2: have been stuck in this one hundred year old model 222 00:13:05,720 --> 00:13:08,440 Speaker 2: where we assume that people at the beginning of their 223 00:13:08,480 --> 00:13:12,640 Speaker 2: lives go to school, acquire an education, get a degree, 224 00:13:12,679 --> 00:13:15,360 Speaker 2: and then have a career that relies on that education 225 00:13:15,520 --> 00:13:17,920 Speaker 2: that can productively make a living for them for fifty 226 00:13:18,000 --> 00:13:21,360 Speaker 2: sixty seven years. I think that that is definitely going 227 00:13:21,400 --> 00:13:23,360 Speaker 2: to have to be a thing of the past because 228 00:13:23,400 --> 00:13:26,480 Speaker 2: the pace of technology is changing so quickly. I think 229 00:13:26,520 --> 00:13:29,120 Speaker 2: that we're going to have to embrace a future where 230 00:13:29,160 --> 00:13:32,680 Speaker 2: everyone has to be a proponent of lifelong education, where 231 00:13:32,760 --> 00:13:36,000 Speaker 2: instead of being educated at the beginning of our lives, 232 00:13:36,040 --> 00:13:39,200 Speaker 2: we make a commitment to remaining educated our entire lives, 233 00:13:39,520 --> 00:13:41,400 Speaker 2: and AI can actually help us with that. That's one 234 00:13:41,440 --> 00:13:43,800 Speaker 2: of the things AI is very good at is creating 235 00:13:44,000 --> 00:13:47,000 Speaker 2: a personalized approach to the access of information, which is 236 00:13:47,000 --> 00:13:50,040 Speaker 2: what education is. So I actually think AI will be 237 00:13:50,120 --> 00:13:53,040 Speaker 2: disruptive in that way, but the AI will be essentially 238 00:13:53,120 --> 00:13:53,839 Speaker 2: very beneficial. 239 00:13:54,320 --> 00:13:57,040 Speaker 1: I spend a fair amount of time on how bad, 240 00:13:57,440 --> 00:14:00,480 Speaker 1: for example, some of the big inner city schools are. 241 00:14:00,840 --> 00:14:03,480 Speaker 1: But in a sense you have me sort of opening 242 00:14:03,480 --> 00:14:06,840 Speaker 1: my mind to the concept that maybe one of the 243 00:14:06,880 --> 00:14:10,480 Speaker 1: requirements of the next generation is to develop the ability 244 00:14:10,520 --> 00:14:14,400 Speaker 1: for personalized, constant learning. So people who might well have 245 00:14:14,480 --> 00:14:17,160 Speaker 1: gone to a terrible school there are twenty three schools 246 00:14:17,160 --> 00:14:20,240 Speaker 1: in Baltimore City, for example, in which not a single 247 00:14:20,280 --> 00:14:23,000 Speaker 1: person can do math. That doesn't mean that the rest 248 00:14:23,000 --> 00:14:25,120 Speaker 1: of their life they can't do math. It may mean 249 00:14:25,160 --> 00:14:26,680 Speaker 1: that they have to learn it at twenty or twenty 250 00:14:26,760 --> 00:14:29,800 Speaker 1: one or twenty two, and that AI might well become 251 00:14:30,200 --> 00:14:33,040 Speaker 1: sort of a personal coach in a way that would 252 00:14:33,040 --> 00:14:34,600 Speaker 1: have been impossible thirty years ago. 253 00:14:35,600 --> 00:14:39,480 Speaker 2: I think that's absolutely right. But key to that whole 254 00:14:39,800 --> 00:14:43,600 Speaker 2: situation is a desire to learn math, and I think 255 00:14:43,680 --> 00:14:46,160 Speaker 2: that's what AI is going to empower. We're going to 256 00:14:46,200 --> 00:14:48,600 Speaker 2: in the future need to teach young people that they 257 00:14:48,640 --> 00:14:51,120 Speaker 2: can learn, and teach young people how to learn, and 258 00:14:51,160 --> 00:14:53,400 Speaker 2: then for the rest of their lives they'll be empowered 259 00:14:53,400 --> 00:14:56,240 Speaker 2: to learn whatever it is they want to learn. And 260 00:14:56,320 --> 00:14:58,640 Speaker 2: I actually think that's a really optimistic future, you know, 261 00:14:58,680 --> 00:15:01,360 Speaker 2: full of citizens who are in places that they are 262 00:15:01,440 --> 00:15:03,320 Speaker 2: because they choose to be and because they want to be. 263 00:15:03,800 --> 00:15:08,240 Speaker 1: Some people have suggested to me that the great danger 264 00:15:08,240 --> 00:15:12,160 Speaker 1: from artificial intelligence isn't scenes like the Terminator, you know, 265 00:15:12,160 --> 00:15:15,480 Speaker 1: where the artificial intelligence systems have risen in rebellion, but 266 00:15:15,600 --> 00:15:20,400 Speaker 1: in fact the power that artificial intelligence gives a utilitarian 267 00:15:20,480 --> 00:15:23,880 Speaker 1: system like China to be able to literally track everybody 268 00:15:23,880 --> 00:15:27,000 Speaker 1: in real time at a level of detail that we 269 00:15:27,040 --> 00:15:30,920 Speaker 1: would have thought impossible. To what extent is that threat 270 00:15:30,960 --> 00:15:34,359 Speaker 1: to freedom from the sheer power that the state acquires 271 00:15:34,720 --> 00:15:37,040 Speaker 1: if it's not controlled and carefully regulated. 272 00:15:37,640 --> 00:15:39,640 Speaker 2: No, I think it's a very serious concern, and I 273 00:15:39,720 --> 00:15:42,280 Speaker 2: think we need to take it seriously. One of the 274 00:15:42,320 --> 00:15:45,600 Speaker 2: things that I often talk about in the context of 275 00:15:45,640 --> 00:15:50,479 Speaker 2: the regulation of AI is that, in addition to putting guardrails, 276 00:15:50,560 --> 00:15:53,480 Speaker 2: around industry, which is what most people think of when 277 00:15:53,520 --> 00:15:56,280 Speaker 2: they talk about regulation. We also need to put guard 278 00:15:56,360 --> 00:16:00,520 Speaker 2: rails around government because of the point that you raised. 279 00:16:00,920 --> 00:16:02,800 Speaker 2: You look at the way that AI is already being 280 00:16:02,880 --> 00:16:06,400 Speaker 2: used in countries like China to enact what is essentially 281 00:16:06,440 --> 00:16:11,440 Speaker 2: the world's largest surveillance state, where personal information about people 282 00:16:11,560 --> 00:16:15,760 Speaker 2: is aggregated and put together in very dystopian ways to 283 00:16:15,880 --> 00:16:19,440 Speaker 2: create things like government loyalty predictions and scores, and to 284 00:16:19,640 --> 00:16:23,560 Speaker 2: condition people's access to resources on their loyalty to the government. 285 00:16:23,800 --> 00:16:26,920 Speaker 2: That's something that we would never want to have happen here, 286 00:16:27,400 --> 00:16:30,200 Speaker 2: but we can take steps in the short term to 287 00:16:30,360 --> 00:16:32,560 Speaker 2: prevent that. And that's one of the things that I'm 288 00:16:32,840 --> 00:16:35,520 Speaker 2: a firm believer in is that in the short term, 289 00:16:35,560 --> 00:16:38,000 Speaker 2: the best way of guarding against the malicious uses of 290 00:16:38,000 --> 00:16:42,480 Speaker 2: AI is to create a federal regulatory framework on the 291 00:16:42,480 --> 00:16:46,280 Speaker 2: accumulation of personal digital data. That's something that we have 292 00:16:46,360 --> 00:16:50,440 Speaker 2: relegated to the states before now, and I think that 293 00:16:50,920 --> 00:16:54,240 Speaker 2: it's very dangerous to allow that to continue. I think 294 00:16:54,240 --> 00:16:56,760 Speaker 2: that we need to keep track of what people are 295 00:16:56,800 --> 00:17:01,040 Speaker 2: allowed to collect about other people and store, because AI 296 00:17:01,200 --> 00:17:06,840 Speaker 2: is really good at taking this aggregated personal digital data 297 00:17:06,920 --> 00:17:10,639 Speaker 2: and putting it together to create a behavioral model that 298 00:17:10,640 --> 00:17:13,760 Speaker 2: can make eerily accurate predictions of what people will do 299 00:17:13,800 --> 00:17:16,680 Speaker 2: in the future, and then use that information to influence 300 00:17:16,720 --> 00:17:19,480 Speaker 2: that behavior. We have companies that are already using AI 301 00:17:19,560 --> 00:17:22,439 Speaker 2: that way. That's something that I don't think is not beneficial. 302 00:17:22,680 --> 00:17:25,679 Speaker 2: It could be used in ways like the influence of 303 00:17:25,680 --> 00:17:27,760 Speaker 2: public opinion, you know, ways that are very corrosive to 304 00:17:27,800 --> 00:17:28,440 Speaker 2: a democracy. 305 00:17:28,800 --> 00:17:31,159 Speaker 1: Can you talk a little bit about TikTok and the 306 00:17:31,200 --> 00:17:38,800 Speaker 1: whole concern that TikTok actually combine sort of mass public 307 00:17:38,880 --> 00:17:44,280 Speaker 1: access with the potential for analysis and manipulation through artificial 308 00:17:44,320 --> 00:17:46,919 Speaker 1: intelligence systems. I mean, to what extent is that a 309 00:17:46,960 --> 00:17:47,720 Speaker 1: real danger? 310 00:17:48,320 --> 00:17:50,280 Speaker 2: I think it is a real danger. Which is not 311 00:17:50,359 --> 00:17:53,160 Speaker 2: to say that I'm a proponent of banning TikTok. That's 312 00:17:53,200 --> 00:17:55,679 Speaker 2: a debate we've been having in Congress and we're going 313 00:17:55,720 --> 00:17:57,960 Speaker 2: to continue to have. But as you know, we had 314 00:17:57,960 --> 00:18:00,840 Speaker 2: a hearing recently where we had the pre of TikTok 315 00:18:00,880 --> 00:18:04,080 Speaker 2: testify and we asked him questions about the data that 316 00:18:04,080 --> 00:18:06,280 Speaker 2: TikTok is collecting and what they're doing with it. But 317 00:18:06,359 --> 00:18:10,760 Speaker 2: the danger there is in allowing a foreign adversary to 318 00:18:10,800 --> 00:18:14,600 Speaker 2: have access to the private digital data of millions of Americans. 319 00:18:14,960 --> 00:18:18,239 Speaker 2: Does that give potential foreign adversaries the power to do 320 00:18:18,320 --> 00:18:22,879 Speaker 2: things like interferer in our elections, influence public opinion, spread 321 00:18:22,880 --> 00:18:27,240 Speaker 2: disinformation And the answer to that question is very clearly yes, 322 00:18:27,720 --> 00:18:30,520 Speaker 2: which is why we need to pay more attention to 323 00:18:31,080 --> 00:18:33,280 Speaker 2: the data that people are allowed to collect and what 324 00:18:33,280 --> 00:18:34,280 Speaker 2: they're allowed to do with it. 325 00:18:34,680 --> 00:18:37,920 Speaker 1: Should we try to find a way to have TikTok 326 00:18:38,040 --> 00:18:41,680 Speaker 1: sold in effect to an American company or some kind 327 00:18:41,720 --> 00:18:44,439 Speaker 1: of barrier, and is there any way to stop that 328 00:18:44,560 --> 00:18:45,760 Speaker 1: data from going to China? 329 00:18:46,320 --> 00:18:50,240 Speaker 2: I ask some very detailed technical questions about exactly that 330 00:18:50,760 --> 00:18:53,280 Speaker 2: in our hearing with the president of TikTok a few 331 00:18:53,280 --> 00:18:58,080 Speaker 2: weeks ago, and I emerged from that conversation with the 332 00:18:58,160 --> 00:19:00,719 Speaker 2: thought that no, it is not possible to prevent that 333 00:19:00,800 --> 00:19:03,520 Speaker 2: from occurring. So instead, I think what we need to 334 00:19:03,560 --> 00:19:08,760 Speaker 2: focus on is enacting privacy legislation that prevents information like 335 00:19:08,800 --> 00:19:11,240 Speaker 2: that from being collected in the first place, and that 336 00:19:11,320 --> 00:19:14,000 Speaker 2: will solve the problem not just for TikTok, but for 337 00:19:14,119 --> 00:19:18,440 Speaker 2: any malicious player that is trying to accumulate this data 338 00:19:18,520 --> 00:19:20,080 Speaker 2: and use it for ways that are not good for 339 00:19:20,119 --> 00:19:20,760 Speaker 2: our society. 340 00:19:21,160 --> 00:19:23,560 Speaker 1: So does that become sort of the equivalent of Hippa 341 00:19:23,560 --> 00:19:24,879 Speaker 1: for all personal data. 342 00:19:25,720 --> 00:19:28,240 Speaker 2: Yes, exactly, And what we're talking about is a framework 343 00:19:28,280 --> 00:19:32,320 Speaker 2: that actually encapsulates HIPPA. So we want to create one 344 00:19:32,840 --> 00:19:36,320 Speaker 2: level digital playing field for everyone with a set of 345 00:19:36,359 --> 00:19:44,080 Speaker 2: rules that everyone knows and understands, including for healthcare data. 346 00:19:49,880 --> 00:19:52,600 Speaker 1: Hi, this is NEWT and my new book, Marks the Majority, 347 00:19:52,640 --> 00:19:56,120 Speaker 1: the real story of the Republican Revolution. I offer strategies 348 00:19:56,160 --> 00:20:00,240 Speaker 1: and insights for everyday citizens and for season politicians, a 349 00:20:00,280 --> 00:20:03,600 Speaker 1: guide for political success and for winning back the Majority 350 00:20:03,600 --> 00:20:06,800 Speaker 1: in twenty twenty four. March to the Majority outlines the 351 00:20:06,880 --> 00:20:11,240 Speaker 1: sixteen year campaign to write the Contract with America. Explains 352 00:20:11,240 --> 00:20:15,080 Speaker 1: how we elected the first Republican House majority in forty years, 353 00:20:15,440 --> 00:20:18,440 Speaker 1: in how we worked with President Bill Clinton to pass 354 00:20:18,640 --> 00:20:23,479 Speaker 1: major reforms, including four consecutive balance budgets. March to the 355 00:20:23,480 --> 00:20:26,679 Speaker 1: Majority tells the behind the scenes story of how we 356 00:20:26,760 --> 00:20:30,160 Speaker 1: got it done. Here's a special offer for my podcast listeners. 357 00:20:30,520 --> 00:20:33,840 Speaker 1: You can pre order March the Majority right now at 358 00:20:33,880 --> 00:20:37,080 Speaker 1: gingrishtree sixty dot com slash book and it'll be shipped 359 00:20:37,119 --> 00:20:40,320 Speaker 1: directly to you on June sixth. Don't miss out on 360 00:20:40,400 --> 00:20:43,159 Speaker 1: the special offer to pre order my new book today. 361 00:20:43,520 --> 00:20:46,520 Speaker 1: Go to gingishtree sixty dot com slash book and order 362 00:20:46,560 --> 00:20:50,160 Speaker 1: your copy now. Order it today at gingishtree sixty dot 363 00:20:50,200 --> 00:21:00,440 Speaker 1: com slash book. What are the countries of the most 364 00:21:00,480 --> 00:21:04,240 Speaker 1: advanced in developing artificial intelligence capabilities? 365 00:21:04,520 --> 00:21:07,639 Speaker 2: Well, undeniably, the United States is in the lead right now. 366 00:21:07,960 --> 00:21:13,040 Speaker 2: China is right behind US. China passed the United States 367 00:21:13,119 --> 00:21:16,399 Speaker 2: two years ago in the number of PhDs in computer science. 368 00:21:16,520 --> 00:21:19,119 Speaker 2: It was graduating every year and this year it will 369 00:21:19,160 --> 00:21:23,160 Speaker 2: graduate double the number of PhDs in computer science. So China. 370 00:21:23,400 --> 00:21:26,880 Speaker 2: This is an industry that China has recognized as representative 371 00:21:26,880 --> 00:21:28,719 Speaker 2: of the future, and this is something that they're investing 372 00:21:28,720 --> 00:21:29,199 Speaker 2: heavily in. 373 00:21:29,600 --> 00:21:31,800 Speaker 1: I get the impression from you and others that the 374 00:21:31,840 --> 00:21:35,480 Speaker 1: Europeans in a sense are sort of artificially taking themselves 375 00:21:35,520 --> 00:21:38,680 Speaker 1: out of the game because they're regulatory structure as such 376 00:21:39,280 --> 00:21:41,240 Speaker 1: that it's unlikely that they're going to be the most 377 00:21:41,240 --> 00:21:45,480 Speaker 1: innovative country system since their collection of countries. Does that 378 00:21:45,560 --> 00:21:49,480 Speaker 1: basically limit the primary competition to the US and China 379 00:21:49,560 --> 00:21:52,680 Speaker 1: or are the other players who potentially have this kind 380 00:21:52,720 --> 00:21:53,480 Speaker 1: of capability. 381 00:21:54,200 --> 00:21:56,479 Speaker 2: I think there are some very smart people in Europe, 382 00:21:56,480 --> 00:21:59,480 Speaker 2: and they certainly continue to develop AI. But I'll tell 383 00:21:59,520 --> 00:22:03,600 Speaker 2: you in general, the forces that are arrayed against the 384 00:22:03,640 --> 00:22:06,520 Speaker 2: deployment of AI, even in ways that are beneficial to 385 00:22:06,600 --> 00:22:10,119 Speaker 2: human society, are the ones that fear disruption. You know, 386 00:22:10,240 --> 00:22:13,600 Speaker 2: we were talking about a few minutes ago how AI 387 00:22:13,920 --> 00:22:17,679 Speaker 2: enables the automated reading of CT scans. That's great for 388 00:22:17,720 --> 00:22:20,960 Speaker 2: everybody except the radiologists. So what would be dangerous is 389 00:22:21,000 --> 00:22:23,840 Speaker 2: to allow the radiologists to get together and say you're 390 00:22:23,840 --> 00:22:26,680 Speaker 2: not allowed to develop AI because we want to keep 391 00:22:26,680 --> 00:22:29,240 Speaker 2: our jobs. That's what the Luddites were doing. You could 392 00:22:29,320 --> 00:22:32,160 Speaker 2: understand their point of view, but as a government, it's 393 00:22:32,280 --> 00:22:35,080 Speaker 2: very dangerous to give into that kind of thinking because 394 00:22:35,080 --> 00:22:39,080 Speaker 2: it denies the millions and millions of people that aren't 395 00:22:39,160 --> 00:22:42,480 Speaker 2: radiologists the benefits of that new technology. So I think 396 00:22:42,520 --> 00:22:46,359 Speaker 2: that Europe is particularly susceptible to that kind of subversion 397 00:22:46,520 --> 00:22:48,480 Speaker 2: just because of the political system that they have. 398 00:22:49,080 --> 00:22:52,840 Speaker 1: Just a few weeks ago, you introduced the Artificial Intelligence 399 00:22:52,840 --> 00:22:55,959 Speaker 1: for National Security Act. What are you trying to accomplish 400 00:22:55,960 --> 00:22:57,600 Speaker 1: with that and why do you think it's important. 401 00:22:58,560 --> 00:23:01,639 Speaker 2: Well, what we're trying to enable is the ability for 402 00:23:01,680 --> 00:23:06,359 Speaker 2: the Department of Defense to use advanced AI tools in 403 00:23:06,760 --> 00:23:11,840 Speaker 2: defending against things like cyber attacks from foreign adversaries. As 404 00:23:11,880 --> 00:23:16,080 Speaker 2: it turns out, AI is often the best defense against 405 00:23:16,119 --> 00:23:19,200 Speaker 2: the use of AI to try and disrupt the systems 406 00:23:19,200 --> 00:23:21,719 Speaker 2: of an adversary. So we want to make sure that 407 00:23:21,840 --> 00:23:26,040 Speaker 2: the DoD had the precurement authority to acquire these tools 408 00:23:26,080 --> 00:23:28,399 Speaker 2: as they've become available, and that's what the purpose of 409 00:23:28,400 --> 00:23:29,080 Speaker 2: that act is. 410 00:23:29,680 --> 00:23:31,399 Speaker 1: Close And I just came back from a week in 411 00:23:32,119 --> 00:23:37,240 Speaker 1: soul Carea. We had several briefings that indicated that North 412 00:23:37,320 --> 00:23:41,960 Speaker 1: Korea gets around many of the sanctions because through cybercrime, 413 00:23:42,560 --> 00:23:45,000 Speaker 1: they're making about a billion two hundred million dollars a 414 00:23:45,080 --> 00:23:47,960 Speaker 1: year to finance our system. Shouldn't there be some way 415 00:23:48,040 --> 00:23:51,640 Speaker 1: that NSA and others could use artificial intelligence to try 416 00:23:51,680 --> 00:23:55,240 Speaker 1: to close off that whole model of income through theft. 417 00:23:55,800 --> 00:23:58,520 Speaker 2: Yes, I think you're absolutely right, although we will never 418 00:23:58,600 --> 00:24:03,080 Speaker 2: succeed in closing off all of those avenues because one 419 00:24:03,119 --> 00:24:06,880 Speaker 2: of the ways that countries like North Korea make that 420 00:24:07,040 --> 00:24:12,200 Speaker 2: income is through extortion, where they encrypt people's digital data 421 00:24:12,280 --> 00:24:14,479 Speaker 2: and then charge them for the key to unlock it. 422 00:24:15,000 --> 00:24:18,399 Speaker 2: And that's something that happens because a single employee might 423 00:24:18,400 --> 00:24:21,199 Speaker 2: have clicked on a malicious link in an email that 424 00:24:21,240 --> 00:24:24,120 Speaker 2: they shouldn't have opened. That's not something that AI can 425 00:24:24,200 --> 00:24:26,800 Speaker 2: help you guard against. It's something that's going to require 426 00:24:26,840 --> 00:24:29,760 Speaker 2: better education and some better digital and computer skills from 427 00:24:29,800 --> 00:24:33,359 Speaker 2: our workforce. But certainly things like denial of service attacks 428 00:24:33,359 --> 00:24:36,800 Speaker 2: where you're trying to take down a website, AI can 429 00:24:36,840 --> 00:24:40,280 Speaker 2: be very sophisticated and useful in detecting a pattern of 430 00:24:40,359 --> 00:24:43,000 Speaker 2: usage that is an attack and not legitimate use. 431 00:24:43,240 --> 00:24:45,320 Speaker 1: The other thing I'm going to ask you about was 432 00:24:45,480 --> 00:24:49,760 Speaker 1: in February you introduced Protecting against Compromised Internet of Things 433 00:24:49,840 --> 00:24:52,080 Speaker 1: Technology Act. First, all I have to say I'm very 434 00:24:52,119 --> 00:24:55,879 Speaker 1: impressed that you've obviously mastered the art of legislating very quickly. 435 00:24:56,320 --> 00:24:58,840 Speaker 1: What are you trying to achieve with this bill on 436 00:24:59,000 --> 00:25:01,560 Speaker 1: protecting against comp Promised Internet of Things? 437 00:25:02,280 --> 00:25:08,840 Speaker 2: Well, we have federal legislation that protects against the malicious 438 00:25:09,000 --> 00:25:15,480 Speaker 2: insertion of foreign components into things like telecommunications equipment. But 439 00:25:16,040 --> 00:25:20,840 Speaker 2: we are seeing a proliferation of Internet enabled devices that 440 00:25:21,000 --> 00:25:23,840 Speaker 2: have really permeated every aspect of the way that we 441 00:25:23,840 --> 00:25:27,160 Speaker 2: live our lives. Our refrigerator might tell us that we're 442 00:25:27,160 --> 00:25:29,359 Speaker 2: out of milk. We've got a toaster that tells us 443 00:25:29,400 --> 00:25:31,919 Speaker 2: when the toast is done, the doorbell rings, and we 444 00:25:31,960 --> 00:25:34,000 Speaker 2: have a video of someone even though we're a thousand 445 00:25:34,080 --> 00:25:36,359 Speaker 2: miles away. So all of these we call that the 446 00:25:36,359 --> 00:25:39,120 Speaker 2: Internet of things, and we really don't have a federal 447 00:25:39,720 --> 00:25:43,560 Speaker 2: network capable of detecting when a malicious actor might be 448 00:25:43,600 --> 00:25:46,359 Speaker 2: able to use that in ways that are contrary to 449 00:25:46,440 --> 00:25:49,360 Speaker 2: not only data privacy but also our national security. So 450 00:25:49,680 --> 00:25:52,600 Speaker 2: that's what this bill does, is to establish that list 451 00:25:52,600 --> 00:25:54,520 Speaker 2: of bad actors and to prohibit the use of those 452 00:25:54,520 --> 00:25:56,879 Speaker 2: components in IoT devices. 453 00:25:57,359 --> 00:26:00,280 Speaker 1: Let me say, verus all, how impressed I am with 454 00:26:00,400 --> 00:26:04,880 Speaker 1: your knowledge but also with your ability to provide leadership 455 00:26:04,920 --> 00:26:08,040 Speaker 1: in the legislative branch. I think it's just a remarkable 456 00:26:08,040 --> 00:26:10,600 Speaker 1: asset for the United States to have somebody of your 457 00:26:10,640 --> 00:26:14,000 Speaker 1: knowledge and your technical background in a position to influence 458 00:26:14,040 --> 00:26:16,439 Speaker 1: and be part of the public policy debate. And I 459 00:26:16,520 --> 00:26:19,320 Speaker 1: know how extraordinarily busy you guys are right now, so 460 00:26:19,440 --> 00:26:22,760 Speaker 1: I'm doubly grateful to you personally for having taken the 461 00:26:22,760 --> 00:26:25,000 Speaker 1: time to help educate the rest of us, and I 462 00:26:25,040 --> 00:26:27,320 Speaker 1: look forward very much to chatting with you in the 463 00:26:27,320 --> 00:26:29,160 Speaker 1: future as this continues to evolve. 464 00:26:29,280 --> 00:26:31,720 Speaker 2: Well, thank you, dut I very much. Enjoyed our discussion 465 00:26:31,760 --> 00:26:33,320 Speaker 2: and yes, let's absolutely do it again. 466 00:26:36,600 --> 00:26:39,600 Speaker 1: Thank you to my guest, Congressman Jay Obernalty. You can 467 00:26:39,680 --> 00:26:43,320 Speaker 1: learn more about artificial intelligence and public policy on our 468 00:26:43,359 --> 00:26:46,719 Speaker 1: show page at newsworld dot com. Newtsworld is produced by 469 00:26:46,720 --> 00:26:51,359 Speaker 1: Gainish three sixty and iHeartMedia. Our executive producer is Guernsey Sloan. 470 00:26:51,720 --> 00:26:55,199 Speaker 1: Our researcher is Rachel Peterson. The artwork for the show 471 00:26:55,600 --> 00:26:58,760 Speaker 1: was created by Steve Penley. Special thanks to the team 472 00:26:58,960 --> 00:27:02,199 Speaker 1: at Gaingrish three sixty. If you've been enjoying Nutsworld, I 473 00:27:02,240 --> 00:27:05,320 Speaker 1: hope you'll go to Apple Podcast and both rate us 474 00:27:05,320 --> 00:27:08,840 Speaker 1: with five stars and give us a review so others 475 00:27:08,840 --> 00:27:12,040 Speaker 1: can learn what it's all about. Right now, listeners of 476 00:27:12,080 --> 00:27:15,600 Speaker 1: Newtsworld can sign up for my three free weekly columns 477 00:27:15,800 --> 00:27:20,000 Speaker 1: at gingristhree sixty dot com slash newsletter. I'm Newt Gingrich. 478 00:27:20,440 --> 00:27:21,520 Speaker 1: This is Newtsworld.