1 00:00:04,200 --> 00:00:09,039 Speaker 1: Welcome to Newt's World podcast on the iHeart Podcast Network. 2 00:00:09,840 --> 00:00:12,879 Speaker 1: You know, one of the most interesting and I think 3 00:00:13,039 --> 00:00:18,439 Speaker 1: understudied aspects of the Trump presidency is his passion for 4 00:00:18,560 --> 00:00:21,919 Speaker 1: building things. Comes out of his childhood, of course, and 5 00:00:22,360 --> 00:00:25,080 Speaker 1: the fact that his entire career he's been out there 6 00:00:25,120 --> 00:00:29,520 Speaker 1: making money building stuff, building hotels. He still has the 7 00:00:29,840 --> 00:00:33,680 Speaker 1: only high hat regency in New York City, a contract 8 00:00:33,720 --> 00:00:37,440 Speaker 1: he cleverly got done very early in his career. He 9 00:00:37,520 --> 00:00:43,240 Speaker 1: has always going around building Trump towers, Trump hotels, Trump 10 00:00:43,280 --> 00:00:47,880 Speaker 1: golf courses. That shouldn't shock us. Now that he's back 11 00:00:47,920 --> 00:00:51,680 Speaker 1: in the presidency, he is going around building things, all 12 00:00:51,720 --> 00:00:54,520 Speaker 1: of which, by the way, will have some impact of Trump. 13 00:00:55,120 --> 00:00:58,280 Speaker 1: My two favorites right now are the fight with the 14 00:00:58,320 --> 00:01:02,560 Speaker 1: Federal Court, which I think is almost certainly wrong in 15 00:01:02,600 --> 00:01:06,039 Speaker 1: the courtside about whether or not Trump can build a 16 00:01:06,040 --> 00:01:09,680 Speaker 1: new ballroom. He concluded that there was no place in 17 00:01:09,720 --> 00:01:12,160 Speaker 1: the White House where you could have a big enough 18 00:01:12,160 --> 00:01:16,559 Speaker 1: event to have people come and be dressed in really 19 00:01:16,640 --> 00:01:20,000 Speaker 1: fine outfits and have a big evening. And so, in 20 00:01:20,080 --> 00:01:23,200 Speaker 1: a Trump like manner, he tore down the East Wing 21 00:01:23,800 --> 00:01:28,240 Speaker 1: and is now putting in a brand new, huge ballroom. 22 00:01:28,720 --> 00:01:34,800 Speaker 1: He also, however, is putting an entire safe area below 23 00:01:34,959 --> 00:01:38,800 Speaker 1: the ballroom. So once they had created the space, they 24 00:01:38,840 --> 00:01:43,000 Speaker 1: talked to the military about how to create a really safe, 25 00:01:43,520 --> 00:01:48,760 Speaker 1: survivable center for the presidency in the East wing where 26 00:01:48,800 --> 00:01:51,600 Speaker 1: there never existed before. And that's going to be part 27 00:01:51,680 --> 00:01:56,080 Speaker 1: of this. Now what's fascinating is Trump, I think, intuited 28 00:01:56,760 --> 00:02:01,240 Speaker 1: that if he followed the tradition, the tradition which for example, 29 00:02:01,640 --> 00:02:04,919 Speaker 1: Jackie Kennedy had followed when she created the Rose Garden, 30 00:02:05,480 --> 00:02:09,480 Speaker 1: then it would all be paid for privately. So he 31 00:02:09,480 --> 00:02:12,040 Speaker 1: didn't go to the Congress and ask for money. He 32 00:02:12,120 --> 00:02:16,160 Speaker 1: simply raised the money to build this entire thing. And 33 00:02:16,160 --> 00:02:18,000 Speaker 1: it's a a lot of mine. It's a big project. 34 00:02:18,520 --> 00:02:21,440 Speaker 1: And now this federal judge is saying, well, you can't 35 00:02:21,440 --> 00:02:25,040 Speaker 1: do this without congressional approval. But the fact is the 36 00:02:25,120 --> 00:02:28,680 Speaker 1: Trump's lawyer said had looked at all this and historically 37 00:02:29,120 --> 00:02:33,480 Speaker 1: there is no congressional involvement in the White House if 38 00:02:33,520 --> 00:02:38,120 Speaker 1: it's being done privately. So there's an ongoing fight developing here. 39 00:02:38,600 --> 00:02:41,080 Speaker 1: And of course, being Trump, he already went out in 40 00:02:41,040 --> 00:02:43,840 Speaker 1: ask permission. He didn't go to court first. He just 41 00:02:43,880 --> 00:02:46,680 Speaker 1: went out tore down the building and began rebuilding, which 42 00:02:46,720 --> 00:02:49,160 Speaker 1: is something he's done in a number of his properties 43 00:02:49,200 --> 00:02:51,639 Speaker 1: over the years. I think his attitude is, look, there's 44 00:02:51,680 --> 00:02:54,200 Speaker 1: this huge hole on the ground, and I'm prepared to 45 00:02:54,240 --> 00:02:56,960 Speaker 1: pay the phillip and if you guys want to mess 46 00:02:57,000 --> 00:02:59,200 Speaker 1: around with it, you're going to have just a total 47 00:02:59,240 --> 00:03:02,560 Speaker 1: disaster on your hands. So I suspect in the end, 48 00:03:03,000 --> 00:03:06,400 Speaker 1: the court will be overruled. The Supreme Court will decide 49 00:03:06,720 --> 00:03:10,600 Speaker 1: that in fact, the tradition that presidents can do with 50 00:03:10,680 --> 00:03:13,079 Speaker 1: the White House what they want to as long as 51 00:03:13,080 --> 00:03:16,720 Speaker 1: they pay for it still stands, and Trump will before 52 00:03:16,760 --> 00:03:20,120 Speaker 1: he leaves office, have his brand new ballroom. In fact, 53 00:03:20,160 --> 00:03:22,320 Speaker 1: I wouldn't be shocked to see them get it done 54 00:03:22,360 --> 00:03:25,239 Speaker 1: in time to have at least one really big event 55 00:03:25,680 --> 00:03:30,000 Speaker 1: before he goes back into private life. Meanwhile, in the 56 00:03:30,080 --> 00:03:33,640 Speaker 1: last couple of days, his son Eric, who really runs 57 00:03:33,680 --> 00:03:37,000 Speaker 1: the day to day company, has announced a mock up 58 00:03:37,560 --> 00:03:41,360 Speaker 1: of the Trump Presidential Library. Now you may know that 59 00:03:41,680 --> 00:03:46,160 Speaker 1: Obama has built a large and fairly ugly building in 60 00:03:46,280 --> 00:03:51,360 Speaker 1: Chicago in which kind of stands out. But Obama, never 61 00:03:51,440 --> 00:03:55,200 Speaker 1: having been in construction but being a lawyer by trade, 62 00:03:55,600 --> 00:03:58,960 Speaker 1: lacked the understanding of what you really could get done. 63 00:03:59,400 --> 00:04:02,480 Speaker 1: And if you look the pictures, they're talking about putting 64 00:04:02,600 --> 00:04:06,280 Speaker 1: up gigantic skyscraper which will of course have at the 65 00:04:06,360 --> 00:04:10,600 Speaker 1: very top of it Trump. So the Trump Library south 66 00:04:10,680 --> 00:04:15,240 Speaker 1: of marri Largo fits the tradition of Trump Tower, Trump 67 00:04:15,360 --> 00:04:19,479 Speaker 1: National Golf Course worldwide. My guess is there must be 68 00:04:20,160 --> 00:04:23,080 Speaker 1: eighty or ninety places now that have Trump's name on it, 69 00:04:23,560 --> 00:04:26,120 Speaker 1: and he just cheerfully goes along doing it. And of 70 00:04:26,160 --> 00:04:29,640 Speaker 1: course the library again will be built with private funds, 71 00:04:30,040 --> 00:04:33,039 Speaker 1: so he doesn't actually care what the political system thinks 72 00:04:33,040 --> 00:04:38,279 Speaker 1: of it. But it's so perfectly classically Trump. It's going 73 00:04:38,360 --> 00:04:41,040 Speaker 1: to be big, it's going to be bold, it's going 74 00:04:41,120 --> 00:04:43,840 Speaker 1: to be visible, and it's going to have Trump on it. 75 00:04:44,360 --> 00:04:46,760 Speaker 1: Watching all this, if you have a sense of humor, 76 00:04:47,160 --> 00:04:53,360 Speaker 1: sometimes it's pretty darn funny. Coming up AI hype or hope, 77 00:04:53,680 --> 00:04:58,720 Speaker 1: promise or peril, existential threat or the next great technological 78 00:04:58,800 --> 00:05:02,760 Speaker 1: leap towards pross parity and human flourishing. I'm going to 79 00:05:02,800 --> 00:05:06,479 Speaker 1: talk with Winton Hall about his new book, Code Read. 80 00:05:07,000 --> 00:05:16,440 Speaker 1: That's next. I'm really pleased to welcome my guest, Winton Hall, 81 00:05:16,920 --> 00:05:20,800 Speaker 1: Bright Bart News, Director of Social Media, Distinguished Fellow at 82 00:05:20,839 --> 00:05:26,160 Speaker 1: Peter Schweizer's Government Accountability Institute and former visiting fellow at 83 00:05:26,200 --> 00:05:31,200 Speaker 1: Stanford University's Hoover Institution. He's joining me to discuss his 84 00:05:31,320 --> 00:05:36,880 Speaker 1: new book, Code Read the Left, the Right, China, and 85 00:05:36,920 --> 00:05:41,000 Speaker 1: the Race to control AI. When welcome. Thank you for 86 00:05:41,080 --> 00:05:42,080 Speaker 1: joining me the News world. 87 00:05:42,400 --> 00:05:44,360 Speaker 2: Oh, it's a pleasure to be with you, mister speaker. 88 00:05:44,400 --> 00:05:44,719 Speaker 2: Thank you. 89 00:05:45,760 --> 00:05:48,359 Speaker 1: I think this is a very important book. And you 90 00:05:48,560 --> 00:05:55,400 Speaker 1: describe AI as potentially the defining national security challenge of 91 00:05:55,480 --> 00:05:57,800 Speaker 1: the next decade. What do you mean by. 92 00:05:57,680 --> 00:05:59,440 Speaker 2: That, mispeaker. 93 00:05:59,480 --> 00:06:01,599 Speaker 3: I think the things that we have to realize is 94 00:06:01,600 --> 00:06:04,400 Speaker 3: that AI we often here described as a tool. It 95 00:06:04,440 --> 00:06:07,720 Speaker 3: is a tool, but it's also political power, and increasingly 96 00:06:07,839 --> 00:06:10,280 Speaker 3: within the AI warfare space, I think it is going 97 00:06:10,360 --> 00:06:13,880 Speaker 3: to become essential when we talk about beating China, I 98 00:06:13,880 --> 00:06:17,200 Speaker 3: think we can beat China without becoming China, maintain our values. 99 00:06:17,240 --> 00:06:18,680 Speaker 2: We're seeing the use of. 100 00:06:18,720 --> 00:06:22,480 Speaker 3: AI right now in Iran and certainly in the Maduro Rate. 101 00:06:22,800 --> 00:06:24,919 Speaker 3: And I think when you talk to folks in the 102 00:06:24,920 --> 00:06:28,400 Speaker 3: military and leadership in the defense tech space, they really 103 00:06:28,480 --> 00:06:32,919 Speaker 3: understand and explain that having that speed efficiency and also 104 00:06:33,000 --> 00:06:36,200 Speaker 3: as we move toward things like RSI, recursive self improvement, 105 00:06:36,440 --> 00:06:39,720 Speaker 3: which is of course the AI potentially theoretically one day 106 00:06:39,800 --> 00:06:43,159 Speaker 3: being able to constantly update and improve itself. That whoever 107 00:06:43,360 --> 00:06:46,599 Speaker 3: gains that advantage is going to have full spectrum battlefield 108 00:06:46,640 --> 00:06:51,560 Speaker 3: dominance and things like cybersecurity, encryption, hacking of missile systems, 109 00:06:51,600 --> 00:06:55,400 Speaker 3: hacking of infrastructure. So we really have to understand that 110 00:06:55,440 --> 00:06:58,000 Speaker 3: we are in a real sprint, and I think thankfully 111 00:06:58,080 --> 00:07:01,080 Speaker 3: you're seeing a lot of leadership around that, whether you're 112 00:07:01,120 --> 00:07:04,440 Speaker 3: talking about the War Department, President Trump, and just military 113 00:07:04,520 --> 00:07:09,560 Speaker 3: leadership who have long understood that technological innovation brings battlefield 114 00:07:09,640 --> 00:07:12,880 Speaker 3: opportunities for us to keep not just our soldier, sailors, airmen, 115 00:07:12,920 --> 00:07:15,280 Speaker 3: and marine safe but here at home as well as 116 00:07:15,360 --> 00:07:17,440 Speaker 3: asymmetric threats start to emerge. 117 00:07:17,680 --> 00:07:22,480 Speaker 1: We've already seen a pretty big jump in both kinagon 118 00:07:22,560 --> 00:07:28,240 Speaker 1: spending and general government spending on AI related contracts. Are 119 00:07:28,280 --> 00:07:29,720 Speaker 1: we doing enough in that direction? 120 00:07:30,640 --> 00:07:32,680 Speaker 3: I think we should do more, and I think that 121 00:07:33,320 --> 00:07:35,840 Speaker 3: one of the things that's important is a lot of 122 00:07:35,880 --> 00:07:39,640 Speaker 3: the traditional warfare machinery that we've seen, we know that 123 00:07:39,680 --> 00:07:43,520 Speaker 3: the costs are exceptionally high, and now with the AI innovation, 124 00:07:43,640 --> 00:07:46,760 Speaker 3: that cost curve can be brought down. And also the 125 00:07:46,840 --> 00:07:49,640 Speaker 3: efficiency can be brought up. One of the advantages not 126 00:07:49,720 --> 00:07:52,400 Speaker 3: just in the actual weapons themselves, but of course is 127 00:07:52,400 --> 00:07:55,640 Speaker 3: in the intel process for AI, because if there's one 128 00:07:55,640 --> 00:07:58,920 Speaker 3: thing that AI really excels in, as you know, is 129 00:07:59,280 --> 00:08:02,239 Speaker 3: massive paw recognition, and so you'll be able to take 130 00:08:02,360 --> 00:08:07,680 Speaker 3: all of that intercepted communication, whether it's satellites or images surveillance, 131 00:08:07,960 --> 00:08:10,320 Speaker 3: and be able to sift and sort that in what 132 00:08:10,360 --> 00:08:13,440 Speaker 3: would have taken months, if not longer, in teams of 133 00:08:13,560 --> 00:08:16,480 Speaker 3: hundreds or thousands, and you're now able to do that 134 00:08:16,680 --> 00:08:19,400 Speaker 3: in a matter of days with just a smaller team. 135 00:08:19,520 --> 00:08:21,440 Speaker 2: So it's one of those opportunities. 136 00:08:21,480 --> 00:08:23,800 Speaker 3: I think it also extends have a chapter in this 137 00:08:23,880 --> 00:08:26,200 Speaker 3: book about also on the domestic side, not just in 138 00:08:26,240 --> 00:08:30,520 Speaker 3: the foreign space, of how we can use AI towards 139 00:08:30,600 --> 00:08:33,800 Speaker 3: a conservative vision of a small or more efficient government 140 00:08:33,840 --> 00:08:35,920 Speaker 3: as well. So I think there are opportunities on both 141 00:08:35,960 --> 00:08:38,000 Speaker 3: sides of it, and I think that if we can 142 00:08:38,040 --> 00:08:41,000 Speaker 3: expand on that in the military side, it will be 143 00:08:41,040 --> 00:08:42,000 Speaker 3: to our advantage. 144 00:08:42,160 --> 00:08:47,160 Speaker 1: HEYI has a potential both to be an empowering tool 145 00:08:48,160 --> 00:08:51,160 Speaker 1: in dramatically improving a lot of what we do already do, 146 00:08:51,960 --> 00:08:57,040 Speaker 1: but also potentially inventing If you will a virtually new 147 00:08:57,160 --> 00:09:02,640 Speaker 1: battlefield of speed and complexity unlike anything we've ever seen before, 148 00:09:02,960 --> 00:09:06,160 Speaker 1: and that these are actually kind of two parallel areas. 149 00:09:06,200 --> 00:09:08,520 Speaker 1: That is, if you just get defense logistics, or you 150 00:09:08,520 --> 00:09:11,679 Speaker 1: could get the defense accounting system, since they've never had 151 00:09:11,720 --> 00:09:14,920 Speaker 1: a successful audit of the Pentagon, you know, with AI 152 00:09:15,000 --> 00:09:17,320 Speaker 1: you should be able to do that. But at the 153 00:09:17,360 --> 00:09:23,640 Speaker 1: same time, a military which understands the rhythm and capability 154 00:09:23,640 --> 00:09:28,920 Speaker 1: of the AI may be able to generate combat capabilities 155 00:09:29,679 --> 00:09:33,360 Speaker 1: in ways that would literally be unthinkable for people that 156 00:09:33,440 --> 00:09:36,280 Speaker 1: don't have that kind of advantage. Aren't these two sort 157 00:09:36,320 --> 00:09:39,840 Speaker 1: of parallel but very different applications that they are? 158 00:09:40,640 --> 00:09:41,320 Speaker 2: They really are? 159 00:09:41,480 --> 00:09:43,400 Speaker 3: Right so well, on the one hand, you're saying and 160 00:09:43,480 --> 00:09:46,600 Speaker 3: talking about the efficiency side of just the logistics, right, 161 00:09:46,640 --> 00:09:49,839 Speaker 3: the actual functions, and there I think there's no doubt, right, 162 00:09:49,880 --> 00:09:51,960 Speaker 3: that's almost in many ways like you would think of 163 00:09:52,040 --> 00:09:55,240 Speaker 3: in a corporate setting or in a large institutional setting 164 00:09:55,440 --> 00:09:58,400 Speaker 3: being able to have just the rudimentary nuts and bolts 165 00:09:58,400 --> 00:10:01,920 Speaker 3: of motion and logistics improve and I think there there's 166 00:10:01,920 --> 00:10:03,160 Speaker 3: no question on that. 167 00:10:03,559 --> 00:10:05,040 Speaker 2: The second piece though, on. 168 00:10:04,960 --> 00:10:07,520 Speaker 3: The speed part of the actual battlefield, I think that 169 00:10:07,760 --> 00:10:11,040 Speaker 3: becomes very important, particularly as we look at the race 170 00:10:11,080 --> 00:10:13,360 Speaker 3: with China that we often hear about. I mean, you 171 00:10:13,400 --> 00:10:15,800 Speaker 3: see that in two directions, right. One, I think is 172 00:10:15,880 --> 00:10:18,520 Speaker 3: just the economics of it. Right, one third of the 173 00:10:18,600 --> 00:10:22,000 Speaker 3: S and P five hundred is constituted around that mag seven, 174 00:10:22,040 --> 00:10:25,080 Speaker 3: the Magnificent seven, our biggest seven tech companies. 175 00:10:25,240 --> 00:10:27,679 Speaker 2: And so just on an economic scale, we see that 176 00:10:27,880 --> 00:10:29,599 Speaker 2: we saw what deep seeks. 177 00:10:29,280 --> 00:10:32,199 Speaker 3: Our one model did from China in that one day 178 00:10:32,240 --> 00:10:34,880 Speaker 3: wipeout in American history for one single company, and that 179 00:10:35,000 --> 00:10:37,920 Speaker 3: was in Nvidia at about six hundred billion dollars. So 180 00:10:37,960 --> 00:10:40,640 Speaker 3: on the economic side, it becomes very important to win. 181 00:10:40,920 --> 00:10:43,160 Speaker 3: On the military side though, we know that you know, 182 00:10:43,240 --> 00:10:45,720 Speaker 3: China has said and put down a marker from twenty 183 00:10:45,760 --> 00:10:48,960 Speaker 3: seventeen to twenty thirty they want to have be dominant 184 00:10:48,960 --> 00:10:51,360 Speaker 3: in this space. So you're absolutely right that we can 185 00:10:51,400 --> 00:10:54,280 Speaker 3: gain those efficiencies. But if we have that leadership push 186 00:10:54,559 --> 00:10:58,080 Speaker 3: to really accelerate on the military side, both in terms 187 00:10:58,120 --> 00:11:00,880 Speaker 3: of logistics as well as battle field dominance, and I 188 00:11:00,880 --> 00:11:02,479 Speaker 3: think all that accrues economically. 189 00:11:02,760 --> 00:11:08,160 Speaker 1: From your research, how many countries are potentially at the 190 00:11:08,200 --> 00:11:12,160 Speaker 1: cutting edge of developing artificial intelligence. 191 00:11:12,040 --> 00:11:14,920 Speaker 3: Yeah, there are many players, but thankfully right now the 192 00:11:14,960 --> 00:11:18,160 Speaker 3: American enterprise labs are in the lead, and we will 193 00:11:18,160 --> 00:11:21,439 Speaker 3: hope to maintain that lead. China is usually referred to 194 00:11:21,480 --> 00:11:24,760 Speaker 3: as the major dominant competitor in that space in that regard, 195 00:11:25,120 --> 00:11:27,760 Speaker 3: and there really are right. It's not that other countries 196 00:11:27,800 --> 00:11:31,040 Speaker 3: don't have AI labs that are efficient and are doing 197 00:11:31,080 --> 00:11:34,760 Speaker 3: good work, but honestly, quite frankly, their regulatory schema has 198 00:11:34,800 --> 00:11:38,800 Speaker 3: really slowed those other countries who have pursued a more 199 00:11:39,160 --> 00:11:43,360 Speaker 3: regulation heavy approach as opposed to our lighter. 200 00:11:43,080 --> 00:11:45,840 Speaker 2: Touch with that. Now China doesn't really have to worry. 201 00:11:46,160 --> 00:11:49,600 Speaker 3: Hijingping sort of has a one way decision making change. 202 00:11:49,640 --> 00:11:50,920 Speaker 2: Obviously, we do not want. 203 00:11:50,760 --> 00:11:55,160 Speaker 3: To pursue a CCP authoritarian style surveillance state or anything 204 00:11:55,280 --> 00:11:59,040 Speaker 3: like it. But nevertheless, that's why we often hear, you know, 205 00:11:59,120 --> 00:12:02,440 Speaker 3: leaders like Sam Altman, must Off of Suliman, Dario Amide 206 00:12:02,480 --> 00:12:05,680 Speaker 3: and others frame it around China because they do know 207 00:12:05,760 --> 00:12:07,040 Speaker 3: that that's their main competitor. 208 00:12:07,360 --> 00:12:10,920 Speaker 1: It strikes me that one of the side effects of 209 00:12:10,960 --> 00:12:15,959 Speaker 1: AI maybe that it actually empowers kind of middle level 210 00:12:16,040 --> 00:12:19,959 Speaker 1: countries with levels of capability they could never have gotten otherwise. 211 00:12:20,200 --> 00:12:23,840 Speaker 1: I'm wondering to what extent we're going to see sort 212 00:12:23,880 --> 00:12:30,960 Speaker 1: of mid sized countries actually have very formidable military by 213 00:12:31,000 --> 00:12:34,240 Speaker 1: the application of this kind of artificial intelligence. 214 00:12:35,000 --> 00:12:37,960 Speaker 3: Yes, mister speaker, I mean, you're the ultimate historian, and 215 00:12:38,000 --> 00:12:40,599 Speaker 3: you could probably explain it even deeper. But one of 216 00:12:40,640 --> 00:12:44,000 Speaker 3: the things you're hitting on is so important. The democratization 217 00:12:44,320 --> 00:12:47,959 Speaker 3: of artificial intelligence is going to be a force multiplier 218 00:12:48,040 --> 00:12:51,439 Speaker 3: for people that are either restrained because of scale of 219 00:12:51,520 --> 00:12:55,800 Speaker 3: technological limits and or budgetary limits, and so in that way, 220 00:12:55,840 --> 00:12:58,920 Speaker 3: it becomes a catapult, right. It gets you more for less. 221 00:12:59,280 --> 00:13:01,800 Speaker 3: It helps you to be able to do more with less, 222 00:13:02,160 --> 00:13:05,319 Speaker 3: and that's both militarily as well as in things like education. 223 00:13:05,480 --> 00:13:09,480 Speaker 3: For example, I mean having AI tutors for a developing 224 00:13:09,559 --> 00:13:12,600 Speaker 3: country for a ninety nine dollars tablet and having the 225 00:13:12,640 --> 00:13:15,880 Speaker 3: ability to have machine learning improvements that's going to really 226 00:13:16,080 --> 00:13:18,959 Speaker 3: give gains and real opportunities there as well. 227 00:13:19,280 --> 00:13:21,079 Speaker 2: So I think you're exactly right now. 228 00:13:21,200 --> 00:13:23,679 Speaker 3: The darker side, of course, is it also will potentially 229 00:13:23,679 --> 00:13:27,960 Speaker 3: that democratization that bringing that cost scale down and access 230 00:13:28,000 --> 00:13:31,160 Speaker 3: will also obviously enable the potential for non state actors 231 00:13:31,480 --> 00:13:34,920 Speaker 3: to use AI, whether you're talking about the propaganda side 232 00:13:34,920 --> 00:13:37,280 Speaker 3: and deep bakes, which we've just been seeing right now. 233 00:13:37,320 --> 00:13:42,320 Speaker 3: Happen in real time or the actual building biological chemical weapons. 234 00:13:42,360 --> 00:13:46,679 Speaker 3: When you have PhD level intelligence and an unmetered level 235 00:13:46,679 --> 00:13:51,160 Speaker 3: of intelligence, relatively low costs, with reasoning models and advanced 236 00:13:51,160 --> 00:13:54,480 Speaker 3: models and AI, then you would have if you had determined, 237 00:13:54,600 --> 00:13:58,040 Speaker 3: you know, jihadis forces or things of that sort, greater fears. 238 00:13:58,080 --> 00:14:00,360 Speaker 3: The other thing, of course, is with drones and facial 239 00:14:00,400 --> 00:14:04,040 Speaker 3: recognition technology, now you're able to do what would have 240 00:14:04,080 --> 00:14:07,400 Speaker 3: taken a ten x component in budget and skill levels 241 00:14:07,640 --> 00:14:09,960 Speaker 3: for far less and so it really is going to 242 00:14:10,000 --> 00:14:12,360 Speaker 3: be like you know, fire, you can either help warm 243 00:14:12,400 --> 00:14:15,080 Speaker 3: a civilization or earn it down, who uses it and 244 00:14:15,080 --> 00:14:15,800 Speaker 3: how they decide. 245 00:14:15,800 --> 00:14:17,160 Speaker 2: But you're absolutely right. 246 00:14:17,280 --> 00:14:21,880 Speaker 3: Those middle and lower tier countries or actors or groups, they're. 247 00:14:21,680 --> 00:14:23,240 Speaker 2: Going to now have a lot more power for a 248 00:14:23,240 --> 00:14:24,040 Speaker 2: lot less money. 249 00:14:25,640 --> 00:14:28,480 Speaker 1: Next we're going to discuss the power of artificial intelligence 250 00:14:28,840 --> 00:14:38,360 Speaker 1: in modern warfare. I look at things like the Ukrainian 251 00:14:39,360 --> 00:14:44,640 Speaker 1: use of underwater drones. You have a fifty thousand dollars 252 00:14:44,720 --> 00:14:49,240 Speaker 1: drone taking out a thirteen million dollar ship. It's something 253 00:14:49,280 --> 00:14:51,440 Speaker 1: we have to worry about. I mean, we've had a 254 00:14:51,480 --> 00:14:56,359 Speaker 1: tendency to go to very very big, very very expensive 255 00:14:56,360 --> 00:15:01,200 Speaker 1: ships and we only have a handful of aircraft, if 256 00:15:01,240 --> 00:15:04,240 Speaker 1: you certainly run into an opponent who has an ability 257 00:15:04,280 --> 00:15:06,720 Speaker 1: to get through your various defenses. And part of the 258 00:15:06,800 --> 00:15:10,360 Speaker 1: challenge there is cost. I mean, there are drones out 259 00:15:10,400 --> 00:15:15,080 Speaker 1: there between five and fifty thousand dollars being shot at 260 00:15:15,200 --> 00:15:19,920 Speaker 1: by missiles that cost a million dollars. You can't sustain 261 00:15:20,000 --> 00:15:20,880 Speaker 1: that exchange rate. 262 00:15:21,600 --> 00:15:24,720 Speaker 3: That's exactly right, Palmer Lucky at Anderil talks about that 263 00:15:24,840 --> 00:15:27,520 Speaker 3: a lot. When you look at how much money If, 264 00:15:27,560 --> 00:15:29,640 Speaker 3: like you say, if you're sending a million dollar missile 265 00:15:29,960 --> 00:15:33,160 Speaker 3: to knock down something that you know is fifty thousand dollars, 266 00:15:33,480 --> 00:15:36,240 Speaker 3: that math doesn't work in your favor very long. And 267 00:15:36,320 --> 00:15:38,720 Speaker 3: I think it also encourages us to really stay at 268 00:15:38,720 --> 00:15:41,760 Speaker 3: that bleeding edge and realize that we've got to innovate 269 00:15:41,800 --> 00:15:45,040 Speaker 3: in these spaces because this battlescape is moving very quickly. 270 00:15:45,400 --> 00:15:47,880 Speaker 1: I'm going to give you two problems. One that are 271 00:15:48,880 --> 00:15:54,480 Speaker 1: procurement systems are designed to match the speed of evolution 272 00:15:55,480 --> 00:16:01,000 Speaker 1: that we're seeing with artificial intelligence. And second that artificial 273 00:16:01,000 --> 00:16:04,560 Speaker 1: intelligence is potentially going to create an entirely new scale 274 00:16:04,600 --> 00:16:09,320 Speaker 1: of mass that is the opposite of how we've designed 275 00:16:09,360 --> 00:16:12,560 Speaker 1: our war fighting capabilities. These are both that seems to 276 00:16:12,600 --> 00:16:15,800 Speaker 1: me really really big challenges for the United States. 277 00:16:16,560 --> 00:16:18,360 Speaker 2: I couldn't agree with you more, mister speaker. 278 00:16:18,600 --> 00:16:21,040 Speaker 3: You want to think of being able to sprint and 279 00:16:21,120 --> 00:16:22,520 Speaker 3: chew gum at the same time. 280 00:16:22,640 --> 00:16:25,320 Speaker 2: But I think you're right. The calcification of a lot. 281 00:16:25,120 --> 00:16:29,120 Speaker 3: Of institutional sort of inertia that sets in prevents some 282 00:16:29,160 --> 00:16:31,480 Speaker 3: people from being able to give up certain turf and 283 00:16:31,520 --> 00:16:34,680 Speaker 3: so forth. I think we have to have this speed 284 00:16:34,840 --> 00:16:36,400 Speaker 3: mindset that you're talking about. 285 00:16:36,680 --> 00:16:38,320 Speaker 2: And I'm not suggesting. 286 00:16:37,880 --> 00:16:39,880 Speaker 3: For a minute that you know we're going to do 287 00:16:40,000 --> 00:16:43,480 Speaker 3: away with any of what exists in totality, but I 288 00:16:43,480 --> 00:16:44,760 Speaker 3: think we can be doing both. 289 00:16:44,840 --> 00:16:45,920 Speaker 2: And when you look. 290 00:16:45,720 --> 00:16:47,880 Speaker 3: At the fact of how fastest is moving, I mean, 291 00:16:48,440 --> 00:16:51,280 Speaker 3: just look at even with what AI's architects, the way 292 00:16:51,320 --> 00:16:54,560 Speaker 3: they describe their own technology that Sam Altman says there 293 00:16:54,640 --> 00:16:58,760 Speaker 3: is a private betting pool among the AI billionaires of 294 00:16:58,840 --> 00:17:03,560 Speaker 3: who will create the first one person billion dollar company. 295 00:17:03,840 --> 00:17:05,280 Speaker 3: Now what does that suggest to you? 296 00:17:05,560 --> 00:17:08,200 Speaker 2: Right? It suggests that you can do so much more 297 00:17:08,359 --> 00:17:09,480 Speaker 2: with so much less. 298 00:17:09,560 --> 00:17:11,479 Speaker 3: And I think that if we know our enemies are 299 00:17:11,480 --> 00:17:14,000 Speaker 3: thinking that way, then we want to be thinking about 300 00:17:14,000 --> 00:17:15,800 Speaker 3: that so that we can get in front of that 301 00:17:15,880 --> 00:17:18,160 Speaker 3: curve instead of stay behind it. I think the other 302 00:17:18,200 --> 00:17:22,080 Speaker 3: thing that you're talking about is even within the procurement process, right, 303 00:17:22,440 --> 00:17:24,760 Speaker 3: let's take away weapons outside of it for a moment, 304 00:17:25,000 --> 00:17:28,679 Speaker 3: just the actual flow of being able to spot fraud, 305 00:17:28,960 --> 00:17:31,919 Speaker 3: double billing, cost cutting, all of that, and I go 306 00:17:32,000 --> 00:17:34,280 Speaker 3: through actually in code read how we can do that 307 00:17:34,359 --> 00:17:36,280 Speaker 3: and we should do that. Did a lot of research 308 00:17:36,320 --> 00:17:41,639 Speaker 3: into that procurement process. That alone will provide enormous savings 309 00:17:41,640 --> 00:17:44,040 Speaker 3: that we can then use to do the kind of 310 00:17:44,080 --> 00:17:47,439 Speaker 3: innovations that you're talking about or put back into general 311 00:17:47,480 --> 00:17:50,480 Speaker 3: budgetary usage. So I think that we've got to get 312 00:17:50,480 --> 00:17:52,119 Speaker 3: in that mindset. I mean, one of the things I 313 00:17:52,240 --> 00:17:55,120 Speaker 3: have been a little bit encouraged by the President's AI 314 00:17:55,200 --> 00:17:57,600 Speaker 3: Action Plan that we saw the twenty twenty five AI 315 00:17:57,720 --> 00:18:01,440 Speaker 3: Action Plan that came out get very specif pillars inside 316 00:18:01,440 --> 00:18:04,680 Speaker 3: of that plan, and they were talking about this need 317 00:18:04,720 --> 00:18:07,920 Speaker 3: to be able to move quickly on this because nothing 318 00:18:07,960 --> 00:18:11,000 Speaker 3: matters more than our national security, for the safety of 319 00:18:11,040 --> 00:18:14,240 Speaker 3: our country and uncertainly our soldier sailor Zeeroman and marines. 320 00:18:14,480 --> 00:18:17,399 Speaker 3: So we I think recognize this has to happen. I 321 00:18:17,440 --> 00:18:19,600 Speaker 3: think it's going to really take people like yourself and 322 00:18:19,640 --> 00:18:22,560 Speaker 3: others who are able to speak that language with authority 323 00:18:22,640 --> 00:18:25,400 Speaker 3: to really explain this five D chess matrix and why 324 00:18:25,440 --> 00:18:26,320 Speaker 3: we need to be doing this. 325 00:18:26,920 --> 00:18:30,639 Speaker 1: When you think about all that, a couple big issues, 326 00:18:30,680 --> 00:18:35,000 Speaker 1: as you know, one is this question about truly autonomous weapons. 327 00:18:35,440 --> 00:18:38,679 Speaker 1: Given your studies, where do you come down on the 328 00:18:38,720 --> 00:18:41,800 Speaker 1: development of genuinely autonomous systems. 329 00:18:42,480 --> 00:18:44,600 Speaker 2: Well, I would love your historical background. 330 00:18:44,640 --> 00:18:46,320 Speaker 3: One of the fascinating things that comes through in the 331 00:18:46,359 --> 00:18:49,560 Speaker 3: research is, you know, autonomous weapons, depending on how you're 332 00:18:49,600 --> 00:18:53,000 Speaker 3: defining autonomous weapons have been around for a long time. 333 00:18:53,119 --> 00:18:55,800 Speaker 3: And one of the things that Palmer Lucky says is 334 00:18:55,880 --> 00:18:57,760 Speaker 3: that he had this great phrase. He says, you know, 335 00:18:57,800 --> 00:19:00,679 Speaker 3: I don't think there's a lot of moral authority in 336 00:19:00,760 --> 00:19:03,960 Speaker 3: a landmine, in other words, something that can be stepped 337 00:19:04,000 --> 00:19:07,479 Speaker 3: on and activated without a human input that is taking 338 00:19:07,520 --> 00:19:10,040 Speaker 3: a human life or God forbid, one of our own. 339 00:19:10,359 --> 00:19:13,080 Speaker 2: So I think that the moral consequence needs to be there. 340 00:19:13,440 --> 00:19:16,359 Speaker 3: I think that, you know, having a kill switch is 341 00:19:16,600 --> 00:19:19,440 Speaker 3: very much a part of that sort of baked in situation, 342 00:19:19,560 --> 00:19:22,119 Speaker 3: but being able to go to full auto. One of 343 00:19:22,119 --> 00:19:24,159 Speaker 3: the things that was fascinating when you really look at 344 00:19:24,200 --> 00:19:26,560 Speaker 3: what defense experts will tell you, and you study this, 345 00:19:27,000 --> 00:19:30,240 Speaker 3: they say, Look, if we're in a metaphorical gunslinger duel 346 00:19:30,560 --> 00:19:33,480 Speaker 3: and one side has the equivalent of a metaphorical ICBM, 347 00:19:33,560 --> 00:19:36,480 Speaker 3: in other words, a full auto and you have a slingshot, 348 00:19:36,840 --> 00:19:39,320 Speaker 3: if you have to stop and pause to pull that 349 00:19:39,400 --> 00:19:42,960 Speaker 3: trigger and have the human decision, it's over before it's begun. 350 00:19:43,160 --> 00:19:45,879 Speaker 3: And so we're going to get into this situation where 351 00:19:46,520 --> 00:19:49,800 Speaker 3: having to pause and have the human green light. If 352 00:19:49,800 --> 00:19:51,640 Speaker 3: you're up against an enemy that is in a full 353 00:19:51,680 --> 00:19:55,880 Speaker 3: auto capacity, you've been neutralized already. I think that's where 354 00:19:55,880 --> 00:19:58,080 Speaker 3: it's going, and I think that's the reality. Personally, my 355 00:19:58,200 --> 00:19:59,840 Speaker 3: view would be that, yes, we have to be able 356 00:19:59,880 --> 00:20:02,680 Speaker 3: to ultimately have kill switches on anything, even if you've 357 00:20:02,720 --> 00:20:05,720 Speaker 3: decided to go into a full autonomy. And then secondly, 358 00:20:05,760 --> 00:20:08,240 Speaker 3: I think that we have to recognize that the commander 359 00:20:08,240 --> 00:20:10,359 Speaker 3: in chief has to have that ability. That's the reason 360 00:20:10,400 --> 00:20:14,879 Speaker 3: why we haven't banned autonomous weapons, and we've had autonomous weapons, 361 00:20:14,920 --> 00:20:17,960 Speaker 3: depending upon again how you're defining that for a long time. 362 00:20:18,040 --> 00:20:20,000 Speaker 3: I think it's very scary to a lot of everyday 363 00:20:20,040 --> 00:20:24,080 Speaker 3: Americans because we've been taught by Hollywood and Terminator and the. 364 00:20:24,040 --> 00:20:24,600 Speaker 2: Rest of it. 365 00:20:24,880 --> 00:20:27,160 Speaker 3: That we're going to have these robots with laser eyes 366 00:20:27,240 --> 00:20:27,919 Speaker 3: and so forth. 367 00:20:28,040 --> 00:20:30,160 Speaker 2: And we're seeing that in the polls just last week. 368 00:20:30,240 --> 00:20:31,560 Speaker 2: I'm sure you saw it, mister speaker. 369 00:20:31,600 --> 00:20:35,400 Speaker 3: The NBC poll, twenty six percent have a positive view 370 00:20:35,440 --> 00:20:38,480 Speaker 3: in America of AI. Forty six percent of Americans have 371 00:20:38,520 --> 00:20:42,119 Speaker 3: a negative view. And so I think there's a communication gap. 372 00:20:42,160 --> 00:20:45,320 Speaker 3: I think people have to really clearly communicate and very 373 00:20:45,440 --> 00:20:49,160 Speaker 3: openly handle some of these real, genuine ethical concerns left, 374 00:20:49,240 --> 00:20:52,240 Speaker 3: right and center, because we have to understand this. You 375 00:20:52,280 --> 00:20:54,359 Speaker 3: don't get to opt out of the AI race. It's 376 00:20:54,400 --> 00:20:57,560 Speaker 3: already here. Ninety nine percent of us use AI every day, 377 00:20:57,840 --> 00:21:00,880 Speaker 3: even though only sixty four percent of people don't realize 378 00:21:00,880 --> 00:21:03,840 Speaker 3: they're using it because it's baked into the algorithms of life. 379 00:21:04,040 --> 00:21:07,040 Speaker 3: So I think that Silicon Valley has got its communication 380 00:21:07,160 --> 00:21:09,680 Speaker 3: work cut out for them, because I don't think they've 381 00:21:09,680 --> 00:21:11,080 Speaker 3: really explained a lot of this. 382 00:21:12,240 --> 00:21:17,280 Speaker 1: It's I think, very challenging to get across to people 383 00:21:18,359 --> 00:21:22,160 Speaker 1: the sheer scale of change that we see coming down 384 00:21:22,200 --> 00:21:24,720 Speaker 1: the road on every front. And one of the reasons 385 00:21:24,760 --> 00:21:28,080 Speaker 1: that I thought it was really so important that you 386 00:21:28,160 --> 00:21:30,840 Speaker 1: wrote code read the left, the right, China and the 387 00:21:30,920 --> 00:21:34,720 Speaker 1: race to control a eye. You're describing what maybe the 388 00:21:34,760 --> 00:21:39,399 Speaker 1: most important change agent of the next half century, and 389 00:21:39,560 --> 00:21:41,880 Speaker 1: people got to get that into their heads. 390 00:21:42,280 --> 00:21:45,720 Speaker 3: I've written many books as a ghostwriter, and I honestly 391 00:21:45,760 --> 00:21:47,440 Speaker 3: didn't want to write this book. 392 00:21:47,440 --> 00:21:48,359 Speaker 2: I felt compelled to. 393 00:21:49,000 --> 00:21:51,560 Speaker 3: I think that when you look at conservatism, you know, 394 00:21:51,560 --> 00:21:53,840 Speaker 3: William F. Buckley, the job of a conservative is the 395 00:21:53,880 --> 00:21:57,199 Speaker 3: standard thwart history, yelling stop. I think it's sadly some 396 00:21:57,359 --> 00:22:01,200 Speaker 3: on our side think that men's technology, and of course 397 00:22:01,200 --> 00:22:01,679 Speaker 3: it didn't. 398 00:22:01,720 --> 00:22:03,080 Speaker 2: You know, he was referring to. 399 00:22:03,080 --> 00:22:07,359 Speaker 3: The preservation of order and tradition and conservative values. So 400 00:22:07,400 --> 00:22:09,680 Speaker 3: I think we've got to get our side coached up 401 00:22:09,720 --> 00:22:12,680 Speaker 3: as conservatives, but even more than conservatives, I just think 402 00:22:12,720 --> 00:22:16,520 Speaker 3: everyday people, this is a seismic shift, and the disruption 403 00:22:16,640 --> 00:22:20,120 Speaker 3: we're going to see and the opportunity and the pitfalls 404 00:22:20,160 --> 00:22:21,760 Speaker 3: and all that comes with it in the next five 405 00:22:21,840 --> 00:22:24,320 Speaker 3: years is going to be unlike anything we've ever seen. 406 00:22:24,560 --> 00:22:27,320 Speaker 3: And so the speed issue is going to be the 407 00:22:27,400 --> 00:22:30,720 Speaker 3: real issue, because when you look at the industrial revolution, right, 408 00:22:30,760 --> 00:22:33,760 Speaker 3: there was time. It took time to lay roads and 409 00:22:33,800 --> 00:22:37,639 Speaker 3: build factories and electrification and so forth. Every person in 410 00:22:37,680 --> 00:22:41,560 Speaker 3: America has the phone, and so the speed with which 411 00:22:41,720 --> 00:22:46,720 Speaker 3: the proliferation of artificial intelligence can spread has been deeply accelerated. 412 00:22:47,000 --> 00:22:50,160 Speaker 3: And I think yes, people do learn over time how 413 00:22:50,160 --> 00:22:53,480 Speaker 3: to adapt to disruption, as the horse buggy goes the 414 00:22:53,480 --> 00:22:55,880 Speaker 3: way of the car and the candle goes the way 415 00:22:56,000 --> 00:22:58,080 Speaker 3: of the light bulb. But I think that if you're 416 00:22:58,119 --> 00:23:01,040 Speaker 3: not prepared and understanding how the ground is going to 417 00:23:01,040 --> 00:23:05,200 Speaker 3: shift around you in things like education, national security, economics, 418 00:23:05,200 --> 00:23:07,040 Speaker 3: and so forth, there's going to be a lot of 419 00:23:07,080 --> 00:23:11,560 Speaker 3: opportunity for ideologically motivated people to do power grabs and 420 00:23:11,600 --> 00:23:14,199 Speaker 3: to scare and disrupt a lot of people. So this 421 00:23:14,359 --> 00:23:16,320 Speaker 3: was just a humble attempt to try to get people 422 00:23:16,400 --> 00:23:17,840 Speaker 3: really thinking about these things. 423 00:23:18,200 --> 00:23:20,919 Speaker 1: When you look at the Chinese effort, I really want 424 00:23:20,960 --> 00:23:23,959 Speaker 1: you to comment, from the study you've done, to what 425 00:23:24,160 --> 00:23:29,440 Speaker 1: degree is their ability to focus resources offset their inability 426 00:23:29,840 --> 00:23:33,960 Speaker 1: to liberate entrepreneurs to go out and take risks can 427 00:23:34,040 --> 00:23:34,520 Speaker 1: be wrong. 428 00:23:35,240 --> 00:23:37,680 Speaker 3: I think that they realize is that it goes back 429 00:23:37,680 --> 00:23:39,560 Speaker 3: to what you were talking about earlier about the speed 430 00:23:39,600 --> 00:23:40,720 Speaker 3: and cost efficiencies. 431 00:23:41,080 --> 00:23:43,360 Speaker 2: China's mindset around. 432 00:23:43,080 --> 00:23:47,640 Speaker 3: AI is not as focused on this huge we're trying 433 00:23:47,680 --> 00:23:52,960 Speaker 3: to reach AGI artificial general intelligence or ASI artificial superintelligence 434 00:23:53,200 --> 00:23:55,560 Speaker 3: it's really about process and function. 435 00:23:55,840 --> 00:23:58,240 Speaker 2: The way I like to tell people is China kind 436 00:23:58,240 --> 00:23:59,560 Speaker 2: of thinks of it like a. 437 00:23:59,480 --> 00:24:02,719 Speaker 3: Toyota, AI like a Toyota, and we're looking at it 438 00:24:02,760 --> 00:24:06,280 Speaker 3: like a Bugatti, right, or a Ferrari. We're going for 439 00:24:06,359 --> 00:24:12,000 Speaker 3: the huge wins, and they're looking at supply chain improvements, efficiencies, 440 00:24:12,560 --> 00:24:15,440 Speaker 3: things that aren't maybe as sort of razzle dazzle to 441 00:24:15,600 --> 00:24:19,840 Speaker 3: capture the imagination, but in terms of actual functionality, have 442 00:24:19,920 --> 00:24:21,120 Speaker 3: a lot more efficiency. 443 00:24:22,520 --> 00:24:26,520 Speaker 1: Coming up, we're going to discuss how ninety Americans are 444 00:24:26,520 --> 00:24:30,840 Speaker 1: already using AI in their everyday lives. When will explain 445 00:24:31,200 --> 00:24:40,359 Speaker 1: what he means by that? Next, when you say something 446 00:24:40,560 --> 00:24:45,520 Speaker 1: which is I'm certain true, you said that nine percent 447 00:24:45,560 --> 00:24:49,880 Speaker 1: of Americans already use AI. Walk as through that. I'm 448 00:24:49,920 --> 00:24:53,399 Speaker 1: certain you're right, But I want to hear your version 449 00:24:53,480 --> 00:24:56,639 Speaker 1: of what are the ways in which normal, everyday people 450 00:24:57,000 --> 00:25:00,000 Speaker 1: are already immersed in artificial intelligence. 451 00:25:00,200 --> 00:25:00,840 Speaker 2: Mister speaker. 452 00:25:00,920 --> 00:25:04,000 Speaker 3: It's so important when I encounter my friends or family, 453 00:25:04,080 --> 00:25:06,159 Speaker 3: or somebody at church or something and they have a 454 00:25:06,240 --> 00:25:08,080 Speaker 3: sort of negative disposition toward AI. 455 00:25:08,520 --> 00:25:10,240 Speaker 2: I just asked them, well, did you use a weather 456 00:25:10,280 --> 00:25:13,280 Speaker 2: app today? Did you use a movie streaming service? 457 00:25:13,400 --> 00:25:15,800 Speaker 3: Over the weekend with the family to watch a family movie. 458 00:25:16,000 --> 00:25:18,200 Speaker 2: Did you use your GPS. 459 00:25:17,640 --> 00:25:19,359 Speaker 3: When you went fishing on that trip and you had 460 00:25:19,400 --> 00:25:21,600 Speaker 3: to get around to go to the lake And. 461 00:25:21,400 --> 00:25:23,480 Speaker 2: They say, well, of course, you know, of course I did. 462 00:25:23,720 --> 00:25:25,680 Speaker 2: I say, well, guess what you used AI? 463 00:25:25,840 --> 00:25:29,040 Speaker 3: And that's generally depending on what application referred to as 464 00:25:29,160 --> 00:25:32,320 Speaker 3: narrow AI. It doesn't necessarily mean they were using generative AI. 465 00:25:32,680 --> 00:25:35,720 Speaker 3: But I think it helps people to understand. Look, we've 466 00:25:35,800 --> 00:25:39,560 Speaker 3: had since nineteen fifty since and Macarthy's term in the 467 00:25:39,560 --> 00:25:44,080 Speaker 3: public facing term artificial intelligence. So artificial intelligence has been 468 00:25:44,119 --> 00:25:47,240 Speaker 3: with us for a long time, but obviously it has 469 00:25:47,280 --> 00:25:50,640 Speaker 3: progressed enormously in recent The reason why it's ninety nine 470 00:25:50,680 --> 00:25:54,439 Speaker 3: percent is because it's baked into the daily algorithms that 471 00:25:54,600 --> 00:25:57,239 Speaker 3: power and so when you see someone on Twitter X 472 00:25:57,359 --> 00:25:59,920 Speaker 3: or on Facebook say I hate AI, the grand iron 473 00:26:00,240 --> 00:26:04,280 Speaker 3: is AI is what's allowing them to communicate that message. 474 00:26:04,440 --> 00:26:05,760 Speaker 2: What I was mentioning. 475 00:26:05,400 --> 00:26:09,560 Speaker 3: Too, is that sixty four percent of people can't identify 476 00:26:09,640 --> 00:26:12,040 Speaker 3: when they're using AI because they don't really think of 477 00:26:12,240 --> 00:26:15,160 Speaker 3: AI in that kind of daily life. So I think 478 00:26:15,200 --> 00:26:18,400 Speaker 3: it's important for us to understand there are real land 479 00:26:18,440 --> 00:26:23,720 Speaker 3: mines and real challenges ethically, morally, in security, personally, I 480 00:26:23,760 --> 00:26:26,080 Speaker 3: think politically in the bias space is going to be 481 00:26:26,080 --> 00:26:28,800 Speaker 3: a real big battle. But it's also important to recognize 482 00:26:28,800 --> 00:26:31,560 Speaker 3: there are roses of opportunity and that maybe it's not 483 00:26:31,640 --> 00:26:33,879 Speaker 3: as scary as your thought, because quite frankly, you've been 484 00:26:33,920 --> 00:26:35,600 Speaker 3: using AI for quite a while. 485 00:26:36,119 --> 00:26:39,359 Speaker 1: You get so used to things. I'm told I don't 486 00:26:39,440 --> 00:26:42,160 Speaker 1: understand this exactly, but I'm told that there is an 487 00:26:42,200 --> 00:26:48,800 Speaker 1: exquisite timing system worldwide which enables your automatic teller machine 488 00:26:48,800 --> 00:26:52,440 Speaker 1: to work, and that if that timing system wasn't there, 489 00:26:52,760 --> 00:26:56,480 Speaker 1: you literally couldn't match up around the world. I've been 490 00:26:56,520 --> 00:26:59,479 Speaker 1: in places in rural Africa where I got money out 491 00:26:59,520 --> 00:27:02,560 Speaker 1: of an at it sort of works, which, also, by 492 00:27:02,600 --> 00:27:06,520 Speaker 1: the way, is a reminder that in certain kinds of conflicts, 493 00:27:06,920 --> 00:27:10,840 Speaker 1: you may have an attack on the very structure of 494 00:27:10,880 --> 00:27:14,119 Speaker 1: how you live, where the ATMs don't work, the phones 495 00:27:14,160 --> 00:27:17,359 Speaker 1: don't work, GPS doesn't work, And that's part of why 496 00:27:17,760 --> 00:27:20,440 Speaker 1: we have to continue to invest, I think, in being 497 00:27:20,440 --> 00:27:23,040 Speaker 1: able to defend ourselves. I have to say this has 498 00:27:23,080 --> 00:27:26,760 Speaker 1: been exhilarating. You've given me a whole bunch of new ideas. Now, well, 499 00:27:26,840 --> 00:27:29,360 Speaker 1: I want to thank you for joining me. Your new book, 500 00:27:29,840 --> 00:27:32,440 Speaker 1: code Read the Left, The Right, China and the Race 501 00:27:32,520 --> 00:27:36,000 Speaker 1: to Control Ai, is available now on Amazon and in 502 00:27:36,000 --> 00:27:39,960 Speaker 1: bookstores everywhere. I really appreciate you taking the time to 503 00:27:39,960 --> 00:27:40,439 Speaker 1: be with us. 504 00:27:40,720 --> 00:27:42,840 Speaker 2: Thank you so much, mister Speaker. It's such an honor. 505 00:27:52,680 --> 00:27:55,160 Speaker 1: And now I'm pleased enderduce a new segment in news 506 00:27:55,200 --> 00:27:58,879 Speaker 1: World where I answer listeners questions. To ask a question, 507 00:27:59,240 --> 00:28:03,400 Speaker 1: please email me at newt at ginistre sixty dot com 508 00:28:04,280 --> 00:28:07,879 Speaker 1: as an Inner Circle member. William asks, mister Speaker, my 509 00:28:08,000 --> 00:28:11,480 Speaker 1: question is about Iran. The first objective should be the 510 00:28:11,520 --> 00:28:15,240 Speaker 1: removal of all uranium. I suggest that the Israelis be 511 00:28:15,320 --> 00:28:17,639 Speaker 1: given this task and that they get to keep the 512 00:28:17,760 --> 00:28:23,040 Speaker 1: uranium that they capture. Qualm is the theological capital. Removing 513 00:28:23,080 --> 00:28:27,320 Speaker 1: the libraries and the scholars is essential. What is the 514 00:28:27,359 --> 00:28:30,000 Speaker 1: support for the regime in the eastern part of the 515 00:28:30,040 --> 00:28:33,960 Speaker 1: country among other ethnic groups there. It's a lot of 516 00:28:33,960 --> 00:28:36,840 Speaker 1: different questions, William. Let me just say first of all 517 00:28:37,280 --> 00:28:40,920 Speaker 1: that I think we are either going to capture the 518 00:28:41,080 --> 00:28:44,640 Speaker 1: uranium or we're going to bury it so deep that 519 00:28:44,720 --> 00:28:47,200 Speaker 1: it will not be reachable in the next twenty or 520 00:28:47,280 --> 00:28:50,400 Speaker 1: thirty years. And I think the question there is how 521 00:28:50,520 --> 00:28:53,600 Speaker 1: risky is it to go in and get the uranium. Remember, 522 00:28:53,600 --> 00:28:56,600 Speaker 1: it weighs a pretty good amount. It's difficult to handle, 523 00:28:57,040 --> 00:28:59,360 Speaker 1: and you can't just run in the way we did 524 00:28:59,360 --> 00:29:02,479 Speaker 1: with Maduro, pick up two human beings and hustle them 525 00:29:02,480 --> 00:29:05,520 Speaker 1: out of the country. So it's a significant question, and 526 00:29:05,640 --> 00:29:07,360 Speaker 1: there's a question of do you really want to take 527 00:29:07,720 --> 00:29:11,000 Speaker 1: casualties if you can just stand off and balm them 528 00:29:11,040 --> 00:29:15,440 Speaker 1: so much that they're literally physically not reachable. Qualm is 529 00:29:15,440 --> 00:29:19,080 Speaker 1: the theological capital. I think it's very dangerous to get 530 00:29:19,120 --> 00:29:22,719 Speaker 1: involved in people's religions and it creates a level of 531 00:29:22,720 --> 00:29:27,320 Speaker 1: emotion and a level of natural support that would not 532 00:29:27,360 --> 00:29:31,120 Speaker 1: necessarily be there otherwise. So I'd be very cautious about 533 00:29:31,160 --> 00:29:35,680 Speaker 1: any activity aimed at the religious centers. You ask about 534 00:29:35,760 --> 00:29:38,240 Speaker 1: the regime in the eastern part of the country, there's 535 00:29:38,240 --> 00:29:42,160 Speaker 1: a real question about that. Very high percent. Almost half 536 00:29:42,440 --> 00:29:45,800 Speaker 1: of the people in Iran are not Persians. They come 537 00:29:45,840 --> 00:29:51,920 Speaker 1: from Kurds, Azrabanis and other ethnic groups, including some Bulukis 538 00:29:51,960 --> 00:29:56,440 Speaker 1: from Pakistan in the southwest part of the country. The 539 00:29:56,480 --> 00:30:00,320 Speaker 1: potential is always there for them to spin apart, then 540 00:30:00,360 --> 00:30:03,560 Speaker 1: to decide that they want some kind of separatist movement, 541 00:30:04,080 --> 00:30:07,520 Speaker 1: but at the moment the dominant force is still the 542 00:30:07,960 --> 00:30:12,120 Speaker 1: slight majority that are Persians, and I suspect on the 543 00:30:12,120 --> 00:30:14,440 Speaker 1: one hand, they'd like to hearrid of the dictatorship. On 544 00:30:14,480 --> 00:30:16,560 Speaker 1: the other hand, they want to keep a run as 545 00:30:16,600 --> 00:30:20,160 Speaker 1: a single country. So it's a tightrope we're having to walk. 546 00:30:21,000 --> 00:30:23,120 Speaker 1: I look forward to hearing from you, and you can 547 00:30:23,200 --> 00:30:27,040 Speaker 1: ask a question by just emailing me at Newt at 548 00:30:27,080 --> 00:30:30,200 Speaker 1: Gingish three sixty dot com. Thank you to my guest, 549 00:30:30,280 --> 00:30:34,320 Speaker 1: Wynton Hall. Newtsworld is produced by Gingrish three sixty and iHeartMedia. 550 00:30:34,920 --> 00:30:39,240 Speaker 1: Our executive producer is Guardnsey Sloan. Our researcher is Rachel Peterson. 551 00:30:39,800 --> 00:30:43,080 Speaker 1: Special thanks to the team at Gingish three sixty. If 552 00:30:43,120 --> 00:30:45,760 Speaker 1: you've been enjoying Newtsworld, I hope you'll go to Apple 553 00:30:45,800 --> 00:30:49,080 Speaker 1: Podcasts and both rate us with five stars and give 554 00:30:49,160 --> 00:30:51,880 Speaker 1: us a review so others can learn what it's all about. 555 00:30:52,520 --> 00:30:56,040 Speaker 1: Join me on substack at Gingish three sixty dot net. 556 00:30:56,400 --> 00:30:58,800 Speaker 1: I'm Newt Gingrich. This is Newsworld