1 00:00:04,840 --> 00:00:08,440 Speaker 1: On this episode of news World. Artificial intelligence is being 2 00:00:08,440 --> 00:00:11,680 Speaker 1: developed so rapidly by the tech industry, and AI is 3 00:00:11,720 --> 00:00:14,319 Speaker 1: starting to affect millions of Americans who engage with it, 4 00:00:15,000 --> 00:00:18,360 Speaker 1: from college students using chat GBT to write their papers, 5 00:00:18,760 --> 00:00:22,400 Speaker 1: to AI sampling human voices, to AI creating whole new 6 00:00:22,440 --> 00:00:26,160 Speaker 1: search engines like Microsoft's new launch of Being Powered by AI. 7 00:00:26,360 --> 00:00:29,120 Speaker 1: There's so much happening. I wanted to devote a series 8 00:00:29,120 --> 00:00:33,519 Speaker 1: of episodes to understanding artificial intelligence. I'm really pleased to 9 00:00:33,560 --> 00:00:37,360 Speaker 1: welcome my guest, Will Reinhardt. He is a senior research 10 00:00:37,400 --> 00:00:40,920 Speaker 1: fellow at the Center for Growth and Opportunity at Utah 11 00:00:41,040 --> 00:00:53,479 Speaker 1: State University. Will welcome and thank you for joining me 12 00:00:53,479 --> 00:00:54,120 Speaker 1: on Newsworld. 13 00:00:54,680 --> 00:00:55,600 Speaker 2: Yeah, thanks for having me. 14 00:00:55,880 --> 00:00:59,880 Speaker 1: Can you briefly talk about what is artificial intelligence and 15 00:01:00,200 --> 00:01:02,880 Speaker 1: how has it rapidly changed and develop over the past 16 00:01:02,960 --> 00:01:03,480 Speaker 1: few years. 17 00:01:03,680 --> 00:01:07,520 Speaker 3: Yeah, Artificial intelligence is a big blanket term, a big 18 00:01:07,560 --> 00:01:11,200 Speaker 3: tent term for a whole bunch of related technological developments 19 00:01:11,200 --> 00:01:15,000 Speaker 3: that really have come to fruition and come into application 20 00:01:15,040 --> 00:01:18,280 Speaker 3: in the last ten fifteen years. Probably the most important 21 00:01:18,360 --> 00:01:20,679 Speaker 3: things that we're talking about today, as you even mentioned, 22 00:01:21,000 --> 00:01:25,039 Speaker 3: is a suite of technologies really around it's called chat GPT, right. 23 00:01:25,080 --> 00:01:29,199 Speaker 3: So these technologies are called large language models, and these 24 00:01:29,680 --> 00:01:34,200 Speaker 3: lms are really just big models that take really big 25 00:01:34,319 --> 00:01:37,399 Speaker 3: data sets. In this case, it's literally the data of 26 00:01:37,440 --> 00:01:40,440 Speaker 3: the entire web, and then they put into a model, 27 00:01:40,560 --> 00:01:43,000 Speaker 3: and they use that model to do some learning, to 28 00:01:43,080 --> 00:01:45,960 Speaker 3: do some machine learning, and then from that they're able 29 00:01:45,959 --> 00:01:49,840 Speaker 3: to then connect it to a chat function, a messenger function, 30 00:01:50,400 --> 00:01:53,280 Speaker 3: and create a whole bunch of suite of new kind 31 00:01:53,320 --> 00:01:57,320 Speaker 3: of really interesting things. So I think really the fundamental 32 00:01:57,360 --> 00:02:00,520 Speaker 3: idea here, the fundamental underlying part of all of this 33 00:02:00,760 --> 00:02:02,520 Speaker 3: is that there's a lot of data that goes into 34 00:02:02,560 --> 00:02:05,080 Speaker 3: these models, and that there's a lot of human learning 35 00:02:05,120 --> 00:02:08,120 Speaker 3: and a lot of human input that's needed to actually 36 00:02:08,160 --> 00:02:09,400 Speaker 3: create something that's useful. 37 00:02:09,800 --> 00:02:12,560 Speaker 1: To some extent, this is just an expansion of an 38 00:02:12,600 --> 00:02:16,600 Speaker 1: ongoing computerization that's been to some extent of affecting us, 39 00:02:16,600 --> 00:02:19,400 Speaker 1: I guess since the nineteen fifties. To some extent is 40 00:02:19,440 --> 00:02:22,639 Speaker 1: a new and very different thing. Where do you come 41 00:02:22,680 --> 00:02:23,919 Speaker 1: down on those two choices? 42 00:02:24,360 --> 00:02:29,320 Speaker 3: Computerization and the embedding of very important computers into the workplace, 43 00:02:29,320 --> 00:02:31,959 Speaker 3: And this has been happening since the late nineteen forties, 44 00:02:31,960 --> 00:02:35,400 Speaker 3: early nineteen fifties. This current technology, I think is a 45 00:02:35,400 --> 00:02:37,800 Speaker 3: bit of a change, and it could potentially be a 46 00:02:37,840 --> 00:02:41,400 Speaker 3: pretty massive change. I would say very clearly that was 47 00:02:41,520 --> 00:02:46,079 Speaker 3: somebody with the background in economics that the technological developments 48 00:02:46,080 --> 00:02:49,280 Speaker 3: that have come with software, even though they've been important, 49 00:02:49,320 --> 00:02:52,720 Speaker 3: they haven't been as transformative, at least in the pure 50 00:02:52,760 --> 00:02:55,760 Speaker 3: economic sense, they haven't been as transformative as some people 51 00:02:55,800 --> 00:02:58,720 Speaker 3: would hope. And I think what these new AI programs, 52 00:02:58,760 --> 00:03:02,840 Speaker 3: these new models potent actually create is a new boost 53 00:03:03,040 --> 00:03:07,760 Speaker 3: of technological advancement for firms, for individuals, for people. So 54 00:03:08,080 --> 00:03:10,400 Speaker 3: to me, it is something that's new. What we're seeing 55 00:03:10,520 --> 00:03:13,120 Speaker 3: is definitely something that's new, But there's still a lot 56 00:03:13,120 --> 00:03:15,680 Speaker 3: of technologies that I suspect aren't probably going to be 57 00:03:15,680 --> 00:03:17,560 Speaker 3: effective all that much by these things. You know, I 58 00:03:17,560 --> 00:03:20,480 Speaker 3: can't imagine plumbers are going to be affected all that much. 59 00:03:20,720 --> 00:03:23,480 Speaker 3: We've talked in the past a lot about truck drivers, 60 00:03:23,520 --> 00:03:26,480 Speaker 3: but autonomous vehicles still seem to be very, very difficult 61 00:03:26,520 --> 00:03:30,360 Speaker 3: to actually accomplish in practice. So to me, what's happening 62 00:03:30,400 --> 00:03:33,200 Speaker 3: here is something that's very new. But at the same time, 63 00:03:33,320 --> 00:03:36,440 Speaker 3: what traditionally has happened is that firms have used new 64 00:03:36,480 --> 00:03:39,680 Speaker 3: technologies to do better, to become more productive, and that's 65 00:03:39,680 --> 00:03:43,360 Speaker 3: been a really uneven process, depending on the firm, depending 66 00:03:43,400 --> 00:03:46,520 Speaker 3: on the industry, depending on where the firm fits within 67 00:03:46,560 --> 00:03:48,720 Speaker 3: the industry. So there's a lot that actually goes into this. 68 00:03:49,160 --> 00:03:53,000 Speaker 1: Is there a threshold breakpoint between My car has all 69 00:03:53,040 --> 00:03:55,880 Speaker 1: sorts of computer chips and somebody said to me last 70 00:03:55,960 --> 00:03:59,080 Speaker 1: night that there are about three hundred big computer chips 71 00:03:59,080 --> 00:04:01,720 Speaker 1: and three thousand small computer chips in a modern car, 72 00:04:02,240 --> 00:04:04,800 Speaker 1: which is sort of amazing. But my car can tell 73 00:04:04,800 --> 00:04:06,720 Speaker 1: me when I'm backing up, but can tell me whether 74 00:04:06,760 --> 00:04:09,960 Speaker 1: I'm inside the lines of a parking space, et cetera, 75 00:04:10,120 --> 00:04:13,280 Speaker 1: things that fifteen years ago would have been impossible. What 76 00:04:13,480 --> 00:04:16,880 Speaker 1: is the jump from that to artificial intelligence? 77 00:04:17,400 --> 00:04:21,480 Speaker 3: So there're slightly separate but parallel tracks that are going on. 78 00:04:21,640 --> 00:04:25,440 Speaker 3: So all of the really advanced technologies that are in 79 00:04:25,440 --> 00:04:28,880 Speaker 3: your car have been created because of very specific kinds 80 00:04:28,880 --> 00:04:32,040 Speaker 3: of semiconductors. There are these chips that go in and 81 00:04:32,120 --> 00:04:36,080 Speaker 3: are used to drive a whole bunch of different possibilities 82 00:04:36,080 --> 00:04:39,840 Speaker 3: and technologies within your car. Now, those chips, there's been 83 00:04:39,839 --> 00:04:43,640 Speaker 3: a very specific subset of those chips which have gone 84 00:04:43,680 --> 00:04:47,599 Speaker 3: into helping to learn and have been really kind of 85 00:04:47,600 --> 00:04:51,560 Speaker 3: fundamental in making everything that's happening in AI actually occur. 86 00:04:52,120 --> 00:04:55,680 Speaker 3: So there's been a series of very specialized chips really 87 00:04:55,680 --> 00:05:00,400 Speaker 3: coming out of the gaming industry that's allowed these large models, 88 00:05:00,440 --> 00:05:03,240 Speaker 3: these large AI models to actually come about and allowed 89 00:05:03,279 --> 00:05:06,480 Speaker 3: for these new AI technologies to actually develop. So these 90 00:05:06,480 --> 00:05:08,159 Speaker 3: two things are happening right at the same time, and 91 00:05:08,200 --> 00:05:10,840 Speaker 3: I think that they're very important to actually connect. Because 92 00:05:11,440 --> 00:05:14,560 Speaker 3: the developments that you've seen in cars, that you've seen 93 00:05:14,600 --> 00:05:17,479 Speaker 3: in new technology that's embedded in cars and embedded in 94 00:05:18,160 --> 00:05:20,840 Speaker 3: washers and dryers and a whole bunch of other traditional 95 00:05:20,880 --> 00:05:24,039 Speaker 3: goods that we all use, there's also a parallel process 96 00:05:24,120 --> 00:05:27,640 Speaker 3: of advanced technology that's going on, or advanced development that's 97 00:05:27,680 --> 00:05:30,640 Speaker 3: going on also within AI, and those two things really 98 00:05:30,640 --> 00:05:32,120 Speaker 3: are tracking each other very closely. 99 00:05:32,360 --> 00:05:35,920 Speaker 1: Does AI tend to be more software based rather than 100 00:05:35,960 --> 00:05:38,720 Speaker 1: hardware or is that a false distinction. 101 00:05:39,160 --> 00:05:42,160 Speaker 3: That's a key question, and the two have developed in parallel. 102 00:05:42,760 --> 00:05:45,559 Speaker 3: Some have called this the hardware lottery. So in fact, 103 00:05:45,600 --> 00:05:48,479 Speaker 3: the AI models that we're seeing right now are taking 104 00:05:49,160 --> 00:05:54,400 Speaker 3: advantage of GPUs graphical processing units. These are gaining chips 105 00:05:54,440 --> 00:05:57,520 Speaker 3: effectively that have been developed so you can run a 106 00:05:57,560 --> 00:06:01,799 Speaker 3: lot of really advanced graphics, but that technology is used 107 00:06:01,920 --> 00:06:06,200 Speaker 3: fundamentally interestingly enough in bitcoin and other like cryptocurrency mining, 108 00:06:06,279 --> 00:06:10,320 Speaker 3: but also in AI massive AI developments, and so the 109 00:06:10,360 --> 00:06:13,160 Speaker 3: two things run in parallel and are very matched together. 110 00:06:13,520 --> 00:06:15,919 Speaker 3: I think what's interesting, at least for your audience to 111 00:06:16,040 --> 00:06:19,560 Speaker 3: understand is that we're actually coming to about the limits 112 00:06:19,920 --> 00:06:23,920 Speaker 3: of these large chat GPT related models, and that we're 113 00:06:23,920 --> 00:06:25,760 Speaker 3: only going to be able to get so much bigger 114 00:06:26,160 --> 00:06:28,520 Speaker 3: because of the data sets. There's limitations on how much 115 00:06:28,600 --> 00:06:30,960 Speaker 3: data you can put in because we have the entire web. 116 00:06:31,240 --> 00:06:34,240 Speaker 3: You can't go much further the entire web. But those 117 00:06:34,560 --> 00:06:37,760 Speaker 3: large data sets really have to be matched with really 118 00:06:37,839 --> 00:06:41,479 Speaker 3: really intense computing use at the same time, and those 119 00:06:41,480 --> 00:06:44,400 Speaker 3: two things really are connected. But also they make a 120 00:06:44,400 --> 00:06:47,440 Speaker 3: lot of these AI models very expensive to run. I 121 00:06:47,480 --> 00:06:50,320 Speaker 3: was actually talking with a colleague recently and there's some 122 00:06:50,520 --> 00:06:54,080 Speaker 3: estimations that chat GPT open ai is running something like 123 00:06:54,160 --> 00:06:57,480 Speaker 3: seven hundred thousand dollars a day worth of costs just 124 00:06:57,600 --> 00:07:01,360 Speaker 3: in running these models themselves. So there's a lot that 125 00:07:01,400 --> 00:07:04,320 Speaker 3: goes into both making the models and then actually using 126 00:07:04,320 --> 00:07:05,640 Speaker 3: the models on a daily basis. 127 00:07:06,040 --> 00:07:08,680 Speaker 1: So when you start describing that though, I'm trying to 128 00:07:08,680 --> 00:07:10,920 Speaker 1: get this in my head. If you have the most 129 00:07:10,960 --> 00:07:13,960 Speaker 1: advanced current games, don't you have a certain amount of 130 00:07:14,000 --> 00:07:18,600 Speaker 1: autonomous learning going on by the game itself as it 131 00:07:18,640 --> 00:07:20,120 Speaker 1: interacts with the players. 132 00:07:20,440 --> 00:07:23,040 Speaker 3: Yes, that can occur, and we've seen that happening. Sometimes 133 00:07:23,080 --> 00:07:25,960 Speaker 3: there's learning occurring, but not all AI systems do that. 134 00:07:26,560 --> 00:07:28,240 Speaker 1: I mean, is one of the distinctions of AI that 135 00:07:28,280 --> 00:07:30,880 Speaker 1: it is capable of learning? Or is that a false distinction? 136 00:07:32,000 --> 00:07:33,280 Speaker 2: It is capable of learning. 137 00:07:33,400 --> 00:07:37,760 Speaker 3: So open AI and chat GPT, basically the technology that 138 00:07:37,840 --> 00:07:39,640 Speaker 3: a lot of people are worried about and are using 139 00:07:40,040 --> 00:07:42,760 Speaker 3: it effectively, only has learned up to about twenty twenty 140 00:07:42,800 --> 00:07:45,600 Speaker 3: one from my understanding, so it's not up to date. 141 00:07:45,960 --> 00:07:49,840 Speaker 3: There are efforts to combine this system with other systems 142 00:07:49,840 --> 00:07:54,320 Speaker 3: that would allow for the chatbot feature to actually integrate 143 00:07:54,360 --> 00:07:56,880 Speaker 3: with the open web. But when you look at the 144 00:07:56,960 --> 00:08:00,200 Speaker 3: development of AI systems, there are some that are kind 145 00:08:00,200 --> 00:08:02,800 Speaker 3: of frozen in time, if that makes sense, and others 146 00:08:02,800 --> 00:08:06,560 Speaker 3: that are advancing in our learning, and that really is 147 00:08:06,600 --> 00:08:09,480 Speaker 3: an area of development and in a really critical area 148 00:08:09,480 --> 00:08:12,560 Speaker 3: of development for a lot of these AI models and aisystems. 149 00:08:12,840 --> 00:08:15,960 Speaker 1: Now are there geographic centers of AI development than the 150 00:08:16,000 --> 00:08:20,080 Speaker 1: sense that Silicon Valley became such a dominating center for 151 00:08:20,200 --> 00:08:20,840 Speaker 1: social media. 152 00:08:21,640 --> 00:08:24,040 Speaker 3: Yes, so the big areas right now that we're seeing 153 00:08:24,520 --> 00:08:27,960 Speaker 3: within this development is Silicon Valley. They still have a lead. 154 00:08:28,320 --> 00:08:31,120 Speaker 3: There's a really big contingent in Toronto as well. The 155 00:08:31,200 --> 00:08:34,440 Speaker 3: University of Toronto has had a pretty advanced AI development. 156 00:08:34,480 --> 00:08:36,240 Speaker 3: There's a lot of really kind of key players in 157 00:08:36,240 --> 00:08:39,640 Speaker 3: the system that are in AI in Toronto. There's efforts 158 00:08:39,679 --> 00:08:42,440 Speaker 3: in China as well, and so I wouldn't downplay what's 159 00:08:42,440 --> 00:08:45,120 Speaker 3: happening in Beijing and Shanghai, but at least for the 160 00:08:45,200 --> 00:08:47,679 Speaker 3: United States, I think the most interesting part of this, 161 00:08:47,720 --> 00:08:49,959 Speaker 3: because I'm from the Midwest, is that there's a lot 162 00:08:50,000 --> 00:08:52,880 Speaker 3: of emphasis now I'm putting data centers basically back into 163 00:08:52,880 --> 00:08:54,840 Speaker 3: the Midwest because it has a whole bunch of benefits 164 00:08:54,840 --> 00:08:58,160 Speaker 3: as well. So we could also see this as AI 165 00:08:58,320 --> 00:09:01,080 Speaker 3: demands more computing power, or we could see this computing 166 00:09:01,120 --> 00:09:04,520 Speaker 3: power actually going to areas that you may not expect 167 00:09:04,520 --> 00:09:07,120 Speaker 3: it as much, so in Iowa and in Michigan and 168 00:09:07,240 --> 00:09:11,439 Speaker 3: areas with basically cheap energy and easy regulatory structures. 169 00:09:11,640 --> 00:09:14,120 Speaker 1: Yeah. I was always struck with how much Carnegie mellon 170 00:09:14,960 --> 00:09:18,760 Speaker 1: the city in Pittsburgh, almost as an outlier, is nonetheless 171 00:09:18,880 --> 00:09:24,320 Speaker 1: remarkably invested and has drawn a huge Google facility to Pittsburgh. 172 00:09:24,520 --> 00:09:28,200 Speaker 3: Indeed, and it's a huge robotics development space. And what's interesting, 173 00:09:28,320 --> 00:09:30,400 Speaker 3: since I got really fascinated with this a couple of 174 00:09:30,480 --> 00:09:32,920 Speaker 3: years ago, and what I find really fascinating as well, 175 00:09:33,000 --> 00:09:34,920 Speaker 3: is that some of those facilities, or at least a 176 00:09:35,000 --> 00:09:39,480 Speaker 3: number of those facilities, are where nuclear development and nuclear 177 00:09:39,520 --> 00:09:42,400 Speaker 3: deployment happened in the nineteen forties and nineteen fifties. So 178 00:09:42,440 --> 00:09:45,960 Speaker 3: there is this kind of long history that exists specifically 179 00:09:45,960 --> 00:09:48,520 Speaker 3: within Pittsburgh, which I think is underrated as a city 180 00:09:49,040 --> 00:09:51,880 Speaker 3: where you have knowledge base that has existed for quite 181 00:09:51,920 --> 00:09:54,600 Speaker 3: some time and it's able to develop and deploy over 182 00:09:54,640 --> 00:09:55,439 Speaker 3: that knowledge base. 183 00:09:55,960 --> 00:09:58,640 Speaker 1: Are there specific companies that you sort of check in 184 00:09:58,720 --> 00:09:59,400 Speaker 1: on regularly. 185 00:10:00,000 --> 00:10:02,320 Speaker 3: The two big companies that are working right now in 186 00:10:02,520 --> 00:10:06,360 Speaker 3: AI deployment and development are open AI. This is Sam 187 00:10:06,400 --> 00:10:11,080 Speaker 3: Altman's big group. The other big contingent the big firm, 188 00:10:11,360 --> 00:10:15,040 Speaker 3: is what's known as Anthropic, so they are former open 189 00:10:15,120 --> 00:10:16,640 Speaker 3: AI people who have left. 190 00:10:17,240 --> 00:10:18,160 Speaker 2: Those are the two big. 191 00:10:17,960 --> 00:10:20,520 Speaker 3: Players right now that are working on kind of chat 192 00:10:20,520 --> 00:10:24,319 Speaker 3: functions or related sort of chat technologies, but we also 193 00:10:24,559 --> 00:10:28,000 Speaker 3: should expect something. Obviously, Google has their own version of it, 194 00:10:28,040 --> 00:10:30,520 Speaker 3: but they seem to be taking some time to actually 195 00:10:30,520 --> 00:10:34,600 Speaker 3: develop internally their own AI tools and AI functions. The 196 00:10:34,640 --> 00:10:37,240 Speaker 3: other big player that I'm kind of surprised hasn't made 197 00:10:37,280 --> 00:10:40,400 Speaker 3: as many waves is Meta Facebook, because they have a 198 00:10:40,400 --> 00:10:42,760 Speaker 3: whole bunch of fundamental data, They've got a lot of 199 00:10:43,280 --> 00:10:45,720 Speaker 3: messaging data, they've got a lot of the parts that 200 00:10:45,840 --> 00:10:49,120 Speaker 3: all should work together, and internally it seems as though 201 00:10:49,160 --> 00:10:52,360 Speaker 3: they've had some sort of AI back end now for 202 00:10:52,400 --> 00:10:55,320 Speaker 3: about four or five years. So those are the major 203 00:10:55,360 --> 00:10:57,400 Speaker 3: players that are going on. The other big one in 204 00:10:57,480 --> 00:10:59,800 Speaker 3: China would be I DO, and it seems that they're 205 00:10:59,800 --> 00:11:03,880 Speaker 3: all so deploying a large language model something that's very similar. 206 00:11:04,040 --> 00:11:06,480 Speaker 3: So those are the biggest players right now. But I 207 00:11:06,480 --> 00:11:08,280 Speaker 3: should mention that it takes a lot of money to 208 00:11:08,320 --> 00:11:11,320 Speaker 3: really do this. It takes quite literally billions of dollars 209 00:11:11,320 --> 00:11:14,360 Speaker 3: to start up a large firm of this nature. I 210 00:11:14,440 --> 00:11:17,000 Speaker 3: think open Ai got something like ten million dollars to 211 00:11:17,080 --> 00:11:19,360 Speaker 3: start up, and they've been getting a lot of money 212 00:11:19,360 --> 00:11:23,040 Speaker 3: from Microsoft to run the computations that it actually takes 213 00:11:23,160 --> 00:11:25,720 Speaker 3: to serve this content and that's a big part of 214 00:11:25,760 --> 00:11:28,640 Speaker 3: this that I think is really interesting because you do 215 00:11:28,679 --> 00:11:30,480 Speaker 3: one part of it, which is you do the initial 216 00:11:30,520 --> 00:11:33,800 Speaker 3: it's called training, But then every single time that you 217 00:11:33,880 --> 00:11:37,320 Speaker 3: interact with this chat feature, it costs money and it 218 00:11:37,320 --> 00:11:40,320 Speaker 3: costs a couple cents to actually run those chat features, 219 00:11:40,559 --> 00:11:44,200 Speaker 3: which means that every instance, every single chat function is 220 00:11:44,200 --> 00:11:46,760 Speaker 3: going to be really really expensive. And that means really 221 00:11:46,760 --> 00:11:48,640 Speaker 3: the only the biggest players at least at this point, 222 00:11:48,640 --> 00:11:50,760 Speaker 3: are going to be able to serve that content and 223 00:11:50,840 --> 00:12:00,439 Speaker 3: serve those sorts of models. 224 00:12:09,480 --> 00:12:12,040 Speaker 1: So this is different than the rise of social media 225 00:12:12,080 --> 00:12:15,840 Speaker 1: where you had the so called garage, you know, entrepreneurs 226 00:12:15,840 --> 00:12:19,480 Speaker 1: who went out and with almost no money could become players. 227 00:12:20,120 --> 00:12:23,520 Speaker 3: Fundamentally, yes, the economics, the unit economics of these are 228 00:12:23,600 --> 00:12:27,200 Speaker 3: just fundamentally different. The social media players are known as 229 00:12:27,360 --> 00:12:30,520 Speaker 3: kind of software economics. Your initial cost is maybe high, 230 00:12:30,559 --> 00:12:33,600 Speaker 3: but then your individual costs, what it costs to actually 231 00:12:33,600 --> 00:12:36,840 Speaker 3: serve the next person is pretty low. AI is both 232 00:12:36,840 --> 00:12:39,080 Speaker 3: a high initial cost and a high variable costs. That 233 00:12:39,080 --> 00:12:40,920 Speaker 3: everything costs a lot of money to serve, and it 234 00:12:40,960 --> 00:12:42,800 Speaker 3: costs a lot of money to start up, which means 235 00:12:42,840 --> 00:12:45,200 Speaker 3: really only the biggest players can even be involved. In 236 00:12:45,240 --> 00:12:48,439 Speaker 3: this but that's also changing. Things are opening up and 237 00:12:48,480 --> 00:12:50,679 Speaker 3: going onto the open web, So these things will be 238 00:12:50,720 --> 00:12:52,080 Speaker 3: shifting over time, I imagine. 239 00:12:52,160 --> 00:12:54,040 Speaker 1: Do you think there will be a decline in cost 240 00:12:54,120 --> 00:12:56,960 Speaker 1: or do you think it'll remain relatively very expensive. 241 00:12:57,679 --> 00:13:02,440 Speaker 3: The best quality AI chat features are still very costly. 242 00:13:02,679 --> 00:13:04,440 Speaker 3: The thing that could change this would be a new 243 00:13:04,600 --> 00:13:09,000 Speaker 3: set of computing technologies. Quantum computing could really change this cost. 244 00:13:09,679 --> 00:13:11,880 Speaker 3: I'm still little skeptical of that, and I think that's 245 00:13:11,920 --> 00:13:14,600 Speaker 3: still in the near future. That's still in the arms 246 00:13:14,679 --> 00:13:17,600 Speaker 3: length away at this point. But it really is pretty 247 00:13:17,600 --> 00:13:19,480 Speaker 3: expensive to run a lot of these models, the high 248 00:13:19,559 --> 00:13:21,560 Speaker 3: quality models. I should say, some of the stuff we 249 00:13:21,559 --> 00:13:24,360 Speaker 3: haven't really talked as much about so far are the 250 00:13:24,800 --> 00:13:29,160 Speaker 3: art and painting and music technologies, and the music models 251 00:13:29,160 --> 00:13:32,160 Speaker 3: that use a very similar sort of back end to 252 00:13:32,200 --> 00:13:37,480 Speaker 3: create these amazing new paintings and renderings and musical recordings. 253 00:13:37,760 --> 00:13:39,920 Speaker 3: Those things seem to cost a little bit less, and 254 00:13:39,960 --> 00:13:42,800 Speaker 3: they're actually being deployed and developed by teams of people 255 00:13:42,800 --> 00:13:46,640 Speaker 3: that are working effectively for free online. It matters where 256 00:13:47,240 --> 00:13:49,080 Speaker 3: in the content space you're talking about. 257 00:13:49,120 --> 00:13:49,800 Speaker 2: If you're talking. 258 00:13:49,679 --> 00:13:53,440 Speaker 3: About language, that's one demand. If you're talking about painting 259 00:13:53,480 --> 00:13:55,760 Speaker 3: and images and graphics that's another, and then if you're 260 00:13:55,760 --> 00:13:58,240 Speaker 3: talking about music, it's yet another. And all of these 261 00:13:58,240 --> 00:14:02,040 Speaker 3: things all matter. The cost structures are all slightly different, 262 00:14:02,360 --> 00:14:05,640 Speaker 3: but it's still not that easy to develop, deploy, and 263 00:14:05,800 --> 00:14:08,679 Speaker 3: make high quality AI models at this point, but that 264 00:14:08,760 --> 00:14:09,800 Speaker 3: could change in the future. 265 00:14:10,120 --> 00:14:12,840 Speaker 1: Do say I have much of an application to healthcare. 266 00:14:13,160 --> 00:14:16,520 Speaker 3: There is a lot of application to healthcare. The big development, 267 00:14:16,559 --> 00:14:19,440 Speaker 3: at least in the near term that seems to draw 268 00:14:19,440 --> 00:14:23,400 Speaker 3: a lot of interest is one is to develop new 269 00:14:23,440 --> 00:14:27,160 Speaker 3: drugs and new drug platforms that are able to model 270 00:14:27,200 --> 00:14:31,360 Speaker 3: out these systems and are able to basically model internally 271 00:14:31,480 --> 00:14:35,400 Speaker 3: what the human genome works internally. So that's a big 272 00:14:35,440 --> 00:14:38,320 Speaker 3: area of development and we're seeing a lot of change there. 273 00:14:38,720 --> 00:14:40,680 Speaker 3: It seems that also some of the kind of medical 274 00:14:40,720 --> 00:14:43,520 Speaker 3: deviceing and the medical imaging so like cancer screen also 275 00:14:43,560 --> 00:14:45,400 Speaker 3: is in that same sort of space where we have 276 00:14:45,440 --> 00:14:48,560 Speaker 3: a lot of data and you can put that data 277 00:14:48,600 --> 00:14:52,040 Speaker 3: into a model and then use it to do better 278 00:14:52,120 --> 00:14:54,400 Speaker 3: and to have better service. So those are the two 279 00:14:54,400 --> 00:14:57,400 Speaker 3: areas that we've seen drug development primarily, and then cancer 280 00:14:57,440 --> 00:15:00,440 Speaker 3: screens and various sorts of cancer screening technolologies. 281 00:15:00,600 --> 00:15:03,160 Speaker 1: Are you seeing the rise of sort of satellite companies 282 00:15:03,640 --> 00:15:07,720 Speaker 1: that specialize in things like that, but then basically get 283 00:15:07,760 --> 00:15:10,000 Speaker 1: the capability from the big companies. 284 00:15:10,240 --> 00:15:12,800 Speaker 3: Yes, and then they integrate and they work specifically in 285 00:15:13,120 --> 00:15:14,400 Speaker 3: very specific contracts. 286 00:15:14,520 --> 00:15:16,200 Speaker 2: Some of them are working, some of them aren't. 287 00:15:16,960 --> 00:15:19,120 Speaker 3: It takes a lot to do better and to actually 288 00:15:19,160 --> 00:15:21,280 Speaker 3: have better processes, and it's going to take a lot 289 00:15:21,280 --> 00:15:24,400 Speaker 3: of time to actually see this implemented within companies and 290 00:15:24,440 --> 00:15:27,320 Speaker 3: for companies to use these new technologies in a way 291 00:15:27,360 --> 00:15:29,560 Speaker 3: that does better than the traditional methods. 292 00:15:29,840 --> 00:15:32,080 Speaker 1: To what extent is the federal government investing in this? 293 00:15:32,760 --> 00:15:35,840 Speaker 3: So the federal government has for years invested in a 294 00:15:35,840 --> 00:15:38,520 Speaker 3: lot of the fundamental technologies. DARPA has been a big 295 00:15:38,560 --> 00:15:43,720 Speaker 3: supporter of semiconductors and the GPUs. The software itself, the technology, 296 00:15:43,760 --> 00:15:46,440 Speaker 3: the software that we're talking about right now, was largely 297 00:15:46,480 --> 00:15:49,400 Speaker 3: developed by Google, and then they released a paper and 298 00:15:49,520 --> 00:15:51,600 Speaker 3: kind of everyone co lest around that paper. So the 299 00:15:51,640 --> 00:15:55,120 Speaker 3: software side of things seem to be far more commercially 300 00:15:55,200 --> 00:15:58,520 Speaker 3: driven and driven by business. Government seems to be much 301 00:15:58,520 --> 00:16:00,680 Speaker 3: more on the hard tech side of things. They've been 302 00:16:00,800 --> 00:16:03,320 Speaker 3: obviously spending a lot of money on quantum computers, and 303 00:16:03,680 --> 00:16:06,000 Speaker 3: there's also been a lot of efforts in semiconductors to 304 00:16:06,400 --> 00:16:08,000 Speaker 3: bring a whole bunch of that back to the States. 305 00:16:08,520 --> 00:16:10,880 Speaker 3: But that seems to be the bifurcation right now between 306 00:16:10,920 --> 00:16:13,360 Speaker 3: the two, the hard science and then the soft science. 307 00:16:13,680 --> 00:16:15,800 Speaker 1: Is there any country other than the US and China 308 00:16:15,840 --> 00:16:19,880 Speaker 1: that seems to be making this significant creative investment. 309 00:16:20,440 --> 00:16:25,040 Speaker 3: There have been European companies that have been involved in deployment, 310 00:16:25,120 --> 00:16:28,640 Speaker 3: but the regulatory structure in Europe is a lot more difficult. 311 00:16:28,880 --> 00:16:29,920 Speaker 2: So in Italy. 312 00:16:30,000 --> 00:16:35,280 Speaker 3: Recently, Italy basically banned chat GPT because it violated privacy, 313 00:16:35,640 --> 00:16:39,360 Speaker 3: and it seems that Europe is really wanting to regulate AI. 314 00:16:39,560 --> 00:16:42,680 Speaker 3: So there's arguments of basically banning the technology altogether. Within 315 00:16:42,840 --> 00:16:44,080 Speaker 3: the European countries. 316 00:16:44,840 --> 00:16:46,000 Speaker 2: Europe is kind. 317 00:16:45,800 --> 00:16:47,800 Speaker 3: Of its own I don't want to say problem, but 318 00:16:48,160 --> 00:16:50,120 Speaker 3: they don't exactly have the startup scene that we have 319 00:16:50,200 --> 00:16:51,840 Speaker 3: here in the States, which I think is one of 320 00:16:51,920 --> 00:16:55,200 Speaker 3: the great benefits of this country. The UK does seem 321 00:16:55,240 --> 00:16:57,400 Speaker 3: to have a lot of advanced technology when it comes 322 00:16:57,440 --> 00:17:00,920 Speaker 3: to medical devices in medical science, and so that's also 323 00:17:00,960 --> 00:17:04,399 Speaker 3: another area where we've seen a lot of development. But 324 00:17:04,600 --> 00:17:07,040 Speaker 3: it seems as though those companies, and the big one 325 00:17:07,280 --> 00:17:09,520 Speaker 3: in the UK is deep Mind is acquired by Google 326 00:17:09,560 --> 00:17:12,080 Speaker 3: a couple of years ago. That company is probably going 327 00:17:12,160 --> 00:17:15,960 Speaker 3: to get merged in with other features or functions within Google, 328 00:17:16,520 --> 00:17:19,880 Speaker 3: and it seems at least that's probably the biggest player there. 329 00:17:19,920 --> 00:17:23,160 Speaker 3: But other than that, really there aren't many people who 330 00:17:23,200 --> 00:17:26,600 Speaker 3: can run these models or incur the costs, and that 331 00:17:26,680 --> 00:17:28,600 Speaker 3: I think is a very important thing to understand about 332 00:17:28,600 --> 00:17:29,919 Speaker 3: the technology, writ large. 333 00:17:30,160 --> 00:17:32,600 Speaker 1: One of the concerns that led us to want to 334 00:17:32,800 --> 00:17:37,600 Speaker 1: develop this. We check gbts. You're getting people who get 335 00:17:37,600 --> 00:17:41,240 Speaker 1: calls from a system which imitates perfectly the voice of 336 00:17:41,280 --> 00:17:44,320 Speaker 1: their daughter saying, oh gee, this just happened. Can you 337 00:17:44,359 --> 00:17:46,639 Speaker 1: send me five hundred dollars? And so you have a 338 00:17:46,640 --> 00:17:50,320 Speaker 1: whole new zone for fraud and cheating. Is that a 339 00:17:50,359 --> 00:17:51,920 Speaker 1: significant evolution. 340 00:17:52,080 --> 00:17:54,240 Speaker 3: That has been happening, And I think that's going to 341 00:17:54,240 --> 00:17:57,399 Speaker 3: be the very near term effect that the cybersecurity and 342 00:17:57,440 --> 00:17:59,399 Speaker 3: kind of police and fraud efforts I think are going 343 00:17:59,440 --> 00:18:02,679 Speaker 3: to be the near term biggest change. It's now easier 344 00:18:02,720 --> 00:18:04,919 Speaker 3: than ever to hack, It's now easier than ever to 345 00:18:05,119 --> 00:18:08,600 Speaker 3: imitate somebody's likeness. But to me, there's gonna probably be 346 00:18:08,640 --> 00:18:11,120 Speaker 3: a back and forth between the two. So as much 347 00:18:11,119 --> 00:18:13,760 Speaker 3: as we'll see nefarious actors use this. You're going to 348 00:18:13,800 --> 00:18:16,880 Speaker 3: see police starting to use this as well. You'll also, 349 00:18:16,920 --> 00:18:19,479 Speaker 3: i think, probably have a reaction to this, and people 350 00:18:20,040 --> 00:18:22,960 Speaker 3: the initial instances of saying, hey, we have your daughter, 351 00:18:22,960 --> 00:18:25,160 Speaker 3: which happened recently. I think it was like a million dollars. 352 00:18:25,160 --> 00:18:27,640 Speaker 3: If you don't give us a million dollars, then we're 353 00:18:27,640 --> 00:18:30,000 Speaker 3: going to do all these horrible things. I think people 354 00:18:30,040 --> 00:18:34,240 Speaker 3: are gonna, unfortunately become more used to this and are 355 00:18:34,320 --> 00:18:37,560 Speaker 3: gonna understand better how to deal with it. And we've 356 00:18:37,560 --> 00:18:39,520 Speaker 3: seen this as well with a lot of individuals, a 357 00:18:39,560 --> 00:18:41,600 Speaker 3: lot of folks that obviously this is a big problem 358 00:18:41,760 --> 00:18:44,760 Speaker 3: brought in particular, and identity threat is a really really 359 00:18:44,800 --> 00:18:47,600 Speaker 3: big problem in the United States, but that over time 360 00:18:47,760 --> 00:18:50,679 Speaker 3: we've seen new ways of dealing with this problem. And 361 00:18:50,720 --> 00:18:53,119 Speaker 3: I think that we're going to have to advance those efforts. 362 00:18:53,119 --> 00:18:55,680 Speaker 3: And I think that's actually a big place where leadership 363 00:18:55,680 --> 00:18:58,920 Speaker 3: actually could be helpful, that they could have a better 364 00:18:58,960 --> 00:19:02,720 Speaker 3: conversation about that, in particular, how police can do better 365 00:19:02,840 --> 00:19:06,040 Speaker 3: and how obviously the banks and financial institutions and a 366 00:19:06,040 --> 00:19:09,280 Speaker 3: whole bunch of other important kind of bedrock institutions in 367 00:19:09,280 --> 00:19:13,000 Speaker 3: the United States can actually do better in ensuring personal identity. 368 00:19:13,280 --> 00:19:15,359 Speaker 1: Well, it may also be that you need a whole 369 00:19:15,400 --> 00:19:20,119 Speaker 1: new approach to managing the technologies of fraud. I think 370 00:19:20,640 --> 00:19:25,800 Speaker 1: the best estimate I've seen is that during the COVID period, 371 00:19:26,320 --> 00:19:31,320 Speaker 1: California unemployment lost twenty billion dollars to identity theft. 372 00:19:31,640 --> 00:19:35,360 Speaker 3: So this is a fundamental problem besides these AI models, 373 00:19:35,359 --> 00:19:40,040 Speaker 3: that it's easy to defraud states of these benefits. That 374 00:19:40,240 --> 00:19:42,760 Speaker 3: is something that we really need to work on. I 375 00:19:42,760 --> 00:19:44,239 Speaker 3: don't know the easy solution to that. 376 00:19:44,920 --> 00:19:48,480 Speaker 1: Clearly there has to be some kind of concerted effort 377 00:19:48,800 --> 00:19:52,199 Speaker 1: to develop an ability to both block it and to 378 00:19:52,280 --> 00:19:54,920 Speaker 1: track it back and make it expensive for the person 379 00:19:54,920 --> 00:19:55,560 Speaker 1: who's doing. 380 00:19:55,440 --> 00:19:57,560 Speaker 2: It and cheaper for a person. 381 00:19:57,880 --> 00:20:01,600 Speaker 3: In a colleague of mind actually had unemployment claim filed 382 00:20:01,640 --> 00:20:05,119 Speaker 3: and he's very employed still, and we were talking about this, 383 00:20:05,200 --> 00:20:07,800 Speaker 3: and he was discussing how difficult it was for him 384 00:20:07,880 --> 00:20:11,480 Speaker 3: to actually go to the state itself and say, hey, 385 00:20:12,000 --> 00:20:14,320 Speaker 3: I know this, I did not file this claim. So 386 00:20:14,680 --> 00:20:16,679 Speaker 3: I think that's the part of this that's also going 387 00:20:16,720 --> 00:20:18,720 Speaker 3: to be really really important for states to deal with, 388 00:20:18,760 --> 00:20:22,119 Speaker 3: which is actively saying, hey, we have a claim against you. 389 00:20:22,359 --> 00:20:25,000 Speaker 3: Is this truly you? And if states were to make 390 00:20:25,040 --> 00:20:27,440 Speaker 3: that easier. I think it's not going to solve everything, 391 00:20:27,600 --> 00:20:29,359 Speaker 3: but that I think is a very key part of 392 00:20:29,400 --> 00:20:31,600 Speaker 3: it that we actually need to make. We need to 393 00:20:31,600 --> 00:20:34,760 Speaker 3: make governments more responsive to individuals who knew that would 394 00:20:34,800 --> 00:20:35,600 Speaker 3: be actually a solution. 395 00:20:52,240 --> 00:20:55,400 Speaker 1: As a former teacher, to what extent does Chad GBT 396 00:20:56,160 --> 00:20:59,680 Speaker 1: end up being sort of the next generation beyond cliff notes. 397 00:21:00,520 --> 00:21:02,440 Speaker 3: I have some friends who are in this space, and 398 00:21:02,520 --> 00:21:06,000 Speaker 3: I have students as well at Utah State, and I've 399 00:21:06,040 --> 00:21:08,480 Speaker 3: been active in talking with them about how to deal 400 00:21:08,520 --> 00:21:10,960 Speaker 3: with this, because the kinds of stuff that I've been 401 00:21:11,000 --> 00:21:16,359 Speaker 3: asking of them requires more than just a common response, 402 00:21:16,480 --> 00:21:19,080 Speaker 3: and I think what that's going to demand is that 403 00:21:19,440 --> 00:21:22,159 Speaker 3: students and teachers. It probably means that there's going to 404 00:21:22,160 --> 00:21:25,200 Speaker 3: be more in classroom teaching, that there would be more 405 00:21:25,280 --> 00:21:28,119 Speaker 3: demands to have tests taking in classrooms. But it probably 406 00:21:28,160 --> 00:21:30,639 Speaker 3: also means that some of the things that had happened 407 00:21:30,680 --> 00:21:33,520 Speaker 3: in the past, you just write a paper that may 408 00:21:33,560 --> 00:21:38,080 Speaker 3: not be as simple as before, or more importantly, that 409 00:21:38,240 --> 00:21:41,640 Speaker 3: teachers demand more from that. I don't really know how 410 00:21:41,760 --> 00:21:44,119 Speaker 3: this is all going to shake out. I have noticed 411 00:21:44,160 --> 00:21:46,119 Speaker 3: with some of the students that I've worked with have 412 00:21:46,280 --> 00:21:48,640 Speaker 3: tried to pass off some of the material as their own, 413 00:21:49,200 --> 00:21:52,760 Speaker 3: and it's pretty evidently clear that you really didn't do 414 00:21:52,800 --> 00:21:56,720 Speaker 3: the work here. But that was kind of fundamentally their 415 00:21:56,800 --> 00:21:59,080 Speaker 3: problem to begin with. They'd never really done the work 416 00:21:59,080 --> 00:22:02,240 Speaker 3: of thinking through the problem itself. So I don't really 417 00:22:02,280 --> 00:22:04,359 Speaker 3: know where that ends up, but I do know that 418 00:22:04,520 --> 00:22:07,440 Speaker 3: teachers are kind of noticing this naturally, and that if 419 00:22:07,440 --> 00:22:09,720 Speaker 3: a student is not doing particularly well, I think it's 420 00:22:09,720 --> 00:22:12,080 Speaker 3: pretty evident to them that they're not doing very well 421 00:22:12,160 --> 00:22:15,720 Speaker 3: or that it feels fishy. Now, admittedly, we're really bad 422 00:22:15,760 --> 00:22:18,480 Speaker 3: about understanding and whether or not these things are chet 423 00:22:18,520 --> 00:22:22,720 Speaker 3: GPT answers, but if individuals are still not performing in 424 00:22:22,800 --> 00:22:25,359 Speaker 3: the way that they should, that still means that they're 425 00:22:25,359 --> 00:22:27,520 Speaker 3: not keeping up with the grades and not keeping up 426 00:22:27,560 --> 00:22:30,399 Speaker 3: with the teachings. And so I know this adds a 427 00:22:30,400 --> 00:22:32,560 Speaker 3: lot of new problems, but it really doesn't solve the 428 00:22:32,560 --> 00:22:36,080 Speaker 3: fundamental problem, which is that it's sometimes difficult to educate. 429 00:22:36,200 --> 00:22:38,800 Speaker 3: I mean, it's oftentimes difficult to educate, but it's often 430 00:22:38,840 --> 00:22:42,640 Speaker 3: difficult to understand what teachers and what students really need 431 00:22:42,640 --> 00:22:44,840 Speaker 3: to do better. And I'll leave that to the teachers 432 00:22:44,880 --> 00:22:46,560 Speaker 3: like yourself to try to figure out how to do 433 00:22:46,600 --> 00:22:47,080 Speaker 3: better on that. 434 00:22:47,560 --> 00:22:50,200 Speaker 1: The whole thing, though, is interesting in terms of deception 435 00:22:50,400 --> 00:22:52,880 Speaker 1: and theft. I do some work with Home Title Luck 436 00:22:53,320 --> 00:22:56,240 Speaker 1: and I interviewed a criminal who'd spent eight years in 437 00:22:56,320 --> 00:22:59,560 Speaker 1: jail the peak of his career, had stolen over two 438 00:22:59,600 --> 00:23:03,800 Speaker 1: hundred online. He just went in, He replaced the person 439 00:23:03,840 --> 00:23:07,920 Speaker 1: who owned the mortgage, redirected it to himself, sold the house. 440 00:23:08,080 --> 00:23:10,880 Speaker 1: It became a nightmare for the person who's dealing with it. 441 00:23:10,960 --> 00:23:14,240 Speaker 1: But apparently we live in a world now where the 442 00:23:14,280 --> 00:23:18,359 Speaker 1: ability to use the Internet to engage in theft across 443 00:23:18,400 --> 00:23:21,919 Speaker 1: a wide range of ways is astonishing. But it sounds 444 00:23:21,920 --> 00:23:26,240 Speaker 1: like AI is not necessarily a dramatic increase in that capability. 445 00:23:27,000 --> 00:23:31,880 Speaker 3: Could be it could help make things much easier. I 446 00:23:31,920 --> 00:23:34,240 Speaker 3: have a feeling that all of these instances that you're 447 00:23:34,240 --> 00:23:38,000 Speaker 3: talking about, the theft of mortgages online, the problems that 448 00:23:38,000 --> 00:23:41,000 Speaker 3: we've seen with unemployment and with COVID, all of those 449 00:23:41,040 --> 00:23:43,200 Speaker 3: things they have existed. You're exactly right, and we could 450 00:23:43,240 --> 00:23:45,520 Speaker 3: see more and more happen in the near future. But 451 00:23:45,600 --> 00:23:48,360 Speaker 3: I think what also should be understood, or at least 452 00:23:48,359 --> 00:23:51,520 Speaker 3: my assumption would be that as these things increase, you'll 453 00:23:51,560 --> 00:23:55,919 Speaker 3: also see reactions such that the states start using chat 454 00:23:56,040 --> 00:23:59,120 Speaker 3: features to better parse out to see if there actually 455 00:23:59,200 --> 00:24:01,840 Speaker 3: is a person on the other line, such that basically, 456 00:24:01,920 --> 00:24:04,399 Speaker 3: you know, it kind of ramps up the armament on 457 00:24:04,440 --> 00:24:06,960 Speaker 3: both sides, right that it kind of weaponizes both sides 458 00:24:06,960 --> 00:24:09,640 Speaker 3: with the technology, and that to me is probably where 459 00:24:09,640 --> 00:24:12,040 Speaker 3: things need to have gone a long time ago, which 460 00:24:12,080 --> 00:24:14,280 Speaker 3: is we needed better waste fraud and abuse. I work 461 00:24:14,280 --> 00:24:16,120 Speaker 3: a lot on broadband, and this is a really really 462 00:24:16,119 --> 00:24:18,679 Speaker 3: fundamental problem, but it's been so for the last twenty years, 463 00:24:19,160 --> 00:24:22,479 Speaker 3: and this is something that we constantly talk about. So personally, 464 00:24:22,520 --> 00:24:24,479 Speaker 3: I think that really is probably going to put pressure 465 00:24:24,520 --> 00:24:28,359 Speaker 3: on governments and especially agencies to do more in this space. 466 00:24:28,400 --> 00:24:31,160 Speaker 3: And it's unfortunate that it's happening now, but we need 467 00:24:31,160 --> 00:24:33,280 Speaker 3: to clear these things up and make things. 468 00:24:33,280 --> 00:24:34,200 Speaker 2: Work a lot better. 469 00:24:34,240 --> 00:24:37,399 Speaker 3: And unfortunately we're now seeing exactly what that cost is, 470 00:24:37,440 --> 00:24:38,360 Speaker 3: and it's quite massive. 471 00:24:38,880 --> 00:24:42,120 Speaker 1: So in addition to the crime side, the other thing 472 00:24:42,240 --> 00:24:47,399 Speaker 1: is this concept of artificial intelligence eventually becoming autonomous and 473 00:24:47,480 --> 00:24:51,080 Speaker 1: deciding that it doesn't necessarily like us. Is that basically 474 00:24:51,119 --> 00:24:52,160 Speaker 1: still science. 475 00:24:51,800 --> 00:24:55,119 Speaker 3: Fiction robots with lasers for eyes, I think is still 476 00:24:55,480 --> 00:24:58,800 Speaker 3: quite far off. It is something that animates a lot 477 00:24:58,800 --> 00:25:02,200 Speaker 3: of people, especially in Silicon Valley. But what's interesting about 478 00:25:02,200 --> 00:25:05,240 Speaker 3: everything that's been released about chat GPT is they actually 479 00:25:05,359 --> 00:25:07,960 Speaker 3: tried to do or at least there was some efforts 480 00:25:08,000 --> 00:25:11,800 Speaker 3: to try to connect various pieces together. So they tried 481 00:25:11,840 --> 00:25:14,879 Speaker 3: to use the chat feature to go online and do 482 00:25:15,119 --> 00:25:18,399 Speaker 3: an order, and it didn't work very well. So as 483 00:25:18,480 --> 00:25:23,160 Speaker 3: much as people talk about autonomous technologies taking over all 484 00:25:23,200 --> 00:25:25,639 Speaker 3: of these sorts of systems, I still think that's a 485 00:25:25,840 --> 00:25:28,600 Speaker 3: pretty far afield. And the reason why I think that's 486 00:25:28,640 --> 00:25:31,200 Speaker 3: the case is that there's a lot of effort that's spent, 487 00:25:31,240 --> 00:25:34,520 Speaker 3: at least now, in trying to ensure that these chat 488 00:25:34,560 --> 00:25:38,600 Speaker 3: features are embedded with some what's called red teaming. So 489 00:25:38,600 --> 00:25:39,960 Speaker 3: they go through and they make sure that there's a 490 00:25:40,000 --> 00:25:42,199 Speaker 3: lot of capabilities that don't exist or some sort of 491 00:25:42,320 --> 00:25:46,120 Speaker 3: limitations on those capabilities. But there are still a lot 492 00:25:46,160 --> 00:25:49,320 Speaker 3: of open technologies that I do worry about. So there's 493 00:25:49,320 --> 00:25:52,879 Speaker 3: a lot of fundamental infrastructure that is still very open 494 00:25:53,040 --> 00:25:55,960 Speaker 3: and it could easily be hacked. Now, the worry that 495 00:25:56,000 --> 00:25:58,600 Speaker 3: I think is not nuclear war. It's not that sort 496 00:25:58,640 --> 00:26:01,760 Speaker 3: of destruction worry, much more mundane. You'd be able to 497 00:26:01,800 --> 00:26:05,119 Speaker 3: just shut off air vents in a building and create 498 00:26:05,160 --> 00:26:07,240 Speaker 3: lots of havoc. So it seems to me that the 499 00:26:07,359 --> 00:26:11,960 Speaker 3: problems are much more spread out, but they're much more mundane, 500 00:26:12,200 --> 00:26:15,080 Speaker 3: and they potentially could exist much more in the future. 501 00:26:15,240 --> 00:26:16,600 Speaker 2: But again, the other part. 502 00:26:16,400 --> 00:26:20,040 Speaker 3: Of this is that now if you have these chat features, 503 00:26:20,040 --> 00:26:22,000 Speaker 3: if you have a function, if you have an AI 504 00:26:22,080 --> 00:26:26,640 Speaker 3: model that can do this sort of active hacking, there's 505 00:26:26,640 --> 00:26:29,119 Speaker 3: also would be one that works to ensure that you 506 00:26:29,119 --> 00:26:31,560 Speaker 3: can secure these systems. And I think that's the real 507 00:26:31,600 --> 00:26:33,119 Speaker 3: thing that we need to be thinking about. How to 508 00:26:33,200 --> 00:26:37,960 Speaker 3: use these AI models proactively, how we train, at least 509 00:26:37,960 --> 00:26:41,119 Speaker 3: in a very narrow sense, the right kinds of models 510 00:26:41,160 --> 00:26:44,040 Speaker 3: to go after and figure out the critical points of 511 00:26:44,040 --> 00:26:46,000 Speaker 3: the infrastructure and say, hey, we're going to test these 512 00:26:46,040 --> 00:26:48,679 Speaker 3: through AI models. We're going to do some testing, and 513 00:26:48,720 --> 00:26:50,880 Speaker 3: then we're going to secure everything that we have problems 514 00:26:50,920 --> 00:26:54,040 Speaker 3: with this kind of iterative method and iterative systems. 515 00:26:54,320 --> 00:26:56,200 Speaker 2: We have seen a lot of firms doing this. 516 00:26:56,560 --> 00:26:59,040 Speaker 3: They're getting better over time, but I think that's going 517 00:26:59,119 --> 00:27:01,359 Speaker 3: to become a more critical part of the. 518 00:27:01,920 --> 00:27:04,879 Speaker 2: Security writ large, and cybersecurity. 519 00:27:04,080 --> 00:27:06,480 Speaker 3: Is something that I've been thinking about and talking with 520 00:27:06,520 --> 00:27:09,879 Speaker 3: people over the last couple of years, but that is 521 00:27:09,920 --> 00:27:11,919 Speaker 3: going to be a big area of development that I 522 00:27:11,920 --> 00:27:14,399 Speaker 3: think is going to be positive for AI systems to 523 00:27:14,400 --> 00:27:18,120 Speaker 3: be able to automatically detect problems and then help diagnose 524 00:27:18,160 --> 00:27:21,320 Speaker 3: them and move quickly through a whole bunch of these 525 00:27:21,359 --> 00:27:25,040 Speaker 3: problems and basically set up checklists. So as much as 526 00:27:25,080 --> 00:27:27,600 Speaker 3: there will be problems, that's also going to be proactively 527 00:27:27,800 --> 00:27:30,520 Speaker 3: used or these technologies will be proactively used to solve 528 00:27:30,560 --> 00:27:31,240 Speaker 3: those problems. 529 00:27:31,520 --> 00:27:34,239 Speaker 1: This is such a rapidly changing field. What would you 530 00:27:34,280 --> 00:27:38,000 Speaker 1: recommend to our listeners? Are there either publications or what's 531 00:27:38,040 --> 00:27:40,480 Speaker 1: the best way to keep up with the general development 532 00:27:40,480 --> 00:27:40,960 Speaker 1: of the field. 533 00:27:41,320 --> 00:27:44,399 Speaker 2: So I follow very closely Hacker News. 534 00:27:44,800 --> 00:27:47,840 Speaker 3: It is in this service that's existed now I want 535 00:27:47,840 --> 00:27:50,040 Speaker 3: to say twenty or thirty years at this point. It 536 00:27:50,160 --> 00:27:52,320 Speaker 3: just compiles a lot of the newest events and things 537 00:27:52,359 --> 00:27:55,520 Speaker 3: that are happening in all of technology. There's a lot 538 00:27:55,560 --> 00:27:58,240 Speaker 3: also in hard science that I love watching. But it's 539 00:27:58,359 --> 00:28:01,800 Speaker 3: run by why Combinators, and that is the group that 540 00:28:02,119 --> 00:28:05,679 Speaker 3: Sam Altman, the open AI CEO, used to be the 541 00:28:05,720 --> 00:28:08,200 Speaker 3: head of I think still is a partner there. Point 542 00:28:08,240 --> 00:28:09,960 Speaker 3: meaning is that that's the place where a lot of 543 00:28:09,960 --> 00:28:12,639 Speaker 3: people are having this conversation about AI. Hacker News I 544 00:28:12,640 --> 00:28:14,960 Speaker 3: think is really one of the best sources to understand 545 00:28:15,520 --> 00:28:17,760 Speaker 3: a whole bunch of the different conversations that are going on. 546 00:28:17,920 --> 00:28:20,320 Speaker 3: So they include links to stories, but then there's also 547 00:28:20,359 --> 00:28:23,720 Speaker 3: these really long discussion posts where you get people who 548 00:28:23,720 --> 00:28:27,080 Speaker 3: are technically working on the problem having conversations with people 549 00:28:27,160 --> 00:28:29,960 Speaker 3: who have a broader sense of what's happening in the 550 00:28:30,000 --> 00:28:32,919 Speaker 3: industry or what's happening in the regulatory space. 551 00:28:32,960 --> 00:28:33,880 Speaker 2: I think it's a very. 552 00:28:33,720 --> 00:28:36,680 Speaker 3: Active community and an interesting community that if you're interested 553 00:28:36,680 --> 00:28:39,440 Speaker 3: in what's happening in AI, or really anything that's happening 554 00:28:39,480 --> 00:28:42,040 Speaker 3: in tech, it's really the place to be still. You know, 555 00:28:42,040 --> 00:28:43,760 Speaker 3: some people might disagree with me on that, but I 556 00:28:43,760 --> 00:28:46,080 Speaker 3: think it's the best first place really to understand all 557 00:28:46,080 --> 00:28:46,280 Speaker 3: of this. 558 00:28:46,680 --> 00:28:52,800 Speaker 1: Speaker McCarthy and Democratic Leader have jointly hosted MIT briefing 559 00:28:52,840 --> 00:28:57,320 Speaker 1: members on artificial intelligence. What policy advice would you recommend? 560 00:28:57,400 --> 00:29:00,480 Speaker 1: What should they do in a proactive way in the field. 561 00:29:00,960 --> 00:29:02,560 Speaker 3: There's a lot that can be done, and there will 562 00:29:02,600 --> 00:29:04,560 Speaker 3: be a lot that does in fact happen on the 563 00:29:04,600 --> 00:29:07,520 Speaker 3: policy side. So in the near term, I think it 564 00:29:07,560 --> 00:29:10,360 Speaker 3: is important to be educated on exactly the capabilities of 565 00:29:10,360 --> 00:29:13,160 Speaker 3: these technologies and their limits as well. We're coming up 566 00:29:13,200 --> 00:29:15,280 Speaker 3: to those limits, and we're about to see the limits 567 00:29:15,280 --> 00:29:18,080 Speaker 3: at least of some of the major technologies CHAT, GPT 568 00:29:18,240 --> 00:29:20,920 Speaker 3: and kind of related AI models. They are going to 569 00:29:20,960 --> 00:29:24,560 Speaker 3: have some limits in the very near future. So education, 570 00:29:24,640 --> 00:29:28,040 Speaker 3: I think is a very important part. I have been 571 00:29:28,440 --> 00:29:34,480 Speaker 3: skeptical generally speaking of taking a really really strong, active 572 00:29:34,840 --> 00:29:37,280 Speaker 3: regulatory approach on this. Some people have called for an 573 00:29:37,360 --> 00:29:40,520 Speaker 3: FDA like system. I think that at the very very 574 00:29:40,560 --> 00:29:42,560 Speaker 3: near term we're probably going to have to deal with 575 00:29:43,160 --> 00:29:45,640 Speaker 3: copyright issues. That will be something that a lot of 576 00:29:45,840 --> 00:29:48,280 Speaker 3: policymakers are going to have to deal with. But probably 577 00:29:48,280 --> 00:29:50,160 Speaker 3: the thing in the area to start with would be 578 00:29:50,200 --> 00:29:54,960 Speaker 3: to understand liability rules and fundamentally understanding how liability shifts 579 00:29:55,000 --> 00:29:56,840 Speaker 3: and who is liable for what is being said on 580 00:29:56,920 --> 00:30:00,360 Speaker 3: these platforms. That is, I think, probably to be a 581 00:30:00,440 --> 00:30:04,840 Speaker 3: broader conversation. But the other thing that really should be 582 00:30:04,920 --> 00:30:08,600 Speaker 3: understood by policymakers at the federal level is that states 583 00:30:08,640 --> 00:30:11,800 Speaker 3: are going to be working on their own set of 584 00:30:12,200 --> 00:30:14,960 Speaker 3: AI regulations, and they really already are. We're seeing a 585 00:30:14,960 --> 00:30:17,440 Speaker 3: whole bunch of different states proposed things. There's been a 586 00:30:17,440 --> 00:30:20,520 Speaker 3: whole bunch of proposals in New York. Illinois has a 587 00:30:20,680 --> 00:30:24,280 Speaker 3: biometric privacy law that seems to affect things as well, 588 00:30:24,760 --> 00:30:27,240 Speaker 3: and there's a whole bunch of kind of related states 589 00:30:27,240 --> 00:30:29,479 Speaker 3: they're working on this. Connecticut, I think, is working on 590 00:30:29,480 --> 00:30:32,080 Speaker 3: something along those lines, and if I recall correctly, New 591 00:30:32,160 --> 00:30:35,560 Speaker 3: Jersey as well. So I think my big worry is 592 00:30:35,600 --> 00:30:37,800 Speaker 3: that the states are really going to take the lead 593 00:30:37,840 --> 00:30:41,239 Speaker 3: in this, and that they could potentially mess up some 594 00:30:41,280 --> 00:30:45,200 Speaker 3: of these systems or not create the best environments for 595 00:30:45,240 --> 00:30:48,720 Speaker 3: these large AI models, but at least in the near term, 596 00:30:48,800 --> 00:30:50,320 Speaker 3: the big thing that I think that they should be 597 00:30:50,320 --> 00:30:53,400 Speaker 3: thinking about is really privacy regulation. As much as we're 598 00:30:53,440 --> 00:30:55,760 Speaker 3: talking about AI, there's a lot of the elements that 599 00:30:55,800 --> 00:30:58,719 Speaker 3: exist within privacy regulation that probably could be just as 600 00:30:58,720 --> 00:31:02,080 Speaker 3: easy applied to AI. A lot of questions about transparency 601 00:31:02,320 --> 00:31:06,520 Speaker 3: and about data and about again liability when something goes wrong, 602 00:31:06,880 --> 00:31:11,040 Speaker 3: those sorts of questions could probably be answered or at 603 00:31:11,120 --> 00:31:14,360 Speaker 3: least largely answered by privacy laws. And that's something that's 604 00:31:14,360 --> 00:31:17,240 Speaker 3: obviously going on at the federal level that probably needs 605 00:31:17,280 --> 00:31:18,920 Speaker 3: to get solved. There's a whole bunch of states that 606 00:31:19,080 --> 00:31:21,640 Speaker 3: have privacy laws and they're kind of all over the board, 607 00:31:21,640 --> 00:31:23,720 Speaker 3: and there's a whole bunch of costs that are related there. 608 00:31:24,320 --> 00:31:27,360 Speaker 3: My suggestion fundamentally would be to try to get privacy solved, 609 00:31:27,360 --> 00:31:30,320 Speaker 3: if you could do that first, and then worry about AI. 610 00:31:30,520 --> 00:31:32,440 Speaker 3: But I know that's a big task, and there's a 611 00:31:32,440 --> 00:31:35,520 Speaker 3: lot of democratic leaders right now that have kind of 612 00:31:35,520 --> 00:31:39,440 Speaker 3: put the stop on privacy legislation and privacy conversations. 613 00:31:40,160 --> 00:31:41,440 Speaker 2: But to me, those two. 614 00:31:41,320 --> 00:31:43,960 Speaker 3: Things really are probably pretty twinned and need to have 615 00:31:44,080 --> 00:31:45,800 Speaker 3: some sort of conversation at the same time. 616 00:31:46,120 --> 00:31:48,200 Speaker 1: Well, I want to thank you for joining me and 617 00:31:48,280 --> 00:31:50,360 Speaker 1: for helping all of us understand the latest in the 618 00:31:50,360 --> 00:31:54,120 Speaker 1: world of artificial intelligence and machine learning. As this evolved 619 00:31:54,520 --> 00:31:57,720 Speaker 1: and as we begin to see political figures trying to 620 00:31:57,720 --> 00:31:59,760 Speaker 1: wrestle with it and trying to know what has to 621 00:31:59,760 --> 00:32:02,600 Speaker 1: be done on I think what you're doing is very important. 622 00:32:02,640 --> 00:32:04,920 Speaker 1: I hope you'll keep us in form and I want 623 00:32:04,960 --> 00:32:07,440 Speaker 1: you to know that I found this very helpful and 624 00:32:07,520 --> 00:32:11,840 Speaker 1: very educational, and I'm encouraging all of our listeners to 625 00:32:12,080 --> 00:32:16,120 Speaker 1: pay attention to the evolution of artificial intelligence and to 626 00:32:16,160 --> 00:32:19,920 Speaker 1: recognize that this is a very important development which is 627 00:32:19,960 --> 00:32:22,719 Speaker 1: going to have many, many different impacts, So thank you 628 00:32:22,800 --> 00:32:24,160 Speaker 1: for sharing your knowledge with us. 629 00:32:24,480 --> 00:32:25,200 Speaker 2: Thanks for having me. 630 00:32:28,600 --> 00:32:31,280 Speaker 1: Thank you to my guest Will Reinhardt. You can learn 631 00:32:31,280 --> 00:32:34,600 Speaker 1: more about the latest and artificial intelligence developments on our 632 00:32:34,640 --> 00:32:38,320 Speaker 1: show page at newsworld dot com. New World is produced 633 00:32:38,320 --> 00:32:41,960 Speaker 1: by Gaywis three sixty and iHeartMedia. Our executive producer is 634 00:32:42,000 --> 00:32:46,360 Speaker 1: Guernsey Sloan and our researcher is Rachel Peterson. The artwork 635 00:32:46,400 --> 00:32:49,600 Speaker 1: for the show was created by Steve Penley. Special thanks 636 00:32:49,640 --> 00:32:51,959 Speaker 1: to the team at Gainwich three sixty. If you've been 637 00:32:52,000 --> 00:32:55,360 Speaker 1: enjoying Newsworld, I hope you'll go to Apple Podcast and 638 00:32:55,480 --> 00:32:58,080 Speaker 1: both rate us with five stars and give us a 639 00:32:58,120 --> 00:33:00,480 Speaker 1: review so others can learn what it's on all about. 640 00:33:01,120 --> 00:33:03,600 Speaker 1: Right now, listeners of neut World consign up for my 641 00:33:03,720 --> 00:33:09,120 Speaker 1: three free weekly columns at ginglishwet sixty dot com slash newsletter. 642 00:33:09,520 --> 00:33:11,920 Speaker 1: I'm newt English. This is neut World.