1 00:00:05,000 --> 00:00:09,039 Speaker 1: Hey, welcome to It could happen here a podcast about 2 00:00:09,119 --> 00:00:13,000 Speaker 1: things falling apart, and today it's kind of going to 3 00:00:13,000 --> 00:00:16,320 Speaker 1: be a conversation about is shit falling apart? Are we 4 00:00:16,360 --> 00:00:19,799 Speaker 1: all about to be devoured by a rogue AI? Is 5 00:00:19,840 --> 00:00:23,120 Speaker 1: your job about to be devoured by a rogue AI? 6 00:00:23,560 --> 00:00:25,919 Speaker 1: These are the questions that we're going to, you know, 7 00:00:26,079 --> 00:00:29,160 Speaker 1: talk around and about and stuff today and with us 8 00:00:29,160 --> 00:00:34,000 Speaker 1: today is Noah John Syracusa, a math professor at Bentley University. Noah, 9 00:00:34,040 --> 00:00:37,640 Speaker 1: welcome to the show. Thanks for having me, and I'm 10 00:00:37,680 --> 00:00:41,120 Speaker 1: reaching out. We're talking right now because there's an article 11 00:00:41,200 --> 00:00:44,080 Speaker 1: that was put up in The New York Times on 12 00:00:44,159 --> 00:00:46,919 Speaker 1: March twenty four, twenty twenty three, titled you can have 13 00:00:46,960 --> 00:00:49,000 Speaker 1: the Blue Pill or the Red Pill and We're out 14 00:00:49,000 --> 00:00:53,000 Speaker 1: of Blue Pills, which is a fun title by Yuval Harari, 15 00:00:54,000 --> 00:00:58,080 Speaker 1: Tristan Harris and as a Raskin. And it's an article 16 00:00:58,160 --> 00:01:01,240 Speaker 1: that is kind of about the pit falls and dangers 17 00:01:01,280 --> 00:01:05,040 Speaker 1: of AI research, of which there definitely are some. I 18 00:01:05,160 --> 00:01:07,000 Speaker 1: enjoyed your thread on the matter. I thought it was 19 00:01:07,040 --> 00:01:09,640 Speaker 1: a lucid breakdown of the things the article gets right 20 00:01:09,720 --> 00:01:11,360 Speaker 1: and the areas in which I think they're a bit 21 00:01:11,440 --> 00:01:14,160 Speaker 1: fear mongery. So yeah, I think that's probably a good 22 00:01:14,160 --> 00:01:16,760 Speaker 1: place to start, unless you wanted to start by just 23 00:01:16,800 --> 00:01:20,160 Speaker 1: kind of generally talking about where you kind of are 24 00:01:20,160 --> 00:01:24,399 Speaker 1: on AI and what you kind of think, you know, 25 00:01:24,560 --> 00:01:29,200 Speaker 1: the technology is advancing towards right now. Yeah, I mean, 26 00:01:29,240 --> 00:01:31,400 Speaker 1: I think I can probably answer both those questions and 27 00:01:31,440 --> 00:01:33,840 Speaker 1: the same because part of why I enjoyed writing that 28 00:01:33,880 --> 00:01:37,200 Speaker 1: threat dissecting the article is I just had the strangest 29 00:01:37,200 --> 00:01:39,480 Speaker 1: feeling reading it that I agreed with it so much 30 00:01:39,480 --> 00:01:42,640 Speaker 1: in principle and yet somehow objected it to so much 31 00:01:42,720 --> 00:01:45,920 Speaker 1: in detail. Yeah, and it thinking about that article helped 32 00:01:45,920 --> 00:01:48,160 Speaker 1: me think about my own feelings on AI, which you know, 33 00:01:48,240 --> 00:01:50,360 Speaker 1: every day of the week is slightly different because so 34 00:01:50,440 --> 00:01:55,320 Speaker 1: much news happens. Yeah, I found myself overall deeply frustrated 35 00:01:55,360 --> 00:01:57,560 Speaker 1: that I agree with the central conclusion, which is that 36 00:01:58,000 --> 00:02:00,840 Speaker 1: maybe we shouldn't be just like plowing headlong into this 37 00:02:00,880 --> 00:02:04,320 Speaker 1: and should be more careful when we we we screw 38 00:02:04,320 --> 00:02:07,640 Speaker 1: around with technology like this, which I agree with and 39 00:02:07,680 --> 00:02:09,639 Speaker 1: I feel like should have been the thing we did 40 00:02:09,680 --> 00:02:12,600 Speaker 1: with like I don't know, Facebook, Twitter, like all of 41 00:02:12,639 --> 00:02:16,200 Speaker 1: these things, Like it's less my obsession is less with 42 00:02:16,280 --> 00:02:19,200 Speaker 1: like the specific dangers of AI, and more with what 43 00:02:19,360 --> 00:02:22,880 Speaker 1: we keep letting these guys who are fundamentally like gamblers 44 00:02:22,880 --> 00:02:26,440 Speaker 1: within your capital money really put our society through the 45 00:02:26,520 --> 00:02:29,280 Speaker 1: ringer without ever asking should we like do any research 46 00:02:29,320 --> 00:02:31,639 Speaker 1: on maybe how social media affects children and like how 47 00:02:31,720 --> 00:02:35,919 Speaker 1: all of these different things. And it's it's right that, like, yeah, 48 00:02:35,960 --> 00:02:38,600 Speaker 1: we should be concerned about what these people are going 49 00:02:38,639 --> 00:02:41,959 Speaker 1: to do with AI, but also why why now? Why 50 00:02:42,040 --> 00:02:46,040 Speaker 1: just now? Yeah? And that raises a really good point, 51 00:02:46,120 --> 00:02:49,919 Speaker 1: which is what's different now versus what we've been experiencing 52 00:02:49,919 --> 00:02:52,600 Speaker 1: with social media? And just to give your listeners some context, 53 00:02:53,280 --> 00:02:55,560 Speaker 1: one of the three authors on this Sneer Times article 54 00:02:56,160 --> 00:02:58,960 Speaker 1: is famous for writing this book Sapiens that's a sweeping 55 00:02:58,960 --> 00:03:01,639 Speaker 1: history of humanity, and the other two are actually most 56 00:03:01,639 --> 00:03:05,560 Speaker 1: famous for the Netflix documentary The Social Dilemma. So they 57 00:03:05,720 --> 00:03:08,079 Speaker 1: really are in this camp of warning people about social 58 00:03:08,080 --> 00:03:11,040 Speaker 1: media algorithms. And as exactly as you're saying, that's sort 59 00:03:11,080 --> 00:03:14,520 Speaker 1: of this thing that we've been dealing with, probably quite poorly, 60 00:03:14,639 --> 00:03:16,359 Speaker 1: and now we're kind of moving on to the next 61 00:03:16,440 --> 00:03:19,560 Speaker 1: societal risk, which is AI. So that as a really 62 00:03:19,600 --> 00:03:21,680 Speaker 1: important question of what's different now? And I think That's 63 00:03:21,720 --> 00:03:23,680 Speaker 1: one of the things the articles trying to address, which 64 00:03:23,760 --> 00:03:27,200 Speaker 1: is many of the problems that we already have with algorithms, 65 00:03:27,280 --> 00:03:30,480 Speaker 1: data driven algorithms, and even AI as it's used in 66 00:03:30,560 --> 00:03:34,200 Speaker 1: social media is still happening now, but somehow things feel 67 00:03:34,240 --> 00:03:37,920 Speaker 1: like they're aspiring out of control. Yeah, and I think, 68 00:03:38,520 --> 00:03:40,120 Speaker 1: I mean, honestly, I think a lot of this just 69 00:03:40,360 --> 00:03:44,040 Speaker 1: has to do with culturally, what are touchstones for AI 70 00:03:44,160 --> 00:03:47,800 Speaker 1: we're going into this, you know, which are Skynet? You know, 71 00:03:48,120 --> 00:03:50,600 Speaker 1: like it's that sort of thing, and you do see. 72 00:03:50,840 --> 00:03:54,880 Speaker 1: I feel like the uncredited fourth author on this particular 73 00:03:54,960 --> 00:03:58,640 Speaker 1: article is James Cameron, because there's pieces of it throughout 74 00:03:58,680 --> 00:04:03,240 Speaker 1: this There's some it opens actually pretty provocatively. Imagine that 75 00:04:03,280 --> 00:04:06,160 Speaker 1: you are boarding an airplane. Half the engineers who built 76 00:04:06,160 --> 00:04:07,800 Speaker 1: it tell you there is a ten percent chance the 77 00:04:07,840 --> 00:04:10,320 Speaker 1: plane will crash, killing you and everyone else on it. 78 00:04:10,520 --> 00:04:13,480 Speaker 1: Would you still board? In twenty twenty two, over seven 79 00:04:13,560 --> 00:04:17,320 Speaker 1: hundred top academics and researchers behind the leading artificial intelligence companies, 80 00:04:17,480 --> 00:04:20,200 Speaker 1: we're asking a survey about future AI risk. Half of 81 00:04:20,200 --> 00:04:22,400 Speaker 1: those surveys stated there was a ten percent or greater 82 00:04:22,480 --> 00:04:26,599 Speaker 1: chance of human extinction from future AI systems. Which yeah, 83 00:04:27,240 --> 00:04:29,600 Speaker 1: let's zoom in on. Yeah, yeah, let's talk about that, 84 00:04:31,240 --> 00:04:33,080 Speaker 1: because what I tried to do in my thread was 85 00:04:33,120 --> 00:04:35,320 Speaker 1: go through all the claims and assertions and really pause 86 00:04:35,400 --> 00:04:38,520 Speaker 1: and say hold on. But that's a great one to start, 87 00:04:38,520 --> 00:04:41,240 Speaker 1: because there's a lot to dig in right there. Yeah. So, 88 00:04:41,320 --> 00:04:44,920 Speaker 1: first of all, there's a huge difference in that airplanes 89 00:04:44,920 --> 00:04:47,520 Speaker 1: are based on science and physics and things that we 90 00:04:47,640 --> 00:04:51,000 Speaker 1: understand pretty well. There's a lot to it, and there's 91 00:04:51,040 --> 00:04:53,080 Speaker 1: been millions of flights, so you have a lot of data. 92 00:04:53,120 --> 00:04:55,039 Speaker 1: You know, how many planes crash and how many don't. 93 00:04:55,480 --> 00:04:58,320 Speaker 1: Maybe one engine goes out, you can do the statistics 94 00:04:58,320 --> 00:05:01,080 Speaker 1: and CEO you know whatever, percent of planes without the 95 00:05:01,200 --> 00:05:04,920 Speaker 1: engine still land safely. The problem with AI is we're 96 00:05:05,000 --> 00:05:08,839 Speaker 1: just guessing, right There's no way to know one hundred 97 00:05:08,880 --> 00:05:10,640 Speaker 1: years from now or ten years from now what it's 98 00:05:10,640 --> 00:05:13,400 Speaker 1: going to do, what the real risks are, so we speculate, 99 00:05:14,080 --> 00:05:17,719 Speaker 1: and that's not uncharted territory, right let nuclear weapons were 100 00:05:17,720 --> 00:05:22,200 Speaker 1: first introduced, people had to guess and speculate. But the danger, 101 00:05:22,240 --> 00:05:25,360 Speaker 1: I think is putting it in that same category as 102 00:05:25,400 --> 00:05:28,080 Speaker 1: things like airplanes or climate change. I'll like to think 103 00:05:28,080 --> 00:05:31,480 Speaker 1: about climate change when you see these you know, what's 104 00:05:31,480 --> 00:05:34,919 Speaker 1: the IPCC if forget the acronym in these reports, that's 105 00:05:34,920 --> 00:05:38,240 Speaker 1: based on thousands of scientists digging into thousands of published 106 00:05:38,240 --> 00:05:41,520 Speaker 1: papers and all this data really modeling the environment. There's 107 00:05:41,560 --> 00:05:44,000 Speaker 1: a lot of meat and substance to it. The problem 108 00:05:44,000 --> 00:05:46,280 Speaker 1: with the AI is it's mostly people I hate to 109 00:05:46,320 --> 00:05:48,040 Speaker 1: say it, but like me or like you, just kind 110 00:05:48,080 --> 00:05:51,719 Speaker 1: of guessing and thinking, maybe this will happen, maybe that'll happen. 111 00:06:02,120 --> 00:06:04,880 Speaker 1: The reasonable thing to say if you're in the airs, 112 00:06:04,880 --> 00:06:07,760 Speaker 1: which is like, yeah, I have concerns that AI could 113 00:06:07,800 --> 00:06:11,720 Speaker 1: cause serious negative externalities for the human race, perfectly reasonable statement. 114 00:06:11,880 --> 00:06:14,799 Speaker 1: It is physically impossible to say there's a ten percent chance, 115 00:06:14,960 --> 00:06:18,720 Speaker 1: exactly because it's never done that before. You know, I'm 116 00:06:18,760 --> 00:06:21,839 Speaker 1: a math professor, and I'm the first to say numbers 117 00:06:21,920 --> 00:06:24,479 Speaker 1: don't have some intrinsic meaning. Right. If I just say 118 00:06:24,600 --> 00:06:27,240 Speaker 1: something has maybe a fifteen percent, I'm just making up 119 00:06:27,240 --> 00:06:29,160 Speaker 1: I'm pulling out of my ass. Yeah, it doesn't make 120 00:06:29,200 --> 00:06:33,880 Speaker 1: it true. So it's this, it's a general pet thieve 121 00:06:33,880 --> 00:06:36,600 Speaker 1: I have of sort of giving a false sense of 122 00:06:36,720 --> 00:06:40,000 Speaker 1: precision by using numbers that you don't really know where 123 00:06:40,000 --> 00:06:42,200 Speaker 1: they came from or they're just made up. So that's 124 00:06:42,240 --> 00:06:46,159 Speaker 1: one issue is these numbers are made up, and asking 125 00:06:46,200 --> 00:06:48,840 Speaker 1: a thousand people to make up numbers isn't necessarily any 126 00:06:48,880 --> 00:06:51,400 Speaker 1: better than asking one or two. You know, then if 127 00:06:51,400 --> 00:06:54,920 Speaker 1: the numbers made up, it's made up. So that's one issue. Yeah, 128 00:06:54,960 --> 00:06:57,560 Speaker 1: I also do think and I'm not the I saw 129 00:06:57,640 --> 00:07:00,120 Speaker 1: someone making no I think it was Ben Collins, who 130 00:07:00,120 --> 00:07:02,839 Speaker 1: writes for NBC. On Twitter made a note that like, well, 131 00:07:02,839 --> 00:07:05,239 Speaker 1: the fact that all of these statements about like how 132 00:07:05,360 --> 00:07:07,760 Speaker 1: dangerous they are about human extinction are coming out of 133 00:07:07,760 --> 00:07:10,400 Speaker 1: people in the AI industry has started to kind of 134 00:07:10,400 --> 00:07:13,720 Speaker 1: feel like marketing. That's right, Yeah, exactly, it's a little 135 00:07:13,720 --> 00:07:15,920 Speaker 1: bit of buzz marketing going on here. And I think 136 00:07:15,960 --> 00:07:18,119 Speaker 1: you mentioned social media, and the authors of this article 137 00:07:18,160 --> 00:07:20,720 Speaker 1: mentioned social media, and we have to look to the 138 00:07:20,760 --> 00:07:22,840 Speaker 1: past right to understand the future. I think that's the 139 00:07:22,840 --> 00:07:25,000 Speaker 1: only way to do it. So, what was one of 140 00:07:25,040 --> 00:07:28,800 Speaker 1: the biggest scandals in social media was Cambridge Analytica And 141 00:07:28,920 --> 00:07:31,480 Speaker 1: as you know, we probably remember, this was this data 142 00:07:31,520 --> 00:07:34,280 Speaker 1: privacy scandal where a bunch of data was collected from 143 00:07:34,320 --> 00:07:37,800 Speaker 1: Facebook users that shouldn't have been you know, people didn't 144 00:07:37,800 --> 00:07:40,000 Speaker 1: realize that the data has been collected, they didn't approve it, 145 00:07:40,440 --> 00:07:43,480 Speaker 1: and it was used for this election company or this 146 00:07:43,880 --> 00:07:48,880 Speaker 1: political company that was trying to profile people and influenced 147 00:07:48,920 --> 00:07:54,400 Speaker 1: campaigns towards Donald Trump towards Brexit. So this was a 148 00:07:54,480 --> 00:07:57,360 Speaker 1: huge scandal, and you know, Facebook was fine five billion 149 00:07:57,400 --> 00:08:01,080 Speaker 1: dollars or something, very justifiably. But I would say what 150 00:08:01,120 --> 00:08:04,480 Speaker 1: it was in retrospect was a data privacy issue. People's 151 00:08:04,520 --> 00:08:07,120 Speaker 1: personal data was leaked when it shouldn't have been. The 152 00:08:07,240 --> 00:08:10,040 Speaker 1: problem was there was so much fear and fear mongering 153 00:08:10,080 --> 00:08:14,720 Speaker 1: over it that people felt this data was used by 154 00:08:14,720 --> 00:08:17,880 Speaker 1: these sort of algorithmic mind lasers to kind of know 155 00:08:18,080 --> 00:08:20,520 Speaker 1: us in such great detail and get us, trick us 156 00:08:20,520 --> 00:08:23,680 Speaker 1: into voting for Donald Trump and targeting us. And the 157 00:08:24,160 --> 00:08:25,960 Speaker 1: journey is still kind of out, but most of the 158 00:08:25,960 --> 00:08:29,880 Speaker 1: evidence looks like Cambridge Analytica it wasn't that effective. They 159 00:08:29,920 --> 00:08:32,760 Speaker 1: just couldn't do it. And it turns out you can 160 00:08:32,760 --> 00:08:34,800 Speaker 1: know a lot about a person, a lot about their data, 161 00:08:34,840 --> 00:08:37,320 Speaker 1: and it's really hard to influence them to change them. 162 00:08:38,040 --> 00:08:40,480 Speaker 1: So what happened I think was there was a lot 163 00:08:40,520 --> 00:08:44,160 Speaker 1: of alarm set spread rightly so about the tech companies. 164 00:08:44,160 --> 00:08:46,000 Speaker 1: They have too much power, too much data, they know 165 00:08:46,040 --> 00:08:48,480 Speaker 1: too much about us, and this horrible thing happened. The 166 00:08:48,559 --> 00:08:52,520 Speaker 1: problem was a lot of the alarmism then actually reinforced 167 00:08:52,559 --> 00:08:55,280 Speaker 1: this aura of power, of godlike power that the tech 168 00:08:55,280 --> 00:09:00,640 Speaker 1: companies have. People criticizing them actually gave them more potency 169 00:09:00,679 --> 00:09:03,719 Speaker 1: than they deserved. And then suddenly Google and Facebook and 170 00:09:03,800 --> 00:09:05,680 Speaker 1: all they had. It wasn't sudden, but it kind of 171 00:09:05,720 --> 00:09:08,880 Speaker 1: built it up. They had this aura that our algorithms 172 00:09:08,880 --> 00:09:11,120 Speaker 1: are it's so insanely powerful, and we have to make 173 00:09:11,120 --> 00:09:12,880 Speaker 1: sure they stay in the right hands, and we can 174 00:09:13,320 --> 00:09:16,360 Speaker 1: do so much. And that's unfortunately what I see happening 175 00:09:16,360 --> 00:09:18,760 Speaker 1: now a lot, and that is kind of the setting 176 00:09:18,800 --> 00:09:24,360 Speaker 1: for critiquing this article. Yeah, absolutely agree that this stuff 177 00:09:24,480 --> 00:09:26,760 Speaker 1: is risky. AI. I absolutely agree that we could go 178 00:09:26,800 --> 00:09:29,959 Speaker 1: down to dangerous path. But once we start leaving firm 179 00:09:30,000 --> 00:09:32,599 Speaker 1: ground and speculating wildly and using the terminator stuff that 180 00:09:32,600 --> 00:09:35,560 Speaker 1: you described. Yeah, even if you think you're criticizing the 181 00:09:35,600 --> 00:09:37,520 Speaker 1: tech companies, you know what you're doing giving them the 182 00:09:37,559 --> 00:09:40,400 Speaker 1: biggest compliment in the world, saying that you guys have 183 00:09:40,520 --> 00:09:44,000 Speaker 1: created are godlike and you've created these mighty machines, created 184 00:09:44,040 --> 00:09:46,760 Speaker 1: a deity which is very similar to the language this 185 00:09:47,000 --> 00:09:49,520 Speaker 1: argue article has at the end, and I think it's 186 00:09:49,559 --> 00:09:53,280 Speaker 1: kind of worth like, as you're bringing up there are 187 00:09:53,559 --> 00:09:56,240 Speaker 1: real threats. There are real threats that are immediately obvious. 188 00:09:56,320 --> 00:09:58,120 Speaker 1: The threat that a lot of writers are going to 189 00:09:58,160 --> 00:10:01,640 Speaker 1: lose their jobs because companies like BuzzFeed decide to replace 190 00:10:01,679 --> 00:10:04,480 Speaker 1: them with you know, chat, GPT or whatever. The fact 191 00:10:04,480 --> 00:10:06,280 Speaker 1: that a lot of artists are going to lose out 192 00:10:06,320 --> 00:10:08,120 Speaker 1: on work because their work has been hoovered up and 193 00:10:08,160 --> 00:10:10,760 Speaker 1: it's being used to generate Like these are very real 194 00:10:10,800 --> 00:10:13,440 Speaker 1: and very immediate concerns that we don't have to They're 195 00:10:13,440 --> 00:10:16,160 Speaker 1: not hypothetical. We don't have to theorize about the AI 196 00:10:16,240 --> 00:10:19,280 Speaker 1: becoming intelligent for this to be a problem. These are 197 00:10:19,320 --> 00:10:23,360 Speaker 1: things we have to immediately deal with because it puts 198 00:10:23,360 --> 00:10:27,520 Speaker 1: people at risk. It's the same thing with like, you know, 199 00:10:27,559 --> 00:10:30,360 Speaker 1: there's a lot that gets talked about with Cambridge Analytica, 200 00:10:30,440 --> 00:10:33,680 Speaker 1: with kind of like the different Russian disinformation efforts. But 201 00:10:33,800 --> 00:10:35,920 Speaker 1: when I think about the stuff that was happening in 202 00:10:35,960 --> 00:10:38,960 Speaker 1: the same period that worries me more. One of the 203 00:10:39,000 --> 00:10:42,240 Speaker 1: things that occurred is because there was so much money 204 00:10:42,240 --> 00:10:43,960 Speaker 1: to be made if you could get certain things to 205 00:10:43,960 --> 00:10:47,319 Speaker 1: go viral on YouTube, companies that use tools that weren't 206 00:10:47,320 --> 00:10:52,199 Speaker 1: wildly dissimilar from some of these basically generated CGI videos 207 00:10:52,280 --> 00:10:55,240 Speaker 1: based on kind of random terms that they knew were 208 00:10:55,320 --> 00:10:58,160 Speaker 1: likely to trick the algorithm into trending. And god knows 209 00:10:58,160 --> 00:11:00,280 Speaker 1: how many children were parked in front of these like 210 00:11:00,400 --> 00:11:03,040 Speaker 1: very unhinged videos for hours at a time that like 211 00:11:03,320 --> 00:11:06,320 Speaker 1: they would start watching some normal kid musical video or something, 212 00:11:06,320 --> 00:11:09,080 Speaker 1: and then they're watching like the disembodied head of Krusty 213 00:11:09,120 --> 00:11:11,760 Speaker 1: the Crown bounce around while like some sort of nonsense 214 00:11:11,800 --> 00:11:13,520 Speaker 1: song gets sung, and it's like, well, what is that 215 00:11:13,559 --> 00:11:15,440 Speaker 1: actually going to do with kids? Like, we don't know. 216 00:11:15,520 --> 00:11:20,720 Speaker 1: That's unsettling, thoughttling. Yeah, And that's the kind of thing, 217 00:11:20,960 --> 00:11:24,040 Speaker 1: you know, And I'm sure there will be obviously, Like 218 00:11:24,080 --> 00:11:25,960 Speaker 1: one of the things that this article is not wrong 219 00:11:26,000 --> 00:11:29,360 Speaker 1: about is that if we kind of leap forward into 220 00:11:29,400 --> 00:11:33,040 Speaker 1: this technology with the kind of abandon that we're used 221 00:11:33,080 --> 00:11:36,520 Speaker 1: to giving the tech company, there will be unforeseen externalities 222 00:11:36,559 --> 00:11:39,359 Speaker 1: that we can't predict right now that will be very concerning. 223 00:11:39,679 --> 00:11:41,800 Speaker 1: I just don't think it's sky in it. Yeah, And 224 00:11:41,840 --> 00:11:44,720 Speaker 1: that's what was so challenging, not just with that article, 225 00:11:44,760 --> 00:11:47,720 Speaker 1: but with I think the movement we're having is I 226 00:11:47,800 --> 00:11:51,920 Speaker 1: do agree very much in spirit. I agree with the recommendations. 227 00:11:52,040 --> 00:11:54,400 Speaker 1: We need to slow down, we need to be more 228 00:11:54,480 --> 00:11:59,560 Speaker 1: judicious and cautious, we need to really consider these. But again, 229 00:11:59,600 --> 00:12:03,120 Speaker 1: if we overhyped the technology, we may be doing ourselves 230 00:12:03,120 --> 00:12:06,760 Speaker 1: a disservice by empowering the very entities that we're trying 231 00:12:06,760 --> 00:12:10,679 Speaker 1: to take power from. And a sample like that, can 232 00:12:10,720 --> 00:12:13,080 Speaker 1: I read a quick quote from the article, do you 233 00:12:13,640 --> 00:12:16,320 Speaker 1: AI's new mastery of language means it can now hack 234 00:12:16,320 --> 00:12:21,240 Speaker 1: and manipulate the operating system of civilization. By'm gaining mastery 235 00:12:21,280 --> 00:12:24,439 Speaker 1: of language, AI is seizing the master key to civilization 236 00:12:24,760 --> 00:12:28,480 Speaker 1: from bank vaults to holy sepulchers. That's right, and that 237 00:12:28,800 --> 00:12:30,880 Speaker 1: I mean, that is funny, and you're right to laugh. 238 00:12:31,440 --> 00:12:33,480 Speaker 1: Let's actually zoom in a second, and I think this 239 00:12:33,559 --> 00:12:37,480 Speaker 1: is such a tempting trap that AI is super intelligent 240 00:12:37,520 --> 00:12:40,600 Speaker 1: in some respects, right, you can. It's done amazing at 241 00:12:40,679 --> 00:12:43,839 Speaker 1: chasss amazing, It's jeopard be amazing at various things. Chat 242 00:12:43,960 --> 00:12:47,680 Speaker 1: Gypt is amazing at these conversations. So what happens is 243 00:12:47,760 --> 00:12:51,360 Speaker 1: it's so tempting to think AI just equal super smart 244 00:12:51,880 --> 00:12:54,120 Speaker 1: and because it can do those things, and now look, 245 00:12:54,160 --> 00:12:56,839 Speaker 1: it can converse, that it must be the super intelligent 246 00:12:57,240 --> 00:13:01,400 Speaker 1: conversational entity. And it's really good at, you know, taking 247 00:13:01,559 --> 00:13:03,679 Speaker 1: text that's on the web that it's already looked at 248 00:13:03,720 --> 00:13:05,960 Speaker 1: and kind of spinning it around and processing. It can 249 00:13:06,000 --> 00:13:09,040 Speaker 1: come up with poems and weird forms. But that doesn't 250 00:13:09,120 --> 00:13:13,440 Speaker 1: mean it is super intelligent in all respects. For instance, 251 00:13:13,559 --> 00:13:16,840 Speaker 1: one of the main issues is to hack civilization. To 252 00:13:16,920 --> 00:13:19,760 Speaker 1: manipulate us with language, it has to kind of know 253 00:13:19,840 --> 00:13:22,600 Speaker 1: what impact its words have on us, and it doesn't 254 00:13:22,760 --> 00:13:25,480 Speaker 1: really have that. It just has a little conversation a 255 00:13:25,559 --> 00:13:27,800 Speaker 1: textbox and I can give it a thumbs up or 256 00:13:27,840 --> 00:13:30,760 Speaker 1: thumbs down. So the only data that it's collecting for 257 00:13:30,840 --> 00:13:32,760 Speaker 1: me when it talks to me any of these chatbots 258 00:13:33,120 --> 00:13:36,880 Speaker 1: is did I like the response or not. That's pretty 259 00:13:36,960 --> 00:13:39,319 Speaker 1: weak data to try to manipulate me, you know, it's 260 00:13:39,320 --> 00:13:41,960 Speaker 1: so basic. That's not that different than when I watch 261 00:13:42,000 --> 00:13:44,280 Speaker 1: YouTube videos. YouTube knows what videos I like and what 262 00:13:44,360 --> 00:13:48,160 Speaker 1: I don't like. Would you say that YouTube is hacked civilization. No, 263 00:13:48,280 --> 00:13:51,840 Speaker 1: it's addicted a lot of us, but it's not hacked us. Yeah. 264 00:13:51,880 --> 00:13:55,000 Speaker 1: We people have hacked YouTube and that has done some 265 00:13:55,080 --> 00:13:59,040 Speaker 1: damage to other people, like but it's like the thing is. 266 00:13:59,160 --> 00:14:02,760 Speaker 1: And that's part of why while I have many concerns 267 00:14:02,880 --> 00:14:05,199 Speaker 1: about this technology, it's not that it's going to hack 268 00:14:05,280 --> 00:14:08,560 Speaker 1: civilization because like, we're really good at doing that to 269 00:14:08,720 --> 00:14:11,840 Speaker 1: each other. Like there's always huge numbers of people hacking 270 00:14:12,320 --> 00:14:15,160 Speaker 1: bits of the populace and manipulating each other, and there 271 00:14:15,200 --> 00:14:17,360 Speaker 1: always have been. That's why we figured out how to 272 00:14:17,400 --> 00:14:23,000 Speaker 1: paint like it's I do think that there's there's an 273 00:14:23,000 --> 00:14:28,480 Speaker 1: interesting conversation to be had about the part of why 274 00:14:28,560 --> 00:14:32,680 Speaker 1: people are kind of willing to believe anything as possible 275 00:14:32,680 --> 00:14:34,960 Speaker 1: with this stuff is that for folks who were just 276 00:14:35,040 --> 00:14:38,240 Speaker 1: kind of living their lives with a normal amount of 277 00:14:38,240 --> 00:14:41,160 Speaker 1: attention paid to the tech industry, it seems like these 278 00:14:41,160 --> 00:14:43,640 Speaker 1: tools popped out of nowhere a couple of months ago, right. 279 00:14:43,680 --> 00:14:46,200 Speaker 1: It feels like, oh, there has just suddenly been this 280 00:14:46,280 --> 00:14:49,120 Speaker 1: massive breakthrough. And the reality is that all of the 281 00:14:49,200 --> 00:14:52,080 Speaker 1: stuff that people you know, chat gpt, these different ais 282 00:14:52,080 --> 00:14:54,680 Speaker 1: that everybody's talking about, this is technology that people have 283 00:14:54,760 --> 00:14:57,600 Speaker 1: been pouring resources into for years and years and years 284 00:14:57,600 --> 00:15:00,080 Speaker 1: and years and years, and that's why it's able to 285 00:15:00,240 --> 00:15:02,200 Speaker 1: do some of these amazing things that we've seen. But 286 00:15:02,280 --> 00:15:04,280 Speaker 1: it's not I don't think it means that in a 287 00:15:04,360 --> 00:15:07,359 Speaker 1: month it's going to be a thousand times smarter. It's 288 00:15:07,120 --> 00:15:10,440 Speaker 1: it's it's a process of labor, and it was finally 289 00:15:10,480 --> 00:15:12,880 Speaker 1: ready to be unveiled to the extent that it has been. 290 00:15:13,160 --> 00:15:16,920 Speaker 1: Maybe that's right. And a good example is a GPT 291 00:15:17,080 --> 00:15:19,880 Speaker 1: four which recently came out. There was GPT three before 292 00:15:20,400 --> 00:15:22,880 Speaker 1: and chat Gypt, and there was so much speculation that 293 00:15:23,000 --> 00:15:26,600 Speaker 1: GPT four is going to be again this godlike thing 294 00:15:26,640 --> 00:15:29,479 Speaker 1: that just you know, that brings us to the singularity. 295 00:15:29,520 --> 00:15:32,760 Speaker 1: And honestly, it's done better at tests. You know, I 296 00:15:32,800 --> 00:15:34,520 Speaker 1: forget the numbers, but maybe one of them got a 297 00:15:34,560 --> 00:15:36,920 Speaker 1: twenty percent grade on some tests and this one got 298 00:15:36,960 --> 00:15:40,000 Speaker 1: an eighty percent. So that is a significant improvement. Right. 299 00:15:40,000 --> 00:15:42,400 Speaker 1: If you're a teacher and your students improve that much, 300 00:15:42,720 --> 00:15:45,280 Speaker 1: you should be happy, right, But as you said, is 301 00:15:45,320 --> 00:15:48,360 Speaker 1: that a thousand times No, even though the machine is 302 00:15:48,440 --> 00:15:51,520 Speaker 1: much bigger, much more data, and it just shows that. Yeah, Like, 303 00:15:51,680 --> 00:15:54,760 Speaker 1: the reality is this is incremental progress going at a 304 00:15:54,840 --> 00:15:57,680 Speaker 1: very fast rate, very unsettling even for those of us 305 00:15:58,000 --> 00:16:00,880 Speaker 1: following the field closely, where it experiencing that kind of 306 00:16:00,920 --> 00:16:03,160 Speaker 1: vertigo that you're saying that whoa where did this come from? 307 00:16:03,600 --> 00:16:05,760 Speaker 1: So even within the field, and you're absolutely right, if 308 00:16:05,800 --> 00:16:08,040 Speaker 1: you're just at home, you know, not paying attention for 309 00:16:08,280 --> 00:16:10,480 Speaker 1: a week or a month or a year, suddenly the 310 00:16:10,520 --> 00:16:13,600 Speaker 1: stuff pops up. It is disorienting. But one thing I 311 00:16:13,640 --> 00:16:16,000 Speaker 1: think that's helped me at least kind of clarify what 312 00:16:17,320 --> 00:16:20,200 Speaker 1: not even answering what the risks are, but just understanding 313 00:16:20,200 --> 00:16:23,360 Speaker 1: the different camps of why certain people are reacting differently, 314 00:16:23,800 --> 00:16:26,040 Speaker 1: and why even the people afraid of AI seem to 315 00:16:26,040 --> 00:16:28,800 Speaker 1: be now fighting amongst each other and why it's getting fractured. 316 00:16:29,440 --> 00:16:33,920 Speaker 1: Is are you more afraid of this AI used as 317 00:16:33,920 --> 00:16:37,040 Speaker 1: a tool by people or are you more afraid of 318 00:16:37,080 --> 00:16:39,800 Speaker 1: it kind of taking on its own autonomy and kind 319 00:16:39,800 --> 00:16:42,240 Speaker 1: of going rogue and doing its own things. And I'm 320 00:16:42,360 --> 00:16:45,520 Speaker 1: very much afraid of people using it. I think big 321 00:16:45,560 --> 00:16:47,560 Speaker 1: companies are going to use it and there's going to 322 00:16:47,640 --> 00:16:49,240 Speaker 1: be a lot of problems, just like we saw with 323 00:16:49,320 --> 00:16:53,040 Speaker 1: social media. People will get addicted, democracies will be flooded 324 00:16:53,080 --> 00:16:57,520 Speaker 1: with misinformation, It'll be weaponized by various actors, will be 325 00:16:57,560 --> 00:17:01,720 Speaker 1: bought accounts. So I am very concerned about it being used. Basically, 326 00:17:01,760 --> 00:17:04,600 Speaker 1: it performing the job it was told to do. But 327 00:17:04,680 --> 00:17:07,280 Speaker 1: it'll be told to do dangerous jobs, either making money 328 00:17:07,359 --> 00:17:10,399 Speaker 1: or making discord. There's another group of people that are 329 00:17:10,440 --> 00:17:13,520 Speaker 1: more worried about the AI somehow deciding on its own 330 00:17:13,600 --> 00:17:17,320 Speaker 1: to do things to take over. And that's where you know, 331 00:17:17,600 --> 00:17:19,800 Speaker 1: I can't roll it out, But that's where I kind 332 00:17:19,840 --> 00:17:22,840 Speaker 1: of am skeptical. Let's focus on how people are using 333 00:17:22,920 --> 00:17:26,160 Speaker 1: it for now, for the foreseeable future. I don't think 334 00:17:26,160 --> 00:17:29,080 Speaker 1: we need to worry yet, at least about the AI 335 00:17:29,240 --> 00:17:32,040 Speaker 1: somehow having a life of its own and stabbing us 336 00:17:32,040 --> 00:17:35,160 Speaker 1: in the back and enslaving us, because there's just so 337 00:17:35,240 --> 00:17:37,159 Speaker 1: much that can go wrong before you even get to 338 00:17:37,200 --> 00:17:41,640 Speaker 1: that point. Yeah, and it's it's not that's exactly like 339 00:17:41,720 --> 00:17:44,040 Speaker 1: it's a threat triage kind of thing, where like, is 340 00:17:44,080 --> 00:17:47,080 Speaker 1: it theoretically possible that one day human beings could create 341 00:17:47,119 --> 00:17:50,760 Speaker 1: an artificial intelligence that is capable of having its own 342 00:17:50,880 --> 00:17:54,240 Speaker 1: agency that is malicious? Yeah, sure, I guess, Like I 343 00:17:54,320 --> 00:17:57,720 Speaker 1: mean maybe, but man, we're there's a lot of us 344 00:17:57,720 --> 00:18:00,960 Speaker 1: that are very malicious right now that are actively trying 345 00:18:01,000 --> 00:18:04,080 Speaker 1: to harm other people at scale. I'm concerned about how 346 00:18:04,080 --> 00:18:06,600 Speaker 1: they will use AI to do that. I think botonets 347 00:18:06,600 --> 00:18:08,680 Speaker 1: are a really good example. One of the things that 348 00:18:08,680 --> 00:18:11,879 Speaker 1: that these these new this newest generation of AI tools 349 00:18:11,880 --> 00:18:15,800 Speaker 1: allows is more realistic and intelligent bots than I think 350 00:18:15,840 --> 00:18:18,640 Speaker 1: have been accessible at scale before. And that's a very 351 00:18:18,720 --> 00:18:23,560 Speaker 1: real concern. Um. I will say when I kind of sorry, 352 00:18:23,560 --> 00:18:25,800 Speaker 1: when I kind of wargame this back and forth with myself. 353 00:18:26,080 --> 00:18:30,600 Speaker 1: One thing that is oddly comforting is like, well, the 354 00:18:30,760 --> 00:18:35,760 Speaker 1: shared comments that we all inhabit of, like ontological truth 355 00:18:36,119 --> 00:18:39,719 Speaker 1: is already so shattered that like there's there's only so 356 00:18:39,840 --> 00:18:43,680 Speaker 1: much damage. I feel like adding additional bots and additional 357 00:18:43,720 --> 00:18:48,440 Speaker 1: disinformation can really do um. Like I done one thought 358 00:18:48,480 --> 00:18:50,840 Speaker 1: on that though, because I've been digging into that too. 359 00:18:51,160 --> 00:18:53,040 Speaker 1: I've been, you know, trying to ponder how to feel 360 00:18:53,040 --> 00:18:54,719 Speaker 1: about that, because a lot of this I don't know, 361 00:18:54,800 --> 00:18:58,480 Speaker 1: you know, I'm trying to make is. I do think 362 00:18:58,680 --> 00:19:02,800 Speaker 1: if you go back to like two sixteen earlier versions 363 00:19:02,800 --> 00:19:05,840 Speaker 1: of the Internet before leading up to Donald Trump's election, 364 00:19:06,520 --> 00:19:10,200 Speaker 1: I think there was a lot of wild West to Google, 365 00:19:10,240 --> 00:19:12,359 Speaker 1: to social media, to all these things. Right, fake news 366 00:19:12,440 --> 00:19:14,720 Speaker 1: was just like piling up to the top of Google 367 00:19:14,760 --> 00:19:18,800 Speaker 1: search results. That election was so monumental and such a 368 00:19:19,200 --> 00:19:23,080 Speaker 1: seismic shockwave through tech that fake news and misinformation might 369 00:19:23,080 --> 00:19:25,960 Speaker 1: have played a role that they really had to do something, 370 00:19:26,119 --> 00:19:28,120 Speaker 1: and I think some companies are more effective than others. 371 00:19:28,119 --> 00:19:30,680 Speaker 1: I think Google put a lot of effort into making 372 00:19:30,680 --> 00:19:33,800 Speaker 1: sure authoritative sources rise to the top. So what that 373 00:19:33,880 --> 00:19:36,399 Speaker 1: means is when now you go online and you google 374 00:19:36,440 --> 00:19:40,040 Speaker 1: for medical information, the top results you get are WebMD 375 00:19:40,359 --> 00:19:44,160 Speaker 1: or some official CDC, your government thing. They're pretty decent reliable. 376 00:19:44,720 --> 00:19:46,480 Speaker 1: It's not to say there's an all that crap on 377 00:19:46,520 --> 00:19:48,800 Speaker 1: the Internet, but Google has done a pretty good job 378 00:19:48,880 --> 00:19:51,280 Speaker 1: of having the good stuff float to the top, and 379 00:19:51,320 --> 00:19:54,720 Speaker 1: that's the information that people see. So what I'm worried 380 00:19:54,880 --> 00:19:57,440 Speaker 1: is now we might be kind of resetting ourselves back 381 00:19:57,480 --> 00:19:59,719 Speaker 1: to the twenty sixteen where when you're talking to these 382 00:19:59,760 --> 00:20:02,679 Speaker 1: chat bots that are trained on all the internets. Yeah, 383 00:20:03,280 --> 00:20:06,000 Speaker 1: I don't know if the web mds and the CDC 384 00:20:06,640 --> 00:20:09,120 Speaker 1: type of information is necessarily going to float to the top. 385 00:20:09,240 --> 00:20:12,080 Speaker 1: Maybe they'll work that out. But I'm also worried that 386 00:20:13,240 --> 00:20:15,879 Speaker 1: open Ai or Google or Microsoft for wherever, they'll have 387 00:20:15,920 --> 00:20:18,320 Speaker 1: ones that are pretty reasonable and kind of you know, 388 00:20:18,520 --> 00:20:20,800 Speaker 1: tuned to appeal to a lot of people. But Elon 389 00:20:20,920 --> 00:20:23,679 Speaker 1: must might build his own competitor, one that might be 390 00:20:23,760 --> 00:20:27,120 Speaker 1: really tuned to elevate the right wing site your car. 391 00:20:38,320 --> 00:20:40,679 Speaker 1: So I have been messing around, as I mean, and 392 00:20:40,760 --> 00:20:42,720 Speaker 1: you have been doing so in a much more rigorous manner, 393 00:20:42,760 --> 00:20:44,639 Speaker 1: I'm sure. But I've screw around with a couple of 394 00:20:44,680 --> 00:20:48,640 Speaker 1: different AI chat and search engines. I use find PHI 395 00:20:48,760 --> 00:20:52,119 Speaker 1: and D sometimes I've been playing around with BING and 396 00:20:52,359 --> 00:20:54,280 Speaker 1: one of the things I've noticed is that you know, 397 00:20:54,280 --> 00:20:56,800 Speaker 1: if you ask it like, hey, summarize for me, like 398 00:20:56,840 --> 00:20:59,000 Speaker 1: why the Battle of Hastings mattered, You'll get a reasonably 399 00:20:59,040 --> 00:21:02,240 Speaker 1: decent answer. But if I ask it like I don't know, 400 00:21:02,359 --> 00:21:05,959 Speaker 1: specific questions about myself, I've come to I noticed at 401 00:21:05,960 --> 00:21:07,480 Speaker 1: first when I did it, I would get some really 402 00:21:07,520 --> 00:21:12,320 Speaker 1: weirdly like colloquial vernacular from it explaining things, and I 403 00:21:12,400 --> 00:21:15,200 Speaker 1: realized it was just pulling answers directly that fans had 404 00:21:15,200 --> 00:21:18,640 Speaker 1: asked about me on the subreddit that this show has. 405 00:21:19,240 --> 00:21:21,520 Speaker 1: And so when I think about like ways in which 406 00:21:21,520 --> 00:21:23,600 Speaker 1: to game the system, well, you make a bunch of bots. 407 00:21:23,640 --> 00:21:26,359 Speaker 1: You have them post questions and answers that are you know, 408 00:21:26,400 --> 00:21:28,800 Speaker 1: supportive of this specific product line or whatever on a 409 00:21:28,840 --> 00:21:32,560 Speaker 1: subreddit and hope that it gets picked like scanned by 410 00:21:32,600 --> 00:21:34,720 Speaker 1: an AI and that becomes part of its like answer 411 00:21:34,840 --> 00:21:37,080 Speaker 1: for you know what happens if you know, I can't 412 00:21:37,080 --> 00:21:39,520 Speaker 1: stop itching or whatever. I don't know, like, but I 413 00:21:40,200 --> 00:21:43,320 Speaker 1: like obviously you can see using them ways in which 414 00:21:43,320 --> 00:21:46,000 Speaker 1: these cannon will be gamed to some extent. You know, 415 00:21:46,040 --> 00:21:49,080 Speaker 1: it's always kind of a red Queen sort of situation 416 00:21:49,119 --> 00:21:52,919 Speaker 1: where you have to disinformation people fighting this info. You're 417 00:21:52,920 --> 00:21:55,040 Speaker 1: always running as fast as you can just to stay 418 00:21:55,080 --> 00:21:58,280 Speaker 1: in place that's right, And that is that brings up 419 00:21:58,320 --> 00:22:01,679 Speaker 1: another issue which I do feel like this is possibly 420 00:22:01,720 --> 00:22:04,879 Speaker 1: really tipping the balance in that it takes a certain 421 00:22:04,880 --> 00:22:08,400 Speaker 1: amount of resources to create misinformation, it takes a certain 422 00:22:08,440 --> 00:22:11,080 Speaker 1: amount of resources to debunk it. Right, A journalist has 423 00:22:11,119 --> 00:22:13,520 Speaker 1: to sit down, Snopes has to write a little piece 424 00:22:13,560 --> 00:22:16,760 Speaker 1: about it. And the problem is with this AI, it's 425 00:22:16,880 --> 00:22:20,680 Speaker 1: suddenly just dropping the price of creation down to essentially zero. 426 00:22:21,200 --> 00:22:26,399 Speaker 1: Anyone can create essentially limitless supply of quasi information that 427 00:22:26,440 --> 00:22:29,080 Speaker 1: may or may not be true. But the problem is, 428 00:22:29,080 --> 00:22:32,280 Speaker 1: is the price of journalism of debunking also going down, 429 00:22:32,640 --> 00:22:34,880 Speaker 1: maybe by fifty percent, right, maybe it takes you half 430 00:22:34,880 --> 00:22:37,120 Speaker 1: as much time to write an article. It's not going 431 00:22:37,119 --> 00:22:40,800 Speaker 1: to zero, no, So that's the balance is Creating stuff 432 00:22:40,840 --> 00:22:44,480 Speaker 1: has gotten a lot cheaper, Detecting, debunking, doing proper journalism 433 00:22:44,520 --> 00:22:46,919 Speaker 1: has gotten a little bit cheaper. So I'm worried that 434 00:22:46,920 --> 00:22:51,800 Speaker 1: that's journalists are already stretched then. And this is by 435 00:22:51,880 --> 00:22:55,040 Speaker 1: far my biggest concern because it's it's not just this 436 00:22:55,160 --> 00:22:57,399 Speaker 1: that's obviously a significant factor in it. There will be 437 00:22:57,440 --> 00:23:00,520 Speaker 1: more disinformation, there will not be more journalists, in part 438 00:23:00,560 --> 00:23:03,200 Speaker 1: because I think AI is going to take jobs from 439 00:23:03,400 --> 00:23:05,880 Speaker 1: a particularly low level d It's not going to replace, 440 00:23:05,960 --> 00:23:09,680 Speaker 1: you know, prizewinning columnists at the New York Times, and 441 00:23:09,720 --> 00:23:11,920 Speaker 1: it's not going to replace like guys like me who 442 00:23:12,000 --> 00:23:15,280 Speaker 1: have a very long and established career of doing the 443 00:23:15,320 --> 00:23:18,119 Speaker 1: specific thing that we do. But I think back to 444 00:23:18,240 --> 00:23:20,920 Speaker 1: when I got started as a journalist as a writer. 445 00:23:20,960 --> 00:23:23,680 Speaker 1: It was as a tech blogger, and I had an 446 00:23:23,840 --> 00:23:25,920 Speaker 1: X number of articles that I had to get out 447 00:23:25,920 --> 00:23:29,879 Speaker 1: per day, and obviously, like my boss was essentially trusting 448 00:23:29,880 --> 00:23:31,960 Speaker 1: that with that many articles, i'd have a few that 449 00:23:32,000 --> 00:23:34,119 Speaker 1: did well on Google, and that brings in traffic, and 450 00:23:34,160 --> 00:23:36,280 Speaker 1: that brought in money. And there's a degree to which 451 00:23:36,280 --> 00:23:38,600 Speaker 1: you're just kind of doing seo shit. But it's also 452 00:23:38,800 --> 00:23:41,520 Speaker 1: I conducted my first interviews for that job, I went 453 00:23:41,560 --> 00:23:43,359 Speaker 1: to trade shows for the first time. I did my 454 00:23:43,400 --> 00:23:46,080 Speaker 1: first on the ground journalism for that job. It taught 455 00:23:46,119 --> 00:23:48,520 Speaker 1: me how to write quickly and an a polished nature. 456 00:23:48,600 --> 00:23:51,680 Speaker 1: And I was not writing anything that was like crucial 457 00:23:51,720 --> 00:23:56,560 Speaker 1: to the development of humankind. But it made me into 458 00:23:56,640 --> 00:23:59,080 Speaker 1: the kind of person who was later able to write 459 00:23:59,080 --> 00:24:01,639 Speaker 1: things that were read by people all over the world, 460 00:24:01,640 --> 00:24:05,280 Speaker 1: and that had an influence on people. And I worry 461 00:24:05,320 --> 00:24:08,959 Speaker 1: about the brain dray, not just among journalists, but among writers, 462 00:24:09,119 --> 00:24:12,920 Speaker 1: among artists, you know, people who do illustrations and stuff. Eventually, musicians, 463 00:24:13,040 --> 00:24:15,640 Speaker 1: at least some kinds of musicians will probably also run 464 00:24:15,720 --> 00:24:19,320 Speaker 1: up against this, where the stuff that it was easy 465 00:24:19,400 --> 00:24:21,680 Speaker 1: for kind of people breaking in to get a little 466 00:24:21,680 --> 00:24:24,679 Speaker 1: bit of work that would hone their skills and allow 467 00:24:24,720 --> 00:24:27,639 Speaker 1: them to live doing the thing that they're interested in 468 00:24:28,200 --> 00:24:31,639 Speaker 1: is going to disappear. And more and more of the 469 00:24:31,720 --> 00:24:34,760 Speaker 1: stuff that we kind of casually low level consume, not 470 00:24:34,840 --> 00:24:37,720 Speaker 1: our high art, not our favorite movies, not our favorite books, 471 00:24:37,800 --> 00:24:40,600 Speaker 1: but the stuff that we encounter when we stumble upon 472 00:24:40,640 --> 00:24:42,800 Speaker 1: a web page or like in a commercial or whatever, 473 00:24:43,000 --> 00:24:47,040 Speaker 1: will be increasingly made by AIS, and that AI will 474 00:24:47,080 --> 00:24:50,199 Speaker 1: be pulling from an increasingly narrow set of things that 475 00:24:50,359 --> 00:24:52,840 Speaker 1: humans made because less humans will get that in tree 476 00:24:52,880 --> 00:24:55,440 Speaker 1: level work, and that is there's something concerning there that 477 00:24:55,560 --> 00:24:59,920 Speaker 1: is something that worries me about the future of just creativity. Yeah, 478 00:25:00,200 --> 00:25:02,320 Speaker 1: and I think, I mean two points. One is just 479 00:25:02,359 --> 00:25:04,560 Speaker 1: to kind of be Devil's advocate a little bit, because 480 00:25:04,560 --> 00:25:06,800 Speaker 1: I do sympathize and I think you're right, but a 481 00:25:06,800 --> 00:25:10,159 Speaker 1: little bit devil's advocate is there. It might be on 482 00:25:10,200 --> 00:25:12,040 Speaker 1: the out flip side of the coin that there's people 483 00:25:12,080 --> 00:25:15,800 Speaker 1: that feel like they have artistic imagination and desires but 484 00:25:15,920 --> 00:25:19,399 Speaker 1: lack the technical ability, and suddenly they can paint, so 485 00:25:19,520 --> 00:25:23,440 Speaker 1: to speak, by using these aiimage generators. Maybe someone has 486 00:25:23,720 --> 00:25:26,520 Speaker 1: some form of dyslexia or their English as a second 487 00:25:26,600 --> 00:25:30,439 Speaker 1: language or even you know, native speaker without any of 488 00:25:30,480 --> 00:25:34,000 Speaker 1: these issues obstructions, but just finds the writing process difficult, 489 00:25:34,040 --> 00:25:37,520 Speaker 1: and maybe AI enables them to be a writer to contribute. 490 00:25:37,960 --> 00:25:40,679 Speaker 1: So I could see, you know, there's going to be 491 00:25:40,680 --> 00:25:42,920 Speaker 1: the pros and the negatives, and I don't know how 492 00:25:42,920 --> 00:25:45,240 Speaker 1: the balance is, but I think you're right thinking from 493 00:25:45,280 --> 00:25:48,880 Speaker 1: a profession that's sort of like a passion project view. 494 00:25:49,320 --> 00:25:52,280 Speaker 1: From a professional view, I do see the profession narrowing. 495 00:25:52,680 --> 00:25:55,680 Speaker 1: If it journalists are expected to work twice as quickly 496 00:25:55,680 --> 00:25:58,679 Speaker 1: because they're all using chatbots, there's probably going to be 497 00:25:58,680 --> 00:26:01,520 Speaker 1: half half as many of them, right, I mean, that's 498 00:26:01,960 --> 00:26:05,040 Speaker 1: that's the economics. But this brings up a bigger issue, 499 00:26:05,040 --> 00:26:07,320 Speaker 1: which is I do think what you're hitting on is 500 00:26:07,720 --> 00:26:11,199 Speaker 1: there are these long term risks that maybe AI is 501 00:26:11,240 --> 00:26:14,480 Speaker 1: gonna fuel this rebellion of robots and this. You know, maybe, 502 00:26:14,760 --> 00:26:19,240 Speaker 1: but again, we have an economics, social, political, economic world 503 00:26:19,240 --> 00:26:22,080 Speaker 1: we live in, and I just think let's really focus 504 00:26:22,119 --> 00:26:25,119 Speaker 1: on the issues we have. Now. That's not discounting the future. 505 00:26:25,160 --> 00:26:27,480 Speaker 1: It's not like let's burn a bunch of carbon and 506 00:26:27,560 --> 00:26:30,679 Speaker 1: meaning fuels because who cares about climate change? That's our 507 00:26:30,680 --> 00:26:34,480 Speaker 1: grandkids problems. Yeah, this is different. It's like, let's think 508 00:26:34,480 --> 00:26:36,639 Speaker 1: about the jobs the world. I mean, another way to 509 00:26:36,640 --> 00:26:39,159 Speaker 1: put this is if we mess up our economy and 510 00:26:39,200 --> 00:26:42,520 Speaker 1: mess up our democracy by people losing jobs and mass 511 00:26:42,600 --> 00:26:45,919 Speaker 1: protests and losing trust in the government and there's just 512 00:26:46,040 --> 00:26:48,360 Speaker 1: an erosion of truth, we're not going to be able 513 00:26:48,359 --> 00:26:52,159 Speaker 1: to handle climate change or any of these big AI 514 00:26:52,520 --> 00:26:56,080 Speaker 1: you know, the singularity type of risks. So what I 515 00:26:56,080 --> 00:26:58,960 Speaker 1: feel like is, let's focus on what keeps our economy 516 00:26:59,240 --> 00:27:02,919 Speaker 1: and our sanity and our humanity. Well, let's keep this 517 00:27:03,040 --> 00:27:06,959 Speaker 1: fabric of society together now so that we're more equipped 518 00:27:07,000 --> 00:27:10,760 Speaker 1: in the future to handle all the risks AI and otherwise. 519 00:27:11,119 --> 00:27:12,840 Speaker 1: But this goes back to what you're saying, which is, 520 00:27:14,040 --> 00:27:16,920 Speaker 1: these are real issues in the short term, and if 521 00:27:16,920 --> 00:27:19,040 Speaker 1: we don't address them, if we get distracted by the 522 00:27:19,080 --> 00:27:21,760 Speaker 1: long term, we're not going to be ready to address 523 00:27:21,800 --> 00:27:23,840 Speaker 1: the long term even if we think about it now, 524 00:27:23,840 --> 00:27:26,760 Speaker 1: we'll be so distracted and so dismayed. Yeah, so I 525 00:27:26,800 --> 00:27:29,639 Speaker 1: think we have to be practical here. I agree, and 526 00:27:29,680 --> 00:27:32,840 Speaker 1: I am also I think it's a valid point that 527 00:27:32,920 --> 00:27:36,000 Speaker 1: you make about the fact that all these are tools 528 00:27:36,040 --> 00:27:38,400 Speaker 1: that will reduce options for some people, there are also 529 00:27:38,480 --> 00:27:40,879 Speaker 1: tools that create options that can be used for the 530 00:27:40,920 --> 00:27:44,480 Speaker 1: creation of art of culture. I do think some people 531 00:27:44,520 --> 00:27:47,240 Speaker 1: I know have brought up photoshop when I talk about 532 00:27:47,280 --> 00:27:49,399 Speaker 1: my concerns with AI and are, like, you know, there 533 00:27:49,440 --> 00:27:52,720 Speaker 1: were a lot of people, draftsmen and whatnot who were 534 00:27:52,760 --> 00:27:55,760 Speaker 1: concerned when photoshop hit because it was a threat to 535 00:27:55,840 --> 00:27:57,560 Speaker 1: some of the things that they did for money. And 536 00:27:57,960 --> 00:28:01,920 Speaker 1: photoshop effectively has created whole forms of art that didn't 537 00:28:01,960 --> 00:28:04,119 Speaker 1: exist or didn't exist in the same fashion before it 538 00:28:04,400 --> 00:28:07,680 Speaker 1: did as a tool and tools like it. And that's 539 00:28:07,720 --> 00:28:11,080 Speaker 1: not a think I think it's kind of worth I 540 00:28:11,440 --> 00:28:15,000 Speaker 1: don't like I don't want to be kind of just 541 00:28:15,080 --> 00:28:17,040 Speaker 1: on the edge of tragedy here. You know, this is 542 00:28:17,080 --> 00:28:19,399 Speaker 1: a there's a lot of different ways this could go, 543 00:28:19,480 --> 00:28:22,800 Speaker 1: and they're not all bad. I think we're all used 544 00:28:22,800 --> 00:28:26,000 Speaker 1: to calamity right now, so much so that we potentially 545 00:28:26,040 --> 00:28:30,160 Speaker 1: expect it in situations where it's not the inevitable outcome. Well, 546 00:28:30,440 --> 00:28:32,320 Speaker 1: I mean that's I think one way to kind of 547 00:28:32,480 --> 00:28:36,520 Speaker 1: boil a lot of that down is we can adapt. 548 00:28:36,840 --> 00:28:39,560 Speaker 1: We just need time to do so to many things. 549 00:28:39,960 --> 00:28:43,280 Speaker 1: And what's really challenging and frustrating now is the pace 550 00:28:43,400 --> 00:28:45,560 Speaker 1: is so fast. It's not just an illusion, and it's 551 00:28:45,560 --> 00:28:47,600 Speaker 1: not just oh, if you don't pay attention to AI, 552 00:28:47,720 --> 00:28:50,600 Speaker 1: it really is fast. It's very very hard for us 553 00:28:50,600 --> 00:28:53,720 Speaker 1: to adapt. So, just thinking of the Internet, we got 554 00:28:53,720 --> 00:28:56,920 Speaker 1: a lot like individuals as users and tech companies got 555 00:28:56,920 --> 00:28:59,520 Speaker 1: a lot better at dealing with clickbait. Right YouTube was 556 00:28:59,560 --> 00:29:02,280 Speaker 1: tons of baden. They figured out ways to demote that 557 00:29:02,360 --> 00:29:04,800 Speaker 1: to some extent. We got a lot better at keeping 558 00:29:04,840 --> 00:29:07,280 Speaker 1: fake news out of the high search rankings in Google. 559 00:29:07,320 --> 00:29:09,840 Speaker 1: Like I mentioned, a lot of these problems that came up, 560 00:29:10,160 --> 00:29:13,040 Speaker 1: we're not perfectly addressed, not even close. But there was 561 00:29:13,120 --> 00:29:16,720 Speaker 1: significant progress and that's often understated. But if these problems 562 00:29:16,760 --> 00:29:20,120 Speaker 1: are coming so fast and so intense, it's a lot 563 00:29:20,160 --> 00:29:22,720 Speaker 1: to adapt to. And that's what's really the challenge is 564 00:29:22,720 --> 00:29:24,840 Speaker 1: the pace. And I think we're we're seeing a very 565 00:29:24,960 --> 00:29:29,000 Speaker 1: very breakneck pace. That's really hard. Now does that main 566 00:29:29,040 --> 00:29:31,520 Speaker 1: you're on the side of like Elon Musk and some 567 00:29:31,560 --> 00:29:33,520 Speaker 1: of those folks who just signed that letter being like, 568 00:29:34,280 --> 00:29:37,320 Speaker 1: maybe we should put a pause on AI research because 569 00:29:37,680 --> 00:29:41,120 Speaker 1: you know, I'm not one hundred percent against it. Again, 570 00:29:41,160 --> 00:29:43,440 Speaker 1: I kind of am, like, Man, I wish we'd been 571 00:29:43,480 --> 00:29:47,440 Speaker 1: having this conversation when Facebook dropped or YouTube dropped. But 572 00:29:48,000 --> 00:29:50,960 Speaker 1: I don't think that's a realistic thing. I'll say that, 573 00:29:51,440 --> 00:29:54,600 Speaker 1: but I do. Look yeah, yeah, so I would say, no, 574 00:29:54,680 --> 00:29:58,320 Speaker 1: I'm not I'm not a favor that for one thing. 575 00:29:58,360 --> 00:30:02,120 Speaker 1: I mean, in a very practical sense, you think all 576 00:30:02,120 --> 00:30:04,760 Speaker 1: these companies that are putting billions of dollars in these 577 00:30:04,760 --> 00:30:07,840 Speaker 1: investments in AI are all going to sit around saying, 578 00:30:08,080 --> 00:30:10,360 Speaker 1: you know what, let's just not do this for a 579 00:30:10,400 --> 00:30:13,600 Speaker 1: few months of course not. So here's what I think 580 00:30:13,680 --> 00:30:15,480 Speaker 1: They're not going to slow down. What's going to happen 581 00:30:15,600 --> 00:30:18,320 Speaker 1: is going to happen. Even if some players decide to 582 00:30:18,320 --> 00:30:21,040 Speaker 1: be responsible and slow down. Guess what that means. The 583 00:30:21,080 --> 00:30:23,760 Speaker 1: only people plunging ahead are going to be the irresponsible ones. 584 00:30:24,360 --> 00:30:26,000 Speaker 1: So what I think we need to do is I 585 00:30:26,000 --> 00:30:29,080 Speaker 1: don't think we can really slow that down. So what 586 00:30:29,160 --> 00:30:31,480 Speaker 1: about the flip side. I think we need to accelerate 587 00:30:31,560 --> 00:30:35,400 Speaker 1: public education on artificial intelligence. I think we need to 588 00:30:35,440 --> 00:30:41,280 Speaker 1: accelerate government legislation, regulation, international cooperation. I don't think we 589 00:30:41,320 --> 00:30:43,960 Speaker 1: can solve this by slowing AI down. I do think 590 00:30:44,000 --> 00:30:45,720 Speaker 1: we need to find a way to speed up our 591 00:30:45,760 --> 00:30:49,560 Speaker 1: democratic process processes. It's taken us how many years to 592 00:30:49,640 --> 00:30:52,880 Speaker 1: pass basically nothing about social media in the US and 593 00:30:53,040 --> 00:30:56,400 Speaker 1: some mixed results in Europe. Yeah, that's the problem, right, 594 00:30:56,440 --> 00:30:59,400 Speaker 1: If we could work faster, then I think we could 595 00:30:59,480 --> 00:31:02,640 Speaker 1: keep up. And I think that that's that's actually the 596 00:31:02,720 --> 00:31:06,200 Speaker 1: long term like practical survival thing from this is that 597 00:31:06,320 --> 00:31:10,160 Speaker 1: I hope we get is like, yeah, we've always needed 598 00:31:10,160 --> 00:31:13,880 Speaker 1: to be more careful about the things that we expose 599 00:31:14,000 --> 00:31:18,480 Speaker 1: billions of people too. Suddenly it should have happened before now. 600 00:31:18,560 --> 00:31:21,280 Speaker 1: But I hope that this I hope that all I 601 00:31:21,320 --> 00:31:24,280 Speaker 1: hope the fact that AI, because of James Cameron, is 602 00:31:24,320 --> 00:31:26,680 Speaker 1: coated into our brains to be something that triggers a 603 00:31:26,720 --> 00:31:29,240 Speaker 1: little bit of panic in people. I hope that rather 604 00:31:29,280 --> 00:31:33,240 Speaker 1: than reacting with panic, it leads to a more intelligent 605 00:31:33,320 --> 00:31:36,800 Speaker 1: and considered state of affairs when potentially embracing technologies that 606 00:31:36,840 --> 00:31:38,880 Speaker 1: are going to change life for huge numbers of people. 607 00:31:39,720 --> 00:31:41,719 Speaker 1: That's right, and that is I think we have an 608 00:31:41,720 --> 00:31:45,200 Speaker 1: opportunity here to experience that and explore them and try 609 00:31:45,200 --> 00:31:47,040 Speaker 1: and that that is kind of what I was aiming for, 610 00:31:47,080 --> 00:31:49,520 Speaker 1: And that threat is again I love that article that 611 00:31:49,680 --> 00:31:52,560 Speaker 1: you know you mentioned at the beginning, But if we 612 00:31:52,640 --> 00:31:55,400 Speaker 1: start going down this road of hype, there is a 613 00:31:55,560 --> 00:31:57,600 Speaker 1: danger that we're going to fall into these traps. And 614 00:31:57,640 --> 00:31:59,880 Speaker 1: I think let's stay grounded. Let's say practical, let's real 615 00:32:00,080 --> 00:32:02,680 Speaker 1: identify the risks. Not that I'm some guru and know 616 00:32:02,760 --> 00:32:06,200 Speaker 1: what they are, but it's almost easier to see what's 617 00:32:06,200 --> 00:32:09,000 Speaker 1: not true than what is true. Yeah, and that's I 618 00:32:09,000 --> 00:32:10,880 Speaker 1: think let's all try to police each other and make 619 00:32:10,920 --> 00:32:14,320 Speaker 1: sure we're focusing on practical things that really are manageable, 620 00:32:14,400 --> 00:32:19,160 Speaker 1: that really are genuine risks that are impacting people, that 621 00:32:19,200 --> 00:32:22,240 Speaker 1: are impacting people today, and especially ones that are impacting 622 00:32:22,400 --> 00:32:26,200 Speaker 1: marginalized populations. Yes, so I think let's hope we learn 623 00:32:26,200 --> 00:32:29,600 Speaker 1: these lessons. And yeah, I am not optimistic, but I'm 624 00:32:29,640 --> 00:32:32,320 Speaker 1: not as cynical. I think there's a lot of important 625 00:32:32,360 --> 00:32:35,840 Speaker 1: discussions happening now that let's just say, there's a lot 626 00:32:35,880 --> 00:32:38,040 Speaker 1: more discussion now than we had with social media, and 627 00:32:38,120 --> 00:32:40,880 Speaker 1: maybe that's a good thing. Yeah, well, I think that's 628 00:32:40,880 --> 00:32:43,000 Speaker 1: a good note to end on. Noah, did you have 629 00:32:43,040 --> 00:32:44,800 Speaker 1: anything you kind of wanted to plug before we roll 630 00:32:44,840 --> 00:32:49,120 Speaker 1: out here? No, I just I think it's it's a 631 00:32:49,160 --> 00:32:52,320 Speaker 1: great topic that everyone can be involved in, and I 632 00:32:52,400 --> 00:32:56,520 Speaker 1: just my plug is just don't be intimidated. Don't be afraid. 633 00:32:56,560 --> 00:32:58,239 Speaker 1: I am writing a book that's not going to come 634 00:32:58,280 --> 00:32:59,760 Speaker 1: up for a couple of years that's trying to help 635 00:33:00,440 --> 00:33:02,960 Speaker 1: empower people to kind of be part of these conversations. 636 00:33:03,000 --> 00:33:05,320 Speaker 1: But that's far off. I just want to say broadly, 637 00:33:05,760 --> 00:33:09,000 Speaker 1: don't be intimidated and don't fall for this narrative that 638 00:33:09,120 --> 00:33:12,680 Speaker 1: sometimes happens in tech communities that, oh, you know, I'm 639 00:33:12,720 --> 00:33:14,400 Speaker 1: not a tech person, I don't have a chance to 640 00:33:14,480 --> 00:33:17,520 Speaker 1: understand this stuff affects all of us, and how it 641 00:33:17,520 --> 00:33:20,920 Speaker 1: affects you matters, and your opinion matters, and your voice matters, 642 00:33:20,960 --> 00:33:23,600 Speaker 1: and we're all part of social media, we're all very 643 00:33:23,640 --> 00:33:25,680 Speaker 1: soon going to be part of AI in chat thoughts, 644 00:33:26,120 --> 00:33:28,320 Speaker 1: So don't don't be afraid to join the conversation. You 645 00:33:28,320 --> 00:33:31,360 Speaker 1: don't need any technical background because I think the subject 646 00:33:31,520 --> 00:33:35,360 Speaker 1: is just as much sociological as technical. It's about people. 647 00:33:35,760 --> 00:33:37,600 Speaker 1: I think that's a great point to end on. Thank 648 00:33:37,640 --> 00:33:42,000 Speaker 1: you so much, Noah, really appreciate your time, and everybody 649 00:33:42,040 --> 00:33:43,920 Speaker 1: else have a nice day. I mean you have a 650 00:33:43,960 --> 00:33:46,920 Speaker 1: nice day too, also, thanks to you too. It's lots 651 00:33:46,920 --> 00:33:53,240 Speaker 1: of fun. It could happen here as a production of 652 00:33:53,280 --> 00:33:56,240 Speaker 1: cool Zone Media. Well more podcasts from cool Zone Media. 653 00:33:56,320 --> 00:33:58,959 Speaker 1: Visit our website cool Zonemedia dot com, or check us 654 00:33:58,960 --> 00:34:01,800 Speaker 1: out on the iHeartRadio app, Apple Podcasts, or wherever you 655 00:34:01,840 --> 00:34:04,600 Speaker 1: listen to podcasts. You can find sources for It could 656 00:34:04,640 --> 00:34:08,759 Speaker 1: Happen here, updated monthly at cool Zonemeda dot com slash sources. 657 00:34:08,760 --> 00:34:09,600 Speaker 1: Thanks for listening.