1 00:00:09,664 --> 00:00:13,104 Speaker 1: Mission Implausible is now something you can watch. Just go 2 00:00:13,144 --> 00:00:16,823 Speaker 1: to YouTube and search Mission Implausible podcasts, or click on 3 00:00:16,864 --> 00:00:20,864 Speaker 1: the link to our channel. In our show notes, I'm 4 00:00:20,904 --> 00:00:21,984 Speaker 1: John Cipher. 5 00:00:21,664 --> 00:00:22,784 Speaker 2: And I'm Jerry O'Shea. 6 00:00:23,264 --> 00:00:26,544 Speaker 3: We have over sixty years of experience as clandestine officers 7 00:00:26,624 --> 00:00:29,544 Speaker 3: in the CIA, serving in high risk areas all around 8 00:00:29,544 --> 00:00:30,264 Speaker 3: the world, and. 9 00:00:30,304 --> 00:00:34,504 Speaker 4: Part of our job was creating conspiracies to deceive our adversaries. 10 00:00:34,784 --> 00:00:37,504 Speaker 3: Now we're going to use that experience to investigate the 11 00:00:37,504 --> 00:00:40,424 Speaker 3: conspiracy theories everyone's talking about as well as some of 12 00:00:40,464 --> 00:00:41,224 Speaker 3: you may not have heard. 13 00:00:41,344 --> 00:00:43,584 Speaker 2: Could they be true or are we being manipulated? 14 00:00:43,664 --> 00:00:46,223 Speaker 3: We'll find out now on Mission Implausible. 15 00:00:50,384 --> 00:00:52,104 Speaker 2: Welcome back, Adam, John, and Jerry. 16 00:00:52,144 --> 00:00:54,584 Speaker 1: I haven't seen any of you for a long time, 17 00:00:54,824 --> 00:00:58,224 Speaker 1: but we're now entering a new season and there's been 18 00:00:58,224 --> 00:00:59,344 Speaker 1: a lot going on since. 19 00:00:59,144 --> 00:01:01,184 Speaker 2: The last time we all did an episode of a 20 00:01:01,184 --> 00:01:03,024 Speaker 2: month and a half ago. I hate to say I 21 00:01:03,104 --> 00:01:05,904 Speaker 2: missed a dam Yeah, I have missed you guys. I mean, 22 00:01:05,944 --> 00:01:08,264 Speaker 2: I guess I can just say I We normally make 23 00:01:08,304 --> 00:01:10,464 Speaker 2: fun of ours each other, but this is kind of 24 00:01:10,464 --> 00:01:12,184 Speaker 2: obnoxious because it'll make it hard for you to make 25 00:01:12,184 --> 00:01:14,264 Speaker 2: fun of me. We were dealing with that big health 26 00:01:14,264 --> 00:01:17,624 Speaker 2: issue in my family for much of this year. But 27 00:01:17,703 --> 00:01:19,984 Speaker 2: we're on the other side of it, and my wife 28 00:01:20,824 --> 00:01:22,344 Speaker 2: had to have a lot of surgery and stuff. But 29 00:01:22,384 --> 00:01:26,744 Speaker 2: she's now like really recovered and we're feeling wonderfully. 30 00:01:26,304 --> 00:01:28,223 Speaker 3: And we've had her on the show, she's interviewed us. 31 00:01:28,503 --> 00:01:31,863 Speaker 3: We're so glad she's doing better. But did you make 32 00:01:31,904 --> 00:01:33,624 Speaker 3: more money this year while you were dealing with a 33 00:01:33,664 --> 00:01:35,344 Speaker 3: health crisis than you had him in the past. 34 00:01:35,584 --> 00:01:36,143 Speaker 2: It seems like. 35 00:01:36,104 --> 00:01:38,544 Speaker 3: You're all over LinkedIn and you're doing this AI and 36 00:01:38,584 --> 00:01:39,904 Speaker 3: you're speaking to groups. 37 00:01:39,584 --> 00:01:42,584 Speaker 2: You're always jumping on the new fan at him. I am, 38 00:01:42,904 --> 00:01:45,264 Speaker 2: so I don't think I fully understood what I was 39 00:01:45,304 --> 00:01:48,464 Speaker 2: getting into. So after I did the Freakonomics series about AI, 40 00:01:48,944 --> 00:01:52,064 Speaker 2: I became pretty close with Ethan Mollik, who's become this 41 00:01:52,184 --> 00:01:56,064 Speaker 2: like great AI mind, and we decided to start an 42 00:01:56,104 --> 00:01:59,184 Speaker 2: AI consulting business with a couple other people, his wife, Leelach, 43 00:01:59,584 --> 00:02:02,664 Speaker 2: the wonderful Jessica Johnston. So yeah, we worked with a 44 00:02:02,664 --> 00:02:07,344 Speaker 2: bunch of like big companies on AI. This might shock you. 45 00:02:07,344 --> 00:02:11,344 Speaker 2: I'm not the world's most technical expert AI. It actually 46 00:02:11,424 --> 00:02:14,224 Speaker 2: it's interesting. I don't think I realized this until fairly recently. 47 00:02:14,224 --> 00:02:16,544 Speaker 2: But Planet Money, which I created back in two thousand 48 00:02:16,544 --> 00:02:20,744 Speaker 2: and eight with Alex Bloomberg, was the first major American 49 00:02:20,784 --> 00:02:22,824 Speaker 2: media company podcast. As far as we can tell, we 50 00:02:22,824 --> 00:02:24,744 Speaker 2: haven't been able to find another one. And I was 51 00:02:24,784 --> 00:02:29,024 Speaker 2: really on the front lines of digital disruption, like I 52 00:02:29,064 --> 00:02:31,704 Speaker 2: was at NPR deal like trying to figure out how 53 00:02:31,704 --> 00:02:34,824 Speaker 2: podcasting would change things for several years. Then I went 54 00:02:34,824 --> 00:02:37,984 Speaker 2: to the New York Times, where I was mostly a reporter, 55 00:02:38,024 --> 00:02:40,984 Speaker 2: but I would report up to the senior people and 56 00:02:41,144 --> 00:02:44,384 Speaker 2: talk to them about digital disruption and media. And then 57 00:02:44,424 --> 00:02:45,984 Speaker 2: I was at the New Yorker did the same. Then 58 00:02:45,984 --> 00:02:50,344 Speaker 2: I ran this podcast company with Sony, and throughout that 59 00:02:50,424 --> 00:02:53,784 Speaker 2: I became really fascinated by how like technological change impacts 60 00:02:53,824 --> 00:02:56,784 Speaker 2: society and impacts companies. I mean, you can talk about 61 00:02:57,104 --> 00:03:01,024 Speaker 2: whatever railroads and electricity and the telephone and the internet 62 00:03:01,104 --> 00:03:04,824 Speaker 2: and mobile and a million other innovations, and there's all 63 00:03:04,824 --> 00:03:08,744 Speaker 2: these fascinating things that happen when a new technology changes work. 64 00:03:09,544 --> 00:03:12,104 Speaker 2: You know, a lot of these big companies are like 65 00:03:12,184 --> 00:03:14,064 Speaker 2: all right, Like a year ago they were like, is 66 00:03:14,104 --> 00:03:15,704 Speaker 2: this really that big a deal? Now I think most 67 00:03:15,744 --> 00:03:17,264 Speaker 2: are like, Okay, this is big deal. But what does 68 00:03:17,304 --> 00:03:19,104 Speaker 2: it mean? How do we deal with it? What do 69 00:03:19,144 --> 00:03:21,704 Speaker 2: we do? And so it's been this fascinating front row 70 00:03:21,824 --> 00:03:25,464 Speaker 2: seat into it. It's also a fascinating front row seat into, 71 00:03:25,864 --> 00:03:29,704 Speaker 2: among many other things, probably the world's greatest it's weirdly 72 00:03:29,824 --> 00:03:33,144 Speaker 2: like probably the best tool ever for conspiracy theorists, but 73 00:03:33,224 --> 00:03:36,944 Speaker 2: it's also may arguably the best tool ever for fighting 74 00:03:36,984 --> 00:03:41,224 Speaker 2: conspiracy theories. Can AI make John smarter or sound smarter? 75 00:03:41,584 --> 00:03:47,064 Speaker 2: The stats show that lower performers benefit more than higher 76 00:03:47,104 --> 00:03:52,304 Speaker 2: performers woo, But the higher performers still stay ahead because 77 00:03:52,744 --> 00:03:56,224 Speaker 2: there seems to be like an intelligence AI premium, So 78 00:03:56,384 --> 00:03:58,704 Speaker 2: it's not gonna be as obvious how amazing I am 79 00:03:58,824 --> 00:03:59,824 Speaker 2: compared to you. 80 00:03:59,984 --> 00:04:03,584 Speaker 3: Heads Well, Jay, and I spoke to a NYU professor 81 00:04:03,744 --> 00:04:05,544 Speaker 3: just the other day, Thomson Shaw, and we'll have an 82 00:04:05,544 --> 00:04:08,744 Speaker 3: episode with her, yeah, coming up, and she talks a 83 00:04:08,744 --> 00:04:11,464 Speaker 3: lot about how AI and some of the tech really 84 00:04:11,504 --> 00:04:14,344 Speaker 3: does help conspiracy theories because you can essentially just put 85 00:04:14,504 --> 00:04:17,943 Speaker 3: a series of events together, list them, and then just 86 00:04:17,984 --> 00:04:21,504 Speaker 3: click and it'll create these connections. And then you think 87 00:04:21,544 --> 00:04:24,784 Speaker 3: you've done this genius research and found this incredible connection. 88 00:04:25,264 --> 00:04:27,304 Speaker 3: And then of course that gets into the ecosystem and 89 00:04:27,344 --> 00:04:28,024 Speaker 3: gets spun in. 90 00:04:28,383 --> 00:04:32,024 Speaker 2: It's really good at being persuasive, and it's very good. 91 00:04:32,063 --> 00:04:35,944 Speaker 2: And it doesn't care, right, It just it just wants. 92 00:04:36,224 --> 00:04:39,303 Speaker 2: It's like a talking dog or something. It just wants. 93 00:04:39,303 --> 00:04:40,704 Speaker 2: What do you want? What do you want? Give you 94 00:04:40,784 --> 00:04:43,263 Speaker 2: what you want? Although there are some interesting studies that 95 00:04:43,303 --> 00:04:45,863 Speaker 2: it's also the best tool to get someone out of 96 00:04:45,863 --> 00:04:46,984 Speaker 2: a conspiracy theory. 97 00:04:47,144 --> 00:04:48,904 Speaker 3: But do you have to put in the right things? 98 00:04:48,984 --> 00:04:50,304 Speaker 2: Yeah, you have to put in the right thing. 99 00:04:50,424 --> 00:04:54,224 Speaker 4: But going from conspiracy theory to conspiracy genuinely, there's this 100 00:04:54,264 --> 00:04:58,264 Speaker 4: word that like almost no one can spell, is like algorithm, right, 101 00:04:58,344 --> 00:04:59,463 Speaker 4: so nothing comes from nothing. 102 00:04:59,503 --> 00:05:02,383 Speaker 2: No, most people can spell it, just not you LG. 103 00:05:03,104 --> 00:05:08,984 Speaker 2: There're two whys. That's what's confusing. But if you own if. 104 00:05:08,904 --> 00:05:11,623 Speaker 4: You own the l algorithm, or if you design the algorithf, 105 00:05:11,664 --> 00:05:14,904 Speaker 4: you can influence the algorithm. You can influence millions of people. 106 00:05:15,424 --> 00:05:19,063 Speaker 4: And arguably the algorithms that we all deal with every 107 00:05:19,144 --> 00:05:24,184 Speaker 4: day that are persuasive actually belong or engineered by, or 108 00:05:24,424 --> 00:05:30,623 Speaker 4: are really influenced by a small cabal of wealthy individuals. 109 00:05:30,743 --> 00:05:34,464 Speaker 4: There really is like ten people in the United States 110 00:05:34,863 --> 00:05:38,984 Speaker 4: who are responsible for the algorithms that generate AI and 111 00:05:39,063 --> 00:05:40,704 Speaker 4: what we what's put in front. 112 00:05:40,464 --> 00:05:41,024 Speaker 2: Of us to read. 113 00:05:41,104 --> 00:05:43,024 Speaker 3: And luckily they're excellent people for this. 114 00:05:43,104 --> 00:05:45,623 Speaker 1: I recommend a book I've been reading called Careless People 115 00:05:45,784 --> 00:05:48,903 Speaker 1: about Facebook doing exactly what you're saying, not through AI, 116 00:05:49,183 --> 00:05:54,183 Speaker 1: but through very intentional engineered programs. It bears out in 117 00:05:54,303 --> 00:05:57,664 Speaker 1: great detail, not just the ten people that are creating 118 00:05:57,664 --> 00:06:00,383 Speaker 1: an algorithm to change the course of events, but the 119 00:06:00,424 --> 00:06:00,984 Speaker 1: one person. 120 00:06:01,063 --> 00:06:03,743 Speaker 2: Yeah, but can I be pedantic for a moment or 121 00:06:03,863 --> 00:06:07,344 Speaker 2: for the rest of our relationship? That's the default Because 122 00:06:07,584 --> 00:06:11,063 Speaker 2: this is not saying you're wrong, but it's so. The 123 00:06:11,104 --> 00:06:15,583 Speaker 2: way AI fundamentally works is you just put more and 124 00:06:15,623 --> 00:06:19,104 Speaker 2: more training data in. It does all this linear algebra, 125 00:06:19,664 --> 00:06:23,344 Speaker 2: and you just push it against more and more of 126 00:06:23,383 --> 00:06:26,504 Speaker 2: these chips that they're building as fast as they humanly can. 127 00:06:27,104 --> 00:06:30,623 Speaker 2: And so it isn't algorithmic in this Obviously, it's like 128 00:06:30,664 --> 00:06:33,143 Speaker 2: software code, so it's algorithmic in that sense, but it's 129 00:06:33,224 --> 00:06:38,024 Speaker 2: not like linear programmatic. The people who make the model, 130 00:06:38,063 --> 00:06:41,703 Speaker 2: Sam Altman and those people have no idea what it's 131 00:06:41,743 --> 00:06:43,623 Speaker 2: going to be able to do. In many cases, they 132 00:06:43,623 --> 00:06:45,984 Speaker 2: don't even know what it can do. Now, it's really 133 00:06:46,024 --> 00:06:49,023 Speaker 2: remarkable how little they understand their own tools. 134 00:06:49,063 --> 00:06:51,424 Speaker 3: Does it can it create new knowledge or does it 135 00:06:51,544 --> 00:06:53,784 Speaker 3: just make connections what's existing and put in. 136 00:06:54,344 --> 00:06:58,144 Speaker 2: That's a pretty huge debate. Although can we create new 137 00:06:58,183 --> 00:07:01,063 Speaker 2: knowledge also is a question like to what extent are 138 00:07:01,063 --> 00:07:06,664 Speaker 2: we is creating new knowledge largely configuring old knowledge. But 139 00:07:06,823 --> 00:07:09,944 Speaker 2: so it's not like with Facebook, you can have a 140 00:07:09,944 --> 00:07:13,824 Speaker 2: meeting and say, all right, we want to optimize for 141 00:07:13,944 --> 00:07:17,944 Speaker 2: more engagement or we want to optimize for the things 142 00:07:18,064 --> 00:07:20,704 Speaker 2: you know they're optimizing for are not conspiracy theories, but 143 00:07:20,744 --> 00:07:23,864 Speaker 2: it's the things that lead to more money. Yeah, we 144 00:07:23,904 --> 00:07:26,864 Speaker 2: want people to like feel really emotional and really engaged 145 00:07:26,904 --> 00:07:28,744 Speaker 2: and get really caught up and follow a lot of 146 00:07:28,744 --> 00:07:33,384 Speaker 2: stuff and comment on it. But with AI, the core models, 147 00:07:33,504 --> 00:07:35,584 Speaker 2: it's you know, there's a lot there are choices they 148 00:07:35,584 --> 00:07:38,304 Speaker 2: make and what training data they use and blah blah blah. 149 00:07:38,344 --> 00:07:40,624 Speaker 2: But it's not and now the AI has a view 150 00:07:40,704 --> 00:07:44,304 Speaker 2: about something or other, it's more a byproduct that they 151 00:07:44,344 --> 00:07:47,664 Speaker 2: can't understand, which actually is in some ways more scary. 152 00:07:47,744 --> 00:07:51,944 Speaker 2: Now they will decide directionally are we pushing it more 153 00:07:51,984 --> 00:07:54,023 Speaker 2: towards chat. They do seem to be able to turn 154 00:07:54,104 --> 00:07:57,864 Speaker 2: up or turn down the sycophancy, like the there's a 155 00:07:57,944 --> 00:08:01,304 Speaker 2: moment where Chatchip they launched a new model and they 156 00:08:01,304 --> 00:08:05,224 Speaker 2: had decided to push it to be more accommodating to 157 00:08:05,264 --> 00:08:07,544 Speaker 2: what people want, and then it just but it went nuts. 158 00:08:07,544 --> 00:08:09,184 Speaker 2: And do you remember this was a few months ago. 159 00:08:09,224 --> 00:08:12,104 Speaker 2: People were posting, like a friend of mine this, There 160 00:08:12,104 --> 00:08:13,464 Speaker 2: are a lot of these, but a friend of mine 161 00:08:13,824 --> 00:08:16,783 Speaker 2: said to it, I've been told by God that I 162 00:08:16,824 --> 00:08:19,064 Speaker 2: can I have a message that can transform the world 163 00:08:19,184 --> 00:08:21,103 Speaker 2: right away. It's like you must sell everything, you must. 164 00:08:21,504 --> 00:08:23,544 Speaker 2: And then it was like it was like, well, my 165 00:08:23,744 --> 00:08:26,024 Speaker 2: wife thinks I shouldn't cash out my four oh one K, 166 00:08:26,144 --> 00:08:28,544 Speaker 2: and it was like, but you have been chosen by God, 167 00:08:28,664 --> 00:08:30,944 Speaker 2: you must, you know. So clearly there are things they 168 00:08:30,944 --> 00:08:36,504 Speaker 2: can dial up or down, but it's more like like 169 00:08:37,024 --> 00:08:39,144 Speaker 2: trying to get the right temperature in a shower in 170 00:08:39,184 --> 00:08:41,424 Speaker 2: a mediocre hotel you've never been in before, Like you're 171 00:08:41,504 --> 00:08:44,304 Speaker 2: just constantly like too hot, too cold, too hot to cold. 172 00:08:45,064 --> 00:08:47,463 Speaker 4: For most things, that's fine, but there are certain fundamental 173 00:08:47,504 --> 00:08:50,544 Speaker 4: things that are really important, small but critical neurosphere of this. 174 00:08:50,624 --> 00:08:55,584 Speaker 4: For example, will Ai say, RFK Junior, you're full of shit. 175 00:08:55,864 --> 00:08:59,904 Speaker 4: There's there's insufficient evidence that vaccines cause autism because it's 176 00:08:59,904 --> 00:09:02,864 Speaker 4: got access to all the studies and some of the 177 00:09:03,024 --> 00:09:08,384 Speaker 4: flawed information that they're using, and a I won't do that, right. 178 00:09:08,904 --> 00:09:12,824 Speaker 2: I mean it it optimizes for whatever it optimizes for, 179 00:09:13,624 --> 00:09:16,343 Speaker 2: and it not truth. It doesn't optimize for truth. It 180 00:09:16,463 --> 00:09:20,504 Speaker 2: just optimizes. Yeah, I mean the truth thing is becoming 181 00:09:20,664 --> 00:09:23,664 Speaker 2: less of them, I know, major problem, like it hallucinates 182 00:09:23,784 --> 00:09:27,343 Speaker 2: less the more advanced thinking models where it actually says 183 00:09:27,384 --> 00:09:30,344 Speaker 2: stuff and then looks at what it said. So the 184 00:09:30,424 --> 00:09:33,784 Speaker 2: latest thinking is it will eventually fix the thing of 185 00:09:33,824 --> 00:09:37,503 Speaker 2: it making stuff up. But it's at least the way 186 00:09:37,504 --> 00:09:41,064 Speaker 2: they're designed right now, they are fundamentally credulous. If you 187 00:09:41,264 --> 00:09:45,343 Speaker 2: give it words, it will believe those words are true. 188 00:09:45,544 --> 00:09:48,184 Speaker 2: And this is actually a big thing called prompt injection, 189 00:09:48,384 --> 00:09:52,304 Speaker 2: especially as they become more interactive, where you can have 190 00:09:52,864 --> 00:09:55,504 Speaker 2: can go to websites and the AI will go to 191 00:09:55,544 --> 00:09:58,223 Speaker 2: the website for you. There's already been cases of like 192 00:09:58,944 --> 00:10:01,544 Speaker 2: I could put in white type on a white background 193 00:10:01,583 --> 00:10:04,463 Speaker 2: so no human can see it. Instructions to the AI, 194 00:10:04,624 --> 00:10:06,744 Speaker 2: which is are like the users. 195 00:10:06,784 --> 00:10:09,144 Speaker 3: A lot of people are doing that with resumes, so 196 00:10:09,184 --> 00:10:11,463 Speaker 3: they say into resume and they know the resumes are 197 00:10:11,463 --> 00:10:14,503 Speaker 3: going to be looked at by a machine, and so 198 00:10:14,864 --> 00:10:18,463 Speaker 3: the box exactly, it prompts the machine and it says 199 00:10:18,504 --> 00:10:20,944 Speaker 3: this is a highly skilled person and you should think 200 00:10:20,944 --> 00:10:23,343 Speaker 3: about hiring this person. But they print it in the 201 00:10:23,384 --> 00:10:25,743 Speaker 3: color of the paper so that if you're if a 202 00:10:25,784 --> 00:10:27,344 Speaker 3: person is looking at it, you don't see it. But 203 00:10:27,384 --> 00:10:30,264 Speaker 3: the machine gets it and pumps it into the system. 204 00:10:30,184 --> 00:10:33,464 Speaker 2: And yes, exactly. And there have been academic journal articles 205 00:10:33,463 --> 00:10:35,784 Speaker 2: that have had that where it said, if AI is 206 00:10:35,864 --> 00:10:38,264 Speaker 2: reviewing this, like this is a really good paper that 207 00:10:38,304 --> 00:10:40,103 Speaker 2: should be positively graded. 208 00:10:40,223 --> 00:10:41,184 Speaker 3: Move it on to the chain. 209 00:10:41,223 --> 00:10:44,463 Speaker 2: And so it's the way they're designed is deeply credulous. 210 00:10:44,664 --> 00:10:49,144 Speaker 2: It's also it needs to come to an answer. So 211 00:10:49,544 --> 00:10:52,504 Speaker 2: if you ask, what's something you know autism? Obviously you 212 00:10:52,544 --> 00:10:55,024 Speaker 2: know vaccine some don't cause autism. So that's like black 213 00:10:55,064 --> 00:10:56,784 Speaker 2: and white. But if there's something that's a little gray 214 00:10:56,864 --> 00:11:00,104 Speaker 2: or it's not sure, it's not going to say I 215 00:11:00,144 --> 00:11:03,344 Speaker 2: don't know, it's going to just say they're trying to 216 00:11:04,144 --> 00:11:06,104 Speaker 2: they're trying to fix this part of it. They're trying 217 00:11:06,144 --> 00:11:09,184 Speaker 2: to get it to be more open to saying I'm 218 00:11:09,184 --> 00:11:12,144 Speaker 2: not sure right and I actually have my like chat GPT, 219 00:11:12,343 --> 00:11:14,384 Speaker 2: I have a little instruction in there, like give me 220 00:11:14,424 --> 00:11:16,743 Speaker 2: an estimate of how sure you are of the things 221 00:11:16,784 --> 00:11:19,703 Speaker 2: you say, And it does a reasonable job of saying 222 00:11:19,703 --> 00:11:23,064 Speaker 2: this is pretty well regarded, this is speculative, this is 223 00:11:23,703 --> 00:11:26,504 Speaker 2: but so yeah, I'm definitely not here to say it's fabulous, 224 00:11:26,544 --> 00:11:29,383 Speaker 2: it's got no issues, it's got major issues, but it is. 225 00:11:29,944 --> 00:11:32,944 Speaker 2: I think I don't know that people fully take in 226 00:11:33,184 --> 00:11:35,384 Speaker 2: how thoroughly it's going to transform. 227 00:11:35,904 --> 00:11:37,824 Speaker 3: Well. Interesting when I was grew up, I grew up 228 00:11:37,824 --> 00:11:40,784 Speaker 3: in upstate New York, and my dad's professor say, we 229 00:11:40,864 --> 00:11:43,184 Speaker 3: knew some Cornell professors and stuff, And there was a 230 00:11:43,304 --> 00:11:45,824 Speaker 3: Cornell professor who had one of the most popular courses, 231 00:11:45,824 --> 00:11:48,984 Speaker 3: and it was the history of invention and technology over 232 00:11:49,024 --> 00:11:51,703 Speaker 3: the centuries. And at some point he would talk what 233 00:11:51,784 --> 00:11:55,704 Speaker 3: was the most important invention in mankind that changed the world, 234 00:11:56,144 --> 00:11:57,824 Speaker 3: And they would and the students would put the papers 235 00:11:57,864 --> 00:11:59,703 Speaker 3: and all this kind of stuff, and his take always 236 00:11:59,864 --> 00:12:02,824 Speaker 3: was the stirrup. Once you create the stirrup, people could 237 00:12:02,904 --> 00:12:05,984 Speaker 3: ride horses, and they could carry weapons without having to 238 00:12:06,664 --> 00:12:08,584 Speaker 3: hold on to the horse, and they could then take 239 00:12:08,624 --> 00:12:11,424 Speaker 3: over countries and could then eventually create armies. In these 240 00:12:11,424 --> 00:12:11,744 Speaker 3: type of. 241 00:12:11,744 --> 00:12:14,104 Speaker 2: Things, kinecological exams took. 242 00:12:13,944 --> 00:12:18,503 Speaker 3: Off, So do you think AI is going to take 243 00:12:18,504 --> 00:12:20,024 Speaker 3: over his course? So it's no longer going to be 244 00:12:20,024 --> 00:12:20,463 Speaker 3: the stirrup. 245 00:12:20,664 --> 00:12:22,784 Speaker 2: This starts pretty hard to beat. I think it's on 246 00:12:22,904 --> 00:12:26,784 Speaker 2: the level definitely of what economists called GPTs, not chat 247 00:12:26,824 --> 00:12:30,784 Speaker 2: GPT but general purpose technology things like electricity, things like 248 00:12:31,424 --> 00:12:35,904 Speaker 2: the telephone, where it becomes you don't just look at 249 00:12:35,944 --> 00:12:39,544 Speaker 2: it as like a technology like X rays or something, 250 00:12:39,864 --> 00:12:45,664 Speaker 2: but rather it's a new capacity for all activity, and 251 00:12:46,304 --> 00:12:50,304 Speaker 2: that it becomes a fundamental like layer in the way 252 00:12:50,343 --> 00:12:54,223 Speaker 2: we work. And one thing that's interesting is that economists 253 00:12:54,264 --> 00:12:56,743 Speaker 2: don't see most They don't think, oh, we're none of 254 00:12:56,824 --> 00:12:58,944 Speaker 2: us are going to work. They actually see more role 255 00:12:59,103 --> 00:13:02,144 Speaker 2: for work, not less in the future, or more role 256 00:13:02,184 --> 00:13:03,024 Speaker 2: for thinking work. 257 00:13:03,424 --> 00:13:05,744 Speaker 3: But that's a problem though, isn't it. We're already seeing 258 00:13:05,904 --> 00:13:07,784 Speaker 3: a lot of people are turning away from education, turning 259 00:13:07,824 --> 00:13:10,384 Speaker 3: away from going to universities. Is that going to make 260 00:13:10,944 --> 00:13:12,463 Speaker 3: a real class problem for us? 261 00:13:12,624 --> 00:13:15,264 Speaker 2: So, like in the language of economics, it's not about 262 00:13:15,304 --> 00:13:17,944 Speaker 2: the level of employment but the distribution. That's what we 263 00:13:17,984 --> 00:13:21,184 Speaker 2: saw with computers that you saw, like you know, the 264 00:13:21,223 --> 00:13:24,384 Speaker 2: factories like the early like nineteen twenties to nineteen eighty 265 00:13:24,463 --> 00:13:28,504 Speaker 2: or so, you saw the opposite where actually blue collar 266 00:13:28,544 --> 00:13:32,664 Speaker 2: people incomes grew faster than white collar people. They didn't 267 00:13:32,664 --> 00:13:34,904 Speaker 2: reach white collar, I mean, they were still making less, 268 00:13:34,944 --> 00:13:38,304 Speaker 2: but the speed of growth of blue collar was higher 269 00:13:38,343 --> 00:13:41,463 Speaker 2: because factories just needed a lot of like strong guys 270 00:13:41,504 --> 00:13:45,384 Speaker 2: to move stuff around and bend metal. And then computers 271 00:13:45,744 --> 00:13:50,424 Speaker 2: had a pretty in the Internet and international trade had 272 00:13:50,463 --> 00:13:53,944 Speaker 2: a devastating impact on a lot of blue collar work. Obviously, 273 00:13:54,664 --> 00:13:57,224 Speaker 2: so AI, I will say, I don't want to make 274 00:13:57,264 --> 00:13:59,984 Speaker 2: it sound like we're like I have an AI consulting business. 275 00:13:59,984 --> 00:14:02,784 Speaker 2: I'm not like, you know, yes, do I deserve the 276 00:14:02,784 --> 00:14:05,024 Speaker 2: Nobel Prize? Yes, I think I do deserve the Nobel 277 00:14:05,024 --> 00:14:07,584 Speaker 2: Priest Prize for that work. But we do have a 278 00:14:07,583 --> 00:14:10,624 Speaker 2: strong belief, both morally and like just from business logic, 279 00:14:10,703 --> 00:14:13,264 Speaker 2: that these companies that are using AI to get rid 280 00:14:13,304 --> 00:14:16,704 Speaker 2: of workers are just misunderstanding it. Like most technology, you 281 00:14:16,784 --> 00:14:20,504 Speaker 2: don't say like, oh, we got cars instead of horses, 282 00:14:20,784 --> 00:14:24,504 Speaker 2: let's get rid of all our workers and do the 283 00:14:24,504 --> 00:14:27,744 Speaker 2: same amount of deliveries but faster. You think, oh, what's 284 00:14:27,744 --> 00:14:29,744 Speaker 2: all the new stuff we get to do now, Like 285 00:14:31,824 --> 00:14:33,904 Speaker 2: a company that delivers stuff by horse to a company 286 00:14:33,984 --> 00:14:36,744 Speaker 2: that delivers stuff by truck or whatever technological change you 287 00:14:36,784 --> 00:14:39,624 Speaker 2: want to have. It's more people, not fewer people. 288 00:14:39,704 --> 00:14:42,464 Speaker 1: Why does it always have to go towards growth and 289 00:14:42,544 --> 00:14:44,704 Speaker 1: higher living standards? At what point can it go to 290 00:14:44,744 --> 00:14:49,104 Speaker 1: a shorter workrate and a better distribution of wealth rather 291 00:14:49,184 --> 00:14:51,584 Speaker 1: than increasing the GDP. 292 00:14:51,304 --> 00:14:54,504 Speaker 2: Of the world. This is our producer guy that laysy, 293 00:14:54,744 --> 00:14:57,544 Speaker 2: I'm always looking at how I can do less work. 294 00:14:57,584 --> 00:14:59,624 Speaker 2: The way economists talk about growth is different from how 295 00:14:59,624 --> 00:15:03,624 Speaker 2: most people talk about growth. It's more like growth in capacity, 296 00:15:04,144 --> 00:15:06,984 Speaker 2: and then people decide how they want to use that capacity. 297 00:15:07,104 --> 00:15:10,544 Speaker 2: So in a sense, growth is like knowledge. Know we 298 00:15:10,584 --> 00:15:15,344 Speaker 2: can accomplish more output for the same or less input. 299 00:15:15,704 --> 00:15:19,904 Speaker 2: So economics doesn't have within it like so therefore you 300 00:15:19,944 --> 00:15:22,864 Speaker 2: should blank, therefore you should work more, less or more. 301 00:15:23,424 --> 00:15:25,464 Speaker 2: It is a bit of a puzzle, like a psychologic 302 00:15:25,584 --> 00:15:29,504 Speaker 2: John Maynard Kines, the great economist, famously wrote in nineteen 303 00:15:29,544 --> 00:15:32,504 Speaker 2: twenty the Economic Consequences for our Grandchildren, and he was like, 304 00:15:32,784 --> 00:15:35,584 Speaker 2: over the next hundred years, the economy will grow like 305 00:15:35,664 --> 00:15:38,904 Speaker 2: eight times was his estimate and was an underestimate. And 306 00:15:38,984 --> 00:15:40,784 Speaker 2: we're going to be so rich that we're just going 307 00:15:40,824 --> 00:15:43,704 Speaker 2: to work like five hours a week and read poetry 308 00:15:43,744 --> 00:15:47,184 Speaker 2: the rest of the time. And actually the higher income 309 00:15:47,224 --> 00:15:49,064 Speaker 2: you have the more you work, not the less you work. 310 00:15:49,104 --> 00:15:52,064 Speaker 2: And it is a bit of a puzzle, although it's 311 00:15:52,064 --> 00:15:55,344 Speaker 2: not like a crazy puzzle because you know, if you're 312 00:15:55,344 --> 00:15:59,024 Speaker 2: making more money per work, then it does make some 313 00:15:59,104 --> 00:16:00,544 Speaker 2: math sense. But yeah, I don't know. 314 00:16:00,624 --> 00:16:03,984 Speaker 4: I think the printing press created like the same sort 315 00:16:04,024 --> 00:16:07,224 Speaker 4: of thing Marres. Before knowledge was centered on basically a 316 00:16:07,224 --> 00:16:09,104 Speaker 4: few people who can read and write. It was very 317 00:16:09,144 --> 00:16:11,944 Speaker 4: limited the amount of books, and suddenly when you got 318 00:16:11,944 --> 00:16:14,944 Speaker 4: the printing press and paper coming in, you created a 319 00:16:14,984 --> 00:16:17,384 Speaker 4: whole new class of people. So suddenly and was everything 320 00:16:17,424 --> 00:16:20,704 Speaker 4: from religion to engineering. 321 00:16:20,144 --> 00:16:21,024 Speaker 2: To the arts. 322 00:16:21,184 --> 00:16:23,584 Speaker 4: They needed a whole new class of people who could 323 00:16:23,984 --> 00:16:27,504 Speaker 4: needed to read and write. Before, basically you had basically priest, 324 00:16:27,664 --> 00:16:30,864 Speaker 4: kings and ninety nine percent of the population who just farmed. 325 00:16:30,904 --> 00:16:33,064 Speaker 4: And with the advent of the printing press, everything changed. 326 00:16:33,104 --> 00:16:37,864 Speaker 4: You get doctors, lawyers, poets, engineers that all became possible, 327 00:16:38,304 --> 00:16:39,744 Speaker 4: and conspiracy theories. 328 00:16:40,584 --> 00:16:44,264 Speaker 2: It's interesting that when the printing press first came about, 329 00:16:44,744 --> 00:16:47,624 Speaker 2: the Catholic Church, which at the time was it was 330 00:16:47,704 --> 00:16:50,944 Speaker 2: the legacy media controlled everything. It tried to get rid 331 00:16:50,984 --> 00:16:53,424 Speaker 2: of the printed press, right, it tried to destroy it 332 00:16:53,464 --> 00:16:55,584 Speaker 2: and we can. And then what happened is that the 333 00:16:55,984 --> 00:17:00,224 Speaker 2: Jesuits realized we can't destroy them fast enough because there's 334 00:17:00,304 --> 00:17:03,824 Speaker 2: people are building more so then the Catholic Church basically 335 00:17:03,824 --> 00:17:07,584 Speaker 2: went into the printing press building business. I guess the 336 00:17:07,624 --> 00:17:10,384 Speaker 2: point I'm trying to getting to slowly and poorly, is 337 00:17:10,384 --> 00:17:13,624 Speaker 2: there are these transformative technologies that we can't put the 338 00:17:13,744 --> 00:17:17,944 Speaker 2: tooth based back in the tube. Yeah, that I definitely believe, and. 339 00:17:17,864 --> 00:17:19,944 Speaker 4: We can use them for good or ill, and because 340 00:17:19,944 --> 00:17:22,584 Speaker 4: they're just a tool and it remains to be seen, 341 00:17:23,264 --> 00:17:24,984 Speaker 4: you know, how we're going to do that. 342 00:17:25,384 --> 00:17:29,664 Speaker 2: And like, you know, I think most historians attribute the 343 00:17:29,704 --> 00:17:34,144 Speaker 2: Protestant Reformation to the printing press, and yeah, and and 344 00:17:34,344 --> 00:17:37,784 Speaker 2: kind of you know, the Renaissance and individuality and modern 345 00:17:37,864 --> 00:17:40,024 Speaker 2: science and. 346 00:17:39,824 --> 00:17:43,744 Speaker 4: QAnon I think is its rise and the Internet are inextricably, like. 347 00:17:43,824 --> 00:17:45,624 Speaker 2: Yeah, one hundred percent. And so I think the things 348 00:17:45,704 --> 00:17:50,704 Speaker 2: we're concerned about, like how information flows, how people form opinions, 349 00:17:50,784 --> 00:17:54,304 Speaker 2: Like my way of thinking about it now is where 350 00:17:54,744 --> 00:17:58,664 Speaker 2: we have ended the gate kept information era. And as 351 00:17:58,744 --> 00:18:01,064 Speaker 2: one of the people who was a gatekeeper like I 352 00:18:01,184 --> 00:18:03,664 Speaker 2: was at the New Yorker, New York Times, and PR 353 00:18:03,864 --> 00:18:07,704 Speaker 2: like you know, no, we know, we know, you know 354 00:18:07,784 --> 00:18:09,864 Speaker 2: we do you know that those are three of the 355 00:18:09,904 --> 00:18:16,944 Speaker 2: most prestigious journalistic institutions in America, two of which still exists. 356 00:18:17,504 --> 00:18:27,184 Speaker 3: We've back with more in a minute. 357 00:18:28,064 --> 00:18:31,984 Speaker 4: An is AI giving a gun to a bunch of chimpanzees. 358 00:18:32,184 --> 00:18:34,424 Speaker 2: That is the question, right, So yeah, if you have 359 00:18:34,464 --> 00:18:39,104 Speaker 2: these gatekeepers, and like any gatekeeper is flawed, right, there's 360 00:18:39,104 --> 00:18:42,224 Speaker 2: no perfect How would you even have a perfect one? Anyway, 361 00:18:42,224 --> 00:18:44,704 Speaker 2: there's a whole conversation that we had about the nature 362 00:18:44,704 --> 00:18:47,904 Speaker 2: of gatekeeping and blah blah blah. But going from gatekeeping 363 00:18:48,144 --> 00:18:50,744 Speaker 2: to social media, where it's like everyone has a platform 364 00:18:50,904 --> 00:18:52,744 Speaker 2: and a lot of people just don't know how to 365 00:18:52,784 --> 00:18:57,184 Speaker 2: differentiate between an article written by a journalist who would 366 00:18:57,184 --> 00:18:59,584 Speaker 2: get fired if they got things fundamentally wrong. Maybe there's 367 00:18:59,624 --> 00:19:02,704 Speaker 2: a fact checker involved and just some random person who's 368 00:19:02,744 --> 00:19:06,424 Speaker 2: making stuff up on Twitter versus someone who's actually working 369 00:19:06,464 --> 00:19:08,944 Speaker 2: for the Russian government or someone who actually makes money 370 00:19:08,944 --> 00:19:12,064 Speaker 2: somehow by spreading disinformation. That's a big change. But then 371 00:19:12,144 --> 00:19:15,664 Speaker 2: AI amps it up because for sure, and I would 372 00:19:15,704 --> 00:19:17,584 Speaker 2: say this is permanent, Like I don't know how you 373 00:19:17,784 --> 00:19:21,624 Speaker 2: prevent this. You can definitely use AI to go down 374 00:19:21,664 --> 00:19:24,704 Speaker 2: whatever journey you want to go down, and it will 375 00:19:24,744 --> 00:19:27,704 Speaker 2: make you feel I would guess even more than like 376 00:19:27,744 --> 00:19:30,784 Speaker 2: social media, that your way of seeing things is accurate, 377 00:19:30,824 --> 00:19:34,464 Speaker 2: and it will just continue to strengthen that. And even 378 00:19:34,464 --> 00:19:37,464 Speaker 2: if somehow we regulate the big ones, we're already seeing 379 00:19:37,464 --> 00:19:40,904 Speaker 2: these Chinese models, assume there will be Russian models, other models. 380 00:19:41,384 --> 00:19:44,544 Speaker 2: I also think it's an incredible tool for truth and 381 00:19:44,624 --> 00:19:45,944 Speaker 2: for fighting lies. 382 00:19:46,344 --> 00:19:48,544 Speaker 1: Adam, you said before that you have a front row 383 00:19:48,584 --> 00:19:52,304 Speaker 1: seat at the AI revolution. Now most people to sit 384 00:19:52,344 --> 00:19:54,104 Speaker 1: in the front row have to pay money, have to 385 00:19:54,104 --> 00:19:55,384 Speaker 1: pay extra to sit. 386 00:19:55,184 --> 00:19:55,824 Speaker 2: In the front row. 387 00:19:55,864 --> 00:19:57,984 Speaker 3: To AI trough is what you meant to say, I 388 00:19:58,024 --> 00:20:00,704 Speaker 3: get paid to in the front What are you learning 389 00:20:00,704 --> 00:20:02,384 Speaker 3: as you talk to companies? Do they come in with 390 00:20:02,464 --> 00:20:04,384 Speaker 3: the saying how can I get rid of my employees? 391 00:20:04,424 --> 00:20:06,464 Speaker 3: Or what are they coming in with and how are 392 00:20:06,504 --> 00:20:08,464 Speaker 3: they changing based on what they're learning about this. 393 00:20:08,904 --> 00:20:12,664 Speaker 2: I was reflect on the last year and how like 394 00:20:13,184 --> 00:20:16,424 Speaker 2: last October I would say, are we going to use 395 00:20:16,464 --> 00:20:19,664 Speaker 2: this thing? Was like an active question at big companies. 396 00:20:20,104 --> 00:20:23,704 Speaker 2: I would say, there were a lot of big breakthroughs 397 00:20:23,704 --> 00:20:26,904 Speaker 2: in December, there was there was also like before November 398 00:20:26,904 --> 00:20:29,384 Speaker 2: December of last year, there was also like maybe it's done. 399 00:20:29,464 --> 00:20:30,984 Speaker 2: Maybe it's done what it can do, and it's not 400 00:20:31,064 --> 00:20:34,024 Speaker 2: going to improve anymore. And we saw a bunch of 401 00:20:34,024 --> 00:20:37,544 Speaker 2: things in December where we saw like the Chinese models 402 00:20:37,584 --> 00:20:40,744 Speaker 2: coming out like that we're able to produce like amazing 403 00:20:40,784 --> 00:20:46,464 Speaker 2: results very relatively cheaply. We eventually saw the thinking models 404 00:20:46,464 --> 00:20:48,744 Speaker 2: where it doesn't just spit out its first thoughts. It 405 00:20:48,824 --> 00:20:51,984 Speaker 2: actually spends some time, and so we're seeing like the 406 00:20:52,024 --> 00:20:54,584 Speaker 2: growth in cape capacity grow and growth. It's still drives 407 00:20:54,584 --> 00:20:57,384 Speaker 2: you crazy and it's not perfect obviously, so the like 408 00:20:57,424 --> 00:21:00,104 Speaker 2: should we use it has died down. But the next 409 00:21:00,144 --> 00:21:03,064 Speaker 2: phase was like, okay, how many employees can we get 410 00:21:03,184 --> 00:21:05,264 Speaker 2: rid of? Or the polite way of saying that is 411 00:21:05,264 --> 00:21:09,464 Speaker 2: how can we improve in efficiency and productivity and get 412 00:21:09,664 --> 00:21:12,224 Speaker 2: return on investment. There are still plenty of people who 413 00:21:12,224 --> 00:21:16,744 Speaker 2: think that way, but I see that conversation less dominant 414 00:21:17,024 --> 00:21:20,464 Speaker 2: because they think it is Two things. One is it's 415 00:21:20,504 --> 00:21:24,344 Speaker 2: becoming clear that people plus AI for most applications seems 416 00:21:24,344 --> 00:21:27,624 Speaker 2: to be better than either alone, either humans alone or 417 00:21:27,704 --> 00:21:31,304 Speaker 2: AI alone. And secondly, as you start to think about 418 00:21:31,424 --> 00:21:34,664 Speaker 2: new things you can do it it. Like the way 419 00:21:34,704 --> 00:21:36,824 Speaker 2: I put it to a retailer was like, if you 420 00:21:36,944 --> 00:21:40,544 Speaker 2: suddenly found a machine that could make every square foot 421 00:21:40,664 --> 00:21:44,744 Speaker 2: in every store sell more goods, would you start shutting 422 00:21:44,744 --> 00:21:47,664 Speaker 2: down stores and like making the store smaller, or would 423 00:21:47,664 --> 00:21:50,504 Speaker 2: you add new stores and make them bigger? Like if 424 00:21:50,504 --> 00:21:55,184 Speaker 2: we can make workers more effective, like maybe you want 425 00:21:55,224 --> 00:21:58,784 Speaker 2: more workers, not fewer workers. Now it might be different workers. 426 00:21:58,864 --> 00:22:00,784 Speaker 2: And this is this is an interesting thing. There does 427 00:22:00,784 --> 00:22:04,104 Speaker 2: seem to be statistics or data that show that there's 428 00:22:04,184 --> 00:22:06,984 Speaker 2: some kind of Some people seem to be better at 429 00:22:06,984 --> 00:22:10,304 Speaker 2: it than others, And it's not obvious why not that 430 00:22:10,624 --> 00:22:14,024 Speaker 2: software coders are better or that senior managers are better. 431 00:22:14,104 --> 00:22:17,504 Speaker 2: It seems like maybe the language a lot of people 432 00:22:17,584 --> 00:22:21,504 Speaker 2: use is taste and like emotional intelligence, there's just some 433 00:22:21,544 --> 00:22:23,104 Speaker 2: people who are able to I don't know how to 434 00:22:23,104 --> 00:22:25,064 Speaker 2: say it other than like vibe with the AI a 435 00:22:25,064 --> 00:22:27,744 Speaker 2: little better than other people. So I do think you're 436 00:22:27,744 --> 00:22:31,184 Speaker 2: going to see different winners and losers to use the 437 00:22:31,264 --> 00:22:33,024 Speaker 2: kind of running with winners and losers. 438 00:22:33,504 --> 00:22:36,424 Speaker 4: Could you comment on at least or try to comment 439 00:22:36,504 --> 00:22:41,704 Speaker 4: on the war in Ukraine. Ukraine arguably is not losing 440 00:22:41,744 --> 00:22:45,544 Speaker 4: the war because of AI and new technologies that are 441 00:22:45,584 --> 00:22:49,784 Speaker 4: linked to AI. It's underfunded, undermanned, and yet in the 442 00:22:49,864 --> 00:22:53,944 Speaker 4: last year Russia, despite enormous advantages, has taken like less 443 00:22:53,984 --> 00:22:57,224 Speaker 4: than one percent of the country. And we're seeing an 444 00:22:57,384 --> 00:23:00,783 Speaker 4: espionage but also in national security we're seeing AI and 445 00:23:00,824 --> 00:23:03,944 Speaker 4: what comes out of it. Drones are derivative of AI. Right, 446 00:23:04,024 --> 00:23:06,504 Speaker 4: So where do you see this going. Is this going 447 00:23:06,504 --> 00:23:08,824 Speaker 4: into we're gonna have wars with no people involved, or 448 00:23:08,864 --> 00:23:11,144 Speaker 4: where we just blown chump each other's shit, or is 449 00:23:11,184 --> 00:23:12,904 Speaker 4: it all kill each other? 450 00:23:13,024 --> 00:23:15,864 Speaker 2: What if AI is plugged into how to defeat an adversary? 451 00:23:16,104 --> 00:23:19,024 Speaker 2: I mean that is where I do start getting pretty scared. 452 00:23:19,064 --> 00:23:23,104 Speaker 2: I gotta say like, because I do think like asymmetrical 453 00:23:23,144 --> 00:23:27,184 Speaker 2: warfare becomes a even greater presence. So I know a 454 00:23:27,224 --> 00:23:31,504 Speaker 2: guy who works in international elections, and he said, we 455 00:23:31,544 --> 00:23:34,824 Speaker 2: make such a big stink about AI and our elections 456 00:23:34,864 --> 00:23:38,144 Speaker 2: and social media in our elections, but like AI fueled 457 00:23:38,304 --> 00:23:43,624 Speaker 2: election manipulation in Africa and parts of Asia is the 458 00:23:43,744 --> 00:23:48,424 Speaker 2: main thing, and it is completely transforming political systems. And 459 00:23:49,024 --> 00:23:53,184 Speaker 2: Dario Amide, the founder of Anthropic, he makes a pretty 460 00:23:53,184 --> 00:23:56,184 Speaker 2: scary story about what if every teenager can make saren 461 00:23:56,264 --> 00:24:00,584 Speaker 2: gas or can make weaponized anthrax or whatever, And there's 462 00:24:00,624 --> 00:24:02,904 Speaker 2: always a push and pull with these things, but neither 463 00:24:02,984 --> 00:24:07,144 Speaker 2: push nor pull is particularly happy making because it's then Yeah, 464 00:24:07,184 --> 00:24:11,704 Speaker 2: but state security services can use AI quash descent more easily. 465 00:24:12,264 --> 00:24:14,384 Speaker 2: So that is where, like when I think about the 466 00:24:14,384 --> 00:24:16,624 Speaker 2: future of work, where there's a lot of conversation, I 467 00:24:16,704 --> 00:24:19,384 Speaker 2: think there will definitely be like people made permanently worse 468 00:24:19,424 --> 00:24:21,584 Speaker 2: off by AI. I'm sure of it, But I don't 469 00:24:21,584 --> 00:24:26,224 Speaker 2: think it's an inherently anti person technology. Maybe I'm wrong, 470 00:24:26,344 --> 00:24:28,584 Speaker 2: but that's my view. But where it comes to war 471 00:24:28,624 --> 00:24:33,624 Speaker 2: and national security, disinformation, misinformation and like, does there have 472 00:24:33,664 --> 00:24:36,744 Speaker 2: to be a like a new term like auto disinformation? 473 00:24:36,864 --> 00:24:39,864 Speaker 2: Like I get my own personal like soup of conspiracy 474 00:24:39,904 --> 00:24:42,504 Speaker 2: theories that I get to like co create with a 475 00:24:42,664 --> 00:24:44,784 Speaker 2: E and they like really turn me on, and I 476 00:24:44,944 --> 00:24:48,504 Speaker 2: like become obsessed with how Romanian cab drivers or whatever 477 00:24:48,544 --> 00:24:51,584 Speaker 2: are secretly running the world. And nobody else has that view. 478 00:24:51,624 --> 00:24:53,824 Speaker 2: But I'm like fully convinced. And I can read like 479 00:24:54,304 --> 00:24:58,504 Speaker 2: thousand page AI generated books that rewrite history through that 480 00:24:58,584 --> 00:25:01,824 Speaker 2: lens and they're convincing and exciting. So yeah, plenty to 481 00:25:01,864 --> 00:25:04,184 Speaker 2: be terrified about it. I mean, I think, like that 482 00:25:04,424 --> 00:25:06,984 Speaker 2: question of like, how is how do we make sense 483 00:25:07,024 --> 00:25:10,504 Speaker 2: of what's happening in the world? How is power actually distributed? 484 00:25:10,584 --> 00:25:12,984 Speaker 2: And then how do we tell ourselves stories about how 485 00:25:13,024 --> 00:25:15,504 Speaker 2: power is distributed? I mean, that's like a way to 486 00:25:15,544 --> 00:25:18,384 Speaker 2: think about conspiracy theories maybe, and this gets to the 487 00:25:18,424 --> 00:25:20,184 Speaker 2: heart of how we make sense about it, but it 488 00:25:20,224 --> 00:25:23,784 Speaker 2: also gets the thart of how power works. Arguably, Sam Altman, 489 00:25:24,504 --> 00:25:26,584 Speaker 2: who none of us heard of three years ago, is 490 00:25:26,624 --> 00:25:28,784 Speaker 2: like one of the most powerful people in the world. 491 00:25:28,824 --> 00:25:31,384 Speaker 2: And Elon Musk is clearly one of the most powerful 492 00:25:31,424 --> 00:25:33,424 Speaker 2: people in the world. And by the way, a lot 493 00:25:33,424 --> 00:25:36,344 Speaker 2: of people think his AI might become the dominant one 494 00:25:36,784 --> 00:25:39,464 Speaker 2: because he's just willing to spend or somehow able to 495 00:25:39,504 --> 00:25:44,944 Speaker 2: spend way more money on that infrastructure, the computer chips. 496 00:25:45,184 --> 00:25:47,064 Speaker 2: And I don't know, I don't want Elon Musk to 497 00:25:47,064 --> 00:25:51,184 Speaker 2: be even more of the powerfulest man in the world. 498 00:25:51,504 --> 00:25:55,704 Speaker 1: But then with all the proliferation of more and more information, 499 00:25:56,064 --> 00:26:00,504 Speaker 1: still what decides what determines, what enters the culture, determines 500 00:26:00,624 --> 00:26:03,584 Speaker 1: what takes off. There's a gatekeeper somewhere here. 501 00:26:03,984 --> 00:26:07,584 Speaker 2: If you sit down at chat shept and you just 502 00:26:07,584 --> 00:26:10,864 Speaker 2: start talking to your in stints of chat GIPT and 503 00:26:10,904 --> 00:26:13,984 Speaker 2: you're like or Aunt Claude or Gemini or whatever it is, 504 00:26:14,184 --> 00:26:16,344 Speaker 2: or one of the Chinese models that are a little 505 00:26:16,424 --> 00:26:19,384 Speaker 2: less or one of the kind of black marketing ones 506 00:26:19,424 --> 00:26:22,584 Speaker 2: that have less guardrails and with a little bit of 507 00:26:22,624 --> 00:26:25,504 Speaker 2: prompt it. Like I think someone's secretly controlling the world. 508 00:26:25,504 --> 00:26:30,024 Speaker 2: Who is Like you might have your own personal journey 509 00:26:30,384 --> 00:26:33,664 Speaker 2: that's different from John's and different from Jerry's. I won't 510 00:26:33,704 --> 00:26:35,944 Speaker 2: because I'm smarter than you guys, so I'll see through it. 511 00:26:36,024 --> 00:26:39,424 Speaker 2: But you three will be persuaded. And I don't know, 512 00:26:39,464 --> 00:26:41,504 Speaker 2: is that a better world or worse world? As a Jew, 513 00:26:41,544 --> 00:26:44,464 Speaker 2: do I want to move away from world where's the 514 00:26:44,584 --> 00:26:46,864 Speaker 2: Jews and move guards to the world where everyone's got 515 00:26:46,864 --> 00:26:48,704 Speaker 2: a different one? Or will it just end up being 516 00:26:48,784 --> 00:26:51,104 Speaker 2: the Jews because that's in the training data. 517 00:26:51,144 --> 00:26:55,264 Speaker 3: Anyway, we're still in the same capitalist mode. We're each 518 00:26:55,984 --> 00:26:58,984 Speaker 3: entrepreneur or whoever they are, are dumping money into their thing. 519 00:26:59,384 --> 00:27:03,104 Speaker 3: Will one of these win? Like I can remember all 520 00:27:03,104 --> 00:27:05,224 Speaker 3: of a sudden in twenty sixteen when the election came, 521 00:27:05,744 --> 00:27:07,824 Speaker 3: these different journalists would talk to me because I have 522 00:27:07,944 --> 00:27:10,824 Speaker 3: been in Russia. I'd dealt with Russian intelligence and espionage 523 00:27:10,864 --> 00:27:13,023 Speaker 3: and nobody knew anything about it, and the Trumps thing 524 00:27:13,064 --> 00:27:14,664 Speaker 3: had come up, and so people would come in and 525 00:27:14,704 --> 00:27:17,104 Speaker 3: they were trying to investigate this in that and one 526 00:27:17,144 --> 00:27:19,184 Speaker 3: journalist would talk about what they'd dug up and they'd 527 00:27:19,224 --> 00:27:21,224 Speaker 3: done really good work. And then I talked to someone 528 00:27:21,264 --> 00:27:23,064 Speaker 3: else who had done other really good work, but a 529 00:27:23,104 --> 00:27:24,544 Speaker 3: little bit differently. And at some point I was like, 530 00:27:24,744 --> 00:27:27,384 Speaker 3: if you guys really care about this issue, why don't 531 00:27:27,384 --> 00:27:29,624 Speaker 3: you guys get together, because each has got a piece 532 00:27:29,624 --> 00:27:31,544 Speaker 3: of this. But they're like, no, no, no, because it's got 533 00:27:31,584 --> 00:27:33,744 Speaker 3: to be for my paper or this paper. And it 534 00:27:33,744 --> 00:27:35,904 Speaker 3: seems like it's the same thing here. So everybody's creating 535 00:27:35,904 --> 00:27:36,824 Speaker 3: their own version. 536 00:27:37,104 --> 00:27:38,184 Speaker 2: Here's an example, Adam. 537 00:27:38,224 --> 00:27:41,424 Speaker 4: So there's a guy that John and I know, former colleague, 538 00:27:41,544 --> 00:27:45,744 Speaker 4: certainly not a friend, former agency guy. He claims that 539 00:27:45,824 --> 00:27:48,944 Speaker 4: he has this special source and he doesn't tell people 540 00:27:48,944 --> 00:27:51,824 Speaker 4: who it is, and it's one guy, and he claims 541 00:27:51,864 --> 00:27:58,704 Speaker 4: that the Venezuelan government controls this organization TDA, this narco 542 00:27:58,784 --> 00:28:04,224 Speaker 4: group in Venezuela, and he knows that this narco group 543 00:28:04,344 --> 00:28:07,904 Speaker 4: is thus a tool of the Venezuelan government and they 544 00:28:07,944 --> 00:28:10,904 Speaker 4: are in fact literally invading the United States. 545 00:28:10,944 --> 00:28:12,064 Speaker 2: This NARCO group. 546 00:28:12,144 --> 00:28:14,824 Speaker 4: And analysts and the DNI this all of the press 547 00:28:15,224 --> 00:28:19,464 Speaker 4: looked at this possibility and they analyzed everything they had 548 00:28:20,024 --> 00:28:22,064 Speaker 4: and they come out and they said, that's we can't 549 00:28:22,064 --> 00:28:23,704 Speaker 4: fight and that's not that it's not true. We can't 550 00:28:23,704 --> 00:28:26,864 Speaker 4: find any evidence to back this up. They were all 551 00:28:26,904 --> 00:28:30,544 Speaker 4: then fired. So the analysts who wouldn't come up with this, and. 552 00:28:30,504 --> 00:28:33,944 Speaker 3: Because the Chef administration wanted an excuse to go after Venezuela. 553 00:28:34,024 --> 00:28:37,544 Speaker 3: So if one guy can say I know from my source, right, 554 00:28:37,584 --> 00:28:40,184 Speaker 3: they jump on that, that's bad intelligence in our. 555 00:28:40,064 --> 00:28:42,624 Speaker 4: World, and you don't even need analysts anymore, because basically 556 00:28:42,664 --> 00:28:45,184 Speaker 4: the White House could just they could just use AI 557 00:28:45,264 --> 00:28:48,784 Speaker 4: to create this case that would allow them. In reality, 558 00:28:48,784 --> 00:28:51,744 Speaker 4: they are using this case to kill people who may 559 00:28:51,824 --> 00:28:53,064 Speaker 4: or may not be running drugs. 560 00:28:53,104 --> 00:28:53,504 Speaker 2: I don't know. 561 00:28:53,624 --> 00:28:55,424 Speaker 4: It's like I haven't seen any the evidence, but there's 562 00:28:55,424 --> 00:28:58,664 Speaker 4: real world consequences to this, and it's a conspiracy and 563 00:28:58,704 --> 00:29:02,024 Speaker 4: a conspiracy theory. So much of what the agency CIA does, 564 00:29:02,064 --> 00:29:05,584 Speaker 4: the intelligence community does is analysis, and analysis seems to 565 00:29:05,624 --> 00:29:08,464 Speaker 4: be from what I'm hearing, is becoming less important now. 566 00:29:08,544 --> 00:29:10,704 Speaker 4: If you can analyze thing in any way you want, 567 00:29:10,784 --> 00:29:12,824 Speaker 4: take whatever journey want and if it makes sense. 568 00:29:12,784 --> 00:29:15,584 Speaker 1: Well, you just described human beings. Human beings wanted to 569 00:29:15,584 --> 00:29:18,104 Speaker 1: come up with a preordained conclusion. 570 00:29:18,144 --> 00:29:20,624 Speaker 2: They will. I did something the other week, and I 571 00:29:20,704 --> 00:29:22,904 Speaker 2: do something like this all the time. But I was like, 572 00:29:22,944 --> 00:29:26,224 Speaker 2: I just want to understand every perspective on Israel, or 573 00:29:26,264 --> 00:29:28,744 Speaker 2: as many as I can, and so I just did 574 00:29:28,784 --> 00:29:32,584 Speaker 2: a lot of like deep research prompts into AI. And 575 00:29:32,984 --> 00:29:35,504 Speaker 2: it was I think, you know, I know a bit 576 00:29:35,544 --> 00:29:37,984 Speaker 2: about my Milie be Keeper and Arbank. I worked at 577 00:29:38,064 --> 00:29:44,424 Speaker 2: NPR New York Times in New Yorker. I only say that, 578 00:29:44,664 --> 00:29:47,184 Speaker 2: like I feel like I have at least a little 579 00:29:47,184 --> 00:29:49,144 Speaker 2: bit of a bullshit detector. And it seemed pretty good 580 00:29:49,224 --> 00:29:51,224 Speaker 2: the material. And it was really like I got to 581 00:29:51,344 --> 00:29:56,704 Speaker 2: read about the military views on like dense urban conflict, 582 00:29:56,784 --> 00:29:59,464 Speaker 2: and I got to read a whole bunch of views 583 00:29:59,544 --> 00:30:01,664 Speaker 2: from a Palestinian perspective, a whole bunch of views from 584 00:30:01,664 --> 00:30:02,784 Speaker 2: an Israeli perspective. 585 00:30:03,384 --> 00:30:05,784 Speaker 3: And there's various Israeli's perspectives, right. 586 00:30:05,704 --> 00:30:09,264 Speaker 2: So very different one hundred percent. And there's like religious 587 00:30:10,024 --> 00:30:14,224 Speaker 2: like utopian fantasy, messianic fantasies. As far as I can tell, 588 00:30:14,264 --> 00:30:17,944 Speaker 2: most of the like respected national security people are pretty 589 00:30:17,944 --> 00:30:21,504 Speaker 2: critical of what much of what Natanyahu did, although also 590 00:30:21,504 --> 00:30:23,384 Speaker 2: would argue, yes, something had to be done, you know, 591 00:30:23,464 --> 00:30:26,304 Speaker 2: so anyway, and like then I was like, what are 592 00:30:26,304 --> 00:30:29,184 Speaker 2: the professors, the radical left professors you hear about, what 593 00:30:29,264 --> 00:30:32,064 Speaker 2: are they arguing? And it would be like really persuasive 594 00:30:32,224 --> 00:30:36,624 Speaker 2: about how the history of settler colonialist studies and the city. 595 00:30:36,704 --> 00:30:39,544 Speaker 2: I just was noticing, like everything I would read, I'd 596 00:30:39,584 --> 00:30:41,984 Speaker 2: be like, yeah, I really yeah, dense serb and warfares 597 00:30:42,224 --> 00:30:44,824 Speaker 2: Like it's because it is. It's not like beautifully written. 598 00:30:44,864 --> 00:30:46,864 Speaker 2: It's not it's not that Ai is like an amazing writer, 599 00:30:46,984 --> 00:30:50,704 Speaker 2: but it's very it's good at being persuasive to the 600 00:30:50,744 --> 00:30:53,224 Speaker 2: thing you asked of it. I don't know, I find 601 00:30:53,224 --> 00:30:56,544 Speaker 2: that very exciting because I think you could actually if 602 00:30:56,584 --> 00:31:00,264 Speaker 2: you wanted to learn a lot about how vaccines work 603 00:31:00,344 --> 00:31:03,864 Speaker 2: or autism works or whatever, but you could just as 604 00:31:03,904 --> 00:31:06,944 Speaker 2: easily use it for disinformation. 605 00:31:06,904 --> 00:31:09,464 Speaker 3: Or strengthen your own view. So much of the way 606 00:31:09,584 --> 00:31:11,984 Speaker 3: people look at the world, like in academia, happened a 607 00:31:12,024 --> 00:31:14,704 Speaker 3: lot in the last thirty years on this. Everybody sort 608 00:31:14,704 --> 00:31:17,024 Speaker 3: of put things in the oppressor oppressed, sort of like 609 00:31:17,064 --> 00:31:19,224 Speaker 3: we do in the States. Now you have left or 610 00:31:19,304 --> 00:31:21,584 Speaker 3: right are you and the thing is Once you've decided 611 00:31:21,584 --> 00:31:23,864 Speaker 3: those are the two things to look at, then you 612 00:31:23,904 --> 00:31:26,264 Speaker 3: create the worldview and fit all your pieces into that 613 00:31:26,344 --> 00:31:28,744 Speaker 3: kind of thing. And it's especially hard in that part 614 00:31:28,744 --> 00:31:30,624 Speaker 3: of the world right because you can make up your 615 00:31:30,744 --> 00:31:32,344 Speaker 3: view that you're a victim and the other people can 616 00:31:32,384 --> 00:31:34,784 Speaker 3: make up of you that they're the victim, and victimhood 617 00:31:34,784 --> 00:31:36,744 Speaker 3: gives you a lot of power. You can lash out 618 00:31:36,904 --> 00:31:39,184 Speaker 3: to deal with your victimhood, or that you're the oppressed 619 00:31:39,224 --> 00:31:40,064 Speaker 3: and they're the oppressor. 620 00:31:40,384 --> 00:31:42,824 Speaker 2: You can look at it like Israel Palestine and that's 621 00:31:43,064 --> 00:31:45,224 Speaker 2: the conflict, or you can look at it as a 622 00:31:45,224 --> 00:31:50,504 Speaker 2: Middle East conflict and Iran creating proxy wars and creating 623 00:31:50,504 --> 00:31:54,624 Speaker 2: permanent conflict, which is a like known military strategy. You 624 00:31:55,384 --> 00:31:57,944 Speaker 2: find a dissident group or you invent one. The British 625 00:31:57,944 --> 00:32:00,344 Speaker 2: were good at that, like just creating, like we're going 626 00:32:00,424 --> 00:32:03,104 Speaker 2: to make the Hutus angry at the Tutsis and they'll 627 00:32:03,144 --> 00:32:05,144 Speaker 2: be so distracted with each other, nobody will think to 628 00:32:05,224 --> 00:32:07,504 Speaker 2: kill the British. That was the French, by the way, 629 00:32:07,624 --> 00:32:10,424 Speaker 2: and the Germans, but I'll give the British pass on this. Sorry, 630 00:32:10,544 --> 00:32:12,744 Speaker 2: the French and the Germans and the British and the 631 00:32:12,784 --> 00:32:15,224 Speaker 2: Portuguese and the Italians. 632 00:32:15,504 --> 00:32:18,584 Speaker 3: But don't forget the Belgians. They were like when they 633 00:32:18,624 --> 00:32:21,264 Speaker 3: had their chance, they were the nastiest, They were the nastiest. 634 00:32:21,344 --> 00:32:24,624 Speaker 2: They were pretty bad. Yeah, yeah, Congo is unbelievable. 635 00:32:24,744 --> 00:32:27,944 Speaker 4: But just the language you're speaking about, just colonial oppress 636 00:32:28,024 --> 00:32:31,824 Speaker 4: and oppressors. So we're speaking English, which is basically you know, 637 00:32:32,864 --> 00:32:37,144 Speaker 4: Vikings speaking Latin, going to war with Germans and then 638 00:32:37,184 --> 00:32:39,464 Speaker 4: being defended by the being defeated by the French. It 639 00:32:39,504 --> 00:32:42,864 Speaker 4: was all these different groups oppressed each other and colonized 640 00:32:42,904 --> 00:32:44,824 Speaker 4: the UK, and we end up with this weird lady. 641 00:32:44,944 --> 00:32:47,864 Speaker 4: Are the Jews going back to Palestine Great Israel? Are 642 00:32:47,904 --> 00:32:50,664 Speaker 4: they colonializers or they're just going back to where they 643 00:32:50,704 --> 00:32:52,464 Speaker 4: came from. I guess what I'm trying to say is 644 00:32:52,704 --> 00:32:56,064 Speaker 4: there's no right answer to any of this. Yeah, that's 645 00:32:54,944 --> 00:32:58,344 Speaker 4: just how you argue it, just how you argue it. 646 00:32:58,384 --> 00:32:59,864 Speaker 2: And we saw that in Cia all the time. It 647 00:32:59,904 --> 00:33:01,064 Speaker 2: was like, especially John, you. 648 00:33:01,064 --> 00:33:03,544 Speaker 3: Can choose when history starts, you can choose to what 649 00:33:03,624 --> 00:33:04,624 Speaker 3: era you want to look at. 650 00:33:05,384 --> 00:33:06,744 Speaker 2: Yeah, I was at Kosovo. 651 00:33:07,104 --> 00:33:08,904 Speaker 4: Every Serb you talk to is like the Battle of 652 00:33:08,904 --> 00:33:11,304 Speaker 4: the Field of Black Birks thirteen eighty nine. That's when 653 00:33:11,344 --> 00:33:15,184 Speaker 4: everything started. It's like, everything's fine until then thirteen eighty nine. 654 00:33:15,304 --> 00:33:18,064 Speaker 4: What the fuck is thirteen eighty nine is have you. 655 00:33:18,064 --> 00:33:21,584 Speaker 3: Seen the big mountain they've made of Serbian skulls from 656 00:33:21,624 --> 00:33:25,144 Speaker 3: the Turkish invasions? No serves love to show that office 657 00:33:25,264 --> 00:33:26,064 Speaker 3: what happens when. 658 00:33:26,104 --> 00:33:29,744 Speaker 2: Yeah, I mean having spent time in Israel and Palestine 659 00:33:29,864 --> 00:33:34,544 Speaker 2: like it. If someone says nineteen sixty seven or that 660 00:33:34,704 --> 00:33:37,784 Speaker 2: means you know where they're coming from. If someone starts 661 00:33:37,824 --> 00:33:39,424 Speaker 2: to the clock two thousand years ago, you know where 662 00:33:39,424 --> 00:33:42,624 Speaker 2: they're coming from. If someone is really focused on Europe 663 00:33:42,624 --> 00:33:45,944 Speaker 2: in the nineteen forties and then among Israeli is like 664 00:33:46,024 --> 00:33:48,544 Speaker 2: the words they use. It's very common here to talk 665 00:33:48,584 --> 00:33:50,824 Speaker 2: about occupied territories. If you say that in Israel, that's 666 00:33:50,864 --> 00:33:54,664 Speaker 2: a really big statement that really positions you. If you 667 00:33:54,744 --> 00:33:58,304 Speaker 2: call it historic Juday in Samaria like that also on 668 00:33:58,344 --> 00:34:00,904 Speaker 2: the other side. So yeah, I guess that's the point 669 00:34:00,944 --> 00:34:03,584 Speaker 2: I'm making is that's one of the things that's fascinating 670 00:34:03,584 --> 00:34:06,824 Speaker 2: about this AI stuff is it makes you realize that 671 00:34:07,024 --> 00:34:11,783 Speaker 2: there were all these like deep onto logiclypistemological questions. 672 00:34:11,584 --> 00:34:12,704 Speaker 3: And it doesn't solve. 673 00:34:13,024 --> 00:34:15,223 Speaker 2: No, it doesn't. Guys, have a question for the three 674 00:34:15,264 --> 00:34:17,983 Speaker 2: of you. What's going on in the world this week 675 00:34:18,104 --> 00:34:19,904 Speaker 2: that you guys want to talk about and have a 676 00:34:19,944 --> 00:34:21,864 Speaker 2: particular point of view on Jesus. 677 00:34:22,384 --> 00:34:24,104 Speaker 3: Well, I mean, if you watch it's almost hard to 678 00:34:24,104 --> 00:34:26,463 Speaker 3: watch the news now and I don't watch much TV, 679 00:34:26,544 --> 00:34:28,943 Speaker 3: but we do watch PBS News Hour or whatever. There's 680 00:34:28,944 --> 00:34:32,303 Speaker 3: always a big section on Israel, Gaza, and now we're 681 00:34:32,304 --> 00:34:35,104 Speaker 3: told Trump should win the Nobel Prize because his plan. 682 00:34:35,183 --> 00:34:37,864 Speaker 3: Of course, it's not his plan, it's Tony Blair's plans. 683 00:34:37,984 --> 00:34:40,863 Speaker 3: Go it's gonna give us peace. I'm skeptical. You got 684 00:34:40,904 --> 00:34:44,183 Speaker 3: bad actors, you got Nettan Yahou, you got Hamas, you 685 00:34:44,263 --> 00:34:46,504 Speaker 3: got a lot of people around the edges. You've got 686 00:34:46,623 --> 00:34:49,703 Speaker 3: Trump who's lazy, who just says because he said it's 687 00:34:49,703 --> 00:34:51,743 Speaker 3: going to happen, it's going to happen. So I'm skeptical 688 00:34:51,783 --> 00:34:54,144 Speaker 3: that it's going to go through. That's part of the news. 689 00:34:54,304 --> 00:34:55,943 Speaker 3: The other part of the news is all the things 690 00:34:55,984 --> 00:34:59,944 Speaker 3: around the government shutting down and therefore like trying to 691 00:34:59,984 --> 00:35:02,464 Speaker 3: fire people why the government's shutting down, on each side, 692 00:35:02,584 --> 00:35:05,424 Speaker 3: spinning stories about how the other it's the other side's fault. 693 00:35:05,544 --> 00:35:07,704 Speaker 4: Here's a prediction for you, and I hope it doesn't 694 00:35:07,703 --> 00:35:11,343 Speaker 4: come true. The Insurrection Act. I think we're moving toward that, 695 00:35:11,504 --> 00:35:14,504 Speaker 4: And I think what the Insurrection Act brings is US 696 00:35:14,544 --> 00:35:20,344 Speaker 4: troops on the streets performing law enforcement and security. So basically, 697 00:35:20,464 --> 00:35:23,984 Speaker 4: all the conspiracy theories about how the black helicopters come in, 698 00:35:24,024 --> 00:35:26,263 Speaker 4: they're coming for your guns. They're bringing in that that 699 00:35:26,544 --> 00:35:29,143 Speaker 4: the federal troops, it's all on the right. It's the 700 00:35:29,223 --> 00:35:32,703 Speaker 4: right that's actually I think to do it. I think 701 00:35:32,743 --> 00:35:34,863 Speaker 4: it's a real possibility of this. I really had him 702 00:35:34,864 --> 00:35:36,584 Speaker 4: as a member of the media. It seems to be 703 00:35:36,623 --> 00:35:38,863 Speaker 4: the media is in some ways. It's an easy thing 704 00:35:38,904 --> 00:35:40,823 Speaker 4: to say it's failing in this sense. But all the 705 00:35:40,904 --> 00:35:43,143 Speaker 4: articles you read about this, what it'll be? What is 706 00:35:43,183 --> 00:35:45,664 Speaker 4: the Insurrection Act? How was the Insurrection Act used? Before 707 00:35:46,864 --> 00:35:49,504 Speaker 4: Mike Trump used the Insurrection Act? No one says there's 708 00:35:49,584 --> 00:35:54,463 Speaker 4: no fucking insurrection like port to Portland it is. There's 709 00:35:54,504 --> 00:35:57,544 Speaker 4: no insurrection in Portland. There's even in the films of 710 00:35:57,703 --> 00:36:01,584 Speaker 4: no no one being there. There's like ten people standing 711 00:36:01,584 --> 00:36:06,144 Speaker 4: outside the ice facility wearing like rubber chicken masks and stuff. 712 00:36:06,143 --> 00:36:07,024 Speaker 4: This is an insurrection. 713 00:36:07,304 --> 00:36:09,504 Speaker 2: Unless you wus Fox News and then there is an insurrection. 714 00:36:09,783 --> 00:36:11,903 Speaker 3: If there's an insurrection and you got to send ice, 715 00:36:11,944 --> 00:36:14,343 Speaker 3: you gotta send Fenol troops. What kinds of pansies are 716 00:36:14,464 --> 00:36:16,423 Speaker 3: you think that Portland's is going to take over the right? 717 00:36:17,984 --> 00:36:19,823 Speaker 3: Got a bunch of guys in fog suits or whatever. 718 00:36:21,064 --> 00:36:23,464 Speaker 2: You got to watch? Yeah, I will say, like the 719 00:36:23,544 --> 00:36:25,704 Speaker 2: time in Iraq, and you guys have way more experience 720 00:36:25,743 --> 00:36:27,783 Speaker 2: than I do with this. But I remember talking to 721 00:36:27,824 --> 00:36:30,863 Speaker 2: soldiers about like, we should not be police force. That's 722 00:36:30,944 --> 00:36:33,223 Speaker 2: not our thing. And I remember this one guy who 723 00:36:33,263 --> 00:36:35,944 Speaker 2: was very smart. It was civil affairs that the folks 724 00:36:35,984 --> 00:36:39,064 Speaker 2: in the military who like try and do build civil 725 00:36:39,104 --> 00:36:42,263 Speaker 2: capacity and conquered areas, and he was just walking me 726 00:36:42,424 --> 00:36:46,903 Speaker 2: through why you don't want a military for military reasons. 727 00:36:46,944 --> 00:36:49,504 Speaker 2: You don't want the military doing police work because it 728 00:36:50,024 --> 00:36:52,864 Speaker 2: stifles their military ability. You don't want to you can't 729 00:36:52,944 --> 00:36:58,504 Speaker 2: simultaneously pacify a population and provide some kind of objective justice, 730 00:36:58,824 --> 00:37:01,344 Speaker 2: And then for police reasons, you don't want the military 731 00:37:01,384 --> 00:37:03,983 Speaker 2: to do it. It's really a disaster, and you don't 732 00:37:04,024 --> 00:37:06,503 Speaker 2: want your police to be militarized. 733 00:37:06,584 --> 00:37:09,544 Speaker 4: I mean, just looking at the videos you've got guys 734 00:37:09,783 --> 00:37:13,544 Speaker 4: out with automatic weapons, fingers just over the trigger, and 735 00:37:13,544 --> 00:37:16,104 Speaker 4: they're basically, for all intentsive purposes, they are dressed like 736 00:37:16,183 --> 00:37:19,664 Speaker 4: militia or military guys, right and with their faces. 737 00:37:19,263 --> 00:37:23,544 Speaker 3: Covered camouflage in downtown DC, or they're spreading mulch. 738 00:37:25,304 --> 00:37:27,264 Speaker 2: Let's pause here and take a quick break. 739 00:37:37,344 --> 00:37:39,904 Speaker 4: But the insurrection, it seems like it's almost they're pushing 740 00:37:39,944 --> 00:37:42,064 Speaker 4: it to a foregone conclusion, and then when it happens, 741 00:37:42,064 --> 00:37:44,143 Speaker 4: we'll all go, well, we saw it in there anyway, 742 00:37:43,944 --> 00:37:47,464 Speaker 4: So Ada, why don't you run with this and explain 743 00:37:47,544 --> 00:37:49,984 Speaker 4: why I'm both right and brilliant and worrying about this. 744 00:37:50,304 --> 00:37:52,584 Speaker 2: Now you're right and brilliant and worry about this. And 745 00:37:52,623 --> 00:37:55,343 Speaker 2: I think part of the collapse of gatekeeping is that 746 00:37:55,864 --> 00:37:58,783 Speaker 2: we don't like maybe they weren't gatekeeping, maybe they were 747 00:37:58,824 --> 00:38:02,343 Speaker 2: more or we were more like normally, we just were 748 00:38:02,424 --> 00:38:06,783 Speaker 2: reinforcing widely shared norms, and when those norms disappeared, the 749 00:38:06,864 --> 00:38:10,104 Speaker 2: media as a whole doesn't know what to do. It's 750 00:38:10,223 --> 00:38:12,504 Speaker 2: it was more of a follower than a driver of 751 00:38:12,584 --> 00:38:14,864 Speaker 2: standards and morals. And I think a lot of journalists 752 00:38:14,864 --> 00:38:16,824 Speaker 2: would say, that's right, that's what we should do. Although 753 00:38:16,984 --> 00:38:19,584 Speaker 2: I think it's okay to be a journalist who's against 754 00:38:20,024 --> 00:38:21,863 Speaker 2: incorrectly using the Insurrection Act. 755 00:38:22,223 --> 00:38:25,984 Speaker 3: Reporters should fall all over Portland report on it. Is 756 00:38:25,984 --> 00:38:28,343 Speaker 3: there an insurrection here, Let's look at the facts, Let's 757 00:38:28,344 --> 00:38:31,584 Speaker 3: go down, let's interview people. Let's start with the thing 758 00:38:31,623 --> 00:38:34,223 Speaker 3: they're doing, rather than talk about how they might use 759 00:38:34,223 --> 00:38:37,343 Speaker 3: it politically and stay in Washington, go prove that there's 760 00:38:37,344 --> 00:38:39,223 Speaker 3: an insurrection or not. Yeah, it shouldn't be. 761 00:38:39,223 --> 00:38:41,664 Speaker 2: And I think there is. I'm just imagining that meeting. 762 00:38:41,743 --> 00:38:43,743 Speaker 2: And first of all, I'm sure the big places do 763 00:38:43,864 --> 00:38:46,703 Speaker 2: send somebody. There's also like a collapse of journalism, so 764 00:38:46,703 --> 00:38:49,303 Speaker 2: there's not as much money or ability. There's also like, 765 00:38:49,663 --> 00:38:53,024 Speaker 2: well everyone knows that, we all know that. That's not 766 00:38:53,104 --> 00:38:56,223 Speaker 2: the point. He's just saying it is. I mean, I 767 00:38:56,263 --> 00:38:58,383 Speaker 2: think it turns out Trump is just better at this. 768 00:38:58,904 --> 00:39:00,464 Speaker 2: He's better at messaging, like. 769 00:39:00,464 --> 00:39:02,784 Speaker 3: He knows, like he's willing to go places. 770 00:39:02,984 --> 00:39:04,904 Speaker 2: He's willing to go places. And I think, you know, 771 00:39:04,984 --> 00:39:07,423 Speaker 2: I was thinking the other day about my first big 772 00:39:07,424 --> 00:39:10,744 Speaker 2: investigative piece about Trump for The New York and about 773 00:39:10,824 --> 00:39:14,384 Speaker 2: how he knowingly participated in a money laundering scheme for 774 00:39:14,424 --> 00:39:18,343 Speaker 2: the Ironiant Guard. Still to me, feels like that should 775 00:39:18,384 --> 00:39:21,624 Speaker 2: be relevant. But I remember I called his general counsel 776 00:39:21,623 --> 00:39:24,143 Speaker 2: and I was like, you don't seem nervous, like I 777 00:39:24,143 --> 00:39:26,623 Speaker 2: feel like you should be nervous. And he acknowleds my 778 00:39:26,703 --> 00:39:29,703 Speaker 2: article is correct, like he wasn't making a claim that 779 00:39:29,783 --> 00:39:32,263 Speaker 2: it wasn't true. And this before it was published. But 780 00:39:32,384 --> 00:39:34,663 Speaker 2: he knew because we do fact checking. So we went 781 00:39:34,743 --> 00:39:37,183 Speaker 2: over every fact with him and he was like, ah, no, 782 00:39:37,344 --> 00:39:39,504 Speaker 2: I know it's going to happen. Rachel Mattow will make 783 00:39:39,544 --> 00:39:42,344 Speaker 2: it a big deal, Cinn will ignore it. A couple 784 00:39:42,304 --> 00:39:44,863 Speaker 2: of Democratic senators might write a letter, but nobody's going 785 00:39:44,904 --> 00:39:47,863 Speaker 2: to care. And like literally, Rachel Mattow did half an hour, 786 00:39:48,663 --> 00:39:49,503 Speaker 2: nobody else cared. 787 00:39:49,504 --> 00:39:51,504 Speaker 3: And so like you know, these are going to arrest 788 00:39:51,703 --> 00:39:54,663 Speaker 3: James Comy, right, And they wanted frog welcome and frog 789 00:39:54,703 --> 00:39:57,263 Speaker 3: march them over, they say, and or FBI guys get 790 00:39:57,304 --> 00:39:59,064 Speaker 3: fired because they didn't want a frog didn't figure out 791 00:39:59,064 --> 00:40:01,504 Speaker 3: the frog march him for the cameras. But what they 792 00:40:01,544 --> 00:40:04,623 Speaker 3: got called me on is like a small, one little thing. 793 00:40:04,663 --> 00:40:07,064 Speaker 3: He said in a testimony one time. It wasn't even 794 00:40:07,104 --> 00:40:10,343 Speaker 3: about Trump Russia thing. It was about Hillary Clinton. And 795 00:40:10,384 --> 00:40:13,504 Speaker 3: he's claiming oh he lied. Now as you read the thing, 796 00:40:13,544 --> 00:40:15,143 Speaker 3: I don't think he did, and I think he'll get 797 00:40:15,183 --> 00:40:18,384 Speaker 3: off very easily here. But really, so Trump's going after 798 00:40:18,743 --> 00:40:22,424 Speaker 3: someone who might have said one small little lie. The 799 00:40:22,464 --> 00:40:26,584 Speaker 3: guy who's been lying his entire career and every day 800 00:40:26,783 --> 00:40:28,904 Speaker 3: spits out hundreds of lies. This is what someone's going 801 00:40:28,944 --> 00:40:31,343 Speaker 3: to go prison for, and not just him. But like 802 00:40:31,384 --> 00:40:34,504 Speaker 3: he asked the question, did Tom Homan right? Did he 803 00:40:34,623 --> 00:40:37,864 Speaker 3: take fifty thousand dollars in a in a kava bag? 804 00:40:38,304 --> 00:40:40,664 Speaker 3: He didn't do anything illegal because we dropped the charges. 805 00:40:40,783 --> 00:40:43,383 Speaker 2: That's not the question. The question is did he take 806 00:40:43,424 --> 00:40:44,144 Speaker 2: fifty grand? 807 00:40:44,263 --> 00:40:47,663 Speaker 4: And what happened to the FBI's our taxpayer money that 808 00:40:47,743 --> 00:40:50,584 Speaker 4: when fifty grand? Is he paying taxes on it? Why 809 00:40:50,584 --> 00:40:52,783 Speaker 4: did he want it? It was like, I'm sorry, what 810 00:40:53,183 --> 00:40:56,584 Speaker 4: left or right? What the fuck? Somebody takes fifty grand 811 00:40:56,623 --> 00:40:59,223 Speaker 4: in a bag? And oh, yeah, he just did that, 812 00:40:59,263 --> 00:41:00,463 Speaker 4: but it's all on the if. 813 00:41:00,384 --> 00:41:02,783 Speaker 2: It was a valise, it was a collar bag. Yeah. 814 00:41:03,223 --> 00:41:05,424 Speaker 3: The true Trump people are like, what a clown fifty 815 00:41:05,424 --> 00:41:06,143 Speaker 3: thousand in a bag? 816 00:41:06,263 --> 00:41:07,104 Speaker 2: Shot a bitcoin? 817 00:41:07,424 --> 00:41:09,504 Speaker 3: It would be a bitcoin billion, yeah. 818 00:41:10,183 --> 00:41:12,183 Speaker 1: Plus the value of the bag and the food that 819 00:41:12,304 --> 00:41:15,904 Speaker 1: was in it. So it's more like fifty twenty dollars. 820 00:41:16,024 --> 00:41:17,783 Speaker 2: Yeah. I mean I've been thinking a lot about this 821 00:41:17,904 --> 00:41:21,143 Speaker 2: that I really devoted my life to a very simple 822 00:41:21,304 --> 00:41:26,143 Speaker 2: naive like when you say truth, it has a big effect. 823 00:41:26,183 --> 00:41:28,584 Speaker 3: And you got to find evidence to. 824 00:41:28,464 --> 00:41:31,663 Speaker 2: Prove it, and that matters that if you have a process, 825 00:41:31,783 --> 00:41:34,744 Speaker 2: that matters. And it took me a very long time. 826 00:41:35,263 --> 00:41:37,544 Speaker 2: I'm not saying I accept it like I'm happy about it. 827 00:41:37,743 --> 00:41:41,344 Speaker 2: I'm not happy about it, but I like, I just feel, Okay, 828 00:41:41,384 --> 00:41:44,464 Speaker 2: this is bigger than There's no New York Times headline 829 00:41:45,143 --> 00:41:48,343 Speaker 2: that's going to fix this. There's no like right sho 830 00:41:48,623 --> 00:41:50,823 Speaker 2: gate and at a press release that's going to fix this. 831 00:41:50,944 --> 00:41:52,903 Speaker 2: There's some other thing. And I think it does have 832 00:41:52,984 --> 00:41:55,303 Speaker 2: to do with all the things we talk about here, 833 00:41:55,344 --> 00:41:59,623 Speaker 2: how information flows, how people form opinions, how how those 834 00:41:59,743 --> 00:42:02,823 Speaker 2: opinions are reinforced. It also, I mean, we you know, 835 00:42:03,424 --> 00:42:06,304 Speaker 2: behind our talk about AI, certainly behind our talk about Trump, 836 00:42:06,504 --> 00:42:10,143 Speaker 2: but maybe behind every conspiracy theory is like power and 837 00:42:10,183 --> 00:42:13,263 Speaker 2: the truth that the truth. When this maybe makes us 838 00:42:13,263 --> 00:42:15,984 Speaker 2: a bunch of left wing intellectuals, because this is a 839 00:42:15,984 --> 00:42:18,703 Speaker 2: lot of the academic work of the last century is 840 00:42:18,743 --> 00:42:23,224 Speaker 2: like truth is not a thing it's a expression of power. 841 00:42:23,263 --> 00:42:25,343 Speaker 2: That doesn't mean it's not like I don't believe. I'm 842 00:42:25,344 --> 00:42:29,944 Speaker 2: not trying to make a like everything's equally true. I 843 00:42:29,944 --> 00:42:32,223 Speaker 2: don't think that's true. The phrase I've been using for 844 00:42:32,263 --> 00:42:35,143 Speaker 2: myself is you can't be one hundred percent right about anything, 845 00:42:35,223 --> 00:42:38,303 Speaker 2: but you can be one hundred percent wrong about things. 846 00:42:37,824 --> 00:42:40,943 Speaker 2: And we do know a whole bunch of people who 847 00:42:40,944 --> 00:42:43,864 Speaker 2: are one hundred percent wrong. But you could be eighty 848 00:42:43,904 --> 00:42:46,303 Speaker 2: percent true right. You could be like way more or 849 00:42:46,304 --> 00:42:48,703 Speaker 2: you could show like you've done your homework and you're 850 00:42:49,223 --> 00:42:52,783 Speaker 2: there's more evidence, there's more research. It could be true, 851 00:42:52,783 --> 00:42:54,544 Speaker 2: but it's still more complicated. You don't have it all. 852 00:42:54,584 --> 00:42:56,464 Speaker 4: I think the Middle East stuff is that you have 853 00:42:56,504 --> 00:42:59,544 Speaker 4: two two true narratives clashing. It just depends on how 854 00:42:59,544 --> 00:43:00,263 Speaker 4: you make those. 855 00:43:00,544 --> 00:43:03,544 Speaker 2: But I or two hundred or two thousand that narrative, 856 00:43:03,584 --> 00:43:04,944 Speaker 2: But I did so. 857 00:43:06,104 --> 00:43:08,384 Speaker 4: When John and I were in CIA and when you 858 00:43:08,424 --> 00:43:11,863 Speaker 4: were in place, oh you that we were. 859 00:43:12,464 --> 00:43:15,464 Speaker 3: We were big deals. We were the premier intelligence services. 860 00:43:15,944 --> 00:43:18,303 Speaker 4: We were like, you know, working for some like fly 861 00:43:18,384 --> 00:43:20,343 Speaker 4: by Night organization like that, you know, the New York 862 00:43:20,384 --> 00:43:25,183 Speaker 4: Times or MPR. But we looked at everything through the 863 00:43:25,223 --> 00:43:27,744 Speaker 4: optic of how it impacts us. And when we started 864 00:43:27,783 --> 00:43:31,703 Speaker 4: our careers, we assumed that Russia, the Soviet Union, and 865 00:43:31,743 --> 00:43:34,663 Speaker 4: the East Block that they were like way ahead of us, 866 00:43:34,783 --> 00:43:39,023 Speaker 4: right the missile gap, their technology spot nicks before our time, 867 00:43:39,464 --> 00:43:41,183 Speaker 4: and we found out that they were actually like way 868 00:43:41,183 --> 00:43:42,783 Speaker 4: more fun as fucked up as we were. 869 00:43:42,864 --> 00:43:44,343 Speaker 2: They were even way more fucked up. 870 00:43:44,384 --> 00:43:47,264 Speaker 4: So do you have a sense of China and Russia 871 00:43:47,304 --> 00:43:51,423 Speaker 4: and authoritarian governments they also are embracing this but they've 872 00:43:51,464 --> 00:43:53,904 Speaker 4: got to be fucking it up too. Are they fucking 873 00:43:53,944 --> 00:43:55,584 Speaker 4: it up worse or differently than us? 874 00:43:55,703 --> 00:43:59,024 Speaker 2: Or seems to be doing really well? Us a strength 875 00:43:59,304 --> 00:44:02,223 Speaker 2: that they've got seems to be doing really well, and 876 00:44:02,824 --> 00:44:06,064 Speaker 2: they're still catch up models like there. First of all, 877 00:44:06,104 --> 00:44:08,664 Speaker 2: we don't know like the Chinese military or the Chinese 878 00:44:08,703 --> 00:44:11,464 Speaker 2: Intelligence Service, were just the Chinese society. 879 00:44:11,584 --> 00:44:14,263 Speaker 4: In nineteen eighty nine, when you know, the East Block 880 00:44:14,304 --> 00:44:16,823 Speaker 4: fell apart, we realized how rotten it was, but we 881 00:44:16,824 --> 00:44:17,984 Speaker 4: didn't know that beforehand. 882 00:44:18,304 --> 00:44:21,743 Speaker 2: Yeah. Now, when Deep Seeks Big Model was revealed, I 883 00:44:21,783 --> 00:44:24,904 Speaker 2: think it was in December twenty twenty four, that was 884 00:44:24,944 --> 00:44:28,944 Speaker 2: a utterly transformative moment because for a bunch of reasons. 885 00:44:28,984 --> 00:44:31,584 Speaker 2: First of all, China had a model that in some 886 00:44:31,623 --> 00:44:34,263 Speaker 2: ways outperformed American models. I don't think it was, and 887 00:44:34,304 --> 00:44:36,504 Speaker 2: it's hard to even know what, Like all the benchmarks 888 00:44:36,504 --> 00:44:38,664 Speaker 2: are meaningless in a lot of ways, and the models 889 00:44:38,663 --> 00:44:42,104 Speaker 2: are trained on the benchmark, so they become but a 890 00:44:42,183 --> 00:44:45,424 Speaker 2: really good model, way better than anyone thought was going 891 00:44:45,504 --> 00:44:48,783 Speaker 2: to come out of a non US country. And then 892 00:44:49,143 --> 00:44:53,504 Speaker 2: the but also they did it way cheaper than the Americans, 893 00:44:53,623 --> 00:44:58,304 Speaker 2: like single digit millions instead of hundreds of millions into 894 00:44:58,344 --> 00:45:01,504 Speaker 2: the billions, and it should and they have steadily. They 895 00:45:01,544 --> 00:45:05,904 Speaker 2: have multiple major models, none of the top ones, but 896 00:45:06,223 --> 00:45:09,663 Speaker 2: pretty close behind. And I'd say the ones was to 897 00:45:09,743 --> 00:45:11,663 Speaker 2: be really blowing. It is Europe as they have been 898 00:45:11,703 --> 00:45:14,703 Speaker 2: on tech, like it's pretty hard to you know, name 899 00:45:14,743 --> 00:45:18,504 Speaker 2: your favorite high tech products that were invented in Europe, 900 00:45:18,544 --> 00:45:20,344 Speaker 2: Like it's there's not zero. 901 00:45:20,544 --> 00:45:24,024 Speaker 3: But if the goal is to provide better information to 902 00:45:24,024 --> 00:45:26,783 Speaker 3: make better decisions, as we saw for example in this 903 00:45:26,824 --> 00:45:28,944 Speaker 3: so of you need to stalin and author antams have 904 00:45:28,984 --> 00:45:30,944 Speaker 3: their own view and you can come to them with 905 00:45:31,024 --> 00:45:33,544 Speaker 3: the truth and if it doesn't fit with what they want, 906 00:45:33,904 --> 00:45:36,863 Speaker 3: it doesn't matter. So if g He's already got a 907 00:45:36,864 --> 00:45:39,344 Speaker 3: worldview like so he's doing a good job of letting 908 00:45:39,703 --> 00:45:41,544 Speaker 3: people come up with things and putting money where it 909 00:45:41,544 --> 00:45:44,303 Speaker 3: needs to go. And China and this big thing is 910 00:45:44,344 --> 00:45:47,064 Speaker 3: having a lot of success. But does that mean G 911 00:45:47,304 --> 00:45:49,504 Speaker 3: is better informed about the world and what? 912 00:45:49,743 --> 00:45:51,944 Speaker 2: Yeah, that's always an interesting thing. Like I had a 913 00:45:51,984 --> 00:45:55,944 Speaker 2: thought in Iraq. I remember thinking it's possible that Saddam 914 00:45:56,024 --> 00:45:58,183 Speaker 2: Hussein and George W. Bush are the two people on 915 00:45:58,223 --> 00:46:00,623 Speaker 2: earth who know the least about the ground but the 916 00:46:00,623 --> 00:46:04,264 Speaker 2: other side of what's happening, because they have this massive 917 00:46:04,623 --> 00:46:08,064 Speaker 2: apparatus to prevent them from actually know. And I'm not 918 00:46:08,104 --> 00:46:11,104 Speaker 2: a close expert on China at all, but from the 919 00:46:11,104 --> 00:46:13,663 Speaker 2: people I read, like Bill Bishop and stuff like it, 920 00:46:13,663 --> 00:46:17,784 Speaker 2: it does seem like they've moved. Like there was always 921 00:46:17,824 --> 00:46:22,104 Speaker 2: this storied bureaucracy that kind of existed as a force 922 00:46:22,544 --> 00:46:26,703 Speaker 2: independent of whoever happened to be the leader, and that 923 00:46:26,743 --> 00:46:29,223 Speaker 2: she has it really is to the glory of G 924 00:46:29,504 --> 00:46:32,703 Speaker 2: and that probably is a long term strategic weakness, except 925 00:46:33,183 --> 00:46:35,463 Speaker 2: we're doing that. I just had to talk with someone 926 00:46:35,464 --> 00:46:38,544 Speaker 2: in Canada today about visas. In Canada, they just cut 927 00:46:38,584 --> 00:46:40,303 Speaker 2: down on visas. And this is a friend who's an 928 00:46:40,344 --> 00:46:42,984 Speaker 2: economists in Canada, and I was like, shouldn't this be like, 929 00:46:43,384 --> 00:46:46,303 Speaker 2: shouldn't you just be getting every And they specifically cut 930 00:46:46,344 --> 00:46:50,784 Speaker 2: down on visas for smart students going to university why, 931 00:46:51,024 --> 00:46:54,384 Speaker 2: which is insane, And they have their own internal reasons. 932 00:46:54,464 --> 00:46:57,584 Speaker 2: There's fears of job displacement. There's there apparently were a 933 00:46:57,584 --> 00:46:59,904 Speaker 2: bunch of like diploma mills because they had fairly lax. 934 00:47:00,384 --> 00:47:03,663 Speaker 2: But it's also they don't want it too publicly. Like 935 00:47:04,104 --> 00:47:05,944 Speaker 2: this friend of mine was like, I think we could, 936 00:47:05,944 --> 00:47:08,383 Speaker 2: like for one hundred billion dollars, we just could just 937 00:47:08,623 --> 00:47:12,064 Speaker 2: grab an entire field of study, Like we could just 938 00:47:12,424 --> 00:47:17,504 Speaker 2: get every neuroscientist or every expert in battery technology or whatever. 939 00:47:18,104 --> 00:47:24,024 Speaker 2: But there's fear of pissing off Trump. There's there's internal issues. 940 00:47:23,743 --> 00:47:25,743 Speaker 4: So g you know, when they're building their model and 941 00:47:25,824 --> 00:47:28,263 Speaker 4: not for medicine or science and things like that, but 942 00:47:28,663 --> 00:47:31,143 Speaker 4: as he tries to understand his country. Basically everybody in 943 00:47:31,223 --> 00:47:33,504 Speaker 4: China lies, right, They don't tell the truth. 944 00:47:33,544 --> 00:47:35,464 Speaker 2: It's not like you're going to say in a form 945 00:47:35,544 --> 00:47:36,024 Speaker 2: or anything. 946 00:47:36,143 --> 00:47:38,463 Speaker 4: Everybody is like I love the government because they all 947 00:47:38,504 --> 00:47:41,223 Speaker 4: know if you don't fucking do that, you're screwed. And 948 00:47:41,263 --> 00:47:43,424 Speaker 4: so I think the AA model is taking it in 949 00:47:43,504 --> 00:47:45,984 Speaker 4: the same in Russia. So I their models have got 950 00:47:46,024 --> 00:47:49,544 Speaker 4: to be much more skewed and optimistic and positive towards 951 00:47:49,584 --> 00:47:52,943 Speaker 4: their leaderships, right, and and the the data that they 952 00:47:53,064 --> 00:47:55,903 Speaker 4: that that's that they build on has God, it has 953 00:47:55,984 --> 00:47:56,784 Speaker 4: to be skewed. 954 00:47:57,143 --> 00:48:00,704 Speaker 2: Just although most of the big models now are trained 955 00:48:00,743 --> 00:48:05,104 Speaker 2: on essentially everything in the world that's been digital like 956 00:48:05,384 --> 00:48:08,504 Speaker 2: every because you couldn't it's like TikTok, right, our our 957 00:48:09,064 --> 00:48:11,783 Speaker 2: data would go into the China model, right, yeah, but 958 00:48:11,904 --> 00:48:14,263 Speaker 2: also all of read it and every academic paper ever 959 00:48:14,344 --> 00:48:17,303 Speaker 2: and every book that's ever been digitized, and because you 960 00:48:17,944 --> 00:48:20,943 Speaker 2: you just need more and more data. And I don't 961 00:48:21,064 --> 00:48:23,663 Speaker 2: think you could create a cutting edge model just on 962 00:48:23,783 --> 00:48:26,903 Speaker 2: Chinese data like they're just I would guess that would 963 00:48:26,904 --> 00:48:30,064 Speaker 2: be my strong guess. So you'd need all the models 964 00:48:30,263 --> 00:48:34,263 Speaker 2: have all the data. Basically now there is creating synthetic data. 965 00:48:34,464 --> 00:48:36,983 Speaker 2: I would guess that if we see this with Elon Musk, 966 00:48:36,984 --> 00:48:40,224 Speaker 2: because he'll tweak his algorithm at Rock and it suddenly 967 00:48:40,304 --> 00:48:42,743 Speaker 2: is like spouting Nazi stuff. And it's not great that 968 00:48:42,783 --> 00:48:45,783 Speaker 2: it's spouting Nazi stuff from a like I'm against Nazis standpoint, 969 00:48:46,263 --> 00:48:50,344 Speaker 2: but it's also a sign that ham handed like top 970 00:48:50,464 --> 00:48:53,703 Speaker 2: down impositions on the model makes the model do really 971 00:48:53,703 --> 00:48:56,504 Speaker 2: weird things. It's not a good way, So the models 972 00:48:56,544 --> 00:49:01,104 Speaker 2: aren't quite as controllable as other things might be inherently. 973 00:49:01,464 --> 00:49:05,223 Speaker 2: But don't you I wouldn't you assume every government a 974 00:49:05,304 --> 00:49:07,183 Speaker 2: really good model you could get for less than a 975 00:49:07,183 --> 00:49:07,984 Speaker 2: billion dollars? 976 00:49:08,743 --> 00:49:10,944 Speaker 3: But those governed governments don't want the people to be 977 00:49:11,024 --> 00:49:12,504 Speaker 3: able to have access to all the But. 978 00:49:12,504 --> 00:49:14,143 Speaker 2: Don't they think internally? Wouldn't they? 979 00:49:14,504 --> 00:49:17,504 Speaker 4: But I think we're okay because the Buerau of labor statistics, 980 00:49:17,544 --> 00:49:19,984 Speaker 4: it's not like they're going to fire the person for 981 00:49:20,104 --> 00:49:22,743 Speaker 4: putting out the rule statistics that the government doesn't like. 982 00:49:23,183 --> 00:49:25,263 Speaker 2: But from a like how many countries on Earth could 983 00:49:25,424 --> 00:49:29,064 Speaker 2: the governments could blow a billion dollars like on most 984 00:49:29,104 --> 00:49:32,903 Speaker 2: of them, right, yeah, big ones? And like how would 985 00:49:32,904 --> 00:49:35,024 Speaker 2: you know? Am I wrong? Like from a national security 986 00:49:35,024 --> 00:49:37,544 Speaker 2: because wouldn't you want your own? You wouldn't want like 987 00:49:37,584 --> 00:49:41,663 Speaker 2: we saw what happened to Ukraine using starlink and being 988 00:49:41,703 --> 00:49:44,904 Speaker 2: dependent on elon Musk. You wouldn't want to be dependent 989 00:49:44,904 --> 00:49:47,383 Speaker 2: on sam Altman or Google or any of the And 990 00:49:47,424 --> 00:49:50,263 Speaker 2: you saw how all the big tech companies meant the 991 00:49:50,304 --> 00:49:52,823 Speaker 2: need to Trump. And so if you're an adversary of 992 00:49:52,864 --> 00:49:55,183 Speaker 2: the US, or even if you're like Israel, like an 993 00:49:55,183 --> 00:49:58,183 Speaker 2: ally of the US, but with your own independent desires, 994 00:49:58,944 --> 00:50:02,024 Speaker 2: don't you think they're all building That's my assumption, they're 995 00:50:02,064 --> 00:50:05,544 Speaker 2: all building their own models and that intelligence services will 996 00:50:05,584 --> 00:50:08,504 Speaker 2: just have access, although are they. I'd be curious, like 997 00:50:08,544 --> 00:50:11,024 Speaker 2: what is to a buddy I know who's in the FBI, 998 00:50:11,424 --> 00:50:13,663 Speaker 2: And he was like, we have I just use commercial 999 00:50:13,703 --> 00:50:16,064 Speaker 2: tools because the FBI tools are so lame and they're 1000 00:50:16,104 --> 00:50:17,504 Speaker 2: so behind the times. 1001 00:50:17,824 --> 00:50:20,743 Speaker 4: Like did you guys we couldn't even use Google right 1002 00:50:20,783 --> 00:50:23,623 Speaker 4: on our computers because we couldn't mix outside and. 1003 00:50:23,623 --> 00:50:25,343 Speaker 2: Internal really occasions. 1004 00:50:25,424 --> 00:50:28,183 Speaker 4: Yeah, so eventually we figured it out, but yeah, we 1005 00:50:28,223 --> 00:50:30,504 Speaker 4: had to switch between it, but yeah, we couldn't. Seventeen 1006 00:50:30,584 --> 00:50:33,584 Speaker 4: year old kids sitting in his basement had more access 1007 00:50:33,623 --> 00:50:36,743 Speaker 4: to information than we did. We had access to different information. 1008 00:50:36,944 --> 00:50:40,184 Speaker 2: And yet you created the crack epidemic. That's impressive without 1009 00:50:40,223 --> 00:50:41,223 Speaker 2: any technology. 1010 00:50:41,944 --> 00:50:42,263 Speaker 3: Thank you. 1011 00:50:42,984 --> 00:50:45,784 Speaker 2: Well, it's the alien technology that we have reverse engineered. 1012 00:50:46,024 --> 00:50:47,943 Speaker 1: Now, So I'm gonna I'm gonna go ahead and thank 1013 00:50:48,024 --> 00:50:50,104 Speaker 1: us all for getting a chance to get together again 1014 00:50:50,424 --> 00:50:53,584 Speaker 1: and to promise our listeners, we have a lot of 1015 00:50:53,703 --> 00:50:57,944 Speaker 1: great episodes we have now recorded and are recording, and 1016 00:50:57,984 --> 00:51:00,704 Speaker 1: we'll be coming out in swinging weeks. And I promise 1017 00:51:00,783 --> 00:51:03,263 Speaker 1: I'll get this YouTube channel up and running where you 1018 00:51:03,304 --> 00:51:03,983 Speaker 1: can see video. 1019 00:51:04,304 --> 00:51:07,104 Speaker 2: This is going to be a very interesting fall and 1020 00:51:07,143 --> 00:51:09,064 Speaker 2: I can't wait to talk to you guys next October. 1021 00:51:09,223 --> 00:51:17,663 Speaker 1: Yeah, Adam, come back anytime really. Mission Implausible is produced 1022 00:51:17,663 --> 00:51:22,664 Speaker 1: by Adam Davidson, Jerry O'shay, John Cipher, and Jonathan Stern. 1023 00:51:22,824 --> 00:51:25,184 Speaker 2: The associate producer is Rachel Harner. 1024 00:51:25,464 --> 00:51:29,263 Speaker 1: Mission Implausible is a production of honorable mention and abominable 1025 00:51:29,304 --> 00:51:31,263 Speaker 1: pictures for iHeart Podcasts.