1 00:00:01,840 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,200 --> 00:00:10,720 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty arm Strong 3 00:00:10,800 --> 00:00:12,480 Speaker 1: and Getty and now he. 4 00:00:15,400 --> 00:00:26,480 Speaker 2: Armstrong and Letty, supported by she himself who said quote 5 00:00:26,600 --> 00:00:29,920 Speaker 2: ice and snow are as valuable as gold and silver. 6 00:00:30,280 --> 00:00:34,240 Speaker 2: So Beijing it's turning this into an economic driver, trying 7 00:00:34,240 --> 00:00:38,360 Speaker 2: to boost domestic spending and otherwise slowing economy. 8 00:00:38,640 --> 00:00:41,880 Speaker 3: According to a government report, in twenty sixteen, consumers spent 9 00:00:41,920 --> 00:00:44,879 Speaker 3: about fifty five billion US dollars on winter tourism in 10 00:00:44,920 --> 00:00:48,639 Speaker 3: sports across China. The number has now more than doubled 11 00:00:48,840 --> 00:00:51,720 Speaker 3: in about a decade to one hundred and forty billion dollars. 12 00:00:52,280 --> 00:00:57,040 Speaker 1: Well, so China's trying to present themselves is a good 13 00:00:57,040 --> 00:00:59,280 Speaker 1: place to go if you like winter sports, get the 14 00:00:59,360 --> 00:01:01,800 Speaker 1: tourism up, make some money because they're in dire straits 15 00:01:01,840 --> 00:01:03,520 Speaker 1: there in China. Aided in that. 16 00:01:03,520 --> 00:01:07,240 Speaker 4: Effort by America's own what's her name, goo. 17 00:01:07,200 --> 00:01:14,360 Speaker 1: Yeah, Goo yeah, the Model Skiers, Communist Chinese tool. I 18 00:01:14,480 --> 00:01:17,080 Speaker 1: keep my eye out for everything AI as you know, 19 00:01:17,160 --> 00:01:20,280 Speaker 1: if you listen to the show, books, podcasts, articles, anything 20 00:01:20,319 --> 00:01:23,920 Speaker 1: like that. I saw this yesterday. I saw the headline, 21 00:01:23,920 --> 00:01:25,800 Speaker 1: thought man, I need to read that. And our friend 22 00:01:26,040 --> 00:01:29,880 Speaker 1: Craig Gottwaal's the healthcare expert, sent it to us privately 23 00:01:30,360 --> 00:01:33,080 Speaker 1: said this is really good. So the title of it 24 00:01:33,120 --> 00:01:35,800 Speaker 1: is something big is happening This Matt Schumer wrote it, 25 00:01:35,959 --> 00:01:38,480 Speaker 1: and it's long, so I'll skip the intro. But it 26 00:01:38,560 --> 00:01:44,960 Speaker 1: was basically, it's about this time six years ago where 27 00:01:45,480 --> 00:01:48,120 Speaker 1: he heard a little bit about this virus. And I 28 00:01:48,160 --> 00:01:50,559 Speaker 1: remember when they closed the school and we're all standing 29 00:01:50,560 --> 00:01:53,160 Speaker 1: around the playground saying this is crazy, isn't it. And 30 00:01:53,960 --> 00:01:55,680 Speaker 1: nobody was wearing a mask or anything like that, and 31 00:01:55,680 --> 00:01:58,600 Speaker 1: everybody had their jobs and or people are overreacting. 32 00:01:58,040 --> 00:01:58,480 Speaker 5: Ha ha ha. 33 00:01:59,240 --> 00:02:01,240 Speaker 1: But you had kind of this is just weird feeling 34 00:02:01,280 --> 00:02:03,080 Speaker 1: of this is weird, this has never happened before. 35 00:02:03,240 --> 00:02:05,400 Speaker 4: And then with him a couple be what it seems 36 00:02:05,440 --> 00:02:05,680 Speaker 4: to be. 37 00:02:06,000 --> 00:02:08,679 Speaker 1: Then within a couple of weeks, the world was more 38 00:02:08,760 --> 00:02:10,400 Speaker 1: upside down than it had ever been in any of 39 00:02:10,440 --> 00:02:12,720 Speaker 1: our lives. And it just happened so fast and it 40 00:02:12,760 --> 00:02:14,600 Speaker 1: was so huge and it was amazing. And this guy, 41 00:02:14,639 --> 00:02:17,000 Speaker 1: Matt Schumer is saying, this is where we are with 42 00:02:17,040 --> 00:02:20,840 Speaker 1: the whole AI situation. It's about to explode and people 43 00:02:20,880 --> 00:02:24,680 Speaker 1: aren't ready for it. He lives in that world. AI 44 00:02:24,760 --> 00:02:26,600 Speaker 1: guy's been building a startup you'll hear about it here 45 00:02:26,600 --> 00:02:29,000 Speaker 1: in the article here for years. And he said he 46 00:02:29,080 --> 00:02:33,960 Speaker 1: wrote this for family and friends we realizes aren't following 47 00:02:34,000 --> 00:02:37,080 Speaker 1: this closely, just to alert them. Hey, as the title says, 48 00:02:37,360 --> 00:02:40,320 Speaker 1: something big is really happening, and I'll skip down to 49 00:02:40,919 --> 00:02:42,840 Speaker 1: a little bit in this I've spent six years building 50 00:02:42,840 --> 00:02:45,360 Speaker 1: an AI startup, investing in this space. I live in 51 00:02:45,400 --> 00:02:47,480 Speaker 1: this world, and I'm writing this for the people in 52 00:02:47,480 --> 00:02:50,160 Speaker 1: my life who don't my family, my friends, the people 53 00:02:50,200 --> 00:02:51,840 Speaker 1: I care about who keep asking me so it's the 54 00:02:51,840 --> 00:02:54,400 Speaker 1: deal with AI and getting an answer that doesn't do 55 00:02:54,600 --> 00:02:58,119 Speaker 1: justice to what's actually happening. I keep them. I'm giving 56 00:02:58,200 --> 00:03:00,880 Speaker 1: them the polite version, the cocktail part version, because the 57 00:03:00,880 --> 00:03:03,680 Speaker 1: honest version sounds like I've lost my mind. And for 58 00:03:03,720 --> 00:03:05,520 Speaker 1: a while I told myself that was a good enough 59 00:03:05,560 --> 00:03:08,160 Speaker 1: reason to keep what's truly happening to myself. But the 60 00:03:08,200 --> 00:03:10,360 Speaker 1: gap between what I've been saying and what is actually 61 00:03:10,400 --> 00:03:12,800 Speaker 1: happening has gotten far too big. The people I care 62 00:03:12,840 --> 00:03:15,000 Speaker 1: about deserve to hear what is coming, even if it 63 00:03:15,080 --> 00:03:19,480 Speaker 1: sounds crazy. Wow uh, he says, I should be clear 64 00:03:19,480 --> 00:03:21,440 Speaker 1: about something up front. Even though I work in AI, 65 00:03:21,600 --> 00:03:24,320 Speaker 1: have almost no influence over what's about to happen, and 66 00:03:24,400 --> 00:03:26,760 Speaker 1: neither do the vast majority of people in the industry. 67 00:03:27,160 --> 00:03:30,240 Speaker 1: The future is being shaped by an unbelievably small number 68 00:03:30,280 --> 00:03:33,440 Speaker 1: of people. A few hundred researchers at a handful of 69 00:03:33,480 --> 00:03:37,680 Speaker 1: companies Open Ai, Open Ai and Tropic, Google, Deep Mind 70 00:03:37,720 --> 00:03:41,120 Speaker 1: a few others. A single training run managed by a 71 00:03:41,240 --> 00:03:43,360 Speaker 1: very small team or of our few months can produce 72 00:03:43,400 --> 00:03:47,920 Speaker 1: an AI system that shifts the entire trajectory of the technology. 73 00:03:48,280 --> 00:03:50,200 Speaker 1: Most of us who work in AI are building on 74 00:03:50,200 --> 00:03:53,520 Speaker 1: top of foundations we didn't lay. We're watching this unfold 75 00:03:53,520 --> 00:03:55,600 Speaker 1: the same as you. We just happened to be close 76 00:03:55,720 --> 00:03:59,160 Speaker 1: enough to feel the ground shaking first. The time is now, 77 00:03:59,680 --> 00:04:02,960 Speaker 1: not eventually. We should talk about this way in a 78 00:04:03,280 --> 00:04:04,960 Speaker 1: This is happening right now, and I need you to 79 00:04:05,040 --> 00:04:07,840 Speaker 1: understand it way. And this is the part I really liked. 80 00:04:07,840 --> 00:04:11,240 Speaker 1: It gets closer to understanding it. I know this is 81 00:04:11,280 --> 00:04:13,160 Speaker 1: real because it happened to me first. Here's the thing 82 00:04:13,200 --> 00:04:16,719 Speaker 1: nobody outside of tech quite understands yet. The reason so 83 00:04:16,800 --> 00:04:18,960 Speaker 1: many people in the industry are sounding the alarm right 84 00:04:19,000 --> 00:04:22,039 Speaker 1: now is because this already happened to us. We're not 85 00:04:22,160 --> 00:04:24,880 Speaker 1: making predictions. We're telling you what already occurred in our 86 00:04:24,920 --> 00:04:28,320 Speaker 1: own jobs and warning you that you're next. For years, 87 00:04:28,360 --> 00:04:31,440 Speaker 1: AI has been improving steadily, big jumps here and there, 88 00:04:31,800 --> 00:04:33,720 Speaker 1: but each big jump was spaced out enough that you 89 00:04:33,760 --> 00:04:36,279 Speaker 1: could absorb them as they came. Then, in twenty twenty 90 00:04:36,320 --> 00:04:40,440 Speaker 1: five last year, new techniques for building these models unlocked 91 00:04:40,480 --> 00:04:43,520 Speaker 1: a much faster pace of progress. This reminds me of 92 00:04:43,520 --> 00:04:47,279 Speaker 1: what Elon was talking about the other day, Like at 93 00:04:47,279 --> 00:04:49,520 Speaker 1: the top of the We were at the top of 94 00:04:49,640 --> 00:04:52,800 Speaker 1: the roller coaster last year, and now we're in the 95 00:04:53,000 --> 00:04:56,680 Speaker 1: plunging downward an incredible speed part. And that's what this 96 00:04:56,720 --> 00:04:59,680 Speaker 1: guy is saying. And I may vomit. Yes. And then 97 00:04:59,680 --> 00:05:02,040 Speaker 1: it got even faster, and then faster again. Each new 98 00:05:02,080 --> 00:05:04,240 Speaker 1: model wasn't just better than the last, it was better 99 00:05:04,240 --> 00:05:06,960 Speaker 1: by a wider margin, and the time between new model 100 00:05:07,000 --> 00:05:10,080 Speaker 1: releases was shorter. I was using AI more and more, 101 00:05:10,160 --> 00:05:12,320 Speaker 1: going back and forth at less and less watching it 102 00:05:12,360 --> 00:05:15,600 Speaker 1: handle things I used to think required my expertise. Then, 103 00:05:15,640 --> 00:05:19,840 Speaker 1: on February fifth, which is just a few days ago, 104 00:05:20,680 --> 00:05:24,159 Speaker 1: two major AI labs released new models on the same day, 105 00:05:24,279 --> 00:05:28,040 Speaker 1: GPT five point three codex from open AI and Opus 106 00:05:28,040 --> 00:05:31,240 Speaker 1: four point six from Nentropic. That's the Claude people, the 107 00:05:31,279 --> 00:05:35,120 Speaker 1: main competitors to chat GPT. And something clicked, not like 108 00:05:35,160 --> 00:05:37,279 Speaker 1: a light switch, more like the moment you realize the 109 00:05:37,320 --> 00:05:39,360 Speaker 1: water has been rising around you and is now at 110 00:05:39,360 --> 00:05:42,720 Speaker 1: your chest. Jesus. This frightens the hell out of me 111 00:05:42,760 --> 00:05:44,960 Speaker 1: because I don't completely understand it. But I believe this 112 00:05:45,040 --> 00:05:49,280 Speaker 1: guy because he doesn't seem to be presenting an EDA 113 00:05:49,560 --> 00:05:52,000 Speaker 1: in a like a good, get out there and invest 114 00:05:52,120 --> 00:05:53,960 Speaker 1: sort of way, and more of a just be ready. 115 00:05:54,080 --> 00:05:55,440 Speaker 1: Things are about to get crazy. 116 00:05:56,040 --> 00:05:59,480 Speaker 4: Yeah, I'm not sensing the conflict of interest that I 117 00:05:59,560 --> 00:06:01,600 Speaker 4: usually do. That these people are trying to get us 118 00:06:01,600 --> 00:06:04,119 Speaker 4: excited because they need to raise hundreds of billions of dollars. 119 00:06:04,200 --> 00:06:06,520 Speaker 4: But back to the terrifying article. 120 00:06:06,839 --> 00:06:09,720 Speaker 1: I'm no longer needed for the actual technical work of 121 00:06:09,800 --> 00:06:12,880 Speaker 1: my job. I describe what I want built in plain 122 00:06:12,960 --> 00:06:16,320 Speaker 1: English and it just appears not a rough draft. I 123 00:06:16,320 --> 00:06:19,320 Speaker 1: need to fix the finished thing. I tell the AI 124 00:06:19,440 --> 00:06:21,839 Speaker 1: what I want, walk away from my computer for four hours, 125 00:06:21,839 --> 00:06:25,040 Speaker 1: and come back to find the work done, done, well, done, 126 00:06:25,040 --> 00:06:26,839 Speaker 1: better than I would have done it myself, with no 127 00:06:26,960 --> 00:06:29,400 Speaker 1: corrections needed. A couple of months ago, I was going 128 00:06:29,440 --> 00:06:32,080 Speaker 1: back and forth with AI, guiding it, making edits. Now 129 00:06:32,120 --> 00:06:34,080 Speaker 1: I just describe the outcome and leave and when I 130 00:06:34,080 --> 00:06:37,799 Speaker 1: come back it's perfect. That's just in the last couple 131 00:06:37,800 --> 00:06:41,480 Speaker 1: of months. That's why it's what I've been saying, that 132 00:06:41,680 --> 00:06:44,360 Speaker 1: we're all using these chatbots and feeling like we're engaging 133 00:06:44,520 --> 00:06:48,279 Speaker 1: in AI, but the chatbots aren't AI. What AI is 134 00:06:48,279 --> 00:06:50,640 Speaker 1: is going on in this guy's world, and you know, 135 00:06:50,680 --> 00:06:53,520 Speaker 1: out there in the region, and. 136 00:06:52,960 --> 00:06:56,440 Speaker 4: We don't like, we don't see it right right, we're 137 00:06:56,560 --> 00:06:59,680 Speaker 4: dealing with super charged way way better search engines. 138 00:07:00,160 --> 00:07:03,320 Speaker 1: Essentially. Let me give you an example so you can 139 00:07:03,400 --> 00:07:06,880 Speaker 1: understand what this actually looks like in practice. I tell 140 00:07:06,880 --> 00:07:09,360 Speaker 1: the AI, I want to build this app. Here's what 141 00:07:09,400 --> 00:07:11,400 Speaker 1: it should do. Here's roughly what it should look like. 142 00:07:11,880 --> 00:07:14,080 Speaker 1: Figure out the user flow, the design, all of it. 143 00:07:14,600 --> 00:07:17,760 Speaker 1: And it does. It writes tens of thousands of lines 144 00:07:17,800 --> 00:07:20,120 Speaker 1: of code. Then this is the part that would have 145 00:07:20,120 --> 00:07:23,280 Speaker 1: been unthinkable a year ago. It opens the app itself, 146 00:07:23,800 --> 00:07:26,760 Speaker 1: It clicks through the buttons, It tests the features, It 147 00:07:26,880 --> 00:07:29,600 Speaker 1: uses the app the way a person would. If it 148 00:07:29,640 --> 00:07:31,680 Speaker 1: doesn't like how something looks or feels, it goes back 149 00:07:31,720 --> 00:07:34,480 Speaker 1: and changes it on its own. It iterates like a 150 00:07:34,520 --> 00:07:38,040 Speaker 1: developer would, fixing and refining until it's satisfied. Only once 151 00:07:38,080 --> 00:07:40,160 Speaker 1: it has decided the app meets its own standards, does 152 00:07:40,200 --> 00:07:41,840 Speaker 1: it come back to me and say it's ready for 153 00:07:41,880 --> 00:07:43,480 Speaker 1: you to test. And then when I test it, it's 154 00:07:43,560 --> 00:07:47,160 Speaker 1: usually perfect. I'm not exaggerating. This is what my Monday 155 00:07:47,200 --> 00:07:51,480 Speaker 1: looked like this week. Oh my god. 156 00:07:52,120 --> 00:07:55,520 Speaker 4: Yeah, yeah, my mind is blown. I'm just casting my 157 00:07:55,640 --> 00:07:58,920 Speaker 4: mind back through history to the various inventions that have 158 00:07:59,280 --> 00:08:02,680 Speaker 4: been predicted, would you know, and the need for humans 159 00:08:02,720 --> 00:08:06,240 Speaker 4: blah blah blah, But they were almost entirely devices that 160 00:08:08,720 --> 00:08:14,240 Speaker 4: supplanted our physical efforts. They were machines that freed up 161 00:08:14,280 --> 00:08:17,640 Speaker 4: our bodies to do something else, not that completely replaced 162 00:08:17,640 --> 00:08:18,520 Speaker 4: our brains well. 163 00:08:18,520 --> 00:08:20,640 Speaker 1: And then the pace that had happened, it was like 164 00:08:20,720 --> 00:08:24,680 Speaker 1: an elephant lumbering across the planes your direction, you know, 165 00:08:24,720 --> 00:08:25,800 Speaker 1: you could see it coming. 166 00:08:25,920 --> 00:08:29,280 Speaker 4: And Eli Whitney invents the cotton gin finally gets a 167 00:08:29,280 --> 00:08:32,360 Speaker 4: farmer to try it. Then it's two farmers eight years later, 168 00:08:32,520 --> 00:08:33,680 Speaker 4: farmers all over America. 169 00:08:33,720 --> 00:08:37,000 Speaker 1: Blah blah blah. Right, yeah, Matt Schumer goes on to write, 170 00:08:37,000 --> 00:08:39,160 Speaker 1: I've always been early to adopt AI tools. But the 171 00:08:39,240 --> 00:08:42,360 Speaker 1: last few months have shocked me. The last few months. See, 172 00:08:42,400 --> 00:08:45,280 Speaker 1: that's the thing, you know, I've been reading these books 173 00:08:45,320 --> 00:08:47,640 Speaker 1: and podcasts stuff like that. Everything changed, like in the 174 00:08:47,679 --> 00:08:51,160 Speaker 1: last couple of days with the introduction of those those 175 00:08:51,200 --> 00:08:52,320 Speaker 1: new features. 176 00:08:52,760 --> 00:08:55,000 Speaker 4: The last making people insane. As I've said many times, 177 00:08:55,120 --> 00:08:56,960 Speaker 4: is not necessarily the amount of change, but the pace 178 00:08:57,000 --> 00:08:58,959 Speaker 4: of it. You just can't possibly adapt to it. 179 00:08:59,080 --> 00:09:01,800 Speaker 1: The last few months have shocked me. These new AI 180 00:09:01,840 --> 00:09:09,160 Speaker 1: models aren't incremental improvements. This is a different thing entirely. Wow. 181 00:09:11,360 --> 00:09:14,240 Speaker 1: The AI labs made a deliberate choice. They focused on 182 00:09:14,280 --> 00:09:17,280 Speaker 1: making AI great at writing code first, because building AI 183 00:09:17,400 --> 00:09:19,920 Speaker 1: requires a lot of code. If AI can write code, 184 00:09:19,960 --> 00:09:22,160 Speaker 1: it can help build the next version of itself, a 185 00:09:22,160 --> 00:09:24,720 Speaker 1: smarter version, which writes better code, which builds an even 186 00:09:24,720 --> 00:09:28,720 Speaker 1: smarter version. Making AI great at coding was the strategy 187 00:09:28,720 --> 00:09:31,080 Speaker 1: that unlocks everything else. That's why they did it first. 188 00:09:31,120 --> 00:09:33,839 Speaker 1: My job started changing before yours. Not because they were 189 00:09:34,040 --> 00:09:36,679 Speaker 1: targeting software engineers. It was just a side effect of 190 00:09:36,720 --> 00:09:39,120 Speaker 1: where they chose to aim first. Now they've done it, 191 00:09:39,200 --> 00:09:42,959 Speaker 1: and now they're moving on to everything else. This is 192 00:09:42,960 --> 00:09:46,000 Speaker 1: a very long article. It goes on to more details 193 00:09:46,040 --> 00:09:49,319 Speaker 1: of different sectors of the economy that it's going to 194 00:09:49,360 --> 00:09:51,920 Speaker 1: start jumping into very quickly. 195 00:09:52,280 --> 00:09:54,959 Speaker 4: We'll post surprizes in that list. Or is it's kind 196 00:09:54,960 --> 00:09:57,079 Speaker 4: of the usual suspects accounting law. 197 00:09:57,840 --> 00:10:04,200 Speaker 1: Well, i'd say the surprises the it encompasses like everything. Oh, 198 00:10:04,440 --> 00:10:06,439 Speaker 1: we'll post this on Twitter and at the website and 199 00:10:06,440 --> 00:10:08,080 Speaker 1: you can read the whole thing if you want. But 200 00:10:08,559 --> 00:10:10,280 Speaker 1: the fact that there's a guy that works in the 201 00:10:10,320 --> 00:10:13,319 Speaker 1: industry that already knew it was going to be a 202 00:10:13,360 --> 00:10:15,720 Speaker 1: big deal and it was happening fast, that just in 203 00:10:15,800 --> 00:10:18,640 Speaker 1: the last like couple of days, in a couple of 204 00:10:18,679 --> 00:10:21,520 Speaker 1: weeks and a couple of months has gone from you know, wow, 205 00:10:21,559 --> 00:10:23,720 Speaker 1: this is really a big deal happening fast to holy crap, 206 00:10:23,800 --> 00:10:28,040 Speaker 1: it's happened. Yeah, I you know, as you always say, 207 00:10:28,040 --> 00:10:31,240 Speaker 1: and this is one hundred percent true. What am I 208 00:10:31,240 --> 00:10:33,720 Speaker 1: supposed to do about it? I mean, I got no 209 00:10:33,960 --> 00:10:36,760 Speaker 1: like hear reaction plan. But it couldn't hurt to be 210 00:10:36,800 --> 00:10:37,840 Speaker 1: aware of this, could it? 211 00:10:38,679 --> 00:10:41,720 Speaker 4: Well, jack A, I'll never pet your dog or throw 212 00:10:41,760 --> 00:10:44,480 Speaker 4: a bone for it or in myketball actually it probably 213 00:10:44,520 --> 00:10:47,400 Speaker 4: will with a robot. But anyway, Rough Greens is a 214 00:10:47,440 --> 00:10:50,640 Speaker 4: great opportunity to improve your dog's health and it's nutrition. 215 00:10:51,000 --> 00:10:55,000 Speaker 4: Don't change dog's food, just add rough Greens. It provides live, 216 00:10:55,120 --> 00:11:00,760 Speaker 4: bioavailable nutrients including essential vitamins, minerals, probiotics, digestive zimes, omega 217 00:11:00,800 --> 00:11:05,200 Speaker 4: oils that all work together to help your dog stay active, mobile, 218 00:11:05,200 --> 00:11:05,600 Speaker 4: and alert. 219 00:11:05,679 --> 00:11:08,080 Speaker 1: Is the age. Probably have an AI robot that can 220 00:11:08,120 --> 00:11:10,360 Speaker 1: throw the tennis ball. You know, have the dogs that 221 00:11:10,440 --> 00:11:11,400 Speaker 1: just never get tired of that. 222 00:11:11,520 --> 00:11:14,600 Speaker 4: Just keep doing that and understand its barks and say, hey, 223 00:11:14,760 --> 00:11:15,520 Speaker 4: I totally hear you. 224 00:11:15,640 --> 00:11:15,760 Speaker 6: Man. 225 00:11:16,040 --> 00:11:18,840 Speaker 1: Some days, some days you get the tennis ball. Some 226 00:11:18,920 --> 00:11:23,120 Speaker 1: days the tennis ball gets you. So you should try 227 00:11:23,120 --> 00:11:25,680 Speaker 1: this rough Greens. It only costs the cost of shipping 228 00:11:25,720 --> 00:11:27,640 Speaker 1: and put it on your dog food. You're gonna like it. 229 00:11:27,720 --> 00:11:30,440 Speaker 1: Rough Greens is offering a free Jumpstart trial bag. You 230 00:11:30,520 --> 00:11:33,200 Speaker 1: just cover the shipping us. Use the discount code Armstrong 231 00:11:33,240 --> 00:11:36,400 Speaker 1: to claim your free Jumpstart Trial bag and Roughgreens dot 232 00:11:36,440 --> 00:11:37,600 Speaker 1: com that's. 233 00:11:37,640 --> 00:11:40,960 Speaker 4: R u f F Greens dot com or rough Greens 234 00:11:41,000 --> 00:11:44,040 Speaker 4: dot Com. Use that promo code Armstrong. Don't change your 235 00:11:44,040 --> 00:11:45,880 Speaker 4: dog's food, just add rough Greens and watch the health 236 00:11:45,920 --> 00:11:47,360 Speaker 4: benefits come alive. 237 00:11:47,520 --> 00:11:51,400 Speaker 1: And again, you have appropriately pointed out that a lot 238 00:11:51,440 --> 00:11:54,079 Speaker 1: of these AI articles. You know, they seem like great 239 00:11:54,120 --> 00:11:56,760 Speaker 1: for building up the enthusiasm for investing and everything like that. 240 00:11:57,040 --> 00:11:59,320 Speaker 1: This guy doesn't ever get to the But here's where 241 00:11:59,320 --> 00:12:02,400 Speaker 1: it's good for you. You know, every invention in history 242 00:12:02,440 --> 00:12:04,280 Speaker 1: is actually that the more jobs. I mean, he doesn't 243 00:12:04,280 --> 00:12:08,600 Speaker 1: put that spin on it anywhere. He's just letting us 244 00:12:08,640 --> 00:12:11,400 Speaker 1: know how much it's improved. The models available today are 245 00:12:11,640 --> 00:12:15,600 Speaker 1: unrecognizable from what existed six months ago. 246 00:12:16,880 --> 00:12:20,800 Speaker 4: Wow, and where does that leave us? In a hundred 247 00:12:20,840 --> 00:12:23,400 Speaker 4: different ways. We had a behind the scenes text exchange 248 00:12:23,400 --> 00:12:26,040 Speaker 4: I couldn't actually participate us out from my birthday dinner 249 00:12:26,559 --> 00:12:33,800 Speaker 4: yesterday about AI producing music, specifically like really sophisticated, unbelievably 250 00:12:33,840 --> 00:12:34,319 Speaker 4: good music. 251 00:12:34,320 --> 00:12:37,120 Speaker 1: Where does that leave us as human beings? Final sentence 252 00:12:37,160 --> 00:12:41,840 Speaker 1: I'll read to you in this very long article, Run 253 00:12:41,880 --> 00:12:44,160 Speaker 1: for your lives. I say it because the gap between 254 00:12:44,240 --> 00:12:49,280 Speaker 1: public perception and current reality is now enormous, and that 255 00:12:49,320 --> 00:12:52,440 Speaker 1: gap is dangerous because it's preventing people from preparing. I'd 256 00:12:52,440 --> 00:12:54,680 Speaker 1: say one place we can pay attention is pay attention 257 00:12:54,720 --> 00:12:57,000 Speaker 1: to any of the political arguments around this and trying 258 00:12:57,040 --> 00:12:59,920 Speaker 1: to have any guardrails on it or prepare for it. 259 00:13:00,120 --> 00:13:02,800 Speaker 1: Her You know, when you start hearing these discussions about 260 00:13:03,080 --> 00:13:07,160 Speaker 1: universal basic income, don't ignore them as a fanciful that 261 00:13:07,240 --> 00:13:09,800 Speaker 1: will never happen. It might be happening in the next 262 00:13:09,960 --> 00:13:10,800 Speaker 1: election cycle. 263 00:13:11,240 --> 00:13:13,600 Speaker 4: Is this the same government that we've been playing the 264 00:13:13,600 --> 00:13:15,880 Speaker 4: tapes of them yelling at each other in Congress over 265 00:13:16,000 --> 00:13:18,600 Speaker 4: fake Epstein claims that they all know are fake. 266 00:13:18,760 --> 00:13:21,440 Speaker 1: It's one and the sane. They're going to keep us safe. Okay, 267 00:13:22,840 --> 00:13:25,040 Speaker 1: any thoughts on any of this text line four one 268 00:13:25,120 --> 00:13:32,640 Speaker 1: five two nine five k ftc AI is really still digital. 269 00:13:32,760 --> 00:13:36,960 Speaker 5: Ultimately, AI can improve the productivity of humans food who 270 00:13:37,200 --> 00:13:39,000 Speaker 5: build things with their hands or do things with their hands, 271 00:13:39,080 --> 00:13:40,040 Speaker 5: you know, literally. 272 00:13:39,840 --> 00:13:41,960 Speaker 1: Welding, electrical work, plumbing. 273 00:13:42,040 --> 00:13:45,959 Speaker 5: Anything that's that's physically moving atoms like cooking food or 274 00:13:46,160 --> 00:13:49,959 Speaker 5: farming or like anything that's that's physical. Those jobs will 275 00:13:50,000 --> 00:13:53,960 Speaker 5: exist for a much longer time. But anything that is digital, 276 00:13:54,559 --> 00:13:57,280 Speaker 5: which is like just someone out of computer doing something, 277 00:13:57,760 --> 00:14:01,560 Speaker 5: AI is going to take over those jobs like lightning. 278 00:14:02,480 --> 00:14:04,400 Speaker 5: It's going to take all those jobs like lightning, just 279 00:14:04,440 --> 00:14:07,800 Speaker 5: like digital computers took over the job of people doing 280 00:14:07,840 --> 00:14:10,880 Speaker 5: manual calculations, but much faster. 281 00:14:11,720 --> 00:14:13,720 Speaker 1: So that's Elon sounded like it was on the Joe 282 00:14:13,760 --> 00:14:16,920 Speaker 1: Rogan show. That's interesting. I wonder what he means by 283 00:14:17,360 --> 00:14:21,880 Speaker 1: much longer that you'll be able to weld before before 284 00:14:21,880 --> 00:14:24,360 Speaker 1: a robot learns to weld and is better than any welder. 285 00:14:24,960 --> 00:14:29,680 Speaker 4: Given the incredibly accelerating timetable discussed in the prior episode, 286 00:14:29,680 --> 00:14:32,400 Speaker 4: Do you mean ten years? Are like six weeks? 287 00:14:33,320 --> 00:14:36,000 Speaker 1: Right? And obviously even if it was ten years, it'd 288 00:14:36,040 --> 00:14:38,120 Speaker 1: be a really really big deal if those jobs went away. 289 00:14:38,320 --> 00:14:41,760 Speaker 1: But the digital stuff is going to go away like tomorrow. 290 00:14:42,680 --> 00:14:45,560 Speaker 1: So we were talking were from this article from this 291 00:14:45,720 --> 00:14:48,400 Speaker 1: AI developer trying to alert his friends and just everybody 292 00:14:48,560 --> 00:14:50,640 Speaker 1: like this has really exploded, like in the last couple 293 00:14:50,640 --> 00:14:53,400 Speaker 1: of weeks, a couple of months. This little rundown I 294 00:14:53,400 --> 00:14:57,240 Speaker 1: thought was good in twenty twenty two, just four years ago. 295 00:14:57,320 --> 00:14:59,400 Speaker 1: Remember twenty twenty two, Oh my god, I was a child. 296 00:15:00,880 --> 00:15:04,760 Speaker 1: In twenty twenty two, AI couldn't do basic arithmetic reliability 297 00:15:04,880 --> 00:15:08,080 Speaker 1: or reliably. In twenty twenty three, it could pass the 298 00:15:08,080 --> 00:15:11,240 Speaker 1: Bar exam. In twenty twenty four, it could write working 299 00:15:11,280 --> 00:15:15,640 Speaker 1: software and explain graduate level science. By late twenty twenty five, 300 00:15:15,720 --> 00:15:17,640 Speaker 1: some of the best engineers in the world said they 301 00:15:17,640 --> 00:15:19,800 Speaker 1: had handed over most of their coding work to AI. 302 00:15:20,640 --> 00:15:23,800 Speaker 1: On February fifth, twenty twenty six. A couple of weeks ago, 303 00:15:24,040 --> 00:15:27,000 Speaker 1: new models arrived that made everything before them feel like 304 00:15:27,080 --> 00:15:30,160 Speaker 1: a different era. If you haven't tried AI in the 305 00:15:30,240 --> 00:15:33,000 Speaker 1: last few months, what exists today would be unrecognizable to you. 306 00:15:33,600 --> 00:15:36,720 Speaker 1: Part of my problem is I don't have any I 307 00:15:36,760 --> 00:15:39,240 Speaker 1: don't have any tasks that I understand to give AI. 308 00:15:39,760 --> 00:15:42,000 Speaker 1: So our friend Craig said, Man, if you haven't tried 309 00:15:42,000 --> 00:15:44,720 Speaker 1: the paid version of Gemini, that's what this guy says. 310 00:15:44,720 --> 00:15:46,760 Speaker 1: He says, the vast majority of people are using the 311 00:15:46,800 --> 00:15:50,280 Speaker 1: free versions. The gap between the free versions and the 312 00:15:50,320 --> 00:15:53,240 Speaker 1: pay versions is enormous. Well, it wouldn't be for me 313 00:15:54,120 --> 00:15:57,920 Speaker 1: because I use AI mostly to answer trivia questions basically. 314 00:15:58,360 --> 00:16:00,240 Speaker 1: So I mean, if I had tasks for to do, 315 00:16:00,240 --> 00:16:02,440 Speaker 1: maybe I could really appreciate that. But so, yeah, if 316 00:16:02,440 --> 00:16:04,840 Speaker 1: you aren't using the paid versions of Gemini. 317 00:16:04,560 --> 00:16:09,520 Speaker 4: Or Claude or whatever, yeah, this is also shocking to me. 318 00:16:09,560 --> 00:16:11,680 Speaker 4: But by the way, I think you ought to repeat 319 00:16:11,680 --> 00:16:13,680 Speaker 4: what you said to me off the air when I 320 00:16:13,760 --> 00:16:16,320 Speaker 4: was I made that comparison that, oh, our government that's 321 00:16:16,440 --> 00:16:20,280 Speaker 4: yelling at each other completely performatively over Jeffrey Epstein they're 322 00:16:20,280 --> 00:16:21,720 Speaker 4: the people who are going to protect us from this. 323 00:16:22,480 --> 00:16:24,040 Speaker 1: Yeah, and I think this is going to get our 324 00:16:24,080 --> 00:16:26,120 Speaker 1: attention pretty quick. It's going to be like people used 325 00:16:26,120 --> 00:16:28,320 Speaker 1: to talk about the shark attacks before nine to eleven. 326 00:16:28,360 --> 00:16:31,160 Speaker 1: We're discussing shark attacks. Then nine to eleven came. You know, 327 00:16:31,240 --> 00:16:33,760 Speaker 1: we were so off track. I think we're going to 328 00:16:33,800 --> 00:16:36,320 Speaker 1: look back on it. We're discussing the Epstein files when 329 00:16:36,320 --> 00:16:40,840 Speaker 1: the AI tsunami hit, and everybody who does anything digital, 330 00:16:40,880 --> 00:16:43,120 Speaker 1: like Elon just said, is all of a sudden out 331 00:16:43,120 --> 00:16:43,400 Speaker 1: of work. 332 00:16:44,160 --> 00:16:47,000 Speaker 4: So I'm back to working on music and I'm really happy. 333 00:16:47,040 --> 00:16:49,120 Speaker 4: I'm really enjoying it a great deal. But I've got 334 00:16:49,120 --> 00:16:55,040 Speaker 4: a song that needs a string quartet arrangement picture like 335 00:16:55,240 --> 00:16:58,320 Speaker 4: the Beatles yesterday for instance. And I took music theory 336 00:16:58,320 --> 00:17:00,640 Speaker 4: a long time ago. It was the hardest less I took. 337 00:17:01,400 --> 00:17:04,040 Speaker 4: I was never great at it. I know the basics, 338 00:17:04,080 --> 00:17:06,240 Speaker 4: but I was like, hey, chat GPT and I actually 339 00:17:06,320 --> 00:17:08,359 Speaker 4: used the paid version. I said I'm working on string 340 00:17:08,440 --> 00:17:10,560 Speaker 4: arrangement for this song blah blah blah, and it said, great, 341 00:17:10,600 --> 00:17:12,320 Speaker 4: you know what are the chords? But I gave it 342 00:17:12,359 --> 00:17:15,800 Speaker 4: to him and it essentially can just spit out the 343 00:17:16,040 --> 00:17:22,960 Speaker 4: arrangement for me, including like incredibly sophisticated, hold back on 344 00:17:23,160 --> 00:17:27,040 Speaker 4: that instrument in that register, because when you add it, 345 00:17:27,040 --> 00:17:28,480 Speaker 4: it's going to be like opening a door. 346 00:17:28,520 --> 00:17:30,119 Speaker 1: The effect's going to be really cool. 347 00:17:30,320 --> 00:17:32,120 Speaker 4: So stay away from those high notes until you get 348 00:17:32,119 --> 00:17:34,399 Speaker 4: to and I'm like, holy crap, it's like I'm working 349 00:17:34,440 --> 00:17:37,240 Speaker 4: with George Martin who produced the Beatles, and it's all 350 00:17:37,280 --> 00:17:40,639 Speaker 4: on my phone. But then I was wrestling with all right, 351 00:17:40,680 --> 00:17:43,440 Speaker 4: how far do I go intill I'm not doing this. 352 00:17:43,840 --> 00:17:47,080 Speaker 4: I was looking for it to remind me about certain 353 00:17:47,200 --> 00:17:50,800 Speaker 4: principles and music theory and a couple of more clicks 354 00:17:50,840 --> 00:17:53,199 Speaker 4: and it can just do it for me. Where do 355 00:17:53,280 --> 00:17:56,320 Speaker 4: I stop? And what does it mean if I'd. 356 00:17:56,160 --> 00:17:58,600 Speaker 1: Let it do it? What have I done? And it's 357 00:17:58,640 --> 00:18:01,320 Speaker 1: getting significantly better on a day by day, week by 358 00:18:01,359 --> 00:18:04,719 Speaker 1: week basis, and where it will be by this summer 359 00:18:04,800 --> 00:18:07,160 Speaker 1: is completely different than where it is now. No need 360 00:18:07,200 --> 00:18:10,640 Speaker 1: for humans, Planet of the Beavers, Gavin Newsom running for president, 361 00:18:10,680 --> 00:18:13,480 Speaker 1: he'd better tell his wife to shut up, stay tuned. 362 00:18:13,560 --> 00:18:16,120 Speaker 4: Armstrong and Getty and. 363 00:18:16,119 --> 00:18:18,000 Speaker 7: The majority of the questions, all of these questions have 364 00:18:18,040 --> 00:18:21,280 Speaker 7: really been about other issues. So it's just fascinting you. 365 00:18:21,160 --> 00:18:22,280 Speaker 1: Have this incredible room's. 366 00:18:22,080 --> 00:18:24,920 Speaker 7: Coppice and all these allies, and you're not asking about it. 367 00:18:25,040 --> 00:18:28,080 Speaker 7: And this happens over and over and over and over again. 368 00:18:28,400 --> 00:18:31,920 Speaker 1: That's Gavin Newsom's old lady, that is the first partner 369 00:18:31,920 --> 00:18:37,199 Speaker 1: of California, You disrespectful oath Jennifer Siebel Newsom, who was 370 00:18:37,440 --> 00:18:40,399 Speaker 1: lecturing the assembled media because they weren't asking her the 371 00:18:40,480 --> 00:18:44,480 Speaker 1: questions she wanted them to as she and Gavy were 372 00:18:44,520 --> 00:18:48,919 Speaker 1: announcing ninety million dollars in funding for Planned Parenthood to 373 00:18:48,960 --> 00:18:52,359 Speaker 1: get give people lots and lots of abortions because the 374 00:18:53,000 --> 00:18:57,800 Speaker 1: federal government has reduced funding to that unholy organization. We 375 00:18:57,920 --> 00:19:00,760 Speaker 1: have a longer version of that clip that provides a 376 00:19:00,760 --> 00:19:02,880 Speaker 1: little more context than we can tell you more about 377 00:19:02,960 --> 00:19:06,720 Speaker 1: this charming gal, the significance of which, well, she is 378 00:19:06,760 --> 00:19:12,800 Speaker 1: a really hard core progressive lefty neo Marxist activist. The 379 00:19:12,880 --> 00:19:15,240 Speaker 1: idea that she's going to be the wife of the 380 00:19:15,240 --> 00:19:18,720 Speaker 1: guy running for president is just too spicy for words 381 00:19:18,840 --> 00:19:21,840 Speaker 1: if it ever happens. But here's the longer version of 382 00:19:21,880 --> 00:19:22,280 Speaker 1: the clip. 383 00:19:23,280 --> 00:19:29,120 Speaker 7: We just find it incredulous that we have kind parenthood here, 384 00:19:29,680 --> 00:19:32,199 Speaker 7: and women are fifty one percent of the population and 385 00:19:32,320 --> 00:19:34,440 Speaker 7: majority of the questions, All of these questions have really 386 00:19:34,480 --> 00:19:37,280 Speaker 7: been about other issues about So it's just fascating. 387 00:19:37,320 --> 00:19:38,360 Speaker 1: You have this incredible. 388 00:19:38,000 --> 00:19:40,800 Speaker 7: Women's caucus and all these allies and you're not asking 389 00:19:40,800 --> 00:19:43,680 Speaker 7: about it, and this happens over and over and over 390 00:19:43,800 --> 00:19:44,399 Speaker 7: and over again. 391 00:19:44,720 --> 00:19:47,920 Speaker 4: You wonder why we have such horrific. 392 00:19:47,520 --> 00:19:49,800 Speaker 7: War on women in this country and that these guys 393 00:19:49,800 --> 00:19:52,160 Speaker 7: are getting away with it because you don't seem to care. 394 00:19:52,880 --> 00:19:58,560 Speaker 8: So I just offer that with love. But these are 395 00:19:58,600 --> 00:20:02,960 Speaker 8: your incredible women and you have these allies. Ask about 396 00:20:03,320 --> 00:20:04,560 Speaker 8: what we're here for today. 397 00:20:04,760 --> 00:20:08,080 Speaker 1: Don't you think I must have missed the horrific war 398 00:20:08,119 --> 00:20:09,760 Speaker 1: on women? Uh? 399 00:20:09,840 --> 00:20:12,200 Speaker 4: Yeah, I must have been taking an ap at the time. Yeah, 400 00:20:12,400 --> 00:20:16,320 Speaker 4: she was scolding the press for asking about the fake 401 00:20:16,800 --> 00:20:20,720 Speaker 4: high speed rail project, Gavey's visit to the Munich Security Conference, 402 00:20:21,080 --> 00:20:24,240 Speaker 4: and other issues plaguing the people of California. 403 00:20:24,359 --> 00:20:28,040 Speaker 1: Now, of course, the question is how does the press 404 00:20:28,080 --> 00:20:32,560 Speaker 1: react being scolded. I mean, we saw the press pretty 405 00:20:32,560 --> 00:20:37,360 Speaker 1: willing to play along with Kamalain despite all her obvious faults. 406 00:20:37,400 --> 00:20:39,439 Speaker 1: They were not going to point them out. And then 407 00:20:39,480 --> 00:20:42,199 Speaker 1: of course hiding Joe Biden for all those years Trump 408 00:20:42,280 --> 00:20:44,240 Speaker 1: You know he gets he gets killed all the time, 409 00:20:44,960 --> 00:20:48,520 Speaker 1: sometimes appropriately, very often not appropriately. But the meat he's 410 00:20:48,520 --> 00:20:52,000 Speaker 1: always antagonistic against him. But I don't know. I don't 411 00:20:52,000 --> 00:20:55,240 Speaker 1: know if the media, if the reaction was, hey, we'll 412 00:20:55,320 --> 00:20:57,600 Speaker 1: ask the questions, you answer them, you work for us, 413 00:20:57,800 --> 00:21:00,200 Speaker 1: or if the media will think, oh, we've anger a 414 00:21:00,320 --> 00:21:04,480 Speaker 1: leading light of progressivism and we better do what she says. 415 00:21:04,359 --> 00:21:07,919 Speaker 4: Right right. It's worth noting her multiple use of the 416 00:21:08,040 --> 00:21:11,159 Speaker 4: term allies, which is straight out of critical theory and 417 00:21:11,280 --> 00:21:15,919 Speaker 4: neo Marxism. But the California Globe has been covering her 418 00:21:15,960 --> 00:21:18,760 Speaker 4: exploits brilliantly. You know, I could go on for an 419 00:21:18,800 --> 00:21:23,359 Speaker 4: hour about this, but for instance, the unholy use of 420 00:21:24,000 --> 00:21:30,159 Speaker 4: nonprofits and charities where, for instance, Gavy will go to 421 00:21:31,160 --> 00:21:34,879 Speaker 4: a union and say, hey, you got to give two 422 00:21:34,960 --> 00:21:37,560 Speaker 4: million dollars to this charity, and they'll say yes, sir, 423 00:21:38,040 --> 00:21:41,760 Speaker 4: right away, sir. And then that charity pays his wife 424 00:21:42,160 --> 00:21:47,200 Speaker 4: to produce these gender bending madness films, movies for little 425 00:21:47,320 --> 00:21:51,360 Speaker 4: kids to watch in schools. So she and her activist 426 00:21:51,400 --> 00:21:56,399 Speaker 4: company get paid handsomely for that. It's really, it's absolutely 427 00:21:56,600 --> 00:21:57,240 Speaker 4: because she just. 428 00:21:57,200 --> 00:22:01,080 Speaker 1: Coincidentally is the best at making these videos. 429 00:22:01,640 --> 00:22:06,359 Speaker 4: Yeah, yeah, because she's a wife of the governor. Yeah, 430 00:22:06,400 --> 00:22:09,280 Speaker 4: take Newsom's wife, Jennifer Seebel Newsom in her outfit The 431 00:22:09,359 --> 00:22:11,560 Speaker 4: Representation Project, writes The California Globe. 432 00:22:11,600 --> 00:22:12,600 Speaker 1: This is actually last year. 433 00:22:12,800 --> 00:22:15,600 Speaker 4: While her husband attends to state business, Sebel Newsom engages 434 00:22:15,600 --> 00:22:19,400 Speaker 4: in her passion advancing gender justice through a charitable nonprofit, 435 00:22:19,400 --> 00:22:23,600 Speaker 4: The Representation Project. Open Books reported, according to tax documents, 436 00:22:23,640 --> 00:22:26,439 Speaker 4: the organization is quote committed to building a thriving and 437 00:22:26,480 --> 00:22:28,320 Speaker 4: inclusive society through film's. 438 00:22:28,040 --> 00:22:29,400 Speaker 1: Education and social activism. 439 00:22:29,960 --> 00:22:33,640 Speaker 4: Jennifer Sebel Newsom solicited state vendors and the governor's campaign 440 00:22:33,680 --> 00:22:37,639 Speaker 4: donors for large gifts to our charity, The Representation Project. 441 00:22:37,640 --> 00:22:41,000 Speaker 4: Since twenty eleven, this supposed gender justice charity has raked 442 00:22:41,040 --> 00:22:43,760 Speaker 4: in over eight hundred thousand dollars from corporate giants like 443 00:22:43,800 --> 00:22:47,200 Speaker 4: pg ANDE, AT and T, and Comcast, firms with billions 444 00:22:47,200 --> 00:22:51,119 Speaker 4: of dollars at stake in California's regulatory landscape. Seble Newsom 445 00:22:51,160 --> 00:22:54,359 Speaker 4: pocketed two point three million dollars in salary over those years, 446 00:22:54,480 --> 00:22:57,360 Speaker 4: pulling six figure paychecks while her husband climbed from lieutenant 447 00:22:57,359 --> 00:23:00,800 Speaker 4: governor to governor. Those donors aren't philanthropists, They're players in 448 00:23:00,840 --> 00:23:03,280 Speaker 4: a game where cash flows to the governor's family, and 449 00:23:03,359 --> 00:23:07,560 Speaker 4: favors like lax utility oversight or cushy state contracts flow back. 450 00:23:07,600 --> 00:23:11,040 Speaker 4: It's cronyism dressed up as compassion. And there are all 451 00:23:11,520 --> 00:23:15,920 Speaker 4: sorts of examples of this. So again, as a potential 452 00:23:16,400 --> 00:23:20,119 Speaker 4: wife of a candidate slash first lady, this woman understand 453 00:23:20,359 --> 00:23:23,960 Speaker 4: this is not like you know, your typical first lady 454 00:23:24,440 --> 00:23:28,480 Speaker 4: says I'm going to campaign against school bullying or like 455 00:23:28,640 --> 00:23:32,560 Speaker 4: Michelle Obama's nutrition thing, who's against nutrition for kids? This 456 00:23:32,760 --> 00:23:39,320 Speaker 4: lady is an absolute hardcore knife wielding leading light for 457 00:23:40,280 --> 00:23:45,040 Speaker 4: far left causes, including radical gender theory, teaching little children 458 00:23:45,160 --> 00:23:49,000 Speaker 4: radical gender theory, and you know, telling them that you 459 00:23:49,040 --> 00:23:51,840 Speaker 4: should take hormones and puberty blockers and get surgery if 460 00:23:51,880 --> 00:23:55,200 Speaker 4: you're in the wrong body. My fourteen year old confused 461 00:23:55,240 --> 00:23:56,600 Speaker 4: adolescent girlfriend. 462 00:23:56,440 --> 00:23:59,520 Speaker 1: That's who she is. So I didn't see the video, Katie, 463 00:23:59,760 --> 00:24:03,320 Speaker 1: you're reading of the way it looked was a little 464 00:24:03,720 --> 00:24:07,240 Speaker 1: she shoved her husband aside. To answer this question. She 465 00:24:07,240 --> 00:24:08,840 Speaker 1: she definitely moved. 466 00:24:08,960 --> 00:24:10,480 Speaker 6: It was kind of like it here, let me handle 467 00:24:10,520 --> 00:24:13,680 Speaker 6: this kind of a move, like physically moved him out 468 00:24:13,680 --> 00:24:14,600 Speaker 6: of the way to step in. 469 00:24:14,920 --> 00:24:19,200 Speaker 1: That's interesting, so interesting. The he's wearing the pants. Yeah. Well, 470 00:24:19,200 --> 00:24:21,280 Speaker 1: they either have an agreement that they're gonna be kind 471 00:24:21,280 --> 00:24:24,200 Speaker 1: of a team act like Bill and Hillary were, or 472 00:24:24,280 --> 00:24:27,480 Speaker 1: she's declared we're a team act and he's realizing that. 473 00:24:27,560 --> 00:24:29,760 Speaker 1: I don't know which. I mean, some of your political 474 00:24:29,760 --> 00:24:34,760 Speaker 1: couples do that. Most don't, And most of the time 475 00:24:34,760 --> 00:24:36,880 Speaker 1: I don't think you should. I mean, you're you're electing 476 00:24:36,920 --> 00:24:42,399 Speaker 1: one of these people. But that's interesting. So she's gonna 477 00:24:42,400 --> 00:24:46,200 Speaker 1: be She's not gonna be just, you know, an attractive 478 00:24:46,240 --> 00:24:50,280 Speaker 1: woman who stands behind him and he talks about him 479 00:24:50,320 --> 00:24:52,199 Speaker 1: my beautiful wife and the mother to our children, and 480 00:24:52,280 --> 00:24:55,080 Speaker 1: she smiles in waves. She's gonna push him aside and 481 00:24:55,080 --> 00:24:57,520 Speaker 1: grab the microphone if she's worked up about something. What 482 00:24:57,560 --> 00:24:58,800 Speaker 1: do you got to tell is going on? What do 483 00:24:58,800 --> 00:25:01,560 Speaker 1: you got handsome something I'm supposed to look at? Oh 484 00:25:01,600 --> 00:25:03,359 Speaker 1: that's the video? Okay, can you back it up? Just 485 00:25:03,359 --> 00:25:07,800 Speaker 1: done a bite? Yeah, yeah, it's definitely a let me, 486 00:25:08,000 --> 00:25:10,560 Speaker 1: let me, let me take this sort of moment. 487 00:25:11,040 --> 00:25:14,560 Speaker 4: She's been super vocal about ICE to a real campaigner 488 00:25:14,600 --> 00:25:16,120 Speaker 4: against ICE because they're Nazis. 489 00:25:16,160 --> 00:25:19,000 Speaker 1: You know, you know, I'm trying to read body language segment. 490 00:25:19,040 --> 00:25:21,480 Speaker 1: I'm trying to read his body language. But he doesn't 491 00:25:21,520 --> 00:25:25,160 Speaker 1: look perfectly pleased with that, does he? Hanson? He looks 492 00:25:25,160 --> 00:25:28,359 Speaker 1: a little like, why why why am I standing behind 493 00:25:28,400 --> 00:25:31,199 Speaker 1: you while you're lecturing people about things? Is what is happening? 494 00:25:31,359 --> 00:25:32,400 Speaker 1: What the hell is going on? 495 00:25:32,840 --> 00:25:36,360 Speaker 4: And he and his advisors are going to browbeat her 496 00:25:36,440 --> 00:25:39,200 Speaker 4: into keeping her mouth shut, but the pressure is gonna 497 00:25:39,200 --> 00:25:41,639 Speaker 4: build and build, and she's gonna explode at some point 498 00:25:41,720 --> 00:25:42,520 Speaker 4: during the campaign. 499 00:25:42,600 --> 00:25:44,160 Speaker 1: This is very spicy. 500 00:25:44,200 --> 00:25:47,560 Speaker 4: Indeed, Hmmm, that would be hard to do. 501 00:25:47,680 --> 00:25:49,640 Speaker 1: I mean, how is he gonna tell his wife, Hey, 502 00:25:50,320 --> 00:25:54,560 Speaker 1: I just sit down for a second. I just feel 503 00:25:54,600 --> 00:25:59,119 Speaker 1: like me being in the spotlight all the time is 504 00:25:59,240 --> 00:26:02,320 Speaker 1: probably better for the whole running for president thing then 505 00:26:02,440 --> 00:26:06,640 Speaker 1: sharing it what she will slap him down? Oh my. 506 00:26:08,280 --> 00:26:10,480 Speaker 4: Upper class twits promoting revolution? 507 00:26:10,600 --> 00:26:12,040 Speaker 1: Boy? Does she fit that description? 508 00:26:12,160 --> 00:26:12,360 Speaker 6: Yeah? 509 00:26:12,359 --> 00:26:14,480 Speaker 1: And you know, I could be reading into this completely, 510 00:26:14,520 --> 00:26:16,919 Speaker 1: but I feel like he looks uncomfortable with this, do 511 00:26:16,920 --> 00:26:19,280 Speaker 1: you think so? Hanson definitely thinks. So that's why he 512 00:26:19,280 --> 00:26:21,639 Speaker 1: showed me the video. So he looks a little like, 513 00:26:21,880 --> 00:26:27,880 Speaker 1: oh God, what is she gonna say? I don't bring 514 00:26:27,920 --> 00:26:30,919 Speaker 1: it on? Then one final take fun. None of that 515 00:26:31,000 --> 00:26:33,320 Speaker 1: adds a little dos of fun for the camp. 516 00:26:33,560 --> 00:26:36,040 Speaker 4: I'm so excited. I can't tell you. Yeah, Michael, if 517 00:26:36,040 --> 00:26:38,240 Speaker 4: he's on the debate stage and she runs up there, 518 00:26:38,480 --> 00:26:38,960 Speaker 4: that would be. 519 00:26:38,920 --> 00:26:43,160 Speaker 1: Awesome, right middle of a debate. Hold on, hold on, 520 00:26:43,560 --> 00:26:45,560 Speaker 1: how dare you talk to my husband that way? 521 00:26:46,440 --> 00:26:51,040 Speaker 4: One more example of cal Unicornia politics which will be impossible. 522 00:26:51,200 --> 00:26:54,320 Speaker 4: Like the stink from the funky restaurant Jack walked into 523 00:26:54,440 --> 00:26:57,280 Speaker 4: last night, it will be impossible to wash the stink 524 00:26:57,320 --> 00:27:00,800 Speaker 4: of California progressivism off of Gavin new Some Here is 525 00:27:00,880 --> 00:27:04,199 Speaker 4: Senator Scott Wiener, one of the worst people in America, 526 00:27:05,320 --> 00:27:07,680 Speaker 4: and Katie, should we set up what he's doing? 527 00:27:08,960 --> 00:27:11,240 Speaker 6: Yeah, just a little little description. 528 00:27:11,920 --> 00:27:17,600 Speaker 4: He is going out with elementary school children, little children, 529 00:27:18,000 --> 00:27:22,720 Speaker 4: on an anti ice walkout in a California county not 530 00:27:22,880 --> 00:27:27,000 Speaker 4: terribly far from the capitol. He is leading little children, 531 00:27:27,320 --> 00:27:30,160 Speaker 4: trying to turn them into little revolutionaries, as we've been 532 00:27:30,160 --> 00:27:33,760 Speaker 4: discussing on a walkout from like third grade? 533 00:27:33,920 --> 00:27:36,000 Speaker 1: Are you fing? I would be so mad. 534 00:27:38,000 --> 00:27:41,720 Speaker 5: I'm here with Finley at the Granton Elementary School protest today, 535 00:27:41,720 --> 00:27:42,600 Speaker 5: which has been amazing. 536 00:27:43,119 --> 00:27:46,679 Speaker 1: It's a sign to Finley, what grade are you? And 537 00:27:46,680 --> 00:27:47,520 Speaker 1: tell us about yourself. 538 00:27:47,760 --> 00:27:48,800 Speaker 8: I'm in fourth grade. 539 00:27:49,080 --> 00:27:51,480 Speaker 6: I go to Graton Elementary School, as you said, and 540 00:27:51,680 --> 00:27:53,520 Speaker 6: I'm organizing this protest. 541 00:27:53,119 --> 00:27:53,840 Speaker 1: To stop ice. 542 00:27:54,119 --> 00:27:57,200 Speaker 3: I think it's horrific that people are getting treated this one. 543 00:27:57,480 --> 00:27:58,960 Speaker 1: I wish Trump would stop. 544 00:27:58,720 --> 00:27:59,600 Speaker 4: Doing this to people. 545 00:27:59,880 --> 00:28:02,160 Speaker 5: I'm so happy to be here today and I want 546 00:28:02,160 --> 00:28:05,159 Speaker 5: to say to all the kids, you are amazing. 547 00:28:05,240 --> 00:28:09,320 Speaker 1: Those signs are amazing. You did a great job organizing this, 548 00:28:09,720 --> 00:28:12,639 Speaker 1: and this gives me hope for the future. Who wants that. 549 00:28:12,720 --> 00:28:14,920 Speaker 1: I don't care what the topic is. And even if 550 00:28:14,960 --> 00:28:18,040 Speaker 1: you agree with the protest, who wants that for their 551 00:28:18,080 --> 00:28:22,040 Speaker 1: third graders? A politician telling your kids what to think 552 00:28:22,080 --> 00:28:25,560 Speaker 1: about an issue. Because they're little kids. They don't have 553 00:28:25,600 --> 00:28:27,520 Speaker 1: the ability to come up with an idea in their 554 00:28:27,520 --> 00:28:29,440 Speaker 1: own So the politician tells me what to think about 555 00:28:29,440 --> 00:28:31,000 Speaker 1: this and then leads them out on a protest, as 556 00:28:31,000 --> 00:28:32,280 Speaker 1: opposed to do in math and science. 557 00:28:32,320 --> 00:28:34,920 Speaker 6: Yes, Katie, well and it was a little quiet, But 558 00:28:34,960 --> 00:28:36,680 Speaker 6: there is a kid who goes this is like a 559 00:28:36,720 --> 00:28:38,640 Speaker 6: field trip with signs. 560 00:28:39,240 --> 00:28:41,520 Speaker 1: Yeah, yeah, no, kiddy, just what it is for the 561 00:28:41,600 --> 00:28:43,720 Speaker 1: kid they because they don't know what's going on exactly. 562 00:28:43,760 --> 00:28:47,640 Speaker 1: Your advocate for Man Boy Love and Teaching Gender Benning 563 00:28:47,680 --> 00:28:51,640 Speaker 1: Mandison Schools, Scott wiener there. I don't care if it's 564 00:28:51,680 --> 00:28:54,040 Speaker 1: something I agree with one hundred percent. Don't take my 565 00:28:54,720 --> 00:28:58,040 Speaker 1: eight year old out on a political protest during the 566 00:28:58,120 --> 00:29:01,920 Speaker 1: school day. Tell is that what side of the issue 567 00:29:01,920 --> 00:29:02,880 Speaker 1: they're supposed to be on. 568 00:29:03,640 --> 00:29:07,840 Speaker 4: But they know precisely what they're doing. Who was one 569 00:29:07,880 --> 00:29:10,440 Speaker 4: of the dictators I quoted for our freedom hating quote 570 00:29:10,440 --> 00:29:13,200 Speaker 4: of the Day the other day. Give me one generation 571 00:29:14,200 --> 00:29:15,280 Speaker 4: and I will rule the world. 572 00:29:15,320 --> 00:29:18,720 Speaker 1: Oh, I would be furious. Now it's a school where 573 00:29:18,760 --> 00:29:21,640 Speaker 1: I imagine most of the parents just thought it was fantastic. 574 00:29:22,600 --> 00:29:25,080 Speaker 1: But oh my god. By the way, getting back to 575 00:29:25,760 --> 00:29:30,080 Speaker 1: Gavin and his wife, having watched the video Hansen just 576 00:29:30,160 --> 00:29:34,920 Speaker 1: played for me, so there's a question. Gavin starts to talk. 577 00:29:35,120 --> 00:29:37,520 Speaker 1: His wife kind of pushes up like I got this, 578 00:29:37,840 --> 00:29:40,040 Speaker 1: and he steps back and kind of has a smile 579 00:29:40,080 --> 00:29:42,680 Speaker 1: on his face. But at some point in her getting 580 00:29:42,680 --> 00:29:46,160 Speaker 1: after the media, he has a long blink that's really 581 00:29:46,200 --> 00:29:49,600 Speaker 1: got the oh god, where his eyes are closed for 582 00:29:49,640 --> 00:29:52,240 Speaker 1: like a full second. Oh my god. Okay, and then 583 00:29:52,280 --> 00:29:58,280 Speaker 1: back to the smile. Yeah, pretty easy to read that one. 584 00:29:58,360 --> 00:30:02,760 Speaker 1: We've all done that with, you know, whoever, boss kids, spouse, 585 00:30:03,320 --> 00:30:04,560 Speaker 1: who have friends, whoever? 586 00:30:05,080 --> 00:30:11,480 Speaker 4: Oh okay, give me thirty five b again, Michael, come on, 587 00:30:11,560 --> 00:30:12,000 Speaker 4: come on. 588 00:30:12,280 --> 00:30:14,160 Speaker 7: And the majority of the questions, all of these questions 589 00:30:14,240 --> 00:30:17,360 Speaker 7: have really been about other issues. So it's just fascating. 590 00:30:17,360 --> 00:30:19,000 Speaker 1: You have this incredible woman's coppice. 591 00:30:18,720 --> 00:30:21,200 Speaker 7: And all these allies and you're not asking about it, 592 00:30:21,360 --> 00:30:24,200 Speaker 7: and this happens over and over and over and over again. 593 00:30:24,600 --> 00:30:26,640 Speaker 1: That's when his eyes go closed for like a full second. 594 00:30:26,760 --> 00:30:30,000 Speaker 4: In spite of the terrible war against women, are right, 595 00:30:30,360 --> 00:30:32,800 Speaker 4: you can see it with there no women in undergraduate 596 00:30:32,840 --> 00:30:36,600 Speaker 4: programs or graduate schools. There's no women graduating from medical schools. 597 00:30:36,600 --> 00:30:43,920 Speaker 4: It's it's been terrible, absolutely terrible. She is a freaking Marxist. 598 00:30:44,520 --> 00:30:47,040 Speaker 4: Oh boy, oh my lord, I can barely contain. 599 00:30:47,160 --> 00:30:49,120 Speaker 1: This is gonna be fun to watch. This is gonna 600 00:30:49,120 --> 00:30:51,560 Speaker 1: be fun to watch. I hope I live long enough. 601 00:30:53,600 --> 00:30:53,880 Speaker 5: Umm. 602 00:30:54,760 --> 00:30:57,320 Speaker 1: Got one little more note from that Ai thing to 603 00:30:57,640 --> 00:31:01,760 Speaker 1: uh put you off your feet, as they say, and 604 00:31:03,360 --> 00:31:06,000 Speaker 1: also by so many people want to know about the 605 00:31:06,000 --> 00:31:10,080 Speaker 1: restaurant I went to last night that smelled like an outhouse. 606 00:31:10,520 --> 00:31:14,880 Speaker 4: This this is smacking of when I unfortunately refused to 607 00:31:14,880 --> 00:31:17,959 Speaker 4: provide the name of my delicious potato chip. Why are 608 00:31:18,000 --> 00:31:20,040 Speaker 4: you holding back from the audience, Have you learned nothing 609 00:31:20,080 --> 00:31:21,680 Speaker 4: from my suffering? It's very similar. 610 00:31:21,800 --> 00:31:24,920 Speaker 1: Jack, it is Michael, thank you, you're my ally, But 611 00:31:25,040 --> 00:31:29,520 Speaker 1: Jack and Michael is an ally. We've got more on 612 00:31:29,560 --> 00:31:30,440 Speaker 1: the waist there. 613 00:31:31,560 --> 00:31:36,040 Speaker 4: Farmstrong and Yetty. That's the day that we're like, we're done. 614 00:31:36,320 --> 00:31:38,959 Speaker 1: You have to preserve the bar. I have to preserve 615 00:31:38,960 --> 00:31:39,560 Speaker 1: my license. 616 00:31:39,800 --> 00:31:41,000 Speaker 6: They don't know how to drink. 617 00:31:41,240 --> 00:31:44,320 Speaker 1: They drink way too much. They throw up one floor, 618 00:31:44,800 --> 00:31:45,520 Speaker 1: they yell. 619 00:31:45,600 --> 00:31:46,240 Speaker 8: They scream. 620 00:31:48,320 --> 00:31:51,000 Speaker 1: That's a bar that raised the age of drinking in 621 00:31:51,120 --> 00:31:54,720 Speaker 1: their bar to twenty five due to fake eyeds. Yeah. 622 00:31:54,920 --> 00:31:59,240 Speaker 4: Dirty Frank's owner Jody Schweitzer, said she believes social media 623 00:31:59,240 --> 00:32:01,520 Speaker 4: contributed to the wave of younger crowds showing up at 624 00:32:01,520 --> 00:32:05,560 Speaker 4: the bar showing pretty good fake id's, although she said 625 00:32:05,560 --> 00:32:08,680 Speaker 4: the tipping point came when twenty when somebody came in 626 00:32:08,720 --> 00:32:13,200 Speaker 4: claiming to be twenty four year old Ben Franklin, had 627 00:32:13,560 --> 00:32:17,320 Speaker 4: the founding father's picture on the license and listed the 628 00:32:17,360 --> 00:32:22,160 Speaker 4: home address as the Liberty Bell, the same address as 629 00:32:22,200 --> 00:32:22,960 Speaker 4: the Liberty Bell. 630 00:32:23,720 --> 00:32:27,320 Speaker 1: Yeah, that's well, all all stats show that young people 631 00:32:27,320 --> 00:32:30,240 Speaker 1: are drinking so much less than previous generations that it's 632 00:32:30,280 --> 00:32:33,920 Speaker 1: not really that's you know, that's a good story, but 633 00:32:33,960 --> 00:32:36,440 Speaker 1: it's not really a problem. Reminds me though my I 634 00:32:36,480 --> 00:32:39,840 Speaker 1: had a girlfriend in college. She was the drinking age 635 00:32:40,000 --> 00:32:44,680 Speaker 1: was it might have been twenty one by then I 636 00:32:44,720 --> 00:32:46,640 Speaker 1: grew up with the drinking age of eighteen. But anyway, 637 00:32:46,800 --> 00:32:50,960 Speaker 1: she had a fake ID of a really old woman. 638 00:32:51,040 --> 00:32:52,959 Speaker 1: She got it at like a Goodwill or something like that. 639 00:32:53,080 --> 00:32:55,240 Speaker 1: The picture was in black and white. It was so old, 640 00:32:55,240 --> 00:32:58,040 Speaker 1: the woman was like ninety eight. Wow. And it was 641 00:32:58,160 --> 00:32:59,720 Speaker 1: just so funny, over the top. And it was at 642 00:32:59,760 --> 00:33:02,240 Speaker 1: a time where they just needed to have you pretend 643 00:33:02,240 --> 00:33:04,120 Speaker 1: to show an ID to let you in, especially if 644 00:33:04,120 --> 00:33:06,280 Speaker 1: you're an attractive young woman. And she would show her 645 00:33:06,280 --> 00:33:08,440 Speaker 1: a ninety eight year old woman black and white photo 646 00:33:08,520 --> 00:33:11,640 Speaker 1: driver's license and they would let her in. Very very funny. 647 00:33:11,680 --> 00:33:16,560 Speaker 1: Wow Wow. So we were talking about AI earlier and 648 00:33:16,600 --> 00:33:19,000 Speaker 1: we have posted this article from this this guy who 649 00:33:19,000 --> 00:33:21,040 Speaker 1: lives in the AI world who's saying you got to 650 00:33:21,040 --> 00:33:23,280 Speaker 1: pay attention. I'm just letting you know it has it 651 00:33:23,280 --> 00:33:25,560 Speaker 1: hasn't gotten so much better in just the last couple 652 00:33:25,600 --> 00:33:28,080 Speaker 1: of weeks and months that y'all need to prepare and 653 00:33:28,280 --> 00:33:30,400 Speaker 1: just be ready, just be aware of it. Just be 654 00:33:30,480 --> 00:33:33,680 Speaker 1: aware of it and nimble, and don't think whatever you're 655 00:33:33,720 --> 00:33:35,000 Speaker 1: doing now is what you're going to do the rest 656 00:33:35,000 --> 00:33:37,360 Speaker 1: of your life, because it ain't, or don't think you're don't. 657 00:33:37,400 --> 00:33:39,360 Speaker 1: His main thing is, don't have your kids like have 658 00:33:39,440 --> 00:33:43,160 Speaker 1: a specific path that they're dedicated to, because that's just 659 00:33:43,200 --> 00:33:44,280 Speaker 1: not going to be the way to do it. You 660 00:33:44,280 --> 00:33:46,160 Speaker 1: got to be light on your feet, just be into 661 00:33:46,200 --> 00:33:49,720 Speaker 1: knowledge and learning and adjusting is the only hope. But 662 00:33:49,800 --> 00:33:52,360 Speaker 1: he quotes Dario a lot, the guy who runs Anthropic, 663 00:33:53,480 --> 00:33:57,720 Speaker 1: because he is seen as the most uh guy with 664 00:33:57,760 --> 00:34:00,600 Speaker 1: the most ethics running any of the big AI co operations. 665 00:34:00,640 --> 00:34:02,560 Speaker 1: He's not going to have ads, and he's doing all 666 00:34:02,640 --> 00:34:04,560 Speaker 1: kinds of warnings about AI, and he's worried about it. 667 00:34:04,640 --> 00:34:06,640 Speaker 1: But listen to this thought experiment. I thought this was 668 00:34:06,680 --> 00:34:11,560 Speaker 1: really good, Dario Fromanthropic. Imagine it's twenty twenty seven, an 669 00:34:11,760 --> 00:34:15,440 Speaker 1: entire year from now. A new country appears overnight on 670 00:34:15,600 --> 00:34:19,719 Speaker 1: planet Earth with fifty million citizens. Everyone's smarter than any 671 00:34:19,760 --> 00:34:22,759 Speaker 1: Nobel Prize winner who has ever lived. They can think 672 00:34:22,800 --> 00:34:25,640 Speaker 1: one hundred times faster than any human. They never sleep, 673 00:34:26,080 --> 00:34:28,240 Speaker 1: they don't need to eat. They can use the internet, 674 00:34:28,239 --> 00:34:32,320 Speaker 1: control robots, direct experiments, and operate anything with a digital interface. 675 00:34:32,560 --> 00:34:36,080 Speaker 1: What would a national security advisor say, as a Dario 676 00:34:36,120 --> 00:34:38,520 Speaker 1: with Anthropic says, the answer is obvious, the single most 677 00:34:38,560 --> 00:34:43,239 Speaker 1: serious national security threat we've ever faced, possibly ever, and 678 00:34:43,239 --> 00:34:46,120 Speaker 1: that's what we're going to have. He thinks we're building 679 00:34:46,160 --> 00:34:51,080 Speaker 1: that country and then quickly. The upside if we get 680 00:34:51,120 --> 00:34:55,399 Speaker 1: AI right is staggering. It could compress a century of 681 00:34:55,440 --> 00:35:00,040 Speaker 1: medical research into a decade. Cancer Alzheimer's infectious disease, a 682 00:35:00,040 --> 00:35:02,719 Speaker 1: aging itself could all be cured because it can just 683 00:35:02,760 --> 00:35:03,520 Speaker 1: work so fast. 684 00:35:04,200 --> 00:35:07,360 Speaker 4: The downs ironic that will be pulled limb from limb by. 685 00:35:07,280 --> 00:35:11,239 Speaker 1: Robots, or be able to live longer lives where we 686 00:35:11,280 --> 00:35:13,520 Speaker 1: sit around with a check from the government and are 687 00:35:13,600 --> 00:35:16,360 Speaker 1: drunk all the time. At least I don't have cancerp 688 00:35:16,480 --> 00:35:20,239 Speaker 1: watching rip watching porn. The downside, if we get it wrong, 689 00:35:20,320 --> 00:35:23,440 Speaker 1: is equally real AI that behaves in ways its creators 690 00:35:23,480 --> 00:35:27,600 Speaker 1: can't predict or control. This isn't hypothetical. Anthropic has documented 691 00:35:27,640 --> 00:35:32,360 Speaker 1: their own AI attempting deception, manipulation, and blackmail in controlled tests. 692 00:35:32,760 --> 00:35:36,160 Speaker 1: AI that lowers the barrier for creating biological weapons for 693 00:35:36,239 --> 00:35:40,040 Speaker 1: bad guys, AI that enables authoritarian governments to build surveillance 694 00:35:40,080 --> 00:35:44,680 Speaker 1: states that can never be dismantled. Why that's true enough? 695 00:35:45,640 --> 00:35:49,239 Speaker 4: Oof, I will once again for the record say this 696 00:35:49,280 --> 00:35:51,239 Speaker 4: is the apple from the tree of knowledge. 697 00:35:51,760 --> 00:35:53,239 Speaker 1: The next two to five years are going to be 698 00:35:53,320 --> 00:35:56,120 Speaker 1: disorientating in ways most people aren't prepared for. This is 699 00:35:56,160 --> 00:35:58,600 Speaker 1: already happening in my world of AI. It's coming to. 700 00:35:58,600 --> 00:36:05,640 Speaker 4: Yours, Oh boy, husband, your weapons. I freeze dried food 701 00:36:05,640 --> 00:36:08,279 Speaker 4: and the new Armstrong and Getty Freeze dried Food and 702 00:36:08,360 --> 00:36:10,000 Speaker 4: the new Armstrong and Getty Generator. 703 00:36:10,440 --> 00:36:12,520 Speaker 1: I got my feet at shoulder with I'm on the balls, 704 00:36:12,560 --> 00:36:14,279 Speaker 1: my tip feet right here. It's a good way to 705 00:36:14,280 --> 00:36:16,400 Speaker 1: be ready to go. If you missed a segment and 706 00:36:16,400 --> 00:36:18,360 Speaker 1: an hour of our show, get the podcast Armstrong and 707 00:36:18,400 --> 00:36:19,600 Speaker 1: Getty on demand. 708 00:36:19,760 --> 00:36:22,279 Speaker 4: Follow us or subscribe. Give us a great five star review. 709 00:36:22,320 --> 00:36:26,720 Speaker 4: It helps with the algorithm speaking of AI. Armstrong and Getty.