1 00:00:01,880 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,200 --> 00:00:07,640 Speaker 1: Washington Broadcast Center. 3 00:00:07,800 --> 00:00:11,720 Speaker 2: Jack Armstrong and Joe Getty. I'm Strong and Getty and 4 00:00:12,440 --> 00:00:17,280 Speaker 2: he Armstrong and Getty. 5 00:00:23,600 --> 00:00:26,639 Speaker 3: In a stunning development, Canada has declared war on the 6 00:00:26,720 --> 00:00:27,400 Speaker 3: United States. 7 00:00:27,480 --> 00:00:29,720 Speaker 2: Let's go to Joe Braxton, who is live at the border. 8 00:00:29,880 --> 00:00:32,280 Speaker 2: I'm currently at the border, but there is no war. 9 00:00:32,720 --> 00:00:36,320 Speaker 1: Mom, Dad, I know this looks kind of real, but 10 00:00:36,360 --> 00:00:37,200 Speaker 1: it's all AI. 11 00:00:37,600 --> 00:00:40,879 Speaker 2: Grandpa, I'm fine, this is just AI. You don't need 12 00:00:40,960 --> 00:00:43,520 Speaker 2: to wire money to anyone. I am not in love 13 00:00:43,600 --> 00:00:46,680 Speaker 2: with you, and I do not need your money for 14 00:00:46,760 --> 00:00:47,560 Speaker 2: a plane ticket. 15 00:00:47,960 --> 00:00:48,680 Speaker 4: I am AI. 16 00:00:49,040 --> 00:00:51,800 Speaker 1: This fake moon landing footage isn't classified. 17 00:00:52,159 --> 00:00:52,760 Speaker 5: It's AI. 18 00:00:53,280 --> 00:00:57,240 Speaker 6: There could be Aliens, but I'm not one of them. 19 00:00:57,640 --> 00:01:01,680 Speaker 2: I'm AI, Uncle Fred. Government has not been taken over 20 00:01:01,720 --> 00:01:04,160 Speaker 2: by bisert people. You don't need to send us five 21 00:01:04,240 --> 00:01:07,880 Speaker 2: thousand dollars in gift cards because you missed jury duty. Grandma, 22 00:01:09,240 --> 00:01:09,960 Speaker 2: it's not real. 23 00:01:10,920 --> 00:01:12,319 Speaker 5: It's AI. It's Ai. 24 00:01:13,400 --> 00:01:13,800 Speaker 2: AI. 25 00:01:13,959 --> 00:01:16,199 Speaker 1: It's kind of funny, if only, but that's not the case, 26 00:01:16,240 --> 00:01:18,319 Speaker 1: as we all know, and all those instances. 27 00:01:19,080 --> 00:01:24,080 Speaker 5: Boy got to run that on every TV service that exists, 28 00:01:24,120 --> 00:01:25,039 Speaker 5: and people see it. 29 00:01:25,360 --> 00:01:28,160 Speaker 1: Well, so I'm about to talk about this new AI 30 00:01:28,240 --> 00:01:30,280 Speaker 1: book in just a second, and one of his big 31 00:01:30,280 --> 00:01:33,480 Speaker 1: points is governments need to start regulating AI like yesterday 32 00:01:33,920 --> 00:01:37,840 Speaker 1: and coming up with some guardrails before it gets unleashed 33 00:01:37,840 --> 00:01:40,200 Speaker 1: on the world in all kinds of different ways. Luckily, 34 00:01:40,240 --> 00:01:42,760 Speaker 1: the state of California is already on it, Katie. 35 00:01:43,040 --> 00:01:46,679 Speaker 3: Yes, California lawmakers moving forward with a bill to regulate 36 00:01:46,800 --> 00:01:52,880 Speaker 3: companionship chat box. It requires reminders every three hours that 37 00:01:52,920 --> 00:01:54,760 Speaker 3: the chatbots are not real. 38 00:01:58,160 --> 00:02:03,440 Speaker 1: So first, there's several things I like about this. So 39 00:02:03,520 --> 00:02:07,120 Speaker 1: you have a relationship with the chatbot that's obviously already 40 00:02:07,800 --> 00:02:10,480 Speaker 1: like weird at a level of dystopian that we only 41 00:02:10,520 --> 00:02:13,600 Speaker 1: could imagine just a few years ago, and that already happening, 42 00:02:13,639 --> 00:02:14,600 Speaker 1: And it's already happening. 43 00:02:14,639 --> 00:02:15,760 Speaker 2: And I know people who know people. 44 00:02:15,800 --> 00:02:18,040 Speaker 1: I don't know anybody who's actually doing the relationship thing, 45 00:02:18,280 --> 00:02:20,919 Speaker 1: but I know people who say they have friends who 46 00:02:21,000 --> 00:02:25,880 Speaker 1: are like in some cases like lonely old farmers, which 47 00:02:25,919 --> 00:02:28,200 Speaker 1: I find really disturbing, like the last sort of people 48 00:02:28,240 --> 00:02:31,160 Speaker 1: you would expect that would you know, get emotionally attached 49 00:02:31,160 --> 00:02:34,560 Speaker 1: to a chat bot. But anyway, it's so prevalent that 50 00:02:34,600 --> 00:02:36,919 Speaker 1: the state of California felt like they needed to put 51 00:02:36,919 --> 00:02:39,440 Speaker 1: in guardrails and step in and have a warning. Every 52 00:02:39,520 --> 00:02:42,080 Speaker 1: two hours, No two and a half hours, No four hours, 53 00:02:42,120 --> 00:02:45,280 Speaker 1: that's not long enough, Jack, Okay, three hours, will compromise. 54 00:02:45,639 --> 00:02:48,280 Speaker 1: Every three hours that will warn you that you're talking 55 00:02:48,280 --> 00:02:51,960 Speaker 1: to a computer, you weirdo. 56 00:02:52,280 --> 00:02:54,920 Speaker 5: So there, I am trading intimacies. Oh that reminds me 57 00:02:55,000 --> 00:02:57,119 Speaker 5: coming up later this hour out to fall in love. 58 00:02:57,240 --> 00:03:01,560 Speaker 5: Science has figured it out. Wow, I'm trading intimacies with 59 00:03:01,639 --> 00:03:06,359 Speaker 5: my sexy, sexy computer. And or I suppose a sex 60 00:03:06,360 --> 00:03:09,560 Speaker 5: spot in the future, and the State of California is 61 00:03:09,600 --> 00:03:12,440 Speaker 5: gonna say, we hate to interrupt your sexy chat with 62 00:03:12,560 --> 00:03:15,280 Speaker 5: your computer, but like to remind you it's a computer 63 00:03:15,600 --> 00:03:17,959 Speaker 5: and not a real human being. This message brought to 64 00:03:18,000 --> 00:03:21,040 Speaker 5: you by Gavin Newsom and the State of California see 65 00:03:21,120 --> 00:03:22,360 Speaker 5: in three hours. 66 00:03:22,160 --> 00:03:26,040 Speaker 2: And the super majority Democratic Party. I would love to have. 67 00:03:26,040 --> 00:03:28,480 Speaker 1: Been involved in that conversation for a number of reasons, 68 00:03:28,560 --> 00:03:31,080 Speaker 1: concluding I'm as much as I'm an expert as anybody 69 00:03:31,080 --> 00:03:34,160 Speaker 1: who made that decision, because nobody has any idea where 70 00:03:34,200 --> 00:03:36,400 Speaker 1: any of this is going, So my opinion would have 71 00:03:36,440 --> 00:03:38,760 Speaker 1: been as valid as anybody who you know, wrote that 72 00:03:38,880 --> 00:03:44,560 Speaker 1: and voted for it. But first of all, you could 73 00:03:44,600 --> 00:03:49,880 Speaker 1: start with, do we need the government jumping in and 74 00:03:49,920 --> 00:03:50,800 Speaker 1: telling us. 75 00:03:50,880 --> 00:03:53,119 Speaker 2: I know, you feel really good right now. 76 00:03:52,960 --> 00:03:57,000 Speaker 1: And you're really enjoying yourself and you're being you're getting 77 00:03:57,160 --> 00:04:00,560 Speaker 1: enjoyment at a deep, deep, human level, but we want 78 00:04:00,560 --> 00:04:02,640 Speaker 1: to tell you it's fake because we're the state and 79 00:04:02,680 --> 00:04:03,480 Speaker 1: that's our job. 80 00:04:03,880 --> 00:04:05,200 Speaker 2: I don't know what I think about that. 81 00:04:05,880 --> 00:04:08,600 Speaker 5: I mean, I think that's a great question. That's the 82 00:04:08,640 --> 00:04:12,000 Speaker 5: thirty thousand foot question. I mean, as we've seen a 83 00:04:12,040 --> 00:04:17,960 Speaker 5: steep decline in social belonging, just in general, friendships, civic organizations, 84 00:04:18,320 --> 00:04:23,840 Speaker 5: church attendance, all of the guard rails of our behavior, 85 00:04:24,640 --> 00:04:24,920 Speaker 5: or a. 86 00:04:24,880 --> 00:04:26,640 Speaker 2: Lot of them have gone away. 87 00:04:27,360 --> 00:04:29,880 Speaker 5: And so the state of California said, well, let's say 88 00:04:29,880 --> 00:04:32,640 Speaker 5: you're an atheist with no friends. We'll tell you what's 89 00:04:32,680 --> 00:04:33,520 Speaker 5: crazy and what's not. 90 00:04:34,040 --> 00:04:35,480 Speaker 2: I don't want the government doing that. 91 00:04:35,560 --> 00:04:39,040 Speaker 1: Well why wouldn't the So the government doesn't announce, you know, 92 00:04:39,120 --> 00:04:41,400 Speaker 1: every three hours at a strip club, the music stops, 93 00:04:41,440 --> 00:04:43,960 Speaker 1: the girls stop dancing, and announcement from the state comes up. 94 00:04:44,240 --> 00:04:46,240 Speaker 1: These girls do not actually like you. They are going 95 00:04:46,279 --> 00:04:48,039 Speaker 1: to pretend to love you, but they are getting paid 96 00:04:48,080 --> 00:04:50,360 Speaker 1: to do that. Now, back to the hot chicks dancing. 97 00:04:50,640 --> 00:04:53,280 Speaker 1: I mean, right, the state doesn't step in to let 98 00:04:53,360 --> 00:04:55,159 Speaker 1: you know this is phony emotion. 99 00:04:55,960 --> 00:04:58,640 Speaker 5: Right before we begin the fourth quarter of the forty 100 00:04:58,720 --> 00:05:00,640 Speaker 5: nine ers game, we'd like to re mind you that 101 00:05:00,720 --> 00:05:04,440 Speaker 5: these young gentlemen are actually rented millionaires and couldn't give 102 00:05:04,480 --> 00:05:06,960 Speaker 5: a crap about the Bay Area beyond finding a nice 103 00:05:07,000 --> 00:05:09,960 Speaker 5: place to live. They'll be gone the moment there contract 104 00:05:10,000 --> 00:05:12,440 Speaker 5: has ended. Now back to the game, or. 105 00:05:12,400 --> 00:05:14,479 Speaker 1: At the end of a lottery commercial that the state 106 00:05:14,600 --> 00:05:17,919 Speaker 1: benefits from very few people, We'd like to remind you 107 00:05:18,040 --> 00:05:20,480 Speaker 1: very few people actually win, and studies show that people 108 00:05:20,480 --> 00:05:22,560 Speaker 1: who do win end up less happy than before. 109 00:05:22,800 --> 00:05:25,360 Speaker 2: I mean, when is it the state's role to jump 110 00:05:25,400 --> 00:05:26,640 Speaker 2: in on this? So there's that. 111 00:05:27,320 --> 00:05:29,560 Speaker 1: Then the other part, if you're going to accept that 112 00:05:29,600 --> 00:05:32,400 Speaker 1: they should, how did they come up with three hours? 113 00:05:32,440 --> 00:05:34,800 Speaker 1: I would have loved to have heard that back and forth. 114 00:05:34,960 --> 00:05:36,240 Speaker 2: I mean, that's hilarious. 115 00:05:37,200 --> 00:05:40,159 Speaker 5: I mean, you don't want to interrupt people's intimacies with 116 00:05:40,200 --> 00:05:43,000 Speaker 5: their computers too often because that'll be obnoxious. 117 00:05:43,480 --> 00:05:45,400 Speaker 1: But if you wait five hours, they'll be so hooked 118 00:05:45,400 --> 00:05:47,279 Speaker 1: there's no turning them around or something. 119 00:05:47,360 --> 00:05:49,160 Speaker 5: Ye's say, this sweet spot here what do you think, 120 00:05:49,240 --> 00:05:52,600 Speaker 5: Jim three. Okay, you know, there's part of me that 121 00:05:52,760 --> 00:05:57,080 Speaker 5: thinks I don't really think this, but I've had this 122 00:05:57,120 --> 00:05:59,200 Speaker 5: thought pop into my head that the state of California 123 00:05:59,279 --> 00:06:03,239 Speaker 5: is Trumpian in that it proposes all this ridiculous crap 124 00:06:03,880 --> 00:06:07,200 Speaker 5: to distract from the fact that now the bullet train 125 00:06:07,320 --> 00:06:11,200 Speaker 5: and the tension time I'm sorry, the pension time bomb, 126 00:06:11,279 --> 00:06:13,960 Speaker 5: and you know, a dozen of the soon to be 127 00:06:14,160 --> 00:06:17,279 Speaker 5: incredibly even more expensive gas prices than the rest of it. 128 00:06:17,320 --> 00:06:17,720 Speaker 2: I don't know. 129 00:06:18,600 --> 00:06:22,680 Speaker 1: So I'm reading another AI book. I'm fascinated by the 130 00:06:22,680 --> 00:06:25,000 Speaker 1: whole thing. I think everybody should be. I think it's 131 00:06:25,040 --> 00:06:28,920 Speaker 1: going to be a giant deal. But I do need 132 00:06:28,960 --> 00:06:31,360 Speaker 1: to read my next book. I swore last night has 133 00:06:31,400 --> 00:06:34,680 Speaker 1: got to be one that is the other side. And 134 00:06:34,680 --> 00:06:36,440 Speaker 1: there are a lot of people on the other side 135 00:06:36,440 --> 00:06:38,840 Speaker 1: that say it's not going to be that devastating to 136 00:06:39,000 --> 00:06:41,320 Speaker 1: mankind or society. So I need to read one of 137 00:06:41,320 --> 00:06:45,039 Speaker 1: those books next, because I've been leaning toward the people 138 00:06:45,040 --> 00:06:46,640 Speaker 1: who think it's going to be a big deal. But anyway, 139 00:06:46,680 --> 00:06:51,120 Speaker 1: here's the review of this book from Bill Gates. Guy 140 00:06:51,160 --> 00:06:53,080 Speaker 1: aren Microsoft used to be the richest man in the 141 00:06:53,080 --> 00:06:55,599 Speaker 1: world for a very very long time and spend a 142 00:06:55,640 --> 00:06:58,719 Speaker 1: ton of is currently spending a ton of money on 143 00:06:58,839 --> 00:07:03,360 Speaker 1: his own AI, you know, creation that can compete against chat, 144 00:07:03,400 --> 00:07:06,000 Speaker 1: GPT and GROCK and all the others. Here's there's a 145 00:07:06,040 --> 00:07:07,839 Speaker 1: review of the book in which he mentions the book. 146 00:07:08,200 --> 00:07:11,240 Speaker 1: When people ask me about artificial intelligence, their questions often 147 00:07:11,240 --> 00:07:13,520 Speaker 1: boil down to this, what should I be worried about? 148 00:07:13,520 --> 00:07:16,040 Speaker 1: And how worried should I be? For the past year. 149 00:07:16,080 --> 00:07:17,760 Speaker 1: This book came out a year and a half ago, 150 00:07:19,280 --> 00:07:21,960 Speaker 1: which unfortunately is a little dated by AI standards, But 151 00:07:22,640 --> 00:07:25,400 Speaker 1: this review is from December, so the review is only 152 00:07:25,480 --> 00:07:28,400 Speaker 1: seven months old. For the past year, I've responded by 153 00:07:28,400 --> 00:07:32,160 Speaker 1: telling them to read The Coming Wave by Mustafa Suliman. 154 00:07:33,080 --> 00:07:35,680 Speaker 1: It's the book I recommend more than any other AI 155 00:07:35,960 --> 00:07:38,840 Speaker 1: on AI to heads of state, business leaders, or anyone 156 00:07:38,920 --> 00:07:41,880 Speaker 1: else who asks, because it offers something rare, a clear 157 00:07:41,920 --> 00:07:45,600 Speaker 1: eyed view of both the extraordinary opportunities and genuine risks ahead. 158 00:07:45,840 --> 00:07:49,400 Speaker 1: And I'm two chapters in and it's already fantastic. What 159 00:07:49,520 --> 00:07:51,960 Speaker 1: sets this u bah blah blah blah blah. What sets 160 00:07:51,960 --> 00:07:55,000 Speaker 1: this book apart from the others is Mustafa's insight that 161 00:07:55,120 --> 00:07:59,600 Speaker 1: AI is only one part of an unprecedented convergence of 162 00:07:59,680 --> 00:08:03,960 Speaker 1: sign typic breakthroughs happening at the same time. Gene editing, 163 00:08:04,520 --> 00:08:09,560 Speaker 1: DNA synthesis, and other advances in biotechnology are racing forward 164 00:08:09,800 --> 00:08:13,640 Speaker 1: in parallel with AI. As the title suggests, these changes 165 00:08:13,680 --> 00:08:17,000 Speaker 1: are building like a wave far out at sea, invisible 166 00:08:17,480 --> 00:08:21,000 Speaker 1: to many, but gathering force. Each one of these individual 167 00:08:21,080 --> 00:08:25,840 Speaker 1: things could be game changing for mankind on its own. Obviously, 168 00:08:25,960 --> 00:08:28,680 Speaker 1: this is supposed to be reassuring me. No, this is 169 00:08:28,720 --> 00:08:35,679 Speaker 1: a scary book. Any of those obviously, gene editing, DNA synthesis, 170 00:08:35,720 --> 00:08:38,800 Speaker 1: whatever that is, advances in biotechnology where you start screwing 171 00:08:38,840 --> 00:08:41,040 Speaker 1: with food and cattle and all kinds of different sorts 172 00:08:41,080 --> 00:08:44,160 Speaker 1: of things. Any of these could be life altering for Earth, 173 00:08:45,240 --> 00:08:48,720 Speaker 1: each of them game changing. Together with AI, they're poised 174 00:08:48,720 --> 00:08:53,720 Speaker 1: to reshape every aspect of society. And yeah, this is 175 00:08:53,760 --> 00:08:56,760 Speaker 1: the first book where he's combined. Hey, AAI is the 176 00:08:57,080 --> 00:08:59,319 Speaker 1: isn't the only thing out there. It's all these other things, 177 00:08:59,360 --> 00:09:02,640 Speaker 1: and it's gonna come bind with AI to where countries 178 00:09:02,760 --> 00:09:06,640 Speaker 1: or non state actors can alter crops. 179 00:09:06,280 --> 00:09:07,360 Speaker 2: Maybe in ways that are great. 180 00:09:07,400 --> 00:09:10,000 Speaker 1: They don't need much water but provide as much wheat, 181 00:09:10,440 --> 00:09:12,320 Speaker 1: you know, twice as much wheat on half the water 182 00:09:12,480 --> 00:09:13,839 Speaker 1: or whatever could be fantastic. 183 00:09:14,000 --> 00:09:15,520 Speaker 2: But also Molio will. 184 00:09:15,320 --> 00:09:17,199 Speaker 5: Soon be dealing with an obesity problem. 185 00:09:17,960 --> 00:09:21,640 Speaker 1: But he also could develop some sort of a germ 186 00:09:21,679 --> 00:09:22,800 Speaker 1: that will kill off wheat. 187 00:09:22,880 --> 00:09:24,199 Speaker 2: And you, you know, let at. 188 00:09:24,200 --> 00:09:28,160 Speaker 1: Least loosen Kansas and Nebraska if you're the Chinese or whatever. 189 00:09:28,240 --> 00:09:31,000 Speaker 5: Yeah, if anybody needs me, I'll be in the fetal position. 190 00:09:33,080 --> 00:09:35,040 Speaker 1: Oh God, you get into gene editing and all that 191 00:09:35,080 --> 00:09:38,200 Speaker 1: sort of stuff. Just it boggles the mind. I can't 192 00:09:38,200 --> 00:09:40,880 Speaker 1: wait to get further along in this book. And he 193 00:09:41,679 --> 00:09:42,920 Speaker 1: so he started. 194 00:09:43,000 --> 00:09:45,040 Speaker 2: I forget which company it was, but it's. 195 00:09:44,880 --> 00:09:48,480 Speaker 1: The one that got bought by Microsoft and Bill Gates 196 00:09:48,480 --> 00:09:51,720 Speaker 1: and he works, he had been working for him, but 197 00:09:51,960 --> 00:09:55,480 Speaker 1: he talks many times, and Bill Gates actually says his 198 00:09:55,559 --> 00:09:58,760 Speaker 1: review similar. Look, I'm an optimist. I've always been an 199 00:09:58,760 --> 00:10:03,840 Speaker 1: optimist about technology. I've always believed that it's a positive outway. 200 00:10:03,880 --> 00:10:06,640 Speaker 1: It's negatives blah blah blah. But not in this case. 201 00:10:06,679 --> 00:10:09,800 Speaker 1: It's basically what he says, And he said, I hope 202 00:10:09,840 --> 00:10:14,079 Speaker 1: I'm wrong. I'd love nothing more than two people look 203 00:10:14,160 --> 00:10:15,880 Speaker 1: back on this book years from now and I was 204 00:10:15,960 --> 00:10:16,760 Speaker 1: completely wrong. 205 00:10:16,840 --> 00:10:17,640 Speaker 2: But I don't think so. 206 00:10:19,480 --> 00:10:25,679 Speaker 5: Yeah, I feel like we're giving chimpanzee's handguns. I just 207 00:10:25,840 --> 00:10:28,959 Speaker 5: I think it is a tool that we cannot handle. 208 00:10:29,080 --> 00:10:34,240 Speaker 1: I think it's probably significantly worse than that. Yeah, I mean, 209 00:10:35,960 --> 00:10:38,360 Speaker 1: so I did adopt this. I heard Charlie Cook cerme 210 00:10:38,480 --> 00:10:43,599 Speaker 1: National Reviews say this the other day, why he embraces 211 00:10:43,640 --> 00:10:48,000 Speaker 1: technology as opposed to rejects it. I'm a guy who's 212 00:10:48,000 --> 00:10:50,560 Speaker 1: always rejected technology. I may have changed my mind based 213 00:10:50,600 --> 00:10:52,559 Speaker 1: on what he just said, on what he said the 214 00:10:52,600 --> 00:10:55,000 Speaker 1: other day. He said, if it's gonna happen anyway, you 215 00:10:55,080 --> 00:10:57,120 Speaker 1: might as well embrace it. There's no point in sitting 216 00:10:57,160 --> 00:11:00,760 Speaker 1: around and complaining you wish the internet had an involved happened, 217 00:11:01,120 --> 00:11:02,720 Speaker 1: and talk about all the ways that life would be 218 00:11:02,720 --> 00:11:05,440 Speaker 1: better if it didn't. It did, and everybody's got it, 219 00:11:05,480 --> 00:11:06,679 Speaker 1: and the same thing is with AI. 220 00:11:06,800 --> 00:11:07,320 Speaker 2: It's coming. 221 00:11:07,920 --> 00:11:09,880 Speaker 1: Non state actors are gonna have it, China's gonna have 222 00:11:09,920 --> 00:11:12,680 Speaker 1: everybody's gonna have it. So the I wish this wouldn't happen, 223 00:11:12,800 --> 00:11:14,480 Speaker 1: or it'd be better if it didn't, or sticking your 224 00:11:14,520 --> 00:11:15,920 Speaker 1: head in the sand and pretending it's. 225 00:11:15,800 --> 00:11:17,320 Speaker 2: Not doesn't do any good. 226 00:11:18,679 --> 00:11:21,560 Speaker 5: No, there's no benefit in that whatsoever. So yeah, I 227 00:11:21,600 --> 00:11:25,320 Speaker 5: think those are two different cases. I absolutely see the point. 228 00:11:25,520 --> 00:11:27,640 Speaker 5: I think a lot of the times when I am 229 00:11:27,880 --> 00:11:31,840 Speaker 5: complaining about the effects of the Internet, it's not some 230 00:11:31,920 --> 00:11:36,280 Speaker 5: sort of denial that it's real, or it's an affirmative 231 00:11:36,320 --> 00:11:39,360 Speaker 5: statement of values that I think are more important or 232 00:11:39,760 --> 00:11:44,000 Speaker 5: activities that are more important and healthy than being on 233 00:11:44,160 --> 00:11:48,840 Speaker 5: the internet. So I mean, I don't I do bemoan 234 00:11:48,880 --> 00:11:52,000 Speaker 5: the fact that it exists, but it's it's a tangent 235 00:11:52,080 --> 00:11:55,760 Speaker 5: to the fact that, look, it is an it's entertainment 236 00:11:56,280 --> 00:12:00,719 Speaker 5: and information, but don't don't spend too much of your 237 00:12:00,760 --> 00:12:02,439 Speaker 5: life on it or it will ruin you. 238 00:12:02,480 --> 00:12:05,319 Speaker 1: Do you think life define life however you want, society, 239 00:12:05,440 --> 00:12:09,240 Speaker 1: the country, whatever, is better because of the Internet or worse? 240 00:12:12,120 --> 00:12:14,760 Speaker 5: Uh, in complete grade. 241 00:12:14,840 --> 00:12:18,240 Speaker 2: I think worse. I mean, I know, I think I'm 242 00:12:18,280 --> 00:12:19,160 Speaker 2: a minority on that. 243 00:12:19,280 --> 00:12:25,360 Speaker 5: But no, no, I it's it's obviously a complicated picture. 244 00:12:25,440 --> 00:12:29,000 Speaker 5: But the question you have to answer before you answer 245 00:12:29,160 --> 00:12:34,200 Speaker 5: that question is by what standard are we gonna decide this? 246 00:12:34,720 --> 00:12:36,160 Speaker 2: People? Is are a measure? 247 00:12:36,559 --> 00:12:38,559 Speaker 1: Oh, I think the only measure. It's like the measure 248 00:12:38,559 --> 00:12:39,840 Speaker 1: you should have for the government. 249 00:12:39,880 --> 00:12:40,760 Speaker 2: And Paul's. 250 00:12:42,480 --> 00:12:46,640 Speaker 1: What's the Thomas Jefferson phrase? You know, pursuit of happiness. Yeah, 251 00:12:46,920 --> 00:12:48,960 Speaker 1: I think it's lessened our pursuit of happiness. 252 00:12:49,040 --> 00:12:52,760 Speaker 5: I think it's our people happy, yeah, with their lives 253 00:12:53,040 --> 00:12:57,120 Speaker 5: and yeah, oh that is absolutely undeniable. Yeah, people are 254 00:12:57,160 --> 00:12:59,560 Speaker 5: less happy. I think it's the fruit of the tree 255 00:12:59,559 --> 00:13:02,160 Speaker 5: of knowledge. Yeah, whenever, bona genesis, whenever. 256 00:13:02,240 --> 00:13:05,720 Speaker 1: I actually have actually haven't debated anybody on this, but 257 00:13:05,800 --> 00:13:07,600 Speaker 1: when I hear people debating on this, they always get 258 00:13:07,640 --> 00:13:10,559 Speaker 1: the productivity and you know how much better email is 259 00:13:10,600 --> 00:13:13,480 Speaker 1: and snail out? Okay, fine, great? Are people happier or 260 00:13:13,559 --> 00:13:15,760 Speaker 1: less happy since the internet occurred? Uh? 261 00:13:16,120 --> 00:13:19,120 Speaker 2: Less happy? So what difference does it make? 262 00:13:19,840 --> 00:13:21,760 Speaker 5: Right, You've got to design, what the decide what the 263 00:13:21,760 --> 00:13:25,160 Speaker 5: bottom line is. The interesting slash troubling part of this, 264 00:13:25,200 --> 00:13:26,520 Speaker 5: and then we need to wrap it up is that 265 00:13:26,559 --> 00:13:29,360 Speaker 5: the bottom line for the people in charge of this 266 00:13:29,520 --> 00:13:30,439 Speaker 5: is the bottom line. 267 00:13:30,520 --> 00:13:30,640 Speaker 2: Right. 268 00:13:30,760 --> 00:13:33,719 Speaker 5: They couldn't give a single crap about human happiness or 269 00:13:34,160 --> 00:13:36,480 Speaker 5: children's anxiety or suicide or the rest of it. Some 270 00:13:36,559 --> 00:13:38,959 Speaker 5: of them can on an individual basis, but it's about 271 00:13:38,960 --> 00:13:39,640 Speaker 5: making money. 272 00:13:39,840 --> 00:13:42,800 Speaker 1: Quick reminder that chat bought you're having a romantic talk 273 00:13:42,840 --> 00:13:44,000 Speaker 1: with is not real. 274 00:13:44,840 --> 00:13:45,880 Speaker 2: Thanks Uncle Gavin. 275 00:13:46,360 --> 00:13:48,480 Speaker 1: Interested in your thoughts on any of this text line 276 00:13:48,480 --> 00:13:51,040 Speaker 1: four one five two nine five k FTZ. 277 00:13:54,880 --> 00:13:59,560 Speaker 4: Gladimir Putin, according to President Trump, vowing retaliation after a 278 00:13:59,640 --> 00:14:03,280 Speaker 4: one hour phone call between the leaders, President Trump posting 279 00:14:03,520 --> 00:14:06,880 Speaker 4: Putin did say and very strongly that he will have 280 00:14:06,960 --> 00:14:10,480 Speaker 4: to respond to the recent attack and acknowledging it was 281 00:14:10,720 --> 00:14:14,000 Speaker 4: not a conversation that will lead to immediate peace. 282 00:14:14,920 --> 00:14:17,760 Speaker 1: And then what did you say to him, Donald Trump, 283 00:14:17,840 --> 00:14:20,880 Speaker 1: since the ball is in your court on this entire war, 284 00:14:22,280 --> 00:14:25,760 Speaker 1: did you say, we're putting together a package of sanctions 285 00:14:25,760 --> 00:14:28,120 Speaker 1: that will devastate you. So I think maybe it'd be 286 00:14:28,240 --> 00:14:32,280 Speaker 1: better if you didn't or anything that leads even within 287 00:14:32,360 --> 00:14:35,240 Speaker 1: a million miles of a threat or pushback on the 288 00:14:35,280 --> 00:14:38,040 Speaker 1: side of Ukraine. 289 00:14:37,200 --> 00:14:41,000 Speaker 5: It will not lead to immediate peace. Yeah, he's going 290 00:14:41,040 --> 00:14:44,560 Speaker 5: with there's no interest in piece whatsoever. It's it's these guys. 291 00:14:44,680 --> 00:14:46,400 Speaker 5: Neither one of them will actually want piece. They hate 292 00:14:46,400 --> 00:14:47,520 Speaker 5: each other. So what are you going to do? 293 00:14:47,960 --> 00:14:48,080 Speaker 3: Well? 294 00:14:48,080 --> 00:14:49,720 Speaker 1: Are you gonna be on one side or the other? 295 00:14:49,880 --> 00:14:51,760 Speaker 1: Is one option? I know for a lot of you 296 00:14:51,840 --> 00:14:53,840 Speaker 1: think stay out of it is the option. I don't 297 00:14:53,840 --> 00:14:55,240 Speaker 1: think so. 298 00:14:55,240 --> 00:14:57,600 Speaker 5: So. The Wall Street General Editorial Board wants to know 299 00:14:57,640 --> 00:15:00,240 Speaker 5: if President Trump, when will he finally take no for 300 00:15:00,320 --> 00:15:06,040 Speaker 5: an answer. Senior Kremlin official Dmitri Medvedyev, also a frequent 301 00:15:06,120 --> 00:15:11,080 Speaker 5: mouthpiece for Putin, talking about the so called peace talks quote, 302 00:15:11,080 --> 00:15:14,560 Speaker 5: the negotiations in Istanbul are not aimed at compromise piece 303 00:15:14,680 --> 00:15:18,520 Speaker 5: based on someone else's delusional terms. The goal is our 304 00:15:18,640 --> 00:15:22,040 Speaker 5: swift victory and complete destruction of the neo Nazi regime. 305 00:15:22,720 --> 00:15:24,960 Speaker 2: Right. That's one of the written. 306 00:15:27,200 --> 00:15:31,360 Speaker 1: Pieces for the so called ceasefire that have to be met, 307 00:15:31,640 --> 00:15:34,560 Speaker 1: the elimination of the Nazis. Okay, fine, all the Nazis 308 00:15:34,600 --> 00:15:36,480 Speaker 1: are gone, because what the hell does that mean? But 309 00:15:36,720 --> 00:15:39,520 Speaker 1: you have to make Russia the official language in Ukraine 310 00:15:39,560 --> 00:15:42,440 Speaker 1: is one of their demands. For instance, they pretend they 311 00:15:42,480 --> 00:15:45,160 Speaker 1: didn't take any children, so Ukraine wanting kids back. 312 00:15:45,160 --> 00:15:46,760 Speaker 2: We didn't take any kids. What are you talking about? 313 00:15:46,800 --> 00:15:51,520 Speaker 5: So that's not part of the deal, right, right, Ah, 314 00:15:51,640 --> 00:15:54,640 Speaker 5: it's clear that vlad Putin, It's pretty clear to me anyway. 315 00:15:54,920 --> 00:15:55,280 Speaker 2: And the. 316 00:15:57,080 --> 00:16:03,880 Speaker 5: Molus in Iran have decided. Trump is so proud of 317 00:16:03,920 --> 00:16:08,400 Speaker 5: his ability to make deals that we're just gonna keep 318 00:16:08,440 --> 00:16:11,920 Speaker 5: telling him there's a deal out there, but never come 319 00:16:11,960 --> 00:16:15,760 Speaker 5: to one. That's what it seems like to me. I mean, 320 00:16:15,800 --> 00:16:18,680 Speaker 5: because the talks with Iran are going nowhere. There have 321 00:16:18,680 --> 00:16:20,400 Speaker 5: been a couple announcements from the White House that we're 322 00:16:20,440 --> 00:16:25,240 Speaker 5: getting close. We're getting close. They're still saying uranium enrichment 323 00:16:25,360 --> 00:16:28,280 Speaker 5: is the key to our nuclear program. The the Ayatola 324 00:16:28,440 --> 00:16:31,240 Speaker 5: himself said that in a televised speech, the rude and 325 00:16:31,320 --> 00:16:34,080 Speaker 5: arrogant leaders of America repeatedly demanded, we should not have 326 00:16:34,120 --> 00:16:36,560 Speaker 5: a nuclear program. Who are you to decide whether Iran 327 00:16:36,600 --> 00:16:40,560 Speaker 5: should have enrichment? Well, the answer is that everybody knows 328 00:16:40,560 --> 00:16:42,840 Speaker 5: in Richmond's path through nuclear bomb. You don't need it 329 00:16:42,880 --> 00:16:48,240 Speaker 5: for a convention for you know, energy programs. But they 330 00:16:48,320 --> 00:16:52,440 Speaker 5: keep sending Steve Whitkoff to negotiate. I just I don't 331 00:16:52,440 --> 00:16:53,960 Speaker 5: think there are deals to be made here. 332 00:16:54,520 --> 00:16:58,000 Speaker 1: We may have taken a step toward the greatest whizzing 333 00:16:58,160 --> 00:17:00,359 Speaker 1: contest in world history. 334 00:17:00,960 --> 00:17:03,960 Speaker 5: Oh my, plus science tells you how to fall in love. 335 00:17:04,240 --> 00:17:09,359 Speaker 5: Oh that's nice love and whizzing coming up? You? 336 00:17:10,160 --> 00:17:12,400 Speaker 2: Ah, what are you? Peep? You brought it, don't you 337 00:17:12,480 --> 00:17:14,040 Speaker 2: Ooh to me? You brought it up. 338 00:17:17,320 --> 00:17:21,240 Speaker 7: Armstrong and Geeddy, Elon and I left on a great note. 339 00:17:21,240 --> 00:17:24,600 Speaker 7: We were texting one another, you know, happy texts. You know, Monday, 340 00:17:24,960 --> 00:17:27,640 Speaker 7: and then and then yesterday, you know, twenty four hours 341 00:17:27,720 --> 00:17:30,080 Speaker 7: later he doesn't want eighty and he comes out and 342 00:17:30,080 --> 00:17:32,360 Speaker 7: opposed the bill. And it surprised me. Frankly, I think 343 00:17:32,400 --> 00:17:35,520 Speaker 7: he's flat wrong. I think he's he's way off on this, 344 00:17:35,560 --> 00:17:37,439 Speaker 7: and I've told him as much, and I've said it 345 00:17:37,440 --> 00:17:40,359 Speaker 7: publicly and privately. I'm very consistent in that. But am 346 00:17:40,400 --> 00:17:42,520 Speaker 7: I concerned about the effective this on the midterms? 347 00:17:42,520 --> 00:17:42,760 Speaker 4: I'm not. 348 00:17:42,880 --> 00:17:44,760 Speaker 7: Let me tell you why, because when the big beautiful 349 00:17:44,760 --> 00:17:47,280 Speaker 7: bill is done and signed in a law, every single 350 00:17:47,320 --> 00:17:48,440 Speaker 7: American is going to do better. 351 00:17:48,520 --> 00:17:51,320 Speaker 1: Yeah, we'll see that Speaker Johnson saying, Hey, I'm friends 352 00:17:51,320 --> 00:17:54,160 Speaker 1: with Elon, but I think he's wrong. So it started 353 00:17:54,720 --> 00:17:58,359 Speaker 1: quite tepid, the Elon backing away, and I thought it 354 00:17:58,400 --> 00:17:58,920 Speaker 1: was way. 355 00:17:58,720 --> 00:18:00,800 Speaker 5: Over disagreeing, very gentleman. 356 00:18:00,800 --> 00:18:02,600 Speaker 1: I thought it was way over blown. Last week, I 357 00:18:02,640 --> 00:18:06,520 Speaker 1: the guy was wrong or or I was right, but 358 00:18:06,600 --> 00:18:08,600 Speaker 1: the people who guessed where it was growing were right 359 00:18:08,680 --> 00:18:09,640 Speaker 1: were right. AnyWho. 360 00:18:10,119 --> 00:18:12,399 Speaker 2: So elon the last two days he's amped it up. 361 00:18:12,440 --> 00:18:12,840 Speaker 2: Each day. 362 00:18:12,880 --> 00:18:16,200 Speaker 1: He went from forty eight hours ago from what the 363 00:18:16,560 --> 00:18:20,719 Speaker 1: hellacious abomination or whatever he called it, disgusting abomination. 364 00:18:20,880 --> 00:18:23,800 Speaker 2: Then to yesterday kill the bill. 365 00:18:23,880 --> 00:18:26,840 Speaker 1: Kill the bill all day long, urging Republican senators to 366 00:18:26,960 --> 00:18:32,520 Speaker 1: vote against the bill. Now, Donald Trump, sitting with the 367 00:18:32,640 --> 00:18:35,920 Speaker 1: leader of Germany and the Oval Office, ask about the relationship, 368 00:18:36,000 --> 00:18:37,720 Speaker 1: said this, I've. 369 00:18:37,520 --> 00:18:41,119 Speaker 8: Always liked Elon, and yeah, I can understand why he's upset. 370 00:18:41,359 --> 00:18:44,439 Speaker 8: Remember he was here for a long time. He saw 371 00:18:44,600 --> 00:18:47,879 Speaker 8: a man who was very happy when he stood behind 372 00:18:47,920 --> 00:18:51,800 Speaker 8: the Oval desk, and even with the black guy. I said, 373 00:18:51,800 --> 00:18:53,680 Speaker 8: do you want a little makeup, We'll get you little Mecca. 374 00:18:54,320 --> 00:18:57,879 Speaker 8: But he said no, I don't think so, which is interesting. 375 00:18:58,400 --> 00:18:59,960 Speaker 2: And very nice. He wants to be. 376 00:19:00,080 --> 00:19:02,160 Speaker 8: Who he is, so you could make that stay so too. 377 00:19:02,200 --> 00:19:04,840 Speaker 1: I guess he said that's enough. So he said I 378 00:19:05,200 --> 00:19:07,520 Speaker 1: Elon and I had a great relationship. I don't know 379 00:19:07,560 --> 00:19:12,680 Speaker 1: if we will anymore, which is I've been following Trump. 380 00:19:12,760 --> 00:19:14,800 Speaker 1: We've been following Trump for a long time. That's the 381 00:19:14,840 --> 00:19:18,240 Speaker 1: first part of the turn before you go full on 382 00:19:18,680 --> 00:19:19,600 Speaker 1: scorched earth. 383 00:19:20,119 --> 00:19:21,440 Speaker 2: I don't know if we will anymore. 384 00:19:21,680 --> 00:19:25,320 Speaker 1: He hasn't said anything bad about me personally, which we 385 00:19:25,440 --> 00:19:29,840 Speaker 1: all know that is the line you cross, but I'm 386 00:19:29,880 --> 00:19:31,200 Speaker 1: sure that'll be next. 387 00:19:31,240 --> 00:19:32,360 Speaker 2: I'm very disappointed. 388 00:19:32,400 --> 00:19:35,680 Speaker 1: I've helped Elon and not a lot if Elon makes 389 00:19:35,680 --> 00:19:38,760 Speaker 1: any comment, which he will as again, Wall Street Journal 390 00:19:38,800 --> 00:19:41,840 Speaker 1: called these two of the most powerful men on planet Earth. 391 00:19:42,880 --> 00:19:45,240 Speaker 1: They both have the ability. I mean, they've done it 392 00:19:45,280 --> 00:19:47,439 Speaker 1: in the past with a lot of high level people. 393 00:19:47,920 --> 00:19:51,600 Speaker 1: They both have the ability to go farther than anybody 394 00:19:51,680 --> 00:19:54,119 Speaker 1: I've ever known in my life would ever go in 395 00:19:54,200 --> 00:19:55,800 Speaker 1: terms of insulting somebody. 396 00:19:56,160 --> 00:19:59,160 Speaker 5: Well, they are both incredibly powerful, but they are both 397 00:19:59,240 --> 00:20:02,240 Speaker 5: incredibly under disciplined. Yeah, I mean, I can I can 398 00:20:02,320 --> 00:20:04,640 Speaker 5: say plenty of people are the most people of powerful 399 00:20:04,640 --> 00:20:07,560 Speaker 5: people on Earth who would not utter a syllable that 400 00:20:07,600 --> 00:20:09,840 Speaker 5: they had not carefully considered. 401 00:20:10,440 --> 00:20:16,080 Speaker 1: Or would never personally attack someone someone even an opponent, 402 00:20:16,200 --> 00:20:19,000 Speaker 1: let alone of somebody on your side. But we've seen 403 00:20:19,040 --> 00:20:20,560 Speaker 1: Trump do it over and over and over again, and 404 00:20:20,600 --> 00:20:23,240 Speaker 1: Elon is quite undisciplined in this. I think there's a 405 00:20:23,359 --> 00:20:26,800 Speaker 1: chance this blows up into the biggest story of all 406 00:20:27,000 --> 00:20:32,240 Speaker 1: I mean, I could imagine some crazy, crazy stuff coming 407 00:20:32,280 --> 00:20:35,480 Speaker 1: out of posts between Trump and Elon by the end 408 00:20:35,520 --> 00:20:37,080 Speaker 1: of the day or certainly by the end of the week. 409 00:20:37,720 --> 00:20:40,359 Speaker 5: Right right, His rockets. 410 00:20:39,960 --> 00:20:42,120 Speaker 1: Blow up all the time. He's not a very good engineer, 411 00:20:42,119 --> 00:20:45,280 Speaker 1: I've been told, you know. And then just just it 412 00:20:45,400 --> 00:20:46,440 Speaker 1: explodes from there. 413 00:20:48,280 --> 00:20:51,879 Speaker 5: The other day, Trump was blasting the head of the 414 00:20:51,920 --> 00:20:56,800 Speaker 5: Federalist Society, which recommended Kavanaugh and Amy, Connie Barrett and 415 00:20:56,880 --> 00:21:01,280 Speaker 5: Neil Gorse, that you've been absolutely wonderful justices because now 416 00:21:01,359 --> 00:21:05,120 Speaker 5: some of their appointees have been upholding the Constitution, telling 417 00:21:05,200 --> 00:21:07,920 Speaker 5: Trump no to some of his plans. That's my point 418 00:21:07,960 --> 00:21:10,879 Speaker 5: of view, anyway, And Trump said, of I can't remember 419 00:21:10,920 --> 00:21:15,199 Speaker 5: the guy's name, but he said, was it derelict or 420 00:21:16,240 --> 00:21:18,000 Speaker 5: scumbag or reprobate? 421 00:21:18,080 --> 00:21:19,720 Speaker 2: It's one of those words. 422 00:21:20,280 --> 00:21:22,760 Speaker 5: He said, he's a reprobate and he probably doesn't even 423 00:21:22,800 --> 00:21:26,119 Speaker 5: love America. Right, You're gonna get the head of the Federalists, 424 00:21:26,200 --> 00:21:28,520 Speaker 5: the co founder of the Federalists. You're gonna get that 425 00:21:29,280 --> 00:21:31,639 Speaker 5: from Trump toward Elon at some point, and Elon is 426 00:21:31,760 --> 00:21:34,480 Speaker 5: not is going to say something sober. Yes, Michael, I'm 427 00:21:34,480 --> 00:21:36,720 Speaker 5: picturing Trump saying, who names their child? X? 428 00:21:36,960 --> 00:21:38,720 Speaker 2: Oh yeah, I could be all kinds. 429 00:21:38,520 --> 00:21:41,840 Speaker 1: Of hell, hiy fair criticism. I heard he has a 430 00:21:41,920 --> 00:21:44,560 Speaker 1: drug problem, whatever the hell? And then Elon will not 431 00:21:44,600 --> 00:21:48,560 Speaker 1: hold back tall. So the George Carlin of me in 432 00:21:48,680 --> 00:21:52,080 Speaker 1: me that just watches the world and is amused, is 433 00:21:52,160 --> 00:21:54,399 Speaker 1: really looking forward to it the I want us to 434 00:21:54,400 --> 00:21:57,240 Speaker 1: be a successful nation. Part of me is not so excited. 435 00:21:59,119 --> 00:22:01,399 Speaker 5: Yeah, I've got a bit of a feeling of sad 436 00:22:01,440 --> 00:22:07,160 Speaker 5: resignation on my shoulders at this point, like to hear 437 00:22:07,160 --> 00:22:10,240 Speaker 5: it described. So I would like to apologize to the 438 00:22:10,240 --> 00:22:16,560 Speaker 5: American people. Okay, there's not a single chance it would 439 00:22:16,560 --> 00:22:19,000 Speaker 5: be idiotic to launch you into the whole science and 440 00:22:19,040 --> 00:22:21,639 Speaker 5: falling in love thing now. We don't even have a 441 00:22:21,680 --> 00:22:23,880 Speaker 5: fraction of the time left that we need. Why don't 442 00:22:23,880 --> 00:22:25,320 Speaker 5: we do it as part of the armstrong and get 443 00:22:25,400 --> 00:22:28,119 Speaker 5: you on demand? I'm sorry. One more thing podcast? 444 00:22:28,160 --> 00:22:30,040 Speaker 2: Cool? That sounds good. Give me the tease. 445 00:22:32,080 --> 00:22:36,919 Speaker 5: If you This is how to fall in love and 446 00:22:37,000 --> 00:22:40,119 Speaker 5: it can be accomplished in an hour. Haven't you and 447 00:22:40,160 --> 00:22:43,359 Speaker 5: another person you've hit it off, you like each other, 448 00:22:43,880 --> 00:22:45,840 Speaker 5: you'd kind of like for it to go somewhere. 449 00:22:47,040 --> 00:22:50,040 Speaker 2: I can have you in love in an hour. But doesn't. 450 00:22:50,720 --> 00:22:53,719 Speaker 2: Hasn't the way it's been working been fine for people? No? 451 00:22:53,720 --> 00:22:57,280 Speaker 5: No, no, no, oh my god, you're a reprobate who 452 00:22:57,320 --> 00:23:04,520 Speaker 5: probably doesn't love America. No, it's you know, circumstances can 453 00:23:04,520 --> 00:23:08,960 Speaker 5: intervene and things can happen, and something very very promising 454 00:23:09,000 --> 00:23:11,800 Speaker 5: went sideways and it shouldn't have. Besides, it's the go 455 00:23:11,800 --> 00:23:14,480 Speaker 5: go twenty first century. Who has weeks and months to 456 00:23:14,560 --> 00:23:17,640 Speaker 5: fall in love? This is actually, you know, or making 457 00:23:17,720 --> 00:23:20,800 Speaker 5: light and I'm being very vague. I found it a 458 00:23:21,080 --> 00:23:26,720 Speaker 5: very very interesting dissection. Is an unfortunate term, but that's 459 00:23:26,760 --> 00:23:32,239 Speaker 5: what popped into my head of how true intimacy is 460 00:23:32,320 --> 00:23:37,440 Speaker 5: built and you can like supercharge the process and save 461 00:23:37,520 --> 00:23:38,200 Speaker 5: a lot of time. 462 00:23:39,400 --> 00:23:42,480 Speaker 1: Wow, do you want to save time? Or is the 463 00:23:42,680 --> 00:23:45,520 Speaker 1: gradual process one of the greatest things that ever happens 464 00:23:45,560 --> 00:23:46,679 Speaker 1: in your entire life? 465 00:23:46,800 --> 00:23:50,439 Speaker 5: But and it has its own advantages that we just 466 00:23:50,480 --> 00:23:53,880 Speaker 5: don't sense because we're humans and we're dope that is possible. 467 00:23:54,800 --> 00:23:56,840 Speaker 1: And then you've got the story we started the hour 468 00:23:56,920 --> 00:24:00,000 Speaker 1: with where the government of California has decided to stay 469 00:24:00,200 --> 00:24:03,880 Speaker 1: in And if you're enjoying falling in love with your chatbot, 470 00:24:04,080 --> 00:24:06,399 Speaker 1: every three hours in the state of California will remind 471 00:24:06,440 --> 00:24:09,000 Speaker 1: you on your screen. Remember you're talking to a computer. 472 00:24:09,359 --> 00:24:14,040 Speaker 1: Do not be happy now as you were. The more 473 00:24:14,080 --> 00:24:18,560 Speaker 1: I think about the Elon v. Trump feel, the more. 474 00:24:18,280 --> 00:24:19,280 Speaker 2: It's gonna be. 475 00:24:19,800 --> 00:24:22,520 Speaker 1: I just I could go go back and dig up 476 00:24:22,560 --> 00:24:27,040 Speaker 1: examples of things they've each said to people that are 477 00:24:27,520 --> 00:24:29,159 Speaker 1: just so over the top. But if you get it 478 00:24:29,200 --> 00:24:32,320 Speaker 1: being toward the president from the richest man in the 479 00:24:32,320 --> 00:24:34,320 Speaker 1: world or vice versa, I mean. 480 00:24:34,160 --> 00:24:37,360 Speaker 2: It takes on a whole I mean him saying things about. 481 00:24:37,320 --> 00:24:40,720 Speaker 1: I don't know, you know, what's his name, Carson the doctor, 482 00:24:41,080 --> 00:24:44,920 Speaker 1: or Marco Rubi or whoever's attack you know, Rosie O'Donnell. 483 00:24:45,080 --> 00:24:47,720 Speaker 1: That's one thing, but the world's richest man who does 484 00:24:47,760 --> 00:24:50,240 Speaker 1: not need to hold back for any reason whatsoever, and 485 00:24:50,280 --> 00:24:55,520 Speaker 1: the sorts of things Elon might say about so. 486 00:24:55,200 --> 00:25:00,520 Speaker 2: So Trump said, well, let's throw this up, you know, 487 00:25:00,840 --> 00:25:01,359 Speaker 2: go ahead. 488 00:25:01,400 --> 00:25:04,200 Speaker 5: But it's it's not it's less what he says than 489 00:25:04,240 --> 00:25:06,880 Speaker 5: what he does. That's the part that's got me feeling 490 00:25:07,240 --> 00:25:10,760 Speaker 5: oh ish, trumping the levers of the power of the 491 00:25:10,840 --> 00:25:15,879 Speaker 5: executive branch to lash back hello at SpaceX or Twitter 492 00:25:16,040 --> 00:25:17,000 Speaker 5: or whatever. 493 00:25:17,119 --> 00:25:18,280 Speaker 2: In a way that may be. 494 00:25:19,920 --> 00:25:23,080 Speaker 5: Unpalatable to the courts and the American people in two 495 00:25:23,119 --> 00:25:27,520 Speaker 5: thirds of the Senate. If you hear me hinting, so. 496 00:25:27,560 --> 00:25:32,119 Speaker 1: This this might be what kicks it off. 497 00:25:32,480 --> 00:25:33,320 Speaker 2: Or Trump said this. 498 00:25:33,840 --> 00:25:36,800 Speaker 8: And you know, Elon's upset because we took the ev 499 00:25:37,000 --> 00:25:39,000 Speaker 8: mandate and you know, which was a lot of money 500 00:25:39,040 --> 00:25:43,120 Speaker 8: for electric vehicles, and you know they're having a hard 501 00:25:43,160 --> 00:25:45,359 Speaker 8: time the electric vehicles, and I want to say, it 502 00:25:45,480 --> 00:25:46,040 Speaker 8: makes the point. 503 00:25:46,280 --> 00:25:49,560 Speaker 1: So he's already claiming that Elon is against the bill 504 00:25:49,760 --> 00:25:52,800 Speaker 1: because it financially damages him. 505 00:25:52,840 --> 00:25:57,080 Speaker 5: Not a principal stance against overspending and suffocating debt. No, 506 00:25:57,320 --> 00:25:59,000 Speaker 5: it is a selfish motivent. 507 00:25:58,680 --> 00:26:02,440 Speaker 1: Which Elon has been talking about four years and talking about, hey, look, 508 00:26:02,880 --> 00:26:07,040 Speaker 1: our interest payments are now greater than what we spend 509 00:26:07,080 --> 00:26:07,840 Speaker 1: on the military. 510 00:26:07,840 --> 00:26:08,480 Speaker 2: Blah blah blah. 511 00:26:08,520 --> 00:26:13,439 Speaker 1: Oh, one hundred percent defendable positions on the bill, and 512 00:26:13,440 --> 00:26:15,800 Speaker 1: Trump claims it's just because it's hurting you financially. That 513 00:26:15,880 --> 00:26:18,800 Speaker 1: would get my hair up, That would make me angry. 514 00:26:20,080 --> 00:26:22,520 Speaker 1: I could believe it's both too. I mean, Elon can 515 00:26:22,640 --> 00:26:26,280 Speaker 1: possibly be pleased at the credits. No, no, but he 516 00:26:26,320 --> 00:26:29,280 Speaker 1: has been talking about the debt and how we it's 517 00:26:29,359 --> 00:26:34,520 Speaker 1: unsustainable and it's going to ruin the country for years. 518 00:26:35,359 --> 00:26:38,159 Speaker 1: So I don't think he's gonna react well to that statement. 519 00:26:38,200 --> 00:26:39,919 Speaker 1: That just happened in the last forty five minutes. So 520 00:26:40,160 --> 00:26:42,960 Speaker 1: you know, Elon, I'm sure he's changing the diaper on 521 00:26:42,960 --> 00:26:45,040 Speaker 1: one baby while he teaches another kid how to ride 522 00:26:45,040 --> 00:26:46,560 Speaker 1: a bike, and then he's up in another one with 523 00:26:46,600 --> 00:26:48,639 Speaker 1: algebra homework here on the last week of school. And 524 00:26:48,800 --> 00:26:50,919 Speaker 1: you know he's very busy with his thirteen kids. But 525 00:26:51,000 --> 00:26:52,760 Speaker 1: as soon as he hears what Trump said, he might 526 00:26:52,800 --> 00:26:56,640 Speaker 1: fire back something. And then it's on ladies and gentlemen, truth, 527 00:26:56,760 --> 00:27:00,560 Speaker 1: social versus Twitter. Oh boy, let it fly. We will 528 00:27:00,560 --> 00:27:01,400 Speaker 1: finish strong next. 529 00:27:01,400 --> 00:27:04,840 Speaker 5: Things are getting weird, and they getting weird fast. Are 530 00:27:05,000 --> 00:27:05,760 Speaker 5: strong yet? 531 00:27:07,600 --> 00:27:09,680 Speaker 1: So every day after the show, Joe and I joke 532 00:27:09,720 --> 00:27:11,679 Speaker 1: about how bad the show was today, and maybe we 533 00:27:11,720 --> 00:27:15,560 Speaker 1: can do better tomorrow. So our executive producer, Hansen, said, 534 00:27:15,560 --> 00:27:19,240 Speaker 1: make a song in that theme, like it's seventies soft rock. 535 00:27:19,400 --> 00:27:22,359 Speaker 6: Let's try to put this one behind. 536 00:27:26,320 --> 00:27:30,920 Speaker 2: Just tamp down the shame. AI. 537 00:27:31,400 --> 00:27:35,320 Speaker 5: Obviously, maybe we do better tomorrow. 538 00:27:35,720 --> 00:27:41,600 Speaker 2: We can only hope unless we continue to be so lame. 539 00:27:43,040 --> 00:27:47,720 Speaker 1: Then a seventies guitar solo after a seventies drum fell. 540 00:27:47,800 --> 00:27:53,000 Speaker 2: Please give the drummer some love. Yeah, if you're old enough. AI. 541 00:27:53,280 --> 00:27:55,960 Speaker 1: But I was just talking to our boss, Steve, who's 542 00:27:56,040 --> 00:27:59,879 Speaker 1: roughly our age, and he said he couldn't believe how 543 00:28:00,080 --> 00:28:03,879 Speaker 1: dead on seventies rock music this was. 544 00:28:04,680 --> 00:28:12,240 Speaker 2: Here comes the chorus, Ladies and germs being strong, and. 545 00:28:12,600 --> 00:28:13,720 Speaker 1: Yet show. 546 00:28:17,480 --> 00:28:22,080 Speaker 6: Just tamp down on the shad, tamping down a shame. Now, 547 00:28:23,359 --> 00:28:32,199 Speaker 6: maybe we do better to love unless we contu being so. 548 00:28:34,880 --> 00:28:40,440 Speaker 5: This is uncanny and franky and disturbing from the Elton 549 00:28:40,560 --> 00:28:44,400 Speaker 5: John esque A couple of interesting twists on an obvious 550 00:28:44,440 --> 00:28:47,320 Speaker 5: court Forgison to the timber of his voice is very 551 00:28:47,320 --> 00:28:51,440 Speaker 5: elton esque to the production is exactly right. Not to 552 00:28:51,480 --> 00:28:54,200 Speaker 5: geek out on you, but just the way it's mixed, 553 00:28:54,240 --> 00:28:56,360 Speaker 5: and how loud the drums are and the amount of 554 00:28:56,400 --> 00:28:59,240 Speaker 5: reverb and the lead guitar it's all out of like 555 00:28:59,320 --> 00:29:00,720 Speaker 5: seventies m M production. 556 00:29:01,680 --> 00:29:02,800 Speaker 2: It's disturbing. 557 00:29:03,160 --> 00:29:05,240 Speaker 1: Man, I feel like it picked up some sticks and 558 00:29:05,320 --> 00:29:08,440 Speaker 1: Peter Frampton and all kinds of different stuff from the second. 559 00:29:08,120 --> 00:29:11,200 Speaker 5: Oh yeah yeah, a little sprinkle of this spread that 560 00:29:11,520 --> 00:29:12,760 Speaker 5: just distilled it down. 561 00:29:12,640 --> 00:29:14,640 Speaker 1: And dos I want to be guitar player? It makes 562 00:29:14,640 --> 00:29:17,720 Speaker 1: me want to cut off my hands, Like what is 563 00:29:17,800 --> 00:29:18,440 Speaker 1: going on here? 564 00:29:21,840 --> 00:29:27,480 Speaker 5: I don't, I don't know. I really I don't think 565 00:29:27,520 --> 00:29:30,400 Speaker 5: I am an old man yelling at clouds. I really don't. 566 00:29:32,640 --> 00:29:36,920 Speaker 5: My animal instincts are danger danger, you know what. I 567 00:29:36,920 --> 00:29:38,880 Speaker 5: don't think I've ever put it that plainly. 568 00:29:45,480 --> 00:29:50,040 Speaker 1: That your animal instincts are yeah, yeah, My instinctive reaction 569 00:29:50,120 --> 00:29:51,120 Speaker 1: to this is this is bad. 570 00:29:51,160 --> 00:29:53,920 Speaker 5: This is a threat. That's as simple as that. I've 571 00:29:53,920 --> 00:29:59,000 Speaker 5: found various ways to uh, to phrase that concern, but 572 00:29:59,000 --> 00:30:00,560 Speaker 5: that's that's that's the. 573 00:30:00,160 --> 00:30:01,080 Speaker 2: Description I can give. 574 00:30:01,480 --> 00:30:03,200 Speaker 5: It's an instinctive revulsion. 575 00:30:03,840 --> 00:30:06,080 Speaker 1: Well, I mentioned that book I'm reading about AI and 576 00:30:06,120 --> 00:30:07,600 Speaker 1: I'll have more of it than I get through the book. 577 00:30:07,600 --> 00:30:10,120 Speaker 1: But he he says we should feel that way, and 578 00:30:10,400 --> 00:30:15,320 Speaker 1: uh and not and not and not not feel that 579 00:30:15,360 --> 00:30:18,240 Speaker 1: way because something big is coming and we gotta think 580 00:30:18,240 --> 00:30:18,600 Speaker 1: about it. 581 00:30:19,600 --> 00:30:20,560 Speaker 2: So you know, you know what. 582 00:30:20,600 --> 00:30:24,960 Speaker 5: I'm reminded though of what finally helped me enjoy sports more. 583 00:30:27,120 --> 00:30:30,520 Speaker 5: Right after clip fifteen, Michael, gimme clip fifteen. Give it 584 00:30:30,560 --> 00:30:35,600 Speaker 5: to me for the first overtime. Here's Pectavid out for 585 00:30:35,720 --> 00:30:39,440 Speaker 5: Deuchen Hopkins out of the point, who sharp Chent Hopkins? 586 00:30:39,480 --> 00:30:47,920 Speaker 2: Wave it? Here's Pectavid. Listen to that cry of Flint 587 00:30:48,000 --> 00:30:50,360 Speaker 2: came one. Oh. 588 00:30:50,840 --> 00:30:52,840 Speaker 5: I've been rooting hard for the Oilers. I'm a I'm 589 00:30:52,920 --> 00:30:55,120 Speaker 5: a last couple of years a big Oilers fan. I 590 00:30:55,160 --> 00:30:58,840 Speaker 5: love playoff hockey so much. Do respect to basketball fans. 591 00:30:59,000 --> 00:31:02,800 Speaker 5: It is if you understand the game, there's no comparison, 592 00:31:03,120 --> 00:31:06,680 Speaker 5: because the decisive play could happen in the first minute, 593 00:31:06,840 --> 00:31:10,880 Speaker 5: the twenty third minute, or the last minute. Every minute 594 00:31:11,080 --> 00:31:15,000 Speaker 5: is critically important in a hockey game, as opposed to basketball, 595 00:31:15,040 --> 00:31:17,440 Speaker 5: where there's a three and a half quarter exhibition and 596 00:31:17,480 --> 00:31:19,960 Speaker 5: then they play hard for four minutes. Plus they play 597 00:31:19,960 --> 00:31:21,160 Speaker 5: on ice, which is cool. 598 00:31:21,240 --> 00:31:23,800 Speaker 2: Yes, it's on beice, Michael. It's a good point the 599 00:31:23,920 --> 00:31:25,680 Speaker 2: NBA or the hockey. 600 00:31:27,240 --> 00:31:29,560 Speaker 1: Yeah, so it's got that whole soccer thing there with 601 00:31:29,640 --> 00:31:31,560 Speaker 1: you know, s score is such a big deal. Listen 602 00:31:31,600 --> 00:31:34,080 Speaker 1: to that crowd though. Wow, that was something so obviously 603 00:31:34,120 --> 00:31:36,560 Speaker 1: it was in Canada. They haven't in Canada, you know, 604 00:31:36,640 --> 00:31:38,760 Speaker 1: they hockey means so much of them. They have won 605 00:31:38,800 --> 00:31:39,640 Speaker 1: the Stanley Cup. 606 00:31:39,480 --> 00:31:41,520 Speaker 2: In thirty two years, thirty three years. 607 00:31:41,680 --> 00:31:45,480 Speaker 5: Yeah, the whole country is beside themselves rooting for Edmonton. 608 00:31:46,600 --> 00:31:48,280 Speaker 5: So anyway, playoff hockey. 609 00:31:48,280 --> 00:31:48,440 Speaker 1: Oh. 610 00:31:48,440 --> 00:31:50,240 Speaker 5: What I started to say was the only thing that 611 00:31:50,280 --> 00:31:53,760 Speaker 5: comforts me is I watch mankind slide into the abyss. 612 00:31:54,160 --> 00:31:56,320 Speaker 2: Is that just like watching. 613 00:31:55,960 --> 00:31:59,840 Speaker 5: Sports, I don't actually have any effect on the outcome. Well, 614 00:32:00,000 --> 00:32:02,200 Speaker 5: I mean we have a teeny tiny effect doing this show, 615 00:32:02,240 --> 00:32:06,240 Speaker 5: but not really the great tidal forces that are sweeping 616 00:32:06,640 --> 00:32:12,640 Speaker 5: humankind toward you know whatever AI powered droid nightmare. 617 00:32:13,480 --> 00:32:15,000 Speaker 2: It's gonna happen, whether I like it or not. 618 00:32:15,120 --> 00:32:17,000 Speaker 1: I was thinking about that last night, about you know 619 00:32:17,000 --> 00:32:18,840 Speaker 1: why I'm so fascinated by this and I just can't 620 00:32:18,880 --> 00:32:20,960 Speaker 1: get enough information about A part of is that I 621 00:32:21,000 --> 00:32:24,760 Speaker 1: would like to have some hand in guiding my kids 622 00:32:24,800 --> 00:32:27,160 Speaker 1: to how that what the hell their lives. 623 00:32:26,960 --> 00:32:27,560 Speaker 2: Are going to be like? 624 00:32:27,600 --> 00:32:30,960 Speaker 1: If I can get an inkling that their lives could 625 00:32:31,040 --> 00:32:35,440 Speaker 1: be more different from mine than has ever happened in 626 00:32:35,440 --> 00:32:38,040 Speaker 1: one generation in human history, in fact, that I think 627 00:32:38,080 --> 00:32:39,920 Speaker 1: that might guaranteed to be true. 628 00:32:40,320 --> 00:32:43,520 Speaker 5: I think that's inevitable. Yeah, I think you've nailed it. Yep. 629 00:32:43,880 --> 00:32:45,200 Speaker 2: Wow, think about. 630 00:32:44,960 --> 00:32:48,160 Speaker 5: That well, I think their lives may be more different 631 00:32:48,200 --> 00:32:51,520 Speaker 5: from yours than yours were from the founding fathers. 632 00:32:53,880 --> 00:33:04,200 Speaker 2: Wow. I'm strong, I'm strong, You're ready. 633 00:33:07,840 --> 00:33:08,200 Speaker 5: Strong. 634 00:33:09,880 --> 00:33:12,400 Speaker 2: Here's your host for final thoughts, Joe Getty. 635 00:33:12,680 --> 00:33:14,640 Speaker 5: Let's get a final thought from everybody on the crew 636 00:33:14,680 --> 00:33:16,080 Speaker 5: to wrap up the show for the day. There he 637 00:33:16,120 --> 00:33:18,680 Speaker 5: is Michael Angelo, pressing the buttons. Michael, what's your final thought? 638 00:33:18,760 --> 00:33:20,280 Speaker 2: All right? Guys, coming soon. 639 00:33:20,920 --> 00:33:24,160 Speaker 5: Elon Musk versus Donald Trump in a UFC fight. 640 00:33:24,880 --> 00:33:27,240 Speaker 3: Dana White will be there to promote. 641 00:33:26,840 --> 00:33:32,600 Speaker 5: It real fight or like or I mean, well he 642 00:33:32,840 --> 00:33:33,080 Speaker 5: liked it. 643 00:33:33,120 --> 00:33:36,000 Speaker 2: Pro rat you wanted Mark Zuckerberg. Yeah, he challenged Mark 644 00:33:36,040 --> 00:33:36,760 Speaker 2: to a real song. 645 00:33:38,680 --> 00:33:41,280 Speaker 5: Got a weight advantage, though, Katie Green are seen Newswoman 646 00:33:41,280 --> 00:33:42,880 Speaker 5: as a final thought, Katie. 647 00:33:42,520 --> 00:33:44,880 Speaker 3: There's a new Katie's corner out at Armstrong getty dot 648 00:33:44,920 --> 00:33:47,520 Speaker 3: com and you can see the photo of Drew's birthday 649 00:33:47,520 --> 00:33:50,200 Speaker 3: dinner last night where I wore a shirt covered in 650 00:33:50,280 --> 00:33:53,920 Speaker 3: his face. Wow, wats of his face? 651 00:33:54,240 --> 00:33:58,320 Speaker 2: You're a big fan her, I'll betty was jack final 652 00:33:58,360 --> 00:33:59,800 Speaker 2: thought for me. I don't want to let that lay 653 00:33:59,800 --> 00:34:01,360 Speaker 2: with repeating it. 654 00:34:01,520 --> 00:34:05,640 Speaker 1: My kids' lives are going to be more different from mine. 655 00:34:05,840 --> 00:34:09,000 Speaker 1: That has happened than any generation in human history, and 656 00:34:09,040 --> 00:34:10,759 Speaker 1: I think that is probably true. 657 00:34:12,360 --> 00:34:18,640 Speaker 5: My final thought is I watched the video that a 658 00:34:18,640 --> 00:34:23,320 Speaker 5: guy made to distribute to his parents and grandparents, hipping 659 00:34:23,400 --> 00:34:26,560 Speaker 5: them to how good AI is and how this isn't real, 660 00:34:26,640 --> 00:34:28,879 Speaker 5: This isn't real, This isn't real, and this isn't real. 661 00:34:29,440 --> 00:34:33,200 Speaker 5: I think it's really great and important and cool, and 662 00:34:33,280 --> 00:34:35,719 Speaker 5: it's at Armstrong egeeddy dot com if you want to 663 00:34:35,760 --> 00:34:36,279 Speaker 5: zap it around. 664 00:34:36,360 --> 00:34:37,960 Speaker 2: Is that the one we've got at Katie's corner. 665 00:34:39,480 --> 00:34:43,200 Speaker 1: Yes maybe yes, yes, Armstrong and Getty wrapping about other 666 00:34:43,320 --> 00:34:44,720 Speaker 1: grueling four hour workday. 667 00:34:45,200 --> 00:34:46,960 Speaker 5: So many people will think, so little time. Go to 668 00:34:47,000 --> 00:34:49,920 Speaker 5: Armstrong e getdy dot com for that fine video. It's 669 00:34:49,960 --> 00:34:51,960 Speaker 5: really quite amazing. Drop us a note mail bag at 670 00:34:52,040 --> 00:34:55,319 Speaker 5: armstrong you geddy dot com. Pick up some swag while 671 00:34:55,320 --> 00:34:57,120 Speaker 5: you're there, a hat or a hoodie. 672 00:34:57,320 --> 00:34:59,480 Speaker 1: You know, as you heard in the song, so much disappointment, 673 00:34:59,520 --> 00:35:00,120 Speaker 1: so much shame. 674 00:35:00,160 --> 00:35:01,759 Speaker 2: We'll try to do better tomorrow. We'll see it then. 675 00:35:01,840 --> 00:35:07,160 Speaker 2: God bless America. I'm Strong and Getty. This is fabulous. 676 00:35:07,480 --> 00:35:09,920 Speaker 5: Perhaps you also know that hot dog is my favorite 677 00:35:10,000 --> 00:35:13,400 Speaker 5: meat and that was none of that did not come 678 00:35:13,440 --> 00:35:14,560 Speaker 5: from a dog. 679 00:35:14,680 --> 00:35:17,000 Speaker 2: But damn it, let's not play games with this. This 680 00:35:17,080 --> 00:35:19,640 Speaker 2: is the United States of America, for God's sake. Lie 681 00:35:19,760 --> 00:35:22,800 Speaker 2: after lie after lie. Do not listen to the lies. 682 00:35:23,040 --> 00:35:24,440 Speaker 5: This is what will happen to you. 683 00:35:24,719 --> 00:35:29,840 Speaker 2: Necessary. Okay, this is crazy. Yep, that's enough of that. 684 00:35:30,000 --> 00:35:33,840 Speaker 2: I thank you. Have a terrific day, Armstrong and Getty.