1 00:00:01,880 --> 00:00:06,200 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,240 --> 00:00:07,680 Speaker 1: Washington Broadcast Center. 3 00:00:07,840 --> 00:00:12,520 Speaker 2: Jack Armstrong, Joe Ketty, I'm strong and getty and he 4 00:00:15,440 --> 00:00:17,320 Speaker 2: armstrong and yetty. 5 00:00:23,880 --> 00:00:29,040 Speaker 1: Now the one two pitch Judge swings and missus and 6 00:00:29,280 --> 00:00:33,599 Speaker 1: Annalie has pulled off a stunner, the. 7 00:00:33,640 --> 00:00:38,760 Speaker 2: Biggest win in Italian baseball history. They shocked the United 8 00:00:38,840 --> 00:00:40,440 Speaker 2: States eight to six. 9 00:00:42,080 --> 00:00:44,720 Speaker 3: Yeah, so that's Aaron Judge, one of the best baseball 10 00:00:44,760 --> 00:00:47,239 Speaker 3: players in the world and of all time, maybe uh 11 00:00:47,479 --> 00:00:50,120 Speaker 3: striking out to end the game against a team they 12 00:00:50,120 --> 00:00:54,600 Speaker 3: should have beaten easily. Bryce Harper, who gets paid like 13 00:00:54,640 --> 00:00:58,040 Speaker 3: a quarter of a billion dollars, also had men on 14 00:00:58,120 --> 00:01:00,920 Speaker 3: and a chance to take the lead and flied out 15 00:01:01,240 --> 00:01:03,080 Speaker 3: some of the smith and the biggest stars. And as it 16 00:01:03,080 --> 00:01:06,200 Speaker 3: said the New York Post, the twenty twenty six Olympics 17 00:01:06,200 --> 00:01:09,400 Speaker 3: had plenty of upsets and surprises, but Tuesday Nights World 18 00:01:09,400 --> 00:01:12,920 Speaker 3: of Baseball Classic may have produced the most shocking international 19 00:01:13,040 --> 00:01:16,800 Speaker 3: upset of the entire year, and Team USA was on 20 00:01:16,840 --> 00:01:18,720 Speaker 3: the wrong side of it. Italy jumped out to an 21 00:01:18,720 --> 00:01:22,480 Speaker 3: eight nothing lead. Wow, we clawed her back away back 22 00:01:22,480 --> 00:01:25,319 Speaker 3: to eight to six, but lost. The manager of the 23 00:01:25,319 --> 00:01:27,760 Speaker 3: team I don't know his name, but had said I 24 00:01:27,840 --> 00:01:30,720 Speaker 3: basically consider us in the middle round before the game 25 00:01:31,520 --> 00:01:33,240 Speaker 3: because Italy was going to be so easy to beat, 26 00:01:33,360 --> 00:01:33,959 Speaker 3: which is not the. 27 00:01:33,920 --> 00:01:36,360 Speaker 2: Sorts of God's heard that you should never say that 28 00:01:36,400 --> 00:01:37,560 Speaker 2: sort of thing. Uh. 29 00:01:38,080 --> 00:01:40,680 Speaker 3: And then they lost, and now Mexico has got to 30 00:01:40,720 --> 00:01:43,479 Speaker 3: beat somebody by like four runs or more or something 31 00:01:43,520 --> 00:01:46,840 Speaker 3: to pull us into the middle round. I'm not exactly 32 00:01:46,880 --> 00:01:49,000 Speaker 3: sure how that works, but we don't even determine our 33 00:01:49,000 --> 00:01:49,960 Speaker 3: own fate at this point. 34 00:01:50,160 --> 00:01:52,600 Speaker 2: That's disappointing. Yeah, it is. 35 00:01:53,080 --> 00:01:57,480 Speaker 1: It's I don't love the format of the tournament just 36 00:01:57,520 --> 00:02:01,560 Speaker 1: because of the nature of baseball. As we discussed briefly earlier. 37 00:02:01,880 --> 00:02:04,240 Speaker 1: You can have one team that's clearly superior to the 38 00:02:04,240 --> 00:02:06,040 Speaker 1: other and is going to beat him in the World Series, 39 00:02:06,040 --> 00:02:08,240 Speaker 1: say four to one. But if the first game is 40 00:02:08,320 --> 00:02:12,600 Speaker 1: that one game that the inferior team wins in this tournament, 41 00:02:12,720 --> 00:02:15,520 Speaker 1: well the other team is just screwed because you can have, 42 00:02:16,080 --> 00:02:19,079 Speaker 1: you know, a great pitcher just not have his stuff 43 00:02:19,120 --> 00:02:21,520 Speaker 1: that night. Nobody knows why. I pitched for years and years, 44 00:02:21,560 --> 00:02:24,119 Speaker 1: and some days you felt like a world beater. 45 00:02:23,960 --> 00:02:25,480 Speaker 2: And you just couldn't do it. 46 00:02:25,680 --> 00:02:27,960 Speaker 1: And some days you go out thinking, oh God, and 47 00:02:28,000 --> 00:02:31,440 Speaker 1: you'd just be masterful. And that's why baseball is so interesting. 48 00:02:31,480 --> 00:02:35,080 Speaker 1: But that's also why they play a lot of games. Anyway, 49 00:02:35,120 --> 00:02:35,840 Speaker 1: what are you gonna do? 50 00:02:36,000 --> 00:02:38,560 Speaker 3: Yeah, well, it's a national humiliation. And I really think 51 00:02:38,600 --> 00:02:41,400 Speaker 3: we got to send the Marines into Italy, take their oil. 52 00:02:42,280 --> 00:02:44,799 Speaker 1: Well, I think probably jail the turncoats when they come 53 00:02:44,800 --> 00:02:45,760 Speaker 1: back to our shores. 54 00:02:45,840 --> 00:02:50,880 Speaker 3: Aaron Judge, Yeah, I hope you enjoy Levenworth. 55 00:02:50,919 --> 00:02:53,320 Speaker 1: Aaron, gonna have to get him an extra large sell. 56 00:02:53,360 --> 00:02:54,800 Speaker 1: I'm not sure he'd fit in a regular one. 57 00:02:54,880 --> 00:02:56,799 Speaker 3: Speaking of war, and I don't know that I advocate 58 00:02:56,840 --> 00:03:04,200 Speaker 3: a war against Italy. Trump said this is new, said 59 00:03:04,360 --> 00:03:07,320 Speaker 3: the war will end soon, practically nothing left to target, 60 00:03:07,360 --> 00:03:09,360 Speaker 3: and the war will end when I say it's over. 61 00:03:09,520 --> 00:03:12,120 Speaker 3: So he just told Axios that the war will end 62 00:03:12,160 --> 00:03:14,360 Speaker 3: when I say it's over. I still think it's on 63 00:03:14,400 --> 00:03:16,120 Speaker 3: the table for him to pull the plug on it 64 00:03:16,320 --> 00:03:20,520 Speaker 3: just out of nowhere, like soon and declare it over. 65 00:03:23,400 --> 00:03:26,360 Speaker 1: And then not though they're still threatening this straight of 66 00:03:26,400 --> 00:03:28,919 Speaker 1: horror moves. Do have to get that opened up somehow? 67 00:03:29,280 --> 00:03:32,520 Speaker 2: Yeah, and like open so people believe it's open. 68 00:03:32,840 --> 00:03:34,080 Speaker 3: I don't know what it is now, but the other 69 00:03:34,200 --> 00:03:38,120 Speaker 3: day there were a thousand ships wined up waiting to 70 00:03:38,120 --> 00:03:40,240 Speaker 3: go through. That number is probably a lot bigger now, 71 00:03:41,640 --> 00:03:45,840 Speaker 3: and they ain't gonna go through unless they feel like 72 00:03:45,920 --> 00:03:49,840 Speaker 3: they can keep their sailors safe and probably more importantly, unfortunately, 73 00:03:50,000 --> 00:03:52,480 Speaker 3: their ship and their oil that the billions. 74 00:03:52,120 --> 00:03:54,280 Speaker 2: Of dollars that are on the line there. 75 00:03:54,800 --> 00:03:57,040 Speaker 1: Although if we fully ensured that, it would still be 76 00:03:57,080 --> 00:03:59,280 Speaker 1: a paperwork nightmare and it take a while to get 77 00:03:59,280 --> 00:03:59,720 Speaker 1: your funds. 78 00:03:59,760 --> 00:04:03,560 Speaker 3: But the doomsday plane, I mentioned this a little bit ago. 79 00:04:04,040 --> 00:04:07,000 Speaker 3: The doomsday plane has been spotted flying over in northern California. 80 00:04:07,480 --> 00:04:10,120 Speaker 3: That's the flying White House that they use if we 81 00:04:10,160 --> 00:04:12,480 Speaker 3: go to full on nuclear war. That Trump can fly 82 00:04:12,600 --> 00:04:16,680 Speaker 3: around super high in this plane and run America. So 83 00:04:16,680 --> 00:04:19,280 Speaker 3: they've been flying that around to be ready for an 84 00:04:19,320 --> 00:04:24,560 Speaker 3: emergency operation hub. Probably standard sort of thing they do anytime. 85 00:04:25,160 --> 00:04:26,760 Speaker 3: In fact, they might flat around all the time for 86 00:04:26,800 --> 00:04:27,120 Speaker 3: all I know. 87 00:04:27,160 --> 00:04:27,560 Speaker 2: I don't know. 88 00:04:28,080 --> 00:04:30,039 Speaker 1: Yeah, it's an exciting story, but yeah, it might be 89 00:04:30,040 --> 00:04:33,599 Speaker 1: completely meaningless. So hey, I hadn't intended to go back 90 00:04:33,640 --> 00:04:37,840 Speaker 1: to this, but I just found this. I am stunned 91 00:04:37,839 --> 00:04:41,000 Speaker 1: by all this. Michael play fourteen again. This is Abby 92 00:04:41,120 --> 00:04:48,040 Speaker 1: Phillip on CNN reporting on when again you had these geeks, 93 00:04:48,040 --> 00:04:54,960 Speaker 1: a handful of jackasses who were demonstrating against essentially a 94 00:04:55,080 --> 00:04:57,440 Speaker 1: Muslim being the mayor of New York because they think 95 00:04:57,440 --> 00:05:00,360 Speaker 1: it's a bad thing. And you know, I get they're 96 00:05:00,360 --> 00:05:03,400 Speaker 1: coming from these people I would not want to be with, 97 00:05:03,680 --> 00:05:06,679 Speaker 1: but you know, in the stop clock being right twice 98 00:05:06,680 --> 00:05:10,520 Speaker 1: a day, way they're onto something kind of. But anyway, 99 00:05:10,800 --> 00:05:16,040 Speaker 1: they were demonstrating and some ISIS trained and inspired Muslims 100 00:05:16,560 --> 00:05:19,600 Speaker 1: came and tried to blow them up with bombs to 101 00:05:19,720 --> 00:05:23,359 Speaker 1: kill me. Right exactly, this is what CNN reported. 102 00:05:23,160 --> 00:05:26,920 Speaker 4: Two Republicans say Muslims don't belong here after an attempted 103 00:05:27,000 --> 00:05:31,039 Speaker 4: terror attack against New York's mayor Zoron Mamdani, and the 104 00:05:31,080 --> 00:05:35,440 Speaker 4: House Speaker Mike Johnson says nothing really to condemn those comments. 105 00:05:36,880 --> 00:05:39,679 Speaker 1: Okay, that was in coherent in half a dozen different ways, 106 00:05:39,680 --> 00:05:43,560 Speaker 1: but claimed that it was a terror attack against Mumdanni 107 00:05:43,680 --> 00:05:47,880 Speaker 1: when it indeed was his co religionists firebombing people who 108 00:05:47,920 --> 00:05:49,280 Speaker 1: were condemning Mumdanni. 109 00:05:50,240 --> 00:05:51,680 Speaker 2: So this is not shocking. 110 00:05:51,720 --> 00:05:55,599 Speaker 1: You got murad Awa Dad, president of the New yorkstation. 111 00:05:56,040 --> 00:06:02,559 Speaker 3: ISIS members, attempt to kill peaceful protest that's the headline. 112 00:06:02,440 --> 00:06:06,440 Speaker 1: Right, Yes, yeah, but so you've got this maraud Awada, 113 00:06:06,600 --> 00:06:10,080 Speaker 1: president of New York Immigration Coalition, his statement, after all, 114 00:06:10,120 --> 00:06:14,039 Speaker 1: this is blasting the original organizer, the peaceful protesters, for 115 00:06:14,120 --> 00:06:17,679 Speaker 1: their outlandish bigotry. We're talking about a white supremacist who's 116 00:06:17,720 --> 00:06:19,400 Speaker 1: looking for people to respond. 117 00:06:19,000 --> 00:06:20,240 Speaker 2: To them in this way. He said. 118 00:06:20,480 --> 00:06:23,719 Speaker 1: Okay, wow, So they incited the terrorism and they deserved it. 119 00:06:23,720 --> 00:06:28,080 Speaker 1: According to maraud Awada, Brad Lander, former New York City 120 00:06:28,080 --> 00:06:32,480 Speaker 1: controller who's running against Dan Goldman in a Democratic primary 121 00:06:33,560 --> 00:06:37,360 Speaker 1: for what Congress I think, said he was horrified. I'm 122 00:06:37,360 --> 00:06:40,120 Speaker 1: gonna now pay attention to the words you use. He 123 00:06:40,200 --> 00:06:44,520 Speaker 1: was horrified by the disturbing threat of violence, which he 124 00:06:44,600 --> 00:06:48,040 Speaker 1: instinctively blamed on the anti Muslim protesters. Quote, vile displays 125 00:06:48,040 --> 00:06:52,200 Speaker 1: of Islamophobia will never be tolerated in our city. He 126 00:06:52,320 --> 00:06:55,440 Speaker 1: later apologized after learning some basic facts about the case. 127 00:06:55,600 --> 00:06:58,080 Speaker 1: I'm sorry for jumping to conclusions and posting too soon, 128 00:06:58,240 --> 00:07:01,280 Speaker 1: but I'm not sorry for hating Islamaphobia as much as 129 00:07:01,279 --> 00:07:05,200 Speaker 1: I hate anti Semitism. As I've putted out before, that's funny. 130 00:07:05,240 --> 00:07:09,920 Speaker 1: Islamophobia sounds like an irrational fear anti Semitism. The word 131 00:07:09,960 --> 00:07:13,360 Speaker 1: itself sounds like a political stance. That's funny, isn't it. 132 00:07:13,520 --> 00:07:15,320 Speaker 2: Huh huh, it's not. It's deliberate. 133 00:07:15,800 --> 00:07:20,280 Speaker 1: Manhattan Borough President Brad Heulman Seagull's response was a vague 134 00:07:20,360 --> 00:07:23,760 Speaker 1: condemnation of both sides. Today on Manhattan's Upper East Side, 135 00:07:23,800 --> 00:07:27,080 Speaker 1: we saw disturbing acts and displays of hate targeting Muslim 136 00:07:27,120 --> 00:07:30,360 Speaker 1: and Jewish New Yorkers. Every New Yorker should be alarmed, 137 00:07:30,400 --> 00:07:34,840 Speaker 1: blub ball. New York State Senator Liz Krueger. Also, of 138 00:07:34,840 --> 00:07:38,360 Speaker 1: course these are all democrats, said the city has no 139 00:07:38,560 --> 00:07:42,280 Speaker 1: place for anti Muslim hate or any other kind of prejudice, 140 00:07:43,040 --> 00:07:46,120 Speaker 1: and blasted out of state provocateurs. 141 00:07:45,320 --> 00:07:47,320 Speaker 2: Showing fear, division and violence. 142 00:07:49,760 --> 00:07:55,800 Speaker 1: So not the literal bombchuckers, but the peaceful demonstrators jackasses, 143 00:07:55,840 --> 00:07:56,480 Speaker 1: though they may be. 144 00:07:58,760 --> 00:07:59,760 Speaker 2: That's amazing. 145 00:08:00,520 --> 00:08:04,160 Speaker 3: This is I think this is a level different in 146 00:08:04,240 --> 00:08:07,880 Speaker 3: terms of lying about the facts of a story than 147 00:08:08,280 --> 00:08:10,080 Speaker 3: even in recent memory. 148 00:08:10,720 --> 00:08:15,440 Speaker 1: In defense of radical Islam, New York Times wrote, a 149 00:08:15,480 --> 00:08:20,320 Speaker 1: far right influencer attracted the counter protests that turned violent. 150 00:08:21,120 --> 00:08:26,200 Speaker 1: Oh so again the New York Times saying they attracted 151 00:08:26,200 --> 00:08:28,600 Speaker 1: the counter protest that turned violent, So they deserved it. 152 00:08:29,360 --> 00:08:30,040 Speaker 2: Babba bah. 153 00:08:30,080 --> 00:08:32,920 Speaker 1: The paper's reporting writes the Free Beacon was riddled with 154 00:08:33,080 --> 00:08:37,199 Speaker 1: passive voice descriptions of how smoking jars of metal infuses 155 00:08:37,200 --> 00:08:39,640 Speaker 1: were thrown by an individual who was quote one of 156 00:08:39,720 --> 00:08:43,240 Speaker 1: six people arrested after a clash with anti muscle protesters. 157 00:08:43,240 --> 00:08:45,360 Speaker 3: There are a number of people who like the smoking 158 00:08:45,520 --> 00:08:48,880 Speaker 3: jars filled with metal infuses as opposed to cast a bomb. 159 00:08:49,760 --> 00:08:52,520 Speaker 1: Well right, And they mentioned that the guys who threw 160 00:08:52,559 --> 00:08:55,560 Speaker 1: bombs designed to kill more people than the Boston marathon 161 00:08:55,640 --> 00:08:59,760 Speaker 1: bombers were some of six people arrested after a clash 162 00:08:59,800 --> 00:09:03,280 Speaker 1: with anti Muslim protesters, going as far as they can 163 00:09:03,360 --> 00:09:07,000 Speaker 1: to imply that it was the anti Muslims who are 164 00:09:07,000 --> 00:09:10,280 Speaker 1: the bombers. In The New York Times, the Times portrayed 165 00:09:10,280 --> 00:09:13,040 Speaker 1: the chaotic scene is quote a strained moment filled with 166 00:09:13,120 --> 00:09:15,160 Speaker 1: tensions both national and hyper local. 167 00:09:15,200 --> 00:09:24,200 Speaker 2: Blah blah blah ah wow. So that, oh you know what. 168 00:09:24,360 --> 00:09:28,080 Speaker 1: Final note Andrew Styles, who is freaking hilarious, bitterly hilarious 169 00:09:28,120 --> 00:09:30,959 Speaker 1: and the Free Beacon. He has a picture of the 170 00:09:31,000 --> 00:09:34,560 Speaker 1: World Trade Center exploding in fireback on nine to eleven, 171 00:09:35,520 --> 00:09:39,200 Speaker 1: and his headline in The New York Times is islamophobes 172 00:09:39,320 --> 00:09:44,520 Speaker 1: pounce after Muslim piloted aircraft collide with symbols of capitalist imperialism. 173 00:09:44,840 --> 00:09:47,440 Speaker 2: It's pretty good and not a far stretch from what 174 00:09:47,520 --> 00:09:51,720 Speaker 2: we're hearing. This is unreal in even the. 175 00:09:53,320 --> 00:09:57,200 Speaker 3: Guy I don't like, he is a January sixth dude 176 00:09:57,480 --> 00:10:01,560 Speaker 3: who was violent against cops and was pardoned by Trump. 177 00:10:01,600 --> 00:10:05,040 Speaker 3: And he was there to protest against Muslims in office 178 00:10:05,120 --> 00:10:05,640 Speaker 3: in New York. 179 00:10:06,360 --> 00:10:07,640 Speaker 2: But when when. 180 00:10:09,080 --> 00:10:12,160 Speaker 3: The cops showed, he said, I want you to understand 181 00:10:12,280 --> 00:10:13,840 Speaker 3: who was being violent here? 182 00:10:14,440 --> 00:10:16,120 Speaker 2: Who were the violent ones? 183 00:10:16,280 --> 00:10:16,319 Speaker 4: Was? 184 00:10:16,360 --> 00:10:19,320 Speaker 3: It ads to them? And you know, nobody's picked up 185 00:10:19,360 --> 00:10:22,479 Speaker 3: on that no obvious point. 186 00:10:23,720 --> 00:10:27,800 Speaker 1: Leftist America's love affair with radical Islam is going to 187 00:10:27,840 --> 00:10:31,160 Speaker 1: be the biggest story in the next decade. 188 00:10:31,679 --> 00:10:34,520 Speaker 2: You think, I do, I really do. 189 00:10:34,679 --> 00:10:37,760 Speaker 1: Maybe AI will fix it because of the probably so 190 00:10:37,920 --> 00:10:39,840 Speaker 1: because it's going to grow and grow and grow. The 191 00:10:40,000 --> 00:10:44,079 Speaker 1: radical Muslims Mamdani and company are not going to uh well, 192 00:10:44,120 --> 00:10:47,400 Speaker 1: they will as the saying goes grow hungrier with the eating. 193 00:10:50,559 --> 00:10:54,120 Speaker 3: AI safety under scrutiny after mass shooting. I don't know 194 00:10:54,160 --> 00:10:55,480 Speaker 3: this story. I'm going to have to click on it 195 00:10:55,520 --> 00:10:57,000 Speaker 3: because I don't know that when I was going to 196 00:10:57,040 --> 00:10:59,360 Speaker 3: talk a little bit about AI and some. 197 00:10:59,400 --> 00:11:00,880 Speaker 2: Of the shoot somebody? 198 00:11:02,360 --> 00:11:04,800 Speaker 3: Oh or was it a person driven nuts by AI? 199 00:11:05,400 --> 00:11:08,520 Speaker 3: Lawsuit claims Open AI. That'd be your chat GPT was 200 00:11:08,559 --> 00:11:11,959 Speaker 3: aware of the shooter's violent intentions because apparently the guy 201 00:11:11,960 --> 00:11:14,880 Speaker 3: had been chatting with the chatbot and it didn't alert people. 202 00:11:15,000 --> 00:11:17,480 Speaker 3: Or I don't know how we will get into that 203 00:11:17,600 --> 00:11:22,400 Speaker 3: world where the chatbots are obligated. It'd be like, what 204 00:11:22,559 --> 00:11:24,640 Speaker 3: is the test that I took in California? I am 205 00:11:24,679 --> 00:11:30,679 Speaker 3: a mandatory reporter, So I'm a mandatory reporter. If I 206 00:11:30,720 --> 00:11:34,280 Speaker 3: see somebody abusing a kid, for instance, I have to 207 00:11:34,320 --> 00:11:36,319 Speaker 3: report it or I'm in trouble. Are we going to 208 00:11:36,400 --> 00:11:41,080 Speaker 3: have mandatory reporter rules for chatbots? You have to report 209 00:11:41,200 --> 00:11:45,040 Speaker 3: to mental health professionals that this person is talking about 210 00:11:45,120 --> 00:11:47,959 Speaker 3: killing themselves or hurting others. 211 00:11:49,120 --> 00:11:52,120 Speaker 2: That'd be interesting, Yeah, and that could go way, way, 212 00:11:52,160 --> 00:11:55,199 Speaker 2: way too far. Oh heck yeah. 213 00:11:55,360 --> 00:11:58,840 Speaker 1: I mean if I say the Italians killed our guys yesterday, 214 00:12:00,040 --> 00:12:02,400 Speaker 1: well they think I'm planning some sort of genocide or something. 215 00:12:04,040 --> 00:12:06,920 Speaker 3: Well, and you can talk to a therapist about not 216 00:12:06,960 --> 00:12:09,920 Speaker 3: wanting to hurt other people probably but wanting to hurt yourself, 217 00:12:09,960 --> 00:12:12,360 Speaker 3: and they don't necessarily report it there. It's up to 218 00:12:12,400 --> 00:12:15,280 Speaker 3: them to determine whether or not. That's a but chatbots 219 00:12:15,320 --> 00:12:19,200 Speaker 3: would not be able to pick up the distinctions. M yeah, 220 00:12:19,240 --> 00:12:21,840 Speaker 3: that'll be an interesting case, one of the many, many 221 00:12:21,840 --> 00:12:24,360 Speaker 3: interesting cases involving AI. 222 00:12:24,640 --> 00:12:30,320 Speaker 2: Okay, we got more on the ways. Stay here? Do 223 00:12:30,440 --> 00:12:32,120 Speaker 2: you have to take him out? Does he have a 224 00:12:32,160 --> 00:12:34,640 Speaker 2: target on his badge? You mean the new Supreme Leader? 225 00:12:35,160 --> 00:12:36,920 Speaker 2: You mean the sun? How can there be an Iran? 226 00:12:37,040 --> 00:12:39,880 Speaker 1: Well I don't want to say that, but you know, 227 00:12:40,000 --> 00:12:43,400 Speaker 1: I was disappointed because we think it's going to lead 228 00:12:43,440 --> 00:12:45,600 Speaker 1: to just more of the same problem for the country. 229 00:12:45,640 --> 00:12:47,360 Speaker 2: So I was disappointed to see their. 230 00:12:47,280 --> 00:12:52,720 Speaker 3: Choice much less threatening and belligerent statement from the president 231 00:12:52,840 --> 00:12:54,839 Speaker 3: on the Does the new Supreme Leader have a target 232 00:12:54,880 --> 00:12:56,800 Speaker 3: on his back? Well, I don't want to say that, 233 00:12:56,880 --> 00:12:59,520 Speaker 3: but I was disappointed in the choice. So there you go, 234 00:13:00,000 --> 00:13:04,760 Speaker 3: breaking your non trumpion the International Energy something or other 235 00:13:04,840 --> 00:13:07,000 Speaker 3: and didn't even know they had reserve oil, but they 236 00:13:07,000 --> 00:13:09,600 Speaker 3: do and they're going to release four hundred million gallons 237 00:13:09,600 --> 00:13:12,960 Speaker 3: of it. WHOA So in an attempt to keep oil 238 00:13:12,960 --> 00:13:16,080 Speaker 3: price down and take away Iran's leverage. Who's really what 239 00:13:16,120 --> 00:13:16,600 Speaker 3: they're doing? 240 00:13:17,200 --> 00:13:18,480 Speaker 2: Right? That's cool? 241 00:13:19,080 --> 00:13:21,319 Speaker 3: And AI storre you for you that I find interesting. 242 00:13:21,440 --> 00:13:23,880 Speaker 3: Not well, it is surprising to me. But but there's 243 00:13:23,880 --> 00:13:27,000 Speaker 3: been a lot of technological advancements that were supposed to 244 00:13:27,080 --> 00:13:29,920 Speaker 3: ease our workload and lead to less paper and blah 245 00:13:29,960 --> 00:13:33,680 Speaker 3: blah blah, and hasn't necessarily done that. I was talking 246 00:13:33,679 --> 00:13:40,079 Speaker 3: to somebody about the other day about pre email texts 247 00:13:40,160 --> 00:13:43,960 Speaker 3: when you went home from work, and only in extreme 248 00:13:44,040 --> 00:13:48,600 Speaker 3: conditions would your boss call you at home, you know, 249 00:13:48,640 --> 00:13:50,520 Speaker 3: between you getting home at night in the next morning. 250 00:13:50,880 --> 00:13:53,320 Speaker 3: Oh yeah, But now you get your texts and emails 251 00:13:53,320 --> 00:13:56,439 Speaker 3: from people round the clock with the expectation of being 252 00:13:56,480 --> 00:13:57,800 Speaker 3: able to reply to them. 253 00:13:58,320 --> 00:14:01,800 Speaker 2: Yeah, that's funny. I hadn't thought of that in years. 254 00:14:01,800 --> 00:14:05,000 Speaker 1: But your boss calls you at seven o'clock, you know, 255 00:14:05,080 --> 00:14:06,880 Speaker 1: your wife answers, honey, it's your boss. 256 00:14:06,920 --> 00:14:11,000 Speaker 2: You'd think, what the hell? Right, No kidding, I didn't 257 00:14:11,040 --> 00:14:14,640 Speaker 2: know you knew my home number. But this from the 258 00:14:14,640 --> 00:14:15,440 Speaker 2: Wall Street Journal. 259 00:14:15,480 --> 00:14:21,080 Speaker 3: AI isn't lightning workloads, it's making them more intense. One 260 00:14:21,120 --> 00:14:24,240 Speaker 3: of the great hopes for artificial intelligence, at least among workers, 261 00:14:24,280 --> 00:14:27,120 Speaker 3: is that will ease workloads, fring people up for more 262 00:14:27,160 --> 00:14:28,560 Speaker 3: high level creative pursuits. 263 00:14:29,000 --> 00:14:31,040 Speaker 2: So far, the opposite is happening. 264 00:14:31,120 --> 00:14:35,560 Speaker 3: New data show in fact, AI is increasing the speed, density, 265 00:14:35,600 --> 00:14:38,400 Speaker 3: and complexity of work rather than reducing it. According to 266 00:14:38,400 --> 00:14:41,200 Speaker 3: an analysis of a analysis of one hundred and sixty 267 00:14:41,240 --> 00:14:45,680 Speaker 3: four thousand workers digital work activity, that's a lot of people. Yeah, 268 00:14:45,840 --> 00:14:49,040 Speaker 3: that's a big study. The data from workforce analytics and 269 00:14:49,080 --> 00:14:52,440 Speaker 3: productivity tracking software covers more than blah blah go four 270 00:14:52,520 --> 00:14:56,600 Speaker 3: hundred and forty four million hours okay, across eleven hundred employers, 271 00:14:56,880 --> 00:14:59,760 Speaker 3: So this is one of the biggest studies of AI today. 272 00:15:01,920 --> 00:15:05,320 Speaker 3: Examining users digital activity one hundred and eighty days before 273 00:15:05,760 --> 00:15:09,120 Speaker 3: and one hundred days eighty days after they began using 274 00:15:09,160 --> 00:15:12,680 Speaker 3: tools on the job AI tools, they found that AI 275 00:15:12,800 --> 00:15:16,880 Speaker 3: intensified activity across nearly every category. The time they spent 276 00:15:16,960 --> 00:15:22,240 Speaker 3: on email, messaging, and chat more than doubled, while their 277 00:15:22,320 --> 00:15:24,880 Speaker 3: use of business management tools such as human resources or 278 00:15:24,920 --> 00:15:27,760 Speaker 3: accounting software rose ninety four percent, so it almost doubled. 279 00:15:29,800 --> 00:15:30,359 Speaker 2: Wow. 280 00:15:30,760 --> 00:15:33,040 Speaker 3: Wow, I don't know how this could possibly be true, 281 00:15:33,040 --> 00:15:34,720 Speaker 3: but I assume it is. Yeah. 282 00:15:34,760 --> 00:15:36,800 Speaker 1: Gosh, our jobs are so weird, it's it's tough for 283 00:15:36,800 --> 00:15:39,920 Speaker 1: me to relate to that sort of thing. I have 284 00:15:40,000 --> 00:15:42,640 Speaker 1: no reason to doubt the study Meanwhile, the amount of 285 00:15:42,680 --> 00:15:48,120 Speaker 1: time AI users devoted to focused, uninterrupted work, the kind 286 00:15:48,120 --> 00:15:51,280 Speaker 1: of concentration offer often required to figure out complex program 287 00:15:51,360 --> 00:15:52,680 Speaker 1: the problems, you know, the sort of thing you were 288 00:15:52,680 --> 00:15:56,440 Speaker 1: hoping you'd have more time to do, right, dropped nine 289 00:15:56,520 --> 00:15:59,920 Speaker 1: percent compared to the one hundred and eighty days where 290 00:15:59,920 --> 00:16:06,560 Speaker 1: you didn't have AI. It's not that AI doesn't create efficiency, 291 00:16:06,600 --> 00:16:09,080 Speaker 1: says this expert. It's that the capacity it frees up 292 00:16:09,120 --> 00:16:13,320 Speaker 1: immediately gets repurposed into doing other work, and that's where 293 00:16:13,320 --> 00:16:18,520 Speaker 1: the creep is likely to happen. Ok wow, Okay, So 294 00:16:18,600 --> 00:16:23,120 Speaker 1: the pace at which it brings you solutions and turns 295 00:16:23,160 --> 00:16:27,360 Speaker 1: out work or analysis or whatever, that just creates much 296 00:16:27,360 --> 00:16:28,480 Speaker 1: more work for you to do. 297 00:16:28,720 --> 00:16:30,960 Speaker 3: I don't understand it, but you got to be groaning 298 00:16:30,960 --> 00:16:32,840 Speaker 3: if you're listening to this, and if it's true the 299 00:16:32,960 --> 00:16:36,840 Speaker 3: idea that when AI comes to your workplace that the 300 00:16:36,920 --> 00:16:41,200 Speaker 3: time spent on emailing and chatting and messaging will double. 301 00:16:42,960 --> 00:16:46,800 Speaker 1: On the other hand, that would make the anti doomers right. 302 00:16:47,840 --> 00:16:51,040 Speaker 1: AI will not eliminate jobs, it will create so much 303 00:16:51,080 --> 00:16:54,440 Speaker 1: more work. Everybody will, you know, to get paid whatever 304 00:16:54,440 --> 00:16:56,320 Speaker 1: they want to get to charge, you know, demand. 305 00:16:56,320 --> 00:16:58,920 Speaker 3: I don't know about that part. But you keep your job. 306 00:16:59,080 --> 00:17:01,920 Speaker 3: Maybe work so hard it makes you insane. 307 00:17:01,760 --> 00:17:05,560 Speaker 2: Till it kills you. Yes, the robots won't kill you. 308 00:17:05,640 --> 00:17:07,880 Speaker 2: The work will. Oh my god. We have no idea 309 00:17:07,920 --> 00:17:10,120 Speaker 2: where this is going. No, we don't. If you miss 310 00:17:10,160 --> 00:17:12,080 Speaker 2: a sick ni get the podcast Armstrong and Getty on 311 00:17:12,160 --> 00:17:15,840 Speaker 2: demand Armstrong and Getty. 312 00:17:17,359 --> 00:17:20,800 Speaker 5: One of those driverless taxis coming dangerously close to train 313 00:17:20,880 --> 00:17:24,600 Speaker 5: tracks in Texas. A driver capturing the empty Waymo vehicle 314 00:17:24,840 --> 00:17:27,560 Speaker 5: across the lower gates that are rail crossing. In Austin, 315 00:17:27,840 --> 00:17:30,439 Speaker 5: a train passing by just inches away. There Weymo reacting 316 00:17:30,480 --> 00:17:32,600 Speaker 5: to the close call, saying in the statement they are 317 00:17:32,640 --> 00:17:36,959 Speaker 5: temporarily restricting Weimo cars from similar rail crossings while they 318 00:17:36,960 --> 00:17:37,959 Speaker 5: conduct a full review. 319 00:17:40,040 --> 00:17:41,639 Speaker 3: Yeah, I'd hate to be sitting in an Awei Mo 320 00:17:41,840 --> 00:17:43,760 Speaker 3: and it starts driving onto a train tracks with the 321 00:17:43,840 --> 00:17:45,160 Speaker 3: train come and think why. 322 00:17:45,080 --> 00:17:49,919 Speaker 2: What are you? You'd hate that you sing what are 323 00:17:49,920 --> 00:17:50,480 Speaker 2: you doing? Here? 324 00:17:51,960 --> 00:17:54,840 Speaker 3: Got a little more on that AI work play story 325 00:17:54,840 --> 00:17:58,080 Speaker 3: that's really interesting. But the New York Post is calling 326 00:17:58,080 --> 00:18:03,119 Speaker 3: out soccer player activist Megan Rappeno. You remember her good 327 00:18:04,280 --> 00:18:06,840 Speaker 3: from She was one of the best soccer players ever 328 00:18:06,840 --> 00:18:09,480 Speaker 3: and she was on the US women's team that competed 329 00:18:09,480 --> 00:18:12,439 Speaker 3: at the highest level. But she kneeled for the national anthem, 330 00:18:12,520 --> 00:18:14,840 Speaker 3: and she testified before Congress about how women are being 331 00:18:14,920 --> 00:18:17,280 Speaker 3: ripped off and all these different things. She's been involved in, 332 00:18:17,320 --> 00:18:19,280 Speaker 3: all these different things. She's a big George Floyd person 333 00:18:19,320 --> 00:18:23,119 Speaker 3: all that hasn't said a word about Iran's female soccer 334 00:18:23,160 --> 00:18:27,080 Speaker 3: team being sent home to be raped, tortured and murdered, 335 00:18:27,160 --> 00:18:29,440 Speaker 3: quite possibly by their own government. 336 00:18:29,560 --> 00:18:32,840 Speaker 2: Why now, I don't know. I don't I talked about 337 00:18:32,880 --> 00:18:33,920 Speaker 2: this yesterday. I don't know how I. 338 00:18:33,840 --> 00:18:36,720 Speaker 3: Feel about people when they don't get involved in an issue. 339 00:18:36,760 --> 00:18:39,600 Speaker 3: But man, she's been involved in so many political issues, 340 00:18:39,920 --> 00:18:43,080 Speaker 3: and you're you're like the biggest voice in women's soccer 341 00:18:43,119 --> 00:18:44,520 Speaker 3: on planet Earth. 342 00:18:45,359 --> 00:18:50,240 Speaker 1: Certainly when politics intrudes. Yeah, And to not say anything 343 00:18:50,240 --> 00:18:54,520 Speaker 1: about this, I'm telling you, these people are so steeped 344 00:18:54,640 --> 00:18:58,840 Speaker 1: in self hatred in America hatred. They can't criticize anyone 345 00:18:59,080 --> 00:19:02,000 Speaker 1: who's the enemy of the United States. 346 00:19:01,760 --> 00:19:06,200 Speaker 2: It's that crazy. Good she should be called out. Good 347 00:19:06,240 --> 00:19:09,159 Speaker 2: for the Post anyway, So we were talking about this. 348 00:19:09,359 --> 00:19:16,000 Speaker 3: It's the biggest study yet by far, of the impact 349 00:19:16,160 --> 00:19:22,120 Speaker 3: of AI on the workplace. Hundreds of thousands of employees 350 00:19:22,960 --> 00:19:28,120 Speaker 3: gazillions of hours, and what they determined is it doesn't 351 00:19:28,320 --> 00:19:29,960 Speaker 3: put you in a position where you can get all 352 00:19:30,000 --> 00:19:30,560 Speaker 3: your work. 353 00:19:30,400 --> 00:19:32,000 Speaker 2: Done in like an hour, and so the rest of 354 00:19:32,040 --> 00:19:32,480 Speaker 2: the time you. 355 00:19:32,440 --> 00:19:34,720 Speaker 3: Get to you know, write poetry and play your guitar 356 00:19:34,840 --> 00:19:37,760 Speaker 3: and ah, my fondest wish, get into glass blowing and 357 00:19:37,800 --> 00:19:39,960 Speaker 3: candle making or whatever it is you're wanting to do. 358 00:19:41,160 --> 00:19:44,800 Speaker 3: You just become way way more productive and actually end 359 00:19:44,840 --> 00:19:47,160 Speaker 3: up more stressed out and more harder, which actually kind 360 00:19:47,160 --> 00:19:49,240 Speaker 3: of makes sense to me, because, like I'm thinking about, 361 00:19:49,520 --> 00:19:52,399 Speaker 3: I don't know many worlds of business other than radio 362 00:19:52,840 --> 00:19:54,399 Speaker 3: because this is all I've done my whole life. But 363 00:19:54,720 --> 00:19:58,000 Speaker 3: like salespeople write in sales proposals, So maybe you do 364 00:19:58,160 --> 00:20:00,800 Speaker 3: I don't know, three a day on your own. And 365 00:20:00,840 --> 00:20:02,840 Speaker 3: if AI comes along and you can do three and 366 00:20:02,840 --> 00:20:04,679 Speaker 3: a half an hour, the boss isn't going to say, well, 367 00:20:04,680 --> 00:20:06,200 Speaker 3: go ahead and take the other seven and a half 368 00:20:06,240 --> 00:20:09,800 Speaker 3: hours off and just do it some glass. The boss 369 00:20:09,840 --> 00:20:11,200 Speaker 3: is gonna say, well, then I want to have a 370 00:20:11,280 --> 00:20:15,920 Speaker 3: thousand proposals a day instead of three, since you born faster, or. 371 00:20:16,040 --> 00:20:19,400 Speaker 1: Because you're a quadruplinger, quintell, whatever it is, your output, 372 00:20:19,600 --> 00:20:22,240 Speaker 1: the human follow up that needs to be done has 373 00:20:22,359 --> 00:20:24,000 Speaker 1: just increased by many times. 374 00:20:25,080 --> 00:20:26,639 Speaker 2: Right, Yeah, that's a good way to look at it too. 375 00:20:26,680 --> 00:20:29,760 Speaker 3: Anyway, So back to this Wall Street Journal article about this, 376 00:20:29,800 --> 00:20:32,000 Speaker 3: because this is what they've found happen. The amount of 377 00:20:32,080 --> 00:20:36,119 Speaker 3: email and chat and texts and everything like that is 378 00:20:36,240 --> 00:20:40,520 Speaker 3: doubled for people that started using AI, as opposed to 379 00:20:41,320 --> 00:20:45,280 Speaker 3: been cut down. Such habits, says the Wall Street Journal, 380 00:20:45,320 --> 00:20:48,920 Speaker 3: aren't what AI evangelists had predicted. A number of tech 381 00:20:48,960 --> 00:20:51,240 Speaker 3: and business leaders, from Bill Gates to J. B. Porgan's 382 00:20:51,320 --> 00:20:55,040 Speaker 3: Chases Jamie Diamond, have suggested that AI could ultimately lead 383 00:20:55,080 --> 00:20:57,800 Speaker 3: people to work less, not more, and result in a 384 00:20:57,840 --> 00:21:01,520 Speaker 3: shorter workweek. Elon Musk has said that within twenty years, 385 00:21:01,680 --> 00:21:05,159 Speaker 3: advancements in AI and robots could even make work optional. 386 00:21:06,240 --> 00:21:11,200 Speaker 1: I think that's a fanciful idea. But the cautionary note 387 00:21:11,240 --> 00:21:15,200 Speaker 1: is this is early days. Yeah, so these trends could change. 388 00:21:15,200 --> 00:21:18,199 Speaker 1: But again, this is creating their work, which this is 389 00:21:18,200 --> 00:21:20,120 Speaker 1: to me, it's not going to eliminate to employment. 390 00:21:20,240 --> 00:21:20,920 Speaker 2: It is early days. 391 00:21:20,920 --> 00:21:23,840 Speaker 3: But this is the first actual data in practice of it, 392 00:21:23,920 --> 00:21:26,159 Speaker 3: not just theory. This is the first time they've actually 393 00:21:26,160 --> 00:21:29,760 Speaker 3: tested it as opposed to just theorizing about it on podcasts. 394 00:21:30,400 --> 00:21:34,440 Speaker 3: Evidence so far suggests that many a AI adopters aren't 395 00:21:34,520 --> 00:21:38,000 Speaker 3: using the technology's efficiencies to give themselves a break. Workers 396 00:21:38,000 --> 00:21:40,480 Speaker 3: often use the time savings to do more work rather 397 00:21:40,520 --> 00:21:44,199 Speaker 3: than less, because AI makes additional tasks feel easy and accessible, 398 00:21:44,400 --> 00:21:45,920 Speaker 3: creating a sense of momentum. 399 00:21:46,119 --> 00:21:47,040 Speaker 2: That's kind of interesting. 400 00:21:48,119 --> 00:21:50,919 Speaker 3: Analysis backs up the findings of an eight month study. 401 00:21:50,960 --> 00:21:53,840 Speaker 3: So this backs up the findings of a preliminary, smaller study. 402 00:21:54,080 --> 00:21:55,960 Speaker 3: So you had the smaller study saying it looks like 403 00:21:56,000 --> 00:21:58,200 Speaker 3: people end up working more when you start using AI, 404 00:21:58,280 --> 00:22:00,280 Speaker 3: and they did an even bigger study that can firm 405 00:22:00,320 --> 00:22:04,480 Speaker 3: those results about how AI is shaping work habits at 406 00:22:04,480 --> 00:22:07,320 Speaker 3: a tech company. The research stolen away is so far, 407 00:22:07,359 --> 00:22:10,800 Speaker 3: as said, the tools didn't reduce work, but intensified it. 408 00:22:11,119 --> 00:22:14,000 Speaker 3: The employees worked at faster paces, took on broader scopes 409 00:22:14,040 --> 00:22:17,080 Speaker 3: of tasks, and ended up working more hours, the initial 410 00:22:17,119 --> 00:22:21,879 Speaker 3: findings reported. And this is my favorite part, and this 411 00:22:21,960 --> 00:22:27,640 Speaker 3: is what I wanted to get on over time, said 412 00:22:27,680 --> 00:22:32,800 Speaker 3: one analyst. This can lead to cognitive overload, burnout, poorer 413 00:22:32,800 --> 00:22:36,440 Speaker 3: decision making, and declining work quality, even if workers peer 414 00:22:36,640 --> 00:22:40,240 Speaker 3: appear more effective. Yeah, can lead to wanting to jump 415 00:22:40,320 --> 00:22:44,520 Speaker 3: off the eighth story balcony because everything's just going so fast. 416 00:22:44,520 --> 00:22:46,760 Speaker 3: You're working so much, and you feel insane. 417 00:22:47,640 --> 00:22:54,600 Speaker 1: Yeah, this is all so confusing and overwhelming and right 418 00:22:54,760 --> 00:22:55,680 Speaker 1: with uncertainty. 419 00:22:55,960 --> 00:22:59,600 Speaker 3: Now, So I I had been looking at the AI 420 00:22:59,720 --> 00:23:04,760 Speaker 3: take over of AI starts doing the job instead of humans, 421 00:23:05,320 --> 00:23:07,080 Speaker 3: and then we have to come up with the universal 422 00:23:07,680 --> 00:23:09,920 Speaker 3: universal income thing, and then you just stay home, you 423 00:23:09,960 --> 00:23:14,000 Speaker 3: don't work. I get Elon's view of it more than 424 00:23:14,040 --> 00:23:16,399 Speaker 3: I do. The this will make us more productive, so 425 00:23:16,480 --> 00:23:19,399 Speaker 3: will work less? That doesn't make any sense on its face. 426 00:23:19,800 --> 00:23:23,640 Speaker 3: What when have we ever like come up with something 427 00:23:23,680 --> 00:23:26,800 Speaker 3: that makes it more efficient, so now you don't work 428 00:23:26,840 --> 00:23:30,480 Speaker 3: as much? No, when's that ever? Why would that happen? 429 00:23:30,760 --> 00:23:33,240 Speaker 3: I'm paying out forty hours a week. The fact that 430 00:23:33,280 --> 00:23:35,840 Speaker 3: you can now do your old work and half the 431 00:23:35,880 --> 00:23:38,280 Speaker 3: time freeze you up to do more work. 432 00:23:38,359 --> 00:23:39,119 Speaker 2: Not go home. 433 00:23:40,000 --> 00:23:42,960 Speaker 1: Well, right, if I give you a shovel that doesn't 434 00:23:43,000 --> 00:23:44,880 Speaker 1: have holes in it so you can dig a ditch 435 00:23:44,960 --> 00:23:47,800 Speaker 1: more quickly, that doesn't mean I'm going to settle for 436 00:23:47,880 --> 00:23:50,439 Speaker 1: the one, you know, slow ditch that I used to 437 00:23:50,440 --> 00:23:50,840 Speaker 1: settle for. 438 00:23:51,000 --> 00:23:55,640 Speaker 2: Now dig two ditches obviously. Yeah. 439 00:23:57,200 --> 00:24:01,840 Speaker 3: Yeah, so the in between of what we got now 440 00:24:01,880 --> 00:24:04,640 Speaker 3: without AI, and then AI doing it all for us? 441 00:24:05,760 --> 00:24:08,800 Speaker 2: Is just going to be people working their asses off? 442 00:24:10,000 --> 00:24:15,640 Speaker 2: Or is it? Yeah? I don't know. Certainly sounds believable 443 00:24:15,640 --> 00:24:15,840 Speaker 2: to me. 444 00:24:16,720 --> 00:24:20,160 Speaker 1: I wish I had more specifics. I'd like to hear 445 00:24:20,160 --> 00:24:22,400 Speaker 1: about specific jobs and tasks. 446 00:24:26,000 --> 00:24:27,200 Speaker 2: You know, you. 447 00:24:27,160 --> 00:24:31,320 Speaker 1: Gave the example of you can prepare three sales presentations 448 00:24:31,320 --> 00:24:34,920 Speaker 1: in a day, and then you have three sales presentations 449 00:24:34,920 --> 00:24:37,440 Speaker 1: worth of follow up work to do or whatever goes 450 00:24:37,480 --> 00:24:39,639 Speaker 1: wrong with that. That I can understand. I can relate 451 00:24:39,680 --> 00:24:42,399 Speaker 1: to that. So if all of a sudden you can 452 00:24:42,400 --> 00:24:45,520 Speaker 1: do fifteen, then I get why that's a little overwhelming, 453 00:24:45,680 --> 00:24:49,840 Speaker 1: and you would hire more people to do follow ups, 454 00:24:49,880 --> 00:24:54,440 Speaker 1: I guess. But at some point you've got I mean, 455 00:24:54,720 --> 00:24:57,560 Speaker 1: surely you would design the AI, or enhance the AI, 456 00:24:57,760 --> 00:24:59,560 Speaker 1: tweak it, whatever, so it could do the follow up 457 00:24:59,560 --> 00:25:03,840 Speaker 1: as well, l as well as a human in addition 458 00:25:03,880 --> 00:25:05,520 Speaker 1: to what it was doing. But then you've got an 459 00:25:05,520 --> 00:25:10,359 Speaker 1: AI on the other hand, reading the proposal and processing 460 00:25:10,400 --> 00:25:14,520 Speaker 1: it and then having its AI cousin. I was telling 461 00:25:14,560 --> 00:25:16,960 Speaker 1: my kids the other day, I decide whether it to buy, 462 00:25:16,960 --> 00:25:18,040 Speaker 1: and I don't know. 463 00:25:18,720 --> 00:25:20,399 Speaker 2: My mind is blown. I got no time for this, 464 00:25:20,520 --> 00:25:22,080 Speaker 2: I'm gonna go watch The Squirrel's Caport. 465 00:25:22,400 --> 00:25:26,719 Speaker 3: My youngest is my oldest doesn't care about really anything, 466 00:25:27,160 --> 00:25:31,920 Speaker 3: so but my youngest and like music and skateboarding. But 467 00:25:31,960 --> 00:25:34,359 Speaker 3: my youngest is terrified of AI and how it's going 468 00:25:34,400 --> 00:25:36,200 Speaker 3: to ruin the world and what is his future going 469 00:25:36,240 --> 00:25:36,399 Speaker 3: to be. 470 00:25:37,080 --> 00:25:39,439 Speaker 2: He's very very I'm sitting with him. He's fourteen. 471 00:25:40,480 --> 00:25:42,919 Speaker 3: Last time he said, he's just shaking his head and 472 00:25:42,960 --> 00:25:45,000 Speaker 3: I said, what's wrong. He said, I've got to get 473 00:25:45,000 --> 00:25:53,600 Speaker 3: my life together. What you really don't You're I don't 474 00:25:53,640 --> 00:25:56,840 Speaker 3: know what specifically he was thinking about. And I got 475 00:25:56,840 --> 00:25:57,840 Speaker 3: to get my life together. 476 00:25:58,119 --> 00:25:58,280 Speaker 4: Now. 477 00:25:58,320 --> 00:26:00,159 Speaker 3: You know, I'm sitting with a forty year old. You 478 00:26:00,200 --> 00:26:04,360 Speaker 3: can imagine all kinds of scenarios a fourteen year old. 479 00:26:04,720 --> 00:26:08,120 Speaker 3: I'm not sure what, but anyway, I said, I told 480 00:26:08,160 --> 00:26:09,840 Speaker 3: them I might hit the sweet spot of AI. 481 00:26:10,000 --> 00:26:10,960 Speaker 2: Joe and I min or people. 482 00:26:11,000 --> 00:26:15,400 Speaker 3: Our age of it comes along, and you know we're 483 00:26:15,440 --> 00:26:17,800 Speaker 3: ready to be done working any way, and the universal 484 00:26:17,800 --> 00:26:21,119 Speaker 3: basic income hits and you know the salad days are 485 00:26:21,240 --> 00:26:25,840 Speaker 3: upon me and I've already done the go out there 486 00:26:25,880 --> 00:26:28,680 Speaker 3: and have meaning and life thing for. 487 00:26:28,760 --> 00:26:32,439 Speaker 2: Long enough, so coast on until it really finished. Line. 488 00:26:32,880 --> 00:26:35,120 Speaker 1: Yeah, it's true, and I hate to pile on at 489 00:26:35,119 --> 00:26:38,600 Speaker 1: the old man yelling at Cloud's theme, but there are times. 490 00:26:38,680 --> 00:26:41,440 Speaker 1: I mean, like, for instance, Jack, you probably haven't gotten 491 00:26:41,480 --> 00:26:44,040 Speaker 1: a chance to read it yet, but I was power 492 00:26:44,240 --> 00:26:47,080 Speaker 1: reading a piece that our friend Tim Sanderfer sent us 493 00:26:47,160 --> 00:26:50,360 Speaker 1: about the nature of the music industry right now and 494 00:26:50,960 --> 00:26:54,560 Speaker 1: about how sink music, which is like old production music 495 00:26:54,560 --> 00:26:58,000 Speaker 1: and radio, the music you'd hear under a grocery store 496 00:26:58,080 --> 00:27:00,240 Speaker 1: ad or whatever up eat and cherry and whatever ever 497 00:27:00,560 --> 00:27:01,720 Speaker 1: you know, in pairs or. 498 00:27:01,720 --> 00:27:05,240 Speaker 2: Three for a dozen or whatever. That's a good price 499 00:27:05,280 --> 00:27:05,760 Speaker 2: for pairs. 500 00:27:06,040 --> 00:27:10,120 Speaker 1: Oh please come on, get a couple of dozen anyway. Ah, 501 00:27:10,359 --> 00:27:13,560 Speaker 1: but how that's now, That is what pop music is. 502 00:27:14,400 --> 00:27:17,720 Speaker 1: People are not writing music to be listened to and 503 00:27:17,920 --> 00:27:20,919 Speaker 1: enjoyed as an art form because there's no money in 504 00:27:20,960 --> 00:27:23,359 Speaker 1: it unless you're one of the super giants. It's all 505 00:27:23,520 --> 00:27:29,200 Speaker 1: music for TikTok videos, for TV shows, for YouTube videos. 506 00:27:29,800 --> 00:27:35,720 Speaker 1: And weirdly, pop music is now imitating the background music 507 00:27:35,800 --> 00:27:39,840 Speaker 1: from TikTok videos. So which one is which is now 508 00:27:39,880 --> 00:27:44,560 Speaker 1: completely unclear, And I'm sitting here thinking it's like I've 509 00:27:44,560 --> 00:27:49,040 Speaker 1: been transported to a foreign country. The culture's weird. I 510 00:27:49,080 --> 00:27:51,879 Speaker 1: don't like the food, and I want to go home. 511 00:27:52,840 --> 00:27:54,320 Speaker 1: I don't want to stay here. 512 00:27:58,880 --> 00:27:59,360 Speaker 2: I don't know. 513 00:27:59,600 --> 00:28:05,199 Speaker 1: I I remember famously a musicologist who was talking about 514 00:28:05,880 --> 00:28:09,080 Speaker 1: the Sergeant Pepper's album, the Beatles album. He said that 515 00:28:09,240 --> 00:28:14,080 Speaker 1: was the moment where clearly pop music went from ritual 516 00:28:14,160 --> 00:28:18,840 Speaker 1: dance music to music for listening. And I thought, that's 517 00:28:18,920 --> 00:28:21,800 Speaker 1: really interesting observation. And you know, you can cite other 518 00:28:21,960 --> 00:28:24,200 Speaker 1: works of oht blah blah blah, but it's mostly true. 519 00:28:24,960 --> 00:28:33,960 Speaker 1: And now we're in an era where it's becoming background 520 00:28:34,080 --> 00:28:41,280 Speaker 1: music for just mildly entertaining videos of people doing things. 521 00:28:41,400 --> 00:28:43,959 Speaker 1: That's what pop music is for people that make it, 522 00:28:44,040 --> 00:28:46,320 Speaker 1: because people still want to listen to good music, right, 523 00:28:47,240 --> 00:28:50,480 Speaker 1: some do, But again pop music is now imitating the 524 00:28:50,520 --> 00:28:53,880 Speaker 1: background music for TikTok videos, or it's impossible to tell 525 00:28:53,920 --> 00:28:56,760 Speaker 1: the difference which is which. And it was talking about 526 00:28:56,760 --> 00:28:59,600 Speaker 1: how they're for some reason twenty one second I don't 527 00:28:59,640 --> 00:29:02,400 Speaker 1: know why, but they're looking for that roughly twenty one 528 00:29:02,520 --> 00:29:07,480 Speaker 1: second grabber piece of music. It's like the hook or whatever, 529 00:29:07,560 --> 00:29:09,840 Speaker 1: but that's gonna be the part that's going to be 530 00:29:09,920 --> 00:29:12,800 Speaker 1: in the videos. And so anybody who's making pop music 531 00:29:12,840 --> 00:29:15,160 Speaker 1: wants to be sure that that comes like super early 532 00:29:15,200 --> 00:29:17,440 Speaker 1: in the song, so that somebody will start using that 533 00:29:17,520 --> 00:29:21,320 Speaker 1: song for whatever, you know, a reality show or a 534 00:29:21,360 --> 00:29:24,719 Speaker 1: TikTok video or whatever. And it's just, you know what, 535 00:29:24,760 --> 00:29:28,200 Speaker 1: we need to change definitions. If you like music as art, 536 00:29:28,560 --> 00:29:33,600 Speaker 1: don't look there anymore, right, don't look for diamonds in 537 00:29:33,640 --> 00:29:34,640 Speaker 1: a gold mine. 538 00:29:34,760 --> 00:29:36,040 Speaker 2: You're wasting your time. 539 00:29:37,880 --> 00:29:40,959 Speaker 3: Yeah, I feel like because I just love that Olivia 540 00:29:41,040 --> 00:29:43,720 Speaker 3: Dan album that won Grammy for Best New Artist. Freaking 541 00:29:43,800 --> 00:29:47,880 Speaker 3: listen to at least one song every single day more. Yeah, God, 542 00:29:47,960 --> 00:29:51,720 Speaker 3: she's a R and B genius. But anyway, that's real 543 00:29:51,840 --> 00:29:53,280 Speaker 3: music with a real singer. 544 00:29:53,520 --> 00:29:54,160 Speaker 2: No on the scene. 545 00:29:54,240 --> 00:29:57,280 Speaker 3: I just I can't imagine anything replacing that for me. 546 00:29:58,600 --> 00:30:00,840 Speaker 3: Maybe other people want to hear ai I generated twenty 547 00:30:00,840 --> 00:30:02,280 Speaker 3: one second TikTok songs. I don't know. 548 00:30:02,760 --> 00:30:04,640 Speaker 2: Yeah, it'll have a different role. 549 00:30:06,680 --> 00:30:10,840 Speaker 1: Societally and economically, but that's always been the case of music. 550 00:30:10,880 --> 00:30:15,520 Speaker 1: It always changes. Again, A great book to read on 551 00:30:15,560 --> 00:30:18,600 Speaker 1: that topic, and he ought to update it. David Byrne 552 00:30:18,600 --> 00:30:21,920 Speaker 1: of Talking Heads famous book is How Music Works. Really interesting, 553 00:30:21,960 --> 00:30:23,320 Speaker 1: thought provoking, gets. 554 00:30:23,160 --> 00:30:25,880 Speaker 2: A little bogged down towards you know, second half. 555 00:30:26,000 --> 00:30:30,840 Speaker 3: So so we haven't nailed down whether AI is going 556 00:30:30,880 --> 00:30:32,800 Speaker 3: to make a workplace better or worse. 557 00:30:32,880 --> 00:30:34,600 Speaker 2: At least according to this study, it's going to be 558 00:30:34,640 --> 00:30:36,480 Speaker 2: worse great. 559 00:30:38,920 --> 00:30:40,760 Speaker 3: Before the ultimate time comes where you don't have to 560 00:30:40,760 --> 00:30:42,400 Speaker 3: work at all. 561 00:30:42,480 --> 00:30:46,320 Speaker 2: But that's twenty years out, according to Elon all right, 562 00:30:47,000 --> 00:30:49,280 Speaker 2: I have no opinion on any of this. I'll prepare 563 00:30:49,320 --> 00:30:50,880 Speaker 2: my kids accordingly. 564 00:30:51,520 --> 00:30:55,560 Speaker 3: Yeah, good luck, we'll finish strong next. 565 00:30:56,480 --> 00:31:01,240 Speaker 2: It's a harder right field. Sod and watched it. Fly 566 00:31:01,520 --> 00:31:03,920 Speaker 2: Cat Leonis hit the third hole run of the game. 567 00:31:05,560 --> 00:31:07,800 Speaker 1: They're gonna need a fresh bag of beams over there 568 00:31:07,800 --> 00:31:09,080 Speaker 1: in that dugout. 569 00:31:09,560 --> 00:31:11,640 Speaker 2: So that's their home run celebration. 570 00:31:11,800 --> 00:31:14,000 Speaker 3: They run over to the espresso machine and they all 571 00:31:14,040 --> 00:31:17,520 Speaker 3: take a shot of espresso after a home run, then 572 00:31:17,560 --> 00:31:21,280 Speaker 3: smoke a cigarette, and they had several of them when 573 00:31:21,360 --> 00:31:27,200 Speaker 3: Italy beat the United States eight six yesterday. That's kind 574 00:31:27,240 --> 00:31:29,080 Speaker 3: of a funny home run celebration. But here's something I 575 00:31:29,080 --> 00:31:31,800 Speaker 3: didn't know. I might have actually possibly been rooting for 576 00:31:31,840 --> 00:31:35,800 Speaker 3: Italy if I'd have known this at the time. They 577 00:31:35,840 --> 00:31:38,840 Speaker 3: got twenty one players born in the United States on 578 00:31:38,880 --> 00:31:41,760 Speaker 3: their team. You just need to have some Italian heritage, 579 00:31:41,760 --> 00:31:43,840 Speaker 3: and I don't think they're very strict on that. It's 580 00:31:43,960 --> 00:31:47,560 Speaker 3: guys who couldn't make the US team who would love 581 00:31:47,600 --> 00:31:50,440 Speaker 3: to be on the US team because they got beaten 582 00:31:50,480 --> 00:31:51,600 Speaker 3: out by people that are. 583 00:31:51,640 --> 00:31:52,440 Speaker 2: Better than them. 584 00:31:52,880 --> 00:31:55,920 Speaker 3: You think they were motivated to beat the All Star team. 585 00:31:56,560 --> 00:31:58,840 Speaker 3: There's a bunch of Major League baseball players who would 586 00:31:58,880 --> 00:32:00,560 Speaker 3: love to play for the US team. 587 00:32:00,720 --> 00:32:02,040 Speaker 2: Oh yeah, yeah, I mean. 588 00:32:01,920 --> 00:32:04,800 Speaker 3: That it would be very exciting kicking the ass of 589 00:32:05,360 --> 00:32:09,080 Speaker 3: Judge and Bryce Harper and all those different. 590 00:32:08,920 --> 00:32:10,640 Speaker 2: US Oh absolutely, yeah. 591 00:32:10,720 --> 00:32:15,720 Speaker 1: I'm looking at the names Jack Caglian and Sam Antonacci. Yeah, 592 00:32:15,760 --> 00:32:18,959 Speaker 1: these are people who got like their grandmother grew up 593 00:32:19,000 --> 00:32:21,640 Speaker 1: in Sicily or whatever. The Italians are like good enough, 594 00:32:21,640 --> 00:32:23,440 Speaker 1: come play ball, you're a major leaguer. 595 00:32:23,760 --> 00:32:26,040 Speaker 3: Hey, I mean i'd brother to the United States win 596 00:32:26,080 --> 00:32:28,960 Speaker 3: that Italy wins. But I could see rooting for the 597 00:32:29,520 --> 00:32:30,960 Speaker 3: Oh I'm not good enough to be on the team, 598 00:32:31,160 --> 00:32:35,880 Speaker 3: take a suck of its. Yeah yeah, but so Italy, 599 00:32:36,000 --> 00:32:37,720 Speaker 3: I got it, I had it wrong or I was misled. 600 00:32:37,960 --> 00:32:41,240 Speaker 3: Italy has to win today to pull the United States. 601 00:32:41,400 --> 00:32:44,480 Speaker 3: Italy has to beat Mexico. Italy will beat Mexico unless 602 00:32:44,520 --> 00:32:46,080 Speaker 3: they just have a bad day because they've. 603 00:32:45,880 --> 00:32:50,080 Speaker 2: Been trouncing everybody. Oh really, yeah, yeah, I've. 604 00:32:49,960 --> 00:32:57,600 Speaker 1: Got a baseball related final thought, but they can wait. 605 00:33:07,000 --> 00:33:08,880 Speaker 2: Here's yours for final thoughts, Joe Getty. 606 00:33:09,120 --> 00:33:10,920 Speaker 1: Let's get a final thought from everybody on the crew 607 00:33:10,960 --> 00:33:12,240 Speaker 1: to wrap up the show for the day. There he 608 00:33:12,320 --> 00:33:14,400 Speaker 1: is pressing the buttons, Mike Langelow in the control room. 609 00:33:14,400 --> 00:33:17,680 Speaker 2: Michael, which final thought? You see how good Italy's team is? Jack? 610 00:33:17,800 --> 00:33:22,800 Speaker 2: How about an expressoap machine in your studio? A Yeah, 611 00:33:22,920 --> 00:33:23,360 Speaker 2: I love it. 612 00:33:23,720 --> 00:33:28,040 Speaker 1: I love a nice espresso in the afternoon. Myself, Katie 613 00:33:28,040 --> 00:33:30,880 Speaker 1: Greener's didn't use woman as a final thought. Katie. 614 00:33:30,960 --> 00:33:33,120 Speaker 3: As I sit here with my iced water, I am 615 00:33:33,200 --> 00:33:37,640 Speaker 3: still completely baffled by Jack's preference of drinking warm water. 616 00:33:37,920 --> 00:33:41,520 Speaker 2: Yeah. I've always been though after for room temperature water sick. 617 00:33:41,560 --> 00:33:44,040 Speaker 2: It's perverse. Yeat, do you have a final thought? You weirdo. 618 00:33:44,400 --> 00:33:49,040 Speaker 3: Italy beat Brazil eight nothing, then Great Britain seven four wow, 619 00:33:49,160 --> 00:33:51,160 Speaker 3: and then beat the United States. So I think they'll 620 00:33:51,160 --> 00:33:54,240 Speaker 3: probably beat Mexico today and then maybe a rematch Italy 621 00:33:54,320 --> 00:33:54,840 Speaker 3: United States. 622 00:33:54,840 --> 00:33:58,280 Speaker 2: That'd be exciting. So my final note is baseball related. 623 00:33:58,360 --> 00:33:59,760 Speaker 2: It's too Apple, the Apple Corporation. 624 00:34:00,040 --> 00:34:02,080 Speaker 1: Well, what if I was a major league ballplayer as 625 00:34:02,080 --> 00:34:04,760 Speaker 1: a hitter, and I went, oh, for three hundred and 626 00:34:04,800 --> 00:34:08,919 Speaker 1: seventy five, like twelve years in a row. What would 627 00:34:09,000 --> 00:34:11,800 Speaker 1: you tell me? You would tell me you gotta stop trying. 628 00:34:11,840 --> 00:34:15,920 Speaker 1: You're terrible at that. That's how you are with punctuating 629 00:34:16,040 --> 00:34:21,600 Speaker 1: my voice texts. You are utterly incompetent. It's hilariously wrong 630 00:34:21,719 --> 00:34:26,440 Speaker 1: and wrong headed. It's never good. You have no idea 631 00:34:26,440 --> 00:34:30,000 Speaker 1: what you're doing. Just stop trying. How is it not better? 632 00:34:31,440 --> 00:34:33,239 Speaker 1: And I swear it's gotten worse over the years as 633 00:34:33,280 --> 00:34:37,040 Speaker 1: opposed to better. Armstrang time I paused to pick the 634 00:34:37,120 --> 00:34:39,080 Speaker 1: right word. You don't have to put a comma in. 635 00:34:40,640 --> 00:34:42,800 Speaker 2: I'm strong in getting rabbing up, but O they're grueling 636 00:34:42,880 --> 00:34:43,800 Speaker 2: four hour workday. 637 00:34:44,000 --> 00:34:46,360 Speaker 1: So many people thanks the little time, good armstrong Getty 638 00:34:46,360 --> 00:34:50,400 Speaker 1: dot com. Many pleasures await you there, including the hop links, 639 00:34:50,960 --> 00:34:54,320 Speaker 1: the ang Swag Store, the Katie's corner. 640 00:34:54,320 --> 00:34:56,920 Speaker 2: You can drop us a note. See you tomorrow. God 641 00:34:57,239 --> 00:35:03,200 Speaker 2: bless America. I'm strong and Getty. You gonna make a 642 00:35:03,440 --> 00:35:04,839 Speaker 2: very obvious point number one. 643 00:35:04,960 --> 00:35:07,120 Speaker 1: I gotta see if Judy's willing to rent me out 644 00:35:07,120 --> 00:35:08,319 Speaker 1: as a jigglow because I. 645 00:35:08,280 --> 00:35:13,919 Speaker 2: Am a natural born lover man. Never say that again, 646 00:35:14,800 --> 00:35:17,960 Speaker 2: hear me? Plainly. What I'm telling you is I'm an idiot. Yeah, 647 00:35:17,960 --> 00:35:18,800 Speaker 2: he's a damn idiot. 648 00:35:18,800 --> 00:35:22,680 Speaker 5: Oh yes, it's a pathetic spectacle of a man who's 649 00:35:22,760 --> 00:35:23,640 Speaker 5: run out of road. 650 00:35:24,760 --> 00:35:27,439 Speaker 2: Okay, let high know. Hi, good night everybody. I still 651 00:35:27,480 --> 00:35:30,040 Speaker 2: have this stuff, Baby Armstrong and Getty