1 00:00:01,920 --> 00:00:06,200 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,280 --> 00:00:10,320 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,240 --> 00:00:17,320 Speaker 2: Arm Strong and Getty and now he Armstrong and Yetty. 4 00:00:23,720 --> 00:00:26,880 Speaker 3: Thirty six air traffic control facilities of reported staffing problems. 5 00:00:27,120 --> 00:00:30,280 Speaker 3: That includes the towers at Chicago O'Hare, Dallas, Fort War, 6 00:00:30,440 --> 00:00:33,680 Speaker 3: Phoenix Sky Harbor, huge hubs for American airlines. You can 7 00:00:33,720 --> 00:00:36,040 Speaker 3: also have the towers at San Francisco and lax to 8 00:00:36,080 --> 00:00:38,239 Speaker 3: the list. This is all on the heels of the 9 00:00:38,560 --> 00:00:42,480 Speaker 3: worst weekend for air traffic control staffing since the shutdown began. 10 00:00:43,520 --> 00:00:46,400 Speaker 1: Yeah, we're at day forty two the shutdown. It's going 11 00:00:46,440 --> 00:00:49,120 Speaker 1: to end, but it has an officially end. So we're 12 00:00:49,120 --> 00:00:52,840 Speaker 1: in day forty two and the flight problem is still big. 13 00:00:52,920 --> 00:00:57,720 Speaker 1: Three thousand cancelations on Sunday, twenty five hundred yesterday and 14 00:00:57,840 --> 00:01:00,440 Speaker 1: off to a bad start today. And then how long 15 00:01:00,480 --> 00:01:02,800 Speaker 1: this is gonna last? I still don't quite get the 16 00:01:03,600 --> 00:01:09,080 Speaker 1: lots of air traffic controllers retired. Yeah, you had a 17 00:01:09,080 --> 00:01:12,840 Speaker 1: life plan and then you you altered it because of 18 00:01:12,880 --> 00:01:13,520 Speaker 1: the shutdown. 19 00:01:13,680 --> 00:01:16,280 Speaker 2: You retired early. What I don't get it, I mean 20 00:01:16,360 --> 00:01:18,120 Speaker 2: even a year early because of this? 21 00:01:18,760 --> 00:01:21,360 Speaker 4: You know, mess up. That'd be odd, but it happened. 22 00:01:21,440 --> 00:01:22,520 Speaker 4: I guess apparently. 23 00:01:22,720 --> 00:01:25,319 Speaker 1: Anyway, I'm looking up at all your big there are 24 00:01:25,360 --> 00:01:28,000 Speaker 1: twelve hundred cancelations so far today. I don't know when 25 00:01:28,000 --> 00:01:31,520 Speaker 1: you're gonna be listening to this, but this is gonna 26 00:01:31,520 --> 00:01:33,200 Speaker 1: be a problem for a while. Sure, hope they get 27 00:01:33,200 --> 00:01:36,559 Speaker 1: it straightened out before the biggest traveling week of the year, 28 00:01:36,920 --> 00:01:40,959 Speaker 1: Thanksgiving obviously, which is coming up kind of so yeah, yeah, 29 00:01:42,120 --> 00:01:45,560 Speaker 1: another big story around sports. 30 00:01:45,560 --> 00:01:47,160 Speaker 2: I didn't realize the Winter Olympics are. 31 00:01:47,040 --> 00:01:52,440 Speaker 1: Twelve weeks away. We're getting close. Yeah, And there's somewhere 32 00:01:52,440 --> 00:01:54,400 Speaker 1: somewhere snowy and isome France. 33 00:01:54,680 --> 00:01:55,800 Speaker 2: Where are the winter limit good? 34 00:01:55,960 --> 00:01:58,440 Speaker 1: It'd be a good idea rock that for us. Well you, Katie, 35 00:01:58,440 --> 00:02:01,120 Speaker 1: whare are the dang Winter Olympics? But then Italy I think, 36 00:02:01,320 --> 00:02:03,080 Speaker 1: or was that last time? That doesn't matter? 37 00:02:03,640 --> 00:02:05,520 Speaker 2: Someplace snowys be a good bing it? 38 00:02:05,880 --> 00:02:13,280 Speaker 1: Yeah, probably not Jamaica, Milano, Cortina, I win, congratulations. So 39 00:02:13,360 --> 00:02:15,840 Speaker 1: the Winter Olympics coming up in Italy. They ain't gonna 40 00:02:15,840 --> 00:02:17,799 Speaker 1: do the whole trans thing this time around. Here's a 41 00:02:17,800 --> 00:02:18,360 Speaker 1: little bit of that. 42 00:02:18,880 --> 00:02:21,080 Speaker 3: This sounds like a really big change potentially, Nick. 43 00:02:21,120 --> 00:02:23,520 Speaker 5: This would be big, and it would be a huge 44 00:02:23,560 --> 00:02:26,840 Speaker 5: policy shift for the ile C, which until now has 45 00:02:26,919 --> 00:02:30,880 Speaker 5: led each sports international governing body set their own policies 46 00:02:30,919 --> 00:02:34,359 Speaker 5: centered on transgender inclusion. It would also mark a big 47 00:02:34,440 --> 00:02:38,120 Speaker 5: change from the ILEC's twenty twenty one framework, which said 48 00:02:38,160 --> 00:02:41,160 Speaker 5: there should not be a presumption of advantage when it 49 00:02:41,160 --> 00:02:42,520 Speaker 5: comes to trans athletes. 50 00:02:43,160 --> 00:02:47,120 Speaker 1: So they were running scared last Olympics, like lots of 51 00:02:47,120 --> 00:02:49,560 Speaker 1: people were didn't want to be on the wrong side 52 00:02:49,560 --> 00:02:51,519 Speaker 1: of this and said some things that they knew were 53 00:02:51,680 --> 00:02:54,600 Speaker 1: wrong and crazy, like you just heard and now I've 54 00:02:54,680 --> 00:02:55,320 Speaker 1: changed your mind. 55 00:02:55,360 --> 00:02:58,480 Speaker 2: Here's one of the spokespeople for the Olympics. 56 00:02:59,520 --> 00:03:02,640 Speaker 6: Next in or what I would like for the IOC 57 00:03:02,880 --> 00:03:05,320 Speaker 6: to do is to bring everyone together to try and 58 00:03:05,360 --> 00:03:08,760 Speaker 6: find a consensus amongst all of us that we can 59 00:03:08,800 --> 00:03:12,040 Speaker 6: all get behind and that we can implement and above 60 00:03:12,120 --> 00:03:15,640 Speaker 6: anything and everything else, it's fair and protects the female categories. 61 00:03:15,680 --> 00:03:19,240 Speaker 1: They're all about protecting the female category. This she's new 62 00:03:19,280 --> 00:03:21,200 Speaker 1: and she got elected on the idea of I'm going 63 00:03:21,240 --> 00:03:24,280 Speaker 1: to protect women's sports, which is code for we and 64 00:03:24,440 --> 00:03:25,560 Speaker 1: let dudes. 65 00:03:26,680 --> 00:03:28,760 Speaker 2: Participate in our women's sports. 66 00:03:28,919 --> 00:03:30,680 Speaker 1: Well, of course, you're not going to let men in 67 00:03:30,720 --> 00:03:34,120 Speaker 1: women's sports, then it's not women's sports, says virtually all 68 00:03:34,160 --> 00:03:37,280 Speaker 1: of humanity that got bullied into silence. You know, in 69 00:03:37,480 --> 00:03:40,560 Speaker 1: most cases, all these conversations that are going on are 70 00:03:40,840 --> 00:03:43,800 Speaker 1: how do we handle this without being in trouble politically 71 00:03:43,880 --> 00:03:46,080 Speaker 1: as opposed to what's the right thing to do. You 72 00:03:46,160 --> 00:03:48,720 Speaker 1: might have a couple of true believers, you know what, 73 00:03:49,840 --> 00:03:52,640 Speaker 1: your high school in California or something like that, but 74 00:03:52,880 --> 00:03:55,880 Speaker 1: mostly it's how do we handle this without causing a firestorm, 75 00:03:56,040 --> 00:03:59,000 Speaker 1: and here's the reasoning behind it all. 76 00:04:00,440 --> 00:04:03,680 Speaker 5: Will issue the band sometime early next year, citing a 77 00:04:03,720 --> 00:04:07,400 Speaker 5: new scientific review that found evidence men have a permanent 78 00:04:07,440 --> 00:04:12,320 Speaker 5: physical advantage over women athletes even after hormone therapy. However, 79 00:04:12,360 --> 00:04:14,960 Speaker 5: the Guardian newspaper says the band could still be a 80 00:04:15,040 --> 00:04:18,000 Speaker 5: year out and that the IOC is facing pushback to 81 00:04:18,040 --> 00:04:21,680 Speaker 5: a possible ban on athletes who reported female at birth 82 00:04:21,920 --> 00:04:25,360 Speaker 5: but have male chromosomes and the same testosterone level as men, 83 00:04:25,760 --> 00:04:29,800 Speaker 5: also known as differences in sexual development. That would include 84 00:04:29,839 --> 00:04:34,680 Speaker 5: athletes like South Africa's Casser Semenya, who won gold at 85 00:04:34,680 --> 00:04:37,599 Speaker 5: the London and Real Games before Track and Fields governing 86 00:04:37,640 --> 00:04:41,640 Speaker 5: body World Athletics band DSc athletes from competing as women 87 00:04:41,920 --> 00:04:43,000 Speaker 5: in twenty twenty. 88 00:04:42,760 --> 00:04:49,880 Speaker 1: Three DSc different sexual characteristics or whatever. Yeah, yeah, yeah, 89 00:04:50,360 --> 00:04:52,360 Speaker 1: I like this new study that came out that said 90 00:04:52,440 --> 00:04:53,880 Speaker 1: men have an advantage over women. 91 00:04:54,120 --> 00:04:56,280 Speaker 2: That's a good scientific study. I wonder who did that. 92 00:04:56,520 --> 00:04:57,680 Speaker 2: How much did you spend? 93 00:04:58,120 --> 00:05:01,320 Speaker 4: Tell me about the Methodology's hilarious that they had to 94 00:05:01,400 --> 00:05:05,760 Speaker 4: hide behind a new study, good lord, instead of ancient wisdom. 95 00:05:05,920 --> 00:05:06,880 Speaker 2: You know, it strikes me. 96 00:05:08,279 --> 00:05:11,040 Speaker 1: You've learned since you were a little kid on the playground. 97 00:05:11,960 --> 00:05:15,120 Speaker 1: Going forward is the new scientific study. 98 00:05:15,279 --> 00:05:18,760 Speaker 4: So I was reading about the women's soccer league. They've 99 00:05:18,760 --> 00:05:21,080 Speaker 4: had a controversy lately, and it's all centered around this 100 00:05:21,160 --> 00:05:25,479 Speaker 4: one player who is much bigger and muscular than all 101 00:05:25,640 --> 00:05:28,479 Speaker 4: of the girls on the field. Looks like a dude, 102 00:05:28,640 --> 00:05:30,919 Speaker 4: plays like a dude, built like a dude, et cetera. 103 00:05:31,320 --> 00:05:35,239 Speaker 4: It's a similar case to Castor Semenya, who they mentioned, 104 00:05:35,320 --> 00:05:36,760 Speaker 4: who has internal testicles. 105 00:05:37,560 --> 00:05:38,280 Speaker 2: It was one of the. 106 00:05:40,320 --> 00:05:44,200 Speaker 1: That'd be awesome. Oh I like having testicles. I just 107 00:05:44,200 --> 00:05:45,039 Speaker 1: don't want him in the way. 108 00:05:45,480 --> 00:05:46,799 Speaker 2: I see, Thanks for clarifying. 109 00:05:46,839 --> 00:05:50,360 Speaker 4: So, but that's one of the extremely rare cases where 110 00:05:50,480 --> 00:05:52,719 Speaker 4: sex assigned at birth is a phrase that makes any 111 00:05:52,760 --> 00:05:57,080 Speaker 4: sense because these are people with both sets of genitals 112 00:05:57,120 --> 00:06:02,200 Speaker 4: meaning no, of course not no. So she has both 113 00:06:02,320 --> 00:06:05,480 Speaker 4: ovaries and testicles and has much much higher test to 114 00:06:05,640 --> 00:06:09,560 Speaker 4: party like that the Turkish fella who was whipping up 115 00:06:09,560 --> 00:06:12,279 Speaker 4: on the girls in the boxing last time around too. 116 00:06:12,960 --> 00:06:14,720 Speaker 2: But you know, and it's funny. 117 00:06:14,760 --> 00:06:17,240 Speaker 1: How so what do you left, What do you think 118 00:06:17,279 --> 00:06:19,799 Speaker 1: we should do with people who have testicles and ovaries? 119 00:06:20,080 --> 00:06:22,359 Speaker 4: Oh, it's a shame, but they can't compete in women's sports. 120 00:06:22,640 --> 00:06:23,919 Speaker 4: And that's the point I was about to make. The 121 00:06:24,000 --> 00:06:27,599 Speaker 4: left always it's funny. They just are crazy about individual rights, 122 00:06:27,920 --> 00:06:31,240 Speaker 4: except when it comes to individual rights that conservatives like, 123 00:06:31,320 --> 00:06:32,480 Speaker 4: then they have no interest in them. 124 00:06:32,480 --> 00:06:33,640 Speaker 2: It's all about the community. 125 00:06:34,480 --> 00:06:36,880 Speaker 4: But yeah, I feel terrible for those people if they 126 00:06:36,880 --> 00:06:38,600 Speaker 4: want to be athletes, but they can either compete as 127 00:06:38,640 --> 00:06:41,320 Speaker 4: manner in an open category. You can't beat the crap 128 00:06:41,320 --> 00:06:45,120 Speaker 4: out of women because you are functionally athletically speaking, a man. 129 00:06:45,240 --> 00:06:49,080 Speaker 2: According to a new study, there avantage by being a man. 130 00:06:49,800 --> 00:06:52,400 Speaker 4: Oh, speaking of this sort of thing, I just finished 131 00:06:52,400 --> 00:06:55,880 Speaker 4: reading the piece by Colin Wright, who's terrific writer. 132 00:06:56,000 --> 00:06:57,800 Speaker 2: He writes about this sort of thing. 133 00:06:58,600 --> 00:07:01,599 Speaker 4: He was an academic scientist Penn State in twenty twenty 134 00:07:01,960 --> 00:07:07,520 Speaker 4: and there was that crazy explosion in adolescent transgenderism among 135 00:07:07,600 --> 00:07:12,560 Speaker 4: young girls, and he commented two words social contagion. Within hours, 136 00:07:12,600 --> 00:07:15,960 Speaker 4: his colleagues denounced him as a transphobic bigot, and the 137 00:07:16,000 --> 00:07:18,760 Speaker 4: online mob came from him before him and crushed his 138 00:07:18,880 --> 00:07:24,360 Speaker 4: academic career. And he talks about how he was referring 139 00:07:24,440 --> 00:07:27,960 Speaker 4: to research published by a scientist who had coined the 140 00:07:28,040 --> 00:07:30,880 Speaker 4: term rapid onset genderness for you in twenty eighteen, the 141 00:07:30,880 --> 00:07:34,480 Speaker 4: peer reviewed paper blah blah blah, and that the pattern 142 00:07:34,560 --> 00:07:38,120 Speaker 4: was clearly explained by social contagion, the spread of ideas 143 00:07:38,200 --> 00:07:40,880 Speaker 4: or behaviors through peer influence. And there are other examples, 144 00:07:40,880 --> 00:07:44,200 Speaker 4: whether it's cutting or anarexi or whatever. Teen girls are 145 00:07:44,240 --> 00:07:46,280 Speaker 4: just incredibly prone to that sort of thing. 146 00:07:46,600 --> 00:07:47,760 Speaker 2: But then he gets to. 147 00:07:49,760 --> 00:07:54,960 Speaker 4: The fact that the left wing dogma that gender identity 148 00:07:55,160 --> 00:08:01,200 Speaker 4: is innate and immutable people are born as transgender. It's 149 00:08:01,280 --> 00:08:03,960 Speaker 4: not they're not convinced by a trend or whatever. It's 150 00:08:04,080 --> 00:08:08,120 Speaker 4: they're born, and that claim underpins the medical practice and 151 00:08:08,160 --> 00:08:13,480 Speaker 4: the legal strategy puberty blockers crossed such hormones, mutilations of kids, miners, 152 00:08:14,400 --> 00:08:16,760 Speaker 4: and the rest of it, and the civil rights argument. 153 00:08:17,600 --> 00:08:22,200 Speaker 4: So then he makes the point that the dominant argument 154 00:08:22,640 --> 00:08:25,320 Speaker 4: to the counter argument to the social contagent theory is 155 00:08:25,320 --> 00:08:27,840 Speaker 4: that the sharp rise in transgender identification over the past 156 00:08:27,880 --> 00:08:32,760 Speaker 4: decades simply reflects liberation right people are more comfortable expressing 157 00:08:32,800 --> 00:08:39,160 Speaker 4: their authentic selves. That has been the argument. As transgender 158 00:08:39,200 --> 00:08:42,280 Speaker 4: activist and biologist Julius Erano put it in twenty seventeen, 159 00:08:42,320 --> 00:08:45,000 Speaker 4: there really wasn't a rise in left handedness so much 160 00:08:45,040 --> 00:08:48,760 Speaker 4: as there was a rise in left handed acceptance. That's 161 00:08:48,800 --> 00:08:52,400 Speaker 4: an interesting premise, isn't it. People were free to be 162 00:08:52,520 --> 00:08:55,880 Speaker 4: who the f they were, as John Oliver put it 163 00:08:55,920 --> 00:09:00,240 Speaker 4: on his Last Week Tonight. But then Colin points out, 164 00:09:00,640 --> 00:09:03,640 Speaker 4: if transgender identity were an innate trait like left handedness, 165 00:09:03,640 --> 00:09:07,160 Speaker 4: we would expect identification rates to rise at first when 166 00:09:07,160 --> 00:09:11,360 Speaker 4: it became socially acceptable, then plateau and remain stable at 167 00:09:11,400 --> 00:09:15,160 Speaker 4: a fixed level. If the phenomenon were instead driven by 168 00:09:15,240 --> 00:09:18,959 Speaker 4: social contagion, we might expect a boom and bust pattern, 169 00:09:19,160 --> 00:09:22,400 Speaker 4: a spike followed by a rapid decline once the social 170 00:09:22,480 --> 00:09:27,640 Speaker 4: forces driving it were weakened, And indeed that has become 171 00:09:27,800 --> 00:09:33,800 Speaker 4: incredibly clear. Transgenderal identification has fallen fifty percent in two 172 00:09:33,960 --> 00:09:39,240 Speaker 4: years among college students and adolescents a couple of different studies. 173 00:09:40,400 --> 00:09:45,000 Speaker 4: It was clearly a social contagion, and people who got 174 00:09:45,000 --> 00:09:47,880 Speaker 4: their careers ruined for saying it wasn't good. Lord, were 175 00:09:47,920 --> 00:09:50,760 Speaker 4: you a victim of an angry wrong mob. 176 00:09:51,760 --> 00:09:54,120 Speaker 1: I'm glad left handedness that didn't catch on as a 177 00:09:54,160 --> 00:09:56,520 Speaker 1: social contagent and have a hard time pretending I was 178 00:09:56,600 --> 00:09:59,960 Speaker 1: left handed, eating, tying my shoes, whatever I'm doing. 179 00:10:00,360 --> 00:10:03,520 Speaker 4: Yeah, and then he points out that the overwhelming majority 180 00:10:03,520 --> 00:10:06,720 Speaker 4: of those driving the trans mania fall into the none 181 00:10:06,840 --> 00:10:11,079 Speaker 4: binary category, adopting identities which are said to be neither 182 00:10:11,280 --> 00:10:15,160 Speaker 4: both somewhere in between demi boy, gender fluid. 183 00:10:14,840 --> 00:10:15,640 Speaker 2: Two spirit. 184 00:10:15,880 --> 00:10:19,840 Speaker 4: These are social identities, not biological ones. Unlike right handedness 185 00:10:19,880 --> 00:10:23,840 Speaker 4: or left handedness, non very non binary identities have no 186 00:10:24,320 --> 00:10:28,360 Speaker 4: anatomical or physiological reference. They're conceptual, political, and responsive to 187 00:10:28,360 --> 00:10:30,960 Speaker 4: cultural trends, hallmarks of a social contagion. 188 00:10:31,360 --> 00:10:33,599 Speaker 2: Case closed, Bam, Next case. 189 00:10:35,320 --> 00:10:35,480 Speaker 6: Man. 190 00:10:35,520 --> 00:10:38,600 Speaker 1: I'm looking at the dust up that happened in Berkeley. 191 00:10:38,640 --> 00:10:40,320 Speaker 1: I wish I had gone last night. I was thinking 192 00:10:40,320 --> 00:10:43,960 Speaker 1: about going. It's only whatever. It is forty five miles 193 00:10:44,000 --> 00:10:46,040 Speaker 1: from my house, but I had some kids stuff going on. 194 00:10:47,040 --> 00:10:50,480 Speaker 1: But it got pretty spicy there outside the Turning Point 195 00:10:50,480 --> 00:10:51,920 Speaker 1: event in Berkeley last night. 196 00:10:52,240 --> 00:10:54,480 Speaker 2: Who are they needed you there fighting Antifa? 197 00:10:54,480 --> 00:10:57,040 Speaker 1: And what did you do? You stay at home? Who 198 00:10:57,040 --> 00:10:59,240 Speaker 1: are these numbnuts that show up to fight this stuff? 199 00:10:59,320 --> 00:11:02,480 Speaker 1: Just let them, let them gather and speak. What's the 200 00:11:02,520 --> 00:11:06,120 Speaker 1: skin off your nose? They're actually convinced that they're fascists. 201 00:11:06,800 --> 00:11:09,480 Speaker 1: They believe, and that's I have to remind myself of 202 00:11:09,520 --> 00:11:11,760 Speaker 1: that semi regularly. And it's true on the right, but 203 00:11:11,960 --> 00:11:14,960 Speaker 1: especially on the left, there is a significant group of 204 00:11:15,000 --> 00:11:20,280 Speaker 1: people that believes the most lunatic rantings of activists. 205 00:11:20,480 --> 00:11:21,640 Speaker 2: They believe it. 206 00:11:21,760 --> 00:11:24,280 Speaker 1: They're willing to get into a fistfight over somebody speaking 207 00:11:24,280 --> 00:11:25,319 Speaker 1: in an auditorium. 208 00:11:26,040 --> 00:11:27,600 Speaker 2: Yeah, you're you're the fascist. 209 00:11:27,640 --> 00:11:31,240 Speaker 4: I mean clearly, let's see you dress up in a uniform, 210 00:11:31,320 --> 00:11:33,760 Speaker 4: you go to the opposition events and you beat people 211 00:11:33,880 --> 00:11:36,960 Speaker 4: up and call them fascists. 212 00:11:36,480 --> 00:11:39,280 Speaker 2: Right people inside having the speaking engagement. 213 00:11:39,880 --> 00:11:42,640 Speaker 4: It's the lack of appreciating the irony that offends me 214 00:11:42,679 --> 00:11:43,120 Speaker 4: the most. 215 00:11:44,760 --> 00:11:46,800 Speaker 1: I want to stand up for the Armstrong in Getty 216 00:11:46,800 --> 00:11:49,880 Speaker 1: pickleball paddle when we talk about the store. 217 00:11:49,640 --> 00:11:52,120 Speaker 2: When we come back, okay, because there's time. 218 00:11:51,920 --> 00:11:53,600 Speaker 1: To buy stuff in the Armstrong in Getty store if 219 00:11:53,640 --> 00:11:55,000 Speaker 1: you want to get it in time for Christmas. 220 00:11:55,280 --> 00:11:59,160 Speaker 4: Speaking of rhetoric, the phrase cheap Chinese pos has been 221 00:11:59,200 --> 00:12:00,160 Speaker 4: thrown around Jacks. 222 00:12:01,480 --> 00:12:03,839 Speaker 1: Not by me, having now just handed I held it 223 00:12:03,960 --> 00:12:06,520 Speaker 1: my hand, so I'm going to stand up for it. 224 00:12:06,720 --> 00:12:08,480 Speaker 1: And lots of other news of the day, So stay tune. 225 00:12:13,040 --> 00:12:15,400 Speaker 5: When asked Essay during the Fox broadcast of the Washington 226 00:12:15,440 --> 00:12:19,840 Speaker 5: Commanders game, who his favorite team was, President Trump said, quote, 227 00:12:19,880 --> 00:12:21,800 Speaker 5: so I love the Jets and I love the Giants. 228 00:12:21,880 --> 00:12:25,320 Speaker 2: Wow. You know you can just say you don't like football. 229 00:12:28,520 --> 00:12:32,160 Speaker 1: So what's going on with the NFL? We're talking about 230 00:12:32,400 --> 00:12:35,800 Speaker 1: Broncos Raiders. Sunday night? Was it was it ten to seven? 231 00:12:36,200 --> 00:12:39,120 Speaker 1: And then last night you had the Eagles and Packers. 232 00:12:39,160 --> 00:12:42,680 Speaker 1: The Eagles won ten to seven. The first touchdown was 233 00:12:42,679 --> 00:12:45,440 Speaker 1: scored in the fourth quarter. Did they change the rules 234 00:12:45,440 --> 00:12:48,719 Speaker 1: in the NFL or something. That's not the way the 235 00:12:48,840 --> 00:12:51,920 Speaker 1: NFL has been for the last I don't know decade. 236 00:12:52,120 --> 00:12:55,600 Speaker 4: Healed three hundred yards long. Now they forgot to announce 237 00:12:55,640 --> 00:12:56,479 Speaker 4: it is crazy. 238 00:12:56,440 --> 00:12:57,880 Speaker 2: Do you think there's betting going on? 239 00:12:58,600 --> 00:13:00,840 Speaker 1: No, but it's just a it's just weird. It's a 240 00:13:00,840 --> 00:13:03,200 Speaker 1: football of my youth. That's what NFL. 241 00:13:02,960 --> 00:13:05,319 Speaker 2: Used to be. It's pretty yards and a cloud or dust. 242 00:13:05,480 --> 00:13:10,160 Speaker 1: Yeah, anyway, first touchdown in the fourth quarter. Yeah, the 243 00:13:10,520 --> 00:13:13,120 Speaker 1: the networks can't be happy about that. 244 00:13:13,960 --> 00:13:16,600 Speaker 4: I do love punting. That was a great game, I said, 245 00:13:16,679 --> 00:13:21,320 Speaker 4: no one ever. Armstrong and Getty store is open. 246 00:13:21,679 --> 00:13:25,000 Speaker 1: So some of you buy stuff for your family or 247 00:13:25,040 --> 00:13:28,320 Speaker 1: friends for Christmas, if they're fans of this here radio program. 248 00:13:28,480 --> 00:13:31,360 Speaker 2: It's just for yourself. I feel like it's one for me. 249 00:13:31,679 --> 00:13:33,760 Speaker 1: I feel like it's one of those great excuse me, 250 00:13:34,000 --> 00:13:39,200 Speaker 1: easy gift, checked it off, didn't cost you much, straight 251 00:13:39,200 --> 00:13:40,000 Speaker 1: from the heart. 252 00:13:40,080 --> 00:13:41,760 Speaker 2: Seems like you did something nice. 253 00:13:42,080 --> 00:13:45,240 Speaker 4: It was really there will be laughter, tears, hugs. It 254 00:13:45,320 --> 00:13:47,160 Speaker 4: will be the bell of the ball. 255 00:13:47,200 --> 00:13:48,880 Speaker 1: You go to the Armstrong and Getty store, and Hanson, 256 00:13:48,880 --> 00:13:51,680 Speaker 1: who runs a damn thing, is really pushing people to 257 00:13:51,679 --> 00:13:54,320 Speaker 1: do it soon to make sure you get everything in 258 00:13:54,400 --> 00:13:57,560 Speaker 1: time for Christmas. There was some talk about the Armstrong 259 00:13:57,559 --> 00:13:59,559 Speaker 1: and Getty pickleball paddles. I thought it was pretty funny 260 00:13:59,600 --> 00:14:01,680 Speaker 1: that we had them, and then there are some complaints 261 00:14:01,679 --> 00:14:04,480 Speaker 1: about them, and Hanson brought one in and I don't 262 00:14:04,520 --> 00:14:08,000 Speaker 1: have any idea what a really high quality pickleball pedal is, 263 00:14:08,040 --> 00:14:10,320 Speaker 1: like I have them. I bought mine at large five, 264 00:14:10,400 --> 00:14:12,720 Speaker 1: I think, But they're just like this. There's nothing wrong 265 00:14:12,760 --> 00:14:15,080 Speaker 1: with this. This is perfectly tie. Unless you like, get 266 00:14:15,200 --> 00:14:18,160 Speaker 1: serious serious about pickleball, this is gonna be absolutely fine. 267 00:14:18,240 --> 00:14:20,320 Speaker 1: What is wrong with this? This isn't doesn't feel like 268 00:14:20,400 --> 00:14:23,360 Speaker 1: cheap Chinese crap. It feels like something quality to me. 269 00:14:24,600 --> 00:14:26,360 Speaker 1: I like everything about it. And it comes with this 270 00:14:26,360 --> 00:14:29,000 Speaker 1: cool carrying case. Although they're sold out, so they're I 271 00:14:29,040 --> 00:14:30,520 Speaker 1: don't know why I'm even talking about it. We're sold 272 00:14:30,520 --> 00:14:32,920 Speaker 1: out of the pickleball pedalsar'ms trying. You can get a pick 273 00:14:32,920 --> 00:14:37,040 Speaker 1: a ball, but not anymore. They're gonna be back. But anyway, 274 00:14:37,080 --> 00:14:38,400 Speaker 1: and go to the store and see what we got. 275 00:14:38,520 --> 00:14:40,560 Speaker 4: Yeah, we put the lash to the Chinese slaves and 276 00:14:40,640 --> 00:14:42,000 Speaker 4: they made several more of them. 277 00:14:42,120 --> 00:14:44,320 Speaker 1: What if we had adult items like the sort of 278 00:14:44,320 --> 00:14:47,000 Speaker 1: thing people were throwing on WNBA courts at one point, 279 00:14:47,200 --> 00:14:50,479 Speaker 1: like marital aids. We'll go on it for novelty purposes, 280 00:14:50,520 --> 00:14:52,680 Speaker 1: only take a and g to bed. 281 00:14:52,920 --> 00:14:53,960 Speaker 2: Yes, I love that idea. 282 00:14:53,960 --> 00:14:55,600 Speaker 1: Anyway, and you know, we need to branch out, we 283 00:14:55,640 --> 00:14:58,280 Speaker 1: need to have more things. They're probably not as popular 284 00:14:58,320 --> 00:15:01,880 Speaker 1: as the hoodies. Including the conscience of the nation hoodie, 285 00:15:02,960 --> 00:15:07,720 Speaker 1: the star of the Lazy Stupid should Hurt stars including 286 00:15:07,760 --> 00:15:13,120 Speaker 1: the f yolickin Party which I wear proudly, or stocking stuffers, 287 00:15:13,120 --> 00:15:18,440 Speaker 1: ang coasters, decals, uh coffee mugs, stainless steel water bottle. 288 00:15:18,480 --> 00:15:24,200 Speaker 4: Or the get Your Words Straight Jack note. Look that's 289 00:15:24,240 --> 00:15:26,960 Speaker 4: Alid Armstrong and Getty dot Com. Anyway, that's enough of that. 290 00:15:26,960 --> 00:15:29,360 Speaker 1: I'm trying to figure out what to do for my 291 00:15:29,560 --> 00:15:32,640 Speaker 1: kids this year. They're definitely the age they're almost fourteen, 292 00:15:32,720 --> 00:15:35,480 Speaker 1: almost sixteen, They're definitely past the age of more cheap 293 00:15:35,560 --> 00:15:39,080 Speaker 1: Chinese crap. There's just no reason so either an experience 294 00:15:39,280 --> 00:15:42,880 Speaker 1: or like accumulate the money into one actually. 295 00:15:42,560 --> 00:15:44,680 Speaker 2: Worth something gift. I think this year. 296 00:15:45,240 --> 00:15:49,360 Speaker 1: God, when they're younger, it's just endless piles of cheap 297 00:15:49,480 --> 00:15:53,360 Speaker 1: Chinese crab is what they get when they're little kids. 298 00:15:53,360 --> 00:15:55,560 Speaker 1: And they're delighted, and that's fine, and it is cheap, 299 00:15:56,200 --> 00:15:58,520 Speaker 1: but it well, it was always. 300 00:15:58,280 --> 00:16:01,920 Speaker 4: Interesting to me that we would we give our kids 301 00:16:02,240 --> 00:16:05,480 Speaker 4: the same beloved toy that we had, but you'd take 302 00:16:05,520 --> 00:16:07,880 Speaker 4: it out of the package and instead of feeling like 303 00:16:08,240 --> 00:16:09,960 Speaker 4: it could last, you know, one hundred years. 304 00:16:10,040 --> 00:16:11,480 Speaker 2: Yeah, it's cheap Chinese crab. 305 00:16:11,640 --> 00:16:16,160 Speaker 1: Yeah, definitely China definitely the modern version of something you 306 00:16:16,160 --> 00:16:18,239 Speaker 1: played with as a kid that is just so flimsy 307 00:16:18,280 --> 00:16:21,320 Speaker 1: and light and poorly made exactly. 308 00:16:21,480 --> 00:16:21,680 Speaker 7: Yeah. 309 00:16:21,680 --> 00:16:23,360 Speaker 4: Oh, speaking of China, we ought to get to that. 310 00:16:23,800 --> 00:16:26,880 Speaker 4: China's really rushing to catch up and pass us on AI. 311 00:16:27,200 --> 00:16:30,000 Speaker 4: Couple of pretty compelling AI stories in the news. 312 00:16:29,920 --> 00:16:34,640 Speaker 1: Today definitely are that I can't wait to talk about, 313 00:16:34,920 --> 00:16:38,600 Speaker 1: among other things that we got coming up. And I 314 00:16:38,640 --> 00:16:40,040 Speaker 1: do want to get some of the audio from the 315 00:16:40,040 --> 00:16:44,480 Speaker 1: Turning Point event in Berkeley. You know, the Charlie Kirk 316 00:16:44,560 --> 00:16:47,200 Speaker 1: crowd trying to get together and the Antifa types outside 317 00:16:47,320 --> 00:16:49,760 Speaker 1: not wanting them to get together and speak for some reason, 318 00:16:49,800 --> 00:16:53,880 Speaker 1: and fighting cops and setting off smoke bombs, and I 319 00:16:54,440 --> 00:16:56,440 Speaker 1: just don't get it. I don't get you people. 320 00:16:56,840 --> 00:17:02,760 Speaker 4: Which side is trying to shut down free speech? That 321 00:17:03,000 --> 00:17:05,320 Speaker 4: the answer to that question tells you who's the good 322 00:17:05,320 --> 00:17:05,920 Speaker 4: guys and. 323 00:17:05,880 --> 00:17:07,960 Speaker 2: Who's the bad guys. That's all you need to know. 324 00:17:09,880 --> 00:17:11,720 Speaker 1: That seems so obvious to me. I can't believe it 325 00:17:11,800 --> 00:17:13,280 Speaker 1: even needs to be said, but it needs to be 326 00:17:13,320 --> 00:17:14,800 Speaker 1: said for a lot of the media. 327 00:17:14,840 --> 00:17:15,320 Speaker 2: Apparently. 328 00:17:15,600 --> 00:17:20,880 Speaker 5: More on the way stay here, Armstrong and getty, Hey. 329 00:17:20,720 --> 00:17:22,040 Speaker 7: Optimists, what are you doing there? 330 00:17:23,000 --> 00:17:24,400 Speaker 2: Just chilling you ready to help? 331 00:17:25,160 --> 00:17:29,560 Speaker 7: Hey, Optimist, you know where I can get a coke? Sorry, 332 00:17:29,800 --> 00:17:30,360 Speaker 7: I don't. 333 00:17:32,720 --> 00:17:35,000 Speaker 1: I have a real time info, but I can take 334 00:17:35,000 --> 00:17:36,399 Speaker 1: you to the kitchen if you want to check for 335 00:17:36,440 --> 00:17:37,040 Speaker 1: a coke there. 336 00:17:37,280 --> 00:17:39,879 Speaker 7: Oh yeah, that'd be great. Go, yes, let's do that. 337 00:17:42,040 --> 00:17:43,200 Speaker 2: And then it just stands there. 338 00:17:45,320 --> 00:17:54,159 Speaker 7: Let's go awesome, attend to the kitchen. Okay, okay, go, 339 00:17:54,680 --> 00:17:55,879 Speaker 7: I think it's I think we need to give a 340 00:17:55,920 --> 00:17:56,680 Speaker 7: little bit more. 341 00:17:58,560 --> 00:17:58,879 Speaker 2: Okay. 342 00:17:58,920 --> 00:18:02,199 Speaker 1: So that's the voice obvious Elon Musk right there who 343 00:18:02,280 --> 00:18:03,879 Speaker 1: said we need to give it more room. They were 344 00:18:03,880 --> 00:18:06,040 Speaker 1: standing too close. I guess you can take that down, Michael. 345 00:18:06,200 --> 00:18:09,239 Speaker 1: They're standing next to optimists as they were all going 346 00:18:09,280 --> 00:18:11,719 Speaker 1: to go to the kitchen to get a coke, and 347 00:18:11,880 --> 00:18:15,000 Speaker 1: Optimist who was just standing there looking at him ilis 348 00:18:15,440 --> 00:18:17,119 Speaker 1: and Elon said, I think we need to back up 349 00:18:17,119 --> 00:18:17,439 Speaker 1: a little bit. 350 00:18:17,480 --> 00:18:17,800 Speaker 2: Anyway. 351 00:18:17,920 --> 00:18:21,119 Speaker 1: My takeaway from that video was we ain't even close yet. 352 00:18:22,119 --> 00:18:25,400 Speaker 1: We're not even close to robots taking over yet. Now 353 00:18:26,320 --> 00:18:30,560 Speaker 1: it's moving pretty fast. Maybe it'll be exponentially better in 354 00:18:30,640 --> 00:18:33,680 Speaker 1: a year. I'm sure it will be. But the fact 355 00:18:33,720 --> 00:18:38,680 Speaker 1: that Elon has got a trillion dollar incentive packtion package 356 00:18:38,720 --> 00:18:42,320 Speaker 1: now from Tesla, and he's focusing mostly on optimists the 357 00:18:42,640 --> 00:18:44,879 Speaker 1: AI robot more than the electric cars. 358 00:18:46,680 --> 00:18:47,040 Speaker 7: I don't know. 359 00:18:47,080 --> 00:18:49,320 Speaker 1: It seems like we're a long way away. Where do 360 00:18:49,359 --> 00:18:54,280 Speaker 1: I get a coke? I don't know where to get it, stops, 361 00:18:54,320 --> 00:18:59,320 Speaker 1: gets hung sort of glitch, has no idea, and then 362 00:18:59,359 --> 00:19:02,240 Speaker 1: it just stands. It's I thought it would be further 363 00:19:02,280 --> 00:19:02,800 Speaker 1: along than. 364 00:19:02,720 --> 00:19:05,840 Speaker 4: That, didn't you Wait a minute? I just googled where 365 00:19:05,840 --> 00:19:09,959 Speaker 4: do people keep cox? It suggested the refrigerator, which is 366 00:19:10,000 --> 00:19:13,240 Speaker 4: often in a human kitchen. Let's go to the kitchen. 367 00:19:15,880 --> 00:19:18,040 Speaker 4: All right, let's go. You go first. 368 00:19:18,520 --> 00:19:20,399 Speaker 1: I'm not trying to come off as a guy who 369 00:19:20,520 --> 00:19:23,280 Speaker 1: mocks technology thinking it'll never because I'm sure it will 370 00:19:23,280 --> 00:19:26,159 Speaker 1: be a thing eventually. But it's not as close as 371 00:19:26,200 --> 00:19:28,520 Speaker 1: I thought. But didn't you think that optimist robot would 372 00:19:28,560 --> 00:19:29,359 Speaker 1: be more impressive than that? 373 00:19:32,040 --> 00:19:32,680 Speaker 2: Yeah? 374 00:19:32,800 --> 00:19:35,840 Speaker 4: Yeah, certainly before you trotted it out to do what 375 00:19:35,920 --> 00:19:39,159 Speaker 4: they just did, right right, right, right right. You know, 376 00:19:39,200 --> 00:19:42,159 Speaker 4: I'm reminded of Elon trotting out the cyber truck for 377 00:19:42,200 --> 00:19:45,200 Speaker 4: the first time and saying, and the windows cannot be shattered. Boom, 378 00:19:45,200 --> 00:19:46,760 Speaker 4: he shatters the window here. 379 00:19:50,560 --> 00:19:52,760 Speaker 2: So this article we got a couple of AI stories 380 00:19:52,760 --> 00:19:53,000 Speaker 2: for you. 381 00:19:53,040 --> 00:19:55,480 Speaker 1: This article in the Wall Street Journal today about China's 382 00:19:56,000 --> 00:19:59,359 Speaker 1: push to catch up within surpassity the United States is 383 00:20:00,080 --> 00:20:05,840 Speaker 1: flipping troubling. For instance, this paragraph the ESCALATINGAI race is 384 00:20:05,920 --> 00:20:09,959 Speaker 1: drawing comparisons with the Cold War and the great scientific 385 00:20:09,960 --> 00:20:12,440 Speaker 1: and technological clashes that characterized it. 386 00:20:12,440 --> 00:20:15,280 Speaker 2: It is likely to be at least as consequential. 387 00:20:16,040 --> 00:20:19,439 Speaker 1: The AI race between US and China is going to 388 00:20:19,480 --> 00:20:22,280 Speaker 1: be at least as consequential as the Cold War between 389 00:20:22,359 --> 00:20:24,240 Speaker 1: US and the Soviet Union, if you're. 390 00:20:24,160 --> 00:20:26,639 Speaker 2: Old enough to have lived through that. Holy crap. 391 00:20:29,880 --> 00:20:33,520 Speaker 1: China realized that all the big AI was going to 392 00:20:33,520 --> 00:20:36,760 Speaker 1: be the next big thing, maybe the next big only 393 00:20:36,800 --> 00:20:40,639 Speaker 1: thing on planet Earth, and it was way behind Open Ai, Google, 394 00:20:40,760 --> 00:20:43,080 Speaker 1: all the American companies that were doing so well, and 395 00:20:43,119 --> 00:20:45,640 Speaker 1: then decided we got to do something. And they've done 396 00:20:45,680 --> 00:20:48,160 Speaker 1: a whole of nation effort to try to catch up 397 00:20:48,440 --> 00:20:51,840 Speaker 1: and poured a ton of money into it and relaxed 398 00:20:52,000 --> 00:20:58,800 Speaker 1: all kinds of regulations, which is highly troubling and. 399 00:20:59,000 --> 00:21:00,879 Speaker 2: Sudden silent concerns. 400 00:21:02,160 --> 00:21:05,040 Speaker 4: You know, just hey, quit talking about safety and what's 401 00:21:05,040 --> 00:21:06,000 Speaker 4: best for humanity. 402 00:21:06,080 --> 00:21:08,879 Speaker 2: We don't have time. I mentioned. My favorite quote in 403 00:21:08,920 --> 00:21:10,120 Speaker 2: the article is from JD. 404 00:21:10,280 --> 00:21:14,440 Speaker 4: Vance, and he argued this in February, the AI future 405 00:21:14,520 --> 00:21:15,280 Speaker 4: is not going to. 406 00:21:15,240 --> 00:21:20,240 Speaker 1: Be won by hand ringing about safety. Well, he's right, 407 00:21:20,440 --> 00:21:24,439 Speaker 1: I understand what he's saying. What he's wanting to say 408 00:21:24,920 --> 00:21:28,760 Speaker 1: is China and Russia, mostly China, because China's got the 409 00:21:28,800 --> 00:21:30,760 Speaker 1: money to put into this. China is going to do 410 00:21:30,800 --> 00:21:32,720 Speaker 1: whatever the hell they want. And if they beat us 411 00:21:32,760 --> 00:21:35,600 Speaker 1: to the punch on this, it ain't gonna make any difference. 412 00:21:36,160 --> 00:21:38,640 Speaker 1: That we tried to be ethical and safe about it, 413 00:21:38,640 --> 00:21:39,920 Speaker 1: it ain't gonna make any difference. 414 00:21:40,840 --> 00:21:42,960 Speaker 4: And I'm certainly the wrong guy to ask this question, 415 00:21:43,000 --> 00:21:46,560 Speaker 4: but I find myself wondering, can their AI essentially crush 416 00:21:46,640 --> 00:21:49,760 Speaker 4: our AI if it gets to you know, whatever critical 417 00:21:49,800 --> 00:21:54,240 Speaker 4: stage first, it can mess with our efforts and our 418 00:21:54,280 --> 00:21:57,879 Speaker 4: programs and databases and the rest of it to the 419 00:21:57,920 --> 00:22:01,640 Speaker 4: point that it blows ours ee up. Yeah, they could 420 00:22:01,680 --> 00:22:04,840 Speaker 4: be some sort of like on purpose effort like that, 421 00:22:04,880 --> 00:22:07,280 Speaker 4: But I take in a ton of AI information reading 422 00:22:07,280 --> 00:22:09,119 Speaker 4: and listening to podcasts with the smartest people in the 423 00:22:09,160 --> 00:22:12,360 Speaker 4: world dogging about this. The more likely concern is without 424 00:22:12,440 --> 00:22:16,120 Speaker 4: any attempt whatsoever to ethnically control it, it just gets 425 00:22:16,200 --> 00:22:18,879 Speaker 4: loose on its own and gets into computers and travels 426 00:22:18,920 --> 00:22:20,440 Speaker 4: around the world and just kind. 427 00:22:20,240 --> 00:22:22,520 Speaker 1: Of does its own thing, and then the genie is 428 00:22:22,520 --> 00:22:26,720 Speaker 1: out of the bottle, which is pretty much inevitable. How 429 00:22:26,720 --> 00:22:29,760 Speaker 1: do we prevent that though, I mean, even if we're first, oh, 430 00:22:29,800 --> 00:22:35,080 Speaker 1: we can't. Okay, never mind forward to having your organs harvested. 431 00:22:36,440 --> 00:22:37,919 Speaker 4: I mean, because even if we beat them to the 432 00:22:37,920 --> 00:22:40,320 Speaker 4: punch by five years, when they catch up five years later, 433 00:22:40,440 --> 00:22:43,440 Speaker 4: unless our AI can trump their AI, they will unleash it. 434 00:22:43,480 --> 00:22:45,120 Speaker 2: And the hell you're speaking of. 435 00:22:45,080 --> 00:22:47,280 Speaker 1: Something, I suppose my big first of all, that paragraph 436 00:22:47,320 --> 00:22:50,080 Speaker 1: about the Cold War, I find just like bone chilling, 437 00:22:51,600 --> 00:22:58,760 Speaker 1: don't I don't feel like the population is taking this 438 00:22:59,040 --> 00:23:04,000 Speaker 1: like the challenge that it is. The way the Cold 439 00:23:04,040 --> 00:23:07,000 Speaker 1: War was. I mean, my dad grew up hiding underneath 440 00:23:07,040 --> 00:23:11,439 Speaker 1: his desks in rural Iowa in case the Russians dropped 441 00:23:11,440 --> 00:23:14,479 Speaker 1: the bomb, but it was on their radar that we 442 00:23:14,480 --> 00:23:18,119 Speaker 1: were in a you know, fight to the death with 443 00:23:18,240 --> 00:23:21,320 Speaker 1: a foe that was close enough to we're equal to 444 00:23:21,359 --> 00:23:23,160 Speaker 1: have to worry about it. I don't feel like people 445 00:23:23,160 --> 00:23:25,560 Speaker 1: feel that way about China and AI. The average person 446 00:23:25,600 --> 00:23:29,119 Speaker 1: doesn't have any idea any of this is happening. No, no, 447 00:23:30,080 --> 00:23:31,400 Speaker 1: which is troubling. 448 00:23:31,400 --> 00:23:37,800 Speaker 2: I think, my I guess we're better off that way. 449 00:23:38,000 --> 00:23:39,640 Speaker 4: I mean, because if we spend all of our time 450 00:23:39,760 --> 00:23:43,920 Speaker 4: terrified of our if our AI overlords, that's no way 451 00:23:43,960 --> 00:23:46,560 Speaker 4: to live. Instead, you'll be going about your business. One 452 00:23:46,600 --> 00:23:49,280 Speaker 4: day you'll turn around, there's a robot beyond you. You'll think, Wow, 453 00:23:49,320 --> 00:23:51,760 Speaker 4: that's weird. Then it'll suffer your head. I mean, just 454 00:23:51,840 --> 00:23:54,800 Speaker 4: like that, and you won't have suffered the fear. 455 00:23:54,920 --> 00:23:57,399 Speaker 2: Hard to imagine where no head. But what are you 456 00:23:57,440 --> 00:23:59,480 Speaker 2: gonna do? Yeah, I suppose worrying about it? And not 457 00:23:59,560 --> 00:24:00,639 Speaker 2: much you can do about it. 458 00:24:01,320 --> 00:24:03,760 Speaker 1: I was gonna says, as a guy who cares about 459 00:24:03,800 --> 00:24:06,000 Speaker 1: his money, I worry about the economy and what's going 460 00:24:06,040 --> 00:24:08,000 Speaker 1: to happen and whether or not this is all a 461 00:24:08,040 --> 00:24:10,280 Speaker 1: bubble and it's going to completely collapse. And it is 462 00:24:10,760 --> 00:24:14,240 Speaker 1: CHIP companies trading money with AI companies back and forth 463 00:24:14,240 --> 00:24:16,119 Speaker 1: and investing each other, and it could bust and it 464 00:24:16,119 --> 00:24:19,680 Speaker 1: doesn't turn out to be what they said. But well, 465 00:24:19,680 --> 00:24:21,640 Speaker 1: that one of the other lead stories that have got 466 00:24:21,680 --> 00:24:22,720 Speaker 1: for today the. 467 00:24:24,840 --> 00:24:25,280 Speaker 2: Right here. 468 00:24:26,680 --> 00:24:32,120 Speaker 1: Yan Lukun is Meta's chief AI scientist, the top guy 469 00:24:32,200 --> 00:24:37,119 Speaker 1: working on AI for Zuckerberg, who has spent tens of 470 00:24:37,200 --> 00:24:39,320 Speaker 1: billions of dollars. I think he spent one hundred billion 471 00:24:39,320 --> 00:24:42,359 Speaker 1: dollars on this project. His lead scientists is leaving and 472 00:24:42,400 --> 00:24:47,520 Speaker 1: starting his own company. All of these people, including the Chinese, 473 00:24:47,800 --> 00:24:49,240 Speaker 1: can't all be wrong. 474 00:24:49,080 --> 00:24:49,480 Speaker 2: Can they? 475 00:24:49,840 --> 00:24:51,919 Speaker 1: That it turns into a dot com bubble where it's like, oh, 476 00:24:51,960 --> 00:24:54,639 Speaker 1: I guess AI is not going to be profitable do anything, 477 00:24:54,680 --> 00:24:57,000 Speaker 1: So never mind gosh, I wouldn't think so, you wouldn't 478 00:24:57,000 --> 00:25:00,480 Speaker 1: think Elon and Zuckerberg and China and ever everybody else 479 00:25:00,480 --> 00:25:03,560 Speaker 1: could be wrong about this. So that's what leads me 480 00:25:03,560 --> 00:25:05,120 Speaker 1: to believe that this is going to be a thing 481 00:25:05,280 --> 00:25:08,520 Speaker 1: unfolding in front of our eyes at some point. 482 00:25:08,640 --> 00:25:10,640 Speaker 2: And then looming behind us and severing our heads. 483 00:25:10,680 --> 00:25:14,240 Speaker 1: So so you say that all the time, which is funny, 484 00:25:14,720 --> 00:25:18,080 Speaker 1: But do you do you have a real world sense 485 00:25:18,200 --> 00:25:20,000 Speaker 1: of bad things that AI could do? 486 00:25:22,200 --> 00:25:25,199 Speaker 4: Well. It goes back to the commonly spoken theme of 487 00:25:25,280 --> 00:25:29,600 Speaker 4: AI decides the only thing impeding it is human beings 488 00:25:29,840 --> 00:25:34,400 Speaker 4: or the only thing impeding the planet being at peak health. 489 00:25:34,520 --> 00:25:35,359 Speaker 2: Is human beings. 490 00:25:36,080 --> 00:25:38,920 Speaker 4: I mean, those are the two classic why they would 491 00:25:38,920 --> 00:25:39,680 Speaker 4: sever our heads. 492 00:25:40,920 --> 00:25:44,320 Speaker 1: And then even if they don't do that, what if 493 00:25:44,320 --> 00:25:46,800 Speaker 1: it wipes out seventy five percent of jobs. 494 00:25:47,640 --> 00:25:51,320 Speaker 4: Well right right, yeah, and then political turmoil and revolution 495 00:25:51,440 --> 00:25:55,480 Speaker 4: in the Straits, et cetera, et cetera. Robots great, oh yeah, 496 00:25:55,520 --> 00:25:58,800 Speaker 4: no kidding, Wow. So a couple more quick AI notes. 497 00:25:58,840 --> 00:26:01,520 Speaker 4: I thought this was really interesting, the Wall Street Journal 498 00:26:01,520 --> 00:26:06,920 Speaker 4: reporting that Anthropic, which is the company behind Claude, expects 499 00:26:06,920 --> 00:26:09,680 Speaker 4: to break even for the first time in twenty twenty eight. 500 00:26:11,440 --> 00:26:17,040 Speaker 4: By contrast open AI, the chat GPT folks plan, they're 501 00:26:17,080 --> 00:26:21,320 Speaker 4: forecasting their operating losses that year twenty twenty eight will 502 00:26:21,320 --> 00:26:25,920 Speaker 4: be about seventy four billion dollars. They will lose seventy 503 00:26:25,960 --> 00:26:29,000 Speaker 4: four billion dollars in twenty twenty eight, or roughly three 504 00:26:29,080 --> 00:26:32,480 Speaker 4: quarters of revenue thanks to ballooning spending on computing costs, 505 00:26:32,920 --> 00:26:36,560 Speaker 4: And they don't think they'll They're gonna burn through roughly 506 00:26:36,600 --> 00:26:39,600 Speaker 4: fourteen times as much cash as Anthropic before turning a 507 00:26:39,600 --> 00:26:43,840 Speaker 4: profit in twenty thirty but certainly don't take my word 508 00:26:43,880 --> 00:26:46,680 Speaker 4: for it through the Wall Street Journal and invest carefully. 509 00:26:46,920 --> 00:26:49,399 Speaker 1: Yeah, and then you've got this story of Amazon that 510 00:26:49,640 --> 00:26:51,960 Speaker 1: never made any money and was losing money like crazy, 511 00:26:52,000 --> 00:26:53,760 Speaker 1: and I remember all the jokes about it never turned 512 00:26:53,760 --> 00:26:55,760 Speaker 1: to profit and everything like that, and obviously came to 513 00:26:55,840 --> 00:27:01,040 Speaker 1: dominate the landscape in so many different ways of eventually. 514 00:27:01,560 --> 00:27:04,280 Speaker 2: And then I'm sorry, Michael's what did you say to us? 515 00:27:04,720 --> 00:27:04,880 Speaker 6: Oh? 516 00:27:05,000 --> 00:27:07,280 Speaker 2: Price picky? In just a second? Coming up? 517 00:27:07,320 --> 00:27:10,760 Speaker 4: In one more AI note from a website I had 518 00:27:10,800 --> 00:27:13,320 Speaker 4: never heard of, sent to us by alert listener Hillbilly 519 00:27:13,640 --> 00:27:18,680 Speaker 4: Savingcountry Music dot Com. The headline is AI song's top 520 00:27:18,720 --> 00:27:21,920 Speaker 4: Billboard chart. Why we need transparency? Now? 521 00:27:22,840 --> 00:27:25,600 Speaker 2: Okay, I want to hear that. Well, that shits nizzle. 522 00:27:25,600 --> 00:27:28,280 Speaker 2: Are you kidding me? I'll tell you what. 523 00:27:29,280 --> 00:27:32,840 Speaker 1: Cutting off my head, taking everybody's jobs and running country music, well, 524 00:27:32,840 --> 00:27:33,480 Speaker 1: this is no good. 525 00:27:34,840 --> 00:27:37,439 Speaker 2: Unplug it. So a word from our friends. A prize 526 00:27:37,440 --> 00:27:38,040 Speaker 2: picks It is. 527 00:27:38,000 --> 00:27:40,320 Speaker 4: The easiest, most fun way to get into fantasy sports 528 00:27:40,320 --> 00:27:42,240 Speaker 4: around You just pick more or less on at least 529 00:27:42,440 --> 00:27:45,080 Speaker 4: two player stats. You think your favorite basketball player is 530 00:27:45,119 --> 00:27:48,720 Speaker 4: going to go off against the weak defensive wherever they're playing, 531 00:27:48,760 --> 00:27:51,680 Speaker 4: pick more or vice versa. And you can even combine 532 00:27:51,680 --> 00:27:55,080 Speaker 4: like a football player with a basketball player or multiple 533 00:27:55,080 --> 00:27:56,600 Speaker 4: players on your lineup, mix. 534 00:27:56,520 --> 00:27:57,119 Speaker 2: Up the sports. 535 00:27:57,200 --> 00:27:59,160 Speaker 1: I guess for the NFL, now you go with less 536 00:27:59,200 --> 00:28:01,879 Speaker 1: on everything with these ten to seven games? What the 537 00:28:01,920 --> 00:28:02,880 Speaker 1: heck is going on there? 538 00:28:03,000 --> 00:28:03,359 Speaker 2: Anyway? 539 00:28:03,359 --> 00:28:06,560 Speaker 1: If you got figured out a trend or a hot player, 540 00:28:06,920 --> 00:28:08,840 Speaker 1: or somebody who's passed their prime and all that sort 541 00:28:08,840 --> 00:28:11,119 Speaker 1: of stuff. You can take your opinion and turn it 542 00:28:11,160 --> 00:28:13,640 Speaker 1: into cash with Prize Picks. Download the Price Picks app 543 00:28:13,680 --> 00:28:15,680 Speaker 1: today and use the code armstrong to get fifty dollars 544 00:28:15,680 --> 00:28:17,760 Speaker 1: and lineups after you play your first five dollars lineup. 545 00:28:17,880 --> 00:28:20,040 Speaker 1: That code is armstrong to get fifty dollars in the lineups 546 00:28:20,040 --> 00:28:22,920 Speaker 1: after you play your first five dollars lineup Price Picks. 547 00:28:22,960 --> 00:28:24,040 Speaker 2: It is good to be right. 548 00:28:24,440 --> 00:28:27,119 Speaker 4: Yeah, If you want flexibility, you can play the flex 549 00:28:27,160 --> 00:28:28,879 Speaker 4: play where you can get paid even if one of 550 00:28:28,920 --> 00:28:31,639 Speaker 4: your picks misses. Once again, the coat is Armstrong. That 551 00:28:31,720 --> 00:28:34,480 Speaker 4: Prize Picks app get fifty bucks in lineups after you 552 00:28:34,560 --> 00:28:38,840 Speaker 4: play five prize picks. All right, so I just opened 553 00:28:38,880 --> 00:28:42,320 Speaker 4: this up again. Thank you hill Billy for sending this along. 554 00:28:43,000 --> 00:28:45,840 Speaker 4: There's an alarmingly low sense of urgency. They write about 555 00:28:45,840 --> 00:28:49,280 Speaker 4: a rapidly developing dilemma that threatens to absolutely eviscerate everything 556 00:28:49,280 --> 00:28:51,240 Speaker 4: we know and love about music in a matter of months. 557 00:28:51,360 --> 00:28:53,400 Speaker 4: We're talking about AI, of course, but it feels almost 558 00:28:53,400 --> 00:28:55,479 Speaker 4: embarrassing and trite at this point even bring it up 559 00:28:55,480 --> 00:28:58,120 Speaker 4: in such a breathless context, in part because we all 560 00:28:58,160 --> 00:29:00,640 Speaker 4: have an inherent sense of how catastrophic is going to 561 00:29:00,640 --> 00:29:03,920 Speaker 4: be for the human creators and how inevitable its impacts ultimately? 562 00:29:03,960 --> 00:29:04,520 Speaker 2: Are you know? 563 00:29:04,640 --> 00:29:07,120 Speaker 4: Hill Billy mentioned that the guy who wrote this is 564 00:29:07,120 --> 00:29:11,600 Speaker 4: a terrific writer and he is. Wow, that's some good writing. Anyway, 565 00:29:12,640 --> 00:29:13,360 Speaker 4: what's his name? 566 00:29:13,880 --> 00:29:14,280 Speaker 2: I don't know? 567 00:29:14,800 --> 00:29:18,920 Speaker 4: But they there's the picture of AI generated artist breaking rust. 568 00:29:20,200 --> 00:29:23,480 Speaker 4: It's a little too perfect, country looking bearded guy in 569 00:29:23,520 --> 00:29:24,440 Speaker 4: a cowboy. 570 00:29:24,160 --> 00:29:28,800 Speaker 1: Hat and right and all uh handsome, rugged yet sensitive. 571 00:29:29,760 --> 00:29:30,760 Speaker 2: Yes, how do you know? 572 00:29:30,840 --> 00:29:33,440 Speaker 4: You must have seen this picture, Papa? But do we 573 00:29:33,480 --> 00:29:37,520 Speaker 4: expect Congress to address this existential crisis facing human creators? 574 00:29:38,320 --> 00:29:40,520 Speaker 4: They're saying, we should do something about it, attempt to 575 00:29:40,560 --> 00:29:43,960 Speaker 4: install some guardrails and guideposts, and expend at least a 576 00:29:44,000 --> 00:29:46,200 Speaker 4: modicum of effort to at least make sure the public 577 00:29:46,280 --> 00:29:48,719 Speaker 4: is aware of weight, what is AI and what is not. 578 00:29:48,960 --> 00:29:51,560 Speaker 1: Yeah, there's some effort by lots of people that you 579 00:29:51,640 --> 00:29:55,080 Speaker 1: have to declare something in AI creation. Do you think 580 00:29:55,080 --> 00:29:58,720 Speaker 1: that makes any difference? You dig in a song, Oh 581 00:29:58,720 --> 00:30:01,160 Speaker 1: it's AI. Will never mind that. I don't know either 582 00:30:01,280 --> 00:30:02,920 Speaker 1: like it or I don't like it? Do you think 583 00:30:02,920 --> 00:30:04,800 Speaker 1: I'd like it more if it turns out it's a human. 584 00:30:05,960 --> 00:30:09,080 Speaker 1: Maybe maybe I find out it's some thirty four year 585 00:30:09,080 --> 00:30:12,800 Speaker 1: old former drug addict was in jail. You know. I'm 586 00:30:12,800 --> 00:30:15,640 Speaker 1: thinking of a what's the guy jelly world type of 587 00:30:15,640 --> 00:30:18,360 Speaker 1: story or something like that. 588 00:30:17,000 --> 00:30:18,040 Speaker 2: That hooks you. 589 00:30:18,800 --> 00:30:23,360 Speaker 4: M Yeah, because the sentiment in the song seems much 590 00:30:23,400 --> 00:30:26,560 Speaker 4: more real. You've asked the key question. That's a super 591 00:30:26,560 --> 00:30:30,560 Speaker 4: interesting question. Will people still enjoy it? This guy's advocating 592 00:30:30,920 --> 00:30:33,360 Speaker 4: any piece of music made by AI or even partially 593 00:30:33,440 --> 00:30:36,320 Speaker 4: maybe AI, must be disclosed as such to the public period. 594 00:30:36,720 --> 00:30:39,760 Speaker 4: So like, if you use AI to clean up the bassline, 595 00:30:39,800 --> 00:30:41,880 Speaker 4: I don't know because your bass player's drunk or something. 596 00:30:45,440 --> 00:30:45,960 Speaker 2: I don't know. 597 00:30:46,880 --> 00:30:52,280 Speaker 4: Because evidently this breaking rust song walk My Walk top 598 00:30:52,360 --> 00:30:55,360 Speaker 4: the Billboard Country Digital Song Sales Chart. 599 00:30:55,960 --> 00:30:58,720 Speaker 2: An AI track was number one song in country. I'm 600 00:30:58,720 --> 00:31:02,160 Speaker 2: gonna listen to that during the break, please do. 601 00:31:02,360 --> 00:31:04,720 Speaker 1: We can't play it for copywriting reasons, but I'm going 602 00:31:04,760 --> 00:31:05,200 Speaker 1: to listen to it. 603 00:31:05,200 --> 00:31:09,800 Speaker 2: Didn't break, see what I think? Yeah? Uh God add 604 00:31:09,800 --> 00:31:12,160 Speaker 2: it into a weird world? Yeah yeah, I don't know. 605 00:31:12,240 --> 00:31:13,200 Speaker 2: And everybody's guessing. 606 00:31:13,480 --> 00:31:15,360 Speaker 1: But the people with a lots of money, like the 607 00:31:15,440 --> 00:31:17,960 Speaker 1: richest people on Earth are guessing that it's going to 608 00:31:18,000 --> 00:31:18,920 Speaker 1: be a big deal. 609 00:31:18,720 --> 00:31:19,680 Speaker 2: And going to be profitable. 610 00:31:21,040 --> 00:31:22,720 Speaker 1: I don't know if there are any super wealthy people 611 00:31:22,720 --> 00:31:26,200 Speaker 1: are saying, Nah, this is overhyped. I'll be in the 612 00:31:26,240 --> 00:31:29,480 Speaker 1: woods if you need me. Man, Well, good luck the 613 00:31:29,560 --> 00:31:33,760 Speaker 1: AI robot. But currently can't find a coke. We'll be 614 00:31:33,760 --> 00:31:35,480 Speaker 1: able to find you in the woods and chop off 615 00:31:35,480 --> 00:31:38,920 Speaker 1: your head for whatever reason. Look, honey, look at that squirrel. 616 00:31:39,640 --> 00:31:42,680 Speaker 1: Zero's and ones flash in his eyes. It sends down 617 00:31:42,720 --> 00:31:43,040 Speaker 1: about me. 618 00:31:43,240 --> 00:31:44,600 Speaker 2: Yeah. 619 00:31:45,120 --> 00:31:50,960 Speaker 1: Likely Any thoughts on this text line four one KFTC. 620 00:31:54,520 --> 00:31:56,680 Speaker 3: Listen to this gen z is posting tiktoks of a 621 00:31:56,720 --> 00:31:59,160 Speaker 3: new challenge where they do nothing but sit in silence 622 00:31:59,280 --> 00:31:59,960 Speaker 3: for as long as they. 623 00:31:59,880 --> 00:32:04,560 Speaker 2: Can hand where as your grandparents call that life. 624 00:32:06,800 --> 00:32:09,760 Speaker 1: Speaking of tech, So before we went to break, Joe 625 00:32:10,000 --> 00:32:13,320 Speaker 1: mentioned this country song that's on the top of some 626 00:32:13,520 --> 00:32:16,680 Speaker 1: chart and it's AI, and so we both took a 627 00:32:16,720 --> 00:32:18,640 Speaker 1: listen to it. We can't play it because it'd be 628 00:32:18,760 --> 00:32:23,480 Speaker 1: a violation of something. But so before I tell you 629 00:32:23,520 --> 00:32:25,240 Speaker 1: what I thought of this AI song that's at the 630 00:32:25,240 --> 00:32:26,960 Speaker 1: top of the charts, Is it all AI? 631 00:32:27,120 --> 00:32:29,760 Speaker 2: Is that what you're saying? It's entirely AI song, I 632 00:32:29,840 --> 00:32:30,200 Speaker 2: believe so. 633 00:32:30,320 --> 00:32:31,920 Speaker 7: Yeah. 634 00:32:31,960 --> 00:32:33,840 Speaker 1: So the dude on the cover is a made up picture. 635 00:32:34,120 --> 00:32:37,720 Speaker 1: The voice is Ai, the instruments, the writing, all of. 636 00:32:37,640 --> 00:32:37,960 Speaker 2: It is that. 637 00:32:39,200 --> 00:32:41,719 Speaker 4: Yeah, I don't know if the there's probably somebody who 638 00:32:41,720 --> 00:32:44,880 Speaker 4: wrote the lyrics just but maybe not. 639 00:32:45,200 --> 00:32:46,760 Speaker 2: I don't know. Having listened to it. 640 00:32:46,800 --> 00:32:52,320 Speaker 4: That is highly troubling. Yeah, it seconded me. That is 641 00:32:52,800 --> 00:32:56,200 Speaker 4: way too good. That is why why would anybody try 642 00:32:56,240 --> 00:32:58,320 Speaker 4: at this point? Well, why would anybody try to become 643 00:32:58,360 --> 00:33:00,840 Speaker 4: famous and make money at it? If you want to 644 00:33:01,280 --> 00:33:04,000 Speaker 4: make music, I do it every day at home. I 645 00:33:04,000 --> 00:33:05,920 Speaker 4: played the piano in my bedroom in my underwear. 646 00:33:06,040 --> 00:33:08,560 Speaker 1: I do that all the time. But it left that 647 00:33:08,640 --> 00:33:11,200 Speaker 1: last part out. Hey, I isn't going to replace that. 648 00:33:11,280 --> 00:33:14,360 Speaker 1: But any I'm gonna make it onto the charts. I 649 00:33:14,360 --> 00:33:16,040 Speaker 1: don't know if there's any point in Matt anymore. 650 00:33:16,600 --> 00:33:21,000 Speaker 4: Yeah, I'm I'm already very very cynical about pop music. 651 00:33:21,280 --> 00:33:24,239 Speaker 4: And it occurred to me that a lot of us 652 00:33:24,240 --> 00:33:27,320 Speaker 4: of a certain age we had the unbelievable experience that 653 00:33:27,360 --> 00:33:31,120 Speaker 4: we took for granted that pop music, which was entirely 654 00:33:31,160 --> 00:33:35,240 Speaker 4: a commodity. I mean, anybody who was actually a creative 655 00:33:35,320 --> 00:33:38,760 Speaker 4: artist was exploited and thrown away by the money guys. 656 00:33:39,520 --> 00:33:41,520 Speaker 2: It was just again a corporate commodity. 657 00:33:41,760 --> 00:33:44,520 Speaker 4: And then there was a brief and wonderful period of 658 00:33:44,560 --> 00:33:48,200 Speaker 4: I don't know, ten to twenty five years where the 659 00:33:48,320 --> 00:33:51,760 Speaker 4: art was dominated by actual creative artists, at least to 660 00:33:51,840 --> 00:33:55,800 Speaker 4: a significant extent. It still was corporate, but there was 661 00:33:55,840 --> 00:33:58,960 Speaker 4: a hell of a lot of creativity. And now, and 662 00:33:58,960 --> 00:34:01,040 Speaker 4: I'm not saying there's no creat activity left, but pop 663 00:34:01,120 --> 00:34:04,160 Speaker 4: music is so corporate. There's so much money to be made. 664 00:34:04,360 --> 00:34:08,920 Speaker 4: The formulaic AI, they might as well be AI. Song 665 00:34:09,080 --> 00:34:13,160 Speaker 4: factories are so efficient, the underwear models lip syncing to 666 00:34:13,200 --> 00:34:16,160 Speaker 4: the music are so good looking. In the rest of it, 667 00:34:16,160 --> 00:34:19,000 Speaker 4: it's easy to be very, very cynical about it. Having 668 00:34:19,120 --> 00:34:22,640 Speaker 4: said that, the lyrics of this song in particular are 669 00:34:23,040 --> 00:34:28,040 Speaker 4: a person who has had some very painful times in 670 00:34:28,080 --> 00:34:31,399 Speaker 4: their life pouring out their soul. And the fact that 671 00:34:31,400 --> 00:34:34,279 Speaker 4: that is cranked out by a computer because they know 672 00:34:34,400 --> 00:34:36,360 Speaker 4: you like that sort of thing makes. 673 00:34:36,120 --> 00:34:40,400 Speaker 2: Me want to vomit. That's a good point. The fact that. 674 00:34:42,360 --> 00:34:46,000 Speaker 1: A chat bot picks up Oh okay, people have angst 675 00:34:46,120 --> 00:34:47,719 Speaker 1: and pain and that sort of thing. 676 00:34:47,760 --> 00:34:48,520 Speaker 2: Oh right about that? 677 00:34:49,400 --> 00:34:52,360 Speaker 1: Yeah, when it comes from somebody who's had that same feeling, 678 00:34:52,360 --> 00:34:54,880 Speaker 1: and we have that, we have that in common as 679 00:34:54,920 --> 00:34:56,799 Speaker 1: a human being. Oh you felt that. I'm feeling that 680 00:34:56,880 --> 00:34:59,120 Speaker 1: right now. Thanks for writing about it. When it turns 681 00:34:59,160 --> 00:35:01,840 Speaker 1: out it's a computer completely you know. 682 00:35:02,880 --> 00:35:03,120 Speaker 2: Yeah. 683 00:35:03,120 --> 00:35:04,759 Speaker 4: One of my one of my favorite songs by one 684 00:35:04,800 --> 00:35:08,640 Speaker 4: of my favorite bands. The writer and singer happens to 685 00:35:08,680 --> 00:35:12,960 Speaker 4: be a gal. He's talking about, you know, being you know, 686 00:35:13,040 --> 00:35:15,480 Speaker 4: self destructive, in love and drinking way too much. And 687 00:35:15,600 --> 00:35:17,640 Speaker 4: the line is, maybe I'll find my maker on the 688 00:35:17,680 --> 00:35:20,680 Speaker 4: bedroom floor, which is a hell of a line. 689 00:35:21,880 --> 00:35:23,319 Speaker 2: Maybe I'll meet my maker, I think it is. 690 00:35:23,360 --> 00:35:27,200 Speaker 4: Anyway, uh to hear that somebody just cranked that out 691 00:35:27,239 --> 00:35:30,280 Speaker 4: because the computer algorithm said that would that would be compelling. 692 00:35:30,600 --> 00:35:33,360 Speaker 2: I don't know, just ugh, my skin. 693 00:35:33,239 --> 00:35:36,719 Speaker 4: Is crawling, my guts are churning. Maybe I ate something 694 00:35:36,760 --> 00:35:39,880 Speaker 4: bad for dinner last night. But yeah, that's that's awful. 695 00:35:39,920 --> 00:35:41,920 Speaker 4: It's not good, it's not funny, it's not amusing. 696 00:35:42,080 --> 00:35:45,440 Speaker 1: So Google hired some AI guru to come over. They 697 00:35:45,880 --> 00:35:50,560 Speaker 1: spent two point seven billion dollars to buy character dot Ai, 698 00:35:50,960 --> 00:35:55,680 Speaker 1: and then this guy had a whole bunch of posts 699 00:35:55,680 --> 00:35:58,439 Speaker 1: about how he doesn't believe the whole trans thing is real, 700 00:35:58,600 --> 00:36:01,200 Speaker 1: and so Google tried to shut him down, having just 701 00:36:01,239 --> 00:36:03,839 Speaker 1: spent three billion dollars on his company, and that became 702 00:36:04,040 --> 00:36:08,040 Speaker 1: so they got a woke problem within the Google eye stuff. 703 00:36:08,040 --> 00:36:09,919 Speaker 1: So that would be something China is not worried about. 704 00:36:09,920 --> 00:36:10,000 Speaker 6: That. 705 00:36:10,200 --> 00:36:13,040 Speaker 1: I guarantee you China's attempt to be the dominant AI 706 00:36:13,120 --> 00:36:15,520 Speaker 1: force on Earth is not worried about the politics of 707 00:36:15,520 --> 00:36:16,560 Speaker 1: the individual employees. 708 00:36:17,080 --> 00:36:19,839 Speaker 4: Right, speaking of wars, man is the Democratic Party at 709 00:36:19,880 --> 00:36:22,439 Speaker 4: war with itself over the end of the shutdown? Man, 710 00:36:22,520 --> 00:36:24,280 Speaker 4: some strong stuff being said. 711 00:36:24,880 --> 00:36:27,680 Speaker 5: Teach you that coming up Armstrong and Getty