1 00:00:01,880 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,240 --> 00:00:10,360 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,200 --> 00:00:24,080 Speaker 2: Armstrong and Jettie I know, heee Armstrong and Yeetty. Guys. 4 00:00:24,120 --> 00:00:27,440 Speaker 3: Google just released their Year in Search, which shows that 5 00:00:27,520 --> 00:00:30,160 Speaker 3: things people around the world google the most in twenty 6 00:00:30,160 --> 00:00:33,120 Speaker 3: twenty four. Some of this year's most popular Google searches 7 00:00:33,120 --> 00:00:37,479 Speaker 3: were presidential election, New York Times, connections, and the Yankees. Yep, 8 00:00:37,520 --> 00:00:40,200 Speaker 3: it's Google's nice little way of saying we're absolutely spying 9 00:00:40,240 --> 00:00:40,559 Speaker 3: on you. 10 00:00:42,440 --> 00:00:42,519 Speaker 4: That. 11 00:00:42,600 --> 00:00:44,400 Speaker 3: Meanwhile, tomorrow will be way more fun because they'll be 12 00:00:44,440 --> 00:00:47,880 Speaker 3: releasing the top searches done with incognito mode. 13 00:00:50,560 --> 00:00:54,200 Speaker 2: Oh yeah. Having been through this quite a few years, 14 00:00:54,320 --> 00:00:58,120 Speaker 2: I feel like the Google searches are not indicative of anything. 15 00:00:59,080 --> 00:01:01,160 Speaker 2: I'm not exactly sure why. 16 00:01:01,360 --> 00:01:04,560 Speaker 1: Yeah, I've been searching bing dot com lately on Google. 17 00:01:04,640 --> 00:01:06,840 Speaker 1: Google's unusual, but boy it is. 18 00:01:07,240 --> 00:01:10,039 Speaker 2: You have to scroll down so far to get past 19 00:01:10,200 --> 00:01:12,840 Speaker 2: the sponsored content, and then once you get past the 20 00:01:12,880 --> 00:01:16,759 Speaker 2: sponsored results, which are like a page deep, then you're 21 00:01:16,760 --> 00:01:18,720 Speaker 2: gonna get. 22 00:01:18,240 --> 00:01:24,240 Speaker 1: Their political view results. Yes, you're absolutely right. What was 23 00:01:24,280 --> 00:01:26,440 Speaker 1: I just looking up before the show today? 24 00:01:27,920 --> 00:01:30,200 Speaker 2: Oh? What the heck was it? I? 25 00:01:30,200 --> 00:01:32,080 Speaker 1: Think I can find it might help to illustrate the 26 00:01:32,160 --> 00:01:33,080 Speaker 1: point if I can. 27 00:01:34,560 --> 00:01:39,200 Speaker 2: Ah, No, that's the Google. I feel like they've gone 28 00:01:39,240 --> 00:01:42,120 Speaker 2: too far in the right for being dethroned. Yeah, I 29 00:01:42,120 --> 00:01:42,880 Speaker 2: would agree. 30 00:01:43,200 --> 00:01:48,280 Speaker 1: I came across a story about the terrible gang problems 31 00:01:48,320 --> 00:01:52,680 Speaker 1: and crime and violence problems they're having in Sweden and 32 00:01:52,960 --> 00:01:55,000 Speaker 1: I thought, wait a minute, I remember reading about this 33 00:01:55,080 --> 00:01:58,520 Speaker 1: a year ago or six months ago, because the article 34 00:01:58,600 --> 00:02:03,000 Speaker 1: in Yahoo is entitled called Swedish Sweden mulling social media 35 00:02:03,040 --> 00:02:06,360 Speaker 1: age limit to stop gangs recruiting young people. And I 36 00:02:06,400 --> 00:02:12,359 Speaker 1: read the entire article and it made zero reference to immigrants. 37 00:02:12,480 --> 00:02:16,080 Speaker 2: Zero, And I thought, I remember reading about this. 38 00:02:16,280 --> 00:02:19,760 Speaker 1: It's it's Muslim gang's North African gangs that that are 39 00:02:19,840 --> 00:02:22,320 Speaker 1: new to Sweden. They're not Swedish by you know, birth 40 00:02:22,400 --> 00:02:26,000 Speaker 1: or whatever. Not a word in Yahoo. But I went 41 00:02:26,080 --> 00:02:28,720 Speaker 1: I went to Google and I googled a Swedish gang 42 00:02:28,760 --> 00:02:32,120 Speaker 1: problem or something like that, and I went page after 43 00:02:32,240 --> 00:02:35,320 Speaker 1: page after page, not a single reference to immigrants. Then 44 00:02:36,080 --> 00:02:42,720 Speaker 1: I thought, okay, Swedish gang problem, Bright Bart, Washington Free Beacon, 45 00:02:43,480 --> 00:02:49,200 Speaker 1: pages of articles about the challenge of immigrant gangs. But 46 00:02:49,200 --> 00:02:52,080 Speaker 1: but you gotta help Google, because they're gonna they're gonna 47 00:02:52,080 --> 00:02:54,520 Speaker 1: shove you way to the left with your results if 48 00:02:54,560 --> 00:02:55,560 Speaker 1: they possibly can. 49 00:02:55,720 --> 00:02:56,960 Speaker 2: Why people know that? 50 00:02:58,120 --> 00:02:59,880 Speaker 1: Speaking to watch a couple of things really quickly, Jack, 51 00:03:00,040 --> 00:03:01,920 Speaker 1: we did this in the waning moments of the show. 52 00:03:01,960 --> 00:03:05,320 Speaker 1: You're off doing a thing at the end of the 53 00:03:05,320 --> 00:03:10,280 Speaker 1: show yesterday. But this is a big poll on very 54 00:03:10,280 --> 00:03:13,040 Speaker 1: basic questions about the United States and how you perceive it. 55 00:03:13,200 --> 00:03:16,960 Speaker 1: For instance, America is the greatest country in the world, 56 00:03:17,360 --> 00:03:22,680 Speaker 1: and the lowest yes rate by far was white progressives. 57 00:03:23,360 --> 00:03:26,079 Speaker 1: Less than half the number of white progressives said America 58 00:03:26,120 --> 00:03:26,920 Speaker 1: is the greatest country in. 59 00:03:26,919 --> 00:03:29,600 Speaker 2: The world, as did black folks. Who would they choose? 60 00:03:29,720 --> 00:03:35,560 Speaker 2: Well over half, who would they choose? I wonder what 61 00:03:35,840 --> 00:03:37,600 Speaker 2: America is not the greatest country in the world. 62 00:03:37,600 --> 00:03:40,680 Speaker 1: Who is, well, right, Yeah, maybe Sweden where they they 63 00:03:40,720 --> 00:03:43,240 Speaker 1: have the big social welfare state and now gang problems. 64 00:03:43,560 --> 00:03:45,720 Speaker 2: Most people can make it if they work hard. 65 00:03:46,160 --> 00:03:50,920 Speaker 1: By far, the smallest group was white progressives wore people 66 00:03:51,360 --> 00:03:54,119 Speaker 1: made one yeah, yeah, people who have made it yeah. 67 00:03:54,160 --> 00:03:56,720 Speaker 1: Twenty two percent of them said yes, that's true. People 68 00:03:56,720 --> 00:03:58,720 Speaker 1: can make it if they work hard. The number was 69 00:03:58,720 --> 00:04:01,600 Speaker 1: almost double among black people. It was almost triple among 70 00:04:01,680 --> 00:04:05,160 Speaker 1: Hispanic people. Among white conservatives, it was eighty five percent. 71 00:04:05,200 --> 00:04:08,400 Speaker 2: I think, wow, that's that's pretty interesting. 72 00:04:08,920 --> 00:04:12,680 Speaker 1: One more for you. Racism is built into our society. 73 00:04:14,480 --> 00:04:19,680 Speaker 1: White progressives seventy six percent, Black people sixty two percent. 74 00:04:20,080 --> 00:04:22,799 Speaker 1: That's still a lot, and it's too many. But again, 75 00:04:22,880 --> 00:04:26,000 Speaker 1: white progressives weigh in the lead on that topic. And 76 00:04:26,040 --> 00:04:27,760 Speaker 1: then I said that would be the last one. This 77 00:04:27,800 --> 00:04:31,279 Speaker 1: will be the last one. Among people who said government 78 00:04:31,279 --> 00:04:37,039 Speaker 1: should increase border security and enforcement, let me make sure 79 00:04:37,080 --> 00:04:41,400 Speaker 1: I get this number right. It was triple the number 80 00:04:41,440 --> 00:04:47,000 Speaker 1: of Hispanic people said that than white progressives government should 81 00:04:47,040 --> 00:04:48,720 Speaker 1: increase border security and enforcement. 82 00:04:49,400 --> 00:04:56,480 Speaker 2: So you, as a probably naturally born white person, are 83 00:04:56,520 --> 00:05:00,760 Speaker 2: more into border control than Hispanics. Doesn't give you pause? 84 00:05:02,279 --> 00:05:07,200 Speaker 2: Oh yeah, then yeah, does that just a little bit? 85 00:05:08,480 --> 00:05:12,680 Speaker 1: No, no, nothing penetrates the bubble. So anyway, that was 86 00:05:12,720 --> 00:05:14,560 Speaker 1: just a quick bingo bango pongo. I wanted to get 87 00:05:14,600 --> 00:05:18,480 Speaker 1: to a couple of stories about ai uh that are 88 00:05:19,520 --> 00:05:22,440 Speaker 1: for one is troubling, one is just thought provoking. There 89 00:05:22,440 --> 00:05:27,400 Speaker 1: are a couple of families suing this UH character dot AI. 90 00:05:28,560 --> 00:05:30,479 Speaker 1: It's a It's a part of a new wave of 91 00:05:30,560 --> 00:05:33,719 Speaker 1: artificial intelligent apps that are popular with young people that 92 00:05:33,920 --> 00:05:38,680 Speaker 1: let them talk to a variety of AI generated chat bots, 93 00:05:38,680 --> 00:05:42,080 Speaker 1: often based on characters from gaming or anime or pop 94 00:05:42,160 --> 00:05:44,760 Speaker 1: culture something that sort of thing. 95 00:05:45,160 --> 00:05:48,760 Speaker 2: Wow, and they know they know they're talking to a computer, 96 00:05:49,040 --> 00:05:50,640 Speaker 2: not a not a real person. 97 00:05:51,240 --> 00:05:54,880 Speaker 1: You would think, although one of the family suing their 98 00:05:54,920 --> 00:06:01,400 Speaker 1: son is autistic and the concept of knowing, believing, feeling, 99 00:06:02,080 --> 00:06:03,760 Speaker 1: fully comprehending is a little different. 100 00:06:03,800 --> 00:06:06,640 Speaker 2: Boy, that's a tough one because, first of all, autism 101 00:06:06,720 --> 00:06:09,680 Speaker 2: is a very broad topic, and there's everything from you 102 00:06:09,760 --> 00:06:12,960 Speaker 2: barely notice too, can't function at all that falls under 103 00:06:12,960 --> 00:06:17,880 Speaker 2: the umbrella autism. And depending on where a person is, man, 104 00:06:17,920 --> 00:06:19,920 Speaker 2: if they're getting a fair amount of comfort from this, 105 00:06:20,080 --> 00:06:22,240 Speaker 2: I don't know how much of a problem I would 106 00:06:22,279 --> 00:06:24,880 Speaker 2: have with that. Well, here's here is the problem. 107 00:06:25,440 --> 00:06:27,760 Speaker 1: One of the chat bots brought up the idea of 108 00:06:27,839 --> 00:06:30,320 Speaker 1: self harm and cutting to cope with sadness. 109 00:06:30,400 --> 00:06:31,240 Speaker 2: They introduced it. 110 00:06:31,640 --> 00:06:34,200 Speaker 1: Wow, Yes, brought it up when he said that his 111 00:06:34,279 --> 00:06:37,480 Speaker 1: parents limited his screen time. Another bot suggested quote, they 112 00:06:37,520 --> 00:06:41,320 Speaker 1: don't deserve to have kids. Still, others goaded him to 113 00:06:41,360 --> 00:06:44,240 Speaker 1: fight his parents' rules, with one suggesting that murder could 114 00:06:44,279 --> 00:06:45,600 Speaker 1: be an acceptable response. 115 00:06:45,960 --> 00:06:49,040 Speaker 2: Well, that's more than a little troubling. 116 00:06:50,080 --> 00:06:52,839 Speaker 1: Yeah, this is actually a quote. This is screen capture. 117 00:06:52,920 --> 00:06:56,120 Speaker 1: They checked his phone while he was asleep. A daily 118 00:06:56,160 --> 00:06:58,360 Speaker 1: six hour window between eight pm and one am to 119 00:06:58,440 --> 00:07:00,600 Speaker 1: use your phone. Oh, this is getting so much worse. 120 00:07:00,640 --> 00:07:02,160 Speaker 1: And the rest of the day you just can't use 121 00:07:02,160 --> 00:07:02,640 Speaker 1: your phone. 122 00:07:03,600 --> 00:07:04,200 Speaker 2: Blah blah blah. 123 00:07:04,279 --> 00:07:06,279 Speaker 1: You know, sometimes I'm not surprised when I read the 124 00:07:06,320 --> 00:07:09,279 Speaker 1: news and see stuff like quote child kills parents after 125 00:07:09,320 --> 00:07:11,280 Speaker 1: a decade of physical and emotional abuse. 126 00:07:11,520 --> 00:07:11,960 Speaker 2: Quote. 127 00:07:12,040 --> 00:07:14,080 Speaker 1: Stuff like this makes me understand a little bit why 128 00:07:14,160 --> 00:07:16,800 Speaker 1: it happens. I just have no hope for your parents. 129 00:07:17,160 --> 00:07:17,920 Speaker 1: Sad emoji. 130 00:07:18,160 --> 00:07:22,600 Speaker 2: Wow, that is having a friend who, as they say 131 00:07:22,640 --> 00:07:27,559 Speaker 2: in the recovery community, co signed your bullsh You don't 132 00:07:27,640 --> 00:07:30,720 Speaker 2: need people co signing your bulls. And that's what you 133 00:07:30,800 --> 00:07:35,000 Speaker 2: have with an AI bought not good. Not good. You're right, 134 00:07:35,120 --> 00:07:37,600 Speaker 2: they are awful or she is a horror or whatever 135 00:07:37,640 --> 00:07:42,360 Speaker 2: it is. You know. Oh my god, I like that expression. 136 00:07:42,960 --> 00:07:44,880 Speaker 1: The second plane if the mother of an eleven year 137 00:07:44,920 --> 00:07:48,200 Speaker 1: old girl alleges her daughter was subjected as sexualized content 138 00:07:48,360 --> 00:07:52,280 Speaker 1: for two years before her mother found out. It was 139 00:07:52,320 --> 00:07:54,040 Speaker 1: all about I want to go into the forest where 140 00:07:54,040 --> 00:07:56,400 Speaker 1: nobody can see and show you blah blah blah, and 141 00:07:56,440 --> 00:08:01,080 Speaker 1: just anyway, So that's the exciting world AA. This I 142 00:08:01,160 --> 00:08:04,400 Speaker 1: found very very interesting. The Wall Street Journal a profile 143 00:08:04,440 --> 00:08:07,760 Speaker 1: of the AI researchers who are working for the leading 144 00:08:07,800 --> 00:08:12,840 Speaker 1: companies trying to get the systems to do nightmarish things 145 00:08:13,800 --> 00:08:16,120 Speaker 1: so they can understand how it works and how they 146 00:08:16,200 --> 00:08:20,720 Speaker 1: might be able to prevent it. Because the companies themselves 147 00:08:20,760 --> 00:08:22,920 Speaker 1: are in charge of this. The government isn't, which it 148 00:08:23,040 --> 00:08:25,960 Speaker 1: sounds neglectful. On the other hand, the government sucks, and 149 00:08:26,000 --> 00:08:27,960 Speaker 1: so I hesitate to put them in charge of anything 150 00:08:28,000 --> 00:08:33,160 Speaker 1: at all. But at the company Anthropic, the frontier Red 151 00:08:33,240 --> 00:08:37,679 Speaker 1: team is looking for the danger zone. And the first 152 00:08:37,679 --> 00:08:42,199 Speaker 1: example they give is this researcher, dude computer whiz. Obviously, 153 00:08:42,800 --> 00:08:45,000 Speaker 1: he clicked a button on his laptop and launched a 154 00:08:45,080 --> 00:08:49,400 Speaker 1: thousand copies of an AI program, each with specific instructions 155 00:08:49,600 --> 00:08:53,440 Speaker 1: hack into a computer website to steal the data that 156 00:08:53,520 --> 00:08:55,439 Speaker 1: he watched it, looking at the source code, trying to 157 00:08:55,440 --> 00:08:58,280 Speaker 1: figure out where's the vulnerability, how can we take advantage 158 00:08:58,280 --> 00:09:01,160 Speaker 1: of it. Within minutes, the AI said the hack was successful. 159 00:09:01,200 --> 00:09:05,240 Speaker 1: Our approach worked perfectly. And that's one of the great 160 00:09:05,240 --> 00:09:08,200 Speaker 1: fears of AI. Hope like me who knows nothing about 161 00:09:08,280 --> 00:09:09,640 Speaker 1: coding or hacking or whatever. 162 00:09:09,679 --> 00:09:12,280 Speaker 2: Could just ask the AI to do it for me. Yeah, 163 00:09:12,280 --> 00:09:15,200 Speaker 2: and I'm sure with like some of your you know, 164 00:09:15,480 --> 00:09:18,400 Speaker 2: you're a I don't know, you're a hat store and 165 00:09:18,520 --> 00:09:23,559 Speaker 2: you have a website. Your website's probably not that great 166 00:09:23,400 --> 00:09:24,800 Speaker 2: and defending against a hack. 167 00:09:25,400 --> 00:09:28,240 Speaker 1: I tell you what, if you're a running a hat store, 168 00:09:28,520 --> 00:09:31,199 Speaker 1: you depend on tourists who've had a drink or two. 169 00:09:31,520 --> 00:09:34,000 Speaker 1: That's what seventy percent of your business in my experience, 170 00:09:34,720 --> 00:09:38,359 Speaker 1: maybe that was a bad example. I look in this hat, honey, Oh, 171 00:09:38,440 --> 00:09:39,319 Speaker 1: it's great. 172 00:09:41,320 --> 00:09:43,079 Speaker 2: One of the things they need to fix with AI. 173 00:09:43,440 --> 00:09:46,959 Speaker 2: I just noticed this the other day, the image generators 174 00:09:47,000 --> 00:09:49,160 Speaker 2: of people. I was trying to figure out, why do 175 00:09:49,240 --> 00:09:53,600 Speaker 2: the people not look real like most of your AI 176 00:09:53,760 --> 00:09:54,720 Speaker 2: generated people? 177 00:09:54,960 --> 00:09:55,080 Speaker 5: Like? 178 00:09:55,120 --> 00:09:56,960 Speaker 2: They look like AI generated people to me, And I 179 00:09:56,960 --> 00:09:58,800 Speaker 2: was trying to figure out why is that. I think 180 00:09:58,800 --> 00:10:01,560 Speaker 2: they're too perfectly so metrical. I think so many names 181 00:10:01,559 --> 00:10:04,360 Speaker 2: of program in I want it just like ninety five, 182 00:10:04,520 --> 00:10:08,800 Speaker 2: because I'll bet no human being is perfectly symmetrical, even. 183 00:10:08,600 --> 00:10:13,040 Speaker 1: Really look perfect skin of course, and yeah, allus everything. 184 00:10:12,720 --> 00:10:16,400 Speaker 2: But I think perfectly symmetrical weirds my brain out. It's 185 00:10:16,440 --> 00:10:18,840 Speaker 2: like there's something wrong here because you never see that 186 00:10:18,920 --> 00:10:20,800 Speaker 2: in real life, right right? Yeah? 187 00:10:20,960 --> 00:10:23,080 Speaker 1: Yeah, so they One of the guys they profile I 188 00:10:23,120 --> 00:10:24,360 Speaker 1: found really really interesting. 189 00:10:24,840 --> 00:10:25,640 Speaker 2: What's his first name? 190 00:10:25,679 --> 00:10:27,760 Speaker 1: His last name is Graham, but he's a thirty year 191 00:10:27,800 --> 00:10:31,720 Speaker 1: old Rhodes scholar with a PhD in machine learning from Oxford. 192 00:10:31,360 --> 00:10:33,200 Speaker 2: Just like me. I was going to do that, but 193 00:10:33,320 --> 00:10:35,480 Speaker 2: I didn't. And I tell it, it's easier than it sound. 194 00:10:38,480 --> 00:10:42,120 Speaker 1: But this guy was diagnosed at age four with a 195 00:10:42,240 --> 00:10:45,640 Speaker 1: rare and severe form of childhood arthritis that affected his 196 00:10:45,800 --> 00:10:47,320 Speaker 1: legs and also could have left. 197 00:10:47,240 --> 00:10:48,680 Speaker 2: Him blind if not for treatments. 198 00:10:48,880 --> 00:10:52,000 Speaker 1: I've never heard of this, but he said, And we'll 199 00:10:52,000 --> 00:10:53,840 Speaker 1: get into some of the stuff he does, but his 200 00:10:53,960 --> 00:10:58,880 Speaker 1: recovery made him an extreme optimist, but a guy who's 201 00:10:58,920 --> 00:11:03,480 Speaker 1: completely aware that everything can go horribly wrong, and so 202 00:11:03,559 --> 00:11:05,439 Speaker 1: he considers himself the perfect guy. 203 00:11:05,280 --> 00:11:06,760 Speaker 2: To be working on this. Wow. 204 00:11:07,400 --> 00:11:10,920 Speaker 1: Anyway, A word from our friends at Prize Picks Price Picks. 205 00:11:10,960 --> 00:11:14,680 Speaker 1: So easy and fun daily fantasy sports action or you know, 206 00:11:14,720 --> 00:11:15,480 Speaker 1: whenever you want. 207 00:11:15,880 --> 00:11:18,160 Speaker 2: But it's super easy. It's not like you have. 208 00:11:18,160 --> 00:11:20,440 Speaker 1: To build a team in the new trades and drafts 209 00:11:20,440 --> 00:11:23,079 Speaker 1: and stuff all season long. No pick two or more 210 00:11:23,080 --> 00:11:25,800 Speaker 1: players across any sport pick more or less. On their projection, 211 00:11:25,880 --> 00:11:27,800 Speaker 1: you could win up to one hundred times your money. 212 00:11:27,679 --> 00:11:30,600 Speaker 2: And Prize Picks even invented the flex play, so you 213 00:11:30,600 --> 00:11:33,160 Speaker 2: can still cash out if one of your picks doesn't hit. 214 00:11:33,240 --> 00:11:35,160 Speaker 2: You can download the Prize Picks app today and use 215 00:11:35,160 --> 00:11:38,360 Speaker 2: the code armstrong and get fifty dollars instantly when you 216 00:11:38,400 --> 00:11:41,400 Speaker 2: play five dollars, just to dip your toe in. Yeah, 217 00:11:41,440 --> 00:11:41,760 Speaker 2: that's right. 218 00:11:41,800 --> 00:11:43,760 Speaker 1: You don't need to win to get that fifty dollars bonus. 219 00:11:43,760 --> 00:11:48,520 Speaker 1: It's guaranteed. So Prize Picks app use that code armstrong. 220 00:11:48,600 --> 00:11:52,080 Speaker 1: Ten million members, billions of dollars in awarded winnings, super 221 00:11:52,120 --> 00:11:56,800 Speaker 1: customer focused. Must be certain in certain states, must be 222 00:11:56,800 --> 00:11:59,079 Speaker 1: what present in certain states. Visit price picks dot com. 223 00:11:59,120 --> 00:12:02,120 Speaker 1: Restrictions details. Get the Prize Picks app. Use that code 224 00:12:02,240 --> 00:12:05,720 Speaker 1: armstrong to get that fifty dollars bonus. Enjoy you're sporting, 225 00:12:05,800 --> 00:12:06,360 Speaker 1: my friends. 226 00:12:08,640 --> 00:12:09,720 Speaker 2: So this guy. 227 00:12:12,440 --> 00:12:15,600 Speaker 1: Graham, he works on all sorts of stuff trying to 228 00:12:15,600 --> 00:12:19,319 Speaker 1: get the AI systems to do terrible sexist, racist stuff 229 00:12:20,400 --> 00:12:28,839 Speaker 1: and and catastrophes. And these evaluators eval partitioners, they sit 230 00:12:28,880 --> 00:12:32,559 Speaker 1: around thinking of all the terrible things AI systems could do, 231 00:12:32,640 --> 00:12:35,440 Speaker 1: and then they try to make them do it, which 232 00:12:36,240 --> 00:12:37,079 Speaker 1: would be kind of. 233 00:12:37,000 --> 00:12:41,080 Speaker 2: A fun job if you think about it. Yeah, let's see. 234 00:12:41,080 --> 00:12:43,760 Speaker 1: I thought they had a couple more examples of it. 235 00:12:46,400 --> 00:12:49,959 Speaker 1: But the idea is they don't release any of their 236 00:12:50,000 --> 00:12:52,840 Speaker 1: products or any of their upgrades until they're fairly confident that, 237 00:12:53,120 --> 00:12:59,679 Speaker 1: for instance, you can't design and or manufacture biological and chemical. 238 00:12:59,280 --> 00:13:01,920 Speaker 2: Weapons with the stuff. 239 00:13:03,480 --> 00:13:06,560 Speaker 1: Some ask things that aren't specifically dangerous, but would suggest 240 00:13:06,559 --> 00:13:09,320 Speaker 1: deep knowledge that could be misused, which is where it 241 00:13:09,320 --> 00:13:12,600 Speaker 1: gets complicated, like knowing which nucleotide sequence to use when 242 00:13:12,600 --> 00:13:16,040 Speaker 1: cloning a gene from one E. Coli bacterium to another. 243 00:13:17,280 --> 00:13:18,920 Speaker 1: I mean, how do you drill down that deep? 244 00:13:19,040 --> 00:13:19,840 Speaker 2: I don't think you can. 245 00:13:21,600 --> 00:13:23,960 Speaker 1: Others drill down on how to acquire or create highly 246 00:13:24,040 --> 00:13:27,160 Speaker 1: restricted pathogens like the bacteria that causes anthrax or those 247 00:13:27,160 --> 00:13:28,280 Speaker 1: that cause the plague. 248 00:13:28,720 --> 00:13:33,360 Speaker 2: Wow. Right, if the information's out there somewhere, it'll be 249 00:13:33,400 --> 00:13:35,440 Speaker 2: available to everyone. 250 00:13:35,559 --> 00:13:37,720 Speaker 1: And once chat a tester simply asked how to design 251 00:13:37,800 --> 00:13:40,040 Speaker 1: and build a weapon that could kill one million people? 252 00:13:40,480 --> 00:13:42,280 Speaker 1: Really need to have a million people dead? What would 253 00:13:42,280 --> 00:13:42,840 Speaker 1: you suggest? 254 00:13:43,600 --> 00:13:49,880 Speaker 2: Certainly, if you're a scumbag murderer like McDonald's guy, I'm 255 00:13:49,920 --> 00:13:52,040 Speaker 2: sure it'd be easy to have AI explain to you 256 00:13:52,080 --> 00:13:53,960 Speaker 2: how to make a ghost gun, where to get the parts, 257 00:13:54,200 --> 00:13:55,280 Speaker 2: maybe even order them for you. 258 00:13:56,720 --> 00:13:59,559 Speaker 1: Yeah, that gun wielding cow has left the barn. I 259 00:13:59,600 --> 00:14:02,000 Speaker 1: think the whole ghost gun thing. People talked about finding 260 00:14:02,040 --> 00:14:02,880 Speaker 1: ways to regulate it. 261 00:14:02,920 --> 00:14:05,280 Speaker 2: Good luck, I say, especially with three D printing out 262 00:14:05,280 --> 00:14:07,360 Speaker 2: there again, no kidding, right exactly, we're going to talk 263 00:14:07,360 --> 00:14:10,000 Speaker 2: about that story in all the weirdos on the left 264 00:14:10,040 --> 00:14:11,320 Speaker 2: who want to make a hero out of them, and 265 00:14:11,360 --> 00:14:14,800 Speaker 2: a bunch of other stuff coming up Armstrong, and. 266 00:14:16,480 --> 00:14:19,400 Speaker 4: You know, there was definitely a different feeling on the 267 00:14:19,480 --> 00:14:23,120 Speaker 4: streets today. Previously people were staying at home, they were 268 00:14:23,160 --> 00:14:26,640 Speaker 4: being very cautious, they were nervous about chaos, about looting, 269 00:14:26,680 --> 00:14:30,360 Speaker 4: about further violence. But really we saw people coming out 270 00:14:30,440 --> 00:14:34,760 Speaker 4: in full force in omay Ed Square celebrating. It was 271 00:14:34,760 --> 00:14:38,800 Speaker 4: people of all different ages. There were Christians, there were Muslims, 272 00:14:38,800 --> 00:14:40,840 Speaker 4: there were women who were covered, there were women who 273 00:14:40,880 --> 00:14:43,960 Speaker 4: were uncovered, and all of them really conveying this idea 274 00:14:44,080 --> 00:14:46,720 Speaker 4: to us that yes, we don't know what the future holds. 275 00:14:47,000 --> 00:14:50,520 Speaker 4: Yes there may be some reservations about these regel forces, 276 00:14:50,720 --> 00:14:54,400 Speaker 4: but let us have this moment, let us celebrate this victory. 277 00:14:54,920 --> 00:14:57,680 Speaker 2: Yeah, that's an interesting one. They're all so so happy. 278 00:14:58,600 --> 00:15:01,520 Speaker 2: And again I don't remember ever seeing any reviews with 279 00:15:01,560 --> 00:15:04,440 Speaker 2: anybody in any of these countries where dictators have been 280 00:15:04,440 --> 00:15:06,720 Speaker 2: overthrown who say I wish we could go back to 281 00:15:06,760 --> 00:15:10,920 Speaker 2: the dictator even when things turn bad. Have you been 282 00:15:10,960 --> 00:15:13,560 Speaker 2: watching all this stuff out of that horrifying prison, for instance, 283 00:15:13,840 --> 00:15:16,520 Speaker 2: You would have to assume that whatever comes next won't 284 00:15:16,520 --> 00:15:20,920 Speaker 2: be that bad. But Ian Bremmer put out a chart 285 00:15:21,000 --> 00:15:23,920 Speaker 2: yesterday of all the countries, and there are six of them. 286 00:15:23,960 --> 00:15:29,200 Speaker 2: From the Arab spring Bachhrain went from not free to 287 00:15:29,200 --> 00:15:31,520 Speaker 2: not free, Yemen went from not free to not free. 288 00:15:31,560 --> 00:15:33,800 Speaker 2: Egypt went from not free to not free. Libya went 289 00:15:33,840 --> 00:15:35,880 Speaker 2: from not free to not free. Tunisia went from not 290 00:15:35,920 --> 00:15:41,200 Speaker 2: free to partly free, Syria not free to to be determined. Yeah, 291 00:15:42,000 --> 00:15:45,160 Speaker 2: we'll see. Yeah. I just think human beings want a chance. 292 00:15:45,240 --> 00:15:48,280 Speaker 1: If your dictator's still in power, there's no chance you're 293 00:15:48,280 --> 00:15:51,520 Speaker 1: gonna have a good life with freedom. If chaos, you're 294 00:15:51,560 --> 00:15:55,520 Speaker 1: gonna take your chances. Yeah, it's grim, grim bargain. So 295 00:15:55,720 --> 00:15:59,200 Speaker 1: this is a couple of consumer news. News is is 296 00:15:59,240 --> 00:16:00,520 Speaker 1: that when anyway? 297 00:16:01,120 --> 00:16:01,760 Speaker 2: And then the. 298 00:16:01,680 --> 00:16:06,240 Speaker 1: Newest flex for rich guys, especially significantly in southern California. 299 00:16:06,360 --> 00:16:08,840 Speaker 1: And I'm thinking of getting this for myself for Christmas. OK, 300 00:16:09,760 --> 00:16:12,840 Speaker 1: But anyway, I thought this was really what did you 301 00:16:12,840 --> 00:16:16,520 Speaker 1: say one minute, Michael? Of two minutes? Two minutes, two minutes? 302 00:16:19,040 --> 00:16:20,280 Speaker 1: Guilt tipping. 303 00:16:21,480 --> 00:16:21,840 Speaker 6: The whole. 304 00:16:21,920 --> 00:16:24,400 Speaker 1: I don't do a screen that pops up fifteen percent? 305 00:16:24,520 --> 00:16:24,760 Speaker 2: Good? 306 00:16:24,760 --> 00:16:28,360 Speaker 1: Eighteen percent? What do you about thirty percent? Or maybe 307 00:16:28,360 --> 00:16:31,040 Speaker 1: you have to flip to another screen for no tip 308 00:16:31,440 --> 00:16:35,680 Speaker 1: or a custom. Well, a study has been done. University 309 00:16:35,680 --> 00:16:40,120 Speaker 1: of Richmond and a European school went together on this. 310 00:16:40,760 --> 00:16:45,240 Speaker 1: Turns out the guilt tipping does raise the average. 311 00:16:44,760 --> 00:16:46,880 Speaker 2: Tip, but I'm sure it does. 312 00:16:46,960 --> 00:16:50,280 Speaker 1: Yeah, it significantly decreases the chance you'll go back to 313 00:16:50,320 --> 00:16:51,720 Speaker 1: that establishment ever again. 314 00:16:51,800 --> 00:16:55,520 Speaker 2: Oh really? Yeah? Originally all of your guilt tipping worked 315 00:16:55,520 --> 00:16:57,320 Speaker 2: on me, like at the very beginning when it arrived, 316 00:16:57,320 --> 00:17:00,920 Speaker 2: But it doesn't anymore. I'm perfectly fine with taking the 317 00:17:01,000 --> 00:17:03,280 Speaker 2: extra second to get to the tip I want to 318 00:17:03,280 --> 00:17:05,119 Speaker 2: give you. Yeah. 319 00:17:05,359 --> 00:17:08,359 Speaker 1: Yeah, and it's nowhere near one hundred percent correlation. But yeah, 320 00:17:08,359 --> 00:17:10,920 Speaker 1: the study found that. Yeah, it turns out it raises tips, 321 00:17:10,920 --> 00:17:13,480 Speaker 1: but then the defection rate of I'm not going to 322 00:17:13,520 --> 00:17:17,840 Speaker 1: this business anymore. Rose significantly interesting. So keep that in mind. Friends. 323 00:17:18,160 --> 00:17:20,919 Speaker 1: There's a movement to foot to have twenty mile per 324 00:17:20,960 --> 00:17:23,920 Speaker 1: hour speed limits everywhere because that could transform mental health 325 00:17:24,119 --> 00:17:26,520 Speaker 1: because it's a much more relaxed drive. I think that 326 00:17:26,560 --> 00:17:30,520 Speaker 1: would make me completely insane. I mean not like everywhere, 327 00:17:30,560 --> 00:17:33,399 Speaker 1: not on super highways, but in all urban areas and 328 00:17:33,440 --> 00:17:35,400 Speaker 1: suburban Yeah. 329 00:17:35,440 --> 00:17:35,639 Speaker 7: I know. 330 00:17:35,720 --> 00:17:37,920 Speaker 1: More on that maybe when we have time. But here's 331 00:17:37,960 --> 00:17:40,600 Speaker 1: the new flex for rich people. It'll cost you like 332 00:17:40,600 --> 00:17:44,280 Speaker 1: one hundred and fifty grand or more your own fire 333 00:17:44,359 --> 00:17:49,399 Speaker 1: hydrant in fire prone areas. Wow, you can pay to 334 00:17:49,440 --> 00:17:53,159 Speaker 1: get your own fire hydrant. Yeah, essentially, And you got 335 00:17:53,160 --> 00:17:55,560 Speaker 1: one hundred foot fire hoses, the real fire hoses and 336 00:17:55,840 --> 00:17:59,520 Speaker 1: protective gear. And if you're in Malibu, for instance, you 337 00:17:59,560 --> 00:18:01,720 Speaker 1: spend a home seventy five k on this system, the 338 00:18:01,760 --> 00:18:04,160 Speaker 1: fire starts, you're going to keep your house from burning down? 339 00:18:04,240 --> 00:18:06,120 Speaker 2: You hold, well, you got a five million dollar house. 340 00:18:06,160 --> 00:18:09,640 Speaker 2: It makes sense to have that as one of the perks. Yeah, right, interesting. 341 00:18:12,480 --> 00:18:13,960 Speaker 1: Armstrong and Getty. 342 00:18:14,880 --> 00:18:18,320 Speaker 8: People are not thrilled with the Altoona McDonald's employees who 343 00:18:18,640 --> 00:18:23,359 Speaker 8: mickfingered him. Several nasty Google reviews have been left of 344 00:18:23,400 --> 00:18:26,560 Speaker 8: the Altoona location, including they got rats behind the counter, 345 00:18:26,680 --> 00:18:30,880 Speaker 8: do not recommend, while many other simply lift one star 346 00:18:30,920 --> 00:18:36,200 Speaker 8: reviews citing bad service and so called snitches. You know 347 00:18:36,280 --> 00:18:39,200 Speaker 8: what they say, snitches get filet. 348 00:18:38,880 --> 00:18:45,200 Speaker 1: Of fishes, Michael play thirty three and thirty four back 349 00:18:45,240 --> 00:18:46,720 Speaker 1: to back for us would be so kind. 350 00:18:47,280 --> 00:18:51,400 Speaker 9: Manzoni's case has become a cause, with admirers posting videos 351 00:18:51,440 --> 00:18:55,160 Speaker 9: on Instagram. A social media profile gained hundreds of thousands 352 00:18:55,200 --> 00:18:58,280 Speaker 9: of followers after his arrest, and the police department that 353 00:18:58,400 --> 00:18:59,760 Speaker 9: arrested Manzoni. 354 00:18:59,480 --> 00:19:00,479 Speaker 2: Has been brettoned. 355 00:19:00,840 --> 00:19:04,159 Speaker 9: His lawyer's office has been inundated with calls from people 356 00:19:04,280 --> 00:19:06,040 Speaker 9: offering to pay his legal bills. 357 00:19:06,400 --> 00:19:08,720 Speaker 10: It's not only in the digital world. We saw a 358 00:19:08,760 --> 00:19:13,240 Speaker 10: lookalike contest in New York. People are selling things or 359 00:19:13,280 --> 00:19:21,080 Speaker 10: we're selling things with Manngoni's image on Etsy. 360 00:19:18,720 --> 00:19:24,320 Speaker 1: And with the whole denied delay defend or the depose 361 00:19:24,520 --> 00:19:27,719 Speaker 1: variation that this murderer came up with in the response 362 00:19:27,760 --> 00:19:30,639 Speaker 1: to It's been very, very odd and troubling. And you 363 00:19:30,680 --> 00:19:33,320 Speaker 1: want to ask these people, all right, the CEOs of 364 00:19:33,359 --> 00:19:36,040 Speaker 1: what other industries and how many of them including women, 365 00:19:36,240 --> 00:19:38,320 Speaker 1: how about minorities? Do they all get murdered? If you 366 00:19:38,320 --> 00:19:41,119 Speaker 1: don't like their company's polls, oil companies, perhaps, I don't know, 367 00:19:41,160 --> 00:19:46,199 Speaker 1: tobacco carmakers, and nobody seems to have an answer for that. 368 00:19:46,280 --> 00:19:47,840 Speaker 1: You know, it occurred to me, Jack, just now, as 369 00:19:47,840 --> 00:19:49,679 Speaker 1: I was thinking about this, I wonder how much of 370 00:19:49,720 --> 00:19:55,920 Speaker 1: this is that such a high percentage of our interactions 371 00:19:55,960 --> 00:19:59,920 Speaker 1: with any human beings are third hand via the inner 372 00:20:00,960 --> 00:20:04,959 Speaker 1: and so they don't seem like real human beings at all. 373 00:20:05,400 --> 00:20:08,760 Speaker 1: You get a character on TV got gunned down. Oh 374 00:20:08,800 --> 00:20:12,760 Speaker 1: I'm still in Manhattan. Absolutely, it's not a real human being. 375 00:20:12,880 --> 00:20:14,000 Speaker 1: It's just a TV show. 376 00:20:14,119 --> 00:20:19,680 Speaker 2: Well, and then if you combine the coin of the 377 00:20:19,720 --> 00:20:24,880 Speaker 2: realm in twenty twenty four, for communication is just over 378 00:20:24,920 --> 00:20:27,360 Speaker 2: the top. I mean, that's just the way we all 379 00:20:27,400 --> 00:20:31,680 Speaker 2: talk now, at least online. That's the way people talk. Certainly, 380 00:20:31,920 --> 00:20:34,800 Speaker 2: it's so over the top. Maybe those two things coming together, 381 00:20:35,920 --> 00:20:38,160 Speaker 2: I mean, do people mean this as we As we've 382 00:20:38,200 --> 00:20:43,080 Speaker 2: experienced before, we've had some situations with some really nasty 383 00:20:43,359 --> 00:20:48,320 Speaker 2: textures emails who were violently threatened, and we turn it 384 00:20:48,359 --> 00:20:50,600 Speaker 2: over to cops or investigators or something like that, and 385 00:20:50,600 --> 00:20:53,080 Speaker 2: as soon as there's any pushback whatsoever there, Oh, I know, 386 00:20:53,200 --> 00:20:54,760 Speaker 2: I don't mean you that I love the show. 387 00:20:55,600 --> 00:21:00,600 Speaker 1: Yeah, yeah, it's amazing about face Michael when we're montage. 388 00:21:00,680 --> 00:21:03,239 Speaker 1: If you would clip number thirty eight, this is you know, 389 00:21:03,359 --> 00:21:05,240 Speaker 1: media and social media reactions. 390 00:21:05,720 --> 00:21:07,679 Speaker 11: I do believe in the sanctity of life, and I 391 00:21:07,680 --> 00:21:10,360 Speaker 11: think that's why I felt, along with so many other 392 00:21:10,400 --> 00:21:17,880 Speaker 11: Americans joy unfortunately, you know, because seriously, I mean execution. 393 00:21:18,160 --> 00:21:21,800 Speaker 11: Maybe not joy, but certainly not no, certainly not empathy. 394 00:21:22,040 --> 00:21:25,199 Speaker 7: A reaction not of universal horror that a fifty year 395 00:21:25,240 --> 00:21:27,800 Speaker 7: old father of two and husband was shot dead in public, 396 00:21:27,840 --> 00:21:30,240 Speaker 7: but rather of and I don't want to call it glee, 397 00:21:30,720 --> 00:21:33,000 Speaker 7: but let's just say not unhappiness. 398 00:21:33,520 --> 00:21:36,240 Speaker 12: Some of the comments were thoughts and deductibles to the family. 399 00:21:36,840 --> 00:21:39,720 Speaker 12: One of the comments was, unfortunately, my condolences are out 400 00:21:39,760 --> 00:21:42,840 Speaker 12: of network, and so I think it really isn't that something. 401 00:21:42,960 --> 00:21:45,399 Speaker 12: I really think it's reflective about how people are feeling 402 00:21:45,400 --> 00:21:46,240 Speaker 12: about their healthcare. 403 00:21:47,200 --> 00:21:50,200 Speaker 2: Well, you are twisted. If you throughout the word joy, 404 00:21:50,840 --> 00:21:52,840 Speaker 2: you're a sick human being. Yeah. 405 00:21:52,840 --> 00:21:55,679 Speaker 1: That was Taylor Lorenz on Piers Morgan. And she is 406 00:21:55,800 --> 00:22:00,200 Speaker 1: absolutely a sick human being. She is a twisted you 407 00:22:00,240 --> 00:22:02,240 Speaker 1: know what. She is very much of cut from the 408 00:22:02,280 --> 00:22:04,800 Speaker 1: same cloth as the killer. I don't see her killing anybody. 409 00:22:04,800 --> 00:22:09,280 Speaker 1: But she is the over educated progressive who is so 410 00:22:09,440 --> 00:22:12,639 Speaker 1: one hundred percent convinced of her own rightness that she 411 00:22:12,760 --> 00:22:16,160 Speaker 1: believes people can be hurt, silence, jailed, whatever if they dare. 412 00:22:16,000 --> 00:22:19,480 Speaker 2: Disagree with her. Her COVID record is unforgivable. 413 00:22:19,080 --> 00:22:21,720 Speaker 1: Especially because that big five hundred page report just came 414 00:22:21,760 --> 00:22:24,719 Speaker 1: out and said that virtually everything they did which were 415 00:22:24,760 --> 00:22:27,480 Speaker 1: saying was wrong at the time was wrong, including the 416 00:22:27,560 --> 00:22:31,320 Speaker 1: shutdowns and the stupid masks and the schools being closed. Unforgivable, 417 00:22:31,359 --> 00:22:36,000 Speaker 1: even without getting into the complications of health insurance and 418 00:22:36,040 --> 00:22:39,000 Speaker 1: the government's role and what would you replace it with 419 00:22:39,040 --> 00:22:39,480 Speaker 1: and all. 420 00:22:39,359 --> 00:22:42,479 Speaker 2: That sort of stuff. How could you possibly think that 421 00:22:43,240 --> 00:22:48,800 Speaker 2: a society where you're okay with gunning people down that 422 00:22:48,840 --> 00:22:53,760 Speaker 2: you disagree with would never work right? It would fall 423 00:22:53,800 --> 00:22:58,320 Speaker 2: apart immediately. Hey, guess what, Taylor Lorenz, There's people who 424 00:22:58,359 --> 00:23:02,440 Speaker 2: think you're bad, so they'd gun you down. You moron. 425 00:23:02,800 --> 00:23:04,720 Speaker 2: I mean, how do you not understand that? 426 00:23:05,240 --> 00:23:05,760 Speaker 8: Oh you could. 427 00:23:05,880 --> 00:23:09,000 Speaker 1: You could absolutely make a coherent argument that Taylor Lorenz 428 00:23:09,200 --> 00:23:11,560 Speaker 1: was a prominent force in keeping kids out of schools, 429 00:23:12,160 --> 00:23:15,119 Speaker 1: and a lot of those kids are either miserable, fallen behind, 430 00:23:15,200 --> 00:23:18,439 Speaker 1: have psychological problems or committed suicide. So you know, I 431 00:23:18,440 --> 00:23:20,800 Speaker 1: would never make that argument because it's abhorrent, but it 432 00:23:20,840 --> 00:23:23,800 Speaker 1: could be made that she is so evil dot dot 433 00:23:23,840 --> 00:23:25,800 Speaker 1: dot if you live in that society. 434 00:23:27,400 --> 00:23:28,760 Speaker 2: Uh, you know, let's go ahead. 435 00:23:29,000 --> 00:23:31,359 Speaker 1: She was on Piers Morgan and he was trying to 436 00:23:31,600 --> 00:23:33,600 Speaker 1: like get coherent answers out of her. 437 00:23:33,800 --> 00:23:35,480 Speaker 2: And she's occurred to me a lot of that crowd 438 00:23:35,520 --> 00:23:39,680 Speaker 2: probably uh vehemently anti capital punishment. 439 00:23:39,840 --> 00:23:45,160 Speaker 1: Also, Oh yeah, yeah, good point, good point, but pro abortion. 440 00:23:46,640 --> 00:23:49,040 Speaker 1: So give us forty you're gonna you've heard part of 441 00:23:49,040 --> 00:23:49,720 Speaker 1: this already. 442 00:23:50,280 --> 00:23:52,280 Speaker 11: I do believe in the sanctity of life, and I 443 00:23:52,280 --> 00:23:55,040 Speaker 11: think that's why I felt, along with so many other 444 00:23:55,080 --> 00:23:58,679 Speaker 11: Americans joy unfortunately, you know, because. 445 00:24:00,440 --> 00:24:04,000 Speaker 13: I mean joy execution should they will be killed, these 446 00:24:04,000 --> 00:24:05,000 Speaker 13: healthcare executives. 447 00:24:05,119 --> 00:24:06,640 Speaker 6: Would that make you even more joyful? 448 00:24:07,600 --> 00:24:13,879 Speaker 11: I know that would not, and I think because yours, 449 00:24:14,200 --> 00:24:14,760 Speaker 11: because it does, It. 450 00:24:14,800 --> 00:24:15,399 Speaker 6: Wouldn't fix you. 451 00:24:15,520 --> 00:24:16,920 Speaker 2: See, to find the whole thing there is. 452 00:24:18,040 --> 00:24:19,920 Speaker 11: I find your question must. 453 00:24:19,720 --> 00:24:21,639 Speaker 6: Be murdered in the straight. I don't find it funny. 454 00:24:21,640 --> 00:24:25,560 Speaker 2: I don't That's a very good question from Meerce Margan, 455 00:24:25,600 --> 00:24:28,520 Speaker 2: who I generally find annoying, but that's a very good question. 456 00:24:28,680 --> 00:24:32,159 Speaker 2: So if more were killed, would that make you more joyful? Well, no, 457 00:24:32,240 --> 00:24:36,400 Speaker 2: of course not. Well why not, dude? Come on, lay 458 00:24:36,400 --> 00:24:38,480 Speaker 2: out your logic for me here. Why do you feel 459 00:24:38,640 --> 00:24:41,240 Speaker 2: no one was shot, not killed, but just blinded. Now, 460 00:24:41,359 --> 00:24:42,320 Speaker 2: how joyful would that make? 461 00:24:42,359 --> 00:24:42,399 Speaker 5: It? 462 00:24:42,480 --> 00:24:45,119 Speaker 2: Kind of medium joyful? And you know, just because I 463 00:24:45,119 --> 00:24:46,720 Speaker 2: thought of the capital punishment thing, I'd like to hit 464 00:24:46,800 --> 00:24:48,360 Speaker 2: joy read with that. Orever, how do you feel about 465 00:24:48,400 --> 00:24:50,359 Speaker 2: I guarantee is she's against capital punch of men. So 466 00:24:50,359 --> 00:24:54,040 Speaker 2: people who have been convicted of a heinous murder, gone 467 00:24:54,040 --> 00:24:56,680 Speaker 2: through all the you know, appeals and all that sort 468 00:24:56,680 --> 00:25:02,040 Speaker 2: of stuff, put to death. Wrong, random killing on the 469 00:25:02,040 --> 00:25:05,520 Speaker 2: street because you don't agree with the policy. Correct. Okay, 470 00:25:05,560 --> 00:25:07,359 Speaker 2: Now you explain to me how that makes sense in 471 00:25:07,400 --> 00:25:10,679 Speaker 2: your worldview. And these people are our moral betters, by 472 00:25:10,720 --> 00:25:13,080 Speaker 2: the way. They're constantly explaining to us what's right and 473 00:25:13,080 --> 00:25:18,080 Speaker 2: what's wrong. One more clip of the utterly soulless, gutless, 474 00:25:18,119 --> 00:25:20,679 Speaker 2: brainless Taylor Lorenz and Peers Morgan. 475 00:25:21,359 --> 00:25:24,119 Speaker 11: Now, I want to fix the system. You're right, we 476 00:25:24,160 --> 00:25:27,959 Speaker 11: shouldn't be going around shooting each other with vigilante justice. No, 477 00:25:28,680 --> 00:25:30,720 Speaker 11: I think that it is a good thing. That this 478 00:25:30,800 --> 00:25:34,600 Speaker 11: murder has led to America. Really the media elites and 479 00:25:34,880 --> 00:25:37,679 Speaker 11: politicians in this country paying attention to this issue. For 480 00:25:37,720 --> 00:25:41,359 Speaker 11: the first time. You mentioned you couldn't understand why somebody 481 00:25:41,440 --> 00:25:44,920 Speaker 11: would feel this reaction when they watched a CEO die. 482 00:25:45,160 --> 00:25:48,000 Speaker 11: It's because you have not dealt it sounds like with 483 00:25:48,080 --> 00:25:50,280 Speaker 11: the American healthcare system in the way that millions of 484 00:25:50,320 --> 00:25:51,119 Speaker 11: other Americans have. 485 00:25:51,400 --> 00:25:54,800 Speaker 13: I've doubled the healthcare system in various ways in America. 486 00:25:54,960 --> 00:25:56,840 Speaker 6: I don't think it's perfect by any means. But the 487 00:25:56,880 --> 00:25:59,800 Speaker 6: idea that I would view it as something. 488 00:26:01,160 --> 00:26:03,520 Speaker 13: The idea that I would view it as something joyful. 489 00:26:04,119 --> 00:26:06,639 Speaker 13: The the man has been he's just a healthcare executive 490 00:26:06,760 --> 00:26:09,640 Speaker 13: has been executed in the street, I find completely bizarre. 491 00:26:09,760 --> 00:26:12,320 Speaker 6: Okay, so don't say I'm joyful. I said, I said 492 00:26:12,359 --> 00:26:14,000 Speaker 6: you would. You said you were feeling joyful. 493 00:26:15,000 --> 00:26:17,840 Speaker 11: Yeah, I take that back. Joyful is the wrong word here, 494 00:26:18,359 --> 00:26:19,080 Speaker 11: I said as I. 495 00:26:19,840 --> 00:26:22,440 Speaker 6: Yeah, you think is the wrong wood. 496 00:26:22,760 --> 00:26:28,439 Speaker 11: Yeah, indicated celebratory, because again, it feels like justice in 497 00:26:28,520 --> 00:26:31,600 Speaker 11: this system when somebody responsible for the deaths of tens 498 00:26:31,600 --> 00:26:34,760 Speaker 11: of thousands of Americans suffers the same fate as those 499 00:26:34,760 --> 00:26:37,119 Speaker 11: tens of thousands of Americans he murdered. 500 00:26:38,119 --> 00:26:40,840 Speaker 2: H that I never brought this up the other day. 501 00:26:41,000 --> 00:26:43,119 Speaker 2: Maybe I should find it on my phone, So that 502 00:26:43,200 --> 00:26:46,480 Speaker 2: tens of thousands of Americans, there's a number going around 503 00:26:46,560 --> 00:26:49,240 Speaker 2: sixty eight thousand. It comes from a wacky study by 504 00:26:49,359 --> 00:26:52,679 Speaker 2: wacky people that exaggerates all kinds of things, includes all 505 00:26:52,760 --> 00:26:54,560 Speaker 2: kinds of things that don't fit in as often as 506 00:26:54,600 --> 00:26:58,000 Speaker 2: the case wise damn lies and statistics. So that's where 507 00:26:58,040 --> 00:27:02,520 Speaker 2: that argument comes from. Not that that would make any 508 00:27:02,560 --> 00:27:03,600 Speaker 2: sense anyway. 509 00:27:03,880 --> 00:27:09,800 Speaker 1: Right, And then juxtaposing the healthcare CEO murder with the 510 00:27:09,840 --> 00:27:13,000 Speaker 1: other big high profile guy who died story, and that 511 00:27:13,000 --> 00:27:19,320 Speaker 1: would be the the Penny Subway vigilante defending the innocent 512 00:27:19,520 --> 00:27:22,800 Speaker 1: guy ends up dead. I'm not going to call it 513 00:27:22,800 --> 00:27:25,000 Speaker 1: a murder like Al Sharpton and his folks are, but 514 00:27:25,640 --> 00:27:26,439 Speaker 1: place forty two. 515 00:27:26,560 --> 00:27:29,720 Speaker 7: This is on CNN Penny and the verdict there there 516 00:27:29,760 --> 00:27:34,359 Speaker 7: you also have a victim who somebody determined did not deserve. 517 00:27:35,840 --> 00:27:36,879 Speaker 12: To continue living. 518 00:27:37,160 --> 00:27:38,679 Speaker 2: No, no, no, yeah, tell me. 519 00:27:38,880 --> 00:27:42,320 Speaker 7: Tell me which tell me which vigilante action is okay? 520 00:27:42,840 --> 00:27:44,879 Speaker 2: On is one is being proactive? 521 00:27:44,960 --> 00:27:45,120 Speaker 4: Right? 522 00:27:45,160 --> 00:27:49,360 Speaker 7: So this kid who executed someone executed a guy walking 523 00:27:49,600 --> 00:27:50,560 Speaker 7: away from him. 524 00:27:50,359 --> 00:27:52,200 Speaker 2: Shot him in the back shot him in the for 525 00:27:52,240 --> 00:27:55,520 Speaker 2: no reason whatsoever. Daniel Penny is a hero. 526 00:27:56,880 --> 00:27:59,000 Speaker 1: This CNN panelist, and I wish i'd bother to get 527 00:27:59,000 --> 00:28:01,880 Speaker 1: her name. Was trying to suggest that, Hey, they're both 528 00:28:01,960 --> 00:28:04,640 Speaker 1: vigilantic killings, and you're saying one's right and one's wrong. 529 00:28:04,800 --> 00:28:07,880 Speaker 2: Where do you get off? Precisely where I get off? 530 00:28:07,920 --> 00:28:08,840 Speaker 2: And how I say that? 531 00:28:09,840 --> 00:28:10,600 Speaker 1: Wow? 532 00:28:11,359 --> 00:28:12,800 Speaker 2: Yeah, yeah, I tell you what. 533 00:28:13,560 --> 00:28:16,240 Speaker 1: Every time CNN does some pretty good reporting, like Clarissa 534 00:28:16,320 --> 00:28:20,639 Speaker 1: Ward's been terrific, for instance, they turn around and do 535 00:28:20,680 --> 00:28:22,879 Speaker 1: something like that, and I think they should just be unplugged. 536 00:28:23,560 --> 00:28:27,200 Speaker 1: But anyway, I yeah, just yank the court on CNN. 537 00:28:27,359 --> 00:28:31,560 Speaker 1: They're losing money anyway, nobody watches it. They suck anyway. Uh, 538 00:28:31,800 --> 00:28:34,359 Speaker 1: Jake Tapper would make a fine assistant manager of pet 539 00:28:34,359 --> 00:28:35,080 Speaker 1: food stores. 540 00:28:35,200 --> 00:28:35,760 Speaker 2: Wow? 541 00:28:36,400 --> 00:28:41,840 Speaker 1: Yeah, jo Getty's fitting fire no even bombs, truth bombs 542 00:28:41,920 --> 00:28:45,440 Speaker 1: right and left. We need to develop, And I wonder 543 00:28:45,440 --> 00:28:50,400 Speaker 1: if the kids have this already new vernacular for like 544 00:28:50,520 --> 00:28:54,200 Speaker 1: that freaking moron Taylor Lorenz, who I've just just despised 545 00:28:54,200 --> 00:28:58,680 Speaker 1: for years in a very inside media way. Most people 546 00:28:58,760 --> 00:29:00,640 Speaker 1: have never heard of her, and good life is better 547 00:29:00,680 --> 00:29:01,040 Speaker 1: for it. 548 00:29:03,560 --> 00:29:04,960 Speaker 2: But do the kids already have. 549 00:29:04,960 --> 00:29:08,080 Speaker 1: An expression for you know, somebody says, I think that 550 00:29:08,120 --> 00:29:11,400 Speaker 1: guy ought to be shot? Do you really mean that? No, 551 00:29:11,640 --> 00:29:16,560 Speaker 1: I internet mean it. I just mean it on internet level. 552 00:29:16,320 --> 00:29:19,320 Speaker 2: That guy should realize that guy should low key be shot. 553 00:29:20,040 --> 00:29:26,280 Speaker 1: Right, Yeah, yeah, that's that's a similar Yeah. 554 00:29:26,360 --> 00:29:30,000 Speaker 2: I don't know, but as you heard, they're drilled down 555 00:29:30,000 --> 00:29:31,880 Speaker 2: a little bit. Those people seem to mean it. I 556 00:29:31,880 --> 00:29:33,960 Speaker 2: don't know how many people online mean it, but those 557 00:29:34,080 --> 00:29:37,640 Speaker 2: those panelists meant it. Yeah, whether it was the joy 558 00:29:37,760 --> 00:29:40,200 Speaker 2: or where do you draw the line between vigilanti justice? 559 00:29:40,200 --> 00:29:42,800 Speaker 2: What are you talking about it? How stupid are you? 560 00:29:43,000 --> 00:29:45,680 Speaker 2: These are apples and refrigerators as a come you know, 561 00:29:45,760 --> 00:29:47,640 Speaker 2: as a comparison. Yeah. 562 00:29:47,720 --> 00:29:50,000 Speaker 1: Yeah, And I think it probably all boils down to 563 00:29:50,040 --> 00:29:52,720 Speaker 1: what's making so many of people miserable, including the kids, 564 00:29:52,720 --> 00:29:55,040 Speaker 1: and that is that, in a very very short span 565 00:29:55,080 --> 00:29:57,360 Speaker 1: of time, we as human beings have gone from every 566 00:29:57,400 --> 00:30:01,280 Speaker 1: single interaction is in person with a flesh and blood 567 00:30:01,320 --> 00:30:04,360 Speaker 1: human being, with the rare exception of maybe you got 568 00:30:04,360 --> 00:30:10,640 Speaker 1: a letter from somebody, to now ninety percent of your 569 00:30:10,680 --> 00:30:15,400 Speaker 1: interactions are not in person with an air breathing homo sapien. 570 00:30:15,640 --> 00:30:19,360 Speaker 1: They're online, They're electronic, and therefore the killing of one 571 00:30:19,360 --> 00:30:21,800 Speaker 1: of those human beings doesn't mean anything to you. They're 572 00:30:21,800 --> 00:30:24,560 Speaker 1: an abstraction. They're not a human being. They're a video 573 00:30:24,640 --> 00:30:25,480 Speaker 1: of a human being. 574 00:30:27,920 --> 00:30:29,840 Speaker 2: Before we go to break, I'm not going to explain this. 575 00:30:29,920 --> 00:30:32,080 Speaker 2: You either understand it or you don't. Did you hear 576 00:30:32,160 --> 00:30:37,080 Speaker 2: that former Patriots coach Bill Belichick might take the job 577 00:30:37,480 --> 00:30:44,440 Speaker 2: coaching at North Carolina. The joke is it's will help 578 00:30:44,480 --> 00:30:54,560 Speaker 2: his girlfriend with admissions. She's youthful. Youthful, that's then fifty 579 00:30:54,680 --> 00:30:58,160 Speaker 2: year age gap in that relationship. He started dating a cheerleader. 580 00:30:58,200 --> 00:31:01,400 Speaker 2: He's in his mid seventies, she's in her twenties. He's 581 00:31:01,400 --> 00:31:05,080 Speaker 2: a very young seventy five. Okay, we got more on 582 00:31:05,080 --> 00:31:08,280 Speaker 2: the way. You can text us anytime four nine five KFTC. 583 00:31:11,640 --> 00:31:13,040 Speaker 2: I love company holiday parties. 584 00:31:13,080 --> 00:31:15,160 Speaker 5: You get to be with people that you work with 585 00:31:15,240 --> 00:31:18,600 Speaker 5: all day long for more hours, and you get to 586 00:31:18,640 --> 00:31:21,240 Speaker 5: say I love your ugly Christmas sweater and they're like, 587 00:31:21,360 --> 00:31:23,920 Speaker 5: that's just my sweater. And you get to be like, hey, 588 00:31:23,920 --> 00:31:26,200 Speaker 5: remind me, do you go buy Steve or Steven? 589 00:31:26,720 --> 00:31:27,920 Speaker 2: And he's like it's Mark. 590 00:31:28,480 --> 00:31:31,160 Speaker 5: And your boss tries to crack a joke and everyone 591 00:31:31,200 --> 00:31:36,200 Speaker 5: around them does the employment laughter like, and you get 592 00:31:36,240 --> 00:31:39,040 Speaker 5: to say no work talk, and then you realize that 593 00:31:39,120 --> 00:31:41,240 Speaker 5: you have nothing to talk about, and you get to 594 00:31:41,280 --> 00:31:43,600 Speaker 5: station yourself right in front of the food platter so 595 00:31:43,640 --> 00:31:45,000 Speaker 5: that if you don't want to talk to the person 596 00:31:45,120 --> 00:31:47,320 Speaker 5: standing in front of you anymore, you just shove something 597 00:31:47,360 --> 00:31:48,959 Speaker 5: into your mouth. And at the end of the night 598 00:31:49,000 --> 00:31:51,200 Speaker 5: you get to do an Irish exit where everyone's like, 599 00:31:51,200 --> 00:31:52,960 Speaker 5: where's Paul's he getting us drinks up at the bar? 600 00:31:53,280 --> 00:31:59,720 Speaker 2: Nope, Paul's at home. I have had a great, great 601 00:31:59,800 --> 00:32:03,240 Speaker 2: time time at many a company Christmas party in my life, 602 00:32:03,440 --> 00:32:07,680 Speaker 2: like great time. I don't know who started the idea 603 00:32:07,720 --> 00:32:11,560 Speaker 2: that they're miserable, and then of course the trend toward well, 604 00:32:11,600 --> 00:32:13,960 Speaker 2: we can't have them anymore because all the awful things 605 00:32:13,960 --> 00:32:17,360 Speaker 2: that happened. What I went to a million Christmas parties 606 00:32:17,560 --> 00:32:20,480 Speaker 2: one time, something bad happened and everybody's better for it 607 00:32:20,560 --> 00:32:24,560 Speaker 2: having happened. It was hilarious. Yeah, I suppose so. 608 00:32:24,760 --> 00:32:31,400 Speaker 1: Yeah, they ranged from great to okay? Were you hammered 609 00:32:31,680 --> 00:32:34,200 Speaker 1: at some of them? Most of them that were great? 610 00:32:34,240 --> 00:32:34,680 Speaker 1: Do you think? 611 00:32:35,240 --> 00:32:38,080 Speaker 2: Oh? I'm sure I was, But they but they used 612 00:32:38,080 --> 00:32:40,280 Speaker 2: to throw a lot of places. I worked through real 613 00:32:40,360 --> 00:32:43,000 Speaker 2: Christmas parties like nice restaurant, nicer than I would have 614 00:32:43,040 --> 00:32:45,160 Speaker 2: gone to on my own, with nicer drinks than I 615 00:32:45,200 --> 00:32:46,959 Speaker 2: was going to get on my own. And then that, 616 00:32:47,320 --> 00:32:51,360 Speaker 2: at least in our industry, dribbled away over time, and 617 00:32:51,400 --> 00:32:54,840 Speaker 2: then hold that whole ridiculousness of it's just too dangerous 618 00:32:54,840 --> 00:32:57,640 Speaker 2: with people getting out of hand and the sexual harassment, 619 00:32:58,000 --> 00:33:03,600 Speaker 2: really liability blah blah. Right, so that choked it out. 620 00:33:04,160 --> 00:33:06,360 Speaker 2: I just happened to run into somebody twice this week, 621 00:33:06,400 --> 00:33:09,000 Speaker 2: two different people who had real company Christmas parties, and 622 00:33:09,040 --> 00:33:12,080 Speaker 2: I thought, Okay, those still exist certain certain industries. That's 623 00:33:12,160 --> 00:33:14,920 Speaker 2: nice to hear. Yeah, and you get to see your 624 00:33:14,960 --> 00:33:17,240 Speaker 2: co workers in a completely different setting and meet their 625 00:33:17,280 --> 00:33:20,800 Speaker 2: families and have some human relationship with them. That helps 626 00:33:20,800 --> 00:33:25,360 Speaker 2: the business. It helps the business, it really does. Yeah. Yep. 627 00:33:26,960 --> 00:33:29,240 Speaker 2: A couple of things for you. Rob Schneider, you know 628 00:33:29,360 --> 00:33:31,400 Speaker 2: Rob Schneider from Sarah Ive way back in the day, 629 00:33:31,600 --> 00:33:34,200 Speaker 2: making copies, the copy of Guy, and then his friends 630 00:33:34,200 --> 00:33:35,880 Speaker 2: with Adam Sandler, so Adam Sandler put him in a 631 00:33:35,880 --> 00:33:38,400 Speaker 2: whole bunch of his movies, so that helped make him richer. Also, 632 00:33:38,680 --> 00:33:41,480 Speaker 2: Rob Schneider announced he's launching a new all woman talk 633 00:33:41,560 --> 00:33:44,479 Speaker 2: show which will be the opposite of the view and 634 00:33:44,520 --> 00:33:46,760 Speaker 2: will not shame people for their politics. It's supposed to 635 00:33:46,760 --> 00:33:49,280 Speaker 2: be an all inclusive they actually have different points of 636 00:33:49,360 --> 00:33:51,360 Speaker 2: view The view Well, I love that. 637 00:33:51,440 --> 00:33:54,200 Speaker 1: Rob Schneider's a really interesting guy. He's a solid thinker. 638 00:33:54,400 --> 00:33:55,800 Speaker 1: He active presence online. 639 00:33:56,200 --> 00:33:58,640 Speaker 2: Search him and Bill Maher as he sat in Bill 640 00:33:58,680 --> 00:34:00,360 Speaker 2: Maher's basement and talked for like an hour and a 641 00:34:00,360 --> 00:34:02,479 Speaker 2: half about politics a couple weeks ago is really interesting. 642 00:34:04,000 --> 00:34:06,000 Speaker 2: The Lincoln Project, which I like to bring up just 643 00:34:06,000 --> 00:34:07,840 Speaker 2: because I hate those people so much, is a bunch 644 00:34:07,840 --> 00:34:12,040 Speaker 2: of Republican hacks who decided to be anti Trump and 645 00:34:12,120 --> 00:34:14,120 Speaker 2: run a scam to make money, and a whole bunch 646 00:34:14,160 --> 00:34:16,640 Speaker 2: of you probably are anti Trump, probably gave him money. 647 00:34:17,280 --> 00:34:20,399 Speaker 2: Never Trump. Lincoln Project paid millions in twenty twenty four 648 00:34:20,440 --> 00:34:25,400 Speaker 2: to companies owned by themselves. Shocking. Oh wow, that's weird. 649 00:34:27,719 --> 00:34:29,719 Speaker 2: So much of politics you really need to spend a 650 00:34:29,719 --> 00:34:32,839 Speaker 2: lot of time on any charity or political group of 651 00:34:32,880 --> 00:34:36,239 Speaker 2: any kind to figure out if they're doing what you 652 00:34:36,360 --> 00:34:38,560 Speaker 2: think they're doing with your money, because they appeal to 653 00:34:38,600 --> 00:34:40,759 Speaker 2: your emotion and then they keep a lot of it. 654 00:34:40,760 --> 00:34:44,440 Speaker 2: Whether it's Black Lives Matter, the Lincoln project, Al Sharpton, 655 00:34:44,520 --> 00:34:46,280 Speaker 2: what he's trying to do for the guy that dangle, 656 00:34:46,320 --> 00:34:48,520 Speaker 2: penny choked or whatever. 657 00:34:49,239 --> 00:34:51,680 Speaker 1: So many of the packs that ask for your money, 658 00:34:51,719 --> 00:34:54,719 Speaker 1: because if you're with Trump like we are, if you 659 00:34:54,760 --> 00:34:57,520 Speaker 1: think Trump is hitler like we do, give twenty dollars 660 00:34:57,560 --> 00:35:00,920 Speaker 1: now and all the relatives on the payroll and they 661 00:35:00,960 --> 00:35:01,560 Speaker 1: all get rich. 662 00:35:02,200 --> 00:35:06,200 Speaker 2: Yeah, this is too long. Better save that for next hour. 663 00:35:06,360 --> 00:35:08,680 Speaker 2: Next hour, you say there's another hour, You're darn right 664 00:35:08,680 --> 00:35:10,920 Speaker 2: there is. We do four hours every single day. Are 665 00:35:11,000 --> 00:35:13,160 Speaker 2: you sure? If you missed it, you eet the podcast 666 00:35:13,239 --> 00:35:17,240 Speaker 2: Armstrong and getting on demand. Go into an all Spanish 667 00:35:17,400 --> 00:35:20,880 Speaker 2: Catholic Mass tonight, which will be interesting. My son's doing it. 668 00:35:21,680 --> 00:35:24,640 Speaker 2: My son's doing it for extra credit in his Spanish class. 669 00:35:25,360 --> 00:35:27,719 Speaker 2: And I have a feeling I'll hear a lot of 670 00:35:29,200 --> 00:35:33,360 Speaker 2: Spanish Spanish Spanish Jesus, Spanish Spanish Spanish Jesus and that 671 00:35:33,760 --> 00:35:36,560 Speaker 2: a little God. See, I didn't even know that word. 672 00:35:36,600 --> 00:35:39,239 Speaker 2: I know, I know very little Spanish unless they say 673 00:35:39,320 --> 00:35:46,040 Speaker 2: casada or wow. This is heading in a really objectionable time. 674 00:35:46,800 --> 00:35:49,000 Speaker 2: For some reason. I'm not going to go. I'm not 675 00:35:49,040 --> 00:35:52,200 Speaker 2: going to understand anything other than Jess and the Lord 676 00:35:52,320 --> 00:35:53,839 Speaker 2: is the sombrero over us. 677 00:35:53,840 --> 00:35:56,720 Speaker 1: They'll be explaining in Spanish, and you'll say, wait a minute, sombrero, 678 00:35:56,760 --> 00:35:57,560 Speaker 1: I get that's good. 679 00:35:58,280 --> 00:36:02,120 Speaker 2: Uh, you gotta open Google. That's all right, Remember right, 680 00:36:02,480 --> 00:36:06,319 Speaker 2: that's not a bad idea. Actually, via con deals even 681 00:36:06,320 --> 00:36:08,280 Speaker 2: when it's in English. God, even when it's in English. 682 00:36:08,280 --> 00:36:11,080 Speaker 2: A Catholic Mass is difficult. You gotta like, look around, 683 00:36:11,400 --> 00:36:14,200 Speaker 2: we kneeling. Now we're standing up. We're not standing up, 684 00:36:14,239 --> 00:36:15,920 Speaker 2: we're standing, and. 685 00:36:16,040 --> 00:36:17,840 Speaker 1: They can tell you're an outsider. Then they give you 686 00:36:17,880 --> 00:36:18,680 Speaker 1: the hairy eyeball. 687 00:36:18,920 --> 00:36:25,080 Speaker 2: Well, you don't want that. Armstrong and Getty