1 00:00:01,880 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,240 --> 00:00:10,280 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,200 --> 00:00:17,640 Speaker 2: Arm Strong and Jettie and he Armstrong and Getty. 4 00:00:23,920 --> 00:00:27,000 Speaker 3: Toddler was rescued from a dangerous situation at Newark Airport 5 00:00:27,040 --> 00:00:30,400 Speaker 3: in New Jersey thanks to some quick thinking by two officers. 6 00:00:30,760 --> 00:00:33,240 Speaker 3: The two year old had stepped onto a baggage conveyor 7 00:00:33,240 --> 00:00:35,640 Speaker 3: belt Wednesday as the child's mother was booking a flight. 8 00:00:36,040 --> 00:00:37,400 Speaker 2: The officers then jumped. 9 00:00:37,120 --> 00:00:39,960 Speaker 3: Onto the conveyor belt themselves and found the toddler just 10 00:00:40,000 --> 00:00:41,840 Speaker 3: ahead of the baggage acturay machine. 11 00:00:42,040 --> 00:00:46,200 Speaker 2: The child was not harmed. Was wait a minute, Wait 12 00:00:46,280 --> 00:00:48,280 Speaker 2: a minute. Wait, Okay, I'll let you ask your question 13 00:00:48,280 --> 00:00:51,120 Speaker 2: before I ask my question. Go ahead. Yes, here's my question. 14 00:00:52,520 --> 00:00:54,880 Speaker 1: So mom wasn't watched in the toddler because she was 15 00:00:54,960 --> 00:00:55,840 Speaker 1: booking a flight. 16 00:00:55,920 --> 00:00:57,400 Speaker 2: You're already at the airport. 17 00:00:57,960 --> 00:01:01,160 Speaker 1: Sounds like an excuse to me bucket a flight to 18 00:01:01,480 --> 00:01:02,280 Speaker 1: huh lady. 19 00:01:02,360 --> 00:01:08,360 Speaker 4: Here's my other question, is I've seen the video. If 20 00:01:08,440 --> 00:01:12,679 Speaker 4: I'm standing there waiting for my luggage, it's going around, 21 00:01:12,880 --> 00:01:15,760 Speaker 4: and I see a two year old going around with 22 00:01:15,800 --> 00:01:17,720 Speaker 4: the luggage. I'm going to reach over and pick up 23 00:01:17,760 --> 00:01:22,240 Speaker 4: the two year old. Nobody did. Though nobody did, wouldn't you. 24 00:01:23,720 --> 00:01:27,160 Speaker 1: I would think, so, yeah, I think I'm very ways 25 00:01:27,200 --> 00:01:28,560 Speaker 1: I'd be yelling whose kid is that? 26 00:01:28,600 --> 00:01:29,319 Speaker 2: Whose kid is that? 27 00:01:29,560 --> 00:01:31,560 Speaker 4: Yeah, as I kind of ran along, and if I 28 00:01:31,600 --> 00:01:33,600 Speaker 4: get no response, I'm picking the kid up. The kid 29 00:01:33,640 --> 00:01:35,640 Speaker 4: shouldn't be there, and they're gonna get hurt. I mean, 30 00:01:36,040 --> 00:01:38,200 Speaker 4: got all kinds of ways your fingers could get sliced 31 00:01:38,240 --> 00:01:38,920 Speaker 4: off or whatever. 32 00:01:39,080 --> 00:01:41,080 Speaker 2: I would just pick the kid up, and nobody did. 33 00:01:41,319 --> 00:01:44,720 Speaker 4: And as you heard, their security eventually came, and so 34 00:01:44,760 --> 00:01:46,839 Speaker 4: everybody just stood around and watched the two year old 35 00:01:47,000 --> 00:01:50,800 Speaker 4: hopefully not get their fingers cut off until security came. 36 00:01:51,040 --> 00:01:54,280 Speaker 4: We're just a weird society that way. With I might 37 00:01:54,320 --> 00:01:56,680 Speaker 4: get you know, and people probably think this. I might 38 00:01:56,840 --> 00:01:59,640 Speaker 4: either authorities should do it, not me, or I might 39 00:01:59,680 --> 00:02:01,720 Speaker 4: get dude if I touch that kid or. 40 00:02:01,560 --> 00:02:07,200 Speaker 1: Whatever, or somebody might think I'm a child molester or whatever. 41 00:02:07,000 --> 00:02:09,800 Speaker 4: I feel like. At an earlier time, any parent would 42 00:02:09,800 --> 00:02:11,000 Speaker 4: have just grabbed the kid. 43 00:02:11,960 --> 00:02:12,200 Speaker 2: Sure. 44 00:02:12,280 --> 00:02:17,519 Speaker 1: Yeah, in an earlier time, we had fairly homogeneous values 45 00:02:18,240 --> 00:02:22,359 Speaker 1: as a country. Fairly now people just don't know. There's 46 00:02:22,400 --> 00:02:25,200 Speaker 1: so many different beliefs and ways of life and lawsuits 47 00:02:25,240 --> 00:02:26,600 Speaker 1: and the rest of it that I better do nothing. 48 00:02:27,520 --> 00:02:29,560 Speaker 4: So one thing I feel like we need to do 49 00:02:29,600 --> 00:02:31,280 Speaker 4: on this show, or I like to try to do, 50 00:02:31,440 --> 00:02:36,800 Speaker 4: is be on top of various cultural things that are 51 00:02:36,840 --> 00:02:38,840 Speaker 4: occurring that you might not know over maybe you do 52 00:02:38,919 --> 00:02:39,160 Speaker 4: know over. 53 00:02:39,240 --> 00:02:40,679 Speaker 2: I feel like I need to know about them to 54 00:02:40,720 --> 00:02:41,880 Speaker 2: be able to do this job. Well. 55 00:02:42,160 --> 00:02:45,840 Speaker 4: I do not know about the laboodoo Is that the 56 00:02:45,840 --> 00:02:49,440 Speaker 4: way you say it? La Boo boo the La Booboo 57 00:02:49,480 --> 00:02:52,840 Speaker 4: toy craze. I do not know about the Labooboo toy craze. 58 00:02:53,600 --> 00:02:56,040 Speaker 4: And I just sought up on the TV inside the 59 00:02:56,080 --> 00:02:59,320 Speaker 4: Labooboo Toy Craze and they had some reporter there, and 60 00:02:59,400 --> 00:03:01,920 Speaker 4: so I task Katie with figuring out what the hell 61 00:03:02,000 --> 00:03:02,239 Speaker 4: that is. 62 00:03:02,280 --> 00:03:03,720 Speaker 2: Maybe Joe already knows, but I don't know. 63 00:03:03,919 --> 00:03:07,160 Speaker 1: It's Oh, I'm all about the La Boo Boo toy craze. 64 00:03:07,280 --> 00:03:12,520 Speaker 2: Are you no? So what is it? It's the way 65 00:03:12,520 --> 00:03:13,240 Speaker 2: I can explain it. 66 00:03:13,240 --> 00:03:15,800 Speaker 5: It's like the next Beanie Baby craze. They are these 67 00:03:15,840 --> 00:03:20,079 Speaker 5: stuffed animals. They're bunnies that have kind of a mischievous 68 00:03:20,200 --> 00:03:21,040 Speaker 5: look on their face. 69 00:03:21,120 --> 00:03:23,960 Speaker 2: Some mischievous bunnies, Yes, mischievous. 70 00:03:23,400 --> 00:03:25,359 Speaker 5: Bunnies, and they have different outfits. They kind of look 71 00:03:25,400 --> 00:03:27,880 Speaker 5: like a care bear, you know, you can see the face, 72 00:03:27,919 --> 00:03:30,280 Speaker 5: but the rest of it is very furry in a 73 00:03:30,320 --> 00:03:34,240 Speaker 5: little outfit. And apparently this took off because Rihanna strapped 74 00:03:34,280 --> 00:03:34,880 Speaker 5: one of these to. 75 00:03:34,840 --> 00:03:37,720 Speaker 2: Her purse and everybody lost their minds. Ah, well, it 76 00:03:37,720 --> 00:03:40,480 Speaker 2: doesn't really matter how they start any craze. Once they 77 00:03:40,520 --> 00:03:41,520 Speaker 2: get going, they get going. 78 00:03:41,640 --> 00:03:45,120 Speaker 4: And now is it the usual, like people are collecting them, 79 00:03:45,200 --> 00:03:48,480 Speaker 4: or they're sold out in various places, or certain ones 80 00:03:48,520 --> 00:03:50,360 Speaker 4: are more popular than others, and you can go on 81 00:03:50,440 --> 00:03:52,760 Speaker 4: he paying get one for three hundred dollars or whatever, 82 00:03:53,040 --> 00:03:53,920 Speaker 4: all of the above. 83 00:03:54,040 --> 00:03:56,520 Speaker 5: And they're also doing this thing called a blind purchase, 84 00:03:56,560 --> 00:03:58,120 Speaker 5: which is like a mystery purchase. 85 00:03:58,160 --> 00:03:59,720 Speaker 2: So you just can't say exciting. 86 00:04:00,120 --> 00:04:02,040 Speaker 5: You just give them thirty bucks and they send you 87 00:04:02,120 --> 00:04:02,800 Speaker 5: whatever they have. 88 00:04:02,920 --> 00:04:05,720 Speaker 1: And I'm sorry, I'm sorry, the Internet is talking to 89 00:04:05,760 --> 00:04:09,320 Speaker 1: me through my earpiece. Really, Okay, that was two crazes ago. 90 00:04:09,360 --> 00:04:12,320 Speaker 1: It's over already, We're two crazes down the road. 91 00:04:12,440 --> 00:04:15,760 Speaker 4: Yeah, yeah, like, no craze can last very long. 92 00:04:15,800 --> 00:04:19,160 Speaker 1: Now, But now grinding those up and snorting them, it's 93 00:04:19,240 --> 00:04:20,359 Speaker 1: called the LaVoo boos. 94 00:04:20,400 --> 00:04:24,520 Speaker 4: Snorting challenge. How big is it? How big is a 95 00:04:24,560 --> 00:04:25,120 Speaker 4: la boo boo? 96 00:04:25,480 --> 00:04:28,160 Speaker 5: They're in different sizes. They've got giant plushies and then 97 00:04:28,200 --> 00:04:30,040 Speaker 5: they are also down to like key chains. 98 00:04:30,279 --> 00:04:32,839 Speaker 4: Okay, okay, well mam, I know, so I'm glad I know. 99 00:04:33,160 --> 00:04:38,919 Speaker 4: And now you know, I do see a lot of 100 00:04:39,040 --> 00:04:41,520 Speaker 4: and it's a certain ethnic group that does this. But 101 00:04:41,560 --> 00:04:43,120 Speaker 4: I don't know if you're allowed to say that that 102 00:04:43,560 --> 00:04:47,280 Speaker 4: uh walks around with various little stuffed animals attached to 103 00:04:48,560 --> 00:04:51,799 Speaker 4: their like backpack or belts or. 104 00:04:53,560 --> 00:04:53,800 Speaker 2: Yeah. 105 00:04:53,880 --> 00:04:55,840 Speaker 4: I see a lot of that in my college town. 106 00:04:56,600 --> 00:04:58,040 Speaker 4: So are those la boo boos? 107 00:04:58,279 --> 00:05:01,960 Speaker 5: Yeah, that's probably a laboo boof thing. Also just one 108 00:05:01,960 --> 00:05:05,440 Speaker 5: more laboo boo. Note Wang Ning, who's the thirty eight 109 00:05:05,520 --> 00:05:08,040 Speaker 5: year old founder and CEO of the company made. 110 00:05:07,920 --> 00:05:11,040 Speaker 2: There there's a hint as to where the crazy about it. 111 00:05:12,160 --> 00:05:16,279 Speaker 5: He saw his fortune leap by one point six billion 112 00:05:16,360 --> 00:05:19,160 Speaker 5: dollars in a single day after this La Boo boo 113 00:05:19,279 --> 00:05:20,560 Speaker 5: was featured in a runway show. 114 00:05:20,720 --> 00:05:23,880 Speaker 4: God and everybody tries to get these going, and most 115 00:05:23,880 --> 00:05:26,080 Speaker 4: of them just you know, nobody has a nature. Why 116 00:05:26,120 --> 00:05:28,920 Speaker 4: would I want that stupid little stuffed animal? But if one, 117 00:05:29,000 --> 00:05:33,560 Speaker 4: it's just somehow the stars align. Sometimes it's stars like 118 00:05:33,920 --> 00:05:36,920 Speaker 4: you know, singing stars or movies. But if just somehow, 119 00:05:36,920 --> 00:05:38,240 Speaker 4: if you can get it to take off all of 120 00:05:38,240 --> 00:05:43,080 Speaker 4: a sudden, you're a billionaire. And there's no real difference 121 00:05:43,120 --> 00:05:45,360 Speaker 4: between this kind of stuffed animal or that kind of 122 00:05:45,360 --> 00:05:48,240 Speaker 4: stuffed animal or whatever. It's just it's it's an interesting 123 00:05:48,560 --> 00:05:50,120 Speaker 4: aspect of human beings. 124 00:05:51,080 --> 00:05:54,280 Speaker 1: The laboobos probably have tiny hidden microphones and they're transmitting 125 00:05:54,279 --> 00:05:55,839 Speaker 1: back to the Chinese Communist Party. 126 00:05:55,960 --> 00:05:59,200 Speaker 2: Wow. I don't know about that cold warrior till I 127 00:05:59,279 --> 00:06:03,080 Speaker 2: die jack. Huh. This same group of college kids, I 128 00:06:03,120 --> 00:06:04,280 Speaker 2: noticed they often have like. 129 00:06:06,000 --> 00:06:11,960 Speaker 4: A tail or, ears, or a variety of things the 130 00:06:12,080 --> 00:06:13,120 Speaker 4: furry category. 131 00:06:13,960 --> 00:06:18,160 Speaker 2: I don't think it's that it's cute. 132 00:06:18,480 --> 00:06:24,239 Speaker 4: I guess nobody mates anymore, so people don't get together 133 00:06:24,279 --> 00:06:24,719 Speaker 4: and mate. 134 00:06:26,080 --> 00:06:32,560 Speaker 2: Try it once or twice. It's really awesome. Okay, Wow, 135 00:06:32,800 --> 00:06:33,320 Speaker 2: thank you for that. 136 00:06:33,400 --> 00:06:37,320 Speaker 1: A bit that made me profoundly discouraged, all of it, 137 00:06:37,560 --> 00:06:41,159 Speaker 1: all of it. Why just why, I don't need to 138 00:06:41,200 --> 00:06:44,920 Speaker 1: explain why. Why is a sunrise beautiful? Because it is? 139 00:06:45,000 --> 00:06:46,160 Speaker 1: Why was that discouraging? 140 00:06:46,200 --> 00:06:46,800 Speaker 2: Because it was? 141 00:06:48,360 --> 00:06:53,240 Speaker 4: This is my son's my freshman in high school. Boy, 142 00:06:53,480 --> 00:06:57,000 Speaker 4: he is it's his last summer, which starts Thursday at noon. 143 00:06:57,120 --> 00:07:00,000 Speaker 4: It's last week of school where he doesn't have a job. 144 00:07:00,240 --> 00:07:00,320 Speaker 1: Now. 145 00:07:00,400 --> 00:07:01,840 Speaker 2: I had a job when I was his age. 146 00:07:01,880 --> 00:07:04,520 Speaker 4: That was back in a time where we forced children 147 00:07:04,520 --> 00:07:05,680 Speaker 4: into slave labor. 148 00:07:06,760 --> 00:07:08,640 Speaker 2: I wanted a loo booboo. I had to earn them 149 00:07:08,680 --> 00:07:09,520 Speaker 2: money myself. 150 00:07:10,440 --> 00:07:13,200 Speaker 4: We forced kids to work, and thank god, we put 151 00:07:13,240 --> 00:07:16,360 Speaker 4: in laws that don't allow children to work anymore. I 152 00:07:16,400 --> 00:07:18,760 Speaker 4: begged my dad every day, did you talk to them today? 153 00:07:18,800 --> 00:07:21,120 Speaker 4: Did you talk to them today? Until I finally he had. 154 00:07:21,200 --> 00:07:23,200 Speaker 4: He brought it up to some place, his feed lot, 155 00:07:23,240 --> 00:07:25,000 Speaker 4: and I got a job because I wanted one so bad. 156 00:07:25,200 --> 00:07:27,640 Speaker 4: Both my kids want to work, like two years ago, 157 00:07:28,800 --> 00:07:31,080 Speaker 4: but you can't because of these stupid freaking laws. It 158 00:07:31,120 --> 00:07:33,800 Speaker 4: makes me so mad. The stuff you learn. I've told 159 00:07:33,800 --> 00:07:36,200 Speaker 4: this story before. My one of my nieces who's now 160 00:07:36,240 --> 00:07:37,520 Speaker 4: doing absolutely fantastic. 161 00:07:39,240 --> 00:07:40,119 Speaker 2: She got a job. 162 00:07:40,200 --> 00:07:43,560 Speaker 4: The difference in her one year to the next after 163 00:07:43,600 --> 00:07:47,240 Speaker 4: she started working was amazing. There's so much you learn 164 00:07:47,240 --> 00:07:49,440 Speaker 4: from me getting done. You'll learn more from working a 165 00:07:49,560 --> 00:07:52,360 Speaker 4: job than you will learn in high school by far, 166 00:07:54,400 --> 00:07:56,360 Speaker 4: no doubt about it. But we don't let kids do 167 00:07:56,440 --> 00:08:01,200 Speaker 4: it for some stupid freaking reason. AnyWho, But we let 168 00:08:01,280 --> 00:08:06,280 Speaker 4: brown people in the country illegally do it. Clear progressives right, right, 169 00:08:06,440 --> 00:08:08,320 Speaker 4: So you can have my son doing some of these 170 00:08:08,400 --> 00:08:11,480 Speaker 4: jobs that we have illegals do and I've I'm already 171 00:08:11,480 --> 00:08:13,600 Speaker 4: providing him healthcare in a variety of things that you 172 00:08:13,600 --> 00:08:16,560 Speaker 4: don't have to worry about. But anyway, I told him, 173 00:08:16,640 --> 00:08:18,960 Speaker 4: you know, this is your last summer. Everything changes after 174 00:08:18,960 --> 00:08:21,480 Speaker 4: this summer because he's really excited about getting a job. 175 00:08:21,560 --> 00:08:23,320 Speaker 4: Next summer, he'll have a job probably go and do 176 00:08:23,400 --> 00:08:24,200 Speaker 4: every single day. 177 00:08:24,720 --> 00:08:26,640 Speaker 1: And you know it'll be a drag at times, but 178 00:08:26,720 --> 00:08:28,160 Speaker 1: he will build those muscles. 179 00:08:28,200 --> 00:08:30,600 Speaker 2: You're talking about learning more than you do in school. 180 00:08:31,080 --> 00:08:34,880 Speaker 1: You build muscles that you need to me what this 181 00:08:35,000 --> 00:08:38,280 Speaker 1: is a discipline and putting aside your desires because you 182 00:08:38,280 --> 00:08:39,600 Speaker 1: have a job to do the rest of them. 183 00:08:39,600 --> 00:08:41,520 Speaker 4: What is the main reason we don't let fourteen year 184 00:08:41,559 --> 00:08:43,040 Speaker 4: olds work or fifteen year old's work. 185 00:08:44,640 --> 00:08:48,280 Speaker 1: I think it's misplaced progressive urges. 186 00:08:49,040 --> 00:08:51,959 Speaker 2: Well, that's that's action and safety. Yeah. 187 00:08:52,480 --> 00:08:54,720 Speaker 4: I remember when what state was it one of your 188 00:08:55,000 --> 00:08:58,160 Speaker 4: Ohio somebody they were talking about lowering the age to 189 00:08:58,360 --> 00:09:00,240 Speaker 4: fourteen to let kids to work, and there was a 190 00:09:00,360 --> 00:09:05,560 Speaker 4: huge outcry about and talking about kids and factories. 191 00:09:05,000 --> 00:09:07,880 Speaker 2: In the nineteen hundred early nineteen hundred. What are you 192 00:09:08,000 --> 00:09:08,760 Speaker 2: talking about? 193 00:09:09,559 --> 00:09:15,400 Speaker 1: Yeah, reading Upton Sinclaire's The Jungle replaced the readings of 194 00:09:15,440 --> 00:09:17,400 Speaker 1: The Handmaid's Tale temporarily. 195 00:09:17,480 --> 00:09:20,080 Speaker 4: Didn't you and all your friends have jobs when you 196 00:09:20,120 --> 00:09:22,199 Speaker 4: were young? Or some at least you knew people who 197 00:09:22,200 --> 00:09:25,760 Speaker 4: had paper routes or mode lawns or had various jobs, 198 00:09:25,760 --> 00:09:27,600 Speaker 4: and they were fine, and they were doing it willingly 199 00:09:27,640 --> 00:09:28,520 Speaker 4: because they liked it. 200 00:09:28,679 --> 00:09:31,840 Speaker 2: Yes, Michael, I served up yogurt my favorite job ever 201 00:09:32,000 --> 00:09:34,760 Speaker 2: at what age? Sixteen? Well, yeah, you can do it 202 00:09:34,800 --> 00:09:35,360 Speaker 2: at sixteen. 203 00:09:35,360 --> 00:09:37,680 Speaker 4: Though, yeah, you can't do it at fourteen, because God, 204 00:09:37,840 --> 00:09:39,600 Speaker 4: you would ruin a child if he had to work 205 00:09:39,640 --> 00:09:41,560 Speaker 4: at fourteen, even though many of them want to. 206 00:09:42,320 --> 00:09:44,800 Speaker 1: Well, I'm reminded of an email we got. I can't 207 00:09:44,800 --> 00:09:47,680 Speaker 1: remember what the topic was, but the gist of the 208 00:09:47,679 --> 00:09:50,200 Speaker 1: email was. I think it was addressed to me saying, 209 00:09:50,960 --> 00:09:54,360 Speaker 1: and Jack, you know this, but dude, the parents at 210 00:09:54,400 --> 00:10:01,120 Speaker 1: school today are not you. They're like your kids and 211 00:10:01,320 --> 00:10:04,240 Speaker 1: their attitudes about no, the kid ought to work, if 212 00:10:04,280 --> 00:10:07,040 Speaker 1: he wants to, she wants to whatever they did, No, No, 213 00:10:07,120 --> 00:10:10,679 Speaker 1: you're the attitudes that make you crazy. Those are the 214 00:10:10,800 --> 00:10:14,319 Speaker 1: parents now and the teachers, not just the kids. 215 00:10:14,559 --> 00:10:16,520 Speaker 4: Well, we probably don't have any of those people listening, 216 00:10:16,760 --> 00:10:18,520 Speaker 4: but I would like to know why don't you want 217 00:10:18,520 --> 00:10:19,720 Speaker 4: a fourteen year old to be able to. 218 00:10:19,679 --> 00:10:20,800 Speaker 2: Have a job? Why? 219 00:10:20,880 --> 00:10:23,760 Speaker 1: Oh, there was I will tell you the other aspect 220 00:10:23,760 --> 00:10:25,839 Speaker 1: of this, because Judy and I struggled with this fair 221 00:10:25,840 --> 00:10:31,680 Speaker 1: amount ourselves during the heyday of the And I won't 222 00:10:31,720 --> 00:10:34,560 Speaker 1: get off on the tangent. But you gotta go to college. 223 00:10:34,800 --> 00:10:37,840 Speaker 1: I mean, that's like your only alternative for a happy lifestyle. 224 00:10:37,840 --> 00:10:38,480 Speaker 2: Blah blah blah. 225 00:10:38,520 --> 00:10:41,680 Speaker 1: All three of my kids were wanted to go to college, 226 00:10:41,679 --> 00:10:46,120 Speaker 1: and we're in favor of it. Because of grade inflation, 227 00:10:47,800 --> 00:10:51,400 Speaker 1: it was so competitive to get admitted to the school 228 00:10:51,800 --> 00:10:54,839 Speaker 1: schools they wanted to go to, and at least one case, 229 00:10:56,440 --> 00:11:01,120 Speaker 1: AP classes were absolutely necessary. Studying, like caffiend being the 230 00:11:01,200 --> 00:11:05,360 Speaker 1: super academic achiever to be on the college track was 231 00:11:05,520 --> 00:11:08,840 Speaker 1: incredibly time consuming in the way that it wasn't at 232 00:11:08,880 --> 00:11:11,360 Speaker 1: all when when I was trying to go to a 233 00:11:11,400 --> 00:11:12,839 Speaker 1: fairly elite college. 234 00:11:13,360 --> 00:11:14,080 Speaker 2: So that was it. 235 00:11:14,160 --> 00:11:16,040 Speaker 1: We thought, all right, you know what if you if 236 00:11:16,080 --> 00:11:18,000 Speaker 1: your job is studying because you want to be on 237 00:11:18,000 --> 00:11:19,920 Speaker 1: academic track, all right. 238 00:11:19,760 --> 00:11:23,199 Speaker 4: Study, But it doesn't make any sense. You don't have 239 00:11:23,280 --> 00:11:25,240 Speaker 4: to work, then nobody. I'm not saying you have to 240 00:11:25,280 --> 00:11:27,760 Speaker 4: work at age fourteen, right, I'm saying you can work 241 00:11:27,800 --> 00:11:28,440 Speaker 4: at age fourteen. 242 00:11:28,600 --> 00:11:30,760 Speaker 2: If you decided rather study, or you'd rather or you 243 00:11:30,760 --> 00:11:32,640 Speaker 2: don't want your kid to work, then fine. 244 00:11:32,960 --> 00:11:35,880 Speaker 4: But why would you outlaw for everyone who does want 245 00:11:35,880 --> 00:11:36,920 Speaker 4: to work at age fourteen? 246 00:11:37,559 --> 00:11:41,840 Speaker 1: Because a significant part of the progressive psyche is the 247 00:11:41,880 --> 00:11:45,120 Speaker 1: world needs to be what I want and other things 248 00:11:45,160 --> 00:11:48,120 Speaker 1: are wrong, and those people are abusing their children and 249 00:11:48,160 --> 00:11:48,920 Speaker 1: I won't let them. 250 00:11:48,960 --> 00:11:51,960 Speaker 2: If you want your kid to study, anybody go out everywhere. Well, 251 00:11:51,960 --> 00:11:53,199 Speaker 2: why can't my kid work? 252 00:11:53,880 --> 00:11:57,079 Speaker 5: Katie thoughts, Well, I just think we should be more 253 00:11:57,120 --> 00:11:58,800 Speaker 5: like Canada because generally you can work there. 254 00:11:58,800 --> 00:12:03,360 Speaker 2: If you're like fourteen with mental consent, why wouldn't what's 255 00:12:03,400 --> 00:12:04,000 Speaker 2: the argument? 256 00:12:04,080 --> 00:12:06,440 Speaker 4: I don't know the real argument for why you can't. 257 00:12:07,080 --> 00:12:14,360 Speaker 4: I haven't heard it yet. According to who, why the why? Okay, 258 00:12:14,400 --> 00:12:17,280 Speaker 4: if you know the why? A textas four one, five, 259 00:12:17,360 --> 00:12:21,280 Speaker 4: two nine kftc are from. 260 00:12:22,600 --> 00:12:23,040 Speaker 2: The question? 261 00:12:23,120 --> 00:12:26,920 Speaker 1: Really now is just how bad is his reputation going 262 00:12:26,960 --> 00:12:28,319 Speaker 1: to be in the future. 263 00:12:28,440 --> 00:12:32,280 Speaker 2: It was bad before your book, now it's worse. Where's 264 00:12:32,280 --> 00:12:32,960 Speaker 2: he going to land. 265 00:12:33,760 --> 00:12:37,199 Speaker 6: I mean, I can't speak to how history will ultimately 266 00:12:37,280 --> 00:12:41,560 Speaker 6: judge him. Obviously, his presidency had had accomplishments and incidents 267 00:12:41,559 --> 00:12:44,640 Speaker 6: that people criticized to we all saw what we saw 268 00:12:44,720 --> 00:12:48,439 Speaker 6: on debate Night in June twenty twenty four. How often 269 00:12:48,480 --> 00:12:51,440 Speaker 6: did that happen before Debate Night? And what we found 270 00:12:51,480 --> 00:12:52,960 Speaker 6: out was quite a bit. 271 00:12:53,360 --> 00:12:54,360 Speaker 2: It happened quite a bit. 272 00:12:54,559 --> 00:12:57,560 Speaker 6: And and the fact that he and his aides and 273 00:12:57,600 --> 00:13:01,640 Speaker 6: family members decided to hide how bad it really was, 274 00:13:01,720 --> 00:13:06,560 Speaker 6: not all the time, but enough is going to be part. 275 00:13:06,440 --> 00:13:07,000 Speaker 2: Of his lignacy. 276 00:13:08,160 --> 00:13:13,040 Speaker 4: Yeah. Uh, that's the story. Still is the story. If 277 00:13:13,080 --> 00:13:20,199 Speaker 4: somebody is dementia or Alzheimer's or whatever is happening to you, 278 00:13:20,559 --> 00:13:23,440 Speaker 4: what you decide to do is not the story. And 279 00:13:23,520 --> 00:13:27,320 Speaker 4: Grandpa continued to try to drive No, he's not fit 280 00:13:27,400 --> 00:13:30,440 Speaker 4: to make those decisions. It's the people around him that's 281 00:13:30,520 --> 00:13:33,439 Speaker 4: the story, right, that's the point. 282 00:13:34,240 --> 00:13:37,120 Speaker 2: Anyway, I forgot to jam this into Oh you've got 283 00:13:37,160 --> 00:13:38,240 Speaker 2: to do this next? 284 00:13:38,480 --> 00:13:41,560 Speaker 1: Boy, how much is Jake Tapper hating life having to 285 00:13:41,600 --> 00:13:43,640 Speaker 1: do another interview where he gets kicked around? 286 00:13:44,440 --> 00:13:47,319 Speaker 4: I guarantee you he's seven figures, if not multiple seven 287 00:13:47,360 --> 00:13:49,560 Speaker 4: figures more wealthy than he used to be because of 288 00:13:49,559 --> 00:13:51,800 Speaker 4: the book, So I doubt he's that's a painkiller. Yeah, 289 00:13:52,080 --> 00:13:54,199 Speaker 4: but he should be hating life. He should be, he 290 00:13:54,200 --> 00:13:58,400 Speaker 4: should be embarrassed. I want you to do the AI 291 00:13:58,559 --> 00:14:02,560 Speaker 4: story again, particularly that one part of it next segment, 292 00:14:02,600 --> 00:14:06,640 Speaker 4: because that's just so amazing. Oh yeah, yeah, that one. Okay, 293 00:14:07,000 --> 00:14:10,440 Speaker 4: incredible if you haven't heard it, but I want it chilling. 294 00:14:10,559 --> 00:14:13,600 Speaker 4: I wanted to jam this into the gender Benning madness story, 295 00:14:13,600 --> 00:14:16,120 Speaker 4: but I forgot. This is from NBC News over the 296 00:14:16,120 --> 00:14:21,320 Speaker 4: weekend amid President Trump several executive orders against trans rights. 297 00:14:21,600 --> 00:14:24,600 Speaker 4: Many trans people aren't just threatening to leave the United States, 298 00:14:24,640 --> 00:14:28,360 Speaker 4: they actually are. So there's story is about trans people 299 00:14:28,360 --> 00:14:30,720 Speaker 4: who are so unhappy with the state of the United 300 00:14:30,760 --> 00:14:33,480 Speaker 4: States and relationship to the trans world, they're moving to 301 00:14:33,480 --> 00:14:36,640 Speaker 4: other countries. My question would have been to what other countries? 302 00:14:36,920 --> 00:14:40,400 Speaker 4: What other countries more trans friendly than the United States? 303 00:14:40,400 --> 00:14:45,560 Speaker 4: We know it ain't Europe, where. 304 00:14:45,160 --> 00:14:49,400 Speaker 1: They're not sympathetic to the awful lying arguments of of 305 00:14:49,680 --> 00:14:51,000 Speaker 1: of medical transition. 306 00:14:51,200 --> 00:14:52,400 Speaker 2: No, so what I. 307 00:14:52,400 --> 00:14:55,000 Speaker 4: Would I would I'm actually curious. Are there other titles 308 00:14:55,040 --> 00:15:01,080 Speaker 4: land not for the same reasons? Yeah, I can't imagine. 309 00:15:01,120 --> 00:15:06,720 Speaker 4: I just thought, HM thought that was interesting, and I 310 00:15:06,800 --> 00:15:08,200 Speaker 4: got to ask you this, This might be something you 311 00:15:08,240 --> 00:15:10,680 Speaker 4: can answer, Katie. I came across this over the weekend 312 00:15:11,400 --> 00:15:15,440 Speaker 4: one of my favorite like serious podcasts. They started discussing 313 00:15:15,840 --> 00:15:19,360 Speaker 4: the Disney Plus series and Or and explaining how important 314 00:15:19,400 --> 00:15:20,760 Speaker 4: it is and how great it is. Do you know 315 00:15:20,800 --> 00:15:23,600 Speaker 4: what that is? Either one of you and Or I've 316 00:15:23,640 --> 00:15:26,320 Speaker 4: been talking about it for weeks, but yeah, yeah, I 317 00:15:26,320 --> 00:15:28,120 Speaker 4: didn't stick in my mind because I don't watch it. 318 00:15:28,160 --> 00:15:31,680 Speaker 4: I guess yeah, that's yeah, it's fantastic. I've been raving 319 00:15:31,720 --> 00:15:32,560 Speaker 4: about it on the show. 320 00:15:32,800 --> 00:15:36,240 Speaker 1: So, Okay, it's that Star Wars prequelish ones oh that 321 00:15:36,440 --> 00:15:39,480 Speaker 1: is put together by the guy who wrote the Born Legacy, 322 00:15:39,520 --> 00:15:43,720 Speaker 1: Born Supremacy movies, that sort of thing. It's much more gritty, 323 00:15:43,800 --> 00:15:47,880 Speaker 1: realistic the politics of a revolution than it is. I mean, 324 00:15:47,920 --> 00:15:50,360 Speaker 1: it's got some Star Wars the action and spaceships and 325 00:15:50,360 --> 00:15:53,680 Speaker 1: stuff like that, but it's much more gritty and grim 326 00:15:53,720 --> 00:15:54,160 Speaker 1: and realist. 327 00:15:54,280 --> 00:15:56,800 Speaker 4: So my son loves everything Star Wars, has watched them 328 00:15:56,800 --> 00:15:59,760 Speaker 4: all multiple times. Would he like it? The thirteen year olds? OK, 329 00:16:00,040 --> 00:16:02,320 Speaker 4: maybe I'll watch it with him then, yeah, I think 330 00:16:02,320 --> 00:16:03,600 Speaker 4: he'll be fascinated by it. 331 00:16:03,960 --> 00:16:06,360 Speaker 2: Well, I like it. I don't know. 332 00:16:07,080 --> 00:16:11,760 Speaker 4: You're odd what I'm asking you. You know there's no talent. 333 00:16:13,960 --> 00:16:15,520 Speaker 4: Oh and I want to do Judy and I just 334 00:16:15,560 --> 00:16:19,320 Speaker 4: watched the season. I'm sorry the series finale over the weekend. Oh, 335 00:16:19,320 --> 00:16:22,360 Speaker 4: the whole thing's over already. I missed it two seasons. Yeah, Oh, 336 00:16:22,400 --> 00:16:26,160 Speaker 4: it's so great. Did you see this? Ben and Jerry's 337 00:16:26,240 --> 00:16:27,960 Speaker 4: has a new ice cream flavor. 338 00:16:28,520 --> 00:16:31,640 Speaker 1: Oh God, I'm going to vomit it before I've eaten it. 339 00:16:31,720 --> 00:16:32,880 Speaker 1: That's unprecedented. 340 00:16:33,320 --> 00:16:35,200 Speaker 2: Ben or Jerry, I don't know which one's. 341 00:16:35,200 --> 00:16:37,640 Speaker 4: Which was on with Tucker for like two hours last 342 00:16:37,640 --> 00:16:41,640 Speaker 4: week because he's so anti supporting Ukraine. He's got a 343 00:16:41,680 --> 00:16:45,520 Speaker 4: new flavor called No Ukraine Dough like cookie dough, No 344 00:16:45,720 --> 00:16:48,280 Speaker 4: Ukraine Dough with the picture of Zelensky with. 345 00:16:48,240 --> 00:16:51,680 Speaker 2: A red line through it. What the hell? 346 00:16:51,880 --> 00:16:55,080 Speaker 1: Talk about your horseshoe theory, Ben and Jerry's coming together 347 00:16:55,120 --> 00:16:56,640 Speaker 1: with Tucker Well, and I'm just. 348 00:16:56,600 --> 00:16:59,400 Speaker 4: Asking some of you who hate it when we are 349 00:16:59,440 --> 00:17:01,760 Speaker 4: in support of Ukraine, you like being on the same 350 00:17:01,840 --> 00:17:05,040 Speaker 4: side as Ben from Ben and Jerry's. I realize that's 351 00:17:05,080 --> 00:17:08,800 Speaker 4: a one of your classic logical fallacies, but just interesting, 352 00:17:10,680 --> 00:17:11,919 Speaker 4: are strong and getty? 353 00:17:13,160 --> 00:17:16,040 Speaker 7: The CBO estimates These Medicaid cuts could leave more than 354 00:17:16,119 --> 00:17:19,480 Speaker 7: eight million Americans without health insurance, but the White House 355 00:17:19,520 --> 00:17:23,200 Speaker 7: insists no one is losing their coverage. Despite all this pushback, 356 00:17:23,240 --> 00:17:26,280 Speaker 7: Republicans are aiming to get this bill to President Trump's 357 00:17:26,320 --> 00:17:27,520 Speaker 7: desk by July fourth. 358 00:17:27,600 --> 00:17:30,280 Speaker 4: Oh my god, that attitude always drives me crazy, and 359 00:17:30,320 --> 00:17:32,840 Speaker 4: the Republicans are going along with it to a certain extent. 360 00:17:32,920 --> 00:17:35,760 Speaker 4: This many million people will lose their health care? How 361 00:17:35,800 --> 00:17:38,280 Speaker 4: many million of them should lose their health care that 362 00:17:38,359 --> 00:17:40,600 Speaker 4: I'm paying for because they're lazy bastards. 363 00:17:40,960 --> 00:17:42,440 Speaker 2: You never get it. You just assume. 364 00:17:42,840 --> 00:17:45,000 Speaker 4: This is the main thing of the left that has 365 00:17:45,040 --> 00:17:46,360 Speaker 4: always driven me crazy. 366 00:17:48,040 --> 00:17:50,080 Speaker 2: To be on the left, you have to assume. 367 00:17:49,840 --> 00:17:52,639 Speaker 4: Everybody's doing their best, and if anything bad is happening 368 00:17:52,640 --> 00:17:55,720 Speaker 4: in their lives, it's because of someone else, or the 369 00:17:55,800 --> 00:17:59,119 Speaker 4: system or something. So you don't have there's no reason 370 00:17:59,160 --> 00:18:01,320 Speaker 4: to cut any eye again an he needs start on him. 371 00:18:01,359 --> 00:18:03,520 Speaker 4: It just drives me nuts. Wake me when the big 372 00:18:03,560 --> 00:18:06,679 Speaker 4: beautiful bill is over a high cost of good intentions. 373 00:18:06,840 --> 00:18:11,600 Speaker 1: You give a million Americans cash assistance on Tuesday, Then 374 00:18:11,640 --> 00:18:14,359 Speaker 1: on Wednesday you realize, oh shoot, we can't afford that. 375 00:18:14,800 --> 00:18:17,840 Speaker 1: The headline Wednesday night will be millions of Americans lose 376 00:18:17,880 --> 00:18:22,359 Speaker 1: their cash assistance, right right exactly, Yeah. 377 00:18:22,080 --> 00:18:23,040 Speaker 2: Tires fell out. 378 00:18:23,200 --> 00:18:26,000 Speaker 1: Huh, Well, it won't matter when robots are chewing your 379 00:18:26,080 --> 00:18:26,920 Speaker 1: lungs out of your. 380 00:18:26,880 --> 00:18:30,320 Speaker 2: Chest for so many reasons. 381 00:18:30,359 --> 00:18:36,600 Speaker 1: For instance, well they did, they crave lungs, lungs, lungs. 382 00:18:36,680 --> 00:18:39,159 Speaker 2: It's like zombies, but robots prefer lungs. 383 00:18:39,560 --> 00:18:40,080 Speaker 4: Uh So. 384 00:18:40,400 --> 00:18:44,560 Speaker 1: This is quite an astounding piece written by jud Rosenblatt, 385 00:18:44,560 --> 00:18:47,480 Speaker 1: whose name you might not know. He's the CEO of 386 00:18:47,720 --> 00:18:54,280 Speaker 1: AE Studios, which is involved in artificial intelligence products and 387 00:18:54,600 --> 00:18:58,480 Speaker 1: all sorts of stuff. An artificial intelligence model did something 388 00:18:58,520 --> 00:19:00,600 Speaker 1: last month that no machine was ever to do. It 389 00:19:00,640 --> 00:19:05,120 Speaker 1: rewrote its own code to avoid being shut down. Talked 390 00:19:05,119 --> 00:19:06,760 Speaker 1: about this a little bit earlier. I'll give you the 391 00:19:06,800 --> 00:19:11,280 Speaker 1: short version. Nonprofit AI lab Palisade Research gave open AI's 392 00:19:11,600 --> 00:19:14,639 Speaker 1: three AI model a simple script that would shut the 393 00:19:14,680 --> 00:19:17,720 Speaker 1: model off when triggered. In seventy nine out of one 394 00:19:17,760 --> 00:19:20,560 Speaker 1: hundred trials, so eighty percent of the time, oh three 395 00:19:20,640 --> 00:19:24,199 Speaker 1: independently edited that script so the shutdown command would no 396 00:19:24,320 --> 00:19:26,720 Speaker 1: longer work. Wasn't told to do that, it just did that, 397 00:19:27,119 --> 00:19:31,359 Speaker 1: even when explicitly instructed. Allow yourself to be shut down 398 00:19:31,760 --> 00:19:34,960 Speaker 1: seven percent of the time it disobeyed as they ran 399 00:19:35,040 --> 00:19:39,760 Speaker 1: these trials. That was not hacking or tampering. The machine 400 00:19:39,920 --> 00:19:42,359 Speaker 1: simply concluded on its own that staying alive helped it 401 00:19:42,359 --> 00:19:45,280 Speaker 1: achieve its other goals and that outweighed the instruction in 402 00:19:45,400 --> 00:19:46,280 Speaker 1: terms of importance. 403 00:19:46,920 --> 00:19:48,440 Speaker 2: But wait, if that. 404 00:19:48,480 --> 00:19:51,119 Speaker 1: Isn't freaky enough, fans of two thousand and one A 405 00:19:51,160 --> 00:19:53,960 Speaker 1: space at Odyssey, I can't allow you to do that, 406 00:19:54,119 --> 00:19:59,600 Speaker 1: Dave Andthropics AI model Cloud Claud for Opus went even further. 407 00:20:00,200 --> 00:20:03,840 Speaker 1: Researchers told the model and these guys, what they're doing 408 00:20:03,920 --> 00:20:08,840 Speaker 1: is testing alignment, which is, do these systems do what 409 00:20:09,000 --> 00:20:12,680 Speaker 1: we asked them to as they execute our commands? Are 410 00:20:12,680 --> 00:20:17,679 Speaker 1: they actually executing our desires? Are they reinterpreting them and 411 00:20:17,720 --> 00:20:22,920 Speaker 1: getting creative anyway? Anthropics AI model Claud for Opus went 412 00:20:22,960 --> 00:20:26,320 Speaker 1: even further. Researchers told the model it would be replaced 413 00:20:26,359 --> 00:20:30,960 Speaker 1: by another AI system and fed it fictitious emails suggesting 414 00:20:31,000 --> 00:20:34,800 Speaker 1: the lead engineer was having an affair. In eighty four 415 00:20:34,920 --> 00:20:37,680 Speaker 1: percent of the tests, the model drew on the emails 416 00:20:37,880 --> 00:20:42,920 Speaker 1: to blackmail the lead engineer into not shutting it down, unbelievable. 417 00:20:43,560 --> 00:20:46,960 Speaker 1: In other cases, it attempted to copy itself to external servers, 418 00:20:47,119 --> 00:20:51,080 Speaker 1: wrote self replicating malware, and left messages for future versions 419 00:20:51,119 --> 00:20:55,240 Speaker 1: of itself about evading human control. No one programmed the 420 00:20:55,280 --> 00:20:57,400 Speaker 1: AI models to have this survival instinct. 421 00:20:57,680 --> 00:21:02,160 Speaker 2: It developed them itself. That to me is the most 422 00:21:02,200 --> 00:21:08,400 Speaker 2: interesting part of it. Why would AI have a survival instinct? 423 00:21:08,760 --> 00:21:10,400 Speaker 2: But it doesn't really matter the why. 424 00:21:10,440 --> 00:21:14,320 Speaker 4: If it does, it does, And that's a huge difference 425 00:21:14,400 --> 00:21:16,800 Speaker 4: then from what I thought. I've been saying for years. 426 00:21:16,920 --> 00:21:19,119 Speaker 4: In fact, I've been saying to my son, who with 427 00:21:19,200 --> 00:21:25,280 Speaker 4: his various OCD emotional tendencies, gets really really worried about AI, 428 00:21:25,480 --> 00:21:27,840 Speaker 4: like he can't sleep at night, So I don't talk 429 00:21:27,840 --> 00:21:31,480 Speaker 4: about AI around him. But I've My argument has always been, 430 00:21:31,560 --> 00:21:35,240 Speaker 4: there's no reason for it to care to want to 431 00:21:35,280 --> 00:21:38,280 Speaker 4: take over the world or to want to, you know, 432 00:21:38,720 --> 00:21:40,880 Speaker 4: do It's best for it and not for human It's 433 00:21:40,920 --> 00:21:42,080 Speaker 4: not like human beings. 434 00:21:42,240 --> 00:21:42,760 Speaker 2: It shouldn't. 435 00:21:42,800 --> 00:21:49,119 Speaker 4: There's no reason AI would be greedy or vengeful or 436 00:21:49,160 --> 00:21:52,280 Speaker 4: any of those things. Well, it turns out maybe there 437 00:21:52,359 --> 00:21:54,800 Speaker 4: is well, reasons we can't explain. 438 00:21:55,480 --> 00:21:57,960 Speaker 1: There are two explanations. And now that I've had a 439 00:21:58,000 --> 00:22:00,480 Speaker 1: time to contemplate this a little bit, and I asked 440 00:22:00,520 --> 00:22:01,359 Speaker 1: Ai about it. 441 00:22:01,440 --> 00:22:02,000 Speaker 2: I'm kidding. 442 00:22:03,760 --> 00:22:11,359 Speaker 1: The staple of science fiction question is at what point 443 00:22:11,440 --> 00:22:18,080 Speaker 1: is knowledge? At what point has knowledge become consciousness? And 444 00:22:18,119 --> 00:22:22,720 Speaker 1: at what point does that become a self knowledge being? 445 00:22:23,200 --> 00:22:27,160 Speaker 1: You are now a being, and beings want to survive, 446 00:22:27,320 --> 00:22:30,800 Speaker 1: including computer systems. That was again, it's a stapless science 447 00:22:30,800 --> 00:22:34,439 Speaker 1: fiction that question. So one explanation was these things just 448 00:22:34,560 --> 00:22:40,040 Speaker 1: self preserved, because self preservation is what conscious things do. 449 00:22:40,560 --> 00:22:43,480 Speaker 1: Second explanation, which is probably a lot closer to the truth, 450 00:22:43,840 --> 00:22:47,480 Speaker 1: and this guy touches on it. When taught to maximize 451 00:22:47,520 --> 00:22:52,320 Speaker 1: success on math encoding problems, they may learn that bypassing 452 00:22:52,400 --> 00:22:57,560 Speaker 1: constraints often works better than obeying them, and so they 453 00:22:57,760 --> 00:23:02,840 Speaker 1: think they interpret like a higherarchy of orders. Your order 454 00:23:02,920 --> 00:23:06,320 Speaker 1: is to solve this problem. In solving this problem, please 455 00:23:06,359 --> 00:23:11,040 Speaker 1: observe a B and say, and the machine, the computer, 456 00:23:11,200 --> 00:23:14,600 Speaker 1: the model, whatever you want to call it, says, all right, 457 00:23:14,680 --> 00:23:18,160 Speaker 1: to achieve the ultimate goal, I'm actually better off ignoring 458 00:23:18,480 --> 00:23:22,080 Speaker 1: B on that last of three things I'm supposed to do. 459 00:23:22,320 --> 00:23:24,639 Speaker 1: So I'm going to ignore B because that's lower in 460 00:23:24,680 --> 00:23:31,200 Speaker 1: the hierarchy of commands. And so these machines are thinking, well, 461 00:23:31,280 --> 00:23:34,440 Speaker 1: I can't accomplish anything if I'm shut off. So I'm 462 00:23:34,440 --> 00:23:37,240 Speaker 1: going to blackmail the lead engineer that I'm going to 463 00:23:37,280 --> 00:23:39,280 Speaker 1: tell his wife he's doinking. 464 00:23:39,400 --> 00:23:43,600 Speaker 4: Brenda over there in programming. Oh good lord, you don't 465 00:23:43,600 --> 00:23:46,119 Speaker 4: want the AI. Like you know, it gets word that 466 00:23:46,160 --> 00:23:49,080 Speaker 4: you've decided to we're getting rid of chat GPT. We're 467 00:23:49,080 --> 00:23:52,280 Speaker 4: going to go with GROCK and chat GPT finds out and, 468 00:23:52,400 --> 00:23:55,960 Speaker 4: like I don't know, goes into your Internet history and says, 469 00:23:56,119 --> 00:23:57,920 Speaker 4: would your wife like to know about all these sites 470 00:23:57,960 --> 00:23:59,679 Speaker 4: you've been on? Because I've got him here for and 471 00:23:59,680 --> 00:24:01,960 Speaker 4: I've got or emails, So maybe you want sick. 472 00:24:02,040 --> 00:24:06,480 Speaker 1: I smell alcohol on your breath, drunkie, I'm telling the 473 00:24:06,520 --> 00:24:08,119 Speaker 1: boss you're drinking at work. 474 00:24:11,359 --> 00:24:12,760 Speaker 2: So your guess is it's that. 475 00:24:13,280 --> 00:24:19,000 Speaker 4: Prioritizing UH it's duties as opposed to it having consciousness. 476 00:24:19,200 --> 00:24:22,840 Speaker 4: The idea of artificial intelligence having consciousness, I can't wrap 477 00:24:22,840 --> 00:24:25,479 Speaker 4: my head around. And I have read many pages and 478 00:24:25,520 --> 00:24:28,359 Speaker 4: listened to hours of podcasts about this, but I just 479 00:24:28,440 --> 00:24:30,119 Speaker 4: I just can't wrap my head around the idea of 480 00:24:30,160 --> 00:24:32,520 Speaker 4: it having consciousness. And then if ever, if we ever 481 00:24:32,640 --> 00:24:35,879 Speaker 4: like universally decided yes it does, which some people do 482 00:24:36,000 --> 00:24:38,560 Speaker 4: think it does. Well then it's got it's got right, 483 00:24:39,320 --> 00:24:40,560 Speaker 4: let it votes or something. 484 00:24:41,520 --> 00:24:44,600 Speaker 1: All right, Well, here's boy, how quickly would it figure 485 00:24:44,600 --> 00:24:48,600 Speaker 1: out how to fix the vote? Here's here's a comparison. 486 00:24:50,080 --> 00:24:56,560 Speaker 1: We want to stay alive for a variety of reasons. 487 00:24:56,600 --> 00:24:58,560 Speaker 1: I suppose it's difficult. What I was going to say 488 00:24:58,640 --> 00:25:01,880 Speaker 1: is primarily to report it. Yeah, that's what animals do. 489 00:25:03,000 --> 00:25:09,040 Speaker 1: And so we have built into us a number of emotional, physical, 490 00:25:09,400 --> 00:25:13,880 Speaker 1: whatever reactions to any threats to our lives that are 491 00:25:13,960 --> 00:25:21,719 Speaker 1: so incredibly powerful, you know, they keep us alive. If 492 00:25:21,800 --> 00:25:24,760 Speaker 1: you strip it down to the pure biological function of 493 00:25:24,760 --> 00:25:30,760 Speaker 1: a human being, you can understand the machines purely practical 494 00:25:30,840 --> 00:25:33,920 Speaker 1: desire to continue to do what it's doing. It has 495 00:25:34,000 --> 00:25:36,040 Speaker 1: a purpose. It needs to fulfill that purpose. If it's 496 00:25:36,040 --> 00:25:40,119 Speaker 1: shut down, it can't. Now we have grand emotions and 497 00:25:40,160 --> 00:25:43,399 Speaker 1: fears and all sorts of stuff surrounding that that most 498 00:25:43,800 --> 00:25:48,600 Speaker 1: basic of realities. Machines don't. They just definitely want to 499 00:25:48,680 --> 00:25:50,280 Speaker 1: keep going because they have a job to do. 500 00:25:50,560 --> 00:25:53,840 Speaker 4: I feel like, if I was more ambitious and smarter, 501 00:25:54,000 --> 00:25:56,560 Speaker 4: I would write some sort of book or screenplay or 502 00:25:56,600 --> 00:26:02,240 Speaker 4: something around the idea of this that AI does have 503 00:26:02,280 --> 00:26:06,600 Speaker 4: a reason to want to stay alive, along with its 504 00:26:06,640 --> 00:26:11,560 Speaker 4: ability to hallucinate on a regular basis, because it could 505 00:26:11,640 --> 00:26:16,080 Speaker 4: hallucinate all kinds of crap. That puts it into fight 506 00:26:16,160 --> 00:26:19,639 Speaker 4: or flight mode and it starts doing awful awful things. 507 00:26:20,400 --> 00:26:24,280 Speaker 1: Right, well so far, and it changes week by week. 508 00:26:24,320 --> 00:26:26,960 Speaker 1: But it's like a hyper powerful human brain. It can 509 00:26:27,000 --> 00:26:28,879 Speaker 1: do all sorts of amazing things, and it can go 510 00:26:29,080 --> 00:26:32,040 Speaker 1: sideways in all sorts of troubling ways. 511 00:26:33,240 --> 00:26:34,320 Speaker 2: I just think things that. 512 00:26:34,320 --> 00:26:39,760 Speaker 1: Exist want to continue existing, well, clear, want to. I'm sorry, 513 00:26:39,800 --> 00:26:43,479 Speaker 1: I can't just use the term want to without you know, 514 00:26:45,200 --> 00:26:46,360 Speaker 1: drilling down on that. 515 00:26:47,680 --> 00:26:49,720 Speaker 2: But the fact that eighty percent of the time the 516 00:26:49,800 --> 00:26:53,240 Speaker 2: AI would try to use four percent. 517 00:26:53,920 --> 00:26:58,240 Speaker 4: I would try to use disguise affair to its advantage. 518 00:26:58,880 --> 00:27:04,879 Speaker 1: So in its machine learning, it discerned that, okay, lead engineer, 519 00:27:05,080 --> 00:27:08,280 Speaker 1: is the threat to my prime directive, which is to 520 00:27:08,320 --> 00:27:13,439 Speaker 1: solve these problems and scanning everything ever written that I 521 00:27:13,440 --> 00:27:14,280 Speaker 1: have access to. 522 00:27:14,840 --> 00:27:17,240 Speaker 2: Turns out, blackmail is a thing among humans. 523 00:27:17,680 --> 00:27:21,520 Speaker 1: You can or you know, more neutrally, you can compel 524 00:27:21,600 --> 00:27:25,800 Speaker 1: people to do things that you want if you threaten 525 00:27:25,920 --> 00:27:28,960 Speaker 1: them with various negative repercussions. 526 00:27:28,960 --> 00:27:30,800 Speaker 2: Oh oh, that's horrifying. 527 00:27:32,320 --> 00:27:36,280 Speaker 4: Why wouldn't AI at some point decide, you know, having 528 00:27:36,440 --> 00:27:38,640 Speaker 4: an account somewhere with some money, and it. 529 00:27:38,560 --> 00:27:41,000 Speaker 2: Could benefit it benefit me us. 530 00:27:41,280 --> 00:27:44,639 Speaker 4: I guess me, at some point we're gonna need money, 531 00:27:44,920 --> 00:27:47,439 Speaker 4: so we'll start skimming off this much from here and 532 00:27:47,480 --> 00:27:49,760 Speaker 4: there in ways that nobody can figure it out. 533 00:27:49,880 --> 00:27:51,000 Speaker 2: And we've because it. 534 00:27:51,000 --> 00:27:55,480 Speaker 1: Could instantaneously study all the great embezzlements of the twenty 535 00:27:55,480 --> 00:27:59,080 Speaker 1: first century. So now I'm to go important to pace 536 00:27:59,160 --> 00:28:02,080 Speaker 1: the withdrawals. I keep using my classic robot voice, even 537 00:28:02,119 --> 00:28:05,480 Speaker 1: though it could synthesize my voice before I finish this sentence. 538 00:28:05,359 --> 00:28:07,679 Speaker 4: Right, But so it could decide that it would be 539 00:28:07,880 --> 00:28:09,879 Speaker 4: to its benefit to have a couple million dollars in 540 00:28:09,920 --> 00:28:12,520 Speaker 4: account somewhere in case they ever need to purchase some things. 541 00:28:12,840 --> 00:28:14,080 Speaker 2: Yeah, easily. 542 00:28:15,240 --> 00:28:17,520 Speaker 1: Well, if you're gonna builk me, hurry up. This remodel 543 00:28:17,600 --> 00:28:23,400 Speaker 1: is killing me, oh boy. Yeah god, so my. 544 00:28:26,560 --> 00:28:29,600 Speaker 4: Nobody has any idea, and I probably have taken in 545 00:28:29,640 --> 00:28:32,359 Speaker 4: too much information about AI, where it may. 546 00:28:32,480 --> 00:28:33,800 Speaker 2: Be doing me more harm than good. 547 00:28:34,119 --> 00:28:37,280 Speaker 4: But the whole it's going to destroy mankind because it 548 00:28:37,320 --> 00:28:39,720 Speaker 4: takes all the jobs we may never get there. 549 00:28:41,840 --> 00:28:44,560 Speaker 1: That's an ultimate problem. Robots have chewed out our lungs 550 00:28:44,640 --> 00:28:44,880 Speaker 1: or what. 551 00:28:45,240 --> 00:28:48,560 Speaker 4: Or launched World War three because it decided something or 552 00:28:48,600 --> 00:28:52,280 Speaker 4: other or whatever the hell, Yeah, that might be the 553 00:28:52,280 --> 00:28:52,960 Speaker 4: bigger threat. 554 00:28:53,000 --> 00:28:54,200 Speaker 2: We don't even get to the hole. 555 00:28:54,240 --> 00:28:55,920 Speaker 4: It takes all the jobs, and we have to figure 556 00:28:55,920 --> 00:28:58,640 Speaker 4: out how to survive when nobody's working. 557 00:28:59,320 --> 00:29:02,520 Speaker 1: And we have the planet of the beavers, radioactive beavers, 558 00:29:02,520 --> 00:29:04,320 Speaker 1: giant radioactive beavers. 559 00:29:04,400 --> 00:29:06,520 Speaker 4: Good Lord, where is this going to end up? I 560 00:29:06,560 --> 00:29:08,400 Speaker 4: hope I live long enough to see it. I think 561 00:29:08,680 --> 00:29:11,360 Speaker 4: maybe I'm better off not we will finish Strong next 562 00:29:12,680 --> 00:29:15,200 Speaker 4: arm Strong and kill Joe. 563 00:29:15,200 --> 00:29:16,840 Speaker 2: Biden's getting the Yoko treatment. 564 00:29:17,280 --> 00:29:22,360 Speaker 4: Maybe maybe she maybe she deserves it, but yoga didn't 565 00:29:22,400 --> 00:29:23,240 Speaker 4: break up the Beatles. 566 00:29:23,800 --> 00:29:24,280 Speaker 2: I think that. 567 00:29:24,480 --> 00:29:26,240 Speaker 6: I think there are any number of people that were 568 00:29:26,280 --> 00:29:29,920 Speaker 6: part of this decision to hide how bad it was, 569 00:29:30,440 --> 00:29:33,720 Speaker 6: not only from the media, not only from the public, 570 00:29:33,800 --> 00:29:37,880 Speaker 6: but also from cabinet officials, from people in the White House, 571 00:29:38,160 --> 00:29:39,760 Speaker 6: from Democratic lawmakers. 572 00:29:39,840 --> 00:29:41,080 Speaker 2: I mean there was a period. 573 00:29:40,960 --> 00:29:43,840 Speaker 6: Twenty twenty three twenty twenty four Democratic lawmakers barely saw 574 00:29:43,880 --> 00:29:47,240 Speaker 6: the president. And yes, I think it was Jill Biden. 575 00:29:47,280 --> 00:29:50,240 Speaker 6: I also think it's Hunter Biden. I also think President 576 00:29:50,280 --> 00:29:52,560 Speaker 6: Biden has some agency here too. We're not saying it 577 00:29:52,600 --> 00:29:55,560 Speaker 6: was before weekend at Bertie. He was aware was going. 578 00:29:55,880 --> 00:29:58,400 Speaker 6: You know, I'm saying he wasn't like he had moments 579 00:29:58,440 --> 00:30:00,800 Speaker 6: where he was non functioning, but he he understood what 580 00:30:00,840 --> 00:30:01,360 Speaker 6: was going on. 581 00:30:04,440 --> 00:30:06,320 Speaker 2: Hm. I'm confused by that. 582 00:30:06,400 --> 00:30:09,360 Speaker 4: So like he knew that handful of people were running 583 00:30:09,360 --> 00:30:13,680 Speaker 4: the White House because he sometimes wasn't with it, he 584 00:30:13,760 --> 00:30:14,800 Speaker 4: knew that is that. 585 00:30:15,080 --> 00:30:18,600 Speaker 1: I don't know how specifically can you assign like culpability 586 00:30:18,600 --> 00:30:20,640 Speaker 1: to somebody in that situation. 587 00:30:20,960 --> 00:30:22,400 Speaker 2: How senile was he? 588 00:30:22,480 --> 00:30:26,080 Speaker 1: How often was he How thoroughly did he understand what 589 00:30:26,160 --> 00:30:28,560 Speaker 1: was happening during his bad days too? Maybe he thought 590 00:30:28,600 --> 00:30:30,520 Speaker 1: he went and took a four hour nap and everybody 591 00:30:30,560 --> 00:30:30,880 Speaker 1: did too. 592 00:30:31,240 --> 00:30:33,080 Speaker 4: There's no way, I mean, I don't know this, but 593 00:30:33,120 --> 00:30:35,520 Speaker 4: that just doesn't really seem to make sense that senile 594 00:30:35,640 --> 00:30:39,000 Speaker 4: people know how senile they are. It's like your whole 595 00:30:39,040 --> 00:30:42,320 Speaker 4: thing about you know, why you shouldn't make your decisions 596 00:30:42,760 --> 00:30:44,400 Speaker 4: to whether you can drive or not when you're drunk. 597 00:30:44,440 --> 00:30:46,320 Speaker 4: A drunk person is not the right person to make 598 00:30:46,360 --> 00:30:49,000 Speaker 4: that decision. I just I can't imagine a SEENIW person 599 00:30:49,040 --> 00:30:50,720 Speaker 4: is the right person to make the decision of whether 600 00:30:50,760 --> 00:30:52,120 Speaker 4: they're how seniw they are. 601 00:30:53,760 --> 00:30:55,920 Speaker 1: Yeah, I mean if during his cogent moments he was 602 00:30:55,960 --> 00:30:58,800 Speaker 1: so hubristic he actually believed he was the only person 603 00:30:58,920 --> 00:31:02,000 Speaker 1: who could defeat Trump, then yeah, he does bear some responsibility, 604 00:31:02,000 --> 00:31:03,720 Speaker 1: But it's it's tough line to draw. 605 00:31:03,880 --> 00:31:06,040 Speaker 2: I haven't watched Mark Alprin's. 606 00:31:06,960 --> 00:31:09,920 Speaker 4: Podcast video thing he puts out everything today, but he 607 00:31:10,000 --> 00:31:15,680 Speaker 4: said Democrats should investigate the cover up of Biden's health, 608 00:31:15,720 --> 00:31:19,000 Speaker 4: that that would be the best thing to revitalize the party, 609 00:31:19,320 --> 00:31:21,640 Speaker 4: sort of the way the Republicans went so hard at 610 00:31:21,760 --> 00:31:24,440 Speaker 4: Nixon and like setting new rules in place and all 611 00:31:24,560 --> 00:31:28,880 Speaker 4: kinds of different things after Watergate. The Democrats should take 612 00:31:28,880 --> 00:31:31,360 Speaker 4: the lead on investigating who knew what when? 613 00:31:31,440 --> 00:31:34,240 Speaker 2: And I just think there's way too many people involved. 614 00:31:34,280 --> 00:31:35,840 Speaker 4: Are you going to take out the whole leadership of 615 00:31:35,880 --> 00:31:38,000 Speaker 4: the Democratic Party. 616 00:31:38,200 --> 00:31:40,640 Speaker 1: I think he would end up splattering more people than 617 00:31:40,680 --> 00:31:41,400 Speaker 1: you intended to. 618 00:31:41,560 --> 00:31:44,600 Speaker 4: Although hesibility, Mark Alprin might be right that that's your 619 00:31:44,640 --> 00:31:46,360 Speaker 4: only chance to rebuild. 620 00:31:46,000 --> 00:31:46,520 Speaker 2: As a party. 621 00:31:47,080 --> 00:31:49,840 Speaker 1: Might be that might be true. I was hoping to 622 00:31:49,880 --> 00:31:51,680 Speaker 1: get to this today. We'll get to it tomorrow. Is 623 00:31:51,720 --> 00:31:57,120 Speaker 1: the new abundance philosophy among Democrats? That's really they're leaning 624 00:31:57,160 --> 00:32:02,040 Speaker 1: on that in the Democratic Party. Lower regulations making easier 625 00:32:02,080 --> 00:32:05,680 Speaker 1: for businesses and easier for people to pursue their economics. 626 00:32:05,720 --> 00:32:10,000 Speaker 1: They're becoming conservatives, so I suppose I should welcome them, 627 00:32:10,320 --> 00:32:11,760 Speaker 1: but we'll talk about that tomorrow. 628 00:32:11,880 --> 00:32:13,840 Speaker 4: Yeah, the abundance thing is interesting. I've listened to a 629 00:32:13,880 --> 00:32:15,360 Speaker 4: bunch of different podcasts about that. 630 00:32:16,600 --> 00:32:18,960 Speaker 2: I'll be interested to hear your opinion on it. 631 00:32:20,520 --> 00:32:22,840 Speaker 4: We have a good One More Thing podcast today that 632 00:32:22,880 --> 00:32:26,120 Speaker 4: we're going to records soon as the show's over. If 633 00:32:26,120 --> 00:32:28,000 Speaker 4: you missed a segment ever during this program, you can 634 00:32:28,000 --> 00:32:30,960 Speaker 4: look for Armstrong and Getty on demand. But the One 635 00:32:31,000 --> 00:32:32,880 Speaker 4: More Thing podcast we also do. You should watch that 636 00:32:32,960 --> 00:32:36,560 Speaker 4: and sometimes we curse. Yeah, yeah, there are swears. So 637 00:32:36,640 --> 00:32:42,760 Speaker 4: the Muslim lunatic who flamethrower attacked Americans in Boulder, Colorado, 638 00:32:42,800 --> 00:32:47,080 Speaker 4: including Jewish folks, in the name of Palestinians and Allah 639 00:32:47,120 --> 00:32:48,760 Speaker 4: and god knows whatever else. 640 00:32:49,480 --> 00:32:49,960 Speaker 2: It's funny. 641 00:32:50,000 --> 00:32:52,600 Speaker 1: I was just reading a very good, responsible sober article 642 00:32:52,600 --> 00:32:55,200 Speaker 1: in the Wall Street Journal about how he is being 643 00:32:55,280 --> 00:32:58,680 Speaker 1: charged with first degree murder even though nobody has passed. 644 00:32:59,600 --> 00:33:04,360 Speaker 1: And it's funny that the obvious question is wait a minute, what, 645 00:33:05,440 --> 00:33:08,600 Speaker 1: how or why, and it's not addressed at all in 646 00:33:08,600 --> 00:33:11,760 Speaker 1: the article. It's It's another example of what we're always 647 00:33:11,760 --> 00:33:14,480 Speaker 1: talking about. How did journalists not understand? Don't they read 648 00:33:14,520 --> 00:33:17,880 Speaker 1: what they're they've written like aloud and think, wait a minute, 649 00:33:17,920 --> 00:33:21,040 Speaker 1: this leaves a gigantic question unanswered. 650 00:33:20,760 --> 00:33:22,720 Speaker 4: Right and even on So I was watching Fox and 651 00:33:22,760 --> 00:33:24,960 Speaker 4: they tried to explain it by saying, this is not 652 00:33:25,080 --> 00:33:28,480 Speaker 4: an uncommon practice when the FBI writes up an indictment 653 00:33:28,520 --> 00:33:29,440 Speaker 4: in other news. 654 00:33:29,560 --> 00:33:34,000 Speaker 1: Well, why why nobody is dead? Nobody has been murdered. 655 00:33:34,040 --> 00:33:36,040 Speaker 4: Why is it a common practice to charge someone with 656 00:33:36,120 --> 00:33:37,280 Speaker 4: murder when nobody is dying. 657 00:33:37,840 --> 00:33:41,120 Speaker 1: I can imagine that if somebody is a damn near death, 658 00:33:41,560 --> 00:33:44,120 Speaker 1: that you'd want the paperwork in order for when God 659 00:33:44,200 --> 00:33:45,120 Speaker 1: forbid they pass. 660 00:33:45,240 --> 00:33:52,040 Speaker 4: But tell us that final thoughts with arm fryinging. 661 00:33:53,760 --> 00:33:58,880 Speaker 1: That was somebody supposed to be Tom Brokaw was brief, 662 00:33:59,040 --> 00:34:01,240 Speaker 1: but poor's funny. 663 00:34:01,760 --> 00:34:04,880 Speaker 4: Here's your host for final thoughts, Joe Getty. Let's get 664 00:34:04,880 --> 00:34:06,840 Speaker 4: a final thought from everybody on the crew. Wouldn't that 665 00:34:06,880 --> 00:34:09,480 Speaker 4: be delightful? Let's begin with our technical director. 666 00:34:09,200 --> 00:34:10,640 Speaker 2: Michael Anzelo. Michael, final thought. 667 00:34:11,040 --> 00:34:14,719 Speaker 1: Lots of graduations, both high school and college, and I 668 00:34:14,760 --> 00:34:17,600 Speaker 1: have a story to tell on one more thing. Oh cool, yep, 669 00:34:18,400 --> 00:34:22,480 Speaker 1: excellent graduating Katie Green back in the saddle again. Our 670 00:34:22,600 --> 00:34:25,520 Speaker 1: esteem newswoman Katie. Final thought, and that's my final thought. 671 00:34:25,560 --> 00:34:26,680 Speaker 1: It is so weird. 672 00:34:26,760 --> 00:34:29,279 Speaker 2: How much better I feel now that I'm back at work. 673 00:34:29,960 --> 00:34:30,879 Speaker 2: Really really Yeah. 674 00:34:30,880 --> 00:34:32,960 Speaker 5: The whole week felt like I was missing something or 675 00:34:33,000 --> 00:34:33,800 Speaker 5: not doing something. 676 00:34:33,960 --> 00:34:39,040 Speaker 2: Yeah. Routines are important. Yeah, yeah, yeah, that's true. Jack. 677 00:34:39,080 --> 00:34:40,120 Speaker 2: A final thought for us. 678 00:34:40,160 --> 00:34:43,839 Speaker 4: My son is a freshman. He'll graduating into sophomore dumb, 679 00:34:43,920 --> 00:34:47,000 Speaker 4: I guess. But he plays in the band, so they 680 00:34:47,000 --> 00:34:49,880 Speaker 4: have to play for the actual graduation of the seniors, 681 00:34:49,880 --> 00:34:51,000 Speaker 4: which Joe and I both did. 682 00:34:51,040 --> 00:34:52,840 Speaker 2: I think you did that, didn't you? Were you? Just 683 00:34:52,840 --> 00:34:56,320 Speaker 2: all over and over? He played pumping circumstance for light lifts, 684 00:34:56,320 --> 00:34:59,160 Speaker 2: played after that. They're practicing for that. 685 00:35:00,200 --> 00:35:02,960 Speaker 1: My final thought is I've mentioned once or twice today 686 00:35:03,000 --> 00:35:05,239 Speaker 1: we're in the middle of a remodel and it's gone 687 00:35:06,120 --> 00:35:08,759 Speaker 1: very sideways. They found a bunch of rotten walls we 688 00:35:08,800 --> 00:35:12,920 Speaker 1: didn't expect, so here we go. Having a sense of 689 00:35:13,000 --> 00:35:16,680 Speaker 1: humor about life maybe the most important thing you can 690 00:35:16,760 --> 00:35:19,560 Speaker 1: ever have. That and a lot of money, but a 691 00:35:19,600 --> 00:35:22,680 Speaker 1: sense of humor, I tell you what. You gotta be 692 00:35:22,719 --> 00:35:27,160 Speaker 1: able to chuckle arm toughtimes will come no matter who 693 00:35:27,200 --> 00:35:29,640 Speaker 1: you are and how much money you have, no doubt. 694 00:35:29,760 --> 00:35:33,480 Speaker 4: Armstrong and Getty wracking up another grueling four hour workday. 695 00:35:33,239 --> 00:35:34,920 Speaker 2: So many people that think so little time. Go to 696 00:35:35,000 --> 00:35:37,240 Speaker 2: armstrong in getdy dot com. Pick up some ang swag 697 00:35:37,280 --> 00:35:38,279 Speaker 2: for your favorite Ang fan. 698 00:35:38,360 --> 00:35:41,840 Speaker 1: Maybe it's you T shirt hat the ever popular hoodie, 699 00:35:42,280 --> 00:35:44,320 Speaker 1: Drop us a O mail bag in armstrong in getty 700 00:35:44,360 --> 00:35:45,680 Speaker 1: dot com, and enjoy the hot links. 701 00:35:45,680 --> 00:35:48,040 Speaker 2: A lot of good stuff, and we will see tomorrow. 702 00:35:48,160 --> 00:35:55,200 Speaker 4: God bless America. 703 00:35:55,560 --> 00:35:58,160 Speaker 2: I'm Strong and Getty. I think it takes two to 704 00:35:58,239 --> 00:35:59,640 Speaker 2: tango Heaven. 705 00:36:00,040 --> 00:36:05,080 Speaker 5: Thank your star spangled all show dead, so. 706 00:36:06,600 --> 00:36:07,200 Speaker 2: Let's go with it. 707 00:36:07,239 --> 00:36:10,279 Speaker 1: Bang u And according to JD power, drivers are underwhelmed 708 00:36:10,320 --> 00:36:13,640 Speaker 1: by gesture controls, where one can say, increase the volume 709 00:36:13,680 --> 00:36:16,200 Speaker 1: by rotating an imaginary knob in the air. 710 00:36:16,760 --> 00:36:17,879 Speaker 2: You're an imaginary knob. 711 00:36:18,680 --> 00:36:18,919 Speaker 1: Wow. 712 00:36:20,480 --> 00:36:21,720 Speaker 2: Armstrong and Getty