1 00:00:01,800 --> 00:00:06,080 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,160 --> 00:00:10,200 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,080 --> 00:00:17,200 Speaker 2: I'm Strong and Getty and he Armstrong and Eddy. 4 00:00:23,320 --> 00:00:27,080 Speaker 3: The latest Epstein files released by the Justice Department appeared 5 00:00:27,080 --> 00:00:31,480 Speaker 3: to show the former Prince sending sensitive information to Epstein. 6 00:00:32,080 --> 00:00:36,639 Speaker 3: Misconduct in a public office refers to quote serious, willful 7 00:00:36,680 --> 00:00:40,400 Speaker 3: abuse or neglect of powers relating to the role. The 8 00:00:40,440 --> 00:00:45,280 Speaker 3: former Prince has denied any wrongdoing involving his relationship with 9 00:00:45,520 --> 00:00:49,000 Speaker 3: Jeffrey Epstein, the convicted sex offender, and right now he 10 00:00:49,120 --> 00:00:51,159 Speaker 3: has not been charged with a crime. 11 00:00:52,000 --> 00:00:54,240 Speaker 4: I heard it described as, if you haven't heard the headline, 12 00:00:54,280 --> 00:00:55,880 Speaker 4: Prince Andrew is actually arrested. 13 00:00:55,920 --> 00:00:58,160 Speaker 2: He's under arrest and it's his sixties birthday. 14 00:00:58,200 --> 00:01:03,440 Speaker 4: Happy birth yall's some shiny handcuffs for you. I've heard 15 00:01:03,480 --> 00:01:07,600 Speaker 4: it described as like they got al Capone for tax evasion. 16 00:01:07,680 --> 00:01:09,320 Speaker 4: It was hard to nail him. All kinds of crimes 17 00:01:09,319 --> 00:01:10,920 Speaker 4: they knew he was committing. That they got him for 18 00:01:10,959 --> 00:01:14,680 Speaker 4: tax evasion. It's hard to nail Prince Andrew. Everybody assumes 19 00:01:14,720 --> 00:01:17,360 Speaker 4: that he was sexing up underage girls, but proven it 20 00:01:17,480 --> 00:01:19,800 Speaker 4: enough to actually charge it with them but they are 21 00:01:19,880 --> 00:01:22,720 Speaker 4: able to prove that he was giving away information he 22 00:01:22,720 --> 00:01:26,280 Speaker 4: shouldn't share, which by definition is a state secret to 23 00:01:26,920 --> 00:01:30,759 Speaker 4: an American billionaire, which ain't cool. Yeah, yeah, I've got 24 00:01:30,760 --> 00:01:32,959 Speaker 4: to admit that charge sounds a little like, yeah, we 25 00:01:33,000 --> 00:01:34,000 Speaker 4: don't know what you've done, but. 26 00:01:33,959 --> 00:01:36,679 Speaker 1: We'll figure something out. Yeah yeah, Now if he's in's 27 00:01:36,720 --> 00:01:40,560 Speaker 1: in office, Yeah, severe penalties too. Old Prince Andy might 28 00:01:40,600 --> 00:01:42,959 Speaker 1: be well, former Prince Andy might be going to the clink. 29 00:01:43,000 --> 00:01:44,240 Speaker 1: But we're not going to dwell on this. There are 30 00:01:44,319 --> 00:01:47,560 Speaker 1: much more important stories to talk about. But yeah, it's shocking. 31 00:01:48,120 --> 00:01:50,400 Speaker 1: There are a couple other Epstein fallout things. In spite 32 00:01:50,440 --> 00:01:53,120 Speaker 1: of myself, I can't help but be at least somewhat 33 00:01:53,160 --> 00:01:56,280 Speaker 1: interested in them because it is a glimpse into the 34 00:01:56,480 --> 00:02:02,640 Speaker 1: high flying, utterly unaccountable elite and their lifestyle. Anyway, a 35 00:02:02,680 --> 00:02:07,000 Speaker 1: couple of somewhat economic related stories, and one of the 36 00:02:07,040 --> 00:02:09,799 Speaker 1: people I'm going to be quoting is a professor from 37 00:02:09,840 --> 00:02:12,480 Speaker 1: the London School of Economics, where I wish I had 38 00:02:12,520 --> 00:02:16,000 Speaker 1: gone for a couple of reasons, mostly that Mick Jagger 39 00:02:16,040 --> 00:02:16,440 Speaker 1: went there. 40 00:02:17,200 --> 00:02:19,680 Speaker 2: But it's funny. 41 00:02:19,680 --> 00:02:22,320 Speaker 1: I was thinking back to my education and I had 42 00:02:22,320 --> 00:02:24,840 Speaker 1: what the equivalent of my minor in economics, and it 43 00:02:24,919 --> 00:02:30,720 Speaker 1: was just dreadful. It was horrifyingly dull and tedious, and 44 00:02:30,840 --> 00:02:33,919 Speaker 1: just the memorizing of formulas and that sort of thing. 45 00:02:33,840 --> 00:02:36,480 Speaker 4: Is Is that why they call it the dismal science? 46 00:02:37,000 --> 00:02:40,240 Speaker 2: Yeah? That and just nobody can prove anything, that's all. Yeah. 47 00:02:40,320 --> 00:02:42,760 Speaker 4: I remember asking a graduate student economics once. I said 48 00:02:42,760 --> 00:02:44,480 Speaker 4: how's the go and he said, it's all theory, Jack, 49 00:02:44,560 --> 00:02:45,480 Speaker 4: it's all theory. 50 00:02:47,280 --> 00:02:50,920 Speaker 1: Anyway, I wish I'd run into behavioral economics. 51 00:02:51,480 --> 00:02:53,440 Speaker 2: That is some of the moderns me. I didn't even 52 00:02:53,520 --> 00:02:54,280 Speaker 2: know that existed. 53 00:02:54,360 --> 00:02:56,520 Speaker 4: I didn't even know that was in under the umbrella 54 00:02:56,520 --> 00:02:59,320 Speaker 4: of economics until like the pre economics people came along 55 00:02:59,320 --> 00:02:59,960 Speaker 4: and that sort of stuff. 56 00:03:00,360 --> 00:03:01,880 Speaker 2: All of that is fascinating. 57 00:03:02,240 --> 00:03:04,680 Speaker 1: It might be my favorite thing to think about and 58 00:03:04,720 --> 00:03:08,639 Speaker 1: read about. And just there wasn't a sniff. By the way, 59 00:03:08,639 --> 00:03:10,880 Speaker 1: the University of Illinois, And I'm not donating a cent 60 00:03:10,919 --> 00:03:14,160 Speaker 1: to you idiots, so quit bugging me. 61 00:03:14,960 --> 00:03:17,520 Speaker 4: You sound like John mulaney. He always does that routine 62 00:03:17,520 --> 00:03:21,920 Speaker 4: about college want more money. We had an agreement. I 63 00:03:22,000 --> 00:03:24,200 Speaker 4: gave you money, you gave me an education, and. 64 00:03:24,080 --> 00:03:29,840 Speaker 1: It's over right right, you know what, It's funny a 65 00:03:29,919 --> 00:03:34,200 Speaker 1: tangent on a tangent. But I've often said making history 66 00:03:34,360 --> 00:03:38,680 Speaker 1: dull is an astounding achievement. I mean, you've got to 67 00:03:38,800 --> 00:03:42,640 Speaker 1: work hard and show real creativity to make history dull, 68 00:03:43,000 --> 00:03:47,920 Speaker 1: and yet teachers and professor do constantly, and economics, I think, 69 00:03:47,960 --> 00:03:51,520 Speaker 1: is the same. It reminds me my sister, God bless her, 70 00:03:51,560 --> 00:03:54,880 Speaker 1: and my dad are on a cruise of some of 71 00:03:54,920 --> 00:03:59,360 Speaker 1: the beautiful parts of the southeastern United States, and it's 72 00:03:59,480 --> 00:04:01,320 Speaker 1: nice that my this just going along with my dad. Yeah, 73 00:04:01,320 --> 00:04:04,840 Speaker 1: that's awesome. There are a lot of pirates piracy in 74 00:04:04,840 --> 00:04:07,480 Speaker 1: that area, and oh my god. And they went to 75 00:04:07,560 --> 00:04:12,080 Speaker 1: a talk on pirates and the guy was dull as me, 76 00:04:12,240 --> 00:04:17,080 Speaker 1: trying to memorize economics equations. Wow, how can you make 77 00:04:17,360 --> 00:04:18,599 Speaker 1: pirates dull? 78 00:04:19,279 --> 00:04:21,520 Speaker 2: Wow? That's a talent. God. 79 00:04:21,680 --> 00:04:24,680 Speaker 1: Yeah, I haven't walked the plank literally say hey, look, 80 00:04:24,960 --> 00:04:26,880 Speaker 1: this is crazy, but we're gonna do this. 81 00:04:27,000 --> 00:04:31,839 Speaker 2: Ah No, we're not kidding anyway, Where was I? Ah? Yes? 82 00:04:32,080 --> 00:04:38,840 Speaker 1: So the piece in the Free Press is responding to 83 00:04:38,920 --> 00:04:42,720 Speaker 1: the viral essay that Jack You brought us from Matt 84 00:04:42,760 --> 00:04:45,479 Speaker 1: Schumer titled something Big is Happening. 85 00:04:45,560 --> 00:04:47,640 Speaker 2: I got a lot of attention. You want to characterize 86 00:04:47,640 --> 00:04:47,880 Speaker 2: it real. 87 00:04:47,920 --> 00:04:51,520 Speaker 4: Briefly, the way he wrote it is he is a 88 00:04:51,880 --> 00:04:53,599 Speaker 4: he's an entrepreneur that has been in the world of 89 00:04:53,640 --> 00:04:55,880 Speaker 4: AI for quite a few years, a bunch of different startups, 90 00:04:55,880 --> 00:04:58,479 Speaker 4: and he said he was writing this to alert his 91 00:04:58,560 --> 00:05:01,720 Speaker 4: friends and family who are asking him about AI. 92 00:05:01,839 --> 00:05:03,400 Speaker 2: You know, this is a big a deal, as everybody 93 00:05:03,440 --> 00:05:03,880 Speaker 2: says it is. 94 00:05:03,920 --> 00:05:06,600 Speaker 4: He was saying, it is. Trust us, those of us 95 00:05:06,640 --> 00:05:09,680 Speaker 4: on the inside, it is a big deal. We're losing 96 00:05:09,680 --> 00:05:12,200 Speaker 4: our jobs like left and right, and it's coming to 97 00:05:12,279 --> 00:05:13,280 Speaker 4: you next. 98 00:05:13,600 --> 00:05:16,159 Speaker 1: And the point that really grabbed me was he said, 99 00:05:16,400 --> 00:05:20,040 Speaker 1: not only are this year's systems make last year's systems 100 00:05:20,080 --> 00:05:25,520 Speaker 1: look hilarious, this month's systems make last months look primitive. 101 00:05:25,560 --> 00:05:28,960 Speaker 2: Oof. Yeah, oof is right. So anyway, this is all. 102 00:05:29,080 --> 00:05:31,760 Speaker 1: They talked to a bunch of thinkers for their reaction 103 00:05:32,200 --> 00:05:33,640 Speaker 1: to this, and I'm just gonna hit you with the 104 00:05:33,720 --> 00:05:37,239 Speaker 1: very broad out lines. Noah Smith, who is a writer 105 00:05:37,279 --> 00:05:41,400 Speaker 1: about technology and thinker and stuff like that, his response 106 00:05:41,480 --> 00:05:45,159 Speaker 1: is people needed to hear this warning. It alerted a 107 00:05:45,160 --> 00:05:47,640 Speaker 1: lot of people to the incredibly rapid progress being made 108 00:05:47,680 --> 00:05:49,200 Speaker 1: in AI enabled coding. 109 00:05:50,320 --> 00:05:52,400 Speaker 2: And it's here the fear is valid. 110 00:05:52,560 --> 00:05:57,440 Speaker 1: Okay, Next person, Gary Marcus similar background. AI is big, 111 00:05:57,560 --> 00:06:00,200 Speaker 1: so is the hype. Schumer's essay is a master piece 112 00:06:00,240 --> 00:06:02,120 Speaker 1: of hype, written in the style of the old direct 113 00:06:02,120 --> 00:06:05,200 Speaker 1: marketing campaigns, with bold faced callouts like I know this 114 00:06:05,320 --> 00:06:07,440 Speaker 1: is real because it happened to me first, and I 115 00:06:07,480 --> 00:06:09,720 Speaker 1: am no longer needed for the actual technical work in 116 00:06:09,800 --> 00:06:13,799 Speaker 1: my job. It's chuck full of singularity vibes. Schumer just says, 117 00:06:13,800 --> 00:06:15,960 Speaker 1: he just tells an AI, I want to build this up, 118 00:06:16,000 --> 00:06:18,760 Speaker 1: and it writes thousands of lines of code allegedly near perfect. 119 00:06:19,120 --> 00:06:21,800 Speaker 1: What's missing from the discussion is data. Blah blah blah. 120 00:06:21,800 --> 00:06:25,040 Speaker 4: He's a skeptical Yeah, I'm a doomer. 121 00:06:26,160 --> 00:06:27,360 Speaker 2: I want those people to be right. 122 00:06:27,440 --> 00:06:30,440 Speaker 4: I want to be hilariously wrong and embarrassed about it for. 123 00:06:30,360 --> 00:06:32,839 Speaker 2: The rest of my life. But that's not the direction 124 00:06:32,880 --> 00:06:34,480 Speaker 2: I actually lean. Yeah. 125 00:06:34,720 --> 00:06:38,239 Speaker 1: His take is that coding is such a narrow question 126 00:06:39,120 --> 00:06:43,280 Speaker 1: and so obviously suited to a computer that you can't 127 00:06:43,320 --> 00:06:47,920 Speaker 1: extrapolate from there anyway. He's a professor and neural professor 128 00:06:47,920 --> 00:06:50,400 Speaker 1: of psychology and neural science, founder of a couple of companies, well. 129 00:06:50,279 --> 00:06:51,039 Speaker 2: The Tyler Cowen. 130 00:06:51,240 --> 00:06:54,000 Speaker 4: Yes, the piece we're talking about. He opened with comparing 131 00:06:54,040 --> 00:06:56,560 Speaker 4: it to the beginning of COVID, when everybody's like, is 132 00:06:56,600 --> 00:06:57,719 Speaker 4: this really gonna happen? 133 00:06:57,760 --> 00:06:58,840 Speaker 2: All this stuff they're talking about. 134 00:06:58,960 --> 00:07:01,480 Speaker 4: And I remember the first first time I heard Scott 135 00:07:01,520 --> 00:07:04,840 Speaker 4: Gottlieb on one of the Sunday shows say somebody's got 136 00:07:04,880 --> 00:07:06,600 Speaker 4: to have the guts to shut down a city and 137 00:07:06,600 --> 00:07:09,680 Speaker 4: say everything is shut And I thought, what shut down 138 00:07:09,720 --> 00:07:13,400 Speaker 4: a city? What are you talking about? That's impossible? And 139 00:07:13,440 --> 00:07:16,240 Speaker 4: like a week later every city in America was shut down. 140 00:07:16,400 --> 00:07:18,440 Speaker 1: Well, you were right, and it should have been impossible, 141 00:07:18,560 --> 00:07:22,280 Speaker 1: but we won't relitigate that. Tyler Cowan says progress will 142 00:07:22,280 --> 00:07:24,920 Speaker 1: be slower than people think. Strong artificial intelligence is going 143 00:07:24,960 --> 00:07:27,800 Speaker 1: to change all of our lives and elsewhere jobs, but progress, 144 00:07:27,800 --> 00:07:30,400 Speaker 1: someone say, regress would be much slower than many people think. 145 00:07:30,760 --> 00:07:33,360 Speaker 1: Artificial intelligence can invent all sorts of new drugs or 146 00:07:33,360 --> 00:07:36,480 Speaker 1: at least come up with a new drug idea. True, 147 00:07:36,520 --> 00:07:38,920 Speaker 1: but FDA approval can take ten years, and furthermore, a 148 00:07:38,960 --> 00:07:41,760 Speaker 1: system of clinical trials has spent some time consuming. You 149 00:07:41,760 --> 00:07:44,520 Speaker 1: think lawyers are going out of business, No, not top 150 00:07:44,560 --> 00:07:48,160 Speaker 1: quality ones. White collar employment not going away anytime since. 151 00:07:48,160 --> 00:07:51,560 Speaker 1: So he's a skeptic too, Andrew Yang or remember him 152 00:07:51,600 --> 00:07:53,760 Speaker 1: the social contract will be vaporized. 153 00:07:54,120 --> 00:07:56,040 Speaker 2: Whoa back to the doomers. 154 00:07:56,480 --> 00:07:58,600 Speaker 1: Well, I don't AI is coming. It's going to wipe 155 00:07:58,640 --> 00:08:01,360 Speaker 1: out millions of jobs. He three years ago were running 156 00:08:01,360 --> 00:08:02,880 Speaker 1: for office. What do you call the people that are 157 00:08:02,920 --> 00:08:03,800 Speaker 1: the opposite the doomers? 158 00:08:03,800 --> 00:08:04,520 Speaker 2: There's a name for that. 159 00:08:04,680 --> 00:08:09,840 Speaker 4: But uh that saying top quality lawyers will still be needed. Okay, 160 00:08:09,960 --> 00:08:12,160 Speaker 4: that's a bit of a hedge to me. Okay, Oh 161 00:08:12,240 --> 00:08:15,520 Speaker 4: she starts gazillions of lawyers. What percentage you're going to 162 00:08:16,000 --> 00:08:20,000 Speaker 4: continue to be employed? That's that's not a minor deal. 163 00:08:20,480 --> 00:08:23,600 Speaker 4: The yanger says things. These changes are happening faster than 164 00:08:23,600 --> 00:08:25,240 Speaker 4: we thought. Do you sit at a desk and look 165 00:08:25,240 --> 00:08:27,960 Speaker 4: at a computer much of the day? Take this very seriously. 166 00:08:28,200 --> 00:08:30,640 Speaker 1: The social contract of study hard, go to school, get 167 00:08:30,640 --> 00:08:32,560 Speaker 1: a good job, live a decent life is about to 168 00:08:32,600 --> 00:08:35,800 Speaker 1: be vaporized. Mobility for most will be a thing of 169 00:08:35,840 --> 00:08:37,880 Speaker 1: the past. Oh my people are not going to take it. 170 00:08:37,960 --> 00:08:40,959 Speaker 2: Well, oh my god, I hope he's wrong. 171 00:08:41,800 --> 00:08:45,959 Speaker 1: And finally not finally, Eric markowicks, we must be more 172 00:08:46,000 --> 00:08:49,760 Speaker 1: than our tools. He gets a very human about it, 173 00:08:49,800 --> 00:08:52,240 Speaker 1: and blah blah blah. And then the professor I mentioned 174 00:08:52,320 --> 00:08:55,080 Speaker 1: London School of economics. I believe AI is a huge 175 00:08:55,080 --> 00:08:56,880 Speaker 1: deal and will radically change the world. But the world, 176 00:08:56,880 --> 00:08:59,240 Speaker 1: and by extension, many white collar jobs is messy. Automating 177 00:08:59,280 --> 00:09:02,520 Speaker 1: the automatable tasks within them is not near to automating 178 00:09:02,600 --> 00:09:08,720 Speaker 1: the job. And he goes into an example with housing 179 00:09:10,440 --> 00:09:13,560 Speaker 1: and all the challenges of housing in London, and I 180 00:09:13,600 --> 00:09:16,040 Speaker 1: won't bore you with the details. But does any that 181 00:09:16,080 --> 00:09:18,079 Speaker 1: one think a I will fix this? 182 00:09:19,080 --> 00:09:20,679 Speaker 2: Well, so, I. 183 00:09:20,440 --> 00:09:27,679 Speaker 4: Think a life of leisure and lack of purpose is tragic. 184 00:09:28,320 --> 00:09:31,760 Speaker 2: But that's just me. What if the. 185 00:09:31,800 --> 00:09:37,559 Speaker 4: Average person, especially young people, you tell him, look universal 186 00:09:39,679 --> 00:09:43,079 Speaker 4: beyond basic income, you're gonna have You're not gonna have 187 00:09:43,080 --> 00:09:45,080 Speaker 4: a giant house, but you're gonna have a decent sized place. 188 00:09:45,400 --> 00:09:49,200 Speaker 4: You're gonna have food and healthcare, and you can smoke pot, 189 00:09:49,280 --> 00:09:51,160 Speaker 4: drink beer, play video games all day the rest of 190 00:09:51,160 --> 00:09:51,600 Speaker 4: your life. 191 00:09:51,880 --> 00:09:55,360 Speaker 1: What percentage of people would say, sign me the hell up? 192 00:09:56,200 --> 00:09:58,800 Speaker 1: A lot, A lot, A lot, probably the majority A lot. 193 00:09:58,880 --> 00:10:02,520 Speaker 4: I think it's but that's with the you know, a 194 00:10:02,559 --> 00:10:05,840 Speaker 4: little distance from my youth, and even in my youth, 195 00:10:05,840 --> 00:10:06,680 Speaker 4: I wouldn't have wanted that. 196 00:10:06,720 --> 00:10:07,960 Speaker 2: But yeah, but. 197 00:10:07,880 --> 00:10:13,680 Speaker 1: They will find themselves miserable. But I agree, Yeah, yeah, 198 00:10:13,720 --> 00:10:14,240 Speaker 1: I don't. 199 00:10:14,040 --> 00:10:15,720 Speaker 4: Think that sounds My point is, I don't think that 200 00:10:15,760 --> 00:10:20,920 Speaker 4: sounds like a horror to a large, large portion of society. No, 201 00:10:21,000 --> 00:10:23,400 Speaker 4: so I'm gonna play golf every day and not have 202 00:10:23,520 --> 00:10:25,520 Speaker 4: to go into work and listen to my boss. 203 00:10:25,600 --> 00:10:27,560 Speaker 2: Okay, why am I upset about this? 204 00:10:28,800 --> 00:10:30,880 Speaker 1: Yeah, except you won't be able to get a tee time, 205 00:10:32,200 --> 00:10:34,160 Speaker 1: will have to the AI will have to build more 206 00:10:34,240 --> 00:10:35,400 Speaker 1: golf courses. 207 00:10:35,600 --> 00:10:36,120 Speaker 2: Get busy. 208 00:10:36,840 --> 00:10:41,839 Speaker 1: I was I gonna say a final time, oh uh? 209 00:10:42,120 --> 00:10:46,200 Speaker 1: Lately when a question like you opposed comes up, I think, 210 00:10:47,000 --> 00:10:51,439 Speaker 1: what is the percentage of Americans who went out of nowhere? 211 00:10:51,920 --> 00:10:55,839 Speaker 1: We're told see that man, that's a woman, Say it's 212 00:10:55,840 --> 00:10:59,720 Speaker 1: a woman, and they said it's a woman. That's the percentage, 213 00:11:01,400 --> 00:11:05,040 Speaker 1: the percentage you don't have the confidence to think for themselves, 214 00:11:05,520 --> 00:11:09,079 Speaker 1: to call out lies when they hear them. They just 215 00:11:09,160 --> 00:11:11,120 Speaker 1: want to go along to get along. That's a pretty 216 00:11:11,160 --> 00:11:13,080 Speaker 1: high percentage. I don't think one of the reasons the 217 00:11:13,080 --> 00:11:15,600 Speaker 1: Founding Fathers designed the system the way they did because 218 00:11:15,600 --> 00:11:17,640 Speaker 1: the majority of people don't yearn for freedom. 219 00:11:18,280 --> 00:11:20,040 Speaker 4: I don't think I've seen any polling on this, but 220 00:11:20,040 --> 00:11:24,720 Speaker 4: I'd love to presented as you're not gonna have to work. 221 00:11:26,480 --> 00:11:26,959 Speaker 2: At all. 222 00:11:30,240 --> 00:11:31,480 Speaker 4: I don't know how you would lay it out as 223 00:11:31,520 --> 00:11:36,120 Speaker 4: a question. I gotta believe it's a majority of people 224 00:11:36,160 --> 00:11:39,520 Speaker 4: would say, sign me the hell up. Yeah, a lot 225 00:11:39,520 --> 00:11:41,360 Speaker 4: of people going to a job today they don't like 226 00:11:41,760 --> 00:11:44,640 Speaker 4: would love to be able to say the hell with you. 227 00:11:45,360 --> 00:11:47,360 Speaker 4: I'm gonna do whatever I want today. It's a nice day, 228 00:11:47,400 --> 00:11:49,560 Speaker 4: I'm gonna play golf, it's a rainy day. I'm gonna 229 00:11:49,559 --> 00:11:51,560 Speaker 4: stay inside, and you know on Xbox. 230 00:11:52,720 --> 00:11:56,800 Speaker 1: One more caveat those of you who, especially in a 231 00:11:56,840 --> 00:12:00,960 Speaker 1: place like California, where if you'd have said anything about 232 00:12:01,120 --> 00:12:04,520 Speaker 1: that man as a man, you'd have lost your career 233 00:12:05,400 --> 00:12:06,600 Speaker 1: or you know whatever. 234 00:12:07,240 --> 00:12:07,960 Speaker 2: I give you a pass. 235 00:12:08,000 --> 00:12:09,600 Speaker 1: I wish you'd have spoken up, but I get why 236 00:12:09,640 --> 00:12:13,480 Speaker 1: you didn't, because if you don't live in a very, 237 00:12:13,559 --> 00:12:19,120 Speaker 1: very blue place, you have no idea how totalitarian the 238 00:12:19,160 --> 00:12:21,720 Speaker 1: progressive point of view is in a lot of workplaces, 239 00:12:21,800 --> 00:12:24,359 Speaker 1: especially schools and government offices. 240 00:12:24,120 --> 00:12:27,240 Speaker 4: And social pressure isn't something to be scoffed at. You 241 00:12:27,280 --> 00:12:31,120 Speaker 4: don't want to be a pariah in your neighborhood, your school, 242 00:12:31,200 --> 00:12:31,960 Speaker 4: your workplace. 243 00:12:32,480 --> 00:12:32,640 Speaker 2: Right. 244 00:12:32,679 --> 00:12:34,880 Speaker 1: See, I am so I don't have as much sympathy, 245 00:12:34,960 --> 00:12:38,040 Speaker 1: but most people you're right, Yeah, we're going to talk 246 00:12:38,080 --> 00:12:41,160 Speaker 1: to a report about the social media trial yesterday Zuckerberg 247 00:12:41,240 --> 00:12:42,839 Speaker 1: was on the stand finet where that is, and a 248 00:12:42,880 --> 00:12:43,880 Speaker 1: lot of other news on the way. 249 00:12:43,960 --> 00:12:44,360 Speaker 2: Stay here. 250 00:12:48,360 --> 00:12:51,440 Speaker 5: Man in Illinois had sued Buffalo Wild Wings, claiming that 251 00:12:51,520 --> 00:12:55,439 Speaker 5: his order of boneless chicken wings were not actually debone wings, 252 00:12:55,480 --> 00:12:58,840 Speaker 5: but were instead chicken breast assembled to look like a wing. 253 00:12:59,240 --> 00:13:02,560 Speaker 5: Buffalo Wild Wings admitted it's true, posting our boneless wings 254 00:13:02,559 --> 00:13:05,840 Speaker 5: are all white meat chicken, adding the hamburgers contained no 255 00:13:05,960 --> 00:13:09,160 Speaker 5: ham and the buffalo wings are a zero percent buffalo. 256 00:13:09,280 --> 00:13:12,600 Speaker 5: So was this man deceived? The judge ruled, and he 257 00:13:12,640 --> 00:13:15,920 Speaker 5: said no, it need not. In that ten page ruling, 258 00:13:15,960 --> 00:13:18,400 Speaker 5: he said the man's claim had quote no meat on 259 00:13:18,480 --> 00:13:19,120 Speaker 5: its bones. 260 00:13:20,320 --> 00:13:23,280 Speaker 4: Wow, kudos to the judge and went on to say 261 00:13:23,280 --> 00:13:27,760 Speaker 4: that you can shape cauliflower like wings and call that 262 00:13:27,920 --> 00:13:35,680 Speaker 4: vegetarian buffalo wings. Right, They're neither wings nor buffalo, which 263 00:13:35,720 --> 00:13:37,079 Speaker 4: I understand is a reference. 264 00:13:36,840 --> 00:13:37,320 Speaker 2: To the city. 265 00:13:37,400 --> 00:13:39,560 Speaker 1: How did that get far enough to even have a 266 00:13:39,679 --> 00:13:42,880 Speaker 1: judge look at it? It's a measure of how crazy 267 00:13:42,920 --> 00:13:47,360 Speaker 1: we've gotten as a society with the you know, lawsuit mania. Oh, 268 00:13:47,400 --> 00:13:50,840 Speaker 1: we need tort reforms so bad that guy seriously ought 269 00:13:50,880 --> 00:13:53,520 Speaker 1: to be jailed or something, or I don't know that. 270 00:13:53,800 --> 00:13:55,280 Speaker 1: The lawyers everybody involved. 271 00:13:55,720 --> 00:13:59,800 Speaker 4: Just speaking of lawsuits, multi billion dollar lawsuits possibly with 272 00:14:00,040 --> 00:14:02,520 Speaker 4: many many cases across the country against the big social 273 00:14:02,559 --> 00:14:07,600 Speaker 4: media platforms. Mark Zuckerberg was on the stand yesterday defending 274 00:14:07,640 --> 00:14:11,720 Speaker 4: Facebook and Instagram specifically, and we'll have a reporter on 275 00:14:11,760 --> 00:14:12,880 Speaker 4: that coming up next segment. 276 00:14:13,559 --> 00:14:14,760 Speaker 2: So we were talking about. 277 00:14:14,559 --> 00:14:16,760 Speaker 1: AI last segment, and I was going to say in 278 00:14:16,800 --> 00:14:18,800 Speaker 1: a related story, but we ran out of time. 279 00:14:18,880 --> 00:14:20,520 Speaker 2: So now, in a related. 280 00:14:20,080 --> 00:14:24,320 Speaker 1: Story, what to make of this very very weird stock market. 281 00:14:25,200 --> 00:14:26,840 Speaker 2: And the long and short of this. 282 00:14:27,040 --> 00:14:33,600 Speaker 1: Is, it's extremely unusual that you have three of the 283 00:14:33,640 --> 00:14:37,640 Speaker 1: big six or so sectors way up and three of 284 00:14:37,680 --> 00:14:39,360 Speaker 1: them pretty down. 285 00:14:39,960 --> 00:14:43,920 Speaker 2: But the market itself is stable. It doesn't happen. Have 286 00:14:44,000 --> 00:14:48,600 Speaker 2: you looked at your four ohing K lately? Woo? I 287 00:14:48,640 --> 00:14:52,080 Speaker 2: almost don't want to. It's really high. Yeah it is, 288 00:14:52,600 --> 00:14:52,920 Speaker 2: it is. 289 00:14:53,280 --> 00:14:58,520 Speaker 1: But so you've got energy materials and construction staples and 290 00:14:58,560 --> 00:15:02,560 Speaker 1: I won't bother describing what's in them. Who cares, way way, way, 291 00:15:02,600 --> 00:15:08,320 Speaker 1: way up? But technology, consumer discretionary goods, and financials are 292 00:15:08,360 --> 00:15:12,440 Speaker 1: all down. And yet the market has been just growing 293 00:15:12,480 --> 00:15:15,840 Speaker 1: slowly to record heights, but is pretty stable right now, 294 00:15:15,920 --> 00:15:17,840 Speaker 1: And there've only been a couple other times you've seen 295 00:15:17,880 --> 00:15:21,480 Speaker 1: that sort of divergence between sectors and a fairly stable market. 296 00:15:21,800 --> 00:15:23,720 Speaker 1: And that was would you like to guess, you know, 297 00:15:23,840 --> 00:15:27,520 Speaker 1: right after the or just at the beginning of COVID 298 00:15:28,200 --> 00:15:33,400 Speaker 1: or the mid nineties, just before the Silicon Valley banks 299 00:15:33,400 --> 00:15:38,200 Speaker 1: collapse that prompted a global banking panic, federal rescues before 300 00:15:38,240 --> 00:15:40,640 Speaker 1: that was during the global financial crisis of twenty eight, 301 00:15:40,720 --> 00:15:43,560 Speaker 1: twenty oh nine, et cetera. So the market is very, 302 00:15:43,760 --> 00:15:50,520 Speaker 1: very strange. Well, don't get me started about options. I 303 00:15:50,560 --> 00:15:52,240 Speaker 1: don't suppose there's a lot I can do about it. 304 00:15:52,760 --> 00:15:59,920 Speaker 2: No, by you could buy the dip coming up AOC. 305 00:16:00,000 --> 00:16:04,880 Speaker 1: He calls reporters and tries to cover up her embarrassing 306 00:16:04,920 --> 00:16:08,560 Speaker 1: performance in Munich, continuing live coverage. 307 00:16:08,720 --> 00:16:12,280 Speaker 4: Prince Andrew in jail. Nobody knows where he is, where 308 00:16:12,280 --> 00:16:14,040 Speaker 4: they're holding him. I don't know why they're keeping that 309 00:16:14,040 --> 00:16:16,720 Speaker 4: a secret. To Day Tower London. I'm telling you I 310 00:16:16,760 --> 00:16:17,280 Speaker 4: was just there. 311 00:16:17,400 --> 00:16:21,120 Speaker 1: Yeah, probably down in the basement, got him on the rack. 312 00:16:21,520 --> 00:16:26,920 Speaker 4: So the windsor Mount battens are they blood relatives to 313 00:16:26,960 --> 00:16:31,240 Speaker 4: any of the famous royal families of the past, like 314 00:16:31,320 --> 00:16:33,760 Speaker 4: as Henry the eighth is great great grandpa or anything 315 00:16:33,800 --> 00:16:33,920 Speaker 4: like that. 316 00:16:34,080 --> 00:16:36,480 Speaker 2: Or is that because he had your tutors, you had 317 00:16:36,520 --> 00:16:41,040 Speaker 2: your Stuart's. I don't know where the windsor Mountain Battons 318 00:16:41,080 --> 00:16:44,360 Speaker 2: came in. I don't know. I don't remember. 319 00:16:44,360 --> 00:16:47,040 Speaker 1: I think probably somebody told me in Britain, But man 320 00:16:47,640 --> 00:16:50,240 Speaker 1: tour guides are crazy and the whole line of kings 321 00:16:50,240 --> 00:16:52,240 Speaker 1: and stuff. And I got to admit, once you get 322 00:16:52,280 --> 00:16:56,040 Speaker 1: past James the fifth and Elizabeth the fourteenth, my eyes. 323 00:16:55,840 --> 00:16:58,640 Speaker 2: Were glazing, like I'm not going to remember any of this. 324 00:16:59,200 --> 00:17:02,280 Speaker 4: I don't feel like Zuckerberg and did himself and he 325 00:17:02,320 --> 00:17:05,960 Speaker 4: favors on the stand yesterday. I'm surprised by their legal 326 00:17:06,000 --> 00:17:08,680 Speaker 4: angle they're taking in their defense. But we'll bring you 327 00:17:08,760 --> 00:17:11,000 Speaker 4: up to speed on that whole case because it could 328 00:17:11,040 --> 00:17:13,280 Speaker 4: it could be a big, big deal for everything. 329 00:17:14,960 --> 00:17:16,119 Speaker 1: Armstrong and Getty. 330 00:17:17,640 --> 00:17:21,560 Speaker 6: Lead plaintiff lawyer Mark Lanney are pulling out documents after documents, 331 00:17:21,560 --> 00:17:25,240 Speaker 6: including one twenty eighteen studies stating that roughly a third 332 00:17:25,280 --> 00:17:28,120 Speaker 6: of kids ages ten to twelve in the US were 333 00:17:28,160 --> 00:17:32,080 Speaker 6: on Instagram. Plaintiff Kayley in this case was nine years 334 00:17:32,080 --> 00:17:35,360 Speaker 6: old when she signed up for it. Soccerberg maintains company 335 00:17:35,400 --> 00:17:38,560 Speaker 6: policy is that users must be over the age of thirteen, 336 00:17:38,840 --> 00:17:42,479 Speaker 6: but acknowledged kids can lie and that age verification is 337 00:17:42,520 --> 00:17:43,359 Speaker 6: not foolproof. 338 00:17:44,000 --> 00:17:46,840 Speaker 4: Yeah, we're gonna have to come up with something for that. 339 00:17:47,000 --> 00:17:49,960 Speaker 4: I mean, then that's not Mark Zuckerberg's fault. There is 340 00:17:50,000 --> 00:17:54,239 Speaker 4: no way to currently really verify age. And you got 341 00:17:54,280 --> 00:17:58,840 Speaker 4: the whole parent angle of We talked about this yesterday, 342 00:17:58,880 --> 00:18:02,600 Speaker 4: this particular plaintiff who was spending many, many hours looking 343 00:18:02,600 --> 00:18:04,880 Speaker 4: at a screen at a very young age, and does 344 00:18:04,880 --> 00:18:06,280 Speaker 4: a parenting play a role at all. 345 00:18:07,040 --> 00:18:09,919 Speaker 1: However this all comes out, it's of enormous significance I 346 00:18:09,920 --> 00:18:13,439 Speaker 1: think to modern society. And Ian Sure, who's CBS News 347 00:18:13,760 --> 00:18:16,520 Speaker 1: radio tech expert these days, has spent his career writing 348 00:18:16,560 --> 00:18:18,880 Speaker 1: for all sorts of great publications about tech and social 349 00:18:18,960 --> 00:18:20,560 Speaker 1: media and that sort of thing. He's covering the trial 350 00:18:20,560 --> 00:18:22,600 Speaker 1: as well, and joins us, Now, Ian, how are you? 351 00:18:24,160 --> 00:18:25,159 Speaker 7: I'm doing all right? How are you? 352 00:18:25,600 --> 00:18:26,080 Speaker 2: How horrific? 353 00:18:26,400 --> 00:18:29,040 Speaker 4: How was Zuckerberg on the stand yesterday defending himself and 354 00:18:29,080 --> 00:18:29,879 Speaker 4: what was his angle? 355 00:18:31,600 --> 00:18:35,040 Speaker 7: You know, his angle was very much about trying to 356 00:18:35,160 --> 00:18:40,399 Speaker 7: look reasonable, right. A lot of what he said was to, 357 00:18:40,880 --> 00:18:42,679 Speaker 7: you know, and remember he's in front of a jury, 358 00:18:43,480 --> 00:18:46,600 Speaker 7: was that, you know, he constantly is trying to make 359 00:18:46,640 --> 00:18:49,800 Speaker 7: sure that he does right by people. He was talking 360 00:18:49,840 --> 00:18:53,160 Speaker 7: about how it used to be that they had metrics 361 00:18:53,200 --> 00:18:57,159 Speaker 7: for engagement inside the company, and he's backed off a 362 00:18:57,240 --> 00:19:01,880 Speaker 7: lot of that, and even stuff like you know that 363 00:19:01,920 --> 00:19:06,199 Speaker 7: they've tried to keep younger kids off the platform, and 364 00:19:06,520 --> 00:19:10,119 Speaker 7: you know, essentially trying to show that there's good faith 365 00:19:10,240 --> 00:19:15,640 Speaker 7: effort happening here. Of course, there's a lot of criticism 366 00:19:15,840 --> 00:19:19,399 Speaker 7: in the other direction about that, including the fact that 367 00:19:19,680 --> 00:19:22,800 Speaker 7: his own company has done a lot of research over 368 00:19:22,840 --> 00:19:25,800 Speaker 7: the years that's come out through whistleblowers and others that 369 00:19:25,960 --> 00:19:29,320 Speaker 7: shows that they were aware that its apps and its 370 00:19:29,359 --> 00:19:33,159 Speaker 7: services were having a negative impact on children and teenagers 371 00:19:33,200 --> 00:19:38,159 Speaker 7: in particular, and yet they continued to release those apps 372 00:19:38,200 --> 00:19:41,359 Speaker 7: and they continued to build features that were popular with 373 00:19:41,440 --> 00:19:42,000 Speaker 7: young people. 374 00:19:43,760 --> 00:19:45,879 Speaker 4: First of all, was that it for Zuckerberg or is 375 00:19:45,920 --> 00:19:48,480 Speaker 4: he going on the stand again today or for more days? 376 00:19:49,160 --> 00:19:53,400 Speaker 7: So he should be understand again today. The I think 377 00:19:53,440 --> 00:19:58,080 Speaker 7: the big question will be exactly of what else he says. 378 00:19:58,240 --> 00:20:01,920 Speaker 7: It's you know, compare him to one of his lieutenants, 379 00:20:02,280 --> 00:20:07,000 Speaker 7: who Adam Musri, who runs Instagram. He was on the 380 00:20:07,080 --> 00:20:12,000 Speaker 7: stand earlier this week and he made a little bit 381 00:20:12,000 --> 00:20:14,800 Speaker 7: of a headline because he said, you know, he believes, 382 00:20:15,080 --> 00:20:21,480 Speaker 7: he does not believe that Instagram can create a clinical addiction. 383 00:20:22,240 --> 00:20:29,359 Speaker 7: And when the when, when the lawyers asked him, what 384 00:20:29,400 --> 00:20:31,359 Speaker 7: do you mean by that, he had said, well, I 385 00:20:31,400 --> 00:20:34,720 Speaker 7: don't I'm not actually a doctor. I can't say it's clinical. 386 00:20:34,760 --> 00:20:37,399 Speaker 7: And of course he's the one who who makes money 387 00:20:37,440 --> 00:20:40,639 Speaker 7: when Instagram does well. So I think on some levels, 388 00:20:40,760 --> 00:20:44,639 Speaker 7: Zuckerberg is trying to avoid saying anything that's going to 389 00:20:44,760 --> 00:20:47,800 Speaker 7: upset anyone, which is kind of his m He's been 390 00:20:47,920 --> 00:20:50,840 Speaker 7: very clear about the fact that he doesn't handle public 391 00:20:50,880 --> 00:20:52,080 Speaker 7: situations very well. 392 00:20:52,320 --> 00:20:52,800 Speaker 2: I didn't know that. 393 00:20:53,760 --> 00:20:56,199 Speaker 1: I was just about to ask, did they get into 394 00:20:56,840 --> 00:21:00,800 Speaker 1: at all? The very murky question of what is addiction versus? 395 00:21:00,960 --> 00:21:03,119 Speaker 1: What is this? Is really fun? I want to do 396 00:21:03,160 --> 00:21:05,480 Speaker 1: it some more with Zuckerberg, you. 397 00:21:05,440 --> 00:21:09,200 Speaker 7: Know, I think it'll be interesting to see. So one 398 00:21:09,240 --> 00:21:11,879 Speaker 7: of the unfortunate parts of the way that a lot 399 00:21:11,920 --> 00:21:14,719 Speaker 7: of these trials happen is that there is not a 400 00:21:14,800 --> 00:21:19,080 Speaker 7: live audio feed that's public and uh it's you know, 401 00:21:19,240 --> 00:21:23,239 Speaker 7: often a lot of a lot of courts still have 402 00:21:24,119 --> 00:21:28,680 Speaker 7: rules against recording what's said inside the court room, even 403 00:21:28,760 --> 00:21:32,919 Speaker 7: though it's public records. So it's really frustrating because we 404 00:21:32,960 --> 00:21:37,960 Speaker 7: don't know everything. But one of the kind of interesting 405 00:21:38,080 --> 00:21:42,440 Speaker 7: aspect of this conversation has been what does addiction mean? 406 00:21:42,920 --> 00:21:46,879 Speaker 7: And if you think about the tobacco industry and what 407 00:21:47,080 --> 00:21:51,840 Speaker 7: they went through several decades ago, part of the key 408 00:21:51,960 --> 00:21:55,040 Speaker 7: there was not just what is addiction, Like there is 409 00:21:55,520 --> 00:21:57,880 Speaker 7: a physical element to that that we can talk about 410 00:21:57,920 --> 00:22:00,239 Speaker 7: brain chemistry and all that, but there's all, I say, 411 00:22:00,359 --> 00:22:05,760 Speaker 7: the responsibility of knowing the impact your product has and 412 00:22:05,800 --> 00:22:08,679 Speaker 7: then what do you do about it afterwards? And we 413 00:22:08,880 --> 00:22:12,399 Speaker 7: do know that Zuckerberg knew the impact that his product 414 00:22:12,520 --> 00:22:17,480 Speaker 7: was having. The question is did he do and act responsibly? 415 00:22:17,600 --> 00:22:20,560 Speaker 7: And I think that is more what the lawyers are after. 416 00:22:21,560 --> 00:22:23,439 Speaker 4: Well, I think the most important thing you've said in 417 00:22:23,480 --> 00:22:25,600 Speaker 4: this report was, like your first sentence, that this is 418 00:22:25,600 --> 00:22:27,760 Speaker 4: in front of a jury, because it's about what a 419 00:22:27,920 --> 00:22:31,800 Speaker 4: jury thinks, not what is necessarily accurately right or wrong. 420 00:22:32,440 --> 00:22:35,119 Speaker 4: And I thought it was interesting that Zuckerberg got up 421 00:22:35,119 --> 00:22:38,680 Speaker 4: there yesterday and I'm reading for some of the reporting 422 00:22:38,720 --> 00:22:41,080 Speaker 4: of what he said, and like you just said that 423 00:22:41,119 --> 00:22:44,840 Speaker 4: there's not an actual transcript, but Zuckerberg saying that they 424 00:22:45,320 --> 00:22:48,840 Speaker 4: quote not are not trying to maximize the amount of 425 00:22:48,880 --> 00:22:51,440 Speaker 4: time people spend every month on Instagram. 426 00:22:51,680 --> 00:22:54,520 Speaker 2: That just seems like an alst slide to me. 427 00:22:57,000 --> 00:22:59,800 Speaker 7: You know, I think that the way that he of 428 00:23:00,040 --> 00:23:05,000 Speaker 7: Lloyd's you know, perjury and getting in trouble over that 429 00:23:05,119 --> 00:23:08,360 Speaker 7: is that, you know, the argument he makes, and this 430 00:23:08,440 --> 00:23:11,760 Speaker 7: came out with some of the other reporting around that sentence, right, 431 00:23:11,960 --> 00:23:14,879 Speaker 7: is that the context of that statement was that he 432 00:23:15,000 --> 00:23:18,840 Speaker 7: believes that if people find it useful, then they will 433 00:23:18,880 --> 00:23:22,000 Speaker 7: keep coming back. And so you know, whether or not 434 00:23:22,160 --> 00:23:25,560 Speaker 7: you can you can say, well, cynically, you know, he's 435 00:23:25,600 --> 00:23:28,159 Speaker 7: just trying to make himself sleep better at night. He 436 00:23:28,359 --> 00:23:30,920 Speaker 7: believes that if you're enjoying it, then you're going to 437 00:23:30,960 --> 00:23:34,320 Speaker 7: come back, versus trying to trick you into always coming 438 00:23:34,359 --> 00:23:35,560 Speaker 7: back as much as possible. 439 00:23:36,160 --> 00:23:39,720 Speaker 1: He insures CBS News Radio tech expert is online Ian 440 00:23:40,080 --> 00:23:43,000 Speaker 1: any other witnesses in particular, we ought to be looking 441 00:23:43,000 --> 00:23:43,959 Speaker 1: forward to hearing from. 442 00:23:45,400 --> 00:23:50,160 Speaker 7: Well, we're definitely going to be seeing more from all 443 00:23:50,240 --> 00:23:53,920 Speaker 7: of these companies. You know, the witnesses themselves are interesting, 444 00:23:53,960 --> 00:23:55,960 Speaker 7: and of course getting duck a brick on the stand 445 00:23:56,440 --> 00:23:59,280 Speaker 7: is enough to get us all talking about it. I 446 00:23:59,359 --> 00:24:03,000 Speaker 7: think what going to be very meaningful is the data 447 00:24:03,080 --> 00:24:06,840 Speaker 7: and the documents that come out of these companies. Having 448 00:24:06,920 --> 00:24:10,680 Speaker 7: followed Apple and Samsung and some of the other major trials, 449 00:24:10,880 --> 00:24:13,159 Speaker 7: you know what the executive to say on the stand 450 00:24:13,280 --> 00:24:16,919 Speaker 7: is it's kind of ahearst, right, and it's often very polished. 451 00:24:17,240 --> 00:24:20,760 Speaker 7: But their internal communications is really where we get a 452 00:24:20,760 --> 00:24:24,080 Speaker 7: lot of the of conversation and where, honestly, where a 453 00:24:24,080 --> 00:24:27,640 Speaker 7: lot of lawmakers end up making choices about what they 454 00:24:27,640 --> 00:24:30,760 Speaker 7: will do in terms of regulations. So you know, I'm 455 00:24:30,800 --> 00:24:32,920 Speaker 7: going to be really keeping an eye on what these 456 00:24:33,000 --> 00:24:36,280 Speaker 7: exhibits and evidence are coming out over the next couple 457 00:24:36,359 --> 00:24:36,920 Speaker 7: of weeks. 458 00:24:37,160 --> 00:24:39,120 Speaker 1: Well, can't wait to check in with you again, Ian 459 00:24:39,200 --> 00:24:41,960 Speaker 1: suran Us of CBS News, Ian, a great job. 460 00:24:42,000 --> 00:24:45,040 Speaker 2: Good to talk to you. Thanks absolutely. 461 00:24:45,320 --> 00:24:48,960 Speaker 4: And my point also, though, is I think it's ridiculous 462 00:24:49,000 --> 00:24:51,280 Speaker 4: for Mark Zuckerberg to stand up there or sit there 463 00:24:51,280 --> 00:24:54,240 Speaker 4: and say we are not trying to maximize the amount 464 00:24:54,240 --> 00:24:56,120 Speaker 4: of time people spend every month on Instagram. 465 00:24:56,280 --> 00:25:00,879 Speaker 2: That's hilariously Really, that's not a crime. 466 00:25:01,560 --> 00:25:03,800 Speaker 4: The restaurant's trying to get you to come back. We're 467 00:25:03,840 --> 00:25:05,920 Speaker 4: trying to get you to stay tuned to the next segment. 468 00:25:06,280 --> 00:25:09,960 Speaker 4: I mean, just every Ford hopes you'll buy Fords for 469 00:25:09,960 --> 00:25:12,200 Speaker 4: the rest of your life. Pepsi wants you to stay 470 00:25:12,200 --> 00:25:15,159 Speaker 4: at PEPSI person. I mean, it's just come on again. 471 00:25:15,280 --> 00:25:18,080 Speaker 4: I wish this weren't being tried in the court of court. 472 00:25:18,640 --> 00:25:22,800 Speaker 4: It's not about criminal liability. It's about, oh, these companies. 473 00:25:22,359 --> 00:25:25,680 Speaker 1: Are insidious and their products do bad things to your 474 00:25:25,720 --> 00:25:29,479 Speaker 1: brain and your life. That's just that people need to 475 00:25:29,680 --> 00:25:32,239 Speaker 1: know that, and then you know they have agency they 476 00:25:32,240 --> 00:25:35,960 Speaker 1: can get in their own lives. And because it's going 477 00:25:36,040 --> 00:25:38,960 Speaker 1: to be extremely difficult to make the case, I think 478 00:25:39,080 --> 00:25:42,600 Speaker 1: that it's what the plaintiffs say. It is that this 479 00:25:43,320 --> 00:25:47,119 Speaker 1: young woman who had terrible parenting and it was just 480 00:25:47,440 --> 00:25:50,640 Speaker 1: streaming videos twelve hours a day till her eyes went 481 00:25:50,800 --> 00:25:52,760 Speaker 1: googly from age six on. 482 00:25:52,800 --> 00:25:55,000 Speaker 2: Are you kidding me? Of course she's screwed up. 483 00:25:54,880 --> 00:25:59,199 Speaker 4: And they've got documents that she had mental emotional problems 484 00:26:00,440 --> 00:26:03,520 Speaker 4: to get an online. 485 00:26:03,160 --> 00:26:04,639 Speaker 2: So she's a trouble anyway. 486 00:26:05,080 --> 00:26:07,120 Speaker 1: I'm afraid for the drive by media and the drive 487 00:26:07,119 --> 00:26:11,199 Speaker 1: by news consumer that when the plane offs fail the 488 00:26:11,240 --> 00:26:14,480 Speaker 1: defendants prevail, people say, oh, okay, so I guess it 489 00:26:14,520 --> 00:26:16,679 Speaker 1: is an addictive and it's actually good for me, and 490 00:26:16,680 --> 00:26:19,480 Speaker 1: I'm gonna do scroll for the rest of my miserable life. 491 00:26:19,560 --> 00:26:21,719 Speaker 4: I think we all well, I mean, I don't know. 492 00:26:21,760 --> 00:26:23,560 Speaker 4: Maybe I'm wrong about this, I was about to say, 493 00:26:23,640 --> 00:26:27,080 Speaker 4: and then you can contradict me. I was about to say, 494 00:26:27,119 --> 00:26:29,440 Speaker 4: we all intuitively know that this stuff. 495 00:26:29,240 --> 00:26:30,040 Speaker 2: Is bad for us. 496 00:26:30,560 --> 00:26:34,440 Speaker 4: Maybe everybody doesn't, but I feel like everybody intuitively knows 497 00:26:34,480 --> 00:26:36,360 Speaker 4: that this is just bad. We're spending too much time 498 00:26:36,359 --> 00:26:38,080 Speaker 4: on it. This isn't doing my life any good. But 499 00:26:38,119 --> 00:26:40,680 Speaker 4: I keep doing it anyway and I can't stop, which 500 00:26:40,720 --> 00:26:44,600 Speaker 4: is the definition of an addiction. Really, But how the 501 00:26:44,720 --> 00:26:47,440 Speaker 4: hell would you outlaw it and not get into all 502 00:26:47,560 --> 00:26:50,840 Speaker 4: kinds of other things like potato. 503 00:26:50,480 --> 00:26:54,679 Speaker 1: Chips or booze or whatever. I will not contradict you. 504 00:26:55,160 --> 00:26:57,040 Speaker 1: I think you're you're right, except that I think a 505 00:26:57,080 --> 00:26:58,920 Speaker 1: lot of people are lacking in that intuition. 506 00:26:59,000 --> 00:27:03,240 Speaker 2: You would need to know intuitively. But it's a jury. 507 00:27:03,680 --> 00:27:05,800 Speaker 4: Like if you have a bunch of parents on there, 508 00:27:06,080 --> 00:27:08,439 Speaker 4: they could get into a discussion of can you believe 509 00:27:08,480 --> 00:27:11,400 Speaker 4: she let her kid watch her iPad all day long? 510 00:27:11,440 --> 00:27:14,520 Speaker 2: And she six? But man, I wouldn't be shocked to 511 00:27:14,560 --> 00:27:15,240 Speaker 2: have a jury. 512 00:27:15,280 --> 00:27:19,440 Speaker 4: Think Facebook and Google gut eight quad trillion dollars, right, 513 00:27:19,520 --> 00:27:23,680 Speaker 4: and they knew this was addictive. I charge them four 514 00:27:23,840 --> 00:27:27,800 Speaker 4: billion dollars in fines, wouldn't shock me. 515 00:27:28,640 --> 00:27:31,360 Speaker 1: Yeah, it's terrifying being on a jury. Anybody who's been 516 00:27:31,359 --> 00:27:33,600 Speaker 1: on a jury, well at least a lot of us 517 00:27:33,680 --> 00:27:36,600 Speaker 1: know that. Yeah, who knows where they go. I'm picturing 518 00:27:36,640 --> 00:27:39,720 Speaker 1: special warnings being put on this stuff like Instagrams, Like 519 00:27:39,760 --> 00:27:42,320 Speaker 1: when you click on it, there's a warning, you click okay, 520 00:27:42,359 --> 00:27:43,639 Speaker 1: and then you can just scroll. 521 00:27:43,840 --> 00:27:46,600 Speaker 4: Oh yeah, So that's almost the worst case scenario. These 522 00:27:46,640 --> 00:27:51,680 Speaker 4: companies get dinged for ridiculous amounts of money for basically 523 00:27:51,720 --> 00:27:55,800 Speaker 4: doing what every consumer product does, trying to get you 524 00:27:55,840 --> 00:27:58,480 Speaker 4: to use it as much as possible. And the only 525 00:27:58,520 --> 00:28:00,520 Speaker 4: thing that comes out of it that makes decide any 526 00:28:00,640 --> 00:28:03,119 Speaker 4: better is some warning that you have to click on 527 00:28:03,280 --> 00:28:07,240 Speaker 4: before you scroll all day long on Instagram. 528 00:28:06,320 --> 00:28:06,920 Speaker 2: Right right. 529 00:28:07,080 --> 00:28:10,040 Speaker 1: The only reason every other product in the world isn't 530 00:28:10,200 --> 00:28:13,600 Speaker 1: trying to rearrange your brain chemistry for their profit is 531 00:28:13,640 --> 00:28:21,640 Speaker 1: they don't know how, right, right exactly? Oh man, uh, 532 00:28:22,160 --> 00:28:24,960 Speaker 1: I can't believe Zuckerberg got up there and his lawyers 533 00:28:25,040 --> 00:28:28,320 Speaker 1: thought the best angle was to claim they're not trying 534 00:28:28,359 --> 00:28:33,440 Speaker 1: to maximize engagement though. Wow, I would have laughed out 535 00:28:33,480 --> 00:28:34,680 Speaker 1: loud if I was in the jury. 536 00:28:34,920 --> 00:28:38,120 Speaker 2: Fack what please bless? 537 00:28:38,800 --> 00:28:41,320 Speaker 1: Yeah, the judge might frown on that quick word from 538 00:28:41,360 --> 00:28:44,520 Speaker 1: our friends. That simply save home security. Traditional security systems 539 00:28:44,920 --> 00:28:48,240 Speaker 1: set off a buzzer or something after somebody's already broken 540 00:28:48,240 --> 00:28:49,920 Speaker 1: into your house and probably grabbed all your stuff. 541 00:28:49,960 --> 00:28:50,520 Speaker 2: That's too late. 542 00:28:50,800 --> 00:28:54,000 Speaker 1: Simply saves outdoor guard outdoor I'm sorry. Active guard outdoor 543 00:28:54,040 --> 00:28:57,320 Speaker 1: protection can help prevent break ins before they happen. It's 544 00:28:57,360 --> 00:29:00,200 Speaker 1: all about the AI powered cameras backed up by the 545 00:29:00,280 --> 00:29:02,320 Speaker 1: live professional monitoring agents. 546 00:29:02,360 --> 00:29:04,560 Speaker 4: All I need to know is no long term contract 547 00:29:04,680 --> 00:29:09,000 Speaker 4: or cancelation fee. That's a company that believes you're gonna 548 00:29:09,040 --> 00:29:11,000 Speaker 4: like the product and use it. They're not trying to 549 00:29:11,040 --> 00:29:13,120 Speaker 4: just get roped into two weeks and it's all about 550 00:29:13,360 --> 00:29:15,040 Speaker 4: our two years and getting you to close the deal 551 00:29:15,040 --> 00:29:15,760 Speaker 4: and sign on the line. 552 00:29:15,800 --> 00:29:16,520 Speaker 2: Cool, we got them. 553 00:29:16,520 --> 00:29:18,600 Speaker 4: No, they think you'll keep using It's simply safe because 554 00:29:18,600 --> 00:29:19,280 Speaker 4: they're gonna like it. 555 00:29:19,640 --> 00:29:21,920 Speaker 1: And if some junkie scumbag is trying to break into 556 00:29:21,960 --> 00:29:24,560 Speaker 1: your house, they don't like to give you an alert. Hey, 557 00:29:24,560 --> 00:29:26,880 Speaker 1: somebody's trying to break in your house like all the 558 00:29:27,000 --> 00:29:30,280 Speaker 1: other big companies do. Simply Safe agents take care of that, 559 00:29:30,600 --> 00:29:31,680 Speaker 1: call the cops, et cetera. 560 00:29:32,000 --> 00:29:33,240 Speaker 2: Why wait, protect your home today? 561 00:29:33,320 --> 00:29:35,480 Speaker 1: Enjoy a fifty percent off a new simply Safe system 562 00:29:35,480 --> 00:29:38,800 Speaker 1: with professional monitoring. It's simply safe dot com slash Armstrong. 563 00:29:39,160 --> 00:29:43,360 Speaker 1: That's simplysafe dot Com slash armstrong. Super affordable to around 564 00:29:43,360 --> 00:29:45,960 Speaker 1: a buck a day for that incredible monitoring. Simply safe 565 00:29:46,000 --> 00:29:48,520 Speaker 1: dot Com slash Armstrong. There's no safe like simply Safe. 566 00:29:48,600 --> 00:29:49,760 Speaker 1: Finally on this for now. 567 00:29:49,880 --> 00:29:54,320 Speaker 4: Yesterday, Zuckerberg said he does know, as our reporter said, 568 00:29:54,360 --> 00:29:57,720 Speaker 4: that underage people are using their product when they're not 569 00:29:57,760 --> 00:30:01,480 Speaker 4: supposed to. They're lying about their A and whenever. But 570 00:30:01,520 --> 00:30:05,120 Speaker 4: whenever they can identify somebody eight underage, they boot them off. 571 00:30:05,200 --> 00:30:08,520 Speaker 4: But the lawyer for the plaintiff shot back, you expect 572 00:30:08,560 --> 00:30:10,520 Speaker 4: a nine year old to read all of that fine print, 573 00:30:10,840 --> 00:30:14,320 Speaker 4: which I'm not exactly sure it's a score? It is 574 00:30:14,320 --> 00:30:18,160 Speaker 4: that a score? I mean, I don't know what it is. 575 00:30:18,280 --> 00:30:20,320 Speaker 4: On Instagram, I don't think I have one, but I've 576 00:30:20,360 --> 00:30:22,200 Speaker 4: seen sights where it says are you eighteen? 577 00:30:22,920 --> 00:30:24,800 Speaker 2: Yeah? Yes, I mean that's not a lot of fun. 578 00:30:24,800 --> 00:30:25,440 Speaker 2: You've got me. 579 00:30:27,920 --> 00:30:33,160 Speaker 1: So coming up, America's favorite islamiunist Zorn Mumdani trying to 580 00:30:33,520 --> 00:30:38,160 Speaker 1: take a New York hostage. And how far has CNN fallen? 581 00:30:39,080 --> 00:30:42,000 Speaker 1: Why did I even mention them? Just now good one, 582 00:30:42,120 --> 00:30:44,000 Speaker 1: that's good stuff all the way. 583 00:30:44,000 --> 00:30:46,600 Speaker 4: And the dog forgot loose at the Olympics, Oh my god, 584 00:30:46,680 --> 00:30:47,880 Speaker 4: how many people were bitten? 585 00:30:48,320 --> 00:30:48,959 Speaker 2: Stay tuned? 586 00:30:53,200 --> 00:30:55,720 Speaker 5: And a late entry at the Olympics, not chasing gold, 587 00:30:55,880 --> 00:30:59,440 Speaker 5: just chasing skiers. A family dog got loose and joined 588 00:30:59,480 --> 00:31:03,280 Speaker 5: a women's cross country qualifier. He didn't win a medal, 589 00:31:03,400 --> 00:31:06,560 Speaker 5: but he did cross the finish line and get some scratches. 590 00:31:06,160 --> 00:31:09,680 Speaker 2: Behind the ear. Yeah, it was euthanized as a lesson 591 00:31:09,680 --> 00:31:13,240 Speaker 2: to other dogs. Don't be running around. And of course. 592 00:31:15,160 --> 00:31:17,720 Speaker 4: In the Olympics, everybody who was cheering like crazy the 593 00:31:17,760 --> 00:31:19,880 Speaker 4: dog crossed the finish line was whoo. 594 00:31:20,400 --> 00:31:22,120 Speaker 2: It was con as cute as can be. 595 00:31:22,280 --> 00:31:26,080 Speaker 1: It was yeah. Yeah. Oh So a couple of stories. 596 00:31:26,080 --> 00:31:28,840 Speaker 1: We'll do our best to krame in. First of all, 597 00:31:29,240 --> 00:31:33,400 Speaker 1: America's favorite little Islamunists are on Mundami, according to the 598 00:31:33,400 --> 00:31:37,440 Speaker 1: Wall Street Journal, is taking New York hostage. His first 599 00:31:37,440 --> 00:31:40,600 Speaker 1: big move is threatening to raise property taxes unless Democrats 600 00:31:40,640 --> 00:31:43,360 Speaker 1: in the capitol raised taxes on top earners in business, 601 00:31:43,480 --> 00:31:46,080 Speaker 1: would an ultimatum fleece the rich for them, or it'll 602 00:31:46,080 --> 00:31:49,360 Speaker 1: fleece them and the middle class. Then they go into 603 00:31:49,360 --> 00:31:52,440 Speaker 1: what you brought us yesterday, Jack, New York City's budget 604 00:31:52,520 --> 00:31:56,600 Speaker 1: is ten billion dollars more than the state of Florida, 605 00:31:57,480 --> 00:32:00,720 Speaker 1: though it has only forty percent of the population of 606 00:32:00,760 --> 00:32:05,440 Speaker 1: the Sunshine State. It's amazing, enormous city. But and then 607 00:32:05,480 --> 00:32:08,480 Speaker 1: this from the invaluable jt In Livermore. New York City's 608 00:32:08,520 --> 00:32:11,440 Speaker 1: mayor is on record for advocating advocating freezing rents for 609 00:32:11,480 --> 00:32:15,760 Speaker 1: four years. Presumably this is to help keep housing costs down. However, 610 00:32:15,760 --> 00:32:17,600 Speaker 1: at the same time he's talking threatening in nine and 611 00:32:17,600 --> 00:32:20,760 Speaker 1: a half percent increasing property taxes. Presumably this is going 612 00:32:20,800 --> 00:32:22,800 Speaker 1: to raise housing costs. So on the one hand, he 613 00:32:22,880 --> 00:32:25,680 Speaker 1: things keeping housing costs down is good by rent freezes, 614 00:32:25,800 --> 00:32:29,080 Speaker 1: while on the other hand, raising proposed blah blah blah 615 00:32:29,240 --> 00:32:32,440 Speaker 1: is good. In both cases, he's hurting the property owners. 616 00:32:32,480 --> 00:32:34,840 Speaker 1: But I don't want to hear a single yip from 617 00:32:34,880 --> 00:32:38,400 Speaker 1: Democrats about the double whammy attack on property ownership. Mamdonnie 618 00:32:38,400 --> 00:32:40,840 Speaker 1: and those who appointed warned the New Yorkers they would 619 00:32:40,880 --> 00:32:44,120 Speaker 1: have very different relationship with property going forward to say 620 00:32:44,200 --> 00:32:47,080 Speaker 1: nothing about the fact that home ownership is white supremacy. 621 00:32:47,240 --> 00:32:50,520 Speaker 4: Do you think, Mom, Donnie, that's the first time I've 622 00:32:50,560 --> 00:32:51,440 Speaker 4: ever said it correctly? 623 00:32:51,800 --> 00:32:54,400 Speaker 2: Well done? Uh, probably the last. 624 00:32:54,600 --> 00:32:57,720 Speaker 4: Do you think he if he had a magic wand 625 00:32:58,280 --> 00:33:00,320 Speaker 4: he'd do away with private homeowners ship? 626 00:33:01,760 --> 00:33:05,680 Speaker 2: Yes? I think communists. I think so too. And in 627 00:33:05,760 --> 00:33:07,120 Speaker 2: ISLAMI which is which is rare. 628 00:33:07,680 --> 00:33:11,040 Speaker 1: Usually they tolerate each other just to overthrow Western civilization, 629 00:33:11,200 --> 00:33:14,600 Speaker 1: liberal democracy. Islamiston's communists, and then they are at each 630 00:33:14,600 --> 00:33:17,200 Speaker 1: other's it's throats immediately. He's the rare hybrid, I think, 631 00:33:17,360 --> 00:33:18,360 Speaker 1: although it is hard. 632 00:33:18,600 --> 00:33:20,640 Speaker 2: I don't know. He might think socialism is hot. 633 00:33:21,080 --> 00:33:24,280 Speaker 1: I ride this to get Islamism into the United States. 634 00:33:24,320 --> 00:33:25,800 Speaker 2: I actually believe that's a strategy. 635 00:33:26,440 --> 00:33:29,800 Speaker 1: Could get into AOC calling a New York Times reporter 636 00:33:29,920 --> 00:33:33,040 Speaker 1: from Berlin in an effort to clean up the coverage 637 00:33:33,080 --> 00:33:36,440 Speaker 1: of her cringe worthy missteps. But I want to pay 638 00:33:36,440 --> 00:33:38,320 Speaker 1: this off for sure. Maybe we'll get to that next hour. 639 00:33:38,360 --> 00:33:38,720 Speaker 2: I don't know. 640 00:33:40,040 --> 00:33:43,160 Speaker 1: New numbers are out the ratings for cable news viewing 641 00:33:43,600 --> 00:33:44,320 Speaker 1: the top shows. 642 00:33:44,600 --> 00:33:49,680 Speaker 4: I love to hate this because it always reminds me 643 00:33:49,760 --> 00:33:52,080 Speaker 4: why do I ever, why do we ever talk about 644 00:33:52,080 --> 00:33:55,560 Speaker 4: cable news shows? 645 00:33:55,960 --> 00:33:59,600 Speaker 1: Well, your number one show is the five. Interestingly enough, 646 00:34:00,240 --> 00:34:05,920 Speaker 1: it's okay, Well, number two Jesse Waters Primetime, Number three, 647 00:34:06,080 --> 00:34:09,400 Speaker 1: Gutfeld with an exclamation point, Number four Special Report with 648 00:34:09,400 --> 00:34:13,240 Speaker 1: Brett Pear, Number five, The ingram Angle. No, you didn't 649 00:34:13,280 --> 00:34:19,279 Speaker 1: lose track, The Top hang on Now seventeen shows are 650 00:34:19,360 --> 00:34:23,480 Speaker 1: all on Fox News, which is you know, because there's 651 00:34:23,560 --> 00:34:25,480 Speaker 1: like one outlet for that. 652 00:34:25,560 --> 00:34:29,080 Speaker 4: Point of view and fifty outlets for the other point 653 00:34:29,080 --> 00:34:29,520 Speaker 4: of view. 654 00:34:29,719 --> 00:34:31,759 Speaker 2: Yeah, virtually all of them. Yeah, all of the rest 655 00:34:31,760 --> 00:34:32,080 Speaker 2: of them. 656 00:34:32,920 --> 00:34:38,040 Speaker 1: Then you finally get an ms NOW show MSNBC the execrable, 657 00:34:38,520 --> 00:34:42,719 Speaker 1: horrible show for or channel for stupid progressives? 658 00:34:42,840 --> 00:34:44,920 Speaker 2: What's there? The top show? Morning Joe. 659 00:34:46,320 --> 00:34:49,880 Speaker 4: Deadline, White House with Nicole Wallace, Oh my god, I 660 00:34:49,960 --> 00:34:50,879 Speaker 4: can't stand her. 661 00:34:52,000 --> 00:34:55,800 Speaker 1: Morning Joseph Oh doesn't appear to the number twenty seven. 662 00:34:56,480 --> 00:34:59,080 Speaker 1: But so once you get into like the eighteen and on, 663 00:34:59,160 --> 00:35:02,400 Speaker 1: it's MSNBC seeing Fox News, A few of those. 664 00:35:02,239 --> 00:35:04,280 Speaker 2: Few of those blah blah blah blah blah. 665 00:35:04,000 --> 00:35:09,400 Speaker 1: Notably absent until you get to number freaking thirty four CNN, 666 00:35:09,600 --> 00:35:11,640 Speaker 1: And do we have how many people are actually watching 667 00:35:11,680 --> 00:35:18,080 Speaker 1: the show eight hundred and seven thousand average viewer, and 668 00:35:18,120 --> 00:35:20,200 Speaker 1: that'd be all people, but in the money demo would 669 00:35:20,239 --> 00:35:21,719 Speaker 1: be significantly smaller than that. 670 00:35:22,080 --> 00:35:23,520 Speaker 2: Yeah, they're mostly ninety plus. 671 00:35:23,640 --> 00:35:27,720 Speaker 4: Yeah, we're people standing at airports waiting for their connection. 672 00:35:28,560 --> 00:35:33,400 Speaker 1: They still remember what was the what was the scud stud? 673 00:35:35,000 --> 00:35:36,480 Speaker 2: Wow, that's going way back. 674 00:35:36,719 --> 00:35:38,080 Speaker 1: Oh that would have been a great pull if I 675 00:35:38,120 --> 00:35:40,600 Speaker 1: could come up with that, but I couldn't, so anyway. Yeah, 676 00:35:40,680 --> 00:35:44,240 Speaker 1: the best show on CNN is the thirty fourth most 677 00:35:44,280 --> 00:35:51,799 Speaker 1: watched cable news show. They are nothing. They don't exist. Yeah, 678 00:35:51,800 --> 00:35:55,360 Speaker 1: good for clips. Yep, they'll probably land a presidential debate. 679 00:35:57,120 --> 00:36:01,839 Speaker 1: Good lord, you're right, or at least yeah, they'll host it. 680 00:36:02,080 --> 00:36:05,319 Speaker 4: Yeah, Jake Tapper and Dana what's her name will be 681 00:36:05,360 --> 00:36:06,400 Speaker 4: the moderators. 682 00:36:06,840 --> 00:36:11,320 Speaker 1: They are the seventy eight RPM record of cable news. 683 00:36:12,040 --> 00:36:14,680 Speaker 1: I'm aware of its existence, and it was a quaint 684 00:36:14,719 --> 00:36:19,399 Speaker 1: old thing. But yeah, who'd watched that? I was gonna say, 685 00:36:19,400 --> 00:36:22,600 Speaker 1: great insights from one of our beloved listeners about possible 686 00:36:22,640 --> 00:36:23,720 Speaker 1: conflict with Iran. 687 00:36:24,200 --> 00:36:26,480 Speaker 2: Next hour, Armstrong and Getty