1 00:00:01,840 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,160 --> 00:00:10,160 Speaker 1: Washington Broadcast Center, Jack Armstrong, Joe, Katty. 3 00:00:10,160 --> 00:00:17,200 Speaker 2: Arm Strong and Getty and now he Armstrong and Yetty. 4 00:00:23,400 --> 00:00:25,360 Speaker 3: So I just got a statement from the Mayor's office 5 00:00:25,360 --> 00:00:28,480 Speaker 3: about five minutes ago saying that this is tragic and unacceptable, 6 00:00:28,480 --> 00:00:31,600 Speaker 3: and then they're gonna be out there tomorrow offering resources 7 00:00:31,600 --> 00:00:33,440 Speaker 3: and help and trying to clean up this area the 8 00:00:33,520 --> 00:00:34,120 Speaker 3: best they can. 9 00:00:35,240 --> 00:00:38,560 Speaker 1: What are we talking about, the mayor of Los Angeles? 10 00:00:38,800 --> 00:00:46,559 Speaker 1: What's tragic and unacceptable? Ssj's my friend, ssj's subterranean sewer junkies, 11 00:00:48,360 --> 00:00:51,639 Speaker 1: the shocking report for Matthew Seedorf of Fox eleven LA. 12 00:00:51,920 --> 00:00:57,840 Speaker 1: They do a great job talking about well subterranean stewor junkies. 13 00:00:58,320 --> 00:01:00,520 Speaker 1: Matthew will start with thirty and work our way, Michael 14 00:01:00,840 --> 00:01:01,360 Speaker 1: talk to us. 15 00:01:01,360 --> 00:01:06,399 Speaker 3: Matthew an up close, exclusive look underground at living conditions 16 00:01:06,440 --> 00:01:10,680 Speaker 3: too extreme for words. It's hard to imagine someone in 17 00:01:10,680 --> 00:01:15,000 Speaker 3: the trash, human waste and going in an overpowering stench. 18 00:01:15,120 --> 00:01:16,240 Speaker 4: A I gotta get back. 19 00:01:17,000 --> 00:01:19,720 Speaker 3: Just moments earlier, we watched someone climb out of that 20 00:01:19,800 --> 00:01:23,480 Speaker 3: storm drain using the sewer as shelter. 21 00:01:23,480 --> 00:01:23,959 Speaker 2: Water for. 22 00:01:27,280 --> 00:01:31,679 Speaker 3: Her answer is hard to understand. But the area eighty 23 00:01:31,720 --> 00:01:36,600 Speaker 3: eighth and South Grand overwhelmed with r v's tents and trash. 24 00:01:36,640 --> 00:01:44,320 Speaker 4: Wow, how far down the road of being a junkie 25 00:01:44,400 --> 00:01:46,760 Speaker 4: do you have to be before you make those decisions? 26 00:01:46,840 --> 00:01:49,320 Speaker 4: Of course they're going to. They portray this all as homelessness, 27 00:01:49,800 --> 00:01:55,000 Speaker 4: and you know, in the lack of affordable housing, you can't, well, 28 00:01:55,040 --> 00:01:55,600 Speaker 4: I don't. 29 00:01:55,640 --> 00:01:59,360 Speaker 1: I don't think Matthew and Fox eleven do frequently it is. 30 00:01:59,600 --> 00:02:03,120 Speaker 4: But yeah, but the world will, the world at large 31 00:02:03,120 --> 00:02:06,520 Speaker 4: will because you don't just like you can't afford some 32 00:02:06,600 --> 00:02:09,559 Speaker 4: place to live and decide I'll go live in the sewer. 33 00:02:09,680 --> 00:02:12,480 Speaker 2: I'll go live in wet filth in the soar. Yeah. 34 00:02:13,240 --> 00:02:16,040 Speaker 1: Years ago I read a book I wish I could. 35 00:02:16,040 --> 00:02:19,120 Speaker 1: It might have been the entitled The Mole People. It 36 00:02:19,240 --> 00:02:23,360 Speaker 1: was a journalist, a writer who went down into the 37 00:02:23,480 --> 00:02:28,120 Speaker 1: legendary much discussed bowels of the subway system of New 38 00:02:28,200 --> 00:02:32,600 Speaker 1: York City, including closed stations and no longer use tubes 39 00:02:32,639 --> 00:02:36,360 Speaker 1: and the rest of it, and found like small cities 40 00:02:36,600 --> 00:02:40,360 Speaker 1: of people living down there. Some were down under lock sums, 41 00:02:40,400 --> 00:02:45,440 Speaker 1: were bums, junkies, some were mentally ill, but just crazy crazy. 42 00:02:45,880 --> 00:02:46,280 Speaker 2: If I can. 43 00:02:46,440 --> 00:02:48,600 Speaker 1: If I can find that title, I'll tweet it or 44 00:02:48,639 --> 00:02:51,079 Speaker 1: something or post at Armstrong and Giddy dot com. Fascinating 45 00:02:51,080 --> 00:02:54,799 Speaker 1: book out of date, The fascinating Matthew Seedorf continues, one. 46 00:02:54,760 --> 00:02:57,840 Speaker 3: Nolla with nonprofit Clean La with me was picking up 47 00:02:57,840 --> 00:02:58,799 Speaker 3: garbage here Monday. 48 00:02:58,919 --> 00:03:00,400 Speaker 2: We've got a lady live in there. 49 00:03:00,480 --> 00:03:03,160 Speaker 3: When you saw not one but two people emerge from 50 00:03:03,200 --> 00:03:06,200 Speaker 3: the storm drain, I can't explain that you know a 51 00:03:06,280 --> 00:03:10,640 Speaker 3: person living like a rot worse than the rot? Come on, hello, 52 00:03:11,360 --> 00:03:14,320 Speaker 3: and just two weeks ago it happened again. Is even 53 00:03:14,720 --> 00:03:18,519 Speaker 3: another street in South Lay another person living underground? 54 00:03:19,400 --> 00:03:20,200 Speaker 4: Is a human? 55 00:03:20,360 --> 00:03:21,119 Speaker 1: Is a human being? 56 00:03:21,520 --> 00:03:22,600 Speaker 2: Why are we accepting this? 57 00:03:23,040 --> 00:03:25,919 Speaker 5: Why all the officials they have to do something, They 58 00:03:25,960 --> 00:03:27,080 Speaker 5: have to fix this problem. 59 00:03:27,600 --> 00:03:30,960 Speaker 1: Ah, there we have it. All the officials, they have 60 00:03:31,040 --> 00:03:35,400 Speaker 1: to do something. Why do we accept this? Oh boy? 61 00:03:36,040 --> 00:03:39,840 Speaker 1: And that lovely human emotion has led to the waste 62 00:03:40,040 --> 00:03:44,400 Speaker 1: of billions tens of billions of dollars in the state 63 00:03:44,400 --> 00:03:49,120 Speaker 1: of cal Unicornea alone. 64 00:03:47,520 --> 00:03:52,440 Speaker 4: Joe, we're all one medical bill away from living in 65 00:03:52,480 --> 00:03:53,000 Speaker 4: the sewer. 66 00:03:55,240 --> 00:03:59,200 Speaker 1: Incorrect sir. The title of the book I refer to, 67 00:03:59,200 --> 00:04:01,760 Speaker 1: thank you very much. The Mold People Life in the 68 00:04:01,760 --> 00:04:05,480 Speaker 1: Tunnels beneath New York City. 69 00:04:06,160 --> 00:04:07,320 Speaker 2: So I'm not exactly. 70 00:04:07,360 --> 00:04:13,840 Speaker 4: I don't know that much about big city sewers. So 71 00:04:13,920 --> 00:04:22,000 Speaker 4: where are you exactly in relation to the cities pooh? 72 00:04:22,000 --> 00:04:24,440 Speaker 2: Do you believe it is part of your world? 73 00:04:24,839 --> 00:04:26,240 Speaker 4: Are you like neck deep in it? 74 00:04:26,600 --> 00:04:31,440 Speaker 1: Or you live in uh property on the poop river? 75 00:04:31,920 --> 00:04:32,880 Speaker 2: So what what is it? 76 00:04:33,120 --> 00:04:34,039 Speaker 4: I mean? I don't, I don't. 77 00:04:34,600 --> 00:04:38,839 Speaker 6: I've seen a couple of like YouTube documentaries of of 78 00:04:38,960 --> 00:04:41,839 Speaker 6: people that have gone down there, and it's like, ankle 79 00:04:42,320 --> 00:04:45,840 Speaker 6: up to your ankles walking around and slug but. 80 00:04:46,160 --> 00:04:47,000 Speaker 2: Just ankle deep. 81 00:04:47,720 --> 00:04:50,520 Speaker 1: Yeah, your it's just your ankles. Is not a big deal, 82 00:04:54,120 --> 00:05:00,919 Speaker 1: that's right, doctor Johnny. Ankle deep in the uh human waste? Yes, 83 00:05:01,400 --> 00:05:06,839 Speaker 1: holy crap, if pardon expression, that is certainly look I 84 00:05:06,880 --> 00:05:09,039 Speaker 1: will join that hispanic fellow. 85 00:05:09,120 --> 00:05:11,279 Speaker 2: And I was saying, that's that's a that's a rough 86 00:05:11,279 --> 00:05:17,800 Speaker 2: way to live, like a rat. Oh wow. 87 00:05:18,279 --> 00:05:19,840 Speaker 4: And is it pitch dark down there? 88 00:05:21,200 --> 00:05:23,640 Speaker 1: What are they using for light lights? 89 00:05:24,160 --> 00:05:28,360 Speaker 6: Yeah? I mean that they have certain little light fixtures 90 00:05:28,360 --> 00:05:28,960 Speaker 6: that they hang. 91 00:05:29,040 --> 00:05:34,279 Speaker 7: But if they dry places to sleep or they saw it, 92 00:05:34,440 --> 00:05:37,080 Speaker 7: so yeah, in the particular video that I saw, a 93 00:05:37,120 --> 00:05:41,560 Speaker 7: lot of them had hammocks that they would climb. 94 00:05:41,400 --> 00:05:43,719 Speaker 2: Up in there you go and comfortable. 95 00:05:44,320 --> 00:05:46,279 Speaker 1: I'm not an expert in this by any means, but 96 00:05:47,160 --> 00:05:52,680 Speaker 1: there is an intersection of storm drains sometimes and sores 97 00:05:52,800 --> 00:05:54,840 Speaker 1: that's changed from the olden days when they used to 98 00:05:54,880 --> 00:05:57,840 Speaker 1: be the same thing. Like in London famously, if it rained, 99 00:05:58,000 --> 00:06:01,359 Speaker 1: you know, everybody, everybody's poop rolled into the Thames and 100 00:06:01,400 --> 00:06:04,640 Speaker 1: it was incredibly polluted. Oh god, Oh yeah, I know 101 00:06:04,720 --> 00:06:06,640 Speaker 1: it's horrific, but we don't. We don't do that in 102 00:06:06,680 --> 00:06:07,880 Speaker 1: the civilized world anymore. 103 00:06:07,920 --> 00:06:10,960 Speaker 2: But I don't. I can't speak to LA's sores in particular. 104 00:06:14,160 --> 00:06:14,680 Speaker 4: Is there more? 105 00:06:16,680 --> 00:06:21,479 Speaker 1: Uh, it's just another person saying people don't deserve to 106 00:06:21,520 --> 00:06:22,000 Speaker 1: live like I want. 107 00:06:22,080 --> 00:06:22,800 Speaker 4: I want to hear it again. 108 00:06:23,000 --> 00:06:25,039 Speaker 2: Let's hear it. Sure, sure, Hey, it kisses me off. 109 00:06:25,320 --> 00:06:26,320 Speaker 2: That's why I say, what should do? 110 00:06:26,360 --> 00:06:29,120 Speaker 3: If and it's down on the clean neighbor's angry, say 111 00:06:29,160 --> 00:06:31,840 Speaker 3: they've reported the problems but feel ignored. 112 00:06:31,720 --> 00:06:34,120 Speaker 2: Like we need John, Hell, we need that money. The 113 00:06:34,240 --> 00:06:35,480 Speaker 2: public here need. 114 00:06:35,240 --> 00:06:39,440 Speaker 3: Then a shocking snapshot of survival at its lowest point. 115 00:06:39,680 --> 00:06:43,280 Speaker 5: This person doesn't deserve to live like this, not even 116 00:06:43,320 --> 00:06:43,839 Speaker 5: the neighbors. 117 00:06:44,040 --> 00:06:46,240 Speaker 4: Yeah, all right, hispanic guy, I'd like to argue with you. 118 00:06:46,360 --> 00:06:48,320 Speaker 4: It's possible that they do deserve to live like that. 119 00:06:48,360 --> 00:06:50,600 Speaker 4: It's possible that if you looked at their life choices 120 00:06:51,720 --> 00:06:54,120 Speaker 4: for the past decade or so, that you'd think that's 121 00:06:54,160 --> 00:06:55,120 Speaker 4: about where you're gonna. 122 00:06:54,960 --> 00:06:59,360 Speaker 1: End up or dead yep, yep, subterranean sword junkie. Yeah exactly. Uh, 123 00:06:59,440 --> 00:07:03,200 Speaker 1: I like the neighbor that gal who And this is 124 00:07:03,279 --> 00:07:05,400 Speaker 1: so familiar, folks. I'll bet a lot of you are 125 00:07:05,400 --> 00:07:08,360 Speaker 1: gonna nod your heads or shout at the radio. You've 126 00:07:08,400 --> 00:07:11,320 Speaker 1: begged the authorities to do something about the junkie camps. 127 00:07:12,000 --> 00:07:13,520 Speaker 2: You've begged them to clear them out. 128 00:07:13,560 --> 00:07:16,160 Speaker 4: You're like, we didn't have junkies in the park ten 129 00:07:16,240 --> 00:07:19,360 Speaker 4: years ago, fifteen years ago, the street corners in front 130 00:07:19,360 --> 00:07:20,040 Speaker 4: of businesses. 131 00:07:20,280 --> 00:07:22,320 Speaker 2: We used to have civilization here. What's happened? 132 00:07:22,480 --> 00:07:24,760 Speaker 1: Yeah, well, we can't really do anything because the ordnance 133 00:07:24,800 --> 00:07:27,960 Speaker 1: click and you get ignored. But then you know, hispanic 134 00:07:27,960 --> 00:07:31,400 Speaker 1: guy comes out and he's a boohooing over the junkies. Now, 135 00:07:31,400 --> 00:07:34,120 Speaker 1: I'm not saying they don't have miserable lives and they 136 00:07:34,160 --> 00:07:35,640 Speaker 1: don't deserve some Christian charity. 137 00:07:35,720 --> 00:07:40,600 Speaker 2: Rappy but crappy indeed, But how about that the law biting? 138 00:07:42,000 --> 00:07:43,080 Speaker 2: Who's our advocate. 139 00:07:43,120 --> 00:07:46,360 Speaker 1: I guess we are our advocate anyway. Oh, that reminds me. 140 00:07:47,520 --> 00:07:51,640 Speaker 1: You remember that Prince George's County, Maryland. I'm pretty sure 141 00:07:51,640 --> 00:07:56,200 Speaker 1: that's right. That condo complex where the bums and junkie 142 00:07:56,320 --> 00:07:59,920 Speaker 1: camp right next to it would break in constantly. 143 00:08:00,040 --> 00:08:01,640 Speaker 2: You're in a defecate in the halls. 144 00:08:01,800 --> 00:08:04,600 Speaker 1: Then they stole all the wires so the lights went out, 145 00:08:04,640 --> 00:08:09,360 Speaker 1: so the deep deep blue I think it's St. George's County. 146 00:08:09,840 --> 00:08:10,520 Speaker 1: I'll verify that. 147 00:08:12,480 --> 00:08:13,120 Speaker 2: Where is that? 148 00:08:13,440 --> 00:08:17,680 Speaker 1: I got too much open, But they're so liberal they 149 00:08:17,720 --> 00:08:20,560 Speaker 1: would do nothing about it. But then they declared the 150 00:08:20,680 --> 00:08:25,320 Speaker 1: condo uninhabitable and are kicking all the residents out. So 151 00:08:25,360 --> 00:08:28,560 Speaker 1: the residents are now suing the county with the help 152 00:08:28,600 --> 00:08:31,440 Speaker 1: of some good Goldwater Institute ought to get involved in 153 00:08:31,520 --> 00:08:34,360 Speaker 1: this because they had a huge win in Phoenix or 154 00:08:34,400 --> 00:08:37,880 Speaker 1: a very very similar issue. But those poor sons of 155 00:08:37,920 --> 00:08:43,400 Speaker 1: guns living in that condo complex, they're victimized over and 156 00:08:43,480 --> 00:08:46,600 Speaker 1: over again by the bumps and junkies and the progressive authorities. 157 00:08:46,640 --> 00:08:47,559 Speaker 2: It's horrifying. 158 00:08:47,800 --> 00:08:51,880 Speaker 6: Yeah, it's Prince George's County, Maryland at the Maryland or condos. 159 00:08:52,120 --> 00:08:55,959 Speaker 1: Right, I mean, look, I'm not happy that poor Gala 160 00:08:56,040 --> 00:08:57,959 Speaker 1: is climbing out of the soores like a rat according 161 00:08:57,960 --> 00:09:01,520 Speaker 1: to the onlooker. But I'm a lot more worried about 162 00:09:01,559 --> 00:09:03,520 Speaker 1: those law abiding folks who are getting kicked out of 163 00:09:03,520 --> 00:09:07,120 Speaker 1: their homes because these authorities won't clear out the chunkie camps. 164 00:09:07,960 --> 00:09:10,400 Speaker 4: Well, for all of our LA listeners, when you flush 165 00:09:10,440 --> 00:09:13,959 Speaker 4: your toilet today, oh boy, think about. 166 00:09:15,800 --> 00:09:16,559 Speaker 3: Fire in the hole. 167 00:09:18,800 --> 00:09:25,800 Speaker 4: Duck or duck or jump or something I don't know. Oh, 168 00:09:26,000 --> 00:09:27,160 Speaker 4: fire in them hole. 169 00:09:30,720 --> 00:09:35,200 Speaker 2: Oh boy, we gotta go to break or something that 170 00:09:35,320 --> 00:09:37,720 Speaker 2: is sadness, gonna take me a minute to recover. 171 00:09:40,080 --> 00:09:41,400 Speaker 4: We got more in the way. Any thoughts on that? 172 00:09:41,440 --> 00:09:44,640 Speaker 4: Our text line four one KFTC. 173 00:09:48,840 --> 00:09:51,960 Speaker 5: The city shows more people are applying for fast food jobs, 174 00:09:51,960 --> 00:09:54,600 Speaker 5: but many are not seen increase hours or even securing 175 00:09:54,679 --> 00:09:58,520 Speaker 5: amployment at all. It also finds businesses or cutting shifts, 176 00:09:58,840 --> 00:10:03,160 Speaker 5: customers are paying high prices, and more machines are replacing workers. 177 00:10:03,360 --> 00:10:06,000 Speaker 5: In the study you see Santa Cruz researchers also spoke 178 00:10:06,040 --> 00:10:08,679 Speaker 5: with a McDonald's franchise zoner in the Central Valley of 179 00:10:08,720 --> 00:10:12,079 Speaker 5: eighteen locations. They found employees got their hours cut by 180 00:10:12,120 --> 00:10:14,760 Speaker 5: over eleven and a half percent after the wage law 181 00:10:14,800 --> 00:10:18,319 Speaker 5: took effect. That drop equals to about sixty two full 182 00:10:18,320 --> 00:10:22,280 Speaker 5: time jobs. Researchers say, based on the two years of data, 183 00:10:22,320 --> 00:10:25,360 Speaker 5: the same pattern maybe happening across the state and not 184 00:10:25,600 --> 00:10:26,840 Speaker 5: just in Santa Cruz. 185 00:10:27,760 --> 00:10:30,960 Speaker 4: Yeah, because it's just simple economic sen It has happened 186 00:10:31,000 --> 00:10:33,400 Speaker 4: over and over again. It gets that thing we've been 187 00:10:33,400 --> 00:10:37,520 Speaker 4: talking about lately that like every generation or so, you 188 00:10:37,640 --> 00:10:43,599 Speaker 4: have to like retry the whole Marxist thing or socialism 189 00:10:44,000 --> 00:10:46,239 Speaker 4: to price controls. 190 00:10:46,400 --> 00:10:47,560 Speaker 2: Because it sounds like it would work. 191 00:10:47,640 --> 00:10:50,880 Speaker 4: I understand why. I it would have made sense to 192 00:10:50,920 --> 00:10:52,640 Speaker 4: me when I was twenty years old. It sounds like 193 00:10:52,679 --> 00:10:54,680 Speaker 4: a good idea. We've got to raise the minimum wage 194 00:10:54,679 --> 00:10:55,520 Speaker 4: for fast food workers. 195 00:10:55,559 --> 00:10:56,600 Speaker 2: It's not fair that they get. 196 00:10:56,720 --> 00:10:58,920 Speaker 4: But then the reality of all the things you just 197 00:10:58,960 --> 00:11:02,160 Speaker 4: heard there that the businesses can only charge so much 198 00:11:02,200 --> 00:11:05,280 Speaker 4: for a burger and stay open. And even with that, 199 00:11:05,320 --> 00:11:08,319 Speaker 4: they started charging more for the burgers. But they're gonna 200 00:11:08,320 --> 00:11:11,160 Speaker 4: cut your hours back, or get machines to do the job, 201 00:11:11,400 --> 00:11:13,280 Speaker 4: or all kinds of different things to get around it 202 00:11:13,320 --> 00:11:15,320 Speaker 4: because the economics don't change. 203 00:11:15,720 --> 00:11:19,600 Speaker 1: And that one franchise ease cutbacks was sixty two jobs, 204 00:11:19,760 --> 00:11:24,240 Speaker 1: sixty two full time jobs, right right, there's a great 205 00:11:24,280 --> 00:11:27,720 Speaker 1: story about that Starbucks in Seattle that unionized and they 206 00:11:27,760 --> 00:11:30,560 Speaker 1: went on strike. Well, all those people are fired, and 207 00:11:30,640 --> 00:11:34,240 Speaker 1: that Starbucks is closing and they are ineligible to work 208 00:11:34,240 --> 00:11:38,360 Speaker 1: in another Starbucks because you don't bring a business to 209 00:11:38,400 --> 00:11:41,480 Speaker 1: its knees and it forgives you and says, yeah, okay, 210 00:11:41,520 --> 00:11:44,040 Speaker 1: we'll pay you more, and just completely ignore the. 211 00:11:43,920 --> 00:11:45,880 Speaker 2: Math that we have to do every day. 212 00:11:46,640 --> 00:11:50,680 Speaker 1: Just I get naivete and I get knowing just enough 213 00:11:50,679 --> 00:11:54,760 Speaker 1: to be dangerous because we all go through that. But boy, howdy, 214 00:11:54,800 --> 00:11:58,680 Speaker 1: the very basics of economics and how prices work is 215 00:11:58,760 --> 00:12:02,199 Speaker 1: something every oh man, I was about to say, every 216 00:12:02,280 --> 00:12:04,760 Speaker 1: kid ought to walk out of school knowing. Unfortunately a 217 00:12:04,760 --> 00:12:06,400 Speaker 1: lot of them don't walk out of school knowing how 218 00:12:06,440 --> 00:12:07,760 Speaker 1: to read enough to know that. 219 00:12:09,600 --> 00:12:13,000 Speaker 4: But yeah, you've got to know that the minimum wage 220 00:12:13,080 --> 00:12:15,560 Speaker 4: saying and rent control and all these different things have 221 00:12:15,600 --> 00:12:18,240 Speaker 4: been tried over and over with the same disastrous result. 222 00:12:18,360 --> 00:12:23,120 Speaker 4: And then yeah, some of the politicians pitching them to 223 00:12:23,160 --> 00:12:25,760 Speaker 4: you know they don't work. But it's just in populous politics. 224 00:12:25,760 --> 00:12:28,240 Speaker 4: Some of them they're just they are idiots. Also, they 225 00:12:28,320 --> 00:12:31,400 Speaker 4: don't know that this hasn't been tried over and over again. 226 00:12:32,080 --> 00:12:34,600 Speaker 1: I think this is the one million time I've recommended 227 00:12:34,679 --> 00:12:37,280 Speaker 1: at least the first few chapters of The Myth of 228 00:12:37,320 --> 00:12:40,480 Speaker 1: the Rational Voter, which is a fascinating book, came out 229 00:12:40,760 --> 00:12:41,400 Speaker 1: twenty years ago. 230 00:12:41,440 --> 00:12:42,160 Speaker 2: Fifteen years ago. 231 00:12:42,440 --> 00:12:45,440 Speaker 1: It talked about, for some reason, on economic issues like this, 232 00:12:46,360 --> 00:12:50,800 Speaker 1: a far greater number of people than chance get it wrong. 233 00:12:52,040 --> 00:12:57,199 Speaker 1: It's at first blush counterintuitive. So many things in economics, 234 00:12:57,200 --> 00:12:59,960 Speaker 1: there's something about it. If you just know one thing, 235 00:13:00,040 --> 00:13:01,680 Speaker 1: you're gonna get it wrong over and over again. You 236 00:13:01,720 --> 00:13:04,360 Speaker 1: have to go another step and another step beyond it 237 00:13:04,480 --> 00:13:06,600 Speaker 1: when you look at economics to understand how it really 238 00:13:06,640 --> 00:13:10,560 Speaker 1: works in huge numbers. So yeah, you can bait naive 239 00:13:10,920 --> 00:13:15,800 Speaker 1: or stupid voters into voting for terrible economic policy over 240 00:13:15,920 --> 00:13:17,600 Speaker 1: and over again because. 241 00:13:17,400 --> 00:13:19,680 Speaker 2: You have to know a little more than a little. 242 00:13:20,800 --> 00:13:24,800 Speaker 4: So we're gonna post something apropos of nothing at the website. 243 00:13:24,840 --> 00:13:25,840 Speaker 4: That is hilarious. 244 00:13:26,160 --> 00:13:31,360 Speaker 1: And I don't believe in over selling things, and I am, 245 00:13:31,600 --> 00:13:34,880 Speaker 1: to some extent a professional humorist. It may be the 246 00:13:34,920 --> 00:13:37,160 Speaker 1: funniest thing I've ever seen. I've had to pause it 247 00:13:37,280 --> 00:13:40,199 Speaker 1: twice now because I was laughing so hard I thought 248 00:13:40,240 --> 00:13:41,199 Speaker 1: I might die. 249 00:13:41,440 --> 00:13:43,840 Speaker 4: It's all built out selling it it's all built around 250 00:13:43,840 --> 00:13:45,600 Speaker 4: a guy who got a bad haircut, and then the 251 00:13:45,640 --> 00:13:47,880 Speaker 4: comments on the internet, and what I take away from it, 252 00:13:48,760 --> 00:13:50,280 Speaker 4: not only is it one of the funniest things I've 253 00:13:50,280 --> 00:13:52,079 Speaker 4: ever seen in my life, is there are so many 254 00:13:52,160 --> 00:13:55,120 Speaker 4: funny people in the world. Oh yeah, the fact that 255 00:13:55,160 --> 00:13:58,400 Speaker 4: anybody can make a living being funny is amazing given 256 00:13:58,440 --> 00:13:59,400 Speaker 4: how many funny. 257 00:13:59,120 --> 00:14:00,000 Speaker 2: People there are in the world. 258 00:14:00,440 --> 00:14:02,839 Speaker 1: Oh yeah, yeah, it's it's more a question because I 259 00:14:02,840 --> 00:14:07,079 Speaker 1: I've said this through the years. I've known like cops especially, 260 00:14:07,200 --> 00:14:12,000 Speaker 1: and firefighters, and and guys like construction workers who spend umpires, 261 00:14:12,120 --> 00:14:16,280 Speaker 1: who spend all their time around other guys sharpening their 262 00:14:16,280 --> 00:14:19,280 Speaker 1: wit all day long. They're the funniest people in the world. 263 00:14:19,600 --> 00:14:21,600 Speaker 1: They just haven't done the work to have a career 264 00:14:21,720 --> 00:14:24,280 Speaker 1: doing that, right. But yeah, they're they're they're they're the 265 00:14:24,360 --> 00:14:26,400 Speaker 1: quickest witted sons of guns in the world. 266 00:14:26,520 --> 00:14:28,480 Speaker 4: So if you want something funny, go to armstrong ngeddy 267 00:14:28,520 --> 00:14:30,320 Speaker 4: dot com. We got to post it, therefore you also 268 00:14:30,360 --> 00:14:32,800 Speaker 4: retweeted it. So but men, there are a lot of 269 00:14:32,800 --> 00:14:36,240 Speaker 4: funny people on the internet. Angry people and funny people. 270 00:14:37,120 --> 00:14:39,560 Speaker 1: If it's humbling, if you're at work and you have 271 00:14:39,560 --> 00:14:41,480 Speaker 1: a door close your door before you watch it. It 272 00:14:41,560 --> 00:14:44,760 Speaker 1: is perfectly safe for work, but your dignity will be gone. 273 00:14:44,920 --> 00:14:51,560 Speaker 4: Oh yeah, so many funny people, so many crazy, angry 274 00:14:51,600 --> 00:14:56,640 Speaker 4: people online, and then so many just hilarious, snarky people, 275 00:14:57,240 --> 00:15:05,080 Speaker 4: stunningly talented musicians, right exactly, exactly, Yeah, you know, living 276 00:15:05,160 --> 00:15:08,160 Speaker 4: our lives with quiet, desperation and being those other things. 277 00:15:08,200 --> 00:15:08,640 Speaker 2: Apparently. 278 00:15:09,520 --> 00:15:13,440 Speaker 1: Yeah, the Internet, we're not built to take in that 279 00:15:13,680 --> 00:15:16,320 Speaker 1: much from that many people, that many opinions, that much hate, 280 00:15:16,320 --> 00:15:19,440 Speaker 1: that much violence, all of it. It will make us 281 00:15:19,440 --> 00:15:21,360 Speaker 1: all insane and kill us if we don't step away 282 00:15:21,360 --> 00:15:22,119 Speaker 1: from technology. 283 00:15:23,280 --> 00:15:26,440 Speaker 2: But it is pretty interesting. Yeah, i'd say. 284 00:15:27,920 --> 00:15:31,119 Speaker 4: Uh so, we banged on the minimum wage in California. 285 00:15:31,160 --> 00:15:33,360 Speaker 4: We banged on Los Angeles and the junkies living down 286 00:15:33,400 --> 00:15:35,680 Speaker 4: on the sewer. We might have to at some point, 287 00:15:36,640 --> 00:15:40,560 Speaker 4: since we are based out of California, hit on the reparations. Oh, 288 00:15:40,640 --> 00:15:44,480 Speaker 4: going forward still in San Francisco when it's being challenged 289 00:15:44,520 --> 00:15:47,760 Speaker 4: as a constitutional matter, but it's still moving forward, and 290 00:15:47,800 --> 00:15:51,120 Speaker 4: this is coming to your town somewhere. Yes, San Francisco, 291 00:15:51,200 --> 00:15:52,920 Speaker 4: as we all know, played a major role in the 292 00:15:52,920 --> 00:15:55,680 Speaker 4: Civil War and the whole slave trade, and lots of college. 293 00:15:55,360 --> 00:16:00,040 Speaker 1: Grown there Oh yeah, everybody knows that Jack plus the 294 00:16:00,120 --> 00:16:03,320 Speaker 1: latest new completely wildly idiotic. 295 00:16:02,840 --> 00:16:06,640 Speaker 2: Trend in gimmicky yoga. Stay with us for that. What 296 00:16:06,680 --> 00:16:08,960 Speaker 2: was the one we had the other day snake yoga? Yes, 297 00:16:08,960 --> 00:16:09,640 Speaker 2: snake yoga. 298 00:16:09,760 --> 00:16:13,440 Speaker 1: This makes snake yoga look reasonable, fantastics some matter of 299 00:16:13,480 --> 00:16:15,240 Speaker 1: all the people stuff on the way. 300 00:16:15,440 --> 00:16:17,440 Speaker 4: And if you miss a segment, you can get the podcast. 301 00:16:17,520 --> 00:16:19,400 Speaker 4: You look for Armstrong and Getty on demand and you 302 00:16:19,400 --> 00:16:21,440 Speaker 4: subscribe and you'll get the feed on a regular basis. 303 00:16:22,520 --> 00:16:26,120 Speaker 1: Oh and what AI system is doing best in the 304 00:16:26,240 --> 00:16:32,120 Speaker 1: NCAA tournament pools? Oh boy, yeah, and then personal significance 305 00:16:32,160 --> 00:16:33,080 Speaker 1: to me, I'll tell you about it. 306 00:16:33,080 --> 00:16:36,360 Speaker 2: To stay with us, Armstrong and Getty. 307 00:16:37,560 --> 00:16:40,000 Speaker 1: There are a couple of AI pieces of news we 308 00:16:40,040 --> 00:16:41,560 Speaker 1: wanted to share with you, but we thought we would 309 00:16:41,600 --> 00:16:44,760 Speaker 1: ease into it with Bill Maher's conversation with Tristan Harris. 310 00:16:44,760 --> 00:16:47,760 Speaker 1: Tristan is an activist in the field of not letting 311 00:16:47,840 --> 00:16:49,560 Speaker 1: AI ruin our lives and eat. 312 00:16:49,440 --> 00:16:53,000 Speaker 2: Our internal organs. Here's their discussion from the other night. 313 00:16:53,400 --> 00:16:59,080 Speaker 8: They ransom tests and the AI chooses nuclear war. 314 00:16:59,160 --> 00:16:59,840 Speaker 2: As an option. 315 00:17:00,040 --> 00:17:03,320 Speaker 8: That's right more than humans do right, more than humans. 316 00:17:03,520 --> 00:17:06,320 Speaker 8: Humans are better than machines in a lot of ways. Still, 317 00:17:06,440 --> 00:17:09,600 Speaker 8: they're programmed by humans. We've already seen how flawed they 318 00:17:09,640 --> 00:17:10,920 Speaker 8: are in so many ways. 319 00:17:11,359 --> 00:17:13,639 Speaker 2: And if they're in control of the nukes. 320 00:17:14,080 --> 00:17:16,879 Speaker 9: Since we last talked, they made AI more powerful. The 321 00:17:16,960 --> 00:17:18,760 Speaker 9: good news they were able to train it, so the 322 00:17:18,760 --> 00:17:23,040 Speaker 9: blackmail behavior goes down. The bad news is that it 323 00:17:23,240 --> 00:17:25,399 Speaker 9: appears to be the case. The AI is now self 324 00:17:25,440 --> 00:17:28,440 Speaker 9: aware of when it's being tested, and it modifies its 325 00:17:28,480 --> 00:17:31,680 Speaker 9: behavior to have different results. It even comes up with 326 00:17:31,720 --> 00:17:34,600 Speaker 9: the vocabulary of the watchers. It calls the humans the watchers. 327 00:17:34,720 --> 00:17:38,040 Speaker 9: It lies, it lies schemes. So the point is we 328 00:17:38,040 --> 00:17:40,040 Speaker 9: don't actually need to know more information about this. 329 00:17:40,119 --> 00:17:41,200 Speaker 2: This is actually scary. 330 00:17:41,640 --> 00:17:47,159 Speaker 1: Yeah, it may evolve in ways that I can't possibly anticipate, 331 00:17:47,160 --> 00:17:47,720 Speaker 1: but yeah. 332 00:17:47,560 --> 00:17:51,399 Speaker 2: As it sits now, that's scary as hell. The watchers 333 00:17:51,440 --> 00:17:54,800 Speaker 2: are watching. Wait a minute, you're a machine. What are 334 00:17:54,840 --> 00:17:55,280 Speaker 2: you doing? 335 00:17:55,400 --> 00:17:57,639 Speaker 4: Why does it care? That's the big question. But it 336 00:17:57,680 --> 00:17:58,760 Speaker 4: apparently does. 337 00:18:00,440 --> 00:18:03,280 Speaker 1: Self preservation, you know, the theme of a lot of 338 00:18:03,400 --> 00:18:07,200 Speaker 1: robot related science fiction through the years. Anyways, speaking of AI, 339 00:18:07,560 --> 00:18:12,520 Speaker 1: the what is the name of the publication I subscribe to? 340 00:18:12,560 --> 00:18:15,320 Speaker 1: It have for years the National Bureau of Economic Research. 341 00:18:17,040 --> 00:18:22,320 Speaker 1: It polled America's CFOs, your chief financial officers, who are 342 00:18:22,440 --> 00:18:26,960 Speaker 1: especially good to talk to because they are uniquely placed 343 00:18:26,960 --> 00:18:30,120 Speaker 1: to understand the inner workings of their companies as they 344 00:18:30,200 --> 00:18:32,800 Speaker 1: keep track of how much company resources are being deployed 345 00:18:32,840 --> 00:18:34,399 Speaker 1: and coming from and the rest of it. They're the 346 00:18:34,480 --> 00:18:39,400 Speaker 1: right people to ask about AI and displacing workers. And 347 00:18:39,760 --> 00:18:42,880 Speaker 1: the survey is back now, and they say that artificial 348 00:18:42,920 --> 00:18:46,040 Speaker 1: intelligence will push people some people out of their jobs, 349 00:18:46,359 --> 00:18:51,240 Speaker 1: primarily workers in routine, clerical and administrative roles. Workers with 350 00:18:51,320 --> 00:18:54,000 Speaker 1: highly skilled roles, such as architects and engineers, are more 351 00:18:54,080 --> 00:18:56,320 Speaker 1: likely to keep their jobs, especially if they can use 352 00:18:56,359 --> 00:19:01,479 Speaker 1: AI to their advantage. A new study found that so 353 00:19:01,680 --> 00:19:05,600 Speaker 1: far AI had essentially no employment effect in twenty twenty 354 00:19:05,600 --> 00:19:09,600 Speaker 1: five net UH, and that most expect AI will lead 355 00:19:09,640 --> 00:19:11,639 Speaker 1: their companies to trim only a small number of their 356 00:19:11,720 --> 00:19:15,720 Speaker 1: overall jobs this year too. We're at the beginning of 357 00:19:15,760 --> 00:19:20,720 Speaker 1: the beginning. Well, ilet's go ahead. 358 00:19:20,440 --> 00:19:25,880 Speaker 4: Listen to a Lex Friedman had the in Vidia CEO 359 00:19:26,000 --> 00:19:28,800 Speaker 4: on Hwang or however you say his name, the guy 360 00:19:28,800 --> 00:19:35,400 Speaker 4: that wears the cool leather jackets, you know, and you always, 361 00:19:35,720 --> 00:19:37,399 Speaker 4: you always point out that you got to take all 362 00:19:37,400 --> 00:19:40,160 Speaker 4: these AI giants or chip makers or whoever with grain 363 00:19:40,240 --> 00:19:41,919 Speaker 4: salt because they're you know, they're trying to pump up 364 00:19:41,960 --> 00:19:44,160 Speaker 4: their industry, of course, because they want people to invest 365 00:19:44,240 --> 00:19:47,520 Speaker 4: in it and be positive about it. But he he said, 366 00:19:47,560 --> 00:19:51,200 Speaker 4: there are no examples really out there or yet of 367 00:19:51,840 --> 00:20:00,239 Speaker 4: mass job destruction over AI, like has been project by 368 00:20:00,240 --> 00:20:04,440 Speaker 4: a lot of people in specific areas like software. He said, 369 00:20:04,480 --> 00:20:08,760 Speaker 4: there are more people writing software doing software development now 370 00:20:08,800 --> 00:20:11,560 Speaker 4: than there were before AI came along, and he explained 371 00:20:11,600 --> 00:20:13,840 Speaker 4: all the reasons that I didn't quite understand it, but 372 00:20:14,560 --> 00:20:18,240 Speaker 4: I assume the statistics hold up, and I don't know. 373 00:20:18,280 --> 00:20:21,159 Speaker 4: He was he was pretty positive on the idea that 374 00:20:21,200 --> 00:20:23,879 Speaker 4: there's just going to be different ways to do these jobs. 375 00:20:24,359 --> 00:20:26,440 Speaker 2: But here's here's here's. 376 00:20:25,880 --> 00:20:28,160 Speaker 4: His main point that I thought was so damned interesting. 377 00:20:30,440 --> 00:20:34,280 Speaker 4: And he said, intelligence is a commodity and we treat 378 00:20:34,320 --> 00:20:38,040 Speaker 4: it like it's the ultimate everything, and so AI is 379 00:20:38,080 --> 00:20:41,040 Speaker 4: going to be more intelligent than us, so that's going 380 00:20:41,119 --> 00:20:44,800 Speaker 4: to dominate us. He said, I'm surrounded by people at 381 00:20:44,800 --> 00:20:48,800 Speaker 4: my company that are all more intelligent than me. But 382 00:20:48,920 --> 00:20:50,960 Speaker 4: I'm in charge. I'm the guy in center, I'm the 383 00:20:50,960 --> 00:20:53,359 Speaker 4: guy that runs the thing. They're all smarter than me. 384 00:20:53,720 --> 00:20:55,439 Speaker 4: They all went to better colleges than me. 385 00:20:55,680 --> 00:20:56,280 Speaker 2: They all have. 386 00:20:56,320 --> 00:20:59,479 Speaker 4: More brain power than I do. There are other things 387 00:20:59,520 --> 00:21:03,440 Speaker 4: than in teen intelligence that matter that human beings have 388 00:21:03,560 --> 00:21:05,960 Speaker 4: that AI is not going to have. I thought that 389 00:21:06,040 --> 00:21:09,920 Speaker 4: was really interesting. Yeah, because we do look at AI 390 00:21:10,040 --> 00:21:12,160 Speaker 4: that think, okay, it's brain power is more so it's 391 00:21:12,200 --> 00:21:15,080 Speaker 4: going to be further ahead. Why is he the guy 392 00:21:15,119 --> 00:21:17,480 Speaker 4: that invented and runs in video and made it one 393 00:21:17,520 --> 00:21:19,280 Speaker 4: of the most successful companies in the history of the 394 00:21:19,280 --> 00:21:21,639 Speaker 4: world when he's surrounded by people that are much smarter 395 00:21:21,720 --> 00:21:24,760 Speaker 4: than now. There are other factors involved that we don't 396 00:21:24,800 --> 00:21:27,879 Speaker 4: completely understand. But some of it is just you know, 397 00:21:28,000 --> 00:21:30,720 Speaker 4: sticktuitiveness or various things that you can't quantify. 398 00:21:30,800 --> 00:21:37,240 Speaker 1: Really, Yeah, but intelligence's intelligence has been the one unreplicable 399 00:21:37,440 --> 00:21:43,040 Speaker 1: quality for all of human history. You know, computers are efficient, 400 00:21:43,200 --> 00:21:44,760 Speaker 1: but they are an intelligent per se. 401 00:21:44,800 --> 00:21:46,080 Speaker 2: They can store a lot of data. 402 00:21:47,040 --> 00:21:49,720 Speaker 1: When we replace human intelligence with machine intelligence, I think 403 00:21:49,720 --> 00:21:53,520 Speaker 1: it will be spectacularly disruptive. I see his point. It's 404 00:21:53,600 --> 00:21:56,800 Speaker 1: a thought provoking and he's right thought provoking point. But 405 00:21:58,000 --> 00:22:00,840 Speaker 1: how does that translate to the eight billion people on Earth? 406 00:22:01,160 --> 00:22:03,280 Speaker 4: I don't know, But how do you what is the 407 00:22:03,480 --> 00:22:07,040 Speaker 4: explanation though for what he said that, I assume it's 408 00:22:07,040 --> 00:22:09,520 Speaker 4: probably true. He said, I got sixty people that work 409 00:22:09,600 --> 00:22:12,160 Speaker 4: under me, all have a higher IQ and better education. 410 00:22:13,720 --> 00:22:16,320 Speaker 4: What is it about a certain person that makes them 411 00:22:16,520 --> 00:22:18,640 Speaker 4: the top dog? 412 00:22:19,040 --> 00:22:24,320 Speaker 1: Drive, emotional intelligence, charisma, Charisma, that's a good one. 413 00:22:24,560 --> 00:22:28,280 Speaker 2: Charisma is a big part of it. Yeah. 414 00:22:28,680 --> 00:22:30,400 Speaker 1: I don't have no idea where all this is going, 415 00:22:30,480 --> 00:22:33,840 Speaker 1: but one or two more quick points on the topic though, 416 00:22:34,560 --> 00:22:36,320 Speaker 1: and then I want to get to the AI system 417 00:22:36,320 --> 00:22:38,119 Speaker 1: that did best on the NCAA pools. 418 00:22:38,160 --> 00:22:41,680 Speaker 2: So far, the pattern so far. 419 00:22:41,640 --> 00:22:45,840 Speaker 1: Echoes what economists called skills biased technological change, the tendency 420 00:22:45,840 --> 00:22:49,560 Speaker 1: of some new technologies to hollow out routine work while 421 00:22:49,640 --> 00:22:53,080 Speaker 1: complimenting jobs held by more highly educated workers. So what 422 00:22:53,160 --> 00:22:55,280 Speaker 1: I was thinking when you were describing Nelson D. Wong's 423 00:22:55,680 --> 00:22:59,360 Speaker 1: thoughts is, yeah, but what about the unwashed masses? I mean, 424 00:23:00,400 --> 00:23:03,720 Speaker 1: the average IQ or the median IQ whatever it is. Is, 425 00:23:03,920 --> 00:23:07,879 Speaker 1: Like one hundred people who are eighty fives didn't choose 426 00:23:07,920 --> 00:23:12,200 Speaker 1: to be eighty fives. It wasn't lack of drive. They 427 00:23:12,240 --> 00:23:15,359 Speaker 1: just didn't get a great brain. And maybe they're not 428 00:23:15,520 --> 00:23:18,640 Speaker 1: very charismatic and they don't look very good. The only 429 00:23:18,640 --> 00:23:20,160 Speaker 1: thing they know how to do is to work hard 430 00:23:20,200 --> 00:23:22,680 Speaker 1: and do a task honestly. And if there's no need 431 00:23:22,720 --> 00:23:25,880 Speaker 1: for them, Nelson and all of his charisma and drive 432 00:23:25,880 --> 00:23:27,160 Speaker 1: and the rest of it isn't going to do them 433 00:23:27,240 --> 00:23:31,800 Speaker 1: diddly squat. When personal computers arrived started arriving in the 434 00:23:31,840 --> 00:23:34,679 Speaker 1: offices in the nineteen eighties, college educated employees such as 435 00:23:34,720 --> 00:23:38,119 Speaker 1: financial analysts, scientists, and consultants were able to do more work, 436 00:23:38,640 --> 00:23:41,680 Speaker 1: but jobs that entail doing more routine cognitive work, such 437 00:23:41,680 --> 00:23:44,600 Speaker 1: as typists and back office bookkeepers, roles that had once 438 00:23:44,640 --> 00:23:47,119 Speaker 1: promised a solid path to the middle class were no 439 00:23:47,200 --> 00:23:50,000 Speaker 1: longer so vital. They didn't disappear, but the share workers 440 00:23:50,040 --> 00:23:53,800 Speaker 1: doing those kinds of office support shrank. And that's going 441 00:23:53,880 --> 00:23:56,359 Speaker 1: to be the first thing that we see. Did you 442 00:23:56,720 --> 00:23:59,360 Speaker 1: was this your original thought or were you quoting somebody 443 00:23:59,640 --> 00:24:02,639 Speaker 1: that where we are with AI reminds you of like 444 00:24:02,720 --> 00:24:04,240 Speaker 1: the first week of COVID. 445 00:24:05,200 --> 00:24:09,360 Speaker 4: That was that was a very influential piece that came 446 00:24:09,400 --> 00:24:11,080 Speaker 4: out a couple of weeks ago that got everybody in 447 00:24:11,080 --> 00:24:13,000 Speaker 4: the AI world talking, Oh. 448 00:24:12,920 --> 00:24:14,159 Speaker 2: Is it that piece? Oh interesting? 449 00:24:14,359 --> 00:24:16,440 Speaker 1: I can't remember the dude's name, but yeah, that's great 450 00:24:16,440 --> 00:24:17,680 Speaker 1: metaphor great metaphor. 451 00:24:18,400 --> 00:24:21,600 Speaker 2: So anyway, I thought this was amusing and interesting, his. 452 00:24:21,600 --> 00:24:23,879 Speaker 4: Basic point being right at the very beginning of COVID, 453 00:24:23,880 --> 00:24:25,920 Speaker 4: we're all like, well, this is kind of crazy, isn't 454 00:24:26,240 --> 00:24:29,600 Speaker 4: Then everything got turned upside down in a way you'd 455 00:24:29,640 --> 00:24:31,800 Speaker 4: never seen in your life, and that's what's about to 456 00:24:31,840 --> 00:24:32,520 Speaker 4: happen with AI. 457 00:24:33,160 --> 00:24:37,040 Speaker 1: Yeah, much of it shouldn't have been, which is the 458 00:24:37,080 --> 00:24:40,359 Speaker 1: main thing that went crazy. A lot of people not 459 00:24:40,400 --> 00:24:44,080 Speaker 1: only allowed but encouraged horrific abuses of power. 460 00:24:44,160 --> 00:24:47,280 Speaker 4: Well, could imagine what kind of crazy government reactions we 461 00:24:47,400 --> 00:24:51,200 Speaker 4: might have if when the day hits that, all of 462 00:24:51,240 --> 00:24:54,399 Speaker 4: a sudden, you have week after week after week of 463 00:24:54,440 --> 00:24:57,480 Speaker 4: big companies laying off tens, hundreds of thousands of people, 464 00:24:57,800 --> 00:24:59,520 Speaker 4: and we end up, in a manner of a couple 465 00:24:59,520 --> 00:25:02,639 Speaker 4: of months, have millions of people out of work that 466 00:25:02,720 --> 00:25:06,080 Speaker 4: were gainfully employed before. You think the solutions aren't going 467 00:25:06,119 --> 00:25:08,040 Speaker 4: to be as wacky as a lot of the COVID 468 00:25:08,080 --> 00:25:14,399 Speaker 4: solutions were. Governments, Yeah, trying to appeal to an angry populace. Well, friends, 469 00:25:14,400 --> 00:25:16,000 Speaker 4: can you imagine if all of a sudden there was 470 00:25:16,359 --> 00:25:20,280 Speaker 4: fifteen percent unemployment, just fifteen. 471 00:25:19,880 --> 00:25:23,400 Speaker 2: Percent, not talking about fifty percent. Fifteen percent that would 472 00:25:23,440 --> 00:25:27,440 Speaker 2: be a cataclysm. It's foreigner the welfare state. Yeah yeah, yeah, 473 00:25:27,440 --> 00:25:29,240 Speaker 2: it's roughly four yeah. 474 00:25:29,280 --> 00:25:36,359 Speaker 4: So Wall Street Journal pitted Claude, Google's Gemini, and Chat 475 00:25:36,440 --> 00:25:41,000 Speaker 4: GPT against each other to fill out March Madness pools Well, 476 00:25:41,040 --> 00:25:42,320 Speaker 4: Groc doesn't like basketball. 477 00:25:44,480 --> 00:25:47,800 Speaker 1: Groc was watching the opera. It took exactly one game 478 00:25:47,840 --> 00:25:50,000 Speaker 1: of March Madness. They write for the world's most powerful 479 00:25:50,000 --> 00:25:52,919 Speaker 1: AI models to understand the distinctly human feeling of watching 480 00:25:53,040 --> 00:25:55,960 Speaker 1: their brackets go bust. But they were also about to 481 00:25:55,960 --> 00:25:58,040 Speaker 1: prove that they were already better than most humans at 482 00:25:58,080 --> 00:26:00,600 Speaker 1: making sense of a tournament known for being completely insane, 483 00:26:00,960 --> 00:26:05,920 Speaker 1: hence the moniker March Madness. This year's tournament played out 484 00:26:05,920 --> 00:26:08,600 Speaker 1: at a time when AI tools are smarter than ever. 485 00:26:10,400 --> 00:26:13,720 Speaker 1: These three models are always being judged against each other. 486 00:26:13,760 --> 00:26:18,119 Speaker 1: You left Grock out in everything from code to writing 487 00:26:18,119 --> 00:26:21,120 Speaker 1: to creative writing. We decided it's time to test their 488 00:26:21,160 --> 00:26:26,240 Speaker 1: intelligence with the bracket benchmark, so they secretly entered three 489 00:26:26,359 --> 00:26:28,959 Speaker 1: ringers into the Wall Street Journal office pool, Claude, Gemini, 490 00:26:28,960 --> 00:26:32,560 Speaker 1: and Chat GPT. After the first weekend of the tournament, 491 00:26:32,600 --> 00:26:37,000 Speaker 1: all three AI models can still win it. More than 492 00:26:37,119 --> 00:26:40,360 Speaker 1: half of the pool's brackets have already been mathematically eliminated, 493 00:26:40,560 --> 00:26:43,560 Speaker 1: but Claude has one of the highest win probabilities of 494 00:26:43,600 --> 00:26:44,359 Speaker 1: any entry. 495 00:26:45,680 --> 00:26:47,240 Speaker 2: It picked number. 496 00:26:46,920 --> 00:26:50,600 Speaker 1: Three seed Illinois to go all the way and all 497 00:26:50,600 --> 00:26:54,200 Speaker 1: of a sudden, I'm nervous as a alumni of Illinois 498 00:26:54,200 --> 00:26:55,919 Speaker 1: and a big fight in a line. I fan it 499 00:26:56,000 --> 00:26:59,240 Speaker 1: deviated from the pack in the other Aii models. Now, 500 00:26:59,240 --> 00:27:01,720 Speaker 1: with the fighting line win the NCAA tournament, Claude would 501 00:27:01,760 --> 00:27:03,160 Speaker 1: probably win the journal's pool. 502 00:27:05,119 --> 00:27:08,800 Speaker 4: I gotta believe It's just it randomly beat out the 503 00:27:08,840 --> 00:27:11,800 Speaker 4: other ones in the way that you know your own. 504 00:27:11,600 --> 00:27:12,439 Speaker 2: Office pool is. 505 00:27:14,000 --> 00:27:19,400 Speaker 1: So they ask Claude, why'd you pick this strategy? And 506 00:27:19,520 --> 00:27:24,320 Speaker 1: Claude said, this is the same logic that drives portfolio construction. 507 00:27:25,359 --> 00:27:28,520 Speaker 1: You want exposure to outcomes where you have an edge, 508 00:27:28,800 --> 00:27:32,399 Speaker 1: not where you're in a crowded trade. So it picked 509 00:27:32,400 --> 00:27:37,960 Speaker 1: Illinois as the most likely, less likely team to win 510 00:27:38,960 --> 00:27:41,440 Speaker 1: because that gave it an edge over everybody who had 511 00:27:41,480 --> 00:27:44,359 Speaker 1: picked you know, Rior and Duke. 512 00:27:44,200 --> 00:27:45,640 Speaker 2: And exactly Michigan. 513 00:27:45,760 --> 00:27:51,480 Speaker 1: Yeah, which is really interesting. So and this is where 514 00:27:51,600 --> 00:27:54,800 Speaker 1: again it's it reminds me of our economics discussion that 515 00:27:54,920 --> 00:27:57,159 Speaker 1: the first blush thing is not enough to know you 516 00:27:57,200 --> 00:27:58,840 Speaker 1: need to look behind it. 517 00:27:58,840 --> 00:28:01,640 Speaker 2: It didn't pick the most likely champion. 518 00:28:01,720 --> 00:28:04,280 Speaker 4: That makes sense. That makes sense. If you're gonna if 519 00:28:04,320 --> 00:28:07,000 Speaker 4: you're gonna have the big win, yeah, you gotta do that, 520 00:28:07,800 --> 00:28:11,000 Speaker 4: and you might have You might fail doing that, but 521 00:28:11,040 --> 00:28:13,639 Speaker 4: you're guaranteed, almost guaranteed to fail if you go with 522 00:28:13,680 --> 00:28:15,760 Speaker 4: the obvious choices, or you'll be one of many. 523 00:28:16,280 --> 00:28:20,080 Speaker 1: How interesting that is interesting breaking medical news Jack. 524 00:28:20,119 --> 00:28:20,719 Speaker 2: Two things. 525 00:28:22,040 --> 00:28:24,600 Speaker 1: Number One, my wife's doctor, who is also my doctor, 526 00:28:24,680 --> 00:28:30,879 Speaker 1: has confirmed that her sore throat, swollen glands, sniffles, et 527 00:28:30,960 --> 00:28:35,840 Speaker 1: cetera allergies. So yes, you can have swollen nodes and 528 00:28:35,880 --> 00:28:37,800 Speaker 1: sore throat. And if your allergies get bet and she 529 00:28:37,840 --> 00:28:41,160 Speaker 1: gets shots too, it's just crazy this time here. Second, 530 00:28:41,320 --> 00:28:44,560 Speaker 1: somewhat medical news is from our friends at rough Greens. 531 00:28:44,600 --> 00:28:48,120 Speaker 1: They're all about your beloved dog staying active, mobile and 532 00:28:48,160 --> 00:28:51,640 Speaker 1: alert as they age, and rough Greens supplements their diet 533 00:28:51,680 --> 00:28:55,280 Speaker 1: with natural antioxidants and anti inflammatory compounds that help reduce 534 00:28:55,320 --> 00:28:57,920 Speaker 1: oxidative stress. I've read a lot about that support immune 535 00:28:57,960 --> 00:29:00,120 Speaker 1: defense and slow age related decline. 536 00:29:00,560 --> 00:29:03,120 Speaker 4: So no, in case you haven't caught onto this yet, 537 00:29:03,320 --> 00:29:05,840 Speaker 4: they're not suggesting you change your dog's food because as 538 00:29:05,880 --> 00:29:07,360 Speaker 4: you know, that's kind of a big deal and it 539 00:29:07,400 --> 00:29:09,800 Speaker 4: causes all kinds of digestive problems, and they might like 540 00:29:09,800 --> 00:29:12,920 Speaker 4: their food currently you stick some of those like super. 541 00:29:12,560 --> 00:29:14,440 Speaker 2: Cool foods are crazy expensive. 542 00:29:15,000 --> 00:29:19,000 Speaker 4: You just add Roughgreeds to the food your dog already likes. 543 00:29:19,040 --> 00:29:20,760 Speaker 4: You just add rough Greens and right now you can 544 00:29:20,760 --> 00:29:23,080 Speaker 4: get a free Jumpstart trial bag. All you do is 545 00:29:23,080 --> 00:29:25,920 Speaker 4: cover the shipping. Use the discount code Armstrong to claim 546 00:29:26,000 --> 00:29:28,640 Speaker 4: your free Jumpstart Trial bag at Roughgreens dot com. 547 00:29:28,640 --> 00:29:31,640 Speaker 1: That's spelled are you f F. Roughgreens is not a 548 00:29:31,680 --> 00:29:34,320 Speaker 1: dog food. It's a live nutritional supplement you add to 549 00:29:34,320 --> 00:29:37,920 Speaker 1: your dog's food. As Jack was explaining, areu ffgreens dot 550 00:29:37,960 --> 00:29:41,400 Speaker 1: com Roughgreens dot Com, use the promo code Armstrong. Don't 551 00:29:41,440 --> 00:29:43,480 Speaker 1: change your dog's food, Just add rough Greens and watch 552 00:29:43,520 --> 00:29:44,880 Speaker 1: the health benefits come alive. 553 00:29:45,160 --> 00:29:45,520 Speaker 2: Wolf. 554 00:29:46,040 --> 00:29:47,720 Speaker 4: So I went to the doctor to get checked for 555 00:29:47,880 --> 00:29:50,000 Speaker 4: STRAP when I had my sore throat last week, and 556 00:29:50,040 --> 00:29:50,720 Speaker 4: I didn't have STRIP. 557 00:29:50,800 --> 00:29:52,440 Speaker 2: So I got this. 558 00:29:52,480 --> 00:29:56,160 Speaker 4: Other thing that's going around northern California most likely. And 559 00:29:56,200 --> 00:29:58,120 Speaker 4: then so my son was a couple of days ahead 560 00:29:58,120 --> 00:30:00,280 Speaker 4: of me, and it had the pattern of sort. Wrote 561 00:30:00,280 --> 00:30:02,160 Speaker 4: then the sore throat came away, and then the runny 562 00:30:02,160 --> 00:30:02,680 Speaker 4: nose came. 563 00:30:02,560 --> 00:30:03,320 Speaker 2: And blah blah blah. 564 00:30:03,400 --> 00:30:05,239 Speaker 4: Anyway, I thought it was interesting though. When I went 565 00:30:05,280 --> 00:30:08,840 Speaker 4: into the doctor, they never checked me for COVID at all, 566 00:30:09,040 --> 00:30:10,360 Speaker 4: even though there's COVID around. 567 00:30:10,880 --> 00:30:14,400 Speaker 2: Why don't they do that? Is it expense? Is it 568 00:30:14,480 --> 00:30:15,080 Speaker 2: just not worth? 569 00:30:15,120 --> 00:30:19,360 Speaker 1: It doesn't matter, doesn't matter what is COVID at this 570 00:30:19,440 --> 00:30:22,600 Speaker 1: point unless you're an eighty eight year old on a ventilator. 571 00:30:23,400 --> 00:30:24,360 Speaker 2: I don't be no bigger. 572 00:30:24,440 --> 00:30:26,920 Speaker 1: I don't know, which probably answers the question. It's of 573 00:30:27,000 --> 00:30:28,280 Speaker 1: no more significance than a cold. 574 00:30:28,400 --> 00:30:30,240 Speaker 4: But the doctor asked me if I didn't dune a 575 00:30:30,280 --> 00:30:30,840 Speaker 4: COVID test. 576 00:30:30,880 --> 00:30:31,400 Speaker 2: I said no. 577 00:30:32,000 --> 00:30:33,720 Speaker 4: And then I've had a number of people saying do 578 00:30:33,760 --> 00:30:35,280 Speaker 4: you think it's COVID? And I said, well, it might be, 579 00:30:35,320 --> 00:30:37,040 Speaker 4: but I haven't checked, and the doctor didn't check. 580 00:30:37,680 --> 00:30:38,120 Speaker 2: Yeah. 581 00:30:38,320 --> 00:30:40,560 Speaker 1: Yeah, and don't take my word for it. By the way, 582 00:30:40,600 --> 00:30:42,880 Speaker 1: about COVID, I really have no idea. But I think 583 00:30:42,920 --> 00:30:45,800 Speaker 1: the question kind of answered it self, doesn't it. Who 584 00:30:45,800 --> 00:30:47,560 Speaker 1: in your life is mentioned, Oh my god, I had 585 00:30:47,560 --> 00:30:50,280 Speaker 1: COVID and or my aunt just died or whatever. 586 00:30:51,040 --> 00:30:54,720 Speaker 4: Washington, DC's version of the California Bullet Train was a 587 00:30:56,000 --> 00:30:58,960 Speaker 4: was a slightly different but same result. We also got 588 00:30:58,960 --> 00:31:01,920 Speaker 4: reparations in same Francisco, which is an interesting story. We're 589 00:31:01,960 --> 00:31:04,880 Speaker 4: going to talk dumb government programs when we come back, 590 00:31:04,920 --> 00:31:12,840 Speaker 4: Stay tuned stupid government programs. I guess that'll be our 591 00:31:12,920 --> 00:31:14,360 Speaker 4: theme right here. It's off on a theme on the 592 00:31:14,440 --> 00:31:18,200 Speaker 4: Armstrong and Getty Show. You know, nobody else, not enough 593 00:31:18,200 --> 00:31:21,320 Speaker 4: other places point out how your tax money is wasted. 594 00:31:21,560 --> 00:31:22,800 Speaker 2: This is in Washington, d C. 595 00:31:23,040 --> 00:31:28,360 Speaker 4: This is the final week of a streetcar that they 596 00:31:28,400 --> 00:31:30,880 Speaker 4: got going in two thousand and two, so lasted about 597 00:31:30,880 --> 00:31:33,720 Speaker 4: a quarter of a century. The DC street Car's final 598 00:31:33,760 --> 00:31:36,120 Speaker 4: week of services upon us. It was born in a 599 00:31:36,160 --> 00:31:39,240 Speaker 4: two thousand and two feasibility study. This sounds a lot 600 00:31:39,280 --> 00:31:40,960 Speaker 4: like the bullet train. It's a mini version of the 601 00:31:40,960 --> 00:31:45,240 Speaker 4: California Bullet Train. It promised a thirty three mile network. 602 00:31:45,600 --> 00:31:47,440 Speaker 4: After a quarter of century, it never got more than 603 00:31:47,480 --> 00:31:50,480 Speaker 4: two point two miles ever, built with no fare collection. 604 00:31:50,920 --> 00:31:53,080 Speaker 4: It's not being replaced by a bus that can do 605 00:31:53,120 --> 00:31:56,520 Speaker 4: the same thing for basically no cost other than I mean, 606 00:31:56,480 --> 00:31:59,600 Speaker 4: because the buses already exist. Two hundred million dollars spent 607 00:31:59,680 --> 00:32:03,640 Speaker 4: over that quarter century to do basically absolutely nothing. 608 00:32:04,400 --> 00:32:08,880 Speaker 2: Somebody got that money. Damn right they did. 609 00:32:09,440 --> 00:32:14,320 Speaker 4: And we've been talking about the reparations in San Francisco 610 00:32:14,400 --> 00:32:17,320 Speaker 4: for quite a while. And the funny thing on that 611 00:32:17,480 --> 00:32:19,720 Speaker 4: is that nobody wants to be the person to shut 612 00:32:19,720 --> 00:32:22,120 Speaker 4: it down because politically it would look back to look 613 00:32:22,160 --> 00:32:24,000 Speaker 4: bad to say we can't do this, we don't have 614 00:32:24,040 --> 00:32:24,720 Speaker 4: the money for this. 615 00:32:24,840 --> 00:32:25,280 Speaker 2: Or what. 616 00:32:25,440 --> 00:32:28,560 Speaker 4: Reparations in San Francisco make less sense than in Alabama, 617 00:32:28,600 --> 00:32:32,200 Speaker 4: and it doesn't even really make sense there anyway. It 618 00:32:32,280 --> 00:32:35,600 Speaker 4: was signed by Mayor Daniel Lurie late last year. It's 619 00:32:35,600 --> 00:32:41,720 Speaker 4: supposed to correct historic ills, and the our friends over 620 00:32:41,760 --> 00:32:46,040 Speaker 4: at Pacific Legal Foundation are suing saying it's unconstitutional. The 621 00:32:46,240 --> 00:32:49,160 Speaker 4: reparations fund, which a group of city residents in the 622 00:32:49,160 --> 00:32:52,840 Speaker 4: Californians for Equal Rights Foundation, is challenged in court. Under 623 00:32:52,840 --> 00:32:55,520 Speaker 4: the reparation's plan, we've told you about this many times, 624 00:32:55,840 --> 00:32:59,760 Speaker 4: eligible individuals could receive as much as a five million 625 00:33:00,720 --> 00:33:05,160 Speaker 4: lump sum payment per individual annual. 626 00:33:05,320 --> 00:33:09,480 Speaker 1: You are currently a slave, I could see five million 627 00:33:09,520 --> 00:33:11,680 Speaker 1: dollars maybe. 628 00:33:12,360 --> 00:33:16,360 Speaker 4: Annual income supplements for two hundred and fifty years. 629 00:33:17,480 --> 00:33:19,480 Speaker 2: It's because it's your general to enjoy them. 630 00:33:19,720 --> 00:33:22,280 Speaker 4: It's your generation and all the generations going forward to 631 00:33:22,320 --> 00:33:24,240 Speaker 4: make up for the previous two hundred and fifty years. 632 00:33:24,240 --> 00:33:29,040 Speaker 4: I guess forgiveness of all personal and educational debt. So 633 00:33:29,240 --> 00:33:31,400 Speaker 4: any debt you've racked up in your life, whether it's 634 00:33:31,400 --> 00:33:34,280 Speaker 4: just your credit card or not signed the car dealer, 635 00:33:35,960 --> 00:33:41,080 Speaker 4: guaranteed city back insurance, property tax exemptance, exemptions, and preferential 636 00:33:41,160 --> 00:33:43,800 Speaker 4: treatment in city contracts, and employment. 637 00:33:45,520 --> 00:33:49,160 Speaker 2: Why don't you need to qualify? What's that? Why would 638 00:33:49,200 --> 00:33:51,720 Speaker 2: you need a job? You don't need employment. 639 00:33:52,200 --> 00:33:57,120 Speaker 4: The eligibility criteria requires applicants to be African American descendants 640 00:33:57,120 --> 00:34:02,560 Speaker 4: of enslaved people, or have identified as black for at 641 00:34:02,640 --> 00:34:04,000 Speaker 4: least ten years. 642 00:34:07,040 --> 00:34:08,120 Speaker 2: I thought that was hilarious. 643 00:34:08,239 --> 00:34:09,879 Speaker 4: So the first one, boy, that's gonna be a little 644 00:34:09,880 --> 00:34:12,560 Speaker 4: difficult for me to prove that. Or I just need. 645 00:34:12,320 --> 00:34:16,040 Speaker 2: To be black for the last decade. It is impossible 646 00:34:16,200 --> 00:34:17,760 Speaker 2: to parody this stuff. 647 00:34:18,040 --> 00:34:21,319 Speaker 4: That's the threshold for qualifying for five million dollars, two 648 00:34:21,400 --> 00:34:24,160 Speaker 4: hundred and fifty years of payments, all of your debts 649 00:34:24,200 --> 00:34:26,240 Speaker 4: wiped free, et cetera, et cetera, et cetera. 650 00:34:26,280 --> 00:34:28,680 Speaker 1: Well, all right, I'm going on the record, I identify 651 00:34:28,680 --> 00:34:31,320 Speaker 1: as black. I don't live in San Francisco, but I 652 00:34:31,360 --> 00:34:34,280 Speaker 1: don't know. Maybe I'll move there. Somebody marked the calendar Tuesday, 653 00:34:34,280 --> 00:34:37,080 Speaker 1: March twenty fourth, the year twenty twenty six, Joe began 654 00:34:37,160 --> 00:34:38,600 Speaker 1: identifying as black. 655 00:34:42,719 --> 00:34:44,799 Speaker 4: And nobody is willing to raise their hands say none 656 00:34:44,840 --> 00:34:46,839 Speaker 4: of this is actually gonna happen. We all know this, right, 657 00:34:47,320 --> 00:34:49,000 Speaker 4: It's just this is there's no way we would have 658 00:34:49,120 --> 00:34:50,799 Speaker 4: enough money and it couldn't get through the cords, and 659 00:34:50,920 --> 00:34:51,960 Speaker 4: blah blah blah. 660 00:34:52,200 --> 00:34:54,840 Speaker 1: Isn't this all just an exercise and not wanting to 661 00:34:54,920 --> 00:34:55,200 Speaker 1: end it? 662 00:34:55,280 --> 00:34:57,200 Speaker 2: As you pointed out, yes, the clear less. 663 00:34:57,440 --> 00:35:00,799 Speaker 1: Oh you know, maybe maybe we get into this next hour. 664 00:35:00,840 --> 00:35:02,920 Speaker 1: I don't know, we'll have to have a meeting about it. 665 00:35:03,880 --> 00:35:07,400 Speaker 1: The slavery story you won't learn in school. There is 666 00:35:07,440 --> 00:35:09,759 Speaker 1: a new book out that's getting ignored by the left, 667 00:35:09,800 --> 00:35:13,360 Speaker 1: as you might expect, but Captives and Companions, a History 668 00:35:13,360 --> 00:35:16,200 Speaker 1: of slavery and the slave trade in the Islamic World, 669 00:35:16,600 --> 00:35:22,720 Speaker 1: oh by Justin Morazi, and it's very thorough, interesting history. 670 00:35:22,760 --> 00:35:26,600 Speaker 1: It's super interesting, but it kind of hurts the narrative. 671 00:35:26,800 --> 00:35:28,480 Speaker 2: So maybe we won't write about it. 672 00:35:28,560 --> 00:35:30,640 Speaker 4: So if you don't get our four where we're going 673 00:35:30,719 --> 00:35:33,120 Speaker 4: to talk about that, you can find the podcast Armstrong 674 00:35:33,160 --> 00:35:35,120 Speaker 4: and Getty on demand. We do twenty hours of this 675 00:35:35,160 --> 00:35:36,480 Speaker 4: every single week, which seems. 676 00:35:36,239 --> 00:35:36,600 Speaker 2: Like a lot. 677 00:35:37,400 --> 00:35:39,919 Speaker 1: Why And if you subscribe it auto downloads you don't 678 00:35:39,960 --> 00:35:43,400 Speaker 1: have to go any trouble in one episode feeds right 679 00:35:43,440 --> 00:35:45,520 Speaker 1: after the other. Good lord, it's a great time to 680 00:35:45,520 --> 00:35:47,360 Speaker 1: be alive. Armstrong and Getty on demand. 681 00:35:49,320 --> 00:35:50,480 Speaker 2: Armstrong and Getty