1 00:00:01,840 --> 00:00:06,160 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,200 --> 00:00:10,280 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,160 --> 00:00:17,240 Speaker 2: Arm Strong and Jetty and now he Armstrong and Jetty. 4 00:00:23,239 --> 00:00:24,960 Speaker 2: We're in the singular. You're at the top of the 5 00:00:25,040 --> 00:00:26,799 Speaker 2: roller coaster. You're about to go. I don't have just 6 00:00:26,880 --> 00:00:27,760 Speaker 2: have court side seats. 7 00:00:27,760 --> 00:00:30,240 Speaker 3: I'm on the court exactly, and it blows my and 8 00:00:30,320 --> 00:00:32,840 Speaker 3: still blows my mind sometimes multiple times a week. 9 00:00:33,080 --> 00:00:34,239 Speaker 2: Yeah. 10 00:00:34,320 --> 00:00:36,760 Speaker 3: And so just when I think I'm like wow, and 11 00:00:36,760 --> 00:00:39,720 Speaker 3: then it's like two days later more wow. 12 00:00:39,960 --> 00:00:45,360 Speaker 2: Yeah, exponential wow. Yeah. I think we'll hit age next 13 00:00:45,400 --> 00:00:47,080 Speaker 2: year in twenty six. Yeah. 14 00:00:47,159 --> 00:00:48,880 Speaker 1: So he said that right at the end of the year. 15 00:00:49,680 --> 00:00:51,960 Speaker 1: Elon Musk there if you don't recognize his voice, the 16 00:00:52,000 --> 00:00:57,840 Speaker 1: singularity being basically and different people have different uh definitions, 17 00:00:57,840 --> 00:01:01,720 Speaker 1: but in general that a I can start programming itself 18 00:01:02,760 --> 00:01:06,039 Speaker 1: and then things get really crazy, really really fast. 19 00:01:06,200 --> 00:01:08,040 Speaker 2: Things are getting weird, I think getting word fast. 20 00:01:08,680 --> 00:01:12,720 Speaker 1: Oh not even Elon, and with the caveat always of 21 00:01:12,760 --> 00:01:17,280 Speaker 1: these people have funding reasons to keep this, perhaps AI 22 00:01:17,400 --> 00:01:20,720 Speaker 1: Bubble going for talking about we're this close to reaching 23 00:01:20,800 --> 00:01:23,199 Speaker 1: the really critical point. But I mean he and Altman 24 00:01:23,280 --> 00:01:24,880 Speaker 1: and lots of other people. And then you know that 25 00:01:24,959 --> 00:01:27,280 Speaker 1: guy that wrote that book last year that I was 26 00:01:27,280 --> 00:01:30,319 Speaker 1: talking about, if anyone builds it, everyone dies. I mean, 27 00:01:30,360 --> 00:01:32,600 Speaker 1: he would do it a great. 28 00:01:32,360 --> 00:01:33,640 Speaker 2: It's one of the best titles. 29 00:01:33,840 --> 00:01:36,520 Speaker 1: Yeah, and he was a proponent of AI for decades, 30 00:01:36,560 --> 00:01:37,800 Speaker 1: going back to like the seventies. 31 00:01:37,840 --> 00:01:39,800 Speaker 2: He used to be on Phil Donahue. He was so 32 00:01:39,920 --> 00:01:40,280 Speaker 2: into it. 33 00:01:40,319 --> 00:01:42,720 Speaker 1: And then we got close enough and he decided, no, 34 00:01:43,080 --> 00:01:46,440 Speaker 1: this is bad for humanity. We have got to pull 35 00:01:46,520 --> 00:01:52,320 Speaker 1: the exit cord on this or whatever. Too late, too late, sorry, chum. 36 00:01:52,960 --> 00:01:56,440 Speaker 4: So we've gotten a couple of really interesting emails about 37 00:01:56,480 --> 00:01:59,800 Speaker 4: that discussion yesterday. Thought we'd share at least one for you. 38 00:02:00,040 --> 00:02:06,280 Speaker 4: How I do scan see if anything? Boy, you know 39 00:02:06,640 --> 00:02:10,359 Speaker 4: what's interesting is as I scan emails, people quote unquote 40 00:02:10,400 --> 00:02:12,600 Speaker 4: connect the correct us for being wrong and they're completely 41 00:02:12,639 --> 00:02:16,560 Speaker 4: wrong with their correction anyway, Okay, correct a way. 42 00:02:16,880 --> 00:02:17,640 Speaker 2: Oh, let's see. 43 00:02:18,080 --> 00:02:21,680 Speaker 4: So it's Matt the land surveyor, who I got the 44 00:02:21,680 --> 00:02:25,160 Speaker 4: pleasure to know a little bit personally, and and when 45 00:02:25,200 --> 00:02:27,480 Speaker 4: I grow up, I want to be like Matt anyway, 46 00:02:27,520 --> 00:02:29,839 Speaker 4: he says yesterday, during the first segment of the show, 47 00:02:29,840 --> 00:02:33,080 Speaker 4: you're discussing what Elon Musk said about AGI and U hi, 48 00:02:33,480 --> 00:02:37,280 Speaker 4: which is universal high income, how basic income. 49 00:02:37,919 --> 00:02:41,280 Speaker 1: We're gonna have so much that he doesn't even understand 50 00:02:41,280 --> 00:02:47,920 Speaker 1: why money will be a thing anymore. Everybody will have housing, energy, healthcare, food. 51 00:02:48,400 --> 00:02:52,400 Speaker 1: All the things you need is attainment, Yeah, entertainment. 52 00:02:52,400 --> 00:02:56,960 Speaker 4: Pleasures of the flesh. So you just go down to 53 00:02:57,000 --> 00:02:59,200 Speaker 4: the grocery store, fill up your cart and walk out again. 54 00:03:00,040 --> 00:03:02,960 Speaker 4: Of course that happens in like, you know, big cities anyway, 55 00:03:03,160 --> 00:03:07,840 Speaker 4: but this will be on the up and up. You 56 00:03:07,960 --> 00:03:10,000 Speaker 4: go down to the well. I'm not sure why you 57 00:03:10,000 --> 00:03:12,160 Speaker 4: would go to the home depot because a robot will 58 00:03:12,200 --> 00:03:13,800 Speaker 4: fix your garden. 59 00:03:13,880 --> 00:03:16,680 Speaker 2: Shit. Yeah, which is you know. I'll go down to the. 60 00:03:16,600 --> 00:03:18,760 Speaker 4: Golf store and get a new driver just to try 61 00:03:18,760 --> 00:03:20,040 Speaker 4: it out, and if I don't like it, I'll throw 62 00:03:20,040 --> 00:03:25,320 Speaker 4: it in the woods. I hit this one sideways too. 63 00:03:25,680 --> 00:03:30,320 Speaker 4: This is a bad club anyway. Golfers get that joke anyway. 64 00:03:30,400 --> 00:03:33,840 Speaker 4: So anyway, moving along, uh Elon said that we will 65 00:03:33,880 --> 00:03:36,960 Speaker 4: all have free healthcare. Oh oh, I'm sorry I didn't 66 00:03:36,960 --> 00:03:39,800 Speaker 4: finish Matt's initial thought. We're talking about a g I 67 00:03:39,880 --> 00:03:41,880 Speaker 4: U H I. And of course the old rob the 68 00:03:42,000 --> 00:03:42,760 Speaker 4: River of Blood. 69 00:03:42,880 --> 00:03:45,800 Speaker 2: Oh yeah, that's what that's that's what we expect. Yeah, 70 00:03:45,800 --> 00:03:46,720 Speaker 2: that's what I'm predicting. 71 00:03:48,160 --> 00:03:52,520 Speaker 4: Actually, I the reason I predicted a river of blood 72 00:03:54,360 --> 00:03:58,080 Speaker 4: was and Matt doesn't really get into this, but as 73 00:03:58,120 --> 00:04:01,080 Speaker 4: a as a guy schooled a little bit and political science, 74 00:04:01,160 --> 00:04:09,200 Speaker 4: my question is, after money is made irrelevant and there's 75 00:04:09,440 --> 00:04:12,920 Speaker 4: everything for everybody, as much as you want, because our 76 00:04:13,040 --> 00:04:17,880 Speaker 4: gross domestic product has tenfold increased, right, and we're just 77 00:04:17,960 --> 00:04:19,160 Speaker 4: we're all ready the wealthiest nation. 78 00:04:19,200 --> 00:04:21,880 Speaker 2: We're just so incredibly wealthy, right. 79 00:04:22,320 --> 00:04:27,320 Speaker 4: What will be the mechanism for taking that wealth from 80 00:04:27,360 --> 00:04:31,280 Speaker 4: the very very few who generate it and distributing it 81 00:04:31,320 --> 00:04:32,440 Speaker 4: to the rest of us? 82 00:04:33,120 --> 00:04:36,599 Speaker 2: Who will do it under what legal framework? And why? 83 00:04:37,240 --> 00:04:40,120 Speaker 2: And why? And what if they don't? 84 00:04:40,400 --> 00:04:40,560 Speaker 4: All? 85 00:04:40,680 --> 00:04:45,159 Speaker 1: Right, And I take in so much AI stuff, and 86 00:04:45,200 --> 00:04:46,880 Speaker 1: I've never heard anybody. 87 00:04:46,440 --> 00:04:49,600 Speaker 2: Address that, which astounds me. It's the first question I have. 88 00:04:50,000 --> 00:04:51,800 Speaker 4: Well, you've got to ignore all of human history and 89 00:04:51,800 --> 00:04:55,120 Speaker 4: all of human nature to miss that question, which I 90 00:04:55,160 --> 00:04:58,599 Speaker 4: think frequently the tech guys do. And even if you 91 00:04:58,720 --> 00:05:01,480 Speaker 4: were to come up with the perfect system, the perfect 92 00:05:01,480 --> 00:05:04,120 Speaker 4: answers to those questions, how long would it take to 93 00:05:04,200 --> 00:05:10,960 Speaker 4: implement that framework, distribute that money, open those stores? 94 00:05:11,640 --> 00:05:14,760 Speaker 2: Blah blah, blah. 95 00:05:14,800 --> 00:05:17,200 Speaker 4: And I know the robots in the computers would be 96 00:05:17,279 --> 00:05:18,960 Speaker 4: in charge of it, I guess in this scenario. But 97 00:05:19,680 --> 00:05:22,440 Speaker 4: what happens in the interim where people are starving, desperate 98 00:05:22,640 --> 00:05:26,040 Speaker 4: or I don't know anyway, Moving along back to mac matt. 99 00:05:25,960 --> 00:05:28,560 Speaker 1: And just just I could really get bogged down in this, 100 00:05:28,960 --> 00:05:32,680 Speaker 1: But so got When you're distributing the money, you're going 101 00:05:32,760 --> 00:05:38,760 Speaker 1: to give more to people in San Francisco than in Oxford, Mississippi, 102 00:05:38,960 --> 00:05:42,440 Speaker 1: just because it requires more to have the same lifestyle. 103 00:05:42,560 --> 00:05:43,520 Speaker 2: You know, the whole. 104 00:05:45,920 --> 00:05:47,840 Speaker 1: Being poor in one place is different another place, and 105 00:05:47,839 --> 00:05:48,960 Speaker 1: amount of money wouldn't you. 106 00:05:48,839 --> 00:05:53,760 Speaker 2: Have to well? And even and why in some other town. 107 00:05:54,400 --> 00:05:58,119 Speaker 4: Even if there's no scarcity of most goods, there will 108 00:05:58,120 --> 00:06:03,040 Speaker 4: definitely be a scarcity of penthouses overlooking in your scenario 109 00:06:03,200 --> 00:06:05,080 Speaker 4: the Golden gate bridge, right, or. 110 00:06:05,000 --> 00:06:08,120 Speaker 1: In climates, you know, great climates versus crappy climates are 111 00:06:08,160 --> 00:06:09,200 Speaker 1: all kinds of different. 112 00:06:08,960 --> 00:06:11,479 Speaker 4: Things or an aspen or on the beach or whatever. Yeah, 113 00:06:11,520 --> 00:06:16,320 Speaker 4: and so who gets Those who decides might take a 114 00:06:16,360 --> 00:06:21,240 Speaker 4: minute to work that out, I would say. Moving along, 115 00:06:21,800 --> 00:06:24,000 Speaker 4: Elon said that we will all have free healthcare. But 116 00:06:24,040 --> 00:06:26,880 Speaker 4: if AGI is providing mankind with all of its wants 117 00:06:26,880 --> 00:06:30,200 Speaker 4: and needs so much now that money is obsolete. Who 118 00:06:30,320 --> 00:06:32,600 Speaker 4: is going to study for years to become a doctor 119 00:06:33,000 --> 00:06:35,279 Speaker 4: that are yet, why would that individual want the headache 120 00:06:35,320 --> 00:06:38,120 Speaker 4: of dealing with the public if they have no financial incentive. 121 00:06:38,440 --> 00:06:40,200 Speaker 4: It's one of the issues I have always had with 122 00:06:40,240 --> 00:06:43,400 Speaker 4: socialism slash communism. If everyone is given the same amount 123 00:06:43,400 --> 00:06:45,880 Speaker 4: of money, who's going to do the hard or dirty job? 124 00:06:46,200 --> 00:06:47,400 Speaker 2: Why would anyone work. 125 00:06:47,279 --> 00:06:49,120 Speaker 4: At the sewer's treatment plan if you can make the 126 00:06:49,160 --> 00:06:53,039 Speaker 4: same amount as a poet quote unquote, they wouldn't. Therefore, 127 00:06:53,080 --> 00:06:55,960 Speaker 4: whatever government entity is in charge, we'll have to force 128 00:06:56,000 --> 00:06:57,760 Speaker 4: some people to do jobs they don't want to do 129 00:06:57,800 --> 00:07:00,320 Speaker 4: for no extra compensation, and freedom is dead. 130 00:07:00,520 --> 00:07:02,920 Speaker 1: I don't know that example where I'm trying to come 131 00:07:03,000 --> 00:07:04,880 Speaker 1: up with a dirty job example that AI couldn't do, 132 00:07:04,920 --> 00:07:10,440 Speaker 1: because clearly AI will run the sewage plant. Yes, robots 133 00:07:10,520 --> 00:07:14,560 Speaker 1: and computers and everything. How about the doctoring the doctoring one. 134 00:07:14,600 --> 00:07:16,679 Speaker 1: I have no answer for why you'd go to medical 135 00:07:16,720 --> 00:07:19,560 Speaker 1: school for ten years just because I want to help 136 00:07:19,560 --> 00:07:21,600 Speaker 1: people out. I mean, you're kind of getting into socialism 137 00:07:21,680 --> 00:07:24,600 Speaker 1: there where you're just assuming that everybody's gonna go out 138 00:07:24,640 --> 00:07:27,320 Speaker 1: and do the best for mankind or with no incentive 139 00:07:27,360 --> 00:07:28,880 Speaker 1: other than I'm a good person. 140 00:07:29,400 --> 00:07:33,000 Speaker 4: I think the answer is the AI will supervise the 141 00:07:33,120 --> 00:07:37,239 Speaker 4: meta robots that are incapable of making a mistake. Old 142 00:07:37,280 --> 00:07:42,120 Speaker 4: Stille right exactly now are you thrashing. 143 00:07:44,320 --> 00:07:48,200 Speaker 2: That hruts? I think you're doing it wrong. And this 144 00:07:48,480 --> 00:07:52,840 Speaker 2: unit zap and visit you. Oh my god, zap. 145 00:07:53,640 --> 00:07:57,240 Speaker 4: Yeah. Now I know that Elon would say a g 146 00:07:57,440 --> 00:08:00,880 Speaker 4: I will provide the healthcare and run the sewage street plant. 147 00:08:01,120 --> 00:08:03,320 Speaker 4: I'm guessing he is envisioning robots that are built in 148 00:08:03,360 --> 00:08:05,520 Speaker 4: program by AGI that will do every job for us. 149 00:08:05,520 --> 00:08:07,600 Speaker 4: But if you want a good sci fi dystopian future, 150 00:08:07,880 --> 00:08:10,480 Speaker 4: imagine humans stop passing on the knowledge that they've gained 151 00:08:10,480 --> 00:08:13,360 Speaker 4: to the next generations because quote AI does a better 152 00:08:13,440 --> 00:08:14,800 Speaker 4: job than any human ever could. 153 00:08:15,120 --> 00:08:17,600 Speaker 2: One hundred years goes by, it wouldn't take nearly that 154 00:08:17,680 --> 00:08:20,440 Speaker 2: no one generation. We'll go with Matt's scenario. 155 00:08:20,480 --> 00:08:23,200 Speaker 4: One hundred years go by and humans have stopped using 156 00:08:23,200 --> 00:08:25,560 Speaker 4: their brains to think critically, and there is a glitch, 157 00:08:25,920 --> 00:08:28,280 Speaker 4: some computer bug that shuts AI down. No one has 158 00:08:28,320 --> 00:08:31,800 Speaker 4: skilled at anything. We have officially re entered the dark ages. Ay, 159 00:08:31,880 --> 00:08:34,040 Speaker 4: I won't need to come for our vital juices. Because 160 00:08:34,040 --> 00:08:36,679 Speaker 4: it is a machine, it will eventually fail and people 161 00:08:36,720 --> 00:08:38,840 Speaker 4: will be back to killing each other with sticks. 162 00:08:39,720 --> 00:08:41,360 Speaker 2: How dumb will we get? How fast? 163 00:08:41,480 --> 00:08:44,800 Speaker 4: I mean, it's hard for me for myself, very very 164 00:08:44,880 --> 00:08:47,679 Speaker 4: dumb and blind drunk. 165 00:08:49,480 --> 00:08:53,880 Speaker 2: So fat? Oh yeah, I. 166 00:08:53,840 --> 00:08:57,120 Speaker 1: Have trouble motivating my oldest kid, particularly right now on 167 00:08:57,280 --> 00:09:00,320 Speaker 1: high school and why it's important to try? What's my 168 00:09:00,400 --> 00:09:02,120 Speaker 1: argument going to be so you can get out there 169 00:09:02,160 --> 00:09:04,079 Speaker 1: and earn exactly what everybody else does. 170 00:09:04,120 --> 00:09:06,920 Speaker 2: Who does try so finish your math homework? 171 00:09:07,480 --> 00:09:09,360 Speaker 4: You want to end up a failed poet who makes 172 00:09:09,400 --> 00:09:12,440 Speaker 4: the same amount as you as a skilled neurosurgeon, do you? 173 00:09:13,840 --> 00:09:15,000 Speaker 2: I mean that would really. 174 00:09:14,840 --> 00:09:16,480 Speaker 1: Be pretty hard to make the argument of why you 175 00:09:16,600 --> 00:09:21,160 Speaker 1: gotta learn anything? Really seriously, I don't know what the 176 00:09:21,280 --> 00:09:22,000 Speaker 1: argument would be. 177 00:09:22,880 --> 00:09:25,920 Speaker 4: And then you get into the thumbing your nose at 178 00:09:25,960 --> 00:09:30,719 Speaker 4: the principles of every great religion, philosophy, and study of 179 00:09:30,800 --> 00:09:33,800 Speaker 4: human beings and happiness that have ever existed. You need 180 00:09:33,880 --> 00:09:36,120 Speaker 4: a purpose, You need a reason to get up in 181 00:09:36,160 --> 00:09:38,760 Speaker 4: the morning and do what you do, and those reasons 182 00:09:38,840 --> 00:09:42,280 Speaker 4: can can vary from I want to come up with 183 00:09:42,400 --> 00:09:44,959 Speaker 4: art that inspires people's souls to I want to help 184 00:09:45,000 --> 00:09:47,920 Speaker 4: the poor kids too. I want to make more money 185 00:09:47,920 --> 00:09:51,680 Speaker 4: than any salesman ever has. Or a hundred other examples, 186 00:09:51,720 --> 00:09:52,720 Speaker 4: thousand other examples. 187 00:09:52,800 --> 00:09:55,960 Speaker 1: The subgeni all of those are supplanted by the gd 188 00:09:56,120 --> 00:10:01,480 Speaker 1: computers and the robots. The robots, the super geniuses are 189 00:10:01,520 --> 00:10:06,360 Speaker 1: constantly having conversations, writing books, podcasts all about the alignment 190 00:10:06,440 --> 00:10:08,920 Speaker 1: problem and hallucinations and all these different things. 191 00:10:09,080 --> 00:10:11,400 Speaker 2: And I just I'm telling you, I haven't come across. 192 00:10:11,400 --> 00:10:13,520 Speaker 1: If you have a book or podcast where they have 193 00:10:13,520 --> 00:10:15,600 Speaker 1: addressed this issue, let me know about it. I'd love 194 00:10:15,880 --> 00:10:20,440 Speaker 1: they don't address to me the most fundamental obvious problem 195 00:10:20,760 --> 00:10:21,880 Speaker 1: of this whole enterprise. 196 00:10:23,240 --> 00:10:24,640 Speaker 2: This. 197 00:10:24,960 --> 00:10:27,880 Speaker 4: You know, I'm always going on about how intelligence doesn't 198 00:10:27,920 --> 00:10:28,760 Speaker 4: equal wisdom. 199 00:10:28,840 --> 00:10:32,440 Speaker 2: It's a good example that to you, God. 200 00:10:32,320 --> 00:10:35,960 Speaker 1: I would think, if you're trying to create a world 201 00:10:35,960 --> 00:10:38,040 Speaker 1: and where you're going to become a trillionaire, you'd want 202 00:10:38,040 --> 00:10:45,280 Speaker 1: to look into this. Perhaps you know wrench in your plans. 203 00:10:47,280 --> 00:10:53,760 Speaker 4: Weir, you're among the few who become ultra unimaginable wealthy 204 00:10:54,679 --> 00:10:59,199 Speaker 4: like you would flip MBS in Saudi Arabia a buck 205 00:10:59,760 --> 00:11:01,040 Speaker 4: to get through the weak. 206 00:11:01,320 --> 00:11:02,480 Speaker 2: You're so freaking rich. 207 00:11:04,120 --> 00:11:07,120 Speaker 4: None of this matters to you, I think, or you're 208 00:11:07,160 --> 00:11:09,040 Speaker 4: just you're so unwise you don't even see it. 209 00:11:09,240 --> 00:11:11,360 Speaker 2: I don't know, you know more. 210 00:11:11,400 --> 00:11:13,800 Speaker 1: I think about the and Elon talks about this in 211 00:11:14,360 --> 00:11:19,080 Speaker 1: very positive terms, about the uh, you know, the high 212 00:11:19,160 --> 00:11:21,000 Speaker 1: level of wealth that we're all gonna have. You don't 213 00:11:21,000 --> 00:11:26,040 Speaker 1: even need money because the Anylways mentions healthcare, food, shelter alone. 214 00:11:26,559 --> 00:11:29,960 Speaker 1: Explain to me how that's gonna work. Every house and 215 00:11:30,040 --> 00:11:33,280 Speaker 1: location is not the same. Unless you're gonna tear down 216 00:11:33,320 --> 00:11:37,480 Speaker 1: every home and dwelling in America and rebuild them all 217 00:11:37,520 --> 00:11:40,800 Speaker 1: the same size, you're going to have envy problems. 218 00:11:40,920 --> 00:11:42,360 Speaker 2: There are gonna be people n. 219 00:11:42,960 --> 00:11:44,840 Speaker 4: Nice house with a nice view, and you're tearing down 220 00:11:44,920 --> 00:11:47,120 Speaker 4: my house and putting me in the government projects in 221 00:11:47,160 --> 00:11:48,119 Speaker 4: the name of equality. 222 00:11:48,360 --> 00:11:50,040 Speaker 2: Excuse me, where do I complain? 223 00:11:50,160 --> 00:11:52,240 Speaker 1: Okay, so you don't do that, you don't tear them 224 00:11:52,240 --> 00:11:55,320 Speaker 1: all done. The people that have the nicer places now 225 00:11:55,400 --> 00:11:57,440 Speaker 1: they get free food, in medical care and everything like that. 226 00:11:57,480 --> 00:11:59,560 Speaker 1: But I was in a crappy place and my family 227 00:11:59,679 --> 00:12:03,960 Speaker 1: is just stuck in this for forever. Generations of my 228 00:12:04,040 --> 00:12:06,600 Speaker 1: family will live in a house this size. Generations your 229 00:12:06,640 --> 00:12:08,439 Speaker 1: family will live in a house of bed size, or 230 00:12:08,480 --> 00:12:10,800 Speaker 1: because you live in San Francisco you'll get you know 231 00:12:11,559 --> 00:12:15,080 Speaker 1: that lifestyle, or La you get that weather if you're Minnesota, 232 00:12:15,080 --> 00:12:17,320 Speaker 1: I mean, I don't understand how it works. 233 00:12:17,480 --> 00:12:20,720 Speaker 4: Or the construction bots will build a mansion for everyone. 234 00:12:20,600 --> 00:12:22,559 Speaker 2: Nobody can move. What if you would decide you want 235 00:12:22,559 --> 00:12:26,160 Speaker 2: to move from Minnesota, La? What house do you get? 236 00:12:26,280 --> 00:12:28,440 Speaker 2: What house do you do? You buy a house? Are 237 00:12:28,480 --> 00:12:29,480 Speaker 2: the houses provided? 238 00:12:30,360 --> 00:12:34,680 Speaker 4: The ask a robot building at I don't know. 239 00:12:34,760 --> 00:12:37,400 Speaker 1: I like, I don't even have the slightest start to 240 00:12:37,480 --> 00:12:40,120 Speaker 1: answering that question. The whole housing thing on how that 241 00:12:40,160 --> 00:12:44,560 Speaker 1: would be equal and nobody would. You're gonna have a 242 00:12:44,640 --> 00:12:48,840 Speaker 1: revolution in the streets if you If people can't become mobile, 243 00:12:49,000 --> 00:12:50,440 Speaker 1: but some people have more than. 244 00:12:50,320 --> 00:12:54,600 Speaker 4: Others, it will be the end of humankind as we 245 00:12:54,640 --> 00:12:57,800 Speaker 4: know it. Maybe not literally. Certain humans will survive and 246 00:12:57,840 --> 00:13:01,320 Speaker 4: probably craft a new reality. But you know, I'm always 247 00:13:01,320 --> 00:13:03,560 Speaker 4: making jokes about the planet of the beavers, or more 248 00:13:03,640 --> 00:13:06,680 Speaker 4: likely the insects or something I could picture it happening. 249 00:13:07,160 --> 00:13:13,120 Speaker 4: You remove the difference between what people have, what they are, 250 00:13:13,440 --> 00:13:15,880 Speaker 4: what they would like to know, I mean, where they 251 00:13:15,880 --> 00:13:18,600 Speaker 4: are and where they would like to be. You remove 252 00:13:18,920 --> 00:13:27,280 Speaker 4: that all human striving, financial, intellectual, romantic, presumably you remove 253 00:13:27,400 --> 00:13:27,880 Speaker 4: all of. 254 00:13:27,800 --> 00:13:30,880 Speaker 2: That and will all be happy if you say. 255 00:13:30,640 --> 00:13:34,960 Speaker 4: That the word fool isn't nearly powerful enough to describe 256 00:13:34,960 --> 00:13:35,959 Speaker 4: your lack of wisdom. 257 00:13:36,160 --> 00:13:36,640 Speaker 2: No it's not. 258 00:13:37,160 --> 00:13:40,120 Speaker 1: And they kind of YadA YadA YadA that whole problem right, 259 00:13:40,200 --> 00:13:43,800 Speaker 1: and it could be happening according to Elon like next year, Oh. 260 00:13:43,640 --> 00:13:46,720 Speaker 2: Goodie, down this road lies madness. 261 00:13:46,840 --> 00:13:49,400 Speaker 1: I agree, any thoughts on this text line four one 262 00:13:49,520 --> 00:13:51,319 Speaker 1: five two nine five KFTC. 263 00:13:55,360 --> 00:13:59,400 Speaker 5: We have extensive basing rights in Greenland under a treaty 264 00:13:59,400 --> 00:14:03,080 Speaker 5: Region nineteen fifty one through NATO that allowed us to 265 00:14:03,200 --> 00:14:06,640 Speaker 5: have a lot of bases in Greenland. And we did 266 00:14:06,640 --> 00:14:09,440 Speaker 5: have a lot of bases in Greenland. They were all 267 00:14:09,520 --> 00:14:12,760 Speaker 5: all but one were taken down. So it's the practical 268 00:14:12,760 --> 00:14:15,640 Speaker 5: differences a little unclear to me, but it might be 269 00:14:15,800 --> 00:14:18,199 Speaker 5: enough to get out of this controversy. 270 00:14:18,320 --> 00:14:21,040 Speaker 1: Yeah, that's what I thought was interesting about that, Brit Hume, 271 00:14:21,360 --> 00:14:27,040 Speaker 1: senior political analyst from Fox saying this deal on Greenland 272 00:14:27,040 --> 00:14:30,280 Speaker 1: that President Trump is says, there's a framework, we'll get 273 00:14:30,280 --> 00:14:32,600 Speaker 1: the details later, but we got most of what we 274 00:14:32,640 --> 00:14:35,840 Speaker 1: want and it's a permanent deal and everything. I think 275 00:14:36,040 --> 00:14:38,400 Speaker 1: Brit Hume is right. It's a way out of the 276 00:14:38,480 --> 00:14:42,240 Speaker 1: controversy is a lot of it. Yeah, and even if 277 00:14:42,240 --> 00:14:45,320 Speaker 1: it's even if we get the deal. As John Bolton, 278 00:14:45,360 --> 00:14:48,960 Speaker 1: his former national security advisor, said yesterday, it's basically like 279 00:14:49,040 --> 00:14:51,240 Speaker 1: the treaty we had in nineteen fifty one, which is 280 00:14:51,240 --> 00:14:53,040 Speaker 1: what brit Hume was referencing there. 281 00:14:54,080 --> 00:14:56,000 Speaker 2: So, which is fine. This is fine. 282 00:14:56,120 --> 00:14:58,520 Speaker 1: The ending is fine. The ending is great. It's good 283 00:14:58,520 --> 00:15:01,920 Speaker 1: for America. It's just did we have to go through 284 00:15:01,960 --> 00:15:05,080 Speaker 1: all that to get to where we got? I don't 285 00:15:05,120 --> 00:15:05,680 Speaker 1: know that we did. 286 00:15:06,560 --> 00:15:10,240 Speaker 4: Yeah, I would suggest we absolutely didn't. And I'm guessing 287 00:15:10,400 --> 00:15:15,479 Speaker 4: Trump's advisors, allies, and like all the Euros he actually respected, 288 00:15:16,040 --> 00:15:20,400 Speaker 4: said you're going to lose five hundred times more than 289 00:15:20,440 --> 00:15:22,440 Speaker 4: you're gaining if you don't ease up. 290 00:15:23,160 --> 00:15:23,680 Speaker 2: He's up. 291 00:15:23,880 --> 00:15:25,960 Speaker 1: So a lot going on in Davos with all the 292 00:15:25,960 --> 00:15:30,200 Speaker 1: world leaders there, and this morning Trump and his son 293 00:15:30,280 --> 00:15:32,640 Speaker 1: in law Jared Kushner and a number of other people 294 00:15:33,480 --> 00:15:36,200 Speaker 1: gave a little speech about the Board of Peace or 295 00:15:36,240 --> 00:15:38,880 Speaker 1: Ministry of Peace or whatever the heck they're calling it too, 296 00:15:40,360 --> 00:15:42,760 Speaker 1: among other things, it's going to solve world conflicts. It 297 00:15:42,800 --> 00:15:45,360 Speaker 1: looks like it's an alternate U and Me, which might 298 00:15:45,400 --> 00:15:46,120 Speaker 1: be a good idea. 299 00:15:46,320 --> 00:15:47,360 Speaker 2: Well, the current one sucks. 300 00:15:47,400 --> 00:15:50,280 Speaker 1: The current one sucks, yes, and is corrupt and all 301 00:15:50,360 --> 00:15:52,600 Speaker 1: kinds of different problems. But this is they're going to 302 00:15:52,680 --> 00:15:55,000 Speaker 1: run Gaza and they're going to make it into a wonderland. 303 00:15:55,000 --> 00:15:56,560 Speaker 1: And here's Jared Kushner explaining it. 304 00:15:56,760 --> 00:15:57,880 Speaker 2: So we did a master plan. 305 00:15:58,160 --> 00:16:00,240 Speaker 6: We brought in I thank you Kygabaya, who's one of 306 00:16:00,280 --> 00:16:02,920 Speaker 6: the most successful real estate developers and brilliant people. 307 00:16:02,960 --> 00:16:03,200 Speaker 7: I know. 308 00:16:03,480 --> 00:16:05,920 Speaker 6: He's volunteered to do this not for profit really because 309 00:16:05,960 --> 00:16:06,400 Speaker 6: of his heart. 310 00:16:06,440 --> 00:16:07,120 Speaker 2: He wants to do this. 311 00:16:07,520 --> 00:16:10,720 Speaker 6: And we've developed ways to redevelop Gaza. Gaza's President Trump's. 312 00:16:10,440 --> 00:16:12,360 Speaker 2: Been saying, has amazing potential and. 313 00:16:12,320 --> 00:16:13,640 Speaker 6: This is for the people of Gaza. 314 00:16:13,720 --> 00:16:15,080 Speaker 2: We've developed into zones. 315 00:16:15,440 --> 00:16:17,560 Speaker 6: In the beginning, we were toying with the idea of saying, 316 00:16:17,760 --> 00:16:19,760 Speaker 6: let's build a free zone and then we have a 317 00:16:19,840 --> 00:16:21,760 Speaker 6: Hamas zone. And then we said, you know what, let's 318 00:16:21,760 --> 00:16:24,240 Speaker 6: just plan for catastrophic success. We have as signed a 319 00:16:24,280 --> 00:16:26,360 Speaker 6: deal demilitarized. That is what we are going to enforce. 320 00:16:26,680 --> 00:16:28,480 Speaker 6: People ask us what our plan be is. We do 321 00:16:28,520 --> 00:16:30,520 Speaker 6: not have a plan B. We have a plan. We 322 00:16:30,560 --> 00:16:32,800 Speaker 6: signed an agreement. We are all committed to making that 323 00:16:32,840 --> 00:16:35,160 Speaker 6: agreement work, there's a master plan. We'll be doing it 324 00:16:35,160 --> 00:16:37,400 Speaker 6: in phasing. In the Middle East, they build cities like 325 00:16:37,440 --> 00:16:40,240 Speaker 6: this in you know, two three million people. They build 326 00:16:40,240 --> 00:16:41,960 Speaker 6: this in three years, and so stuff like this is 327 00:16:42,040 --> 00:16:42,640 Speaker 6: very doable. 328 00:16:42,800 --> 00:16:44,640 Speaker 1: And he went on to say there'd be one hundred 329 00:16:44,640 --> 00:16:47,040 Speaker 1: percent employment right off the bat for the people of 330 00:16:47,040 --> 00:16:48,160 Speaker 1: Gaza who've been needing that. 331 00:16:48,240 --> 00:16:52,160 Speaker 4: They're already moving the rubble out of there. Yeah, this 332 00:16:52,200 --> 00:16:54,920 Speaker 4: is not a but it's an end. And how's the 333 00:16:55,000 --> 00:16:58,080 Speaker 4: enforcement of the disarmament going to proceed. I'm not being 334 00:16:58,080 --> 00:17:00,000 Speaker 4: a cynic about it, because something's got to be done, 335 00:17:00,160 --> 00:17:02,960 Speaker 4: and they're trying to do something. But that's obviously a 336 00:17:03,000 --> 00:17:05,840 Speaker 4: hurdle to be jumped over. It could become one of 337 00:17:05,840 --> 00:17:09,520 Speaker 4: the great vacation spots in the world. It actually could 338 00:17:09,600 --> 00:17:12,400 Speaker 4: be sure if things broke right. But it ain't gonna 339 00:17:12,400 --> 00:17:16,120 Speaker 4: be easy anyway. A lot more on the way Armstrong 340 00:17:16,280 --> 00:17:16,800 Speaker 4: and Getty. 341 00:17:18,040 --> 00:17:20,720 Speaker 1: A video has gone viral of a spooked carriage horse 342 00:17:20,720 --> 00:17:24,200 Speaker 1: in New York City running directly down Sixth Avenue into traffic. 343 00:17:24,480 --> 00:17:36,480 Speaker 2: What spooked the horse? Socialism? What we were just talking about? 344 00:17:36,520 --> 00:17:38,600 Speaker 1: Ai and Elon Musk and I don't want to kill 345 00:17:38,640 --> 00:17:41,680 Speaker 1: you with this, But Elon just said something interesting at Davos. 346 00:17:41,960 --> 00:17:43,760 Speaker 1: He showed up in Davos today at the end of 347 00:17:43,800 --> 00:17:49,240 Speaker 1: the conference and criticized the World Economic Forum that's been 348 00:17:49,240 --> 00:17:53,360 Speaker 1: going on every year for quite some time, saying it's 349 00:17:53,400 --> 00:17:57,440 Speaker 1: increasingly becoming an unelected world government that people never asked 350 00:17:57,480 --> 00:18:02,520 Speaker 1: for and don't want, which is interesting given the criticisms 351 00:18:02,560 --> 00:18:05,440 Speaker 1: of him and others like him. For instance, the fact 352 00:18:05,480 --> 00:18:07,200 Speaker 1: that he was able to get involved in the war 353 00:18:07,320 --> 00:18:11,320 Speaker 1: in Ukraine and in Iran recently by dropping his starling 354 00:18:11,400 --> 00:18:14,240 Speaker 1: thing that like one guy has the power to decide 355 00:18:14,280 --> 00:18:18,120 Speaker 1: to do this or not do this is really pretty amazing. 356 00:18:19,320 --> 00:18:24,280 Speaker 2: But so that's that. And then when you get into. 357 00:18:23,800 --> 00:18:26,679 Speaker 4: I'm sorry he's such a complicate, complicated guy because he 358 00:18:26,840 --> 00:18:31,120 Speaker 4: has that standing up for the common people. We don't 359 00:18:31,160 --> 00:18:35,439 Speaker 4: appreciate you, you elites, you know, telling us how to 360 00:18:35,520 --> 00:18:38,639 Speaker 4: live and blah blah blah, when financially he's one of 361 00:18:38,720 --> 00:18:39,720 Speaker 4: the elites of the elites. 362 00:18:39,760 --> 00:18:42,560 Speaker 2: But he's different of all time. 363 00:18:42,880 --> 00:18:46,239 Speaker 1: He's worth seven hundred billion dollars and and and and 364 00:18:46,400 --> 00:18:50,320 Speaker 1: his particular line of work is maybe. 365 00:18:50,200 --> 00:18:51,720 Speaker 2: Going to alter the human race. 366 00:18:51,880 --> 00:18:54,159 Speaker 1: So yeah, it is kind of interesting that these unelected 367 00:18:54,160 --> 00:19:00,919 Speaker 1: people coming here and telling you how to live anyway, bah, 368 00:19:01,000 --> 00:19:03,000 Speaker 1: let me skip down to ask about the goals of 369 00:19:03,040 --> 00:19:09,560 Speaker 1: his companies, SpaceX Tesla, which is now mostly robots and 370 00:19:09,560 --> 00:19:14,320 Speaker 1: everything like that. Elon said that Tesla's mission includes sustainable 371 00:19:14,400 --> 00:19:18,199 Speaker 1: abundance through the development of robotics. Tesla's currently developing all 372 00:19:18,200 --> 00:19:21,320 Speaker 1: these different robots with robotics in AI, there is really 373 00:19:21,359 --> 00:19:24,320 Speaker 1: a path to abundance for all. People often talk about 374 00:19:24,320 --> 00:19:27,040 Speaker 1: solving global poverty. How do we give everyone a very 375 00:19:27,119 --> 00:19:29,119 Speaker 1: high standard of living. The only way to do that 376 00:19:29,240 --> 00:19:32,000 Speaker 1: is AI in robotics, So that gets to the everybody 377 00:19:32,040 --> 00:19:34,399 Speaker 1: in the world, not just the United States, is going 378 00:19:34,480 --> 00:19:37,399 Speaker 1: to have a high standard of living. My prediction is 379 00:19:37,400 --> 00:19:41,200 Speaker 1: that soon there'll be more robots than people. So it's 380 00:19:41,240 --> 00:19:45,239 Speaker 1: even more complicated than what the fanciful this or that 381 00:19:45,600 --> 00:19:48,160 Speaker 1: or hypotheticals we're talking about. 382 00:19:47,960 --> 00:19:48,520 Speaker 2: In the United States. 383 00:19:48,560 --> 00:19:51,520 Speaker 1: If you're going to go with a global high standard 384 00:19:51,560 --> 00:19:54,520 Speaker 1: of living, still going to be people in god forsaken 385 00:19:54,560 --> 00:19:56,440 Speaker 1: parts of the world and people in cool parts of 386 00:19:56,480 --> 00:20:00,440 Speaker 1: the world, And envy doesn't go away. And I just 387 00:20:00,080 --> 00:20:03,360 Speaker 1: I don't understand how anyway were trying. 388 00:20:03,119 --> 00:20:05,159 Speaker 2: To play this. I'm trying to play this out to like, 389 00:20:05,320 --> 00:20:06,399 Speaker 2: you know, the nth degree. 390 00:20:06,440 --> 00:20:08,640 Speaker 4: All right, so I live in an s whole country, 391 00:20:09,840 --> 00:20:15,080 Speaker 4: but we have now uh never ending unlimited wealth about armies. 392 00:20:15,200 --> 00:20:17,040 Speaker 2: If you're run by a dictator, he's going to keep 393 00:20:17,040 --> 00:20:17,320 Speaker 2: it all. 394 00:20:17,480 --> 00:20:19,800 Speaker 1: How are you gonna what Elon gonna have an army 395 00:20:19,840 --> 00:20:22,680 Speaker 1: going there and distribute all the food and the medicine 396 00:20:22,680 --> 00:20:24,199 Speaker 1: and everything to make sure they all get it. 397 00:20:24,280 --> 00:20:26,000 Speaker 2: I mean, see, that's what Hamas. 398 00:20:25,640 --> 00:20:28,040 Speaker 1: Did with Gaza. The people at the top are going 399 00:20:28,080 --> 00:20:30,480 Speaker 1: to keep it all. There's no stopping that. How can 400 00:20:30,560 --> 00:20:32,480 Speaker 1: he I don't even know what he's talking about. 401 00:20:32,840 --> 00:20:36,360 Speaker 4: Well, like you have been digging into Iran past and present, 402 00:20:37,040 --> 00:20:41,720 Speaker 4: and the Islamic fundamentalists are not that way because they 403 00:20:41,840 --> 00:20:45,560 Speaker 4: really want a bigger color TV. They're not going to 404 00:20:45,600 --> 00:20:48,399 Speaker 4: go away because Elon and his robots, which sounds like 405 00:20:48,480 --> 00:20:52,840 Speaker 4: you know, a do wop group or something, are are 406 00:20:52,880 --> 00:20:57,359 Speaker 4: going to uh you know, distribute fine consumer goods. They're 407 00:20:57,359 --> 00:20:59,919 Speaker 4: gonna blow up the robots and the AI centers and 408 00:21:00,040 --> 00:21:03,200 Speaker 4: everything in the name of Allah. And you know, frankly, 409 00:21:04,600 --> 00:21:06,520 Speaker 4: I'm not in favor of blowing up anything in the 410 00:21:06,560 --> 00:21:09,639 Speaker 4: name of Allah, but I could definitely see people saying 411 00:21:09,680 --> 00:21:11,679 Speaker 4: this is not good for the human soul. It's not 412 00:21:11,720 --> 00:21:13,760 Speaker 4: good for humans in general. I'm against it and I'm 413 00:21:13,760 --> 00:21:15,720 Speaker 4: gonna do what I can to keep it out of here. 414 00:21:16,160 --> 00:21:19,440 Speaker 1: Or just this is going to empower the populace and 415 00:21:19,480 --> 00:21:21,199 Speaker 1: they're going to rise up against us. And we like 416 00:21:21,240 --> 00:21:24,640 Speaker 1: being in charge. But so how do excellent because that's 417 00:21:24,680 --> 00:21:27,639 Speaker 1: a different challenge. How does how do Elon and the 418 00:21:27,720 --> 00:21:33,480 Speaker 1: robots tonight at the Apollo Theater, how do they respond? 419 00:21:33,520 --> 00:21:37,680 Speaker 1: They send a robot army in right and kill their 420 00:21:37,880 --> 00:21:40,760 Speaker 1: idea of who the evil doers are. Elon on his own, 421 00:21:41,680 --> 00:21:43,399 Speaker 1: no is army a robot? Say, well, I know, but 422 00:21:43,520 --> 00:21:46,520 Speaker 1: he makes that decision. So Elon's got an army. It 423 00:21:46,640 --> 00:21:47,440 Speaker 1: sends around the world. 424 00:21:47,840 --> 00:21:48,520 Speaker 2: Well exactly. 425 00:21:48,560 --> 00:21:51,680 Speaker 4: So you have the ultra powerful deciding deciding who should 426 00:21:51,720 --> 00:21:54,439 Speaker 4: die and who should live, and enforcing that at the 427 00:21:54,480 --> 00:21:56,040 Speaker 4: point of a robotic gun. 428 00:21:56,119 --> 00:21:59,600 Speaker 1: And he's in Davos saying this sucks because we shouldn't 429 00:21:59,600 --> 00:22:03,439 Speaker 1: have a unelected government deciding things for the people. 430 00:22:04,000 --> 00:22:06,360 Speaker 2: I guess Elon has a response here, well, I. 431 00:22:06,280 --> 00:22:09,080 Speaker 3: Think generally, I think my last words would be I 432 00:22:09,119 --> 00:22:13,000 Speaker 3: would encourage everyone to be optimistic and excited about the future. 433 00:22:13,240 --> 00:22:18,199 Speaker 3: Good and generally I think for quality of life it 434 00:22:18,240 --> 00:22:20,800 Speaker 3: is actually better to err on the side of being 435 00:22:20,800 --> 00:22:23,720 Speaker 3: an optimist and wrong rather than a pessimist. 436 00:22:23,800 --> 00:22:24,400 Speaker 2: And rice. 437 00:22:26,160 --> 00:22:29,480 Speaker 1: That note, and just one more thing, is there anybody 438 00:22:29,480 --> 00:22:31,600 Speaker 1: here that would like me to impregnate them? I'd like 439 00:22:31,640 --> 00:22:33,640 Speaker 1: to knock out a quick hid before I get out 440 00:22:33,640 --> 00:22:35,359 Speaker 1: of the country. I got about ten minutes before I 441 00:22:35,440 --> 00:22:37,680 Speaker 1: catch my ge whatever the heck jet? 442 00:22:37,800 --> 00:22:43,320 Speaker 2: Yes? Oh yeah, yeah, yeah, all right, man, we are. 443 00:22:44,720 --> 00:22:46,879 Speaker 1: Gonna live through our living through one of the most 444 00:22:46,880 --> 00:22:48,960 Speaker 1: interesting times in human history. 445 00:22:49,359 --> 00:22:51,240 Speaker 2: Things are gonna work and they're gonna wear it fast. 446 00:22:51,640 --> 00:22:54,600 Speaker 4: Well that was back in the slow getting weird days. 447 00:22:54,760 --> 00:22:57,800 Speaker 4: This is like super accelerated. Got a great note from 448 00:22:57,880 --> 00:23:01,040 Speaker 4: a friend of the show who's in tech and says, guys, relax, 449 00:23:01,080 --> 00:23:01,639 Speaker 4: it's a bubble. 450 00:23:01,920 --> 00:23:04,800 Speaker 2: Won't happen. We can touch on that tomorrow when we 451 00:23:04,840 --> 00:23:08,159 Speaker 2: have a little more time. That there just won't be agi. 452 00:23:08,280 --> 00:23:13,399 Speaker 2: It'll never get there. Yeah. Essentially, yeah, they're raising money. 453 00:23:13,480 --> 00:23:16,879 Speaker 2: I hope he's right. I want that to be right. Yeah, 454 00:23:16,920 --> 00:23:19,040 Speaker 2: me too. Yeah. 455 00:23:19,080 --> 00:23:23,080 Speaker 4: Anyway, So, on a completely different topic, I've found this 456 00:23:23,320 --> 00:23:28,399 Speaker 4: troubling and interesting on several different levels. China, as you 457 00:23:28,480 --> 00:23:32,080 Speaker 4: might imagine, is employing you know how we're always talking 458 00:23:32,080 --> 00:23:35,159 Speaker 4: about how they've got a whole of society campaign to 459 00:23:35,200 --> 00:23:38,040 Speaker 4: bring down the United States. Well, god knows, they've got 460 00:23:38,080 --> 00:23:41,680 Speaker 4: a whole of society campaign to take over Taiwan. I mean, 461 00:23:41,680 --> 00:23:46,520 Speaker 4: that's like there are global ambitions distilled down into what 462 00:23:46,800 --> 00:23:50,720 Speaker 4: step number one, and that includes espionage in. 463 00:23:51,240 --> 00:23:56,840 Speaker 2: She's personal desire. True right, but they. 464 00:23:56,720 --> 00:24:01,200 Speaker 4: Are are are working like crazy to undermine Taiwan, especially 465 00:24:01,280 --> 00:24:05,960 Speaker 4: the Taiwanese military. And one of the interesting aspects of 466 00:24:05,960 --> 00:24:07,919 Speaker 4: this before we get into like the big picture, is 467 00:24:08,720 --> 00:24:13,080 Speaker 4: they start this article in the journal about a specific 468 00:24:14,800 --> 00:24:20,520 Speaker 4: military police battalion guy who got undermined because he was 469 00:24:20,560 --> 00:24:24,159 Speaker 4: in debt in short of cash. He's searching online for 470 00:24:24,200 --> 00:24:27,840 Speaker 4: a loan to keep himself afloat. The Chinese agents say, 471 00:24:27,880 --> 00:24:30,439 Speaker 4: we got a live one. Hooked him up with an 472 00:24:30,440 --> 00:24:33,640 Speaker 4: opportunity for easy money. Just snap a couple of photos 473 00:24:33,640 --> 00:24:36,920 Speaker 4: of sensitive security details with your cell phone and your 474 00:24:37,000 --> 00:24:39,760 Speaker 4: bills are paid. And then, of course, once they sink 475 00:24:39,800 --> 00:24:43,080 Speaker 4: the hook into you, you're doomed. They ask more and more, 476 00:24:43,960 --> 00:24:48,439 Speaker 4: but Taiwanese officials fear that agents already placed on the 477 00:24:48,440 --> 00:24:53,399 Speaker 4: island would aid a military attack by China, which is 478 00:24:53,400 --> 00:24:56,400 Speaker 4: obviously threatened to seize it by force. China's spying operations 479 00:24:56,440 --> 00:25:01,639 Speaker 4: are rapidly advancing, using complex operations and new technology along 480 00:25:01,640 --> 00:25:04,720 Speaker 4: with the old school stuff that together quote pose a 481 00:25:04,760 --> 00:25:07,000 Speaker 4: potential serious threat to our national security. 482 00:25:08,200 --> 00:25:10,360 Speaker 2: Had a couple of high profile arrests. 483 00:25:10,400 --> 00:25:13,920 Speaker 4: You had a retired lieutenant general given seven and a 484 00:25:13,960 --> 00:25:17,000 Speaker 4: half years for accepting Chinese funds to establish an armed 485 00:25:17,080 --> 00:25:20,600 Speaker 4: organization in Taiwan that would target military bases. 486 00:25:21,440 --> 00:25:24,479 Speaker 1: Wow, well, did you read that article the other day 487 00:25:24,480 --> 00:25:26,840 Speaker 1: about the problems they're having in their government. There's there's 488 00:25:26,880 --> 00:25:31,840 Speaker 1: a growing faction between the two parties in Taiwan. Of 489 00:25:31,880 --> 00:25:34,280 Speaker 1: a party that's like, we know we need to we 490 00:25:34,320 --> 00:25:37,200 Speaker 1: need to be closer to China and work more with China, 491 00:25:37,240 --> 00:25:40,200 Speaker 1: and you're over you're overreacted to this whole China taking 492 00:25:40,280 --> 00:25:43,359 Speaker 1: us overthing. And then the other party is hardcore we 493 00:25:43,400 --> 00:25:45,760 Speaker 1: need to build up our defenses, we need to spend 494 00:25:45,760 --> 00:25:47,840 Speaker 1: more money than we've ever spent to ready to go 495 00:25:47,880 --> 00:25:50,360 Speaker 1: to war with China. I mean, that's a very different 496 00:25:51,720 --> 00:25:54,639 Speaker 1: view of how to handle things. That's not a minor 497 00:25:54,960 --> 00:25:56,159 Speaker 1: political dispute. 498 00:25:56,560 --> 00:25:58,879 Speaker 4: Well, it's the pardon me, but the fight to the 499 00:25:58,920 --> 00:26:01,399 Speaker 4: death party versus the lay back and try to enjoy 500 00:26:01,440 --> 00:26:01,879 Speaker 4: it party. 501 00:26:02,000 --> 00:26:07,159 Speaker 2: Oh yeah, I know, I know. I look at Taiwan. 502 00:26:07,520 --> 00:26:10,119 Speaker 4: I'm sorry, I look at Hong Kong and think, what 503 00:26:10,200 --> 00:26:11,040 Speaker 4: are you thinking? 504 00:26:11,359 --> 00:26:16,359 Speaker 1: I would vote happily. I think fight to the death party. 505 00:26:16,400 --> 00:26:19,800 Speaker 1: But the lay back and enjoy it party maybe right 506 00:26:20,080 --> 00:26:21,720 Speaker 1: that it's gonna happen no matter what. 507 00:26:22,520 --> 00:26:25,639 Speaker 2: So what are we even talking about or will it? 508 00:26:28,520 --> 00:26:31,160 Speaker 4: My wishes are the father of my thoughts here. I hope, 509 00:26:31,240 --> 00:26:35,639 Speaker 4: I really really hope it doesn't. I'm hoping China's economic 510 00:26:35,800 --> 00:26:41,639 Speaker 4: and demographic problems because that sort of thing can go 511 00:26:41,720 --> 00:26:44,720 Speaker 4: in two different directions. Number One, it empowers you to 512 00:26:44,880 --> 00:26:48,760 Speaker 4: do adventurous stuff because you've got to rally the people 513 00:26:48,800 --> 00:26:52,800 Speaker 4: behind something or other. Sometimes it does that. Sometimes it 514 00:26:52,880 --> 00:26:54,720 Speaker 4: makes you think, oh, we don't have the time or 515 00:26:54,720 --> 00:26:56,080 Speaker 4: the money for that. We need to take care of 516 00:26:56,119 --> 00:26:58,400 Speaker 4: things closer home. Who knows. I hope it's that one, 517 00:26:58,480 --> 00:27:00,880 Speaker 4: but you know it's wishful. 518 00:27:00,960 --> 00:27:02,760 Speaker 1: The reason the stakes are so important, in case you 519 00:27:02,760 --> 00:27:06,240 Speaker 1: don't know this, is that if China takes Taiwan, the 520 00:27:06,640 --> 00:27:09,080 Speaker 1: Seas will no longer be open and free like they 521 00:27:09,080 --> 00:27:11,359 Speaker 1: have been since the end of World War Two, dominated 522 00:27:11,400 --> 00:27:14,440 Speaker 1: by us keeping them open. China will control at least 523 00:27:14,520 --> 00:27:16,560 Speaker 1: a quarter of shipping in the world, and there'll be 524 00:27:16,560 --> 00:27:17,280 Speaker 1: no stop in that. 525 00:27:18,720 --> 00:27:21,160 Speaker 2: That'll be a completely different situation, right. 526 00:27:21,320 --> 00:27:24,959 Speaker 4: Yeah, and final note on this, I'm reminded of one 527 00:27:25,000 --> 00:27:27,480 Speaker 4: of my favorite sayings that have heard in this chunk 528 00:27:27,560 --> 00:27:30,280 Speaker 4: of my life, and it was used to describe I 529 00:27:30,280 --> 00:27:34,600 Speaker 4: think it's a fundamentalist Islam. But I will demand my 530 00:27:34,920 --> 00:27:39,080 Speaker 4: rights because that is according to your principles. Then when 531 00:27:39,119 --> 00:27:41,639 Speaker 4: I take power, I will deny you your rights because 532 00:27:41,640 --> 00:27:46,080 Speaker 4: that is my principle. China is exploiting the openness of Taiwan, 533 00:27:46,200 --> 00:27:50,679 Speaker 4: the freedom, diversity, openness, free press, etc. To recruit gangs, 534 00:27:50,680 --> 00:27:53,439 Speaker 4: the media, commentators, political parties, and even active duty and 535 00:27:53,480 --> 00:27:56,560 Speaker 4: retired members of the armed forces, and then the minute 536 00:27:56,600 --> 00:28:00,480 Speaker 4: they get power, they'll crush that sort of freedom China. 537 00:28:01,680 --> 00:28:05,800 Speaker 4: That's right, they are summary. Yeah. So one of the 538 00:28:05,840 --> 00:28:10,719 Speaker 4: great and interesting, you know, philosophical questions of running a 539 00:28:10,760 --> 00:28:15,919 Speaker 4: society like ours is to what extent do we allow 540 00:28:15,960 --> 00:28:19,280 Speaker 4: our principles to be suicidal. 541 00:28:22,960 --> 00:28:23,520 Speaker 2: Tolerance? 542 00:28:24,119 --> 00:28:28,440 Speaker 4: What's the great saying, you know, tolerating, being tolerant until 543 00:28:28,480 --> 00:28:32,520 Speaker 4: you're killed. That's the essence of the thought that I 544 00:28:32,560 --> 00:28:34,320 Speaker 4: mean suicidal levels of tolerance. 545 00:28:34,400 --> 00:28:35,760 Speaker 2: Yeah, but abandoning those. 546 00:28:35,600 --> 00:28:40,240 Speaker 1: Principles is the great way to get people to give 547 00:28:40,240 --> 00:28:43,320 Speaker 1: you power in the history of doing that sort of. 548 00:28:43,360 --> 00:28:46,600 Speaker 2: Thing, all the balancing Act, Jack Yang. 549 00:28:47,360 --> 00:28:52,400 Speaker 1: So you got about a two thousand mile wide giant 550 00:28:52,480 --> 00:28:55,520 Speaker 1: winter storm that's about to hit like two thirds of 551 00:28:55,520 --> 00:28:56,000 Speaker 1: the country. 552 00:28:56,320 --> 00:28:57,040 Speaker 2: That is correct, sir. 553 00:28:57,080 --> 00:28:59,880 Speaker 1: I know for where my brother lives. It's supposed to 554 00:28:59,880 --> 00:29:01,960 Speaker 1: be minus three on Monday and. 555 00:29:02,240 --> 00:29:05,680 Speaker 4: Entire states covered with ice, with all the power lines 556 00:29:06,480 --> 00:29:07,800 Speaker 4: cascading to the ground. 557 00:29:07,960 --> 00:29:13,880 Speaker 1: Major airlines are already issuing travel waivers because they're canceling 558 00:29:13,920 --> 00:29:17,400 Speaker 1: flights just in anticipation. It's I'm glad I'm not traveling 559 00:29:17,440 --> 00:29:20,160 Speaker 1: anywhere this weekend because it's well, you ain't gonna travel 560 00:29:20,160 --> 00:29:21,960 Speaker 1: this weekend. If you've got plans, it ain't gonna happen. 561 00:29:22,040 --> 00:29:25,520 Speaker 1: You might want to look into it. I'll be safe 562 00:29:25,520 --> 00:29:28,760 Speaker 1: and comfortable in California where the weather's fine, and I'm 563 00:29:29,160 --> 00:29:31,040 Speaker 1: hounded by drug addict homeless people. 564 00:29:31,720 --> 00:29:35,360 Speaker 2: There you go, there, you go home, sweet home. We'll 565 00:29:35,360 --> 00:29:37,160 Speaker 2: finish strong, laugh. 566 00:29:37,240 --> 00:29:43,680 Speaker 1: I laugh, he laughs, Sparkle boring, we'll finish strong next. 567 00:29:46,760 --> 00:29:49,080 Speaker 1: So Oscar nominations came out today and I use all 568 00:29:49,120 --> 00:29:52,640 Speaker 1: these ords shows as a way to know what albums 569 00:29:52,640 --> 00:29:55,280 Speaker 1: are popular, movies are popular, whatever, and then I usually 570 00:29:55,360 --> 00:29:57,240 Speaker 1: check them out and sometimes I like them and sometimes 571 00:29:57,280 --> 00:30:00,680 Speaker 1: I don't. But today they announced this movie Sinners got 572 00:30:00,720 --> 00:30:04,840 Speaker 1: sixteen nominations, the most an OSCAR history, And if that happens, 573 00:30:04,880 --> 00:30:06,520 Speaker 1: I think, well, I got to at least check this 574 00:30:06,600 --> 00:30:10,160 Speaker 1: movie out. It's got the most nominations in the history 575 00:30:10,160 --> 00:30:12,240 Speaker 1: of the Oscars. So let me read a little recap 576 00:30:12,280 --> 00:30:14,560 Speaker 1: here of what it is, which sounds pretty damned interesting. 577 00:30:14,600 --> 00:30:16,720 Speaker 1: I just I saw some clips up on TV today 578 00:30:17,000 --> 00:30:19,080 Speaker 1: and it looks like a great period piece. I love 579 00:30:19,120 --> 00:30:22,520 Speaker 1: a period piece, the period being the Mississippi Delta in 580 00:30:22,560 --> 00:30:26,240 Speaker 1: the early thirties. Sinners is a supernat It's a vampire movie. 581 00:30:26,400 --> 00:30:29,760 Speaker 1: Sinners is a supernatural horror film, written, produced, and directed 582 00:30:29,800 --> 00:30:31,680 Speaker 1: by Ryan Coogler, starring Michael B. 583 00:30:31,840 --> 00:30:33,560 Speaker 2: Jordan in dual roles. 584 00:30:33,600 --> 00:30:37,040 Speaker 1: He plays twins, so he plays both twins, Elijah Smokemore 585 00:30:37,080 --> 00:30:41,040 Speaker 1: and Elias Stackmore. They're both World War One veterans and 586 00:30:41,080 --> 00:30:44,800 Speaker 1: former Chicago Mob associates who returned to their hometown of 587 00:30:44,880 --> 00:30:48,400 Speaker 1: Clox Clarksdale, Mississippi, in nineteen thirty two. During the Jim 588 00:30:48,440 --> 00:30:51,280 Speaker 1: crow era. The plot follows the brothers is they use 589 00:30:51,360 --> 00:30:54,080 Speaker 1: money from their criminal pass to buy an abandoned sawmill 590 00:30:54,400 --> 00:30:57,480 Speaker 1: from a local landowner and converted into a juke joint, 591 00:30:57,800 --> 00:31:00,200 Speaker 1: a safe place for the black community where they all 592 00:31:00,280 --> 00:31:03,400 Speaker 1: enjoy blues music and escape the dangers of the time. 593 00:31:04,360 --> 00:31:08,040 Speaker 1: Their fresh start and lively opening night celebration are disrupted 594 00:31:08,120 --> 00:31:13,120 Speaker 1: by a greater evil vampires that invade, turning the night 595 00:31:13,160 --> 00:31:17,040 Speaker 1: into a bloody siege. The film blends elements of period drama, 596 00:31:17,040 --> 00:31:22,360 Speaker 1: Southern gothic, redemption, revenge, horror, family legacy, broken lynches in 597 00:31:22,960 --> 00:31:24,720 Speaker 1: a single twenty four hour period. 598 00:31:25,080 --> 00:31:25,320 Speaker 2: Yeah. 599 00:31:25,400 --> 00:31:27,840 Speaker 1: I'll check it out hopefully this weekend and see what 600 00:31:27,880 --> 00:31:28,360 Speaker 1: I think of it. 601 00:31:28,480 --> 00:31:29,760 Speaker 2: I love movies that are placed. 602 00:31:29,800 --> 00:31:32,120 Speaker 1: I love any TV show or movie that's placed over 603 00:31:32,160 --> 00:31:34,160 Speaker 1: a one day period. I've always liked that as a 604 00:31:34,800 --> 00:31:37,880 Speaker 1: as a starting point. But it's a single twenty four 605 00:31:37,880 --> 00:31:41,880 Speaker 1: hours fighting vampires from a blues juke joint in the South, 606 00:31:42,360 --> 00:31:43,000 Speaker 1: So there you go. 607 00:31:43,680 --> 00:31:43,920 Speaker 2: Yeah. 608 00:31:43,960 --> 00:31:48,240 Speaker 4: The rep on the conservative side is the vampires are 609 00:31:48,360 --> 00:31:52,600 Speaker 4: a not even disguised metaphor for systemic racism, and all 610 00:31:52,640 --> 00:31:53,800 Speaker 4: the bad people are white people. 611 00:31:53,960 --> 00:31:56,520 Speaker 1: Maybe that's true, maybe it's not. I have many, many 612 00:31:56,560 --> 00:31:59,400 Speaker 1: times disagreed with those takes. When I watch movies, I 613 00:31:59,400 --> 00:32:03,320 Speaker 1: think some of you you don't grasp the concept of 614 00:32:03,360 --> 00:32:06,680 Speaker 1: sometimes a cigar is just a cigar. If Mississippi Burning, 615 00:32:06,720 --> 00:32:08,920 Speaker 1: one of my favorite movies of all time, came out today, 616 00:32:09,200 --> 00:32:12,880 Speaker 1: would it being being portrayed of white people in the 617 00:32:12,960 --> 00:32:15,480 Speaker 1: South being racist? Because it portrayed white people in the 618 00:32:15,480 --> 00:32:17,920 Speaker 1: South who are racist at the time, it's being racist. 619 00:32:18,160 --> 00:32:20,840 Speaker 2: I think you're over correcting. But that's just my opinion. 620 00:32:23,520 --> 00:32:25,880 Speaker 1: Well, oh, I know, as I can watch a movie 621 00:32:25,880 --> 00:32:27,800 Speaker 1: and I think I didn't get that at all from. 622 00:32:27,680 --> 00:32:33,760 Speaker 2: It many many Okay, that's fine. And how do you 623 00:32:33,800 --> 00:32:35,640 Speaker 2: how and what makes a person judge a movie before 624 00:32:35,680 --> 00:32:36,520 Speaker 2: they've even seen it? 625 00:32:38,160 --> 00:32:39,360 Speaker 6: Uh? 626 00:32:39,400 --> 00:32:40,240 Speaker 2: Well, I don't know. 627 00:32:42,200 --> 00:32:44,400 Speaker 4: Otherwise you'd have to watch every movie in the world, 628 00:32:44,680 --> 00:32:47,640 Speaker 4: well or everyone that seemed interest true that. 629 00:32:47,960 --> 00:32:51,160 Speaker 1: But surely you've had the experience of somebody saying this 630 00:32:51,200 --> 00:32:54,680 Speaker 1: album is great or horrible, and then you experience and think, no, oh, yeah, 631 00:32:54,760 --> 00:32:55,880 Speaker 1: I don't agree with that at all. 632 00:32:56,360 --> 00:32:58,960 Speaker 2: Oh yeah. In fact, I've called for lining critics up 633 00:32:58,960 --> 00:33:01,560 Speaker 2: against the wall. Yeah, I'm gonna try to work it 634 00:33:01,560 --> 00:33:03,640 Speaker 2: into my weekend. Feel free. 635 00:33:03,720 --> 00:33:07,680 Speaker 4: Let us know, breaking again and I stand by, judging 636 00:33:07,720 --> 00:33:11,880 Speaker 4: by what I've read, there's no way it's what you're suggesting. 637 00:33:12,240 --> 00:33:14,960 Speaker 4: But I love the period piece aspect of it, and 638 00:33:15,000 --> 00:33:18,200 Speaker 4: the Duke Joint and all that that sounds great, and 639 00:33:18,240 --> 00:33:20,680 Speaker 4: a guitar player making a deal with the devil a classic. 640 00:33:21,280 --> 00:33:23,400 Speaker 1: My biggest problem is not going to be how to 641 00:33:23,480 --> 00:33:26,080 Speaker 1: handle racism. It's gonna be I don't like vampire movies 642 00:33:26,120 --> 00:33:27,040 Speaker 1: and stuff like that. 643 00:33:27,560 --> 00:33:31,360 Speaker 4: Well, that's one objection, but it congratulations on breaking the 644 00:33:31,400 --> 00:33:36,080 Speaker 4: record of nominations of the transgender cartel boss movie last year. 645 00:33:36,840 --> 00:33:38,080 Speaker 2: That was tied with Titanic. 646 00:33:38,160 --> 00:33:40,880 Speaker 1: So I'd rather credit Titanic for holding that record than 647 00:33:41,520 --> 00:33:42,920 Speaker 1: whatever that movie was. 648 00:33:44,520 --> 00:33:48,640 Speaker 2: Its final. 649 00:33:49,760 --> 00:34:01,480 Speaker 7: Was so well comments and ertainas wis closure all the 650 00:34:01,680 --> 00:34:03,280 Speaker 7: show is. 651 00:34:05,840 --> 00:34:07,560 Speaker 1: That's some of the music they're playing at the Duke 652 00:34:07,640 --> 00:34:11,560 Speaker 1: Joint there before the vampires came Dere's Harmony. Here's your 653 00:34:11,680 --> 00:34:13,240 Speaker 1: host for final thoughts, Joe Getty. 654 00:34:13,440 --> 00:34:15,319 Speaker 4: Let's get a final art from everybody on the crew 655 00:34:15,360 --> 00:34:17,360 Speaker 4: to wrap up the day. There is Michaelangelo pressing the 656 00:34:17,400 --> 00:34:18,799 Speaker 4: buttons Mike College final thought. 657 00:34:18,840 --> 00:34:20,880 Speaker 6: You know, one of the interesting things about the Oscars 658 00:34:20,920 --> 00:34:23,000 Speaker 6: is in starting in twenty twenty nine, it goes to 659 00:34:23,080 --> 00:34:25,759 Speaker 6: YouTube only, so just a few more years on broadcast 660 00:34:25,800 --> 00:34:28,080 Speaker 6: TV and then has anything. 661 00:34:28,160 --> 00:34:30,759 Speaker 2: Shoot and I'll have to not watch it on YouTube? 662 00:34:30,960 --> 00:34:34,399 Speaker 1: Has anything fallen faster, harder than the whole Oscars thing? 663 00:34:34,520 --> 00:34:34,719 Speaker 2: Yeah? 664 00:34:34,760 --> 00:34:37,839 Speaker 4: I don't think so, Katie Greener esteemed mused woman. As 665 00:34:37,840 --> 00:34:39,320 Speaker 4: a final thought, Katie. 666 00:34:39,000 --> 00:34:41,600 Speaker 6: I almost had a friendship end because this woman would 667 00:34:41,640 --> 00:34:44,759 Speaker 6: invite me to her Oscars party, and every time I 668 00:34:44,840 --> 00:34:46,320 Speaker 6: just said, I just I don't care. 669 00:34:46,920 --> 00:34:50,759 Speaker 1: Yeah, not coming, yeah, Jack. 670 00:34:50,880 --> 00:34:51,399 Speaker 2: Final thought. 671 00:34:51,840 --> 00:34:54,560 Speaker 1: The interesting thing on that is how well earned it 672 00:34:54,640 --> 00:34:56,520 Speaker 1: is the fall of the Oscars. 673 00:34:56,680 --> 00:34:57,640 Speaker 2: You all earned it. 674 00:34:57,960 --> 00:35:00,359 Speaker 1: You got up there and lectured us for you years, 675 00:35:00,400 --> 00:35:03,680 Speaker 1: and then finally enough people said if. 676 00:35:03,560 --> 00:35:06,239 Speaker 4: You and now that you told me you hated me 677 00:35:06,360 --> 00:35:09,919 Speaker 4: fifty times, I left your party and you screamed, where 678 00:35:09,960 --> 00:35:10,600 Speaker 4: are you going? 679 00:35:11,640 --> 00:35:14,799 Speaker 2: That is exactly right. That is the perfect definition of 680 00:35:14,840 --> 00:35:15,400 Speaker 2: what happened. 681 00:35:15,840 --> 00:35:18,880 Speaker 4: My final thought is from the article we were discussing 682 00:35:18,920 --> 00:35:21,840 Speaker 4: about the Mississippi turnaround of its schools. 683 00:35:22,400 --> 00:35:23,280 Speaker 2: Here's the quote. 684 00:35:23,800 --> 00:35:27,560 Speaker 4: The answer to America's core educational challenge is in Hattiesburg, 685 00:35:27,680 --> 00:35:28,320 Speaker 4: not Harvard. 686 00:35:30,120 --> 00:35:30,920 Speaker 2: That's a good one. 687 00:35:31,160 --> 00:35:35,560 Speaker 4: Look at what works put asking PhDs to reinvent a 688 00:35:35,640 --> 00:35:37,600 Speaker 4: perfectly functioning vehicle wheel. 689 00:35:37,719 --> 00:35:41,720 Speaker 1: Small town Southerners can't possibly come up with a good idea. 690 00:35:41,239 --> 00:35:43,000 Speaker 2: That's not return to a good idea. 691 00:35:43,080 --> 00:35:46,680 Speaker 1: Yeah, Armstrong E Getty wrapping about other grueling four hour workday, 692 00:35:46,880 --> 00:35:47,880 Speaker 1: so many people. 693 00:35:47,680 --> 00:35:49,719 Speaker 4: Thanks a little time, got Armstrong, getdi dot com. 694 00:35:49,800 --> 00:35:50,640 Speaker 2: Let us know what you think. 695 00:35:50,719 --> 00:35:54,480 Speaker 4: Mail bag at Armstrong Engeddy dot com is our email address. 696 00:35:54,920 --> 00:35:58,200 Speaker 4: Way in keep it briefish if you can, We'll see 697 00:35:58,200 --> 00:35:58,640 Speaker 4: you tomorrow. 698 00:35:58,719 --> 00:36:03,720 Speaker 1: God bless America, Armstrong and Getty. 699 00:36:03,840 --> 00:36:06,680 Speaker 2: But in case you're listed, is something. 700 00:36:06,400 --> 00:36:11,360 Speaker 4: Important that without us right now you'll all be speaking 701 00:36:11,600 --> 00:36:13,520 Speaker 4: German and little Japanese. 702 00:36:13,600 --> 00:36:18,360 Speaker 2: Perhaps you get it, you understand one prosi'd be breaking German. 703 00:36:18,760 --> 00:36:21,520 Speaker 2: So you get in line and your DNA say you 704 00:36:21,640 --> 00:36:24,279 Speaker 2: got a picture of the Germans, and they're thinking we 705 00:36:24,320 --> 00:36:27,160 Speaker 2: are speaking Is that bad? 706 00:36:27,239 --> 00:36:27,319 Speaker 6: Am? 707 00:36:27,400 --> 00:36:29,720 Speaker 2: I not supposed to be thinking German. It's a nice language. 708 00:36:29,800 --> 00:36:31,040 Speaker 2: Armstrong and Getty