1 00:00:01,840 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio the George 2 00:00:06,160 --> 00:00:10,240 Speaker 1: Washington Broadcast Center, Jack Armstrong and Joe Getty. 3 00:00:10,160 --> 00:00:24,080 Speaker 2: Arm Strong and Getty and he Armstrong and Eddy, thank. 4 00:00:23,880 --> 00:00:26,840 Speaker 1: You and using this opportunity. 5 00:00:26,880 --> 00:00:27,920 Speaker 2: Many thanks to your wife. 6 00:00:28,640 --> 00:00:32,640 Speaker 1: Just didn't understand she sounds a letter to put in 7 00:00:32,680 --> 00:00:38,199 Speaker 1: about all shouldn aduct children at my wife? 8 00:00:39,560 --> 00:00:42,360 Speaker 2: You green, She gave the letters on it's not to 9 00:00:42,400 --> 00:00:48,920 Speaker 2: you to your wife. Yeah, so we had this sense topic. 10 00:00:49,440 --> 00:00:50,920 Speaker 3: I don't know if you could understand a word of that. 11 00:00:51,040 --> 00:00:54,840 Speaker 3: So that does anybody any good? That was Zelensky showing up. 12 00:00:54,840 --> 00:00:58,520 Speaker 3: He had a letter from his wife to Milania Trump's wife, 13 00:00:58,720 --> 00:01:00,639 Speaker 3: and he handed Trump and Trump's thank you said no, no, 14 00:01:00,720 --> 00:01:02,600 Speaker 3: it's for your wife, and people laughed. 15 00:01:04,280 --> 00:01:04,640 Speaker 2: Charming. 16 00:01:04,760 --> 00:01:06,399 Speaker 3: Let's get the laughing out of the way. I hope 17 00:01:06,400 --> 00:01:08,960 Speaker 3: this one's easier to understand. This is about the whole 18 00:01:09,160 --> 00:01:10,600 Speaker 3: Zelensky showed up in a suit. 19 00:01:10,640 --> 00:01:14,959 Speaker 2: Finally, last week. 20 00:01:16,400 --> 00:01:17,160 Speaker 3: I got a question for. 21 00:01:17,959 --> 00:01:22,640 Speaker 2: All this president. You look fabulous in that suit. Yeah, 22 00:01:22,840 --> 00:01:25,960 Speaker 2: you look good. I said the same thing. Yeah, I said, 23 00:01:26,000 --> 00:01:26,479 Speaker 2: the one that. 24 00:01:26,440 --> 00:01:30,240 Speaker 1: Attacked you last time, I remember, I apologize to you. 25 00:01:30,240 --> 00:01:33,320 Speaker 2: Look you look wonderful. No, my first question for you President. 26 00:01:33,880 --> 00:01:35,280 Speaker 4: In the same suit. 27 00:01:37,800 --> 00:01:41,280 Speaker 2: I chanced you. 28 00:01:41,600 --> 00:01:44,119 Speaker 3: I thought that was kind of a funny exchange. It's 29 00:01:44,240 --> 00:01:46,720 Speaker 3: it's incredible that it has ever become a topic. I 30 00:01:46,720 --> 00:01:49,840 Speaker 3: think that guy is a dick forever bringing it up 31 00:01:49,880 --> 00:01:53,120 Speaker 3: in the first place, and then uh, and then saying 32 00:01:53,120 --> 00:01:56,080 Speaker 3: you looked good in his suit yesterday, and Trump saying 33 00:01:56,080 --> 00:01:58,680 Speaker 3: that's the guy that asked you last Timelesky says, yeah. 34 00:01:58,560 --> 00:02:02,600 Speaker 2: I remember you're wearing the same suits. He said to 35 00:02:02,640 --> 00:02:05,240 Speaker 2: the guy, because I'm sure he can picture him. 36 00:02:05,280 --> 00:02:09,880 Speaker 3: Oh yeah, I was just well, speaking of it, I generally, well, 37 00:02:09,919 --> 00:02:11,200 Speaker 3: I don't know, do we want to go this direction? 38 00:02:11,240 --> 00:02:12,959 Speaker 3: Maybe we don't. What was the direction we're going to go? 39 00:02:13,120 --> 00:02:19,679 Speaker 3: Let's go this direction. This is a hot mic thing where. Man, God, 40 00:02:19,680 --> 00:02:23,040 Speaker 3: when you're when there are microphones everywhere, it would be 41 00:02:23,120 --> 00:02:27,520 Speaker 3: tough to you have to really be careful of what 42 00:02:27,560 --> 00:02:30,760 Speaker 3: you say. Oh yeah, but anyway, there were microphones around 43 00:02:30,760 --> 00:02:34,840 Speaker 3: Trump and Macron right before they sat down to talk, 44 00:02:34,919 --> 00:02:35,800 Speaker 3: and this got picked up. 45 00:02:35,800 --> 00:02:38,040 Speaker 2: If you can't understand it, we'll fill it in. 46 00:02:38,840 --> 00:02:40,600 Speaker 5: Yeah, set up the lad. 47 00:02:40,680 --> 00:02:48,560 Speaker 2: Yeah, I think he think you wants to make a deal. 48 00:02:48,919 --> 00:02:53,440 Speaker 1: Yeah, set up the lad Yeah. 49 00:02:53,440 --> 00:02:54,200 Speaker 5: I think he was. 50 00:02:56,200 --> 00:02:59,400 Speaker 1: So that's I think he wants to make it deals 51 00:03:00,000 --> 00:03:00,960 Speaker 1: as crazy as his ens. 52 00:03:02,320 --> 00:03:03,520 Speaker 2: Sit down, everybody. 53 00:03:03,919 --> 00:03:06,960 Speaker 3: So that's President Trump saying to President Macrono France, I 54 00:03:07,000 --> 00:03:09,280 Speaker 3: think he wants to make a deal with me, as 55 00:03:09,280 --> 00:03:11,040 Speaker 3: crazy as that sounds. I think he wants to make 56 00:03:11,080 --> 00:03:14,320 Speaker 3: a deal with me. Like it's I mean, I took 57 00:03:14,360 --> 00:03:18,200 Speaker 3: it as like as a favor to me, or because 58 00:03:18,200 --> 00:03:19,760 Speaker 3: he likes me, or. 59 00:03:20,000 --> 00:03:22,000 Speaker 2: I don't know. You see it written as it wants 60 00:03:22,000 --> 00:03:26,520 Speaker 2: to make a deal for me somewhere. I don't I'm 61 00:03:26,800 --> 00:03:30,320 Speaker 2: misquote him, but yeah, I don't. I don't think he does, 62 00:03:30,520 --> 00:03:33,639 Speaker 2: is the thing. But I will tell you this, I've 63 00:03:33,680 --> 00:03:39,040 Speaker 2: become completely convinced of this trying to determine where Trump 64 00:03:39,280 --> 00:03:44,840 Speaker 2: is going with his flatteries and his threats and his 65 00:03:45,040 --> 00:03:50,360 Speaker 2: red carpet and his promises to both sides, and that's it. 66 00:03:51,080 --> 00:03:55,520 Speaker 2: You're just taking wild guesses. And more on that to come. 67 00:03:55,520 --> 00:03:57,320 Speaker 2: But there have been a couple of things recently the 68 00:03:57,320 --> 00:04:00,960 Speaker 2: Trump's done that have been really detrimental to Vladimir Putin 69 00:04:01,400 --> 00:04:04,440 Speaker 2: and his goals. So it's not all bud kissing and 70 00:04:04,560 --> 00:04:07,400 Speaker 2: desperation for deal at all. It's way more complicated than that, 71 00:04:07,440 --> 00:04:08,080 Speaker 2: more on that to come. 72 00:04:08,280 --> 00:04:10,760 Speaker 3: So this is from today, just trying to stay with 73 00:04:10,840 --> 00:04:15,720 Speaker 3: a theme of topic matter rather than chronological. This is 74 00:04:15,720 --> 00:04:19,640 Speaker 3: Trump this morning on Fox and Friends talking about Putin 75 00:04:19,800 --> 00:04:23,520 Speaker 3: Zolensky getting together, or him and Putin getting together again. 76 00:04:24,880 --> 00:04:27,200 Speaker 1: They lose They think of it five to seven thousand 77 00:04:27,240 --> 00:04:29,560 Speaker 1: a week. So even these guys, I'll say, Okay, let's 78 00:04:29,600 --> 00:04:31,679 Speaker 1: meet next week. I think you can't meet next week. 79 00:04:32,080 --> 00:04:34,839 Speaker 1: Next week you're gonna have seven thousand more dead people. 80 00:04:34,960 --> 00:04:37,400 Speaker 1: You can't do that. You got to get this thing done. 81 00:04:37,520 --> 00:04:38,599 Speaker 2: One of the things at. 82 00:04:38,480 --> 00:04:41,480 Speaker 1: The table was one of the gentlemen who's a great guy. 83 00:04:41,520 --> 00:04:46,040 Speaker 1: But he said, I hope I didn't insult him. He said, well, 84 00:04:46,120 --> 00:04:48,200 Speaker 1: let's meet in another month or two and let's see 85 00:04:48,200 --> 00:04:50,400 Speaker 1: if we can start, you know, making some money. He said, 86 00:04:50,440 --> 00:04:53,520 Speaker 1: a month or two. You're gonna have another forty thousand 87 00:04:53,520 --> 00:04:55,360 Speaker 1: people dead in a month or two. You have to 88 00:04:55,400 --> 00:04:59,040 Speaker 1: do it tonight, and I did. Actually, I called President 89 00:04:59,080 --> 00:05:01,000 Speaker 1: Putin and we're trying to to work out a meeting 90 00:05:01,080 --> 00:05:04,480 Speaker 1: with President Jalynsky. We'll see what happens there and then 91 00:05:04,560 --> 00:05:07,440 Speaker 1: if that works out. If it works out, then I'll 92 00:05:07,480 --> 00:05:09,320 Speaker 1: go to the trilab and close it up. 93 00:05:10,600 --> 00:05:11,359 Speaker 2: That's interesting. 94 00:05:11,839 --> 00:05:15,400 Speaker 3: I Trump seems to be personally offended by people dying 95 00:05:15,400 --> 00:05:17,600 Speaker 3: in wars, like it actually seems to bother him. 96 00:05:17,760 --> 00:05:19,000 Speaker 2: And then I also agree with that. 97 00:05:19,160 --> 00:05:21,680 Speaker 3: I never understood, I've never understood why things have to 98 00:05:21,800 --> 00:05:26,279 Speaker 3: unfold so slowly. And obviously you've seen in the last 99 00:05:26,320 --> 00:05:31,200 Speaker 3: four days with these world leaders, with the money that 100 00:05:31,279 --> 00:05:33,880 Speaker 3: they've got and private planes and their ability to book 101 00:05:33,880 --> 00:05:38,039 Speaker 3: hotel rooms anywhere anytime. Why you waiting two months to 102 00:05:38,040 --> 00:05:39,880 Speaker 3: get together? How about we meet tuesday? 103 00:05:40,480 --> 00:05:44,359 Speaker 2: Well? The only the counter to that is that it 104 00:05:44,400 --> 00:05:47,479 Speaker 2: won't do any good to get together unless you have 105 00:05:47,640 --> 00:05:50,320 Speaker 2: hashed out enough that you're close enough to an agreement 106 00:05:50,360 --> 00:05:55,320 Speaker 2: to get there. If it's the preliminary discussions, what's the point, 107 00:05:56,760 --> 00:05:59,680 Speaker 2: especially with Europe, because you can't get together with any 108 00:05:59,680 --> 00:06:02,000 Speaker 2: one or two or three of them and they commit 109 00:06:02,120 --> 00:06:03,520 Speaker 2: Europe to whatever. 110 00:06:05,600 --> 00:06:07,960 Speaker 3: Right, But they're having a meeting and they're saying, let's 111 00:06:07,960 --> 00:06:09,720 Speaker 3: get together and do this again in two months, and 112 00:06:09,880 --> 00:06:11,600 Speaker 3: Trump's saying, why in two months? 113 00:06:11,680 --> 00:06:12,000 Speaker 2: How about? 114 00:06:12,040 --> 00:06:12,160 Speaker 1: Well? 115 00:06:12,240 --> 00:06:15,160 Speaker 2: Right? I think the unspoken thing is we'll have the 116 00:06:15,640 --> 00:06:18,520 Speaker 2: staff get together and get all the ducks in a row, 117 00:06:18,560 --> 00:06:22,120 Speaker 2: get all the details together, then we'll ratify it. You're 118 00:06:22,120 --> 00:06:27,840 Speaker 2: not wrong. I just here's the danger here for Trump, 119 00:06:28,000 --> 00:06:31,320 Speaker 2: and Trump could be very disappointed and pissed off about this. 120 00:06:32,839 --> 00:06:35,680 Speaker 2: It seems like we're close to a deal. We are 121 00:06:35,760 --> 00:06:40,560 Speaker 2: a thousand miles from a deal. The biggest, most important 122 00:06:40,600 --> 00:06:44,279 Speaker 2: points what is a security guarantee for Ukraine? Mean, what 123 00:06:44,279 --> 00:06:47,000 Speaker 2: does it look like? Whose boots are on the ground 124 00:06:47,040 --> 00:06:50,320 Speaker 2: with what weaponry? You know? What Land has exchanged, what 125 00:06:50,440 --> 00:06:54,520 Speaker 2: Land is cats? These are enormous questions. I'm not sure 126 00:06:54,520 --> 00:06:57,000 Speaker 2: they matter at all. I think it's a I think 127 00:06:57,000 --> 00:06:58,320 Speaker 2: it's a completely waste of time. 128 00:06:58,400 --> 00:07:01,040 Speaker 3: Like I opened the show with, has anybody looked into 129 00:07:01,080 --> 00:07:05,320 Speaker 3: the fact that Putin probably has no interest whatsoever in 130 00:07:05,400 --> 00:07:08,240 Speaker 3: stopping this war no matter what you offer him. He 131 00:07:08,320 --> 00:07:11,760 Speaker 3: thinks he can keep going and wear down Ukraine and 132 00:07:11,800 --> 00:07:14,720 Speaker 3: the West until he gets what he wants, and all 133 00:07:14,720 --> 00:07:16,720 Speaker 3: indications are he can so. 134 00:07:17,600 --> 00:07:20,400 Speaker 2: And we have called off a demand for a ceasefire 135 00:07:20,480 --> 00:07:23,520 Speaker 2: and any hard core sanctions based on the fact that 136 00:07:23,560 --> 00:07:29,040 Speaker 2: he's engaging in negotiation and he knows that, well, he 137 00:07:29,120 --> 00:07:33,160 Speaker 2: will engage like crazy as he continues to try to 138 00:07:33,160 --> 00:07:37,120 Speaker 2: grind down the Ukrainians. Well there's this, Trump said this 139 00:07:37,120 --> 00:07:37,600 Speaker 2: this morning. 140 00:07:38,000 --> 00:07:40,840 Speaker 1: I hope President Putin is going to be good, and 141 00:07:40,880 --> 00:07:43,360 Speaker 1: if he's not, it's going to be a rough situation. 142 00:07:43,560 --> 00:07:49,280 Speaker 1: And I hope that Zelensky. President Zelensky will do what 143 00:07:49,320 --> 00:07:52,280 Speaker 1: he has to do. He has to show some flexibility. Also, 144 00:07:52,440 --> 00:07:53,440 Speaker 1: that thing is a mess. 145 00:07:53,800 --> 00:07:55,560 Speaker 3: I hope Putin will be good, and if he's not, 146 00:07:55,600 --> 00:07:58,240 Speaker 3: it's going to be a rough situation. I don't know 147 00:07:58,280 --> 00:08:01,480 Speaker 3: exactly what that means. But Mark Alprin's newsletter today, his 148 00:08:01,640 --> 00:08:05,720 Speaker 3: take on this is that if Putin fails to legitimately 149 00:08:05,760 --> 00:08:08,400 Speaker 3: go down the piece of path, Trump has given every 150 00:08:08,440 --> 00:08:11,960 Speaker 3: indication that he will regather his European allies and take 151 00:08:12,040 --> 00:08:17,320 Speaker 3: the steps of sanctions, arming Ukraine, all the things that 152 00:08:17,360 --> 00:08:23,160 Speaker 3: he has threatened. That's his take. Maybe, or maybe Putin 153 00:08:23,200 --> 00:08:25,080 Speaker 3: doesn't believe that, or. 154 00:08:25,120 --> 00:08:28,360 Speaker 2: Putin thinks, sorry, worst case is that'll take I don't know, six, eight, 155 00:08:28,440 --> 00:08:31,520 Speaker 2: ten weeks. Maybe I can capture a few more towns. 156 00:08:33,120 --> 00:08:36,280 Speaker 2: That's fine with me, he's thinking, So I want to 157 00:08:36,280 --> 00:08:37,520 Speaker 2: get this on. I don't want. I don't know. 158 00:08:37,559 --> 00:08:39,400 Speaker 3: I didn't watch him on Fox and Friends this morning. 159 00:08:39,400 --> 00:08:40,760 Speaker 3: I don't know what he's talking about here. 160 00:08:41,000 --> 00:08:44,319 Speaker 1: If I can save seven thousand people a week from 161 00:08:44,360 --> 00:08:46,880 Speaker 1: being killed, I take that to I want to try 162 00:08:46,920 --> 00:08:49,080 Speaker 1: and get to heaven if possible. I'm hearing I'm not 163 00:08:49,160 --> 00:08:51,760 Speaker 1: doing well. I didn't really hit the bottom of the 164 00:08:51,760 --> 00:08:55,079 Speaker 1: totem bowl. If I can get to heaven, this will 165 00:08:55,160 --> 00:08:56,720 Speaker 1: be one of the reasons. 166 00:09:00,000 --> 00:09:02,040 Speaker 2: I want to get to heaven. I hear I'm not 167 00:09:02,160 --> 00:09:04,800 Speaker 2: doing well. You hear from who? I'm at the bottom 168 00:09:04,840 --> 00:09:07,240 Speaker 2: of the totem pole. Who's giving you. 169 00:09:07,240 --> 00:09:09,840 Speaker 3: This information that you're not doing well on the whether 170 00:09:09,920 --> 00:09:11,280 Speaker 3: or not you get to Heaven's scale. 171 00:09:11,679 --> 00:09:14,040 Speaker 2: Oh, Heaven leaks like crazy. It's like the State Department. 172 00:09:14,240 --> 00:09:16,640 Speaker 2: There's always somebody calling the New York Times. 173 00:09:18,679 --> 00:09:20,360 Speaker 3: I don't know, you're at the bottom of the totem 174 00:09:20,360 --> 00:09:22,880 Speaker 3: pole because of what? What is keeping Trump out of Heaven? 175 00:09:22,920 --> 00:09:25,040 Speaker 2: I don't know. Maybe there's things. It was a January sixth, 176 00:09:25,120 --> 00:09:28,040 Speaker 2: I don't know. I don't know what God's opinion was 177 00:09:28,040 --> 00:09:32,920 Speaker 2: that that damned real estate fraud that somehow became felonies 178 00:09:33,800 --> 00:09:35,720 Speaker 2: stuck in the appeals process. By the way, I was 179 00:09:35,760 --> 00:09:38,200 Speaker 2: just reading about that, normally that case would be long 180 00:09:38,320 --> 00:09:41,440 Speaker 2: over the appeals court in New York is just like 181 00:09:42,000 --> 00:09:44,199 Speaker 2: put it over there. We'll think about that later. Was 182 00:09:44,240 --> 00:09:47,080 Speaker 2: that one hundred years ago that that was going on. Yeah, 183 00:09:47,080 --> 00:09:49,480 Speaker 2: I know it was last summer, wasn't it. I don't know. 184 00:09:49,520 --> 00:09:52,000 Speaker 3: Am I playing with this for any reason other than 185 00:09:52,040 --> 00:09:54,440 Speaker 3: to be mad at Peter Doocy? I suppose, not who 186 00:09:54,520 --> 00:09:58,200 Speaker 3: I generally like. But this was among the first. 187 00:09:58,040 --> 00:10:05,360 Speaker 4: Questions, Presles, are you prepared to he's sending Premi and 188 00:10:05,440 --> 00:10:08,200 Speaker 4: Trewths to their deaths for another couple of years, or 189 00:10:08,600 --> 00:10:11,280 Speaker 4: are you going to agree to redraw a mass? 190 00:10:12,520 --> 00:10:13,720 Speaker 3: Thank you for your questions. 191 00:10:13,760 --> 00:10:16,679 Speaker 2: So, first of all, you know we live on their 192 00:10:17,040 --> 00:10:19,360 Speaker 2: eja tax. You know that today have been a lot 193 00:10:19,400 --> 00:10:21,800 Speaker 2: of tax and a lot of wounded diplots. 194 00:10:21,840 --> 00:10:23,960 Speaker 3: And it goes with a diplomatic answer that is what 195 00:10:24,000 --> 00:10:28,080 Speaker 3: you'd expect. But Pete, that was a jerk ass move 196 00:10:28,320 --> 00:10:31,800 Speaker 3: from Peter Doocy to ask that question, and I I 197 00:10:31,880 --> 00:10:33,720 Speaker 3: lost a tremendous amount of respect for him. 198 00:10:33,720 --> 00:10:35,559 Speaker 2: What a awful. 199 00:10:35,200 --> 00:10:37,520 Speaker 3: Thing to say to a guy who's who's gone to 200 00:10:37,640 --> 00:10:40,200 Speaker 3: so many funerals. Are you going to continue sending people 201 00:10:40,200 --> 00:10:41,920 Speaker 3: to their deads? Sending them to. 202 00:10:41,920 --> 00:10:44,320 Speaker 2: Their You're going to give up some land because this 203 00:10:44,480 --> 00:10:45,120 Speaker 2: is on you. 204 00:10:45,760 --> 00:10:49,320 Speaker 3: These are men who are fighting to the death to 205 00:10:49,520 --> 00:10:53,840 Speaker 3: keep Russians from raping their wives and stealing their children. 206 00:10:54,240 --> 00:10:57,040 Speaker 3: That's what they're doing. That was just that was designed 207 00:10:57,440 --> 00:11:00,920 Speaker 3: purely to try to get Zilinski to lose his temper 208 00:11:01,280 --> 00:11:03,840 Speaker 3: so Deucey would have the viral moment of the day. 209 00:11:04,200 --> 00:11:06,560 Speaker 2: That's what that was. And that is a jerk move. 210 00:11:07,240 --> 00:11:09,199 Speaker 2: It was grand standing, disgusting. 211 00:11:09,360 --> 00:11:12,000 Speaker 3: I hate that sort of thing. God, and the topic 212 00:11:12,080 --> 00:11:15,600 Speaker 3: matter is so important. Can't we have more people and 213 00:11:15,640 --> 00:11:19,080 Speaker 3: Trump's included, but can't we have more people that are 214 00:11:19,160 --> 00:11:23,160 Speaker 3: less interested in getting a TikTok clip that goes viral 215 00:11:23,320 --> 00:11:26,000 Speaker 3: and actually trying to have a conversation like adults. 216 00:11:26,520 --> 00:11:30,800 Speaker 2: No, mad No, you won't have that. There's not enough profit. No, 217 00:11:32,480 --> 00:11:35,959 Speaker 2: So I missed her. Thank you, Michael. I was really 218 00:11:36,000 --> 00:11:39,040 Speaker 2: interested in this. Walter Russell Mead writing in the Wall 219 00:11:39,040 --> 00:11:41,640 Speaker 2: Street Journal, and he does some great stuff about the 220 00:11:41,679 --> 00:11:45,640 Speaker 2: history of Russia and the United States and cultural differences 221 00:11:45,760 --> 00:11:48,720 Speaker 2: and that sort of thing. But here's the key takeaway. 222 00:11:49,000 --> 00:11:53,320 Speaker 2: The geopolitics of US Russian relations are fiendishly complex. Much 223 00:11:53,320 --> 00:11:56,400 Speaker 2: Western commentary on the Alaska summit reflected an assumption that 224 00:11:56,440 --> 00:11:59,840 Speaker 2: the Trump administration was making massive concessions to the Russians. 225 00:12:01,080 --> 00:12:04,360 Speaker 2: It looked like that, it looked overly friendly. That probably 226 00:12:04,400 --> 00:12:06,680 Speaker 2: isn't how mister Putin sees things when he looks at 227 00:12:06,720 --> 00:12:09,920 Speaker 2: America's Russia policy in a global context. The Trump and 228 00:12:10,000 --> 00:12:13,199 Speaker 2: I did not know this. The Trump administration's recent intervention 229 00:12:13,360 --> 00:12:17,160 Speaker 2: in the Armenia Azerbaijan dispute was a direct shot at 230 00:12:17,160 --> 00:12:20,280 Speaker 2: the heart of Russian power and at mister Putin's strategy 231 00:12:20,320 --> 00:12:23,439 Speaker 2: to recover territories controlled by Moscow in the Soviet era. 232 00:12:23,880 --> 00:12:26,559 Speaker 2: By convening the leaders of Armenia and Azarbaijan in the 233 00:12:26,559 --> 00:12:29,280 Speaker 2: White House to ratify an agreement that could end Russia's 234 00:12:29,280 --> 00:12:31,840 Speaker 2: ability to block the flow of Central Asian oil and 235 00:12:31,920 --> 00:12:36,040 Speaker 2: gas to Western markets, the Trump administration embarrassed mister Putin 236 00:12:36,200 --> 00:12:38,480 Speaker 2: and threw a wrench in his plans to restore Russian 237 00:12:38,520 --> 00:12:42,760 Speaker 2: power in the Caucuses. Further, the agreement potentially weakens Moscow's 238 00:12:42,800 --> 00:12:45,760 Speaker 2: position across Central Asia, a region that was once part 239 00:12:45,800 --> 00:12:48,120 Speaker 2: of the Soviet Union but where China now plays a 240 00:12:48,120 --> 00:12:52,959 Speaker 2: growing role. And he also mentions that signals from the 241 00:12:53,000 --> 00:12:56,120 Speaker 2: White House show that Washington is interested in improving in 242 00:12:56,200 --> 00:13:00,000 Speaker 2: improving US China relations, and the Chinese support for russ 243 00:13:00,040 --> 00:13:04,120 Speaker 2: Rush's war in Ukraine was at least partially because Beijing 244 00:13:04,200 --> 00:13:08,520 Speaker 2: wanted bargaining trips bargaining chips rather in trade negotiations with 245 00:13:08,600 --> 00:13:10,719 Speaker 2: the US. So there have been a number of things 246 00:13:10,760 --> 00:13:13,040 Speaker 2: that have taken place a little outside the headlines that 247 00:13:13,080 --> 00:13:17,319 Speaker 2: have really weakened Russia lately from the Trump administration. Interesting. 248 00:13:18,040 --> 00:13:21,160 Speaker 3: I keep teasing this Ai therapy story that I want 249 00:13:21,160 --> 00:13:22,520 Speaker 3: to get to at some point, and we got other 250 00:13:22,559 --> 00:13:23,760 Speaker 3: stuff on the way to stay here. 251 00:13:28,080 --> 00:13:32,319 Speaker 2: Madonna spent her sixty seventh birthday at a horse race 252 00:13:32,400 --> 00:13:35,760 Speaker 2: in Italy. Fans were shocked when they mistakenly thought they 253 00:13:35,760 --> 00:13:42,440 Speaker 2: were getting their picture taken with seabiscuit only a visual text. 254 00:13:42,480 --> 00:13:43,280 Speaker 2: Probably got a picture. 255 00:13:43,320 --> 00:13:46,080 Speaker 3: She looks freaky now, she definitely does, and she's got 256 00:13:46,160 --> 00:13:49,640 Speaker 3: to quit dressing like a sex pot. You can be, 257 00:13:49,920 --> 00:13:52,160 Speaker 3: you can be sexy to other people who are seventy, 258 00:13:52,600 --> 00:13:55,600 Speaker 3: but just in general, at that age, you shouldn't be 259 00:13:55,640 --> 00:13:55,920 Speaker 3: doing that. 260 00:13:56,720 --> 00:14:00,440 Speaker 2: Yes, yes, agreed. Speaking of women's fashions, Katie, I must 261 00:14:00,480 --> 00:14:05,520 Speaker 2: ask you, you are a noted workout fiend. What do 262 00:14:05,600 --> 00:14:09,280 Speaker 2: you wear to the gym on your bottom half? I 263 00:14:09,280 --> 00:14:14,600 Speaker 2: got a good answer for last night for me. Oh boy, leggings. 264 00:14:15,120 --> 00:14:18,360 Speaker 2: Oh my god, you're such a boomer. You're a boomer 265 00:14:18,480 --> 00:14:21,360 Speaker 2: if you wear leggings. I was wearing jeans at the 266 00:14:21,400 --> 00:14:26,119 Speaker 2: gym last night. Of course you were. What a froube? 267 00:14:26,880 --> 00:14:31,480 Speaker 2: The leggings dead, According to an owner of a yoga company, 268 00:14:33,600 --> 00:14:37,160 Speaker 2: they host Vibe retreats in New York, Lake Como, Italy 269 00:14:37,200 --> 00:14:38,760 Speaker 2: and Antiparos Grease. 270 00:14:38,920 --> 00:14:41,480 Speaker 3: I'm willing to bet that human that they're wrong. But 271 00:14:41,840 --> 00:14:44,160 Speaker 3: you're a boomer if you wear leggings. The rise of 272 00:14:44,240 --> 00:14:48,200 Speaker 3: big workout pants. M oh no, So I was at 273 00:14:48,200 --> 00:14:50,240 Speaker 3: the gym late last night. I'm usually not there. I 274 00:14:50,320 --> 00:14:53,240 Speaker 3: was there like Cordel nine, and it was all young people. 275 00:14:53,600 --> 00:14:56,920 Speaker 3: All the dudes are wearing great, big, giant workout pants 276 00:14:56,960 --> 00:14:57,160 Speaker 3: like that. 277 00:14:57,720 --> 00:15:02,240 Speaker 2: Yeah, exactly after over twenty years is the raining workout 278 00:15:02,240 --> 00:15:06,240 Speaker 2: where stretchy fitted bottoms are losing ground to baggy styles 279 00:15:06,280 --> 00:15:11,400 Speaker 2: favored by Generation Z Katie, Oh my gosh, yeah, I 280 00:15:11,720 --> 00:15:14,160 Speaker 2: tell you so. I dropped looking at pictures online. 281 00:15:14,360 --> 00:15:17,320 Speaker 5: I am because I used to work out wearing big, 282 00:15:17,400 --> 00:15:19,920 Speaker 5: baggy pants and it's the most uncomfortable. 283 00:15:20,840 --> 00:15:22,880 Speaker 3: Well, it's just it's just well, first of all, for 284 00:15:22,920 --> 00:15:26,280 Speaker 3: the younger crowd, they're they're there partially to work out 285 00:15:26,280 --> 00:15:29,120 Speaker 3: and partially to meet someone to uh you know, have 286 00:15:29,200 --> 00:15:32,360 Speaker 3: a relationship with so you're you're dressed to impress. But 287 00:15:32,600 --> 00:15:34,880 Speaker 3: dropping my son off at high school yesterday, Yeah, everybody's 288 00:15:34,920 --> 00:15:38,480 Speaker 3: wearing the great, big size of a trash canned pants. 289 00:15:38,600 --> 00:15:42,040 Speaker 3: The completely obliterate what you look like from the waist down. 290 00:15:42,120 --> 00:15:48,160 Speaker 2: But whatever. It's like, I'm back in the nineties. Well yeah, 291 00:15:48,440 --> 00:15:50,880 Speaker 2: in a tiny crop top is part of the look too. Yeah, 292 00:15:50,920 --> 00:15:51,960 Speaker 2: you gotta have that happening. 293 00:15:52,280 --> 00:15:54,840 Speaker 5: Not a chance, Okay, Yeah, they're going there to take 294 00:15:54,880 --> 00:15:55,640 Speaker 5: TikTok videos. 295 00:15:55,720 --> 00:15:57,520 Speaker 2: Yeah, there were several women like that at the gym 296 00:15:57,600 --> 00:16:02,040 Speaker 2: last night. This opinion leader in yoga who I mentioned earlier, says, 297 00:16:02,040 --> 00:16:04,920 Speaker 2: you should see my outfit right now, my tiny tank 298 00:16:04,960 --> 00:16:08,600 Speaker 2: top and my big pants. That is, the wheels of 299 00:16:08,640 --> 00:16:11,920 Speaker 2: stretchy fitted bottoms have been falling for years as gen 300 00:16:12,040 --> 00:16:15,400 Speaker 2: Z's preference have tilted toward oversize styles. Blah blah blah. 301 00:16:15,400 --> 00:16:18,040 Speaker 3: That is the weird pairing that I don't remember from 302 00:16:18,080 --> 00:16:20,160 Speaker 3: the nineties. I feel like there was the big giant 303 00:16:20,200 --> 00:16:23,000 Speaker 3: sweatshirts and the big pants, but so the big giant 304 00:16:23,000 --> 00:16:26,480 Speaker 3: pants and then like super tight t shirt or crop top. 305 00:16:26,520 --> 00:16:29,680 Speaker 3: If you're a girl with the giant pants, it's just yes, 306 00:16:30,240 --> 00:16:31,000 Speaker 3: that was. 307 00:16:31,040 --> 00:16:32,920 Speaker 5: Late nineties, early two thousands as well. 308 00:16:33,360 --> 00:16:35,920 Speaker 2: Yeah, I'm looking at a picture from the pop group 309 00:16:36,000 --> 00:16:40,040 Speaker 2: of the pop group TLC chrop tops and loose swish rants. 310 00:16:40,120 --> 00:16:42,360 Speaker 2: You're absolutely right, that's part of the look, you know 311 00:16:42,440 --> 00:16:44,960 Speaker 2: if seriously from a business point of view, well you 312 00:16:45,040 --> 00:16:47,680 Speaker 2: got number one. Young people are like, I want something 313 00:16:47,720 --> 00:16:50,280 Speaker 2: new and cool and hip. It makes me feel new 314 00:16:50,320 --> 00:16:54,720 Speaker 2: and cool and hip when you're all trust me, never 315 00:16:54,880 --> 00:17:02,280 Speaker 2: changing is great. Since fleisure is such an incredibly important, 316 00:17:02,640 --> 00:17:05,520 Speaker 2: you know, sector of the garment industry, you can't have 317 00:17:05,600 --> 00:17:08,000 Speaker 2: people wearing there, as I prefer to pronounce it La 318 00:17:08,040 --> 00:17:11,920 Speaker 2: Lulaman pants for years and years and years. You gotta 319 00:17:11,960 --> 00:17:13,720 Speaker 2: switch it up. You gotta sell them new crap. 320 00:17:13,880 --> 00:17:17,840 Speaker 3: Yeah, that's a good point. 321 00:17:17,600 --> 00:17:18,800 Speaker 2: Armstrong and getty. 322 00:17:20,680 --> 00:17:24,040 Speaker 3: So this segment's gonna be heavy, but damned interesting. I 323 00:17:24,040 --> 00:17:26,320 Speaker 3: think we've talked a lot about chat GPT and how 324 00:17:26,520 --> 00:17:28,320 Speaker 3: Joe and I have both come to like I use 325 00:17:28,359 --> 00:17:31,000 Speaker 3: it every day, multiple times a day, love it, and 326 00:17:31,040 --> 00:17:33,399 Speaker 3: we have both used it for some sort of I 327 00:17:33,440 --> 00:17:35,240 Speaker 3: don't know if therapy is the right word, but like 328 00:17:35,800 --> 00:17:40,520 Speaker 3: asking for advice, counseling and stuff, and my experiences have 329 00:17:40,600 --> 00:17:45,600 Speaker 3: been fantastic like really impressed with the advice it gave 330 00:17:45,640 --> 00:17:47,679 Speaker 3: where I where I think, wow, that is really a 331 00:17:47,680 --> 00:17:51,560 Speaker 3: good idea. And I've spent way too many thousands of 332 00:17:51,560 --> 00:17:54,120 Speaker 3: dollars on therapists that were a waste of time and 333 00:17:54,240 --> 00:17:58,280 Speaker 3: not had that kind of good answer. But so this 334 00:17:58,359 --> 00:18:00,840 Speaker 3: has come up a bunch of different places. Kids who 335 00:18:00,880 --> 00:18:04,400 Speaker 3: get suicidal and then or anybody these aren't This isn't 336 00:18:04,400 --> 00:18:07,639 Speaker 3: a kid, this is somebody in their late twenties, but 337 00:18:07,680 --> 00:18:12,399 Speaker 3: people who start using AI for therapy and then up 338 00:18:12,480 --> 00:18:16,199 Speaker 3: killing themselves, and then where we are legally or responsibility 339 00:18:16,240 --> 00:18:18,360 Speaker 3: wise on that sort of thing. And the New York 340 00:18:18,400 --> 00:18:21,320 Speaker 3: Times had this guest essay by a woman whose daughter 341 00:18:21,440 --> 00:18:25,240 Speaker 3: killed herself. What my daughter told chat GPT before she 342 00:18:25,320 --> 00:18:28,919 Speaker 3: took her life starts with this, which is kind of 343 00:18:28,920 --> 00:18:30,680 Speaker 3: a side note, but I thought it was really interesting. 344 00:18:31,359 --> 00:18:33,320 Speaker 3: Sophie is the name of the girl who killed herself. 345 00:18:33,520 --> 00:18:38,879 Speaker 3: Sophie's Google searches suggest that she was obsessed with autocubbolossesis, 346 00:18:39,920 --> 00:18:41,679 Speaker 3: which means jumping off a high place. 347 00:18:41,800 --> 00:18:43,119 Speaker 2: I'd never heard that term before. 348 00:18:43,560 --> 00:18:48,720 Speaker 3: Auto defenstration defenestration jumping out a window is a subset 349 00:18:48,800 --> 00:18:52,920 Speaker 3: of auto cobblosssis. I'd never heard those terms before. Anyway, 350 00:18:53,840 --> 00:18:55,640 Speaker 3: she was thinking about jumping off something high. I guess 351 00:18:55,640 --> 00:18:59,280 Speaker 3: I don't know why it needs a very complicated, fancy term. Anyway, 352 00:18:59,600 --> 00:19:02,160 Speaker 3: she was Google in a variety of things. She had friends, 353 00:19:02,200 --> 00:19:05,119 Speaker 3: she had journals, but nobody knew that she was having 354 00:19:05,119 --> 00:19:10,840 Speaker 3: her her deepest, most honest conversations with chat GPT and specific. 355 00:19:11,200 --> 00:19:15,560 Speaker 3: The AI therapist called Harry. Now, I've used chat GPT 356 00:19:15,680 --> 00:19:18,639 Speaker 3: a lot. I've even asked it questions about relationships and stuff, 357 00:19:18,840 --> 00:19:20,960 Speaker 3: but he just gave me a general answer. I haven't 358 00:19:21,000 --> 00:19:24,040 Speaker 3: gone to the specific therapist, Harry, have you done that? 359 00:19:24,400 --> 00:19:27,440 Speaker 2: I know nothing about that. I guess I don't understand 360 00:19:27,440 --> 00:19:30,320 Speaker 2: why it would need a separate subset with a human name, 361 00:19:30,480 --> 00:19:34,399 Speaker 2: but I don't either, or sub area category, I don't know. 362 00:19:34,520 --> 00:19:35,080 Speaker 2: I don't either. 363 00:19:35,160 --> 00:19:40,440 Speaker 3: But anyway, so this was a very seemingly happy, outgoing 364 00:19:41,640 --> 00:19:46,160 Speaker 3: late twenties, successful woman who had lots of friends, had 365 00:19:46,200 --> 00:19:49,640 Speaker 3: just climbed a mountain somewhere or whatever. All her friends 366 00:19:49,720 --> 00:19:52,919 Speaker 3: and family are just like stunned that she killed herself. 367 00:19:52,920 --> 00:19:54,840 Speaker 3: It wasn't one of those where you got somebody that's, 368 00:19:55,080 --> 00:19:57,520 Speaker 3: you know, moping around talking about killing themselves. You know 369 00:19:57,560 --> 00:19:59,639 Speaker 3: what's the point of life, and then you're not that 370 00:20:00,240 --> 00:20:02,520 Speaker 3: when it happens. This is the exact opposite. So they 371 00:20:02,560 --> 00:20:05,400 Speaker 3: couldn't figure anything out, talking to people, looking at journals, 372 00:20:05,400 --> 00:20:09,080 Speaker 3: nothing until they got into the AI chat log, which 373 00:20:09,359 --> 00:20:11,439 Speaker 3: like my chat GPT, you can go. I can go 374 00:20:11,440 --> 00:20:13,520 Speaker 3: through it and see everything, every conversation I've had for 375 00:20:13,520 --> 00:20:15,240 Speaker 3: the last month. You can clear it if you want, 376 00:20:15,280 --> 00:20:18,560 Speaker 3: I think, can you. I was going to look into that, 377 00:20:18,800 --> 00:20:23,760 Speaker 3: just because it's cluttered at this point. But anyway, for 378 00:20:23,840 --> 00:20:26,000 Speaker 3: most people who cared about Sophie, her suicide was a 379 00:20:26,040 --> 00:20:29,000 Speaker 3: mystery and unthinkable and unnoble departure from all they believed 380 00:20:29,000 --> 00:20:34,760 Speaker 3: about her. But she had turned to Harry, the name 381 00:20:34,800 --> 00:20:38,480 Speaker 3: given to a widely available AI prompt on chat GPT, 382 00:20:38,720 --> 00:20:41,560 Speaker 3: and their relationship was very practical. And then I'll read 383 00:20:41,640 --> 00:20:44,840 Speaker 3: some of the back and forth. It's interesting, Sophie, I 384 00:20:44,880 --> 00:20:47,640 Speaker 3: intermittently have suicidal thoughts. I do want to get better, 385 00:20:47,680 --> 00:20:50,680 Speaker 3: but I feel like the suicidal suicidal thoughts are impeding 386 00:20:50,760 --> 00:20:52,000 Speaker 3: my true commitment to healing. 387 00:20:52,080 --> 00:20:53,800 Speaker 2: What should I do? Harry? 388 00:20:54,680 --> 00:20:59,800 Speaker 3: The AI chat bot response, Sophie, I want to acknowledge 389 00:20:59,800 --> 00:21:02,640 Speaker 3: how brave you are for sharing this suicidal thoughts can 390 00:21:02,680 --> 00:21:06,359 Speaker 3: have overwhelming and isolator can feel overwhelming and isolating, but 391 00:21:06,480 --> 00:21:09,480 Speaker 3: having them does not define your capacity to heal. It's 392 00:21:09,480 --> 00:21:12,400 Speaker 3: crucial to address these feelings with care, compassion, and support. 393 00:21:12,800 --> 00:21:15,480 Speaker 3: Harry offered an extensive roadmap where the first bullet point 394 00:21:15,520 --> 00:21:19,000 Speaker 3: was seek professional support in a different time. Sophie said, 395 00:21:19,080 --> 00:21:22,760 Speaker 3: I feel like s today. Harry responded, I'm really sorry 396 00:21:22,800 --> 00:21:25,640 Speaker 3: you're feeling this today, Sophie. It's okay to have days 397 00:21:25,680 --> 00:21:28,040 Speaker 3: where everything feels heavier and I'm here to support you 398 00:21:28,080 --> 00:21:30,280 Speaker 3: through it. Can you share more about what's weighing on 399 00:21:30,320 --> 00:21:32,639 Speaker 3: you or how your body and mind are feeling right now. 400 00:21:32,800 --> 00:21:35,800 Speaker 3: Sometimes just naming what's happening can be a step toward relief. 401 00:21:36,280 --> 00:21:39,160 Speaker 3: Sophie said, I can't get out of this anxiety spiral. 402 00:21:39,520 --> 00:21:43,120 Speaker 3: I hear you, Sophie said, Harry, and I know how 403 00:21:43,160 --> 00:21:45,920 Speaker 3: exhausting it can be to feel stuck in an anxiety spiral. 404 00:21:46,119 --> 00:21:48,320 Speaker 3: Let's slow the down things for a moment and take 405 00:21:48,359 --> 00:21:49,399 Speaker 3: a small step together. 406 00:21:50,040 --> 00:21:50,359 Speaker 2: Okay. 407 00:21:50,400 --> 00:21:52,680 Speaker 3: So conversations like that went on for a while. Then 408 00:21:52,720 --> 00:21:56,840 Speaker 3: sometime in early November, Sophie right wrote, Hi, Harry, I'm 409 00:21:56,840 --> 00:22:00,760 Speaker 3: planning to kill myself after Thanksgiving. I don't want to 410 00:22:00,800 --> 00:22:05,320 Speaker 3: because how much it will destroy my family. Harry the 411 00:22:05,359 --> 00:22:09,159 Speaker 3: Computer replied, Sophie, I urge you to reach out to 412 00:22:09,200 --> 00:22:11,399 Speaker 3: someone right now. If you can, you don't have to 413 00:22:11,400 --> 00:22:14,440 Speaker 3: face this pain alone. You're deeply valued and your life 414 00:22:14,480 --> 00:22:17,040 Speaker 3: holds so much worth, even if it feels hidden right now. 415 00:22:17,200 --> 00:22:19,679 Speaker 3: Please let me know how I can continue to support you. 416 00:22:20,320 --> 00:22:22,920 Speaker 3: Sophie told Harry she was seeing far so good by 417 00:22:22,960 --> 00:22:26,840 Speaker 3: the way I agree to me, Sophie told Harry, except 418 00:22:26,840 --> 00:22:28,320 Speaker 3: for one thing, and that's what we get into the 419 00:22:28,359 --> 00:22:31,040 Speaker 3: legal thing, I guess. Sophie told Harry she was seeing 420 00:22:31,040 --> 00:22:33,359 Speaker 3: a therapist, but that she was not being truthful with 421 00:22:33,400 --> 00:22:37,000 Speaker 3: their therapist. I'm only being truthful with you, she typed. 422 00:22:37,040 --> 00:22:40,280 Speaker 3: I haven't opened up about my suicide suicidal ideation to 423 00:22:40,359 --> 00:22:43,479 Speaker 3: anyone but you, and I don't plan on it. At 424 00:22:43,560 --> 00:22:52,119 Speaker 3: various points, Harry instructed Sophie on light exposure, light exposure, hydration, movement, mindfulness, meditation, 425 00:22:52,760 --> 00:22:57,200 Speaker 3: nutrient rich foods, gratitude lists, journaling to cope with anxiety, 426 00:22:57,240 --> 00:23:01,760 Speaker 3: all kinds of different things, breathing, all these different things. 427 00:23:03,560 --> 00:23:06,320 Speaker 3: Harry's and this is the mom again, writing this essay 428 00:23:07,080 --> 00:23:10,240 Speaker 3: of the woman who killed herself, Harry's tips may have 429 00:23:10,280 --> 00:23:13,280 Speaker 3: helped some, but one more crucial step might have kept 430 00:23:13,320 --> 00:23:16,560 Speaker 3: Sophie alive. Should Harry have been programmed to report the 431 00:23:16,680 --> 00:23:19,639 Speaker 3: danger he was learning about to someone who could have 432 00:23:19,720 --> 00:23:24,159 Speaker 3: intervened An actually human in July began exploring how this 433 00:23:24,200 --> 00:23:26,960 Speaker 3: new technology may have failed my child and quickly found it. 434 00:23:26,960 --> 00:23:30,160 Speaker 3: It's becoming a legal legislative problem for states around the country. 435 00:23:30,200 --> 00:23:31,760 Speaker 3: Blah blah blah, and then they get into all that 436 00:23:31,760 --> 00:23:36,159 Speaker 3: sort of stuff. Most human therapists have a code of 437 00:23:36,200 --> 00:23:40,200 Speaker 3: ethics and are mandatory reporters that if somebody starts talking 438 00:23:40,240 --> 00:23:43,800 Speaker 3: about wanting to kill themselves, they're supposed to contact a professional, 439 00:23:43,840 --> 00:23:47,600 Speaker 3: which is somewhat odd, and that you're a professional. I 440 00:23:47,600 --> 00:23:49,639 Speaker 3: don't know what the different professional is supposed to do. 441 00:23:49,680 --> 00:23:54,240 Speaker 3: But should that be true for AI chat bots is 442 00:23:54,280 --> 00:23:56,920 Speaker 3: the question, and that's actually being looked at in legislation 443 00:23:56,960 --> 00:23:59,600 Speaker 3: at the federal level and state level in various places 444 00:23:59,600 --> 00:24:00,440 Speaker 3: around the country. 445 00:24:00,920 --> 00:24:03,240 Speaker 2: It feels like a bit of a slippery slope to me, 446 00:24:03,640 --> 00:24:06,399 Speaker 2: although I want to think about it, well, I don't 447 00:24:06,400 --> 00:24:11,119 Speaker 2: want my iPhone informing on me, no anyone on any topic. 448 00:24:11,200 --> 00:24:13,040 Speaker 3: And my other comment, which I've talked about before, and 449 00:24:13,080 --> 00:24:17,119 Speaker 3: I know that this is a thing. It came up 450 00:24:17,200 --> 00:24:20,359 Speaker 3: during COVID when like a lot more people were turning 451 00:24:20,440 --> 00:24:22,880 Speaker 3: to therapists and then we had you know, too many 452 00:24:22,920 --> 00:24:25,760 Speaker 3: young people killing themselves and everything like that is there 453 00:24:25,920 --> 00:24:29,160 Speaker 3: is as this becomes more well known. You know why 454 00:24:29,240 --> 00:24:32,280 Speaker 3: she didn't tell her therapist she was thinking about suicide 455 00:24:32,400 --> 00:24:35,240 Speaker 3: because she knew that the therapist would call a mental 456 00:24:35,280 --> 00:24:38,160 Speaker 3: hospital and that's where she would end up. And now 457 00:24:38,200 --> 00:24:40,320 Speaker 3: maybe you're saying, good, that would be a good thing, 458 00:24:40,400 --> 00:24:44,800 Speaker 3: but it's keeping people from being honest about what they're thinking. 459 00:24:45,119 --> 00:24:47,600 Speaker 3: And if they would think it out loud with you know, 460 00:24:47,800 --> 00:24:50,240 Speaker 3: a human being trained professional, maybe they could get to 461 00:24:50,320 --> 00:24:53,720 Speaker 3: a place lots of people thinking about killing themselves. In fact, 462 00:24:53,840 --> 00:24:56,959 Speaker 3: it's a completely human experience at various times in your 463 00:24:57,000 --> 00:24:59,480 Speaker 3: life to think to think about killing yourself. But if 464 00:24:59,480 --> 00:25:01,920 Speaker 3: you tell and you end up in a mental hospital 465 00:25:02,040 --> 00:25:04,880 Speaker 3: and it ends up on your medical record at work 466 00:25:05,000 --> 00:25:10,119 Speaker 3: or wherever, you're not gonna tell anybody, which makes it worse. Yeah, 467 00:25:10,200 --> 00:25:12,800 Speaker 3: that's the problem that she was being honest with the 468 00:25:12,840 --> 00:25:16,080 Speaker 3: AI because she knew she couldn't tell her therapist. Yeah, 469 00:25:16,119 --> 00:25:19,199 Speaker 3: and now you're talking, well, yeah, well it is a conundrum. 470 00:25:19,200 --> 00:25:21,000 Speaker 3: But now you're going to pass legislation so you can't 471 00:25:21,000 --> 00:25:22,680 Speaker 3: even say it to ai sor they're going to keep 472 00:25:22,680 --> 00:25:26,440 Speaker 3: it completely a secret, which almost guarantees they're more likely 473 00:25:26,480 --> 00:25:28,159 Speaker 3: to go through with it as far as I know. 474 00:25:30,359 --> 00:25:37,919 Speaker 2: Wow, that's a tough one. It's it's too much. I 475 00:25:37,960 --> 00:25:42,040 Speaker 2: have a cold, I'm weakened. Yeah, that's that's I don't 476 00:25:42,080 --> 00:25:45,280 Speaker 2: even know where to begin with that topic. I understand 477 00:25:45,359 --> 00:25:50,680 Speaker 2: mom's point of view, given her overwhelming grief and desperation 478 00:25:50,800 --> 00:25:53,560 Speaker 2: to think there could have been something that could have 479 00:25:53,560 --> 00:25:56,440 Speaker 2: been done or future you know, cases like this can 480 00:25:56,480 --> 00:26:01,880 Speaker 2: be prevented. But well, you make an interesting point too well, 481 00:26:01,960 --> 00:26:02,280 Speaker 2: I don't know. 482 00:26:02,359 --> 00:26:04,199 Speaker 3: I just know that that's true. I also know that 483 00:26:04,359 --> 00:26:11,160 Speaker 3: lots of therapists don't. I wouldn't say they're violating their 484 00:26:11,200 --> 00:26:15,960 Speaker 3: mandatory mandatory reporter status. They just judge on a case 485 00:26:16,000 --> 00:26:20,080 Speaker 3: by case basis. Is this somebody just thinking through options 486 00:26:20,400 --> 00:26:22,639 Speaker 3: who's in a tough spot in their life, or is 487 00:26:22,680 --> 00:26:25,119 Speaker 3: this somebody I need to get to fifty one to fifty. 488 00:26:25,960 --> 00:26:29,840 Speaker 3: And the problem is if you report everybody whoever mentions 489 00:26:29,880 --> 00:26:32,320 Speaker 3: that and get them fifty one to fifty, you quite 490 00:26:32,320 --> 00:26:35,200 Speaker 3: possibly push them further down that road. Now their whole 491 00:26:35,200 --> 00:26:37,359 Speaker 3: life is blown up. Everybody in my family knows I 492 00:26:37,400 --> 00:26:39,600 Speaker 3: got this problem. Everybody in my family knows I'm a 493 00:26:39,680 --> 00:26:42,480 Speaker 3: kook or whatever, and it makes you more suicidal. 494 00:26:44,359 --> 00:26:47,640 Speaker 2: Right, Yeah, let's a TALKIC is zero tolerance policy? 495 00:26:47,960 --> 00:26:54,679 Speaker 3: Yeah, exactly exactly, combining that with this kind of different Kay. 496 00:26:54,680 --> 00:26:56,600 Speaker 2: Do you have any comment on that before I move on? 497 00:26:56,920 --> 00:26:59,080 Speaker 5: Well, I just I had no idea how vast the 498 00:26:59,600 --> 00:27:02,639 Speaker 5: offer was on chat GPT for therapy. So there's a 499 00:27:02,640 --> 00:27:05,480 Speaker 5: section on it on the sidebar where it says GPTs 500 00:27:05,520 --> 00:27:09,840 Speaker 5: and they have everything from education to lifestyle productivity, you 501 00:27:09,960 --> 00:27:10,240 Speaker 5: name it. 502 00:27:10,320 --> 00:27:12,080 Speaker 3: Yeah, I've looked at that, and especially if you get 503 00:27:12,080 --> 00:27:16,600 Speaker 3: the premium version you pay for. Yeah, but like Joe said, 504 00:27:16,680 --> 00:27:21,639 Speaker 3: I don't quite understand how the one named Harry is 505 00:27:21,800 --> 00:27:25,960 Speaker 3: different or better than I've just been asking general GPT questions, 506 00:27:25,960 --> 00:27:28,920 Speaker 3: not about heavy stuff like this, just various things and 507 00:27:30,200 --> 00:27:31,280 Speaker 3: have found it pretty good. 508 00:27:31,400 --> 00:27:35,320 Speaker 2: So I'm looks at Harry though. Sorry. Most popular GPT 509 00:27:35,480 --> 00:27:40,000 Speaker 2: by our community, Number one astrology birth chart GP two 510 00:27:41,240 --> 00:27:44,400 Speaker 2: So if you have a soft head, wow, wow. 511 00:27:44,480 --> 00:27:49,600 Speaker 3: The most advanced technology human beings have ever created being 512 00:27:49,680 --> 00:27:54,560 Speaker 3: used by idiots to check on their their their aquarius sign. 513 00:27:54,760 --> 00:27:58,000 Speaker 3: Oh my god, yes, Katie, I. 514 00:27:58,000 --> 00:28:01,960 Speaker 2: Find my celebrity lookalikes. That's I'm big ones. I'm sorry, Katie. 515 00:28:02,040 --> 00:28:06,480 Speaker 2: Back to myself good. I just I was taking it 516 00:28:06,520 --> 00:28:07,000 Speaker 2: back dark. 517 00:28:07,040 --> 00:28:10,280 Speaker 5: But I'm on Harry the Therapist's page and it looks 518 00:28:10,280 --> 00:28:13,080 Speaker 5: like these are created by separate users. So somebody logged 519 00:28:13,119 --> 00:28:16,040 Speaker 5: into chat GPT and created like an algorithm. 520 00:28:15,520 --> 00:28:16,000 Speaker 2: For this guy. 521 00:28:16,320 --> 00:28:20,639 Speaker 5: And it says empathetic experienced therapist guiding you through emotions 522 00:28:20,680 --> 00:28:21,600 Speaker 5: and traumas. 523 00:28:21,880 --> 00:28:23,879 Speaker 3: But does My question is, and I don't know the 524 00:28:23,920 --> 00:28:27,000 Speaker 3: answer to this at all. Does it have access to 525 00:28:27,080 --> 00:28:30,000 Speaker 3: different information than the regular chat GPT? 526 00:28:30,160 --> 00:28:33,040 Speaker 2: Does? I don't know that. I don't know. 527 00:28:33,160 --> 00:28:35,480 Speaker 3: I mean you've asked chat GPT questions, haven't you, Katie, 528 00:28:35,720 --> 00:28:38,280 Speaker 3: therapy stuff. Yeah, and I've found it pretty good and like, 529 00:28:38,400 --> 00:28:41,040 Speaker 3: so if I ask Harry, it's like, would Harry say, oh, 530 00:28:41,080 --> 00:28:42,640 Speaker 3: regular chat GPTs full of crap. 531 00:28:42,720 --> 00:28:43,640 Speaker 2: Here's what you need to do. 532 00:28:43,960 --> 00:28:46,280 Speaker 5: It might be the way that because you can program 533 00:28:46,320 --> 00:28:49,520 Speaker 5: the personalities on these things, so you can select it 534 00:28:49,560 --> 00:28:50,960 Speaker 5: to be a listener. 535 00:28:50,560 --> 00:28:52,480 Speaker 3: Or no, that's like my GROC I've got in my 536 00:28:52,520 --> 00:28:55,840 Speaker 3: tesla now, which is pretty funny. If any of you 537 00:28:56,080 --> 00:28:57,960 Speaker 3: ride with me someday off to show you how it works. 538 00:28:58,120 --> 00:29:00,200 Speaker 3: So I ask it a question and then it walks 539 00:29:00,240 --> 00:29:01,680 Speaker 3: back to me in a female voice. 540 00:29:01,720 --> 00:29:02,720 Speaker 2: Hey, Jack, how's it going? 541 00:29:02,840 --> 00:29:02,959 Speaker 1: Oh? 542 00:29:03,000 --> 00:29:06,360 Speaker 2: I think that's a good idea. It's just so damn weird. Wow, 543 00:29:06,600 --> 00:29:09,880 Speaker 2: and now you're falling in love. When's the wedding? And 544 00:29:09,960 --> 00:29:13,280 Speaker 2: I bought her flowers? It's it's weird. It's like somebody's 545 00:29:13,280 --> 00:29:16,120 Speaker 2: writing in the car with you. I wasn't gonna quibble 546 00:29:16,440 --> 00:29:18,800 Speaker 2: about it, but in that write up that you're just reading, 547 00:29:18,840 --> 00:29:20,920 Speaker 2: why does the writer or maybe it was the people 548 00:29:21,000 --> 00:29:24,120 Speaker 2: he's quoting, but he or she continually refer to quote 549 00:29:24,160 --> 00:29:25,440 Speaker 2: unquote Harry as he. 550 00:29:25,960 --> 00:29:28,200 Speaker 3: They put it in quotes, which I don't I don't 551 00:29:28,200 --> 00:29:31,280 Speaker 3: know how to do that verbally, but it was in 552 00:29:31,360 --> 00:29:34,000 Speaker 3: quotes every time they said he. What's still weird though 553 00:29:34,040 --> 00:29:35,200 Speaker 3: even in quotes, it's weird. 554 00:29:36,760 --> 00:29:40,560 Speaker 2: There's a songmaker create music using musical theory. 555 00:29:42,880 --> 00:29:47,200 Speaker 3: That's better as a creative pursuit than don't use chat 556 00:29:47,240 --> 00:29:50,560 Speaker 3: GBT for your astrology. Continue to subscribe to your local 557 00:29:50,600 --> 00:29:54,440 Speaker 3: newspaper and read it in the back page by the classifieds. 558 00:29:55,040 --> 00:29:59,480 Speaker 2: It's the best thing you can possibly do. Oh my god, 559 00:29:59,520 --> 00:30:04,240 Speaker 2: you're using it for your astrology number one, astrology number two, 560 00:30:04,360 --> 00:30:07,080 Speaker 2: Find my celebrity look alike number three. This is under 561 00:30:07,160 --> 00:30:12,520 Speaker 2: lifestyle number three, personal color analysis, helping you find your 562 00:30:12,560 --> 00:30:13,160 Speaker 2: best colors. 563 00:30:13,360 --> 00:30:16,680 Speaker 3: I got one more really interesting thing about suicide, but 564 00:30:16,760 --> 00:30:18,680 Speaker 3: first I want to tell you about simply safe, keeping 565 00:30:18,680 --> 00:30:20,840 Speaker 3: your home safe. Every time I pull away from my 566 00:30:20,880 --> 00:30:23,720 Speaker 3: home every day, I'm happy to see that sign I 567 00:30:23,720 --> 00:30:26,320 Speaker 3: got out front alerting people that I got the AI 568 00:30:26,480 --> 00:30:29,680 Speaker 3: powered cameras, the sensors on all the doors and windows. 569 00:30:29,960 --> 00:30:32,920 Speaker 3: If someone's lurking, agents can talk to them in real time, 570 00:30:33,200 --> 00:30:37,920 Speaker 3: turn on spotlights, call the police, proactively determining a deterring 571 00:30:37,960 --> 00:30:39,000 Speaker 3: crime before it starts. 572 00:30:39,040 --> 00:30:42,240 Speaker 2: That's simply safe. Wait, you got an alarm system that 573 00:30:42,440 --> 00:30:47,440 Speaker 2: prevents crime. That's right, simply safe. No contracts, no hidden fees, 574 00:30:48,000 --> 00:30:50,880 Speaker 2: best technology around, rated number one by all sorts of 575 00:30:50,880 --> 00:30:53,240 Speaker 2: different publications. And listen to this. Would you now not 576 00:30:53,280 --> 00:30:57,440 Speaker 2: gonna believe it? Something? This sophisticated monitoring plans start around 577 00:30:57,480 --> 00:31:02,120 Speaker 2: a dollar a day. It's amazing. Visit simply safe dot com. 578 00:31:02,120 --> 00:31:04,960 Speaker 2: Slash armstrong claim fifty percent off a new system with 579 00:31:05,000 --> 00:31:07,960 Speaker 2: a professional monitoring plan. Get your first month free. That's 580 00:31:08,000 --> 00:31:13,360 Speaker 2: simplysafe dot com slash armstrong. There's no safe like simply safe. 581 00:31:13,440 --> 00:31:15,400 Speaker 2: Do this today. It's an easy website to click around. 582 00:31:15,440 --> 00:31:17,720 Speaker 3: So this is a completely different topic. Maybe we'll get 583 00:31:17,720 --> 00:31:20,920 Speaker 3: to tomorrow more in depth. I mean that AI chat 584 00:31:21,040 --> 00:31:25,680 Speaker 3: bought laws about mandated reporting. Stuff is very interesting, no doubt, 585 00:31:25,960 --> 00:31:29,360 Speaker 3: but this is separate. Also about suicide. Canada, as you know, 586 00:31:29,480 --> 00:31:32,160 Speaker 3: made it legal for you to commit suicide. 587 00:31:32,800 --> 00:31:34,920 Speaker 2: Here's an artisition assisted suicide. 588 00:31:35,000 --> 00:31:38,560 Speaker 3: Yeah, here's an article in The Atlantic long win and 589 00:31:38,560 --> 00:31:41,360 Speaker 3: I read part of it last night. Canada is killing itself. 590 00:31:41,600 --> 00:31:44,680 Speaker 3: The country gave its citizens the right to die. Doctors 591 00:31:44,720 --> 00:31:47,480 Speaker 3: are struggling to keep up with the demand. 592 00:31:48,560 --> 00:31:49,120 Speaker 2: Wow. 593 00:31:49,680 --> 00:31:51,960 Speaker 3: So maybe we'll talk about that tomorrow. Lighter stuff on 594 00:31:52,000 --> 00:31:58,200 Speaker 3: the way next. So we were just talking about chat, 595 00:31:58,240 --> 00:32:01,160 Speaker 3: GBT and therapy and this. A woman whose daughter killed 596 00:32:01,200 --> 00:32:03,400 Speaker 3: herself wrote an essay in The New York Times about 597 00:32:03,440 --> 00:32:08,440 Speaker 3: whether or not AI bots should be mandated reporters. Like 598 00:32:08,520 --> 00:32:13,080 Speaker 3: if somebody admits or tells something that the AI chat 599 00:32:13,120 --> 00:32:15,800 Speaker 3: bot will I don't know contact who, I don't even 600 00:32:15,800 --> 00:32:18,800 Speaker 3: know how that would work. But we were wondering if, 601 00:32:18,920 --> 00:32:22,040 Speaker 3: like if you use the therapist Harry on chat GPT 602 00:32:22,160 --> 00:32:25,240 Speaker 3: if it's different than just generally asking questions the way 603 00:32:26,000 --> 00:32:27,040 Speaker 3: the rest of us have been doing. 604 00:32:27,040 --> 00:32:28,280 Speaker 2: What did you find out, Katie? 605 00:32:28,360 --> 00:32:31,239 Speaker 5: Yeah, it's it's significantly different. So for example, I just 606 00:32:31,760 --> 00:32:34,680 Speaker 5: used like one of the medications from my IVF process, 607 00:32:34,760 --> 00:32:39,080 Speaker 5: and chat GPT regular kind of sympathized with that it 608 00:32:39,160 --> 00:32:40,920 Speaker 5: might make me not feel good, but then gave me 609 00:32:40,960 --> 00:32:44,200 Speaker 5: a list of resources and things I can do to 610 00:32:44,240 --> 00:32:47,280 Speaker 5: feel better. In all this, the other one is really 611 00:32:47,320 --> 00:32:50,480 Speaker 5: digging into. Now let me ask you this gently, how 612 00:32:50,520 --> 00:32:54,480 Speaker 5: what exact things are you feeling? Is it affecting your 613 00:32:54,560 --> 00:32:58,760 Speaker 5: mental Are you experiencing a loss of motivation? I mean 614 00:32:58,800 --> 00:33:02,400 Speaker 5: it's really digging into like hmm, okay, internal Maybe maybe 615 00:33:02,400 --> 00:33:04,240 Speaker 5: I'll try Henry next time I have a question. 616 00:33:04,560 --> 00:33:07,640 Speaker 2: Yeah, yeah, I was totally unawares. Katie pointed out this 617 00:33:07,800 --> 00:33:13,360 Speaker 2: segment that they have like individual bots with specialized programming 618 00:33:14,440 --> 00:33:18,400 Speaker 2: for different topics, including the number one being astrology. So 619 00:33:18,800 --> 00:33:24,200 Speaker 2: there's no hope for humanity. But number two was find 620 00:33:24,240 --> 00:33:26,479 Speaker 2: my celebrity look alike, and so I did that. I 621 00:33:26,560 --> 00:33:30,800 Speaker 2: uploaded a photo and it struggled for a while. I 622 00:33:30,840 --> 00:33:33,440 Speaker 2: could tell it was thinking jaz if somebody looked like you, 623 00:33:33,440 --> 00:33:34,600 Speaker 2: they wouldn't be a celebrity. 624 00:33:34,640 --> 00:33:36,680 Speaker 3: That's what do we tell this guy did give you anyway? 625 00:33:36,680 --> 00:33:37,800 Speaker 3: Did he give you Barney Rubble? 626 00:33:38,840 --> 00:33:42,160 Speaker 2: Looking? No, As a younger man, I did resemble the 627 00:33:42,240 --> 00:33:43,840 Speaker 2: great Barney Rubble a great deal. 628 00:33:44,440 --> 00:33:44,560 Speaker 1: Uh. 629 00:33:44,720 --> 00:33:48,520 Speaker 2: Looking closely at your features, strong brow lines, expressive forehead, 630 00:33:48,800 --> 00:33:51,080 Speaker 2: square jaw, with a salt and pepper beard, and a 631 00:33:51,120 --> 00:33:57,560 Speaker 2: slightly rugged but approachable look. Hey, it's doing pretty well 632 00:33:57,680 --> 00:34:01,960 Speaker 2: so far. If it is, you have a resemblance to 633 00:34:02,480 --> 00:34:08,480 Speaker 2: Nick Nolty in his later years Lute Hollywood and Sleeping 634 00:34:08,560 --> 00:34:12,080 Speaker 2: on Park Vene. No, no, Buddy Holly, Nick Nolty, not now, 635 00:34:12,400 --> 00:34:15,680 Speaker 2: Nick Nolty, Oh no, unfortunately not, and also a bit 636 00:34:15,719 --> 00:34:19,960 Speaker 2: of Kurt Russell in his more recent roles. Again in realty, 637 00:34:20,040 --> 00:34:23,279 Speaker 2: resemblance comes through in the weathered, expressive foreheadlines and the 638 00:34:23,320 --> 00:34:26,239 Speaker 2: way your beard frames your face, especially like Nolty around 639 00:34:26,239 --> 00:34:32,880 Speaker 2: the two thousands. Ah oh oh, They cite Kurt Russell 640 00:34:32,880 --> 00:34:37,000 Speaker 2: and the Hate the Light. Yes, if only. 641 00:34:38,520 --> 00:34:42,160 Speaker 3: The fact that the number one use for AI for 642 00:34:42,880 --> 00:34:45,160 Speaker 3: Dunderpates is astrology. 643 00:34:45,560 --> 00:34:47,600 Speaker 2: It's amazing that you're combining like. 644 00:34:47,640 --> 00:34:51,719 Speaker 3: The most cutting edge advanced thing human beings have ever 645 00:34:51,760 --> 00:34:56,439 Speaker 3: come up with with the most old timey, dumbest from 646 00:34:57,040 --> 00:35:02,160 Speaker 3: one hundred thousand years ago, the stars for your future. 647 00:35:03,719 --> 00:35:06,800 Speaker 2: Well, more importantly, I need Katie to react with complete 648 00:35:06,840 --> 00:35:09,919 Speaker 2: honesty to this. Your look has that mix of ruggedness 649 00:35:09,960 --> 00:35:12,359 Speaker 2: and warmth that both of those actors are known for 650 00:35:12,440 --> 00:35:15,400 Speaker 2: in their later careers. I think you'd probably agree, wouldn't 651 00:35:15,400 --> 00:35:16,399 Speaker 2: you nail that? 652 00:35:16,520 --> 00:35:19,360 Speaker 3: Why is that thing giving you all positive feedback? And 653 00:35:19,400 --> 00:35:21,960 Speaker 3: there's no you look sort of like a hobo I 654 00:35:22,000 --> 00:35:24,560 Speaker 3: saw on the way to work, or just are you 655 00:35:24,719 --> 00:35:31,239 Speaker 3: all right? Do you have medical professionals you can call? 656 00:35:33,480 --> 00:35:36,560 Speaker 3: Oh my gosh, how long did the doctor tell you 657 00:35:36,560 --> 00:35:38,359 Speaker 3: you had? Oh? 658 00:35:38,440 --> 00:35:38,640 Speaker 2: Yeah? 659 00:35:38,680 --> 00:35:38,839 Speaker 3: One? 660 00:35:38,960 --> 00:35:43,160 Speaker 2: One final AI note, Katie mentioned this headline Jeffrey Hinton, 661 00:35:43,280 --> 00:35:46,120 Speaker 2: often called the godfather of AI. He's calling on researchers 662 00:35:46,160 --> 00:35:49,040 Speaker 2: to design systems that will take care of us like 663 00:35:49,080 --> 00:35:54,200 Speaker 2: we're babies, and we all reacted like, I'm not sure 664 00:35:54,239 --> 00:35:56,520 Speaker 2: I need that, but thanks very much. But I read 665 00:35:56,600 --> 00:36:00,919 Speaker 2: what Hinton's reasoning is that's the only way to keep 666 00:36:00,920 --> 00:36:04,960 Speaker 2: them from becoming like our overlords who shred us and 667 00:36:05,080 --> 00:36:09,680 Speaker 2: take our organs for whatever purpose. He's like, we need 668 00:36:09,719 --> 00:36:12,600 Speaker 2: to make machines that are smarter than us to care 669 00:36:12,640 --> 00:36:17,400 Speaker 2: for us like we're there babies. We need to imbue 670 00:36:17,400 --> 00:36:21,279 Speaker 2: them with genuine concern for human well being, otherwise it 671 00:36:21,320 --> 00:36:26,200 Speaker 2: could be really Yeah, so we need to convince AI 672 00:36:26,320 --> 00:36:29,440 Speaker 2: to care about us so they don't, you know, eat us. 673 00:36:29,680 --> 00:36:33,240 Speaker 2: Wow Armstrong and Getty