1 00:00:01,800 --> 00:00:06,120 Speaker 1: Broadcasting live from the Abraham Lincoln Radio Studio, the George 2 00:00:06,160 --> 00:00:11,560 Speaker 1: Washington Broadcast Center, Jack Armstrong, Joe, Getty Armstrong and Jettie 3 00:00:11,600 --> 00:00:24,159 Speaker 1: and He Armstrong and Yetty Yo yo yo, thanks for 4 00:00:24,239 --> 00:00:26,960 Speaker 1: joining us. Joe is in He's the Getty half of 5 00:00:27,080 --> 00:00:30,800 Speaker 1: Armstrong and Getty. Joe is in England on vacation. I 6 00:00:30,840 --> 00:00:33,160 Speaker 1: think he's gonna call in a little bit later, half 7 00:00:33,159 --> 00:00:37,680 Speaker 1: in his cups, drunk on bass ale. But we're having 8 00:00:37,680 --> 00:00:40,560 Speaker 1: a variety of guests on today to keep me sane, 9 00:00:40,640 --> 00:00:43,560 Speaker 1: including who we used to call Craig the Obamacare lawyer, 10 00:00:44,159 --> 00:00:46,120 Speaker 1: back when we're trying to work our way through what 11 00:00:46,159 --> 00:00:49,360 Speaker 1: the hell Obamacare was or is, which continues to be 12 00:00:49,400 --> 00:00:51,600 Speaker 1: the law of the land, right Craig. Welcome, Craig Gottwaals. 13 00:00:52,800 --> 00:00:57,480 Speaker 1: Obamacare is. It's just just became status quo as we 14 00:00:57,520 --> 00:01:00,480 Speaker 1: all predicted it would be, and and all of our 15 00:01:00,520 --> 00:01:03,680 Speaker 1: deductibles went way, way, way, way up, and we all 16 00:01:03,720 --> 00:01:05,200 Speaker 1: just accepted that that's the way they're going to be 17 00:01:05,240 --> 00:01:06,680 Speaker 1: for the rest of our lives. So that's the way 18 00:01:06,680 --> 00:01:09,560 Speaker 1: that turned out. Hey, I do have a quick health 19 00:01:09,600 --> 00:01:12,360 Speaker 1: related thing for you before we get into some other 20 00:01:12,440 --> 00:01:14,920 Speaker 1: stuff that you and I were texting about last night. 21 00:01:15,600 --> 00:01:22,240 Speaker 1: I got a friend who's got a pretty bad health diagnosis, 22 00:01:22,920 --> 00:01:25,640 Speaker 1: and I suggested, man, you ought to get a second opinion. 23 00:01:25,680 --> 00:01:28,040 Speaker 1: But people throw around the whole get a second opinion 24 00:01:28,400 --> 00:01:31,480 Speaker 1: like it's easy to do. I've never actually done it. 25 00:01:31,520 --> 00:01:33,479 Speaker 1: How do you even go about doing that? Does your 26 00:01:33,480 --> 00:01:36,679 Speaker 1: insurance like you do that? What do you do? Do 27 00:01:36,720 --> 00:01:38,920 Speaker 1: you go to a completely different doctor grouper? 28 00:01:40,360 --> 00:01:41,200 Speaker 2: What is that? 29 00:01:41,360 --> 00:01:42,400 Speaker 1: What is the second opinion? 30 00:01:42,600 --> 00:01:44,720 Speaker 2: Yeah, some of that's going to depend on what kind 31 00:01:44,760 --> 00:01:46,680 Speaker 2: of plan you're on, whether it's a PPO or an 32 00:01:46,800 --> 00:01:49,880 Speaker 2: HMO or like what we call an open access plan. 33 00:01:50,720 --> 00:01:53,120 Speaker 1: But what are most people let's start there. What do 34 00:01:53,240 --> 00:01:57,000 Speaker 1: most people have? Who have? People wear a company healthcare 35 00:01:57,280 --> 00:01:59,680 Speaker 1: for APO? Okay, yeah, that's what I thought. Most of 36 00:01:59,720 --> 00:02:01,720 Speaker 1: us have PPO. So if we have a PPO, how 37 00:02:01,720 --> 00:02:02,800 Speaker 1: do we get a second opinion? 38 00:02:04,440 --> 00:02:04,640 Speaker 3: Yeah? 39 00:02:04,640 --> 00:02:06,440 Speaker 2: The easiest way to get a second opinion in that 40 00:02:06,480 --> 00:02:09,639 Speaker 2: case would be to pick a different primary care doctor 41 00:02:09,680 --> 00:02:11,680 Speaker 2: in a different medical group and just go through the 42 00:02:11,720 --> 00:02:14,600 Speaker 2: process again, because if you stay within this through. 43 00:02:14,480 --> 00:02:17,640 Speaker 1: The whole process again, go through the whole process again, 44 00:02:18,720 --> 00:02:22,400 Speaker 1: while you're probably not feeling very good, maybe feeling terrible. 45 00:02:23,840 --> 00:02:26,000 Speaker 3: Yeah, it at that point. 46 00:02:26,080 --> 00:02:28,079 Speaker 2: So it's gonna come down to how much you trust 47 00:02:28,120 --> 00:02:31,120 Speaker 2: your doctor and the medical group, because if you stay 48 00:02:31,160 --> 00:02:33,000 Speaker 2: within the same medical group, they're going to have a 49 00:02:33,040 --> 00:02:35,720 Speaker 2: pretty strong bias to confirm what's. 50 00:02:35,520 --> 00:02:37,960 Speaker 3: Already been done. Well, that exactly, that's what I want 51 00:02:37,960 --> 00:02:38,320 Speaker 3: to second. 52 00:02:38,639 --> 00:02:41,560 Speaker 1: Yeah, there's two things wrong with several things wrong with that, 53 00:02:41,639 --> 00:02:43,600 Speaker 1: And it's a tell me it's not a common phrase, 54 00:02:43,639 --> 00:02:45,640 Speaker 1: Well you should get a second opinion. People throw that 55 00:02:45,680 --> 00:02:47,560 Speaker 1: around all the time like it's an easy thing to do. 56 00:02:47,800 --> 00:02:49,639 Speaker 1: But so I'd have to go through the whole process. 57 00:02:49,800 --> 00:02:52,360 Speaker 1: Am i am? I kind of like half firing my 58 00:02:52,440 --> 00:02:55,919 Speaker 1: primary doctor I've had maybe for years, I'm probably friends 59 00:02:55,919 --> 00:02:58,160 Speaker 1: with at this point. How offended is he or she 60 00:02:58,320 --> 00:02:58,760 Speaker 1: going to be? 61 00:03:00,960 --> 00:03:04,320 Speaker 2: Well, most plans will have a mechanism in there for 62 00:03:04,360 --> 00:03:06,400 Speaker 2: a second opinion so that it's not so that you're 63 00:03:06,440 --> 00:03:08,840 Speaker 2: not Now if it's an HMO, which a lot of 64 00:03:08,840 --> 00:03:11,840 Speaker 2: people are on in larger cities, then kind of are 65 00:03:11,960 --> 00:03:14,799 Speaker 2: firing that doctor because you'd pick a different primary care 66 00:03:14,880 --> 00:03:16,800 Speaker 2: doctor come back to them in a future month. But 67 00:03:17,919 --> 00:03:19,960 Speaker 2: most PPOs will have a mechanism. You can do it 68 00:03:19,960 --> 00:03:21,280 Speaker 2: the other way to do it if you have a 69 00:03:21,280 --> 00:03:23,560 Speaker 2: little bit of means at all is to go out 70 00:03:23,639 --> 00:03:26,960 Speaker 2: into the market and find a direct primary care doctor, 71 00:03:27,160 --> 00:03:30,200 Speaker 2: somebody who's left the system and sees you for like 72 00:03:30,200 --> 00:03:31,840 Speaker 2: one hundred or one hundred and twenty five dollars a 73 00:03:31,919 --> 00:03:35,240 Speaker 2: month fee. That way, you can keep doing what you're 74 00:03:35,320 --> 00:03:38,680 Speaker 2: doing with your system doctor on your planet work, for example, 75 00:03:39,040 --> 00:03:41,360 Speaker 2: But then you can spend a couple hundred dollars and 76 00:03:41,440 --> 00:03:43,200 Speaker 2: go off to the side and see one of these 77 00:03:43,240 --> 00:03:46,280 Speaker 2: direct primary care doctors who's left the system and will 78 00:03:46,320 --> 00:03:48,880 Speaker 2: then give you a truly independent analysis that if you 79 00:03:48,960 --> 00:03:50,960 Speaker 2: have some means at all, that's what I would recommend. 80 00:03:51,200 --> 00:03:56,360 Speaker 1: Okay, that's a little frustrating. I mean, they don't offer 81 00:03:56,440 --> 00:03:58,640 Speaker 1: that up to you, certainly after they give you a diagnosis. 82 00:03:58,680 --> 00:04:00,839 Speaker 1: And the one thing I learned nil people all this time, 83 00:04:01,080 --> 00:04:03,040 Speaker 1: The main thing I learned from when I had cancer 84 00:04:03,120 --> 00:04:06,400 Speaker 1: is there's a lot of guessing. There's way more guessing 85 00:04:07,000 --> 00:04:09,800 Speaker 1: than I ever believed was the case. And you can 86 00:04:09,840 --> 00:04:11,400 Speaker 1: talk to a couple of different people and they have 87 00:04:11,560 --> 00:04:13,360 Speaker 1: completely different opinions. 88 00:04:14,360 --> 00:04:17,520 Speaker 2: So well, not just cancer, I mean all these complicated. 89 00:04:17,560 --> 00:04:20,279 Speaker 2: I mean, sure, if we just start looking at autoimmune diseases, 90 00:04:20,320 --> 00:04:22,240 Speaker 2: and then the way these the new the new drugs 91 00:04:22,240 --> 00:04:26,080 Speaker 2: are affecting that. It's it's you know, because of what 92 00:04:26,120 --> 00:04:28,440 Speaker 2: I do. I have a lot of good friendships with doctors, 93 00:04:28,800 --> 00:04:32,640 Speaker 2: and it's shocking how you can talk to two very 94 00:04:32,680 --> 00:04:34,880 Speaker 2: well respected doctors that have been doing this for decades 95 00:04:34,920 --> 00:04:37,760 Speaker 2: and they'll have incredibly different opinions on how you should 96 00:04:37,760 --> 00:04:39,919 Speaker 2: treat X, Y or Z, especially when we get to 97 00:04:39,920 --> 00:04:41,640 Speaker 2: like autoimmune or even cancer YEP. 98 00:04:41,680 --> 00:04:44,000 Speaker 1: I have that exact situation with my son where they 99 00:04:44,160 --> 00:04:47,719 Speaker 1: I have two PhD level, been around forever people with 100 00:04:48,320 --> 00:04:51,440 Speaker 1: almost one hundred and eighty degree apart opinions, And what 101 00:04:51,480 --> 00:04:54,800 Speaker 1: am I supposed to do with that information? Anyway? That's 102 00:04:54,880 --> 00:04:58,480 Speaker 1: enough of that health stuff for now, So this is interesting. 103 00:04:58,520 --> 00:05:00,840 Speaker 1: So we talked to I know your friends Tim Sanderfer. 104 00:05:01,240 --> 00:05:03,880 Speaker 1: We had Tim the lawyer on Last Hour, and we 105 00:05:03,880 --> 00:05:07,520 Speaker 1: were talking about AI and all the different sort of stuff. 106 00:05:08,040 --> 00:05:11,040 Speaker 1: And I'm fascinated by AI and I read lots of 107 00:05:11,080 --> 00:05:12,680 Speaker 1: books and listen to a lot of podcasts because I 108 00:05:12,680 --> 00:05:14,080 Speaker 1: think it's I think it's going to be a really 109 00:05:14,160 --> 00:05:15,520 Speaker 1: big deal. I don't know if it's going to be 110 00:05:15,520 --> 00:05:19,400 Speaker 1: as big as fire, Like the guy from Google says 111 00:05:19,640 --> 00:05:22,120 Speaker 1: to mankind, the invention of fire but I mean, if 112 00:05:22,160 --> 00:05:28,520 Speaker 1: if it's half that, it would be shockingly huge. A 113 00:05:28,560 --> 00:05:31,560 Speaker 1: lot of people are worried about AI taken so many jobs. 114 00:05:31,600 --> 00:05:33,160 Speaker 1: We're gonna have to come up with some sort of 115 00:05:33,160 --> 00:05:36,200 Speaker 1: guaranteed income thing to pay people to stay home and 116 00:05:36,360 --> 00:05:38,120 Speaker 1: play the flute because there just aren't going to be 117 00:05:38,200 --> 00:05:40,679 Speaker 1: enough jobs. AI is going to take it over. Tim says, 118 00:05:40,720 --> 00:05:43,000 Speaker 1: this is going to be like every other technology that's 119 00:05:43,040 --> 00:05:45,600 Speaker 1: come along. It's going to develop all kinds of new 120 00:05:45,680 --> 00:05:48,040 Speaker 1: jobs that you've never even thought of yet. It'll take 121 00:05:48,080 --> 00:05:52,360 Speaker 1: care of itself. The cotton gin didn't eliminate all farm workers. 122 00:05:52,440 --> 00:05:54,640 Speaker 1: It started all kinds of other different things, and you 123 00:05:54,760 --> 00:05:57,200 Speaker 1: end up with more jobs. Where are you on that question? 124 00:05:57,240 --> 00:05:59,320 Speaker 1: Because Tim Tim Tim thinks now that he's not worried 125 00:05:59,320 --> 00:06:02,680 Speaker 1: about it. I am. I think it's going to destroy 126 00:06:03,080 --> 00:06:04,440 Speaker 1: the entire world. Go ahead. 127 00:06:05,640 --> 00:06:07,800 Speaker 3: I fall much closer to Tim it goes. 128 00:06:08,320 --> 00:06:09,160 Speaker 1: I hope you're both right. 129 00:06:09,200 --> 00:06:11,600 Speaker 2: I'm using it. I'm using it every day at work. 130 00:06:11,640 --> 00:06:13,920 Speaker 2: In fact, everybody in my office is. We're using it regularly. 131 00:06:14,400 --> 00:06:17,080 Speaker 2: And what it's done is it's allowed me to just 132 00:06:17,120 --> 00:06:20,120 Speaker 2: become so much more efficient with not wasting a lot 133 00:06:20,120 --> 00:06:22,240 Speaker 2: of time on some of the more menial tasks that 134 00:06:22,279 --> 00:06:24,800 Speaker 2: I don't want to have to burn time on. I 135 00:06:24,839 --> 00:06:28,279 Speaker 2: can use AI to standardize and templatize a lot of 136 00:06:28,279 --> 00:06:31,839 Speaker 2: the things that i'm doing quickly. I'll give you an example, Jack, 137 00:06:31,839 --> 00:06:34,440 Speaker 2: because you know I'm a lawyer and I'm reviewing healthcare contracts. 138 00:06:34,880 --> 00:06:39,440 Speaker 2: Just recently, I took six different Pharmacy Benefit Manager PBM contracts. 139 00:06:39,480 --> 00:06:41,280 Speaker 2: So it's the part of your health plan that deals 140 00:06:41,279 --> 00:06:43,600 Speaker 2: with all the drugs. Six different contracts. All of them 141 00:06:43,640 --> 00:06:46,480 Speaker 2: were between fifty and one hundred pages. I uploaded all 142 00:06:46,520 --> 00:06:49,520 Speaker 2: of them into chatch GPT, I said, and then I 143 00:06:49,520 --> 00:06:51,520 Speaker 2: gave it like a whole page of instruction on what 144 00:06:51,560 --> 00:06:54,520 Speaker 2: I wanted. I wanted to compare and contrast this. I 145 00:06:54,560 --> 00:06:56,719 Speaker 2: wanted to know the weaknesses and strengths. I wanted to 146 00:06:56,760 --> 00:06:58,400 Speaker 2: know where I could find a B and C and 147 00:06:58,480 --> 00:07:00,720 Speaker 2: D in each contract, and I wanted it to put 148 00:07:00,720 --> 00:07:02,039 Speaker 2: it all in a grid for me. 149 00:07:02,560 --> 00:07:03,479 Speaker 1: So it did. 150 00:07:03,600 --> 00:07:04,800 Speaker 3: Within like ten minutes. 151 00:07:04,839 --> 00:07:08,000 Speaker 2: I had this unbelievable chart that it spit back to me, 152 00:07:08,240 --> 00:07:09,920 Speaker 2: where then I could go back and just hit the 153 00:07:10,000 --> 00:07:13,120 Speaker 2: highlights of the contracts in my review. Now, where that 154 00:07:13,160 --> 00:07:16,560 Speaker 2: would be devastating is if you had zero idea what 155 00:07:16,600 --> 00:07:19,160 Speaker 2: you were doing if you weren't a healthcare attorney, for example, 156 00:07:19,400 --> 00:07:21,920 Speaker 2: and you didn't know where it was wrong. Because it's wrong, 157 00:07:21,960 --> 00:07:24,000 Speaker 2: as you guys have reported, it's wrong a good clip 158 00:07:24,000 --> 00:07:25,680 Speaker 2: of the time. It'll make things up or it'll have 159 00:07:25,720 --> 00:07:29,400 Speaker 2: something totally off. But when you're already an expert in 160 00:07:29,440 --> 00:07:32,160 Speaker 2: an area to take care. I mean, it saved me 161 00:07:32,280 --> 00:07:34,360 Speaker 2: four hours doing what it did, and then I could 162 00:07:34,360 --> 00:07:36,600 Speaker 2: just spend one hour fine tuning it and making it 163 00:07:36,640 --> 00:07:38,560 Speaker 2: exactly what I needed to see from my clients. 164 00:07:38,760 --> 00:07:41,200 Speaker 1: But you do have the problems of hallucinations or whatever. 165 00:07:41,280 --> 00:07:42,160 Speaker 1: It just makes stuff up. 166 00:07:42,160 --> 00:07:46,120 Speaker 2: Now, then absolutely, yeah, you have to watch it. What 167 00:07:46,160 --> 00:07:49,080 Speaker 2: I tell my coworkers is when it tells you something 168 00:07:49,120 --> 00:07:51,960 Speaker 2: that you think is just maybe not quite right, you 169 00:07:51,960 --> 00:07:53,760 Speaker 2: have to tell it give me a source for that, 170 00:07:53,880 --> 00:07:55,400 Speaker 2: and then you have to hit that source and you 171 00:07:55,440 --> 00:07:58,120 Speaker 2: have to go look at it because it will get. 172 00:07:57,920 --> 00:07:59,440 Speaker 3: Things completely wrong. 173 00:08:00,040 --> 00:08:02,480 Speaker 2: I've read the stats that say fifty percent of the time. 174 00:08:02,520 --> 00:08:04,520 Speaker 2: I think that's too high, but I see it getting 175 00:08:04,520 --> 00:08:06,280 Speaker 2: things wrong twenty percent of the time anyway. 176 00:08:06,600 --> 00:08:08,400 Speaker 1: Really, okay, that's interesting. 177 00:08:08,600 --> 00:08:11,040 Speaker 2: Yeah, Well, it sends you to a link that doesn't exist, 178 00:08:11,520 --> 00:08:12,840 Speaker 2: or it just says something that's not right. 179 00:08:13,160 --> 00:08:15,320 Speaker 1: Okay, So I asked a lot of questions that I 180 00:08:15,360 --> 00:08:18,720 Speaker 1: have no expertise in, and so it could maybe it's 181 00:08:18,760 --> 00:08:20,800 Speaker 1: lied to me way more often than I realize, and 182 00:08:20,800 --> 00:08:23,280 Speaker 1: then I probably repeated on the radio. But I'm I'm 183 00:08:23,840 --> 00:08:26,000 Speaker 1: I've mentioned this a thousand times. I'm reading the book 184 00:08:26,080 --> 00:08:28,120 Speaker 1: Ulysses by James Joyce. I'm trying to fight my way 185 00:08:28,160 --> 00:08:30,360 Speaker 1: through that book. And I've been using chat GBT when 186 00:08:30,360 --> 00:08:32,199 Speaker 1: I get stuck on something. But I had one the 187 00:08:32,240 --> 00:08:33,760 Speaker 1: other day where it was just I knew it was 188 00:08:33,800 --> 00:08:36,440 Speaker 1: completely wrong, like just as wrong as wrong could be, 189 00:08:37,080 --> 00:08:39,320 Speaker 1: And I wonder how often that happens. I asked a 190 00:08:39,360 --> 00:08:41,600 Speaker 1: question yesterday about taking zinc when you got a cold, 191 00:08:42,000 --> 00:08:44,480 Speaker 1: and the information it spit out for me for different 192 00:08:44,480 --> 00:08:46,760 Speaker 1: ages and different studies and stuff like that, as far 193 00:08:46,760 --> 00:08:49,880 Speaker 1: as I know, was absolutely fascinating and so fast. So 194 00:08:50,240 --> 00:08:53,600 Speaker 1: which you mentioned chat GPT several times. There's a whole 195 00:08:53,600 --> 00:08:57,839 Speaker 1: bunch of AI apps or programs or whatever you can 196 00:08:57,840 --> 00:09:00,600 Speaker 1: call them chatbots out there. How many of them are 197 00:09:00,640 --> 00:09:01,079 Speaker 1: you using? 198 00:09:02,600 --> 00:09:05,439 Speaker 2: Yeah, so I'm using I'm using an upgraded version of 199 00:09:05,520 --> 00:09:08,240 Speaker 2: chat GPT that I've paid for and I've trained with 200 00:09:08,320 --> 00:09:10,160 Speaker 2: a lot of what I do for healthcare law. 201 00:09:10,320 --> 00:09:12,120 Speaker 1: Do you think the paid for one is Do you 202 00:09:12,160 --> 00:09:14,240 Speaker 1: think the paid one paid for one is worth it 203 00:09:14,280 --> 00:09:16,200 Speaker 1: for the average person or only if you have an 204 00:09:16,200 --> 00:09:17,160 Speaker 1: expertise in something. 205 00:09:17,960 --> 00:09:19,560 Speaker 2: I think if you're using it for work, the paid 206 00:09:19,559 --> 00:09:20,880 Speaker 2: for one is worth it. I mean if you're just 207 00:09:20,960 --> 00:09:23,600 Speaker 2: using it for for fun and for social you know, 208 00:09:23,800 --> 00:09:25,400 Speaker 2: I don't think you need to pay for it, but 209 00:09:26,240 --> 00:09:29,040 Speaker 2: I lean on it pretty heavily at times and chat. 210 00:09:29,120 --> 00:09:30,199 Speaker 3: GPT seems to. 211 00:09:30,160 --> 00:09:33,800 Speaker 2: Be the best one for like legal analysis and writing 212 00:09:33,880 --> 00:09:37,920 Speaker 2: and writing templates when I have to when I start 213 00:09:37,960 --> 00:09:40,599 Speaker 2: working with Excel spreadsheets, for example, when I want to 214 00:09:40,640 --> 00:09:43,080 Speaker 2: compare large Excel spreadsheets and I want to I want 215 00:09:43,080 --> 00:09:45,480 Speaker 2: to have AI shortcut some of that for me, I 216 00:09:45,520 --> 00:09:48,480 Speaker 2: find that Gemini Google's seems to be the best one 217 00:09:48,520 --> 00:09:49,480 Speaker 2: for me in that lane. 218 00:09:49,920 --> 00:09:51,000 Speaker 3: And then the other thing. 219 00:09:50,840 --> 00:09:52,400 Speaker 2: That we use a lot at work because we do 220 00:09:52,440 --> 00:09:54,680 Speaker 2: a lot of presentations for clients and a lot of 221 00:09:54,880 --> 00:09:59,680 Speaker 2: visual stuff. Will use mid Journey to create art and imagery, 222 00:10:00,120 --> 00:10:02,800 Speaker 2: which is I think the industry leader easily for you know, 223 00:10:02,840 --> 00:10:05,840 Speaker 2: creating those pictures in those and those slides that are. 224 00:10:05,760 --> 00:10:06,920 Speaker 1: Oh so if you want to do it really you 225 00:10:06,960 --> 00:10:09,160 Speaker 1: want to do images and stuff you like mid Journey, 226 00:10:09,200 --> 00:10:10,240 Speaker 1: which I'd never even heard of. 227 00:10:10,400 --> 00:10:10,960 Speaker 3: Mid Journey. 228 00:10:11,760 --> 00:10:14,240 Speaker 2: Yeah, mid Journey's amazing. And I actually learned that an 229 00:10:14,280 --> 00:10:16,360 Speaker 2: artist friend of mine in the Bay Area who said, 230 00:10:16,360 --> 00:10:18,680 Speaker 2: that's the only one artists you're using is mid Journey? 231 00:10:18,840 --> 00:10:20,360 Speaker 1: Jar that, Katie, because I know you do a lot 232 00:10:20,360 --> 00:10:22,599 Speaker 1: of that. That's a good one. Mid Journey is that? 233 00:10:22,640 --> 00:10:24,120 Speaker 1: And is that just something like I can put on 234 00:10:24,120 --> 00:10:25,160 Speaker 1: my phone getting. 235 00:10:24,920 --> 00:10:29,600 Speaker 2: That, Yeah, it's just a weblink or there's probably an app, 236 00:10:29,600 --> 00:10:30,880 Speaker 2: but I just I just hit it on a on a. 237 00:10:30,960 --> 00:10:32,000 Speaker 3: On a web page browser. 238 00:10:32,080 --> 00:10:35,200 Speaker 2: And I actually do pay for the upgraded version of 239 00:10:35,200 --> 00:10:37,320 Speaker 2: that one as well, because we we hit it a 240 00:10:37,360 --> 00:10:41,360 Speaker 2: lot for creating. You know, you see a presentation at 241 00:10:41,360 --> 00:10:43,280 Speaker 2: your job, right and you get so sick of seeing 242 00:10:43,320 --> 00:10:44,640 Speaker 2: the same clip art over and over. 243 00:10:44,880 --> 00:10:46,959 Speaker 3: Well, we'll just use mid Journey to create unique art. 244 00:10:47,000 --> 00:10:49,120 Speaker 2: That way, I know that when I'm giving a presentation 245 00:10:49,240 --> 00:10:51,800 Speaker 2: to a client, they've never seen this imagery before. It's 246 00:10:51,800 --> 00:10:55,280 Speaker 2: not you know, some stock imagery, but that Journey. We 247 00:10:55,360 --> 00:10:57,160 Speaker 2: hit it, but use a free version of it as well. 248 00:10:58,559 --> 00:11:02,400 Speaker 1: Mid Journey. I'm looking for it, Okay, I haven't missed 249 00:11:02,400 --> 00:11:04,720 Speaker 1: around with Google Gemini. I need to do that just 250 00:11:04,720 --> 00:11:07,560 Speaker 1: because I know they're pouring so many billions of dollars 251 00:11:07,559 --> 00:11:11,200 Speaker 1: into that sort of thing, because there's a big belief 252 00:11:11,240 --> 00:11:16,240 Speaker 1: among Eli and Eli, Elon and Google and you know 253 00:11:16,320 --> 00:11:21,360 Speaker 1: a couple of different people that whoever emerges as the leader, 254 00:11:22,160 --> 00:11:27,080 Speaker 1: there's trillions of dollars involved in that, and it's worth 255 00:11:27,080 --> 00:11:28,560 Speaker 1: trying to be the best. So I need to figure 256 00:11:28,559 --> 00:11:30,760 Speaker 1: out what Gemini is up to. We'll talk more with 257 00:11:30,840 --> 00:11:32,320 Speaker 1: Craig here in just a little bit about a bunch 258 00:11:32,360 --> 00:11:34,320 Speaker 1: of different things. He's got some strong opinions on the 259 00:11:34,320 --> 00:11:36,160 Speaker 1: war in Ukraine that I think are going to be 260 00:11:36,200 --> 00:11:39,200 Speaker 1: a lot closer to a lot of you our listeners 261 00:11:39,880 --> 00:11:41,720 Speaker 1: than I have been on Ukraine. So we'll get to 262 00:11:41,720 --> 00:11:43,480 Speaker 1: that among other things coming up in just a little bit. 263 00:11:43,960 --> 00:11:45,920 Speaker 1: I want to tell you about Webroot. If you're into 264 00:11:45,960 --> 00:11:47,560 Speaker 1: tech stuff at all and you spend any time on 265 00:11:47,600 --> 00:11:51,080 Speaker 1: your phone, you should probably have webroot. If you ever 266 00:11:51,160 --> 00:11:53,480 Speaker 1: know anybody who had their bank account hacked and the 267 00:11:53,640 --> 00:11:57,000 Speaker 1: nightmare it was to try to untangle that mess and 268 00:11:57,040 --> 00:11:59,120 Speaker 1: maybe get your credit rating back and all those different 269 00:11:59,160 --> 00:12:01,920 Speaker 1: sort of things. That's why we trust webroot Total Protection. 270 00:12:02,000 --> 00:12:05,760 Speaker 1: It monitors for stolen identities, credit fraud, even scans the 271 00:12:05,880 --> 00:12:08,240 Speaker 1: dark web for your info because if it shows up 272 00:12:08,280 --> 00:12:10,880 Speaker 1: on the dark Web, something's gone wrong. So we're hooking 273 00:12:10,880 --> 00:12:12,040 Speaker 1: you up with a great offer right now. You can 274 00:12:12,080 --> 00:12:15,040 Speaker 1: get fifty percent off webroot Total Protection or webroot Essentials 275 00:12:15,240 --> 00:12:18,480 Speaker 1: at webroot dot com slash armstrong. If something goes wrong, 276 00:12:18,480 --> 00:12:21,679 Speaker 1: you're backed up with up to one million dollars in reimbursement. 277 00:12:22,080 --> 00:12:24,840 Speaker 1: It's fast, it's like wait, it installs in minutes, no 278 00:12:24,920 --> 00:12:28,880 Speaker 1: annoying pop ups, just strong protection. Even includes a VPN 279 00:12:29,000 --> 00:12:32,439 Speaker 1: for secure browsing and password manager to keep your login safe. 280 00:12:32,520 --> 00:12:35,280 Speaker 1: Love it, don't risk being the next victim. Get fifty 281 00:12:35,280 --> 00:12:39,120 Speaker 1: percent off Webroot Total Protection or webroot Essentials right now 282 00:12:39,120 --> 00:12:42,719 Speaker 1: at webroot dot com slash Armstrong. A whole bunch of 283 00:12:42,720 --> 00:12:44,440 Speaker 1: different things we're going to talk about that are interesting 284 00:12:44,480 --> 00:12:50,760 Speaker 1: coming up. Stay here Armstrong, Heydy. The FDA is warning 285 00:12:50,800 --> 00:12:53,680 Speaker 1: consumers not to eat frozen shrimp from Walmart because it 286 00:12:53,679 --> 00:12:55,679 Speaker 1: may contain radioactive material. 287 00:12:56,559 --> 00:12:59,240 Speaker 4: Also because it's frozen shrimp from Walmart. 288 00:13:01,040 --> 00:13:03,160 Speaker 1: Aren't you an elitist? You don't eat shrimp? From homart. 289 00:13:03,200 --> 00:13:07,080 Speaker 1: I don't need it. We're talking with Craig Gottwalls, who 290 00:13:07,120 --> 00:13:09,920 Speaker 1: is a healthcare expert. I actually we're talking about other things. 291 00:13:09,920 --> 00:13:11,120 Speaker 1: But if people want to get a hold of you, 292 00:13:11,200 --> 00:13:12,800 Speaker 1: like for some healthcare advice, how do they get. 293 00:13:12,679 --> 00:13:13,080 Speaker 3: A hold of you? 294 00:13:14,600 --> 00:13:17,320 Speaker 2: The easiest way probably Gottwalls dot substack dot com. 295 00:13:17,400 --> 00:13:20,400 Speaker 1: Okay, cool, we're just talking about AI and want to 296 00:13:20,400 --> 00:13:23,240 Speaker 1: finish up that conversation. Uh, because I'm fascinated by it, 297 00:13:23,280 --> 00:13:26,160 Speaker 1: and man, if you're not brushing up against it yet, 298 00:13:26,240 --> 00:13:30,439 Speaker 1: you will be. So I suggest get on the front 299 00:13:30,520 --> 00:13:33,360 Speaker 1: end of checking some of this stuff out. But I 300 00:13:33,440 --> 00:13:36,560 Speaker 1: was talking to a teacher, my next door neighbor. She 301 00:13:36,600 --> 00:13:40,240 Speaker 1: teaches ninth grade English, and I asked her about AI, 302 00:13:40,320 --> 00:13:41,760 Speaker 1: and I can tell her eyes lit up and she 303 00:13:41,840 --> 00:13:43,319 Speaker 1: kind of rolled her eyes, like, oh, yeah, it's really 304 00:13:43,320 --> 00:13:45,440 Speaker 1: hard now to figure out if a kid wrote his 305 00:13:45,520 --> 00:13:49,400 Speaker 1: paper and all this different sort of stuff. I know 306 00:13:49,440 --> 00:13:52,280 Speaker 1: you've taught college classes in the past working with people. 307 00:13:52,320 --> 00:13:54,880 Speaker 1: How do you is there is there a way to 308 00:13:54,960 --> 00:13:57,679 Speaker 1: tell if something's AI or if some it's actually a 309 00:13:57,760 --> 00:13:58,560 Speaker 1: human beings work. 310 00:14:00,280 --> 00:14:03,120 Speaker 2: Yeah, it's becoming it's becoming increasingly easy to tell honestly, 311 00:14:03,160 --> 00:14:04,640 Speaker 2: because I use it so much. I can see how 312 00:14:04,679 --> 00:14:05,960 Speaker 2: it writes, and I see how it works. 313 00:14:05,960 --> 00:14:06,600 Speaker 3: It does things. 314 00:14:06,840 --> 00:14:09,280 Speaker 2: Katie mentioned one earlier in the week, or maybe it 315 00:14:09,280 --> 00:14:13,439 Speaker 2: was last week, the long dash, the ridiculously long dash 316 00:14:13,480 --> 00:14:15,560 Speaker 2: that none of us ever grew up using. 317 00:14:15,880 --> 00:14:18,240 Speaker 3: But AI uses like it's going out of style. 318 00:14:18,640 --> 00:14:20,600 Speaker 2: True, And it's not just like I'm typing, I put 319 00:14:20,600 --> 00:14:22,080 Speaker 2: a dash, and I keep typing and it puts a 320 00:14:22,080 --> 00:14:24,440 Speaker 2: little dash in there. No, it's a it's a dash 321 00:14:24,440 --> 00:14:26,280 Speaker 2: that goes all the way from one word to the next. 322 00:14:26,320 --> 00:14:28,600 Speaker 2: If you see that in a response that you get, 323 00:14:28,640 --> 00:14:30,840 Speaker 2: you know that was written by AI. 324 00:14:31,880 --> 00:14:34,720 Speaker 1: That's true. I'm looking at Yeah, I'm looking at my 325 00:14:34,840 --> 00:14:38,800 Speaker 1: chat GPT question about I was getting getting seasick on 326 00:14:38,920 --> 00:14:41,440 Speaker 1: my sailing lessons and asking about what medicine is the 327 00:14:41,440 --> 00:14:43,680 Speaker 1: best for that. Yeah, and all the answers really long 328 00:14:43,800 --> 00:14:44,560 Speaker 1: dashes in there. 329 00:14:44,640 --> 00:14:46,880 Speaker 3: Okay, a ton of them too. 330 00:14:46,960 --> 00:14:49,080 Speaker 2: And I've even I've even gone into AI. Remember, I've 331 00:14:49,080 --> 00:14:51,200 Speaker 2: got the upgraded version that I've trained, and I've said 332 00:14:51,400 --> 00:14:54,040 Speaker 2: stop using that. I don't write that way. I've routed 333 00:14:54,040 --> 00:14:55,840 Speaker 2: it back to all of my articles that I've written 334 00:14:55,840 --> 00:14:57,960 Speaker 2: and said, this is how I write write this way, 335 00:14:58,040 --> 00:15:01,080 Speaker 2: don't use those dashes, and for whatever reason, I can't 336 00:15:01,080 --> 00:15:03,440 Speaker 2: get it to stop doing that. So I don't know 337 00:15:03,440 --> 00:15:06,200 Speaker 2: how how smart AI is when when that's our reality 338 00:15:06,320 --> 00:15:13,480 Speaker 2: right right. The other one is emojis gone wild. It 339 00:15:13,520 --> 00:15:16,440 Speaker 2: will it will put so many emojis in a response 340 00:15:16,520 --> 00:15:18,960 Speaker 2: a lot of times, especially people that are using it 341 00:15:19,000 --> 00:15:22,280 Speaker 2: to put up posts on like LinkedIn or Instagram or whatever. 342 00:15:22,600 --> 00:15:24,760 Speaker 2: You'll see just way too many emojis and all these 343 00:15:24,800 --> 00:15:26,920 Speaker 2: like crazy emojis that humans don't ever use. 344 00:15:27,440 --> 00:15:29,200 Speaker 3: It's just it's a total tell that it's a hi. 345 00:15:29,600 --> 00:15:30,600 Speaker 1: Oh. 346 00:15:30,640 --> 00:15:32,640 Speaker 2: And then the last one and this is a simple one. 347 00:15:32,640 --> 00:15:34,840 Speaker 2: But if you have it write memo or if you 348 00:15:34,840 --> 00:15:38,920 Speaker 2: have it write like a formal letter, it puts lines, 349 00:15:39,240 --> 00:15:42,240 Speaker 2: like actual lines between the sections. And so sometimes I'll 350 00:15:42,280 --> 00:15:44,440 Speaker 2: get something from somebody that's got the lines in it still, 351 00:15:44,480 --> 00:15:46,960 Speaker 2: and I'm like, dude, you didn't even bother to delete 352 00:15:46,960 --> 00:15:49,520 Speaker 2: the lines out of this pure AI that you just 353 00:15:49,840 --> 00:15:51,160 Speaker 2: regurgitated back at me. 354 00:15:51,800 --> 00:15:53,680 Speaker 1: So a lot and then. 355 00:15:53,840 --> 00:15:57,160 Speaker 2: A lot of LinkedIn is AI you tell me, yeah, 356 00:15:57,400 --> 00:16:00,360 Speaker 2: fifty And this was according to a studied I'm back 357 00:16:00,360 --> 00:16:00,920 Speaker 2: in October. 358 00:16:00,960 --> 00:16:02,440 Speaker 3: I swear to god, it's higher now. 359 00:16:02,640 --> 00:16:07,920 Speaker 2: Fifty four percent of everything on LinkedIn is AI generated. Wow, 360 00:16:08,200 --> 00:16:12,080 Speaker 2: even this is my favorite. There's a few attorneys that 361 00:16:12,120 --> 00:16:14,760 Speaker 2: I like and I follow on LinkedIn that we share ideas. 362 00:16:15,080 --> 00:16:18,680 Speaker 2: One of them has in their bio one hundred percent 363 00:16:18,840 --> 00:16:21,440 Speaker 2: non AI generated. So I'm reading one of I won't 364 00:16:21,440 --> 00:16:23,560 Speaker 2: even say the gender. I'm reading one of their posts 365 00:16:23,600 --> 00:16:26,240 Speaker 2: the other day and it says and I'm reading it, 366 00:16:26,240 --> 00:16:28,160 Speaker 2: and I'm going, this is AI. I can tell by 367 00:16:28,160 --> 00:16:31,440 Speaker 2: the the all the little catchphrases it uses all the 368 00:16:31,520 --> 00:16:33,440 Speaker 2: all the you know, here's the deal, and all the 369 00:16:33,800 --> 00:16:35,880 Speaker 2: kind of almost like the bidenisms that it throws us 370 00:16:35,880 --> 00:16:37,680 Speaker 2: out there, and I'm like, this is AI. 371 00:16:37,800 --> 00:16:39,400 Speaker 3: So then here's another trick you can use. 372 00:16:39,880 --> 00:16:42,480 Speaker 2: Go to Google and just type in AI detection tool 373 00:16:42,760 --> 00:16:45,400 Speaker 2: and you'll be able to pull up, you know, dozens 374 00:16:45,400 --> 00:16:48,560 Speaker 2: of different websites where you copy the text, paste it in, 375 00:16:48,600 --> 00:16:51,640 Speaker 2: and then it'll tell you, with pretty good likelihood what 376 00:16:51,760 --> 00:16:54,800 Speaker 2: percentage of that text generated was AI. So this is 377 00:16:54,840 --> 00:16:57,440 Speaker 2: what university professors are using and high school teachers are 378 00:16:57,520 --> 00:17:00,680 Speaker 2: using to detect it because it has all these tells 379 00:17:00,720 --> 00:17:02,560 Speaker 2: in it that you just get used to over time. 380 00:17:03,120 --> 00:17:04,240 Speaker 1: Yeah, that's really interesting. 381 00:17:04,280 --> 00:17:04,399 Speaker 3: One. 382 00:17:05,240 --> 00:17:07,119 Speaker 1: I'm not smart enough or I haven't spent enough time 383 00:17:07,160 --> 00:17:08,720 Speaker 1: with it to pick it up in print. But I 384 00:17:08,720 --> 00:17:12,880 Speaker 1: feel like I can tell an AI image immediately. They're 385 00:17:12,960 --> 00:17:18,280 Speaker 1: just too something, They're too something, they're too perfect. There's 386 00:17:18,280 --> 00:17:19,480 Speaker 1: no human being that looks like that. 387 00:17:20,800 --> 00:17:22,440 Speaker 3: The other tell this is fascinating. 388 00:17:22,480 --> 00:17:24,159 Speaker 2: So we had a college intern working with us this 389 00:17:24,200 --> 00:17:26,560 Speaker 2: summer and she was shadowing me for a couple of weeks. 390 00:17:26,920 --> 00:17:29,959 Speaker 2: She told me it's so prevalent in her university that 391 00:17:30,040 --> 00:17:35,080 Speaker 2: she will purposefully submit papers with five or six grammatical 392 00:17:35,200 --> 00:17:38,160 Speaker 2: errors in them because she does not want to even 393 00:17:38,200 --> 00:17:40,520 Speaker 2: come close to getting accused of using AI to write 394 00:17:40,520 --> 00:17:41,040 Speaker 2: her papers. 395 00:17:41,240 --> 00:17:44,960 Speaker 1: Ah wow, So you get AI to write a paper 396 00:17:44,960 --> 00:17:46,520 Speaker 1: for you, then you go back in and put in 397 00:17:46,600 --> 00:17:51,920 Speaker 1: some grammatical errors, and that's who you trick, your teacher. Okay, 398 00:17:51,920 --> 00:17:53,680 Speaker 1: I do want to get to that Russia Ukraine stuff 399 00:17:53,720 --> 00:17:56,160 Speaker 1: in just a second. There was I think a fair 400 00:17:56,200 --> 00:17:59,200 Speaker 1: amount of movement that on that on Sunday with Lavrov 401 00:17:59,240 --> 00:18:01,040 Speaker 1: on one of the talk we'll play you what he 402 00:18:01,080 --> 00:18:04,760 Speaker 1: had to say and then discuss how much we should 403 00:18:04,800 --> 00:18:07,280 Speaker 1: get into this war or not. If you miss the 404 00:18:07,280 --> 00:18:09,399 Speaker 1: segment Orn Hour, get the podcast Armstrong and Getty on 405 00:18:09,440 --> 00:18:12,360 Speaker 1: demand Armstrong and Getty. 406 00:18:14,920 --> 00:18:20,040 Speaker 5: No should know by now that never ever Russia deliberately 407 00:18:20,119 --> 00:18:26,960 Speaker 5: targeted any sites which are not linked to military abilities 408 00:18:27,000 --> 00:18:27,800 Speaker 5: of Ukraine. 409 00:18:28,320 --> 00:18:32,800 Speaker 1: So that's Sergei Lavrov. He's the spokeshole for Vladimir Putin, 410 00:18:32,880 --> 00:18:35,359 Speaker 1: has been for many, many years. He's a well, I 411 00:18:35,400 --> 00:18:37,600 Speaker 1: was going to say he's a paid liar. He is. 412 00:18:38,480 --> 00:18:40,440 Speaker 1: He is also in the situation that if he didn't 413 00:18:40,480 --> 00:18:43,000 Speaker 1: lie properly for Putin, he would get he would fall 414 00:18:43,040 --> 00:18:45,200 Speaker 1: out of a window all of a sudden, and his 415 00:18:45,240 --> 00:18:47,200 Speaker 1: whole family would probably be killed. So that's a pretty 416 00:18:47,240 --> 00:18:49,360 Speaker 1: big motivator to get out there and lie too. He's 417 00:18:49,359 --> 00:18:51,600 Speaker 1: pretty good at it anyway. He was on Meet the 418 00:18:51,680 --> 00:18:56,040 Speaker 1: Press on Sunday. Kristen Welker, the host, brought up the 419 00:18:56,119 --> 00:19:01,560 Speaker 1: fact that Russia regularly targets civilians. He says they did not, 420 00:19:02,400 --> 00:19:04,399 Speaker 1: and they went further with that conversation. 421 00:19:04,800 --> 00:19:10,040 Speaker 4: Russia has hit maternity wards, churches, schools, hospitals at kindergarten 422 00:19:10,200 --> 00:19:13,480 Speaker 4: just this past week. So either the Russian military has 423 00:19:13,840 --> 00:19:17,480 Speaker 4: terrible aim or you are targeting civilians, which is it. 424 00:19:17,840 --> 00:19:24,679 Speaker 5: Look NBC is a very respectful structure, and I hope 425 00:19:24,720 --> 00:19:30,200 Speaker 5: you are responsible for the words which you broadcast. I 426 00:19:31,600 --> 00:19:34,960 Speaker 5: asked you to send us or to publicize the information 427 00:19:35,080 --> 00:19:39,879 Speaker 5: to which you just referred, because we never targeted the 428 00:19:39,920 --> 00:19:42,919 Speaker 5: civilian targets of the kind you. 429 00:19:43,920 --> 00:19:50,000 Speaker 1: Cited, So obviously that's a load of crap. But he's 430 00:19:50,040 --> 00:19:53,960 Speaker 1: able to say, look, NBC is a very respected news organization. 431 00:19:54,000 --> 00:19:55,919 Speaker 1: If you have the proof, send it to me. There 432 00:19:55,920 --> 00:19:57,439 Speaker 1: would be no proof you could send to him that 433 00:19:57,440 --> 00:20:01,120 Speaker 1: would satisfy him. And it's not like Putin just emerged 434 00:20:01,160 --> 00:20:03,480 Speaker 1: on the scene. Not only is he doing it in Ukraine, 435 00:20:03,600 --> 00:20:07,800 Speaker 1: he helped Bashar al Asad do it in Syria, you know, 436 00:20:07,880 --> 00:20:10,080 Speaker 1: bombing the hell out of civilians, Carolyn one, hundreds of 437 00:20:10,080 --> 00:20:12,720 Speaker 1: thousands of their own people to try and stay in 438 00:20:12,760 --> 00:20:16,359 Speaker 1: power there. He blew up all those people in his 439 00:20:16,400 --> 00:20:20,320 Speaker 1: own country, in his own hotel, his own Russian citizens. 440 00:20:20,320 --> 00:20:22,320 Speaker 1: He blew up in a hotel to make it look 441 00:20:22,440 --> 00:20:25,840 Speaker 1: like the Chechens had attacked, so he had a reason 442 00:20:25,920 --> 00:20:29,000 Speaker 1: to go into Chechnya. I mean, he's an evil guy. 443 00:20:29,000 --> 00:20:31,000 Speaker 1: He's willing to do anything which leads us to this. 444 00:20:31,440 --> 00:20:34,600 Speaker 1: So we have Craig Gottwalds joining us today 'SU lyon 445 00:20:34,720 --> 00:20:37,080 Speaker 1: talking about healthcare. But I wanted to talk to Craig 446 00:20:37,240 --> 00:20:43,000 Speaker 1: about the Russia Ukraine situation, because he's closer to where 447 00:20:43,080 --> 00:20:47,840 Speaker 1: a lot of you are on your opinions about US 448 00:20:47,920 --> 00:20:50,879 Speaker 1: involvement in Ukraine. Joe and I are pretty big cheerleaders 449 00:20:50,960 --> 00:20:55,000 Speaker 1: for arming the Ukrainians and pushing back hard and all 450 00:20:55,000 --> 00:20:57,680 Speaker 1: that sort of stuff. Craig is not so. I don't 451 00:20:57,680 --> 00:20:59,440 Speaker 1: know if you were listening earlier, Craig when we had 452 00:20:59,520 --> 00:21:03,560 Speaker 1: justin logo on from Cato. He is closer to you 453 00:21:03,640 --> 00:21:07,600 Speaker 1: with a pretty cold eyed reality look at the whole thing. 454 00:21:07,640 --> 00:21:10,199 Speaker 1: What is your position on Russia Ukraine? 455 00:21:12,160 --> 00:21:15,080 Speaker 2: Yeah, this is one I I just I haven't changed 456 00:21:15,119 --> 00:21:16,880 Speaker 2: my opinion on this since day one of the war. 457 00:21:17,080 --> 00:21:19,359 Speaker 2: And I just look at this and think, well, this 458 00:21:19,520 --> 00:21:22,800 Speaker 2: is unfortunate. I don't like it, I don't support it. 459 00:21:23,280 --> 00:21:26,240 Speaker 2: But the reality is you've got a giant nuclear power, 460 00:21:26,520 --> 00:21:29,960 Speaker 2: one of the three largest powers on the planet, that's 461 00:21:30,000 --> 00:21:32,000 Speaker 2: gonna that's gonna do whatever they want to do, and 462 00:21:32,040 --> 00:21:34,600 Speaker 2: they're gonna they're gonna take over U one of their 463 00:21:34,640 --> 00:21:38,960 Speaker 2: former client states, and we're not gonna do anything about it, 464 00:21:40,320 --> 00:21:44,720 Speaker 2: whether it's China, whether it's America, if if, if might 465 00:21:44,760 --> 00:21:46,959 Speaker 2: makes right and if in and then the world of 466 00:21:46,960 --> 00:21:49,320 Speaker 2: war if we have the might and we have the 467 00:21:50,000 --> 00:21:52,359 Speaker 2: manpower and we're willing to do whatever it takes. Like 468 00:21:52,640 --> 00:21:56,080 Speaker 2: for I think about it this way, Like imagine Northern 469 00:21:56,119 --> 00:21:59,160 Speaker 2: Mexico got so out of control with cartel behavior flowing 470 00:21:59,200 --> 00:22:01,560 Speaker 2: into us, and we just decided, look, we got to 471 00:22:01,560 --> 00:22:02,360 Speaker 2: go in and fix it. 472 00:22:02,800 --> 00:22:03,080 Speaker 3: Now. 473 00:22:03,160 --> 00:22:06,240 Speaker 2: I'm not saying what Russia's saying is right. I'm just saying, 474 00:22:06,280 --> 00:22:08,840 Speaker 2: if we decided we needed to go into northern Mexico, 475 00:22:10,280 --> 00:22:13,240 Speaker 2: how would we like it if all of a sudden, 476 00:22:13,359 --> 00:22:17,720 Speaker 2: China and Russia we're funding the Mexicans with arms and 477 00:22:17,760 --> 00:22:20,520 Speaker 2: with munitions. We wouldn't like it at all. We'd say, 478 00:22:20,560 --> 00:22:22,639 Speaker 2: stay out of our neighborhood. This is our world, this 479 00:22:22,680 --> 00:22:25,400 Speaker 2: is what we do. We're the world power. Back off 480 00:22:25,440 --> 00:22:27,159 Speaker 2: unless you're willing to come here and fight us one 481 00:22:27,200 --> 00:22:30,119 Speaker 2: on one. And I just feel like all we're doing 482 00:22:30,359 --> 00:22:34,399 Speaker 2: in Ukraine is prolonging the blood bath. We're making it 483 00:22:34,440 --> 00:22:37,240 Speaker 2: so that they lose a little more slowly and that 484 00:22:37,320 --> 00:22:40,359 Speaker 2: more people die. And the quote from Marco Rubio you 485 00:22:40,359 --> 00:22:42,440 Speaker 2: guys had a week or so ago, I thought was phenomenal. 486 00:22:42,880 --> 00:22:46,399 Speaker 2: It's a meat grinder, and all we're doing is making 487 00:22:46,400 --> 00:22:48,720 Speaker 2: it so that we can prolong the meat grinding, and 488 00:22:48,800 --> 00:22:50,879 Speaker 2: Russia's going to win because they have what it was 489 00:22:50,920 --> 00:22:54,520 Speaker 2: like a seven or a ten to one advantage in manpower. 490 00:22:55,240 --> 00:22:57,720 Speaker 2: And then on top of all that, even if somehow, 491 00:22:57,840 --> 00:23:00,919 Speaker 2: some way we were to arm Ukraine enough that Ukraine 492 00:23:00,920 --> 00:23:03,879 Speaker 2: could push him back a little, I think Putin's actually 493 00:23:03,960 --> 00:23:07,000 Speaker 2: shown restraint by not using a tactical nuke or really 494 00:23:07,040 --> 00:23:10,119 Speaker 2: even waving that flag around, because you know what he 495 00:23:10,240 --> 00:23:14,720 Speaker 2: has it and ultimately I view war as might makes right. 496 00:23:14,880 --> 00:23:17,280 Speaker 2: So unless we're willing to go stand there and fight 497 00:23:17,359 --> 00:23:21,120 Speaker 2: against them, I just think all we're doing is playing games. 498 00:23:21,600 --> 00:23:25,960 Speaker 2: We're depleting our own weaponry, and we're not doing any 499 00:23:26,000 --> 00:23:29,400 Speaker 2: favors to the world. Now I realize your comeback, Jack 500 00:23:29,520 --> 00:23:31,720 Speaker 2: is going to be so you just let Putin take it. 501 00:23:31,720 --> 00:23:33,760 Speaker 2: And the answer is yep. You let him take as 502 00:23:33,840 --> 00:23:35,440 Speaker 2: much as he's going to take. And once he's taken 503 00:23:35,560 --> 00:23:37,600 Speaker 2: enough that we say we're willing to go send our 504 00:23:37,680 --> 00:23:39,840 Speaker 2: kids over to fight, that's when it'll stop. 505 00:23:40,560 --> 00:23:45,000 Speaker 1: Yeah. That was my response to Justin logan from Cato 506 00:23:45,280 --> 00:23:48,360 Speaker 1: earlier this morning, And if you didn't hear that, that 507 00:23:48,440 --> 00:23:51,760 Speaker 1: was our one and you can find the podcast Armstrong 508 00:23:51,800 --> 00:23:55,840 Speaker 1: and Getty on demand. But I said, so, won't this 509 00:23:55,960 --> 00:23:58,199 Speaker 1: set up a situation if Putin gets away with it, 510 00:23:58,320 --> 00:24:00,920 Speaker 1: using my finger quotes, get us away with that where 511 00:24:01,080 --> 00:24:03,119 Speaker 1: China will decide, well we get to take Taiwana or 512 00:24:03,160 --> 00:24:05,800 Speaker 1: any other bigger country gets to take it. And he said, well, 513 00:24:05,800 --> 00:24:08,240 Speaker 1: that's the history of the world. It's always been that way. 514 00:24:08,400 --> 00:24:10,400 Speaker 1: It's only this tiny it's going to happen. 515 00:24:11,840 --> 00:24:12,679 Speaker 3: Anything about it. 516 00:24:12,680 --> 00:24:15,520 Speaker 1: It's only this tiny blip of recent history where that 517 00:24:15,600 --> 00:24:20,000 Speaker 1: hasn't been going on because the United States and the 518 00:24:20,040 --> 00:24:22,240 Speaker 1: Soviet Union at the time were so big and powerful 519 00:24:22,320 --> 00:24:25,280 Speaker 1: they stot and nobody did anything without their approval really 520 00:24:26,040 --> 00:24:29,720 Speaker 1: in their own spheres. But and then the Cold War ended, 521 00:24:29,840 --> 00:24:33,440 Speaker 1: and we were the lone superpower for really the blink 522 00:24:33,480 --> 00:24:36,600 Speaker 1: of an eye. And now we're back to, you know, 523 00:24:36,680 --> 00:24:40,199 Speaker 1: a uni unit more than a unipower world. There are 524 00:24:40,200 --> 00:24:42,400 Speaker 1: a couple of different powers in the world. And that's 525 00:24:42,480 --> 00:24:44,879 Speaker 1: just the way it's always been and it's always going 526 00:24:44,920 --> 00:24:47,080 Speaker 1: to be and there's really not much you can do 527 00:24:47,119 --> 00:24:52,920 Speaker 1: about it. That's hard to argue with that. I don't 528 00:24:53,000 --> 00:24:53,360 Speaker 1: like it. 529 00:24:53,640 --> 00:24:55,840 Speaker 2: I just I think the phrase I used with you 530 00:24:55,840 --> 00:24:59,080 Speaker 2: and Joe the other day was just radical pragmatism, Like 531 00:24:59,680 --> 00:25:03,040 Speaker 2: what's I've ever since this thing started, I never was 532 00:25:03,040 --> 00:25:06,160 Speaker 2: emotional about it. I've always just thought, well, what's the end. Well, 533 00:25:06,200 --> 00:25:08,280 Speaker 2: the end is Putin's gonna take whatever he wants because 534 00:25:08,280 --> 00:25:10,280 Speaker 2: we're not going to go fight, and if we're not 535 00:25:10,280 --> 00:25:12,280 Speaker 2: going to go fight, he's going to win. I mean 536 00:25:12,280 --> 00:25:15,159 Speaker 2: that's just the I just don't know anyway around that. 537 00:25:15,480 --> 00:25:17,880 Speaker 1: Yeah, it's hard to not be emotional about it, obviously, 538 00:25:17,920 --> 00:25:19,720 Speaker 1: if you, you know, you watch the news and you 539 00:25:19,760 --> 00:25:22,040 Speaker 1: hear from some of these Ukrainians who are who are 540 00:25:22,320 --> 00:25:25,560 Speaker 1: literally fighting to save their families. I mean, because their 541 00:25:25,640 --> 00:25:28,359 Speaker 1: children might be rounded up and taken away from mom 542 00:25:28,440 --> 00:25:31,320 Speaker 1: to Russia, never to be seen again. Mom might be 543 00:25:31,440 --> 00:25:33,560 Speaker 1: raped or killed. I mean, it's it's a pretty hard 544 00:25:33,600 --> 00:25:37,080 Speaker 1: not to get emotional about it. But uh, you get 545 00:25:37,119 --> 00:25:38,960 Speaker 1: as emotional as you want. But if you're not going 546 00:25:39,040 --> 00:25:41,080 Speaker 1: to give him the arms or the troops or whatever 547 00:25:41,119 --> 00:25:44,119 Speaker 1: it would take to win, there's no stopping them. 548 00:25:45,000 --> 00:25:46,639 Speaker 3: And I don't think I don't think the arm. I 549 00:25:46,640 --> 00:25:47,919 Speaker 3: think the arms is a red herring. 550 00:25:47,920 --> 00:25:49,920 Speaker 2: I think all we're doing with the arms is prolonging 551 00:25:49,920 --> 00:25:51,480 Speaker 2: the pain in the agony because we're not going to 552 00:25:51,520 --> 00:25:55,679 Speaker 2: give them nukes, right, So, ultimately Putin has nukes and 553 00:25:55,720 --> 00:25:57,639 Speaker 2: he'd be willing to use them because he's that crazy, 554 00:25:57,680 --> 00:26:00,320 Speaker 2: I think, So what the hell are we doing? Like 555 00:26:00,680 --> 00:26:02,919 Speaker 2: this is to me, what we're doing is dangerous and 556 00:26:02,960 --> 00:26:06,720 Speaker 2: scary because ultimately he has nukes and he would be 557 00:26:06,720 --> 00:26:10,040 Speaker 2: willing to use him. Yeah, so what are we even 558 00:26:10,040 --> 00:26:10,920 Speaker 2: doing giving him arms? 559 00:26:11,040 --> 00:26:13,400 Speaker 1: When we were talking to Mike Lyons, our favorite military 560 00:26:13,480 --> 00:26:18,600 Speaker 1: guy yesterday, he seemed to think he wouldn't be very 561 00:26:18,600 --> 00:26:21,879 Speaker 1: surprised if Putin somepoint uses a tactical nuke on one 562 00:26:21,920 --> 00:26:24,960 Speaker 1: of those cities, kills like ten thousand people in one 563 00:26:25,800 --> 00:26:28,280 Speaker 1: you know, a little shot of a small nuke, just 564 00:26:28,320 --> 00:26:30,159 Speaker 1: to let everybody know, Hey, I'm serious about this. And 565 00:26:30,200 --> 00:26:32,680 Speaker 1: he thinks the world would probably back off, the world 566 00:26:32,680 --> 00:26:34,639 Speaker 1: would not say, Okay, we're at war now, NATO's not 567 00:26:34,680 --> 00:26:35,560 Speaker 1: going to go to war to. 568 00:26:35,560 --> 00:26:36,320 Speaker 3: Nuclear war over that. 569 00:26:36,440 --> 00:26:36,760 Speaker 1: Yeah. 570 00:26:36,840 --> 00:26:39,679 Speaker 2: I mean that's and that's now today. And I respect 571 00:26:39,760 --> 00:26:42,000 Speaker 2: Mike Lyons's opinion a lot on these matters, obviously, And 572 00:26:42,000 --> 00:26:44,760 Speaker 2: that's today. That's what Putin basically winning and getting everything 573 00:26:44,760 --> 00:26:47,440 Speaker 2: he wants right now. Imagine if somehow, someway we gave 574 00:26:47,600 --> 00:26:51,000 Speaker 2: enough that somehow Putin got pushed back, and we had 575 00:26:51,280 --> 00:26:54,560 Speaker 2: Ukrainians moving into Moscow. Come on, what's going to happen? 576 00:26:54,560 --> 00:26:57,159 Speaker 2: Then then he's really going to use a nuke, and 577 00:26:57,200 --> 00:27:01,359 Speaker 2: then we've really done it. I just I've never understood 578 00:27:01,400 --> 00:27:04,040 Speaker 2: the idea that we can prolong this war enough that Ukrainian, 579 00:27:04,400 --> 00:27:05,920 Speaker 2: the Ukrainian people can somehow win. 580 00:27:06,400 --> 00:27:07,680 Speaker 3: I just I've just never gotten it. 581 00:27:08,960 --> 00:27:12,040 Speaker 1: Well, So I always think about China and Taiwan whenever 582 00:27:12,080 --> 00:27:14,560 Speaker 1: we're having this conversation, and there's there's a quote that 583 00:27:14,680 --> 00:27:16,439 Speaker 1: I don't know if it's been verified or not, but 584 00:27:16,480 --> 00:27:21,680 Speaker 1: apparently Trump said at some point to someone, if China 585 00:27:21,720 --> 00:27:24,920 Speaker 1: decides they're gonna take Taiwan, They're gonna take Taiwan. Because 586 00:27:24,960 --> 00:27:27,080 Speaker 1: I think it's a similar situation. Are we gonna go 587 00:27:27,080 --> 00:27:28,720 Speaker 1: to war with China over that? 588 00:27:29,000 --> 00:27:29,080 Speaker 5: No? 589 00:27:29,320 --> 00:27:29,600 Speaker 2: Or not? 590 00:27:29,920 --> 00:27:32,119 Speaker 3: No? No it. 591 00:27:33,359 --> 00:27:35,160 Speaker 2: I think if we tried to take Mexico, we would 592 00:27:35,160 --> 00:27:37,280 Speaker 2: take Mexico. I just think it's the same, like, no 593 00:27:37,280 --> 00:27:39,560 Speaker 2: one's gonna stop us. If we wanted to take Mexico, Yeah, 594 00:27:39,600 --> 00:27:41,399 Speaker 2: we would take a black eye. The liberals would scream 595 00:27:41,440 --> 00:27:45,800 Speaker 2: about it, but it's it's not gonna stop it. I just, yeah, 596 00:27:45,400 --> 00:27:48,520 Speaker 2: I feel the same way I just I have family 597 00:27:48,520 --> 00:27:51,160 Speaker 2: friends that just the grandparents just moved back to Taiwan, 598 00:27:51,200 --> 00:27:54,000 Speaker 2: and I said, man, aren't they worried? And the response was, 599 00:27:54,200 --> 00:27:56,040 Speaker 2: they're eighty five, they're not too You know, if they 600 00:27:56,040 --> 00:27:57,680 Speaker 2: get taken over by China at this point, you know 601 00:27:57,680 --> 00:28:00,560 Speaker 2: they're gonna ride it out that way. But I feel 602 00:28:00,600 --> 00:28:03,320 Speaker 2: the same way. I just think Taiwan is there. We're 603 00:28:03,320 --> 00:28:04,640 Speaker 2: not going to go to war over Taiwan. 604 00:28:05,280 --> 00:28:05,760 Speaker 3: We're just not. 605 00:28:06,800 --> 00:28:08,840 Speaker 1: Well, it's gonna be a much different world in the 606 00:28:08,880 --> 00:28:11,840 Speaker 1: near future because of China controls that entire chunk of 607 00:28:11,840 --> 00:28:15,760 Speaker 1: the ocean, the free sea lanes of the world that 608 00:28:15,800 --> 00:28:20,920 Speaker 1: we have uh kept for eighty years. That'll be the 609 00:28:21,000 --> 00:28:23,520 Speaker 1: end of that. But like the guy from Cato said, 610 00:28:23,520 --> 00:28:26,000 Speaker 1: this was a blip in time. The history of the 611 00:28:26,000 --> 00:28:29,040 Speaker 1: world is more like what's coming and what has been. 612 00:28:30,320 --> 00:28:31,760 Speaker 1: That's really interesting stuff. 613 00:28:32,119 --> 00:28:37,080 Speaker 3: Warfare has always been about. Miight makes right. It's just yeah, 614 00:28:37,160 --> 00:28:37,720 Speaker 3: that is. 615 00:28:37,600 --> 00:28:38,400 Speaker 2: That is, you know. 616 00:28:38,440 --> 00:28:39,680 Speaker 1: I don't I don't have a I don't have a 617 00:28:39,720 --> 00:28:43,000 Speaker 1: counter to that. At this point, I could see where 618 00:28:43,160 --> 00:28:46,680 Speaker 1: Europe might decide it's there and there. They didn't and 619 00:28:46,720 --> 00:28:48,640 Speaker 1: they're not going to I mean, if they were going 620 00:28:48,640 --> 00:28:50,320 Speaker 1: to you would have thought they would have earlier. I 621 00:28:50,320 --> 00:28:52,800 Speaker 1: could see how Europe, since it's their own damn backyard, 622 00:28:53,200 --> 00:28:55,080 Speaker 1: would decide we're gonna put up quite a fight here. 623 00:28:55,080 --> 00:28:57,560 Speaker 1: But they didn't. I mean, I mean, and they were 624 00:28:57,600 --> 00:29:01,400 Speaker 1: willing to continue to I mean, we canceled the nord 625 00:29:01,440 --> 00:29:02,320 Speaker 1: Stream pipeline. 626 00:29:02,400 --> 00:29:04,920 Speaker 5: The uh. 627 00:29:06,640 --> 00:29:10,440 Speaker 1: Germany was all for continuing to buy natural gas from 628 00:29:10,480 --> 00:29:16,960 Speaker 1: Russia after the war started. So yeah, that's just the 629 00:29:16,960 --> 00:29:19,520 Speaker 1: reality of the world. I may have moved on this topic, 630 00:29:20,200 --> 00:29:23,360 Speaker 1: it took me three years to get shocking. It's I 631 00:29:23,360 --> 00:29:25,320 Speaker 1: can't I can't believe how long it's gone on again. 632 00:29:25,560 --> 00:29:25,840 Speaker 3: Again. 633 00:29:26,160 --> 00:29:28,640 Speaker 1: I think it's restraint of Putin that's allowed to even 634 00:29:28,680 --> 00:29:31,840 Speaker 1: go on this long, because I think he probably could 635 00:29:31,840 --> 00:29:33,880 Speaker 1: have won this thing earlier had he been even more 636 00:29:33,920 --> 00:29:36,760 Speaker 1: brutal and violent. And you know, the guy, the guy 637 00:29:36,880 --> 00:29:39,880 Speaker 1: is a he's a lizard, he's a shark. He I 638 00:29:39,880 --> 00:29:42,040 Speaker 1: think he's played this thing just about perfectly. And I 639 00:29:42,080 --> 00:29:44,520 Speaker 1: can't I can't even fathom when we get these hot 640 00:29:44,560 --> 00:29:47,880 Speaker 1: mic moments of Trump saying to Macrone, I think he 641 00:29:47,920 --> 00:29:50,040 Speaker 1: wants to do a deal for me. I mean, come on, 642 00:29:53,160 --> 00:29:55,400 Speaker 1: it's Trump dealing with this is better than Biden, but 643 00:29:55,440 --> 00:29:57,160 Speaker 1: not a whole hell of a lot better. After I 644 00:29:57,200 --> 00:30:00,680 Speaker 1: hear things like that, well, well that's one of the 645 00:30:00,720 --> 00:30:02,400 Speaker 1: reasons I wanted to have you on was talk about 646 00:30:02,400 --> 00:30:03,720 Speaker 1: that to get that point of view on and I 647 00:30:03,760 --> 00:30:06,400 Speaker 1: may I may be convinced at this point. Hey, Craig, Gotwalls. 648 00:30:06,440 --> 00:30:08,760 Speaker 1: Appreciate your time today on all these different topics. You're 649 00:30:08,760 --> 00:30:10,480 Speaker 1: a smart guy and like hearing what you have to 650 00:30:10,480 --> 00:30:12,960 Speaker 1: talk about. And again, if somebody has healthcare questions, you 651 00:30:13,040 --> 00:30:15,000 Speaker 1: run a business, you need advice, how do they get 652 00:30:15,000 --> 00:30:16,800 Speaker 1: a hold of you? 653 00:30:16,840 --> 00:30:20,360 Speaker 2: Got Walls, two t's one l dot substack dot com. 654 00:30:20,440 --> 00:30:23,160 Speaker 1: Okay, thanks, Greg, appreciate it. I want to tell you 655 00:30:23,200 --> 00:30:25,040 Speaker 1: about Simply Safe because I just think it's a really 656 00:30:25,080 --> 00:30:27,880 Speaker 1: really good idea to have home security of some sort, 657 00:30:27,880 --> 00:30:29,360 Speaker 1: and you might as well have the best and one 658 00:30:29,360 --> 00:30:31,400 Speaker 1: where you're not locked into a contract. A lot of 659 00:30:31,400 --> 00:30:33,360 Speaker 1: those companies they lock you into a contract because they 660 00:30:33,600 --> 00:30:35,480 Speaker 1: are afraid you're not going to like it or use 661 00:30:35,520 --> 00:30:38,200 Speaker 1: it and decide you want out. Simply Safe doesn't have 662 00:30:38,320 --> 00:30:40,040 Speaker 1: to do that because they know you're gonna like it 663 00:30:40,080 --> 00:30:41,720 Speaker 1: and you are going to use it. No contracts, no 664 00:30:41,800 --> 00:30:44,560 Speaker 1: hidden fees. This is what you do you visit simply 665 00:30:44,560 --> 00:30:47,680 Speaker 1: safe dot com slash armstrong to claim fifty percent off 666 00:30:47,680 --> 00:30:50,640 Speaker 1: a new system with a professional monitoring plan and get 667 00:30:50,640 --> 00:30:52,920 Speaker 1: your first month free. So it's about a dollar a 668 00:30:53,000 --> 00:30:57,760 Speaker 1: day to have the number one customer service rated home 669 00:30:57,800 --> 00:31:01,360 Speaker 1: security system out there. I got the cameras, I got 670 00:31:01,400 --> 00:31:04,680 Speaker 1: the censors, I got all the stuff and it's absolutely fantastic. 671 00:31:04,920 --> 00:31:07,720 Speaker 1: Simply safe dot com slash armstrong to claim fifty percent 672 00:31:07,760 --> 00:31:10,640 Speaker 1: off a new system that's simply safe dot com slash armstrong. 673 00:31:10,680 --> 00:31:12,880 Speaker 1: There's no safe like simply safe. Do you have any 674 00:31:12,920 --> 00:31:16,560 Speaker 1: pushback to perhaps my newfound view on the rush of 675 00:31:16,680 --> 00:31:19,080 Speaker 1: Ukraine war? If you do, I'd like to hear it 676 00:31:19,120 --> 00:31:21,520 Speaker 1: on the text line at four one five two nine 677 00:31:21,640 --> 00:31:25,920 Speaker 1: five KFTC. Otherwise, we got some other stuff to move into. 678 00:31:26,320 --> 00:31:29,520 Speaker 1: So the Cracker Barrel remodeled, getting so much damn attention. 679 00:31:29,600 --> 00:31:32,600 Speaker 1: Of course, Hooters is trying to make a big push 680 00:31:32,600 --> 00:31:36,360 Speaker 1: for a comeback with like a new vibe Hooters of 681 00:31:36,400 --> 00:31:40,600 Speaker 1: all places, but a lot of the way stay here arms. 682 00:31:43,800 --> 00:31:49,400 Speaker 1: The Oakland A's baseball team now plays in Sacramento, California. 683 00:31:49,440 --> 00:31:52,440 Speaker 1: That's where this radio show is based. Actually they play 684 00:31:52,440 --> 00:31:56,600 Speaker 1: in Westsak, which is its own individual town West Sacramento anyway, 685 00:31:56,640 --> 00:31:59,320 Speaker 1: So it's been very exciting that this major league baseball 686 00:31:59,360 --> 00:32:02,720 Speaker 1: team is playing this tiny minor league park, and so 687 00:32:02,960 --> 00:32:05,840 Speaker 1: you can go to these major League baseball games and 688 00:32:06,960 --> 00:32:11,120 Speaker 1: the worst seat there is quite possibly the best seat 689 00:32:11,160 --> 00:32:13,920 Speaker 1: you've ever had in a major League baseball game. I mean, 690 00:32:13,960 --> 00:32:16,760 Speaker 1: it's that much smaller than a major league baseball parkt 691 00:32:16,760 --> 00:32:19,120 Speaker 1: what does it seat, like eleven thousand or nine thousand 692 00:32:19,200 --> 00:32:22,200 Speaker 1: or stuff like that as opposed to fifty five thousand 693 00:32:22,280 --> 00:32:26,360 Speaker 1: if you go to a regular But they beat Detroit 694 00:32:26,440 --> 00:32:30,360 Speaker 1: last night. Yeah, it's on one game, doesn't really matter. 695 00:32:30,400 --> 00:32:32,680 Speaker 1: The A's are in last place. But so it made 696 00:32:32,680 --> 00:32:36,640 Speaker 1: me look up the where the baseball season is currently. 697 00:32:36,640 --> 00:32:38,600 Speaker 1: I'm not a hardcore fan. I will watch when the 698 00:32:38,600 --> 00:32:41,080 Speaker 1: playoffs start. They're about one hundred and thirty games in. 699 00:32:41,600 --> 00:32:43,760 Speaker 1: They play one hundred and sixty two per year, so 700 00:32:43,800 --> 00:32:46,320 Speaker 1: we've got about thirty games left. Do you know what 701 00:32:46,360 --> 00:32:49,080 Speaker 1: team has the best record in all of Major League Baseball? 702 00:32:49,360 --> 00:32:52,960 Speaker 1: It's the Milwaukee Brewers with eighty two wins in the 703 00:32:53,040 --> 00:32:57,200 Speaker 1: National League. They've won seven more games than the LA Dodgers, 704 00:32:57,360 --> 00:33:00,360 Speaker 1: who have a payroll of about nine Quinn trillion dollars 705 00:33:00,560 --> 00:33:02,680 Speaker 1: and have tried to buy every All Star that exists 706 00:33:02,720 --> 00:33:05,760 Speaker 1: in the entire sport. So the Brewers are way ahead 707 00:33:05,760 --> 00:33:08,280 Speaker 1: of the Dodgers in terms of a record. So that 708 00:33:08,440 --> 00:33:09,840 Speaker 1: is going to be a fun one to watch. They 709 00:33:09,920 --> 00:33:12,720 Speaker 1: might be on a collision course, the Dodgers and the Brewers, 710 00:33:12,840 --> 00:33:15,640 Speaker 1: and perhaps the pottery is also in the Phillies. But 711 00:33:16,040 --> 00:33:18,480 Speaker 1: would love to see the Dodgers Brewers man. You talk 712 00:33:18,480 --> 00:33:22,480 Speaker 1: about a tiny market against the biggest of big dogs. Yeah, 713 00:33:22,520 --> 00:33:25,840 Speaker 1: that would be pretty fun. Different story. We've been a 714 00:33:25,840 --> 00:33:30,000 Speaker 1: lot of talk about Cracker Barrel in their remodel, which 715 00:33:30,640 --> 00:33:32,959 Speaker 1: I don't like, the looks of the new insides, if 716 00:33:33,000 --> 00:33:35,000 Speaker 1: that's actually going to happen, and whether or not it's 717 00:33:35,000 --> 00:33:36,960 Speaker 1: woke or not, and all that different sort of stuff. 718 00:33:37,400 --> 00:33:40,480 Speaker 1: What do you call that kind of dining? There's a 719 00:33:40,600 --> 00:33:45,080 Speaker 1: name for it. Is it comfort food or I don't know, 720 00:33:45,120 --> 00:33:47,800 Speaker 1: I don't know why you call the place like your Applebee's, 721 00:33:47,840 --> 00:33:53,600 Speaker 1: your Chili's, your cracker barrels. They're not fast food. There's 722 00:33:53,640 --> 00:33:56,640 Speaker 1: a name for it. This is chain restaurants. Yeah, they're chains, 723 00:33:56,720 --> 00:33:58,959 Speaker 1: but there's a name for that kind of dining anyway, 724 00:33:59,000 --> 00:34:01,960 Speaker 1: you know, it's it's obviously a way a step ahead 725 00:34:02,400 --> 00:34:07,240 Speaker 1: above fast food, but it's certainly not fancy by any means. 726 00:34:07,600 --> 00:34:09,759 Speaker 1: Is Hooters? I think Hooters is one of those too, 727 00:34:10,000 --> 00:34:13,360 Speaker 1: But yeahs obviously had its own brand for the chicks 728 00:34:13,400 --> 00:34:18,120 Speaker 1: wearing those ridiculous outfits. I mean, they're just stupid looking outfits. 729 00:34:18,200 --> 00:34:20,160 Speaker 1: I don't care how hot you are, it's still a 730 00:34:20,280 --> 00:34:21,520 Speaker 1: stupid looking outfit. 731 00:34:21,640 --> 00:34:23,600 Speaker 6: Apparently it's referred to as casual dining. 732 00:34:25,000 --> 00:34:27,359 Speaker 1: I think it's casual fast or some people call it 733 00:34:27,400 --> 00:34:28,399 Speaker 1: brass and glass, but. 734 00:34:28,520 --> 00:34:32,120 Speaker 6: Yeah, Hooters totally falls into that with their raunchy outfits. 735 00:34:32,600 --> 00:34:37,840 Speaker 6: So they were calling it a breastrant by the way. 736 00:34:38,360 --> 00:34:43,520 Speaker 1: Yeah, and we've never quite understood the hubbub over Hooters 737 00:34:43,760 --> 00:34:48,920 Speaker 1: because it's a sports bar with young attractive women wearing 738 00:34:49,120 --> 00:34:51,719 Speaker 1: shorts and tank tops. You know what other sports bars 739 00:34:51,760 --> 00:34:54,680 Speaker 1: are out like that? Every single one of them in 740 00:34:54,760 --> 00:34:58,040 Speaker 1: the country is like that. Young attractive women usually barked 741 00:34:58,080 --> 00:34:59,959 Speaker 1: in there because they make good tips and they wear 742 00:35:00,160 --> 00:35:02,360 Speaker 1: shorts and tak tops. So I know Hooters is nothing 743 00:35:02,480 --> 00:35:05,279 Speaker 1: different than that. But anyway, apparently there's a I don't 744 00:35:05,280 --> 00:35:07,680 Speaker 1: know anything about the Hooters business model, don't quote me 745 00:35:07,680 --> 00:35:10,040 Speaker 1: onto this, I don't care, but they were going out 746 00:35:10,040 --> 00:35:12,640 Speaker 1: of business, or one section of them was going out 747 00:35:12,680 --> 00:35:15,840 Speaker 1: of business, or fifty locations I know were bankrupt. And 748 00:35:15,880 --> 00:35:18,840 Speaker 1: they're so they're trying to come up with a rebrand 749 00:35:18,960 --> 00:35:23,719 Speaker 1: where perhaps the girls wear longer orange shorts but still 750 00:35:23,760 --> 00:35:29,560 Speaker 1: the tank tops. But according to their website, they're still 751 00:35:29,600 --> 00:35:34,520 Speaker 1: looking for servers who will maintain a glamorous hairstyle and 752 00:35:34,600 --> 00:35:38,840 Speaker 1: have the ability to maintain attractive an attractive fit image. 753 00:35:39,239 --> 00:35:42,960 Speaker 6: Oh oh, there's the problem right there. 754 00:35:43,360 --> 00:35:45,239 Speaker 1: So I don't know. We were talking with Tim the 755 00:35:45,320 --> 00:35:48,719 Speaker 1: lawyer about this earlier where this moving company might be 756 00:35:48,840 --> 00:35:52,440 Speaker 1: in trouble because they only hire young, strong men to 757 00:35:52,520 --> 00:35:55,960 Speaker 1: move furniture. They're not hiring a lot of sixty three 758 00:35:56,040 --> 00:35:58,719 Speaker 1: year olds with a bad back. And whether or not 759 00:35:58,800 --> 00:36:01,160 Speaker 1: that you know, is an equal employment problem or whatever, 760 00:36:01,440 --> 00:36:03,840 Speaker 1: I don't know what Hooter, what situation Hooters is in 761 00:36:03,920 --> 00:36:08,319 Speaker 1: If you have to have a glamorous hairstyling according to 762 00:36:08,360 --> 00:36:11,640 Speaker 1: who them, and maintain an attractive and fit image. 763 00:36:11,760 --> 00:36:14,359 Speaker 6: See, that's where all of my red flags are going up, 764 00:36:14,400 --> 00:36:18,600 Speaker 6: because here comes the body positivity people going well, technically, 765 00:36:18,600 --> 00:36:20,800 Speaker 6: according to my doctor, I'm perfectly healthy. 766 00:36:22,040 --> 00:36:23,920 Speaker 1: I don't know how you get around that stuff. I mean, 767 00:36:23,960 --> 00:36:28,479 Speaker 1: I've known people who ran sportscar bars. I've known owners 768 00:36:28,520 --> 00:36:32,239 Speaker 1: who owned sports bars, and all the chicks at work 769 00:36:32,320 --> 00:36:35,960 Speaker 1: there were attractive and wear shorts and taktops. But I 770 00:36:36,000 --> 00:36:38,320 Speaker 1: don't think anybody said anything. I don't think it was 771 00:36:38,360 --> 00:36:42,040 Speaker 1: a rule. It's just he hired people that he thought 772 00:36:42,120 --> 00:36:44,520 Speaker 1: would look good and they knew what they were supposed 773 00:36:44,520 --> 00:36:47,200 Speaker 1: to wear or were perfectly happy wearing that, because that 774 00:36:47,280 --> 00:36:49,040 Speaker 1: was what's going to make the most tips. I don't 775 00:36:49,080 --> 00:36:52,080 Speaker 1: know that it had to be in writing. Hmmm, well, 776 00:36:52,120 --> 00:36:54,960 Speaker 1: we'll see if Hooter does away with those ridiculous shorts 777 00:36:55,120 --> 00:36:57,360 Speaker 1: and having them wear them. Do they still wear the Nylons? 778 00:36:57,360 --> 00:36:59,440 Speaker 1: I haven't. I think I've been in a Hooters once 779 00:36:59,480 --> 00:37:00,960 Speaker 1: in my life and I thought this is not my 780 00:37:01,080 --> 00:37:04,360 Speaker 1: kind of place. Yeah, they were the Nylons with the 781 00:37:04,480 --> 00:37:08,000 Speaker 1: stupid little I have not been since high school. I'm 782 00:37:08,080 --> 00:37:10,320 Speaker 1: next to a Hooters. As a drunk, I lived the 783 00:37:10,360 --> 00:37:12,440 Speaker 1: block away from a Hooters. I never went in there. 784 00:37:12,719 --> 00:37:15,120 Speaker 1: If you miss a segment, get the podcast Armstrong and 785 00:37:15,160 --> 00:37:19,200 Speaker 1: Getty on demand. Armstrong and Getty