1 00:00:00,240 --> 00:00:03,040 Speaker 1: Hey, it's Mark Simone. Welcome to the bonus segment for 2 00:00:03,080 --> 00:00:05,560 Speaker 1: you podcast listeners. Well give you like an extra show 3 00:00:06,400 --> 00:00:09,840 Speaker 1: every week. So November you got the midterm elections. You know, 4 00:00:09,880 --> 00:00:13,319 Speaker 1: also you got the race for governor. Here in New York, 5 00:00:13,320 --> 00:00:16,520 Speaker 1: we hope Bruce Blakeman wins. He'd be the best governor 6 00:00:16,560 --> 00:00:18,960 Speaker 1: we could have right now. But you know, also in 7 00:00:19,000 --> 00:00:23,919 Speaker 1: Connecticut you got a governor's race. Ned Lamont. He's a Democrat. 8 00:00:24,000 --> 00:00:27,720 Speaker 1: He'd be running for I think it's a third term. Yeah, 9 00:00:27,720 --> 00:00:30,479 Speaker 1: he'd be running for a third term. Now. Lamont is 10 00:00:30,520 --> 00:00:33,360 Speaker 1: a Democrat. He's not that bad because he was for 11 00:00:33,400 --> 00:00:37,440 Speaker 1: many years a very successful businessman. So sometimes he's a 12 00:00:37,440 --> 00:00:40,800 Speaker 1: pretty common sense Democrat. He goes a little left sometimes, 13 00:00:41,280 --> 00:00:43,440 Speaker 1: but he's not the worst. He will be running for 14 00:00:43,479 --> 00:00:49,600 Speaker 1: a third term in Connecticut now, and the Republicans looks 15 00:00:49,600 --> 00:00:54,920 Speaker 1: like they're going to nominate what's his name, Ryan, I'm 16 00:00:54,920 --> 00:00:59,000 Speaker 1: blanking out of his name, the state legislator, very very 17 00:00:59,440 --> 00:01:02,800 Speaker 1: a good guy. Republicans like him. But here's the thing, 18 00:01:02,840 --> 00:01:07,920 Speaker 1: Ned Lamont, guess he brings into campaign for him. Al Sharpton. 19 00:01:09,200 --> 00:01:13,880 Speaker 1: Al Sharpton showed up to campaign for Ned Lamont in 20 00:01:14,400 --> 00:01:21,520 Speaker 1: Hartford State Capital. Yeah, Sharpton, Look at this. Sharpton's motor 21 00:01:21,600 --> 00:01:24,759 Speaker 1: cade of SUVs pulled up to the state Capitol. This 22 00:01:24,800 --> 00:01:28,959 Speaker 1: is this week around two pm. Motorcate of SUV's. Who 23 00:01:29,040 --> 00:01:32,360 Speaker 1: the hell does this guy think he is? Al Sharpton. 24 00:01:32,480 --> 00:01:35,000 Speaker 1: You know, if you're donating to his whatever the hell 25 00:01:35,040 --> 00:01:38,800 Speaker 1: he does, you're paying for these motor cades of SUV's. 26 00:01:40,000 --> 00:01:43,240 Speaker 1: Sharpton's team spent about a half hour in the Governor's 27 00:01:43,319 --> 00:01:47,400 Speaker 1: second floor office, joined by Democratic senators who took time 28 00:01:47,480 --> 00:01:50,480 Speaker 1: out from their day's business to talk about the need 29 00:01:50,600 --> 00:01:56,080 Speaker 1: for better training for police. If you want to really 30 00:01:56,200 --> 00:01:58,400 Speaker 1: learn about how to better train police, I don't think 31 00:01:58,400 --> 00:02:00,200 Speaker 1: Al Sharpton's the guy who want to go to But 32 00:02:01,840 --> 00:02:05,520 Speaker 1: listen again, I guess Lamott has to do that. Sharpton 33 00:02:06,520 --> 00:02:09,160 Speaker 1: with his motorcate of SUVs. They must have driven up 34 00:02:09,160 --> 00:02:12,680 Speaker 1: from New York. You wouldn't think they'd have any pull 35 00:02:13,639 --> 00:02:16,760 Speaker 1: in Connecticut or that anybody would pay any attention to them, 36 00:02:16,800 --> 00:02:19,960 Speaker 1: but apparently they do. Ryan Fazzio, that's the guy. He's 37 00:02:20,000 --> 00:02:23,840 Speaker 1: going to be the Republican nominee. Ryan Fazio, State Senator, 38 00:02:24,040 --> 00:02:29,560 Speaker 1: good guy, represented Connecticut for years, been reelected twice, so 39 00:02:30,200 --> 00:02:36,080 Speaker 1: we'll see. Hey, you know, the Epstein fallout continues. And 40 00:02:36,120 --> 00:02:38,320 Speaker 1: the thing is about Epstein. He did have a lot 41 00:02:38,360 --> 00:02:40,480 Speaker 1: of money. He had four or five hundred million, six 42 00:02:40,560 --> 00:02:43,119 Speaker 1: hundred million at one point, and you got to keep 43 00:02:43,120 --> 00:02:46,800 Speaker 1: that money in banks. And if you got hundreds of 44 00:02:46,800 --> 00:02:48,720 Speaker 1: millions of dollars and you want to keep it in 45 00:02:48,720 --> 00:02:51,560 Speaker 1: the bank, you're like Jed Klampett, like mister Drysdale comes 46 00:02:51,600 --> 00:02:54,959 Speaker 1: right to your house and waits on your foot. You're 47 00:02:55,360 --> 00:03:00,359 Speaker 1: you're a big, big, big customer for the bank. So 48 00:03:00,760 --> 00:03:05,079 Speaker 1: JP Morgan Chase. You know, they stood by Epstein right 49 00:03:05,120 --> 00:03:07,160 Speaker 1: to the end because he probably had a couple hundred 50 00:03:07,200 --> 00:03:10,680 Speaker 1: million in the bank. Now, Bank of America was his 51 00:03:10,720 --> 00:03:13,919 Speaker 1: other bank. He kept a couple hundred million in the 52 00:03:14,639 --> 00:03:17,639 Speaker 1: Bank of America, and they were very nice to Epstein, 53 00:03:17,720 --> 00:03:20,680 Speaker 1: even after he got arrested in Palm Beach, even after 54 00:03:20,720 --> 00:03:23,360 Speaker 1: he went to prison, after he got out. Hey, there, 55 00:03:23,400 --> 00:03:25,080 Speaker 1: what do you want. They're bankers. You're going to put 56 00:03:25,080 --> 00:03:28,160 Speaker 1: a couple hundred million in they'll be very, very very 57 00:03:28,240 --> 00:03:31,760 Speaker 1: nice to you. But Bank of America has been sued 58 00:03:32,760 --> 00:03:37,000 Speaker 1: over turning a blind eye to Jeffrey Epstein's crap and 59 00:03:37,120 --> 00:03:40,240 Speaker 1: Bank of America just announced the settlement. They've settled the lawsuit. 60 00:03:40,280 --> 00:03:44,960 Speaker 1: It was a Manhattan federal lawsuit. They've settled for seventy 61 00:03:44,960 --> 00:03:48,280 Speaker 1: three million dollars. They're going to pay seventy three million 62 00:03:49,120 --> 00:03:52,360 Speaker 1: to settle that lawsuit. Now listen to this. JP Morgan 63 00:03:52,400 --> 00:03:55,880 Speaker 1: also got sued. They settled for almost three hundred million. 64 00:03:56,960 --> 00:03:59,560 Speaker 1: That's how much they paid. I wouldn't think they were 65 00:03:59,600 --> 00:04:02,840 Speaker 1: even able that make that much money off Epstein, but 66 00:04:03,120 --> 00:04:04,840 Speaker 1: it cost them a fortune to get out of it. 67 00:04:04,880 --> 00:04:08,240 Speaker 1: So it's seventy three million Bank of America, three hundred million. 68 00:04:08,960 --> 00:04:12,200 Speaker 1: JP Morgan Chase. He also had money in Deutsche Bank. 69 00:04:12,360 --> 00:04:17,159 Speaker 1: They've agreed to settle for seventy five million. Now, legal 70 00:04:17,200 --> 00:04:20,360 Speaker 1: experts say the banks had to settle because they couldn't 71 00:04:20,400 --> 00:04:23,800 Speaker 1: go through a really messy discovery process and they want 72 00:04:23,839 --> 00:04:27,000 Speaker 1: to keep all their internal documents out of the public view. 73 00:04:28,560 --> 00:04:30,560 Speaker 1: They don't want to answer any of these questions. They 74 00:04:30,600 --> 00:04:33,960 Speaker 1: don't want, you know, their books opened up. So that's 75 00:04:34,000 --> 00:04:38,400 Speaker 1: a lot of money in settlements. You know, the Iranian War, 76 00:04:38,520 --> 00:04:41,719 Speaker 1: it's all about the weapons and nuclear material and all 77 00:04:41,720 --> 00:04:44,520 Speaker 1: of that, but for Iranian women it's a big deal. 78 00:04:45,120 --> 00:04:47,680 Speaker 1: Especially the older ones. They want their old lives back. 79 00:04:47,720 --> 00:04:51,800 Speaker 1: You know. Before nineteen seventy nine, before the crazy Ayatola 80 00:04:51,920 --> 00:04:55,960 Speaker 1: took over, Iran was a free place. It was a 81 00:04:55,960 --> 00:04:59,400 Speaker 1: democratic place, It was a capitalist place. It was quite 82 00:04:59,440 --> 00:05:01,919 Speaker 1: a place to Iran was considered the Paris of the 83 00:05:01,960 --> 00:05:05,720 Speaker 1: Middle East. It was a wonderful place to live, and 84 00:05:06,960 --> 00:05:09,200 Speaker 1: the Shawl of Iran was running things back then. But 85 00:05:09,240 --> 00:05:13,760 Speaker 1: it was a very happy, very prosperous, very elegant sort 86 00:05:13,800 --> 00:05:16,640 Speaker 1: of a place. And a lot of women say back 87 00:05:16,680 --> 00:05:20,600 Speaker 1: then it was great for women. Women worked as lawyers, 88 00:05:20,640 --> 00:05:24,920 Speaker 1: as doctors, as politicians. Universities were full of them. Women 89 00:05:25,120 --> 00:05:28,360 Speaker 1: entered the diplomatic corps, the judiciary, even the police force. 90 00:05:29,000 --> 00:05:34,880 Speaker 1: Then nineteen seventy nine happened and women everything changed suddenly. 91 00:05:35,040 --> 00:05:38,160 Speaker 1: The marriage age for girls dropped to nine years old. 92 00:05:39,279 --> 00:05:43,719 Speaker 1: Nine years old a hijab became mandatory. You had to 93 00:05:43,720 --> 00:05:46,480 Speaker 1: cover yourself up and if you were caught not doing it, 94 00:05:46,520 --> 00:05:48,760 Speaker 1: you could be thrown in prison, You could be whipped, 95 00:05:48,839 --> 00:05:52,320 Speaker 1: you could be shot. This is what happened in Iran. 96 00:05:52,480 --> 00:05:56,839 Speaker 1: It was methodical and brutal. Thousands were killed. Women were 97 00:05:56,880 --> 00:06:00,880 Speaker 1: deliberately shot in their faces, shot in their breast, shot 98 00:06:00,960 --> 00:06:04,080 Speaker 1: in their genitals. It was one of the punishments the 99 00:06:04,160 --> 00:06:09,280 Speaker 1: crazy Iyatola regime had. Girls were maimed, murdered, and so 100 00:06:09,360 --> 00:06:13,440 Speaker 1: called honor killings. It was just awful. This is forty 101 00:06:13,480 --> 00:06:16,000 Speaker 1: six years of this crap. So you would think the 102 00:06:16,080 --> 00:06:19,120 Speaker 1: left wing would want to get rid of this regime 103 00:06:19,640 --> 00:06:21,919 Speaker 1: so that women would free be free again, and if 104 00:06:21,960 --> 00:06:24,800 Speaker 1: you were gay in Iran, you'd be literally executed for that. 105 00:06:24,880 --> 00:06:27,560 Speaker 1: You think liberals would fight for getting rid of this 106 00:06:27,760 --> 00:06:31,719 Speaker 1: kind of a government. But the Trump derangement syndrome is stronger, 107 00:06:32,360 --> 00:06:37,440 Speaker 1: so they somehow they're suddenly on a Rand's side in 108 00:06:37,600 --> 00:06:42,000 Speaker 1: all of this. Hey, you know, we got this crazy mayor, 109 00:06:42,040 --> 00:06:44,560 Speaker 1: Mom Donnie in New York, this crazy left wing kook 110 00:06:44,640 --> 00:06:47,480 Speaker 1: of a mayor. And they have these kind of mayors 111 00:06:47,480 --> 00:06:51,320 Speaker 1: in some other cities. So there's about five of them, 112 00:06:51,800 --> 00:06:55,279 Speaker 1: really kooky left wing. Boston has one, Oakland has one, 113 00:06:55,360 --> 00:06:59,279 Speaker 1: Seattle has one, Los Angeles has one, Chicago has one. 114 00:07:00,120 --> 00:07:03,400 Speaker 1: All talking to each other. They want to combine and 115 00:07:03,520 --> 00:07:07,080 Speaker 1: form a supergroup. This is not a good thing. These 116 00:07:07,120 --> 00:07:10,720 Speaker 1: six it's actually six of these leftist mayors, including Mom Donnie, 117 00:07:11,160 --> 00:07:15,560 Speaker 1: want to form an ultra woke supergroup that would seek 118 00:07:15,600 --> 00:07:19,840 Speaker 1: to go national turn the whole United States into this 119 00:07:20,080 --> 00:07:23,160 Speaker 1: crazy left wing, kooky stuff. Right now, we're in the 120 00:07:23,200 --> 00:07:28,080 Speaker 1: process of forming some sort of mayor's coalition and we'll 121 00:07:28,120 --> 00:07:31,480 Speaker 1: fight together. Somebody's got to do something about these people. 122 00:07:31,560 --> 00:07:33,440 Speaker 1: Somebody's got to get them out of office. Now. If 123 00:07:33,480 --> 00:07:35,880 Speaker 1: we get a real governor like a Bruce Blakeman, he 124 00:07:35,920 --> 00:07:40,280 Speaker 1: can remove mom Donnie from office. Now, remember, all these 125 00:07:40,280 --> 00:07:44,520 Speaker 1: cities have caused millions of people to leave, heading to California, 126 00:07:44,560 --> 00:07:47,680 Speaker 1: heading to Florida. All of these mayors have filled their 127 00:07:47,720 --> 00:07:52,600 Speaker 1: streets with homelessness, drug use, terrible justice system, criminals not 128 00:07:52,680 --> 00:07:56,840 Speaker 1: going to jail, legalized shoplifting, all of this kind of stuff. 129 00:07:57,240 --> 00:08:00,320 Speaker 1: So this is not good. This supergroup keep an eye 130 00:08:00,400 --> 00:08:05,880 Speaker 1: on this. Hey, you go to the store, supermarket, drug 131 00:08:05,920 --> 00:08:07,800 Speaker 1: store or whatever. You know, you don't want to wait 132 00:08:07,800 --> 00:08:11,400 Speaker 1: in these lines to pay. So they got the self checkout. 133 00:08:11,400 --> 00:08:17,520 Speaker 1: You can check out yourself. It's not that easy, you know, 134 00:08:17,680 --> 00:08:20,080 Speaker 1: it's easy with you know, anything in a can, in 135 00:08:20,120 --> 00:08:22,800 Speaker 1: a box, barcode that's it. Now you get something, you 136 00:08:22,840 --> 00:08:26,560 Speaker 1: got away. It's complicated and they still have to have 137 00:08:26,640 --> 00:08:30,800 Speaker 1: a couple of employees there. Now. Costco is working on 138 00:08:30,840 --> 00:08:35,400 Speaker 1: this Costco wants to go totally automated paystations, no cashiers anymore. 139 00:08:35,880 --> 00:08:38,640 Speaker 1: They're working on a checkout that could take under ten seconds. 140 00:08:39,360 --> 00:08:43,640 Speaker 1: They're testing it out in places. Everything will be scannable instantly. 141 00:08:44,520 --> 00:08:47,840 Speaker 1: You'll have a mobile wallet, something in your phone, a 142 00:08:47,840 --> 00:08:50,800 Speaker 1: Costco app, and then when it comes to a lot 143 00:08:50,800 --> 00:08:52,720 Speaker 1: of stuff, you can order it ahead of time, you know, 144 00:08:52,760 --> 00:08:56,000 Speaker 1: like Chipotle or those fast food places where you go 145 00:08:56,040 --> 00:08:59,959 Speaker 1: on the app, you order everything, order it, and then 146 00:09:00,960 --> 00:09:02,600 Speaker 1: it tells you when to show up and pick it up. 147 00:09:02,600 --> 00:09:03,920 Speaker 1: And when you show up to pick it up, it's 148 00:09:03,960 --> 00:09:06,080 Speaker 1: there in a bag, just waiting for you. You don't 149 00:09:06,080 --> 00:09:08,160 Speaker 1: have to talk to anybody, you don't have to do anything. 150 00:09:08,880 --> 00:09:11,840 Speaker 1: So Costco working on that. I not worry. I mean 151 00:09:11,880 --> 00:09:14,200 Speaker 1: it's going to come a time where you're never going 152 00:09:14,240 --> 00:09:16,440 Speaker 1: to talk to anybody ever again. Now you don't talk 153 00:09:16,480 --> 00:09:18,880 Speaker 1: to anybody anymore. You don't make phone calls, you just 154 00:09:18,920 --> 00:09:21,320 Speaker 1: go online. Used to be going to a story out 155 00:09:21,360 --> 00:09:24,280 Speaker 1: to order something, talk to somebody. In two years, you 156 00:09:24,320 --> 00:09:26,280 Speaker 1: won't be talking to anybody in the store, and it 157 00:09:26,320 --> 00:09:29,160 Speaker 1: may not be anybody there, may all be replaced by 158 00:09:29,480 --> 00:09:32,200 Speaker 1: AI or whatever. In fact, here's a new study that 159 00:09:32,320 --> 00:09:37,640 Speaker 1: just came out. They expect to lose about one point 160 00:09:37,679 --> 00:09:42,480 Speaker 1: five trillion in household income people losing their jobs to AI. 161 00:09:43,120 --> 00:09:47,440 Speaker 1: And what jobs are most vulnerable to AI? What jobs 162 00:09:48,000 --> 00:09:50,920 Speaker 1: are most likely to be lost to AI? Well, number one, 163 00:09:51,480 --> 00:10:00,800 Speaker 1: Web and digital designers, web developers, database architects, computer programmers, 164 00:10:01,240 --> 00:10:04,120 Speaker 1: data scientists. So far, this is good. This doesn't affect 165 00:10:04,120 --> 00:10:06,520 Speaker 1: any of us. I don't think these are the jobs 166 00:10:06,559 --> 00:10:10,120 Speaker 1: are most going to be lost to AI. Financial risk specialists. Ah, 167 00:10:10,480 --> 00:10:12,480 Speaker 1: I don't even know what that is, but I know 168 00:10:12,559 --> 00:10:16,600 Speaker 1: I don't. Ah. Here's court reporters, court reporters. That yeah, 169 00:10:16,800 --> 00:10:22,800 Speaker 1: that could be all AI. Information security analysts. I don't 170 00:10:22,800 --> 00:10:24,439 Speaker 1: know what the hell that is. That's a good thing 171 00:10:24,440 --> 00:10:26,120 Speaker 1: to use. Anybody asks you what you do for a living, 172 00:10:26,520 --> 00:10:30,319 Speaker 1: just say I'm an information security analyst. It sounds great. 173 00:10:30,400 --> 00:10:32,920 Speaker 1: Nobody knows what the hell it is, but it sounds impressive. 174 00:10:33,760 --> 00:10:36,400 Speaker 1: A database administrator. I don't know what that is either, 175 00:10:36,400 --> 00:10:39,560 Speaker 1: but it sounds good. Medical record specialists that could be 176 00:10:39,600 --> 00:10:43,800 Speaker 1: lost too. Now what are the safest jobs? What jobs 177 00:10:43,920 --> 00:10:47,040 Speaker 1: do not have to worry about AI? Safest jobs? Now 178 00:10:47,040 --> 00:10:49,360 Speaker 1: this could change at some point, but right now, safest 179 00:10:49,440 --> 00:10:55,800 Speaker 1: job roof bolters and mining all right, number two, excavating 180 00:10:56,200 --> 00:11:02,240 Speaker 1: loading machine, number three, orderlies before coating, painting and spraying 181 00:11:02,320 --> 00:11:05,240 Speaker 1: machine setters. But see, at some point they'll figure out 182 00:11:05,280 --> 00:11:10,280 Speaker 1: how to have robots do this. Fiberglass laminator, surgical assistance. Okay, 183 00:11:11,200 --> 00:11:15,000 Speaker 1: now here's one. I think it's number nine, but I 184 00:11:15,000 --> 00:11:20,000 Speaker 1: think it's the one that's least vulnerable. Massage therapists. Massage therapist, 185 00:11:20,000 --> 00:11:22,240 Speaker 1: because even if you can get a robot to do it, 186 00:11:22,280 --> 00:11:24,200 Speaker 1: nobody's gonna want a roebut, they're gonna want a woman 187 00:11:24,280 --> 00:11:28,720 Speaker 1: doing it. Yeah, definitely, that's gonna be the safest. Hey, 188 00:11:28,840 --> 00:11:33,280 Speaker 1: speaking of Epstein, Bill Gates had a longtime science advisor, 189 00:11:34,520 --> 00:11:37,480 Speaker 1: you know, with a white coat and a bow tie. Well, 190 00:11:37,520 --> 00:11:39,480 Speaker 1: it turns out if you look him up in the 191 00:11:39,480 --> 00:11:42,000 Speaker 1: Epstein files, boy, this guy was as bad as Gates. 192 00:11:42,559 --> 00:11:45,600 Speaker 1: The science advisors right into Bill Gates right in Epstein 193 00:11:46,080 --> 00:11:50,280 Speaker 1: about women, this woman, that woman. He's asking for nude pictures. 194 00:11:50,280 --> 00:11:52,360 Speaker 1: Do you have any nude pictures? You consent? This is 195 00:11:52,400 --> 00:12:00,679 Speaker 1: the science advisor. The emails contained very explicit chatter. Bloomberg noted, 196 00:12:01,080 --> 00:12:03,880 Speaker 1: there's no evidence of any sex crimes with these guys, 197 00:12:04,520 --> 00:12:07,960 Speaker 1: but they both look pretty see. And this is Bill Gates' 198 00:12:08,040 --> 00:12:11,720 Speaker 1: top science advisor. He said they were at a dinner 199 00:12:11,760 --> 00:12:14,320 Speaker 1: with Epstein and a bunch of women and one of 200 00:12:14,360 --> 00:12:19,600 Speaker 1: them touched them under the table. Another one he's asking 201 00:12:20,280 --> 00:12:25,120 Speaker 1: if Epstein can send them some strippers. So this is 202 00:12:25,120 --> 00:12:29,440 Speaker 1: Bill Gates, this is his science advisor. Unbelievable. So, hey, 203 00:12:29,440 --> 00:12:33,640 Speaker 1: do you hear about that whale that washed ashore? Rockaway Beach? 204 00:12:34,200 --> 00:12:38,560 Speaker 1: Forty five foot whale washed ashore and everybody showed up 205 00:12:38,600 --> 00:12:42,440 Speaker 1: to take pictures. Apparently they believe the whale died in 206 00:12:42,480 --> 00:12:45,240 Speaker 1: a collision with a boat. A boat hit him, killed them. 207 00:12:45,280 --> 00:12:49,080 Speaker 1: The whale washed ashore. So everybody on the beach. This 208 00:12:49,120 --> 00:12:51,600 Speaker 1: is New York, you know, Long Island. They all ran 209 00:12:51,679 --> 00:12:55,319 Speaker 1: around us taking selfies in front of the whale. Scienceists 210 00:12:55,320 --> 00:12:57,840 Speaker 1: are warming warning, don't do that. If you see a 211 00:12:57,880 --> 00:13:00,960 Speaker 1: dead whale on the beach, get away. When a deceased 212 00:13:01,000 --> 00:13:06,680 Speaker 1: whale becomes beached, decomposition begins. The whale begins to bloat. 213 00:13:06,880 --> 00:13:10,600 Speaker 1: Sunlight can hasten the process. Gases are inside and the 214 00:13:10,640 --> 00:13:14,839 Speaker 1: whale at some point could explode. Yeah, a beach whale 215 00:13:14,840 --> 00:13:16,960 Speaker 1: in the sun will explode at some point and could 216 00:13:17,120 --> 00:13:20,640 Speaker 1: kill people or seriously injured, so do not get near 217 00:13:20,679 --> 00:13:23,840 Speaker 1: a whale. If you see at beach. Find something else 218 00:13:23,880 --> 00:13:27,360 Speaker 1: for your Instagram post. Hey, this has been the bonus 219 00:13:27,360 --> 00:13:30,559 Speaker 1: segment of the podcast. Thanks a lot for listening. I'm 220 00:13:30,559 --> 00:13:34,960 Speaker 1: here normally live ten to noon every weekday, or listen anytime. 221 00:13:35,080 --> 00:13:37,840 Speaker 1: Get the podcast. Thanks for listening. I see you Monday 222 00:13:37,920 --> 00:13:39,920 Speaker 1: on seven to ten WR