1 00:00:01,000 --> 00:00:06,200 Speaker 1: Bold, reverence, and occasionally random. The Sunday Hang with Clay 2 00:00:06,200 --> 00:00:10,320 Speaker 1: and Buck podcast, it starts now, just. 3 00:00:10,280 --> 00:00:13,119 Speaker 2: Got a VIP email. I love the way it starts, 4 00:00:13,560 --> 00:00:18,239 Speaker 2: albeit rare. Clay is right, albeit rare is a very 5 00:00:18,280 --> 00:00:22,079 Speaker 2: good Uh. They have covered Biden's ass and now they 6 00:00:22,160 --> 00:00:26,080 Speaker 2: release info about Biden which could end him. Either Merrick 7 00:00:26,160 --> 00:00:30,800 Speaker 2: Garland has seen the Warrior Light or something else is afoot. 8 00:00:30,960 --> 00:00:33,199 Speaker 3: I'm just letting all of our beloved VIPs know. I'm 9 00:00:33,280 --> 00:00:34,080 Speaker 3: keeping all of these. 10 00:00:34,320 --> 00:00:34,479 Speaker 2: Oh. 11 00:00:34,520 --> 00:00:37,320 Speaker 3: I do like this guy in particular if he apparently 12 00:00:37,400 --> 00:00:41,760 Speaker 3: is keeping score. I do keep in mind all the 13 00:00:41,760 --> 00:00:43,920 Speaker 3: people that say that it's not going to be Biden. 14 00:00:44,240 --> 00:00:46,040 Speaker 3: I'm going to send you all a mass email when 15 00:00:46,080 --> 00:00:48,200 Speaker 3: it is Biden, and just ask where are you taking 16 00:00:48,280 --> 00:00:51,240 Speaker 3: me out for steak? All across America in your hometown? 17 00:00:51,320 --> 00:00:54,000 Speaker 3: Where am I getting the best steak? Because you're going 18 00:00:54,040 --> 00:00:55,160 Speaker 3: to owe me, folks. 19 00:00:55,440 --> 00:00:57,920 Speaker 2: You've got receipts, all right? You also mentioned this the 20 00:00:57,960 --> 00:01:00,000 Speaker 2: Wall Street Journal. What have you got in front of you? 21 00:01:00,120 --> 00:01:00,480 Speaker 2: All right? 22 00:01:00,520 --> 00:01:03,040 Speaker 3: The American jobs that skew most heavily to one gender. 23 00:01:03,040 --> 00:01:05,039 Speaker 3: I'm gonna have Clay pick once I tell them what 24 00:01:05,040 --> 00:01:07,600 Speaker 3: the jobs are. But we first wanted to see the 25 00:01:07,800 --> 00:01:13,839 Speaker 3: highest proportion profession in this country of female workers. 26 00:01:14,360 --> 00:01:18,360 Speaker 2: Is what I would guess. I'm gonna give you three guesses. 27 00:01:18,400 --> 00:01:20,479 Speaker 2: There's five on this list. You told me. I'm gonna 28 00:01:20,480 --> 00:01:24,480 Speaker 2: give you three jobs that I think would overwhelmingly skew female. 29 00:01:25,160 --> 00:01:28,840 Speaker 2: One would be like preschool kindergarten teacher. There's almost is 30 00:01:28,840 --> 00:01:30,760 Speaker 2: that on the list? Yes, ding do we have a 31 00:01:30,840 --> 00:01:36,640 Speaker 2: ding bell? Another would be like, I don't know if 32 00:01:36,640 --> 00:01:42,920 Speaker 2: it's too specific, but like I would say like some 33 00:01:42,959 --> 00:01:49,680 Speaker 2: sort of like cosmetic skin like you know, like care 34 00:01:50,000 --> 00:01:52,000 Speaker 2: ish thing, like I've never nailed it. Again, you didn't 35 00:01:52,000 --> 00:01:54,120 Speaker 2: see this list. No, I've not seen the list. 36 00:01:54,280 --> 00:01:56,640 Speaker 3: Okay, those are number one and number two. Just you 37 00:01:56,680 --> 00:01:59,280 Speaker 3: know you said them two first and number one it's 38 00:01:59,360 --> 00:02:00,560 Speaker 3: skincare specialist. 39 00:02:01,000 --> 00:02:04,400 Speaker 2: Skincare. I've never heard or seen of. 40 00:02:04,360 --> 00:02:06,080 Speaker 3: Skincare specialists or female. 41 00:02:06,320 --> 00:02:09,560 Speaker 2: Yeah, Like I'm thinking like Rodanan Fields like all that stuff, 42 00:02:09,600 --> 00:02:12,200 Speaker 2: like they never, they never, there's never a man. You 43 00:02:12,280 --> 00:02:15,800 Speaker 2: nailed two and the third? What would I say is 44 00:02:15,840 --> 00:02:21,720 Speaker 2: the third most female centric job? I'm thinking already I. 45 00:02:21,800 --> 00:02:26,000 Speaker 3: Think, think like a sexist clay, Yeah, think. 46 00:02:25,880 --> 00:02:32,080 Speaker 2: Yeah, think like a sexist. Oh man, I'm thinking of 47 00:02:32,120 --> 00:02:35,440 Speaker 2: the men. I'm thinking of the men secretaries the h oh, 48 00:02:35,440 --> 00:02:37,440 Speaker 2: that's super Yeah, that's a good point. 49 00:02:37,520 --> 00:02:38,880 Speaker 3: Yeah. I was trying to help you with that. I 50 00:02:38,919 --> 00:02:41,040 Speaker 3: threw you a clue. You had phone a friend on 51 00:02:41,160 --> 00:02:43,880 Speaker 3: live radio right there. And then the next two so 52 00:02:43,919 --> 00:02:47,919 Speaker 3: it's skincare specialist, preschool kindergarten teacher number two, number three 53 00:02:48,040 --> 00:02:52,880 Speaker 3: legal or just secretaries, administrative assistants for dental hygienis. 54 00:02:52,919 --> 00:02:54,919 Speaker 2: Oh, I should have gotten that one. 55 00:02:55,000 --> 00:02:57,920 Speaker 3: And then five is speech language pathologists. 56 00:02:58,400 --> 00:03:00,360 Speaker 2: Okay, men, let me give you some and that I 57 00:03:00,360 --> 00:03:05,560 Speaker 2: immediately think of, like a mechanic mechanics. Mechanics got to 58 00:03:05,600 --> 00:03:10,320 Speaker 2: be on the list, like construction worker, yes, A like 59 00:03:10,480 --> 00:03:13,079 Speaker 2: plumber or something like that. Like I is there any player, 60 00:03:13,200 --> 00:03:19,200 Speaker 2: yes a like that's three. I just nailed three. 61 00:03:19,200 --> 00:03:21,960 Speaker 3: One of the other three, all right, logging workers and 62 00:03:22,120 --> 00:03:25,000 Speaker 3: a mason or stonemason. Those are the top five. If 63 00:03:25,040 --> 00:03:30,480 Speaker 3: you have to pick one of the female centric jobs, 64 00:03:30,680 --> 00:03:32,760 Speaker 3: which one is it to do yourself? 65 00:03:34,680 --> 00:03:43,800 Speaker 2: Second? Man, those all sound like awful jobs. I was 66 00:03:43,800 --> 00:03:47,480 Speaker 2: gonna look at Travis. I was gonna say secretary, but 67 00:03:47,560 --> 00:03:49,280 Speaker 2: I would just feel like such a little. 68 00:03:49,080 --> 00:03:53,280 Speaker 3: Bit I would say of the of the male worker jobs, 69 00:03:53,320 --> 00:03:54,200 Speaker 3: you can tell me which one. 70 00:03:54,480 --> 00:03:58,160 Speaker 2: I'd go logging, just because at least you're outdoors. But 71 00:03:58,200 --> 00:03:58,520 Speaker 2: I don't know. 72 00:03:58,560 --> 00:04:01,800 Speaker 3: If we have any loger listening, please call in and 73 00:04:01,840 --> 00:04:02,640 Speaker 3: tell me what it's like. 74 00:04:02,600 --> 00:04:05,360 Speaker 2: To be I'll be totally afraid of getting hit by 75 00:04:05,360 --> 00:04:08,320 Speaker 2: a tree. That's a tough job, man. I would probably 76 00:04:08,360 --> 00:04:10,480 Speaker 2: be a plumber. I don't think a plumber. That's kind 77 00:04:10,480 --> 00:04:12,680 Speaker 2: of a gross job. But at least you're indoors, the 78 00:04:12,720 --> 00:04:16,120 Speaker 2: weather's not that bad. Plumber fixed problems and are super 79 00:04:16,160 --> 00:04:20,400 Speaker 2: necessary plumbers. I'd prop people with a smile. I'd probably 80 00:04:20,520 --> 00:04:24,600 Speaker 2: go with the kindergarten teacher. I think if I had 81 00:04:24,600 --> 00:04:27,120 Speaker 2: to pick one of those people, might I know they'd 82 00:04:27,120 --> 00:04:33,400 Speaker 2: ask questions Sunday hang with bucks one. I think I would. 83 00:04:33,680 --> 00:04:36,719 Speaker 2: I would change my answer because I was thinking about 84 00:04:36,720 --> 00:04:40,240 Speaker 2: how much one kid, when they're like four or five, 85 00:04:40,520 --> 00:04:43,640 Speaker 2: is incredibly difficult to handle. I don't think I could 86 00:04:43,680 --> 00:04:47,960 Speaker 2: be a element like a kindergarten school teacher at this age. 87 00:04:48,640 --> 00:04:50,720 Speaker 3: If you're an undercover police officer and you got to 88 00:04:50,760 --> 00:04:53,039 Speaker 3: yell out them if not a tuma, that's. 89 00:04:52,920 --> 00:04:57,880 Speaker 2: Right or shores I I I think that job would 90 00:04:57,920 --> 00:05:01,880 Speaker 2: be too draining. I would say secretary, but I'm defining 91 00:05:02,120 --> 00:05:06,360 Speaker 2: and I'm defining secretary now broader as basically personal assistant. 92 00:05:07,000 --> 00:05:09,960 Speaker 2: And you could be a personal assistant to someone who 93 00:05:10,080 --> 00:05:13,479 Speaker 2: has a really cool job, and then you're still sort 94 00:05:13,480 --> 00:05:16,479 Speaker 2: of in a tough spot because you have to take 95 00:05:16,520 --> 00:05:19,880 Speaker 2: care of all sorts of details. But I think that 96 00:05:20,040 --> 00:05:21,280 Speaker 2: job would not be awful. 97 00:05:21,360 --> 00:05:24,800 Speaker 3: So I would think of this one through impa I'm 98 00:05:24,880 --> 00:05:27,360 Speaker 3: we're going, And I would also say I'm disappointed myself 99 00:05:27,360 --> 00:05:30,680 Speaker 3: that I didn't get hygienists, because during our break, I 100 00:05:30,760 --> 00:05:34,279 Speaker 3: was thinking, I like, have you ever had a male 101 00:05:34,360 --> 00:05:37,320 Speaker 3: hygienist when you've ever gone to the dentist in your life? 102 00:05:37,360 --> 00:05:40,240 Speaker 2: I haven't. I was thinking about it, no, and I 103 00:05:40,320 --> 00:05:43,880 Speaker 2: was like, if I had a male hygienist, I would 104 00:05:43,920 --> 00:05:47,080 Speaker 2: be so uncomfortable with the man like leaning over me 105 00:05:47,520 --> 00:05:50,320 Speaker 2: and like doing the teeth cleaning. And I don't know 106 00:05:50,360 --> 00:05:53,120 Speaker 2: why that is. I guess women are not uncomfortable with 107 00:05:53,160 --> 00:05:56,200 Speaker 2: having female hygienists because I've never seen a male hygienist, 108 00:05:57,080 --> 00:05:59,440 Speaker 2: and obviously a lot of dentists are male, so that's 109 00:05:59,480 --> 00:06:02,360 Speaker 2: not a a big deal. But I think I would 110 00:06:02,480 --> 00:06:05,640 Speaker 2: leave if a man like said Remember when I said 111 00:06:05,640 --> 00:06:10,159 Speaker 2: Buck I refused to allow a man to babysit my 112 00:06:10,240 --> 00:06:11,919 Speaker 2: kids when they were young. Like we were at a 113 00:06:11,960 --> 00:06:16,479 Speaker 2: hotel and the hotel provided babysitting services and a man 114 00:06:16,600 --> 00:06:19,640 Speaker 2: showed up and my wife was so upset, but I 115 00:06:19,720 --> 00:06:22,640 Speaker 2: was like, no, I will not leave my I had 116 00:06:22,680 --> 00:06:24,360 Speaker 2: like a four year old and a two year old 117 00:06:24,400 --> 00:06:26,520 Speaker 2: at the time. I was like, I will not leave 118 00:06:26,560 --> 00:06:30,359 Speaker 2: my boys with a male babysitter. And my wife was like, 119 00:06:30,360 --> 00:06:33,279 Speaker 2: oh my god, you're embarrassing me. And I was like, sorry, 120 00:06:33,400 --> 00:06:36,919 Speaker 2: I want a I want a female babysitter. And I 121 00:06:37,040 --> 00:06:39,719 Speaker 2: demanded that that guy leave, and they sent a woman 122 00:06:40,000 --> 00:06:41,880 Speaker 2: like a you know, it was like college age kids 123 00:06:42,640 --> 00:06:44,960 Speaker 2: at the working at the resort. They sent a college 124 00:06:44,960 --> 00:06:46,919 Speaker 2: age girl to babysit, and I was like, I'm fine 125 00:06:46,960 --> 00:06:49,159 Speaker 2: with this, We'll go out to dinner. Now. I wouldn't 126 00:06:49,200 --> 00:06:51,400 Speaker 2: have gone out to dinner if we were leaving them 127 00:06:51,440 --> 00:06:54,440 Speaker 2: with a male babysitter. Like that's a that's an example. 128 00:06:54,560 --> 00:06:56,080 Speaker 2: I think I would leave. If I went to the 129 00:06:56,160 --> 00:06:59,279 Speaker 2: dentist and they were like some brawny guy like leaning over, 130 00:06:59,320 --> 00:07:00,920 Speaker 2: putting his hands at my mouth, I'd be like, I 131 00:07:00,960 --> 00:07:01,880 Speaker 2: don't you don't want to do this. 132 00:07:02,200 --> 00:07:03,600 Speaker 3: I'd be interesting to see if you were to put 133 00:07:03,600 --> 00:07:07,800 Speaker 3: out a X pole. Would you leave your toddler age 134 00:07:07,920 --> 00:07:10,160 Speaker 3: children with a male babysitter that you did that you 135 00:07:10,200 --> 00:07:12,800 Speaker 3: did not know, you know, meeting so not like you know, 136 00:07:12,880 --> 00:07:15,480 Speaker 3: your nest not a family member or not, like yeah right, 137 00:07:16,160 --> 00:07:18,920 Speaker 3: I'm wondering what the what the I couldn't begin to guess. 138 00:07:18,920 --> 00:07:20,240 Speaker 3: I don't know what the numbers would be on that. 139 00:07:20,280 --> 00:07:23,840 Speaker 3: I think probably mostly agreeing with you or with your decision. 140 00:07:24,120 --> 00:07:26,160 Speaker 2: Would you be would you have like you don't have 141 00:07:26,240 --> 00:07:28,720 Speaker 2: kids yet, but it would make you apprehensive if you 142 00:07:28,760 --> 00:07:30,000 Speaker 2: were like at a you know, we were at a 143 00:07:30,080 --> 00:07:32,560 Speaker 2: nice resort and they had a babysitting service and we 144 00:07:32,560 --> 00:07:34,040 Speaker 2: were going to go out to dinner. It's not like 145 00:07:34,040 --> 00:07:38,880 Speaker 2: we were going like far away, but I was like, Laura, 146 00:07:38,880 --> 00:07:42,240 Speaker 2: would Laura would have done it? I was like, no, 147 00:07:42,240 --> 00:07:44,760 Speaker 2: note not happening, like zero percent chance. 148 00:07:46,520 --> 00:07:49,320 Speaker 3: I think I think I don't know. I'm stuck on 149 00:07:49,440 --> 00:07:51,680 Speaker 3: I think maybe I agree with you, but it's it's 150 00:07:51,720 --> 00:07:52,280 Speaker 3: a tough one. 151 00:07:52,640 --> 00:07:55,120 Speaker 2: I think I agree with you. What what about the 152 00:07:55,120 --> 00:07:59,000 Speaker 2: male hygienis thing? Like can you imagine guy like touching 153 00:07:59,040 --> 00:07:59,480 Speaker 2: your teeth? 154 00:07:59,560 --> 00:08:02,240 Speaker 3: And oh sure, there's some great male hygienis out there. 155 00:08:02,360 --> 00:08:04,520 Speaker 3: One thing I checked. I was curious, what do you 156 00:08:04,520 --> 00:08:11,840 Speaker 3: think the percentage I'm thinking of meet the parents, right, yeah, nurses? Yeah, what. 157 00:08:13,920 --> 00:08:17,400 Speaker 2: Five percent women fifteen percent male would be my guess 158 00:08:17,600 --> 00:08:17,880 Speaker 2: that is? 159 00:08:18,200 --> 00:08:18,880 Speaker 3: Did you know that that? 160 00:08:19,040 --> 00:08:19,080 Speaker 1: No? 161 00:08:19,240 --> 00:08:19,440 Speaker 4: I did. 162 00:08:19,600 --> 00:08:22,360 Speaker 3: I mean, I'm just look exactly what I think. I 163 00:08:22,400 --> 00:08:24,000 Speaker 3: think Clay, I don't know. I think he got the 164 00:08:24,040 --> 00:08:25,960 Speaker 3: answers to the test here with some of this stuff. 165 00:08:26,000 --> 00:08:28,680 Speaker 3: But uh, yes, yes, that is correct about far. But 166 00:08:28,720 --> 00:08:30,880 Speaker 3: that's a pretty you know that that's one out of 167 00:08:30,920 --> 00:08:33,000 Speaker 3: a little more than one out of ten nurses. 168 00:08:33,080 --> 00:08:36,360 Speaker 2: Yeah, and some some nursing jobs require a lot of 169 00:08:36,400 --> 00:08:39,200 Speaker 2: physical strength, right, Like I mean to lift like somebody 170 00:08:39,240 --> 00:08:41,360 Speaker 2: who needs to be helped into a chair, stuff like that. 171 00:08:41,400 --> 00:08:43,480 Speaker 2: I mean, if you're you know, five 't four one 172 00:08:43,520 --> 00:08:46,080 Speaker 2: hundred and fifteen pounds, like a lot of nurses are, 173 00:08:46,679 --> 00:08:48,880 Speaker 2: that would be I think very difficult. A lot. 174 00:08:48,960 --> 00:08:52,320 Speaker 3: There's a lot of active duty military nurses who are male. Yeah, 175 00:08:52,320 --> 00:08:54,200 Speaker 3: so they do, but you know they're they're tending to 176 00:08:54,640 --> 00:08:59,400 Speaker 3: you know, military people anyway. Yeah, nursing is not is 177 00:08:59,440 --> 00:09:02,719 Speaker 3: not nearly as skincare specialists. I mean that's the other 178 00:09:02,760 --> 00:09:06,160 Speaker 3: thing too, there's this whole world. Do you realize how 179 00:09:06,400 --> 00:09:08,440 Speaker 3: I've learned this now that I am married. Do you 180 00:09:08,480 --> 00:09:10,360 Speaker 3: realize how much women spend on skincare? 181 00:09:10,640 --> 00:09:10,800 Speaker 2: Oh? 182 00:09:10,880 --> 00:09:13,360 Speaker 3: This is I did not. It's there are little There 183 00:09:13,360 --> 00:09:16,320 Speaker 3: are little things. There are things that look like they're 184 00:09:16,400 --> 00:09:20,280 Speaker 3: made for like a Barbie house or something that have 185 00:09:20,400 --> 00:09:23,679 Speaker 3: some kind of a I cream that costs, you know, 186 00:09:24,000 --> 00:09:27,600 Speaker 3: more than some microchip I smuggle into the country. Like 187 00:09:27,640 --> 00:09:29,560 Speaker 3: these things are like hundreds and hundreds of dollars. 188 00:09:30,080 --> 00:09:33,480 Speaker 2: I notice it every time we travel, and there's like 189 00:09:33,520 --> 00:09:35,960 Speaker 2: a part of the sink that is for me and 190 00:09:36,000 --> 00:09:38,520 Speaker 2: a part of the sink that's for her. And this is, 191 00:09:38,520 --> 00:09:41,160 Speaker 2: by the way, is my I will also point out 192 00:09:41,360 --> 00:09:43,800 Speaker 2: this is why, like I posted a picture of my 193 00:09:43,920 --> 00:09:46,800 Speaker 2: wife and me from Vegas and people were, like, your 194 00:09:46,880 --> 00:09:53,160 Speaker 2: daughter's really pretty. Laura is actually older than me and 195 00:09:53,320 --> 00:09:57,320 Speaker 2: looks like fifteen years younger than me because she knows 196 00:09:57,360 --> 00:09:58,960 Speaker 2: how to take care of her skin. Women know how 197 00:09:59,000 --> 00:10:01,200 Speaker 2: to take care of skin. Like I am forty four 198 00:10:01,760 --> 00:10:04,480 Speaker 2: and I'm waving at you right now on this on 199 00:10:04,520 --> 00:10:07,040 Speaker 2: this screen. I don't wear makeup, like I don't know 200 00:10:07,320 --> 00:10:09,920 Speaker 2: you know, like how to do any like skin care. 201 00:10:10,040 --> 00:10:12,240 Speaker 3: I'll tell you something. You know, I haven't really been 202 00:10:12,240 --> 00:10:16,560 Speaker 3: doing much TV lately, and you know, I'm focusing on 203 00:10:16,600 --> 00:10:19,480 Speaker 3: this show and the book and podcasts. I never got 204 00:10:19,480 --> 00:10:23,280 Speaker 3: comfortable with the makeup thing. Yeah, and I don't like it. 205 00:10:23,440 --> 00:10:25,199 Speaker 3: And I know the thing is, I know you're shiny 206 00:10:25,200 --> 00:10:27,560 Speaker 3: if you don't do it. But I'm just saying, there 207 00:10:27,600 --> 00:10:29,080 Speaker 3: was never a time where I was like, oh, yes, 208 00:10:29,200 --> 00:10:32,120 Speaker 3: please powder my face so I can go on television. 209 00:10:32,160 --> 00:10:34,240 Speaker 3: And just you know, everyone that you're seeing who goes 210 00:10:34,280 --> 00:10:37,240 Speaker 3: on TV is hosting a show without exception that all 211 00:10:37,280 --> 00:10:39,760 Speaker 3: the guys have makeup on and super made up, like 212 00:10:39,840 --> 00:10:41,400 Speaker 3: a lot of a lot of makeup. 213 00:10:41,040 --> 00:10:43,520 Speaker 2: On, super made up. Every time I do a hit, 214 00:10:43,840 --> 00:10:46,600 Speaker 2: I don't have makeup on. I never have, so people 215 00:10:46,720 --> 00:10:49,560 Speaker 2: like you look old, I just look normal. I also 216 00:10:49,679 --> 00:10:51,920 Speaker 2: probably do look old because I don't sleep very much 217 00:10:51,960 --> 00:10:54,640 Speaker 2: and I work all the time. And also my wife 218 00:10:54,720 --> 00:10:56,520 Speaker 2: looks young because she knows how to use all the 219 00:10:56,600 --> 00:10:59,360 Speaker 2: skincare regimens. Like, maybe this is what Joe Biden needs. 220 00:10:59,400 --> 00:11:02,120 Speaker 2: I'll also going out. While we mentioned Joe Biden, he 221 00:11:02,160 --> 00:11:07,160 Speaker 2: looks old. He's also had buck a ton of work done. 222 00:11:07,400 --> 00:11:12,240 Speaker 2: And at some point, at some point, you move from oh, 223 00:11:12,320 --> 00:11:15,240 Speaker 2: I look a little bit younger to you get like that. 224 00:11:15,360 --> 00:11:19,720 Speaker 2: John Kerry look like so fake? Ye, so fake that 225 00:11:19,800 --> 00:11:22,120 Speaker 2: you look older? Right? And I'm not an expert on 226 00:11:22,160 --> 00:11:24,480 Speaker 2: plastic surgery yet I'm gonna say they never had any. 227 00:11:24,920 --> 00:11:28,520 Speaker 3: I'm gonna say this, and you know, people can agree 228 00:11:28,559 --> 00:11:32,120 Speaker 3: or disagree. It's an aesthetic decision. God, how do we 229 00:11:32,160 --> 00:11:34,200 Speaker 3: even get on this? I keep on talking about politics 230 00:11:34,200 --> 00:11:36,120 Speaker 3: and Joe Biden, and you're I just I. 231 00:11:36,080 --> 00:11:38,480 Speaker 2: Just shifted it to Joe Biden looking older than he 232 00:11:38,520 --> 00:11:41,080 Speaker 2: would That's true. Why so areat This is the last 233 00:11:41,080 --> 00:11:42,200 Speaker 2: thing I will say on this, and then we have 234 00:11:42,200 --> 00:11:45,960 Speaker 2: talk about Joe Biden in politics and inform everyone about 235 00:11:45,960 --> 00:11:48,559 Speaker 2: what's going on today in the world. But things, I'm 236 00:11:48,559 --> 00:11:50,080 Speaker 2: just gonna say it. I think we need to shut 237 00:11:50,080 --> 00:11:52,320 Speaker 2: it down a little bit. Things have gotten too crazy 238 00:11:52,800 --> 00:11:55,880 Speaker 2: with plastic surgery meant to make people look younger, and 239 00:11:55,920 --> 00:11:58,480 Speaker 2: people have gotten a little too crazy with the tattoos 240 00:11:58,480 --> 00:12:00,560 Speaker 2: that you can see all the time, Like the guy 241 00:12:00,640 --> 00:12:03,480 Speaker 2: at the super Bowl with the face tattoos. Face tattoos 242 00:12:03,520 --> 00:12:07,800 Speaker 2: not a good idea. You're a very anti tattoo. 243 00:12:07,880 --> 00:12:12,880 Speaker 3: The guy the face tattoo, mister post Malone mister Malone. Yeah, 244 00:12:12,920 --> 00:12:14,120 Speaker 3: that's a bad leged singer. 245 00:12:14,600 --> 00:12:17,640 Speaker 2: But tattoos all over his face, I get it, it's 246 00:12:17,640 --> 00:12:19,280 Speaker 2: a bad look. The first person who I ever saw 247 00:12:19,320 --> 00:12:21,160 Speaker 2: with the tattoo on his face was Mike Tyson. You 248 00:12:21,160 --> 00:12:23,440 Speaker 2: remember that when he got that crazy tattoo right face. 249 00:12:24,559 --> 00:12:28,600 Speaker 1: Sundays with Clay and Buck, we have. 250 00:12:29,480 --> 00:12:33,679 Speaker 3: Uh Lindsay in Ohio who is a dental hygienist who 251 00:12:33,720 --> 00:12:39,280 Speaker 3: says we need to know this about the dental hygienist situation. 252 00:12:39,400 --> 00:12:40,280 Speaker 3: What's going on? Lindsay? 253 00:12:41,520 --> 00:12:41,640 Speaker 1: Hi? 254 00:12:41,679 --> 00:12:44,360 Speaker 3: How are you? We're great? Thanks for calling in. 255 00:12:45,040 --> 00:12:47,160 Speaker 2: Lindsay, do you think I'm weird that if a man 256 00:12:47,320 --> 00:12:49,480 Speaker 2: were trying to put his hands in my mouth when 257 00:12:49,520 --> 00:12:51,560 Speaker 2: I went to the dentist that I would be like, 258 00:12:51,640 --> 00:12:55,000 Speaker 2: I can't do this. How many male hygienists have you seen? 259 00:12:55,240 --> 00:12:59,280 Speaker 2: And in your dental hygienis career, I have only seen three? 260 00:13:00,679 --> 00:13:02,840 Speaker 2: And like, how many women would you have seen? Hundreds? 261 00:13:03,960 --> 00:13:05,800 Speaker 4: Oh? Absolutely, it's not thousands? 262 00:13:05,960 --> 00:13:07,800 Speaker 2: So why do you think no men want to be 263 00:13:07,920 --> 00:13:10,120 Speaker 2: dental hygienists? Do you think that I'm weird that if 264 00:13:10,120 --> 00:13:10,720 Speaker 2: I went to it. 265 00:13:11,360 --> 00:13:13,320 Speaker 3: There are plenty of dentists who are men, right. 266 00:13:13,200 --> 00:13:15,360 Speaker 2: So yeah, but they don't. They don't do the physical 267 00:13:15,480 --> 00:13:17,439 Speaker 2: labor right, Like they just come in and they kind 268 00:13:17,440 --> 00:13:19,800 Speaker 2: of like look and they're like okay, Like I mean, 269 00:13:19,880 --> 00:13:22,640 Speaker 2: I what do you think? What do you attribute this to? Lindsay, 270 00:13:22,679 --> 00:13:24,720 Speaker 2: why do no men want to be dental hygienists. 271 00:13:26,200 --> 00:13:30,560 Speaker 4: I think it's just it's a I feel that men 272 00:13:30,600 --> 00:13:33,480 Speaker 4: are more decision makers, like let's go, let's get this done, 273 00:13:33,600 --> 00:13:35,640 Speaker 4: and women are more in the carring field, kind of 274 00:13:35,640 --> 00:13:38,600 Speaker 4: like the teachers and the nurses. So I think that's 275 00:13:38,640 --> 00:13:39,640 Speaker 4: the general difference. 276 00:13:40,040 --> 00:13:43,440 Speaker 2: Have you ever had a male hygienist work on your teeth? 277 00:13:45,320 --> 00:13:47,720 Speaker 4: Well, that is part of the reason I have called. Yes, 278 00:13:47,760 --> 00:13:49,400 Speaker 4: I did have one male hygienis work on it. 279 00:13:49,520 --> 00:13:52,120 Speaker 2: Did you enjoy it or did it make you uncomfortable? 280 00:13:53,559 --> 00:13:55,920 Speaker 4: I knew him very well, so I was fine. 281 00:13:57,440 --> 00:14:01,080 Speaker 2: What what do you think is the best the best 282 00:14:01,120 --> 00:14:04,920 Speaker 2: flavor to get your teeth cleaned with? Oh? 283 00:14:05,040 --> 00:14:05,280 Speaker 4: Mens? 284 00:14:05,720 --> 00:14:09,400 Speaker 2: Absolutely? Yeah. I think Men's a good answer. Thank you 285 00:14:09,480 --> 00:14:12,760 Speaker 2: for anything else in the interrogation, No, I I I 286 00:14:13,120 --> 00:14:16,600 Speaker 2: appreciate the work, but I can't be alone. There's something 287 00:14:16,640 --> 00:14:20,360 Speaker 2: about men's hairy knuckles like that like grow like in 288 00:14:20,400 --> 00:14:21,680 Speaker 2: your the idea when. 289 00:14:21,640 --> 00:14:24,440 Speaker 3: When the dentist checks your salivary glands. 290 00:14:24,480 --> 00:14:27,360 Speaker 2: I mean it's not it's fast. It's fast, and it's 291 00:14:27,400 --> 00:14:29,680 Speaker 2: not anywhere near as like they really get in there 292 00:14:29,720 --> 00:14:32,760 Speaker 2: on the dental hygienis cleaning. The dentist has just kind 293 00:14:32,760 --> 00:14:34,880 Speaker 2: of come in like cursory view. He's there for like 294 00:14:34,920 --> 00:14:36,359 Speaker 2: two minutes and then he's gone. 295 00:14:36,800 --> 00:14:38,760 Speaker 1: Sunday dra with clean buck. 296 00:14:40,120 --> 00:14:47,240 Speaker 3: There is a rise in the availability of simple artificial 297 00:14:47,280 --> 00:14:50,160 Speaker 3: intelligence technology out there that allows you to do some 298 00:14:50,200 --> 00:14:55,640 Speaker 3: pretty amazing stuff. You can have a very mechanically written 299 00:14:55,800 --> 00:14:59,760 Speaker 3: essay pumped out almost instantaneously on something that would have 300 00:14:59,760 --> 00:15:02,680 Speaker 3: been when I was in you know, seventh grade English class. 301 00:15:03,560 --> 00:15:05,920 Speaker 3: You have just kidding. I would never I would never 302 00:15:05,960 --> 00:15:08,520 Speaker 3: want that for any of the young people listening. Don't 303 00:15:08,560 --> 00:15:12,320 Speaker 3: ever take the shortcuts. Do all of the work. Algebra 304 00:15:12,760 --> 00:15:15,400 Speaker 3: is essential for your life. Don't let anyone tell you otherwise. 305 00:15:15,760 --> 00:15:18,520 Speaker 2: Do they still sell like the spark Notes and the 306 00:15:18,560 --> 00:15:20,240 Speaker 2: cliff Notes in the same way or. 307 00:15:20,240 --> 00:15:23,960 Speaker 3: Those online online? Yeah? I remember that it was a 308 00:15:24,120 --> 00:15:27,120 Speaker 3: study tool where they would sell that in bookstores, the 309 00:15:27,240 --> 00:15:29,440 Speaker 3: cliff Notes study tool. 310 00:15:29,880 --> 00:15:30,800 Speaker 1: And some of you. 311 00:15:30,720 --> 00:15:33,280 Speaker 2: Out there probably told your parents, Oh, this is just 312 00:15:33,360 --> 00:15:35,840 Speaker 2: a supplement. I just want to make sure that I'm 313 00:15:35,840 --> 00:15:36,960 Speaker 2: not missing anything. 314 00:15:37,520 --> 00:15:41,760 Speaker 3: Yeah, I want to be fully enriched in this reading 315 00:15:41,880 --> 00:15:44,840 Speaker 3: of Catcher in the Rye without actually reading it, right, 316 00:15:44,880 --> 00:15:46,760 Speaker 3: I mean, that's the whole That's what the spark Notes 317 00:15:46,760 --> 00:15:48,920 Speaker 3: cliff Notes world could do for you. And I love 318 00:15:48,960 --> 00:15:50,720 Speaker 3: that they even had a warning on it. It was like, 319 00:15:51,320 --> 00:15:53,720 Speaker 3: don't don't use this instead of reading the book. You're 320 00:15:53,760 --> 00:15:56,440 Speaker 3: giving your precious years in order to get an education. 321 00:15:57,080 --> 00:15:58,800 Speaker 3: And I'm sure a lot of the students who were 322 00:15:59,200 --> 00:16:01,800 Speaker 3: trying to cram the night before exams or whatever thought 323 00:16:01,800 --> 00:16:04,200 Speaker 3: to themselves, oh well, if they say I shouldn't do it, 324 00:16:04,240 --> 00:16:07,800 Speaker 3: maybe I should get back to reading it. Anyway. AI 325 00:16:08,040 --> 00:16:09,960 Speaker 3: tools now are all over the place. You can do 326 00:16:09,960 --> 00:16:11,840 Speaker 3: some pretty fun You have you experimented it all with 327 00:16:11,880 --> 00:16:13,840 Speaker 3: AI tools? Play just for fun, like try to see 328 00:16:13,840 --> 00:16:15,880 Speaker 3: if you could what would Clay Travis winning a Super 329 00:16:15,880 --> 00:16:16,480 Speaker 3: Bowl look like? 330 00:16:16,520 --> 00:16:17,040 Speaker 2: You know what I mean? 331 00:16:17,080 --> 00:16:18,360 Speaker 3: And it turns you into. 332 00:16:18,520 --> 00:16:21,040 Speaker 2: Have you No, I haven't actually gone on and looked 333 00:16:21,080 --> 00:16:22,040 Speaker 2: at it at all? You have? 334 00:16:22,120 --> 00:16:24,200 Speaker 3: You just kind of played around with a tiny bit. 335 00:16:24,360 --> 00:16:26,800 Speaker 3: I tried to chat GPT to see what that was like. 336 00:16:26,960 --> 00:16:31,840 Speaker 3: And I'm somebody who's I admit this that I'm by 337 00:16:31,880 --> 00:16:35,320 Speaker 3: no means even particularly technically inclined. I think all the 338 00:16:35,360 --> 00:16:38,600 Speaker 3: stuff about how artificial intelligence is going to turn into 339 00:16:38,920 --> 00:16:44,520 Speaker 3: something akin to the Skynet machine from Terminator, and that 340 00:16:44,960 --> 00:16:46,800 Speaker 3: the machines could destroy us all, I think that's a 341 00:16:46,800 --> 00:16:47,560 Speaker 3: little overblown. 342 00:16:48,560 --> 00:16:51,440 Speaker 2: But I've even thought about it. Sorrady cuts you off, 343 00:16:51,480 --> 00:16:54,720 Speaker 2: but I have thought about it in the context of 344 00:16:55,080 --> 00:16:57,840 Speaker 2: our show, and even in the context of a lot 345 00:16:57,880 --> 00:17:00,920 Speaker 2: of you out there listen to Rush for over thirty years. 346 00:17:01,360 --> 00:17:04,000 Speaker 2: What's a little bit scary to me is, let's say 347 00:17:04,000 --> 00:17:06,800 Speaker 2: something happens to us. Rush is obviously passed, but something 348 00:17:06,840 --> 00:17:09,840 Speaker 2: happens to us down the line. I feel like they 349 00:17:09,880 --> 00:17:13,560 Speaker 2: could put all of the audio from us, because you 350 00:17:13,600 --> 00:17:16,440 Speaker 2: could basically have us saying every word under the sun 351 00:17:16,520 --> 00:17:20,879 Speaker 2: we're on fifteen hours a week, and say, have Clay 352 00:17:21,040 --> 00:17:24,719 Speaker 2: and or Buck talk about you know something that's going 353 00:17:24,760 --> 00:17:28,200 Speaker 2: to happen in twenty fifty based on what we've said 354 00:17:28,200 --> 00:17:31,520 Speaker 2: in the past, to try to create a new version 355 00:17:31,800 --> 00:17:34,520 Speaker 2: of a show, right, Rush talk for thirty three years, 356 00:17:34,840 --> 00:17:39,800 Speaker 2: there is a consistency, I would argue to the things 357 00:17:39,880 --> 00:17:43,400 Speaker 2: that he would say. Could you take what he has 358 00:17:43,440 --> 00:17:47,160 Speaker 2: said about past issues and try to create a new 359 00:17:47,320 --> 00:17:49,879 Speaker 2: version commenting on real life. That says Scarrea. 360 00:17:49,960 --> 00:17:55,000 Speaker 3: Right, I mean, you know, human the AI can't do 361 00:17:55,160 --> 00:17:58,560 Speaker 3: creativity and improvisation and these things the way that human 362 00:17:58,600 --> 00:18:01,000 Speaker 3: beings can. That's just the bottom line, right. AI doesn't 363 00:18:01,000 --> 00:18:04,800 Speaker 3: have weird moments like we hear on this show. For example, 364 00:18:05,080 --> 00:18:07,639 Speaker 3: how much time do we spend talking about whether Clay 365 00:18:07,680 --> 00:18:09,680 Speaker 3: could have a dental hygienist who was a guy. 366 00:18:10,280 --> 00:18:10,479 Speaker 1: You know. 367 00:18:10,720 --> 00:18:13,439 Speaker 3: He got very deep into that conversation, and I was like, 368 00:18:13,480 --> 00:18:15,600 Speaker 3: maybe we should talk about the inflation data, which is 369 00:18:15,600 --> 00:18:17,680 Speaker 3: troubling for the economy. He's like, hold on a second, buck, 370 00:18:17,920 --> 00:18:19,920 Speaker 3: you want a dude with his fingers in the. 371 00:18:20,200 --> 00:18:22,200 Speaker 2: I was thinking about that during the break. I think 372 00:18:22,240 --> 00:18:24,800 Speaker 2: the reason there's no male hygienist is men's hands are 373 00:18:24,800 --> 00:18:26,879 Speaker 2: too big and they don't fit in mouths. I was 374 00:18:26,880 --> 00:18:29,119 Speaker 2: thinking about this while I was going to get my smoothie. 375 00:18:29,359 --> 00:18:31,199 Speaker 3: I mean, if you watch enough Seinfeld, you will know 376 00:18:31,240 --> 00:18:34,160 Speaker 3: that some ladies have man hands. It's a thing. 377 00:18:34,520 --> 00:18:36,200 Speaker 2: Man, they're probably not dental hygienis. 378 00:18:36,960 --> 00:18:40,720 Speaker 3: That's probably true as well. All right, So AI is 379 00:18:40,720 --> 00:18:43,560 Speaker 3: all over the place, and it's happening more and more 380 00:18:43,560 --> 00:18:47,400 Speaker 3: where you'll see something and it's for parody purposes. It's amazing, right, 381 00:18:47,440 --> 00:18:50,720 Speaker 3: because you can you can use it to you know, 382 00:18:50,720 --> 00:18:53,119 Speaker 3: if you used an AI machine, you could have I 383 00:18:53,119 --> 00:18:57,320 Speaker 3: don't know, Nancy Pelosi drinking chardonnay and insider trading while 384 00:18:57,359 --> 00:19:01,200 Speaker 3: cackling about you know, the lower classes and how they're 385 00:19:01,200 --> 00:19:03,240 Speaker 3: going to be stuck on this wheel of poverty forever. 386 00:19:03,480 --> 00:19:04,760 Speaker 3: And it will make well and that would just be 387 00:19:04,800 --> 00:19:06,879 Speaker 3: a photo of Nancy Pelosi, but it could make you 388 00:19:07,520 --> 00:19:11,600 Speaker 3: some image that would show you Nancy Pelosi, and it 389 00:19:11,600 --> 00:19:14,800 Speaker 3: would be taking into account those instructions, right, that's the 390 00:19:14,800 --> 00:19:19,399 Speaker 3: way it works. So the companies had, the big AI 391 00:19:19,600 --> 00:19:24,840 Speaker 3: companies in San Francisco, they're not banning it. This just happened, 392 00:19:25,240 --> 00:19:30,720 Speaker 3: but they are trying to come up with guidelines about 393 00:19:30,800 --> 00:19:34,320 Speaker 3: what the markings will be on it, how they'll identify 394 00:19:34,560 --> 00:19:39,800 Speaker 3: AI content. Because here's where this is all going. There's 395 00:19:39,880 --> 00:19:43,080 Speaker 3: gonna be stuff out there that looks because this stuff 396 00:19:43,240 --> 00:19:45,520 Speaker 3: it used to be like you could see, you know, 397 00:19:45,560 --> 00:19:49,720 Speaker 3: with what's that program ever photoshop, right, you can see 398 00:19:49,800 --> 00:19:52,240 Speaker 3: if somebody puts someone's head on something else, it didn't 399 00:19:52,280 --> 00:19:55,119 Speaker 3: look quite right. And with some of this AI stuff, 400 00:19:55,200 --> 00:19:59,400 Speaker 3: now you really can't tell, and that ability to look 401 00:19:59,400 --> 00:20:03,200 Speaker 3: at something and say this has been messed with, forged, faked. 402 00:20:03,520 --> 00:20:07,440 Speaker 3: Remember what a big deal it was for the Bush, 403 00:20:07,480 --> 00:20:10,479 Speaker 3: the Bush administration when he was running what was this 404 00:20:10,520 --> 00:20:13,200 Speaker 3: in two thousand and four, right against John Kerry and 405 00:20:14,000 --> 00:20:17,480 Speaker 3: Dan Rather had the forged National Guard documents, Right, Yeah, 406 00:20:17,520 --> 00:20:20,440 Speaker 3: he tried to do that to Now. I mean, anybody 407 00:20:20,440 --> 00:20:22,840 Speaker 3: can come up with forged National Guard documents to try 408 00:20:22,880 --> 00:20:26,199 Speaker 3: to It's so easy to make this stuff. And so 409 00:20:26,359 --> 00:20:28,280 Speaker 3: on the one hand, Clay, I'm worried about the actual 410 00:20:28,320 --> 00:20:32,800 Speaker 3: weaponization of the technology for dare I say disinformation purposes. 411 00:20:32,920 --> 00:20:35,760 Speaker 3: But just because the left is obsessed with disinformation that 412 00:20:35,920 --> 00:20:39,639 Speaker 3: isn't real doesn't mean that there isn't disinformation, which I 413 00:20:39,640 --> 00:20:42,120 Speaker 3: always think is important to remind people that's a real thing. 414 00:20:42,359 --> 00:20:45,160 Speaker 3: They say, Joe Biden, can you know, tie his own shoes. 415 00:20:45,200 --> 00:20:48,080 Speaker 3: There's plenty of disinformation out there. But Clay, the other 416 00:20:48,119 --> 00:20:50,160 Speaker 3: part of it is too think about the ways they 417 00:20:50,160 --> 00:20:53,439 Speaker 3: can shut down. Now. They can just say look what 418 00:20:53,480 --> 00:20:55,280 Speaker 3: they did to the Hunter Biden laptop. They said it 419 00:20:55,359 --> 00:20:59,000 Speaker 3: bears all the hallmarks of washing disinformation. Right, that was 420 00:20:59,000 --> 00:21:00,600 Speaker 3: the way they tried to un your mind. It and 421 00:21:01,119 --> 00:21:03,800 Speaker 3: maybe even help Joe win the election, or made Joe 422 00:21:03,800 --> 00:21:07,200 Speaker 3: win the election by lying. Now, anytime you see something, 423 00:21:07,240 --> 00:21:09,199 Speaker 3: if you have a video of Joe Biden accepting a 424 00:21:09,200 --> 00:21:12,679 Speaker 3: bag of cash from one of Hunter's Ukrainian bagmen, you 425 00:21:12,720 --> 00:21:15,080 Speaker 3: could just say, I don't know, it looks like AI 426 00:21:15,119 --> 00:21:18,560 Speaker 3: to me. Nick, prove to me it's not Ai. And 427 00:21:18,640 --> 00:21:20,359 Speaker 3: this is gonna be I think a big argument that 428 00:21:20,359 --> 00:21:22,080 Speaker 3: people have over things. They're going to just say that 429 00:21:22,119 --> 00:21:24,040 Speaker 3: they're gonna lie about it, and it's going to really 430 00:21:24,080 --> 00:21:25,120 Speaker 3: confuse the electorate. 431 00:21:25,760 --> 00:21:29,199 Speaker 2: I think that's right, and it's going to require a 432 00:21:29,240 --> 00:21:31,160 Speaker 2: lot of what people will say as well, is there 433 00:21:31,200 --> 00:21:35,360 Speaker 2: another video? Right? It used to be like, oh well 434 00:21:35,400 --> 00:21:38,520 Speaker 2: we've got a video evidence of it, Okay, like that 435 00:21:38,800 --> 00:21:43,200 Speaker 2: is legit. Now it's going to turn into if you 436 00:21:43,280 --> 00:21:45,840 Speaker 2: want to argue that something's not real, or if you 437 00:21:45,880 --> 00:21:49,040 Speaker 2: want to argue that something is real, you're going to 438 00:21:49,119 --> 00:21:53,159 Speaker 2: need multiple versions of that reality in order to prove it. 439 00:21:54,280 --> 00:21:57,879 Speaker 2: And I just I think we're going to get into 440 00:21:57,960 --> 00:22:00,879 Speaker 2: an era where a lot of people aren't going to 441 00:22:00,960 --> 00:22:06,480 Speaker 2: care either, and they're going to spread so quickly on 442 00:22:06,560 --> 00:22:10,800 Speaker 2: social media that what is and what is not fake 443 00:22:11,040 --> 00:22:15,439 Speaker 2: is going to become increasingly difficult because if something is 444 00:22:15,520 --> 00:22:19,320 Speaker 2: real and it's particularly politically damaging, you can just say 445 00:22:19,320 --> 00:22:22,000 Speaker 2: it's fake, and how do you prove it to your point? Right, 446 00:22:22,040 --> 00:22:27,000 Speaker 2: the Hunter Biden laptop all the hallmarks of Russian disinformation. Well, 447 00:22:27,040 --> 00:22:29,919 Speaker 2: anybody who looked into that at all knew that this 448 00:22:29,960 --> 00:22:31,560 Speaker 2: would have to be that based on the amount of 449 00:22:31,600 --> 00:22:34,640 Speaker 2: video and textual evidence this would there was no way 450 00:22:34,680 --> 00:22:38,720 Speaker 2: to fake that much. But if you want to believe 451 00:22:38,920 --> 00:22:43,280 Speaker 2: that something is is fake, you always have an excuse. 452 00:22:43,280 --> 00:22:44,080 Speaker 2: You always have the issume. 453 00:22:44,160 --> 00:22:54,359 Speaker 3: You've basically created a reasonable doubt excuse for any information 454 00:22:54,480 --> 00:22:56,359 Speaker 3: that comes across the Internet that you don't like about 455 00:22:56,400 --> 00:22:57,879 Speaker 3: your God. You can just always say, I don't know, 456 00:22:57,920 --> 00:23:00,840 Speaker 3: it looks like AI to me, Prove me it's not AI. 457 00:23:01,680 --> 00:23:03,879 Speaker 3: And given what they did with the Hunter Biden laptop, 458 00:23:03,920 --> 00:23:06,840 Speaker 3: how could anyone think that that's farfield? I think it's 459 00:23:06,840 --> 00:23:10,840 Speaker 3: almost a certainty clay, which is why they have this 460 00:23:11,200 --> 00:23:13,959 Speaker 3: manifesto that the biggest tech companies in the world, and 461 00:23:14,040 --> 00:23:16,240 Speaker 3: remember these are some of the richest and most powerful 462 00:23:16,280 --> 00:23:20,200 Speaker 3: countries I'm sorry, companies more powerful than many countries on Earth. 463 00:23:20,520 --> 00:23:25,480 Speaker 3: We're talking about, you know, TikTok Open AI, Adobe, Google, Meta, 464 00:23:25,800 --> 00:23:29,520 Speaker 3: all of it. Right, These are the big players, and 465 00:23:29,800 --> 00:23:32,680 Speaker 3: there's pressure from regulators about this, and people are saying 466 00:23:32,680 --> 00:23:35,040 Speaker 3: they need to stop the spread of fake election content. 467 00:23:35,119 --> 00:23:39,080 Speaker 3: But this is why, you know, all along the notion 468 00:23:39,240 --> 00:23:42,520 Speaker 3: that they'll be able to police lies on the internet, 469 00:23:43,760 --> 00:23:45,879 Speaker 3: You're never going to be able to police lies on 470 00:23:45,920 --> 00:23:48,080 Speaker 3: the internet. So what ends up happening is they create 471 00:23:48,160 --> 00:23:53,359 Speaker 3: the context for the disparate policing of what one side 472 00:23:53,440 --> 00:23:55,680 Speaker 3: doesn't want to hear anymore or doesn't want to hear 473 00:23:55,680 --> 00:23:58,200 Speaker 3: in the moment. And that's how we got the twenty 474 00:23:58,240 --> 00:24:02,320 Speaker 3: twenty election, right, That's how we got shutting you know, 475 00:24:02,359 --> 00:24:05,400 Speaker 3: Twitter saying that it was hacked material all this stuff. 476 00:24:06,080 --> 00:24:09,440 Speaker 3: It was essentially a fix. So I know, we don't 477 00:24:09,440 --> 00:24:13,200 Speaker 3: talk about AI a lot on this show. The implications 478 00:24:13,240 --> 00:24:16,040 Speaker 3: for a whole range of businesses are. It's truly enormous. 479 00:24:16,880 --> 00:24:19,680 Speaker 3: In video is one of the is probably the biggest 480 00:24:19,720 --> 00:24:21,919 Speaker 3: stock success story I think of the year. I mean 481 00:24:21,960 --> 00:24:25,800 Speaker 3: in terms of just having a massive market cap company, 482 00:24:26,320 --> 00:24:29,080 Speaker 3: you know, skyrocketing over the last twelve months or so. 483 00:24:29,080 --> 00:24:31,040 Speaker 3: I mean, there's probably some others, but I'm just saying 484 00:24:31,880 --> 00:24:35,520 Speaker 3: it's a huge company and this is going like, we 485 00:24:35,600 --> 00:24:38,080 Speaker 3: know this is going to happen, This is coming down 486 00:24:38,119 --> 00:24:40,440 Speaker 3: the pike, and honestly, Clay, do you have I don't 487 00:24:40,480 --> 00:24:43,359 Speaker 3: know how we really stop it other than just you 488 00:24:43,640 --> 00:24:46,040 Speaker 3: fight bad speech with good speech, because there's going to 489 00:24:46,119 --> 00:24:51,679 Speaker 3: be AI deep fakes circulating that do sway people and 490 00:24:51,680 --> 00:24:54,119 Speaker 3: that people believe that's going to happen. There's no question. 491 00:24:55,040 --> 00:24:56,840 Speaker 2: I think you're gonna see a lot of people have 492 00:24:56,920 --> 00:24:59,200 Speaker 2: to comment on it. I think it'll start this election 493 00:25:00,560 --> 00:25:03,080 Speaker 2: because I would expect that they'll be both pro Biden 494 00:25:03,400 --> 00:25:07,160 Speaker 2: and anti Biden and pro Trump and anti Trump AI 495 00:25:07,240 --> 00:25:12,560 Speaker 2: fakes that are circulating on a regular basis, and I 496 00:25:12,680 --> 00:25:16,679 Speaker 2: just Biden is speaking right now, by the way. I 497 00:25:16,800 --> 00:25:22,919 Speaker 2: actually think this is where multiple videos become necessary so 498 00:25:22,960 --> 00:25:25,199 Speaker 2: that you can actually see what is and what is 499 00:25:25,240 --> 00:25:31,040 Speaker 2: not said, because it's I don't know the answer of 500 00:25:31,440 --> 00:25:35,160 Speaker 2: how you stop it from having a massive impact going forward, 501 00:25:35,920 --> 00:25:39,200 Speaker 2: and I think we're just at the precipice of beginning 502 00:25:39,240 --> 00:25:43,160 Speaker 2: to understand how impactful it's going to be. Because one 503 00:25:43,160 --> 00:25:46,240 Speaker 2: thing for there to be misinformation or disinformation out there, 504 00:25:46,280 --> 00:25:49,919 Speaker 2: which really the way it's been applied is often just 505 00:25:50,119 --> 00:25:54,320 Speaker 2: arguments you don't like, right more so than anything else. 506 00:25:54,400 --> 00:25:57,040 Speaker 2: That's the way they've tried to police it. But they 507 00:25:57,119 --> 00:26:00,000 Speaker 2: are arguments that are being made, and they're trying to 508 00:26:00,119 --> 00:26:03,399 Speaker 2: fact check you. What about when the argument is is 509 00:26:03,480 --> 00:26:08,399 Speaker 2: this real or not? That becomes a totally different discussion. Right. 510 00:26:08,440 --> 00:26:11,639 Speaker 2: It's one thing to say, Oh, like I saw the 511 00:26:11,640 --> 00:26:13,760 Speaker 2: New York Times this morning book I was reading, and 512 00:26:13,800 --> 00:26:15,600 Speaker 2: I do think it's a good stat where you know, 513 00:26:15,640 --> 00:26:20,800 Speaker 2: what percentage of parents have gotten their kids? The updated booster. 514 00:26:20,560 --> 00:26:23,640 Speaker 3: Shot tiny, I don't know, it's very five. 515 00:26:24,080 --> 00:26:28,000 Speaker 2: Yeah, ninety five percent of parents have not gotten their kids. 516 00:26:28,080 --> 00:26:31,680 Speaker 2: The new booster shot. Only forty percent of parents got 517 00:26:31,720 --> 00:26:36,240 Speaker 2: their kids. The initial COVID vaccine that obviously doesn't work, 518 00:26:36,840 --> 00:26:39,800 Speaker 2: only forty percent. Did those are facts? Right? Somebody could 519 00:26:39,840 --> 00:26:43,159 Speaker 2: dispute whether or not that's true. Somebody putting out a 520 00:26:43,240 --> 00:26:46,159 Speaker 2: video that is either showing something very positive about the 521 00:26:46,200 --> 00:26:49,480 Speaker 2: COVID shot or something very negative. The first thing you 522 00:26:49,520 --> 00:26:51,080 Speaker 2: have to figure out is is this reel or not? 523 00:26:51,720 --> 00:26:55,200 Speaker 3: I see if it's an image, for example, and there's 524 00:26:55,200 --> 00:26:58,720 Speaker 3: no telltale, you know, marks on the image or anything 525 00:26:58,760 --> 00:27:02,800 Speaker 3: to tell, how do you know? How do you know? Yeah? 526 00:27:03,119 --> 00:27:04,840 Speaker 3: I'm just saying, I mean, this is where you start 527 00:27:04,840 --> 00:27:07,440 Speaker 3: to get into and even now we see things sometimes 528 00:27:07,440 --> 00:27:09,680 Speaker 3: on X that are being shared and I think to myself, 529 00:27:10,200 --> 00:27:12,399 Speaker 3: is this really just yeah, we're yeah, we're just hoping 530 00:27:12,440 --> 00:27:17,720 Speaker 3: that community notes and crowdsourcing is effectively going to prove, 531 00:27:17,800 --> 00:27:19,720 Speaker 3: you know what, would let us know if this was fake. 532 00:27:20,080 --> 00:27:22,439 Speaker 3: It's it's a big challenge everybody. I mean, I know 533 00:27:22,520 --> 00:27:25,200 Speaker 3: it sounds like, oh, how much can this could actually 534 00:27:25,200 --> 00:27:27,560 Speaker 3: turn an election? Maybe not this one, maybe not for 535 00:27:27,600 --> 00:27:30,240 Speaker 3: a few elections, but it's coming up where there's going 536 00:27:30,280 --> 00:27:34,800 Speaker 3: to be such a circulation of of faked but high 537 00:27:34,880 --> 00:27:37,800 Speaker 3: level fake, you know, deep fakes, high level faked content 538 00:27:37,920 --> 00:27:41,840 Speaker 3: online that it's going to sway public opinion and and 539 00:27:41,920 --> 00:27:43,520 Speaker 3: how do you deal with it. It's it's really a 540 00:27:43,520 --> 00:27:47,280 Speaker 3: big challenge. And the notion that this could be banned also, 541 00:27:47,280 --> 00:27:49,200 Speaker 3: I would just that's that's not gonna work either.