1 00:00:00,320 --> 00:00:03,400 Speaker 1: Thank you for listening. This is the best of with 2 00:00:03,600 --> 00:00:05,320 Speaker 1: Clay Travis and Buck Sexton. 3 00:00:05,720 --> 00:00:09,680 Speaker 2: I think the big Tech hearing has been a result 4 00:00:09,720 --> 00:00:11,920 Speaker 2: in not very much other than soundbites. 5 00:00:11,960 --> 00:00:13,800 Speaker 3: But let's talk about it. Let's get into some of this. 6 00:00:14,000 --> 00:00:18,239 Speaker 2: I mean the big headline right now Clay is lawmaker saying, quote, 7 00:00:18,720 --> 00:00:22,000 Speaker 2: you have blood on your hands to text CEOs. I 8 00:00:22,000 --> 00:00:26,439 Speaker 2: mean that's a pretty pretty intense allegation. It's pretty severe. 9 00:00:26,960 --> 00:00:28,200 Speaker 2: So let's see what they say. 10 00:00:28,280 --> 00:00:28,800 Speaker 3: Here we go. 11 00:00:28,920 --> 00:00:34,960 Speaker 2: We have Mark Zuckerberg of Facebook, and yeah, here he 12 00:00:35,040 --> 00:00:38,599 Speaker 2: is saying that social media doesn't Actually this is cut 13 00:00:38,680 --> 00:00:40,280 Speaker 2: one harm teen health play it. 14 00:00:40,400 --> 00:00:42,640 Speaker 4: With so much of our lives spent on mobile devices 15 00:00:42,680 --> 00:00:45,720 Speaker 4: and social media, it's important to look into the effects 16 00:00:45,880 --> 00:00:49,440 Speaker 4: on team mental health and wellbeing. I take this very seriously. 17 00:00:49,920 --> 00:00:53,200 Speaker 4: Mental health is a complex issue, and the existing body 18 00:00:53,240 --> 00:00:56,240 Speaker 4: of scientific work has not shown a cause a link 19 00:00:56,320 --> 00:00:59,440 Speaker 4: between using social media and young people having worse mental 20 00:00:59,440 --> 00:01:04,319 Speaker 4: health outcome. A recent National Academies of Science report evaluated 21 00:01:04,360 --> 00:01:08,520 Speaker 4: over three hundred studies and found that research quote did 22 00:01:08,560 --> 00:01:11,560 Speaker 4: not support the conclusion that social media causes changes in 23 00:01:11,600 --> 00:01:13,759 Speaker 4: adolescent mental health at the population level. 24 00:01:13,840 --> 00:01:14,319 Speaker 5: End quote. 25 00:01:15,280 --> 00:01:16,440 Speaker 3: Okay, Clay A few things. 26 00:01:16,440 --> 00:01:17,920 Speaker 2: Just let me give you a few of the headline 27 00:01:17,959 --> 00:01:21,039 Speaker 2: takeaways here from CNN as we dive into this, Senator 28 00:01:21,040 --> 00:01:25,240 Speaker 2: Lindsay Graham quote, the dark side of social media products 29 00:01:25,400 --> 00:01:29,120 Speaker 2: is too great to live with. This is also from CNN. 30 00:01:29,200 --> 00:01:36,240 Speaker 2: Senator Amy Klobash are visibly upset while questioning CEOs, and 31 00:01:36,360 --> 00:01:39,520 Speaker 2: then they've got a South Carolina lawmakers suing Instagram after 32 00:01:39,560 --> 00:01:42,920 Speaker 2: his son died by suicide. It goes through this at 33 00:01:42,959 --> 00:01:46,000 Speaker 2: some length. Okay, I got a few things here that 34 00:01:46,040 --> 00:01:49,080 Speaker 2: I want to do you. One is one of these 35 00:01:49,160 --> 00:01:56,480 Speaker 2: legislature legislators suggesting be done. And two, can we have 36 00:01:56,520 --> 00:02:04,320 Speaker 2: a conversation about how tech platforms are largely information dissemination vehicles. 37 00:02:05,160 --> 00:02:07,200 Speaker 2: The same way that you know, you could use a 38 00:02:07,240 --> 00:02:11,840 Speaker 2: phone and threaten someone's life, and you could make them depressed, 39 00:02:11,880 --> 00:02:14,680 Speaker 2: and you could, you know, do psychological damage to them. 40 00:02:14,960 --> 00:02:16,280 Speaker 3: Social media can do the same thing. 41 00:02:16,360 --> 00:02:18,799 Speaker 2: So I think there has to be some specificity in 42 00:02:18,840 --> 00:02:21,639 Speaker 2: this conversation about what they want to do about it 43 00:02:22,080 --> 00:02:23,960 Speaker 2: and how they think there's a way forward. 44 00:02:25,520 --> 00:02:28,440 Speaker 6: I've spent a lot of time thinking about this again 45 00:02:28,600 --> 00:02:29,400 Speaker 6: as a as. 46 00:02:29,240 --> 00:02:33,560 Speaker 5: A parent, and in my own experience. 47 00:02:34,120 --> 00:02:38,359 Speaker 6: So buck, I I think my thesis in general is 48 00:02:38,400 --> 00:02:42,840 Speaker 6: that you're in mind generation and you're basically my generation, 49 00:02:42,919 --> 00:02:44,360 Speaker 6: even though you're a millennial and I'm the. 50 00:02:44,400 --> 00:02:45,480 Speaker 5: Last year at gen X. 51 00:02:46,720 --> 00:02:50,639 Speaker 6: We grew up by and large in an earrow without 52 00:02:50,720 --> 00:02:54,240 Speaker 6: the internet in our early youth in terms of an 53 00:02:54,280 --> 00:02:59,480 Speaker 6: all encompassing way, and but we're old, but we were 54 00:02:59,480 --> 00:03:03,200 Speaker 6: not so old that we didn't understand how the Internet worked. 55 00:03:03,440 --> 00:03:05,760 Speaker 6: So I feel like we're kind of a bridge generation 56 00:03:06,320 --> 00:03:08,960 Speaker 6: because people who are older than us. Let's say you're 57 00:03:09,000 --> 00:03:12,600 Speaker 6: listening to us right now and you're sixty or older, 58 00:03:13,280 --> 00:03:17,280 Speaker 6: the Internet came along long after you were grown, and 59 00:03:17,360 --> 00:03:19,280 Speaker 6: so you know, it's just kind of something that you 60 00:03:19,400 --> 00:03:22,440 Speaker 6: layered on. And I think in general that's more complicated. 61 00:03:23,600 --> 00:03:25,560 Speaker 6: But I feel like we kind of had the perfect 62 00:03:26,200 --> 00:03:29,720 Speaker 6: raising in the eighties and the nineties where you had 63 00:03:29,720 --> 00:03:34,480 Speaker 6: a real world raising but you weren't immersed online. But 64 00:03:34,520 --> 00:03:36,920 Speaker 6: then you understood as you became a teenager or got 65 00:03:36,960 --> 00:03:39,600 Speaker 6: older in your twenties and thirties, how the Internet worked. 66 00:03:40,000 --> 00:03:42,640 Speaker 6: And I think that's so healthy because I see people 67 00:03:42,720 --> 00:03:45,960 Speaker 6: now that and I think about it not only for 68 00:03:46,080 --> 00:03:47,720 Speaker 6: my kids, but also we have a lot of I 69 00:03:47,760 --> 00:03:50,200 Speaker 6: still consider them kids at some point you get older 70 00:03:50,200 --> 00:03:52,240 Speaker 6: and you start thinking, anybody in their twenties still kind 71 00:03:52,240 --> 00:03:54,880 Speaker 6: of a kid, right, college kids I call them whatever. 72 00:03:56,560 --> 00:04:01,560 Speaker 6: They have grown up entirely online. 73 00:04:00,120 --> 00:04:02,880 Speaker 3: And I don't think it's again, I just come back 74 00:04:02,920 --> 00:04:03,160 Speaker 3: to it. 75 00:04:03,160 --> 00:04:03,480 Speaker 1: It's not. 76 00:04:03,720 --> 00:04:07,040 Speaker 6: In my opinion, it is not a coincidence that mental 77 00:04:07,080 --> 00:04:10,480 Speaker 6: health fell off a cliff right as social media took root. 78 00:04:10,640 --> 00:04:12,800 Speaker 6: And part of it is not just social media, buck, 79 00:04:13,080 --> 00:04:14,880 Speaker 6: it's the combination of social media and for those of 80 00:04:14,880 --> 00:04:17,120 Speaker 6: you watching on video, I'm holding up my phone right now, 81 00:04:17,480 --> 00:04:21,240 Speaker 6: is social media and a computer in your pocket, because 82 00:04:22,560 --> 00:04:28,719 Speaker 6: it is I think impossible to recognize how much life 83 00:04:28,760 --> 00:04:31,360 Speaker 6: has changed with phones. Some of it's for the better, right, 84 00:04:31,400 --> 00:04:35,640 Speaker 6: I mean, my favorite thing about my phone is probably Uber, honestly, 85 00:04:36,000 --> 00:04:38,120 Speaker 6: because the idea that if you ever tried to get 86 00:04:38,120 --> 00:04:40,640 Speaker 6: a taxicab back in the nineties and much of America 87 00:04:41,040 --> 00:04:43,360 Speaker 6: was almost impossible. I mean, like, you couldn't get a 88 00:04:43,400 --> 00:04:45,279 Speaker 6: taxicab in Nashville unless you were coming out of a 89 00:04:45,279 --> 00:04:47,680 Speaker 6: bar at two am and they were just lined up there. 90 00:04:47,680 --> 00:04:49,920 Speaker 6: You couldn't get a ride anywhere. So the idea that 91 00:04:49,960 --> 00:04:52,279 Speaker 6: I can just be on a corner and I can 92 00:04:52,360 --> 00:04:55,400 Speaker 6: get a car in five minutes and it's reasonable cost 93 00:04:55,640 --> 00:04:59,719 Speaker 6: is amazing to me. But the way that people define 94 00:04:59,760 --> 00:05:07,440 Speaker 6: them based on the uh, social media universe, it's scary. 95 00:05:07,760 --> 00:05:09,640 Speaker 6: I just and and I think it's toxic. 96 00:05:09,800 --> 00:05:10,200 Speaker 3: I think it. 97 00:05:10,400 --> 00:05:13,600 Speaker 6: I think social media is the cigarettes of our generation. 98 00:05:13,760 --> 00:05:17,040 Speaker 2: So so so here's what I what I want to know. 99 00:05:17,200 --> 00:05:19,000 Speaker 2: And I there's so many parents listening. 100 00:05:19,080 --> 00:05:19,560 Speaker 3: If you have. 101 00:05:21,000 --> 00:05:24,039 Speaker 2: Strong thoughts on this issue. Remember this is given what's 102 00:05:24,040 --> 00:05:26,719 Speaker 2: going on in the world. Congress or the sentence holding 103 00:05:26,760 --> 00:05:29,760 Speaker 2: this hearing today on Capitol Hill. Mark Zuckerberg is there. 104 00:05:29,800 --> 00:05:32,320 Speaker 2: I mean they're bringing in it's it's I'm trying to 105 00:05:32,320 --> 00:05:32,960 Speaker 2: find all. 106 00:05:32,800 --> 00:05:33,560 Speaker 3: The different. 107 00:05:34,880 --> 00:05:38,560 Speaker 2: Like everybody, everybody, it's all the big tech CEOs. It's 108 00:05:38,640 --> 00:05:43,040 Speaker 2: the CEOs of Meta, TikTok, Snap, discord x formerly known 109 00:05:43,040 --> 00:05:46,719 Speaker 2: as Twitter, all of this right, And and here's what 110 00:05:46,760 --> 00:05:49,320 Speaker 2: I what I see though, is well, you don't want 111 00:05:49,360 --> 00:05:52,080 Speaker 2: to ban social media, okay, because there's actually a lot 112 00:05:52,120 --> 00:05:53,800 Speaker 2: of good and a lot of companies and a lot 113 00:05:53,839 --> 00:06:00,000 Speaker 2: of things that rely on different social media applications. You know. Uh, 114 00:06:00,400 --> 00:06:03,800 Speaker 2: that's I think one part of it. And ultimately, I 115 00:06:03,839 --> 00:06:07,279 Speaker 2: feel like this is up to parents. If you're talking 116 00:06:07,320 --> 00:06:09,720 Speaker 2: about protecting kids when it comes to social media. It's 117 00:06:09,720 --> 00:06:11,360 Speaker 2: really up to parents for the most part. I'm not 118 00:06:11,400 --> 00:06:13,599 Speaker 2: saying there aren't things that maybe could be done. I'm sure, 119 00:06:13,839 --> 00:06:15,920 Speaker 2: although I want to hear what the specifics are. 120 00:06:17,560 --> 00:06:18,760 Speaker 3: What should be done. 121 00:06:18,800 --> 00:06:20,560 Speaker 2: Will I think that, you know, Clay, you said you 122 00:06:20,640 --> 00:06:22,520 Speaker 2: let you don't let your son on social media untill 123 00:06:22,520 --> 00:06:23,120 Speaker 2: he sixteen? 124 00:06:23,279 --> 00:06:23,440 Speaker 3: Right? 125 00:06:23,880 --> 00:06:26,719 Speaker 2: I mean for me, that seems reason that, that seems 126 00:06:26,760 --> 00:06:29,800 Speaker 2: about right. I don't think thirteen year olds need to 127 00:06:29,800 --> 00:06:31,440 Speaker 2: be on social media. I definitely don't think ten year 128 00:06:31,440 --> 00:06:34,000 Speaker 2: olds need to be on social media. I think that 129 00:06:34,080 --> 00:06:35,719 Speaker 2: people and I see this, and I'm not a parent, 130 00:06:35,760 --> 00:06:40,560 Speaker 2: I know, but I'm an observant fellow people use iPads 131 00:06:40,680 --> 00:06:44,599 Speaker 2: as like stand in babysitters instead of human beings. There's 132 00:06:44,640 --> 00:06:47,440 Speaker 2: way too much of that going on, you know. I 133 00:06:47,760 --> 00:06:51,000 Speaker 2: just see this increasingly as it's about parenting. I mean, 134 00:06:51,440 --> 00:06:54,040 Speaker 2: some of these arguments were made about television. They were 135 00:06:54,080 --> 00:06:57,400 Speaker 2: certainly made about video games. Oh, video games will rot 136 00:06:57,480 --> 00:07:00,400 Speaker 2: your brain unless you're Elon Musk, who stays up until 137 00:07:00,400 --> 00:07:02,760 Speaker 2: four am playing video games many nights and is the 138 00:07:02,839 --> 00:07:04,560 Speaker 2: richest guy in the world and is changing the world 139 00:07:04,560 --> 00:07:07,320 Speaker 2: we live in, right, Like, you know, it's all about balance. 140 00:07:07,360 --> 00:07:10,200 Speaker 2: It's all about having boundaries and understanding. And I think 141 00:07:10,200 --> 00:07:13,480 Speaker 2: that on social media, for example, you know, first of all, 142 00:07:13,600 --> 00:07:16,000 Speaker 2: I would assume is this the case. Parents should all 143 00:07:16,040 --> 00:07:18,560 Speaker 2: have all the passwords and full access to anyone under 144 00:07:18,560 --> 00:07:20,520 Speaker 2: eighteen social media period, full stop. 145 00:07:20,600 --> 00:07:21,640 Speaker 3: Right. So that's one thing. 146 00:07:22,000 --> 00:07:23,600 Speaker 2: And I don't know how easy it is to set 147 00:07:23,640 --> 00:07:26,640 Speaker 2: that up, but I would assume that that's possible to 148 00:07:26,720 --> 00:07:29,640 Speaker 2: do right, so you can see every interaction, everything that's 149 00:07:29,680 --> 00:07:31,480 Speaker 2: going on. That's a safety thing as much as it 150 00:07:31,560 --> 00:07:32,760 Speaker 2: is also a mental health thing. 151 00:07:33,920 --> 00:07:35,080 Speaker 3: I don't know what I mean, Like, what do you 152 00:07:35,120 --> 00:07:36,120 Speaker 3: think needs to be done right? 153 00:07:36,120 --> 00:07:38,760 Speaker 2: Because right now Lindsey Graham yelling about how social media 154 00:07:38,760 --> 00:07:40,480 Speaker 2: companies have blood on your hands. 155 00:07:40,600 --> 00:07:43,120 Speaker 3: Is hysterical theatrics. 156 00:07:43,840 --> 00:07:45,960 Speaker 2: It's, oh, you know, I'm making such a big deal 157 00:07:46,000 --> 00:07:47,160 Speaker 2: of It's okay, lindsay, what. 158 00:07:47,120 --> 00:07:47,720 Speaker 3: Do you want to do? 159 00:07:48,560 --> 00:07:51,640 Speaker 6: So my yeah, my thought would be And I'm not 160 00:07:51,720 --> 00:07:53,720 Speaker 6: claiming that I'm some kind of an expert on this, 161 00:07:53,800 --> 00:07:56,200 Speaker 6: although I am active in media, I am active on 162 00:07:56,240 --> 00:07:58,440 Speaker 6: social media. I think I understand it better than the 163 00:07:58,520 --> 00:08:02,200 Speaker 6: average parent would. We don't get phones for our kids 164 00:08:02,520 --> 00:08:06,920 Speaker 6: until they are fourteen. Some people will say that's too early. 165 00:08:07,400 --> 00:08:11,520 Speaker 6: Some people will say that's too late. Fourteen seems to 166 00:08:11,560 --> 00:08:15,160 Speaker 6: me like a reasonable age where a kid could have 167 00:08:15,360 --> 00:08:19,360 Speaker 6: access to a phone, and a lot of that is 168 00:08:19,400 --> 00:08:22,000 Speaker 6: to help with him, and frankly, I like to be 169 00:08:22,040 --> 00:08:25,160 Speaker 6: able to know where he is. But so fourteen, social 170 00:08:25,240 --> 00:08:29,640 Speaker 6: media at sixteen. So and I'm not telling other parents 171 00:08:29,760 --> 00:08:32,200 Speaker 6: what they should do. I'm saying what my wife and 172 00:08:32,240 --> 00:08:36,920 Speaker 6: I have decided to do with our kids. I think 173 00:08:36,960 --> 00:08:39,559 Speaker 6: you shouldn't be able to have a social media account 174 00:08:40,080 --> 00:08:44,560 Speaker 6: until you are, honestly sixteen at their earliest. And I 175 00:08:44,600 --> 00:08:47,520 Speaker 6: think that the Metas of the world and the twitters 176 00:08:47,559 --> 00:08:51,640 Speaker 6: of the world and you should have to establish how 177 00:08:51,679 --> 00:08:55,160 Speaker 6: old you are legitimately to be able to get on 178 00:08:55,200 --> 00:08:56,520 Speaker 6: to have a social media account. 179 00:08:56,640 --> 00:09:00,199 Speaker 2: When I was like a freshman in high school, if 180 00:09:00,200 --> 00:09:03,040 Speaker 2: I had had the ability to talk to girls of 181 00:09:03,040 --> 00:09:05,679 Speaker 2: my own age on I don't think I would. I 182 00:09:05,679 --> 00:09:07,800 Speaker 2: would have failed Algebra two, you know, I would have 183 00:09:07,840 --> 00:09:10,320 Speaker 2: done anything else right, So we were lucky in the 184 00:09:10,320 --> 00:09:15,320 Speaker 2: sense that you didn't have these constant, you know, social 185 00:09:15,400 --> 00:09:18,640 Speaker 2: interaction tools that aren't real. The other part, it's not 186 00:09:18,679 --> 00:09:21,600 Speaker 2: real social interaction, that's the problem. It is in the 187 00:09:21,640 --> 00:09:23,280 Speaker 2: sense that you're talking to people but it's not in 188 00:09:23,280 --> 00:09:26,400 Speaker 2: the sense that real human beings in person is still special. 189 00:09:26,400 --> 00:09:30,559 Speaker 2: It's still more worthwhile. Yeah, and even remember the telephone. 190 00:09:31,040 --> 00:09:34,080 Speaker 2: It was a big deal. I bary a lot of 191 00:09:34,080 --> 00:09:36,200 Speaker 2: you out there, Remember the Star six ' nine era. 192 00:09:36,400 --> 00:09:38,720 Speaker 2: If you had a brother or sister, somebody was always 193 00:09:38,720 --> 00:09:40,880 Speaker 2: on the phone. But mom and dad could always pick 194 00:09:40,960 --> 00:09:42,680 Speaker 2: up the phone and know who it was, or mom 195 00:09:42,679 --> 00:09:44,000 Speaker 2: and dad could answer the phone. 196 00:09:44,160 --> 00:09:45,240 Speaker 3: And there was something to. 197 00:09:45,280 --> 00:09:48,319 Speaker 6: Be said for being a boy or a girl calling 198 00:09:48,360 --> 00:09:50,320 Speaker 6: another boy or a girl and having to talk to 199 00:09:50,440 --> 00:09:52,719 Speaker 6: a parent in order to be able to be on the. 200 00:09:52,640 --> 00:09:53,360 Speaker 3: Phone with them. 201 00:09:53,440 --> 00:09:56,160 Speaker 6: And you knew again mom and dad could pick that 202 00:09:56,200 --> 00:09:58,200 Speaker 6: phone up at any point, or your brother and your 203 00:09:58,240 --> 00:10:02,400 Speaker 6: sister could right there a interaction associated with it. And 204 00:10:02,440 --> 00:10:04,560 Speaker 6: what I would just say in general is when you 205 00:10:04,640 --> 00:10:08,840 Speaker 6: are a teenager, everybody is racked with self doubt, you're growing, 206 00:10:08,960 --> 00:10:14,760 Speaker 6: you're changing. Instagram in particular, to me, is so fake. 207 00:10:15,840 --> 00:10:19,720 Speaker 6: It is everybody's best photo that they've ever taken. It 208 00:10:19,800 --> 00:10:23,600 Speaker 6: is everybody's best vacation that they've ever taken. And if 209 00:10:23,640 --> 00:10:27,080 Speaker 6: you're a fourteen year old girl, and whoever the prettiest 210 00:10:27,120 --> 00:10:29,400 Speaker 6: fourteen year old girl is in your class, or the 211 00:10:29,480 --> 00:10:32,600 Speaker 6: richest fourteen year old girl and she has the best clothes, 212 00:10:32,640 --> 00:10:34,480 Speaker 6: and she has the best friends, and she has the 213 00:10:34,480 --> 00:10:39,480 Speaker 6: best trips. I can see how if you're constantly forced 214 00:10:39,520 --> 00:10:42,199 Speaker 6: to marinate in somebody else. 215 00:10:42,360 --> 00:10:45,800 Speaker 3: Remember, it's not the real world that somebody else officially. Yeah, yeah, 216 00:10:45,800 --> 00:10:47,520 Speaker 3: they're artificial, somebody else's fantasy. 217 00:10:47,679 --> 00:10:50,480 Speaker 6: And then you see if there are six kids that 218 00:10:50,520 --> 00:10:52,560 Speaker 6: you thought you were friends with and they all go 219 00:10:52,679 --> 00:10:54,760 Speaker 6: to the mall, or they all go to your movie 220 00:10:54,800 --> 00:10:58,040 Speaker 6: and they post it. I don't think it's coincidental that 221 00:10:58,160 --> 00:11:01,319 Speaker 6: teen mental health has collapsed at the exact same time 222 00:11:01,679 --> 00:11:03,680 Speaker 6: that phones and social media, because I. 223 00:11:03,640 --> 00:11:04,400 Speaker 3: Do think they're connected. 224 00:11:04,400 --> 00:11:06,720 Speaker 6: Everybody's got them in their pocket, and buck to your point, 225 00:11:07,160 --> 00:11:09,600 Speaker 6: you can't escape. You know, if you were a kid 226 00:11:09,640 --> 00:11:11,800 Speaker 6: and you didn't like school and there are a lot 227 00:11:11,800 --> 00:11:13,719 Speaker 6: of people out there, you might have to be there 228 00:11:13,760 --> 00:11:16,120 Speaker 6: from seven o'clock to two thirty or whatever it is. 229 00:11:16,440 --> 00:11:18,600 Speaker 6: But then you get home and you can have your 230 00:11:18,640 --> 00:11:20,600 Speaker 6: own life outside of that world. 231 00:11:20,800 --> 00:11:21,480 Speaker 3: It never ends. 232 00:11:21,520 --> 00:11:27,040 Speaker 6: You're always snapchatting, you're always on group text, you're always interacting. 233 00:11:27,280 --> 00:11:29,719 Speaker 2: You can't think of develop as a real person. You 234 00:11:29,760 --> 00:11:33,400 Speaker 2: think about the pressure to create. What is effectively your 235 00:11:33,400 --> 00:11:36,680 Speaker 2: own individual brand online, right, Everyone now is a brand 236 00:11:36,720 --> 00:11:39,240 Speaker 2: in the online social media world, whether you think of 237 00:11:39,280 --> 00:11:42,200 Speaker 2: yourself that way or not. It's here's my best fishing photos. 238 00:11:42,240 --> 00:11:44,280 Speaker 2: I'm like, here's me with my family, and like, look, 239 00:11:44,360 --> 00:11:47,000 Speaker 2: ask you who my dog is. And we're all anyone 240 00:11:47,000 --> 00:11:49,200 Speaker 2: who's using social media. So cryber audience isn't on it all, 241 00:11:49,240 --> 00:11:51,480 Speaker 2: by the way, congratulations list in the radio. They're like, yeah, 242 00:11:51,480 --> 00:11:53,240 Speaker 2: I don't mess with any of that stuff. So you 243 00:11:53,280 --> 00:11:55,680 Speaker 2: guys are and gals are ahead of the curve. 244 00:11:55,480 --> 00:11:58,280 Speaker 6: But you're the people who are not smoking, you know 245 00:11:58,360 --> 00:12:01,560 Speaker 6: in nineteen forty when everybody else was smoking all day long. 246 00:12:01,360 --> 00:12:01,800 Speaker 5: Every day. 247 00:12:02,240 --> 00:12:04,880 Speaker 3: Yeah, So you know, I think. 248 00:12:06,320 --> 00:12:10,240 Speaker 2: I think that this is it's just a bigger conversation 249 00:12:10,800 --> 00:12:14,680 Speaker 2: with more facets than just you know, Ben TikTok and 250 00:12:14,840 --> 00:12:16,920 Speaker 2: like Facebook should be able to be sued and repeal 251 00:12:16,920 --> 00:12:17,800 Speaker 2: section two thirty. 252 00:12:18,120 --> 00:12:20,520 Speaker 3: The people who say repealed section two thirty. 253 00:12:20,559 --> 00:12:25,160 Speaker 2: The problem is with this is if if you think 254 00:12:25,160 --> 00:12:26,959 Speaker 2: that that's going to mean that all of a sudden 255 00:12:27,480 --> 00:12:29,840 Speaker 2: they're going to be these benevolent actors one of these 256 00:12:29,880 --> 00:12:32,080 Speaker 2: social media companies, it just means they're gonna crack down 257 00:12:32,520 --> 00:12:35,520 Speaker 2: even more actually they're gonna have even stricter and more 258 00:12:35,679 --> 00:12:39,560 Speaker 2: arbitrary guidelines that they're using, and they're gonna say, sorry, 259 00:12:39,679 --> 00:12:42,080 Speaker 2: we got to avoid getting we got to avoid being sued. 260 00:12:43,120 --> 00:12:45,800 Speaker 2: So it's not as easy as just repealing section two thirty. 261 00:12:45,840 --> 00:12:48,560 Speaker 2: I know that's a common refrain for people. This is 262 00:12:48,600 --> 00:12:52,680 Speaker 2: a complicated thing and humanity has never faced this before. 263 00:12:53,040 --> 00:12:54,760 Speaker 2: And I would add this as we go to break, 264 00:12:54,800 --> 00:12:56,400 Speaker 2: We'll take some of your calls eight hundred and two 265 00:12:56,440 --> 00:12:58,920 Speaker 2: A two two eight A two. Particularly parents, I'm interested 266 00:12:58,960 --> 00:13:01,839 Speaker 2: from a lot of your kids are way smarter. 267 00:13:01,679 --> 00:13:07,360 Speaker 6: With tech than we are, right the parents. My oldest son, 268 00:13:07,880 --> 00:13:10,680 Speaker 6: my wife put up parental safeguards trying to limit how 269 00:13:10,760 --> 00:13:14,040 Speaker 6: much he could use certain websites, and he knew almost 270 00:13:14,040 --> 00:13:17,320 Speaker 6: instantaneously how to get around the guards. I mean, he 271 00:13:17,840 --> 00:13:22,160 Speaker 6: is more sophisticated and native to the space. So even 272 00:13:22,200 --> 00:13:23,760 Speaker 6: if you are this is where I would say buck, 273 00:13:23,760 --> 00:13:26,480 Speaker 6: even if you're a super engaged parent, and even if 274 00:13:26,520 --> 00:13:28,680 Speaker 6: you are a parent that is really on top of 275 00:13:28,760 --> 00:13:32,439 Speaker 6: what your kids are doing online. They are smarter than 276 00:13:32,480 --> 00:13:36,040 Speaker 6: you about using that in the same way that whatever 277 00:13:36,080 --> 00:13:38,840 Speaker 6: you were into when your parent when you were sixteen 278 00:13:38,880 --> 00:13:42,000 Speaker 6: and your parents were you know, forty five, whatever you 279 00:13:42,080 --> 00:13:45,080 Speaker 6: were doing then, that was smarter than your parents. They're 280 00:13:45,120 --> 00:13:48,280 Speaker 6: doing that now, except it's with tech, and so it is. 281 00:13:48,360 --> 00:13:52,520 Speaker 6: I think it's the most challenging single part of parenting 282 00:13:52,559 --> 00:13:53,840 Speaker 6: today by far. 283 00:13:54,080 --> 00:13:57,800 Speaker 1: You're listening to the best of Claye Travis and Buck Sexton. 284 00:13:58,080 --> 00:14:00,800 Speaker 2: I've been getting a bit of heat because of my 285 00:14:00,880 --> 00:14:05,839 Speaker 2: stance on the movie Oppenheimer, which even some of our 286 00:14:06,240 --> 00:14:12,240 Speaker 2: esteemed patriot scholar listeners, some of them seemed to disagree. 287 00:14:11,840 --> 00:14:13,920 Speaker 3: With me on a lot of them. 288 00:14:14,040 --> 00:14:17,360 Speaker 2: A lot of you agree with me that Oppenheimer was 289 00:14:17,440 --> 00:14:20,960 Speaker 2: long and boring and kind of a whitewash of the 290 00:14:21,000 --> 00:14:25,520 Speaker 2: threat of communism. The idea that, oh, well, we have nukes, 291 00:14:25,520 --> 00:14:28,080 Speaker 2: so now the Soviets should have nukes too, because that 292 00:14:28,120 --> 00:14:31,080 Speaker 2: will be parody because you know, everyone's the same. That's insane. 293 00:14:31,280 --> 00:14:34,480 Speaker 2: It was a horrible, insane idea, and people betrayed their country, 294 00:14:34,800 --> 00:14:37,440 Speaker 2: including some people who had been given refuge in this 295 00:14:37,480 --> 00:14:41,080 Speaker 2: country from other places, in order to give that parody. 296 00:14:41,120 --> 00:14:42,960 Speaker 2: But anyway, put that aside, the most important thing is 297 00:14:42,960 --> 00:14:45,600 Speaker 2: that it was just boring, and the middle part of 298 00:14:45,600 --> 00:14:47,240 Speaker 2: it was okay, but the first hour and the third 299 00:14:47,280 --> 00:14:49,760 Speaker 2: hour really bad. And people can disagree with me on that, 300 00:14:49,840 --> 00:14:53,200 Speaker 2: but that just means they're wrong. Now, Clay, you saw 301 00:14:53,480 --> 00:14:59,200 Speaker 2: Dune two and you were a big fan, A plus fantastic. 302 00:14:59,320 --> 00:15:00,360 Speaker 2: Everyone should it. 303 00:15:01,200 --> 00:15:04,680 Speaker 6: Yes, I went with my sixteen year old yesterday, who 304 00:15:04,920 --> 00:15:06,240 Speaker 6: has read the Dune books. 305 00:15:06,320 --> 00:15:06,760 Speaker 5: I have not. 306 00:15:08,040 --> 00:15:10,760 Speaker 6: He read the Dune books. We watched this weekend. I 307 00:15:10,760 --> 00:15:12,800 Speaker 6: think I told you, guys, I somehow missed it. In 308 00:15:12,840 --> 00:15:15,560 Speaker 6: twenty twenty one with the kids and everything else, I 309 00:15:15,600 --> 00:15:17,520 Speaker 6: used to see every movie that came out. I haven't 310 00:15:17,520 --> 00:15:21,000 Speaker 6: seen Oppenheimer, I haven't seen Barbie. I haven't seen any 311 00:15:21,000 --> 00:15:23,880 Speaker 6: of these things that everybody's watching. I barely can keep 312 00:15:23,960 --> 00:15:27,480 Speaker 6: up with the sports now, so I had not seen it. 313 00:15:27,520 --> 00:15:29,200 Speaker 6: This weekend, he said, Dad, I think you'll like it. 314 00:15:29,240 --> 00:15:32,720 Speaker 6: We watched June one. Dune two is fabulous, and I 315 00:15:32,760 --> 00:15:36,600 Speaker 6: didn't even realize it feels as I watch it like 316 00:15:36,720 --> 00:15:41,880 Speaker 6: George Lucas for Star Wars used a ton of the 317 00:15:42,000 --> 00:15:46,760 Speaker 6: Dune story as part of the inspiration. Even Tattowin compared 318 00:15:46,800 --> 00:15:51,760 Speaker 6: to the desert planet that starts with Luke Skywalker on 319 00:15:51,800 --> 00:15:55,320 Speaker 6: it it even, I mean, it's amazing how much he borrowed. 320 00:15:55,360 --> 00:15:58,040 Speaker 6: And again that's how culture works. You see something, you 321 00:15:58,040 --> 00:16:01,680 Speaker 6: build something new. But it's amazing how much he borrowed 322 00:16:02,160 --> 00:16:04,960 Speaker 6: from the books Dune to create Star Wars and the 323 00:16:04,960 --> 00:16:07,200 Speaker 6: way it's being watched now. A lot of people are saying, oh, 324 00:16:07,560 --> 00:16:09,280 Speaker 6: Dune looks like Star Wars. 325 00:16:09,360 --> 00:16:09,880 Speaker 5: No, no, No. 326 00:16:09,960 --> 00:16:14,800 Speaker 6: Dune was the beginning part of the inspiration I think 327 00:16:14,800 --> 00:16:18,600 Speaker 6: for Star Wars, but chronologically, because the Star Wars movies 328 00:16:18,640 --> 00:16:22,280 Speaker 6: were made before these Dune movies, it feels like that 329 00:16:22,360 --> 00:16:24,680 Speaker 6: dude is borrowing from Star Wars instead of vice versa. 330 00:16:24,960 --> 00:16:27,080 Speaker 2: I'm glad we get to talk about this because occasionally, 331 00:16:27,280 --> 00:16:29,120 Speaker 2: especially for people who have listened to me for a 332 00:16:29,200 --> 00:16:32,200 Speaker 2: very long time now going on thirteen years, I'm accused 333 00:16:32,320 --> 00:16:35,200 Speaker 2: of being grumpy about movies and saying that all movies suck. 334 00:16:35,280 --> 00:16:38,640 Speaker 2: That is not true. Dune one is a very good movie. 335 00:16:39,000 --> 00:16:40,160 Speaker 2: You and I liked it too. 336 00:16:40,240 --> 00:16:40,960 Speaker 5: I had not seen it. 337 00:16:41,040 --> 00:16:43,560 Speaker 2: Fantastic Dune one is a fantastic movie for what it is. 338 00:16:43,640 --> 00:16:45,480 Speaker 2: I'm sure I haven't seen it yet because Carrot doesn't 339 00:16:45,520 --> 00:16:49,840 Speaker 2: like sci fi. You know, nobody's perfect, but I'm I'm 340 00:16:49,960 --> 00:16:52,920 Speaker 2: sure I'm going to like Dune two. You and I 341 00:16:52,920 --> 00:16:55,280 Speaker 2: both like Top Gun Maverick. I thought all quiet all 342 00:16:55,320 --> 00:16:57,640 Speaker 2: the Western Front The Net, which was released on Netflix, 343 00:16:57,720 --> 00:17:00,400 Speaker 2: was a phenomenal World War One movie like that I 344 00:17:00,440 --> 00:17:03,040 Speaker 2: call them like I see him just I know that 345 00:17:03,080 --> 00:17:07,320 Speaker 2: it got seven oscars. Oppenheimer is a trash movie and 346 00:17:07,680 --> 00:17:11,440 Speaker 2: it's way too, way too favorable to comis in my opinion, 347 00:17:11,480 --> 00:17:13,159 Speaker 2: because that was actually a big problem for me. 348 00:17:13,680 --> 00:17:17,520 Speaker 1: You're listening to the best of Clay Trapps and Buck Sexton. 349 00:17:17,560 --> 00:17:20,520 Speaker 6: I want to start with a story that for some 350 00:17:20,720 --> 00:17:24,120 Speaker 6: of you out there may not be on the top 351 00:17:24,200 --> 00:17:27,359 Speaker 6: of your radar, and you may not be thinking about 352 00:17:27,440 --> 00:17:31,040 Speaker 6: what the impact of this may be. But as we 353 00:17:31,200 --> 00:17:35,399 Speaker 6: come into twenty twenty four, and I bet your kids 354 00:17:35,400 --> 00:17:40,280 Speaker 6: and grandkids become more and more active in experiencing it online, 355 00:17:40,800 --> 00:17:42,880 Speaker 6: I want to talk about what's going on with a 356 00:17:42,880 --> 00:17:47,760 Speaker 6: AI artificial intelligence. The growth that we are seeing there 357 00:17:48,200 --> 00:17:51,360 Speaker 6: and the degree Buck, to me, of what I am 358 00:17:51,440 --> 00:17:58,120 Speaker 6: seeing is just a sort of a recapitulation of all 359 00:17:58,160 --> 00:18:01,560 Speaker 6: of the flaws that existed in so social media now 360 00:18:01,600 --> 00:18:05,920 Speaker 6: being created and impacted in AI. In the same way 361 00:18:06,040 --> 00:18:10,120 Speaker 6: that every social media site, whether it's Facebook, Instagram, Twitter 362 00:18:10,520 --> 00:18:14,280 Speaker 6: on some way designs an algorithm by a human that 363 00:18:14,480 --> 00:18:16,240 Speaker 6: determines what you do and do. 364 00:18:16,200 --> 00:18:18,000 Speaker 5: Not see AI. 365 00:18:18,320 --> 00:18:22,280 Speaker 6: There's a scary, ridiculous, somewhat maybe a little bit funny, 366 00:18:22,080 --> 00:18:26,840 Speaker 6: but also terrifying story of what's going on with Google's 367 00:18:26,920 --> 00:18:30,320 Speaker 6: AI service. Buck and I know we were talking about 368 00:18:30,320 --> 00:18:33,280 Speaker 6: this off air, so I know you've seen it too. Basically, 369 00:18:33,320 --> 00:18:37,160 Speaker 6: they designed it so there's no way that you can 370 00:18:37,200 --> 00:18:40,320 Speaker 6: get an image of a white person, no matter what 371 00:18:40,440 --> 00:18:43,479 Speaker 6: your prompt of request is. And for those of you 372 00:18:43,560 --> 00:18:46,360 Speaker 6: out there who are not familiar with AI at all, 373 00:18:46,960 --> 00:18:49,399 Speaker 6: trying to explain it in just a couple of sentences, 374 00:18:49,840 --> 00:18:55,040 Speaker 6: it's a video or image based version of search, also 375 00:18:56,200 --> 00:18:59,479 Speaker 6: very strong textually, but it's moving more and more into 376 00:19:01,240 --> 00:19:04,640 Speaker 6: imagery and videos, and the idea is basically, you give 377 00:19:04,680 --> 00:19:07,920 Speaker 6: it a prompt. For instance, with Google, where they were 378 00:19:08,000 --> 00:19:11,080 Speaker 6: using this and being able to expose its flaws, it 379 00:19:11,200 --> 00:19:15,120 Speaker 6: was saying, hey, Google, give me a picture of the pope, 380 00:19:15,240 --> 00:19:20,000 Speaker 6: and the pope pictures that AI were returning were all 381 00:19:20,000 --> 00:19:24,960 Speaker 6: minority figures. If you asked for a Viking a picture 382 00:19:25,080 --> 00:19:28,320 Speaker 6: of a Viking, the people that you were getting back 383 00:19:28,720 --> 00:19:32,840 Speaker 6: were black Vikings, obviously, who did not exist. If you 384 00:19:32,960 --> 00:19:36,679 Speaker 6: asked for pictures of the founding fathers, they were giving 385 00:19:36,680 --> 00:19:40,359 Speaker 6: you pictures back of some black people sitting at the 386 00:19:40,400 --> 00:19:43,960 Speaker 6: table with the founding fathers. And I would guess Buck 387 00:19:44,400 --> 00:19:50,240 Speaker 6: that the intent here is to avoid being racist, and 388 00:19:50,480 --> 00:19:54,399 Speaker 6: they wrote in code which basically made it impossible for 389 00:19:54,520 --> 00:19:58,919 Speaker 6: a white person to be revealed when it was making 390 00:19:59,200 --> 00:20:02,520 Speaker 6: when you were getting the prompts. To me, that's maybe 391 00:20:02,560 --> 00:20:07,240 Speaker 6: somewhat a little funny also scary, but to a larger 392 00:20:07,320 --> 00:20:12,480 Speaker 6: context when when you know that kids growing up today 393 00:20:12,680 --> 00:20:16,280 Speaker 6: are going to be using this as a default Google 394 00:20:16,400 --> 00:20:19,800 Speaker 6: search instead of the way Google works now. You type 395 00:20:19,800 --> 00:20:23,639 Speaker 6: in you know, hotel on South Beach or something, and 396 00:20:23,680 --> 00:20:26,160 Speaker 6: you get a bunch of different hotels that would show 397 00:20:26,240 --> 00:20:28,520 Speaker 6: up as the link. Or you type in you know 398 00:20:28,560 --> 00:20:31,479 Speaker 6: who was the eighth President of the United States, and 399 00:20:31,520 --> 00:20:33,400 Speaker 6: you get a prompt that allows you to go click 400 00:20:33,440 --> 00:20:38,080 Speaker 6: on links. You're not actually going and reading and discovering 401 00:20:38,119 --> 00:20:39,800 Speaker 6: the information from your request. 402 00:20:40,200 --> 00:20:41,560 Speaker 5: It's being given to you. 403 00:20:42,160 --> 00:20:46,720 Speaker 6: Search is being created more powerfully than maybe it ever 404 00:20:46,800 --> 00:20:51,600 Speaker 6: has been before, and it's making these algorithms even more powerful, 405 00:20:51,920 --> 00:20:54,679 Speaker 6: and so as a result, they are now stopping the 406 00:20:54,920 --> 00:21:00,280 Speaker 6: Google search. Buck on ai, But are you troubled by this? 407 00:21:00,320 --> 00:21:02,880 Speaker 6: Because I think it could be a huge story for 408 00:21:03,000 --> 00:21:07,719 Speaker 6: twenty twenty four and all of these AI woke ified. 409 00:21:07,920 --> 00:21:12,719 Speaker 6: I would say algorithms are going to artificially distort the 410 00:21:12,800 --> 00:21:15,560 Speaker 6: real reality in a similar way that I think we 411 00:21:15,600 --> 00:21:21,159 Speaker 6: have seen with social media, and I think that's going 412 00:21:21,200 --> 00:21:21,960 Speaker 6: to be the challenge. 413 00:21:22,800 --> 00:21:26,240 Speaker 2: Yeah, what they're saying about this is that it's an 414 00:21:26,280 --> 00:21:30,240 Speaker 2: over correction, right. They're saying that they were trying to 415 00:21:30,280 --> 00:21:35,280 Speaker 2: make sure that racist things didn't happen, and so they 416 00:21:35,320 --> 00:21:37,040 Speaker 2: made it so that there are no you're not getting 417 00:21:37,040 --> 00:21:39,639 Speaker 2: any images of anyone who is white. But this also 418 00:21:39,760 --> 00:21:43,080 Speaker 2: is occurring in a broader context, Right, There's something else 419 00:21:43,480 --> 00:21:46,159 Speaker 2: that's going on here. I mean I think everyone has 420 00:21:46,720 --> 00:21:50,160 Speaker 2: seen now, particularly the last few years, but it stretches 421 00:21:50,200 --> 00:21:53,560 Speaker 2: back for about a decade that diversity and inclusion is 422 00:21:53,560 --> 00:21:57,199 Speaker 2: effectively a religious belief that people feel there is a 423 00:21:57,280 --> 00:22:02,040 Speaker 2: need to fill our society, our history, everything with the 424 00:22:02,080 --> 00:22:07,040 Speaker 2: tenets of diversity and inclusion, especially anything that has to 425 00:22:07,080 --> 00:22:11,879 Speaker 2: do with with pop culture. I mentioned before on this show, 426 00:22:12,720 --> 00:22:14,800 Speaker 2: I watched you know, I like anything that has vikings 427 00:22:14,840 --> 00:22:16,960 Speaker 2: in it, and I should note that there were there 428 00:22:17,000 --> 00:22:21,080 Speaker 2: were some people that did a Viking search, and sure enough, 429 00:22:21,119 --> 00:22:25,720 Speaker 2: the vikings were, you know, people of of dark dark skin, 430 00:22:26,320 --> 00:22:29,920 Speaker 2: and and that's a bit unusual, right, I mean, historically 431 00:22:29,920 --> 00:22:35,600 Speaker 2: that would not be accurate. And the reality here is 432 00:22:36,000 --> 00:22:39,760 Speaker 2: when they were doing the Netflix show, when they were 433 00:22:39,800 --> 00:22:44,320 Speaker 2: trying to get people interested in I guess the beginnings 434 00:22:44,359 --> 00:22:46,600 Speaker 2: of I think it's Viking, I forget what the full 435 00:22:46,640 --> 00:22:50,320 Speaker 2: title is, Viking something or other Clay. They cast a 436 00:22:50,560 --> 00:22:56,120 Speaker 2: black woman as a tenth century Viking yaarl or kind 437 00:22:56,119 --> 00:22:59,280 Speaker 2: of like an earl or a king who really existed. 438 00:23:00,080 --> 00:23:03,040 Speaker 2: So this stuff is already happening, meaning that there's the 439 00:23:03,119 --> 00:23:07,560 Speaker 2: rewriting of history and pop culture with people who are 440 00:23:08,480 --> 00:23:13,080 Speaker 2: being depicted as non white. And it's in that context 441 00:23:13,160 --> 00:23:16,400 Speaker 2: that when you have an AI machine that is doing this, 442 00:23:16,560 --> 00:23:19,560 Speaker 2: everyone starts to feel like, is it I mean, okay, 443 00:23:19,600 --> 00:23:21,600 Speaker 2: this went too far, but is it really a mistake? 444 00:23:21,920 --> 00:23:25,080 Speaker 2: Is diversity and meaning from their end, is diversity and 445 00:23:25,119 --> 00:23:28,440 Speaker 2: inclusion a part of the algorithm, such that they're going 446 00:23:28,480 --> 00:23:32,960 Speaker 2: to try to create more inclusiveness throughout history, and they're 447 00:23:32,960 --> 00:23:35,200 Speaker 2: going to try to elevate some things. The whole notion 448 00:23:35,280 --> 00:23:38,159 Speaker 2: of a neutral algorithm from the beginning, really the earliest 449 00:23:38,200 --> 00:23:41,280 Speaker 2: days of Google is really a fiction, just like editorial 450 00:23:41,359 --> 00:23:44,239 Speaker 2: lines at newspapers being neutral is a fiction, and I 451 00:23:44,240 --> 00:23:46,520 Speaker 2: think this goes toward everyone understanding that better. 452 00:23:47,200 --> 00:23:49,880 Speaker 6: Yeah, and I would say this is a natural outgrowth 453 00:23:49,960 --> 00:23:53,920 Speaker 6: of Hamilton, which decided, Hey, we're going to put minority 454 00:23:54,000 --> 00:23:57,240 Speaker 6: characters into the role of historical characters, which then was 455 00:23:57,280 --> 00:23:59,840 Speaker 6: followed by what is the show Bridgerton that they make 456 00:24:00,040 --> 00:24:02,800 Speaker 6: such a big deal about, Hey, this is a story 457 00:24:02,840 --> 00:24:07,520 Speaker 6: about eighteenth century England, but the race of the characters 458 00:24:07,640 --> 00:24:11,199 Speaker 6: really doesn't matter at all, which is a form of 459 00:24:11,240 --> 00:24:13,240 Speaker 6: color blindness which you're not supposed to do, which is 460 00:24:13,280 --> 00:24:15,879 Speaker 6: its own interesting story we've talked about on this show. 461 00:24:15,880 --> 00:24:17,080 Speaker 5: Buck Hannibal. 462 00:24:17,200 --> 00:24:20,320 Speaker 6: I believe they're making a movie with Denzel Washington playing Hannibal, 463 00:24:21,000 --> 00:24:24,120 Speaker 6: which is not accurately reflected obviously of what his skin 464 00:24:24,240 --> 00:24:27,119 Speaker 6: color would have been, so Leopatra. I think they just 465 00:24:27,160 --> 00:24:30,720 Speaker 6: did recently and it did horrible. It did horribly on Netflix. 466 00:24:30,720 --> 00:24:31,399 Speaker 6: No one want to watch it. 467 00:24:31,440 --> 00:24:34,359 Speaker 2: I would just add my issue with Look, you and 468 00:24:34,359 --> 00:24:37,480 Speaker 2: I both love Denzel Washington as an actor. I think 469 00:24:37,480 --> 00:24:40,880 Speaker 2: he's one of the best living actors today and with 470 00:24:40,880 --> 00:24:42,920 Speaker 2: one of the most impressive bodies of work, I think 471 00:24:42,920 --> 00:24:44,480 Speaker 2: he may be a fantastic Hannibal. 472 00:24:44,840 --> 00:24:45,639 Speaker 3: I actually don't really have. 473 00:24:45,640 --> 00:24:46,400 Speaker 5: An issue with it. 474 00:24:46,560 --> 00:24:50,240 Speaker 2: My issue is I want people to at least understand 475 00:24:50,560 --> 00:24:54,520 Speaker 2: or want people to be taught that Carthage, those the 476 00:24:54,560 --> 00:24:57,560 Speaker 2: Carthaginians were not North Africans in the way we think 477 00:24:57,560 --> 00:25:01,240 Speaker 2: of them now, which would be predominantly sort of olive 478 00:25:01,280 --> 00:25:05,600 Speaker 2: skinned Muslim, right, I mean, or you know, tan complexion Muslims. 479 00:25:06,960 --> 00:25:09,679 Speaker 2: They were white, They were Greeks. Effectively, they would have 480 00:25:09,720 --> 00:25:11,679 Speaker 2: looked very much like the Greeks looked, or like the 481 00:25:11,760 --> 00:25:15,119 Speaker 2: Romans looked. And as long as people understand that history, 482 00:25:15,160 --> 00:25:16,040 Speaker 2: I have less of in it. 483 00:25:16,119 --> 00:25:17,400 Speaker 3: But very few people do. 484 00:25:17,920 --> 00:25:20,159 Speaker 2: And so when you start introducing these things into the 485 00:25:20,200 --> 00:25:24,680 Speaker 2: popular culture, it erases the historical reality. And I think 486 00:25:24,720 --> 00:25:26,920 Speaker 2: some of that is intentional. And I was somebody who 487 00:25:26,920 --> 00:25:31,000 Speaker 2: said early on, I think Hamilton is a very honestly 488 00:25:31,040 --> 00:25:33,480 Speaker 2: kind of a strange premise in a lot of ways. 489 00:25:33,480 --> 00:25:34,960 Speaker 5: To keep in mind, the only white. 490 00:25:34,800 --> 00:25:37,000 Speaker 2: Person in it is the King of England, who's terrible, right, 491 00:25:37,040 --> 00:25:39,679 Speaker 2: who's like the bad guy, which I think if you 492 00:25:39,680 --> 00:25:42,240 Speaker 2: did that in any other context, people would recognize that's 493 00:25:42,520 --> 00:25:44,880 Speaker 2: that make them feel little bit uncomfortable. But I also 494 00:25:44,920 --> 00:25:47,480 Speaker 2: just thought it wasn't good and what bothered me and 495 00:25:47,560 --> 00:25:49,080 Speaker 2: I really mean that as a piece of art, I 496 00:25:49,080 --> 00:25:51,159 Speaker 2: didn't think it was good. And what bothered me was 497 00:25:51,560 --> 00:25:55,520 Speaker 2: I actually thought it was crap, but that you were supposed. 498 00:25:55,119 --> 00:25:55,840 Speaker 3: To say it was good. 499 00:25:55,880 --> 00:25:57,560 Speaker 2: Like if you didn't say it was good, there was 500 00:25:57,600 --> 00:26:01,040 Speaker 2: something wrong with you. That felt very Soviet to me. Right, 501 00:26:01,040 --> 00:26:03,320 Speaker 2: it felt like everyone has to stand and clap because 502 00:26:03,359 --> 00:26:04,720 Speaker 2: Stalin likes the symphony. 503 00:26:05,600 --> 00:26:08,600 Speaker 6: I also would say I'm not aware, and I would 504 00:26:08,640 --> 00:26:12,040 Speaker 6: I would love if somebody did know this. Is there 505 00:26:12,040 --> 00:26:15,919 Speaker 6: any other country in the world that is obsessed with 506 00:26:16,320 --> 00:26:20,960 Speaker 6: making historical characters a different race than they would otherwise be? 507 00:26:21,760 --> 00:26:24,200 Speaker 6: In other words, if you're making a movie in India 508 00:26:24,280 --> 00:26:27,959 Speaker 6: right now and you're doing a story about Indian history, 509 00:26:28,480 --> 00:26:31,520 Speaker 6: would there be any call in what they call Bollywood 510 00:26:31,880 --> 00:26:34,600 Speaker 6: to come back in and cast someone who is historically 511 00:26:34,680 --> 00:26:36,240 Speaker 6: Indian and a different race. 512 00:26:36,520 --> 00:26:41,920 Speaker 2: Well, you also get to clay history is often very 513 00:26:42,320 --> 00:26:44,879 Speaker 2: non inclusive, right, I mean, if you're going to go 514 00:26:44,920 --> 00:26:50,000 Speaker 2: back in history and look for great female leadership two 515 00:26:50,080 --> 00:26:52,680 Speaker 2: thousand years ago, you can find it here and there, 516 00:26:52,840 --> 00:26:54,720 Speaker 2: but there's not going to be a lot of it, right. 517 00:26:54,640 --> 00:26:57,359 Speaker 6: Joan of arc Cleopatra. I mean there's like two or 518 00:26:57,359 --> 00:26:59,800 Speaker 6: three characters right, I said fifteen. 519 00:26:59,560 --> 00:27:00,200 Speaker 3: Hundred years ago. 520 00:27:00,280 --> 00:27:03,160 Speaker 2: But yeah, if you go back far enough, what you'll 521 00:27:03,160 --> 00:27:06,960 Speaker 2: find is that a lot of history is actually quite exclusionary. 522 00:27:07,040 --> 00:27:11,280 Speaker 2: You could even argue mankind was predatory against mankind, and 523 00:27:11,720 --> 00:27:15,840 Speaker 2: there was no effort made to balance things out. And 524 00:27:15,920 --> 00:27:19,880 Speaker 2: so if you're looking at what actually happened, who discovered stuff, 525 00:27:20,000 --> 00:27:24,119 Speaker 2: who conquered stuff, who found stuff, it's not going to 526 00:27:24,200 --> 00:27:29,320 Speaker 2: be what the sociology department at Brown University wants it 527 00:27:29,320 --> 00:27:31,800 Speaker 2: to be. And that's a challenge that they're always going 528 00:27:31,800 --> 00:27:33,879 Speaker 2: to face, which is why I think there's such an 529 00:27:33,920 --> 00:27:38,399 Speaker 2: obsession with making, you know, a tenth century Viking earl 530 00:27:38,840 --> 00:27:41,440 Speaker 2: a black woman. There were no black women who were 531 00:27:41,480 --> 00:27:43,240 Speaker 2: tenth century Viking earls. 532 00:27:43,840 --> 00:27:49,600 Speaker 6: Western civilization triumph. Thankfully, it's why we all have democracy, republics, freedom, 533 00:27:49,800 --> 00:27:53,520 Speaker 6: freedom of speech. All those things are good, good cultural appropriation. 534 00:27:54,119 --> 00:27:56,359 Speaker 6: Here's a kind of summing it up. And I'm open 535 00:27:56,359 --> 00:27:58,119 Speaker 6: to your calls because some of you out there probably 536 00:27:58,119 --> 00:28:01,680 Speaker 6: are far more sophisticated in terms of your AI knowledge 537 00:28:01,680 --> 00:28:02,720 Speaker 6: than either Bucker. 538 00:28:02,520 --> 00:28:03,159 Speaker 5: Myself would be. 539 00:28:03,920 --> 00:28:06,840 Speaker 6: On a scale of one to ten, I'm about a 540 00:28:06,960 --> 00:28:09,880 Speaker 6: nine on being concerned right now based on what I'm 541 00:28:09,920 --> 00:28:13,840 Speaker 6: seeing about what the impact of these AI algorithms are 542 00:28:13,880 --> 00:28:16,080 Speaker 6: going to be, because I think we're finally catching up 543 00:28:16,119 --> 00:28:18,760 Speaker 6: buck with Twitter, where Elon Musk is giving us some 544 00:28:18,960 --> 00:28:21,879 Speaker 6: form of a free expression site, and I think that 545 00:28:21,880 --> 00:28:24,119 Speaker 6: can be very helpful. It took a decade for that 546 00:28:24,200 --> 00:28:27,399 Speaker 6: to happen on social media, fifteen years. I don't know 547 00:28:27,480 --> 00:28:29,760 Speaker 6: that there's going to be the equivalent in AI. I 548 00:28:29,760 --> 00:28:32,040 Speaker 6: hope I'm wrong, but it seems to me like we're 549 00:28:32,080 --> 00:28:35,240 Speaker 6: just creating new woker algorithms that could be even more impactful. 550 00:28:35,920 --> 00:28:39,880 Speaker 2: Absolutely, and when you're talking about AI, you're not just 551 00:28:40,000 --> 00:28:43,600 Speaker 2: talking about the editorial choice of what to put up 552 00:28:43,640 --> 00:28:49,320 Speaker 2: the page. You're talking about the ability to fabricate primary 553 00:28:49,400 --> 00:28:55,600 Speaker 2: source material, archival footage, archival photos, all kinds of texts 554 00:28:55,600 --> 00:28:59,040 Speaker 2: that would be you know, aged looking, and AI can 555 00:28:59,080 --> 00:29:03,600 Speaker 2: make it look real, right, so our perception of the past. 556 00:29:03,840 --> 00:29:08,520 Speaker 2: I don't think people should should leave out the possibility 557 00:29:08,560 --> 00:29:10,719 Speaker 2: here because I think it's very real that there are 558 00:29:10,760 --> 00:29:14,720 Speaker 2: people on the left who would feel ideologically they would 559 00:29:14,720 --> 00:29:17,880 Speaker 2: feel righteous in doing their version. 560 00:29:17,920 --> 00:29:19,280 Speaker 3: If you know what the Soviets used. 561 00:29:19,160 --> 00:29:21,120 Speaker 2: To do when they would when they would eliminate something 562 00:29:21,160 --> 00:29:23,920 Speaker 2: they kept, you know, pretty detailed records. Sometimes they would 563 00:29:24,000 --> 00:29:27,760 Speaker 2: use a razor blade clay to remove the name from 564 00:29:27,760 --> 00:29:30,600 Speaker 2: paper because they just figured, you know, we're going to 565 00:29:30,680 --> 00:29:32,920 Speaker 2: excize it that way, so it's like it was never 566 00:29:33,000 --> 00:29:35,320 Speaker 2: even there. Yeah, you know there's still a hole, right, 567 00:29:35,360 --> 00:29:39,200 Speaker 2: but it doesn't matter. It's gone forever. It feels very 568 00:29:39,240 --> 00:29:41,800 Speaker 2: Soviet to me that they want to try to change 569 00:29:41,800 --> 00:29:44,960 Speaker 2: what our perception of history is because they recognize that 570 00:29:45,000 --> 00:29:47,840 Speaker 2: controlling the past gives you power over the narrative of 571 00:29:47,880 --> 00:29:48,360 Speaker 2: the present. 572 00:29:49,880 --> 00:29:52,800 Speaker 6: I think everybody out there should be terrified. I'd be 573 00:29:52,840 --> 00:29:56,840 Speaker 6: interested in your calls. And again, so many kids. The 574 00:29:56,880 --> 00:29:59,880 Speaker 6: power of AI is they're going to blindly accept what 575 00:30:00,080 --> 00:30:02,840 Speaker 6: they are told. And that is scary no matter what 576 00:30:02,920 --> 00:30:04,840 Speaker 6: the concept is. But I think it's even more so 577 00:30:04,840 --> 00:30:07,680 Speaker 6: because at least Google buck gives you the opportunity. When 578 00:30:07,680 --> 00:30:09,880 Speaker 6: you do a Google search, you can scroll down. A 579 00:30:09,880 --> 00:30:12,080 Speaker 6: lot of people click on the first thing, whatever it is, 580 00:30:12,360 --> 00:30:14,120 Speaker 6: but you can scroll down and you can look at 581 00:30:14,160 --> 00:30:16,120 Speaker 6: the first seven or eight or even the first page 582 00:30:16,160 --> 00:30:18,080 Speaker 6: results and make a choice about the source that you 583 00:30:18,120 --> 00:30:18,640 Speaker 6: want to pick. 584 00:30:19,000 --> 00:30:22,800 Speaker 2: Not here, you know, it would be a fascinating ven diagram. 585 00:30:23,240 --> 00:30:28,880 Speaker 2: People who enthusiastically masked up people who had Ukraine flags 586 00:30:28,880 --> 00:30:33,120 Speaker 2: in their bio, and people who openly loved Hamilton, when 587 00:30:34,520 --> 00:30:37,960 Speaker 2: these are all people that do whatever the machine tells 588 00:30:38,000 --> 00:30:38,600 Speaker 2: them to do. 589 00:30:39,560 --> 00:30:40,960 Speaker 6: One thing as we go to break here in the 590 00:30:41,000 --> 00:30:44,080 Speaker 6: first segment, I want someone smart to do a country 591 00:30:44,080 --> 00:30:47,760 Speaker 6: and Western version of the Obama administration, and I want 592 00:30:47,800 --> 00:30:50,000 Speaker 6: them to have a white guy playing Obama and see 593 00:30:50,040 --> 00:30:53,280 Speaker 6: what the result is. Barack and Michelle Obama country and 594 00:30:53,320 --> 00:30:56,960 Speaker 6: Western version. I want white people playing Barack and Michelle 595 00:30:57,040 --> 00:31:00,280 Speaker 6: Obama in a country and Western version of their of 596 00:31:00,320 --> 00:31:02,800 Speaker 6: their administration, and see what the reaction would be. 597 00:31:03,000 --> 00:31:06,680 Speaker 1: You're listening to the best of Clay Travis and Buck Sexton. 598 00:31:06,920 --> 00:31:08,280 Speaker 5: I didn't see the oscars. 599 00:31:08,480 --> 00:31:11,120 Speaker 6: I was out running around with my family down here, 600 00:31:11,800 --> 00:31:15,760 Speaker 6: but I did this weekend Buck watched I want to 601 00:31:15,760 --> 00:31:17,480 Speaker 6: give people a movie recommendation. 602 00:31:18,600 --> 00:31:21,680 Speaker 5: And you had already seen this. I had not. I 603 00:31:21,720 --> 00:31:23,040 Speaker 5: watched the movie. 604 00:31:22,840 --> 00:31:26,160 Speaker 6: Dune, which is based on the book, and it came 605 00:31:26,160 --> 00:31:28,440 Speaker 6: out a couple of years ago. I had not seen it. 606 00:31:28,520 --> 00:31:31,040 Speaker 6: My sixteen year old said, Dad, you're really gonna like this. 607 00:31:31,320 --> 00:31:34,520 Speaker 6: I think you're really gonna enjoy it. Dune two is 608 00:31:34,560 --> 00:31:38,600 Speaker 6: out now in theaters, and Greg on our team said, 609 00:31:38,640 --> 00:31:39,680 Speaker 6: he's already seen it. 610 00:31:39,760 --> 00:31:39,960 Speaker 5: Buck. 611 00:31:39,960 --> 00:31:42,440 Speaker 6: As soon as we finish this show, I'm taking my 612 00:31:42,520 --> 00:31:45,480 Speaker 6: sixteen year old We're going to watch Dune two at 613 00:31:45,480 --> 00:31:50,440 Speaker 6: the Imax. The reviews have been incredible June one, which 614 00:31:50,440 --> 00:31:52,560 Speaker 6: again I'm a couple of years behind on watching this. 615 00:31:52,680 --> 00:31:55,200 Speaker 6: I somehow missed it during all the chaos of twenty 616 00:31:55,240 --> 00:32:00,720 Speaker 6: twenty one. Dune one fabulous movie. You had seen it. 617 00:32:00,840 --> 00:32:03,640 Speaker 6: You told me you really liked it too. Word is 618 00:32:03,760 --> 00:32:05,880 Speaker 6: this new movie that is out by the way, no 619 00:32:06,080 --> 00:32:10,480 Speaker 6: woke angles on it, no like hidden that I can see, 620 00:32:10,640 --> 00:32:15,440 Speaker 6: hidden awful messages, just what Hollywood did in the eighties 621 00:32:15,480 --> 00:32:18,440 Speaker 6: and the nineties. A really good movie that if your 622 00:32:18,520 --> 00:32:21,480 Speaker 6: kids are thirteen or older, you can take them to 623 00:32:21,520 --> 00:32:24,239 Speaker 6: go watch it and enjoy it, which is what I 624 00:32:24,280 --> 00:32:27,320 Speaker 6: think should be the goal of every movie. Right just 625 00:32:27,360 --> 00:32:29,960 Speaker 6: about so, I'm excited. I can't wait. I don't remember 626 00:32:29,960 --> 00:32:32,680 Speaker 6: the last time I was this excited to go see 627 00:32:32,680 --> 00:32:35,120 Speaker 6: a movie. As soon as we finished the show, I'm 628 00:32:35,160 --> 00:32:35,800 Speaker 6: headed to the theater. 629 00:32:36,520 --> 00:32:40,280 Speaker 2: I remember reading the original Dune novel years ago. It's 630 00:32:40,840 --> 00:32:43,360 Speaker 2: by many, I think, consider the greatest sci fi novel 631 00:32:43,440 --> 00:32:45,880 Speaker 2: ever written, or certainly in the top ten. You know, 632 00:32:45,920 --> 00:32:48,360 Speaker 2: people would probably think there's some HG. Well stuff and 633 00:32:48,400 --> 00:32:50,600 Speaker 2: other things they would throw in there. But Dune is 634 00:32:50,920 --> 00:32:54,080 Speaker 2: an incredibly fun read and. 635 00:32:54,200 --> 00:32:55,320 Speaker 3: Holds up really well. 636 00:32:55,640 --> 00:32:57,560 Speaker 2: And then there are Children of Dune, and there's some 637 00:32:57,640 --> 00:32:58,440 Speaker 2: of the follow ups. 638 00:32:58,800 --> 00:33:00,000 Speaker 5: The original movie. 639 00:33:00,160 --> 00:33:04,080 Speaker 2: I remember watching it and thinking to myself, you know, 640 00:33:04,200 --> 00:33:06,920 Speaker 2: they could do this, like they could just make movies 641 00:33:06,960 --> 00:33:10,200 Speaker 2: again that are supposed to be well executed and entertained. 642 00:33:10,960 --> 00:33:16,640 Speaker 2: But Hollywood has become so diluted with its narcissism and 643 00:33:16,680 --> 00:33:21,040 Speaker 2: its dei seminars and it's belief that it needs to 644 00:33:21,040 --> 00:33:23,720 Speaker 2: be political along with every like deeply political, not just 645 00:33:23,760 --> 00:33:27,440 Speaker 2: political partisan, along with everything else in our society. I 646 00:33:27,440 --> 00:33:30,560 Speaker 2: think is the big problem. So I actually may go 647 00:33:30,600 --> 00:33:33,080 Speaker 2: Carrie's not into sci fi stuff, It's okay, I ever 648 00:33:33,200 --> 00:33:34,800 Speaker 2: like to sci fi. I'll probably go see do In 649 00:33:34,800 --> 00:33:36,800 Speaker 2: two on my own at some point. I actually think 650 00:33:36,800 --> 00:33:39,440 Speaker 2: going to a movie by yourself on an off day 651 00:33:40,120 --> 00:33:42,320 Speaker 2: at a random time is great because you really just 652 00:33:42,320 --> 00:33:46,240 Speaker 2: get underrated the movie undert eating alone at the bar 653 00:33:46,280 --> 00:33:49,000 Speaker 2: at a restaurant underrated. Going to a movie alone when 654 00:33:49,000 --> 00:33:50,600 Speaker 2: you really want to see the movie and you've got 655 00:33:50,600 --> 00:33:53,360 Speaker 2: a couple of hours and you know you just unlike 656 00:33:53,400 --> 00:33:55,400 Speaker 2: eating an ice cream cone alone as a guy apparently 657 00:33:55,440 --> 00:33:58,760 Speaker 2: which place that is not okay with? But I would 658 00:33:58,800 --> 00:34:00,840 Speaker 2: say it's funny to be his last night I was 659 00:34:01,040 --> 00:34:03,920 Speaker 2: for the Oscars. I mean most of those movies I 660 00:34:03,960 --> 00:34:08,680 Speaker 2: haven't even seen. Oppenheimer completely dominated the Night Oppenheimer is. 661 00:34:09,000 --> 00:34:10,000 Speaker 3: I'll put aside my. 662 00:34:10,120 --> 00:34:15,240 Speaker 2: Historical political objections to it, which are substantial. It's just boring. 663 00:34:15,920 --> 00:34:18,319 Speaker 2: It's just boring, Like there's all this stuff about his 664 00:34:18,400 --> 00:34:20,600 Speaker 2: private life, and am I really a communist? 665 00:34:20,719 --> 00:34:21,120 Speaker 5: Or what is that? 666 00:34:21,239 --> 00:34:24,520 Speaker 2: You know that's boring. The Robert down Downey Junior stuff 667 00:34:24,560 --> 00:34:26,520 Speaker 2: is boring. There's like a middle portion of that movie 668 00:34:26,560 --> 00:34:28,799 Speaker 2: it's three hour long movie. I also, I don't want 669 00:34:28,800 --> 00:34:31,279 Speaker 2: three hour long movies anymore. I haven't a long time. 670 00:34:32,120 --> 00:34:34,680 Speaker 2: It's really hard, with attention spans getting so used to 671 00:34:34,719 --> 00:34:37,000 Speaker 2: on demand to have someone sit for three three and 672 00:34:37,040 --> 00:34:39,000 Speaker 2: a half hours for anything. I think in one sitting. 673 00:34:40,560 --> 00:34:42,560 Speaker 2: You know, That's why The Lord of the Rings was great. 674 00:34:42,560 --> 00:34:44,160 Speaker 2: But when you went to see the Hobbit movies, you're like, 675 00:34:44,200 --> 00:34:46,040 Speaker 2: this is the first movie. It's like four hours long. 676 00:34:46,080 --> 00:34:47,880 Speaker 2: What's going on here? I know there's two more coming 677 00:34:48,680 --> 00:34:52,200 Speaker 2: that that wasn't even the director's cut anyway, Oppenheimer. The 678 00:34:52,200 --> 00:34:54,200 Speaker 2: middle section where they're testing the bomb and then getting 679 00:34:54,239 --> 00:34:56,400 Speaker 2: ready to do that, that's that's well done and interesting. 680 00:34:56,840 --> 00:34:57,480 Speaker 3: But I don't know. 681 00:34:57,640 --> 00:35:00,480 Speaker 2: I actually think after last night, I'm miss to say it, 682 00:35:00,560 --> 00:35:03,759 Speaker 2: I think Christopher Nolan might have taken uh, who's the 683 00:35:03,800 --> 00:35:05,280 Speaker 2: guy who did pulp Fiction? 684 00:35:05,640 --> 00:35:07,440 Speaker 5: Who's the director Tarantino? 685 00:35:08,200 --> 00:35:10,279 Speaker 2: I think that Christopher Nolan may have taken the crown 686 00:35:10,280 --> 00:35:12,600 Speaker 2: from Tarantino as the most overrated. 687 00:35:12,040 --> 00:35:12,959 Speaker 5: Director of all time. 688 00:35:13,520 --> 00:35:16,880 Speaker 2: That's that's how I feel after last night. Yeah, yeah, 689 00:35:17,080 --> 00:35:20,440 Speaker 2: I'm just sing today. Yeah, that's that's what because I'm 690 00:35:21,280 --> 00:35:23,399 Speaker 2: not a good watch. It's just not fun to watch. 691 00:35:23,400 --> 00:35:25,120 Speaker 2: And everyone's like, oh I really liked it. Now you've 692 00:35:25,120 --> 00:35:26,640 Speaker 2: been told to like it. It's not very good. 693 00:35:27,400 --> 00:35:29,560 Speaker 6: I didn't see any of the movies that were nominated 694 00:35:29,560 --> 00:35:32,000 Speaker 6: for Best Picture this year, and this is Isn't. 695 00:35:31,760 --> 00:35:34,359 Speaker 2: The isn't the Killer Flowers of the Summer Moon? Isn't 696 00:35:34,360 --> 00:35:36,880 Speaker 2: it like eight hours long? It's crazy, it's really long. 697 00:35:37,320 --> 00:35:39,719 Speaker 2: I didn't see that either. Once I had kids, I 698 00:35:39,760 --> 00:35:41,680 Speaker 2: see a lot of kids movies. So if you want 699 00:35:41,680 --> 00:35:43,480 Speaker 2: me to know, give a review on kids movies. I 700 00:35:43,480 --> 00:35:45,720 Speaker 2: got a funny story for you Buck on the Solo movie. 701 00:35:46,400 --> 00:35:49,000 Speaker 2: When I was going out all the time to Fox Sports, 702 00:35:49,239 --> 00:35:50,759 Speaker 2: you know, I would stay. For those of you who 703 00:35:50,840 --> 00:35:53,920 Speaker 2: know La, there was a hotel that looks over the 704 00:35:53,960 --> 00:35:57,080 Speaker 2: Fox lot, right at the intersection of Pico and Motor 705 00:35:57,520 --> 00:35:58,880 Speaker 2: and as the Intercontinental Hotel. 706 00:35:58,920 --> 00:36:01,279 Speaker 6: You could walk back and forth. There's also the Westfield 707 00:36:01,320 --> 00:36:03,719 Speaker 6: Mall there and they have a great movie theater. So 708 00:36:03,800 --> 00:36:07,040 Speaker 6: every now and then I would have time off in 709 00:36:07,120 --> 00:36:10,000 Speaker 6: between television shows or whatever else, and I would walk 710 00:36:10,040 --> 00:36:12,120 Speaker 6: and go see a movie. I went to go see. 711 00:36:12,120 --> 00:36:14,320 Speaker 6: Do you remember Gone Girl? I went to go see 712 00:36:14,320 --> 00:36:18,600 Speaker 6: Gone Girl by myself. They changed our filming time. I 713 00:36:18,640 --> 00:36:21,160 Speaker 6: had my phone off. They couldn't get in touch with me. 714 00:36:21,560 --> 00:36:25,359 Speaker 6: For a Fox Sports television show. Coach Dave Wonstet many 715 00:36:25,400 --> 00:36:28,640 Speaker 6: of you will remember, awesome guy was there ready to tape. 716 00:36:28,680 --> 00:36:31,040 Speaker 6: They couldn't figure out where I was. And when they 717 00:36:31,040 --> 00:36:32,719 Speaker 6: finally got me on, they were like, You're gonna have 718 00:36:32,760 --> 00:36:34,759 Speaker 6: to tell coach where you were. He's really fired up 719 00:36:34,800 --> 00:36:36,400 Speaker 6: about this, and I was like, well, you gonna have 720 00:36:36,440 --> 00:36:38,240 Speaker 6: to tell him that I went to see Gone Girl 721 00:36:38,600 --> 00:36:41,160 Speaker 6: by myself at the Westfield Mall and that's why he 722 00:36:41,239 --> 00:36:43,120 Speaker 6: was late and there was a pause and they were like, 723 00:36:43,160 --> 00:36:44,359 Speaker 6: I think you got to come up with a better 724 00:36:44,400 --> 00:36:46,560 Speaker 6: story than that. You can't have gone to see Gone Girl. 725 00:36:46,960 --> 00:36:48,839 Speaker 2: At least you weren't in line alone at the dairy 726 00:36:48,920 --> 00:36:50,400 Speaker 2: Queen Clay, that's true.