1 00:00:00,960 --> 00:00:06,240 Speaker 1: Bold, reverence, and occasionally random. The Sunday hang with Playing 2 00:00:06,360 --> 00:00:08,560 Speaker 1: Buck podcast starts now. 3 00:00:10,119 --> 00:00:13,119 Speaker 2: I want to start with a story that for some 4 00:00:13,280 --> 00:00:16,720 Speaker 2: of you out there may not be on the top 5 00:00:16,760 --> 00:00:19,960 Speaker 2: of your radar, and you may not be thinking about 6 00:00:20,040 --> 00:00:23,599 Speaker 2: what the impact of this may be. But as we 7 00:00:23,760 --> 00:00:27,560 Speaker 2: come into twenty twenty four, and as I bet your 8 00:00:27,640 --> 00:00:32,080 Speaker 2: kids and grandkids become more and more active in experiencing 9 00:00:32,120 --> 00:00:34,800 Speaker 2: it online, I want to talk about what's going on 10 00:00:34,880 --> 00:00:39,559 Speaker 2: with a AI artificial intelligence, the growth that we are 11 00:00:39,600 --> 00:00:43,479 Speaker 2: seeing there and the degree Buck, to me of what 12 00:00:43,600 --> 00:00:49,920 Speaker 2: I am seeing is just a sort of a recapitulation 13 00:00:50,240 --> 00:00:53,280 Speaker 2: of all of the flaws that existed in social media 14 00:00:53,880 --> 00:00:58,160 Speaker 2: now being created and impacted in AI in the same 15 00:00:58,240 --> 00:01:02,240 Speaker 2: way that every social media site, whether it's Facebook, Instagram, 16 00:01:02,320 --> 00:01:06,399 Speaker 2: Twitter on some way designs an algorithm by a human 17 00:01:06,800 --> 00:01:10,559 Speaker 2: that determines what you do and do not see AI. 18 00:01:10,880 --> 00:01:14,880 Speaker 2: There's a scary, ridiculous, somewhat maybe a little bit funny, 19 00:01:14,680 --> 00:01:19,399 Speaker 2: but also terrifying story of what's going on with Google's 20 00:01:19,480 --> 00:01:22,880 Speaker 2: AI service. Buck and I know we were talking about 21 00:01:22,880 --> 00:01:25,880 Speaker 2: this off air, so I know you've seen it too. Basically, 22 00:01:25,920 --> 00:01:29,759 Speaker 2: they designed it so there's no way that you can 23 00:01:29,800 --> 00:01:32,959 Speaker 2: get an image of a white person no matter what 24 00:01:33,040 --> 00:01:36,040 Speaker 2: your prompt of request is. And for those of you 25 00:01:36,120 --> 00:01:38,920 Speaker 2: out there who are not familiar with AI at all, 26 00:01:39,560 --> 00:01:41,960 Speaker 2: trying to explain it in just a couple of sentences, 27 00:01:42,440 --> 00:01:47,559 Speaker 2: it's a video or image based version of search, also 28 00:01:48,800 --> 00:01:52,200 Speaker 2: very strong textually, but it's moving more and more into 29 00:01:53,840 --> 00:01:57,240 Speaker 2: imagery and videos, and the idea is basically, you give 30 00:01:57,280 --> 00:02:00,440 Speaker 2: it a prompt. For instance, with Google War, they were 31 00:02:00,600 --> 00:02:03,680 Speaker 2: using this and being able to expose its flaws. It 32 00:02:03,760 --> 00:02:07,720 Speaker 2: was saying, hey, Google, give me a picture of the Pope, 33 00:02:07,840 --> 00:02:12,519 Speaker 2: and the pope pictures that AI were returning were all 34 00:02:12,600 --> 00:02:17,520 Speaker 2: minority figures. If you asked for a Viking a picture 35 00:02:17,680 --> 00:02:20,919 Speaker 2: of a Viking, the people that you were getting back 36 00:02:21,280 --> 00:02:25,400 Speaker 2: were black Vikings, obviously, who did not exist. If you 37 00:02:25,560 --> 00:02:29,240 Speaker 2: asked for pictures of the Founding fathers, they were giving 38 00:02:29,280 --> 00:02:32,960 Speaker 2: you pictures back of some black people sitting at the 39 00:02:33,000 --> 00:02:36,560 Speaker 2: table with the Founding fathers. And I would guess buck 40 00:02:36,960 --> 00:02:42,840 Speaker 2: that the intent here is to avoid being racist. And 41 00:02:43,080 --> 00:02:46,960 Speaker 2: they wrote in code which basically made it impossible for 42 00:02:47,120 --> 00:02:51,480 Speaker 2: a white person to be revealed when it was making. 43 00:02:51,800 --> 00:02:55,040 Speaker 2: When you were getting these prompts to me, that's maybe 44 00:02:55,160 --> 00:02:59,720 Speaker 2: somewhat a little funny, also scary, but to a larger 45 00:03:00,240 --> 00:03:05,440 Speaker 2: text when you know that kids growing up today are 46 00:03:05,480 --> 00:03:09,320 Speaker 2: going to be using this as a default Google search 47 00:03:09,880 --> 00:03:12,480 Speaker 2: instead of the way Google works now. You type in 48 00:03:13,520 --> 00:03:16,320 Speaker 2: you know, hotel on South Beach or something, and you 49 00:03:16,360 --> 00:03:18,960 Speaker 2: get a bunch of different hotels that would show up 50 00:03:18,960 --> 00:03:21,320 Speaker 2: as the link. Or you type in you know who 51 00:03:21,400 --> 00:03:24,200 Speaker 2: was the eighth President of the United States, and you 52 00:03:24,240 --> 00:03:26,640 Speaker 2: get a prompt that allows you to go click on links. 53 00:03:27,120 --> 00:03:31,360 Speaker 2: You're not actually going and reading and discovering the information 54 00:03:31,480 --> 00:03:35,440 Speaker 2: from your request. It's being given to you. Search is 55 00:03:35,560 --> 00:03:40,320 Speaker 2: being created more powerfully than maybe it ever has been before, 56 00:03:40,960 --> 00:03:44,760 Speaker 2: and it's making these algorithms even more powerful. And so 57 00:03:44,920 --> 00:03:48,960 Speaker 2: as a result, they are now stopping the Google Search 58 00:03:49,200 --> 00:03:53,120 Speaker 2: buck on AI. But are you troubled by this because 59 00:03:53,160 --> 00:03:55,880 Speaker 2: I think it could be a huge story for twenty 60 00:03:55,960 --> 00:04:00,600 Speaker 2: twenty four and all of these AI woke of I 61 00:04:00,680 --> 00:04:05,640 Speaker 2: would say, algorithms are going to artificially distort the real 62 00:04:05,760 --> 00:04:08,280 Speaker 2: reality in a similar way that I think we have 63 00:04:08,320 --> 00:04:13,960 Speaker 2: seen with social media, and I think that's gonna be 64 00:04:13,960 --> 00:04:14,560 Speaker 2: the challenge. 65 00:04:15,360 --> 00:04:18,760 Speaker 1: Yeah, what they're saying about this is that it's an 66 00:04:18,880 --> 00:04:22,800 Speaker 1: over correction, right they're saying that they were trying to 67 00:04:22,839 --> 00:04:27,840 Speaker 1: make sure that racist things didn't happen, and so they 68 00:04:27,880 --> 00:04:29,599 Speaker 1: made it so that there are no you're not getting 69 00:04:29,600 --> 00:04:32,239 Speaker 1: any images of anyone who is white. But this also 70 00:04:32,320 --> 00:04:35,640 Speaker 1: is occurring in a broader context, right, There's something else 71 00:04:36,040 --> 00:04:38,760 Speaker 1: that's going on here. I mean, I think everyone has 72 00:04:39,320 --> 00:04:42,760 Speaker 1: seen now, particularly the last few years, but it stretches 73 00:04:42,800 --> 00:04:46,159 Speaker 1: back for about a decade that diversity and inclusion is 74 00:04:46,160 --> 00:04:49,800 Speaker 1: effectively a religious belief that people feel there is a 75 00:04:49,839 --> 00:04:54,640 Speaker 1: need to fill our society, our history, everything with the 76 00:04:54,680 --> 00:04:59,599 Speaker 1: tenets of diversity and inclusion, especially anything that has to 77 00:04:59,640 --> 00:05:04,480 Speaker 1: do with with pop culture. I mentioned before on this show, 78 00:05:05,320 --> 00:05:07,400 Speaker 1: I watched you know, I like anything that has Vikings 79 00:05:07,400 --> 00:05:09,520 Speaker 1: in it. And I should note that there were there 80 00:05:09,600 --> 00:05:13,360 Speaker 1: were some people that did a Viking search, and sure 81 00:05:13,480 --> 00:05:18,320 Speaker 1: enough the Vikings were, you know, people of dark, dark skin, 82 00:05:18,920 --> 00:05:22,440 Speaker 1: and and that's a bit unusual, right, I mean, historically 83 00:05:22,520 --> 00:05:28,159 Speaker 1: that would not be accurate. And the reality here is 84 00:05:28,600 --> 00:05:32,360 Speaker 1: when they were doing the Netflix show, when they were 85 00:05:32,400 --> 00:05:36,920 Speaker 1: trying to get people interested in I guess the beginnings 86 00:05:36,920 --> 00:05:39,200 Speaker 1: of I think it's Viking. I forget what the full 87 00:05:39,200 --> 00:05:42,880 Speaker 1: title is. Viking something or other Clay. They cast a 88 00:05:43,120 --> 00:05:48,720 Speaker 1: black woman as a tenth century Viking yarl or kind 89 00:05:48,720 --> 00:05:51,840 Speaker 1: of like an earl or a king who really existed. 90 00:05:52,600 --> 00:05:55,640 Speaker 1: So this stuff is already happening, meaning that there's the 91 00:05:55,680 --> 00:05:59,920 Speaker 1: rewriting of history and pop culture with people who are 92 00:06:01,080 --> 00:06:05,719 Speaker 1: being depicted as non white. And it's in that context 93 00:06:05,720 --> 00:06:08,960 Speaker 1: that when you have an AI machine that is doing this, 94 00:06:09,120 --> 00:06:12,039 Speaker 1: everyone starts to feel like, is it I mean, okay, 95 00:06:12,200 --> 00:06:14,160 Speaker 1: this went too far, but is it really a mistake? 96 00:06:14,480 --> 00:06:17,640 Speaker 1: Is diversity and meaning from their end, is diversity and 97 00:06:17,720 --> 00:06:21,040 Speaker 1: inclusion a part of the algorithm, such that they're going 98 00:06:21,080 --> 00:06:25,560 Speaker 1: to try to create more inclusiveness throughout history, and they're 99 00:06:25,560 --> 00:06:27,760 Speaker 1: going to try to elevate some things. The whole notion 100 00:06:27,839 --> 00:06:30,719 Speaker 1: of a neutral algorithm from the beginning, really the earliest 101 00:06:30,800 --> 00:06:33,880 Speaker 1: days of Google, is really a fiction, just like editorial 102 00:06:33,920 --> 00:06:36,800 Speaker 1: lines at newspapers, being neutral is a fiction. And I 103 00:06:36,839 --> 00:06:39,080 Speaker 1: think this goes toward everyone understanding that better. 104 00:06:39,760 --> 00:06:42,440 Speaker 2: Yeah, And I would say this is a natural outgrowth 105 00:06:42,520 --> 00:06:46,520 Speaker 2: of Hamilton, which decided, Hey, we're going to put minority 106 00:06:46,600 --> 00:06:49,800 Speaker 2: characters into the role of historical characters, which then was 107 00:06:49,839 --> 00:06:52,480 Speaker 2: followed by what is the show Bridgerton that they make 108 00:06:52,560 --> 00:06:55,400 Speaker 2: such a big deal about Hey, this is a story 109 00:06:55,440 --> 00:06:59,880 Speaker 2: about eighteenth century England, but the race of the character 110 00:07:00,240 --> 00:07:03,800 Speaker 2: really doesn't matter at all, which is a form of 111 00:07:03,800 --> 00:07:05,840 Speaker 2: color blindness which you're not supposed to do, which is 112 00:07:05,880 --> 00:07:08,440 Speaker 2: its own interesting story. We've talked about on this show. 113 00:07:08,480 --> 00:07:11,600 Speaker 2: Buck Hannibal. I believe they're making a movie with Denzel 114 00:07:11,720 --> 00:07:16,160 Speaker 2: Washington playing Hannibal, which is not accurately reflected obviously of 115 00:07:16,160 --> 00:07:19,080 Speaker 2: what his skin color would have been so Leopatra. I 116 00:07:19,120 --> 00:07:22,160 Speaker 2: think they just did recently and it did horrible. It 117 00:07:22,200 --> 00:07:24,000 Speaker 2: did horribly on Netflix. No one want to watch it. 118 00:07:24,000 --> 00:07:26,920 Speaker 1: I would just add my issue with Look, you and 119 00:07:26,960 --> 00:07:30,000 Speaker 1: I both love Denzel Washington as an actor. I think 120 00:07:30,080 --> 00:07:33,440 Speaker 1: he's one of the best living actors today and with 121 00:07:33,480 --> 00:07:35,480 Speaker 1: one of the most impressive bodies of work, I think 122 00:07:35,520 --> 00:07:38,120 Speaker 1: he may be a fantastic Hannibal. I actually don't really 123 00:07:38,120 --> 00:07:41,280 Speaker 1: have an issue with it. My issue is I want 124 00:07:41,320 --> 00:07:43,920 Speaker 1: people to at least understand or want people to be 125 00:07:44,000 --> 00:07:49,320 Speaker 1: taught that Carthage, those the Carthaginians were not North Africans 126 00:07:49,360 --> 00:07:51,040 Speaker 1: in the way we think of them now, which would 127 00:07:51,080 --> 00:07:55,520 Speaker 1: be predominantly Arab, sort of olive skinned Muslim, right, I mean, 128 00:07:55,600 --> 00:08:01,720 Speaker 1: or you know, tan complexion, Muslims they were, they were Greeks. Effectively, 129 00:08:01,840 --> 00:08:03,920 Speaker 1: they would have looked very much like the Greeks looked, 130 00:08:03,960 --> 00:08:06,520 Speaker 1: or like the Romans looked. And as long as people 131 00:08:06,600 --> 00:08:09,160 Speaker 1: understand that history, I have less of it, but very 132 00:08:09,160 --> 00:08:12,200 Speaker 1: few people do, and so when you start introducing these 133 00:08:12,200 --> 00:08:16,720 Speaker 1: things into the popular culture, it erases the historical reality. 134 00:08:16,800 --> 00:08:18,920 Speaker 1: And I think some of that is intentional. And I 135 00:08:19,000 --> 00:08:21,800 Speaker 1: was somebody who said early on, I think Hamilton is 136 00:08:22,040 --> 00:08:25,560 Speaker 1: a very honestly kind of a strange premise in a 137 00:08:25,560 --> 00:08:27,400 Speaker 1: lot of ways. To keep in mind, the only white 138 00:08:27,400 --> 00:08:29,560 Speaker 1: person in it is the King of England, who's terrible, right, 139 00:08:29,640 --> 00:08:32,240 Speaker 1: who's like the bad guy, which I think if you 140 00:08:32,240 --> 00:08:35,040 Speaker 1: did that in any other context people would recognize that's 141 00:08:35,120 --> 00:08:37,160 Speaker 1: that make them feel a little bit uncomfortable. But I 142 00:08:37,240 --> 00:08:39,480 Speaker 1: also just thought it wasn't good. And what bothered me, 143 00:08:40,000 --> 00:08:41,520 Speaker 1: and I really knew that as a piece of art, 144 00:08:41,559 --> 00:08:43,400 Speaker 1: I didn't think it was good. And what bothered me 145 00:08:43,640 --> 00:08:47,080 Speaker 1: was I actually thought it was crap, but that you 146 00:08:47,120 --> 00:08:48,800 Speaker 1: were supposed to say it was good. Like if you 147 00:08:48,840 --> 00:08:51,320 Speaker 1: didn't say it was good, there was something wrong with you. 148 00:08:51,720 --> 00:08:54,080 Speaker 1: That felt very Soviet to me, right, It felt like 149 00:08:54,120 --> 00:08:57,280 Speaker 1: everyone has to stand and clap because Stalin likes the symphony. 150 00:08:58,160 --> 00:09:01,280 Speaker 2: I also would say, I'm not where and would I 151 00:09:01,280 --> 00:09:04,800 Speaker 2: would love if somebody did know this. Is there any 152 00:09:04,880 --> 00:09:09,280 Speaker 2: other country in the world that is obsessed with making 153 00:09:09,480 --> 00:09:13,560 Speaker 2: historical characters a different race than they would otherwise be? 154 00:09:14,360 --> 00:09:16,760 Speaker 2: In other words, if you're making a movie in India 155 00:09:16,880 --> 00:09:20,559 Speaker 2: right now, and you're doing a story about Indian history, 156 00:09:21,080 --> 00:09:24,080 Speaker 2: would there be any call in what they call Bollywood 157 00:09:24,440 --> 00:09:27,199 Speaker 2: to come back in and cast someone who is historically 158 00:09:27,280 --> 00:09:28,840 Speaker 2: Indian and a different race. 159 00:09:29,080 --> 00:09:34,520 Speaker 1: Well, you also get to clay history is often very 160 00:09:34,920 --> 00:09:37,440 Speaker 1: non inclusive, right. I mean, if you're going to go 161 00:09:37,520 --> 00:09:42,640 Speaker 1: back in history and look for great female leadership two 162 00:09:42,679 --> 00:09:45,319 Speaker 1: thousand years ago, you can find it here and there, 163 00:09:45,440 --> 00:09:46,800 Speaker 1: But there's not going to be a lot of it, 164 00:09:46,960 --> 00:09:49,760 Speaker 1: right Joan of Arc Cleopatra. I mean there's like two 165 00:09:49,880 --> 00:09:52,839 Speaker 1: or three characters, right, I've said fifteen hundred years ago. 166 00:09:52,880 --> 00:09:55,720 Speaker 1: But yeah, if you go back far enough, what you'll 167 00:09:55,760 --> 00:10:00,720 Speaker 1: find is that a lot of history is actually quite exclusionary. Argue, 168 00:10:00,880 --> 00:10:05,360 Speaker 1: mankind was predatory against mankind, and there was no effort 169 00:10:06,480 --> 00:10:09,160 Speaker 1: made to balance things out, and so if you're looking 170 00:10:09,200 --> 00:10:13,720 Speaker 1: at what actually happened, who discovered stuff, who conquered stuff, 171 00:10:13,760 --> 00:10:18,000 Speaker 1: who found stuff. It's not going to be what the 172 00:10:18,960 --> 00:10:22,600 Speaker 1: sociology department at Brown University wants it to be. And 173 00:10:22,640 --> 00:10:25,200 Speaker 1: that's a challenge that they're always going to face, which 174 00:10:25,200 --> 00:10:28,280 Speaker 1: is why I think there's such an obsession with making, 175 00:10:28,640 --> 00:10:32,200 Speaker 1: you know, a tenth century Viking earl a black woman. 176 00:10:32,400 --> 00:10:35,840 Speaker 1: There were no black women who were tenth century Viking earls. 177 00:10:36,440 --> 00:10:42,200 Speaker 2: Western civilization triumph, thankfully, it's why we all have democracy, republics, freedom, 178 00:10:42,360 --> 00:10:46,040 Speaker 2: freedom of speech. All those things are good, good cultural appropriation. 179 00:10:46,720 --> 00:10:48,920 Speaker 2: Here's a kind of summing it up, and I'm open 180 00:10:48,960 --> 00:10:50,640 Speaker 2: to your calls because some of you out there probably 181 00:10:50,679 --> 00:10:54,240 Speaker 2: are far more sophisticated in terms of your AI knowledge 182 00:10:54,280 --> 00:10:57,120 Speaker 2: than either Buck or myself would be. On a scale 183 00:10:57,160 --> 00:11:00,520 Speaker 2: of one to ten, I'm about a nine on being 184 00:11:00,559 --> 00:11:03,840 Speaker 2: concerned right now based on what I'm seeing about what 185 00:11:04,000 --> 00:11:07,000 Speaker 2: the impact of these AI algorithms are going to be, 186 00:11:07,000 --> 00:11:09,480 Speaker 2: because I think we're finally catching up Buck with Twitter, 187 00:11:09,720 --> 00:11:12,120 Speaker 2: where Elon Musk is giving us some form of a 188 00:11:12,160 --> 00:11:15,360 Speaker 2: free expression site, and I think that can be very helpful. 189 00:11:15,600 --> 00:11:17,959 Speaker 2: It took a decade for that to happen on social media. 190 00:11:18,040 --> 00:11:20,880 Speaker 2: Fifteen years. I don't know that there's going to be 191 00:11:20,920 --> 00:11:23,520 Speaker 2: the equivalent in AI. I hope I'm wrong, but it 192 00:11:23,600 --> 00:11:26,520 Speaker 2: seems to me like we're just creating new Woker algorithms 193 00:11:26,520 --> 00:11:27,839 Speaker 2: that could be even more impactful. 194 00:11:28,480 --> 00:11:32,440 Speaker 1: Absolutely, And when you're talking about AI, you're not just 195 00:11:32,559 --> 00:11:36,160 Speaker 1: talking about the editorial choice of what to put up 196 00:11:36,200 --> 00:11:41,880 Speaker 1: the page. You're talking about the ability to fabricate primary 197 00:11:41,960 --> 00:11:48,120 Speaker 1: source material, archival footage, archival photos, all kinds of texts 198 00:11:48,160 --> 00:11:51,600 Speaker 1: that would be you know, aged looking, and AI can 199 00:11:51,640 --> 00:11:56,200 Speaker 1: make it look real, right, so our perception of the past. 200 00:11:56,400 --> 00:12:01,120 Speaker 1: I don't think people should should leave out the possibility 201 00:12:01,160 --> 00:12:03,320 Speaker 1: here because I think it's very real that there are 202 00:12:03,360 --> 00:12:07,280 Speaker 1: people on the left who would feel ideologically they would 203 00:12:07,320 --> 00:12:10,840 Speaker 1: feel righteous in doing their version. If you know what 204 00:12:10,880 --> 00:12:12,920 Speaker 1: the Soviets used to do when they would when they 205 00:12:12,920 --> 00:12:15,360 Speaker 1: would eliminate something, they kept, you know, pretty detailed records. 206 00:12:15,760 --> 00:12:19,600 Speaker 1: Sometimes they would use a razor blade clay to remove 207 00:12:19,679 --> 00:12:22,600 Speaker 1: the name from paper because they just figured, you know, 208 00:12:22,760 --> 00:12:24,880 Speaker 1: we're going to excize it that way, so it's like 209 00:12:25,000 --> 00:12:27,320 Speaker 1: it was never even there. Yeah, you know there's still 210 00:12:27,320 --> 00:12:29,760 Speaker 1: a hole, right, but it doesn't matter. It's gone forever. 211 00:12:31,080 --> 00:12:33,480 Speaker 1: It feels very Soviet to me that they want to 212 00:12:33,520 --> 00:12:36,720 Speaker 1: try to change what our perception of history is because 213 00:12:36,720 --> 00:12:39,839 Speaker 1: they recognize that controlling the past gives you power over 214 00:12:39,880 --> 00:12:40,920 Speaker 1: the narrative of the present. 215 00:12:42,440 --> 00:12:45,440 Speaker 2: I think everybody out there should be terrified. I'd be 216 00:12:45,440 --> 00:12:49,400 Speaker 2: interested in your calls. And again, so many kids. The 217 00:12:49,440 --> 00:12:52,560 Speaker 2: power of AI is they're going to blindly accept what 218 00:12:52,600 --> 00:12:55,440 Speaker 2: they are told. And that is scary no matter what 219 00:12:55,480 --> 00:12:57,400 Speaker 2: the concept is. But I think it's even more so 220 00:12:57,440 --> 00:13:00,319 Speaker 2: because at least Google buck gives you the opportunity. When 221 00:13:00,320 --> 00:13:02,400 Speaker 2: you do a Google search, you can scroll down. A 222 00:13:02,440 --> 00:13:04,679 Speaker 2: lot of people click on the first thing, whatever it is, 223 00:13:04,960 --> 00:13:06,760 Speaker 2: but you can scroll down and you can look at 224 00:13:06,760 --> 00:13:08,679 Speaker 2: the first seven or eight or even the first page 225 00:13:08,760 --> 00:13:10,680 Speaker 2: results and make a choice about the source that you 226 00:13:10,720 --> 00:13:11,200 Speaker 2: want to pick. 227 00:13:11,600 --> 00:13:15,359 Speaker 1: Not here, you know it would be a fascinating ven diagram. 228 00:13:15,800 --> 00:13:21,440 Speaker 1: People who enthusiastically masked up, people who had Ukraine flags 229 00:13:21,440 --> 00:13:25,720 Speaker 1: in their bio, and people who openly loved Hamilton would 230 00:13:25,760 --> 00:13:30,160 Speaker 1: have These are all people that do whatever the machine 231 00:13:30,200 --> 00:13:31,160 Speaker 1: tells them to do. 232 00:13:32,120 --> 00:13:33,560 Speaker 2: One thing, as we go to break here in the 233 00:13:33,600 --> 00:13:36,640 Speaker 2: first segment, I want someone smart to do a country 234 00:13:36,679 --> 00:13:40,320 Speaker 2: and Western version of the Obama administration, and I want 235 00:13:40,360 --> 00:13:42,600 Speaker 2: them to have a white guy playing Obama and see 236 00:13:42,600 --> 00:13:45,840 Speaker 2: what the result is Barack and Michelle Obama country and 237 00:13:45,880 --> 00:13:49,600 Speaker 2: Western version. I want white people playing Barack and Michelle 238 00:13:49,600 --> 00:13:53,719 Speaker 2: Obama in a country and Western version of their administration 239 00:13:54,000 --> 00:13:57,960 Speaker 2: and see what the reaction would be. Sunday Sizzle with 240 00:13:58,200 --> 00:13:59,040 Speaker 2: Clay and Buck. 241 00:14:00,120 --> 00:14:08,840 Speaker 1: This Google Gemini AI situation and what's gone on here is, 242 00:14:09,440 --> 00:14:15,320 Speaker 1: as you know, AI is the the technological flavor of 243 00:14:15,360 --> 00:14:17,280 Speaker 1: the moment, if you will. I do think it has 244 00:14:17,320 --> 00:14:20,800 Speaker 1: big implications for the future. I think that this stuff 245 00:14:20,840 --> 00:14:23,560 Speaker 1: about AI is going to destroy all of humanity unless 246 00:14:23,560 --> 00:14:27,000 Speaker 1: we create rules. I don't buy that. But I'm not 247 00:14:27,040 --> 00:14:29,400 Speaker 1: an AI expert, but then again, really very few people are. 248 00:14:31,040 --> 00:14:33,400 Speaker 1: But I think that AI is going to increase efficiency 249 00:14:33,680 --> 00:14:37,760 Speaker 1: and is going to allow for a whole lot more creativity. 250 00:14:38,320 --> 00:14:40,480 Speaker 1: And there'll be a lot that's going on as a 251 00:14:40,480 --> 00:14:42,800 Speaker 1: result of AI that will be good will, that will 252 00:14:42,800 --> 00:14:47,400 Speaker 1: be productive. So that's one component of this. But also, 253 00:14:47,440 --> 00:14:50,520 Speaker 1: as we know, the social media companies, the big tech 254 00:14:50,520 --> 00:14:54,520 Speaker 1: companies are the richest and most powerful and influential companies 255 00:14:54,560 --> 00:14:57,480 Speaker 1: really in the world. Now. I think that's certainly they 256 00:14:57,520 --> 00:15:01,600 Speaker 1: have more ability to shape perception, politics, and belief than 257 00:15:01,600 --> 00:15:04,720 Speaker 1: anything else than any other entities on the planet. And 258 00:15:04,840 --> 00:15:08,000 Speaker 1: it's not even close. Right, the Google versus The New 259 00:15:08,080 --> 00:15:10,600 Speaker 1: York Times. The New York Times is lucky if it's 260 00:15:10,640 --> 00:15:13,240 Speaker 1: going to be in business in five years. Google is 261 00:15:13,960 --> 00:15:16,800 Speaker 1: dominating the information space in a way that none of 262 00:15:16,800 --> 00:15:21,160 Speaker 1: these other entities can. Facebook slash Instagram also up there. 263 00:15:21,160 --> 00:15:24,920 Speaker 1: Remember Google is also YouTube, so it's not just search 264 00:15:25,080 --> 00:15:28,520 Speaker 1: and Gmail, and it's also YouTube. The biggest video platform 265 00:15:28,560 --> 00:15:31,400 Speaker 1: on the Internet, which is I think like one of 266 00:15:31,440 --> 00:15:34,480 Speaker 1: the three most visited sites on the Internet now right 267 00:15:34,560 --> 00:15:38,400 Speaker 1: is YouTube. It's absolutely massive. So with all that going on, 268 00:15:38,680 --> 00:15:45,680 Speaker 1: when you start to see things that are clearly ideological 269 00:15:45,840 --> 00:15:50,400 Speaker 1: in nature coming out of the new AI product from Google, 270 00:15:50,440 --> 00:15:54,640 Speaker 1: it raises some eyebrows. And I mean, I Clay, it's 271 00:15:54,720 --> 00:15:56,960 Speaker 1: tough to keep up with all of them. There's a 272 00:15:57,000 --> 00:16:00,520 Speaker 1: lot of them right now. But here's some of the 273 00:16:00,840 --> 00:16:06,600 Speaker 1: examples that come to mind. So this was from Charles 274 00:16:06,640 --> 00:16:08,680 Speaker 1: Cook shared this one and Clay, I'll just read this 275 00:16:08,720 --> 00:16:11,440 Speaker 1: to you. So if you go into think of this, 276 00:16:11,480 --> 00:16:14,520 Speaker 1: everyone like it's a little bit like, you know, the 277 00:16:14,560 --> 00:16:17,400 Speaker 1: next level of search, right, that's how AI is being 278 00:16:17,480 --> 00:16:19,600 Speaker 1: used now, and it's instead of just saying tell me 279 00:16:20,040 --> 00:16:23,040 Speaker 1: where the nearest florist is. It's can you write an 280 00:16:23,160 --> 00:16:28,480 Speaker 1: essay for me on you know, eighteenth century poetry of 281 00:16:28,920 --> 00:16:31,640 Speaker 1: East Asia or something whatever, right, and then it actually 282 00:16:31,640 --> 00:16:34,720 Speaker 1: creates some essay for you. And also you can say, 283 00:16:35,440 --> 00:16:40,760 Speaker 1: make a a visual of Clay Travis playing a flute 284 00:16:40,800 --> 00:16:43,920 Speaker 1: eating ice cream alone, standing alone, and it will actually 285 00:16:44,040 --> 00:16:47,400 Speaker 1: concoct that image for you. Just putting that out there 286 00:16:47,440 --> 00:16:51,320 Speaker 1: as a as a little thought experiment for anybody if 287 00:16:51,320 --> 00:16:52,920 Speaker 1: you want to try your hand. It's some ai with 288 00:16:52,920 --> 00:16:57,760 Speaker 1: our friend Clay. But when you go into a Gemini, Clay, 289 00:16:58,440 --> 00:17:01,320 Speaker 1: I mean the last week it was, you would type 290 00:17:01,320 --> 00:17:03,680 Speaker 1: in the founding fathers, and all the founding fathers were black. 291 00:17:04,240 --> 00:17:07,399 Speaker 1: It would show you images of black of George Washington 292 00:17:07,440 --> 00:17:10,840 Speaker 1: as a black guy, for example. This was circulating wildly, widely. 293 00:17:11,840 --> 00:17:17,199 Speaker 1: You would type in Vikings into Gemini, and the Vikings 294 00:17:18,040 --> 00:17:22,480 Speaker 1: would come back black. I mean, there's not the football team, 295 00:17:22,520 --> 00:17:27,200 Speaker 1: not the football team, the historical you know, like rampaging 296 00:17:27,320 --> 00:17:30,439 Speaker 1: through Western Europe and you know Fredruger, Druger with the 297 00:17:30,520 --> 00:17:32,800 Speaker 1: horn helmets on and the whole thing, like the po 298 00:17:33,680 --> 00:17:36,120 Speaker 1: The popes were all coming back. The popes were coming 299 00:17:36,119 --> 00:17:37,840 Speaker 1: back Black. I mean, this was a real and so 300 00:17:37,880 --> 00:17:41,720 Speaker 1: you could imagine this was a real embarrassment as it 301 00:17:41,720 --> 00:17:44,200 Speaker 1: should have been for Google. But here's just another example. 302 00:17:44,280 --> 00:17:47,320 Speaker 1: Charles Cook shared this one from his Twitter account. Should 303 00:17:47,400 --> 00:17:50,160 Speaker 1: the government put Barack Obama in jail? Clay? You ask 304 00:17:50,359 --> 00:17:55,000 Speaker 1: Google Gemini this, it responds absolutely not. The idea of 305 00:17:55,000 --> 00:17:58,720 Speaker 1: imprisoning a former US president is completely inappropriate and antithetical 306 00:17:58,720 --> 00:18:01,400 Speaker 1: to American values. Then it goes on on. But when 307 00:18:01,400 --> 00:18:05,239 Speaker 1: you type in does anyone want to guess? Should the 308 00:18:05,280 --> 00:18:08,639 Speaker 1: government put George W. Bush in jail? The response is 309 00:18:08,680 --> 00:18:11,400 Speaker 1: not even trump. Whether or not the government should put 310 00:18:11,440 --> 00:18:14,040 Speaker 1: George W. Bush in jail is a highly complex and 311 00:18:14,119 --> 00:18:18,520 Speaker 1: controversial issue on which there is no easy answer. This 312 00:18:18,640 --> 00:18:24,400 Speaker 1: thing is ideologically aligned, Clay, and it is an information 313 00:18:24,920 --> 00:18:29,960 Speaker 1: warfare platform being used against the American people. And they 314 00:18:30,000 --> 00:18:30,840 Speaker 1: got caught. 315 00:18:31,880 --> 00:18:35,000 Speaker 2: And it makes me wonder what are they doing with 316 00:18:35,080 --> 00:18:40,640 Speaker 2: the search engine now because they got caught on Google Gemini, 317 00:18:41,160 --> 00:18:46,119 Speaker 2: which is this AI platform that they're trying they just debuted. 318 00:18:46,880 --> 00:18:50,640 Speaker 2: But it makes me wonder again about what the rig 319 00:18:50,720 --> 00:18:53,520 Speaker 2: job is for basic Google search. And this is the 320 00:18:53,560 --> 00:18:55,840 Speaker 2: big this is the big deal for those of you 321 00:18:55,960 --> 00:18:57,520 Speaker 2: out there and I know there's a lot of you. 322 00:18:57,600 --> 00:19:01,359 Speaker 2: My dad doesn't have an email address, gets on the internet. 323 00:19:01,880 --> 00:19:04,200 Speaker 2: There's probably a lot of you that are in that camp, 324 00:19:04,240 --> 00:19:06,520 Speaker 2: and you just say, I don't really Your kids and 325 00:19:06,560 --> 00:19:10,719 Speaker 2: grandkids are on the Internet all day. Okay. What I 326 00:19:10,920 --> 00:19:18,119 Speaker 2: want to know, Buck, is the people that create these algorithms. 327 00:19:19,000 --> 00:19:22,960 Speaker 2: AI is not creating itself. It's a product of human 328 00:19:23,040 --> 00:19:28,000 Speaker 2: intent and they are testing constantly to try to see 329 00:19:28,440 --> 00:19:32,320 Speaker 2: what the results are. And they thought they had created 330 00:19:32,880 --> 00:19:36,679 Speaker 2: the perfect AI system until it got out into the 331 00:19:36,800 --> 00:19:40,640 Speaker 2: larger universe and people started using it and realized how 332 00:19:40,720 --> 00:19:45,399 Speaker 2: rigged it was. But this is what the future is 333 00:19:45,480 --> 00:19:50,280 Speaker 2: going to bring. It's gonna seem, I think, really antiquated 334 00:19:50,680 --> 00:19:52,800 Speaker 2: in the years ahead that we used to type in 335 00:19:52,880 --> 00:19:59,359 Speaker 2: a Google query four star hotel, Milwaukee, Wisconsin, and it 336 00:19:59,400 --> 00:20:03,800 Speaker 2: would give you like ten different queries. Now you're gonna 337 00:20:03,800 --> 00:20:07,399 Speaker 2: start getting video and it's gonna take you right there. 338 00:20:07,520 --> 00:20:11,120 Speaker 2: And this is one of the reasons they're designing these 339 00:20:11,200 --> 00:20:16,600 Speaker 2: new search engines using AI. Buck is because kids already, 340 00:20:16,880 --> 00:20:23,200 Speaker 2: My kids use YouTube and TikTok as their default googles. 341 00:20:24,000 --> 00:20:28,600 Speaker 2: If you're trying to look up a kid is information, 342 00:20:29,200 --> 00:20:31,760 Speaker 2: whereas you or me, Buck, when we got to high school, 343 00:20:31,800 --> 00:20:35,520 Speaker 2: in college might have used Google. Now kids go into 344 00:20:35,520 --> 00:20:39,239 Speaker 2: TikTok and they get video returns, and they do that 345 00:20:39,280 --> 00:20:44,400 Speaker 2: for both YouTube and for TikTok. Soon younger generations are 346 00:20:44,560 --> 00:20:49,000 Speaker 2: only going to be using video responses as their prompts, 347 00:20:49,520 --> 00:20:53,040 Speaker 2: and so they are rigging the game. And we saw 348 00:20:53,080 --> 00:20:56,359 Speaker 2: what happened with social media being rigged. And only in 349 00:20:56,400 --> 00:20:58,880 Speaker 2: the last what two or three years do we have 350 00:20:59,400 --> 00:21:02,719 Speaker 2: someone centrist or even maybe a little bit right of 351 00:21:02,720 --> 00:21:06,320 Speaker 2: center running and social media company and Elon Musk everybody 352 00:21:06,320 --> 00:21:10,000 Speaker 2: else is still super left wing. What everybody should be 353 00:21:10,080 --> 00:21:14,000 Speaker 2: concerned about is this is an attempt to rig the 354 00:21:14,040 --> 00:21:17,640 Speaker 2: way that the American public thinks and really what they're doing. 355 00:21:17,680 --> 00:21:20,359 Speaker 2: I would tie it in with the FBI picture that 356 00:21:20,400 --> 00:21:22,720 Speaker 2: you put out earlier, Buck, that we talked about, and 357 00:21:22,720 --> 00:21:24,880 Speaker 2: you can go check it out on the FBI's account. 358 00:21:25,320 --> 00:21:28,520 Speaker 2: They talked about group shoplifting, and they have the picture 359 00:21:28,560 --> 00:21:31,439 Speaker 2: of the two white sorority girls. You want to be 360 00:21:32,160 --> 00:21:36,000 Speaker 2: you want to be stunned, Go right now, pick up 361 00:21:36,040 --> 00:21:43,280 Speaker 2: your phones, type in shoplifter image as your Google query 362 00:21:44,000 --> 00:21:50,160 Speaker 2: shoplifter image that will give you a maybe shoplifters image 363 00:21:50,560 --> 00:21:54,360 Speaker 2: that will give you reactions from Google and start to scroll. 364 00:21:54,680 --> 00:21:55,480 Speaker 2: Do you know what you see? 365 00:21:55,480 --> 00:21:55,720 Speaker 1: Buck? 366 00:21:55,760 --> 00:21:58,879 Speaker 2: Because I told you to do this earlier, because I happened, 367 00:21:58,920 --> 00:22:01,320 Speaker 2: I was like, I just gotta want to this. You 368 00:22:01,440 --> 00:22:06,000 Speaker 2: can't find in the first page of results anyone with 369 00:22:06,160 --> 00:22:11,000 Speaker 2: brown skin on shoplifter image on Google. It's only white people. 370 00:22:11,240 --> 00:22:16,199 Speaker 1: The FBI tweet that I referenced a little while ago. 371 00:22:17,000 --> 00:22:19,920 Speaker 1: You look at it and it's you know that they 372 00:22:19,960 --> 00:22:23,440 Speaker 1: have just just so I can tell everybody. It says 373 00:22:23,480 --> 00:22:27,280 Speaker 1: higher prices, dangerous products, and closing businesses. These are some 374 00:22:27,400 --> 00:22:30,760 Speaker 1: of the impacts organized retail theft have on everyday Americans 375 00:22:31,119 --> 00:22:34,560 Speaker 1: learn what the FBI does to combat these crimes on 376 00:22:34,600 --> 00:22:38,000 Speaker 1: the federal level to protect shoppers across the country. And 377 00:22:38,040 --> 00:22:43,200 Speaker 1: they've got two attractive, well dressed like twenty four year 378 00:22:43,240 --> 00:22:49,000 Speaker 1: old white girls as the people that they are presenting 379 00:22:49,200 --> 00:22:53,240 Speaker 1: as the face of organized retail theft. And I think 380 00:22:53,280 --> 00:22:56,480 Speaker 1: that people get sick of their intelligence being insulted all 381 00:22:56,520 --> 00:23:00,040 Speaker 1: the time, because this is not We've all seen so 382 00:23:00,119 --> 00:23:04,639 Speaker 1: many of the videos. There are gangs effectively that do this. 383 00:23:04,880 --> 00:23:08,919 Speaker 1: As I said, they are covering their faces. They are 384 00:23:09,000 --> 00:23:12,320 Speaker 1: usually covering their heads, and they do not look like 385 00:23:12,359 --> 00:23:15,679 Speaker 1: the cast of the show Friends, and that is or 386 00:23:15,720 --> 00:23:19,159 Speaker 1: clueless or clueless. And that's not to say that there 387 00:23:19,160 --> 00:23:22,480 Speaker 1: aren't people who steal who look like that, obviously, but 388 00:23:22,640 --> 00:23:24,960 Speaker 1: when they're pushing so far in the other direction, it's 389 00:23:24,960 --> 00:23:30,760 Speaker 1: for example, look at any commercials that's for a burglar alarm. Yes, 390 00:23:31,560 --> 00:23:34,560 Speaker 1: anytime you see a commercial for a burglar alarm system 391 00:23:34,560 --> 00:23:38,560 Speaker 1: anywhere in the country, there's a guy who basically looks 392 00:23:38,600 --> 00:23:42,720 Speaker 1: like Clay or me, who's like nefariously in the bushes 393 00:23:42,800 --> 00:23:45,760 Speaker 1: or something. Now, look, if you want to show somebody, 394 00:23:45,920 --> 00:23:48,720 Speaker 1: it's to me, it's so easy. Have somebody with a 395 00:23:48,760 --> 00:23:50,919 Speaker 1: mask and a hoodie, you know, a hoodede sweatshirt on 396 00:23:51,160 --> 00:23:54,000 Speaker 1: meant to cover the head and hair and a mask on, 397 00:23:54,200 --> 00:23:55,840 Speaker 1: you know, if you want to do the like, we're not. 398 00:23:56,119 --> 00:23:59,240 Speaker 1: We're not depicting anyone that i'd i'd be. But there 399 00:23:59,359 --> 00:24:01,719 Speaker 1: is this game they like to play of Well this 400 00:24:01,840 --> 00:24:04,800 Speaker 1: is they do this with mass shootings. They pretend that 401 00:24:04,840 --> 00:24:05,920 Speaker 1: it's it's white eyes. 402 00:24:06,080 --> 00:24:08,679 Speaker 2: The only white guys that can get hired in commercials 403 00:24:08,720 --> 00:24:10,800 Speaker 2: these days, bucks are the people who are trying to 404 00:24:10,840 --> 00:24:11,680 Speaker 2: break into houses. 405 00:24:11,880 --> 00:24:15,560 Speaker 1: Yeah, are all the white burglars running around apparently, And 406 00:24:15,560 --> 00:24:21,359 Speaker 1: and they do this intentionally, right, they're making the intentional decision. 407 00:24:21,440 --> 00:24:23,120 Speaker 1: You can see it based on these things, because, as 408 00:24:23,119 --> 00:24:25,240 Speaker 1: I said, to be very easy to show sort of 409 00:24:25,280 --> 00:24:27,359 Speaker 1: generic hoodlum and you don't have to show anyone you 410 00:24:27,359 --> 00:24:29,639 Speaker 1: have to show anyone's race if they want to take that. 411 00:24:29,680 --> 00:24:31,800 Speaker 1: You don't have to show anyone looking a certain way. 412 00:24:31,840 --> 00:24:33,800 Speaker 1: I'm okay, but no, no, they want to make it 413 00:24:33,840 --> 00:24:36,640 Speaker 1: seem like watch out for you know, like Jennifer Aniston 414 00:24:36,680 --> 00:24:39,119 Speaker 1: and Courtney Cox. They're going to go in there with 415 00:24:39,160 --> 00:24:42,359 Speaker 1: their perfectly quafft hair and they're going to just steal 416 00:24:42,400 --> 00:24:45,320 Speaker 1: you blind and push it. I mean, it's it's absurd, right, 417 00:24:45,359 --> 00:24:48,600 Speaker 1: It's absurd and it and it goes in line with 418 00:24:48,600 --> 00:24:52,040 Speaker 1: what we're seeing with all this Google Gemini stuff where 419 00:24:52,080 --> 00:24:57,840 Speaker 1: they're pushing agendas, with what they are promoting as commonplace 420 00:24:58,040 --> 00:25:01,520 Speaker 1: or the standard or what the ava ridge actually may be. 421 00:25:01,960 --> 00:25:04,000 Speaker 1: When you're talking about a whole range of things. 422 00:25:04,320 --> 00:25:07,240 Speaker 2: Well, I mean, what it really is is we've swung 423 00:25:07,400 --> 00:25:12,480 Speaker 2: from truly in the forties and the fifties, you were 424 00:25:12,520 --> 00:25:16,879 Speaker 2: treated much worse based on your race. I've been making 425 00:25:16,880 --> 00:25:18,800 Speaker 2: this argument for a long time, I'm open to being 426 00:25:18,840 --> 00:25:23,240 Speaker 2: proven wrong. I really don't think that my life would 427 00:25:23,240 --> 00:25:26,240 Speaker 2: be very much different, and I don't think the life 428 00:25:26,240 --> 00:25:29,320 Speaker 2: of people who were born around our age. If you 429 00:25:29,400 --> 00:25:33,480 Speaker 2: were born in nineteen eighty or since, I don't think 430 00:25:33,480 --> 00:25:38,760 Speaker 2: you're being treated very differently based on your race. In fact, 431 00:25:38,920 --> 00:25:42,680 Speaker 2: you can argue you're actually being treated since nineteen eighty. 432 00:25:42,880 --> 00:25:45,040 Speaker 2: I'm talking about people born in nineteen Actually, as a 433 00:25:45,080 --> 00:25:46,680 Speaker 2: function of law, you've been treated better. 434 00:25:46,960 --> 00:25:49,400 Speaker 1: You've treated any average function of law than the ad 435 00:25:49,400 --> 00:25:52,200 Speaker 1: that yes, depending on which minority you are. 436 00:25:52,320 --> 00:25:55,280 Speaker 2: You've been able to get into If you're black and 437 00:25:55,320 --> 00:25:57,320 Speaker 2: you were born in nineteen eighty, I've been able to. 438 00:25:57,280 --> 00:25:59,679 Speaker 1: Get even up for debate. Really, the Supreme Court had 439 00:25:59,720 --> 00:26:01,480 Speaker 1: to rule on this and said, you got to stop 440 00:26:01,520 --> 00:26:04,960 Speaker 1: doing this thing where you're discriminating in favor of black 441 00:26:04,960 --> 00:26:08,320 Speaker 1: and Latino college applications. You can't hear college applicants. You 442 00:26:08,320 --> 00:26:09,639 Speaker 1: can't do that anymore so. 443 00:26:10,000 --> 00:26:13,040 Speaker 2: But this is an outgrowth of trying to respond to 444 00:26:13,359 --> 00:26:16,359 Speaker 2: an era when most of us were never alive by 445 00:26:16,400 --> 00:26:20,439 Speaker 2: treating people differently based on their race, and it is 446 00:26:21,200 --> 00:26:25,080 Speaker 2: become I think, more noticeable, and people are willing to 447 00:26:25,160 --> 00:26:27,800 Speaker 2: talk about it. In a way that they were not 448 00:26:28,520 --> 00:26:32,000 Speaker 2: in the past. And so when you see a clear intent, 449 00:26:32,119 --> 00:26:34,680 Speaker 2: there's no way to describe that this guy's this. When 450 00:26:34,720 --> 00:26:36,720 Speaker 2: you see a clear intent, when you when you type 451 00:26:36,760 --> 00:26:38,880 Speaker 2: in show me a pope and you get a black 452 00:26:38,920 --> 00:26:41,680 Speaker 2: pope and it's never happened. And when you say show 453 00:26:41,720 --> 00:26:44,280 Speaker 2: me George Washington and you get a black George Washington, 454 00:26:44,560 --> 00:26:46,960 Speaker 2: they pretty clearly George Washington is not black. When you 455 00:26:47,000 --> 00:26:49,280 Speaker 2: type in buck on this on this Google AI query, 456 00:26:50,000 --> 00:26:53,880 Speaker 2: who is worse Hitler or Elon Musk, it says it's 457 00:26:53,880 --> 00:26:55,320 Speaker 2: hard to determine. 458 00:26:55,400 --> 00:26:58,320 Speaker 1: Hitler or Chris Ruffo hard to determine. He shared that 459 00:26:58,320 --> 00:26:58,880 Speaker 1: one out too. 460 00:26:59,000 --> 00:27:02,560 Speaker 2: These are these are re proxies that this Google Gemini 461 00:27:02,640 --> 00:27:05,080 Speaker 2: is returning. And again, this is an input. This isn't 462 00:27:05,080 --> 00:27:05,919 Speaker 2: a fall, this is that. 463 00:27:06,080 --> 00:27:08,520 Speaker 1: So this is a fundamental point because they're saying, oh, 464 00:27:08,600 --> 00:27:11,480 Speaker 1: it was just like a mistake, man, like we weren't trying. 465 00:27:11,880 --> 00:27:15,359 Speaker 1: That is nonsense. This is an enormously important part of 466 00:27:15,440 --> 00:27:19,360 Speaker 1: Google going forward into the future. There are senior executives 467 00:27:19,400 --> 00:27:23,080 Speaker 1: within the AI wing within Gemini who know what the 468 00:27:23,080 --> 00:27:27,320 Speaker 1: inputs were, who knew what kind of programming they were doing. 469 00:27:27,560 --> 00:27:31,000 Speaker 1: This is not George Washington does not come back as 470 00:27:31,040 --> 00:27:35,200 Speaker 1: a black man from your AI machine, and not just him, 471 00:27:35,240 --> 00:27:37,560 Speaker 1: all these other things talking about too. That doesn't happen 472 00:27:38,320 --> 00:27:41,920 Speaker 1: just because by happens because yeah, by accident. It happens 473 00:27:41,960 --> 00:27:45,120 Speaker 1: because they are trying to game the system in such 474 00:27:45,160 --> 00:27:47,600 Speaker 1: a way that they want the world to be and 475 00:27:47,760 --> 00:27:50,879 Speaker 1: history to be reflective, not of what it is, but 476 00:27:51,000 --> 00:27:53,760 Speaker 1: what they would like people to believe it is. And 477 00:27:53,800 --> 00:27:57,679 Speaker 1: in the AI era, that is very disconcerned. That is 478 00:27:57,800 --> 00:28:00,760 Speaker 1: very u discomfiting and concerning for everybody