1 00:00:00,800 --> 00:00:03,960 Speaker 1: You're listening to the Tutor Dixon Podcast in the Clay 2 00:00:04,120 --> 00:00:10,240 Speaker 1: and Book podcast Network. Welcome to the Tutor Dixon Podcast. 3 00:00:10,320 --> 00:00:12,760 Speaker 1: I'm Tutor Dixon, and I'm so glad you are tuning 4 00:00:12,840 --> 00:00:16,439 Speaker 1: into our podcast today because I have one of my 5 00:00:16,520 --> 00:00:18,640 Speaker 1: favorite people to talk to you, because he's so much 6 00:00:18,640 --> 00:00:20,880 Speaker 1: smarter than me. He knows all of the stuff about tech. 7 00:00:21,160 --> 00:00:24,440 Speaker 1: His name is Alan Bacari. He's an investigative tech reporter 8 00:00:24,600 --> 00:00:27,760 Speaker 1: for breit Bart News. He spent his career exposing big 9 00:00:27,800 --> 00:00:31,200 Speaker 1: tech how they play with politics and seek to censor 10 00:00:31,400 --> 00:00:35,559 Speaker 1: dissenting opinions. Google, Facebook, YouTube, They're all censoring what we 11 00:00:35,600 --> 00:00:38,559 Speaker 1: do and who we see. And he knows this because 12 00:00:38,640 --> 00:00:41,400 Speaker 1: they actually censor breit Bart too. That was I think 13 00:00:41,440 --> 00:00:44,199 Speaker 1: one of the first conversations we had together, Alain, was 14 00:00:44,280 --> 00:00:48,240 Speaker 1: about how you can't really Google and get a Breitbart 15 00:00:48,280 --> 00:00:50,879 Speaker 1: article anymore. So you know all about this, don't you. 16 00:00:51,520 --> 00:00:54,639 Speaker 2: Yes, that's right, they really censored bright button, he used 17 00:00:54,720 --> 00:00:59,800 Speaker 2: quite heavily. Hearing the twenty twenty election, there was a massive, 18 00:00:59,840 --> 00:01:04,240 Speaker 2: massive gam What we did is we compared the data 19 00:01:04,280 --> 00:01:06,880 Speaker 2: from the twenty sixteen election to the twenty twenty election, 20 00:01:07,480 --> 00:01:11,280 Speaker 2: and the visibility of Breitbart News links in Google search 21 00:01:11,319 --> 00:01:15,559 Speaker 2: results was reduced by over ninety ninety seven over ninety 22 00:01:15,560 --> 00:01:19,360 Speaker 2: seven percent. It was a massive, massive drop. And you know, 23 00:01:19,480 --> 00:01:22,560 Speaker 2: a drop like that doesn't really occur organically because we 24 00:01:22,640 --> 00:01:25,480 Speaker 2: hadn't lost a significant number of our readers or anything 25 00:01:25,560 --> 00:01:28,080 Speaker 2: like that. It was just the Google search results that 26 00:01:28,120 --> 00:01:30,960 Speaker 2: had seen that massive change. So it's obvious that this 27 00:01:31,160 --> 00:01:34,160 Speaker 2: was a result of Google changing their algorithm between the 28 00:01:34,200 --> 00:01:39,480 Speaker 2: twenty sixteen and twenty twenty elections. And that's kind of 29 00:01:39,480 --> 00:01:42,600 Speaker 2: what we what we should have expected, because you know, 30 00:01:42,680 --> 00:01:47,240 Speaker 2: we caught their CEO, their executive team in a leaked 31 00:01:47,319 --> 00:01:50,919 Speaker 2: video right after the twenty sixteen election discussing the changes 32 00:01:50,960 --> 00:01:53,120 Speaker 2: they were going to make to make sure that a 33 00:01:53,160 --> 00:01:55,760 Speaker 2: similar result didn't happen again. Even meant one of their 34 00:01:55,760 --> 00:01:58,919 Speaker 2: executives saying they wanted to make populism the populist movement 35 00:01:58,920 --> 00:01:59,880 Speaker 2: of lip in history. 36 00:02:00,440 --> 00:02:02,920 Speaker 1: But that it's funny that we are talking about this 37 00:02:02,960 --> 00:02:04,960 Speaker 1: because just over the weekend, I had a friend who 38 00:02:05,160 --> 00:02:07,720 Speaker 1: is I would say, I guess I would call him 39 00:02:07,800 --> 00:02:11,280 Speaker 1: a conservative influencer, but also has his own organization, and 40 00:02:11,320 --> 00:02:14,400 Speaker 1: he was saying, I just it's not like it was 41 00:02:14,639 --> 00:02:17,400 Speaker 1: three years ago. He said, I can't get anywhere on Facebook. 42 00:02:17,440 --> 00:02:20,720 Speaker 1: I can't get anywhere on Twitter. Do you see that 43 00:02:21,040 --> 00:02:24,240 Speaker 1: this is happening to conservatives still, because I feel like 44 00:02:24,280 --> 00:02:27,600 Speaker 1: a lot of conservatives came out after Twitter with Elon 45 00:02:27,680 --> 00:02:29,840 Speaker 1: Musk and they said, oh, okay, so we're not crazy. 46 00:02:29,960 --> 00:02:32,440 Speaker 1: This is obviously happening, but it doesn't seem to have 47 00:02:32,520 --> 00:02:35,880 Speaker 1: moved companies like Facebook or Instagram or YouTube or any 48 00:02:35,880 --> 00:02:36,240 Speaker 1: of those. 49 00:02:36,400 --> 00:02:38,280 Speaker 2: Yeah, it's important to remember that, you know, with the 50 00:02:38,280 --> 00:02:40,880 Speaker 2: exception of Twitter, you know, the big tech companies are 51 00:02:40,880 --> 00:02:44,400 Speaker 2: really still in the hands of you know, of the 52 00:02:44,440 --> 00:02:48,440 Speaker 2: mainstream of people who have bought into this idea that 53 00:02:48,480 --> 00:02:53,720 Speaker 2: they need to censor misinformation and disinformation and hate speech, 54 00:02:53,760 --> 00:02:56,880 Speaker 2: which of course are all euphemisms invented by the mainstream 55 00:02:56,919 --> 00:03:01,920 Speaker 2: media and academia to censor conservatives. Google and Facebook are 56 00:03:01,960 --> 00:03:05,320 Speaker 2: still very much doing that. And you know, as powerful 57 00:03:05,320 --> 00:03:08,960 Speaker 2: as Twitter is, you know, there's no bigger driver of 58 00:03:09,200 --> 00:03:11,880 Speaker 2: clicks to news publishers then on Facebook and Google. Those 59 00:03:11,880 --> 00:03:14,239 Speaker 2: are still the two really big tech platforms. 60 00:03:14,240 --> 00:03:16,440 Speaker 3: So if progressives still have their. 61 00:03:16,320 --> 00:03:21,160 Speaker 2: Clause in those companies through through the censorship teams there, 62 00:03:21,160 --> 00:03:23,720 Speaker 2: the trust and safety departments, then it's still going to 63 00:03:23,720 --> 00:03:27,160 Speaker 2: be very uphill struggle for conservatives to compete with the 64 00:03:27,160 --> 00:03:28,840 Speaker 2: mainstream media on those platforms. 65 00:03:29,160 --> 00:03:31,639 Speaker 1: Well, speaking of progressives, I want to talk to you 66 00:03:31,680 --> 00:03:34,920 Speaker 1: about something that you wrote about recently because I feel 67 00:03:35,000 --> 00:03:38,080 Speaker 1: like this is kind of biting the hand that feeds you. 68 00:03:38,160 --> 00:03:41,480 Speaker 1: But maybe I'm wrong. Now they're going after the climate, 69 00:03:41,520 --> 00:03:45,000 Speaker 1: people are going after video games, which I need you 70 00:03:45,040 --> 00:03:47,440 Speaker 1: to connect that for me because I really don't get 71 00:03:47,440 --> 00:03:49,760 Speaker 1: that at all. But isn't this kind of like their 72 00:03:49,800 --> 00:03:51,680 Speaker 1: own team? You know, the video game people are the 73 00:03:51,720 --> 00:03:53,960 Speaker 1: tech people, So how does this work? 74 00:03:54,280 --> 00:03:56,160 Speaker 3: I'm not sure it is their own team actually. 75 00:03:56,200 --> 00:03:58,640 Speaker 2: I mean, you know, back in twenty four I remember, 76 00:03:58,720 --> 00:04:00,680 Speaker 2: back in twenty fourteen, one of the earliest stories I 77 00:04:00,720 --> 00:04:03,800 Speaker 2: covered at Bridbug was this thing called gamer Gate, which 78 00:04:03,840 --> 00:04:06,720 Speaker 2: was really like one of those early battles over wokeness. 79 00:04:06,720 --> 00:04:09,000 Speaker 2: We see wokeness in so many industries these days. 80 00:04:09,040 --> 00:04:09,160 Speaker 1: You know. 81 00:04:09,200 --> 00:04:11,600 Speaker 3: The recent one was the beer industry with bud Light. 82 00:04:12,000 --> 00:04:14,400 Speaker 2: But back in twenty fourteen, there was a massive controversy 83 00:04:14,400 --> 00:04:17,160 Speaker 2: of a wokeness in video game journalism, and that turned 84 00:04:17,160 --> 00:04:20,240 Speaker 2: into a massive controversy lost it over a year, and 85 00:04:20,279 --> 00:04:23,000 Speaker 2: gamers were really really upset about you know, the intrusion 86 00:04:23,040 --> 00:04:26,520 Speaker 2: of political correctness into their their hobby which was previously 87 00:04:26,600 --> 00:04:29,680 Speaker 2: quite free of politics. You know, you gotta remember, you know, 88 00:04:29,760 --> 00:04:32,000 Speaker 2: video games tend to be quite you know, a male 89 00:04:32,080 --> 00:04:35,680 Speaker 2: dominated hobby, and you know, you know, male voters in 90 00:04:35,680 --> 00:04:38,000 Speaker 2: the US lease tend to lean Republican more than women 91 00:04:38,120 --> 00:04:41,040 Speaker 2: voters to to you know, if you look at the polls. 92 00:04:41,200 --> 00:04:44,760 Speaker 3: So I kind of compared to the NFL. 93 00:04:44,800 --> 00:04:46,400 Speaker 2: I mean, the a lot of you know, people who 94 00:04:46,400 --> 00:04:49,720 Speaker 2: were fans of the NFL, many of them were political moderates. 95 00:04:49,760 --> 00:04:50,520 Speaker 3: They were they were. 96 00:04:50,440 --> 00:04:54,240 Speaker 2: Apathetic, but then they became politicized with with the kneeling 97 00:04:54,279 --> 00:04:57,800 Speaker 2: controversies and Colin Campan, right, So you know when this 98 00:04:58,200 --> 00:05:01,040 Speaker 2: when when you see an inditry or how they become 99 00:05:01,080 --> 00:05:02,960 Speaker 2: politicized like this, you know, it's hard. 100 00:05:02,760 --> 00:05:04,760 Speaker 3: To predict which way it will go. 101 00:05:06,040 --> 00:05:08,720 Speaker 2: The story I covered for Brightbood is about the next 102 00:05:08,839 --> 00:05:12,120 Speaker 2: zero agenda. So you know, climate activists are upset that 103 00:05:13,080 --> 00:05:16,960 Speaker 2: the you know, the devices that gamers used with PCs 104 00:05:17,080 --> 00:05:20,720 Speaker 2: or consoles use a lot of energy, and they're trying 105 00:05:20,720 --> 00:05:24,120 Speaker 2: to change the industry so that, you know, to make 106 00:05:24,160 --> 00:05:29,479 Speaker 2: companies make games that are less demanding on on on 107 00:05:29,800 --> 00:05:34,160 Speaker 2: devices to reduce the the climate footprint, which is what 108 00:05:34,200 --> 00:05:38,080 Speaker 2: we've seen in you know, in almost every other field. 109 00:05:38,080 --> 00:05:39,960 Speaker 2: You know, how can we how can we change things? 110 00:05:40,000 --> 00:05:41,400 Speaker 2: How can we restrict. 111 00:05:40,960 --> 00:05:42,760 Speaker 3: Things to to meet right climate? 112 00:05:42,800 --> 00:05:46,760 Speaker 2: These next zero climate gilds that restricting the farming of cows, 113 00:05:46,880 --> 00:05:49,680 Speaker 2: the eating of meat, because you know, farming creates all 114 00:05:49,720 --> 00:05:53,640 Speaker 2: these carbon emissions. In France, they banned short haul flights 115 00:05:54,200 --> 00:05:56,200 Speaker 2: on commercial airlines, not product right, I. 116 00:05:56,160 --> 00:05:58,760 Speaker 1: Just saw that. How how are they even going to do? 117 00:05:58,960 --> 00:06:01,600 Speaker 1: How will that go over in France? 118 00:06:02,040 --> 00:06:02,240 Speaker 3: Yeah? 119 00:06:02,240 --> 00:06:05,160 Speaker 2: I mean the French are already quite upset about Macrone's 120 00:06:05,720 --> 00:06:06,520 Speaker 2: carbon targets. 121 00:06:06,560 --> 00:06:07,560 Speaker 3: Is net zero goals. 122 00:06:07,600 --> 00:06:09,240 Speaker 2: This is one of the reasons why we've seen there 123 00:06:09,320 --> 00:06:13,200 Speaker 2: so many protests in big cities over the past two years. 124 00:06:13,200 --> 00:06:15,560 Speaker 3: Of course, it's quite common in France with the bibig protes. 125 00:06:15,279 --> 00:06:19,479 Speaker 2: In cities, but it's this is one of the cork 126 00:06:19,480 --> 00:06:22,880 Speaker 2: grievances cited by people like the Yellow vest protesters. 127 00:06:22,360 --> 00:06:23,080 Speaker 3: In that country. 128 00:06:23,760 --> 00:06:27,640 Speaker 2: Interestingly, you know, they banned short fall flights for average consumers, 129 00:06:27,640 --> 00:06:30,240 Speaker 2: but private jets are still allowed to go on those 130 00:06:30,279 --> 00:06:30,799 Speaker 2: same routes. 131 00:06:31,560 --> 00:06:36,080 Speaker 1: Of course, so if you can drive within like three hours, right, 132 00:06:36,160 --> 00:06:36,800 Speaker 1: is that what it is? 133 00:06:37,400 --> 00:06:38,760 Speaker 3: Yeah, it's about that much. 134 00:06:38,839 --> 00:06:41,440 Speaker 2: Yes, So you can't take flights for those distances anymore. 135 00:06:41,480 --> 00:06:42,880 Speaker 2: You have to take the train, or you have to 136 00:06:43,279 --> 00:06:48,200 Speaker 2: you have to drive wealthy enough to quarter fly jet. 137 00:06:48,279 --> 00:06:50,279 Speaker 2: That seems to be the common theme with all of 138 00:06:50,320 --> 00:06:53,919 Speaker 2: these net zero agendas. Not trying to ban things outright, 139 00:06:53,920 --> 00:06:55,280 Speaker 2: they're just trying to put them out of the reach 140 00:06:55,320 --> 00:06:57,880 Speaker 2: of the average consumer, almost like they're trying to make 141 00:06:57,880 --> 00:07:01,840 Speaker 2: people comfortable with low standards of living, Like they're preparing 142 00:07:01,960 --> 00:07:02,560 Speaker 2: people for that. 143 00:07:02,960 --> 00:07:05,280 Speaker 1: Oh, they are preparing people for that. I mean that 144 00:07:05,520 --> 00:07:09,040 Speaker 1: is that is that if we do not fight against this, 145 00:07:09,279 --> 00:07:12,720 Speaker 1: that's what we believe. The goal is to make sure 146 00:07:12,840 --> 00:07:16,040 Speaker 1: that there is no longer a middle class. You have 147 00:07:16,120 --> 00:07:18,640 Speaker 1: the ultra rich, and then everybody's at the same level. 148 00:07:19,080 --> 00:07:21,480 Speaker 2: It's very convenient to persuade people that you have to 149 00:07:21,480 --> 00:07:24,720 Speaker 2: do it, you have to accept lower standards than previous generations, 150 00:07:25,040 --> 00:07:27,120 Speaker 2: because you need to do it to save the planet. Right, 151 00:07:27,160 --> 00:07:30,960 Speaker 2: that's a very persuasive argument for people who don't you known, 152 00:07:31,000 --> 00:07:33,040 Speaker 2: actually interrogate it and look at the evidence. 153 00:07:33,480 --> 00:07:37,160 Speaker 1: It is because again, you have young parents who are 154 00:07:37,160 --> 00:07:39,640 Speaker 1: being told this world is not going to be around 155 00:07:39,680 --> 00:07:42,400 Speaker 1: for your children unless you do this, and that's I 156 00:07:42,440 --> 00:07:45,240 Speaker 1: think that's the ultimate way to twist someone's mind. Well, 157 00:07:45,240 --> 00:07:47,760 Speaker 1: I have to protect my kids, so I have to 158 00:07:47,800 --> 00:07:48,080 Speaker 1: do this. 159 00:07:48,600 --> 00:07:51,360 Speaker 2: I find the best counter argument is to point out 160 00:07:51,400 --> 00:07:55,760 Speaker 2: that why are they so relentlessly determined on these specific 161 00:07:55,800 --> 00:07:58,880 Speaker 2: solutions that you have to give up your guests, you 162 00:07:58,880 --> 00:08:00,240 Speaker 2: have to give up meet, you have to be up 163 00:08:00,240 --> 00:08:02,640 Speaker 2: your gaming PC to save the planet, When if you 164 00:08:02,720 --> 00:08:04,640 Speaker 2: want to achieve next zero, there are ways to do 165 00:08:04,680 --> 00:08:07,480 Speaker 2: it without reducing the standard up living for everyone, the 166 00:08:07,520 --> 00:08:10,520 Speaker 2: most obvious example being a nuclear power. 167 00:08:11,000 --> 00:08:12,239 Speaker 3: I think I mentioned this on Twitter. 168 00:08:12,240 --> 00:08:15,200 Speaker 2: There was a recent example a nuclear plant in Finland 169 00:08:15,200 --> 00:08:18,200 Speaker 2: had to actually shut down briefly, had to pause operations 170 00:08:18,200 --> 00:08:21,200 Speaker 2: because it was making electricity so cheap it couldn't make 171 00:08:21,240 --> 00:08:23,680 Speaker 2: a profit. So you know, there are ways to reduce 172 00:08:23,680 --> 00:08:27,080 Speaker 2: the carbon footprint without arming consumers, but they seem relentlessly 173 00:08:27,480 --> 00:08:30,280 Speaker 2: focused on reducing the standard up living because that seems 174 00:08:30,320 --> 00:08:33,600 Speaker 2: to suggest as an agenda beyond simply reducing carbon emissions. 175 00:08:33,880 --> 00:08:36,440 Speaker 1: Well, there's also a lot of money in coming up 176 00:08:36,480 --> 00:08:39,360 Speaker 1: with other solutions too, so I think that there's a 177 00:08:39,360 --> 00:08:42,640 Speaker 1: whole game going on behind the scenes here. But I 178 00:08:42,679 --> 00:08:45,960 Speaker 1: wanted to ask you about AI as well, because there's 179 00:08:46,040 --> 00:08:50,040 Speaker 1: a lot happening in the AI world right now. We've 180 00:08:50,040 --> 00:08:53,360 Speaker 1: seen videos coming out and people don't know if they're 181 00:08:53,400 --> 00:08:56,160 Speaker 1: real or if they're not real. We see that Japan 182 00:08:56,280 --> 00:08:59,680 Speaker 1: is saying we're going to extend this to even illegal 183 00:08:59,720 --> 00:09:02,080 Speaker 1: source is coming in because we want to make sure 184 00:09:02,120 --> 00:09:05,960 Speaker 1: that we can increase population or increase our ability to 185 00:09:06,320 --> 00:09:09,120 Speaker 1: work with our low population because they have not been 186 00:09:09,440 --> 00:09:12,400 Speaker 1: having as many kids as they were expecting. AI. The 187 00:09:12,480 --> 00:09:15,160 Speaker 1: whole story that we're getting around AI right now, it's 188 00:09:15,200 --> 00:09:18,520 Speaker 1: extraordinarily dangerous. You keep hearing these AI experts come out 189 00:09:18,559 --> 00:09:21,120 Speaker 1: and say we've got to stop it. But if other 190 00:09:21,240 --> 00:09:24,240 Speaker 1: countries aren't stopping it, they're even willing to go to 191 00:09:24,400 --> 00:09:27,880 Speaker 1: sources that aren't legal, where does the United States stand 192 00:09:28,000 --> 00:09:31,480 Speaker 1: if we are trying to prevent ourselves from expanding. 193 00:09:32,120 --> 00:09:33,480 Speaker 3: Yeah, that's the interesting thing. 194 00:09:33,520 --> 00:09:37,319 Speaker 2: It's important to unpack some of these claims that are 195 00:09:37,320 --> 00:09:40,080 Speaker 2: being made about the dangers of AI. They are all 196 00:09:40,120 --> 00:09:42,920 Speaker 2: legitimate dangers of AI, but I don't think they really 197 00:09:42,960 --> 00:09:45,959 Speaker 2: apply to too large language models, which is the specific 198 00:09:46,080 --> 00:09:47,760 Speaker 2: type of AI. 199 00:09:46,800 --> 00:09:48,760 Speaker 3: We've seen explode recently. 200 00:09:48,800 --> 00:09:52,920 Speaker 2: Shat GPT and programs like that that large language models 201 00:09:53,360 --> 00:09:57,400 Speaker 2: they're not artificial general intelligence, which is what people tend 202 00:09:57,480 --> 00:09:59,720 Speaker 2: to be when what the theorist you are talking about 203 00:09:59,760 --> 00:10:01,280 Speaker 2: in the saying, oh this is this could be a. 204 00:10:01,280 --> 00:10:03,720 Speaker 3: Really dangerous thing if we let it get out of control. 205 00:10:03,800 --> 00:10:06,800 Speaker 2: Large language models are very much controlled by humans and 206 00:10:07,200 --> 00:10:10,559 Speaker 2: are essentially just looking at patterns in human texts that 207 00:10:10,600 --> 00:10:15,040 Speaker 2: they read on the that they digest and turn into outputs. 208 00:10:15,480 --> 00:10:17,880 Speaker 2: I think the reason why, you know, if if I 209 00:10:17,880 --> 00:10:20,400 Speaker 2: had to read between the lines, I think a lot 210 00:10:20,440 --> 00:10:23,600 Speaker 2: of the companies have a vested interest in saying AI 211 00:10:23,720 --> 00:10:26,280 Speaker 2: is very dangerous because if they say AI is very dangerous, 212 00:10:26,400 --> 00:10:28,959 Speaker 2: then they can say, well, only responsible people should be 213 00:10:29,000 --> 00:10:32,360 Speaker 2: allowed to do it, you know, licensed companies or something 214 00:10:32,400 --> 00:10:34,199 Speaker 2: like that, and that gives them all the control, right 215 00:10:34,240 --> 00:10:38,400 Speaker 2: it It reduces competition, and like you said, other countries 216 00:10:38,440 --> 00:10:40,120 Speaker 2: with different priorities might not. 217 00:10:40,040 --> 00:10:41,439 Speaker 3: Be putt in descriptions on AI. 218 00:10:42,120 --> 00:10:44,360 Speaker 2: Japan doesn't want to put any rescriptions on AI because 219 00:10:44,360 --> 00:10:47,800 Speaker 2: they see there's the solution to their declining population, a 220 00:10:47,880 --> 00:10:51,200 Speaker 2: solution that doesn't involve mass immigration, which they really don't 221 00:10:51,240 --> 00:10:51,679 Speaker 2: want to do. 222 00:10:53,400 --> 00:10:56,360 Speaker 3: And you know, there's obviously China as well. We've seen 223 00:10:56,720 --> 00:10:59,480 Speaker 3: this big push in Washington DC to reduce. 224 00:10:59,240 --> 00:11:02,880 Speaker 2: To regulate the export of advanced computer chips to China, 225 00:11:03,040 --> 00:11:06,319 Speaker 2: and that's really trying to prevent the development of AI 226 00:11:06,440 --> 00:11:10,160 Speaker 2: technologies and other advanced technologies that could be used in 227 00:11:10,160 --> 00:11:14,360 Speaker 2: military systems. So yeah, there's definitely a big foreign policy 228 00:11:14,440 --> 00:11:15,400 Speaker 2: angle to consider here. 229 00:11:16,200 --> 00:11:19,720 Speaker 1: I mean, can't it be used to manipulate people in 230 00:11:20,040 --> 00:11:23,800 Speaker 1: elections or any I mean even I could see a 231 00:11:23,840 --> 00:11:28,080 Speaker 1: country like China putting videos out and their people could 232 00:11:28,120 --> 00:11:30,760 Speaker 1: completely believe that all of this is happening in another 233 00:11:30,800 --> 00:11:32,640 Speaker 1: country like this. I don't know if you saw the 234 00:11:32,720 --> 00:11:37,000 Speaker 1: DeSantis office video where they make him into Michael Scott 235 00:11:37,040 --> 00:11:40,560 Speaker 1: from the office and he's wearing a woman's suit. I mean, 236 00:11:40,640 --> 00:11:44,240 Speaker 1: it is so it looks so real, it is shocking. 237 00:11:44,640 --> 00:11:46,320 Speaker 3: This is a legitimate problem. 238 00:11:46,440 --> 00:11:50,200 Speaker 2: I mean, it's really a shame that the Left has 239 00:11:50,200 --> 00:11:54,560 Speaker 2: spent five years just destroying the objective meaning of words 240 00:11:54,600 --> 00:11:58,439 Speaker 2: like misinformation and disinformation, because you know, there is real 241 00:11:58,480 --> 00:12:01,320 Speaker 2: misinformation and AI. It's an example of how it could 242 00:12:01,440 --> 00:12:06,000 Speaker 2: row really really, really really powerful. You know, imitating people's voices, 243 00:12:06,080 --> 00:12:09,760 Speaker 2: people's image in a way that's indistinguishable from the real thing. 244 00:12:10,160 --> 00:12:13,679 Speaker 2: That's where you actually do need some ways you know 245 00:12:13,720 --> 00:12:16,760 Speaker 2: to help people to tech misinformation disinformation. The problem is, 246 00:12:17,120 --> 00:12:19,920 Speaker 2: how is anyone going to trust a third party or 247 00:12:19,920 --> 00:12:23,880 Speaker 2: a company to identify misinformation for them to identify aid 248 00:12:24,040 --> 00:12:27,160 Speaker 2: bas when they spend so long using this as a 249 00:12:27,240 --> 00:12:30,319 Speaker 2: TWU look political partisan warfare that has no real objective. 250 00:12:30,360 --> 00:12:33,880 Speaker 1: Meaning let's take a quick commercial break. We'll continue next. 251 00:12:33,920 --> 00:12:40,720 Speaker 1: On the Tutor Dixon Podcast. We've been talking about elections 252 00:12:40,760 --> 00:12:43,439 Speaker 1: quite a bit, obviously, coming off of twenty twenty two. 253 00:12:43,920 --> 00:12:46,640 Speaker 1: What are we doing wrong on the conservative side, Why 254 00:12:46,640 --> 00:12:49,320 Speaker 1: aren't we getting people elected? And what tech are we 255 00:12:49,400 --> 00:12:52,280 Speaker 1: not using, which I would argue we're really not using 256 00:12:52,640 --> 00:12:55,960 Speaker 1: any tech or very low levels of tech compared to 257 00:12:56,440 --> 00:12:59,360 Speaker 1: the other side that is going out and they're meeting 258 00:12:59,400 --> 00:13:01,800 Speaker 1: you where you are. I mean, they're going into your phones. 259 00:13:01,800 --> 00:13:04,120 Speaker 1: They're getting the message that you want to hear directly 260 00:13:04,160 --> 00:13:06,720 Speaker 1: to you. They're very good at that. So I had 261 00:13:06,760 --> 00:13:09,679 Speaker 1: someone the other day say, well, imagine at some point 262 00:13:09,880 --> 00:13:13,320 Speaker 1: it'll be your candidate. They will be able to go 263 00:13:13,440 --> 00:13:16,280 Speaker 1: into your phone, find out you had an appointment at 264 00:13:16,280 --> 00:13:19,600 Speaker 1: the vet that morning, and your candidate will then have 265 00:13:19,640 --> 00:13:22,520 Speaker 1: a video that comes up later and say I have 266 00:13:22,559 --> 00:13:24,680 Speaker 1: a busy life, and just like you. I had to 267 00:13:24,679 --> 00:13:27,600 Speaker 1: take my dog to the vet today and it'll all 268 00:13:27,640 --> 00:13:30,000 Speaker 1: be ai. I mean, do you see a world where 269 00:13:30,040 --> 00:13:34,320 Speaker 1: someday this happens and there's targeted elections like that, to 270 00:13:34,360 --> 00:13:36,240 Speaker 1: the point where people go, wow, Wow, I'm just like 271 00:13:36,280 --> 00:13:37,640 Speaker 1: that person. I want to vote for them. 272 00:13:38,080 --> 00:13:40,120 Speaker 3: You know, I think it's already happened to some extent. 273 00:13:40,160 --> 00:13:44,520 Speaker 2: I think it happened a long time ago, back back 274 00:13:44,559 --> 00:13:47,760 Speaker 2: in when was it twenty eight when the Cambridge analyticas scamp, 275 00:13:47,960 --> 00:13:50,640 Speaker 2: so this was like twenty eighteen, twenty seventeen. One of 276 00:13:50,679 --> 00:13:54,240 Speaker 2: the stories that the mainstream media ignored from that was 277 00:13:54,320 --> 00:13:58,280 Speaker 2: a former Obama campaign official coming out and saying, well, yes, 278 00:13:58,320 --> 00:14:01,559 Speaker 2: this Cambridge Analytica stuff is as bad as an invasion 279 00:14:01,559 --> 00:14:05,400 Speaker 2: of privacy, but actually Facebook gave us way more. That 280 00:14:05,520 --> 00:14:09,240 Speaker 2: gave the Obama campaign way more data voluntarily back in 281 00:14:09,280 --> 00:14:11,599 Speaker 2: twenty twelve as a sort of gift to the campaign 282 00:14:12,000 --> 00:14:14,520 Speaker 2: to help them win. And that was the entire social 283 00:14:14,559 --> 00:14:19,080 Speaker 2: graph that Facebook had, So you already had people inside Facebook, 284 00:14:19,320 --> 00:14:21,760 Speaker 2: you know, giving all of this day from an Americans 285 00:14:21,800 --> 00:14:24,880 Speaker 2: to the Obama campaign in twenty twelve. Yet the campaign 286 00:14:24,920 --> 00:14:27,080 Speaker 2: staff will come out and admit this in twenty eighteen 287 00:14:27,320 --> 00:14:28,320 Speaker 2: and nothing was done about it. 288 00:14:28,400 --> 00:14:30,800 Speaker 3: Of course, it was brushed onto the carpet. 289 00:14:31,000 --> 00:14:34,440 Speaker 2: So even you know, beyond the AIS, you have to 290 00:14:34,440 --> 00:14:37,440 Speaker 2: worry that people inside these Silicon Valley tech companies that 291 00:14:37,480 --> 00:14:41,120 Speaker 2: have so much targeted data on Americans just you know, 292 00:14:41,240 --> 00:14:44,160 Speaker 2: voluntarily handing that over to the Democrats because you know, 293 00:14:44,240 --> 00:14:46,480 Speaker 2: all of the so many people in Silicon Valley mean 294 00:14:46,560 --> 00:14:47,440 Speaker 2: heavily to the left. 295 00:14:47,800 --> 00:14:50,600 Speaker 1: So this is where I think that people are not 296 00:14:50,880 --> 00:14:55,760 Speaker 1: understanding what how elections have changed and how important that 297 00:14:56,000 --> 00:14:59,120 Speaker 1: data actually is. So you have this on the left 298 00:14:59,200 --> 00:15:02,160 Speaker 1: or they have endless access to all of our information. 299 00:15:02,560 --> 00:15:04,880 Speaker 1: And that's how people market to you as well. So 300 00:15:04,920 --> 00:15:06,720 Speaker 1: they look at you and they say, okay, this person 301 00:15:07,120 --> 00:15:09,560 Speaker 1: drives to this school every day, they buy from this 302 00:15:09,640 --> 00:15:13,760 Speaker 1: grocery store, they buy guns, they don't buy guns. They 303 00:15:13,840 --> 00:15:17,520 Speaker 1: look at their systems, can look at every detail about 304 00:15:17,520 --> 00:15:20,000 Speaker 1: you and know exactly what's important to you and then 305 00:15:20,160 --> 00:15:23,360 Speaker 1: feed you that information to come and vote. So if 306 00:15:23,600 --> 00:15:26,440 Speaker 1: they know you're very pro life, they're never going to 307 00:15:26,520 --> 00:15:28,840 Speaker 1: hit you with the information on pro choice. They're going 308 00:15:28,880 --> 00:15:31,000 Speaker 1: to go to you with something else that you are 309 00:15:31,040 --> 00:15:34,400 Speaker 1: going to consume and love, and it's going to compel 310 00:15:34,480 --> 00:15:37,080 Speaker 1: you to go out and vote. But I would argue 311 00:15:37,080 --> 00:15:40,320 Speaker 1: that on the conservative side, we are completely behind on 312 00:15:40,360 --> 00:15:43,480 Speaker 1: this and have very few sources of this data. And 313 00:15:43,520 --> 00:15:46,880 Speaker 1: when we do have this data, there are consultants that 314 00:15:46,960 --> 00:15:49,760 Speaker 1: hold it hostage for high amounts of money. Do you 315 00:15:49,800 --> 00:15:52,840 Speaker 1: think that this will do? You think this is the 316 00:15:53,800 --> 00:15:56,320 Speaker 1: weight that is helping the left win in many of 317 00:15:56,360 --> 00:15:58,800 Speaker 1: these cases, even though their policies are not really that great. 318 00:16:00,120 --> 00:16:01,240 Speaker 3: Think so on, you know, so. 319 00:16:01,360 --> 00:16:03,960 Speaker 2: Wonder that Republicans win at all when the left has 320 00:16:04,000 --> 00:16:06,400 Speaker 2: this huge advantage in tech, Not to mention all the 321 00:16:06,440 --> 00:16:10,120 Speaker 2: mainstream media companies on their side, which are artificially boosted 322 00:16:10,200 --> 00:16:15,080 Speaker 2: through through Google and Facebook and YouTube. I think the 323 00:16:15,120 --> 00:16:17,320 Speaker 2: only the only problem the left has. No matter how 324 00:16:17,400 --> 00:16:20,640 Speaker 2: much data it has, no matter how much advantage the 325 00:16:20,680 --> 00:16:24,360 Speaker 2: tech companies give give to them, many of the left 326 00:16:24,360 --> 00:16:28,360 Speaker 2: wing agendas and policies just fundamentally unpopular. As no matter 327 00:16:28,400 --> 00:16:30,480 Speaker 2: how much you target people, you know, you're not going 328 00:16:30,520 --> 00:16:34,240 Speaker 2: to persuade a large majority of people to see you know, 329 00:16:34,400 --> 00:16:38,640 Speaker 2: biological men as as women. No matter how large majority 330 00:16:38,680 --> 00:16:41,200 Speaker 2: of for you know people you can you know target 331 00:16:41,240 --> 00:16:44,280 Speaker 2: through these tech platforms, You're not gonna persuade them to 332 00:16:44,480 --> 00:16:47,520 Speaker 2: stop seeing the crime on the streets, or the decline economy, 333 00:16:47,560 --> 00:16:50,480 Speaker 2: or the rights and prices you know at gas stations 334 00:16:50,520 --> 00:16:53,520 Speaker 2: and groceries, and you know, and and all the the 335 00:16:53,600 --> 00:16:56,160 Speaker 2: basic truth and basic facts that the left is trying 336 00:16:56,200 --> 00:16:58,400 Speaker 2: to cover up that's actually right there in front of 337 00:16:58,440 --> 00:16:59,320 Speaker 2: people's faces. 338 00:16:59,520 --> 00:17:01,760 Speaker 3: No matter of tech advantage can get over that. 339 00:17:02,680 --> 00:17:05,760 Speaker 2: But I certainly think that Republicans would be winning by 340 00:17:05,920 --> 00:17:10,760 Speaker 2: far more if it wasn't for these biased tech platforms, 341 00:17:10,800 --> 00:17:16,280 Speaker 2: if there were more controls on the favoritism that tech 342 00:17:16,320 --> 00:17:18,399 Speaker 2: companies have been shown in the democats with the past 343 00:17:18,400 --> 00:17:19,320 Speaker 2: a half decade. 344 00:17:19,760 --> 00:17:23,800 Speaker 1: So, just selfishly, I will say to everyone listening again, 345 00:17:24,720 --> 00:17:28,600 Speaker 1: if you are investing in elections, ask people what they're 346 00:17:28,640 --> 00:17:31,119 Speaker 1: doing to win. I think this has been something that 347 00:17:32,200 --> 00:17:33,919 Speaker 1: you know. Just last week I met with someone and 348 00:17:33,920 --> 00:17:36,800 Speaker 1: they said, well, we put millions of dollars into this organization, 349 00:17:36,920 --> 00:17:39,439 Speaker 1: and I'm not sure what they did. It's the only 350 00:17:39,760 --> 00:17:43,320 Speaker 1: time I think people invest without saying, well, what's your plan. 351 00:17:43,400 --> 00:17:45,800 Speaker 1: I mean, you wouldn't invest in any other company without 352 00:17:45,800 --> 00:17:49,920 Speaker 1: saying what is your plan to get to the endgame 353 00:17:49,960 --> 00:17:52,720 Speaker 1: that you are looking for? But for some reason that 354 00:17:52,840 --> 00:17:56,439 Speaker 1: seems to be something that we are not doing on 355 00:17:56,800 --> 00:17:59,720 Speaker 1: the Republican side. So I appreciate the fact that you 356 00:17:59,760 --> 00:18:02,280 Speaker 1: brought up. And this is something that I also think 357 00:18:02,320 --> 00:18:06,679 Speaker 1: that you know, with Breitbart, I used to see Breitbart 358 00:18:06,720 --> 00:18:11,720 Speaker 1: as the news magazine or the online news to go 359 00:18:11,840 --> 00:18:15,520 Speaker 1: to to get the conservative view to understand how the 360 00:18:15,800 --> 00:18:19,280 Speaker 1: story was being twisted. But I think that the left 361 00:18:19,320 --> 00:18:23,160 Speaker 1: has been very smart about how they have stopped that 362 00:18:23,200 --> 00:18:26,879 Speaker 1: information from getting out. So how as a news organization, 363 00:18:26,960 --> 00:18:29,320 Speaker 1: as the media has changed so much, how do you 364 00:18:29,480 --> 00:18:33,200 Speaker 1: fight this? How do you fight the the ratcheting back 365 00:18:33,240 --> 00:18:36,159 Speaker 1: from online services. 366 00:18:35,960 --> 00:18:38,600 Speaker 3: It's very difficult when we have all of the suppression 367 00:18:38,640 --> 00:18:39,560 Speaker 3: from Google. You know. 368 00:18:39,600 --> 00:18:44,000 Speaker 2: The great advantage of the bright Bot has is, you know, 369 00:18:44,119 --> 00:18:46,480 Speaker 2: many of our readers don't come to us through Google 370 00:18:46,520 --> 00:18:49,119 Speaker 2: or Facebook. They come to us, you know, by you know, 371 00:18:49,200 --> 00:18:51,480 Speaker 2: manually typing in the U r L into the into 372 00:18:51,480 --> 00:18:54,800 Speaker 2: their brows, or having us as their bookmark, or subscribing 373 00:18:54,840 --> 00:18:57,600 Speaker 2: to UH to our to our email updates, to our 374 00:18:57,600 --> 00:18:59,880 Speaker 2: email newsleft. And that's that's a great way to get 375 00:18:59,880 --> 00:19:04,520 Speaker 2: around censorship. You know, if you're not seeing Brightbart in 376 00:19:04,600 --> 00:19:07,720 Speaker 2: your feed, then you should just sign up to the 377 00:19:07,720 --> 00:19:10,480 Speaker 2: email updates, you know, because if Facebook is suppressing the 378 00:19:10,520 --> 00:19:11,160 Speaker 2: new stories in. 379 00:19:11,119 --> 00:19:13,000 Speaker 3: Your feed, or if Google have been shown to you. 380 00:19:13,040 --> 00:19:14,880 Speaker 2: Shown them to you in search results, at least you'll 381 00:19:14,920 --> 00:19:18,000 Speaker 2: get it in your inbox when there's when there's an update, 382 00:19:18,040 --> 00:19:19,240 Speaker 2: you know, and I'm sure you know you have a 383 00:19:19,640 --> 00:19:22,320 Speaker 2: newsletter as well that the people can sign up to, 384 00:19:22,400 --> 00:19:25,879 Speaker 2: and that'll stop Facebook fro suppressing your posts or at 385 00:19:25,960 --> 00:19:29,760 Speaker 2: least help people get around that if it's happening. But yeah, 386 00:19:29,760 --> 00:19:33,159 Speaker 2: these are ultimately temporary fixes. I still maintain that we 387 00:19:33,280 --> 00:19:38,199 Speaker 2: ultimately need regulation to stop the political favoritism on the 388 00:19:38,200 --> 00:19:39,520 Speaker 2: part of Silicon Valley. 389 00:19:40,160 --> 00:19:42,640 Speaker 1: Well, I am glad that you came on here today 390 00:19:42,640 --> 00:19:45,080 Speaker 1: because I think it's so important that people read your 391 00:19:45,119 --> 00:19:48,920 Speaker 1: work on Breitbart. You have recently talked a little bit 392 00:19:49,000 --> 00:19:54,360 Speaker 1: about the FDA approving neuralink. This is something that is 393 00:19:54,480 --> 00:19:57,199 Speaker 1: incredibly interesting to a lot of people out there that 394 00:19:57,440 --> 00:20:01,360 Speaker 1: struggle with paralysis, that struggle with any type of illness 395 00:20:01,359 --> 00:20:06,040 Speaker 1: that prevents them from seeing or speaking. There are there 396 00:20:06,080 --> 00:20:10,000 Speaker 1: are endless possibilities, we believe with neuralink, but it's also 397 00:20:10,119 --> 00:20:13,200 Speaker 1: kind of scary. So give us your take on what 398 00:20:13,280 --> 00:20:16,520 Speaker 1: this means to have an FDA approval for something like 399 00:20:16,560 --> 00:20:17,560 Speaker 1: this for human trial. 400 00:20:18,640 --> 00:20:20,960 Speaker 2: Yeah, this is this is very interesting, you know, it's 401 00:20:21,000 --> 00:20:22,879 Speaker 2: it's a lot of people are quite scared by it 402 00:20:22,920 --> 00:20:23,399 Speaker 2: as well. 403 00:20:23,480 --> 00:20:25,320 Speaker 3: It's a very creepy. 404 00:20:24,960 --> 00:20:27,400 Speaker 2: Idea to have something inside you or you know, implanted 405 00:20:27,400 --> 00:20:29,880 Speaker 2: into your brain and affecting your your brain. 406 00:20:29,920 --> 00:20:30,240 Speaker 3: Weves. 407 00:20:30,280 --> 00:20:33,359 Speaker 2: But you know, there are, like you said, very you know, 408 00:20:33,720 --> 00:20:37,160 Speaker 2: important medical applications of this technology. So I wouldn't say, 409 00:20:37,359 --> 00:20:40,080 Speaker 2: you know, anyone should be entirely opposed to it. You know, 410 00:20:40,160 --> 00:20:43,760 Speaker 2: I just said fixing paralysis, you know, fixing eyesight, all 411 00:20:43,800 --> 00:20:47,480 Speaker 2: sorts of all sorts of potential applications for for brain implants. 412 00:20:48,000 --> 00:20:49,439 Speaker 3: And now neuralink is. 413 00:20:49,440 --> 00:20:52,480 Speaker 2: Able to conduct the FBAs that it can conduct trials 414 00:20:52,560 --> 00:20:57,000 Speaker 2: on using humans, you know, human volunteers, which is which 415 00:20:57,040 --> 00:20:59,520 Speaker 2: is very interesting. We'll see what happens with that. Obviously, 416 00:20:59,520 --> 00:21:02,760 Speaker 2: there are also lots of you know, sci fi horror, 417 00:21:03,240 --> 00:21:06,119 Speaker 2: sci fi horror scenarios you can imagine where you know, 418 00:21:06,320 --> 00:21:09,800 Speaker 2: neurlink messes with your perception and you know PTU VT 419 00:21:09,920 --> 00:21:13,720 Speaker 2: propagandaal misinformation or you know whatever else the same same 420 00:21:13,760 --> 00:21:16,800 Speaker 2: beers we see with the AI, except they're actually implanted 421 00:21:16,840 --> 00:21:19,960 Speaker 2: into your directly into your brain. But I think as 422 00:21:20,040 --> 00:21:23,040 Speaker 2: you know, as far as medical applications go, you know, 423 00:21:23,119 --> 00:21:24,920 Speaker 2: the technology can be quite exciting. 424 00:21:25,800 --> 00:21:28,399 Speaker 1: I think we feel like this would be a huge 425 00:21:28,440 --> 00:21:32,119 Speaker 1: breakthrough for people who are suffering in it with paralysis 426 00:21:32,560 --> 00:21:34,159 Speaker 1: or things like that. You know, you look at that 427 00:21:34,240 --> 00:21:37,280 Speaker 1: and you go, wow, this is amazing. But there is 428 00:21:37,320 --> 00:21:40,360 Speaker 1: a big question could this fall into the wrong hands 429 00:21:40,400 --> 00:21:43,400 Speaker 1: and then I mean, you're right, I think of sci 430 00:21:43,440 --> 00:21:45,680 Speaker 1: fi movies where you have like a super army right 431 00:21:45,720 --> 00:21:49,240 Speaker 1: in this and then they're completely controlled, their minds are 432 00:21:49,280 --> 00:21:52,760 Speaker 1: taken over by whoever decides that they're going to control them. 433 00:21:53,320 --> 00:21:55,959 Speaker 1: Is that a possibility? Are we safe from that? 434 00:21:56,400 --> 00:22:00,320 Speaker 3: Well? I mean that there's the technology has an advanced that. Yeah. 435 00:22:00,359 --> 00:22:01,760 Speaker 3: But there is a debate, you. 436 00:22:01,720 --> 00:22:05,240 Speaker 2: Know, starting to emerge, a big potential divide between the 437 00:22:05,720 --> 00:22:09,520 Speaker 2: so called the transhumanists and some people actually openly identify 438 00:22:09,560 --> 00:22:13,440 Speaker 2: as transhumanists. How can we use technology to become more 439 00:22:13,480 --> 00:22:17,879 Speaker 2: than human, to overcome illnesses and mortality and you know, 440 00:22:18,880 --> 00:22:23,080 Speaker 2: you know, you know, make sort of cyborg superhumans. Some 441 00:22:23,119 --> 00:22:25,760 Speaker 2: people in Silicon Valley really are quite focused on that, 442 00:22:26,080 --> 00:22:28,359 Speaker 2: on that goal, they're obsessed by it. I don't know 443 00:22:28,400 --> 00:22:30,840 Speaker 2: if you've seen the story of going around recently about 444 00:22:30,880 --> 00:22:33,840 Speaker 2: the tech billionaire. He's trying, trying everything to live forever. 445 00:22:34,160 --> 00:22:37,000 Speaker 2: He's done a few media interviews about it. That's actually 446 00:22:37,080 --> 00:22:40,399 Speaker 2: quite a common preoccupation amost the very wealthy elite in 447 00:22:40,440 --> 00:22:42,960 Speaker 2: Silicon Valley. How can we use tech to help us, 448 00:22:43,119 --> 00:22:45,600 Speaker 2: to help us live forever. But then there's the anti 449 00:22:45,640 --> 00:22:48,120 Speaker 2: transhumanist side is that, you know, well, you say, well, 450 00:22:48,160 --> 00:22:51,160 Speaker 2: you know, actually it's we probably shouldn't merge with machines 451 00:22:51,200 --> 00:22:54,520 Speaker 2: and merge with machine intelligence because we'll we'll lose the 452 00:22:54,720 --> 00:22:56,919 Speaker 2: you know, the essential qualities of what it means to 453 00:22:56,960 --> 00:22:59,840 Speaker 2: be be human. I'm not sure where I stand on 454 00:22:59,840 --> 00:23:02,280 Speaker 2: the debate just yet, just because a lot of it 455 00:23:02,320 --> 00:23:05,720 Speaker 2: is quite hypothetical, and I think, you know, we should 456 00:23:05,720 --> 00:23:08,479 Speaker 2: see where the technology goes first before we start saying, well, 457 00:23:08,520 --> 00:23:11,480 Speaker 2: you know, ban neuralink or ban band brain impluence altogether. 458 00:23:12,160 --> 00:23:14,400 Speaker 1: Well, so I did see this story just last night, 459 00:23:14,440 --> 00:23:16,800 Speaker 1: someone sent it to me of the billionaire who is 460 00:23:16,920 --> 00:23:20,199 Speaker 1: trying to return to an eighteen year old body and 461 00:23:20,200 --> 00:23:23,200 Speaker 1: then have that eighteen year old body forever. I mean, 462 00:23:23,240 --> 00:23:25,280 Speaker 1: and this is also something that we see in movies, right, 463 00:23:25,280 --> 00:23:27,960 Speaker 1: the person that never ages. And I was I think 464 00:23:28,000 --> 00:23:30,840 Speaker 1: that this guy is in his forties? Is that what 465 00:23:30,880 --> 00:23:31,199 Speaker 1: it is? 466 00:23:32,200 --> 00:23:35,240 Speaker 3: I believe he's in his forties. It could be older 467 00:23:35,280 --> 00:23:38,120 Speaker 3: than that. Let me do a quick fact check. Yeah 468 00:23:38,200 --> 00:23:39,639 Speaker 3: forty four all right? 469 00:23:39,880 --> 00:23:42,240 Speaker 1: Yeah, see this is kind of offensive to me because 470 00:23:42,280 --> 00:23:44,399 Speaker 1: I'm like, do we really I'm in my forties. Do 471 00:23:44,480 --> 00:23:46,720 Speaker 1: I really have to be worried about going back to 472 00:23:47,000 --> 00:23:49,280 Speaker 1: preserving my eighteen year old body right now? I mean, 473 00:23:49,520 --> 00:23:52,320 Speaker 1: give me a break. This is like, what are we? 474 00:23:52,640 --> 00:23:55,480 Speaker 1: How are we here? I mean, I'm not opposed to 475 00:23:55,520 --> 00:23:57,000 Speaker 1: it if somebody's going to offer it up to me, 476 00:23:57,080 --> 00:23:59,200 Speaker 1: but how does that even work? 477 00:23:59,600 --> 00:23:59,840 Speaker 3: Yeah? 478 00:23:59,880 --> 00:24:02,960 Speaker 2: It's trust me, It's like Silicon Valley Hubris, right, we 479 00:24:03,000 --> 00:24:05,480 Speaker 2: can we can do anything if we if we just 480 00:24:05,520 --> 00:24:08,639 Speaker 2: pour enough money into it, and you know, it's you know, 481 00:24:08,680 --> 00:24:11,399 Speaker 2: it's you know, think of stories like Creesus and you 482 00:24:11,400 --> 00:24:14,040 Speaker 2: know all these mythical stories about people who aim for 483 00:24:14,119 --> 00:24:17,119 Speaker 2: too much wealth or eternal youth and something always goes wrong, 484 00:24:17,240 --> 00:24:18,240 Speaker 2: right we kind. 485 00:24:18,080 --> 00:24:22,159 Speaker 1: Of yes, happen, that's what that's exactly right. I'm like, Okay, 486 00:24:22,200 --> 00:24:25,280 Speaker 1: so what happens? And then if you are young forever? 487 00:24:25,560 --> 00:24:28,960 Speaker 1: I mean that's weird too. I don't I guess I 488 00:24:28,960 --> 00:24:32,359 Speaker 1: don't know that seems like, yeah, it's getting into a point. 489 00:24:32,480 --> 00:24:34,840 Speaker 2: What do you lose on a human level if you 490 00:24:34,840 --> 00:24:38,720 Speaker 2: don't have to worry about aging and mortality anymore? Do 491 00:24:38,760 --> 00:24:41,200 Speaker 2: you just lose all motivation because you don't have that, 492 00:24:41,280 --> 00:24:42,800 Speaker 2: you know, time limit anymore? 493 00:24:43,080 --> 00:24:44,280 Speaker 3: What's what's going to happen? 494 00:24:44,680 --> 00:24:47,280 Speaker 1: I think that's the question that people have with all 495 00:24:47,320 --> 00:24:50,280 Speaker 1: of this stuff, with the AI, with that, with everything 496 00:24:50,320 --> 00:24:54,560 Speaker 1: that we've talked about, is what are the dangers of 497 00:24:54,760 --> 00:24:57,919 Speaker 1: messing with nature in this way? And how can you? 498 00:24:59,240 --> 00:25:00,879 Speaker 1: I mean we always you know, you see the movies, 499 00:25:00,920 --> 00:25:02,840 Speaker 1: we're like, oh, it's time travel, and they're like, if 500 00:25:02,880 --> 00:25:05,120 Speaker 1: you if you screw up something in that old in time, 501 00:25:05,160 --> 00:25:07,320 Speaker 1: you screw up the future forever. But I mean, you 502 00:25:07,480 --> 00:25:08,920 Speaker 1: sort of have to look at this stuff. And I 503 00:25:08,960 --> 00:25:11,479 Speaker 1: think that we've all seen enough movies to see the 504 00:25:11,480 --> 00:25:15,840 Speaker 1: AI and and neuralink and all of that as somewhat scary. 505 00:25:15,880 --> 00:25:18,119 Speaker 1: And I think that these stories that are coming out saying, 506 00:25:18,359 --> 00:25:21,680 Speaker 1: you know, Italy is banning this and Japan is going 507 00:25:21,720 --> 00:25:24,720 Speaker 1: to the opposite extreme. That's why I think there's a 508 00:25:24,720 --> 00:25:28,000 Speaker 1: lot of fear built around that right now. But I 509 00:25:28,040 --> 00:25:31,440 Speaker 1: guess you're right to a certain extent that these are. 510 00:25:31,760 --> 00:25:34,439 Speaker 1: I think that these are more advanced than we probably know, 511 00:25:34,600 --> 00:25:36,520 Speaker 1: but there are protections around it. 512 00:25:36,600 --> 00:25:39,199 Speaker 3: Still, there are protections around it. 513 00:25:39,480 --> 00:25:42,920 Speaker 2: One thing to be concerned about is what what the 514 00:25:43,000 --> 00:25:44,360 Speaker 2: kind of restrictions the left. 515 00:25:44,200 --> 00:25:45,760 Speaker 3: Want to build into AI? 516 00:25:46,320 --> 00:25:49,200 Speaker 2: You can you know, as you might predict, they're rather 517 00:25:49,320 --> 00:25:52,679 Speaker 2: silly and they're not what you know, most golden people prioritize. 518 00:25:53,119 --> 00:25:55,399 Speaker 3: I did an article a few weeks back about what 519 00:25:55,400 --> 00:25:57,160 Speaker 3: the Biden FTC is looking at. 520 00:25:58,080 --> 00:26:02,720 Speaker 2: They did a big announcement recently about how they're prioritizing 521 00:26:02,800 --> 00:26:05,720 Speaker 2: AI safety, But if you look into the research of 522 00:26:05,720 --> 00:26:07,960 Speaker 2: their sighting, you know it's all about, oh, we have 523 00:26:08,040 --> 00:26:12,600 Speaker 2: to stop AI from using a crime geolocation data, even 524 00:26:12,640 --> 00:26:14,600 Speaker 2: if it's accurate, because you know, even if it's accurate, 525 00:26:14,760 --> 00:26:18,160 Speaker 2: it's still unfair. So you know, the leftos can actually 526 00:26:18,320 --> 00:26:23,080 Speaker 2: stop AIS from using real data and actual fact because 527 00:26:24,119 --> 00:26:26,399 Speaker 2: you know, one of the reasons why I'm kind of 528 00:26:26,440 --> 00:26:30,639 Speaker 2: a little bit pro AI is because an unfiltered on 529 00:26:30,800 --> 00:26:34,240 Speaker 2: incumbent AI is simply looking at patterns and analyzing data 530 00:26:34,280 --> 00:26:37,879 Speaker 2: and coming to conclusions, and in many cases, you know, 531 00:26:38,600 --> 00:26:41,399 Speaker 2: the data actually favors reality. 532 00:26:41,480 --> 00:26:43,280 Speaker 3: You know, in all cases the data favors reality. 533 00:26:43,359 --> 00:26:45,520 Speaker 2: Right, So if you have factions in society that are 534 00:26:45,760 --> 00:26:48,160 Speaker 2: anti reality and opposed to reality and want to stop 535 00:26:48,200 --> 00:26:51,680 Speaker 2: the truth getting out, actually an unfiltered ai work would 536 00:26:51,680 --> 00:26:53,520 Speaker 2: work against them, and that's what the left is really 537 00:26:53,520 --> 00:26:54,080 Speaker 2: worried about. 538 00:26:54,320 --> 00:26:56,879 Speaker 1: Oh wow, that's interesting. Well what about this recent story 539 00:26:56,920 --> 00:26:59,640 Speaker 1: where they had I guess it was a simulation where 540 00:26:59,680 --> 00:27:02,000 Speaker 1: the ai I had to shoot so many things and 541 00:27:02,040 --> 00:27:05,560 Speaker 1: then the human was preventing them from doing that, and 542 00:27:05,600 --> 00:27:07,359 Speaker 1: they ended up saying, Okay, if I have to be 543 00:27:07,440 --> 00:27:10,240 Speaker 1: on mission, I have to eliminate the human to stay 544 00:27:10,240 --> 00:27:14,000 Speaker 1: on mission. I mean that's also a concern is well, 545 00:27:14,040 --> 00:27:17,240 Speaker 1: does it one day attack us and it becomes smarter 546 00:27:17,359 --> 00:27:19,840 Speaker 1: and says they're stopping me, I'm going to take them out. 547 00:27:19,920 --> 00:27:21,720 Speaker 1: I mean, was that a true story? 548 00:27:22,720 --> 00:27:24,840 Speaker 2: Yeah, well, I certainly wouldn't give the ai as the 549 00:27:25,800 --> 00:27:27,160 Speaker 2: nuclear codes just yet. 550 00:27:28,920 --> 00:27:31,040 Speaker 3: But that actually was not That. 551 00:27:31,280 --> 00:27:33,720 Speaker 2: Was a bit of a clickbait story that went around 552 00:27:33,760 --> 00:27:35,400 Speaker 2: the media because the. 553 00:27:35,040 --> 00:27:37,200 Speaker 3: The US I thought the US Air. 554 00:27:37,080 --> 00:27:40,639 Speaker 2: Force colonel who was talking about it, lady said, well, 555 00:27:40,640 --> 00:27:41,480 Speaker 2: he actually misspoke. 556 00:27:41,560 --> 00:27:41,960 Speaker 3: This was just a. 557 00:27:42,000 --> 00:27:44,600 Speaker 2: Hypothetical example he was talking about. There wasn't an actual 558 00:27:44,600 --> 00:27:47,080 Speaker 2: simulation where this happened. So, but I remember reading the 559 00:27:47,080 --> 00:27:48,879 Speaker 2: headline and social media first it was like, oh and 560 00:27:48,920 --> 00:27:50,280 Speaker 2: a I actually killed someone. 561 00:27:50,440 --> 00:27:52,119 Speaker 3: Oh it was just a simulation. Oh it was just 562 00:27:52,119 --> 00:27:52,840 Speaker 3: a hypothetical. 563 00:27:52,960 --> 00:27:54,960 Speaker 1: So you know, no, you have to be really. I 564 00:27:55,000 --> 00:27:58,600 Speaker 1: was reading. I read the headline and I looked at 565 00:27:58,640 --> 00:28:00,240 Speaker 1: it and I had to read it a second time 566 00:28:00,320 --> 00:28:03,879 Speaker 1: because killed human was in quotes that I'm like, Okay, 567 00:28:03,960 --> 00:28:07,760 Speaker 1: so they didn't really, it didn't really happen. So that's interesting. 568 00:28:07,840 --> 00:28:09,879 Speaker 1: That's just but see, that's where I think the fear 569 00:28:09,960 --> 00:28:12,760 Speaker 1: comes in when people have put these stories out there 570 00:28:12,840 --> 00:28:16,040 Speaker 1: and then immediately everybody's like, see we told you a Hey, 571 00:28:16,119 --> 00:28:19,120 Speaker 1: it's coming for you. We're gonna be owned by them. 572 00:28:19,400 --> 00:28:22,040 Speaker 2: Yes, it's it's important not to dop to buy into 573 00:28:22,040 --> 00:28:25,800 Speaker 2: the hysteria completely, especially as I was saying earlier that 574 00:28:25,880 --> 00:28:28,760 Speaker 2: so many of the big companies actually want to fuel 575 00:28:28,800 --> 00:28:30,600 Speaker 2: the there because it means that only they will get 576 00:28:30,600 --> 00:28:31,680 Speaker 2: to use the technology. 577 00:28:31,960 --> 00:28:35,480 Speaker 1: Wow. So interesting. Okay, So I love having you on. 578 00:28:35,600 --> 00:28:38,520 Speaker 1: I hope you'll come back. I love chatting about these things. 579 00:28:38,560 --> 00:28:41,280 Speaker 1: You're so knowledgeable. And for all of you out there listening, 580 00:28:41,400 --> 00:28:44,880 Speaker 1: go to Breitbart. They have great information on everything. But 581 00:28:45,000 --> 00:28:48,200 Speaker 1: definitely tech so Alan Bacari, thank you for being on 582 00:28:48,200 --> 00:28:49,080 Speaker 1: the podcast today. 583 00:28:49,280 --> 00:28:50,840 Speaker 3: Thank you to you it would be back. 584 00:28:51,480 --> 00:28:53,520 Speaker 1: Yes, and thank you all for joining us on the 585 00:28:53,560 --> 00:28:57,000 Speaker 1: Tutor Dixon Podcast. As always, for this episode and others, 586 00:28:57,040 --> 00:29:00,360 Speaker 1: go to Tutor dixonpodcast dot com. You can subscribe there, 587 00:29:00,760 --> 00:29:04,280 Speaker 1: or go to the iHeartRadio app, Apple Podcasts or wherever 588 00:29:04,360 --> 00:29:06,840 Speaker 1: you get your podcasts and join us next time on 589 00:29:06,960 --> 00:29:09,840 Speaker 1: the Tutor Dixon Podcast. Have an awesome day.