1 00:00:02,080 --> 00:00:04,560 Speaker 1: Just Ibato five here at fifty five KRCD talk station. 2 00:00:04,720 --> 00:00:08,399 Speaker 1: Happy Tuesday, Kevin Gordon filling in for me tomorrow. I'm 3 00:00:08,440 --> 00:00:10,520 Speaker 1: taking the rest of the week off in hon't know, Thanksgiving, 4 00:00:10,560 --> 00:00:13,119 Speaker 1: and as I always start this time of week and 5 00:00:13,160 --> 00:00:15,960 Speaker 1: this moment in time on the Fikarssee Morning Show, Bookmarket, 6 00:00:15,960 --> 00:00:18,400 Speaker 1: Breitbart dot com, b R E I T B A 7 00:00:18,600 --> 00:00:21,520 Speaker 1: r T dot com, it's time for the insight scoop 8 00:00:21,560 --> 00:00:24,320 Speaker 1: at bright Bart News. The return and perfect timing. Welcome 9 00:00:24,360 --> 00:00:27,159 Speaker 1: back Tech editor Colin Maydin. It's always great talking with you. 10 00:00:28,680 --> 00:00:29,880 Speaker 2: Hey, Brian, how are you today? 11 00:00:29,960 --> 00:00:32,000 Speaker 1: I'm doing great, And you know, I when I saw 12 00:00:32,040 --> 00:00:33,440 Speaker 1: you were going to be on the program this morning, 13 00:00:33,479 --> 00:00:37,199 Speaker 1: I was so happy because I just was beaming. And 14 00:00:37,240 --> 00:00:39,640 Speaker 1: in fact, Breitbart's where I found out about it. Elon 15 00:00:39,760 --> 00:00:43,440 Speaker 1: Musk has now labeled to count locations on his platform 16 00:00:43,800 --> 00:00:46,680 Speaker 1: X so you know, in fact, no, you're not dealing 17 00:00:46,680 --> 00:00:49,479 Speaker 1: with some guy in Texas screaming about MAGA or dividing 18 00:00:49,520 --> 00:00:51,440 Speaker 1: and stirring the pot of descent in this country. It's 19 00:00:51,479 --> 00:00:55,280 Speaker 1: actually somebody in Bangladesh or elsewhere. I want this feature 20 00:00:55,600 --> 00:00:58,720 Speaker 1: on every social media platform. Colin Maydin explained this to 21 00:00:58,760 --> 00:00:59,440 Speaker 1: my listeners. 22 00:01:01,080 --> 00:01:03,400 Speaker 3: Well, Brian, yeah, this is a doozy you know, and 23 00:01:03,440 --> 00:01:07,360 Speaker 3: I actually thought of you first as we were You're 24 00:01:07,400 --> 00:01:09,720 Speaker 3: the kind of guy who can who can you know, 25 00:01:09,800 --> 00:01:13,840 Speaker 3: pull us apart for the audience. So you know, we've 26 00:01:13,880 --> 00:01:16,480 Speaker 3: always had a problem. 27 00:01:16,480 --> 00:01:17,960 Speaker 2: With foreign influence. 28 00:01:18,120 --> 00:01:21,440 Speaker 3: Yes, of course, the two sides of the country kind 29 00:01:21,440 --> 00:01:27,200 Speaker 3: of disagree who's influencing what right, So you know, people 30 00:01:27,240 --> 00:01:32,760 Speaker 3: had been begging Elon and his team of Twitter now 31 00:01:32,800 --> 00:01:36,480 Speaker 3: known as X to have this feature saying what country 32 00:01:36,520 --> 00:01:39,680 Speaker 3: are these are these people trump that are posting? 33 00:01:40,000 --> 00:01:40,200 Speaker 2: Right? 34 00:01:40,760 --> 00:01:43,680 Speaker 3: They finally added it. They had it live for a 35 00:01:43,760 --> 00:01:46,160 Speaker 3: couple hours and turned it off because there was problems, 36 00:01:46,160 --> 00:01:49,800 Speaker 3: but then it relaunched over the weekend and you know, 37 00:01:49,960 --> 00:01:53,640 Speaker 3: suddenly all these eyes were opened. Because we have a 38 00:01:53,720 --> 00:01:58,640 Speaker 3: lot of accounts, especially in the political area, where they 39 00:01:58,640 --> 00:02:01,440 Speaker 3: are anonymous accounts. You know, they don't have a real 40 00:02:01,560 --> 00:02:04,920 Speaker 3: person name on there that is a known person, you know, 41 00:02:05,000 --> 00:02:08,480 Speaker 3: like a Brian Thomas account as you but you know, 42 00:02:08,680 --> 00:02:10,240 Speaker 3: American patriot. 43 00:02:09,840 --> 00:02:11,400 Speaker 2: Who the heck is that? Right? 44 00:02:12,600 --> 00:02:15,000 Speaker 3: A lot of those accounts, to your point, are in 45 00:02:15,080 --> 00:02:20,680 Speaker 3: places ranging from Romania to you know, Australia in some cases, 46 00:02:21,080 --> 00:02:25,560 Speaker 3: but a lot of them are in India, Pakistan, and Bangladesh. 47 00:02:25,600 --> 00:02:30,400 Speaker 3: So suddenly all these people go turn into super salutes 48 00:02:30,880 --> 00:02:34,800 Speaker 3: and they're looking at what these accounts are posting. Now, 49 00:02:35,440 --> 00:02:39,560 Speaker 3: the leftist response to this is haha, conservatism is fake, 50 00:02:39,680 --> 00:02:42,919 Speaker 3: Maga is fake because all these accounts are fake. But 51 00:02:43,400 --> 00:02:47,080 Speaker 3: that's not true when you look at it. The political 52 00:02:47,080 --> 00:02:51,919 Speaker 3: accounts based in these foreign countries are all posting divisions. 53 00:02:51,960 --> 00:02:56,160 Speaker 2: It's all about that the conservative movement for years. 54 00:02:56,160 --> 00:02:58,320 Speaker 1: And I'm so blas, I was so excited about this article. 55 00:02:58,360 --> 00:02:59,880 Speaker 1: I'm glad you thought of me because I've been saying 56 00:02:59,919 --> 00:03:01,720 Speaker 1: that out a lot on the program for years without 57 00:03:01,760 --> 00:03:05,800 Speaker 1: you know, support for what I was concluding. But social media, 58 00:03:05,840 --> 00:03:08,000 Speaker 1: it's so easy if you're a foreign actor and you 59 00:03:08,040 --> 00:03:10,360 Speaker 1: want to screw with America, and America is you know, 60 00:03:10,720 --> 00:03:13,440 Speaker 1: embracing under the umbell of freedom, that's one thing at 61 00:03:13,520 --> 00:03:16,600 Speaker 1: least you can unite under. Well, we need to disable that, 62 00:03:16,680 --> 00:03:19,399 Speaker 1: we need to interfere with that. Let us choose any 63 00:03:19,520 --> 00:03:22,160 Speaker 1: given subject matter and stir the pot of division. And 64 00:03:22,200 --> 00:03:25,040 Speaker 1: we can do that from remote control, from anywhere on 65 00:03:25,160 --> 00:03:28,000 Speaker 1: the globe. And this confirms what I've been saying. You're 66 00:03:28,000 --> 00:03:30,000 Speaker 1: listening to a bunch of foreign actors who want us 67 00:03:30,040 --> 00:03:32,720 Speaker 1: to fight amongst ourselves over literally anything. 68 00:03:32,800 --> 00:03:36,640 Speaker 3: Colin, Oh, absolutely true. Now there's you know, there's a 69 00:03:36,640 --> 00:03:40,400 Speaker 3: couple of important points here. Firstly, the concept of an 70 00:03:40,440 --> 00:03:45,240 Speaker 3: anonymous account is not by itself evil at all. We've 71 00:03:45,240 --> 00:03:49,040 Speaker 3: gone through more than a decade of social media cancel culture, 72 00:03:49,280 --> 00:03:51,400 Speaker 3: where if you're a conservative and you dare to share 73 00:03:51,440 --> 00:03:53,760 Speaker 3: your opinion, they're going to try to get you fired. 74 00:03:54,040 --> 00:03:56,520 Speaker 3: They're going to try to get you, you know, kicked 75 00:03:56,560 --> 00:03:59,920 Speaker 3: out of polite society. So that you know, that's okay, 76 00:04:00,320 --> 00:04:07,560 Speaker 3: But what this has revealed is very sophisticated actions by 77 00:04:07,680 --> 00:04:11,480 Speaker 3: people to portray typical Americans. To your point, you know, 78 00:04:11,560 --> 00:04:14,800 Speaker 3: you have an account that portrays itself as a as 79 00:04:14,840 --> 00:04:19,200 Speaker 3: a you know, a gun slinging Texan likes the barbecue 80 00:04:19,640 --> 00:04:22,560 Speaker 3: for lunch, and you know drives a jacked a pickup truck. 81 00:04:23,000 --> 00:04:25,640 Speaker 3: It's a dude in Pakistan that probably works for the 82 00:04:25,680 --> 00:04:32,520 Speaker 3: Pakistani governments. So you know, big problem. And you know 83 00:04:33,480 --> 00:04:38,120 Speaker 3: Twitter already had problems with foreigners because you know, the 84 00:04:38,680 --> 00:04:41,159 Speaker 3: Twitter was one of the one of the big changes 85 00:04:41,240 --> 00:04:43,880 Speaker 3: under Elon was they launched a program where you can 86 00:04:43,960 --> 00:04:47,000 Speaker 3: make money as a verified account. You know, if you're 87 00:04:47,080 --> 00:04:49,719 Speaker 3: driving traffic. If you're doing big numbers, you can make 88 00:04:49,760 --> 00:04:55,600 Speaker 3: money like advertitles and India, Pakistan accounts base. These places 89 00:04:55,600 --> 00:04:59,719 Speaker 3: were completely gaming that system. So your real influencer sitting 90 00:04:59,720 --> 00:05:03,200 Speaker 3: in the U, we're barely making any money because the 91 00:05:03,240 --> 00:05:07,680 Speaker 3: system was being completely you know, tricked by folks overseas, 92 00:05:07,839 --> 00:05:10,680 Speaker 3: and he's been trying to straighten that out. Now it's 93 00:05:10,720 --> 00:05:14,440 Speaker 3: taken a much, much deeper and frankly more sinister turn, 94 00:05:14,560 --> 00:05:18,440 Speaker 3: because you know, exactly to your point, you and others 95 00:05:18,560 --> 00:05:21,360 Speaker 3: were saying, these aren't Americans, these are people trying to 96 00:05:21,400 --> 00:05:23,480 Speaker 3: screw with us. Now we know that's true. 97 00:05:23,920 --> 00:05:26,760 Speaker 1: Yes, indeed, And you know, I love that he anticipated 98 00:05:26,760 --> 00:05:28,960 Speaker 1: the VPN workaround. You know, I'm a big fan of 99 00:05:28,960 --> 00:05:32,000 Speaker 1: epns even for example, and I use this as an illustration. 100 00:05:32,360 --> 00:05:34,880 Speaker 1: You got a teenager in a state which doesn't allow 101 00:05:34,960 --> 00:05:37,720 Speaker 1: teens on pornography sites that state, All you need to 102 00:05:37,720 --> 00:05:39,440 Speaker 1: do is get on a VPN and pretend like you're 103 00:05:39,480 --> 00:05:41,880 Speaker 1: in some other state or as the case, maybe some 104 00:05:41,960 --> 00:05:44,440 Speaker 1: other country. That's where your IP address shows up. That 105 00:05:44,480 --> 00:05:48,599 Speaker 1: defeats the whole ID check thing because that other state, 106 00:05:48,680 --> 00:05:52,280 Speaker 1: like Illinois, doesn't have the ID verification. So that's a workaround. 107 00:05:52,440 --> 00:05:55,120 Speaker 1: Same thing goes here. You could be a foreign actor 108 00:05:55,279 --> 00:05:57,120 Speaker 1: saying that you're in the United States, but if you 109 00:05:57,200 --> 00:06:00,360 Speaker 1: use a VPN from Bangladesh, it might look like you're 110 00:06:00,360 --> 00:06:03,640 Speaker 1: in Chicago or Saint Louis or Cincinnati. That's what the 111 00:06:03,720 --> 00:06:07,080 Speaker 1: VPN will show. But Elon must anticipated that, so he's 112 00:06:07,120 --> 00:06:10,000 Speaker 1: got the little exclamation point feature that's going along with 113 00:06:10,080 --> 00:06:12,039 Speaker 1: that exactly. 114 00:06:12,160 --> 00:06:14,640 Speaker 3: So you know, for anyone that that's not familiar with 115 00:06:14,680 --> 00:06:16,760 Speaker 3: how this works, you can look in an account and 116 00:06:16,760 --> 00:06:19,600 Speaker 3: click on the date that it says they joined the platform. 117 00:06:20,160 --> 00:06:23,600 Speaker 3: It's going to give you all the additional information, including 118 00:06:23,800 --> 00:06:26,520 Speaker 3: the country they're based in. And to your point, Brian, 119 00:06:27,000 --> 00:06:31,000 Speaker 3: you know, yes, anyone could VPN and say hi, I 120 00:06:31,040 --> 00:06:35,120 Speaker 3: am an American, you know, but with that exclamation point, 121 00:06:35,200 --> 00:06:36,440 Speaker 3: you know they're on a VPN. 122 00:06:36,560 --> 00:06:37,960 Speaker 2: You don't know where they're. 123 00:06:37,640 --> 00:06:42,680 Speaker 3: Based, but you know if they're saying American patriot or 124 00:06:43,240 --> 00:06:46,800 Speaker 3: my favorite of all these accounts is Republicans against Trump, 125 00:06:47,200 --> 00:06:51,400 Speaker 3: you know, which is not Republicans against Trump because it's 126 00:06:51,400 --> 00:06:52,040 Speaker 3: not American. 127 00:06:52,680 --> 00:06:52,840 Speaker 1: Uh. 128 00:06:53,400 --> 00:06:56,000 Speaker 3: You know, it's a great that you can see, Hey, 129 00:06:56,040 --> 00:06:57,880 Speaker 3: I don't know where they are, but they're not here, 130 00:06:57,920 --> 00:07:00,159 Speaker 3: and that's enough information for us. 131 00:07:00,560 --> 00:07:03,520 Speaker 1: Well, it provides you an opportunity to exercise some logic 132 00:07:03,560 --> 00:07:06,800 Speaker 1: and reason. I wonder why they're using a VPN, and 133 00:07:06,839 --> 00:07:09,320 Speaker 1: the result is the obvious question. Answer to the question 134 00:07:09,400 --> 00:07:11,760 Speaker 1: is because there's someplace else. It's just you know, just 135 00:07:11,840 --> 00:07:13,800 Speaker 1: put two and two together. It's simply it may not 136 00:07:13,840 --> 00:07:16,920 Speaker 1: necessarily be true, but more than likely in connection with 137 00:07:17,000 --> 00:07:20,520 Speaker 1: these divisive posts related to American politics and other topics, 138 00:07:20,920 --> 00:07:23,560 Speaker 1: it more than likely is a foreign actor. And I 139 00:07:23,640 --> 00:07:26,600 Speaker 1: was surprised in the reporting on this that the China 140 00:07:26,640 --> 00:07:28,560 Speaker 1: didn't come up more often, because I mean you had 141 00:07:28,560 --> 00:07:34,280 Speaker 1: mentioned Bangladesh, Congo's even mentioned in India Pakistan, but no 142 00:07:34,960 --> 00:07:37,440 Speaker 1: specific mention of China, who I kind of figured would 143 00:07:37,440 --> 00:07:40,520 Speaker 1: be the biggest foreign actor involved in this activity. 144 00:07:40,560 --> 00:07:42,240 Speaker 2: It's it's a method of warfare. 145 00:07:43,240 --> 00:07:47,800 Speaker 3: It is, you know. And for our Democrat friends, I'm 146 00:07:47,800 --> 00:07:50,720 Speaker 3: sure you have some Democrat listeners. Yes, Russia wasn't in 147 00:07:50,720 --> 00:07:56,760 Speaker 3: there either, so you know, we could talk all day 148 00:07:56,760 --> 00:07:57,400 Speaker 3: about that, Brian. 149 00:07:57,480 --> 00:07:58,520 Speaker 2: There's a lot going on. 150 00:07:58,680 --> 00:08:05,760 Speaker 3: For one thing, China may not be as concerned about 151 00:08:05,960 --> 00:08:09,520 Speaker 3: what happens on Twitter because they have much larger scale 152 00:08:10,000 --> 00:08:14,520 Speaker 3: influenced operations. For example, TikTok that is clearly a Chinese 153 00:08:14,800 --> 00:08:22,000 Speaker 3: thiop right. In other cases, probably some of these people, 154 00:08:22,360 --> 00:08:25,800 Speaker 3: if you're talking about the true you know, nation state 155 00:08:25,920 --> 00:08:31,080 Speaker 3: security practice level, they might be hiding out in India. 156 00:08:31,120 --> 00:08:34,160 Speaker 3: Because one of the very interesting things that we're going 157 00:08:34,200 --> 00:08:37,080 Speaker 3: to be following up and researching on with this is 158 00:08:37,120 --> 00:08:43,960 Speaker 3: typically when you're dealing with scammers and basically financial influencers, 159 00:08:44,400 --> 00:08:47,280 Speaker 3: you know, people who are not spies but rather doing 160 00:08:47,320 --> 00:08:49,840 Speaker 3: stuff for money, which is some of what's going on here. 161 00:08:50,800 --> 00:08:54,800 Speaker 3: The English is bad, you know, they don't know how 162 00:08:55,440 --> 00:08:59,360 Speaker 3: We get scam emails all the time that are poorly 163 00:08:59,360 --> 00:09:02,680 Speaker 3: written in English, right, because it's not English as the 164 00:09:02,720 --> 00:09:06,360 Speaker 3: first language speakers who are writing those these are all 165 00:09:06,559 --> 00:09:09,640 Speaker 3: you know, the Texans sound like Texans, right. That is 166 00:09:09,640 --> 00:09:13,679 Speaker 3: a level of sophistication that is not there for normal scammers. 167 00:09:14,000 --> 00:09:16,720 Speaker 1: But it will be if it isn't here already called 168 00:09:16,920 --> 00:09:20,680 Speaker 1: in a moment's time, it will be with artificial intelligence, right, 169 00:09:20,720 --> 00:09:26,040 Speaker 1: I mean with a statement in you know, proper English, 170 00:09:26,120 --> 00:09:28,120 Speaker 1: you know, with a Texas flair, even you can get 171 00:09:28,160 --> 00:09:30,400 Speaker 1: AI to do stuff like that. It'll tailor a message 172 00:09:30,400 --> 00:09:32,960 Speaker 1: for it, it'll actually be grammatically correct or maybe even 173 00:09:33,000 --> 00:09:35,360 Speaker 1: come out in a very colloquial way that'll make it 174 00:09:35,400 --> 00:09:36,520 Speaker 1: look convincing. 175 00:09:37,360 --> 00:09:38,520 Speaker 2: Very true. 176 00:09:38,640 --> 00:09:41,080 Speaker 3: You know. So when I look at things, when I 177 00:09:41,120 --> 00:09:43,040 Speaker 3: look at these fake accounts, when I see him coming 178 00:09:43,080 --> 00:09:47,520 Speaker 3: out of Europe like Romania, I'm thinking to myself, this 179 00:09:47,600 --> 00:09:50,520 Speaker 3: is the EU trying to mess with us. Right when 180 00:09:50,559 --> 00:09:54,040 Speaker 3: I see him coming out of India, I'm thinking, that's 181 00:09:54,120 --> 00:09:58,520 Speaker 3: more people trying to make bucks because there's nothing there's 182 00:09:58,559 --> 00:10:01,120 Speaker 3: no better way to make money on social media. Then 183 00:10:01,559 --> 00:10:04,679 Speaker 3: divide people, enrage people. This is kind of the how 184 00:10:04,760 --> 00:10:08,960 Speaker 3: it's third model of radio applied to applied to social media. 185 00:10:10,200 --> 00:10:13,440 Speaker 3: But clearly there's much more than that going on. And 186 00:10:13,520 --> 00:10:15,600 Speaker 3: I think we're going to see this unfold over time. 187 00:10:15,760 --> 00:10:16,320 Speaker 2: No question. 188 00:10:16,520 --> 00:10:19,400 Speaker 1: And further to my point on China, maybe it isn't 189 00:10:19,440 --> 00:10:22,280 Speaker 1: so much that and I point well taken on them 190 00:10:22,400 --> 00:10:26,000 Speaker 1: being you know, in other countries. Obviously they're expanding their 191 00:10:26,000 --> 00:10:28,520 Speaker 1: global footprint China is so they could operate these bot 192 00:10:28,559 --> 00:10:32,920 Speaker 1: farms elsewhere, but that they are perpetuating the we're all 193 00:10:33,040 --> 00:10:37,680 Speaker 1: killing ourselves with the climate CO two eradication messaging that 194 00:10:37,679 --> 00:10:40,120 Speaker 1: that is in their financial best interest. In other words, 195 00:10:40,160 --> 00:10:43,120 Speaker 1: it's a glorified global marketing campaign to get us to 196 00:10:43,160 --> 00:10:45,640 Speaker 1: be stupid and buy windmills and solar panels and cut 197 00:10:45,640 --> 00:10:47,920 Speaker 1: our own throats in terms of energy generation column. 198 00:10:49,400 --> 00:10:53,520 Speaker 3: Yeah, Brian, keep in mind, the Chinese mindset is they'll 199 00:10:53,520 --> 00:10:55,480 Speaker 3: hang us and we'll sell them the roAP. 200 00:10:55,720 --> 00:10:56,520 Speaker 2: That's what they want. 201 00:10:56,800 --> 00:11:00,000 Speaker 1: Yeah, they use to commit a theyse to assassinate their 202 00:11:00,040 --> 00:11:02,480 Speaker 1: political operatives and then build a family for the bullet. 203 00:11:02,559 --> 00:11:04,480 Speaker 1: You may just remember those days back in the you know, 204 00:11:04,480 --> 00:11:05,640 Speaker 1: shoot them in the back of the head and then 205 00:11:05,720 --> 00:11:06,679 Speaker 1: charge the family. 206 00:11:06,360 --> 00:11:06,880 Speaker 2: For the bullet. 207 00:11:07,440 --> 00:11:09,560 Speaker 1: That's China, and they're doing that to us each and 208 00:11:09,559 --> 00:11:11,920 Speaker 1: every day. Colin made iin tech editor with Breitbart b 209 00:11:12,080 --> 00:11:14,400 Speaker 1: R E I T b a rt Breitbart dot Com. 210 00:11:14,559 --> 00:11:18,840 Speaker 1: I was totally blown away and really disturbed by a 211 00:11:18,840 --> 00:11:21,280 Speaker 1: Wall Street Journal article I read just from yesterday. Teens 212 00:11:21,280 --> 00:11:24,720 Speaker 1: are saying tearful goodbyes to their AI companions. Some really 213 00:11:24,840 --> 00:11:28,439 Speaker 1: unbelievable problems these young people are having getting involved with 214 00:11:28,559 --> 00:11:31,880 Speaker 1: artificial intelligence. And they just have this perception that these 215 00:11:32,000 --> 00:11:35,720 Speaker 1: fake characters that they create and interact with conversationally and 216 00:11:35,720 --> 00:11:40,640 Speaker 1: literally spend hours and hours online with them, they perceive 217 00:11:40,720 --> 00:11:42,840 Speaker 1: them to be almost real. So when you say sorry, 218 00:11:43,120 --> 00:11:44,840 Speaker 1: young people are not going to let you have access 219 00:11:44,880 --> 00:11:47,880 Speaker 1: to this anymore. They break down emotionally. Oh my god, 220 00:11:47,920 --> 00:11:49,880 Speaker 1: I'm not gonna get they're not real people. I mean, 221 00:11:49,920 --> 00:11:51,400 Speaker 1: where is that message getting out? 222 00:11:51,440 --> 00:11:51,720 Speaker 2: Colin? 223 00:11:52,200 --> 00:11:55,400 Speaker 1: Sadly, it appears that Meta also was painfully aware of this. 224 00:11:55,440 --> 00:11:57,839 Speaker 1: But so rather than look further into the details of 225 00:11:57,880 --> 00:12:00,000 Speaker 1: the harm social media does the people, they just shut 226 00:12:00,160 --> 00:12:01,440 Speaker 1: down their research on it. 227 00:12:02,960 --> 00:12:05,839 Speaker 3: Yeah, that's uh, you know, that's a fun story, Brian. 228 00:12:05,880 --> 00:12:07,840 Speaker 3: I want to correct one thing you said. Oh, the 229 00:12:07,880 --> 00:12:11,040 Speaker 3: young people don't think the ais are almost real. There's 230 00:12:11,080 --> 00:12:16,760 Speaker 3: no almost don't believe they're real and that generation. Right, 231 00:12:17,720 --> 00:12:20,880 Speaker 3: What you're seeing here, Brian, is a perfect illustration of 232 00:12:20,920 --> 00:12:24,480 Speaker 3: the economics concept of opportunity costs. Opportunity cost is if 233 00:12:24,520 --> 00:12:26,760 Speaker 3: you're doing one thing, you can't do another. So our 234 00:12:26,840 --> 00:12:30,480 Speaker 3: young people are falling in love with AI, you know, 235 00:12:30,640 --> 00:12:32,040 Speaker 3: warning it when they lose it. 236 00:12:32,440 --> 00:12:33,400 Speaker 2: What are they not doing. 237 00:12:33,480 --> 00:12:37,440 Speaker 3: They're not meeting that real human right, They're not forming 238 00:12:37,520 --> 00:12:41,000 Speaker 3: relationships that are going to make the next generation of Americans. 239 00:12:41,040 --> 00:12:44,240 Speaker 3: We're going to have a population crisis light China if 240 00:12:44,240 --> 00:12:47,640 Speaker 3: we don't have people fall in love and get married 241 00:12:47,679 --> 00:12:51,680 Speaker 3: and so forth. Right, and so you know, the Meta 242 00:12:51,840 --> 00:12:54,720 Speaker 3: story you just mentioned is very insidious because what's happening 243 00:12:54,760 --> 00:12:59,600 Speaker 3: there is as they're internal researchers who I'm actually starting 244 00:12:59,600 --> 00:13:02,720 Speaker 3: to admit because they're very sharp and they're they're reaching 245 00:13:02,720 --> 00:13:06,440 Speaker 3: the right conclusions, they just get turned off. So you know, 246 00:13:06,800 --> 00:13:11,480 Speaker 3: we have these internal researchers talking to each other saying, gosh, 247 00:13:11,559 --> 00:13:15,480 Speaker 3: this is a drug and we're pushers, and you know, 248 00:13:15,600 --> 00:13:18,160 Speaker 3: they're realizing the ugly truth. It's kind of like there's 249 00:13:18,160 --> 00:13:20,760 Speaker 3: a meme of the Nazi saying, are we the bad guys? 250 00:13:21,280 --> 00:13:24,319 Speaker 3: That's that's what's happening in Silicon Valley. But they just 251 00:13:24,440 --> 00:13:27,240 Speaker 3: keep pushing forward and not slowing down. 252 00:13:27,440 --> 00:13:28,960 Speaker 2: Well, here's someone's got to slow them down. 253 00:13:29,080 --> 00:13:32,520 Speaker 1: Illustrative quote exactly on that point, Doctor ninovssan director at 254 00:13:32,520 --> 00:13:36,960 Speaker 1: Stanford's Medicine's Brainstorm Lab for Mental Health Innovation. Quote the 255 00:13:37,040 --> 00:13:41,079 Speaker 1: difficulty logging off meaning interaction with the AI chatbots. The 256 00:13:41,120 --> 00:13:43,559 Speaker 1: difficulty in logging off doesn't mean something is wrong with 257 00:13:43,640 --> 00:13:47,719 Speaker 1: the team. It means the tech worked exactly as designed. 258 00:13:48,559 --> 00:13:52,160 Speaker 1: That is, I think that just like hammers. The point home, Yes, 259 00:13:52,240 --> 00:13:55,240 Speaker 1: they want your children to spend all of their hours 260 00:13:55,280 --> 00:13:57,520 Speaker 1: engaging with artificial intelligence. 261 00:13:59,559 --> 00:14:03,760 Speaker 3: The strong comparison that I have yet to see breakdown. 262 00:14:03,960 --> 00:14:12,760 Speaker 2: Brian is Big Tobacco and Right Channel and right. But 263 00:14:12,920 --> 00:14:15,040 Speaker 2: you know, here, here's why I mentioned Big Tobacco. 264 00:14:15,280 --> 00:14:19,080 Speaker 3: What came out about Big Tobacco eventually was all of 265 00:14:19,120 --> 00:14:22,520 Speaker 3: their advertising, from the Marlborough Man to Joe Campbell was 266 00:14:22,520 --> 00:14:25,280 Speaker 3: aimed at fourteen year olds. Because if you start a 267 00:14:25,520 --> 00:14:29,480 Speaker 3: fourteen year old smoking, they'll smoke for life. That's the 268 00:14:29,480 --> 00:14:33,080 Speaker 3: big tech model. Big Tech desperately wants younger and younger 269 00:14:33,120 --> 00:14:36,680 Speaker 3: people because if they get you, you're going to stick 270 00:14:36,720 --> 00:14:40,920 Speaker 3: with them and you can't stop. And you know, we 271 00:14:41,080 --> 00:14:44,520 Speaker 3: have to the legal system in a sense, and our 272 00:14:44,520 --> 00:14:47,760 Speaker 3: politicians have to catch up with that cutting edge of 273 00:14:47,760 --> 00:14:51,520 Speaker 3: what's happening in kids, because even millennials struggle with this 274 00:14:51,720 --> 00:14:53,880 Speaker 3: because it didn't happen to them when they were ten. 275 00:14:54,280 --> 00:14:58,320 Speaker 3: It happened to them when they were, you know, teenagers 276 00:14:58,440 --> 00:14:59,520 Speaker 3: reaching twenty years old. 277 00:14:59,560 --> 00:15:00,920 Speaker 2: Suddenly on social media. 278 00:15:01,360 --> 00:15:04,000 Speaker 3: They warn't ten year olds on social media talking to 279 00:15:04,040 --> 00:15:05,880 Speaker 3: fake accounts from India exactly. 280 00:15:06,240 --> 00:15:08,040 Speaker 1: And Colin the other plant made about to acco companies, 281 00:15:08,040 --> 00:15:10,280 Speaker 1: they had their own internal research which confirmed that yeah, 282 00:15:10,320 --> 00:15:13,200 Speaker 1: tobacco is bad for you, but they hid that evidence, 283 00:15:13,240 --> 00:15:15,040 Speaker 1: which sounds me along the lines of what Meta is 284 00:15:15,080 --> 00:15:18,600 Speaker 1: doing so terrible stuff? It just keeps getting worse every day, 285 00:15:18,640 --> 00:15:23,560 Speaker 1: I suppose. I mean, is there a legislative solution? I mean, 286 00:15:23,560 --> 00:15:26,120 Speaker 1: I was screaming early in the program about TikTok. I 287 00:15:26,160 --> 00:15:28,680 Speaker 1: want this X feature on TikTok. Not that the Chinese 288 00:15:28,720 --> 00:15:31,479 Speaker 1: is going to provide it, but is there some mechanism, 289 00:15:31,640 --> 00:15:34,720 Speaker 1: some simple solution or even complex solution that helps save 290 00:15:34,760 --> 00:15:38,360 Speaker 1: our young people from this disaster? Or do we just 291 00:15:38,440 --> 00:15:41,360 Speaker 1: need to take our kids off the damn apps. 292 00:15:42,320 --> 00:15:45,400 Speaker 3: I have several answers to that, and I'll keep them, 293 00:15:45,760 --> 00:15:48,000 Speaker 3: keep them brief. You know, eventually we're going to need 294 00:15:48,040 --> 00:15:49,360 Speaker 3: some legislative solutions. 295 00:15:49,440 --> 00:15:49,640 Speaker 2: Right. 296 00:15:50,320 --> 00:15:53,200 Speaker 3: What I firm believe in my heart is that when 297 00:15:53,240 --> 00:15:56,200 Speaker 3: I watch I've been watching Kojak from the seventies, right, 298 00:15:57,240 --> 00:16:00,760 Speaker 3: and it's a little shocking because every scene, you know, 299 00:16:01,120 --> 00:16:05,240 Speaker 3: Telly Savalas and others are smoking everywhere in hospitals. 300 00:16:04,840 --> 00:16:05,680 Speaker 2: Police stations. 301 00:16:06,160 --> 00:16:09,320 Speaker 3: We look at that, you know, I'm fifty, I look 302 00:16:09,320 --> 00:16:12,720 Speaker 3: at that, and I'm a little shocked. Right. I believe 303 00:16:12,880 --> 00:16:15,200 Speaker 3: in twenty or thirty years, people are going to watch 304 00:16:15,200 --> 00:16:18,320 Speaker 3: our TV and movies of teenagers staring at phones, and 305 00:16:18,320 --> 00:16:21,040 Speaker 3: they're going to be shocked. How on earth did these 306 00:16:21,080 --> 00:16:25,200 Speaker 3: people let these kids ruin their lives with AI condanions 307 00:16:25,200 --> 00:16:25,680 Speaker 3: of phones? 308 00:16:25,760 --> 00:16:25,960 Speaker 2: Right? 309 00:16:26,200 --> 00:16:29,320 Speaker 1: Yeah, as we as we crawl from the wreckage of 310 00:16:29,360 --> 00:16:32,200 Speaker 1: allowing them to do that for so many years. So yeah, 311 00:16:32,240 --> 00:16:33,920 Speaker 1: it's like looking back at the Great Depression. 312 00:16:34,640 --> 00:16:34,960 Speaker 2: Wow. 313 00:16:35,160 --> 00:16:38,280 Speaker 3: Oh right, So how do we get there? That's the trick. 314 00:16:38,520 --> 00:16:40,920 Speaker 3: That's a tricky question. But you know, you and I 315 00:16:40,960 --> 00:16:44,920 Speaker 3: have talked before. It really comes down to parents. Parents, 316 00:16:45,160 --> 00:16:49,160 Speaker 3: right now, you have to be all over your kids 317 00:16:49,200 --> 00:16:52,560 Speaker 3: when it comes especially to AI, because AI will ruin 318 00:16:52,600 --> 00:16:55,720 Speaker 3: them in a heartbeat, whether it's cheating on school so 319 00:16:55,800 --> 00:16:59,480 Speaker 3: they don't learn anything, or you know, getting into sort 320 00:16:59,480 --> 00:17:02,680 Speaker 3: of this roman antic entanglement, or for those you know, 321 00:17:02,760 --> 00:17:07,400 Speaker 3: for people listening who if your child, grandchild, whoever loved 322 00:17:07,440 --> 00:17:10,920 Speaker 3: one has any sort of you know. 323 00:17:10,920 --> 00:17:13,880 Speaker 2: Mental issue they've dealt with in the past. AI can 324 00:17:13,920 --> 00:17:15,480 Speaker 2: make that a thousand. 325 00:17:15,000 --> 00:17:18,560 Speaker 3: Times worse in about a week. Takes a little to 326 00:17:18,600 --> 00:17:19,600 Speaker 3: take responsibility. 327 00:17:19,720 --> 00:17:21,199 Speaker 1: You have to you have to say no, they're not 328 00:17:21,240 --> 00:17:23,840 Speaker 1: your friends or your children, help them out, stop it. 329 00:17:24,240 --> 00:17:27,119 Speaker 1: Colin made on tech editor breit Bart, thank you Colin 330 00:17:27,160 --> 00:17:29,600 Speaker 1: for joining the program, and on Behalf of my listening audience, 331 00:17:29,640 --> 00:17:31,840 Speaker 1: and my family. To you and everybody at Breitbart, I 332 00:17:31,840 --> 00:17:33,640 Speaker 1: hope you have a wonderful Thanksgiving holiday. 333 00:17:33,680 --> 00:17:36,879 Speaker 2: My friend, same to you and everyone in Sincy. 334 00:17:37,080 --> 00:17:38,960 Speaker 1: Look forward to having you back on real soon. Colin, 335 00:17:39,040 --> 00:17:40,439 Speaker 1: have a great day eight twenty two. Right now if 336 00:17:40,480 --> 00:17:42,239 Speaker 1: you've got KRCD talk station, no cool wit