1 00:00:00,000 --> 00:00:03,080 Speaker 1: Hey, everybody, Welcome to the Buck Sexton Show. Kara Frederick 2 00:00:03,360 --> 00:00:06,280 Speaker 1: is with us now. She's formerly of Facebook, currently with 3 00:00:06,360 --> 00:00:10,080 Speaker 1: the Heritage Foundation, where she's Director of Tech Policy. We're 4 00:00:10,080 --> 00:00:13,000 Speaker 1: going to talk to her about AI, about government snooping 5 00:00:13,240 --> 00:00:16,439 Speaker 1: on social media platforms, and all kinds of good stuff 6 00:00:16,600 --> 00:00:20,959 Speaker 1: or bad stuff, but interesting stuff. Kara, welcome, Thanks for 7 00:00:21,000 --> 00:00:28,360 Speaker 1: having me. So let's start with Elon Musk recently spoke 8 00:00:28,440 --> 00:00:31,720 Speaker 1: to Tucker Carlson on Fox Tree Saw it, and he said, 9 00:00:31,760 --> 00:00:37,360 Speaker 1: not only was he Elon flabbergasted by how much access 10 00:00:37,400 --> 00:00:41,479 Speaker 1: the government had to the communications and just the general 11 00:00:41,479 --> 00:00:44,400 Speaker 1: stuff going on at the social media platform Twitter, I'm 12 00:00:44,440 --> 00:00:47,840 Speaker 1: sure as true of other places, but that d ms 13 00:00:48,040 --> 00:00:51,760 Speaker 1: were easily accessible by the government to What did you 14 00:00:51,800 --> 00:00:53,320 Speaker 1: think of that one? 15 00:00:53,560 --> 00:00:56,920 Speaker 2: Yeah, so this, you know, honestly, it's not surprising. I mean, 16 00:00:57,040 --> 00:01:01,080 Speaker 2: especially given the backgrounds that you and I had there. 17 00:01:01,160 --> 00:01:03,840 Speaker 2: We do know, first and foremost there are legitimate reasons, 18 00:01:03,920 --> 00:01:06,679 Speaker 2: or at least there were legitimate reasons to you know, 19 00:01:06,760 --> 00:01:09,160 Speaker 2: work with some federal entities when it comes to things 20 00:01:09,200 --> 00:01:11,520 Speaker 2: like child sexual abuse material, you. 21 00:01:11,440 --> 00:01:13,119 Speaker 3: Know, sniffing out the bad guys. 22 00:01:13,600 --> 00:01:16,720 Speaker 2: I worked on foreign Islamic terrorism when I was working 23 00:01:16,760 --> 00:01:19,680 Speaker 2: for a big tech platform and before. So, you know, 24 00:01:20,080 --> 00:01:23,960 Speaker 2: there are times when you want the government that is 25 00:01:24,040 --> 00:01:28,800 Speaker 2: ostensibly there to protect the security of Americans to maybe 26 00:01:28,800 --> 00:01:30,520 Speaker 2: have some sort of access. 27 00:01:30,120 --> 00:01:31,560 Speaker 3: To these internal communications. 28 00:01:31,640 --> 00:01:35,040 Speaker 2: Unfortunately, over the past two to three years, we've seen 29 00:01:35,360 --> 00:01:39,560 Speaker 2: that there's been a abject dereliction of duty when it 30 00:01:39,600 --> 00:01:42,959 Speaker 2: comes to the you know, number one, keeping United citizens 31 00:01:43,000 --> 00:01:49,480 Speaker 2: safe and number two really prioritizing the the security of Americans, 32 00:01:49,920 --> 00:01:54,240 Speaker 2: especially from external hostile forces, and instead looking inward at 33 00:01:54,240 --> 00:01:59,240 Speaker 2: the American populace and inflating the definition of domestic extremists 34 00:01:59,240 --> 00:02:02,960 Speaker 2: and terrorism and whatnot. So, in my mind, the government 35 00:02:03,080 --> 00:02:07,560 Speaker 2: has lost our trust and deservedly so, and they have 36 00:02:07,680 --> 00:02:11,480 Speaker 2: been misprioritizing when they should be looking at real actual threats. 37 00:02:11,800 --> 00:02:14,119 Speaker 3: Then okay, maybe we can think about. 38 00:02:13,880 --> 00:02:16,600 Speaker 2: Some of these surveillance capabilities, but that's all gone, that's 39 00:02:16,600 --> 00:02:20,240 Speaker 2: been washed away, given especially what Elon exposed in the 40 00:02:20,280 --> 00:02:25,000 Speaker 2: Twitter files, and it is very telling. I think that 41 00:02:25,520 --> 00:02:27,920 Speaker 2: we at the Heritage Foundation we published a report in 42 00:02:27,960 --> 00:02:30,600 Speaker 2: February of twenty twenty two, and we said there's a 43 00:02:30,639 --> 00:02:35,359 Speaker 2: symbiotic relationship between these big tech companies and the government. 44 00:02:35,480 --> 00:02:38,840 Speaker 2: It goes as far as collusion and any sort of 45 00:02:38,880 --> 00:02:42,239 Speaker 2: collusion between government actors and these big tech companies to 46 00:02:42,320 --> 00:02:45,080 Speaker 2: silence the speech of American should be prohibited. And people 47 00:02:45,120 --> 00:02:47,520 Speaker 2: were like, you are, oh, come on, You're just a 48 00:02:47,560 --> 00:02:48,160 Speaker 2: fear monger. 49 00:02:48,200 --> 00:02:50,440 Speaker 3: What are you doing? But we've been proven right. 50 00:02:50,520 --> 00:02:52,720 Speaker 2: So the fact that the government had full access to 51 00:02:52,760 --> 00:02:55,480 Speaker 2: dms and Twitter, again, we knew Twitter dms a long 52 00:02:55,520 --> 00:02:58,000 Speaker 2: time ago were probably compromised, and this just proves it. 53 00:02:58,040 --> 00:03:00,680 Speaker 2: And frankly, this proves a Heritage Foundation right yet again. 54 00:03:01,520 --> 00:03:06,359 Speaker 1: So you worked at Facebook, I would assume, and tell 55 00:03:06,400 --> 00:03:10,200 Speaker 1: me if it's an incorrect assumption that the very very 56 00:03:10,760 --> 00:03:15,160 Speaker 1: cozy and all too close relationship between Twitter pre Elon 57 00:03:15,919 --> 00:03:20,120 Speaker 1: and the government and as he described it, effectively it 58 00:03:20,160 --> 00:03:25,160 Speaker 1: was a bloated activist organization Twitter masquerading as a social 59 00:03:25,200 --> 00:03:28,560 Speaker 1: media platform. I have to assume, and you tell me 60 00:03:28,600 --> 00:03:32,359 Speaker 1: if it's not right that the politics of Facebook, Google, 61 00:03:32,919 --> 00:03:36,800 Speaker 1: all the rest are basically the same and perhaps even worse. 62 00:03:38,120 --> 00:03:41,720 Speaker 2: I would say post twenty sixteen Trump elections, that's when 63 00:03:41,760 --> 00:03:45,360 Speaker 2: everything became exacerbated. So you have, you know, a cadre 64 00:03:45,400 --> 00:03:49,160 Speaker 2: of us who went in before that election. I did 65 00:03:49,160 --> 00:03:52,880 Speaker 2: there in twenty sixteen, early twenty sixteen, and you know, 66 00:03:53,040 --> 00:03:57,120 Speaker 2: we we were there to solve these big problems like 67 00:03:57,160 --> 00:03:59,800 Speaker 2: the foreign as long terrorism issue, and you had really 68 00:03:59,840 --> 00:04:02,360 Speaker 2: talented people, a lot of time patriots. I came right 69 00:04:02,400 --> 00:04:04,960 Speaker 2: out of a naval special Warfare command where I had 70 00:04:04,960 --> 00:04:08,480 Speaker 2: been doing counter terrorism analysis as a targeter, and I 71 00:04:08,560 --> 00:04:11,960 Speaker 2: went right into Facebook sort of thinking that, you know, 72 00:04:11,960 --> 00:04:13,240 Speaker 2: we were going to do the same thing. We were 73 00:04:13,240 --> 00:04:16,680 Speaker 2: going to protect Americans and their users from this you know, 74 00:04:16,720 --> 00:04:19,760 Speaker 2: foreign Islamic menace, which was taking the form of ISIS 75 00:04:19,800 --> 00:04:23,560 Speaker 2: at the time. And then I would say in twenty sixteen, 76 00:04:23,920 --> 00:04:27,360 Speaker 2: something really changed, and it was Trump's election, and they 77 00:04:27,480 --> 00:04:30,600 Speaker 2: just went hysterical. And the people that they started recruiting 78 00:04:30,920 --> 00:04:33,160 Speaker 2: post election, you know a lot of them went in 79 00:04:33,400 --> 00:04:37,120 Speaker 2: sort of thinking that they had a mandate to stifle 80 00:04:37,240 --> 00:04:41,360 Speaker 2: the conservative voices i'd say in America. You know, prior 81 00:04:41,360 --> 00:04:46,240 Speaker 2: to that, we there were some interesting i would say 82 00:04:46,360 --> 00:04:48,800 Speaker 2: data points when it came to the way that we 83 00:04:48,880 --> 00:04:50,560 Speaker 2: did our analysis, for instance. 84 00:04:50,880 --> 00:04:51,880 Speaker 3: And this is no secret. 85 00:04:51,920 --> 00:04:58,200 Speaker 2: Now these companies have been using organizations and goos like 86 00:04:58,279 --> 00:05:02,480 Speaker 2: the Southern Poverty Law Center to actually help them formulate 87 00:05:02,520 --> 00:05:06,080 Speaker 2: their policies and help them formulate the way that they 88 00:05:06,279 --> 00:05:10,159 Speaker 2: treated specific actors on these platforms, and they thought nothing 89 00:05:10,200 --> 00:05:13,200 Speaker 2: of it. They thought, Okay, SBLC is an honest broker. 90 00:05:13,839 --> 00:05:15,920 Speaker 2: So I think that part of it is ignorance. Part 91 00:05:15,920 --> 00:05:17,520 Speaker 2: of it is, you know, the sea that they swim 92 00:05:17,560 --> 00:05:19,880 Speaker 2: in in the Bay Area. And then the other part 93 00:05:19,920 --> 00:05:22,760 Speaker 2: of it is the twenty sixteen election just really unhinging 94 00:05:22,800 --> 00:05:29,039 Speaker 2: people and making them really come in with mission oriented 95 00:05:29,120 --> 00:05:33,039 Speaker 2: to sort of stifle the voices of the people that 96 00:05:33,080 --> 00:05:34,000 Speaker 2: they disagreed with. 97 00:05:35,240 --> 00:05:37,640 Speaker 1: Is it fixable at these places? Like, how would you 98 00:05:37,720 --> 00:05:42,279 Speaker 1: actually if you were able to get you know, Zuckerberg 99 00:05:42,560 --> 00:05:45,800 Speaker 1: and his top lieutenants in a room and they were 100 00:05:45,800 --> 00:05:47,880 Speaker 1: willing to listen to reason. I mean, you know, I 101 00:05:47,920 --> 00:05:50,880 Speaker 1: think Facebook has just turned. I'm amazed that there's still 102 00:05:50,880 --> 00:05:53,279 Speaker 1: apparently as many people using it in America as there are. 103 00:05:53,400 --> 00:05:56,920 Speaker 1: I find it to be like an unwieldy trash heap 104 00:05:56,960 --> 00:05:59,000 Speaker 1: of nonsense, Like it's really hard to even figure out 105 00:05:59,000 --> 00:06:01,640 Speaker 1: where anything is anymore, or they've it all looks like 106 00:06:02,320 --> 00:06:05,920 Speaker 1: it was made by a bunch of of you know, 107 00:06:06,600 --> 00:06:08,680 Speaker 1: computer engineers, And I don't mean that in a good way, 108 00:06:09,279 --> 00:06:11,520 Speaker 1: Like it looks like it's just this kind of cobbled 109 00:06:11,560 --> 00:06:15,040 Speaker 1: together thing and all of the original just the ease, 110 00:06:15,279 --> 00:06:17,600 Speaker 1: and it's just they're throwing all this. I think it's 111 00:06:17,600 --> 00:06:20,440 Speaker 1: turned into a bad user user experience and user interface. 112 00:06:20,480 --> 00:06:23,000 Speaker 1: But anyway, forget about fixing that for a second. If 113 00:06:23,000 --> 00:06:25,560 Speaker 1: you were just trying to fix the fact that Facebook 114 00:06:25,600 --> 00:06:28,599 Speaker 1: is not fair to half of the country politically, is 115 00:06:28,640 --> 00:06:30,440 Speaker 1: it possible or do we just have to build our 116 00:06:30,480 --> 00:06:32,039 Speaker 1: own build our own thing. By the way, build your 117 00:06:32,040 --> 00:06:34,080 Speaker 1: own Facebook doesn't sound as crazy. Maybe it's buy your 118 00:06:34,080 --> 00:06:36,560 Speaker 1: own Facebook that's more expensive than Twitter. Have to call 119 00:06:36,600 --> 00:06:37,400 Speaker 1: elon on that one. 120 00:06:38,880 --> 00:06:41,440 Speaker 2: Yeah, you know, I will say personally, and when I 121 00:06:41,480 --> 00:06:43,960 Speaker 2: was in the company, I thought Mark Zuckerberg's instincts were 122 00:06:44,680 --> 00:06:49,360 Speaker 2: more libertarian than you know, more leftists. So and that 123 00:06:49,520 --> 00:06:52,720 Speaker 2: is sort of the old guard of the builders. 124 00:06:52,200 --> 00:06:53,120 Speaker 3: In Silicon Valley. 125 00:06:53,200 --> 00:06:55,640 Speaker 2: You know, these are engineers, and they think like engineers, 126 00:06:55,680 --> 00:06:58,359 Speaker 2: they think like programmers. They want to solve problems and 127 00:06:58,400 --> 00:07:01,839 Speaker 2: fix things unfortuntion only. I do think that he's he's 128 00:07:01,880 --> 00:07:04,200 Speaker 2: been there for so long and he's been frankly led 129 00:07:04,200 --> 00:07:06,719 Speaker 2: astray by others in the C suite and that other 130 00:07:07,000 --> 00:07:09,360 Speaker 2: sort of layer of upper management as well, who are 131 00:07:10,040 --> 00:07:13,480 Speaker 2: very concerned with pr fires and clearly, we know that 132 00:07:13,840 --> 00:07:16,800 Speaker 2: public relations only goes one way, and that's against conservatives 133 00:07:16,840 --> 00:07:19,600 Speaker 2: at this point. So you look at that and sort 134 00:07:19,600 --> 00:07:21,679 Speaker 2: of see how he's been co opted when his instincts 135 00:07:21,680 --> 00:07:25,200 Speaker 2: were maybe initially good. I remember a speech that he 136 00:07:25,280 --> 00:07:28,640 Speaker 2: gave in October of twenty nineteen at Georgetown University where 137 00:07:28,640 --> 00:07:30,880 Speaker 2: he said Facebook had to be the alternative to an 138 00:07:30,920 --> 00:07:34,600 Speaker 2: authoritarian China, which was propagating its digital platforms here in 139 00:07:34,640 --> 00:07:37,240 Speaker 2: the US. You know what happened to that that that 140 00:07:37,320 --> 00:07:40,600 Speaker 2: gave me a little bit of hope. But since he 141 00:07:40,760 --> 00:07:47,520 Speaker 2: since spent four hundred million dollars pushing certain Democrats in 142 00:07:47,720 --> 00:07:51,000 Speaker 2: elections under the auspices of these get out the vote measures, 143 00:07:51,040 --> 00:07:54,120 Speaker 2: but we know that they were in blue states going 144 00:07:54,160 --> 00:07:59,160 Speaker 2: to Democrat activist organizations who were then pushing only getting 145 00:07:59,200 --> 00:08:02,480 Speaker 2: out the vote for Demo. So unfortunately, I think those 146 00:08:02,600 --> 00:08:06,120 Speaker 2: instincts that he previously had have been quashed, and I 147 00:08:06,160 --> 00:08:08,280 Speaker 2: don't think there's any coming back from it. I do 148 00:08:08,360 --> 00:08:10,080 Speaker 2: think it is too far gone at this point. 149 00:08:10,920 --> 00:08:12,920 Speaker 1: Let's come back in a second and talk about TikTok. 150 00:08:13,360 --> 00:08:16,040 Speaker 1: Since you mentioned communists China, we'll get into that in 151 00:08:16,080 --> 00:08:19,120 Speaker 1: just a second here, But everybody at home. If you 152 00:08:19,200 --> 00:08:21,120 Speaker 1: haven't tried the Geezy dream Sheets, and I'll tell you 153 00:08:21,200 --> 00:08:22,800 Speaker 1: right now, you're missing out. Mike Lindell's got a lot 154 00:08:22,840 --> 00:08:25,600 Speaker 1: of amazing products. Karen, do you have Geezy dream sheets. 155 00:08:25,680 --> 00:08:26,920 Speaker 1: We're gonna get you hooked up. 156 00:08:28,480 --> 00:08:30,280 Speaker 3: Planning on it. They're in the cart, They're in the cart. 157 00:08:30,320 --> 00:08:32,600 Speaker 1: I'm like, all right, there we go the Geezy dream 158 00:08:32,640 --> 00:08:34,360 Speaker 1: sheets from Mike lind You got to get them for 159 00:08:34,360 --> 00:08:36,840 Speaker 1: the whole family. I'm telling you, they're amazing. Coming as 160 00:08:36,920 --> 00:08:38,720 Speaker 1: low as twenty nine to ninety nine. When you go 161 00:08:38,720 --> 00:08:41,560 Speaker 1: to my pillow dot com, use promo code buck, multiple 162 00:08:41,559 --> 00:08:44,440 Speaker 1: color styles and sizes, you can upgrade your betting now. 163 00:08:44,440 --> 00:08:46,079 Speaker 1: By the way, everybody I know you think, oh I've 164 00:08:46,080 --> 00:08:48,600 Speaker 1: already got sheets. Sheets only last you a couple of years. 165 00:08:48,640 --> 00:08:50,800 Speaker 1: You know, you wash them, they get kind of you 166 00:08:50,800 --> 00:08:53,600 Speaker 1: get kind of a little too worn, threadbare, and they're 167 00:08:53,600 --> 00:08:56,280 Speaker 1: not very comfortable and they start to, you know, just 168 00:08:56,320 --> 00:08:58,599 Speaker 1: look like you need new sheets, because you do. So 169 00:08:58,760 --> 00:09:01,160 Speaker 1: go to my pillow dot com promo code buck twenty 170 00:09:01,240 --> 00:09:04,319 Speaker 1: nine to ninety eight for Geeza dream Sheets. All my 171 00:09:04,440 --> 00:09:07,079 Speaker 1: Pillow products come with a sixty day money back guarantee 172 00:09:07,080 --> 00:09:10,080 Speaker 1: and a ten year warranty. So go to MyPillow dot com, 173 00:09:10,120 --> 00:09:13,720 Speaker 1: click on radio listener specials and use promo code buck 174 00:09:13,760 --> 00:09:16,600 Speaker 1: for the Geeza dream Sheets under thirty bucks. Geeze dream 175 00:09:16,600 --> 00:09:19,400 Speaker 1: Sheets go check them out today. I sleep on them 176 00:09:19,400 --> 00:09:24,240 Speaker 1: every night. All right, So TikTok, I gotta tell you. 177 00:09:24,280 --> 00:09:26,840 Speaker 1: I mean, I've I think that YouTube is much more 178 00:09:26,880 --> 00:09:30,480 Speaker 1: concerning for American freedom and everything else than TikTok. And 179 00:09:30,520 --> 00:09:32,480 Speaker 1: I've been saying this more and more people are saying 180 00:09:32,480 --> 00:09:34,160 Speaker 1: it now. It seemed to be about a month ago, 181 00:09:34,200 --> 00:09:36,679 Speaker 1: and it's like, oh, TikTok. I'm sitting here. I'm like, 182 00:09:37,760 --> 00:09:39,680 Speaker 1: what about the social media companies? They are already throwing 183 00:09:39,720 --> 00:09:43,360 Speaker 1: elections anyway, We've put that aside for a second. I mean, 184 00:09:43,400 --> 00:09:45,600 Speaker 1: how worried are you? Let's just look at TikTok. And 185 00:09:45,800 --> 00:09:50,320 Speaker 1: I won't do the you know, the what about with YouTube, Google, 186 00:09:50,320 --> 00:09:54,559 Speaker 1: Facebook or Instagram all that stuff, although I could. How 187 00:09:54,559 --> 00:09:57,000 Speaker 1: bad is TikTok really? How big a problem is it? 188 00:09:58,120 --> 00:09:58,320 Speaker 3: Yeah? 189 00:09:58,360 --> 00:10:03,360 Speaker 2: I think TikTok falls in to three problem buckets, I 190 00:10:03,400 --> 00:10:05,840 Speaker 2: would say, And the first and foremost is the most 191 00:10:05,880 --> 00:10:09,319 Speaker 2: obvious one. It's that Byte Dance as its parents parent company. 192 00:10:09,600 --> 00:10:13,319 Speaker 2: It is headquartered in Beijing and therefore subject to the 193 00:10:13,400 --> 00:10:16,520 Speaker 2: laws and policies of the People's Republic of China. One 194 00:10:16,559 --> 00:10:18,920 Speaker 2: of the laws that we like to talk about is 195 00:10:18,920 --> 00:10:23,360 Speaker 2: the twenty seventeen National Intelligence Law, which effectively compels private 196 00:10:23,400 --> 00:10:26,400 Speaker 2: companies to do the work of the state. So if 197 00:10:26,400 --> 00:10:29,600 Speaker 2: the Chinese state, the CCP, the Chinese Communist Party, decides 198 00:10:29,640 --> 00:10:32,480 Speaker 2: that they want specific data, they want access to this, 199 00:10:32,600 --> 00:10:35,680 Speaker 2: they want access to that, then by virtue of this law, 200 00:10:35,800 --> 00:10:39,800 Speaker 2: Byte Dance has to comply. And this is not my 201 00:10:39,920 --> 00:10:43,600 Speaker 2: original phrasing, but I think it illustrates the problem pretty well. 202 00:10:44,520 --> 00:10:48,640 Speaker 2: China doesn't have rule of law, they have rule by law. 203 00:10:48,920 --> 00:10:52,000 Speaker 2: So this is something if you're you know, the Bite 204 00:10:52,040 --> 00:10:55,360 Speaker 2: Dance CEO, you're kind of powerless to resist at this point, 205 00:10:55,400 --> 00:10:57,440 Speaker 2: and why would you. When you look at byte Dance, 206 00:10:57,520 --> 00:11:00,840 Speaker 2: they have one of three of their board members of 207 00:11:00,880 --> 00:11:03,800 Speaker 2: the main domestic subsidiary of Byte Dance is a card 208 00:11:03,880 --> 00:11:08,960 Speaker 2: carrying Chinese Communist government official. If you look at like 209 00:11:09,040 --> 00:11:11,240 Speaker 2: some of the good reporting that Forbes has done, they 210 00:11:11,320 --> 00:11:15,760 Speaker 2: scoured LinkedIn and found that three hundred plus profiles had 211 00:11:15,800 --> 00:11:19,240 Speaker 2: either current or former links of these current Byte Dance 212 00:11:19,280 --> 00:11:22,960 Speaker 2: employees to a Chinese propaganda arm to Chinese state media. 213 00:11:23,280 --> 00:11:26,400 Speaker 2: So you have active and former members of the Chinese 214 00:11:26,400 --> 00:11:31,559 Speaker 2: Communist Party, particularly in the information realm, working in byte dance. 215 00:11:31,679 --> 00:11:33,520 Speaker 3: And there's so. 216 00:11:33,400 --> 00:11:35,120 Speaker 2: Many other data points that I could talk about, but 217 00:11:35,360 --> 00:11:38,040 Speaker 2: that's the first one. Byte dance, very close links and 218 00:11:38,080 --> 00:11:41,559 Speaker 2: infiltration frankly by the CCP. They have an internal committee. 219 00:11:41,600 --> 00:11:44,079 Speaker 2: A DOJ report in September twenty twenty coming out of 220 00:11:44,080 --> 00:11:46,960 Speaker 2: the Trump administration assessed as much. And we know that 221 00:11:47,000 --> 00:11:49,800 Speaker 2: they are deeply involved in the inner workings of TikTok 222 00:11:49,920 --> 00:11:53,880 Speaker 2: as well. So that's one thing, and again just tip 223 00:11:53,880 --> 00:11:56,920 Speaker 2: of the iceberg. We can go into just really how 224 00:11:57,080 --> 00:11:59,640 Speaker 2: odious some of those connections are when it comes to 225 00:11:59,840 --> 00:12:03,280 Speaker 2: the connection with American data as well, but I'll table 226 00:12:03,320 --> 00:12:06,319 Speaker 2: that for now. And then number two there's that influence 227 00:12:06,400 --> 00:12:11,520 Speaker 2: campaign aspect, and you know, we talk about the manipulation 228 00:12:11,640 --> 00:12:14,679 Speaker 2: of the information environment. This is something that we were 229 00:12:14,720 --> 00:12:18,160 Speaker 2: dealing with in the intelligence community and especially in big 230 00:12:18,200 --> 00:12:21,800 Speaker 2: tech companies and looking at it now and what we've 231 00:12:21,840 --> 00:12:26,320 Speaker 2: seen is pro CCP narratives pushed on these platforms. We've 232 00:12:26,360 --> 00:12:33,480 Speaker 2: seen an actual Chinese state account come to TikTok and say, 233 00:12:33,679 --> 00:12:36,680 Speaker 2: how can we push our information and we've seen information 234 00:12:36,800 --> 00:12:39,199 Speaker 2: from those accounts pushed and not labeled as state media 235 00:12:39,240 --> 00:12:42,240 Speaker 2: as well, So we know that they're trying to do 236 00:12:42,400 --> 00:12:47,720 Speaker 2: the proverbial sewing of discord, such as promoting Democratic candidates 237 00:12:47,720 --> 00:12:50,280 Speaker 2: in the twenty twenty two midterm election to the detriment 238 00:12:50,320 --> 00:12:53,679 Speaker 2: of Republican candidates. They're trying to push stories about abortion 239 00:12:53,800 --> 00:12:58,240 Speaker 2: and incendiary things to help so dissent among the American populations, 240 00:12:58,280 --> 00:13:00,000 Speaker 2: something everyone always accused sort of the. 241 00:13:00,040 --> 00:13:00,880 Speaker 3: Russians of doing. 242 00:13:01,360 --> 00:13:04,960 Speaker 2: So that's another aspect of that information environment manipulation. And 243 00:13:05,000 --> 00:13:07,560 Speaker 2: then third, you have the kids. You know what it's 244 00:13:07,600 --> 00:13:10,920 Speaker 2: doing to children, and we know that TikTok in particular 245 00:13:10,960 --> 00:13:14,120 Speaker 2: with the four U algorithm just lights a fire under 246 00:13:14,160 --> 00:13:17,320 Speaker 2: these social contagions. We know that they're in bed with 247 00:13:17,400 --> 00:13:23,480 Speaker 2: the transgender lobby, featuring you know, prominent transgender activists, prominent 248 00:13:23,559 --> 00:13:29,360 Speaker 2: LGBTQ activists all over their websites raising awareness. They're very, 249 00:13:29,440 --> 00:13:34,040 Speaker 2: very open about doing this, and there are pediatric hospitals 250 00:13:34,080 --> 00:13:39,720 Speaker 2: that are reporting actual physical manifestations coming out in patients 251 00:13:40,040 --> 00:13:43,559 Speaker 2: because they use TikTok like things called TikTok tics, which 252 00:13:43,720 --> 00:13:48,480 Speaker 2: most researchers are classifying as you know, pure social contagion movements. 253 00:13:48,760 --> 00:13:51,640 Speaker 2: So you know, and this is again something that TikTok 254 00:13:51,720 --> 00:13:56,680 Speaker 2: is very very efficient at that it doesn't necessarily distinguish 255 00:13:56,720 --> 00:13:59,280 Speaker 2: it wholesale from the instagrams of the world. If you 256 00:13:59,320 --> 00:14:01,760 Speaker 2: want to talk about aboutism. We know their staffs on 257 00:14:01,800 --> 00:14:05,560 Speaker 2: that in terms of mental health harms. But TikTok is poised, 258 00:14:05,679 --> 00:14:08,880 Speaker 2: especially because, as Christopher Ray says, directory of the FBI 259 00:14:09,600 --> 00:14:13,480 Speaker 2: China controls the algorithm. That's even more problematic when they're 260 00:14:13,520 --> 00:14:15,400 Speaker 2: feeding our kids this poison. 261 00:14:16,800 --> 00:14:21,520 Speaker 1: But how much of the algorithm is the kids click 262 00:14:21,600 --> 00:14:23,960 Speaker 1: on the things they click on and then it's reflected 263 00:14:24,000 --> 00:14:25,640 Speaker 1: back to them, right, like, I mean not you know, 264 00:14:26,040 --> 00:14:30,200 Speaker 1: I asked this because remember Russia, remember Russia collusion. Back 265 00:14:30,200 --> 00:14:34,960 Speaker 1: in twenty sixteen, they're saying, oh, Hillary Clinton lost because 266 00:14:35,000 --> 00:14:37,120 Speaker 1: Trump worked with Russia. And then they talked about the 267 00:14:37,120 --> 00:14:39,200 Speaker 1: Facebook ads. I think there was like one hundred thousand 268 00:14:39,200 --> 00:14:42,040 Speaker 1: dollars that were spent on these bogus you know, or 269 00:14:42,120 --> 00:14:45,280 Speaker 1: Russian backed or whatever Facebook ads. And when you look 270 00:14:45,280 --> 00:14:47,280 Speaker 1: at the ads, I mean a lot of them were ridiculous. 271 00:14:47,320 --> 00:14:49,960 Speaker 1: I mean it looked like a guy named Yuri, you know, 272 00:14:50,680 --> 00:14:53,840 Speaker 1: working in like sub basement c of some you know 273 00:14:53,960 --> 00:14:57,200 Speaker 1: fsb outpost you know, on the on the in the 274 00:14:57,200 --> 00:15:00,640 Speaker 1: outer ring of Moscow was like looking at a you know, 275 00:15:00,720 --> 00:15:05,320 Speaker 1: little dictionary in English. I mean it was preposterous, right, 276 00:15:05,360 --> 00:15:08,600 Speaker 1: I mean their understanding of US politics beyond lock her 277 00:15:08,680 --> 00:15:11,040 Speaker 1: up for Hillary Clinton, which you know they got that right, 278 00:15:11,360 --> 00:15:14,840 Speaker 1: but their understanding was very weak. I mean the idea 279 00:15:14,920 --> 00:15:17,960 Speaker 1: that that I just hear all these people saying, oh, 280 00:15:18,000 --> 00:15:21,200 Speaker 1: my gosh, you know, the Chinese are gonna brainwash our 281 00:15:21,280 --> 00:15:24,080 Speaker 1: kids and make them lazy or whatever. I look at them, like, 282 00:15:24,720 --> 00:15:28,280 Speaker 1: what do you guys think Disney's door. I just this 283 00:15:28,360 --> 00:15:30,160 Speaker 1: is the part of it that there's so much more 284 00:15:30,240 --> 00:15:32,600 Speaker 1: upset about what's going on with the I understand the 285 00:15:32,640 --> 00:15:34,760 Speaker 1: spying thing and like if they can suck up your 286 00:15:34,800 --> 00:15:37,600 Speaker 1: information and do create that that's a separate issue, but 287 00:15:37,680 --> 00:15:40,480 Speaker 1: it sounds to me like there's also just a content 288 00:15:40,560 --> 00:15:43,760 Speaker 1: component of this, and I I don't under I just 289 00:15:43,760 --> 00:15:46,040 Speaker 1: feel like I feel like TikTok is the shiny object 290 00:15:46,080 --> 00:15:49,200 Speaker 1: where people in politics and in power get to pretend 291 00:15:49,240 --> 00:15:51,600 Speaker 1: that they're doing something that's like, oh, we're protecting the 292 00:15:51,680 --> 00:15:56,040 Speaker 1: kids from the bad influences online. They're protecting them from 293 00:15:56,160 --> 00:16:02,280 Speaker 1: one of dozens of major and endless minor influences online 294 00:16:02,680 --> 00:16:05,600 Speaker 1: that are all being pushed by the Democrat Party anyway, 295 00:16:05,920 --> 00:16:10,320 Speaker 1: like transgenderism. Beijing's not pushing transgenderism via TikTok on kids, 296 00:16:10,560 --> 00:16:14,120 Speaker 1: Democrat Disney is pushing transgenderism on kids. 297 00:16:14,960 --> 00:16:17,640 Speaker 2: Well, number one, you know, you're right, But number two, 298 00:16:17,640 --> 00:16:20,400 Speaker 2: we also don't know that Beijing isn't pushing this stuff. 299 00:16:20,400 --> 00:16:22,160 Speaker 3: And that's you know, part of the problem too. 300 00:16:22,240 --> 00:16:25,480 Speaker 2: If China is so intimately involved in the algorithm, which 301 00:16:25,560 --> 00:16:28,400 Speaker 2: it says it won't. If there's a forced investiture to 302 00:16:28,480 --> 00:16:31,080 Speaker 2: an American company of TikTok, it said, there's no way 303 00:16:31,080 --> 00:16:34,520 Speaker 2: we're giving up the algorithm. So in my mind that means, yeah, 304 00:16:34,560 --> 00:16:36,400 Speaker 2: they want to retain control of it. There's a commercial 305 00:16:36,440 --> 00:16:39,360 Speaker 2: element to it because it is really good, but there's 306 00:16:39,400 --> 00:16:44,120 Speaker 2: also that information environment manipulation potential there because they they 307 00:16:44,120 --> 00:16:46,080 Speaker 2: want to keep it in Chinese hands so badly, and 308 00:16:46,080 --> 00:16:48,880 Speaker 2: they've said as much, which flies in the face of 309 00:16:48,920 --> 00:16:51,600 Speaker 2: a lot of the assurances that these TikTok executives are 310 00:16:51,760 --> 00:16:56,760 Speaker 2: providing to Congress members with their you know, Project Texas potential, 311 00:16:56,960 --> 00:16:59,720 Speaker 2: you know, mollification of our representatives. 312 00:17:00,360 --> 00:17:03,080 Speaker 3: But so I also I do want to address that. 313 00:17:03,360 --> 00:17:07,800 Speaker 2: I think that when you have over sixty seven percent 314 00:17:07,920 --> 00:17:11,719 Speaker 2: of American teenagers as of last year on a particular platform, 315 00:17:12,000 --> 00:17:14,320 Speaker 2: and you have thirty percent, according to a pupil in 316 00:17:14,359 --> 00:17:18,240 Speaker 2: twenty twenty of preteens nine to eleven year olds on 317 00:17:18,280 --> 00:17:21,120 Speaker 2: a specific platform, and we have new data coming out 318 00:17:21,119 --> 00:17:24,439 Speaker 2: of the UK saying a decent percentage of toddlers are 319 00:17:24,480 --> 00:17:26,480 Speaker 2: now exposed to this content, then you. 320 00:17:26,480 --> 00:17:27,200 Speaker 3: Have a problem. 321 00:17:27,280 --> 00:17:30,320 Speaker 2: Then you have a the fact that you know, number one, 322 00:17:30,440 --> 00:17:32,720 Speaker 2: all of these children are on it to a much 323 00:17:32,760 --> 00:17:33,919 Speaker 2: greater degree. 324 00:17:33,600 --> 00:17:34,159 Speaker 3: Than you know. 325 00:17:34,240 --> 00:17:36,639 Speaker 2: Facebook, as you talked about is hemorrhaging users, especially in 326 00:17:36,680 --> 00:17:41,040 Speaker 2: this demographic Instagram as hemorrhaging users as well. Then that 327 00:17:41,200 --> 00:17:43,840 Speaker 2: becomes a source of information, and we know that they're 328 00:17:43,880 --> 00:17:46,280 Speaker 2: getting it for their news now as well. They're not 329 00:17:46,440 --> 00:17:49,119 Speaker 2: just looking at those cute videos. They're getting it to 330 00:17:49,160 --> 00:17:54,439 Speaker 2: be informed about the world. Top Google executives said as 331 00:17:54,480 --> 00:17:56,760 Speaker 2: much as well. In a tech conference. He said, when 332 00:17:56,800 --> 00:18:00,000 Speaker 2: people young kids want their news, they go to TikTok, 333 00:18:00,200 --> 00:18:03,080 Speaker 2: and Google is very much aware of that, and you know, 334 00:18:03,200 --> 00:18:05,760 Speaker 2: keeping their antenna up. And then you have American adults 335 00:18:05,800 --> 00:18:07,720 Speaker 2: so looking at the kid stuff that matters a lot, 336 00:18:07,760 --> 00:18:10,440 Speaker 2: but American adults as well. So you know, in two 337 00:18:10,520 --> 00:18:13,760 Speaker 2: years the amount of American adults that get their news 338 00:18:13,800 --> 00:18:17,040 Speaker 2: from TikTok has tripled. That's problematic, I think from a 339 00:18:17,080 --> 00:18:20,080 Speaker 2: civic perspective, I see you want to say something, but 340 00:18:20,160 --> 00:18:21,920 Speaker 2: I want to say one more thing before I let 341 00:18:21,920 --> 00:18:22,320 Speaker 2: you talk. 342 00:18:26,960 --> 00:18:28,560 Speaker 1: I'm standing in the way of the train. By all means, 343 00:18:28,600 --> 00:18:32,440 Speaker 1: go ahead, sorry you were saying. So. 344 00:18:33,040 --> 00:18:36,000 Speaker 2: I think the last thing is a number of enterprising 345 00:18:36,080 --> 00:18:39,640 Speaker 2: journalists have taken it upon themselves to create their own 346 00:18:39,640 --> 00:18:42,280 Speaker 2: TikTok accounts. And what they do is they register as 347 00:18:42,359 --> 00:18:45,679 Speaker 2: users from around thirteen to fourteen, and they have found, 348 00:18:45,680 --> 00:18:49,600 Speaker 2: to a man, within minutes, they are fed content that 349 00:18:49,800 --> 00:18:54,760 Speaker 2: is composed of self harm content, suicidal content, eating disorder content, 350 00:18:54,840 --> 00:18:58,000 Speaker 2: especially if they're registering as girls versus registering as boys. 351 00:18:58,240 --> 00:19:02,119 Speaker 2: So we do know that they're is something that isn't 352 00:19:02,160 --> 00:19:07,080 Speaker 2: just responsive to shall we say what the children want? Granted, 353 00:19:07,119 --> 00:19:10,080 Speaker 2: the TikTok algorithm is based off your engagement, not necessarily 354 00:19:10,080 --> 00:19:12,119 Speaker 2: your network. So how long your eyes linger over a 355 00:19:12,160 --> 00:19:15,280 Speaker 2: specific video if you're interested in depression, yes, it's more 356 00:19:15,359 --> 00:19:18,399 Speaker 2: likely to feed you self harm content and suicidal content. 357 00:19:18,480 --> 00:19:21,520 Speaker 2: But this appears to be pretty uniform across the board 358 00:19:22,160 --> 00:19:25,320 Speaker 2: for a lot of these journalists experimenting anyway. So there's 359 00:19:25,359 --> 00:19:30,680 Speaker 2: something in TikTok that is particularly I would say nefarious 360 00:19:30,760 --> 00:19:33,520 Speaker 2: when it comes to our children and the noxious content 361 00:19:33,520 --> 00:19:34,200 Speaker 2: they're being fed. 362 00:19:34,840 --> 00:19:36,480 Speaker 1: I mean, do you ever go on TikTok because I 363 00:19:36,520 --> 00:19:40,959 Speaker 1: gotta say, some of those shuffle dance moves very catchy. 364 00:19:42,359 --> 00:19:45,440 Speaker 3: Fuck. No, I am not going on TikTok, nor will 365 00:19:45,480 --> 00:19:46,440 Speaker 3: I ever go on TikTok. 366 00:19:46,480 --> 00:19:49,480 Speaker 1: We get this. This is where like I get to 367 00:19:49,520 --> 00:19:52,880 Speaker 1: point at all the other people in America, usually especially 368 00:19:52,880 --> 00:19:54,840 Speaker 1: the Democrats, but I get to call them comedies and 369 00:19:54,840 --> 00:19:56,879 Speaker 1: have fun with it. Apparently on this one, I've got 370 00:19:56,880 --> 00:20:00,000 Speaker 1: like a soft spot for communist China. I think TikTok 371 00:20:00,160 --> 00:20:02,919 Speaker 1: it's super entertaining. I gotta tell you, it's all for me. 372 00:20:03,600 --> 00:20:07,479 Speaker 1: It's yeah, it's it's how to how to sear like 373 00:20:07,520 --> 00:20:10,439 Speaker 1: the perfect steak and different red meat. I follow this 374 00:20:10,600 --> 00:20:14,040 Speaker 1: like Max the meat guy who makes like wagu and 375 00:20:14,080 --> 00:20:17,000 Speaker 1: briskets and all these things. Uh, guys who know how 376 00:20:17,000 --> 00:20:20,520 Speaker 1: to make like tomahawks out of the stuff you find 377 00:20:20,520 --> 00:20:22,920 Speaker 1: in your backyard like this is, and of course cute 378 00:20:22,920 --> 00:20:25,919 Speaker 1: bulldog videos, like like, I just like, what is this? 379 00:20:26,200 --> 00:20:31,760 Speaker 1: How is this supposedly in some way doing anything that 380 00:20:31,920 --> 00:20:34,440 Speaker 1: is uh, you know, going to damage me? But that again, 381 00:20:34,480 --> 00:20:36,719 Speaker 1: I'm an adult and things are different, and you know 382 00:20:37,320 --> 00:20:40,919 Speaker 1: that can be a little bit of a challenge. And yeah, 383 00:20:41,000 --> 00:20:43,400 Speaker 1: so I got to get to the Oxford Gold group 384 00:20:43,440 --> 00:20:45,840 Speaker 1: here for a second, and we come back. We're going 385 00:20:45,920 --> 00:20:49,440 Speaker 1: to talk about AI because I think its kra Frederick 386 00:20:49,520 --> 00:20:52,800 Speaker 1: has to explain whether or not Skynet is going to 387 00:20:52,840 --> 00:20:55,840 Speaker 1: become self aware and cause the nuclear war that James 388 00:20:55,920 --> 00:20:59,439 Speaker 1: Cameron warned us about in Terminadoor one and two and 389 00:20:59,440 --> 00:21:01,120 Speaker 1: probably the other, but no one saw the other ones 390 00:21:01,119 --> 00:21:04,520 Speaker 1: because the other Terminator movie sucked, So you know, there 391 00:21:04,560 --> 00:21:06,280 Speaker 1: was a she knows it's true. There was a recent 392 00:21:06,359 --> 00:21:09,760 Speaker 1: banking collapse, the nation's largest collapse of a financial institution 393 00:21:09,880 --> 00:21:12,440 Speaker 1: since two thousand and eight. So stuff can go bad 394 00:21:12,480 --> 00:21:15,920 Speaker 1: real fast, you know this, And fiat currency is pretty 395 00:21:15,920 --> 00:21:19,280 Speaker 1: imperfect because it's a situation now we have inflation. We 396 00:21:19,320 --> 00:21:22,200 Speaker 1: also have thirty few trillion dollars a debt. How about 397 00:21:22,359 --> 00:21:26,199 Speaker 1: using gold and silver as a protection for your portfolio. 398 00:21:26,240 --> 00:21:27,919 Speaker 1: Have a little gold and silver on hand just in 399 00:21:27,960 --> 00:21:29,879 Speaker 1: case I've got gold and silver right here with me 400 00:21:30,000 --> 00:21:33,080 Speaker 1: in the radio studio. Now is the time. Don't wait 401 00:21:33,200 --> 00:21:35,600 Speaker 1: because if a crisis hits, you're gonna need to have 402 00:21:35,640 --> 00:21:37,560 Speaker 1: it on hand. And also those prices of gold and 403 00:21:37,560 --> 00:21:41,040 Speaker 1: silver are gonna go way up. So now's the time 404 00:21:41,200 --> 00:21:44,000 Speaker 1: to call my friends at the Oxford Gold Group. Securing 405 00:21:44,000 --> 00:21:45,720 Speaker 1: your I array or four oh one K with real 406 00:21:45,760 --> 00:21:47,840 Speaker 1: gold and silver, by the way, is also a fantastic 407 00:21:48,160 --> 00:21:52,000 Speaker 1: portfolio protection plan. All you have to do is call 408 00:21:52,080 --> 00:21:55,040 Speaker 1: the Oxford Gold Group. You can own real precious metals 409 00:21:55,080 --> 00:21:57,360 Speaker 1: just like I do. Call the Oxford Gold Group at 410 00:21:57,359 --> 00:22:01,040 Speaker 1: eight three three four zero four Gold eight three four 411 00:22:01,200 --> 00:22:07,600 Speaker 1: zero four g O L d okay so ms Kara 412 00:22:07,640 --> 00:22:11,239 Speaker 1: Frederick who is on the task force I believe at 413 00:22:11,280 --> 00:22:15,320 Speaker 1: the Heritage Foundation for dealing with AI related matters, So 414 00:22:15,720 --> 00:22:19,400 Speaker 1: you would you would have some insights into this. I 415 00:22:19,400 --> 00:22:23,040 Speaker 1: I think that the AI is and look is Elon 416 00:22:23,119 --> 00:22:27,919 Speaker 1: Musk both wealthier and smarter than me. Yes, but I 417 00:22:27,960 --> 00:22:30,639 Speaker 1: think all this stuff about how the world is going 418 00:22:31,080 --> 00:22:35,360 Speaker 1: to end because of AI is pretty crazy. Like I'm 419 00:22:35,400 --> 00:22:37,560 Speaker 1: looking at this, I'm like, how does this even happen? 420 00:22:38,119 --> 00:22:41,639 Speaker 1: Everyone's getting all freaked out. Do I I'm usually actually no, 421 00:22:41,680 --> 00:22:43,800 Speaker 1: I usually tell people everything's gonna be okay, because most 422 00:22:43,880 --> 00:22:47,320 Speaker 1: hysteria is just people wanting attention. But is AI really 423 00:22:47,440 --> 00:22:49,400 Speaker 1: I know, it's it's a big deal and it's gonna 424 00:22:49,440 --> 00:22:51,800 Speaker 1: matter a lot. I'm not saying it doesn't matter a lot. 425 00:22:52,119 --> 00:22:55,320 Speaker 1: But as a threat, what is the threat from AI? 426 00:22:55,440 --> 00:22:57,720 Speaker 1: That's the part of this that I still haven't No 427 00:22:57,760 --> 00:22:59,639 Speaker 1: one's really been able to explain to me. It's like, oh, 428 00:22:59,680 --> 00:23:02,400 Speaker 1: look at going to hurt our democracy because of the disinformation. 429 00:23:03,080 --> 00:23:04,679 Speaker 1: Watch CNN, look what they. 430 00:23:04,560 --> 00:23:06,520 Speaker 3: Do, fair point. 431 00:23:06,840 --> 00:23:09,040 Speaker 2: And I do think you stand in good company, uh 432 00:23:09,160 --> 00:23:12,480 Speaker 2: with with some of the skeptics, you know. One of 433 00:23:12,480 --> 00:23:14,800 Speaker 2: the things, as you know, in the intel community, though, 434 00:23:14,800 --> 00:23:17,920 Speaker 2: there was always that person who was like everyone everyone 435 00:23:18,240 --> 00:23:21,879 Speaker 2: China isn't ten feet tall, right, everyone, everyone al Kaita 436 00:23:22,040 --> 00:23:25,560 Speaker 2: is not looking at external operations, and they like are 437 00:23:25,600 --> 00:23:27,080 Speaker 2: the naysayers and everyone. 438 00:23:26,920 --> 00:23:30,560 Speaker 1: Kind of their whole thing is is they're the uh 439 00:23:30,920 --> 00:23:33,560 Speaker 1: you know, they're they're playing the role of what's the 440 00:23:33,600 --> 00:23:35,400 Speaker 1: what's the word we're looking for here? You know, when 441 00:23:35,400 --> 00:23:39,480 Speaker 1: you're just being in opposition, to be in opposition, I 442 00:23:39,480 --> 00:23:43,359 Speaker 1: forget what the word is, you know what I'm saying contrarian. 443 00:23:44,000 --> 00:23:45,600 Speaker 1: Thank you. They're playing the contrarian. 444 00:23:45,680 --> 00:23:49,480 Speaker 2: Yes, yes, yes, So so you know there there's that 445 00:23:49,680 --> 00:23:52,399 Speaker 2: aspect of you know, the AI community as well. But 446 00:23:52,680 --> 00:23:57,320 Speaker 2: I do think there's there there's reason to be worried. 447 00:23:57,560 --> 00:24:01,560 Speaker 2: And I like to quote, you know, two authoritarians on 448 00:24:01,600 --> 00:24:04,920 Speaker 2: this matter, and I thought this was commonplace, but apparently 449 00:24:04,960 --> 00:24:07,280 Speaker 2: not many people know that. Putin a few years ago 450 00:24:07,400 --> 00:24:10,040 Speaker 2: said whoever is going to lead an AI is going 451 00:24:10,080 --> 00:24:10,959 Speaker 2: to rule the world. 452 00:24:11,880 --> 00:24:14,359 Speaker 3: I think that that's the significant. 453 00:24:15,520 --> 00:24:18,600 Speaker 2: Discussion to be had, especially when Putin's saying that, and 454 00:24:18,640 --> 00:24:21,840 Speaker 2: then she and China says, you know, we want to 455 00:24:22,000 --> 00:24:25,040 Speaker 2: dominate in AI by twenty thirty. So you know, if 456 00:24:25,080 --> 00:24:28,480 Speaker 2: our adversaries have their eye on this and they see 457 00:24:28,480 --> 00:24:33,840 Speaker 2: it as some geopolitical strategic key, then I think that 458 00:24:33,920 --> 00:24:37,359 Speaker 2: it's important to sort of pay attention. And the reason 459 00:24:37,440 --> 00:24:40,960 Speaker 2: why I think they know harnessing these technologies is really 460 00:24:40,960 --> 00:24:43,840 Speaker 2: going to catapult them to the front of the line 461 00:24:43,880 --> 00:24:49,120 Speaker 2: of global dominance is is because they're doing things at 462 00:24:49,200 --> 00:24:52,800 Speaker 2: machine speed tends to be better than doing things at 463 00:24:52,880 --> 00:24:55,160 Speaker 2: human speed. And when you're coming when you're talking about 464 00:24:55,160 --> 00:24:58,960 Speaker 2: war fighting, when you're talking about things like even you know, 465 00:24:59,119 --> 00:25:02,000 Speaker 2: the stuff that the Intel pukes like us used to do, 466 00:25:02,440 --> 00:25:05,680 Speaker 2: that computer vision algorithms can do better, you know, instead of. 467 00:25:06,040 --> 00:25:07,280 Speaker 3: What project may even tried to do. 468 00:25:07,359 --> 00:25:09,280 Speaker 2: Instead of a human being sitting in front of an 469 00:25:09,440 --> 00:25:12,399 Speaker 2: FMV screen, you know, saying that's a labeling a truck, 470 00:25:12,640 --> 00:25:16,760 Speaker 2: labeling a rock, labeling a car, you have machines able 471 00:25:16,800 --> 00:25:20,760 Speaker 2: to do that. And then when machines can make decisions, 472 00:25:21,119 --> 00:25:24,480 Speaker 2: and again, you know, the autonomous weapons that we have 473 00:25:24,520 --> 00:25:27,080 Speaker 2: are mostly semi autonomous, so there's always a human in 474 00:25:27,119 --> 00:25:27,360 Speaker 2: the loop. 475 00:25:27,400 --> 00:25:27,840 Speaker 3: At this point. 476 00:25:27,920 --> 00:25:30,119 Speaker 2: There's a lot of debate and discussion about that. But 477 00:25:30,160 --> 00:25:32,040 Speaker 2: when you get to the point that a machine is 478 00:25:32,119 --> 00:25:37,880 Speaker 2: basically cutting out all of that analytical rigor that can 479 00:25:37,920 --> 00:25:40,600 Speaker 2: be applied elsewhere to do things only humans can do, 480 00:25:41,000 --> 00:25:44,040 Speaker 2: that's going to give the war fighter a massive advantage. 481 00:25:44,080 --> 00:25:47,359 Speaker 2: So if you have a computer vision algorithm determining that 482 00:25:47,359 --> 00:25:48,919 Speaker 2: that's a rock and that's a tree, and that you 483 00:25:48,920 --> 00:25:53,760 Speaker 2: shouldn't you know, hit it with a with a kinetic strike, 484 00:25:54,040 --> 00:25:56,159 Speaker 2: then that's going to be better if you have another 485 00:25:56,240 --> 00:25:59,240 Speaker 2: analyst sort of doing things like determining if there's a 486 00:25:59,240 --> 00:26:03,040 Speaker 2: positive identify vocation of that actual terrorist actor which would 487 00:26:03,119 --> 00:26:05,640 Speaker 2: cause that hell fire to rain down. So I think 488 00:26:05,680 --> 00:26:09,439 Speaker 2: that you know, it's offering specific advantages because of that 489 00:26:09,600 --> 00:26:12,600 Speaker 2: machine speed vice human speed. People talk a lot about 490 00:26:12,680 --> 00:26:15,560 Speaker 2: drone swarms too, so instead of you know, training up 491 00:26:15,600 --> 00:26:18,880 Speaker 2: a human pilot necessarily these you know, what my old 492 00:26:18,920 --> 00:26:20,840 Speaker 2: boss at the Center for an American Security used to 493 00:26:20,880 --> 00:26:22,960 Speaker 2: say is that, you know, the human being sort of 494 00:26:22,960 --> 00:26:25,680 Speaker 2: acts as the quarterback and you let your your smaller, 495 00:26:25,760 --> 00:26:29,320 Speaker 2: you're cheaper systems do some of the other work for you, 496 00:26:29,440 --> 00:26:31,240 Speaker 2: so you're not sort of wasting a human being in 497 00:26:31,240 --> 00:26:33,720 Speaker 2: the human capital on that kind of thing too. So 498 00:26:34,040 --> 00:26:36,720 Speaker 2: I think those are just two examples of what AI 499 00:26:37,240 --> 00:26:39,800 Speaker 2: can help accomplish in war. 500 00:26:40,440 --> 00:26:43,080 Speaker 3: And we know that she's on a war footing. 501 00:26:43,520 --> 00:26:46,160 Speaker 2: We know that Putin is currently at war right now too, 502 00:26:46,200 --> 00:26:48,879 Speaker 2: so so that's another thing. The information environment, like what 503 00:26:48,920 --> 00:26:51,920 Speaker 2: you were talking about before, that's a whole nother kettle 504 00:26:51,920 --> 00:26:53,960 Speaker 2: of worms, and we can talk about that all day 505 00:26:54,000 --> 00:26:55,960 Speaker 2: as well. Well. 506 00:26:56,920 --> 00:26:59,560 Speaker 1: I don't know, it seems to me like if there's 507 00:26:59,560 --> 00:27:01,520 Speaker 1: a little more MA that can clean up after me 508 00:27:01,880 --> 00:27:05,240 Speaker 1: and tell me how great I am all the time 509 00:27:05,400 --> 00:27:09,680 Speaker 1: and you know, not ask any questions beyond that, sounds good. 510 00:27:09,840 --> 00:27:11,760 Speaker 1: I'm not that worried about it, like turning into the 511 00:27:11,840 --> 00:27:14,840 Speaker 1: terminator and deciding that you know, it has feelings too, 512 00:27:14,880 --> 00:27:16,800 Speaker 1: and it's self aware at all. This I don't know. 513 00:27:16,840 --> 00:27:19,840 Speaker 1: I mean, I guess I do think it's it's definitely 514 00:27:19,840 --> 00:27:22,240 Speaker 1: going to be interesting for high school kids who want 515 00:27:22,240 --> 00:27:25,320 Speaker 1: to have some program that can easily write their you know, 516 00:27:25,600 --> 00:27:28,320 Speaker 1: kind of B minus level term paper for them that 517 00:27:28,840 --> 00:27:32,960 Speaker 1: we know it can do pretty quickly. But friends, you 518 00:27:33,000 --> 00:27:37,080 Speaker 1: know what you need, not AI you need chalk. Chalk 519 00:27:37,119 --> 00:27:40,119 Speaker 1: provides all natural supplements that help people restore their energy 520 00:27:40,160 --> 00:27:43,639 Speaker 1: potential every day. It's a daily supplement formulated to restore 521 00:27:43,680 --> 00:27:46,440 Speaker 1: lower testosterone levels and men to the levels that men 522 00:27:46,560 --> 00:27:49,160 Speaker 1: used to have. Our diets and stress levels just don't 523 00:27:49,200 --> 00:27:51,920 Speaker 1: naturally provide for the kind of testosterone that we need. 524 00:27:52,320 --> 00:27:54,560 Speaker 1: Chalk's leading and greading in their Male Vitality Sack has 525 00:27:54,600 --> 00:27:57,359 Speaker 1: proven to restore twenty percent of those lower levels in 526 00:27:57,440 --> 00:28:00,200 Speaker 1: just three months time. You'll feel the positive effect and 527 00:28:00,280 --> 00:28:03,000 Speaker 1: experience and energy potential and focus that you haven't in 528 00:28:03,040 --> 00:28:05,640 Speaker 1: a long time. Chalk produces their products with a high 529 00:28:05,720 --> 00:28:08,760 Speaker 1: level of purity, makes it potent and impactful. That's why 530 00:28:08,840 --> 00:28:11,000 Speaker 1: the Male Vitality Stack is as effective as it is. 531 00:28:11,119 --> 00:28:14,520 Speaker 1: Sign yourself up, take it, take delivery of Chalk's Mail, 532 00:28:14,560 --> 00:28:17,480 Speaker 1: Vitality Stack, or any of the other products available via subscription. 533 00:28:17,920 --> 00:28:20,320 Speaker 1: Get thirty five percent off any Chalk subscription for a 534 00:28:20,359 --> 00:28:23,240 Speaker 1: life when you use my name at their website chalk 535 00:28:23,320 --> 00:28:27,840 Speaker 1: dot com. Is that website choq dot com, be sure 536 00:28:27,880 --> 00:28:30,840 Speaker 1: to use my name Buck to get thirty five percent off. 537 00:28:30,880 --> 00:28:34,000 Speaker 1: Go to c choq chalk dot com. Use my name 538 00:28:34,119 --> 00:28:37,960 Speaker 1: Buck b U c K for thirty five percent off. 539 00:28:38,680 --> 00:28:41,680 Speaker 1: So caro, when you're not trying to save the Internet 540 00:28:41,880 --> 00:28:45,280 Speaker 1: for the purposes of freedom, humanity and world peace and 541 00:28:45,280 --> 00:28:47,320 Speaker 1: all that stuff, what else? What do we need to 542 00:28:47,320 --> 00:28:50,440 Speaker 1: with care? Frederick? You were a Navy intel analyst. Is 543 00:28:50,440 --> 00:28:52,320 Speaker 1: that what I'm getting? Because you said vice in a 544 00:28:52,360 --> 00:28:56,280 Speaker 1: way only people from the intelligence community ever say vice 545 00:28:56,480 --> 00:28:59,080 Speaker 1: like that, meaning instead, do you know that no one 546 00:28:59,120 --> 00:29:01,600 Speaker 1: else You'll never come across anybody who did not work 547 00:29:01,640 --> 00:29:03,960 Speaker 1: in the IC who will be like, well, I think 548 00:29:03,960 --> 00:29:06,200 Speaker 1: that's a good idea of vice. This other idea you're like, wait, 549 00:29:06,240 --> 00:29:08,280 Speaker 1: what were the only ones who do that? 550 00:29:09,840 --> 00:29:11,760 Speaker 3: No way? Oh man, I'm about habits. 551 00:29:12,000 --> 00:29:14,840 Speaker 1: Oh I'll give you more. I'll give you more optics. 552 00:29:15,360 --> 00:29:17,920 Speaker 1: Only people from DC talk about well from this optic 553 00:29:18,000 --> 00:29:20,560 Speaker 1: or from that optic or whatever. There there are some 554 00:29:20,720 --> 00:29:24,000 Speaker 1: things that the the Intel people in particular, we have 555 00:29:24,120 --> 00:29:27,560 Speaker 1: this weird nerd vernacular. And I caught you doing some 556 00:29:27,640 --> 00:29:30,640 Speaker 1: nerd vernacular during this podcast. So at least I know 557 00:29:30,680 --> 00:29:32,680 Speaker 1: you're the real deal. At least I know that you 558 00:29:32,720 --> 00:29:36,200 Speaker 1: know you were pouring over those reports eight cups of 559 00:29:36,200 --> 00:29:36,960 Speaker 1: coffee deep. 560 00:29:39,000 --> 00:29:41,760 Speaker 2: Yeah, it's funny, you know, get Smart with Steve Carell 561 00:29:41,800 --> 00:29:42,600 Speaker 2: and Anne Hathaway. 562 00:29:42,760 --> 00:29:43,800 Speaker 3: It's a it's a movie. 563 00:29:43,800 --> 00:29:46,400 Speaker 2: It's like a I don't know, spy caper and you 564 00:29:46,520 --> 00:29:49,320 Speaker 2: have the Intel analyst who like finally gets his chance 565 00:29:49,360 --> 00:29:52,120 Speaker 2: to like be an operations officer in the field and 566 00:29:52,360 --> 00:29:56,360 Speaker 2: he's you know, talking to some operator types and he's like, wait, did. 567 00:29:56,400 --> 00:29:59,680 Speaker 3: Nobody read my report? Like that was me. I was like, 568 00:29:59,720 --> 00:30:01,680 Speaker 3: didn't nobody read my report? 569 00:30:02,440 --> 00:30:05,200 Speaker 2: That person who would would go out with the guys 570 00:30:05,240 --> 00:30:08,800 Speaker 2: and special operations forces and sort of be like, Hey, everybody, 571 00:30:08,880 --> 00:30:10,240 Speaker 2: this is what we should be thinking about. 572 00:30:10,360 --> 00:30:11,800 Speaker 3: Here's the target. You should go there. 573 00:30:12,240 --> 00:30:15,920 Speaker 2: Uh So I was always a civilian, a civilian intelligence officer, 574 00:30:16,000 --> 00:30:18,080 Speaker 2: and we called them targeters. When I was working with 575 00:30:18,200 --> 00:30:21,440 Speaker 2: Naval Special Warfare development group. But that was yeah, that 576 00:30:21,520 --> 00:30:23,720 Speaker 2: was my job. I was an al Qaida analyst first 577 00:30:23,760 --> 00:30:25,640 Speaker 2: and foremost and all sorcers. 578 00:30:25,680 --> 00:30:30,360 Speaker 1: So what they did was they like original original gangster 579 00:30:30,720 --> 00:30:33,920 Speaker 1: old school al Qaeda or the uh a q I. 580 00:30:33,960 --> 00:30:36,320 Speaker 1: Because I was a q I, Oh you were, They 581 00:30:36,320 --> 00:30:37,920 Speaker 1: brought me in to do C D C a q 582 00:30:38,120 --> 00:30:39,880 Speaker 1: I and then I got moved to O I uh 583 00:30:40,000 --> 00:30:42,520 Speaker 1: O I A and and uh and a q I 584 00:30:42,640 --> 00:30:44,840 Speaker 1: basically so no way. 585 00:30:44,800 --> 00:30:45,800 Speaker 3: Yeah, no, I was. 586 00:30:46,040 --> 00:30:49,440 Speaker 2: I was looking at guys over yeah, the original guys, 587 00:30:50,040 --> 00:30:52,080 Speaker 2: some of the guys and hiding and Iran for a 588 00:30:52,120 --> 00:30:54,000 Speaker 2: little bit that is now in the open. 589 00:30:54,320 --> 00:30:55,680 Speaker 3: Uh and uh. Yeah. 590 00:30:55,680 --> 00:30:58,240 Speaker 2: When I deployed a couple of times, that was my 591 00:30:58,880 --> 00:31:01,360 Speaker 2: That was my thing, looking at a kind of external 592 00:31:01,480 --> 00:31:03,440 Speaker 2: operations operatives. 593 00:31:03,600 --> 00:31:08,160 Speaker 1: So what did you think of of Thirteen Hours the movie, 594 00:31:08,200 --> 00:31:09,720 Speaker 1: by the way, because I always thought it was so 595 00:31:09,760 --> 00:31:14,560 Speaker 1: funny that the case officers in that were like so smug, 596 00:31:16,360 --> 00:31:19,080 Speaker 1: which is just great. It was just great. The analyst 597 00:31:19,120 --> 00:31:20,840 Speaker 1: were all sitting there like, yeah, maybe we were just 598 00:31:20,880 --> 00:31:23,480 Speaker 1: back at headquarters making coffee, but like at least we 599 00:31:23,560 --> 00:31:26,000 Speaker 1: respected the paramilitary guys. 600 00:31:27,240 --> 00:31:28,760 Speaker 3: Yeah, oh, I don't know. 601 00:31:28,840 --> 00:31:31,440 Speaker 2: I had a lot of OGA friends when I was 602 00:31:31,480 --> 00:31:36,160 Speaker 2: out there, so you know, you gotta keep those relationships warm. 603 00:31:36,640 --> 00:31:39,600 Speaker 3: So I was I liked your side of. 604 00:31:38,960 --> 00:31:43,360 Speaker 1: The you know, but did you notice in the movie 605 00:31:43,400 --> 00:31:45,520 Speaker 1: for some reason, just to really hammer it home, like 606 00:31:45,600 --> 00:31:50,000 Speaker 1: one of the American case officers in Thirteen Hours just 607 00:31:50,080 --> 00:31:52,440 Speaker 1: randomly has kind of like a French accent and he's 608 00:31:52,440 --> 00:31:55,480 Speaker 1: turning around to like he's turning around to you know, 609 00:31:55,520 --> 00:31:57,920 Speaker 1: what's his name, John Krasinsky, and like all these guys 610 00:31:58,000 --> 00:32:00,720 Speaker 1: were all, you know, jacked and these bad but he's 611 00:32:00,760 --> 00:32:03,480 Speaker 1: like he's like I had to do fancy things, dude, 612 00:32:03,520 --> 00:32:05,880 Speaker 1: Like I am so cool. They're like where did this? 613 00:32:06,240 --> 00:32:09,120 Speaker 1: Where did Why is is his name? Like like Jacques Cousteau, 614 00:32:09,280 --> 00:32:09,800 Speaker 1: Like where. 615 00:32:09,600 --> 00:32:10,120 Speaker 3: Did this guy? 616 00:32:10,400 --> 00:32:13,080 Speaker 1: He's an American? Like they just I love it because 617 00:32:13,200 --> 00:32:18,640 Speaker 1: those guys, uh, the the the GRS guys in the movie, like, oh, 618 00:32:18,680 --> 00:32:20,600 Speaker 1: it's funny the things they would say about the case officer. 619 00:32:20,680 --> 00:32:23,120 Speaker 1: I'm just sorry. I just was observing. I'm just observing. 620 00:32:23,920 --> 00:32:24,160 Speaker 3: Fuck. 621 00:32:24,280 --> 00:32:26,960 Speaker 2: Every time I left when we were forward, every time 622 00:32:27,000 --> 00:32:29,520 Speaker 2: I left the Tactical Operations Center, all the guys in 623 00:32:29,560 --> 00:32:30,320 Speaker 2: a chorus. 624 00:32:30,000 --> 00:32:34,320 Speaker 3: Would go beat it NERD. So yeah, it's a yeah, 625 00:32:34,400 --> 00:32:36,400 Speaker 3: but you guys were like, we. 626 00:32:36,440 --> 00:32:39,840 Speaker 1: Had, we had our reports and are our cool stuff, 627 00:32:40,040 --> 00:32:43,960 Speaker 1: you know in the meetings that we did too? So interesting? 628 00:32:44,040 --> 00:32:46,760 Speaker 1: All right, Well, Cara, where should people go to check 629 00:32:46,800 --> 00:32:48,560 Speaker 1: out your work and the stuff that you're up to 630 00:32:48,600 --> 00:32:49,560 Speaker 1: and all that good stuff. 631 00:32:50,160 --> 00:32:52,960 Speaker 2: Yeah, first and foremost, go to Heritage dot org. So 632 00:32:53,080 --> 00:32:55,920 Speaker 2: all of our work is on the Heritage website. I 633 00:32:56,000 --> 00:32:58,520 Speaker 2: direct the Tech Policy Center there and yeah, we're we're 634 00:32:58,560 --> 00:33:01,040 Speaker 2: looking at you know, five big lines of effort. So 635 00:33:01,240 --> 00:33:05,720 Speaker 2: go to that website to see what we're up to. Personally, 636 00:33:06,200 --> 00:33:08,160 Speaker 2: I'm again kind of in the belly of the beast. 637 00:33:08,240 --> 00:33:11,880 Speaker 2: I'm on Twitter Kara A. Frederick and on Instagram as 638 00:33:11,960 --> 00:33:14,920 Speaker 2: Kara fred with two DS, so you can check out 639 00:33:14,960 --> 00:33:15,280 Speaker 2: my work. 640 00:33:15,440 --> 00:33:17,880 Speaker 3: I do a lot of some personal but mostly and. 641 00:33:17,800 --> 00:33:19,600 Speaker 1: You got a tiny you got a tiny baby too, 642 00:33:19,680 --> 00:33:21,800 Speaker 1: right right now you're like a. 643 00:33:21,480 --> 00:33:24,600 Speaker 3: Yeah, yeah, yeah, yeah. 644 00:33:24,640 --> 00:33:27,120 Speaker 2: So she's an infant and I don't hear her, but 645 00:33:27,360 --> 00:33:29,200 Speaker 2: she's probably gonna start crying a little bit. 646 00:33:29,240 --> 00:33:31,720 Speaker 1: So this is I'm glad we didn't get the form 647 00:33:31,720 --> 00:33:33,360 Speaker 1: where you pretend you hear the infant crying so you 648 00:33:33,360 --> 00:33:35,800 Speaker 1: can end the interview early. So that's good. That means 649 00:33:35,800 --> 00:33:38,600 Speaker 1: we kept it moving here. Check out Kara stuff everybody. 650 00:33:38,680 --> 00:33:42,000 Speaker 1: Kara Frederick, thank you so much. Appreciate you joining us