1 00:00:11,697 --> 00:00:14,977 Speaker 1: You're listening to The Buck Sexton Show podcast, make sure 2 00:00:14,977 --> 00:00:17,977 Speaker 1: you subscribe to the podcast on the iHeartRadio app or 3 00:00:18,017 --> 00:00:19,737 Speaker 1: wherever you get your podcasts. 4 00:00:20,097 --> 00:00:23,217 Speaker 2: Hey, everybody, Welcome to the Buck Sexton Show. Kara Frederick 5 00:00:23,497 --> 00:00:26,417 Speaker 2: is with us now. She's formerly of Facebook, currently with 6 00:00:26,497 --> 00:00:30,217 Speaker 2: the Heritage Foundation, where she's Director of Tech Policy. We're 7 00:00:30,217 --> 00:00:33,137 Speaker 2: going to talk to her about AI, about government snooping 8 00:00:33,377 --> 00:00:36,577 Speaker 2: on social media platforms, and all kinds of good stuff 9 00:00:36,737 --> 00:00:41,097 Speaker 2: or bad stuff, but interesting stuff. Kara, welcome, Thanks for 10 00:00:41,137 --> 00:00:48,497 Speaker 2: having me. So let's start with Elon Musk recently spoke 11 00:00:48,577 --> 00:00:51,857 Speaker 2: to Tucker Carlson on Fox Tree saw it, and he said, 12 00:00:51,897 --> 00:00:57,497 Speaker 2: not only was he Elon flabbergasted by how much access 13 00:00:57,537 --> 00:01:01,617 Speaker 2: the government had to the communications and just the general 14 00:01:01,617 --> 00:01:04,577 Speaker 2: stuff going on at the social media platform Twitter, I'm 15 00:01:04,577 --> 00:01:08,417 Speaker 2: sure as true of other places, but that dms were 16 00:01:08,497 --> 00:01:12,097 Speaker 2: easily accessible by the government to What do you think 17 00:01:12,137 --> 00:01:12,737 Speaker 2: of that one? 18 00:01:13,697 --> 00:01:17,097 Speaker 3: Yeah, so this, you know, honestly, it's not surprising. I mean, 19 00:01:17,177 --> 00:01:21,217 Speaker 3: especially given the backgrounds that you and I had there. 20 00:01:21,297 --> 00:01:23,977 Speaker 3: We do know, first and foremost there are legitimate reasons, 21 00:01:24,057 --> 00:01:26,857 Speaker 3: or at least there were legitimate reasons to you know, 22 00:01:26,897 --> 00:01:29,297 Speaker 3: work with some federal entities when it comes to things 23 00:01:29,337 --> 00:01:31,817 Speaker 3: like child sexual abuse material, you know. 24 00:01:31,937 --> 00:01:33,257 Speaker 1: Sniffing out the bad guys. 25 00:01:33,737 --> 00:01:36,857 Speaker 3: I worked on foreign Islamic terrorism when I was working 26 00:01:36,897 --> 00:01:39,817 Speaker 3: for a big tech platform and before, so you know, 27 00:01:40,217 --> 00:01:43,977 Speaker 3: there are times when you want the government that is 28 00:01:44,177 --> 00:01:48,937 Speaker 3: ostensibly there to protect the security of Americans to maybe 29 00:01:48,937 --> 00:01:52,457 Speaker 3: have some sort of access to these internal communications. Unfortunately, 30 00:01:52,537 --> 00:01:55,657 Speaker 3: over the past two to three years, we've seen that 31 00:01:56,097 --> 00:01:59,977 Speaker 3: there's been a abject dereliction of duty when it comes 32 00:02:00,017 --> 00:02:03,417 Speaker 3: to the you know, number one, keeping United citizens safe 33 00:02:03,457 --> 00:02:09,617 Speaker 3: and number two really prioritizing the the security of Americans, 34 00:02:10,137 --> 00:02:14,377 Speaker 3: especially from external hostile forces, and instead looking inward at 35 00:02:14,377 --> 00:02:19,377 Speaker 3: the American populace and inflating the definition of domestic extremists 36 00:02:19,377 --> 00:02:23,097 Speaker 3: and terrorism and whatnot. So, in my mind, the government 37 00:02:23,217 --> 00:02:27,697 Speaker 3: has lost our trust and deservedly so, and they have 38 00:02:27,857 --> 00:02:31,577 Speaker 3: been misprioritizing when they should be looking at real, actual threats. 39 00:02:31,937 --> 00:02:34,257 Speaker 1: Then okay, maybe we can think about. 40 00:02:34,057 --> 00:02:36,737 Speaker 3: Some of these surveillance capabilities, but that's all gone, that's 41 00:02:36,737 --> 00:02:40,377 Speaker 3: been washed away, given especially what Elon exposed in the 42 00:02:40,417 --> 00:02:45,137 Speaker 3: Twitter files, and it is very telling. I think that 43 00:02:45,657 --> 00:02:48,057 Speaker 3: we at the Heritage Foundation, we published a report in 44 00:02:48,097 --> 00:02:50,737 Speaker 3: February of twenty twenty two, and we said there's a 45 00:02:50,777 --> 00:02:55,497 Speaker 3: symbiotic relationship between these big tech companies and the government. 46 00:02:55,617 --> 00:02:58,977 Speaker 3: It goes as far as collusion and any sort of 47 00:02:59,017 --> 00:03:02,377 Speaker 3: collusion between government actors and these big tech companies to 48 00:03:02,457 --> 00:03:05,217 Speaker 3: silence the speech of Americans should be prohibited. And people 49 00:03:05,257 --> 00:03:07,657 Speaker 3: were like, you are, Oh, come on, You're just a 50 00:03:07,697 --> 00:03:08,297 Speaker 3: fear monger. 51 00:03:08,337 --> 00:03:10,617 Speaker 1: What are you doing. But we've been I've proven right. 52 00:03:10,657 --> 00:03:12,857 Speaker 3: So the fact that the government had full access to 53 00:03:12,937 --> 00:03:15,617 Speaker 3: dms and Twitter. Again, we knew Twitter dms a long 54 00:03:15,657 --> 00:03:18,137 Speaker 3: time ago were probably compromised, and this just proves it. 55 00:03:18,217 --> 00:03:20,777 Speaker 3: And frankly, this proves a Heritage Foundation right yet again. 56 00:03:21,657 --> 00:03:26,457 Speaker 2: So you worked at Facebook, I would assume, and tell 57 00:03:26,537 --> 00:03:30,377 Speaker 2: me if it's an incorrect assumption that the very very 58 00:03:30,897 --> 00:03:35,297 Speaker 2: cozy and all too close relationship between Twitter pre Elon 59 00:03:36,057 --> 00:03:40,257 Speaker 2: and the government and as he described it, effectively, it 60 00:03:40,297 --> 00:03:45,297 Speaker 2: was a bloated activist organization Twitter masquerading as a social 61 00:03:45,337 --> 00:03:48,697 Speaker 2: media platform. I have to assume, and you tell me 62 00:03:48,737 --> 00:03:52,497 Speaker 2: if it's not right that the politics of Facebook, Google, 63 00:03:53,057 --> 00:03:56,937 Speaker 2: all the rest are basically the same and perhaps even worse. 64 00:03:58,257 --> 00:04:01,857 Speaker 3: I would say, post twenty sixteen Trump elections, that's when 65 00:04:01,897 --> 00:04:05,497 Speaker 3: everything became exacerbated. So you have you know, a cadre 66 00:04:05,537 --> 00:04:09,297 Speaker 3: of US who went in before that election. I did 67 00:04:09,297 --> 00:04:13,017 Speaker 3: there in twenty six sixteen, early twenty sixteen, and you know, 68 00:04:13,177 --> 00:04:17,257 Speaker 3: we we were there to solve these big problems like 69 00:04:17,297 --> 00:04:19,977 Speaker 3: the foreign as long terrorism issue, and you had really 70 00:04:20,017 --> 00:04:22,497 Speaker 3: talented people, a lot of time patriots. I came right 71 00:04:22,537 --> 00:04:25,097 Speaker 3: out of a naval special warfare command where I had 72 00:04:25,097 --> 00:04:28,617 Speaker 3: been doing counter terrorism analysis as a targeter, and I 73 00:04:28,697 --> 00:04:32,097 Speaker 3: went right into Facebook sort of thinking that, you know, 74 00:04:32,097 --> 00:04:33,377 Speaker 3: we were going to do the same thing. We were 75 00:04:33,377 --> 00:04:36,857 Speaker 3: going to protect Americans and their users from this you know, 76 00:04:36,897 --> 00:04:39,937 Speaker 3: foreign Islamic menace, which was taking the form of ISIS 77 00:04:39,937 --> 00:04:43,697 Speaker 3: at the time. And then I would say in twenty sixteen, 78 00:04:44,057 --> 00:04:47,497 Speaker 3: something really changed and it was Trump's election, and they 79 00:04:47,617 --> 00:04:50,737 Speaker 3: just went hysterical. And the people that they started recruiting 80 00:04:51,097 --> 00:04:53,297 Speaker 3: post election, you know a lot of them went in 81 00:04:53,537 --> 00:04:57,257 Speaker 3: sort of thinking that they had a mandate to stifle 82 00:04:57,377 --> 00:05:01,497 Speaker 3: the conservative voices i'd say in America. You know, prior 83 00:05:01,537 --> 00:05:06,417 Speaker 3: to that, we there were some interesting i would say 84 00:05:06,497 --> 00:05:08,937 Speaker 3: data points when it came to the way that we 85 00:05:09,017 --> 00:05:12,017 Speaker 3: did our analysis, for instance, and this is no secret. 86 00:05:12,057 --> 00:05:18,337 Speaker 3: Now these companies have been using organizations and goos like 87 00:05:18,417 --> 00:05:22,617 Speaker 3: the Southern Poverty Law Center to actually help them formulate 88 00:05:22,657 --> 00:05:26,217 Speaker 3: their policies and help them formulate the way that they 89 00:05:26,457 --> 00:05:30,297 Speaker 3: treated specific actors on these platforms, and they thought nothing 90 00:05:30,337 --> 00:05:33,337 Speaker 3: of it. They thought, Okay, SBLC is an honest broker. 91 00:05:33,977 --> 00:05:36,057 Speaker 3: So I think that part of it is ignorance. Part 92 00:05:36,057 --> 00:05:37,657 Speaker 3: of it is, you know, the sea that they swim 93 00:05:37,697 --> 00:05:40,017 Speaker 3: in in the Bay Area, and then the other part 94 00:05:40,057 --> 00:05:42,897 Speaker 3: of it is the twenty sixteen election just really unhinging 95 00:05:42,937 --> 00:05:49,137 Speaker 3: people and making them really come in with mission oriented 96 00:05:49,257 --> 00:05:53,177 Speaker 3: to sort of stifle the voices of the people that 97 00:05:53,217 --> 00:05:54,097 Speaker 3: they disagreed with. 98 00:05:55,377 --> 00:05:57,777 Speaker 2: Is it fixable at these places? Like how would you 99 00:05:57,857 --> 00:06:02,417 Speaker 2: actually if you were able to get you know, Zuckerberg 100 00:06:02,697 --> 00:06:05,937 Speaker 2: and his top lieutenants in a room and they were 101 00:06:05,937 --> 00:06:08,017 Speaker 2: willing to listen to reason. I mean, you know, I 102 00:06:08,057 --> 00:06:10,777 Speaker 2: think Facebook has just turned I'm a man that there's 103 00:06:10,777 --> 00:06:13,017 Speaker 2: still apparently as many people using it in America as 104 00:06:13,057 --> 00:06:16,497 Speaker 2: there are. I find it to be like an unwieldy 105 00:06:16,537 --> 00:06:18,737 Speaker 2: trash heap of nonsense, Like it's really hard to even 106 00:06:18,737 --> 00:06:21,577 Speaker 2: figure out where anything is anymore. They've it all looks 107 00:06:21,697 --> 00:06:26,057 Speaker 2: like it was made by a bunch of you know, 108 00:06:26,737 --> 00:06:28,817 Speaker 2: computer engineers. And I don't mean that in a good way, 109 00:06:29,417 --> 00:06:31,657 Speaker 2: Like it looks like it's just this kind of cobbled 110 00:06:31,697 --> 00:06:35,177 Speaker 2: together thing and all of the original just the ease, 111 00:06:35,417 --> 00:06:37,737 Speaker 2: and it's just they're throwing all this. I think it's 112 00:06:37,737 --> 00:06:40,577 Speaker 2: turned into a bad user user experience and user interface. 113 00:06:40,617 --> 00:06:43,137 Speaker 2: But anyway, forget about fixing that for a second. If 114 00:06:43,137 --> 00:06:45,697 Speaker 2: you were just trying to fix the fact that Facebook 115 00:06:45,737 --> 00:06:48,737 Speaker 2: is not fair to half of the country politically, is 116 00:06:48,777 --> 00:06:50,577 Speaker 2: it possible or do we just have to build our 117 00:06:50,617 --> 00:06:52,177 Speaker 2: own build our own thing. By the way, build your 118 00:06:52,177 --> 00:06:54,177 Speaker 2: own Facebook doesn't sound as crazy. Maybe it's buy your 119 00:06:54,217 --> 00:06:56,697 Speaker 2: own Facebook that's more expensive than Twitter. Not to call 120 00:06:56,737 --> 00:06:58,377 Speaker 2: elon on that one. 121 00:06:59,017 --> 00:07:01,577 Speaker 3: Yeah, you know, I will say personally, and when I 122 00:07:01,617 --> 00:07:04,097 Speaker 3: was in the company, I thought Mark Zuckerberg's instincts were 123 00:07:04,817 --> 00:07:09,497 Speaker 3: more libertarian than you know, more leftists. So and that 124 00:07:09,617 --> 00:07:12,857 Speaker 3: is sort of the old guard of the builders. 125 00:07:12,337 --> 00:07:13,297 Speaker 1: In Silicon Valley. 126 00:07:13,337 --> 00:07:15,777 Speaker 3: You know, these are engineers, and they think like engineers, 127 00:07:15,817 --> 00:07:18,497 Speaker 3: they think like programmers. They want to solve problems and 128 00:07:18,537 --> 00:07:22,137 Speaker 3: fix things. Unfortunately, I do think that he's he's been 129 00:07:22,177 --> 00:07:24,817 Speaker 3: there for so long and he's been frankly led astray 130 00:07:24,937 --> 00:07:27,337 Speaker 3: by others in the C suite and that other sort 131 00:07:27,337 --> 00:07:30,377 Speaker 3: of layer of upper management as well, who are very 132 00:07:30,377 --> 00:07:34,297 Speaker 3: concerned with pr fires. And clearly we know that public 133 00:07:34,337 --> 00:07:37,097 Speaker 3: relations only goes one way, and that's against Conservatives at 134 00:07:37,097 --> 00:07:39,817 Speaker 3: this point. So you look at that and sort of 135 00:07:39,817 --> 00:07:42,017 Speaker 3: see how he's been co opted when his instincts were 136 00:07:42,337 --> 00:07:45,657 Speaker 3: maybe initially good. I remember a speech that he gave 137 00:07:45,777 --> 00:07:48,857 Speaker 3: in October of twenty nineteen at Georgetown University where he 138 00:07:48,937 --> 00:07:51,777 Speaker 3: said Facebook had to be the alternative to an authoritarian 139 00:07:51,897 --> 00:07:55,217 Speaker 3: China which was propagating its digital platforms here in the US. 140 00:07:55,457 --> 00:07:57,777 Speaker 3: You know what happened to that that that gave me 141 00:07:57,777 --> 00:08:01,497 Speaker 3: a little bit of hope. But since he since spent 142 00:08:01,617 --> 00:08:08,897 Speaker 3: four hundred million dollars pushing certain Democrats in elections under 143 00:08:08,937 --> 00:08:11,457 Speaker 3: the auspices of these out the vote measures, but we 144 00:08:11,537 --> 00:08:15,457 Speaker 3: know that they were in blue states going to Democrat 145 00:08:16,217 --> 00:08:19,537 Speaker 3: activist organizations who were then pushing only getting out the 146 00:08:19,577 --> 00:08:23,297 Speaker 3: vote for Democrats. So unfortunately, I think those instincts that 147 00:08:23,417 --> 00:08:26,697 Speaker 3: he previously had have been quashed, and I don't think 148 00:08:26,737 --> 00:08:28,777 Speaker 3: there's any coming back from it. I do think it 149 00:08:28,857 --> 00:08:30,217 Speaker 3: is too far gone at this point. 150 00:08:31,097 --> 00:08:33,057 Speaker 2: Let's come back in a second, and talk about TikTok. 151 00:08:33,497 --> 00:08:36,137 Speaker 2: Since you mentioned communist China, we'll get into that in 152 00:08:36,217 --> 00:08:39,257 Speaker 2: just a second here. But everybody at home, if you 153 00:08:39,377 --> 00:08:41,257 Speaker 2: haven't tried the Geezy dream sheets, I'm I'll tell you 154 00:08:41,337 --> 00:08:42,937 Speaker 2: right now, you're missing out. Mike Lindell's got a lot 155 00:08:42,977 --> 00:08:45,737 Speaker 2: of amazing products. Karen, do you have Geezy dream sheets. 156 00:08:45,817 --> 00:08:46,977 Speaker 2: We're gonna get you hooked up. 157 00:08:48,617 --> 00:08:50,417 Speaker 1: Planning on it. They're in the cart, They're in the cart. 158 00:08:50,457 --> 00:08:52,737 Speaker 2: I'm like, all right, there we go the Geezy dream 159 00:08:52,817 --> 00:08:54,497 Speaker 2: sheets from Mike Linda. You got to get them for 160 00:08:54,497 --> 00:08:56,977 Speaker 2: the whole family. I'm telling you. They're amazing. Coming as 161 00:08:57,057 --> 00:08:58,857 Speaker 2: low as twenty nine to ninety nine. When you go 162 00:08:58,857 --> 00:09:01,977 Speaker 2: to my pillow dot COMUS promo code buck multiple color 163 00:09:02,017 --> 00:09:04,737 Speaker 2: styles and sizes, you can upgrade your betting. Now. By 164 00:09:04,737 --> 00:09:06,497 Speaker 2: the way, everybody, I know you think, oh, I've already 165 00:09:06,497 --> 00:09:08,737 Speaker 2: got sheets. Sheets only last you a couple of years. 166 00:09:08,777 --> 00:09:11,417 Speaker 2: You know, you watch them, they get to get kind 167 00:09:11,417 --> 00:09:14,097 Speaker 2: of a little too worn, threadbare, and they're not very 168 00:09:14,097 --> 00:09:16,897 Speaker 2: comfortable and they start to, you know, just look like 169 00:09:17,097 --> 00:09:19,097 Speaker 2: you need new sheets, because you do. So go to 170 00:09:19,097 --> 00:09:21,857 Speaker 2: MyPillow dot Com promo code buck twenty nine to ninety 171 00:09:22,017 --> 00:09:25,337 Speaker 2: eight for Geeza Dream Sheets. All my Pillow products come 172 00:09:25,337 --> 00:09:27,697 Speaker 2: with a sixty day money back guarantee and a ten 173 00:09:27,777 --> 00:09:30,617 Speaker 2: year warranty. So go to MyPillow dot com, click on 174 00:09:30,737 --> 00:09:34,057 Speaker 2: radio listener specials and use promo code buck for the 175 00:09:34,097 --> 00:09:37,177 Speaker 2: Geeza dream Sheets under thirty bucks Geeze dream Sheets, go 176 00:09:37,457 --> 00:09:39,977 Speaker 2: check them out today. I sleep on them every night. 177 00:09:40,697 --> 00:09:44,657 Speaker 2: All right. So TikTok, I gotta tell you. I mean, 178 00:09:44,657 --> 00:09:47,857 Speaker 2: I've I think that YouTube is much more concerning for 179 00:09:48,177 --> 00:09:50,897 Speaker 2: American freedom and everything else than TikTok. And I've been 180 00:09:50,937 --> 00:09:52,977 Speaker 2: saying this more and more people are saying it now. 181 00:09:53,137 --> 00:09:54,497 Speaker 2: It seemed to be about a month ago and it 182 00:09:54,577 --> 00:09:57,977 Speaker 2: was like, oh, TikTok, I'm sitting here. I'm like, what 183 00:09:58,057 --> 00:10:00,617 Speaker 2: about the social media companies? They are already throwing elections anyway, 184 00:10:00,657 --> 00:10:03,697 Speaker 2: We've put that aside for a second. I mean, how 185 00:10:03,697 --> 00:10:06,017 Speaker 2: worried are you? Let's just look at TikTok. And I 186 00:10:06,017 --> 00:10:10,457 Speaker 2: won't do the you know, the what about with YouTube, Google, 187 00:10:10,457 --> 00:10:14,697 Speaker 2: Facebook or Instagram all that stuff, although I could. How 188 00:10:14,737 --> 00:10:17,137 Speaker 2: bad is TikTok really? How big a problem is it? 189 00:10:18,257 --> 00:10:18,497 Speaker 1: Yeah? 190 00:10:18,497 --> 00:10:24,017 Speaker 3: I think TikTok falls into three problem buckets. I would say, 191 00:10:24,457 --> 00:10:26,657 Speaker 3: and The first and foremost is the most obvious one. 192 00:10:26,697 --> 00:10:30,057 Speaker 3: It's that Byte Dances its parents parent company. It is 193 00:10:30,097 --> 00:10:33,937 Speaker 3: headquartered in Beijing and therefore subject to the laws and 194 00:10:33,977 --> 00:10:36,897 Speaker 3: policies of the People's Republic of China. One of the 195 00:10:37,337 --> 00:10:39,457 Speaker 3: laws that we like to talk about is the twenty 196 00:10:39,497 --> 00:10:44,417 Speaker 3: seventeen National Intelligence Law, which effectively compels private companies to 197 00:10:44,537 --> 00:10:47,257 Speaker 3: do the work of the state. So if the Chinese state, 198 00:10:47,297 --> 00:10:50,737 Speaker 3: the CCP, the Chinese Communist Party, decides that they want 199 00:10:50,737 --> 00:10:53,457 Speaker 3: specific data, they want access to this, they want access 200 00:10:53,497 --> 00:10:56,537 Speaker 3: to that, then by virtue of this law, Byte Dance 201 00:10:56,617 --> 00:11:00,937 Speaker 3: has to comply. And this is not my original phrasing, 202 00:11:00,977 --> 00:11:04,977 Speaker 3: but I think it illustrates the problem pretty well. China 203 00:11:05,257 --> 00:11:08,777 Speaker 3: doesn't have rule of law, they have rule by law. 204 00:11:09,697 --> 00:11:11,937 Speaker 3: This if this is something if you're you know, the 205 00:11:11,937 --> 00:11:15,097 Speaker 3: Bite Dance CEO, you're kind of powerless to resist at 206 00:11:15,137 --> 00:11:17,057 Speaker 3: this point, and why would you. When you look at 207 00:11:17,057 --> 00:11:20,537 Speaker 3: Byte Dance, they have one of three of their board 208 00:11:20,537 --> 00:11:23,497 Speaker 3: members of the main domestic subsidiary of Byte Dance is 209 00:11:23,537 --> 00:11:28,777 Speaker 3: a card carrying Chinese Communist government official. If you look 210 00:11:28,817 --> 00:11:31,177 Speaker 3: at like some of the good reporting that Forbes has done, 211 00:11:31,217 --> 00:11:35,497 Speaker 3: they scoured LinkedIn and found that three hundred plus profiles 212 00:11:35,697 --> 00:11:39,177 Speaker 3: had either current or former links of these current byte 213 00:11:39,217 --> 00:11:43,177 Speaker 3: Dance employees to a Chinese propaganda arm, to Chinese state media. 214 00:11:43,457 --> 00:11:46,537 Speaker 3: So you have active and former members of the Chinese 215 00:11:46,537 --> 00:11:51,697 Speaker 3: Communist Party, particularly in the information realm, working in byte Dance. 216 00:11:51,857 --> 00:11:54,777 Speaker 3: And there's so many other data points that I could 217 00:11:54,817 --> 00:11:57,457 Speaker 3: talk about, but that's the first one. Byte Dance very 218 00:11:57,457 --> 00:12:00,617 Speaker 3: close links and infiltration frankly by the CCP. They have 219 00:12:00,657 --> 00:12:03,817 Speaker 3: an internal committee, a DOJ report in September twenty twenty 220 00:12:03,817 --> 00:12:06,657 Speaker 3: coming out of the Trump administration assessed as much, and 221 00:12:06,697 --> 00:12:08,777 Speaker 3: we know that they are deeply involved in the inner 222 00:12:08,817 --> 00:12:10,537 Speaker 3: workings of TikTok as well. 223 00:12:10,817 --> 00:12:14,577 Speaker 1: So that's one thing, and again just tip of the iceberg. 224 00:12:14,617 --> 00:12:18,137 Speaker 3: We can go into just really how odious some of 225 00:12:18,137 --> 00:12:20,617 Speaker 3: those connections are when it comes to the connection with 226 00:12:20,697 --> 00:12:23,977 Speaker 3: American data as well, but I'll table that for now. 227 00:12:24,257 --> 00:12:27,857 Speaker 3: And then number two there's that influence campaign aspect, and 228 00:12:29,137 --> 00:12:33,217 Speaker 3: you know, we talk about the manipulation of the information environment. 229 00:12:33,537 --> 00:12:35,657 Speaker 3: This is something that we were dealing with in the 230 00:12:35,737 --> 00:12:39,737 Speaker 3: intelligence community and especially in big tech companies and looking 231 00:12:39,817 --> 00:12:43,777 Speaker 3: at it now and what we've seen is pro CCP 232 00:12:43,977 --> 00:12:50,617 Speaker 3: narratives pushed on these platforms. We've seen an actual Chinese 233 00:12:50,657 --> 00:12:54,417 Speaker 3: state account come to TikTok and say, how can we 234 00:12:54,537 --> 00:12:57,657 Speaker 3: push our information? And we've seen information from those accounts 235 00:12:57,697 --> 00:12:59,937 Speaker 3: pushed and not labeled as state media as well. So 236 00:13:00,337 --> 00:13:03,777 Speaker 3: we know that they're trying to do the proverbial sowing 237 00:13:03,817 --> 00:13:08,257 Speaker 3: of discord, such as promoting Democratic candidates in the twenty 238 00:13:08,297 --> 00:13:11,737 Speaker 3: twenty two midterm election to the detriment of Republican candidates. 239 00:13:11,817 --> 00:13:14,977 Speaker 3: They're trying to push stories about abortion and incendiary things 240 00:13:15,017 --> 00:13:19,017 Speaker 3: to help so descent among the American populations, something everyone 241 00:13:19,017 --> 00:13:20,097 Speaker 3: always accused sort. 242 00:13:19,937 --> 00:13:21,017 Speaker 1: Of the Russians of doing. 243 00:13:21,497 --> 00:13:24,617 Speaker 3: So that's another aspect of that information environment manipulation. 244 00:13:25,017 --> 00:13:27,417 Speaker 1: And then third, you have the kids. You know what 245 00:13:27,497 --> 00:13:28,497 Speaker 1: it's doing to children. 246 00:13:28,657 --> 00:13:31,577 Speaker 3: And we know that TikTok, in particular with the four 247 00:13:31,697 --> 00:13:33,297 Speaker 3: U algorithm, just lights a. 248 00:13:33,497 --> 00:13:35,537 Speaker 1: Fire under these social contagions. 249 00:13:35,937 --> 00:13:38,857 Speaker 3: We know that they're in bed with the transgender lobby, 250 00:13:39,057 --> 00:13:46,097 Speaker 3: featuring you know, prominent transgender activists, prominent LGBTQ activists all 251 00:13:46,137 --> 00:13:50,377 Speaker 3: over their websites raising awareness. They're very very open about 252 00:13:50,417 --> 00:13:55,137 Speaker 3: doing this, and there are pediatric hospitals that are reporting 253 00:13:55,497 --> 00:14:00,937 Speaker 3: actual physical manifestations coming out in patients because they use 254 00:14:01,017 --> 00:14:04,977 Speaker 3: TikTok like things called TikTok tics, which most researchers are 255 00:14:04,977 --> 00:14:09,697 Speaker 3: classifying as you know, pure social contagion movements. So you know, 256 00:14:10,177 --> 00:14:12,577 Speaker 3: and this is again something that TikTok is very very 257 00:14:12,617 --> 00:14:18,177 Speaker 3: efficient at that it doesn't necessarily distinguish it wholesale from 258 00:14:18,257 --> 00:14:19,857 Speaker 3: the instagrams of the world. If you want to talk 259 00:14:19,897 --> 00:14:21,897 Speaker 3: about what about is them. We know their staffs on 260 00:14:21,937 --> 00:14:25,657 Speaker 3: that in terms of mental health harms. But TikTok is poised, 261 00:14:25,817 --> 00:14:29,057 Speaker 3: especially because, as Christopher Ray says, directory of the FBI 262 00:14:29,737 --> 00:14:33,617 Speaker 3: China controls the algorithm. That's even more problematic when they're 263 00:14:33,657 --> 00:14:35,537 Speaker 3: feeding our kids this poison. 264 00:14:36,937 --> 00:14:41,657 Speaker 2: But how much of the algorithm is the kids click 265 00:14:41,737 --> 00:14:44,097 Speaker 2: on the things they click on and then it's reflected 266 00:14:44,137 --> 00:14:46,657 Speaker 2: back to them, right, Like you know, I asked this 267 00:14:46,777 --> 00:14:51,217 Speaker 2: because remember Russia, remember Russia collusion. Back in twenty sixteen, 268 00:14:51,297 --> 00:14:56,257 Speaker 2: they're saying, oh, Hillary Clinton lost because Trump worked with Russia. 269 00:14:56,297 --> 00:14:58,217 Speaker 2: And then they talked about the Facebook ads. I think 270 00:14:58,257 --> 00:15:00,217 Speaker 2: there was like one hundred thousand dollars that were spent 271 00:15:00,257 --> 00:15:03,417 Speaker 2: on these bogus you know, or Russian backed or whatever 272 00:15:03,457 --> 00:15:06,017 Speaker 2: Facebook ads. And when you look at the ads, I 273 00:15:06,057 --> 00:15:07,777 Speaker 2: mean a lot of them were ridiculous. I mean it 274 00:15:07,857 --> 00:15:11,257 Speaker 2: looked like a guy named Yuri, you know know, working 275 00:15:11,297 --> 00:15:14,577 Speaker 2: in like sub basement c of some you know fsb 276 00:15:14,817 --> 00:15:17,577 Speaker 2: outpost you know, on the on the in the outer 277 00:15:17,657 --> 00:15:20,777 Speaker 2: ring of Moscow was like looking at a you know, 278 00:15:20,777 --> 00:15:25,457 Speaker 2: a little dictionary in English. I mean it was preposterous, right, 279 00:15:25,497 --> 00:15:28,737 Speaker 2: I mean their understanding of US politics beyond lock her 280 00:15:28,817 --> 00:15:31,177 Speaker 2: up for Hillary Clinton, which you know they got that right, 281 00:15:31,497 --> 00:15:34,977 Speaker 2: but their understanding was very weak. I mean the idea 282 00:15:35,057 --> 00:15:38,097 Speaker 2: that that I just hear all these people saying, oh 283 00:15:38,137 --> 00:15:41,177 Speaker 2: my gosh, you know, the Chinese are going to brainwash 284 00:15:41,217 --> 00:15:43,697 Speaker 2: our kids and make them lazy or whatever. I look 285 00:15:43,737 --> 00:15:47,017 Speaker 2: at them, like, what do you guys think Disney's door. 286 00:15:47,257 --> 00:15:49,737 Speaker 2: I just this is the part of it that there's 287 00:15:49,777 --> 00:15:51,817 Speaker 2: so much more upset about what's going on with the 288 00:15:52,017 --> 00:15:54,377 Speaker 2: I understand the spying thing, and like if they can 289 00:15:54,417 --> 00:15:56,417 Speaker 2: suck up your information and do create that that's a 290 00:15:56,457 --> 00:15:59,577 Speaker 2: separate issue, but it sounds to me like there's also 291 00:15:59,737 --> 00:16:03,337 Speaker 2: just a content component of this, and I I don't 292 00:16:03,377 --> 00:16:05,417 Speaker 2: under I just feel like I feel like TikTok is 293 00:16:05,417 --> 00:16:08,097 Speaker 2: the shiny object where people in politics and in power 294 00:16:08,577 --> 00:16:11,017 Speaker 2: get to pretend that they're doing something that's like, oh, 295 00:16:11,057 --> 00:16:14,497 Speaker 2: we're protecting the kids from the bad influences online. They're 296 00:16:14,537 --> 00:16:20,297 Speaker 2: protecting them from one of dozens of major and endless 297 00:16:20,617 --> 00:16:24,577 Speaker 2: minor influences online that are all being pushed by the 298 00:16:24,617 --> 00:16:29,537 Speaker 2: Democrat Party anyway, like transgenderism. Beijing's not pushing transgenderism via 299 00:16:29,577 --> 00:16:34,257 Speaker 2: TikTok on kids. The Democrat Disney is pushing transgenderism on kids. 300 00:16:35,097 --> 00:16:37,777 Speaker 3: Well, number one, you know, you're right, But number two, 301 00:16:37,777 --> 00:16:40,537 Speaker 3: we also don't know that Beijing isn't pushing this stuff, 302 00:16:40,577 --> 00:16:42,577 Speaker 3: and that's you know, part of the problem too. If 303 00:16:42,657 --> 00:16:45,777 Speaker 3: China is so intimately involved in the algorithm, which it 304 00:16:45,857 --> 00:16:48,737 Speaker 3: says it won't. If there's a forced investiture to an 305 00:16:48,737 --> 00:16:51,417 Speaker 3: American company of TikTok, it said, there's no way we're 306 00:16:51,417 --> 00:16:54,697 Speaker 3: giving up the algorithm. So in my mind, that means, yeah, 307 00:16:54,697 --> 00:16:56,537 Speaker 3: they want to retain control of it. There's a commercial 308 00:16:56,577 --> 00:16:59,537 Speaker 3: element to it because it is really good, but there's 309 00:16:59,577 --> 00:17:03,817 Speaker 3: also that information environment of manipulation potential there because they 310 00:17:04,297 --> 00:17:06,217 Speaker 3: want to keep it in Chinese hands so badly, and 311 00:17:06,257 --> 00:17:09,017 Speaker 3: they've said as much, which flies in the face of 312 00:17:09,057 --> 00:17:11,777 Speaker 3: a lot of the assurances that these TikTok executives are 313 00:17:11,897 --> 00:17:16,897 Speaker 3: providing to Congress members with their you know, Project Texas potential, 314 00:17:17,097 --> 00:17:20,017 Speaker 3: you know, mollification of our representatives. 315 00:17:20,457 --> 00:17:23,257 Speaker 1: But so I also I do want to address that. 316 00:17:23,297 --> 00:17:27,937 Speaker 3: I think that when you have over sixty seven percent 317 00:17:28,057 --> 00:17:31,817 Speaker 3: of American teenagers as of last year on a particular platform, 318 00:17:32,137 --> 00:17:34,457 Speaker 3: and you have thirty percent, according to a pupil in 319 00:17:34,497 --> 00:17:38,297 Speaker 3: twenty twenty of preteens nine to eleven year olds on 320 00:17:38,417 --> 00:17:41,257 Speaker 3: a specific platform, and we have new data coming out 321 00:17:41,257 --> 00:17:44,577 Speaker 3: of the UK saying a decent percentage of toddlers are 322 00:17:44,617 --> 00:17:46,617 Speaker 3: now exposed to this content, then you. 323 00:17:46,657 --> 00:17:47,377 Speaker 1: Have a problem. 324 00:17:47,417 --> 00:17:50,497 Speaker 3: Then you have a the fact that you know, number one, 325 00:17:50,577 --> 00:17:52,857 Speaker 3: all of these children are on it to a much 326 00:17:52,897 --> 00:17:55,337 Speaker 3: greater degree than you know, Facebook, as you talked about, 327 00:17:55,417 --> 00:17:58,977 Speaker 3: is hemorrhaging users, especially in this demographic Instagram as hemorrhaging 328 00:17:59,057 --> 00:18:02,937 Speaker 3: users as well. Then then that becomes a source of information, 329 00:18:03,097 --> 00:18:05,337 Speaker 3: and we know that they're getting it for their news 330 00:18:05,377 --> 00:18:08,297 Speaker 3: now as well. They're not just looking at those cute videos. 331 00:18:08,297 --> 00:18:11,697 Speaker 3: They're they're getting it to be informed about the world. 332 00:18:13,097 --> 00:18:15,497 Speaker 3: Top Google executives said as much as well. In a 333 00:18:15,537 --> 00:18:19,137 Speaker 3: tech conference. He said, when people young kids want their news, 334 00:18:19,137 --> 00:18:22,057 Speaker 3: they go to TikTok and Google is very much aware 335 00:18:22,097 --> 00:18:24,657 Speaker 3: of that, and you know, keeping their antenna up. And 336 00:18:24,657 --> 00:18:26,817 Speaker 3: then you have American adults, so looking at the kid 337 00:18:26,857 --> 00:18:29,337 Speaker 3: stuff that matters a lot, but American adults as well. 338 00:18:29,417 --> 00:18:32,657 Speaker 3: So you know, in two years the amount of American 339 00:18:32,697 --> 00:18:36,577 Speaker 3: adults that get their news from TikTok has tripled. That's problematic, 340 00:18:36,617 --> 00:18:39,617 Speaker 3: I think from a civic perspective. I see you want 341 00:18:39,617 --> 00:18:41,217 Speaker 3: to say something, but I want to say one more 342 00:18:41,257 --> 00:18:42,457 Speaker 3: thing before I let you talk. 343 00:18:47,097 --> 00:18:48,697 Speaker 2: I'm standing in the way of the train. By all means, 344 00:18:48,737 --> 00:18:51,017 Speaker 2: go ahead, Sorry you were saying yeah. 345 00:18:52,977 --> 00:18:55,497 Speaker 3: So I think the last thing is a number of 346 00:18:55,657 --> 00:18:59,577 Speaker 3: enterprising journalists have taken it upon themselves to create their 347 00:18:59,617 --> 00:19:02,097 Speaker 3: own TikTok accounts. And what they do is they register 348 00:19:02,257 --> 00:19:05,817 Speaker 3: as users from around thirteen to fourteen, and they have found, 349 00:19:05,817 --> 00:19:09,737 Speaker 3: to a man, within minutes, they are fed content that 350 00:19:09,857 --> 00:19:14,897 Speaker 3: is composed of self harm content, suicidal content, eating disorder content, 351 00:19:14,977 --> 00:19:18,137 Speaker 3: especially if they're registering as girls versus registering as boys. 352 00:19:18,377 --> 00:19:22,257 Speaker 3: So we do know that there is something that isn't 353 00:19:22,337 --> 00:19:24,737 Speaker 3: just responsive to shall we say. 354 00:19:24,577 --> 00:19:26,097 Speaker 1: What the children want? 355 00:19:26,817 --> 00:19:29,617 Speaker 3: Granted, the TikTok algorithm is based off your engagement, not 356 00:19:29,657 --> 00:19:32,217 Speaker 3: necessarily your network, So how long your eyes linger over 357 00:19:32,257 --> 00:19:35,257 Speaker 3: a specific video if you're interested in depression, Yes, it's 358 00:19:35,257 --> 00:19:38,537 Speaker 3: more likely to feed you self harm content and suicidal content. 359 00:19:38,617 --> 00:19:41,697 Speaker 3: But this appears to be pretty uniform across the board 360 00:19:42,297 --> 00:19:45,457 Speaker 3: for a lot of these journalists experimenting anyway. So there's 361 00:19:45,497 --> 00:19:50,817 Speaker 3: something in TikTok that is particularly I would say nefarious 362 00:19:50,897 --> 00:19:53,657 Speaker 3: when it comes to our children and the noxious content 363 00:19:53,657 --> 00:19:54,377 Speaker 3: they're being fed. 364 00:19:54,977 --> 00:19:56,657 Speaker 2: I mean, do you ever go on TikTok because I 365 00:19:56,657 --> 00:20:02,177 Speaker 2: gotta say, some of those shuffle dance moves very catchy. 366 00:20:02,497 --> 00:20:02,737 Speaker 1: Fuck. 367 00:20:03,057 --> 00:20:05,657 Speaker 3: No, I am not going on TikTok, nor will I 368 00:20:05,697 --> 00:20:08,177 Speaker 3: ever go on Tiktokay. 369 00:20:08,137 --> 00:20:10,257 Speaker 2: This is where I get to point it out all 370 00:20:10,257 --> 00:20:13,777 Speaker 2: the other people in America, usually especially the Democrats, but 371 00:20:13,817 --> 00:20:15,577 Speaker 2: I get to call them comedies and have fun with it. 372 00:20:15,777 --> 00:20:17,737 Speaker 2: Apparently on this one, I've got like a soft spot 373 00:20:17,737 --> 00:20:20,417 Speaker 2: for communist China. So I'm like, I think TikTok is 374 00:20:20,457 --> 00:20:23,177 Speaker 2: super entertaining. I gotta tell you it's all for me. 375 00:20:23,737 --> 00:20:27,617 Speaker 2: It's yeah, it's it's how to how to sear like 376 00:20:27,697 --> 00:20:30,577 Speaker 2: the perfect steak and different red meat. I follow this 377 00:20:30,737 --> 00:20:34,177 Speaker 2: like Max the meat guy who makes like Wagu and 378 00:20:34,257 --> 00:20:37,137 Speaker 2: briskets and all these things. Uh, guys who know how 379 00:20:37,137 --> 00:20:40,657 Speaker 2: to make like tomahawks out of the stuff you find 380 00:20:40,657 --> 00:20:43,017 Speaker 2: in your backyard like this is and of course cute 381 00:20:43,057 --> 00:20:46,057 Speaker 2: pulldog videos like like, I was like, what is this? 382 00:20:46,377 --> 00:20:51,897 Speaker 2: How is this supposedly in some way doing anything that 383 00:20:52,097 --> 00:20:54,577 Speaker 2: is uh, you know, going to damage me? But then again, 384 00:20:54,617 --> 00:20:56,857 Speaker 2: I'm an adult and things are different, and you know 385 00:20:57,457 --> 00:21:01,057 Speaker 2: that can be a little bit of a challenge. And yeah, 386 00:21:01,137 --> 00:21:03,537 Speaker 2: so I got to get to the Oxford Gold Group 387 00:21:03,577 --> 00:21:05,977 Speaker 2: here for a second, and we come back. We're going 388 00:21:06,057 --> 00:21:09,577 Speaker 2: to talk about AI because I think its kra Frederick 389 00:21:09,617 --> 00:21:12,937 Speaker 2: has to plane whether or not Skynet is going to 390 00:21:12,977 --> 00:21:15,977 Speaker 2: become self aware and cause the nuclear war that James 391 00:21:16,057 --> 00:21:19,577 Speaker 2: Cameron warned us about in Terminator one and two and 392 00:21:19,617 --> 00:21:21,097 Speaker 2: probably the other ones, but no one saw the other 393 00:21:21,097 --> 00:21:24,057 Speaker 2: ones because the other Terminator movie sucked, So you know, 394 00:21:24,497 --> 00:21:26,097 Speaker 2: there was a she knows it's true. There was a 395 00:21:26,137 --> 00:21:29,537 Speaker 2: recent banking collapse, the nation's largest collapse of a financial 396 00:21:29,537 --> 00:21:32,377 Speaker 2: institution since two thousand and eight. So stuff can go 397 00:21:32,417 --> 00:21:35,737 Speaker 2: bad real fast, you know this, And Fiat currency is 398 00:21:35,777 --> 00:21:39,257 Speaker 2: pretty imperfect because it's a situation. Now we have inflation. 399 00:21:39,337 --> 00:21:42,017 Speaker 2: We also have thirty two trillion dollars a debt. How 400 00:21:42,017 --> 00:21:46,337 Speaker 2: about using gold and silver as a protection for your portfolio. 401 00:21:46,377 --> 00:21:48,297 Speaker 2: Have a little gold and silver on hand just in case. 402 00:21:48,337 --> 00:21:50,217 Speaker 2: I've got gold and silver right here with me in 403 00:21:50,257 --> 00:21:53,777 Speaker 2: the radio studio. Now is the time. Don't wait, because 404 00:21:54,097 --> 00:21:55,857 Speaker 2: if a crisis hits, you're gonna need to have it 405 00:21:55,897 --> 00:21:57,897 Speaker 2: on hand. And also those prices of gold and silver 406 00:21:57,937 --> 00:22:01,177 Speaker 2: are going to go way up. So now's the time 407 00:22:01,337 --> 00:22:04,137 Speaker 2: to call my friends at the Oxford Gold Group. Securing 408 00:22:04,137 --> 00:22:05,857 Speaker 2: your I array or four to one K with real 409 00:22:05,897 --> 00:22:07,977 Speaker 2: gold and silver, by the way, is also a fantastic 410 00:22:08,337 --> 00:22:12,137 Speaker 2: portfolio protection plan. All you have to do is call 411 00:22:12,217 --> 00:22:15,177 Speaker 2: the Oxford Gold Group. You can own real precious metals 412 00:22:15,217 --> 00:22:17,457 Speaker 2: just like I do. Call the Oxford Gold Group at 413 00:22:17,497 --> 00:22:20,857 Speaker 2: eight three three four zero four gold A three three 414 00:22:21,017 --> 00:22:27,417 Speaker 2: four zero four g O L d Okay so ms 415 00:22:27,537 --> 00:22:31,217 Speaker 2: Kara Frederick who is on the task force I believe 416 00:22:31,257 --> 00:22:34,897 Speaker 2: at the Heritage Foundation for dealing with AI related matters, 417 00:22:35,377 --> 00:22:37,417 Speaker 2: So you would you would have some insights into this. 418 00:22:38,057 --> 00:22:42,817 Speaker 2: I I think that the AI is and look is 419 00:22:42,857 --> 00:22:46,897 Speaker 2: Elon Musk, both wealthier and smarter than me. Yes, but 420 00:22:47,977 --> 00:22:50,297 Speaker 2: I think all this stuff about how the world is 421 00:22:50,417 --> 00:22:55,257 Speaker 2: going to end because of AI is pretty crazy. Like 422 00:22:55,337 --> 00:22:57,697 Speaker 2: I'm looking at this, I'm like, how does this even happen? 423 00:22:58,257 --> 00:23:01,777 Speaker 2: Everyone's getting all freaked out? Do I I'm usually actually no, 424 00:23:01,817 --> 00:23:03,937 Speaker 2: I usually tell people everything's gonna be okay, because most 425 00:23:04,017 --> 00:23:07,457 Speaker 2: hysteria is just people wanting attention. But is AI really? 426 00:23:07,577 --> 00:23:10,017 Speaker 2: I know it's a big deal and it's gonna matter all. 427 00:23:10,177 --> 00:23:12,617 Speaker 2: I'm not saying it doesn't matter a lot, but as 428 00:23:12,657 --> 00:23:15,897 Speaker 2: a threat, what is the threat from AI? That's the 429 00:23:15,897 --> 00:23:18,337 Speaker 2: part of this that I still haven't No one's really 430 00:23:18,337 --> 00:23:19,977 Speaker 2: been able to explain to me. It's like, oh, like 431 00:23:20,017 --> 00:23:22,537 Speaker 2: it's going to hurt our democracy because of the disinformation. 432 00:23:23,217 --> 00:23:24,817 Speaker 2: Watch CNN, look what they. 433 00:23:24,697 --> 00:23:26,657 Speaker 1: Do, fair point. 434 00:23:26,977 --> 00:23:29,537 Speaker 3: And I do think you stand in good company with 435 00:23:29,537 --> 00:23:30,657 Speaker 3: with some of the skeptics. 436 00:23:31,577 --> 00:23:31,817 Speaker 2: You know. 437 00:23:32,337 --> 00:23:34,937 Speaker 3: One of the things, as you know, in the Intel community, though, 438 00:23:34,937 --> 00:23:38,057 Speaker 3: there was always that person who was like everyone everyone 439 00:23:38,377 --> 00:23:42,017 Speaker 3: China isn't ten feet tall, right, everyone everyone al Kaita 440 00:23:42,177 --> 00:23:45,697 Speaker 3: is not looking at external operations, and they like are 441 00:23:45,737 --> 00:23:47,457 Speaker 3: the naysayers and everyone kind. 442 00:23:47,257 --> 00:23:51,257 Speaker 2: Of their whole thing is is they're the uh you know, 443 00:23:51,297 --> 00:23:54,017 Speaker 2: they're they're playing the role of what's the what's the 444 00:23:54,097 --> 00:23:55,817 Speaker 2: word we're looking for here. You know, when you're just 445 00:23:55,897 --> 00:23:59,977 Speaker 2: being in opposition, to be in opposition, I forget what 446 00:24:00,017 --> 00:24:04,497 Speaker 2: the word is. You know what I'm saying, contrarian. Thank you. 447 00:24:04,537 --> 00:24:05,737 Speaker 2: They're playing the contrarian. 448 00:24:05,817 --> 00:24:10,057 Speaker 3: Yes, yes, yes, so so you know there's that aspect 449 00:24:10,297 --> 00:24:12,897 Speaker 3: of you know, the AI community as well. But I 450 00:24:12,937 --> 00:24:17,937 Speaker 3: do think there's there there's reason to be worried. And 451 00:24:18,337 --> 00:24:22,257 Speaker 3: I like to quote, you know, two authoritarians on this matter. 452 00:24:22,337 --> 00:24:25,457 Speaker 3: And I thought this was commonplace, but apparently not. Many 453 00:24:25,497 --> 00:24:28,337 Speaker 3: people know that Putin a few years ago said whoever 454 00:24:28,457 --> 00:24:30,617 Speaker 3: is going to lead an AI is going to rule 455 00:24:30,657 --> 00:24:36,417 Speaker 3: the world. I think that that's the significant discussion to 456 00:24:36,777 --> 00:24:39,457 Speaker 3: be had, especially when Putin's saying that, and then she 457 00:24:39,657 --> 00:24:42,657 Speaker 3: and China says, you know, we want to dominate in 458 00:24:42,777 --> 00:24:46,177 Speaker 3: AI by twenty thirty. So, you know, if our adversaries 459 00:24:46,257 --> 00:24:48,977 Speaker 3: have their eye on this and they see it as 460 00:24:49,017 --> 00:24:54,617 Speaker 3: some geopolitical strategic key, then I think that it's important 461 00:24:54,617 --> 00:24:58,217 Speaker 3: to sort of pay attention. And the reason why I 462 00:24:58,617 --> 00:25:01,377 Speaker 3: think they know harnessing these technologies is really going to 463 00:25:01,897 --> 00:25:04,057 Speaker 3: uh catapult them to the front of the line of 464 00:25:04,377 --> 00:25:10,257 Speaker 3: global dominance is is because they're doing things at machine speed. 465 00:25:10,617 --> 00:25:13,537 Speaker 3: Tends to be better than doing things that human speed. 466 00:25:13,577 --> 00:25:15,817 Speaker 3: And when you're coming when you're talking about war fighting, 467 00:25:16,377 --> 00:25:19,337 Speaker 3: when you're talking about things like even you know, the 468 00:25:19,377 --> 00:25:22,137 Speaker 3: stuff that the Intel pukes like us used to do, 469 00:25:22,577 --> 00:25:25,577 Speaker 3: that computer vision algorithms can do better, you know, instead 470 00:25:25,577 --> 00:25:27,857 Speaker 3: of what project may even tried to do. Instead of 471 00:25:27,857 --> 00:25:30,417 Speaker 3: a human being sitting in front of an FMV screen, 472 00:25:30,497 --> 00:25:33,537 Speaker 3: you know, saying that's labeling a truck, labeling a rock, 473 00:25:33,657 --> 00:25:37,377 Speaker 3: labeling a car, you have machines able to do that. 474 00:25:37,537 --> 00:25:42,217 Speaker 3: And then when machines can make decisions, and again, you know, 475 00:25:43,097 --> 00:25:45,977 Speaker 3: the autonomous weapons that we have are mostly semi autonomous, 476 00:25:45,977 --> 00:25:47,977 Speaker 3: so there's always a human in the loop. At this point, 477 00:25:48,057 --> 00:25:50,257 Speaker 3: there's a lot of debate and discussion about that. But 478 00:25:50,337 --> 00:25:52,177 Speaker 3: when you get to the point that a machine is 479 00:25:52,257 --> 00:25:58,017 Speaker 3: basically cutting out all of that analytical rigor that can 480 00:25:58,057 --> 00:26:00,737 Speaker 3: be applied elsewhere to do things only humans can do, 481 00:26:01,137 --> 00:26:04,177 Speaker 3: that's going to give the war fighter a massive advantage. 482 00:26:04,217 --> 00:26:07,497 Speaker 3: So if you have a computer vision algorithm determining that 483 00:26:07,497 --> 00:26:09,057 Speaker 3: that's a rock and that's a tree, and that you 484 00:26:09,097 --> 00:26:13,897 Speaker 3: shouldn't you know, hit it with a with a kinetic strike, 485 00:26:14,177 --> 00:26:16,337 Speaker 3: then that's going to be better. If you have another 486 00:26:16,377 --> 00:26:19,377 Speaker 3: analyst sort of doing things like determining if there's a 487 00:26:19,417 --> 00:26:23,497 Speaker 3: positive identification of that actual terrorist actor which would cause 488 00:26:23,537 --> 00:26:25,977 Speaker 3: that hell fire to rain down. So I think that 489 00:26:26,097 --> 00:26:30,137 Speaker 3: you know, it's offering specific advantages because of that machine 490 00:26:30,137 --> 00:26:33,137 Speaker 3: speed vice human speed. People talk a lot about drone 491 00:26:33,177 --> 00:26:35,817 Speaker 3: swarms too, so instead of you know, training up a 492 00:26:35,897 --> 00:26:39,297 Speaker 3: human pilot necessarily these you know, what my old boss 493 00:26:39,297 --> 00:26:41,217 Speaker 3: at the Center for an American Security used to say 494 00:26:41,337 --> 00:26:43,297 Speaker 3: is that, you know, the human being sort of acts 495 00:26:43,337 --> 00:26:46,057 Speaker 3: as the quarterback and you let your your smaller, you're 496 00:26:46,257 --> 00:26:49,457 Speaker 3: cheaper systems do some of the other work for you, 497 00:26:49,577 --> 00:26:51,377 Speaker 3: so you're not sort of wasting a human being in 498 00:26:51,377 --> 00:26:53,857 Speaker 3: the human capital on that kind of thing too. So 499 00:26:54,177 --> 00:26:56,857 Speaker 3: I think those are just two examples of what AI 500 00:26:57,377 --> 00:26:59,937 Speaker 3: can help accomplish in war. 501 00:27:00,577 --> 00:27:03,217 Speaker 1: And we know that she's on a war footing. 502 00:27:03,657 --> 00:27:06,297 Speaker 3: We know that Putin is currently at war right now too, 503 00:27:06,337 --> 00:27:09,017 Speaker 3: so so that's another thing. The information environment, like what 504 00:27:09,057 --> 00:27:12,057 Speaker 3: you were talking about before, that's a whole nother kettle 505 00:27:12,057 --> 00:27:14,097 Speaker 3: of worms, and we can talk about that all day 506 00:27:14,137 --> 00:27:14,577 Speaker 3: as well. 507 00:27:15,897 --> 00:27:19,537 Speaker 2: Oh, I don't know, it seems to me like if 508 00:27:19,577 --> 00:27:21,657 Speaker 2: there's a little machine that can clean up after me, 509 00:27:22,017 --> 00:27:25,377 Speaker 2: and tell me how great I am all the time, 510 00:27:25,537 --> 00:27:29,817 Speaker 2: and you know, not ask any questions beyond that sounds good. 511 00:27:29,977 --> 00:27:31,897 Speaker 2: I'm not that worried about it, like turning into the 512 00:27:31,977 --> 00:27:34,977 Speaker 2: terminator and deciding that you know, it has feelings too, 513 00:27:35,017 --> 00:27:36,937 Speaker 2: and it's self aware at all. This I don't know. 514 00:27:36,977 --> 00:27:39,977 Speaker 2: I mean, I guess I do think it's it's definitely 515 00:27:39,977 --> 00:27:42,377 Speaker 2: going to be interesting for high school kids who want 516 00:27:42,377 --> 00:27:45,457 Speaker 2: to have some program that can easily write there, you know, 517 00:27:45,737 --> 00:27:48,457 Speaker 2: kind of B minus level term paper for them that 518 00:27:48,977 --> 00:27:53,097 Speaker 2: we know it can do pretty quickly. But friends, you 519 00:27:53,137 --> 00:27:57,217 Speaker 2: know what you need, not AI, you need chalk. Chalk 520 00:27:57,257 --> 00:28:00,257 Speaker 2: provides all natural supplements that help people restore their energy 521 00:28:00,297 --> 00:28:03,777 Speaker 2: potential every day. It's a daily supplement formulated to restore 522 00:28:03,817 --> 00:28:06,577 Speaker 2: lower testosterone levels and men to the levels that men 523 00:28:06,697 --> 00:28:09,297 Speaker 2: used to have. Our diets and stress levels just don't 524 00:28:09,337 --> 00:28:12,057 Speaker 2: naturally provide for the kind of testosterone that we need. 525 00:28:12,457 --> 00:28:14,937 Speaker 2: Chalk's leading ingredient in their mail, Vitality Stack, has proven 526 00:28:14,937 --> 00:28:17,657 Speaker 2: to restore twenty percent of those lower levels in just 527 00:28:17,737 --> 00:28:20,897 Speaker 2: three months time. You'll feel the positive effects and experience 528 00:28:20,937 --> 00:28:23,257 Speaker 2: an energy potential and focus that you haven't in a 529 00:28:23,297 --> 00:28:26,017 Speaker 2: long time. Chalk produces their products with a high level 530 00:28:26,057 --> 00:28:29,057 Speaker 2: of purity makes it potent and impactful. That's why the 531 00:28:29,057 --> 00:28:31,617 Speaker 2: Male Vitality Stack is as effective as it is. Sign 532 00:28:31,737 --> 00:28:35,377 Speaker 2: yourself up, take it, take delivery of Chalk's Mail Vitality Stack, 533 00:28:35,457 --> 00:28:38,297 Speaker 2: or any of the other products available via subscription. Get 534 00:28:38,297 --> 00:28:40,697 Speaker 2: thirty five percent off any Chalk subscription for a life 535 00:28:40,737 --> 00:28:44,217 Speaker 2: when you use my name at their website chalk dot com. 536 00:28:44,337 --> 00:28:48,377 Speaker 2: Is that website choq dot com, Be sure to use 537 00:28:48,457 --> 00:28:51,097 Speaker 2: my name Buck to get thirty five percent off. Go 538 00:28:51,137 --> 00:28:54,537 Speaker 2: to c choq chalk dot com. Use my name Buck 539 00:28:54,617 --> 00:28:59,217 Speaker 2: b U c K for thirty five percent off. So, Caro, 540 00:28:59,337 --> 00:29:02,617 Speaker 2: when you're not trying to save the Internet for the 541 00:29:02,617 --> 00:29:05,977 Speaker 2: purposes of freedom, humanity and world peace and all that stuff, 542 00:29:05,977 --> 00:29:08,097 Speaker 2: what else? What do we need to have a Cara Fredder? 543 00:29:08,137 --> 00:29:11,137 Speaker 2: You were a Navy intel analyst. Is that what I'm getting? 544 00:29:11,137 --> 00:29:13,457 Speaker 2: Because you said vice in a way only people from 545 00:29:13,457 --> 00:29:18,217 Speaker 2: the intelligence community ever say vice like that, meaning instead 546 00:29:18,417 --> 00:29:20,497 Speaker 2: you know that no one else You'll never come across 547 00:29:20,497 --> 00:29:23,297 Speaker 2: anybody who did not work in the IC who will 548 00:29:23,337 --> 00:29:25,057 Speaker 2: be like, well, I think that's a good idea of vice. 549 00:29:25,097 --> 00:29:27,737 Speaker 2: This other idea you're like, wait, what, we're the only 550 00:29:27,777 --> 00:29:28,417 Speaker 2: ones who do that. 551 00:29:30,017 --> 00:29:31,897 Speaker 1: No way, Oh man about Habita. 552 00:29:32,457 --> 00:29:34,977 Speaker 2: Oh I'll give you more. I'll give you more optics. 553 00:29:35,497 --> 00:29:38,057 Speaker 2: Only people from DC talking about well from this optic 554 00:29:38,137 --> 00:29:40,697 Speaker 2: or from that optic or whatever. There there are some 555 00:29:40,857 --> 00:29:44,137 Speaker 2: things that the the Intel people in particular, we have 556 00:29:44,257 --> 00:29:47,697 Speaker 2: this weird nerd vernacular. And I caught you doing some 557 00:29:47,777 --> 00:29:50,777 Speaker 2: nerd vernacular during this podcast. So at least I know 558 00:29:50,817 --> 00:29:52,977 Speaker 2: you're the real deal. At least I know that you know, 559 00:29:53,417 --> 00:29:57,097 Speaker 2: you were pouring over those reports eight cups of coffee deep. 560 00:29:59,137 --> 00:30:01,897 Speaker 3: Yeah, it's funny, you know, get smart with Steve Carell 561 00:30:01,937 --> 00:30:04,257 Speaker 3: and Anne Hathaway. It's a it's a movie. It's like 562 00:30:04,297 --> 00:30:06,937 Speaker 3: a I don't know, spy caper. And you have the 563 00:30:06,977 --> 00:30:09,697 Speaker 3: Intel analyst who like finally gets this chance to like 564 00:30:09,737 --> 00:30:13,017 Speaker 3: be in operations officer in the field and he's you know, 565 00:30:13,097 --> 00:30:16,457 Speaker 3: talking to some operator types and he's like, wait, did 566 00:30:16,537 --> 00:30:18,097 Speaker 3: nobody read my report? 567 00:30:18,457 --> 00:30:20,897 Speaker 1: Like that was me. I was like, did nobody read 568 00:30:20,977 --> 00:30:21,577 Speaker 1: my report? 569 00:30:22,577 --> 00:30:25,337 Speaker 3: That person who would would go out with the guys 570 00:30:25,377 --> 00:30:28,017 Speaker 3: and special Operations forces and sort of be. 571 00:30:28,017 --> 00:30:30,377 Speaker 1: Like, hey, everybody, this is what we should be thinking about. 572 00:30:30,497 --> 00:30:31,937 Speaker 1: Here's the target. You should go there. 573 00:30:32,377 --> 00:30:36,057 Speaker 3: Uh So I was always a civilian, a civilian intelligence officer, 574 00:30:36,137 --> 00:30:38,217 Speaker 3: and we called him Targeters when I was working with 575 00:30:38,297 --> 00:30:39,737 Speaker 3: Naval Special Warfare Development Group. 576 00:30:39,817 --> 00:30:42,177 Speaker 1: But that was, yeah, that was my job. 577 00:30:42,457 --> 00:30:44,897 Speaker 3: I was an Al Qaida analyst first and foremost and 578 00:30:45,017 --> 00:30:45,777 Speaker 3: all sorcers. 579 00:30:45,817 --> 00:30:50,497 Speaker 2: So what they did was they like original original gangster 580 00:30:50,857 --> 00:30:54,337 Speaker 2: old school al Qaeda or the uh a Qi. Because 581 00:30:54,377 --> 00:30:56,737 Speaker 2: I was a q I, Oh you were, they brought 582 00:30:56,777 --> 00:30:58,697 Speaker 2: me in to do CDC a QI and then I 583 00:30:58,697 --> 00:31:01,097 Speaker 2: got moved to OI uh o I A and and 584 00:31:01,257 --> 00:31:05,417 Speaker 2: uh and a q I basically so no way, yeah. 585 00:31:05,297 --> 00:31:05,937 Speaker 1: No, I was. 586 00:31:06,177 --> 00:31:10,217 Speaker 3: I was looking at guys over yeah, the original guys. 587 00:31:10,217 --> 00:31:12,257 Speaker 3: Some of the guys been hiding in Iran for a 588 00:31:12,257 --> 00:31:15,817 Speaker 3: little bit that is now in the open. And yeah, 589 00:31:15,817 --> 00:31:18,417 Speaker 3: when I deployed a couple of times, that was my 590 00:31:19,017 --> 00:31:23,617 Speaker 3: that was my thing, looking at Al Kaida external operations operatives. 591 00:31:23,737 --> 00:31:28,297 Speaker 2: So what did you think of Thirteen Hours the movie, 592 00:31:28,337 --> 00:31:29,857 Speaker 2: by the way, because I always thought it was so 593 00:31:29,897 --> 00:31:34,697 Speaker 2: funny that the case officers in that were like so smug, 594 00:31:36,497 --> 00:31:39,217 Speaker 2: which is just great. It was just great. The analyst 595 00:31:39,257 --> 00:31:40,937 Speaker 2: were all sitting there like yeah, maybe we were just 596 00:31:41,017 --> 00:31:43,617 Speaker 2: back at headquarters making coffee, but like at least we 597 00:31:43,697 --> 00:31:46,177 Speaker 2: respected the paramilitary guys. 598 00:31:47,377 --> 00:31:48,897 Speaker 1: Yeah, oh don't. I don't know. 599 00:31:48,977 --> 00:31:51,577 Speaker 3: I had a lot of OGA friends when I was 600 00:31:51,617 --> 00:31:54,817 Speaker 3: out there, so you know, you got to you gotta. 601 00:31:54,697 --> 00:31:58,697 Speaker 1: Keep those relationships warm. So I was I liked your 602 00:31:58,737 --> 00:31:59,177 Speaker 1: side of the. 603 00:32:01,537 --> 00:32:03,657 Speaker 2: You know, but did you notice in the movie for 604 00:32:03,657 --> 00:32:05,977 Speaker 2: some reason, just to really hammer it home, like one 605 00:32:05,977 --> 00:32:10,577 Speaker 2: of the American case officers in Thirteen Hours just randomly 606 00:32:10,577 --> 00:32:12,897 Speaker 2: has kind of like a French accent and he's turning 607 00:32:12,937 --> 00:32:15,617 Speaker 2: around to like he's turning around to uh, you know, 608 00:32:15,657 --> 00:32:18,057 Speaker 2: what's his name? John Krasinsky, And like all these guys 609 00:32:18,137 --> 00:32:20,657 Speaker 2: were all, you know, jacked and these badasses there and 610 00:32:20,697 --> 00:32:23,617 Speaker 2: he's like he's like, IDA had to do fancy things, dude, 611 00:32:23,697 --> 00:32:26,017 Speaker 2: Like I am so cool. They're like where did this? 612 00:32:26,377 --> 00:32:29,217 Speaker 2: Where did Why is is his name? Like like Jacques Couston, 613 00:32:29,417 --> 00:32:31,617 Speaker 2: Like where did this guy? He's an Emeritan? Like they 614 00:32:31,737 --> 00:32:34,417 Speaker 2: just I love it because those guys. 615 00:32:34,417 --> 00:32:35,857 Speaker 1: Uh, the the the. 616 00:32:35,737 --> 00:32:39,377 Speaker 2: GRS guys in the movie, like, oh, it's funny the 617 00:32:39,497 --> 00:32:41,257 Speaker 2: things they would say about the case officer. I'm just sorry. 618 00:32:41,377 --> 00:32:43,257 Speaker 2: I just was observing. I'm just observing. 619 00:32:44,057 --> 00:32:44,297 Speaker 1: Fuck. 620 00:32:44,417 --> 00:32:47,097 Speaker 3: Every time I left when we were forward, every time 621 00:32:47,137 --> 00:32:48,857 Speaker 3: I left the tactical Operations Center. 622 00:32:48,977 --> 00:32:51,777 Speaker 1: All the guys in a chorus would go beat it NERD. 623 00:32:52,537 --> 00:32:55,377 Speaker 1: So yeah, it's a yeah, but you guys were like. 624 00:32:56,457 --> 00:32:59,977 Speaker 2: He had, we had our reports and are our cool stuff, 625 00:33:00,217 --> 00:33:04,097 Speaker 2: you know in the meetings that we did too? So interesting? 626 00:33:04,177 --> 00:33:06,897 Speaker 2: All right? Well, Cara, where should people go to check 627 00:33:06,937 --> 00:33:08,697 Speaker 2: out your work and the stuff that you're up to 628 00:33:08,777 --> 00:33:09,697 Speaker 2: and all that good stuff? 629 00:33:10,297 --> 00:33:13,097 Speaker 3: Yeah, first and foremost, go to Heritage dot org. So 630 00:33:13,217 --> 00:33:16,097 Speaker 3: all of our work is on the Heritage website. I 631 00:33:16,137 --> 00:33:18,657 Speaker 3: direct the Tech Policy Center there and yeah, we're we're 632 00:33:18,697 --> 00:33:21,217 Speaker 3: looking at you know, five big lines of efforts. So 633 00:33:21,377 --> 00:33:25,857 Speaker 3: go to that website to see what we're up to. Personally, 634 00:33:26,337 --> 00:33:28,297 Speaker 3: I'm again kind of in the belly of the beast. 635 00:33:28,377 --> 00:33:32,017 Speaker 3: I'm on Twitter Kara A Frederick and on Instagram as 636 00:33:32,097 --> 00:33:35,417 Speaker 3: Karafred with two DS, so you can check out my work. 637 00:33:35,577 --> 00:33:38,137 Speaker 1: I do a lot of some personal but mostly. 638 00:33:37,857 --> 00:33:39,737 Speaker 2: And you got a tiny you got a tiny baby too, 639 00:33:39,817 --> 00:33:42,337 Speaker 2: right right now you're like a yeah. 640 00:33:42,057 --> 00:33:44,737 Speaker 1: Yeah, yeah, yeah. 641 00:33:44,777 --> 00:33:47,257 Speaker 3: So she's an infant and I don't hear her, but 642 00:33:47,497 --> 00:33:49,337 Speaker 3: she's probably gonna start crying a little bit. 643 00:33:49,377 --> 00:33:51,857 Speaker 2: So this is I'm glad we didn't get the form 644 00:33:51,857 --> 00:33:53,497 Speaker 2: where you pretend you hear the infant crying so you 645 00:33:53,497 --> 00:33:55,937 Speaker 2: can end the interview early. So that's good. That means 646 00:33:55,937 --> 00:33:58,777 Speaker 2: we kept it moving here. Check out Karra stuff everybody. 647 00:33:58,817 --> 00:34:02,137 Speaker 2: Kara Frederick, thank you so much. Appreciate you joining us