1 00:00:00,440 --> 00:00:04,600 Speaker 1: We had Congressman Mike Waltz on the show recently from Florida, 2 00:00:04,640 --> 00:00:07,960 Speaker 1: and he raised some important questions about Thomas Matthew Crooks, 3 00:00:08,320 --> 00:00:11,120 Speaker 1: the guy who attempted to assassinate Donald Trump. He raised 4 00:00:11,200 --> 00:00:14,240 Speaker 1: questions like why would Thomas Crooks hav ingencrypted accounts in 5 00:00:14,280 --> 00:00:17,680 Speaker 1: three different countries, also asking about, you know, how did 6 00:00:17,680 --> 00:00:21,160 Speaker 1: he learn how to build these IEDs with remote detonators, 7 00:00:21,239 --> 00:00:23,959 Speaker 1: three of which were built and ready to go. He 8 00:00:24,079 --> 00:00:27,320 Speaker 1: raised the point that it's almost as if Crooks's online 9 00:00:27,360 --> 00:00:30,880 Speaker 1: presence was sterilized on purpose. So we're going to get 10 00:00:30,920 --> 00:00:33,120 Speaker 1: to the bottom of that today and ask someone with 11 00:00:33,520 --> 00:00:36,919 Speaker 1: background in that arena. We're also going to ask her 12 00:00:37,120 --> 00:00:39,960 Speaker 1: about recent reports a group of hackers linked to the 13 00:00:40,040 --> 00:00:44,120 Speaker 1: Chinese government we're able to target US internet service providers. 14 00:00:44,200 --> 00:00:47,600 Speaker 1: We all remember recently AT and T notifying millions of 15 00:00:47,640 --> 00:00:48,720 Speaker 1: customers that their. 16 00:00:48,600 --> 00:00:49,959 Speaker 2: Data was likely stolen. 17 00:00:50,720 --> 00:00:53,240 Speaker 1: So what kind of cyber attacks can we expect in 18 00:00:53,280 --> 00:00:55,840 Speaker 1: the future, what does that look like. Finally, we'll get 19 00:00:55,840 --> 00:00:58,880 Speaker 1: to a big one with Mark Zuckerberg and this bombshell 20 00:00:59,080 --> 00:01:02,080 Speaker 1: that he just made this letter to Chairman Jim Jordan, 21 00:01:02,480 --> 00:01:06,319 Speaker 1: admitting three things. According to the Judiciary Committee one Biden 22 00:01:06,360 --> 00:01:10,880 Speaker 1: Harris administration pressured Facebook to censor Americans, two that Facebook 23 00:01:10,920 --> 00:01:14,440 Speaker 1: censored Americans, and three that Facebook throttled the Hunter Biden 24 00:01:14,520 --> 00:01:18,280 Speaker 1: laptop story. We'll get into a court case regarding TikTok 25 00:01:18,280 --> 00:01:20,360 Speaker 1: at what it means for section two thirty. So a 26 00:01:20,400 --> 00:01:25,080 Speaker 1: lot of wonkey data, cyber stuff, and who better to 27 00:01:25,120 --> 00:01:27,880 Speaker 1: have on the show than my friend Kara Frederick. She 28 00:01:28,160 --> 00:01:30,959 Speaker 1: is the director of Tech Policy at the Heritage Foundation. 29 00:01:31,080 --> 00:01:35,160 Speaker 1: She also has a long background and all of this stuff. 30 00:01:35,760 --> 00:01:40,800 Speaker 1: Team lead for Facebook Headquarters Regional Intelligence team for Facebook. 31 00:01:41,440 --> 00:01:44,679 Speaker 1: She was a senior intelligence analyst for the US Naval 32 00:01:44,800 --> 00:01:49,080 Speaker 1: Special Warfare Command. She spent six years as counterintelligence analysts 33 00:01:49,080 --> 00:01:52,360 Speaker 1: at the Department of Defense, deployed Afghanistan three times. So 34 00:01:52,440 --> 00:01:55,000 Speaker 1: she's done a bunch of different things in all these arenas. 35 00:01:55,160 --> 00:01:56,720 Speaker 1: So we're going to get to the bottom of all 36 00:01:56,760 --> 00:01:59,400 Speaker 1: of this with her. So stay tuned for my friend 37 00:01:59,480 --> 00:02:08,320 Speaker 1: care Frederick. Cara Frederick, it's great to have you on 38 00:02:08,320 --> 00:02:11,240 Speaker 1: the show. I always love when I get to see you. 39 00:02:11,240 --> 00:02:13,880 Speaker 1: You're just an awesome person and so smart, So love 40 00:02:13,919 --> 00:02:15,919 Speaker 1: having you on. Appreciate you making the time my friend. 41 00:02:16,240 --> 00:02:18,560 Speaker 3: Oh, it is mutual. Lisa booth. Thanks for having me. 42 00:02:18,880 --> 00:02:21,080 Speaker 1: You're the best, Okay, So I wanted to ask you. 43 00:02:21,160 --> 00:02:23,519 Speaker 1: I had a congress of Mike Waltz on this show 44 00:02:23,560 --> 00:02:25,680 Speaker 1: the other day, and he was just talking about raising 45 00:02:25,760 --> 00:02:29,560 Speaker 1: questions about, you know, why would Thomas Matthew Crooks, that 46 00:02:29,800 --> 00:02:31,600 Speaker 1: would be you know, the want to be assassin trying 47 00:02:31,600 --> 00:02:35,240 Speaker 1: to assassinate Donald Trump? Why would he have three encrypted 48 00:02:35,240 --> 00:02:40,160 Speaker 1: accounts and encrypted accounts in three different countries. So, as 49 00:02:40,160 --> 00:02:43,400 Speaker 1: someone who's worked in counter terrorism, national security, data, all 50 00:02:43,440 --> 00:02:47,120 Speaker 1: those things, all those things aligning, why would he have 51 00:02:47,360 --> 00:02:49,200 Speaker 1: encrypted accounts in three different countries? 52 00:02:49,240 --> 00:02:49,280 Speaker 4: Like? 53 00:02:49,320 --> 00:02:53,240 Speaker 2: What does that signal to you? How concerning is that, Lisa? 54 00:02:53,320 --> 00:02:56,000 Speaker 3: It could mean everything or it could mean nothing. So 55 00:02:56,200 --> 00:02:58,119 Speaker 3: one of the issues here when it talks about when 56 00:02:58,120 --> 00:03:01,040 Speaker 3: we talk about encrypted accounts is I've told people before, 57 00:03:01,120 --> 00:03:04,160 Speaker 3: you know, all forms of encryption are not created equal. 58 00:03:04,520 --> 00:03:07,360 Speaker 3: So some people think something as simple as a Telegram 59 00:03:07,440 --> 00:03:11,359 Speaker 3: account and they consider it encrypted, whereas it's not. There's 60 00:03:11,400 --> 00:03:13,680 Speaker 3: a lot less than meets the eye when it comes 61 00:03:13,720 --> 00:03:18,079 Speaker 3: to Telegram being a widely touted encrypted platform. So so 62 00:03:18,120 --> 00:03:20,679 Speaker 3: this could mean he just had, you know, Telegram accounts 63 00:03:20,680 --> 00:03:25,720 Speaker 3: that were bouncing off ips in different countries. Or it 64 00:03:25,760 --> 00:03:28,600 Speaker 3: could be indicative of something a little more sinister. And 65 00:03:29,080 --> 00:03:32,720 Speaker 3: my bigger question in all of this with Thomas Crooks 66 00:03:32,720 --> 00:03:36,680 Speaker 3: and that attempt is, you know, this guy, there had 67 00:03:36,720 --> 00:03:41,560 Speaker 3: to have been some form of help really that he 68 00:03:41,680 --> 00:03:44,800 Speaker 3: had people that he was talking to, because when we 69 00:03:44,800 --> 00:03:48,680 Speaker 3: were targeting all kinda first online and then isis isis 70 00:03:48,720 --> 00:03:52,360 Speaker 3: would actually send through the dark web these you know, 71 00:03:52,480 --> 00:03:55,520 Speaker 3: slick propaganda videos. Everybody knows that, but then they would 72 00:03:55,560 --> 00:03:57,560 Speaker 3: have these things like how to build a bomb in 73 00:03:57,560 --> 00:03:59,960 Speaker 3: the kitchen of your mom, and it would you know, 74 00:04:00,160 --> 00:04:03,520 Speaker 3: passed through certain hands, people would send it to other 75 00:04:03,600 --> 00:04:06,680 Speaker 3: propagandas you would have cheerleaders who would send it over. 76 00:04:06,800 --> 00:04:10,040 Speaker 3: So there should be some what we call like digital 77 00:04:10,160 --> 00:04:14,120 Speaker 3: chains of custody that Thomas Crooks left in his wake 78 00:04:14,480 --> 00:04:17,760 Speaker 3: when it came to using all of these digital platforms 79 00:04:17,800 --> 00:04:21,039 Speaker 3: that the FBI says they know about. So to me, 80 00:04:21,400 --> 00:04:26,000 Speaker 3: this just it provides more questions than it does answers, 81 00:04:26,080 --> 00:04:27,280 Speaker 3: like I'm sure it does for a lot of the 82 00:04:27,320 --> 00:04:33,000 Speaker 3: congressmen who are conducting oversight over this, But it's smacks 83 00:04:33,040 --> 00:04:38,600 Speaker 3: of something something a lot deeper in my mind than 84 00:04:38,800 --> 00:04:44,240 Speaker 3: just a really smart kid who is pretty good online, 85 00:04:44,320 --> 00:04:47,040 Speaker 3: Because you also remember the first things that we were 86 00:04:47,080 --> 00:04:50,400 Speaker 3: told about this guy is that he didn't have a 87 00:04:50,440 --> 00:04:54,640 Speaker 3: digital footprint. You know, this young guy who came of 88 00:04:54,680 --> 00:04:58,200 Speaker 3: age during COVID, when the entire world went digital, was 89 00:04:58,200 --> 00:05:01,360 Speaker 3: supposed to not have any sort of of digital exhaust, 90 00:05:01,600 --> 00:05:04,680 Speaker 3: which is nearly impossible for even us, you know, geriatric 91 00:05:04,800 --> 00:05:06,000 Speaker 3: millennials these days. 92 00:05:06,240 --> 00:05:07,480 Speaker 2: So we don't want to go that far. 93 00:05:10,160 --> 00:05:14,279 Speaker 3: Fair enough, fair enough, but something just it doesn't smell right. 94 00:05:14,320 --> 00:05:16,440 Speaker 3: As I've said before, you know, it doesn't pass the 95 00:05:16,480 --> 00:05:19,640 Speaker 3: smell test. Everything that we're being told about this guy, 96 00:05:19,680 --> 00:05:22,599 Speaker 3: and especially about his digital life, does not pass the 97 00:05:22,640 --> 00:05:25,440 Speaker 3: smell test. And the questions are piling up and the 98 00:05:25,520 --> 00:05:27,680 Speaker 3: answers are very scant at this point. 99 00:05:28,000 --> 00:05:29,920 Speaker 1: Well, you know that's what he pointed out too, because 100 00:05:29,960 --> 00:05:33,160 Speaker 1: he was asking, you know, how did Thomas Crux learn 101 00:05:33,279 --> 00:05:36,440 Speaker 1: how to build these IEDs with remote detonators, three of 102 00:05:36,480 --> 00:05:38,720 Speaker 1: which were bill and ready to go. And then you know, 103 00:05:38,720 --> 00:05:40,880 Speaker 1: I was talking about it's almost as if his online 104 00:05:40,920 --> 00:05:47,560 Speaker 1: presence was sterilized on purpose. I guess who would sterilize 105 00:05:47,600 --> 00:05:50,760 Speaker 1: his online presence and what would the intention be behind that? 106 00:05:51,400 --> 00:05:54,840 Speaker 3: Yeah, that's a good question, especially if you know there 107 00:05:54,839 --> 00:05:58,279 Speaker 3: are things that you can do beforehand, you know, basic 108 00:05:58,320 --> 00:06:02,040 Speaker 3: digital hygiene of that he could have conducted. But we 109 00:06:02,160 --> 00:06:04,880 Speaker 3: know that there are all sorts of offensive tricks that 110 00:06:05,000 --> 00:06:08,159 Speaker 3: you can get after online too, like you can spoof 111 00:06:08,200 --> 00:06:10,479 Speaker 3: IP addresses, and if he was doing that to sort 112 00:06:10,520 --> 00:06:13,640 Speaker 3: of take people off of his scent, that is a 113 00:06:13,680 --> 00:06:15,719 Speaker 3: possibility as well. We know, you know, a kid with 114 00:06:15,760 --> 00:06:18,800 Speaker 3: a computer, it's not very hard. But again, where where 115 00:06:18,800 --> 00:06:22,080 Speaker 3: are the digital forensic guys and the information that they're 116 00:06:22,320 --> 00:06:27,240 Speaker 3: supposedly exploiting? Like every time I worked with special Operations 117 00:06:27,320 --> 00:06:29,800 Speaker 3: units in the field and every time we're in Afghanistan 118 00:06:29,839 --> 00:06:31,960 Speaker 3: and those guys would hit a target, they would come 119 00:06:32,000 --> 00:06:34,440 Speaker 3: back with a bunch of cell phones with a bunch 120 00:06:34,480 --> 00:06:36,880 Speaker 3: of things to exploit, and we the nerves would get 121 00:06:36,880 --> 00:06:39,760 Speaker 3: to work figuring out, you know, who are in those contexts, 122 00:06:39,800 --> 00:06:42,920 Speaker 3: what are they saying? And we called it exploiting those devices. 123 00:06:43,120 --> 00:06:45,880 Speaker 3: Those devices. If any of these intel analysts in the 124 00:06:45,880 --> 00:06:49,080 Speaker 3: government are worth their salt, they should have been exploited 125 00:06:49,360 --> 00:06:53,839 Speaker 3: twenty ways from Sunday by now. And his ability to 126 00:06:53,920 --> 00:06:57,040 Speaker 3: wipe that clean. If he actually did that, and we 127 00:06:57,200 --> 00:07:00,320 Speaker 3: can't get any information as the US government from these devices, 128 00:07:00,680 --> 00:07:03,480 Speaker 3: then all of these guys should be fired because that 129 00:07:03,640 --> 00:07:07,640 Speaker 3: is something that you know, even the most seasoned Al 130 00:07:07,720 --> 00:07:11,160 Speaker 3: Kaida operative is not as good as apparently this guy 131 00:07:11,280 --> 00:07:14,280 Speaker 3: was when it can't comes to operational security. So again 132 00:07:14,640 --> 00:07:17,600 Speaker 3: doesn't really pass the smell test. 133 00:07:17,640 --> 00:07:20,520 Speaker 1: You know. So I guess, why do we not know 134 00:07:20,640 --> 00:07:23,600 Speaker 1: more about this guy? Do you think our government does 135 00:07:23,640 --> 00:07:26,240 Speaker 1: and they're just not telling us or you know, it's 136 00:07:26,240 --> 00:07:28,480 Speaker 1: just it's just strange that we don't know more. 137 00:07:29,160 --> 00:07:31,360 Speaker 3: No, I agree. I think there's like a lack of 138 00:07:31,440 --> 00:07:34,960 Speaker 3: curiosity first and foremost from from the governments, as we 139 00:07:35,000 --> 00:07:37,360 Speaker 3: all know, because Donald Trump happens to be on the 140 00:07:37,360 --> 00:07:40,480 Speaker 3: wrong side of a lot of the faceless bureaucrats who 141 00:07:40,520 --> 00:07:46,080 Speaker 3: are employed in it. And then secondly, I I to me, 142 00:07:46,880 --> 00:07:49,920 Speaker 3: I've heard I've heard a number of things. I've heard 143 00:07:49,920 --> 00:07:54,840 Speaker 3: people say that the technical capability to exploit some of 144 00:07:54,880 --> 00:07:57,800 Speaker 3: the stuff is beyond the people that we have right 145 00:07:57,840 --> 00:08:00,960 Speaker 3: now employed by the US government. Don't really buy that, 146 00:08:01,320 --> 00:08:04,480 Speaker 3: to be quite honest. And then the other thing is, 147 00:08:04,640 --> 00:08:07,200 Speaker 3: you know, they have information, they're just sitting on it, 148 00:08:07,360 --> 00:08:10,920 Speaker 3: kind of like they released the body for cremation before 149 00:08:11,160 --> 00:08:14,240 Speaker 3: most people could actually have an interest in figuring out, 150 00:08:14,520 --> 00:08:17,040 Speaker 3: you know, how this guy actually died, et cetera, et cetera. 151 00:08:17,160 --> 00:08:21,720 Speaker 3: So to me, it's all just a lack of transparency. 152 00:08:21,880 --> 00:08:24,920 Speaker 3: And when you have a lack of transparency in the 153 00:08:24,960 --> 00:08:29,160 Speaker 3: information environment, then lots of things will quickly fill that vacuum, 154 00:08:29,480 --> 00:08:30,440 Speaker 3: and none of them good. 155 00:08:30,760 --> 00:08:32,920 Speaker 1: No, I know, my head's definitely filling that vacuum. And 156 00:08:32,960 --> 00:08:36,839 Speaker 1: I tried, I tried. I try to be careful about 157 00:08:36,840 --> 00:08:39,800 Speaker 1: how much I verbalize that without facts, so. 158 00:08:41,800 --> 00:08:44,280 Speaker 2: But you know, my mind's filling that void, I'll assure you. 159 00:08:44,520 --> 00:08:46,400 Speaker 1: All right, Well, I want to you know, we've just 160 00:08:46,559 --> 00:08:48,839 Speaker 1: seen a lot of cyber attacks. You know, there was 161 00:08:48,920 --> 00:08:51,679 Speaker 1: these reports recently that a group of hackers linked to 162 00:08:51,679 --> 00:08:56,200 Speaker 1: the Chinese government had you know, uh, you know, exposed 163 00:08:56,200 --> 00:08:59,840 Speaker 1: some vulnerability and was able to target US internet service provider. 164 00:09:00,200 --> 00:09:03,360 Speaker 1: We saw not too long ago AT and T notifying 165 00:09:03,400 --> 00:09:05,800 Speaker 1: all their customers that basically, like you know, everyone was 166 00:09:05,840 --> 00:09:07,760 Speaker 1: hacked more or less or that you know, information was 167 00:09:07,800 --> 00:09:11,560 Speaker 1: stolen for a period of time. I guess it seems 168 00:09:11,559 --> 00:09:15,680 Speaker 1: like this is becoming more frequent with these cyber attacks. One, 169 00:09:15,800 --> 00:09:18,760 Speaker 1: who do you think or behind these increased cyber attacks? 170 00:09:18,800 --> 00:09:21,720 Speaker 1: Obviously you know that one was China. Why do you 171 00:09:21,800 --> 00:09:23,960 Speaker 1: think they're happening, and and sort of what do you 172 00:09:24,000 --> 00:09:26,280 Speaker 1: expect in the future with cyber attacks? 173 00:09:26,640 --> 00:09:28,600 Speaker 3: Oh yeah, I think you know, you can't discount the 174 00:09:28,640 --> 00:09:31,640 Speaker 3: usual suspects, Right, there's China, there's around, there's North Korea, 175 00:09:31,760 --> 00:09:34,720 Speaker 3: and there's Russia. But then there's also a bevy of 176 00:09:34,800 --> 00:09:40,319 Speaker 3: what we call patriotic nedicines, so people linked to the 177 00:09:40,800 --> 00:09:44,319 Speaker 3: CCP in very indirect ways that are kind of feeling 178 00:09:44,400 --> 00:09:47,199 Speaker 3: like they're doing their duty for China. So you can't 179 00:09:47,240 --> 00:09:50,840 Speaker 3: discount those guys. There's chaos agents always and then sort 180 00:09:50,880 --> 00:09:54,080 Speaker 3: of activists that are always percolating in this environment too, 181 00:09:54,200 --> 00:09:57,560 Speaker 3: So there's a litany of actors that are all sort 182 00:09:57,600 --> 00:09:59,960 Speaker 3: of you know, creating the problems that we have now. 183 00:10:00,280 --> 00:10:02,880 Speaker 3: But I do think zeroing in on China is really 184 00:10:02,960 --> 00:10:06,319 Speaker 3: critical here because you look at from twenty fourteen onward 185 00:10:06,360 --> 00:10:09,400 Speaker 3: and even before that, of course, but twenty fourteen twenty fifteen, 186 00:10:09,480 --> 00:10:13,199 Speaker 3: you have the OPM hacks right where most of the 187 00:10:13,280 --> 00:10:18,240 Speaker 3: information of government employees was exploited by China linked actors. 188 00:10:18,480 --> 00:10:21,440 Speaker 3: You have the Anthem hack this is a healthcare You 189 00:10:21,440 --> 00:10:24,240 Speaker 3: have the Equifax that's money obviously credit, and then you 190 00:10:24,280 --> 00:10:27,079 Speaker 3: have the Marriott hack, which most people know. You know, 191 00:10:27,160 --> 00:10:32,040 Speaker 3: a lot of government especially DOOD personnel, use Marriott to travel. 192 00:10:32,480 --> 00:10:34,520 Speaker 3: That's sort of their where they get their points, their 193 00:10:34,520 --> 00:10:37,760 Speaker 3: system and whatnot. So you have China taking all of 194 00:10:37,800 --> 00:10:42,120 Speaker 3: these data sets state linked actors rather and you know, 195 00:10:42,200 --> 00:10:44,560 Speaker 3: being able to at this point integrate them and then 196 00:10:44,720 --> 00:10:48,880 Speaker 3: use artificial intelligence to single people out for blackmail, for espionage, 197 00:10:49,480 --> 00:10:53,320 Speaker 3: for future exploitation, or anything really they want to do 198 00:10:53,400 --> 00:10:56,640 Speaker 3: with it. They tend to hack first and ask questions later. 199 00:10:57,280 --> 00:10:59,400 Speaker 3: I think this is the biggest problem that we face 200 00:10:59,440 --> 00:11:04,600 Speaker 3: in terms of the cybersecurity environment. That is China. Iran's 201 00:11:04,720 --> 00:11:07,120 Speaker 3: really really good. We don't really think about them too 202 00:11:07,160 --> 00:11:10,520 Speaker 3: often when it comes to cyberspace, but they are very good, 203 00:11:10,559 --> 00:11:13,320 Speaker 3: and they've been probing our critical infrastructure just like China 204 00:11:13,760 --> 00:11:17,760 Speaker 3: for a long time, and probing attacks are really interesting. 205 00:11:17,800 --> 00:11:20,720 Speaker 3: They're almost like hey, come and get me, or like 206 00:11:20,800 --> 00:11:25,600 Speaker 3: sort of a big manning the US in specific ways 207 00:11:25,640 --> 00:11:28,199 Speaker 3: because they're like, look, we're in your systems. We haven't 208 00:11:28,240 --> 00:11:30,400 Speaker 3: done anything yet, but we just want you to know 209 00:11:30,480 --> 00:11:32,600 Speaker 3: that we're here, and we could if we wanted to. 210 00:11:33,000 --> 00:11:37,400 Speaker 3: So that's another interesting element to the cybersecurity environment that 211 00:11:37,520 --> 00:11:40,320 Speaker 3: I don't think many people talk about. And it's very 212 00:11:40,320 --> 00:11:43,600 Speaker 3: problematic because then you can sort of turn on those 213 00:11:43,640 --> 00:11:46,600 Speaker 3: bigger attacks that eventually cripple and take down elements of 214 00:11:46,600 --> 00:11:50,440 Speaker 3: our critical infrastructure whenever they want or at critical times, 215 00:11:50,440 --> 00:11:54,080 Speaker 3: say maybe if Iran and Israel start to get into 216 00:11:54,120 --> 00:11:57,480 Speaker 3: a big hot war, that kind of thing. So cybersecurity, 217 00:11:57,520 --> 00:12:00,840 Speaker 3: I tell people this, you might not be interested in cybersecurity, 218 00:12:00,920 --> 00:12:04,240 Speaker 3: but cybersecurity is interested in you. So you've got to 219 00:12:04,240 --> 00:12:07,720 Speaker 3: be eternally vigilant when it comes to the cyberspace environment, 220 00:12:07,800 --> 00:12:09,079 Speaker 3: in particular in America. 221 00:12:09,120 --> 00:12:11,640 Speaker 1: Now, yeah, and god forbid, like the electrical grid or 222 00:12:11,679 --> 00:12:13,800 Speaker 1: anything like that. I was on a panel one time, 223 00:12:13,840 --> 00:12:16,880 Speaker 1: I was moderating it, and one of the guys said 224 00:12:16,920 --> 00:12:18,960 Speaker 1: that there's two types of businesses, the ones who know 225 00:12:18,960 --> 00:12:20,560 Speaker 1: they've been hacked and the ones who don't know they've 226 00:12:20,559 --> 00:12:23,720 Speaker 1: been hacked, you know, basically implying that everyone's been hacked. 227 00:12:23,920 --> 00:12:26,160 Speaker 2: We've got to take a quick commercial break. More with care. 228 00:12:26,240 --> 00:12:31,240 Speaker 1: On the other side, you had mentioned like a sort 229 00:12:31,240 --> 00:12:34,600 Speaker 1: of for usage of espionage, and is that for like coercion, 230 00:12:34,840 --> 00:12:37,200 Speaker 1: you know? And then if so, like how many members 231 00:12:37,200 --> 00:12:43,040 Speaker 1: of Congress do you think have been compromised at then yeah, that's. 232 00:12:42,920 --> 00:12:45,120 Speaker 3: Part of it too. I think with their sort of 233 00:12:45,320 --> 00:12:49,920 Speaker 3: United frontwork, you know, they've been pretty successful in softer ways. 234 00:12:50,400 --> 00:12:52,280 Speaker 3: I really think Peter Swipes are who I think you've 235 00:12:52,280 --> 00:12:53,600 Speaker 3: probably had him on your show. 236 00:12:53,720 --> 00:12:55,640 Speaker 2: He's great. I love him, excellent. 237 00:12:55,720 --> 00:12:58,319 Speaker 3: He's excellent. And I like to think of when I 238 00:12:58,360 --> 00:13:01,200 Speaker 3: think of American politicians and how their compromise, I think 239 00:13:01,240 --> 00:13:04,160 Speaker 3: of that that phrase that he is really brought to 240 00:13:04,240 --> 00:13:07,200 Speaker 3: our consciousness big help with a little bad mouth, you know, 241 00:13:07,280 --> 00:13:10,680 Speaker 3: Tim Waltz comes to mind in particular. It's the whole 242 00:13:10,720 --> 00:13:13,640 Speaker 3: idea is, you know, you have American politicians that are 243 00:13:13,679 --> 00:13:17,160 Speaker 3: really helping the efforts of the Chinese Communist Party, but 244 00:13:17,200 --> 00:13:20,280 Speaker 3: then they'll say bad things about them every now and then, like, oh, 245 00:13:20,400 --> 00:13:23,720 Speaker 3: you know the wigers in concentration camps. That's not great. 246 00:13:23,920 --> 00:13:27,760 Speaker 3: But overall, China is not a competitor. They're really just 247 00:13:27,840 --> 00:13:31,040 Speaker 3: our ally, you know, they're not an adversary, They're they're 248 00:13:31,080 --> 00:13:34,240 Speaker 3: a challenger at times. So I think they've pretty much 249 00:13:34,280 --> 00:13:36,960 Speaker 3: got that whole big help with a little big badmouth 250 00:13:37,800 --> 00:13:41,360 Speaker 3: and United Front work sort of going. And the United 251 00:13:41,360 --> 00:13:45,000 Speaker 3: frontwork is the more softer aspects of it, so not 252 00:13:45,040 --> 00:13:49,040 Speaker 3: necessarily those hard cybersecurity vulnerabilities that we're talking about in 253 00:13:49,120 --> 00:13:52,920 Speaker 3: terms of getting troves of information to integrate and then 254 00:13:53,000 --> 00:13:56,960 Speaker 3: potentially use later for dossier's and whatnot. But when it 255 00:13:57,000 --> 00:14:00,880 Speaker 3: comes to that kind of that kind of work, we 256 00:14:01,040 --> 00:14:06,400 Speaker 3: know that they have long and fullsome dossiers on Australian politicians, 257 00:14:06,679 --> 00:14:09,440 Speaker 3: and obviously there's a proximity issue there, So they've been 258 00:14:09,480 --> 00:14:13,520 Speaker 3: targeting the West via Australia for a very long time. 259 00:14:13,760 --> 00:14:16,960 Speaker 3: And we know that they also have dossier's on UK 260 00:14:17,200 --> 00:14:21,160 Speaker 3: politicians as well too, So I'm sure that there's some 261 00:14:21,360 --> 00:14:25,160 Speaker 3: element of, hey, we know all of these things, we 262 00:14:25,360 --> 00:14:29,000 Speaker 3: could you know, let the American people know, or you 263 00:14:29,040 --> 00:14:31,600 Speaker 3: could you know, work with us in in other ways. 264 00:14:31,600 --> 00:14:36,320 Speaker 3: So I'm sure those elements have occurred before. Obviously, we 265 00:14:36,360 --> 00:14:39,320 Speaker 3: know the Biden family is in deep via the University 266 00:14:39,360 --> 00:14:42,760 Speaker 3: of Pennsylvania and their think tanks and whatnot, and all 267 00:14:42,800 --> 00:14:47,680 Speaker 3: of the reporting on Hunter Biden's associations with people in 268 00:14:47,720 --> 00:14:51,080 Speaker 3: the intelligence world in China, So I have no doubt 269 00:14:51,160 --> 00:14:54,880 Speaker 3: that that the espionage, of the exploitation, the blackmail game 270 00:14:54,920 --> 00:14:58,080 Speaker 3: has been occurring in the corridors of Washington, DC and 271 00:14:58,120 --> 00:14:58,960 Speaker 3: elsewhere as well. 272 00:14:58,840 --> 00:15:02,920 Speaker 1: By China I wanted to get your take on this 273 00:15:03,040 --> 00:15:07,280 Speaker 1: bombshell from Mark Zuckerberg, you know, writing a letter to 274 00:15:07,480 --> 00:15:11,560 Speaker 1: Chairman Jim Jordan saying the letter in twenty twenty one, 275 00:15:11,680 --> 00:15:14,360 Speaker 1: senior officials from the bid administration, including the White House, 276 00:15:14,760 --> 00:15:17,640 Speaker 1: repeatedly pressured our teams for months to center certain to 277 00:15:17,760 --> 00:15:21,960 Speaker 1: censor certain COVID nineteen content. You know, it goes on, 278 00:15:22,040 --> 00:15:24,600 Speaker 1: but basically just you know, admitting what we had known, 279 00:15:24,600 --> 00:15:27,520 Speaker 1: admitting that the FBI was meddling with the Russian you know, 280 00:15:27,560 --> 00:15:31,400 Speaker 1: the fake Russian disinformation story ahead of the twenty twenty 281 00:15:31,440 --> 00:15:35,760 Speaker 1: election as well. What do you make of this bombshell 282 00:15:35,880 --> 00:15:37,160 Speaker 1: and why now? 283 00:15:38,000 --> 00:15:40,760 Speaker 3: Yeah, this is a great question. And what I've been 284 00:15:40,760 --> 00:15:43,800 Speaker 3: trying to do right now is to tell Republicans put 285 00:15:43,840 --> 00:15:47,680 Speaker 3: the ticker tape parade away. Stop everybody, stop this, because 286 00:15:47,880 --> 00:15:50,960 Speaker 3: stop taking a victory lap for free speech and thinking 287 00:15:50,960 --> 00:15:52,880 Speaker 3: that your work here is done and packing up and 288 00:15:52,880 --> 00:15:56,160 Speaker 3: going home like this is not that the timing is critical, 289 00:15:56,240 --> 00:15:59,000 Speaker 3: as you noted, because what I think Mark Zuckerberg has 290 00:15:59,040 --> 00:16:01,880 Speaker 3: done here because he is a very savvy actor. He 291 00:16:01,960 --> 00:16:04,920 Speaker 3: is a very smart man, and his instincts are very 292 00:16:04,960 --> 00:16:08,120 Speaker 3: good when it comes to some of the political environment. 293 00:16:08,400 --> 00:16:11,280 Speaker 3: You know, I've said that he gets his pr people 294 00:16:11,400 --> 00:16:13,640 Speaker 3: and some of the politicals and the building to kind 295 00:16:13,640 --> 00:16:16,240 Speaker 3: of mess him up and mess with what he really 296 00:16:16,240 --> 00:16:18,240 Speaker 3: wants to do and get him to make bad decisions. 297 00:16:18,280 --> 00:16:21,400 Speaker 3: But when he's left to his own devices, he's really 298 00:16:21,440 --> 00:16:23,960 Speaker 3: really smart. And what I think he's done here is 299 00:16:24,040 --> 00:16:27,080 Speaker 3: taking a look at the landscape and basically said, Okay. 300 00:16:27,640 --> 00:16:31,560 Speaker 3: It's In June, the Supreme Court and a decision in 301 00:16:31,840 --> 00:16:36,080 Speaker 3: Murphy v. Missouri, the Court found no standing for plaintiffs 302 00:16:36,080 --> 00:16:39,440 Speaker 3: that were demanding an end to this exact type of 303 00:16:39,480 --> 00:16:43,320 Speaker 3: collusion between Big Tech and the Biden administration and Biden 304 00:16:43,360 --> 00:16:46,800 Speaker 3: campaign officials at the time, and they rejected. This is 305 00:16:46,800 --> 00:16:49,960 Speaker 3: the Supreme Court again rejected state efforts to permanently block 306 00:16:50,000 --> 00:16:53,520 Speaker 3: the Biden administration from working with big Tech to censor speech. 307 00:16:53,640 --> 00:16:55,120 Speaker 3: What they did is they kicked it down to the 308 00:16:55,160 --> 00:16:57,960 Speaker 3: lower court. They had a chance to say this is 309 00:16:58,200 --> 00:17:01,840 Speaker 3: a violation of the First amend this cannot stand, but 310 00:17:01,920 --> 00:17:04,679 Speaker 3: they didn't. They were more like, m lower course, you 311 00:17:04,720 --> 00:17:06,600 Speaker 3: decide it, like, we're not going to decide this case 312 00:17:06,600 --> 00:17:09,200 Speaker 3: on the merits. We're gonna sort of have a process 313 00:17:09,280 --> 00:17:12,040 Speaker 3: decision here, Like you guys do it in the lower 314 00:17:12,080 --> 00:17:15,639 Speaker 3: course again, so that was a big blow to our 315 00:17:15,720 --> 00:17:18,600 Speaker 3: efforts on my side to hold big tech accountable. And 316 00:17:18,760 --> 00:17:21,720 Speaker 3: obviously Mark Zuckerberg did not release his letter when that 317 00:17:21,920 --> 00:17:25,000 Speaker 3: lawsuit was going on, before that decision was occurring. So 318 00:17:25,000 --> 00:17:28,360 Speaker 3: that's point one. Point two is that he knows. Democrats 319 00:17:28,640 --> 00:17:31,760 Speaker 3: like what had happened here. They like what's continuing to 320 00:17:31,800 --> 00:17:36,680 Speaker 3: happen on Facebook's platform, aka helping the left censoring conservative speech, 321 00:17:36,720 --> 00:17:39,440 Speaker 3: so they're not going to do anything to him. Republicans, 322 00:17:39,760 --> 00:17:44,240 Speaker 3: they are amazing at writing these strongly worded letters at fundraising, 323 00:17:44,400 --> 00:17:46,639 Speaker 3: but ultimately, when it comes down to it, they're really 324 00:17:46,680 --> 00:17:48,800 Speaker 3: not going to do anything to Facebook because it's a 325 00:17:48,800 --> 00:17:52,120 Speaker 3: private company, and private companies are the enngines of innovation 326 00:17:52,160 --> 00:17:54,400 Speaker 3: in America and you can't touch him that kind of thing. 327 00:17:54,480 --> 00:17:56,760 Speaker 3: So Mark's like, all right, we don't know what's going 328 00:17:56,800 --> 00:17:59,320 Speaker 3: to happen in at the end of this year for 329 00:17:59,440 --> 00:18:02,120 Speaker 3: the election, So I'm going to pander a little bit 330 00:18:02,160 --> 00:18:04,159 Speaker 3: to the Joe Rogan crowd. I'm going to get on 331 00:18:04,200 --> 00:18:06,000 Speaker 3: my you know, UFC. I'm going to roll on the 332 00:18:06,040 --> 00:18:08,040 Speaker 3: mat a little bit. I'm going to surf with an 333 00:18:08,080 --> 00:18:10,920 Speaker 3: American flag and I'm going to say, you know, Trump, 334 00:18:11,280 --> 00:18:13,680 Speaker 3: the you know, when he stood up after he got shot, 335 00:18:13,720 --> 00:18:16,760 Speaker 3: that was a bad mammagama kind of action to do. 336 00:18:17,200 --> 00:18:19,840 Speaker 3: And I'm going to give them this little nugget and 337 00:18:19,880 --> 00:18:23,040 Speaker 3: they'll probably be happy, and they'll probably you know, take 338 00:18:23,080 --> 00:18:25,640 Speaker 3: the pressure off me a little bit. So I think 339 00:18:25,720 --> 00:18:28,600 Speaker 3: that's all this is. Because the last thing that you'll 340 00:18:28,640 --> 00:18:30,520 Speaker 3: have to remember and then I promise I'll stop, is 341 00:18:30,560 --> 00:18:33,840 Speaker 3: that all the while this is happening, Facebook is still 342 00:18:34,240 --> 00:18:40,120 Speaker 3: censoring conservative and conservative coded and heterodox information. Twenty nine 343 00:18:40,200 --> 00:18:44,720 Speaker 3: out thirty days ago. At this point, Facebook apologized for 344 00:18:44,880 --> 00:18:48,840 Speaker 3: considering Trump the iconic picture of him raising his fist 345 00:18:48,920 --> 00:18:53,760 Speaker 3: in the air. They labeled it misinformation. And again, these mistakes, 346 00:18:53,880 --> 00:18:58,040 Speaker 3: these content moderation quote unquote mistakes, only seem to go 347 00:18:58,200 --> 00:19:00,679 Speaker 3: in one direction, the one that benefits the left. So 348 00:19:00,720 --> 00:19:02,480 Speaker 3: Mark and say what do it has to say? But 349 00:19:02,560 --> 00:19:05,439 Speaker 3: Facebook is still crushing conservative speech. 350 00:19:06,520 --> 00:19:10,399 Speaker 1: Do you is there any world where he might start 351 00:19:10,480 --> 00:19:12,720 Speaker 1: liking Trump and you know, with the you had mentioned 352 00:19:12,760 --> 00:19:14,919 Speaker 1: the fact that he you know, because he does like 353 00:19:15,040 --> 00:19:16,639 Speaker 1: Mma and like these different you know, do you think 354 00:19:16,640 --> 00:19:19,320 Speaker 1: there's any world where he really did see Trump get 355 00:19:19,320 --> 00:19:21,160 Speaker 1: shot and get back up and was like, all right, 356 00:19:21,320 --> 00:19:24,960 Speaker 1: like that, that's pretty cool. Yeah, we've seen in Silicon 357 00:19:25,040 --> 00:19:28,520 Speaker 1: Valley move a little bit more, you know, towards Trump 358 00:19:28,560 --> 00:19:31,399 Speaker 1: via David Sachs do that big fundraiser I think it 359 00:19:31,440 --> 00:19:34,840 Speaker 1: was back in June. You've got more people coming out 360 00:19:35,000 --> 00:19:36,920 Speaker 1: in the tech world being like I, you know, I 361 00:19:36,960 --> 00:19:38,040 Speaker 1: actually kind of like Trump. 362 00:19:38,040 --> 00:19:38,920 Speaker 2: I was fooled before. 363 00:19:39,000 --> 00:19:40,879 Speaker 1: So I don't know, like, do you think there's a 364 00:19:40,920 --> 00:19:44,239 Speaker 1: world where maybe he is kind of like part of 365 00:19:44,280 --> 00:19:47,479 Speaker 1: that group that are sort of coming around to Donald Trump, 366 00:19:47,880 --> 00:19:49,879 Speaker 1: or do you think it's just for safe, you know, 367 00:19:50,080 --> 00:19:53,040 Speaker 1: face saving purposes as you pointed out, No, I think 368 00:19:53,040 --> 00:19:53,840 Speaker 1: it's actually both. 369 00:19:53,880 --> 00:19:56,280 Speaker 3: So, like I said, his instincts like, this is the 370 00:19:56,320 --> 00:19:59,960 Speaker 3: guy who in October twenty nineteen, he went to Georgetown 371 00:20:00,040 --> 00:20:02,679 Speaker 3: and he gave a speech and he basically said, China 372 00:20:02,760 --> 00:20:07,760 Speaker 3: censors people. China is overly censorious and awful, and Facebook 373 00:20:07,840 --> 00:20:10,959 Speaker 3: is going to be the free speech platform that you 374 00:20:11,040 --> 00:20:14,000 Speaker 3: guys should basically cling to. We are going to be 375 00:20:14,119 --> 00:20:18,760 Speaker 3: that counterweight to a authoritarian, digitally minded China. And I 376 00:20:18,840 --> 00:20:20,639 Speaker 3: remember I listened to it live. I was like, wow, 377 00:20:20,800 --> 00:20:23,760 Speaker 3: this is okay, okay, and it really tracked with some 378 00:20:23,800 --> 00:20:25,719 Speaker 3: of the things that I heard from him say in 379 00:20:25,760 --> 00:20:27,840 Speaker 3: some of his you know, public quarterly meetings and whatnot. 380 00:20:27,840 --> 00:20:29,800 Speaker 3: When I worked at Facebook, I was like, this guy 381 00:20:29,840 --> 00:20:33,080 Speaker 3: is really compelling. Like his vision if he could enact 382 00:20:33,119 --> 00:20:36,359 Speaker 3: it purely, like is kind of interesting. And again, his 383 00:20:36,480 --> 00:20:38,720 Speaker 3: instincts are pretty good. And then you have you know, 384 00:20:38,760 --> 00:20:41,439 Speaker 3: the logan Paul's, these famous golfers like all sort of 385 00:20:41,480 --> 00:20:45,399 Speaker 3: like gravitating toward the common man nature that Trump evinces. 386 00:20:45,600 --> 00:20:48,760 Speaker 3: So I think there's something in Mark Zuckerberg, who also 387 00:20:48,880 --> 00:20:52,399 Speaker 3: said I believe the antidote to bad speech is more speech. 388 00:20:52,600 --> 00:20:57,200 Speaker 3: There's something in Mark that wants to be a good guy. Right, 389 00:20:57,280 --> 00:21:00,919 Speaker 3: There's something in him that just wants to be good 390 00:21:01,119 --> 00:21:04,720 Speaker 3: and wants to understand the truth with a capital T. 391 00:21:05,240 --> 00:21:11,239 Speaker 3: But again, everyone he gets so bamboozled, I think by 392 00:21:11,800 --> 00:21:14,840 Speaker 3: all of the voices that are around him. And then 393 00:21:14,960 --> 00:21:18,440 Speaker 3: and generally right, like, let's be honest, Facebook is probably 394 00:21:18,480 --> 00:21:21,000 Speaker 3: a net negative for the world. And I have said 395 00:21:21,000 --> 00:21:23,439 Speaker 3: this before, like it is not a good thing. This 396 00:21:23,520 --> 00:21:27,480 Speaker 3: platform and Instagram that you know, connects perverts with young children. 397 00:21:27,560 --> 00:21:30,439 Speaker 3: And Wall Street Journal has done tons and tons of 398 00:21:30,600 --> 00:21:34,280 Speaker 3: very technical, great expose as of what these algorithms do, 399 00:21:34,359 --> 00:21:36,880 Speaker 3: and what these platforms do and what they knowingly do 400 00:21:36,960 --> 00:21:39,479 Speaker 3: and double down on, especially when it comes to children. 401 00:21:39,720 --> 00:21:42,880 Speaker 3: So on balance, not a great guy, not a great 402 00:21:42,920 --> 00:21:45,280 Speaker 3: thing in my mind that Facebook came into the world. 403 00:21:45,840 --> 00:21:48,640 Speaker 3: But there's something inside of Mark Zuckerberg I do think 404 00:21:48,760 --> 00:21:50,760 Speaker 3: can be turned. In the end. We might see it 405 00:21:50,800 --> 00:21:53,080 Speaker 3: in about twenty years, but we're not quite there yet. 406 00:21:53,359 --> 00:21:56,359 Speaker 2: Well, Mark, if you're listening, we'll have you if you 407 00:21:56,520 --> 00:21:57,560 Speaker 2: promised to be a good boy. 408 00:21:57,640 --> 00:22:00,480 Speaker 1: We've got more with carap but first we are quickly 409 00:22:00,520 --> 00:22:03,880 Speaker 1: approaching the one year anniversary of the horrific Commas attacks 410 00:22:03,880 --> 00:22:06,760 Speaker 1: on Israel, and still the Holy Land continues to be 411 00:22:06,800 --> 00:22:11,600 Speaker 1: attacked on multiple fronts. Deadly threats are increasing in northern Israel. 412 00:22:12,200 --> 00:22:15,399 Speaker 1: Constant rocket attacks from Hesbola have been fired at Israel, 413 00:22:15,440 --> 00:22:20,720 Speaker 1: causing widespread damage, with raging wildfires destroying precious farmland. Since 414 00:22:20,760 --> 00:22:23,600 Speaker 1: the war started, the International Fellowship of Christians and Jews 415 00:22:23,600 --> 00:22:26,240 Speaker 1: has been on the forefront at Israel, addressing the needs 416 00:22:26,240 --> 00:22:29,800 Speaker 1: of the most vulnerable. That's why I'm partnering with IFCJ today. 417 00:22:30,280 --> 00:22:33,119 Speaker 1: Your life saving donation will help provide emergency food as 418 00:22:33,160 --> 00:22:36,840 Speaker 1: well as critical security needs such as black jackets, firefighting equipment, 419 00:22:36,960 --> 00:22:40,520 Speaker 1: armored vehicles, bomb shelters and more. We're looking for five 420 00:22:40,600 --> 00:22:43,840 Speaker 1: hundred listeners to join the fellowship and me by donating 421 00:22:43,880 --> 00:22:46,919 Speaker 1: one hundred and fifty dollars to meet these urgent security needs. 422 00:22:47,320 --> 00:22:50,240 Speaker 1: And thanks to a generous IFCJ supporter, your gift will 423 00:22:50,240 --> 00:22:53,080 Speaker 1: be matched, doubling your impact in the Holy Land. Call 424 00:22:53,160 --> 00:22:55,760 Speaker 1: to make your gift right now at eight eight eight 425 00:22:55,960 --> 00:22:59,760 Speaker 1: four eight eight IFCJ. That's eight eight eight four eight 426 00:22:59,840 --> 00:23:03,760 Speaker 1: eight IFCJ or four three two five, or go online 427 00:23:03,880 --> 00:23:08,080 Speaker 1: to support IFCJ dot org to give. That's one word 428 00:23:08,480 --> 00:23:12,600 Speaker 1: support IFCJ dot org is real needs or support. Now, 429 00:23:15,840 --> 00:23:19,719 Speaker 1: I wanted to ask you about this TikTok case that 430 00:23:20,160 --> 00:23:23,479 Speaker 1: has ramifications for Section two thirty, you know, sort of 431 00:23:23,560 --> 00:23:27,400 Speaker 1: explain the significance of this and how it might then impact, 432 00:23:27,840 --> 00:23:29,720 Speaker 1: you know, some of these other social media companies. 433 00:23:30,160 --> 00:23:34,040 Speaker 3: Yeah, so this is interesting, potentially huge, potentially bigger than 434 00:23:34,160 --> 00:23:37,720 Speaker 3: the Zuckerberg bombshell. And what it is is a third 435 00:23:37,800 --> 00:23:42,159 Speaker 3: circuit circuit court decided to start to pierce the armor 436 00:23:42,359 --> 00:23:45,679 Speaker 3: of big tech when it comes to Section two thirty 437 00:23:46,560 --> 00:23:49,719 Speaker 3: immunity from civil liability. So a lot of people have 438 00:23:49,800 --> 00:23:52,720 Speaker 3: now heard of this famous section two thirty. What it is. 439 00:23:52,760 --> 00:23:55,800 Speaker 3: It's part of the nineteen ninety six Communications Decency Act 440 00:23:56,040 --> 00:23:59,960 Speaker 3: that basically says, Okay, big tech, you are not liable 441 00:24:00,480 --> 00:24:02,480 Speaker 3: for You're not gonna have to face all of these 442 00:24:02,560 --> 00:24:07,040 Speaker 3: lawsuits that could potentially inhibit your growth as nascent platforms 443 00:24:07,080 --> 00:24:10,240 Speaker 3: for third party content that's hosted on your platform. So 444 00:24:10,520 --> 00:24:13,719 Speaker 3: if someone's in the comment section saying like really gross, horrible, 445 00:24:13,760 --> 00:24:17,560 Speaker 3: it's seen disgusting things and somebody wants to sue you 446 00:24:17,640 --> 00:24:21,320 Speaker 3: for it, you are going to be safe from civiliability 447 00:24:21,600 --> 00:24:25,200 Speaker 3: because what you are a platform. You're not a publisher, 448 00:24:25,240 --> 00:24:28,560 Speaker 3: You're a platform. So big tech companies have really hidden 449 00:24:28,600 --> 00:24:33,760 Speaker 3: behind this when their platforms have really, i would say 450 00:24:33,760 --> 00:24:38,080 Speaker 3: propagated all manner of evil and really ridiculous things. And 451 00:24:38,280 --> 00:24:42,480 Speaker 3: one of these platforms, TikTok, more evil than most. They 452 00:24:42,840 --> 00:24:46,760 Speaker 3: faced a lawsuit based off of a Pennsylvania mother whose 453 00:24:46,840 --> 00:24:49,840 Speaker 3: ten year old was in her feed. She by the 454 00:24:49,920 --> 00:24:53,680 Speaker 3: for you algorithm of TikTok. She was fed the blackout challenge, 455 00:24:54,080 --> 00:24:57,160 Speaker 3: where kids try to choke themselves until they pass out. 456 00:24:57,440 --> 00:25:00,000 Speaker 3: She watched one of these videos, she went to the closet. 457 00:25:00,320 --> 00:25:02,240 Speaker 3: She tried to use I believe it was her mother's 458 00:25:02,280 --> 00:25:04,679 Speaker 3: belt or a belt in the closet to attempt the 459 00:25:04,680 --> 00:25:09,000 Speaker 3: blackout challenge, and she killed herself on accident, and her 460 00:25:09,040 --> 00:25:12,160 Speaker 3: mother ended up suing. It was tossed out based off 461 00:25:12,160 --> 00:25:15,919 Speaker 3: of Section two thirty protections, but the third circuit actually 462 00:25:15,960 --> 00:25:19,560 Speaker 3: said no, no, TikTok, you could be liable because of 463 00:25:19,600 --> 00:25:23,440 Speaker 3: your algorithms for pushing this kind of thing to this 464 00:25:23,600 --> 00:25:27,800 Speaker 3: child who eventually killed herself because without that algorithm surfacing 465 00:25:28,160 --> 00:25:30,440 Speaker 3: that content, she never would have seen it, never would 466 00:25:30,440 --> 00:25:33,639 Speaker 3: have done this. So this could be the first chink 467 00:25:33,640 --> 00:25:36,359 Speaker 3: in the armor of Section two thirty that big tech 468 00:25:36,400 --> 00:25:40,080 Speaker 3: has really coded itself in for the past few decades, 469 00:25:40,119 --> 00:25:43,800 Speaker 3: which which we believe, and my here's my mantra, this 470 00:25:43,880 --> 00:25:47,000 Speaker 3: is axiomatic at this point. If Justice Thomas says it, 471 00:25:47,000 --> 00:25:47,480 Speaker 3: it's right. 472 00:25:47,840 --> 00:25:53,359 Speaker 4: So Justice Thomas has really really been questioning the expanded 473 00:25:53,400 --> 00:25:57,040 Speaker 4: interpretation of Section two thirty when tech companies don't really 474 00:25:57,080 --> 00:26:01,560 Speaker 4: have any responsibilities in tandem with this immunity, So the 475 00:26:01,600 --> 00:26:03,000 Speaker 4: decision really expands. 476 00:26:03,040 --> 00:26:05,600 Speaker 3: At least one of the judges says there's something too 477 00:26:06,040 --> 00:26:11,880 Speaker 3: claras Thomas's interpretation of this overly broad Section two thirty protection. 478 00:26:12,240 --> 00:26:14,840 Speaker 3: It's too extended. It's been interpreted by the courts over 479 00:26:14,880 --> 00:26:17,320 Speaker 3: the years in too broad a fashion. Maybe it's time 480 00:26:17,359 --> 00:26:20,600 Speaker 3: to rein it in. So, my friend Matt Stoller, kind 481 00:26:20,600 --> 00:26:24,480 Speaker 3: of a famous anti trust influencer, he's gone so far 482 00:26:24,520 --> 00:26:28,720 Speaker 3: as to say big text business model is over because 483 00:26:28,760 --> 00:26:32,440 Speaker 3: section two thirty might not be that perfect set of armor, 484 00:26:32,480 --> 00:26:35,240 Speaker 3: and that big tech's been using against families, against children, 485 00:26:35,320 --> 00:26:36,679 Speaker 3: against the American people. 486 00:26:36,800 --> 00:26:39,600 Speaker 1: Really, yeah, I think we need bracelet, says say, what 487 00:26:39,640 --> 00:26:40,840 Speaker 1: would Clarence Thomas do? 488 00:26:42,480 --> 00:26:44,399 Speaker 2: Wait, he'll have one. I've got one. 489 00:26:44,640 --> 00:26:49,120 Speaker 1: He's a good barometer of wrong versus rights. So I agree. 490 00:26:49,400 --> 00:26:52,280 Speaker 1: Hi Kera Frederick, love YOUO. Thanks so much for coming on. 491 00:26:52,359 --> 00:26:54,720 Speaker 1: You're so smart. Always appreciate you making the time. I 492 00:26:54,760 --> 00:26:58,040 Speaker 1: always love saying you anything for you all right, come 493 00:26:58,040 --> 00:26:58,840 Speaker 1: to Florida and visit. 494 00:26:59,119 --> 00:27:00,440 Speaker 3: I'm well done. 495 00:27:00,640 --> 00:27:03,879 Speaker 1: That was Kara Frederick with heritage. Appreciate her making the 496 00:27:03,920 --> 00:27:05,600 Speaker 1: time to come on the show. Appreciate you guys at 497 00:27:05,600 --> 00:27:08,560 Speaker 1: home for listening every Monday and Thursday, but of course 498 00:27:08,600 --> 00:27:10,080 Speaker 1: you can listen throughout the week. I want to thank 499 00:27:10,119 --> 00:27:12,240 Speaker 1: John Cassio and my producer for putting the show together. 500 00:27:12,359 --> 00:27:13,080 Speaker 1: Until next time