1 00:00:15,280 --> 00:00:24,800 Speaker 1: I'm afraid Kaleidoscope and iHeart podcasts. This is kill Switch. 2 00:00:26,239 --> 00:00:30,120 Speaker 1: I'm Dextera Thomas. Maybe you've heard a signal Gate. If not, 3 00:00:30,400 --> 00:00:32,960 Speaker 1: I'll get you up to speed real quick. Back in March, 4 00:00:33,040 --> 00:00:36,040 Speaker 1: a bunch of top White House staff, including the Vice President, 5 00:00:36,320 --> 00:00:39,560 Speaker 1: Secretary of Defense, and a lot of other people, were 6 00:00:39,560 --> 00:00:42,800 Speaker 1: discussing military operations in Yemen. And they were having this 7 00:00:42,840 --> 00:00:46,120 Speaker 1: discussion on a chat app called signal which you'll hear 8 00:00:46,159 --> 00:00:48,640 Speaker 1: more about in the second But amongst all the White 9 00:00:48,640 --> 00:00:53,080 Speaker 1: House officials was an outsider somebody, and that somebody turned 10 00:00:53,120 --> 00:00:56,160 Speaker 1: out to be US National Security Advisor Mike Waltz, had 11 00:00:56,320 --> 00:00:59,520 Speaker 1: accidentally added a journalist from the Atlantic to that group chat. 12 00:01:00,320 --> 00:01:03,400 Speaker 1: That journalists did what journalists are supposed to do, tell 13 00:01:03,440 --> 00:01:06,880 Speaker 1: the public what's happening. It was embarrassing for the White House, 14 00:01:07,080 --> 00:01:09,280 Speaker 1: and it also raised a lot of questions about what 15 00:01:09,400 --> 00:01:12,600 Speaker 1: exactly the US is doing overseas. It's a big enough 16 00:01:12,640 --> 00:01:16,080 Speaker 1: deal that Signalgate has its own Wikipedia entry, and the 17 00:01:16,120 --> 00:01:23,279 Speaker 1: story would have ended there. Then fast forward to May. First, 18 00:01:23,520 --> 00:01:26,360 Speaker 1: a picture started to circulate of the very same guy, 19 00:01:26,640 --> 00:01:30,160 Speaker 1: Mike Waltz, looking at his phone in a meeting using 20 00:01:30,160 --> 00:01:33,120 Speaker 1: an app that looked familiar sort of. 21 00:01:33,520 --> 00:01:36,039 Speaker 2: He was in a cabinet meeting with President Trump and 22 00:01:36,080 --> 00:01:39,839 Speaker 2: other senior officials, and the Roysters photographer got him looking 23 00:01:39,920 --> 00:01:43,160 Speaker 2: down at his phone, and if you zoomed in, it 24 00:01:43,200 --> 00:01:45,080 Speaker 2: looked like he was using signal, which is of course 25 00:01:45,120 --> 00:01:48,480 Speaker 2: very Ray funny and Rai Ray ironic. So people saw 26 00:01:48,520 --> 00:01:52,400 Speaker 2: that they started tweeting it and posting on Blue Sky saying, 27 00:01:52,440 --> 00:01:56,080 Speaker 2: ha ha, look, Mike Waltz's using signal in this photo. 28 00:01:56,320 --> 00:01:58,680 Speaker 2: And then of course when you zoom in there are names. 29 00:01:58,840 --> 00:02:02,600 Speaker 2: It shows the chat and there's says Rubio. I only 30 00:02:02,640 --> 00:02:05,840 Speaker 2: know if one senior official who has a surname Rubio 31 00:02:06,160 --> 00:02:09,919 Speaker 2: obviously relates me to the Department of State. There's Gabbard, 32 00:02:10,000 --> 00:02:12,760 Speaker 2: I think probably Tolsy Gabbard. And then there's one that's 33 00:02:12,800 --> 00:02:14,160 Speaker 2: just straight up says jd Vance. 34 00:02:14,440 --> 00:02:16,280 Speaker 1: Yeah, I wasna say. Jadvance is right in there, and 35 00:02:16,280 --> 00:02:20,200 Speaker 1: I'm just I'm reading part of jd Vance's messages right here, 36 00:02:20,240 --> 00:02:24,200 Speaker 1: just in this picture, which brings up an entirely different question, 37 00:02:24,639 --> 00:02:29,880 Speaker 1: which is should you be reading your text messages during 38 00:02:29,919 --> 00:02:32,320 Speaker 1: a meeting in which they are photographers. But I suppose 39 00:02:32,360 --> 00:02:36,960 Speaker 1: we can leave that to the side. This is my 40 00:02:37,080 --> 00:02:40,320 Speaker 1: friend and former colleague from my Vice News days, Joseph Cox. 41 00:02:40,760 --> 00:02:43,800 Speaker 1: He now reports for and co owns the online publication 42 00:02:43,960 --> 00:02:44,880 Speaker 1: four or for Media. 43 00:02:45,400 --> 00:02:47,760 Speaker 2: Somebody put it in our in our work group chat 44 00:02:47,800 --> 00:02:51,359 Speaker 2: and I was say, I look at Signal every day, 45 00:02:51,880 --> 00:02:54,440 Speaker 2: like way too much, like twelve hours a day. I'm 46 00:02:54,440 --> 00:02:57,680 Speaker 2: looking at that app. This is not Signal. There's a 47 00:02:57,760 --> 00:02:59,799 Speaker 2: there's something ever so slightly different. And when I look 48 00:02:59,840 --> 00:03:03,000 Speaker 2: at the bottom, there's usually a message in Signal says 49 00:03:03,000 --> 00:03:05,480 Speaker 2: please verify your signal pin. And this is so you 50 00:03:05,480 --> 00:03:07,320 Speaker 2: remember the pin so you don't get locked out, and 51 00:03:07,360 --> 00:03:09,480 Speaker 2: it keeps your account more secure. It's a very good thing. 52 00:03:09,960 --> 00:03:13,520 Speaker 2: But at the bottom it said please remember your TM 53 00:03:13,960 --> 00:03:17,560 Speaker 2: SGNL pin. And I'm like, what the well the hell 54 00:03:17,680 --> 00:03:20,720 Speaker 2: is that? And then I google around and that leads 55 00:03:20,760 --> 00:03:25,760 Speaker 2: me to tele Message and their modified version of Signal. 56 00:03:26,040 --> 00:03:28,680 Speaker 1: You really have to be hyper focusing on this stuff 57 00:03:29,040 --> 00:03:31,880 Speaker 1: to notice it at all. And then you further zoomed 58 00:03:31,880 --> 00:03:34,359 Speaker 1: in and said, Yo, this ain't signal. 59 00:03:34,160 --> 00:03:37,360 Speaker 2: This is not signal, this is some weird modified version. 60 00:03:38,360 --> 00:03:42,640 Speaker 2: So yeah, we report that Mike Waltz has inadvertently revealed 61 00:03:43,160 --> 00:03:47,080 Speaker 2: the weird Signal clone that the Trump administration is using. 62 00:03:49,040 --> 00:03:52,360 Speaker 1: Just to clarify here, Signal the actual Signal is a 63 00:03:52,360 --> 00:03:55,640 Speaker 1: messaging app that uses what's called end to end encryption, 64 00:03:56,120 --> 00:03:59,240 Speaker 1: which means that when you send a message, nobody, not 65 00:03:59,360 --> 00:04:02,640 Speaker 1: the government, not even Signal can read your messages. Even 66 00:04:02,640 --> 00:04:05,880 Speaker 1: if they do intercept them, only the intended recipient can 67 00:04:05,920 --> 00:04:09,800 Speaker 1: read the message. It's basically the gold standard for secure messaging. 68 00:04:10,520 --> 00:04:13,080 Speaker 1: What Joseph noticed was that the app Mike Waltz was 69 00:04:13,160 --> 00:04:16,520 Speaker 1: using in the picture looked like the actual Signal, but 70 00:04:16,760 --> 00:04:20,480 Speaker 1: that something was off. This app looked kind of similar, 71 00:04:20,720 --> 00:04:23,400 Speaker 1: but it was clear that someone else had modified it, 72 00:04:23,800 --> 00:04:27,000 Speaker 1: and his suspicion was that this modified version might not 73 00:04:27,080 --> 00:04:29,880 Speaker 1: have had the same security that the actual Signal does. 74 00:04:30,760 --> 00:04:33,000 Speaker 1: So Joseph wrote an article on four or four that 75 00:04:33,120 --> 00:04:36,240 Speaker 1: explained the issue here, and the key was that phrase 76 00:04:36,279 --> 00:04:42,040 Speaker 1: at the bottom TMSGNL. Joseph started digging around and realized 77 00:04:42,080 --> 00:04:45,039 Speaker 1: that this was probably from a company called tele Message. 78 00:04:45,600 --> 00:04:50,320 Speaker 1: This company basically makes clones of popular messaging apps like WhatsApp, Telegram, 79 00:04:50,480 --> 00:04:55,479 Speaker 1: we Chat, and Yes Signal with one extra feature. It 80 00:04:55,600 --> 00:04:59,320 Speaker 1: saves copies of all the messages sent or received onto 81 00:04:59,360 --> 00:05:02,719 Speaker 1: another server not to spoil the next part. But maybe 82 00:05:02,760 --> 00:05:06,520 Speaker 1: you're already seeing the problem here, So let's talk about 83 00:05:06,520 --> 00:05:10,279 Speaker 1: the timeline. This picture of Mike Waltz circulates and Joseph 84 00:05:10,320 --> 00:05:14,800 Speaker 1: publishes that article on May first. On May third, two 85 00:05:14,920 --> 00:05:19,200 Speaker 1: days later, an anonymous hacker contacts another reporter named Micah 86 00:05:19,279 --> 00:05:22,080 Speaker 1: Lee saying that they've managed to hack tele a message, 87 00:05:22,600 --> 00:05:26,280 Speaker 1: and so Micah and Joseph put together another article explaining 88 00:05:26,400 --> 00:05:27,760 Speaker 1: how bad this really is. 89 00:05:32,080 --> 00:05:34,960 Speaker 2: I'm getting these screenshots and I'm getting sent this data. 90 00:05:35,440 --> 00:05:37,640 Speaker 2: One of the first examples is a screenshot of a 91 00:05:37,680 --> 00:05:40,760 Speaker 2: telemessage back end, which you know, a normal person should 92 00:05:40,760 --> 00:05:42,120 Speaker 2: not be able to access and a hacker should be 93 00:05:42,200 --> 00:05:44,320 Speaker 2: able to access it. And it has all of these 94 00:05:44,720 --> 00:05:49,120 Speaker 2: contact details for officials from Customs and Border Protection, implying 95 00:05:49,560 --> 00:05:51,960 Speaker 2: that they use the tool as well for some reason. 96 00:05:52,320 --> 00:05:57,160 Speaker 2: So I spend my Sunday phoning up numbers in this 97 00:05:57,400 --> 00:06:00,800 Speaker 2: list of CPB officials contact information, and I answer with 98 00:06:00,839 --> 00:06:02,919 Speaker 2: the name, and I say with Customs and Border and 99 00:06:02,960 --> 00:06:06,360 Speaker 2: they confirm, so it verifies the data. They then hang 100 00:06:06,440 --> 00:06:08,120 Speaker 2: up after I explain what's happening. 101 00:06:08,279 --> 00:06:08,520 Speaker 3: Wow. 102 00:06:08,640 --> 00:06:12,240 Speaker 2: But then there's the much more serious stuff, which is 103 00:06:12,920 --> 00:06:16,040 Speaker 2: messages that they may have thought were secure with a 104 00:06:16,080 --> 00:06:18,760 Speaker 2: modified version of Signal, and these messages are going off 105 00:06:18,760 --> 00:06:23,400 Speaker 2: against archived somewhere. The hacker was able to essentially intercept 106 00:06:23,480 --> 00:06:26,320 Speaker 2: them in the middle of that process, and we were 107 00:06:26,360 --> 00:06:29,919 Speaker 2: reading messages that were clearly sent that day of internal 108 00:06:29,960 --> 00:06:34,960 Speaker 2: conversations between various people. One example was one from it 109 00:06:35,000 --> 00:06:37,640 Speaker 2: looks like it was from Galaxy Digital, which is a 110 00:06:37,640 --> 00:06:41,400 Speaker 2: cryptocurrency connected company, and they were talking about some sort 111 00:06:41,400 --> 00:06:44,960 Speaker 2: of bill or law that the Dems may or may 112 00:06:45,000 --> 00:06:49,240 Speaker 2: not support. I'm not particularly interested in the specifics of 113 00:06:49,240 --> 00:06:52,000 Speaker 2: that bill, but it just shows you whoa this is 114 00:06:52,040 --> 00:06:54,479 Speaker 2: a really sensitive conversation, even if I don't really care 115 00:06:54,520 --> 00:06:57,160 Speaker 2: about cryptocurrency. This isn't something that a hacker should be 116 00:06:57,200 --> 00:06:58,720 Speaker 2: able to get, and this isn't something that I should 117 00:06:58,760 --> 00:07:02,120 Speaker 2: be able to read. But the implication is is that 118 00:07:02,240 --> 00:07:06,239 Speaker 2: Mike Waltz and people he was talking to, like Rubio 119 00:07:06,400 --> 00:07:10,360 Speaker 2: or whoever run this same platform, the hacker just managed 120 00:07:10,400 --> 00:07:13,520 Speaker 2: to break into and get the contents of messages. And 121 00:07:13,560 --> 00:07:16,560 Speaker 2: that's massive because who knows who else had access to this. 122 00:07:17,160 --> 00:07:20,080 Speaker 1: So just giving these government officials the benefit of the 123 00:07:20,120 --> 00:07:23,720 Speaker 1: doubt for a moment, maybe this is what happened. Using 124 00:07:23,760 --> 00:07:27,360 Speaker 1: an app that can automatically delete messages is not okay 125 00:07:27,400 --> 00:07:30,600 Speaker 1: for government officials because the public is supposed to have 126 00:07:30,640 --> 00:07:34,120 Speaker 1: access to non classified information, and one of the things 127 00:07:34,200 --> 00:07:38,400 Speaker 1: telemessage promises to do is help organizations comply with regulations 128 00:07:38,480 --> 00:07:41,920 Speaker 1: about retaining records. Now, we have no evidence of the 129 00:07:41,960 --> 00:07:45,200 Speaker 1: reasoning or timing of the government's use of telemessage, but 130 00:07:45,520 --> 00:07:48,280 Speaker 1: that would be a charitable reading here. Maybe they did 131 00:07:48,280 --> 00:07:49,360 Speaker 1: it to obey the law. 132 00:07:49,840 --> 00:07:53,520 Speaker 2: It is a really really, really really hard trade off, 133 00:07:53,880 --> 00:07:56,200 Speaker 2: and one side we want the government to keep records 134 00:07:56,200 --> 00:07:58,280 Speaker 2: so we can foil them and there can be accountability 135 00:07:58,320 --> 00:08:00,560 Speaker 2: and transparency whenever we need that, you know, either in 136 00:08:00,600 --> 00:08:03,320 Speaker 2: the current moment or at a later date. And then 137 00:08:03,360 --> 00:08:06,080 Speaker 2: when they do seemingly choose at all to do that, 138 00:08:06,760 --> 00:08:08,800 Speaker 2: it blows up massively in their face, and it's actually 139 00:08:08,840 --> 00:08:10,920 Speaker 2: probably even worse than the problem they were trying to 140 00:08:10,920 --> 00:08:13,600 Speaker 2: solve in the first place. Tele a message who don't 141 00:08:13,640 --> 00:08:16,440 Speaker 2: make signal, they don't develop signal in any capacity. They 142 00:08:16,440 --> 00:08:19,040 Speaker 2: don't do it for reach Out, Whatsapple Telegram either, but 143 00:08:19,080 --> 00:08:22,320 Speaker 2: they take those apps and they like shoven archiving ability 144 00:08:22,720 --> 00:08:26,280 Speaker 2: onto it. So presumably you might just not make a 145 00:08:26,360 --> 00:08:31,480 Speaker 2: fully secure solution. But then even more fundamentally, there are 146 00:08:31,520 --> 00:08:34,719 Speaker 2: just always going to be risks involved of this, and 147 00:08:35,040 --> 00:08:38,040 Speaker 2: it kind of reminds me of you know, government's always 148 00:08:38,040 --> 00:08:42,880 Speaker 2: asking for back doors into communication products. I'm not saying 149 00:08:42,960 --> 00:08:47,400 Speaker 2: tele message is absolutely a backdoor, and they don't use 150 00:08:47,440 --> 00:08:53,760 Speaker 2: that terminology, but practically it's kind of the same. It's 151 00:08:53,880 --> 00:08:56,400 Speaker 2: taking the messages and copying them elsewhere, which is what 152 00:08:56,480 --> 00:09:00,559 Speaker 2: we think of a backdoor. Fundamentally, by introduce seeing another 153 00:09:00,600 --> 00:09:05,040 Speaker 2: party to the chat, you're introducing more risk in some form. 154 00:09:05,080 --> 00:09:07,920 Speaker 1: Right exactly, because the idea here, what sounds like tell 155 00:09:07,960 --> 00:09:13,160 Speaker 1: Message is promising is that well, signal your message is 156 00:09:13,679 --> 00:09:15,800 Speaker 1: encrypted from one end to the other. It's end to 157 00:09:16,000 --> 00:09:20,120 Speaker 1: end encrypted, so nobody can intercept that. But then it's 158 00:09:20,200 --> 00:09:24,080 Speaker 1: going through tele messages being copied somewhere else, and so 159 00:09:24,240 --> 00:09:28,760 Speaker 1: by default, by definition, it's not encrypted in to end anymore. 160 00:09:29,000 --> 00:09:31,880 Speaker 2: So when we first reported on sort of the Trump 161 00:09:31,920 --> 00:09:34,640 Speaker 2: happening use of telemessage, the New York Times followed up 162 00:09:34,640 --> 00:09:36,079 Speaker 2: a few days later, and they were talking about some 163 00:09:36,080 --> 00:09:38,160 Speaker 2: of the security risks about this, and they spoke to 164 00:09:38,800 --> 00:09:42,720 Speaker 2: some executive that's Smash, which is the American company that 165 00:09:42,840 --> 00:09:47,199 Speaker 2: owns the Israeli company tele message, and that executive to 166 00:09:47,240 --> 00:09:49,800 Speaker 2: The New York Times was saying something like we we 167 00:09:49,880 --> 00:09:54,200 Speaker 2: don't de encrypts, which is a bizarre term I've never 168 00:09:54,240 --> 00:09:57,600 Speaker 2: heard anyone use with any in any serious context. But 169 00:09:57,679 --> 00:10:00,880 Speaker 2: what they were trying to say was basically, it's still secure, 170 00:10:01,040 --> 00:10:03,440 Speaker 2: you know, even though we have we're adding this archiving thing, 171 00:10:03,480 --> 00:10:06,800 Speaker 2: it's still secure. And I mean they say that, they 172 00:10:06,840 --> 00:10:09,960 Speaker 2: say it's still end to end encrypted, and of course 173 00:10:09,960 --> 00:10:12,559 Speaker 2: they're free to say whatever they want that they're comfortable 174 00:10:12,559 --> 00:10:16,240 Speaker 2: saying in their marketing material. But Michah Lee, the security 175 00:10:16,240 --> 00:10:18,400 Speaker 2: of research and journalist we worked on the hack, has 176 00:10:18,440 --> 00:10:21,760 Speaker 2: done a ray detailed analysis about how telem message does 177 00:10:22,200 --> 00:10:25,240 Speaker 2: have access to the plain text of messages. But then 178 00:10:25,280 --> 00:10:28,520 Speaker 2: I think even just more importantly, we were looking at 179 00:10:28,559 --> 00:10:31,200 Speaker 2: the contents of messages. It's clearly not end to end encrypted. 180 00:10:31,240 --> 00:10:35,040 Speaker 2: We were reading the messages. So and maybe there is 181 00:10:35,080 --> 00:10:37,000 Speaker 2: a way, and maybe there are some messages that are 182 00:10:37,080 --> 00:10:39,720 Speaker 2: encrypted by tele message, but what we've seen and what 183 00:10:39,760 --> 00:10:41,240 Speaker 2: I'm saying is some are definitely not. 184 00:10:42,960 --> 00:10:45,880 Speaker 1: And just to be clear, we don't necessarily know that 185 00:10:46,040 --> 00:10:48,959 Speaker 1: every single top ranking official in the government is using 186 00:10:49,000 --> 00:10:52,320 Speaker 1: this tele message clone of signal. It looks like tele 187 00:10:52,360 --> 00:10:55,400 Speaker 1: message acts enough like regular signal to where you can 188 00:10:55,440 --> 00:10:57,960 Speaker 1: send messages back and forth and you wouldn't know if 189 00:10:58,000 --> 00:11:00,280 Speaker 1: the person on the other side is using the clone. 190 00:11:00,840 --> 00:11:02,959 Speaker 1: But even if one person in a group chat is 191 00:11:03,040 --> 00:11:05,840 Speaker 1: using it, and we know that former National Security advisor 192 00:11:05,880 --> 00:11:09,320 Speaker 1: Mike Watz is using it, that potentially means that everyone 193 00:11:09,360 --> 00:11:12,240 Speaker 1: they talk to or everyone in a group chat is 194 00:11:12,280 --> 00:11:15,520 Speaker 1: having their messages sent to a third party server and 195 00:11:15,559 --> 00:11:18,480 Speaker 1: that someone else would be able to see those messages. 196 00:11:19,679 --> 00:11:24,240 Speaker 1: In the article, you pretty explicitly say we're not going 197 00:11:24,280 --> 00:11:27,200 Speaker 1: to tell you how the hacker did it, so it 198 00:11:27,240 --> 00:11:30,559 Speaker 1: sounds like the hacker communicated to you, this is how 199 00:11:30,600 --> 00:11:34,480 Speaker 1: I did it. You don't make that public. Why is 200 00:11:34,520 --> 00:11:36,320 Speaker 1: there Yeah, I. 201 00:11:36,320 --> 00:11:38,000 Speaker 2: Mean, first of all, we ask as part of the 202 00:11:38,080 --> 00:11:40,559 Speaker 2: verification process, if they can explain how they did it 203 00:11:40,600 --> 00:11:44,680 Speaker 2: in a verifiable manner, that's good or even better. Of course, 204 00:11:45,040 --> 00:11:47,200 Speaker 2: we can't always fully verify that because we're not going 205 00:11:47,280 --> 00:11:51,120 Speaker 2: to go do the hack ourselves inane and obviously, so 206 00:11:51,400 --> 00:11:53,640 Speaker 2: it really depends, and you have to be really, really 207 00:11:53,679 --> 00:11:57,120 Speaker 2: careful about what you're amplifying because if you include certain 208 00:11:57,160 --> 00:12:01,320 Speaker 2: information that could give other hackers or random people the 209 00:12:01,360 --> 00:12:04,160 Speaker 2: ability to go basically do it again, and who knows 210 00:12:04,200 --> 00:12:05,800 Speaker 2: what they're going to then do with that data. We 211 00:12:05,840 --> 00:12:08,080 Speaker 2: don't want to cause more harm, so there's a trade 212 00:12:08,080 --> 00:12:11,880 Speaker 2: off there, and it also relates to how security researchers 213 00:12:12,040 --> 00:12:16,400 Speaker 2: often disclose vulnerabilities. What hackers will often do is they 214 00:12:16,440 --> 00:12:20,560 Speaker 2: will probe a server or reverse engineering app. They'll go 215 00:12:20,600 --> 00:12:23,120 Speaker 2: to the company, Hey, I found this vulnerability. I think 216 00:12:23,160 --> 00:12:25,800 Speaker 2: you should fix it. The company will say, wow, that's 217 00:12:25,800 --> 00:12:27,920 Speaker 2: really bad. His ten thousand dollars in a T shirt 218 00:12:27,960 --> 00:12:31,040 Speaker 2: or a coffee mug or something, and they'll fix it, 219 00:12:31,120 --> 00:12:34,760 Speaker 2: hopefully within thirty days. Blah blah blah. That's called responsible disclosure. 220 00:12:34,960 --> 00:12:38,400 Speaker 2: That doesn't really involve the media at least traditionally. The 221 00:12:38,440 --> 00:12:40,320 Speaker 2: other way that some hackers do, and they did it 222 00:12:40,360 --> 00:12:44,000 Speaker 2: in this case, they find a vulnerability, they might actually 223 00:12:44,040 --> 00:12:47,439 Speaker 2: exploit that vulnerability to show it's real and to get data, 224 00:12:47,480 --> 00:12:50,320 Speaker 2: and then they go to the press. And in this case, 225 00:12:50,400 --> 00:12:52,560 Speaker 2: the hackers said they did this because they thought the 226 00:12:52,559 --> 00:12:53,680 Speaker 2: company might cover it up. 227 00:12:54,040 --> 00:12:57,280 Speaker 1: So after the four or four article comes out, NBC 228 00:12:57,440 --> 00:13:02,079 Speaker 1: reports that they'd spoken to a second hacker. Is it 229 00:13:02,200 --> 00:13:06,520 Speaker 1: possible that there's another hacker who's gotten information and not 230 00:13:06,760 --> 00:13:08,400 Speaker 1: reported it to a journalist. 231 00:13:08,960 --> 00:13:14,520 Speaker 2: It's absolutely possible. Like already you have one hacker who 232 00:13:14,520 --> 00:13:17,240 Speaker 2: provided information to us, you then have a second hacker 233 00:13:17,440 --> 00:13:19,760 Speaker 2: who also breaks into the system. And that's quite rare, 234 00:13:19,960 --> 00:13:23,560 Speaker 2: to be honest, Like, I report on so many data breaches, 235 00:13:24,040 --> 00:13:27,160 Speaker 2: and it's not common for a media outlet to report 236 00:13:27,160 --> 00:13:29,360 Speaker 2: on a hack and then a day or so later, 237 00:13:29,440 --> 00:13:31,880 Speaker 2: another media outlet says, another hacker got it. Like that 238 00:13:32,000 --> 00:13:36,160 Speaker 2: just doesn't doesn't really happen. Now, let's assume those hackers 239 00:13:36,160 --> 00:13:38,079 Speaker 2: are probably random hackers who were just talking to the 240 00:13:38,120 --> 00:13:42,720 Speaker 2: press or whatever. If you are from a nation state 241 00:13:42,760 --> 00:13:47,000 Speaker 2: intelligence agency like China or Russia or Iran, if you 242 00:13:47,840 --> 00:13:50,640 Speaker 2: haven't even looked at that server, you should probably lose 243 00:13:50,640 --> 00:13:55,360 Speaker 2: your job because that's literally your job is to find 244 00:13:55,400 --> 00:14:00,120 Speaker 2: out where these vulnerabilities are. You don't really have a 245 00:14:00,160 --> 00:14:04,600 Speaker 2: more pressing target than the people around Donald Trump or 246 00:14:04,640 --> 00:14:08,240 Speaker 2: Donald Trump himself. That's sort of in a circle. It 247 00:14:08,360 --> 00:14:10,720 Speaker 2: is your job too, among us of these people. 248 00:14:11,360 --> 00:14:14,440 Speaker 1: Should people stop using signal after all this came out? 249 00:14:14,760 --> 00:14:18,880 Speaker 2: Absolutely not. More people should use signal or a secure 250 00:14:18,880 --> 00:14:21,760 Speaker 2: alternative that makes sense in their life. Don't let the 251 00:14:21,840 --> 00:14:25,880 Speaker 2: telemassage Debarkle throw you off that just you probably don't 252 00:14:25,920 --> 00:14:28,360 Speaker 2: use a tool like that. It's like an archiving tool. 253 00:14:29,400 --> 00:14:33,320 Speaker 1: Okay, So hopefully we haven't scared you away from using Signal. Truly, 254 00:14:33,440 --> 00:14:36,240 Speaker 1: it is a great app. There's nothing wrong with Signal itself. 255 00:14:36,840 --> 00:14:40,640 Speaker 1: The issue is with tell Message. But there's something else 256 00:14:40,680 --> 00:14:43,080 Speaker 1: we need to talk about, because we've talked about the 257 00:14:43,120 --> 00:14:45,800 Speaker 1: technical side of all this, and now we need to 258 00:14:45,840 --> 00:14:49,200 Speaker 1: talk about the human side. Once a vulnerability is found, 259 00:14:49,640 --> 00:14:52,520 Speaker 1: the next step is figuring out what the potential damage is. 260 00:14:53,240 --> 00:14:55,120 Speaker 1: And to know that, you need to know whether this 261 00:14:55,240 --> 00:14:58,600 Speaker 1: vulnerability was a mistake or it was there on purpose. 262 00:14:59,080 --> 00:15:02,960 Speaker 1: And this is where things get tricky because Telemessage, the company, 263 00:15:03,400 --> 00:15:06,360 Speaker 1: not only has access to servers where messages are archived, 264 00:15:06,680 --> 00:15:09,600 Speaker 1: but they theoretically would have access to all the chat 265 00:15:09,600 --> 00:15:14,040 Speaker 1: logs of everything ever said using their service. And Telemessage 266 00:15:14,080 --> 00:15:16,720 Speaker 1: is based in Israel, founded by someone who worked in 267 00:15:16,720 --> 00:15:20,160 Speaker 1: intelligence for the Israel Defense Force. So what does this 268 00:15:20,240 --> 00:15:23,840 Speaker 1: mean for the US government and what might they do next? 269 00:15:25,000 --> 00:15:39,720 Speaker 1: That's after the break Politics is not really my forte. 270 00:15:40,000 --> 00:15:42,360 Speaker 1: But in order to understand what exactly is at stake 271 00:15:42,360 --> 00:15:44,560 Speaker 1: in this hack, I knew I'd have to learn a 272 00:15:44,560 --> 00:15:46,640 Speaker 1: little bit more about what people are saying in DC. 273 00:15:47,360 --> 00:15:49,960 Speaker 1: So I called up yet another friend and former colleague, 274 00:15:50,040 --> 00:15:54,400 Speaker 1: Evan McMorris Centaro. He's a longtime politics reporter and he 275 00:15:54,440 --> 00:15:57,600 Speaker 1: now writes a daily newsletter for a nonprofit media organization 276 00:15:57,800 --> 00:15:59,880 Speaker 1: called Notice That's NTU. 277 00:16:00,880 --> 00:16:04,280 Speaker 3: It kind of sort of broke in the journalism i'd say, 278 00:16:04,320 --> 00:16:06,720 Speaker 3: around that snowed in era. That's when I first started 279 00:16:07,040 --> 00:16:08,160 Speaker 3: hearing about it. 280 00:16:08,160 --> 00:16:11,480 Speaker 1: It's not just government officials who use Signal. Actually, outside 281 00:16:11,480 --> 00:16:14,360 Speaker 1: of tech people, journalists were some of the earliest adopters. 282 00:16:14,760 --> 00:16:16,920 Speaker 1: The fact that this app is not only secure, but 283 00:16:17,000 --> 00:16:20,160 Speaker 1: that it would automatically delete messages is a big benefit. 284 00:16:20,720 --> 00:16:22,680 Speaker 1: This stuff is really important when you're talking to a 285 00:16:22,760 --> 00:16:24,400 Speaker 1: source about something sensitive. 286 00:16:25,560 --> 00:16:28,040 Speaker 3: And you know, an important thing about Washington is everybody 287 00:16:28,040 --> 00:16:30,760 Speaker 3: that you're talking to is old for the most part, 288 00:16:31,280 --> 00:16:34,480 Speaker 3: and their tech savviness is very low, and so it 289 00:16:34,560 --> 00:16:38,240 Speaker 3: takes a long time for this something like this to 290 00:16:38,280 --> 00:16:41,000 Speaker 3: kind of matriculate out through the entire system. So now 291 00:16:41,080 --> 00:16:43,880 Speaker 3: we're in a world now where the Vice president is 292 00:16:43,920 --> 00:16:46,480 Speaker 3: on it, and the National Security Advisor is on it, 293 00:16:46,680 --> 00:16:48,560 Speaker 3: and the Secretary of Defense is on it. You know, 294 00:16:48,640 --> 00:16:51,480 Speaker 3: these are particularly old people as far as politics goes. 295 00:16:51,840 --> 00:16:54,800 Speaker 3: But it takes from that sort of snowed in time 296 00:16:54,880 --> 00:16:56,960 Speaker 3: all those years ago to now for it to become 297 00:16:57,080 --> 00:17:01,160 Speaker 3: very very prevalent as people sort of start to catch 298 00:17:01,200 --> 00:17:03,840 Speaker 3: on to something that's new. It is not like in 299 00:17:03,880 --> 00:17:07,520 Speaker 3: the movies where the movie shows, you know, the government 300 00:17:07,600 --> 00:17:12,840 Speaker 3: officials have the coolest, slickest, most amazing technology that has 301 00:17:13,760 --> 00:17:17,280 Speaker 3: the best features. In reality, you know, tech in the 302 00:17:17,280 --> 00:17:22,000 Speaker 3: government is a slow, slow moving process and one that 303 00:17:22,040 --> 00:17:24,879 Speaker 3: has a lot of lawyering involved. There are laws, very 304 00:17:24,880 --> 00:17:28,960 Speaker 3: specific laws about the kinds of records that government officials 305 00:17:29,000 --> 00:17:31,359 Speaker 3: have to keep, and so you know there's sort of like, 306 00:17:31,440 --> 00:17:34,000 Speaker 3: you know, these government officials will tell you we're keeping records, 307 00:17:34,359 --> 00:17:37,200 Speaker 3: we are doing things the right way. Some Biden officials 308 00:17:37,200 --> 00:17:38,520 Speaker 3: talked about how they would use it to you know, 309 00:17:39,080 --> 00:17:41,159 Speaker 3: plan like where they would meet up for dinner or 310 00:17:41,200 --> 00:17:43,200 Speaker 3: something like that if they were on the road somewhere, 311 00:17:43,359 --> 00:17:46,200 Speaker 3: things like that. But as we saw with this signal gate, 312 00:17:46,840 --> 00:17:50,959 Speaker 3: there was a real comfort in using this service for 313 00:17:51,320 --> 00:17:57,400 Speaker 3: something that would absolutely be governed by classified records acts 314 00:17:57,600 --> 00:18:00,000 Speaker 3: that are very specific and how classical records must be handled, 315 00:18:00,359 --> 00:18:03,439 Speaker 3: and also in record keeping that are very specific in 316 00:18:03,520 --> 00:18:07,359 Speaker 3: what kind of things administration officials have to keep to 317 00:18:07,400 --> 00:18:08,520 Speaker 3: be archived for history. 318 00:18:08,960 --> 00:18:13,320 Speaker 1: Why does this stuff need to be archived. 319 00:18:12,840 --> 00:18:15,480 Speaker 3: Well, you know, Acts of the presidential record jacks are 320 00:18:15,520 --> 00:18:19,440 Speaker 3: designed for history when you're president. When your government official, 321 00:18:20,200 --> 00:18:23,280 Speaker 3: your privacy is different. The work that you do, the 322 00:18:23,359 --> 00:18:26,439 Speaker 3: work product that you do as a government official. It 323 00:18:26,520 --> 00:18:29,080 Speaker 3: kind of belongs to the government with means it belongs 324 00:18:29,080 --> 00:18:32,120 Speaker 3: to everybody. That's true of the executive branch. It's not 325 00:18:32,160 --> 00:18:36,679 Speaker 3: true of Congress. It's a very important part of what 326 00:18:36,960 --> 00:18:41,480 Speaker 3: an administration does is create a history of how decisions 327 00:18:41,480 --> 00:18:44,920 Speaker 3: are made so future decisions can be made better. They 328 00:18:44,920 --> 00:18:47,800 Speaker 3: are not This is not your personal private email. Those 329 00:18:47,840 --> 00:18:52,119 Speaker 3: emails are public record and can be looked at by 330 00:18:52,160 --> 00:18:53,680 Speaker 3: people and can be requested by people. 331 00:18:54,040 --> 00:18:57,240 Speaker 1: So look, one of the first things that I thought 332 00:18:57,240 --> 00:18:59,679 Speaker 1: of when I saw this whole telemessage thing in the 333 00:18:59,720 --> 00:19:01,639 Speaker 1: hack is the first thing that popped in my mind 334 00:19:02,119 --> 00:19:04,359 Speaker 1: was this kind of reminds me of Hillary Clinton in 335 00:19:04,400 --> 00:19:06,320 Speaker 1: the emails, And it's become kind of a meme at 336 00:19:06,320 --> 00:19:08,359 Speaker 1: this point. You know, I've heard the memes so much 337 00:19:08,400 --> 00:19:11,360 Speaker 1: that I've actually kind of forgotten what the email thing 338 00:19:11,600 --> 00:19:12,440 Speaker 1: was about. 339 00:19:12,680 --> 00:19:17,920 Speaker 3: There are actually three separate email scandals in the Clinton 340 00:19:18,280 --> 00:19:21,240 Speaker 3: campaign twenty sixteen. A right, so let's start with the 341 00:19:21,240 --> 00:19:26,920 Speaker 3: first one. So Hillary Clinton, rather than use her email 342 00:19:27,040 --> 00:19:30,280 Speaker 3: system that was given to her the as Secretary of 343 00:19:30,320 --> 00:19:33,520 Speaker 3: State that goes through the government, she sent all of 344 00:19:33,560 --> 00:19:38,240 Speaker 3: her emails through a private email server that she had 345 00:19:38,280 --> 00:19:41,080 Speaker 3: set up herself. But so what happened was is that 346 00:19:41,119 --> 00:19:44,920 Speaker 3: when she was done being Secretary of State, she had 347 00:19:44,960 --> 00:19:47,720 Speaker 3: to do the Official Records Acts part of her job, 348 00:19:47,880 --> 00:19:51,280 Speaker 3: and so she turned over all of these emails. And 349 00:19:51,280 --> 00:19:52,919 Speaker 3: then it turned out that the way those emails had 350 00:19:52,960 --> 00:19:56,760 Speaker 3: been turned over was that her and her lawyers had 351 00:19:56,800 --> 00:19:59,840 Speaker 3: gone through all these emails on this server that she 352 00:20:00,040 --> 00:20:04,160 Speaker 3: owned and said, these are the ones that are professional, 353 00:20:04,160 --> 00:20:06,760 Speaker 3: these are the ones that you need to categorize, and 354 00:20:07,000 --> 00:20:10,720 Speaker 3: thousands of other ones we're keeping them. They're not they're 355 00:20:10,760 --> 00:20:13,400 Speaker 3: personal emails, they're not subject to the Official Records Act. 356 00:20:13,640 --> 00:20:16,439 Speaker 3: And then also we're like kind of deleting them for 357 00:20:16,520 --> 00:20:19,040 Speaker 3: all time, so it's never to be found again. The 358 00:20:19,080 --> 00:20:23,280 Speaker 3: second email scandal was the Clinton campaign itself. When it 359 00:20:23,359 --> 00:20:29,000 Speaker 3: was running, it's senior advisor, John Podesta, used his personal 360 00:20:29,160 --> 00:20:33,760 Speaker 3: Gmail for all of his communications with the campaign. He 361 00:20:33,840 --> 00:20:35,359 Speaker 3: was a top advisor. He got a lot of emails, 362 00:20:35,359 --> 00:20:37,880 Speaker 3: a lot of very sensitive series emails. So he gets 363 00:20:37,880 --> 00:20:41,000 Speaker 3: this weird email one day that asks him for like 364 00:20:41,080 --> 00:20:43,159 Speaker 3: his information and passwords and stuff were like you know, 365 00:20:43,280 --> 00:20:46,000 Speaker 3: links away as a phishing attack. He sends it to 366 00:20:46,040 --> 00:20:48,919 Speaker 3: the IT guy and says, is this bad? And the 367 00:20:49,000 --> 00:20:51,439 Speaker 3: IT guy sends it back email saying, now this is 368 00:20:51,440 --> 00:20:53,720 Speaker 3: a legitimate email, but actually he meant to say it 369 00:20:53,760 --> 00:20:58,320 Speaker 3: was an illegitimate email. Pessa clicks on it spears phishing attack. 370 00:20:58,480 --> 00:21:01,600 Speaker 3: All of his emails are harvested. Huge scandal for the 371 00:21:01,640 --> 00:21:04,560 Speaker 3: Clinton campaign, all right. So that's the second one. The 372 00:21:04,600 --> 00:21:08,480 Speaker 3: third one is that Kuma Abidin, who was another top 373 00:21:08,520 --> 00:21:11,360 Speaker 3: senior Clinton advisor while she was Secretary of State and 374 00:21:11,520 --> 00:21:15,000 Speaker 3: on her campaign, had a personal laptop of her own 375 00:21:15,920 --> 00:21:20,120 Speaker 3: and she would get all of Hillary Clinton's emails. And 376 00:21:20,359 --> 00:21:24,080 Speaker 3: so you know, they had had all this conversation. The 377 00:21:24,160 --> 00:21:27,879 Speaker 3: fbi'd investigated the private email server thing, They talked about it. 378 00:21:28,119 --> 00:21:29,960 Speaker 3: The FBI had said, look, this is not great that 379 00:21:30,040 --> 00:21:32,960 Speaker 3: you did this, but we don't see anything worth prosecuting here. 380 00:21:33,520 --> 00:21:36,960 Speaker 3: That happened early in the campaign, but at the very end, 381 00:21:37,119 --> 00:21:40,360 Speaker 3: in the last subverts of the campaign, whom Abiden's then 382 00:21:40,440 --> 00:21:45,720 Speaker 3: husband named Anthony Wiener, who was a congressman from New York. 383 00:21:46,080 --> 00:21:50,240 Speaker 3: He was involved in a very like embarrassing and like ongoing, 384 00:21:50,440 --> 00:21:55,199 Speaker 3: endless kind of sex scandal involving him texting people, some 385 00:21:55,240 --> 00:21:57,719 Speaker 3: of them underage, sexting them. 386 00:21:57,960 --> 00:21:58,120 Speaker 1: Right. 387 00:21:58,640 --> 00:22:01,080 Speaker 3: So, in the course of that investigator of that, they 388 00:22:01,160 --> 00:22:04,560 Speaker 3: uncover that, oh, he's been using this laptop that has 389 00:22:04,680 --> 00:22:07,560 Speaker 3: of course archived on it in its own version of 390 00:22:08,480 --> 00:22:11,959 Speaker 3: Outlook or its email software whatever it was, all of 391 00:22:12,000 --> 00:22:15,760 Speaker 3: these old Hillary Clinton emails that maybe some of them 392 00:22:15,760 --> 00:22:18,919 Speaker 3: are the ones that she, you know, tried to delete 393 00:22:18,920 --> 00:22:20,399 Speaker 3: from a server. We can go back where we can 394 00:22:20,440 --> 00:22:21,720 Speaker 3: look and see if any of them are going to 395 00:22:21,760 --> 00:22:24,679 Speaker 3: violate the lot. So this comes out right near the 396 00:22:24,760 --> 00:22:28,560 Speaker 3: very end of the campaign, and all of a sudden, 397 00:22:28,760 --> 00:22:31,359 Speaker 3: Hill Clinton's emails are back, and then she loses in 398 00:22:31,400 --> 00:22:34,439 Speaker 3: a very close election to Donald Trump, and a lot 399 00:22:34,480 --> 00:22:37,400 Speaker 3: of people equate it being at least one of these 400 00:22:37,400 --> 00:22:41,080 Speaker 3: email scandals that did it, possibly three, but really it 401 00:22:41,160 --> 00:22:43,879 Speaker 3: was email all along. But it does also feel like 402 00:22:43,920 --> 00:22:46,640 Speaker 3: a different time. I will give you a great example 403 00:22:46,680 --> 00:22:49,600 Speaker 3: that just happened, you know, the one hundred days of 404 00:22:49,600 --> 00:22:53,440 Speaker 3: Trump's administration just passed, and so everybody is trying to 405 00:22:53,480 --> 00:22:56,080 Speaker 3: score interview with the president and write a big piece 406 00:22:56,280 --> 00:22:59,800 Speaker 3: about the one hundred days. The Atlantic magazine, which of 407 00:22:59,840 --> 00:23:03,080 Speaker 3: course is the place where Signal Gate happened, was was revealed. 408 00:23:03,200 --> 00:23:07,199 Speaker 3: The reporters reveal that they just at one point just 409 00:23:07,359 --> 00:23:10,720 Speaker 3: called up the president on his cell phone and Trump 410 00:23:10,760 --> 00:23:14,120 Speaker 3: answered and agreed to do an interview. 411 00:23:14,320 --> 00:23:14,680 Speaker 1: Like just. 412 00:23:16,320 --> 00:23:19,920 Speaker 3: Hello, mister President, I'd like to interview. Okay, sure, we're 413 00:23:19,920 --> 00:23:22,600 Speaker 3: not talking about going through layers of security here. We're 414 00:23:22,640 --> 00:23:26,040 Speaker 3: not talking about going through you know, arbiters that are 415 00:23:26,119 --> 00:23:28,919 Speaker 3: checking on this. This is a guy just has a 416 00:23:28,960 --> 00:23:32,160 Speaker 3: phone that can ring, you can pick it up. When 417 00:23:32,760 --> 00:23:36,159 Speaker 3: President Obama was in power, you know, he wanted to 418 00:23:36,240 --> 00:23:38,760 Speaker 3: keep his personal BlackBerry. You know, this is how long 419 00:23:38,760 --> 00:23:40,760 Speaker 3: ago it was. The BlackBerry. A part of like what 420 00:23:40,840 --> 00:23:42,840 Speaker 3: made Obama cool was that he was like a guy 421 00:23:42,880 --> 00:23:45,240 Speaker 3: that knew how he was a BlackBerry and he wanted 422 00:23:45,240 --> 00:23:47,679 Speaker 3: to keep it. There was some conversation that maybe he 423 00:23:47,720 --> 00:23:51,879 Speaker 3: had been given this like super secure government version of 424 00:23:51,880 --> 00:23:54,239 Speaker 3: a BlackBerry. Near the end of his administration. You can 425 00:23:54,280 --> 00:23:56,320 Speaker 3: go on YouTube and you can watch this. He goes 426 00:23:56,480 --> 00:24:00,720 Speaker 3: on Jimmy Fallon and talks about getting his first smartphone 427 00:24:00,760 --> 00:24:03,480 Speaker 3: near the end of his administration, that the government gives 428 00:24:03,520 --> 00:24:06,520 Speaker 3: him a smartphone that he can use, and Obama says 429 00:24:06,520 --> 00:24:09,040 Speaker 3: he can't take a picture, I can't do a phone call, 430 00:24:09,280 --> 00:24:12,680 Speaker 3: can't send a text message because all that stuff makes 431 00:24:12,720 --> 00:24:15,679 Speaker 3: it not secure. Then the president having an ability to 432 00:24:15,680 --> 00:24:18,199 Speaker 3: do that, all that can be hacked not secure. The 433 00:24:18,240 --> 00:24:22,159 Speaker 3: president now gets a phone call from reporters. There was 434 00:24:22,240 --> 00:24:25,120 Speaker 3: no conversation at all about the president just having a 435 00:24:25,160 --> 00:24:29,760 Speaker 3: phone you can call. This is an incredibly different idea 436 00:24:30,359 --> 00:24:33,000 Speaker 3: of what it means to be a secure administration, and 437 00:24:33,040 --> 00:24:35,920 Speaker 3: nobody cares anymore. It just not even talked about. It's 438 00:24:35,960 --> 00:24:38,160 Speaker 3: a completely different landscape. 439 00:24:38,560 --> 00:24:40,199 Speaker 1: And I mean this has been different even on the 440 00:24:40,200 --> 00:24:43,680 Speaker 1: Republican side. I mean, didn't Bush just not use email, 441 00:24:43,920 --> 00:24:44,840 Speaker 1: not use personal email? 442 00:24:44,960 --> 00:24:49,920 Speaker 3: That's correct. George W. Bush famously sent a last email 443 00:24:50,320 --> 00:24:54,200 Speaker 3: to all of his friends in his email list, like, look, 444 00:24:54,200 --> 00:24:56,320 Speaker 3: I'm not gonna be able to email anymore because they 445 00:24:56,320 --> 00:24:57,800 Speaker 3: can see my email. So I'm not going to do 446 00:24:57,840 --> 00:25:00,960 Speaker 3: that anymore. Goodbye for now. And He's not the only 447 00:25:01,000 --> 00:25:03,640 Speaker 3: person to do this. I remember the interview with Lindsay Graham. 448 00:25:03,720 --> 00:25:05,560 Speaker 3: He has talked about how he is never in his 449 00:25:05,680 --> 00:25:08,640 Speaker 3: life sent an email? Wow, right, Why would I send 450 00:25:08,640 --> 00:25:10,320 Speaker 3: an email? He says? Why would I do that? And 451 00:25:10,359 --> 00:25:12,840 Speaker 3: the thing is, you know, part of being a senators 452 00:25:12,880 --> 00:25:14,320 Speaker 3: you have a staff or anything they can deal this 453 00:25:14,359 --> 00:25:17,479 Speaker 3: communication for you. Why, He's like, why do I send 454 00:25:17,480 --> 00:25:19,280 Speaker 3: an email? An email can only get me in trouble? 455 00:25:19,280 --> 00:25:20,240 Speaker 3: Why would I ever do it? 456 00:25:20,480 --> 00:25:22,520 Speaker 1: Let me jump to signal gate real quick, because let's 457 00:25:22,520 --> 00:25:26,280 Speaker 1: go with single gate. You've got these high levels senior 458 00:25:26,320 --> 00:25:30,680 Speaker 1: officials on the Trump administration talking about essentially war plans, 459 00:25:30,960 --> 00:25:35,280 Speaker 1: right doing it on signal What, strictly speaking, what were 460 00:25:35,280 --> 00:25:36,680 Speaker 1: they supposed to have been using. 461 00:25:37,320 --> 00:25:43,320 Speaker 3: There are extremely secure government communication tools that are used 462 00:25:43,760 --> 00:25:48,320 Speaker 3: to talk about things like national security stuff. But chat 463 00:25:48,359 --> 00:25:52,919 Speaker 3: that involves the Vice president, the Secretary of Defense, the 464 00:25:52,960 --> 00:25:57,119 Speaker 3: Secretary of State, the National Security Advisor, and the chief 465 00:25:57,119 --> 00:25:59,800 Speaker 3: of staff and some of their deputies. Every single person 466 00:25:59,800 --> 00:26:04,399 Speaker 3: on that chat has access to millions of dollars of 467 00:26:04,480 --> 00:26:12,520 Speaker 3: equipment specifically designed to let them communicate without breaching national security. 468 00:26:12,680 --> 00:26:16,920 Speaker 3: For whatever reason, they chose not to use that stuff. 469 00:26:17,800 --> 00:26:21,480 Speaker 3: And the result is in this case, they ended up 470 00:26:21,520 --> 00:26:24,040 Speaker 3: inviting a reporter in. There's been a lot of reporting 471 00:26:24,040 --> 00:26:27,080 Speaker 3: for other government officials who talk about how, yes, we 472 00:26:27,200 --> 00:26:32,560 Speaker 3: use signal, but just how incredibly dumb and unbelieve, unfathomably 473 00:26:32,640 --> 00:26:36,399 Speaker 3: dumb it was to use a signal group chat to 474 00:26:36,640 --> 00:26:41,040 Speaker 3: plan and execute a military strike and to then detail 475 00:26:41,440 --> 00:26:44,200 Speaker 3: how that worked. The problem is not the reporter you 476 00:26:44,240 --> 00:26:48,440 Speaker 3: ineventally put on, it's the foreign power that can hack 477 00:26:48,520 --> 00:26:50,920 Speaker 3: it and look at it. So, even in a world 478 00:26:50,920 --> 00:26:55,800 Speaker 3: where people do increasingly use signal, this particular thing was 479 00:26:55,840 --> 00:26:58,520 Speaker 3: seen as one of the dumbest things that people can 480 00:26:58,560 --> 00:27:01,600 Speaker 3: imagine in a long long time in Washington, a town 481 00:27:01,640 --> 00:27:04,000 Speaker 3: where dumb things happen pretty much every day. 482 00:27:05,520 --> 00:27:10,159 Speaker 1: So sener Ron Widen has he basically is demanding and 483 00:27:10,160 --> 00:27:13,320 Speaker 1: he's released this letter, but he's demanding an investigation, right. 484 00:27:13,720 --> 00:27:17,360 Speaker 1: And there's one part of this letter that I found 485 00:27:18,000 --> 00:27:20,080 Speaker 1: kind of wild, right, And this is kind of near 486 00:27:20,119 --> 00:27:22,800 Speaker 1: the middle, near the end. So he's talking about tele 487 00:27:22,880 --> 00:27:26,159 Speaker 1: message and he says, I'm gonna read this here. It 488 00:27:26,240 --> 00:27:29,560 Speaker 1: remains unclear whether the design of this system was merely 489 00:27:29,680 --> 00:27:32,359 Speaker 1: the result of incompetence on the part of the foreign 490 00:27:32,400 --> 00:27:36,720 Speaker 1: company whose senior leadership are former intelligence officers, or a 491 00:27:36,760 --> 00:27:43,080 Speaker 1: backdoor designed to facilitate foreign intelligence collection against US government officials. 492 00:27:43,400 --> 00:27:46,920 Speaker 1: So the foreign government here and the foreign intelligence officers 493 00:27:46,960 --> 00:27:52,159 Speaker 1: he's speaking about this is Israel, which is regardless of 494 00:27:52,200 --> 00:27:55,760 Speaker 1: one's personal beliefs, may be the official US stance is 495 00:27:55,760 --> 00:27:58,280 Speaker 1: that we're very friendly with Israel. This is an ally 496 00:27:58,880 --> 00:28:03,159 Speaker 1: to even can leave open the door to suggest that 497 00:28:03,960 --> 00:28:11,720 Speaker 1: this is the creation of the app is intentional active espionage? 498 00:28:11,840 --> 00:28:14,280 Speaker 1: Am I just reading this? Or is this that actually 499 00:28:14,400 --> 00:28:15,919 Speaker 1: just a wild thing to say in public? 500 00:28:16,840 --> 00:28:20,680 Speaker 3: One Israel has absolutely by the United States a number 501 00:28:20,720 --> 00:28:23,920 Speaker 3: of times. You know that a country is going to 502 00:28:23,960 --> 00:28:26,280 Speaker 3: say something in public, but you really want to know 503 00:28:26,320 --> 00:28:30,040 Speaker 3: what they're talking about in private, right And so even 504 00:28:30,160 --> 00:28:34,359 Speaker 3: if that country is ostensibly your ally, you want to 505 00:28:34,359 --> 00:28:37,800 Speaker 3: know what's going on. And two, this has of course 506 00:28:37,840 --> 00:28:42,360 Speaker 3: been a central concern when it comes to information technology 507 00:28:42,440 --> 00:28:45,160 Speaker 3: and the government for a long time now. This idea 508 00:28:45,240 --> 00:28:48,680 Speaker 3: that this kind of technology can be used by a 509 00:28:48,720 --> 00:28:53,600 Speaker 3: foreign government to hack our stuff, even this foreign government, 510 00:28:53,640 --> 00:28:59,680 Speaker 3: even this ally is not an unlikely one. Why it does, like, look, 511 00:29:00,000 --> 00:29:04,760 Speaker 3: this piece of technology telling message was so sort of 512 00:29:04,840 --> 00:29:08,920 Speaker 3: cartoonishly hackable according to the reporting that we have read 513 00:29:08,960 --> 00:29:12,880 Speaker 3: about it, that either someone is just like really bad 514 00:29:12,960 --> 00:29:16,120 Speaker 3: at making something and has scammed people into buying it, 515 00:29:16,720 --> 00:29:20,920 Speaker 3: or they're stupid like a fox and they made this 516 00:29:21,000 --> 00:29:24,440 Speaker 3: thing so they could just harvest all the information that 517 00:29:24,480 --> 00:29:30,040 Speaker 3: they could get. Either way, you're looking at a huge 518 00:29:30,440 --> 00:29:37,040 Speaker 3: failure of the American governmental state to protect our secrets. 519 00:29:37,080 --> 00:29:40,880 Speaker 3: Clear like an undeniable fuck up. And the question is 520 00:29:40,920 --> 00:29:42,720 Speaker 3: just like how much of a fuck up is it? 521 00:29:43,080 --> 00:29:45,160 Speaker 3: And they're not sure yet, but that is what he 522 00:29:45,240 --> 00:29:49,360 Speaker 3: is saying needs to be investigated. After Pete hegsif their 523 00:29:49,400 --> 00:29:54,000 Speaker 3: secretary of defense was on this signal chat. Later reporting 524 00:29:54,040 --> 00:29:58,040 Speaker 3: discovered he had put an unsecure line into his office 525 00:29:58,040 --> 00:30:01,800 Speaker 3: in a Pentagon that he had ad maybe you signal. 526 00:30:02,400 --> 00:30:06,120 Speaker 3: In many other contexts, the President didn't say, Okay, we 527 00:30:06,200 --> 00:30:10,120 Speaker 3: got to do an investigation what exactly we have been 528 00:30:10,240 --> 00:30:14,000 Speaker 3: spreading all over the place through bad op sect by 529 00:30:14,040 --> 00:30:17,600 Speaker 3: our secretary of defense? Right, very serious job, right, The 530 00:30:17,640 --> 00:30:20,120 Speaker 3: President said, Look, I talked to him again. It was 531 00:30:20,160 --> 00:30:22,800 Speaker 3: a serious conversation. But you know he's going to get 532 00:30:22,800 --> 00:30:25,600 Speaker 3: it together now, that's for the president. That was the 533 00:30:25,640 --> 00:30:30,360 Speaker 3: President's response to this. So in that context, I would 534 00:30:30,360 --> 00:30:35,440 Speaker 3: say it is not that likely that Roten Widen is 535 00:30:35,440 --> 00:30:39,280 Speaker 3: going to get either of the investigations he wants from 536 00:30:39,280 --> 00:30:42,680 Speaker 3: the Trump administration. This administration is much more focused on 537 00:30:42,720 --> 00:30:47,560 Speaker 3: protecting its own, much more focused on circling the wagons here. 538 00:30:47,760 --> 00:30:50,760 Speaker 1: Evan, Okay, I get that, But do people in DC 539 00:30:50,960 --> 00:30:53,880 Speaker 1: even understand this stuff? Do they even care? 540 00:30:54,320 --> 00:30:56,719 Speaker 3: Well, you know, we are asking them to understand this stuff, right, 541 00:30:56,760 --> 00:30:58,240 Speaker 3: These are the ones who often these people who have 542 00:30:58,280 --> 00:31:01,160 Speaker 3: to write regulations about crypt currency for us, these are 543 00:31:01,120 --> 00:31:02,520 Speaker 3: the ones you have to figure out what we're going 544 00:31:02,560 --> 00:31:05,960 Speaker 3: to do about artificial intelligence. Right. The woman who is 545 00:31:06,000 --> 00:31:10,000 Speaker 3: in charge of determining how much we're going to use 546 00:31:10,160 --> 00:31:15,760 Speaker 3: artificial intelligence in American education recently referred to it as 547 00:31:15,920 --> 00:31:19,920 Speaker 3: a one like the steak sauce, not AI, but a 548 00:31:20,080 --> 00:31:24,800 Speaker 3: one in a public's forum. Wait, this is the Secretary 549 00:31:24,800 --> 00:31:28,520 Speaker 3: of Education, Wynda McMahon. She was talking about artificial intelligence 550 00:31:28,520 --> 00:31:29,000 Speaker 3: and education. 551 00:31:29,200 --> 00:31:31,320 Speaker 1: Okay, just as an aside here. I had to look 552 00:31:31,360 --> 00:31:33,840 Speaker 1: this up later and Evan was right. This was in 553 00:31:33,880 --> 00:31:36,840 Speaker 1: an education summit in San Diego, and she said a 554 00:31:37,000 --> 00:31:38,040 Speaker 1: one twice. 555 00:31:38,800 --> 00:31:42,760 Speaker 3: And so if you're a person involved in technology in 556 00:31:42,760 --> 00:31:44,680 Speaker 3: the tech world, if you're a person trying to get 557 00:31:44,680 --> 00:31:48,480 Speaker 3: these regulations or stop these regulations, these are the people 558 00:31:48,480 --> 00:31:51,280 Speaker 3: that you're working with. And I think it explains a 559 00:31:51,280 --> 00:31:53,520 Speaker 3: lot about some of the scandals that we have seen 560 00:31:53,560 --> 00:31:56,840 Speaker 3: when it comes to technology, and it also explains a 561 00:31:56,880 --> 00:32:01,400 Speaker 3: lot of some of the confusing life that we live 562 00:32:01,480 --> 00:32:03,080 Speaker 3: in America when it comes to technology. 563 00:32:03,640 --> 00:32:05,959 Speaker 1: Man. So maybe the last one I'll ask you here 564 00:32:06,000 --> 00:32:09,720 Speaker 1: then is you know you've you've been in the game 565 00:32:09,760 --> 00:32:13,239 Speaker 1: for a minute. You've been reporting on politics, and this 566 00:32:13,280 --> 00:32:15,600 Speaker 1: is how I know you, right, I know you. You 567 00:32:15,640 --> 00:32:17,720 Speaker 1: know we were advice. I knew you as like, if 568 00:32:17,760 --> 00:32:20,400 Speaker 1: I have a question about politics, let me ask Evin. 569 00:32:20,560 --> 00:32:21,760 Speaker 1: He's gonna understand this stuff. 570 00:32:21,960 --> 00:32:24,200 Speaker 3: Yeah, I'm a nerd. I with the nerd. I know 571 00:32:24,640 --> 00:32:26,280 Speaker 3: I did it. I would hold it. 572 00:32:26,400 --> 00:32:29,800 Speaker 1: But hold on, you're the politics nerd. Now you're kind 573 00:32:29,800 --> 00:32:32,959 Speaker 1: of having to be the technology nerd a little bit. 574 00:32:32,960 --> 00:32:35,760 Speaker 1: I'm starting to notice that tech is starting to creep 575 00:32:35,880 --> 00:32:39,800 Speaker 1: into politics. You got to understand signal. Five years ago, 576 00:32:39,840 --> 00:32:41,760 Speaker 1: I wouldn't have been asking you a questions about apps, 577 00:32:41,880 --> 00:32:45,920 Speaker 1: That's right. I'm starting to notice that politics reporting is 578 00:32:46,000 --> 00:32:49,160 Speaker 1: starting to Frankly, I think we should start requiring our 579 00:32:49,240 --> 00:32:52,480 Speaker 1: reporters know more about technology. Maybe they did in the past, 580 00:32:52,680 --> 00:32:54,240 Speaker 1: Like you can't get that pass, you know what I 581 00:32:54,320 --> 00:32:55,680 Speaker 1: mean for not knowing stuff? 582 00:32:56,200 --> 00:33:01,000 Speaker 3: Oh one, this is a huge problem in politics. It 583 00:33:01,080 --> 00:33:04,680 Speaker 3: matters what you can do with AI and things like 584 00:33:05,200 --> 00:33:09,200 Speaker 3: misinformation and fake videos. It matters what you can do 585 00:33:09,320 --> 00:33:13,280 Speaker 3: with things like the hack ability of some of these apps. 586 00:33:13,520 --> 00:33:17,120 Speaker 3: All this stuff matters, But you're still living in a 587 00:33:17,160 --> 00:33:22,000 Speaker 3: world in politics often where it's like, hey, it's just Twitter, 588 00:33:22,280 --> 00:33:25,400 Speaker 3: like it's twenty ten, or hey, it's just signal, Right, 589 00:33:25,400 --> 00:33:28,200 Speaker 3: we're just signaling. Who cares? It's just signal, Like what 590 00:33:28,240 --> 00:33:31,360 Speaker 3: are we doing? Who cares? And that is a huge problem. 591 00:33:31,560 --> 00:33:35,040 Speaker 3: It is a huge problem, and they're changing the nature 592 00:33:35,640 --> 00:33:40,680 Speaker 3: of the way politics works, and unless we're able to 593 00:33:40,840 --> 00:33:44,239 Speaker 3: get up to speed, we are hosed. So you are 594 00:33:44,280 --> 00:33:47,440 Speaker 3: one hundred percent right there. So we absolutely need to 595 00:33:47,520 --> 00:33:51,560 Speaker 3: know about this stuff. The tech is starting to eat 596 00:33:52,560 --> 00:33:58,440 Speaker 3: the politics, and the politics is not ready to stand 597 00:33:58,520 --> 00:34:02,280 Speaker 3: up to that yet. And maybe they will be, maybe 598 00:34:02,280 --> 00:34:05,600 Speaker 3: it'll happen, but right now it absolutely is not. 599 00:34:08,080 --> 00:34:10,160 Speaker 1: And this all brings us back to where we started 600 00:34:10,600 --> 00:34:13,880 Speaker 1: the technology. I asked a similar question to our first guest, 601 00:34:14,120 --> 00:34:17,560 Speaker 1: Joseph Cox from four or four Media. I've been reading 602 00:34:17,800 --> 00:34:21,279 Speaker 1: your reporting forever and it seems that a lot of 603 00:34:21,280 --> 00:34:26,319 Speaker 1: your stuff was focused on specifically on companies. But it 604 00:34:26,360 --> 00:34:30,239 Speaker 1: seems like a lot of politics reporting nowadays is necessarily 605 00:34:30,280 --> 00:34:32,440 Speaker 1: tech reporting, especially with this administration. 606 00:34:32,840 --> 00:34:37,759 Speaker 2: Oh yeah, absolutely. When you have the CEOs of the 607 00:34:37,760 --> 00:34:41,840 Speaker 2: biggest tech companies in America, you know, Jeff Bezos, Smug Zuckerberg, 608 00:34:42,120 --> 00:34:44,920 Speaker 2: Tim Cook, Ela Musk officely, but he's almost in his 609 00:34:44,960 --> 00:34:47,320 Speaker 2: own category because he essentially is a member of the 610 00:34:47,400 --> 00:34:50,919 Speaker 2: US government. When you have all of those people going 611 00:34:50,960 --> 00:34:54,640 Speaker 2: to Trump's inauguration and donating a million dollars each or 612 00:34:54,680 --> 00:35:00,600 Speaker 2: whatever to it, that inherently makes tech political. Three four 613 00:35:00,680 --> 00:35:03,320 Speaker 2: or five years ago, you'd have people say, oh, stick 614 00:35:03,360 --> 00:35:07,080 Speaker 2: to tech, you're getting too political. I tech is inherently political. 615 00:35:07,239 --> 00:35:10,960 Speaker 2: Is the tools and the companies that provide the technical 616 00:35:11,000 --> 00:35:16,920 Speaker 2: infrastructure for how the will's functions and political decisions directly 617 00:35:16,960 --> 00:35:21,400 Speaker 2: dictate that. And tech decisions can directly dictate political decisions 618 00:35:21,680 --> 00:35:23,879 Speaker 2: as well, so they're completely inseparable. 619 00:35:26,040 --> 00:35:28,239 Speaker 1: Maybe this is sort of obvious, but when I talk 620 00:35:28,280 --> 00:35:31,000 Speaker 1: to my friends who report on politics or the economy 621 00:35:31,080 --> 00:35:34,560 Speaker 1: or art or whatever, they're being expected to understand technology 622 00:35:34,600 --> 00:35:37,960 Speaker 1: for their work. Now, this stuff is connected, and I 623 00:35:38,000 --> 00:35:41,920 Speaker 1: think it always was, but especially now, we can't ignore it. 624 00:35:42,280 --> 00:35:44,160 Speaker 1: And that's part of what we're trying to explore with 625 00:35:44,360 --> 00:35:46,719 Speaker 1: kill Switch. If you want to keep up with Evans' work, 626 00:35:46,800 --> 00:35:49,680 Speaker 1: I highly recommend checking it out at Notice That's in 627 00:35:49,760 --> 00:35:53,560 Speaker 1: ot Us. You should also check out Joseph Cox's articles 628 00:35:53,560 --> 00:35:55,799 Speaker 1: on four or four and if you want a really 629 00:35:55,840 --> 00:35:59,399 Speaker 1: technical analysis of tell a Message and how cartoonishly bad 630 00:35:59,440 --> 00:36:02,600 Speaker 1: it is, I highly recommend checking out Michael Lee's article 631 00:36:02,640 --> 00:36:05,000 Speaker 1: on his own site, and all those links are in 632 00:36:05,040 --> 00:36:14,640 Speaker 1: the show notes. Thank you so much for listening to 633 00:36:14,719 --> 00:36:17,160 Speaker 1: kill Switch. Let us know what you think and yo, 634 00:36:17,239 --> 00:36:18,799 Speaker 1: if there's something you want us to cover, let us 635 00:36:18,840 --> 00:36:21,040 Speaker 1: know about that too. You can hit us at kill 636 00:36:21,080 --> 00:36:24,200 Speaker 1: Switch at Kalaidoscope dot NYC, or you can hit me 637 00:36:24,280 --> 00:36:26,840 Speaker 1: at dex digi that's d e x d I g 638 00:36:27,040 --> 00:36:29,279 Speaker 1: I on the Gram or on Blue Sky if that's 639 00:36:29,280 --> 00:36:32,120 Speaker 1: your thing. Also leave us a review that helps other 640 00:36:32,160 --> 00:36:34,680 Speaker 1: people find the show, which in turn helps us keep 641 00:36:34,719 --> 00:36:38,280 Speaker 1: doing our thing. Kill Switch is hosted by Me, Dexter Thomas, 642 00:36:38,800 --> 00:36:42,680 Speaker 1: is produced by Sena Ozaki, Darl Lukkatts and Kate Osborne 643 00:36:42,760 --> 00:36:44,720 Speaker 1: and this week we have production help from a LISTA 644 00:36:44,760 --> 00:36:47,920 Speaker 1: midcalf Our theme song is by me and Kyle Murdoch, 645 00:36:48,040 --> 00:36:51,719 Speaker 1: and Kyle also mixed. The show from Kalaidoscope. Our executive 646 00:36:51,719 --> 00:36:56,160 Speaker 1: producers are Oz Valashin, Mangesh Hatigodur and Kate Osbourne. From 647 00:36:56,200 --> 00:36:59,400 Speaker 1: iHeart our executive producers are Katrina Norvil and Nikki E. 648 00:36:59,560 --> 00:36:59,719 Speaker 3: Tour. 649 00:37:00,280 --> 00:37:03,200 Speaker 1: Catch on the next one, good Bye,