1 00:00:12,840 --> 00:00:16,239 Speaker 1: Hey, it's Os Valoscian, host of tech Stuff. This week, 2 00:00:16,239 --> 00:00:18,040 Speaker 1: we want to do something a little bit different and 3 00:00:18,079 --> 00:00:20,079 Speaker 1: share an episode of a show that's new to the 4 00:00:20,160 --> 00:00:29,280 Speaker 1: Kaleidoscope and iHeart Podcast family. It's called kill Switch. Kill 5 00:00:29,320 --> 00:00:33,760 Speaker 1: Switch is a technology survival guide to modern culture. Host 6 00:00:33,840 --> 00:00:37,800 Speaker 1: Dexter Thomas answers questions big and small, putting back the 7 00:00:37,840 --> 00:00:41,400 Speaker 1: curtain on the systems we rely on daily, as well 8 00:00:41,440 --> 00:00:45,400 Speaker 1: as the more absurd corners of the Internet. So here's 9 00:00:45,440 --> 00:00:46,920 Speaker 1: an episode of kill Switch. 10 00:00:54,320 --> 00:00:58,760 Speaker 2: Quick question, do you know about the xzutil's backdoor hack? 11 00:00:59,360 --> 00:00:59,600 Speaker 3: So what? 12 00:01:00,240 --> 00:01:03,360 Speaker 2: Or wait? Wait, wait, wait, wait what the EXU tills 13 00:01:03,400 --> 00:01:06,240 Speaker 2: back door? I have no idea what you're talking about. 14 00:01:06,400 --> 00:01:07,480 Speaker 1: I don't know what that is. 15 00:01:09,600 --> 00:01:12,920 Speaker 2: This is something almost nobody's heard of, but in the 16 00:01:12,959 --> 00:01:16,679 Speaker 2: spring of twenty twenty four we narrowly avoided a complete 17 00:01:16,720 --> 00:01:20,440 Speaker 2: technological disaster. So you've never heard of this though? 18 00:01:20,840 --> 00:01:21,160 Speaker 1: Nope? 19 00:01:22,040 --> 00:01:24,640 Speaker 3: Yeah, I just searched it up on Wikipedia and it 20 00:01:24,680 --> 00:01:25,800 Speaker 3: seems way too dune. 21 00:01:25,680 --> 00:01:28,880 Speaker 2: To read about. These aren't just random people. These are 22 00:01:28,959 --> 00:01:32,360 Speaker 2: other journalists people in general who keep up with the news. 23 00:01:32,680 --> 00:01:34,920 Speaker 3: Okay, I was like, wait, what did I miss? And 24 00:01:35,000 --> 00:01:38,319 Speaker 3: I feel bad, but I guess maybe I'm not the 25 00:01:38,400 --> 00:01:41,720 Speaker 3: only one, what journalist who doesn't know what this is about. 26 00:01:42,040 --> 00:01:44,600 Speaker 2: Even they didn't really know about what could have been 27 00:01:44,800 --> 00:01:47,480 Speaker 2: the biggest hack in the history of the Internet. 28 00:01:48,440 --> 00:01:50,880 Speaker 3: If this had not been caught, then this would have 29 00:01:51,000 --> 00:01:54,960 Speaker 3: been a skeleton key that would have allowed these attackers 30 00:01:55,040 --> 00:02:01,040 Speaker 3: to break into tens of millions of incredibly important servers 31 00:02:01,080 --> 00:02:04,240 Speaker 3: around the world. We probably would have had airlines not working, 32 00:02:04,600 --> 00:02:08,160 Speaker 3: trading halted ATM's not working, bank's not working, people not 33 00:02:08,160 --> 00:02:11,640 Speaker 3: able to get their money. You'd have a huge loss 34 00:02:11,720 --> 00:02:14,320 Speaker 3: of credibility of technology in people's lives. 35 00:02:15,120 --> 00:02:19,320 Speaker 2: Alex Doamos is a cybersecurity expert. Specifically, he's the chief 36 00:02:19,400 --> 00:02:23,919 Speaker 2: information security Officer or CISO at a cybersecurity company called 37 00:02:23,960 --> 00:02:27,960 Speaker 2: Sentinel One, and he's the former CISO at Facebook. He's 38 00:02:27,960 --> 00:02:30,800 Speaker 2: also a lecturer in the computer science department at Stanford 39 00:02:31,280 --> 00:02:34,400 Speaker 2: and this attempted hack is something that is still keeping 40 00:02:34,480 --> 00:02:35,119 Speaker 2: him up at night. 41 00:02:36,040 --> 00:02:41,240 Speaker 3: It's fallen out of popular discussion, but among people in 42 00:02:41,320 --> 00:02:44,799 Speaker 3: security we're still talking about it. It uncovered a real 43 00:02:45,800 --> 00:02:50,960 Speaker 3: fundamental weakness that terrifies lots of people who have responsibility 44 00:02:50,960 --> 00:02:52,200 Speaker 3: in this area and. 45 00:02:52,160 --> 00:02:55,399 Speaker 2: What scares them most, and this should scare US two 46 00:02:56,000 --> 00:02:58,320 Speaker 2: is that this was caught by complete chance. 47 00:02:58,680 --> 00:03:02,880 Speaker 3: We just got lucky, like one dude got really bored 48 00:03:03,240 --> 00:03:07,640 Speaker 3: and noticed a tiny little change in the speed of 49 00:03:07,840 --> 00:03:11,400 Speaker 3: one program executing and pulled the thread, and on the 50 00:03:11,520 --> 00:03:14,799 Speaker 3: end of this thread was a humongous, ticking time bomb. 51 00:03:15,360 --> 00:03:18,679 Speaker 3: It was one dude, and he should never have to 52 00:03:18,680 --> 00:03:22,680 Speaker 3: buy a beer for himself ever again. Underus freuend I'm 53 00:03:22,760 --> 00:03:25,640 Speaker 3: raising a toast to you right now. This is just water, 54 00:03:25,760 --> 00:03:27,400 Speaker 3: but I wish it was more. 55 00:03:33,800 --> 00:03:44,520 Speaker 2: I'm afraid Kaleidoscope and iHeart podcast is killed switch. I'm 56 00:03:44,600 --> 00:04:09,280 Speaker 2: Dexter Thomas. I'm sorry. I'm If you've never heard about this, 57 00:04:09,520 --> 00:04:12,520 Speaker 2: that's no reason to feel bad, but if it hadn't 58 00:04:12,560 --> 00:04:16,919 Speaker 2: been caught, it absolutely would have affected you. What kind 59 00:04:17,000 --> 00:04:19,200 Speaker 2: of activity were talking about here. 60 00:04:19,360 --> 00:04:21,159 Speaker 3: Well, we really don't know, and because we don't know 61 00:04:21,160 --> 00:04:23,400 Speaker 3: who the attackers are, we don't know whether that would 62 00:04:23,400 --> 00:04:28,880 Speaker 3: have been used for really quiet surveillance. It could have 63 00:04:28,920 --> 00:04:34,039 Speaker 3: been used for national security intelligence gathering purposes, It could 64 00:04:34,040 --> 00:04:36,560 Speaker 3: have been used for a humongous heist of hundreds of 65 00:04:36,600 --> 00:04:40,240 Speaker 3: millions or billions of dollars of cryptocurrency, or it could 66 00:04:40,240 --> 00:04:42,880 Speaker 3: have been used as part of a massive cyber attack 67 00:04:42,960 --> 00:04:46,600 Speaker 3: to shut down millions of computers and cause massive disruptions. 68 00:04:47,880 --> 00:04:50,560 Speaker 2: One of the main reasons that this potential attack isn't 69 00:04:50,600 --> 00:04:54,960 Speaker 2: talked about much is because the details are kind of technical. Well, 70 00:04:55,160 --> 00:04:57,440 Speaker 2: some of the details are a lot of this stuff 71 00:04:57,480 --> 00:05:00,320 Speaker 2: is really just basic human behavior. It's stuff that you 72 00:05:00,440 --> 00:05:02,760 Speaker 2: or I could do if we really wanted, and it 73 00:05:02,839 --> 00:05:06,640 Speaker 2: shows us that sometimes the best hacks are the simplest ones. 74 00:05:07,240 --> 00:05:10,400 Speaker 2: Let me break it down for you. In late March 75 00:05:10,400 --> 00:05:14,000 Speaker 2: of twenty twenty four, Andres Freud, who's an engineer at Microsoft, 76 00:05:14,279 --> 00:05:16,640 Speaker 2: was sitting at his desk doing his job when he 77 00:05:16,720 --> 00:05:19,640 Speaker 2: discovered a malicious piece of code in this little known 78 00:05:19,720 --> 00:05:23,240 Speaker 2: tool called xdu tilS. This code created a method that 79 00:05:23,240 --> 00:05:27,080 Speaker 2: would allow hackers to access a lot of different computers. 80 00:05:27,480 --> 00:05:30,039 Speaker 2: Maybe right now you're thinking, okay, so why is this 81 00:05:30,120 --> 00:05:32,920 Speaker 2: the problem for me? I mean, I don't use xdu tilS, 82 00:05:32,960 --> 00:05:35,960 Speaker 2: so they couldn't get on my computer. And yeah, maybe 83 00:05:36,000 --> 00:05:39,080 Speaker 2: you've never heard of XCU tills. Actually I hadn't either, 84 00:05:39,560 --> 00:05:41,840 Speaker 2: and I did what most people do when they don't 85 00:05:41,880 --> 00:05:45,880 Speaker 2: understand something about a computer, call an expert. But it 86 00:05:45,920 --> 00:05:49,560 Speaker 2: turns out that this really well respected expert he found 87 00:05:49,600 --> 00:05:51,640 Speaker 2: out about XU tills when I did. 88 00:05:52,560 --> 00:05:55,120 Speaker 3: Yeah, So I personally had not heard of xdu tilS 89 00:05:55,640 --> 00:05:57,080 Speaker 3: before this, even. 90 00:05:56,920 --> 00:06:00,760 Speaker 2: You really, yeah, I had definitely not heard it XU tills. 91 00:06:01,160 --> 00:06:03,680 Speaker 2: I figured you would have hearing that you had not 92 00:06:04,040 --> 00:06:07,440 Speaker 2: heard of it before all this happened. Frankly, that's a 93 00:06:07,440 --> 00:06:10,320 Speaker 2: little bit more scary to me. Now, So why did 94 00:06:10,360 --> 00:06:13,240 Speaker 2: this backdoor into a program that no one seems to 95 00:06:13,279 --> 00:06:17,560 Speaker 2: know about? Matters so much? And what is xzu tills. 96 00:06:17,839 --> 00:06:22,680 Speaker 3: This is the brilliance of what these attackers did. XCU 97 00:06:22,720 --> 00:06:25,480 Speaker 3: tilS is an ingredient to an ingredient to an ingredient 98 00:06:25,520 --> 00:06:29,520 Speaker 3: to something really important. So the thing that they wanted 99 00:06:29,520 --> 00:06:32,400 Speaker 3: to have a backdoor into is a really important program 100 00:06:32,440 --> 00:06:35,480 Speaker 3: called open ssh. So this is something that every tech 101 00:06:35,520 --> 00:06:36,000 Speaker 3: he has heard of. 102 00:06:36,880 --> 00:06:39,840 Speaker 2: All right, but what if you're not a techie. So 103 00:06:39,960 --> 00:06:42,320 Speaker 2: in order to understand the XU tills hack, we do 104 00:06:42,400 --> 00:06:45,080 Speaker 2: need to back up and understand something that xdu tilS 105 00:06:45,160 --> 00:06:47,520 Speaker 2: is used in this thing called open ssh. 106 00:06:47,920 --> 00:06:53,640 Speaker 3: This is the program that the majority of Unix like systems, 107 00:06:53,760 --> 00:06:59,400 Speaker 3: especially Linux, also Max and some other operating systems allow 108 00:06:59,480 --> 00:07:02,400 Speaker 3: you to access them remotely over the Internet. 109 00:07:03,080 --> 00:07:06,000 Speaker 2: I'll get to the open later. But SSH stands for 110 00:07:06,080 --> 00:07:09,359 Speaker 2: secure show, and let's just focus on secure right now. 111 00:07:09,880 --> 00:07:12,000 Speaker 2: If you think of the difference between posting a tweet 112 00:07:12,040 --> 00:07:15,320 Speaker 2: online and dming someone, you're actually kind of halfway there. 113 00:07:15,880 --> 00:07:19,040 Speaker 2: Open Ssh allows you to communicate with a remote computer 114 00:07:19,400 --> 00:07:21,080 Speaker 2: just like you were sitting there right in front of it. 115 00:07:21,680 --> 00:07:23,840 Speaker 2: So even though you're far away, if you want to 116 00:07:23,880 --> 00:07:27,440 Speaker 2: send a message, or install programs or delete files, you 117 00:07:27,520 --> 00:07:30,080 Speaker 2: know that the connection is safe and that nobody else 118 00:07:30,120 --> 00:07:32,920 Speaker 2: can see what you're doing or tamper with that connection. 119 00:07:33,520 --> 00:07:35,360 Speaker 3: So you know, when you see like people in the 120 00:07:35,360 --> 00:07:39,400 Speaker 3: matrix typing really fast, see a lot of text right 121 00:07:39,440 --> 00:07:42,760 Speaker 3: if somebody is doing that remotely, it's probably over open ssh. 122 00:07:42,880 --> 00:07:44,760 Speaker 2: You might think this doesn't matter for you because you 123 00:07:44,760 --> 00:07:48,440 Speaker 2: don't use open ssh, but you do because that's what 124 00:07:48,480 --> 00:07:51,520 Speaker 2: you use to connect to systems running Linux around the world. 125 00:07:52,000 --> 00:07:57,280 Speaker 3: Linux has become the standard operating system for the cloud. 126 00:07:57,760 --> 00:08:00,880 Speaker 3: So when you talk to Google, you're talking a Linux system. 127 00:08:00,920 --> 00:08:02,960 Speaker 3: When you talk to Facebook, you're talking to a Linux system. 128 00:08:02,960 --> 00:08:06,480 Speaker 3: When you talk to Apple, you're probably talking to a 129 00:08:06,480 --> 00:08:08,880 Speaker 3: Linux system. Right now, the system that we're talking to 130 00:08:08,880 --> 00:08:11,720 Speaker 3: each other with almost certainly is running Linux. So the 131 00:08:11,840 --> 00:08:13,800 Speaker 3: vast majority of systems you talk to in the cloud 132 00:08:14,280 --> 00:08:15,120 Speaker 3: are running Linux. 133 00:08:15,520 --> 00:08:18,960 Speaker 2: Linux is used for Apple's iCloud, for social media sites 134 00:08:19,000 --> 00:08:23,240 Speaker 2: like Facebook, Instagram, for YouTube, for Twitter, It's used for 135 00:08:23,240 --> 00:08:25,960 Speaker 2: the New York Stock Exchange. Gamers use it when they 136 00:08:26,080 --> 00:08:29,160 Speaker 2: run Steam or they play games online, and the list 137 00:08:29,200 --> 00:08:32,360 Speaker 2: goes on. The vast majority of the Internet runs on 138 00:08:32,440 --> 00:08:36,000 Speaker 2: Linux and open Ssh. Make sure that it's you logging 139 00:08:36,040 --> 00:08:37,720 Speaker 2: in and not somebody else. 140 00:08:38,160 --> 00:08:40,800 Speaker 3: When you log in and you get your mail, the 141 00:08:40,840 --> 00:08:43,240 Speaker 3: server that holds your mail hash on it. The server 142 00:08:43,280 --> 00:08:47,440 Speaker 3: that holds your social media content has SSH. The servers 143 00:08:47,440 --> 00:08:49,960 Speaker 3: that have your banking information of Ssh. It's the door 144 00:08:50,000 --> 00:08:51,600 Speaker 3: by which you get into these systems. 145 00:08:52,320 --> 00:08:56,160 Speaker 2: So open ssh is incredibly important to the Internet and 146 00:08:56,320 --> 00:08:59,240 Speaker 2: all the cloud systems that we rely on, and because 147 00:08:59,280 --> 00:09:01,560 Speaker 2: of that, it has a lot of eyes on it. 148 00:09:02,160 --> 00:09:05,920 Speaker 2: Trying to hack open Ssh directly would pretty much be impossible. 149 00:09:06,400 --> 00:09:08,240 Speaker 2: Someone would catch you pretty quick. 150 00:09:09,000 --> 00:09:10,959 Speaker 3: People pay a lot of attention to it, a lot 151 00:09:10,960 --> 00:09:14,560 Speaker 3: of people run their code scanners on it, a lot 152 00:09:14,559 --> 00:09:16,640 Speaker 3: of people look for bugs in it, and so it 153 00:09:16,640 --> 00:09:19,880 Speaker 3: has been a while since open ssh has had itself 154 00:09:20,480 --> 00:09:23,160 Speaker 3: a humongous security flaw in it. If you just join 155 00:09:23,240 --> 00:09:25,679 Speaker 3: the open ssh project and said, hey, I'm a new 156 00:09:25,720 --> 00:09:30,160 Speaker 3: guy that nobody ever knew. Here's my code. Everybody would 157 00:09:30,160 --> 00:09:34,160 Speaker 3: be super suspicious, right, and whoever these bad guys are, 158 00:09:34,679 --> 00:09:37,360 Speaker 3: they know that. So what they did was they looked 159 00:09:37,400 --> 00:09:41,760 Speaker 3: at open ssh and they looked at its dependency graph, 160 00:09:41,760 --> 00:09:43,640 Speaker 3: what we call They looked at all the stuff that 161 00:09:43,679 --> 00:09:47,240 Speaker 3: goes into opensh and what they saw was open ssh 162 00:09:47,320 --> 00:09:50,120 Speaker 3: depends on other things. 163 00:09:51,720 --> 00:09:55,360 Speaker 2: This is where XU tills comes in. Xeu tills is 164 00:09:55,400 --> 00:09:59,079 Speaker 2: one of the things that open Ssh depends on. What 165 00:09:59,120 --> 00:10:02,400 Speaker 2: does xutil actually do. It's a compression library. 166 00:10:02,559 --> 00:10:06,200 Speaker 3: So it's just a library that is used to make 167 00:10:06,400 --> 00:10:09,360 Speaker 3: data that comes in smaller so that if you're moving 168 00:10:09,400 --> 00:10:11,000 Speaker 3: like a big file back and forth, it can fit 169 00:10:11,120 --> 00:10:13,600 Speaker 3: down a smaller pipe. Right, you might be talking to 170 00:10:13,640 --> 00:10:15,600 Speaker 3: a server on a satellite link, you might be talking 171 00:10:15,640 --> 00:10:17,640 Speaker 3: over a modem. Right, you might be talking over a 172 00:10:17,640 --> 00:10:19,720 Speaker 3: cell phone, and so you want your big file to 173 00:10:19,880 --> 00:10:21,040 Speaker 3: fit into a smaller pipe. 174 00:10:21,080 --> 00:10:23,120 Speaker 2: If you've ever used the zip file on your computer, 175 00:10:23,280 --> 00:10:26,720 Speaker 2: you get the general idea. Smaller files can be transferred faster, 176 00:10:27,160 --> 00:10:29,439 Speaker 2: which is important when you're dealing with so much data 177 00:10:29,440 --> 00:10:33,240 Speaker 2: flowing back and forth. Xdutils allows open ssh to be 178 00:10:33,360 --> 00:10:39,120 Speaker 2: both safe and fast, But that's the trick. By inserting 179 00:10:39,120 --> 00:10:42,160 Speaker 2: a backdoor into xu tills, the hackers created a way 180 00:10:42,160 --> 00:10:46,319 Speaker 2: to access anything being transmitted via open ssh. That meant 181 00:10:46,320 --> 00:10:49,640 Speaker 2: they could not only read supposedly secure messages, but remotely 182 00:10:49,720 --> 00:10:53,120 Speaker 2: run code on any server that uses open ssh. And 183 00:10:53,200 --> 00:10:56,839 Speaker 2: since basically the entire Internet uses this thing, once you're 184 00:10:56,880 --> 00:10:59,080 Speaker 2: in there, you can do anything you want. 185 00:11:00,360 --> 00:11:02,960 Speaker 3: You could have used it for a bunch of very 186 00:11:03,200 --> 00:11:08,280 Speaker 3: quiet surgical attacks over a multi year period, or you 187 00:11:08,320 --> 00:11:11,040 Speaker 3: could have done one humongous, big bane where you knock 188 00:11:11,040 --> 00:11:12,400 Speaker 3: out a huge chunk of the Internet. 189 00:11:12,160 --> 00:11:14,760 Speaker 2: All at once. But how did hackers get access to 190 00:11:14,920 --> 00:11:18,280 Speaker 2: xdutails in the first place. Well, remember when I promised 191 00:11:18,280 --> 00:11:21,480 Speaker 2: to tell you about the open in open ssh. Open 192 00:11:21,600 --> 00:11:25,880 Speaker 2: Ssh and also Linux are open source programs. This means 193 00:11:25,880 --> 00:11:28,240 Speaker 2: that anyone can look at the source code because it's 194 00:11:28,280 --> 00:11:31,839 Speaker 2: open and it's posted publicly. The idea is that if 195 00:11:31,880 --> 00:11:34,720 Speaker 2: everyone works together on the code, it'll be better and 196 00:11:34,760 --> 00:11:38,000 Speaker 2: the public benefit, and so anyone's free to look at 197 00:11:38,000 --> 00:11:40,160 Speaker 2: the code, to learn from the code, or even to 198 00:11:40,240 --> 00:11:42,720 Speaker 2: remix it for their own use. And even if you 199 00:11:42,800 --> 00:11:45,560 Speaker 2: have no interest in all that nerd stuff, you still 200 00:11:45,640 --> 00:11:48,920 Speaker 2: use versions of open source code every day on basically 201 00:11:49,000 --> 00:11:50,560 Speaker 2: all of your devices. 202 00:11:50,600 --> 00:11:54,120 Speaker 3: When you're running open source software, which people don't understand. 203 00:11:54,200 --> 00:11:56,520 Speaker 3: Basically everybody is, right, So what kind of phone do 204 00:11:56,520 --> 00:11:58,120 Speaker 3: you have? Do you have an iPhone or Android? 205 00:11:58,360 --> 00:11:59,240 Speaker 2: I actually have an Android? 206 00:11:59,280 --> 00:12:02,240 Speaker 3: Yeah, okay, Android. A humongous chunk of that code is 207 00:12:02,280 --> 00:12:05,000 Speaker 3: open source, right right, And that is code that is 208 00:12:05,040 --> 00:12:08,480 Speaker 3: maintained by volunteers that you have no idea who those 209 00:12:08,520 --> 00:12:11,080 Speaker 3: people are. Google has no idea who those people are, right. 210 00:12:11,360 --> 00:12:14,160 Speaker 3: Google collects all this code from around the internet, they 211 00:12:14,160 --> 00:12:16,520 Speaker 3: package it all up, and then they put it on 212 00:12:16,559 --> 00:12:18,240 Speaker 3: a phone, or they send it to Samson, and Samson 213 00:12:18,240 --> 00:12:19,040 Speaker 3: puts on the phone. 214 00:12:19,160 --> 00:12:22,240 Speaker 2: And before we get any further, iPhone people, this applies 215 00:12:22,280 --> 00:12:25,040 Speaker 2: to you too. Your iPhone uses a lot of open 216 00:12:25,080 --> 00:12:28,280 Speaker 2: source code also. And don't get me wrong, this is 217 00:12:28,360 --> 00:12:29,200 Speaker 2: not a bad thing. 218 00:12:29,559 --> 00:12:32,120 Speaker 3: It's great because it's free and it makes the phone cheaper, 219 00:12:32,160 --> 00:12:34,640 Speaker 3: and it's cool that we all get to contribute. But 220 00:12:34,720 --> 00:12:38,600 Speaker 3: the flip side is is that, yes, OpenSSH itself gets 221 00:12:38,600 --> 00:12:41,280 Speaker 3: lots of love, The Linux kernel gets lots of love, right, 222 00:12:41,600 --> 00:12:44,880 Speaker 3: But something like xcu tills, which is this tiny little 223 00:12:44,960 --> 00:12:47,640 Speaker 3: component over here, does not get lots of love and 224 00:12:47,800 --> 00:12:51,320 Speaker 3: xCD details at the time was maintained by one person. 225 00:12:51,760 --> 00:12:54,400 Speaker 3: That one dude was then manipulated to giving up control 226 00:12:54,440 --> 00:12:57,000 Speaker 3: of it, and the person he gave up control of 227 00:12:57,040 --> 00:12:59,600 Speaker 3: it too, turned out to be a totally fake persona 228 00:13:00,000 --> 00:13:00,600 Speaker 3: do not exist. 229 00:13:03,200 --> 00:13:04,720 Speaker 2: This is where we get to the human part of 230 00:13:04,720 --> 00:13:08,120 Speaker 2: the story. The one guy who was maintaining xeu tills, 231 00:13:08,360 --> 00:13:11,679 Speaker 2: his name was lost to Culin. He'd been maintaining xeu 232 00:13:11,800 --> 00:13:14,080 Speaker 2: tills since two thousand and nine and he was the 233 00:13:14,080 --> 00:13:17,360 Speaker 2: sole maintainer for the project. He wasn't being paid for it. 234 00:13:17,480 --> 00:13:20,840 Speaker 2: He was a volunteer. That's usually how open source projects go. 235 00:13:21,320 --> 00:13:24,160 Speaker 2: In twenty twenty two, Lusta Collins started to get a 236 00:13:24,160 --> 00:13:27,280 Speaker 2: lot of requests to make updates to the code. Throughout 237 00:13:27,320 --> 00:13:31,280 Speaker 2: the year, multiple accounts, seemingly out of nowhere, started complaining 238 00:13:31,360 --> 00:13:34,959 Speaker 2: that Colin wasn't working fast enough and implying that if 239 00:13:34,960 --> 00:13:37,760 Speaker 2: he wasn't interested in doing this anymore, maybe he wasn't 240 00:13:37,800 --> 00:13:40,920 Speaker 2: the guy for the job and the pressure was getting 241 00:13:40,920 --> 00:13:44,120 Speaker 2: to him. In June of twenty twenty two, Colin wrote 242 00:13:44,120 --> 00:13:47,560 Speaker 2: in a public note quote, I haven't lost interest, but 243 00:13:47,679 --> 00:13:50,960 Speaker 2: my ability to care has been fairly limited. Mostly due 244 00:13:50,960 --> 00:13:53,760 Speaker 2: to long term mental health issues, but also due to 245 00:13:53,800 --> 00:13:56,880 Speaker 2: some other things. He also went on to remind people 246 00:13:57,000 --> 00:13:59,760 Speaker 2: that quote, it's also good to keep in mind that 247 00:13:59,800 --> 00:14:04,559 Speaker 2: this this is an unpaid hobby project. Thankfully, right about 248 00:14:04,559 --> 00:14:07,920 Speaker 2: that time, a new programmer had come into help. This 249 00:14:08,000 --> 00:14:11,320 Speaker 2: new person's name was Gia Tan. Colin seemed a little 250 00:14:11,360 --> 00:14:15,680 Speaker 2: relieved that finally someone wasn't just complaining but helping. In 251 00:14:15,720 --> 00:14:17,960 Speaker 2: that same note from June, he wrote that he'd been 252 00:14:18,000 --> 00:14:21,160 Speaker 2: working a bit with Gatan on execu utils to address 253 00:14:21,200 --> 00:14:24,400 Speaker 2: all of those complaints, and he said about gia quote, 254 00:14:24,720 --> 00:14:27,040 Speaker 2: perhaps he will have a bigger role in the future. 255 00:14:27,400 --> 00:14:31,680 Speaker 2: We'll see. Over the course of a few years, gia 256 00:14:31,760 --> 00:14:35,360 Speaker 2: Tan really started to gain Lasa Collins trust. Gia Tan 257 00:14:35,600 --> 00:14:38,600 Speaker 2: was the ideal contributor. He didn't just help when he 258 00:14:38,640 --> 00:14:40,880 Speaker 2: was asked to, but he would offer to take on 259 00:14:41,040 --> 00:14:44,080 Speaker 2: more work, and by twenty twenty four, Colin had made 260 00:14:44,160 --> 00:14:47,440 Speaker 2: gia Tan a co maintainer on the project, which allowed 261 00:14:47,480 --> 00:14:49,440 Speaker 2: him to add code without needing approval. 262 00:14:50,680 --> 00:14:52,640 Speaker 3: This is a human attack, right. It all happened in 263 00:14:52,680 --> 00:14:55,040 Speaker 3: the open, but the way they did it was they 264 00:14:55,120 --> 00:14:58,520 Speaker 3: created these fake personas were one guy super Friendly and 265 00:14:58,600 --> 00:15:01,960 Speaker 3: one guy's a jerk, and the jerk basically is abusing 266 00:15:02,280 --> 00:15:05,720 Speaker 3: the person who's maintaining the software and saying, oh, I 267 00:15:05,760 --> 00:15:07,640 Speaker 3: need this change, I need this change. You're so slow. 268 00:15:07,680 --> 00:15:09,320 Speaker 3: Why are you so slow? And remember this guy's not 269 00:15:09,320 --> 00:15:13,840 Speaker 3: getting paid, right like, And so eventually basically bully this 270 00:15:13,920 --> 00:15:16,920 Speaker 3: guy to say, oh, I'm tired of doing this. I 271 00:15:16,960 --> 00:15:18,720 Speaker 3: don't want to do it anymore. And then the nice 272 00:15:18,760 --> 00:15:21,520 Speaker 3: guy's like, oh, well you know, I'll do it for you. 273 00:15:21,600 --> 00:15:24,640 Speaker 3: I'll take over, man, let me take this burden. 274 00:15:24,360 --> 00:15:26,880 Speaker 2: For you, right, very convenient, right. 275 00:15:27,520 --> 00:15:30,320 Speaker 3: And this took several years, and so this shows you 276 00:15:30,440 --> 00:15:34,720 Speaker 3: kind of the long play. They're willing to spend months 277 00:15:34,720 --> 00:15:38,200 Speaker 3: and months and months and in fact years building these 278 00:15:38,200 --> 00:15:41,160 Speaker 3: personas because like, look, if you just created an account 279 00:15:41,200 --> 00:15:43,200 Speaker 3: and you're like, hey, i've got code, take it, that 280 00:15:43,200 --> 00:15:46,400 Speaker 3: wouldn't work. So what these people figured out is that 281 00:15:46,440 --> 00:15:49,200 Speaker 3: you have to create these personas. They have to seem real. 282 00:15:49,920 --> 00:15:53,120 Speaker 3: You have to make posts, you have to contribute legit stuff, 283 00:15:53,640 --> 00:15:56,120 Speaker 3: you've got to create kind of a history, build a relationship. 284 00:15:56,200 --> 00:15:58,160 Speaker 3: You have to build a relationship. And so the guy 285 00:15:58,200 --> 00:16:00,480 Speaker 3: who maintains it gives it up of like, oh, thank 286 00:16:00,480 --> 00:16:02,480 Speaker 3: you so much. For taking this burden from me because 287 00:16:02,480 --> 00:16:04,280 Speaker 3: look at these jerks. Now, of course he doesn't know 288 00:16:04,520 --> 00:16:07,120 Speaker 3: that the jerks worked for the same team, or maybe 289 00:16:07,120 --> 00:16:10,240 Speaker 3: you're even the same person as the nice guy, right, 290 00:16:10,720 --> 00:16:12,520 Speaker 3: And then he hands it over to this nice guy 291 00:16:12,520 --> 00:16:15,160 Speaker 3: who's a friend of his, and then the friend takes 292 00:16:15,160 --> 00:16:17,960 Speaker 3: it over and then does a bunch of legitimate stuff, 293 00:16:18,280 --> 00:16:20,040 Speaker 3: and then in the middle of all that legitimate stuff 294 00:16:20,040 --> 00:16:21,920 Speaker 3: inserts a very very subtle backdoor. 295 00:16:22,280 --> 00:16:26,480 Speaker 2: I've seen this back door talked about using the phrase sophisticated, 296 00:16:26,480 --> 00:16:29,360 Speaker 2: that it was very sophisticated. Yes, in some ways it 297 00:16:29,400 --> 00:16:31,840 Speaker 2: sounds sophisticated, but in some ways it sounds like it 298 00:16:31,960 --> 00:16:34,800 Speaker 2: kind of wasn't because a lot of it just revolved 299 00:16:34,840 --> 00:16:37,920 Speaker 2: around getting somebody to give them some access. 300 00:16:38,600 --> 00:16:42,800 Speaker 3: The code was sophisticated, the method of getting in there 301 00:16:42,960 --> 00:16:45,400 Speaker 3: was very human. It was bugging a guy until he 302 00:16:45,440 --> 00:16:50,000 Speaker 3: gave up control. Yes, right, just being a nuisance, Just 303 00:16:50,000 --> 00:16:54,040 Speaker 3: being a nuisance. So who was behind those fake personas 304 00:16:54,680 --> 00:16:57,560 Speaker 3: We don't know for sure, but Alex has a theory 305 00:16:58,240 --> 00:17:10,440 Speaker 3: that's after the break, over the course of years, the 306 00:17:10,480 --> 00:17:14,479 Speaker 3: one guy maintaining this very important tool called Xzeu Tills 307 00:17:14,760 --> 00:17:19,000 Speaker 3: last of Colin was being bullied and manipulated online to 308 00:17:19,040 --> 00:17:22,159 Speaker 3: give a persona called Gia Tan a lead role in 309 00:17:22,200 --> 00:17:27,960 Speaker 3: handling the code. But who is Gia Tan? Everybody's been 310 00:17:28,000 --> 00:17:30,480 Speaker 3: asking this question of like who did this, Who's behind this? 311 00:17:31,040 --> 00:17:34,000 Speaker 3: Most of the names have kind of an Asian origin, right, 312 00:17:34,080 --> 00:17:37,840 Speaker 3: So there's accounts like Jagar Kumar. The key one is 313 00:17:37,960 --> 00:17:40,800 Speaker 3: Gia Tan, which is like, could be Chinese, could be Korean. 314 00:17:41,520 --> 00:17:44,560 Speaker 3: Most of the either the names or the technical indicators 315 00:17:44,600 --> 00:17:47,080 Speaker 3: point to Asia, right, So the time zones that this 316 00:17:47,119 --> 00:17:50,119 Speaker 3: person was working into are kind of the East Asian 317 00:17:50,119 --> 00:17:53,199 Speaker 3: time zone, so it's like Beijing or Korea. The names 318 00:17:53,240 --> 00:17:56,119 Speaker 3: are Asian. Everything points to Asia, which makes a lot 319 00:17:56,119 --> 00:17:59,320 Speaker 3: of people think it's Russia actually because it's just too perfect, right. 320 00:18:00,000 --> 00:18:04,360 Speaker 3: WHI is like it's somebody spent three years doing all 321 00:18:04,400 --> 00:18:08,640 Speaker 3: this work, and then you're like, like, let's say your Chinese. 322 00:18:08,840 --> 00:18:10,880 Speaker 3: Are you gonna use like a Chinese name as your 323 00:18:10,880 --> 00:18:13,560 Speaker 3: fake name? Are you going to spend three years but 324 00:18:13,600 --> 00:18:17,400 Speaker 3: then work in your normal time zone? And the generally 325 00:18:17,520 --> 00:18:20,680 Speaker 3: the only actor who has shown this level of patients 326 00:18:20,720 --> 00:18:23,720 Speaker 3: who's been willing to spend three years working on a 327 00:18:23,720 --> 00:18:26,000 Speaker 3: back door like this. The only people who have ever 328 00:18:26,040 --> 00:18:30,000 Speaker 3: done that is either the United States or the SVR. 329 00:18:30,920 --> 00:18:33,920 Speaker 3: So Russia okay, yeah, are really the only groups where 330 00:18:33,960 --> 00:18:37,120 Speaker 3: you've seen people spend years kind of doing this kind 331 00:18:37,119 --> 00:18:39,000 Speaker 3: of work. And a lot of people don't think it 332 00:18:39,040 --> 00:18:41,080 Speaker 3: would be the US doing something like this, that they 333 00:18:41,080 --> 00:18:43,920 Speaker 3: would never mess with something this important because also the 334 00:18:43,960 --> 00:18:45,520 Speaker 3: thing the Russians really like to do is blame other 335 00:18:45,560 --> 00:18:49,320 Speaker 3: people right again, because we never got to the point 336 00:18:49,359 --> 00:18:53,240 Speaker 3: of again used. Usually attribution is done after something's used, 337 00:18:53,600 --> 00:18:55,800 Speaker 3: and so it's a lot easier to figure out because 338 00:18:55,880 --> 00:18:59,479 Speaker 3: then you can ask quebono right who benefits. But like 339 00:18:59,640 --> 00:19:02,440 Speaker 3: all of these indicators pointing specifically to kind of China 340 00:19:02,520 --> 00:19:07,160 Speaker 3: or Korea makes you think it's just a little too obvious. 341 00:19:07,960 --> 00:19:11,720 Speaker 2: A major theory in cybersecurity circles is that Giatan isn't 342 00:19:11,760 --> 00:19:16,520 Speaker 2: one person, it's potentially multiple people, but likely Russian hackers 343 00:19:16,560 --> 00:19:20,040 Speaker 2: working for the SVR, which is Russia's foreign intelligence service, 344 00:19:20,560 --> 00:19:23,159 Speaker 2: and that they tried to cover their tracks even if 345 00:19:23,240 --> 00:19:24,520 Speaker 2: it wasn't consistent. 346 00:19:24,920 --> 00:19:27,480 Speaker 3: The guys who worked for the professionals will change their 347 00:19:27,480 --> 00:19:31,639 Speaker 3: time zones specifically around who either what allows them to 348 00:19:31,680 --> 00:19:35,640 Speaker 3: avoid detection or specifically around whatever they're doing. For attribution. 349 00:19:35,960 --> 00:19:39,040 Speaker 2: Well, there were some times that the time zones actually 350 00:19:39,080 --> 00:19:42,680 Speaker 2: pointed to an Eastern European time zone or time or 351 00:19:42,720 --> 00:19:43,760 Speaker 2: another time zone, right. 352 00:19:43,680 --> 00:19:45,800 Speaker 3: Yeah, I mean there's there is a little mixed right, 353 00:19:46,080 --> 00:19:50,320 Speaker 3: So somebody could be working from the eastern side of Russia, 354 00:19:50,440 --> 00:19:52,480 Speaker 3: or they could be waking up early in Moscow Saint 355 00:19:52,480 --> 00:19:54,320 Speaker 3: Petersburg and then they slipped right. 356 00:19:54,800 --> 00:19:56,679 Speaker 2: In other words, they might have just slipped up and 357 00:19:56,680 --> 00:19:59,960 Speaker 2: forgot to change their time zones because remember this happen 358 00:20:00,080 --> 00:20:02,840 Speaker 2: been over the course of years. Maybe somebody had an 359 00:20:02,840 --> 00:20:05,600 Speaker 2: off day and forgot to change the computer settings. But 360 00:20:05,960 --> 00:20:09,480 Speaker 2: Alex has another reason for suspecting Russia over China. 361 00:20:10,800 --> 00:20:14,320 Speaker 3: Chinese hackers, for the most part, work very rigorous hours. 362 00:20:14,480 --> 00:20:17,880 Speaker 3: You can almost always tell when Chinese hackers are working 363 00:20:17,960 --> 00:20:20,680 Speaker 3: because they work office hours. They work eight to five, 364 00:20:20,800 --> 00:20:24,000 Speaker 3: eight to six. Really, it's like very regular, yeah okay, 365 00:20:24,040 --> 00:20:26,600 Speaker 3: Whereas it's much harder to do time zone stuff based 366 00:20:26,600 --> 00:20:28,920 Speaker 3: for the Russians because they they will work whatever hours 367 00:20:28,920 --> 00:20:30,919 Speaker 3: they need to work. You know that that scene in 368 00:20:31,000 --> 00:20:33,240 Speaker 3: like one of the Born movies where it's like the 369 00:20:33,280 --> 00:20:35,160 Speaker 3: club scene. I always think about this with the Russia. 370 00:20:35,160 --> 00:20:37,000 Speaker 3: There's like a club scene in Russia, and it's like 371 00:20:37,000 --> 00:20:38,040 Speaker 3: you think it's the middle of the night and he 372 00:20:38,080 --> 00:20:40,439 Speaker 3: walks out, it's like ten am or something. Right, It's like, 373 00:20:40,440 --> 00:20:43,399 Speaker 3: that's what I think about with Russian hackers. Whereas like 374 00:20:43,640 --> 00:20:46,680 Speaker 3: for China, it's amazing because it's like, oh, six pm 375 00:20:46,680 --> 00:20:49,760 Speaker 3: in Beijing, you know, it's like you know, everybody goes home. 376 00:20:49,680 --> 00:20:50,399 Speaker 2: To hacking stops. 377 00:20:50,560 --> 00:20:53,400 Speaker 3: Yeah, or like Chinese Chinese New Year, Lunar New Year, 378 00:20:54,080 --> 00:20:56,080 Speaker 3: everybody goes home, go sees their parents in the village 379 00:20:56,119 --> 00:20:58,640 Speaker 3: or whatever, like hacking stops. It's amazing. 380 00:20:59,359 --> 00:21:01,840 Speaker 2: And in the case of exit the UTILS, looking at 381 00:21:01,840 --> 00:21:05,240 Speaker 2: the timing when this ga Tan was submitting code, there's 382 00:21:05,320 --> 00:21:08,280 Speaker 2: a bunch of submissions during Lunar New Year, but during 383 00:21:08,359 --> 00:21:14,800 Speaker 2: big Eastern European holidays like Christmas, crickets. But that leaves 384 00:21:14,840 --> 00:21:18,640 Speaker 2: a question, what's the motive? Why would the restern SVR 385 00:21:18,840 --> 00:21:19,760 Speaker 2: want to do this. 386 00:21:20,440 --> 00:21:23,680 Speaker 3: So open everybody uses. That's why this is so powerful 387 00:21:23,960 --> 00:21:25,560 Speaker 3: is you don't have to have a specific target in mind, 388 00:21:25,560 --> 00:21:27,359 Speaker 3: which is why you'd also spend three years doing it. 389 00:21:27,640 --> 00:21:32,080 Speaker 3: Because let's say you're at the SVR, you know, no 390 00:21:32,119 --> 00:21:34,879 Speaker 3: matter what war you're involved with, no matter what target, 391 00:21:34,920 --> 00:21:37,480 Speaker 3: you're going after openness station is going to be useful. 392 00:21:37,760 --> 00:21:40,320 Speaker 3: So this is probably a team of the SVR who 393 00:21:40,480 --> 00:21:42,359 Speaker 3: they don't know what's going to be used for. They're 394 00:21:42,400 --> 00:21:43,680 Speaker 3: just they know they're going to get a medal. 395 00:21:43,720 --> 00:21:45,639 Speaker 2: You'll be able to use this at some point, Yeah, 396 00:21:45,680 --> 00:21:46,359 Speaker 2: who knows for what? 397 00:21:46,480 --> 00:21:48,440 Speaker 3: And the US does the same thing, right, Like, there's 398 00:21:48,480 --> 00:21:50,800 Speaker 3: people whose job it is to get the capability, and 399 00:21:51,119 --> 00:21:54,399 Speaker 3: it's other guy's job who understand the geopolitics, who understand 400 00:21:54,400 --> 00:21:56,159 Speaker 3: the intelligence to use it. 401 00:21:57,840 --> 00:22:02,200 Speaker 2: But thankfully last spring, I'm just Freud. The Microsoft engineer 402 00:22:02,640 --> 00:22:05,439 Speaker 2: was able to discover the back door, but this was 403 00:22:05,440 --> 00:22:07,920 Speaker 2: all by chance. He wasn't looking for it. 404 00:22:08,640 --> 00:22:12,240 Speaker 3: He works on a database called Postgress, so he doesn't 405 00:22:12,240 --> 00:22:15,879 Speaker 3: work on xdutils. He works on Postgress is a big 406 00:22:15,920 --> 00:22:20,480 Speaker 3: open source database program that Microsoft uses in their Azure cloud. 407 00:22:20,560 --> 00:22:23,320 Speaker 3: So I'm guessing that's why Microsoft pays him. And in 408 00:22:24,040 --> 00:22:28,639 Speaker 3: the next version of Debian, so a popular Linux distribution, 409 00:22:29,240 --> 00:22:32,360 Speaker 3: Postgress was running a little bit slower, just tiny, tiny, 410 00:22:32,400 --> 00:22:33,760 Speaker 3: a little bit tiny, tiny. 411 00:22:33,520 --> 00:22:36,439 Speaker 2: Little bit right, so tiny, like how much slower? 412 00:22:36,920 --> 00:22:39,480 Speaker 3: Like in one specific circumstance, it was taking a couple 413 00:22:39,480 --> 00:22:40,720 Speaker 3: of milliseconds longer to do. 414 00:22:40,720 --> 00:22:42,480 Speaker 2: Something right, So like a millisecond. 415 00:22:42,560 --> 00:22:45,000 Speaker 3: Yeah, but like if you're a database guy, that's a lot, right. 416 00:22:45,119 --> 00:22:49,160 Speaker 3: And so he is super looking into what is going on, 417 00:22:49,600 --> 00:22:52,840 Speaker 3: and he realizes, oh, it's not actually postgress that's doing this, 418 00:22:52,960 --> 00:22:55,720 Speaker 3: it's open as his age. And so he could have 419 00:22:55,720 --> 00:22:57,520 Speaker 3: stopped there because he could have been like, oh, well, 420 00:22:57,520 --> 00:23:00,040 Speaker 3: it's not my problem, right, it's not my thing, and 421 00:23:00,080 --> 00:23:02,040 Speaker 3: then maybe nobody would have looked at it right, Like 422 00:23:02,200 --> 00:23:04,320 Speaker 3: you could see an open source is that people pass 423 00:23:04,680 --> 00:23:07,800 Speaker 3: problems to each other all the time, right, so it 424 00:23:07,880 --> 00:23:10,560 Speaker 3: is this is like I think a normal like a 425 00:23:10,640 --> 00:23:14,280 Speaker 3: normal person, even open source developer would have been like, oh, okay, 426 00:23:15,400 --> 00:23:18,520 Speaker 3: I looked at this, it's not me. I'm gonna let 427 00:23:18,520 --> 00:23:20,720 Speaker 3: it go. But he did not let it go. He 428 00:23:20,920 --> 00:23:25,120 Speaker 3: ended up digging into okay, well what changed and open 429 00:23:25,119 --> 00:23:26,800 Speaker 3: as the stage and then he looks into open a 430 00:23:26,880 --> 00:23:29,600 Speaker 3: stah and sees this code. And so what the What 431 00:23:29,640 --> 00:23:32,960 Speaker 3: the attackers that did is they created what's called a 432 00:23:33,119 --> 00:23:35,359 Speaker 3: no bus back door nobody butt us. 433 00:23:36,280 --> 00:23:39,320 Speaker 2: No bus or nobody butt us is a way of 434 00:23:39,359 --> 00:23:43,800 Speaker 2: creating a backdoor into something where nobody but us or you. 435 00:23:44,040 --> 00:23:45,560 Speaker 2: The hackers have the key. 436 00:23:46,280 --> 00:23:48,880 Speaker 3: They want a skeleton key that only they can use, 437 00:23:49,160 --> 00:23:51,439 Speaker 3: but no bus back doors nobody but us back doors 438 00:23:51,960 --> 00:23:56,040 Speaker 3: arm are actually hard to sneak in because they're pretty 439 00:23:56,119 --> 00:23:57,160 Speaker 3: like obviously sketchy. 440 00:23:57,640 --> 00:24:00,320 Speaker 2: So instead of doing everything all at once, they'd delivered 441 00:24:00,440 --> 00:24:04,399 Speaker 2: multiple patches in multiple different places, little things here and 442 00:24:04,480 --> 00:24:06,920 Speaker 2: there that wouldn't raise suspicion if you looked at one 443 00:24:07,160 --> 00:24:10,120 Speaker 2: or two or three of them, but layered on top 444 00:24:10,160 --> 00:24:13,480 Speaker 2: of each other, it created a key that only they 445 00:24:13,520 --> 00:24:14,120 Speaker 2: could use. 446 00:24:15,160 --> 00:24:17,159 Speaker 3: And so because they did all this stuff to kind 447 00:24:17,160 --> 00:24:19,760 Speaker 3: of obfuscate it and make it super secret, they actually 448 00:24:19,800 --> 00:24:26,520 Speaker 3: created the performance impact that Undris saw and then went 449 00:24:26,600 --> 00:24:28,080 Speaker 3: way out of his way to pull. And then he 450 00:24:28,119 --> 00:24:32,240 Speaker 3: posts in a public post, Guys, this is super sketchy, right, Like, 451 00:24:32,480 --> 00:24:36,160 Speaker 3: look at this code. There's no good argument for what's going. 452 00:24:35,960 --> 00:24:41,520 Speaker 2: On here, right, So, I mean I kind of have 453 00:24:41,600 --> 00:24:44,720 Speaker 2: to wonder about what the implications for this are. I mean, 454 00:24:44,760 --> 00:24:47,840 Speaker 2: this clearly it almost worked. Do you think there's hackers 455 00:24:47,840 --> 00:24:51,040 Speaker 2: out there saying okay, yeah, yeah, let me change my 456 00:24:51,560 --> 00:24:53,080 Speaker 2: approach and maybe this is the way to do it. 457 00:24:53,640 --> 00:24:55,320 Speaker 3: Yeah. I Mean what I'm afraid of is we haven't 458 00:24:55,320 --> 00:24:58,040 Speaker 3: found any other ones like this, So what I thought 459 00:24:58,320 --> 00:25:00,920 Speaker 3: would happen is at the time I'm like, oh, man, 460 00:25:00,960 --> 00:25:02,919 Speaker 3: we'll have one or two more of these, because everybody 461 00:25:03,000 --> 00:25:06,160 Speaker 3: started looking and then nobody else found any other ones, 462 00:25:06,200 --> 00:25:07,000 Speaker 3: which terrifies me. 463 00:25:07,440 --> 00:25:08,840 Speaker 2: You think there's more like this out there. 464 00:25:09,000 --> 00:25:10,880 Speaker 3: I think it's quite possible. It's more like this. Yeah, 465 00:25:11,160 --> 00:25:13,520 Speaker 3: Like if anybody has an idea, two or three other 466 00:25:13,520 --> 00:25:15,920 Speaker 3: people have had the idea, right, So I can't imagine 467 00:25:15,920 --> 00:25:17,880 Speaker 3: these are the only people who are like, oh, I'm 468 00:25:17,880 --> 00:25:21,160 Speaker 3: gonna go bully some maintainer of one of the five 469 00:25:21,200 --> 00:25:24,280 Speaker 3: thousand libraries on Linux to go take it over or 470 00:25:24,320 --> 00:25:27,760 Speaker 3: submit a patch. I can't imagine there aren't other ones. Now, 471 00:25:27,800 --> 00:25:30,359 Speaker 3: are they in OpenSSH or are they something much more subtle? 472 00:25:30,640 --> 00:25:33,120 Speaker 3: I don't know. I mean this would have been both 473 00:25:33,240 --> 00:25:37,199 Speaker 3: in kind of one of the worst possible places, and 474 00:25:37,240 --> 00:25:39,879 Speaker 3: it would have been a skeleton key that only this 475 00:25:39,920 --> 00:25:42,080 Speaker 3: attacker could have used, which is like kind of the 476 00:25:42,080 --> 00:25:45,200 Speaker 3: worst case scenario. It's also the hardest level of difficulty, right, 477 00:25:45,240 --> 00:25:48,560 Speaker 3: these people picked the hardest level. Thing. If you want 478 00:25:48,600 --> 00:25:51,359 Speaker 3: to do something much much simpler is you go after 479 00:25:51,520 --> 00:25:56,159 Speaker 3: a much lesser used service that's specifically at the target 480 00:25:56,200 --> 00:25:58,480 Speaker 3: that you're going after. If you're going after a specific 481 00:25:58,520 --> 00:26:01,760 Speaker 3: target and you're like, oh, they use this specific this 482 00:26:01,880 --> 00:26:05,080 Speaker 3: one specific service that's much less popular, that doesn't have 483 00:26:05,119 --> 00:26:07,199 Speaker 3: all these eyeballs on it, then you don't have to 484 00:26:07,240 --> 00:26:08,040 Speaker 3: be as tricky. 485 00:26:09,480 --> 00:26:12,920 Speaker 2: There haven't been any like this in OpenSSH but there 486 00:26:12,960 --> 00:26:15,920 Speaker 2: have been other attempts that the Open Source Security Foundation 487 00:26:16,160 --> 00:26:19,720 Speaker 2: and the Open JavaScript Foundation have found that use similar 488 00:26:19,760 --> 00:26:23,760 Speaker 2: social tactics. One project received emails from accounts asking to 489 00:26:23,800 --> 00:26:28,119 Speaker 2: be designated as project maintainers despite having little prior involvement, 490 00:26:28,520 --> 00:26:32,560 Speaker 2: and two other projects saw very similar suspicious patterns. This 491 00:26:32,680 --> 00:26:35,920 Speaker 2: kind of social engineering is really effective because you don't 492 00:26:35,920 --> 00:26:39,159 Speaker 2: have to manipulate code. You just manipulate the person who 493 00:26:39,240 --> 00:26:41,879 Speaker 2: has their hands on the code. And it's only going 494 00:26:41,960 --> 00:26:45,040 Speaker 2: to get easier to do and harder to detect. 495 00:26:45,880 --> 00:26:48,320 Speaker 3: Now we're at the point where with AI, like you 496 00:26:48,320 --> 00:26:50,359 Speaker 3: could be fake now and I have no idea if 497 00:26:50,400 --> 00:26:52,040 Speaker 3: you really exist or vice versa. 498 00:26:52,520 --> 00:26:55,679 Speaker 2: Wait are you are you suggesting that doing something like 499 00:26:55,720 --> 00:26:57,800 Speaker 2: this might be a little bit easier because somebody could 500 00:26:57,880 --> 00:27:01,359 Speaker 2: fake that they actually exist. Oh yeah, with a phone 501 00:27:01,359 --> 00:27:03,080 Speaker 2: conversation or a video conversation. 502 00:27:03,240 --> 00:27:06,080 Speaker 3: Oh yeah, we're already seeing that from the ransomware actors. 503 00:27:06,320 --> 00:27:09,840 Speaker 3: It's easy for phone right, So you're already seeing them 504 00:27:10,000 --> 00:27:12,640 Speaker 3: fake people's voices. So people are getting phone calls from 505 00:27:12,680 --> 00:27:16,200 Speaker 3: like their CEO. The CEO goes on CNBC for two minutes, 506 00:27:16,520 --> 00:27:19,040 Speaker 3: they get their voice from CNBC. They plug it into 507 00:27:19,680 --> 00:27:23,880 Speaker 3: a AI voice library, and then you call and like, hey, 508 00:27:23,920 --> 00:27:27,040 Speaker 3: it's Bob, I need you do a million dollar transfer. Right, 509 00:27:27,160 --> 00:27:29,920 Speaker 3: So that kind of stuff, and now you see real 510 00:27:29,960 --> 00:27:33,280 Speaker 3: time video too. It's not perfect, but it's getting there. 511 00:27:33,400 --> 00:27:33,880 Speaker 2: Yeah. 512 00:27:33,920 --> 00:27:36,160 Speaker 3: The trick, by the way, if this happens to any 513 00:27:36,200 --> 00:27:38,639 Speaker 3: of your listeners, the trick is you can ask people 514 00:27:38,680 --> 00:27:42,399 Speaker 3: to move, touch things in the background, do three sixty 515 00:27:42,400 --> 00:27:44,080 Speaker 3: on the head. It's harder for them to do ears 516 00:27:44,160 --> 00:27:47,199 Speaker 3: forever reason, but they'll get there, right. So, like if 517 00:27:47,240 --> 00:27:48,639 Speaker 3: I asked you to take your glasses off, it'd be 518 00:27:48,720 --> 00:27:50,720 Speaker 3: very hard for the model, Like take your glasses off. 519 00:27:50,880 --> 00:27:54,480 Speaker 2: By the way, hold on, for those of y'all listening 520 00:27:54,480 --> 00:27:57,359 Speaker 2: at home, I took my glasses off here, just double 521 00:27:57,440 --> 00:27:57,760 Speaker 2: check it. 522 00:27:58,359 --> 00:27:59,840 Speaker 3: Oh you kind of frozen me when you did that. 523 00:28:00,080 --> 00:28:05,960 Speaker 3: That's sketch man, it's sketch af As my students say, sorry, 524 00:28:05,880 --> 00:28:09,760 Speaker 3: they keep me on my Stanford since But you know, in. 525 00:28:09,680 --> 00:28:13,480 Speaker 2: The future, though, it is going to be easier to 526 00:28:13,520 --> 00:28:17,800 Speaker 2: spoof people's personalities and stuff like that. So these things 527 00:28:17,800 --> 00:28:19,800 Speaker 2: that you're suggesting right now they work now, are they 528 00:28:19,800 --> 00:28:21,280 Speaker 2: going to work in a year? So? 529 00:28:21,400 --> 00:28:24,880 Speaker 3: I mean the good thing about this is open source 530 00:28:24,880 --> 00:28:29,119 Speaker 3: developers have become much more paranoid, right, So people have 531 00:28:29,160 --> 00:28:31,359 Speaker 3: become much more paranoid about new people. And there's a 532 00:28:31,359 --> 00:28:32,960 Speaker 3: downside of that, right that if you're trying to get 533 00:28:32,960 --> 00:28:36,040 Speaker 3: into open source it's harder. There have become projects where 534 00:28:36,040 --> 00:28:38,640 Speaker 3: it's like, okay, great, let's meet up in person. If 535 00:28:38,640 --> 00:28:40,959 Speaker 3: somebody is willing only to communicate with you an email, 536 00:28:41,360 --> 00:28:44,200 Speaker 3: then you have to be kind of sketched out now. 537 00:28:44,760 --> 00:28:46,520 Speaker 3: So there have been some changes since this. I think 538 00:28:46,520 --> 00:28:48,520 Speaker 3: people have been more paranoid. There's been a bunch of 539 00:28:48,600 --> 00:28:52,320 Speaker 3: work on the flip side of AI is that traditional 540 00:28:53,120 --> 00:28:56,440 Speaker 3: code scanning tools pre AI code scanning tools are not 541 00:28:57,120 --> 00:29:00,080 Speaker 3: extremely good at detecting this kind of malicious code. But 542 00:29:00,160 --> 00:29:02,120 Speaker 3: there is some hope that some of the newer AI 543 00:29:02,200 --> 00:29:04,360 Speaker 3: based code scanning tools could could do this kind of 544 00:29:04,360 --> 00:29:06,720 Speaker 3: stuff at scale. The flip side is is AI is 545 00:29:06,760 --> 00:29:11,640 Speaker 3: really good at writing code, so you know, do you 546 00:29:11,880 --> 00:29:15,520 Speaker 3: not have to be SVR level anymore to be able 547 00:29:15,520 --> 00:29:16,680 Speaker 3: to write a backdoor. That's good. 548 00:29:16,720 --> 00:29:19,880 Speaker 2: That's probably true as well, it's open source too much 549 00:29:19,920 --> 00:29:22,280 Speaker 2: of a risk in the age of AI, and can 550 00:29:22,360 --> 00:29:25,840 Speaker 2: we protect ourselves from another hack like this? That's after 551 00:29:25,880 --> 00:29:42,719 Speaker 2: the break? So this and I want to get back 552 00:29:42,760 --> 00:29:44,360 Speaker 2: into kind of the play by play here, but a 553 00:29:44,360 --> 00:29:49,520 Speaker 2: lot of this hinges on open source. So and I 554 00:29:49,520 --> 00:29:54,360 Speaker 2: think one of the really kind of concerning things about 555 00:29:55,080 --> 00:29:58,240 Speaker 2: this entire thing that happened or almost happened is the 556 00:29:58,240 --> 00:30:01,320 Speaker 2: fact that it basically happened in broad day. Yes, and 557 00:30:01,680 --> 00:30:05,560 Speaker 2: it happened because this is open source. The thing about 558 00:30:05,680 --> 00:30:08,480 Speaker 2: open source, I think, is when you start to explain 559 00:30:08,560 --> 00:30:10,719 Speaker 2: it to somebody who's never heard of it. Are you 560 00:30:10,720 --> 00:30:13,120 Speaker 2: familiar with the galaxy brain meme? 561 00:30:13,760 --> 00:30:14,000 Speaker 3: Yeah? 562 00:30:14,080 --> 00:30:15,400 Speaker 2: Do you know what I'm talking about? Yeah? So I 563 00:30:15,400 --> 00:30:17,320 Speaker 2: feel like this is this is like that Galaxy brain meme, 564 00:30:17,320 --> 00:30:19,800 Speaker 2: where at the very top, when you tell somebody to 565 00:30:19,880 --> 00:30:23,040 Speaker 2: open source, the response is this is a terrible idea. 566 00:30:23,080 --> 00:30:25,880 Speaker 2: Everybody can see the code. Yeah, And then you get 567 00:30:26,200 --> 00:30:27,840 Speaker 2: a little bit further down, it's, oh, this is a 568 00:30:27,840 --> 00:30:30,200 Speaker 2: great idea. Everybody can see the code, and then they 569 00:30:30,200 --> 00:30:32,080 Speaker 2: hear about xutails. When we get down to the bottom, 570 00:30:32,080 --> 00:30:35,200 Speaker 2: and it's a terrible idea. Everybody can see the code. 571 00:30:36,040 --> 00:30:38,640 Speaker 2: What's the true galaxy brain take on this for open source? 572 00:30:39,200 --> 00:30:43,320 Speaker 3: I mean people go back and forth. So one of 573 00:30:43,360 --> 00:30:45,720 Speaker 3: the ideas is that if you can see all the code, 574 00:30:45,760 --> 00:30:48,080 Speaker 3: you can see all the bugs. Right. Is the idea 575 00:30:48,200 --> 00:30:52,200 Speaker 3: that because it's open source, that it should be more 576 00:30:52,240 --> 00:30:55,520 Speaker 3: secure than closed source because you can see the flaws, right. 577 00:30:55,560 --> 00:30:58,480 Speaker 3: I don't think that has empirically turned out to be true. Right, 578 00:30:58,920 --> 00:31:03,160 Speaker 3: And so I I think what I would say is 579 00:31:03,200 --> 00:31:05,240 Speaker 3: I'm a big proponent of oen source. I think it's great. 580 00:31:05,280 --> 00:31:08,800 Speaker 3: I think it has a humongous economic benefit to the world. 581 00:31:09,280 --> 00:31:11,560 Speaker 3: The truth is is the entire kind of cloud competing 582 00:31:11,600 --> 00:31:14,720 Speaker 3: revolution we're all living through only exists because of open 583 00:31:14,760 --> 00:31:18,440 Speaker 3: source software. So that's an incredible thing. That's a wonderful thing. 584 00:31:18,760 --> 00:31:21,440 Speaker 3: Open source is great from an economic perspective. It is 585 00:31:21,440 --> 00:31:24,760 Speaker 3: great from an innovation perspective. We should not pretend that 586 00:31:24,800 --> 00:31:29,080 Speaker 3: it magically solves trust and security problems. And if you're 587 00:31:29,120 --> 00:31:32,600 Speaker 3: a company that's relied upon open source, you have a 588 00:31:32,880 --> 00:31:37,120 Speaker 3: ethical and moral obligation to deal with the security aspects 589 00:31:37,120 --> 00:31:40,240 Speaker 3: of it. And it contribute back. And I do think 590 00:31:40,400 --> 00:31:42,680 Speaker 3: that is something that's gone lost is that people have 591 00:31:42,760 --> 00:31:45,760 Speaker 3: just kind of assumed somebody else is dealing with it, 592 00:31:46,040 --> 00:31:49,240 Speaker 3: and everybody assumed somebody else is doing the security work, 593 00:31:49,280 --> 00:31:50,480 Speaker 3: and that turns out not to be true. 594 00:31:51,280 --> 00:31:53,800 Speaker 2: You know. I think that really gets the core of 595 00:31:53,800 --> 00:31:57,320 Speaker 2: what a lot of this is because if somebody sees 596 00:31:58,280 --> 00:32:01,760 Speaker 2: XU tills there was a potentual security flaw on that. Okay, well, 597 00:32:02,080 --> 00:32:03,840 Speaker 2: I don't care about that. What's that? Oh, well, you 598 00:32:03,880 --> 00:32:05,760 Speaker 2: know it's involved with open as a stage. Well, I 599 00:32:05,800 --> 00:32:07,160 Speaker 2: don't use that either. I don't have that app on 600 00:32:07,160 --> 00:32:09,080 Speaker 2: my phone. I don't know what you're talking about. And 601 00:32:09,640 --> 00:32:14,680 Speaker 2: in this weird way, I feel like the more and 602 00:32:14,800 --> 00:32:19,400 Speaker 2: more technology actually starts to become just magic, that things 603 00:32:19,520 --> 00:32:23,080 Speaker 2: just work. Yeah, we are less and less actually tech literate. 604 00:32:23,360 --> 00:32:27,840 Speaker 2: All the stuff that was science fiction even ten years ago, 605 00:32:28,720 --> 00:32:32,240 Speaker 2: two years ago, frankly, is it's just normal now. 606 00:32:32,480 --> 00:32:32,760 Speaker 3: Yeah. 607 00:32:32,800 --> 00:32:35,600 Speaker 2: And so we're able to do so much with technology 608 00:32:35,680 --> 00:32:37,920 Speaker 2: just regular people things we just do with our phone 609 00:32:37,960 --> 00:32:42,600 Speaker 2: every day, that we've become really removed from the technology itself, 610 00:32:42,600 --> 00:32:44,920 Speaker 2: and so less and less of us fewer and few 611 00:32:44,920 --> 00:32:47,280 Speaker 2: of us actually know how to use a computer. Yeah, 612 00:32:47,360 --> 00:32:50,120 Speaker 2: and so this feels totally removed from us this is like, oh, 613 00:32:50,200 --> 00:32:52,280 Speaker 2: this is some weird nerd shit. I'd like, I don't 614 00:32:52,360 --> 00:32:55,000 Speaker 2: use that nerd program doesn't affect me. 615 00:32:55,360 --> 00:32:57,960 Speaker 3: Yeah, no, you're totally right. I mean, I tell my 616 00:32:58,000 --> 00:33:00,480 Speaker 3: Stanford students. Security is one of the best fields to 617 00:33:00,480 --> 00:33:02,960 Speaker 3: get into professionally because it's the only part of computers 618 00:33:02,960 --> 00:33:06,680 Speaker 3: it gets worse every year. Everything else magically gets better. Man, 619 00:33:06,920 --> 00:33:09,560 Speaker 3: So you could find yourself in any other field being 620 00:33:09,600 --> 00:33:12,800 Speaker 3: made irrelevant, But if you get into security, you have 621 00:33:12,920 --> 00:33:16,360 Speaker 3: job security for life. Because every year I've been in it, 622 00:33:16,360 --> 00:33:19,480 Speaker 3: it's gone worse. And one of the reasons is because 623 00:33:20,000 --> 00:33:22,200 Speaker 3: you say it's nerdsh it. But even the nerds we 624 00:33:22,400 --> 00:33:26,560 Speaker 3: get the normal median nerd gets further and further away 625 00:33:27,120 --> 00:33:30,440 Speaker 3: from the truth, the reality of what's going on on computers. 626 00:33:30,640 --> 00:33:34,720 Speaker 3: So when I learned how to program, I learned assembly language, right, 627 00:33:35,080 --> 00:33:38,440 Speaker 3: I learned how to write like the lowest level languages. 628 00:33:39,080 --> 00:33:43,760 Speaker 3: And then you know, they stopped teaching assembly language unless 629 00:33:43,760 --> 00:33:47,000 Speaker 3: you took special classes and you learn like in Python, right, 630 00:33:47,120 --> 00:33:49,400 Speaker 3: like a very high level language that you don't even 631 00:33:49,680 --> 00:33:52,440 Speaker 3: you know, you don't learn how to like do memory management. 632 00:33:52,720 --> 00:33:54,840 Speaker 2: Right, I mean, pithon. And it's even just to bring 633 00:33:54,920 --> 00:33:58,040 Speaker 2: this down like Python. For a casual person, you can 634 00:33:58,080 --> 00:33:59,800 Speaker 2: look at it. You can kind of tell what's going on. 635 00:34:00,000 --> 00:34:04,240 Speaker 2: Basically looks like English. Yeah, assembly is letters and numbers 636 00:34:04,560 --> 00:34:05,280 Speaker 2: right right. 637 00:34:05,320 --> 00:34:07,880 Speaker 3: But the nice thing about assembly is it's the truth 638 00:34:07,880 --> 00:34:10,560 Speaker 3: of the matter. Right. It has a one to one 639 00:34:10,600 --> 00:34:12,960 Speaker 3: matching to what the processor itself is doing. And from 640 00:34:12,960 --> 00:34:15,440 Speaker 3: a security perspective, if you look at it, is the 641 00:34:15,480 --> 00:34:17,560 Speaker 3: reality of what a security flaw is is seen in 642 00:34:17,600 --> 00:34:18,160 Speaker 3: the assembly. 643 00:34:18,360 --> 00:34:18,480 Speaker 2: Right. 644 00:34:19,000 --> 00:34:22,360 Speaker 3: In Python, you get further, you get abstracted away, you 645 00:34:22,400 --> 00:34:24,600 Speaker 3: get further from the reality of what's actually going on 646 00:34:24,600 --> 00:34:28,120 Speaker 3: on the computer. Now what you see it's incredibly powerful. 647 00:34:28,120 --> 00:34:31,520 Speaker 3: It's incredibly cool, and so I'm not gonna crap on 648 00:34:31,560 --> 00:34:33,680 Speaker 3: it because I think it's an incredibly good thing for people. 649 00:34:33,920 --> 00:34:36,880 Speaker 3: But you look at like Claude three point seven code. 650 00:34:37,040 --> 00:34:39,560 Speaker 3: You know, this new Claude model, and you see people 651 00:34:39,600 --> 00:34:42,680 Speaker 3: on Twitter who don't know anything about computers and they're 652 00:34:42,719 --> 00:34:44,919 Speaker 3: able to program now because they can go into there 653 00:34:44,960 --> 00:34:47,160 Speaker 3: and they can say, build me software that does X. 654 00:34:47,680 --> 00:34:50,040 Speaker 3: And that is going to be terrible for security. It's 655 00:34:50,040 --> 00:34:53,160 Speaker 3: super cool for people's economic opportunities because any big you 656 00:34:53,160 --> 00:34:55,880 Speaker 3: can become a program er now. But man, are people 657 00:34:55,880 --> 00:34:58,880 Speaker 3: in security gonna love it because now you don't need 658 00:34:58,920 --> 00:35:01,560 Speaker 3: to know anything about how can work, and you're just 659 00:35:01,600 --> 00:35:03,319 Speaker 3: going to ask the AI system to build it for you. 660 00:35:03,360 --> 00:35:05,080 Speaker 3: And I see it with my students. Stanford students like 661 00:35:05,120 --> 00:35:07,840 Speaker 3: one of the top computer science programs in the world, 662 00:35:08,400 --> 00:35:12,319 Speaker 3: and you can graduate and not actually really understand how 663 00:35:12,320 --> 00:35:16,239 Speaker 3: operating systems work. I apologize to Sandford Computer Science department, right, 664 00:35:16,360 --> 00:35:19,520 Speaker 3: but really like, you can have a totally productive career 665 00:35:19,920 --> 00:35:22,799 Speaker 3: in Silicon Valley and not really understand what's going on 666 00:35:22,960 --> 00:35:25,360 Speaker 3: three or four layers down. In fact, it's better for 667 00:35:25,400 --> 00:35:28,600 Speaker 3: you not to write. It's better because you're at the 668 00:35:28,680 --> 00:35:31,440 Speaker 3: high level where you're much more productive. Right, You're much 669 00:35:31,440 --> 00:35:33,239 Speaker 3: more productive having the AI do the work for you. 670 00:35:33,320 --> 00:35:36,480 Speaker 3: You're much productive having get hub Copilot help you rewrite stuff. 671 00:35:36,520 --> 00:35:39,480 Speaker 3: You're much more productive using all the cloud intermediation layers. 672 00:35:39,800 --> 00:35:42,399 Speaker 3: And so that's one of the reasons why security gets 673 00:35:42,440 --> 00:35:45,000 Speaker 3: worse every single year is that we add these layers 674 00:35:45,040 --> 00:35:47,840 Speaker 3: of abstraction that make things easier for people. And AI 675 00:35:48,000 --> 00:35:51,799 Speaker 3: is the ultimate abstraction layer, because now you can talk 676 00:35:51,880 --> 00:35:54,480 Speaker 3: to computers and plain English and have them do incredibly 677 00:35:54,480 --> 00:35:55,240 Speaker 3: complex things. 678 00:35:57,640 --> 00:36:01,520 Speaker 2: The thing about this whole story. I mean, I'm thinking 679 00:36:01,520 --> 00:36:03,880 Speaker 2: about you know, we're in a time right now where 680 00:36:04,239 --> 00:36:10,279 Speaker 2: anything bad happens or almost happens. Netflix documentary, Hulu documentary, 681 00:36:10,480 --> 00:36:13,120 Speaker 2: it's a true crime podcast. At some point. I don't 682 00:36:13,120 --> 00:36:17,520 Speaker 2: see that happening with this. This is something that, as 683 00:36:17,560 --> 00:36:23,000 Speaker 2: you were saying, was almost it truly could have been catastrophic. Yeah, 684 00:36:23,080 --> 00:36:27,120 Speaker 2: but it's also kind of boring. It's money you don't think. 685 00:36:27,000 --> 00:36:29,400 Speaker 3: I could get. I could sell ten episodes to Netflix 686 00:36:29,400 --> 00:36:29,640 Speaker 3: on this. 687 00:36:30,640 --> 00:36:33,160 Speaker 2: If you can hire me as a producer, I'd love 688 00:36:33,200 --> 00:36:36,279 Speaker 2: to help. But you see what I'm saying is it 689 00:36:36,360 --> 00:36:38,520 Speaker 2: takes a while to even explain what the heck we're 690 00:36:38,520 --> 00:36:41,279 Speaker 2: talking about. Yeah, and I think that comes back to 691 00:36:41,320 --> 00:36:44,960 Speaker 2: some of this in the same way that this vulnerability 692 00:36:45,520 --> 00:36:48,040 Speaker 2: was introduced via social engineering. A lot of this is 693 00:36:48,080 --> 00:36:50,280 Speaker 2: social I mean a lot of your work you probably 694 00:36:50,280 --> 00:36:52,640 Speaker 2: think about this. How do you get people to care 695 00:36:52,680 --> 00:36:53,640 Speaker 2: about something like this? 696 00:36:54,120 --> 00:36:56,160 Speaker 3: I mean, so that's that's a challenge. That's one of 697 00:36:56,160 --> 00:36:59,080 Speaker 3: the biggest challenges. If you're like a chief information security officer, 698 00:36:59,360 --> 00:37:01,239 Speaker 3: one of your big jobs is getting the rest of 699 00:37:01,239 --> 00:37:05,040 Speaker 3: the company to care about security. Ciso's we have a 700 00:37:05,080 --> 00:37:07,520 Speaker 3: reputation of being the people who say no all the time. 701 00:37:08,920 --> 00:37:11,680 Speaker 3: So I was the CISO of Facebook and I once 702 00:37:11,760 --> 00:37:14,320 Speaker 3: walked into a meeting with a bunch of other vps 703 00:37:14,760 --> 00:37:17,080 Speaker 3: and somebody literally said, like, oh shit, Sumus this year, 704 00:37:17,480 --> 00:37:20,960 Speaker 3: like hey guys, I can hear you. I can hear you, 705 00:37:21,520 --> 00:37:23,160 Speaker 3: and like, no, no, it's not you. It's just like 706 00:37:23,239 --> 00:37:25,640 Speaker 3: whenever you come like it's because you're telling us like 707 00:37:25,760 --> 00:37:28,040 Speaker 3: there's a coup in Turkey or something terrible, and like 708 00:37:28,360 --> 00:37:29,920 Speaker 3: because I was just the bare or bad news, right. 709 00:37:29,960 --> 00:37:32,000 Speaker 3: But this is what's a real challenge for my colleagues, 710 00:37:32,280 --> 00:37:34,120 Speaker 3: and it's a real challenge for us as a society. 711 00:37:34,480 --> 00:37:39,399 Speaker 3: People don't want to think that the systems that they 712 00:37:39,480 --> 00:37:42,640 Speaker 3: rely upon are fragile. And I think that's like a 713 00:37:42,800 --> 00:37:47,440 Speaker 3: real problem. 714 00:37:47,480 --> 00:37:50,080 Speaker 2: What do we learn from this? What is it? Let 715 00:37:50,080 --> 00:37:52,120 Speaker 2: me just say, because I don't I personally don't think 716 00:37:52,239 --> 00:37:54,040 Speaker 2: just being out and talking to people, if I was 717 00:37:54,040 --> 00:37:56,600 Speaker 2: trying to if I try to tell somebody, hey, yeah, man, 718 00:37:56,600 --> 00:37:58,960 Speaker 2: what do you think about the XDUTILS thing? Have you? 719 00:37:58,960 --> 00:38:00,600 Speaker 3: You know, hey, buddy, you what's up? 720 00:38:00,880 --> 00:38:03,799 Speaker 2: Yeah? Has has it changed anything about how? Yeah? Has 721 00:38:03,840 --> 00:38:05,640 Speaker 2: that changed anything about how you go about your life? 722 00:38:05,840 --> 00:38:08,120 Speaker 2: People can tell me no, So I got to ask 723 00:38:08,160 --> 00:38:12,200 Speaker 2: somebody who's actually closer to this. Has this changed how 724 00:38:12,320 --> 00:38:15,360 Speaker 2: you approach things? Has this changed how the industry approaches things? 725 00:38:15,560 --> 00:38:18,759 Speaker 2: Has this changed how I mean the theory that you're 726 00:38:18,800 --> 00:38:20,799 Speaker 2: putting out is that this is a state actor? Has 727 00:38:20,840 --> 00:38:24,560 Speaker 2: this changed how national security is being looked at? 728 00:38:26,360 --> 00:38:29,440 Speaker 3: So for companies that know what they're doing, it has 729 00:38:29,600 --> 00:38:33,440 Speaker 3: changed that they approach open source. For a handful of 730 00:38:33,520 --> 00:38:37,400 Speaker 3: really big you know, like the Googles, the Metas, the Amazon's, 731 00:38:37,400 --> 00:38:40,160 Speaker 3: the Microsoft's, the really big tech companies that do a 732 00:38:40,200 --> 00:38:43,800 Speaker 3: lot of open source work, they are looking more carefully 733 00:38:43,800 --> 00:38:46,520 Speaker 3: at open source for security companies that do this work. 734 00:38:46,719 --> 00:38:49,960 Speaker 3: We're investing in software and AI that can do this 735 00:38:50,080 --> 00:38:54,000 Speaker 3: work for us. But it has not changed anything massively. Right, 736 00:38:54,040 --> 00:38:57,320 Speaker 3: We're still running Linux, We're still all pulling in fifty 737 00:38:57,360 --> 00:39:01,120 Speaker 3: thousand packages. We have these fwomongkeys, depend t graphs. The 738 00:39:01,160 --> 00:39:03,399 Speaker 3: truth is, you can't just pivot all these things. Right. 739 00:39:03,440 --> 00:39:05,759 Speaker 3: It has made us more concerned about these problems. When 740 00:39:05,800 --> 00:39:08,359 Speaker 3: you talk to Cisows, my colleagues and I we're all 741 00:39:08,400 --> 00:39:11,719 Speaker 3: more concerned. But we can't magically pivot off of the 742 00:39:11,760 --> 00:39:14,680 Speaker 3: infrastructure we have built over a decade. I do not 743 00:39:14,760 --> 00:39:18,839 Speaker 3: think we've dealt with the fact that if you get 744 00:39:18,920 --> 00:39:20,920 Speaker 3: on the subway in the morning and you look around, 745 00:39:21,400 --> 00:39:24,120 Speaker 3: most of the people on that train in their pocket. 746 00:39:24,239 --> 00:39:27,920 Speaker 3: Exutills is in their pocket. Every single person in there, 747 00:39:29,080 --> 00:39:32,480 Speaker 3: hundreds of copies of their Social Security number is sitting 748 00:39:32,520 --> 00:39:35,000 Speaker 3: on servers that would have been backdoored by this attack. 749 00:39:35,160 --> 00:39:39,719 Speaker 3: That's how you can think of it, right, So that's 750 00:39:40,239 --> 00:39:40,960 Speaker 3: how close we came. 751 00:39:42,040 --> 00:39:49,560 Speaker 2: Man. So just some closing thoughts here. Again, the reason 752 00:39:49,640 --> 00:39:52,239 Speaker 2: that most people don't know about what was almost the 753 00:39:52,280 --> 00:39:55,480 Speaker 2: biggest hack in the history of the Internet is because 754 00:39:55,480 --> 00:39:58,360 Speaker 2: this is really hard to describe to a non technical audience. 755 00:39:58,400 --> 00:40:02,160 Speaker 2: I mean, when you say EXEU or linux or open ssh, 756 00:40:02,520 --> 00:40:04,839 Speaker 2: people's eyes just rolling the back of their heads. But 757 00:40:05,160 --> 00:40:08,120 Speaker 2: we can't allow tech literacy to be a barrier to 758 00:40:08,280 --> 00:40:11,640 Speaker 2: understanding how the world works, and the truth is, even 759 00:40:11,719 --> 00:40:14,320 Speaker 2: beyond all the tech jargon, a lot of these things 760 00:40:14,360 --> 00:40:17,480 Speaker 2: are very human and they're not so hard to understand. 761 00:40:18,440 --> 00:40:19,799 Speaker 2: And so that's one of the things that we're really 762 00:40:19,800 --> 00:40:21,640 Speaker 2: trying to do here on kill Switch, as we keep 763 00:40:21,680 --> 00:40:24,520 Speaker 2: doing these episodes, is to open it up so that 764 00:40:24,600 --> 00:40:26,759 Speaker 2: more people are able to feel like they're part of 765 00:40:26,760 --> 00:40:36,319 Speaker 2: the conversations that affect all of us. And that is 766 00:40:36,360 --> 00:40:38,799 Speaker 2: it for this one, for real. Thank y'all so much 767 00:40:38,840 --> 00:40:41,120 Speaker 2: for listening to kill Switch. You can hit us up 768 00:40:41,120 --> 00:40:43,840 Speaker 2: at kill Switch at Kaleidoscope dot NYC if you got 769 00:40:43,880 --> 00:40:45,680 Speaker 2: any thoughts or if there's anything you want us to 770 00:40:45,680 --> 00:40:48,000 Speaker 2: cover in the future, and you can hit me at 771 00:40:48,040 --> 00:40:51,239 Speaker 2: dex Digi that's the d e x d igi on 772 00:40:51,440 --> 00:40:54,279 Speaker 2: Instagram or blue Sky if that's more your thing. And 773 00:40:54,360 --> 00:40:56,520 Speaker 2: if you like the episode, you know, take that phone 774 00:40:56,560 --> 00:40:59,000 Speaker 2: out of the pocket and leave us a review. It 775 00:40:59,000 --> 00:41:01,640 Speaker 2: helps people find the show, which in turn helps us 776 00:41:01,719 --> 00:41:04,960 Speaker 2: keep doing our thing. And this thing is hosted by 777 00:41:05,040 --> 00:41:09,600 Speaker 2: me Dexter Thomas. It's produced by Sena Ozaki, darl Luk Potts, 778 00:41:09,880 --> 00:41:13,400 Speaker 2: and Kate Osborne. Our theme song is by Kyle Murdoch, 779 00:41:13,440 --> 00:41:17,480 Speaker 2: who also makes the show from Kaleidoscope. Our executive producers 780 00:41:17,480 --> 00:41:21,160 Speaker 2: are oz Va Lashin, mungesh Hat Togadur, and Kate Osborne. 781 00:41:21,640 --> 00:41:25,960 Speaker 2: From iHeart, our executive producers are Katrina Norville and Nikki E. Tour. 782 00:41:26,480 --> 00:41:44,000 Speaker 2: That's it for this time. Catch on the next one.