1 00:00:09,400 --> 00:00:13,280 Speaker 1: Quick question, do you know about the Xzu Till's back 2 00:00:13,280 --> 00:00:15,360 Speaker 1: door hack? What backdoor? 3 00:00:15,760 --> 00:00:15,960 Speaker 2: Wait? 4 00:00:16,000 --> 00:00:18,959 Speaker 1: Wait wait wait wit what the xu tills back door? 5 00:00:20,000 --> 00:00:21,319 Speaker 1: I have no idea what you're talking about. 6 00:00:21,480 --> 00:00:22,560 Speaker 3: I don't know what that is. 7 00:00:24,680 --> 00:00:28,000 Speaker 1: This is something almost nobody's heard of, but in the 8 00:00:28,040 --> 00:00:31,760 Speaker 1: spring of twenty twenty four, we narrowly avoided a complete 9 00:00:31,800 --> 00:00:33,000 Speaker 1: technological disaster. 10 00:00:33,880 --> 00:00:36,280 Speaker 3: So you've you've never heard of this though? Nope? 11 00:00:37,120 --> 00:00:39,720 Speaker 2: Yeah, I just searched it up on Wikipedia and it 12 00:00:39,760 --> 00:00:40,879 Speaker 2: seems way too nne. 13 00:00:40,760 --> 00:00:41,360 Speaker 3: To read about. 14 00:00:42,040 --> 00:00:45,440 Speaker 1: These aren't just random people. These are other journalists people 15 00:00:45,520 --> 00:00:47,400 Speaker 1: in general who keep up with the news. 16 00:00:47,760 --> 00:00:49,560 Speaker 4: Okay, I was like, wait, what did I miss? 17 00:00:49,920 --> 00:00:53,080 Speaker 3: And I feel bad, But I guess maybe I'm not 18 00:00:53,320 --> 00:00:56,280 Speaker 3: the only one. What journalists who doesn't know what this 19 00:00:56,400 --> 00:00:56,800 Speaker 3: is about? 20 00:00:57,120 --> 00:00:59,680 Speaker 1: Even they didn't really know about what could have been 21 00:00:59,840 --> 00:01:02,560 Speaker 1: the biggest hack in the history of the Internet. 22 00:01:03,520 --> 00:01:05,959 Speaker 2: If this had not been caught, then this would have 23 00:01:06,080 --> 00:01:10,040 Speaker 2: been a skeleton key that would have allowed these attackers 24 00:01:10,120 --> 00:01:16,120 Speaker 2: to break into tens of millions of incredibly important servers 25 00:01:16,160 --> 00:01:19,319 Speaker 2: around the world. We probably would have had airlines not working, 26 00:01:19,680 --> 00:01:23,240 Speaker 2: trading halted ATM's, not working, banks not working, people not 27 00:01:23,240 --> 00:01:26,720 Speaker 2: able to get their money, you'd have a huge loss 28 00:01:26,800 --> 00:01:29,679 Speaker 2: of credibility of technology in people's lives. 29 00:01:30,200 --> 00:01:34,399 Speaker 1: Alex Stamos is a cybersecurity expert. Specifically, he's the chief 30 00:01:34,480 --> 00:01:39,000 Speaker 1: Information Security Officer or CISO at a cybersecurity company called 31 00:01:39,040 --> 00:01:43,039 Speaker 1: Sentinel One, and he's the former CIECO at Facebook. He's 32 00:01:43,040 --> 00:01:45,880 Speaker 1: also a lecturer in the computer science department at Stanford 33 00:01:46,360 --> 00:01:49,480 Speaker 1: and this attempted hack is something that is still keeping 34 00:01:49,520 --> 00:01:50,200 Speaker 1: him up at night. 35 00:01:51,120 --> 00:01:56,360 Speaker 2: It's fallen out of popular discussion, but among people in 36 00:01:56,400 --> 00:01:59,880 Speaker 2: security we're still talking about it. It uncovered a real 37 00:02:00,880 --> 00:02:06,040 Speaker 2: fundamental weakness that terrifies lots of people who have responsibility 38 00:02:06,040 --> 00:02:06,560 Speaker 2: in this area. 39 00:02:07,040 --> 00:02:10,799 Speaker 1: And what scares them most, and this should scare us too, 40 00:02:11,080 --> 00:02:13,400 Speaker 1: is that this was caught by complete chance. 41 00:02:13,760 --> 00:02:17,600 Speaker 2: We just got lucky, right like one dude got really 42 00:02:17,639 --> 00:02:22,480 Speaker 2: bored and noticed a tiny little change in the speed 43 00:02:22,639 --> 00:02:26,360 Speaker 2: of one program executing and pulled the thread. And on 44 00:02:26,400 --> 00:02:29,880 Speaker 2: the end of this thread was a humongous, ticking time bomb. 45 00:02:30,440 --> 00:02:33,800 Speaker 2: It was one dude, and he should never have to 46 00:02:33,800 --> 00:02:38,480 Speaker 2: buy a beer for himself ever again underus freuend I'm raising. 47 00:02:38,160 --> 00:02:40,720 Speaker 4: A toast to you right now. This is just water, 48 00:02:40,880 --> 00:02:42,639 Speaker 4: but I wish it was more. 49 00:02:53,760 --> 00:03:00,400 Speaker 1: Kaleidoscope and iHeart podcast is killed switch com dexterous. 50 00:03:01,520 --> 00:03:02,200 Speaker 2: Amentarians. 51 00:03:23,000 --> 00:03:25,400 Speaker 1: If you've never heard about this, that's no reason to 52 00:03:25,440 --> 00:03:29,520 Speaker 1: feel bad. But if it hadn't been caught, it absolutely 53 00:03:29,520 --> 00:03:33,520 Speaker 1: would have affected you. What kind of activity were talking 54 00:03:33,520 --> 00:03:34,200 Speaker 1: about here. 55 00:03:34,440 --> 00:03:36,240 Speaker 2: Well, we really don't know, and because we don't know 56 00:03:36,240 --> 00:03:38,480 Speaker 2: who the attackers are, we don't know whether that would 57 00:03:38,480 --> 00:03:43,920 Speaker 2: have been used for really quiet surveillance. It could have 58 00:03:44,000 --> 00:03:49,119 Speaker 2: been used for national security intelligence gathering purposes. It could 59 00:03:49,120 --> 00:03:51,640 Speaker 2: have been used for a humongous heist of hundreds of 60 00:03:51,680 --> 00:03:55,320 Speaker 2: millions or billions of dollars of cryptocurrency, or it could 61 00:03:55,320 --> 00:03:57,960 Speaker 2: have been used as part of a massive cyber attack 62 00:03:58,040 --> 00:04:01,680 Speaker 2: to shut down millions of can and cause massive disruptions. 63 00:04:02,960 --> 00:04:05,640 Speaker 1: One of the main reasons that this potential attack isn't 64 00:04:05,680 --> 00:04:10,000 Speaker 1: talked about much is because the details are kind of technical. Well, 65 00:04:10,240 --> 00:04:12,520 Speaker 1: some of the details are a lot of this stuff 66 00:04:12,560 --> 00:04:15,360 Speaker 1: is really just basic human behavior. It's stuff that you 67 00:04:15,520 --> 00:04:17,839 Speaker 1: or I could do if we really wanted, and it 68 00:04:17,920 --> 00:04:21,719 Speaker 1: shows us that sometimes the best hacks are the simplest ones. 69 00:04:22,320 --> 00:04:25,480 Speaker 1: Let me break it down for you. In late March 70 00:04:25,480 --> 00:04:29,080 Speaker 1: of twenty twenty four, Andres Freund who's an engineer at Microsoft, 71 00:04:29,360 --> 00:04:31,719 Speaker 1: was sitting at his desk doing his job when he 72 00:04:31,800 --> 00:04:34,719 Speaker 1: discovered a malicious piece of code in this little known 73 00:04:34,800 --> 00:04:38,280 Speaker 1: tool called xdu tilS. This code created a method that 74 00:04:38,320 --> 00:04:40,360 Speaker 1: would allow hackers to access a. 75 00:04:40,400 --> 00:04:42,080 Speaker 3: Lot of different computers. 76 00:04:42,560 --> 00:04:45,120 Speaker 1: Maybe right now you're thinking, okay, so why is this 77 00:04:45,200 --> 00:04:48,000 Speaker 1: the problem for me? I mean, I don't use xdu tilS, 78 00:04:48,040 --> 00:04:51,040 Speaker 1: so they couldn't get on my computer. And yeah, maybe 79 00:04:51,080 --> 00:04:54,160 Speaker 1: you've never heard of XCU tills. Actually I hadn't either, 80 00:04:54,640 --> 00:04:56,920 Speaker 1: and I did what most people do when they don't 81 00:04:56,960 --> 00:05:00,960 Speaker 1: understand something about a computer, call an expert. But it 82 00:05:01,000 --> 00:05:04,640 Speaker 1: turns out that this really well respected expert he found 83 00:05:04,680 --> 00:05:06,719 Speaker 1: out about XU tills when I did. 84 00:05:07,640 --> 00:05:10,200 Speaker 2: Yeah, so I personally had not heard of xdu tilS 85 00:05:10,720 --> 00:05:11,600 Speaker 2: before this. 86 00:05:11,760 --> 00:05:15,039 Speaker 1: Even you really, yeah, I had definitely not heard of 87 00:05:15,160 --> 00:05:18,279 Speaker 1: XU tills. I figured you would have hearing that you 88 00:05:18,440 --> 00:05:22,280 Speaker 1: had not heard of it before all this happened. Frankly, 89 00:05:22,320 --> 00:05:24,760 Speaker 1: that's a little bit more scary to me. Now, So 90 00:05:25,040 --> 00:05:27,840 Speaker 1: why did this backdoor into a program that no one 91 00:05:27,880 --> 00:05:31,400 Speaker 1: seems to know about matter so much? And what is 92 00:05:31,640 --> 00:05:32,600 Speaker 1: XU tills. 93 00:05:32,920 --> 00:05:37,760 Speaker 2: This is the brilliance of what these attackers did XCU 94 00:05:37,800 --> 00:05:40,560 Speaker 2: tilS is an ingredient to an ingredient to an ingredient 95 00:05:40,600 --> 00:05:44,599 Speaker 2: to something really important. So the thing that they wanted 96 00:05:44,600 --> 00:05:47,480 Speaker 2: to have a backdoor into is a really important program 97 00:05:47,520 --> 00:05:48,479 Speaker 2: called open ssh. 98 00:05:49,200 --> 00:05:51,080 Speaker 4: So this is something that every tech he has heard of. 99 00:05:51,960 --> 00:05:54,920 Speaker 1: All right, but what if you're not a techie. So 100 00:05:55,040 --> 00:05:57,400 Speaker 1: in order to understand the XU tills haack, we do 101 00:05:57,480 --> 00:06:00,320 Speaker 1: need to back up and understand something that xcutil is 102 00:06:00,400 --> 00:06:02,600 Speaker 1: used in this thing called open ssh. 103 00:06:03,000 --> 00:06:08,719 Speaker 2: This is the program that the majority of Unix like systems, 104 00:06:08,839 --> 00:06:14,480 Speaker 2: especially Linux, also Max and some other operating systems, allow 105 00:06:14,560 --> 00:06:17,479 Speaker 2: you to access them remotely over the internet. 106 00:06:18,160 --> 00:06:21,880 Speaker 1: I'll get to the open later, but SSH stands for secure, show, 107 00:06:22,200 --> 00:06:25,200 Speaker 1: and let's just focus on secure right now. If you 108 00:06:25,200 --> 00:06:27,719 Speaker 1: think of the difference between posting a tweet online and 109 00:06:27,839 --> 00:06:31,919 Speaker 1: dming someone, you're actually kind of halfway there. Open Ssh 110 00:06:32,000 --> 00:06:34,680 Speaker 1: allows you to communicate with a remote computer just like 111 00:06:34,720 --> 00:06:36,839 Speaker 1: you were sitting there right in front of it. So 112 00:06:37,080 --> 00:06:39,200 Speaker 1: even though you're far away, if you want to send 113 00:06:39,200 --> 00:06:42,680 Speaker 1: a message, or install programs or delete files, you know 114 00:06:42,720 --> 00:06:45,279 Speaker 1: that the connection is safe and that nobody else can 115 00:06:45,320 --> 00:06:48,000 Speaker 1: see what you're doing or tamper with that connection. 116 00:06:48,600 --> 00:06:50,440 Speaker 2: So you know, when you see like people in the 117 00:06:50,440 --> 00:06:54,480 Speaker 2: matrix typing really fast, see a lot of text, right 118 00:06:54,520 --> 00:06:57,840 Speaker 2: if somebody is doing that remotely, it's probably over open ssh. 119 00:06:57,960 --> 00:06:59,760 Speaker 1: You might think this doesn't matter for you because you 120 00:06:59,760 --> 00:07:03,520 Speaker 1: don't use open ssh, but you do because that's what 121 00:07:03,560 --> 00:07:06,640 Speaker 1: you use to connect to systems running Linux. Around the world. 122 00:07:07,080 --> 00:07:12,360 Speaker 2: Linux has become the standard operating system for the cloud. 123 00:07:12,840 --> 00:07:15,200 Speaker 2: So when you talk to Google, you're talking to a 124 00:07:15,200 --> 00:07:17,440 Speaker 2: Linux system. When you talk to Facebook, you're talking to 125 00:07:17,480 --> 00:07:21,520 Speaker 2: a Linux system. When you talk to Apple, you're probably talking. 126 00:07:21,360 --> 00:07:22,160 Speaker 4: To a Linux system. 127 00:07:22,320 --> 00:07:24,320 Speaker 2: Right now, the system that we're talking to each other 128 00:07:24,360 --> 00:07:27,560 Speaker 2: with almost certainly is running Linux. So the vast majority 129 00:07:27,600 --> 00:07:30,200 Speaker 2: of systems you talk to in the cloud are running Linux. 130 00:07:30,600 --> 00:07:34,040 Speaker 1: Linux is used for Apple's iCloud, for social media sites 131 00:07:34,080 --> 00:07:38,320 Speaker 1: like Facebook, Instagram, for YouTube, for Twitter, It's used for 132 00:07:38,320 --> 00:07:41,040 Speaker 1: the New York Stock Exchange. Gamers use it when they 133 00:07:41,160 --> 00:07:44,240 Speaker 1: run Steam or they play games online, and the list 134 00:07:44,280 --> 00:07:47,440 Speaker 1: goes on. The vast majority of the Internet runs on 135 00:07:47,520 --> 00:07:51,080 Speaker 1: Linux and open ssh. Make sure that it's you logging 136 00:07:51,120 --> 00:07:52,800 Speaker 1: in and not somebody else. 137 00:07:53,240 --> 00:07:55,880 Speaker 2: When you log in and you get your mail, the 138 00:07:55,920 --> 00:07:57,840 Speaker 2: server that holds your mail has a stage on it 139 00:07:57,920 --> 00:08:02,520 Speaker 2: the server that holds your social media SSSH, the servers 140 00:08:02,520 --> 00:08:05,040 Speaker 2: that have your banking information at ssh. It's the door 141 00:08:05,080 --> 00:08:06,840 Speaker 2: by which you get into these systems. 142 00:08:07,400 --> 00:08:11,240 Speaker 1: So open ssh is incredibly important to the Internet and 143 00:08:11,400 --> 00:08:14,320 Speaker 1: all the cloud systems that we rely on, and because 144 00:08:14,360 --> 00:08:16,640 Speaker 1: of that, it has a lot of eyes on it. 145 00:08:17,240 --> 00:08:21,000 Speaker 1: Trying to hack open Ssh directly would pretty much be impossible. 146 00:08:21,480 --> 00:08:23,320 Speaker 1: Someone would catch you pretty quick. 147 00:08:24,080 --> 00:08:26,000 Speaker 2: People pay a lot of attention to it. A lot 148 00:08:26,040 --> 00:08:29,640 Speaker 2: of people run their code scanners on it, a lot 149 00:08:29,640 --> 00:08:31,720 Speaker 2: of people look for bugs in it, and so it 150 00:08:31,720 --> 00:08:34,959 Speaker 2: has been a while since open ssh has had itself 151 00:08:35,559 --> 00:08:38,240 Speaker 2: a humongous security flaw in it. If you just join 152 00:08:38,320 --> 00:08:40,760 Speaker 2: the open ssh project and said, hey, I'm a new 153 00:08:40,800 --> 00:08:45,240 Speaker 2: guy that nobody ever knew, here's my code, everybody would 154 00:08:45,240 --> 00:08:48,839 Speaker 2: be super suspicious, right m h. And whoever these bad 155 00:08:48,840 --> 00:08:51,959 Speaker 2: guys are, they know that. So what they did was 156 00:08:52,000 --> 00:08:55,720 Speaker 2: they looked at open ssh, and they looked at its 157 00:08:55,840 --> 00:08:58,319 Speaker 2: dependency graph, what we call They looked at all the 158 00:08:58,320 --> 00:09:01,160 Speaker 2: stuff that goes into open ssh, and what they saw 159 00:09:01,480 --> 00:09:05,199 Speaker 2: was open Ssh depends on other things. 160 00:09:06,800 --> 00:09:10,440 Speaker 1: This is where xzu tills comes in. Xu tills is 161 00:09:10,480 --> 00:09:14,160 Speaker 1: one of the things that open Ssh depends on what 162 00:09:14,200 --> 00:09:15,880 Speaker 1: does xzu tills actually do. 163 00:09:16,240 --> 00:09:20,000 Speaker 2: It's a compression library, so it's just a library that 164 00:09:20,200 --> 00:09:23,280 Speaker 2: is used to make data that comes in smaller so 165 00:09:23,320 --> 00:09:25,640 Speaker 2: that if you're moving like a big file back and forth, 166 00:09:25,640 --> 00:09:28,040 Speaker 2: it can fit down a smaller pipe. Right, you might 167 00:09:28,080 --> 00:09:30,160 Speaker 2: be talking to a server on a satellite link, you 168 00:09:30,200 --> 00:09:32,160 Speaker 2: might be talking over a modem. Right, you might be 169 00:09:32,200 --> 00:09:34,120 Speaker 2: talking over a cell phone, and so you want your 170 00:09:34,160 --> 00:09:36,120 Speaker 2: big file to fit into a smaller pipe. 171 00:09:36,160 --> 00:09:38,200 Speaker 1: If you've ever used the zip file on your computer, 172 00:09:38,360 --> 00:09:41,800 Speaker 1: you get the general idea. Smaller files can be transferred faster, 173 00:09:42,240 --> 00:09:44,520 Speaker 1: which is important when you're dealing with so much data 174 00:09:44,520 --> 00:09:48,240 Speaker 1: flowing back and forth. Xdu tilS allows open Ssh to 175 00:09:48,280 --> 00:09:53,680 Speaker 1: be both safe and fast, but that's the trick. By 176 00:09:53,760 --> 00:09:56,960 Speaker 1: inserting a back door into xu tills, the hackers created 177 00:09:56,960 --> 00:09:59,880 Speaker 1: a way to access anything being transmitted via open ss. 178 00:10:01,040 --> 00:10:03,800 Speaker 1: That meant they could not only read supposedly secure messages, 179 00:10:03,960 --> 00:10:07,840 Speaker 1: but remotely run code on any server that uses open ssh. 180 00:10:08,120 --> 00:10:11,720 Speaker 1: And since basically the entire Internet uses this thing, once 181 00:10:11,760 --> 00:10:14,160 Speaker 1: you're in there, you can do anything you want. 182 00:10:15,440 --> 00:10:18,040 Speaker 2: You could have used it for a bunch of very 183 00:10:18,280 --> 00:10:23,360 Speaker 2: quiet surgical attacks over a multi year period, or you 184 00:10:23,400 --> 00:10:26,120 Speaker 2: could have done one humongous big bane where you knock 185 00:10:26,120 --> 00:10:27,720 Speaker 2: out a huge chunk of the Internet all at once. 186 00:10:28,080 --> 00:10:30,760 Speaker 1: But how did hackers get access to xdutails in the 187 00:10:30,840 --> 00:10:33,679 Speaker 1: first place. Well, remember when I promised to tell you 188 00:10:33,720 --> 00:10:37,880 Speaker 1: about the open in open Ssh. Open Ssh and also 189 00:10:37,960 --> 00:10:41,640 Speaker 1: Linux are open source programs. This means that anyone can 190 00:10:41,679 --> 00:10:44,280 Speaker 1: look at the source code because it's open and it's 191 00:10:44,320 --> 00:10:48,000 Speaker 1: posted publicly. The idea is that if everyone works together 192 00:10:48,080 --> 00:10:50,880 Speaker 1: on the code, it'll be better and the publical benefit. 193 00:10:51,440 --> 00:10:53,720 Speaker 1: And so anyone's free to look at the code, to 194 00:10:53,840 --> 00:10:55,880 Speaker 1: learn from the code, or even to remix it for 195 00:10:55,920 --> 00:10:58,760 Speaker 1: their own use. And even if you have no interest 196 00:10:58,760 --> 00:11:01,440 Speaker 1: in all that nerd stuff, you still use versions of 197 00:11:01,440 --> 00:11:04,480 Speaker 1: open source code every day on basically all of your 198 00:11:04,520 --> 00:11:06,000 Speaker 1: devices when. 199 00:11:05,840 --> 00:11:09,640 Speaker 2: You're running open source software, which people don't understand. Basically 200 00:11:09,720 --> 00:11:11,719 Speaker 2: everybody is right. So what kind of phone do you have? 201 00:11:11,800 --> 00:11:13,199 Speaker 2: Do you have an iPhone or Android? 202 00:11:13,440 --> 00:11:14,320 Speaker 3: I actually have an Android? 203 00:11:14,360 --> 00:11:15,520 Speaker 4: Yeah, okay, so Android. 204 00:11:15,640 --> 00:11:18,680 Speaker 2: A humongous chunk of that code is open source, right right, 205 00:11:18,679 --> 00:11:22,560 Speaker 2: And that is code that is maintained by volunteers that 206 00:11:22,600 --> 00:11:24,679 Speaker 2: you have no idea who those people are. Google has 207 00:11:24,679 --> 00:11:27,440 Speaker 2: no idea who those people are. Right, Google collects all 208 00:11:27,480 --> 00:11:29,840 Speaker 2: this code from around the internet, they package it all 209 00:11:29,920 --> 00:11:32,040 Speaker 2: up and then they put it on a phone, or 210 00:11:32,040 --> 00:11:34,120 Speaker 2: they send it to Samson, and Samson puts on the phone. 211 00:11:34,240 --> 00:11:37,319 Speaker 1: And before we get any further, iPhone people, this applies 212 00:11:37,360 --> 00:11:40,120 Speaker 1: to you too. Your iPhone uses a lot of open 213 00:11:40,160 --> 00:11:43,360 Speaker 1: source code also. And don't get me wrong, this is 214 00:11:43,440 --> 00:11:44,280 Speaker 1: not a bad thing. 215 00:11:44,640 --> 00:11:47,200 Speaker 2: It's great because it's free and it makes the phone cheaper, 216 00:11:47,240 --> 00:11:49,760 Speaker 2: and it's cool that we all get to contribute. But 217 00:11:49,800 --> 00:11:53,680 Speaker 2: the flip side is is that, yes, OpenSSH itself gets 218 00:11:53,679 --> 00:11:56,360 Speaker 2: lots of love. The Linux kernel gets lots of love, right, 219 00:11:56,720 --> 00:11:59,960 Speaker 2: But something like XCU tills, which is this tiny little 220 00:12:00,080 --> 00:12:02,719 Speaker 2: component over here, does not get lots of enough. And 221 00:12:02,880 --> 00:12:06,400 Speaker 2: xd details at the time was maintained by one person. 222 00:12:06,840 --> 00:12:09,480 Speaker 2: That one dude was then manipulated to giving up control 223 00:12:09,520 --> 00:12:12,079 Speaker 2: of it, and the person he gave up control of 224 00:12:12,120 --> 00:12:14,720 Speaker 2: it too, turned out to be a totally fake persona 225 00:12:14,960 --> 00:12:15,680 Speaker 2: to not exist. 226 00:12:18,280 --> 00:12:19,800 Speaker 1: This is where we get to the human part of 227 00:12:19,800 --> 00:12:23,200 Speaker 1: the story. The one guy who was maintaining xeu tills, 228 00:12:23,440 --> 00:12:26,800 Speaker 1: his name was lost to Culin. He'd been maintaining xeu 229 00:12:26,880 --> 00:12:29,160 Speaker 1: tills since two thousand and nine, and he was the 230 00:12:29,160 --> 00:12:32,440 Speaker 1: sole maintainer for the project. He wasn't being paid for it. 231 00:12:32,559 --> 00:12:35,920 Speaker 1: He was a volunteer. That's usually how open source projects go. 232 00:12:36,400 --> 00:12:39,240 Speaker 1: In twenty twenty two, LASTA. Collins started to get a 233 00:12:39,240 --> 00:12:42,360 Speaker 1: lot of requests to make updates to the code. Throughout 234 00:12:42,400 --> 00:12:46,360 Speaker 1: the year, multiple accounts, seemingly out of nowhere, started complaining 235 00:12:46,440 --> 00:12:50,040 Speaker 1: that Colin wasn't working fast enough and implying that if 236 00:12:50,040 --> 00:12:52,880 Speaker 1: he wasn't interested in doing this anymore, maybe he wasn't 237 00:12:52,880 --> 00:12:56,000 Speaker 1: the guy for the job and the pressure was getting 238 00:12:56,000 --> 00:12:56,640 Speaker 1: to him. 239 00:12:56,880 --> 00:12:57,480 Speaker 3: In June of. 240 00:12:57,440 --> 00:13:00,920 Speaker 1: Twenty twenty two, Colin wrote in a public note, quote, 241 00:13:01,200 --> 00:13:04,040 Speaker 1: I haven't lost interest, but my ability to care has 242 00:13:04,080 --> 00:13:07,800 Speaker 1: been fairly limited, mostly due to long term mental health issues, 243 00:13:08,040 --> 00:13:10,839 Speaker 1: but also due to some other things. He also went 244 00:13:10,880 --> 00:13:13,960 Speaker 1: on to remind people that quote, It's also good to 245 00:13:14,000 --> 00:13:18,880 Speaker 1: keep in mind that this is an unpaid hobby project. Thankfully, 246 00:13:19,120 --> 00:13:22,520 Speaker 1: right about that time, a new programmer had come into help. 247 00:13:22,840 --> 00:13:26,160 Speaker 1: This new person's name was Gia Tan. Colin seemed a 248 00:13:26,200 --> 00:13:30,120 Speaker 1: little relieved that finally someone wasn't just complaining but helping. 249 00:13:30,679 --> 00:13:32,880 Speaker 1: In that same note from June, he wrote that he'd 250 00:13:32,920 --> 00:13:35,760 Speaker 1: been working a bit with Gatan on exeu utils to 251 00:13:35,840 --> 00:13:39,560 Speaker 1: address all of those complaints, and he said about gia quote, 252 00:13:39,800 --> 00:13:42,120 Speaker 1: perhaps he will have a bigger role in the future. 253 00:13:42,480 --> 00:13:46,760 Speaker 1: We'll see. Over the course of a few years, gia 254 00:13:46,840 --> 00:13:50,439 Speaker 1: Tan really started to gain lesa Collins trust. Gia Tan 255 00:13:50,679 --> 00:13:53,679 Speaker 1: was the ideal contributor. He didn't just help when he 256 00:13:53,720 --> 00:13:55,960 Speaker 1: was asked to, but he would offer to take on 257 00:13:56,120 --> 00:13:59,160 Speaker 1: more work, and by twenty twenty four, Colin had made 258 00:13:59,240 --> 00:14:02,520 Speaker 1: gia Tan a co maintainer on the project, which allowed 259 00:14:02,559 --> 00:14:04,520 Speaker 1: him to add code without needing approval. 260 00:14:05,760 --> 00:14:07,120 Speaker 4: This is a human attack, right. 261 00:14:07,120 --> 00:14:09,200 Speaker 2: It all happened in the open, but the way they 262 00:14:09,200 --> 00:14:11,760 Speaker 2: did it was they created these fake personas where one 263 00:14:11,800 --> 00:14:15,320 Speaker 2: guy super friendly and one guy's a jerk, and the 264 00:14:15,440 --> 00:14:19,800 Speaker 2: jerk basically is abusing the person who's maintaining the software 265 00:14:20,080 --> 00:14:22,080 Speaker 2: and saying, oh, I need this change, I need this change. 266 00:14:22,080 --> 00:14:23,560 Speaker 4: You're so slow. Why are you so slow? 267 00:14:23,600 --> 00:14:26,280 Speaker 2: And remember this guy's not getting paid right like, and 268 00:14:26,320 --> 00:14:31,200 Speaker 2: so eventually basically bully this guy to say, oh, I'm 269 00:14:31,240 --> 00:14:33,040 Speaker 2: tired of doing this, I don't want to do it anymore. 270 00:14:33,200 --> 00:14:34,960 Speaker 2: And then the nice guy's like, oh, well you know, 271 00:14:35,760 --> 00:14:38,640 Speaker 2: I'll do it for you. I'll take over man, let 272 00:14:38,640 --> 00:14:39,720 Speaker 2: me take this burden. 273 00:14:39,440 --> 00:14:42,760 Speaker 1: For you, right, very convenient, right, And. 274 00:14:42,760 --> 00:14:45,680 Speaker 2: This took several years, and so this shows you kind 275 00:14:45,680 --> 00:14:49,880 Speaker 2: of the long play. They're willing to spend months and 276 00:14:49,960 --> 00:14:53,920 Speaker 2: months and months and in fact years building these personas, 277 00:14:54,400 --> 00:14:56,320 Speaker 2: because like, look, if you just created an account and 278 00:14:56,320 --> 00:14:58,680 Speaker 2: you're like, hey, i've got code, take it, that wouldn't 279 00:14:58,680 --> 00:15:01,600 Speaker 2: work right with these people. Figured out is that you 280 00:15:01,680 --> 00:15:04,280 Speaker 2: have to create these personas. They have to seem real. 281 00:15:05,000 --> 00:15:08,200 Speaker 2: You have to make posts, you have to contribute legit stuff. 282 00:15:08,720 --> 00:15:09,800 Speaker 2: You've got to create. 283 00:15:09,600 --> 00:15:11,520 Speaker 4: Kind of a history, build a relationship, you have to 284 00:15:11,520 --> 00:15:12,480 Speaker 4: build a relationship. 285 00:15:12,520 --> 00:15:14,840 Speaker 2: And so the guy who maintains it gives it up 286 00:15:14,880 --> 00:15:16,480 Speaker 2: of like, oh, thank you so much for taking this 287 00:15:16,520 --> 00:15:18,360 Speaker 2: burden from me, because look at these jerks. 288 00:15:18,400 --> 00:15:18,520 Speaker 4: Now. 289 00:15:18,560 --> 00:15:20,600 Speaker 2: Of course he doesn't know that the jerks works for 290 00:15:20,640 --> 00:15:23,400 Speaker 2: the same team, or maybe you're even the same person 291 00:15:23,920 --> 00:15:26,640 Speaker 2: as the nice guy, right, And then he hands it 292 00:15:26,680 --> 00:15:28,320 Speaker 2: over to this nice guy who's a friend of his, 293 00:15:29,080 --> 00:15:31,600 Speaker 2: and then the friend takes it over and then does 294 00:15:31,640 --> 00:15:34,000 Speaker 2: a bunch of legitimate stuff, and then in the middle 295 00:15:34,000 --> 00:15:37,000 Speaker 2: of all that legitimate stuff inserts a very very subtle backdoor. 296 00:15:37,360 --> 00:15:40,760 Speaker 1: I've seen this back door talked about using the phrase 297 00:15:40,840 --> 00:15:44,360 Speaker 1: sophisticated that it was very sophisticated. Yes, in some ways 298 00:15:44,360 --> 00:15:46,440 Speaker 1: it sounds sophisticated, but in some ways it sounds like 299 00:15:46,480 --> 00:15:49,400 Speaker 1: it kind of wasn't because a lot of it just 300 00:15:49,480 --> 00:15:53,000 Speaker 1: revolved around getting somebody to give them some access. 301 00:15:53,680 --> 00:15:57,880 Speaker 2: The code was sophisticated, The method of getting in there 302 00:15:58,040 --> 00:16:00,520 Speaker 2: was very human. It was bugging a guy until he 303 00:16:00,560 --> 00:16:05,080 Speaker 2: gave up control. Yes, right, just being a nuisance, Just 304 00:16:05,080 --> 00:16:09,120 Speaker 2: being a nuisance. So who was behind those fake personas 305 00:16:09,760 --> 00:16:12,640 Speaker 2: We don't know for sure, but Alex has a theory 306 00:16:13,320 --> 00:16:25,520 Speaker 2: that's after the break over the course of years, the 307 00:16:25,560 --> 00:16:29,560 Speaker 2: one guy maintaining this very important tool called XEU tills 308 00:16:29,840 --> 00:16:34,280 Speaker 2: Lasa Colin was being bullied and manipulated online to give 309 00:16:34,320 --> 00:16:37,680 Speaker 2: a persona called Gia Tan a lead role in handling 310 00:16:37,720 --> 00:16:43,320 Speaker 2: the code. But who is gia Tan. Everybody's been asking 311 00:16:43,320 --> 00:16:45,560 Speaker 2: this question of like who did this, Who's behind this? 312 00:16:46,120 --> 00:16:49,080 Speaker 2: Most of the names have kind of an Asian origin, right, 313 00:16:49,160 --> 00:16:52,920 Speaker 2: So there's accounts like Jagar Kumar. The key one is 314 00:16:53,040 --> 00:16:55,880 Speaker 2: Gia Tan, which is like, could be Chinese, could be Korean. 315 00:16:56,600 --> 00:16:59,640 Speaker 2: Most of the either the names or the technical indicators 316 00:16:59,640 --> 00:17:02,480 Speaker 2: point to right. So the time zones that this person 317 00:17:02,560 --> 00:17:05,640 Speaker 2: was working into are kind of the East Asian time zone, 318 00:17:05,680 --> 00:17:08,879 Speaker 2: so it's like Beijing or Korea. The names are Asian. 319 00:17:09,359 --> 00:17:11,480 Speaker 2: Everything points to Asia, which makes a lot of people 320 00:17:11,480 --> 00:17:14,400 Speaker 2: think it's Russia actually because it's just too perfect, right, 321 00:17:14,720 --> 00:17:19,280 Speaker 2: because it's just like it's somebody spent three years doing 322 00:17:19,280 --> 00:17:22,240 Speaker 2: all this work, and then you're like, like, let's say 323 00:17:22,280 --> 00:17:25,200 Speaker 2: you're Chinese. Are you going to use like a Chinese 324 00:17:25,320 --> 00:17:27,960 Speaker 2: name as your fake name? Are you going to spend 325 00:17:28,040 --> 00:17:30,880 Speaker 2: three years but then work in your normal time zone? 326 00:17:31,040 --> 00:17:34,800 Speaker 2: And the generally the only actor who has shown this 327 00:17:34,920 --> 00:17:37,960 Speaker 2: level of patients who's been willing to spend three years 328 00:17:38,320 --> 00:17:40,640 Speaker 2: working on a back door like this. The only people 329 00:17:40,640 --> 00:17:43,119 Speaker 2: who have ever done that is either the United States 330 00:17:43,680 --> 00:17:48,000 Speaker 2: or the SVR. So Russia, okay, yeah, are really the 331 00:17:48,000 --> 00:17:51,679 Speaker 2: only groups where you've seen people spend years kind of 332 00:17:51,720 --> 00:17:53,680 Speaker 2: doing this kind of work. And a lot of people 333 00:17:53,680 --> 00:17:55,720 Speaker 2: don't think you would be the US doing something like this, 334 00:17:55,960 --> 00:17:58,440 Speaker 2: that they would never mess with something this important because 335 00:17:58,440 --> 00:18:00,640 Speaker 2: also the thing the Russians are like was blame other 336 00:18:00,640 --> 00:18:04,440 Speaker 2: people right again, because we never got to the point 337 00:18:04,440 --> 00:18:08,359 Speaker 2: of again used. Usually attribution is done after something's used, 338 00:18:08,680 --> 00:18:10,880 Speaker 2: and so it's a lot easier to figure out because 339 00:18:10,960 --> 00:18:13,720 Speaker 2: then you can ask quebono right, who benefits. 340 00:18:14,040 --> 00:18:16,359 Speaker 4: But like all of these indicators. 341 00:18:15,840 --> 00:18:19,000 Speaker 2: Pointing specifically to kind of China or Korea makes you 342 00:18:19,040 --> 00:18:22,240 Speaker 2: think it's just a little too obvious. 343 00:18:23,040 --> 00:18:26,800 Speaker 1: A major theory in cybersecurity circles is that Giatan isn't 344 00:18:26,840 --> 00:18:31,600 Speaker 1: one person. It's potentially multiple people, but likely Russian hackers 345 00:18:31,640 --> 00:18:35,119 Speaker 1: working for the SVR, which is Russia's foreign intelligence service, 346 00:18:35,640 --> 00:18:38,280 Speaker 1: and that they tried to cover their tracks even if 347 00:18:38,320 --> 00:18:39,600 Speaker 1: it wasn't consistent. 348 00:18:40,000 --> 00:18:40,679 Speaker 4: The guys who. 349 00:18:40,520 --> 00:18:43,879 Speaker 2: Worked for the professionals will change their time zones specifically 350 00:18:43,960 --> 00:18:48,080 Speaker 2: around who either what allows them to avoid detection or 351 00:18:48,119 --> 00:18:50,720 Speaker 2: specifically around whatever they're doing for attribution. 352 00:18:51,040 --> 00:18:54,119 Speaker 1: Well, there were some times that the time zones actually 353 00:18:54,160 --> 00:18:57,760 Speaker 1: pointed to an Eastern European time zone or time or 354 00:18:57,800 --> 00:18:58,840 Speaker 1: another time zone, right. 355 00:18:58,760 --> 00:19:00,880 Speaker 2: Yeah, I mean there's there is a little mixed right, 356 00:19:01,160 --> 00:19:05,399 Speaker 2: So somebody could be working from the eastern side of Russia, 357 00:19:05,520 --> 00:19:07,560 Speaker 2: or they could be waking up early in Moscow Saint 358 00:19:07,600 --> 00:19:09,400 Speaker 2: Petersburg and then they slipped right. 359 00:19:09,880 --> 00:19:11,760 Speaker 1: In other words, they might have just slipped up and 360 00:19:11,760 --> 00:19:15,320 Speaker 1: forgot to change their time zones. Because remember this happened 361 00:19:15,359 --> 00:19:18,120 Speaker 1: over the course of years. Maybe somebody had an off 362 00:19:18,200 --> 00:19:21,399 Speaker 1: day and forgot to change the computer settings. But Alex 363 00:19:21,440 --> 00:19:24,560 Speaker 1: has another reason for suspecting Russia over China. 364 00:19:25,880 --> 00:19:29,400 Speaker 2: Chinese hackers, for the most part, work very rigorous hours. 365 00:19:29,560 --> 00:19:32,960 Speaker 2: You can almost always tell when Chinese hackers are working 366 00:19:33,040 --> 00:19:35,760 Speaker 2: because they work office hours. They work eight to five, 367 00:19:35,880 --> 00:19:39,080 Speaker 2: eight to six. Really, it's like very regular, yeah, okay, 368 00:19:39,119 --> 00:19:41,680 Speaker 2: Whereas it's much harder to do time zone stuff based 369 00:19:41,680 --> 00:19:44,000 Speaker 2: for the Russians because they they will work whatever hours 370 00:19:44,000 --> 00:19:46,000 Speaker 2: they need to work. You know that that scene in 371 00:19:46,080 --> 00:19:48,320 Speaker 2: like one of the Born movies where it's like the 372 00:19:48,359 --> 00:19:50,240 Speaker 2: club scene. I always think about this with the Russia. 373 00:19:50,240 --> 00:19:52,080 Speaker 2: There's like a club scene in Russia, and it's like 374 00:19:52,080 --> 00:19:53,080 Speaker 2: you think it's the middle of the night and he 375 00:19:53,160 --> 00:19:55,520 Speaker 2: walks out it's like ten am or something. Right, It's like, 376 00:19:55,520 --> 00:19:58,480 Speaker 2: that's what I think about with Russian hackers, whereas like 377 00:19:58,720 --> 00:20:01,760 Speaker 2: for China, it's amazing because it's like, oh, six pm 378 00:20:01,760 --> 00:20:02,240 Speaker 2: in Beijing. 379 00:20:02,800 --> 00:20:04,840 Speaker 4: You know, it's like, you know, everybody goes home. 380 00:20:04,760 --> 00:20:05,479 Speaker 3: The hacking stops. 381 00:20:05,640 --> 00:20:06,960 Speaker 4: Yeah, or like Chinese. 382 00:20:06,760 --> 00:20:09,919 Speaker 2: Chinese New Year Lunar New Year, everybody goes home, go 383 00:20:09,960 --> 00:20:13,159 Speaker 2: sees their parents in the village or whatever, like hacking stops. 384 00:20:13,160 --> 00:20:13,720 Speaker 4: It's amazing. 385 00:20:14,440 --> 00:20:16,919 Speaker 1: And in the case of Exit the UTILS looking at 386 00:20:16,920 --> 00:20:20,320 Speaker 1: the timing when this ga Tan was submitting code, there's 387 00:20:20,400 --> 00:20:23,399 Speaker 1: a bunch of submissions during Lunar New Year, but during 388 00:20:23,440 --> 00:20:29,879 Speaker 1: big Eastern European holidays like Christmas crickets. But that leaves 389 00:20:29,920 --> 00:20:33,720 Speaker 1: a question, what's the motive. Why would the Western SVR 390 00:20:33,920 --> 00:20:34,840 Speaker 1: want to do this. 391 00:20:35,520 --> 00:20:37,080 Speaker 4: So open everybody uses. 392 00:20:37,440 --> 00:20:39,520 Speaker 2: That's why this is so powerful is you don't have 393 00:20:39,520 --> 00:20:41,040 Speaker 2: to have a specific target in mind, which is why 394 00:20:41,040 --> 00:20:44,680 Speaker 2: you'd also spend three years doing it. Because let's say 395 00:20:44,680 --> 00:20:47,920 Speaker 2: you're at the SVR, you know, no matter what war 396 00:20:47,960 --> 00:20:50,680 Speaker 2: you're involved with, no matter what target you're going after, 397 00:20:51,119 --> 00:20:53,720 Speaker 2: opens station is gonna be useful. So this is probably 398 00:20:53,720 --> 00:20:56,120 Speaker 2: a team of the SVR who they don't know what's 399 00:20:56,119 --> 00:20:58,280 Speaker 2: gonna be used for. They're just they know they're gonna 400 00:20:58,280 --> 00:20:58,760 Speaker 2: get a medal. 401 00:20:58,800 --> 00:21:00,719 Speaker 1: You'll be able to use this at some point, Yeah, 402 00:21:00,760 --> 00:21:01,439 Speaker 1: who knows for what? 403 00:21:01,560 --> 00:21:03,520 Speaker 2: And the US does the same thing, right, Like, there's 404 00:21:03,560 --> 00:21:05,840 Speaker 2: people whose job it is to get the capability, and 405 00:21:06,200 --> 00:21:09,479 Speaker 2: it's other guy's job who understand the geopolitics, who understand 406 00:21:09,480 --> 00:21:10,679 Speaker 2: the intelligence. 407 00:21:10,640 --> 00:21:11,239 Speaker 3: To use it. 408 00:21:12,920 --> 00:21:17,800 Speaker 1: But Thankfully last spring, Andres Freud, the Microsoft engineer, was 409 00:21:17,840 --> 00:21:20,680 Speaker 1: able to discover the back door. But this was all 410 00:21:20,720 --> 00:21:23,000 Speaker 1: by chance. He wasn't looking for it. 411 00:21:23,720 --> 00:21:27,320 Speaker 2: He works on a database called Postgress, so he doesn't 412 00:21:27,320 --> 00:21:30,960 Speaker 2: work on xdutils. He works on Postgress is a big 413 00:21:31,000 --> 00:21:35,560 Speaker 2: open source database program that Microsoft uses in their Azure cloud. 414 00:21:35,640 --> 00:21:38,399 Speaker 2: So I'm guessing that's why Microsoft pays him. And in 415 00:21:39,119 --> 00:21:43,720 Speaker 2: the next version of Debian, so a popular Linux distribution, 416 00:21:44,320 --> 00:21:47,440 Speaker 2: Postgress was running a little bit slower, just tiny, tiny, 417 00:21:47,480 --> 00:21:49,080 Speaker 2: a little bit tiny, tiny, little bit. 418 00:21:49,040 --> 00:21:51,439 Speaker 1: Right, so tiny, like how much slower? 419 00:21:52,000 --> 00:21:54,600 Speaker 2: Like in one specific circumstance, it's taking a couple of 420 00:21:54,600 --> 00:21:57,840 Speaker 2: milliseconds longer to do something right, So like a millisecond, yeah, 421 00:21:57,840 --> 00:21:59,919 Speaker 2: but like if you're a database guy, that's a lot, right, 422 00:22:00,240 --> 00:22:04,240 Speaker 2: And so he is super looking into what is going on, 423 00:22:04,680 --> 00:22:07,919 Speaker 2: and he realizes, oh, it's not actually Postgress that's doing this, 424 00:22:08,040 --> 00:22:11,119 Speaker 2: it's open as hish, And so he could have stopped 425 00:22:11,119 --> 00:22:12,760 Speaker 2: there because he could have been like, oh, well, it's 426 00:22:12,760 --> 00:22:15,199 Speaker 2: not my problem, right, it's not my thing, and then 427 00:22:15,240 --> 00:22:17,359 Speaker 2: maybe nobody would have looked at it, right, Like you 428 00:22:17,400 --> 00:22:20,160 Speaker 2: could see an open source is that people pass problems 429 00:22:20,160 --> 00:22:23,080 Speaker 2: to each other all the time, right, So it is 430 00:22:23,520 --> 00:22:26,399 Speaker 2: this is like I think a normal like a normal person, 431 00:22:26,880 --> 00:22:29,360 Speaker 2: even open source developer, would have been like, oh okay, 432 00:22:30,480 --> 00:22:33,600 Speaker 2: I looked at this, it's not me. I'm gonna let 433 00:22:33,600 --> 00:22:35,800 Speaker 2: it go. But he did not let it go. He 434 00:22:36,000 --> 00:22:40,200 Speaker 2: ended up digging into okay, well what changed in open 435 00:22:40,200 --> 00:22:41,919 Speaker 2: as the sage and then he looks into open a 436 00:22:41,920 --> 00:22:44,679 Speaker 2: sh and sees this code and so what the What 437 00:22:44,720 --> 00:22:48,040 Speaker 2: the attackers that did is they created what's called a 438 00:22:48,200 --> 00:22:50,440 Speaker 2: no bus back door nobody butt us. 439 00:22:51,359 --> 00:22:54,399 Speaker 1: No bus or nobody butt us is a way of 440 00:22:54,440 --> 00:22:58,879 Speaker 1: creating a backdoor into something where nobody but us or you, 441 00:22:59,119 --> 00:23:00,880 Speaker 1: the hackers have the key. 442 00:23:01,359 --> 00:23:04,320 Speaker 2: They wanted, skeleton key that only they can use. But 443 00:23:04,560 --> 00:23:07,080 Speaker 2: no bus back doors nobody but us back doors ar 444 00:23:07,600 --> 00:23:11,280 Speaker 2: are actually hard to sneak in because they're pretty like 445 00:23:11,359 --> 00:23:12,240 Speaker 2: obviously sketchy. 446 00:23:12,720 --> 00:23:15,400 Speaker 1: So instead of doing everything all at once, they delivered 447 00:23:15,480 --> 00:23:19,480 Speaker 1: multiple patches in multiple different places, little things here and 448 00:23:19,560 --> 00:23:22,000 Speaker 1: there that wouldn't raise suspicion if you looked at one 449 00:23:22,240 --> 00:23:25,200 Speaker 1: or two or three of them, but layered on top 450 00:23:25,240 --> 00:23:28,560 Speaker 1: of each other, they created a key that only they 451 00:23:28,600 --> 00:23:29,200 Speaker 1: could use. 452 00:23:30,240 --> 00:23:32,239 Speaker 2: And so because they did all this stuff to kind 453 00:23:32,240 --> 00:23:34,840 Speaker 2: of obfuscate it and make it super secret. They actually 454 00:23:34,880 --> 00:23:41,600 Speaker 2: created the performance impact that unders saw and then went 455 00:23:41,680 --> 00:23:43,159 Speaker 2: way out of his way to pull and then he 456 00:23:43,200 --> 00:23:47,320 Speaker 2: posts in a public post, guys, this is super sketchy, right, like, 457 00:23:47,560 --> 00:23:51,240 Speaker 2: look at this code. There's no good argument for what's going. 458 00:23:51,040 --> 00:23:56,600 Speaker 1: On here, right, So, I mean I kind of have 459 00:23:56,680 --> 00:23:59,600 Speaker 1: to wonder about what the implications for this are. I 460 00:23:59,600 --> 00:24:02,600 Speaker 1: mean this clearly it almost worked. Do you think there's 461 00:24:02,600 --> 00:24:06,159 Speaker 1: hackers out there saying okay, yeah, yeah, let me change my. 462 00:24:06,640 --> 00:24:08,159 Speaker 3: Approach and maybe this is the way to do it. 463 00:24:08,720 --> 00:24:08,920 Speaker 4: Yeah. 464 00:24:08,920 --> 00:24:10,640 Speaker 2: I mean what I'm afraid of is we haven't found 465 00:24:10,640 --> 00:24:13,680 Speaker 2: any other ones like this. So what I thought would 466 00:24:13,680 --> 00:24:16,240 Speaker 2: happen is at the time, I'm like, oh man, we'll 467 00:24:16,240 --> 00:24:18,359 Speaker 2: have one or two more of these because everybody started 468 00:24:18,359 --> 00:24:21,399 Speaker 2: looking and then nobody else found any other ones. 469 00:24:21,280 --> 00:24:22,080 Speaker 4: Which terrifies me. 470 00:24:22,520 --> 00:24:23,920 Speaker 3: You think there's more like this out there? 471 00:24:24,080 --> 00:24:25,679 Speaker 4: I think it's quite possible it's more like this. 472 00:24:25,760 --> 00:24:28,480 Speaker 2: Yeah, Like, if anybody has an idea, two or three 473 00:24:28,480 --> 00:24:30,560 Speaker 2: other people have had the idea, right, So I can't 474 00:24:30,600 --> 00:24:32,680 Speaker 2: imagine these are the only people who are like, oh, 475 00:24:32,800 --> 00:24:35,480 Speaker 2: I'm gonna go bully some maintainer of one of the 476 00:24:35,960 --> 00:24:39,160 Speaker 2: five thousand libraries on Linux to go take it over 477 00:24:39,320 --> 00:24:41,960 Speaker 2: or submit a patch. I can't imagine there aren't other 478 00:24:42,000 --> 00:24:44,439 Speaker 2: ones now, are they in OpenSS H or are they 479 00:24:44,480 --> 00:24:45,439 Speaker 2: something much more subtle? 480 00:24:45,720 --> 00:24:46,159 Speaker 4: I don't know. 481 00:24:46,760 --> 00:24:49,240 Speaker 2: I mean, this would have been both in kind of 482 00:24:50,520 --> 00:24:52,720 Speaker 2: one of the worst possible places, and it would have 483 00:24:52,760 --> 00:24:56,000 Speaker 2: been a skeleton key that only this attacker could have used, 484 00:24:56,080 --> 00:24:58,320 Speaker 2: which is like kind of the worst case scenario. It's 485 00:24:58,320 --> 00:25:01,240 Speaker 2: also the hardest level of difficulty, right, these people picked 486 00:25:01,720 --> 00:25:04,120 Speaker 2: the hardest level. Said, if you want to do something 487 00:25:04,160 --> 00:25:08,040 Speaker 2: much simpler is you go after a much lesser used 488 00:25:08,720 --> 00:25:12,160 Speaker 2: service that's specifically at the target that you're going after. 489 00:25:12,400 --> 00:25:14,760 Speaker 2: If you're going after a specific target and you're like, oh, 490 00:25:14,760 --> 00:25:19,000 Speaker 2: they use this specific this one specific service that's much 491 00:25:19,040 --> 00:25:21,200 Speaker 2: less popular, that doesn't have all these eyeballs on it, 492 00:25:21,520 --> 00:25:23,120 Speaker 2: then you don't have to be as tricky. 493 00:25:24,560 --> 00:25:28,000 Speaker 1: There haven't been any like this in OpenSSH, but there 494 00:25:28,040 --> 00:25:31,000 Speaker 1: have been other attempts that the Open Source Security Foundation 495 00:25:31,240 --> 00:25:34,800 Speaker 1: and the Open JavaScript Foundation have found that use similar 496 00:25:34,840 --> 00:25:38,879 Speaker 1: social tactics. One project received emails from accounts asking to 497 00:25:38,880 --> 00:25:43,200 Speaker 1: be designated as project maintainers despite having little prior involvement, 498 00:25:43,600 --> 00:25:47,639 Speaker 1: and two other projects saw very similar suspicious patterns. This 499 00:25:47,760 --> 00:25:51,000 Speaker 1: kind of social engineering is really effective because you don't 500 00:25:51,000 --> 00:25:54,239 Speaker 1: have to manipulate code. You just manipulate the person who 501 00:25:54,320 --> 00:25:56,960 Speaker 1: has their hands on the code. And it's only going 502 00:25:57,040 --> 00:25:59,960 Speaker 1: to get easier to do and harder to detect. 503 00:26:00,960 --> 00:26:03,399 Speaker 2: Now we're at the point where with AI, like you 504 00:26:03,400 --> 00:26:05,439 Speaker 2: could be fake now and I have no idea if 505 00:26:05,440 --> 00:26:07,119 Speaker 2: you really exist or vice versa. 506 00:26:07,600 --> 00:26:10,760 Speaker 1: Wait are you are you suggesting that doing something like 507 00:26:10,800 --> 00:26:12,880 Speaker 1: this might be a little bit easier because somebody could 508 00:26:12,960 --> 00:26:16,439 Speaker 1: fake that they actually exist. Oh yeah, with a phone 509 00:26:16,440 --> 00:26:18,160 Speaker 1: conversation or a video conversation. 510 00:26:18,320 --> 00:26:21,160 Speaker 2: Oh yeah, we're already seeing that from the ransomware actors. 511 00:26:21,400 --> 00:26:24,920 Speaker 2: It's easy for phone, right, So you're already seeing them 512 00:26:25,080 --> 00:26:27,720 Speaker 2: fake people's voices. So people are getting phone calls from 513 00:26:27,760 --> 00:26:31,320 Speaker 2: like their CEO. The CEO goes on CNBC for two minutes, 514 00:26:31,600 --> 00:26:34,320 Speaker 2: they get their voice from CNBC, they plug it into 515 00:26:34,760 --> 00:26:38,960 Speaker 2: a AI voice library, and then you call and like, hey, 516 00:26:39,000 --> 00:26:42,160 Speaker 2: it's Bob, I need you do a million dollar transfer. Right, 517 00:26:42,240 --> 00:26:45,000 Speaker 2: So that kind of stuff, and now you see real 518 00:26:45,040 --> 00:26:48,360 Speaker 2: time video too. It's not perfect, but it's getting there. 519 00:26:48,480 --> 00:26:48,960 Speaker 3: Yeah. 520 00:26:49,000 --> 00:26:51,240 Speaker 2: The trick, by the way, if this happens to any 521 00:26:51,280 --> 00:26:53,720 Speaker 2: of your listeners. The trick is you can ask people 522 00:26:53,760 --> 00:26:57,479 Speaker 2: to move, touch things in the background, do three sixty 523 00:26:57,480 --> 00:26:59,159 Speaker 2: on the head. It's harder for them to do ears 524 00:26:59,200 --> 00:27:02,280 Speaker 2: forever reason, but they'll get there, right, So, like if 525 00:27:02,320 --> 00:27:03,720 Speaker 2: I asked you to take your glasses off, it'd be 526 00:27:03,800 --> 00:27:05,760 Speaker 2: very hard for the model, Like take your glasses off. 527 00:27:05,960 --> 00:27:09,560 Speaker 1: By the way, hold on, for those of y'all listening 528 00:27:09,560 --> 00:27:12,440 Speaker 1: at home, I took my glasses off here, just double 529 00:27:12,560 --> 00:27:12,879 Speaker 1: check it. 530 00:27:13,440 --> 00:27:14,960 Speaker 2: Oh, you kind of frozen me when you did that, 531 00:27:15,000 --> 00:27:21,080 Speaker 2: So that's sketch man, it's sketchyf as my students say. Sorry, 532 00:27:20,960 --> 00:27:22,760 Speaker 2: they keep me on my Sanford soons. 533 00:27:23,680 --> 00:27:27,080 Speaker 1: But you know, in the future, though, it is going 534 00:27:27,160 --> 00:27:31,600 Speaker 1: to be easier to spoof people's personalities, yeah, and stuff 535 00:27:31,640 --> 00:27:33,800 Speaker 1: like that. So these things that you're suggesting right now 536 00:27:33,800 --> 00:27:35,760 Speaker 1: they work now, are they going to work in a year? 537 00:27:36,280 --> 00:27:36,359 Speaker 3: So? 538 00:27:36,480 --> 00:27:39,960 Speaker 2: I mean, the good thing about this is open source 539 00:27:39,960 --> 00:27:44,199 Speaker 2: developers have become much more paranoid, right, So people have 540 00:27:44,240 --> 00:27:46,439 Speaker 2: become much more paranoid about new people. And there's a 541 00:27:46,480 --> 00:27:48,000 Speaker 2: downside of that, right that if you're trying to get 542 00:27:48,040 --> 00:27:51,120 Speaker 2: into open source, it's harder. There have become projects where 543 00:27:51,119 --> 00:27:53,720 Speaker 2: it's like, okay, great, let's meet up in person. If 544 00:27:53,720 --> 00:27:56,880 Speaker 2: somebody's willing only to communicate with you an email, then 545 00:27:57,440 --> 00:28:00,199 Speaker 2: you have to be kind of sketched out. Now, there 546 00:28:00,240 --> 00:28:01,919 Speaker 2: have been some changes since this. I think people have 547 00:28:01,960 --> 00:28:04,240 Speaker 2: been more paranoid. There's been a bunch of work On 548 00:28:04,600 --> 00:28:08,880 Speaker 2: the flip side of AI is that traditional code scanning 549 00:28:08,880 --> 00:28:13,239 Speaker 2: tools PREI code scanning tools are not extremely good at 550 00:28:13,240 --> 00:28:15,679 Speaker 2: detecting this kind of malicious code. But there is some 551 00:28:15,760 --> 00:28:18,080 Speaker 2: hope that some of the newer AI based code scanning 552 00:28:18,080 --> 00:28:20,119 Speaker 2: tools could could do this kind of stuff at scale. 553 00:28:20,480 --> 00:28:22,439 Speaker 2: The flip side is is AI is really good at 554 00:28:22,440 --> 00:28:27,399 Speaker 2: writing code, So you know, do you not have to 555 00:28:27,400 --> 00:28:31,399 Speaker 2: be SVR level anymore to be able to write a backdoor? 556 00:28:31,440 --> 00:28:32,840 Speaker 4: That's good, That's probably true as. 557 00:28:32,760 --> 00:28:35,560 Speaker 1: Well, it's open source too much of a risk in 558 00:28:35,600 --> 00:28:38,560 Speaker 1: the age of AI, and can we protect ourselves from 559 00:28:38,560 --> 00:28:39,520 Speaker 1: another hack like this? 560 00:28:40,440 --> 00:28:41,680 Speaker 3: That's after the break. 561 00:28:55,280 --> 00:28:58,200 Speaker 1: So this and I want to get back into kind 562 00:28:58,200 --> 00:28:59,760 Speaker 1: of the play by play here, but a lot of 563 00:28:59,800 --> 00:29:05,480 Speaker 1: this hinges on open source. So and I think one 564 00:29:05,520 --> 00:29:10,840 Speaker 1: of the really kind of concerning things about this entire 565 00:29:10,880 --> 00:29:13,680 Speaker 1: thing that happened or almost happened is the fact that 566 00:29:13,720 --> 00:29:17,280 Speaker 1: it basically happened in broad daylight. Yes, and it happened 567 00:29:17,280 --> 00:29:21,320 Speaker 1: because this is open source. The thing about open source, 568 00:29:21,360 --> 00:29:23,760 Speaker 1: I think, is when you start to explain it to 569 00:29:23,800 --> 00:29:26,480 Speaker 1: somebody who's never heard of it. Are you familiar with 570 00:29:26,560 --> 00:29:28,200 Speaker 1: the galaxy brain meme? 571 00:29:28,840 --> 00:29:29,080 Speaker 4: Yeah? 572 00:29:29,160 --> 00:29:30,280 Speaker 3: Do you know what I'm talking about? Yeah? 573 00:29:30,280 --> 00:29:32,400 Speaker 1: So I feel like this is like that galaxy brain meme, 574 00:29:32,400 --> 00:29:34,880 Speaker 1: where at the very top, when you tell somebody to 575 00:29:34,960 --> 00:29:38,160 Speaker 1: open source, the response is, this is a terrible idea. 576 00:29:38,160 --> 00:29:41,320 Speaker 1: Everybody can see the code. And then you get a 577 00:29:41,360 --> 00:29:43,480 Speaker 1: little bit further down it's, oh, this is a great idea. 578 00:29:43,480 --> 00:29:45,640 Speaker 1: Everybody can see the code, and then they hear about 579 00:29:45,680 --> 00:29:47,400 Speaker 1: xutails when we get down to the bottom, and it's 580 00:29:47,800 --> 00:29:51,440 Speaker 1: a terrible idea. Everybody can see the code. What's the 581 00:29:51,480 --> 00:29:53,720 Speaker 1: true galaxy brain take on this for open source? 582 00:29:54,280 --> 00:29:58,400 Speaker 2: I mean, people go back and forth. So one of 583 00:29:58,440 --> 00:30:00,840 Speaker 2: the ideas is that if you can see all the code, 584 00:30:00,840 --> 00:30:01,680 Speaker 2: you can see all the bugs. 585 00:30:01,960 --> 00:30:02,200 Speaker 3: Right. 586 00:30:02,560 --> 00:30:05,880 Speaker 2: Is the idea that because it's open source, that it 587 00:30:05,920 --> 00:30:09,160 Speaker 2: should be more secure than closed source because you could 588 00:30:09,160 --> 00:30:12,160 Speaker 2: see the flaws. I don't think that has empirically turned 589 00:30:12,200 --> 00:30:17,360 Speaker 2: out to be true, right, And so I think what 590 00:30:17,440 --> 00:30:19,720 Speaker 2: I would say is I'm a big proponent of open source. 591 00:30:19,760 --> 00:30:21,640 Speaker 2: I think it's great. I think it has a humongous 592 00:30:21,680 --> 00:30:25,400 Speaker 2: economic benefit to the world. The truth is is the 593 00:30:25,520 --> 00:30:27,920 Speaker 2: entire kind of cloud competing revolution we're all living through 594 00:30:28,400 --> 00:30:31,800 Speaker 2: only exists because of open source software. So that's an 595 00:30:31,800 --> 00:30:34,800 Speaker 2: incredible thing. That's a wonderful thing. Open source is great 596 00:30:34,840 --> 00:30:38,080 Speaker 2: from an economic perspective, it is great from an innovation perspective. 597 00:30:38,360 --> 00:30:41,480 Speaker 2: We should not pretend that it magically solves trust and 598 00:30:41,560 --> 00:30:45,600 Speaker 2: security problems. And if you're a company that's relied upon 599 00:30:45,600 --> 00:30:50,000 Speaker 2: open source, you have a ethical and moral obligation to 600 00:30:50,160 --> 00:30:53,680 Speaker 2: deal with the security aspects of it, and it contribute back. 601 00:30:54,280 --> 00:30:57,080 Speaker 2: And I do think that is something that's gone lost, 602 00:30:57,120 --> 00:31:00,160 Speaker 2: is that people have just kind of assumed somebody else's 603 00:31:00,200 --> 00:31:03,360 Speaker 2: dealing with it, and everybody assume somebody else is doing 604 00:31:03,520 --> 00:31:05,560 Speaker 2: the security work, and that turns out not to be true. 605 00:31:06,360 --> 00:31:08,880 Speaker 1: You know, I think that really gets the core of 606 00:31:08,880 --> 00:31:12,400 Speaker 1: what a lot of this is. Because if somebody sees 607 00:31:13,360 --> 00:31:16,760 Speaker 1: XU tills there was a potential security flaw in that, Okay, 608 00:31:16,760 --> 00:31:17,880 Speaker 1: well I don't care about that. 609 00:31:17,920 --> 00:31:18,280 Speaker 3: What's that? 610 00:31:18,400 --> 00:31:20,760 Speaker 1: Oh, well, you know it's involved with open as a stage. Well, 611 00:31:20,800 --> 00:31:22,160 Speaker 1: I don't use that either. I don't have that app 612 00:31:22,160 --> 00:31:23,600 Speaker 1: on my phone. I don't know what you're talking about. 613 00:31:24,080 --> 00:31:29,600 Speaker 1: And in this weird way, I feel like the more 614 00:31:29,680 --> 00:31:34,120 Speaker 1: and more technology actually starts to become just magic, that 615 00:31:34,200 --> 00:31:37,480 Speaker 1: things just work. Yeah, we are less and less actually 616 00:31:37,520 --> 00:31:42,200 Speaker 1: tech literate. All the stuff that was science fiction even 617 00:31:42,240 --> 00:31:46,600 Speaker 1: ten years ago, two years ago, frankly is it's just 618 00:31:46,680 --> 00:31:47,240 Speaker 1: normal now. 619 00:31:47,560 --> 00:31:47,760 Speaker 4: Yeah. 620 00:31:47,880 --> 00:31:50,680 Speaker 1: And so we're able to do so much with technology 621 00:31:50,760 --> 00:31:53,000 Speaker 1: just regular people things we just do with our phone 622 00:31:53,040 --> 00:31:57,680 Speaker 1: every day, that we've become really removed from the technology itself, 623 00:31:57,680 --> 00:31:59,920 Speaker 1: and so less and less of us, fewer and fewer 624 00:32:00,160 --> 00:32:02,520 Speaker 1: was actually know how to use a computer. Yeah, and 625 00:32:02,560 --> 00:32:05,240 Speaker 1: so this feels totally removed from us. This is like, oh, 626 00:32:05,280 --> 00:32:07,360 Speaker 1: this is some weird nerds shit. I'd like, I don't 627 00:32:07,440 --> 00:32:10,080 Speaker 1: use that nerd program. Doesn't affect me. 628 00:32:10,440 --> 00:32:13,040 Speaker 2: Yeah, No, you're totally right. I mean, I tell my 629 00:32:13,080 --> 00:32:15,520 Speaker 2: Stanford students. Security is one of the best fields to 630 00:32:15,560 --> 00:32:18,040 Speaker 2: get into professionally because it's the only part of computers 631 00:32:18,040 --> 00:32:21,720 Speaker 2: it gets worse every year. Everything else magically gets better. Man, 632 00:32:22,000 --> 00:32:24,640 Speaker 2: So you could find yourself in any other field being 633 00:32:24,680 --> 00:32:27,880 Speaker 2: made irrelevant, But if you get into security, you have 634 00:32:28,000 --> 00:32:30,000 Speaker 2: job security for life because every year. 635 00:32:29,880 --> 00:32:32,160 Speaker 4: I've been in it. It's gone worse. 636 00:32:32,560 --> 00:32:35,920 Speaker 2: And one of the reasons is because you say it's nerdship. 637 00:32:35,960 --> 00:32:39,800 Speaker 2: But even the nerds we get the normal median nerd 638 00:32:40,040 --> 00:32:43,920 Speaker 2: gets further and further away from the truth, the reality 639 00:32:44,000 --> 00:32:46,960 Speaker 2: of what's going on on computers. So when I learned 640 00:32:47,000 --> 00:32:51,040 Speaker 2: how to program, I learned assembly language, right, I learned 641 00:32:51,320 --> 00:32:55,480 Speaker 2: how to write like the lowest level languages. And then 642 00:32:55,720 --> 00:32:59,120 Speaker 2: you know, they stopped teaching assembly language unless you took 643 00:32:59,120 --> 00:33:02,280 Speaker 2: special classes, and you learn, like in Python, right, like 644 00:33:02,320 --> 00:33:04,960 Speaker 2: a very high level language that you don't even you know, 645 00:33:05,040 --> 00:33:07,520 Speaker 2: you don't learn how to like do memory management. 646 00:33:07,800 --> 00:33:09,200 Speaker 3: Right, I mean Python pithon. 647 00:33:09,200 --> 00:33:10,840 Speaker 1: And it's even just to break this down like Python 648 00:33:11,000 --> 00:33:13,800 Speaker 1: for a casual person, you can look at it. You 649 00:33:13,800 --> 00:33:15,920 Speaker 1: can kind of tell what's going on. It basically looks 650 00:33:15,960 --> 00:33:19,800 Speaker 1: like English. Yeah, assembly is letters and numbers. 651 00:33:19,640 --> 00:33:22,440 Speaker 2: Right, right, But the nice thing about assembly is it's 652 00:33:22,480 --> 00:33:23,480 Speaker 2: the truth of the matter. 653 00:33:23,680 --> 00:33:23,880 Speaker 3: Right. 654 00:33:23,960 --> 00:33:26,600 Speaker 2: It has a one to one matching to what the 655 00:33:26,600 --> 00:33:29,640 Speaker 2: processor itself is doing. And from a security perspective, if 656 00:33:29,640 --> 00:33:31,200 Speaker 2: you look at it, is the reality of what a 657 00:33:31,240 --> 00:33:33,240 Speaker 2: security flaw is is seen in the assembly. 658 00:33:33,440 --> 00:33:33,560 Speaker 3: Right. 659 00:33:34,080 --> 00:33:37,440 Speaker 2: In Python, you get further, you get abstracted away, you 660 00:33:37,480 --> 00:33:39,680 Speaker 2: get further from the reality of what's actually going on 661 00:33:39,680 --> 00:33:43,200 Speaker 2: on the computer. Now what you see it's incredibly powerful, 662 00:33:43,200 --> 00:33:46,040 Speaker 2: it's incredibly cool, and so I'm gonna I'm not gonna 663 00:33:46,200 --> 00:33:48,120 Speaker 2: crap on it because I think it's an incredibly good 664 00:33:48,120 --> 00:33:50,760 Speaker 2: thing for people. But you look at like Claude three 665 00:33:50,760 --> 00:33:54,080 Speaker 2: point seven code. You know, this new Claude model, and 666 00:33:54,080 --> 00:33:56,960 Speaker 2: you see people on Twitter who don't know anything about 667 00:33:56,960 --> 00:33:59,360 Speaker 2: computers and they're able to program now because they can 668 00:33:59,360 --> 00:34:01,560 Speaker 2: go into there and they could say, build me software 669 00:34:01,560 --> 00:34:04,000 Speaker 2: that does X. And that is going to be terrible 670 00:34:04,040 --> 00:34:07,640 Speaker 2: for security. It's super cool for people's economic opportunities because 671 00:34:07,680 --> 00:34:10,520 Speaker 2: any bigap you can become a program right now. But man, 672 00:34:10,560 --> 00:34:13,600 Speaker 2: are people in security gonna love it because now you 673 00:34:13,600 --> 00:34:16,400 Speaker 2: don't need to know anything about how computers work and 674 00:34:16,440 --> 00:34:18,120 Speaker 2: you're just gonna ask the AI system to build it 675 00:34:18,160 --> 00:34:18,400 Speaker 2: for you. 676 00:34:18,440 --> 00:34:19,440 Speaker 4: And I see it with my students. 677 00:34:19,440 --> 00:34:22,279 Speaker 2: Stanford students like one of the top computer science programs 678 00:34:22,360 --> 00:34:26,560 Speaker 2: in the world, and you can graduate and not actually 679 00:34:26,560 --> 00:34:30,080 Speaker 2: really understand how operating systems work. I apologize to Sandford 680 00:34:30,080 --> 00:34:33,080 Speaker 2: Computer Science department, right, but really like you can have 681 00:34:33,120 --> 00:34:36,799 Speaker 2: a totally productive career in Silicon Valley and not really 682 00:34:36,920 --> 00:34:39,919 Speaker 2: understand what's going on three or four layers down. In fact, 683 00:34:39,960 --> 00:34:42,720 Speaker 2: it's better for you not to write. It's better because 684 00:34:42,800 --> 00:34:45,440 Speaker 2: you're at the high level where you're much more productive. 685 00:34:46,120 --> 00:34:48,120 Speaker 2: You're much more productive having the AI do the work 686 00:34:48,160 --> 00:34:50,719 Speaker 2: for you. You're much productive having get hub Copilot help 687 00:34:50,760 --> 00:34:53,000 Speaker 2: you rewrite stuff. You're much more productive using all the 688 00:34:53,040 --> 00:34:56,680 Speaker 2: cloud intermediation layers. And so that's one of the reasons 689 00:34:56,719 --> 00:34:59,160 Speaker 2: why security gets worse every single year is that we 690 00:34:59,200 --> 00:35:02,120 Speaker 2: add these layers of abstraction that makes things easier for people. 691 00:35:02,400 --> 00:35:05,960 Speaker 2: And AI is the ultimate abstraction layer, because now you 692 00:35:06,000 --> 00:35:08,920 Speaker 2: can talk to computers and plain English and have them 693 00:35:08,960 --> 00:35:10,320 Speaker 2: do incredibly complex things. 694 00:35:12,719 --> 00:35:16,600 Speaker 1: The thing about this whole story, I mean, I'm thinking 695 00:35:16,600 --> 00:35:18,960 Speaker 1: about you know, we're in a time right now where 696 00:35:19,320 --> 00:35:25,359 Speaker 1: anything bad happens or almost happens. Netflix documentary, Hulu documentary, 697 00:35:25,560 --> 00:35:28,200 Speaker 1: it's a true crime podcast. At some point, I don't 698 00:35:28,239 --> 00:35:32,600 Speaker 1: see that happening with this. This is something that, as 699 00:35:32,640 --> 00:35:38,040 Speaker 1: you were saying, was almost it truly could have been catastrophic. Yeah, 700 00:35:38,160 --> 00:35:40,600 Speaker 1: but it's also kind of boring. 701 00:35:41,000 --> 00:35:42,640 Speaker 2: It's money, Well, you don't think I could get I 702 00:35:42,640 --> 00:35:44,719 Speaker 2: could sell ten episodes to Netflix on this. 703 00:35:45,719 --> 00:35:48,239 Speaker 1: If you can hire me as a producer, I'd love 704 00:35:48,280 --> 00:35:51,720 Speaker 1: to help. But you see what I'm saying, it takes 705 00:35:51,719 --> 00:35:54,799 Speaker 1: a while to even explain what the heck we're talking about. Yeah, 706 00:35:55,000 --> 00:35:56,839 Speaker 1: And I think that comes back to some of this 707 00:35:57,520 --> 00:36:01,600 Speaker 1: in the same way that this vulnerability was introduced via 708 00:36:01,680 --> 00:36:04,040 Speaker 1: social engineering. A lot of this is social I mean 709 00:36:04,080 --> 00:36:05,920 Speaker 1: a lot of your work you probably think about this. 710 00:36:06,560 --> 00:36:08,759 Speaker 1: How do you get people to care about something like this? 711 00:36:09,200 --> 00:36:11,239 Speaker 2: I mean, so that's that's a challenge. That's one of 712 00:36:11,239 --> 00:36:14,160 Speaker 2: the biggest challenges. If you're like a chief information security officer, 713 00:36:14,480 --> 00:36:16,320 Speaker 2: one of your big jobs is getting the rest of 714 00:36:16,360 --> 00:36:20,120 Speaker 2: the company to care about security. Ciso's we have a 715 00:36:20,160 --> 00:36:22,600 Speaker 2: reputation of being the people who say no all the time. 716 00:36:24,000 --> 00:36:26,760 Speaker 2: So I was the CISO of Facebook and I once 717 00:36:26,840 --> 00:36:29,400 Speaker 2: walked into a meeting with a bunch of other vps 718 00:36:29,840 --> 00:36:31,960 Speaker 2: and somebody literally said like, oh shit, some of this 719 00:36:32,040 --> 00:36:34,200 Speaker 2: year like hey guys, I. 720 00:36:34,080 --> 00:36:37,200 Speaker 4: Can hear you. I can hear you, and like, no, no, 721 00:36:37,239 --> 00:36:37,680 Speaker 4: it's not you. 722 00:36:37,840 --> 00:36:40,080 Speaker 2: It's just like whenever you come like it's just because you're 723 00:36:40,080 --> 00:36:42,759 Speaker 2: telling us, like there's a coup in Turkey or something terrible, 724 00:36:43,040 --> 00:36:45,000 Speaker 2: Like because I was just the bare or bad news, right. 725 00:36:45,040 --> 00:36:47,080 Speaker 2: But this is what's a real challenge for my colleagues, 726 00:36:47,360 --> 00:36:49,200 Speaker 2: and it's a real challenge for us as a society. 727 00:36:49,560 --> 00:36:54,480 Speaker 2: People don't want to think that the systems that they 728 00:36:54,560 --> 00:36:57,719 Speaker 2: rely upon are fragile, and I think that's like a 729 00:36:57,880 --> 00:37:02,520 Speaker 2: real problem. 730 00:37:02,560 --> 00:37:04,759 Speaker 3: What do we learn from this? What is it? 731 00:37:05,000 --> 00:37:07,000 Speaker 1: Let me just say because I don't I personally don't 732 00:37:07,000 --> 00:37:09,000 Speaker 1: think just being out and talking to people, if I 733 00:37:09,040 --> 00:37:11,640 Speaker 1: was trying to if I try to tell somebody, hey, yeah, man, 734 00:37:11,680 --> 00:37:13,160 Speaker 1: what do you think about the xdutails thing? 735 00:37:13,200 --> 00:37:14,040 Speaker 3: Have you? 736 00:37:14,040 --> 00:37:15,600 Speaker 4: You know, hey buddy, what's up? 737 00:37:15,960 --> 00:37:18,720 Speaker 3: Yeah? Has that Has it changed anything about how? Yeah? 738 00:37:18,719 --> 00:37:20,680 Speaker 1: Has that changed anything about how you go about your life? 739 00:37:20,920 --> 00:37:23,200 Speaker 1: People can tell me no. So I got to ask 740 00:37:23,239 --> 00:37:27,279 Speaker 1: somebody who's actually closer to this. Has this changed how 741 00:37:27,400 --> 00:37:30,440 Speaker 1: you approach things? Has this changed how the industry approaches things? 742 00:37:30,640 --> 00:37:33,839 Speaker 1: Has this changed how? I mean the theory that you're 743 00:37:33,880 --> 00:37:35,879 Speaker 1: putting out is that this is a state actor? Has 744 00:37:35,920 --> 00:37:39,640 Speaker 1: this changed how national security is being looked at? 745 00:37:41,440 --> 00:37:44,520 Speaker 2: So for companies that know what they're doing, it has 746 00:37:44,640 --> 00:37:48,520 Speaker 2: changed that they approach open source. For a handful of 747 00:37:48,600 --> 00:37:52,480 Speaker 2: really big you know, like the Googles, the Metas, the Amazons, 748 00:37:52,480 --> 00:37:55,239 Speaker 2: the Microsoft's, the really big tech companies that do a 749 00:37:55,280 --> 00:37:58,880 Speaker 2: lot of open source work. They are looking more carefully 750 00:37:58,880 --> 00:38:01,560 Speaker 2: at open source for security companies to do this work. 751 00:38:01,800 --> 00:38:05,040 Speaker 2: We're investing in software and AI that can do this 752 00:38:05,160 --> 00:38:09,080 Speaker 2: work for us. But it has not changed anything massively. Right, 753 00:38:09,120 --> 00:38:12,400 Speaker 2: We're still running Linux, We're still all pulling in fifty 754 00:38:12,440 --> 00:38:16,640 Speaker 2: thousand packages. We have these humongous dependency graphs. The truth is, 755 00:38:16,800 --> 00:38:18,759 Speaker 2: you can't just pivot all these things, right. It has 756 00:38:18,760 --> 00:38:21,080 Speaker 2: made us more concerned about these problems. When you talk 757 00:38:21,120 --> 00:38:24,160 Speaker 2: to CISOs, my colleagues and I, we're all more concerned. 758 00:38:24,400 --> 00:38:27,600 Speaker 2: But we can't magically pivot off of the infrastructure we 759 00:38:27,640 --> 00:38:30,360 Speaker 2: have built over a decade. I do not think we've 760 00:38:30,440 --> 00:38:34,160 Speaker 2: dealt with the fact that if you get on the 761 00:38:34,160 --> 00:38:36,799 Speaker 2: subway in the morning and you look around, most of 762 00:38:36,800 --> 00:38:40,080 Speaker 2: the people on that train in their pocket. Ex Utils 763 00:38:40,280 --> 00:38:44,640 Speaker 2: is in their pocket. Every single person in there, hundreds 764 00:38:44,640 --> 00:38:47,680 Speaker 2: of copies of their social Security number is sitting on 765 00:38:47,719 --> 00:38:50,399 Speaker 2: servers that would have been backdoored by this attack. That's 766 00:38:50,400 --> 00:38:56,040 Speaker 2: how you can think of it, right, So that's how close. 767 00:38:55,800 --> 00:38:57,839 Speaker 3: We can man. 768 00:39:01,160 --> 00:39:04,839 Speaker 1: So just some closing thoughts here. Again, the reason that 769 00:39:04,920 --> 00:39:07,800 Speaker 1: most people don't know about what was almost the biggest 770 00:39:07,840 --> 00:39:10,680 Speaker 1: hack in the history of the Internet is because this 771 00:39:10,719 --> 00:39:13,440 Speaker 1: is really hard to describe to a non technical audience. 772 00:39:13,480 --> 00:39:16,319 Speaker 1: I mean, when you say XU tills or Linux or 773 00:39:16,360 --> 00:39:19,400 Speaker 1: open ssh, people's eyes just rolling the back of their heads. 774 00:39:19,760 --> 00:39:21,080 Speaker 3: But we can't. 775 00:39:20,800 --> 00:39:24,120 Speaker 1: Allow tech literacy to be a barrier to understanding how 776 00:39:24,160 --> 00:39:27,319 Speaker 1: the world works and the truth is, even beyond all 777 00:39:27,360 --> 00:39:30,040 Speaker 1: the tech jargon, a lot of these things are very 778 00:39:30,160 --> 00:39:33,759 Speaker 1: human and they're not so hard to understand. And so 779 00:39:33,800 --> 00:39:35,160 Speaker 1: that's one of the things that we're really trying to 780 00:39:35,200 --> 00:39:37,680 Speaker 1: do here on kill Switch as we keep doing these episodes, 781 00:39:38,320 --> 00:39:40,239 Speaker 1: is to open it up so that more people are 782 00:39:40,320 --> 00:39:43,319 Speaker 1: able to feel like they're part of the conversations that 783 00:39:43,520 --> 00:39:52,239 Speaker 1: affect all of us. And that is it for this one, 784 00:39:52,440 --> 00:39:55,080 Speaker 1: for real. Thank y'all so much for listening to kill Switch. 785 00:39:55,440 --> 00:39:57,879 Speaker 1: You can hit us up at kill Switch at Kaleidoscope 786 00:39:57,880 --> 00:39:59,960 Speaker 1: dot NYC if you've got any thoughts or if there's 787 00:40:00,040 --> 00:40:02,040 Speaker 1: anything you want us to cover in the future, and 788 00:40:02,080 --> 00:40:04,239 Speaker 1: you can get me at dex Digi that's the d 789 00:40:04,360 --> 00:40:07,839 Speaker 1: e x d ig I on Instagram or blue Sky 790 00:40:07,880 --> 00:40:10,560 Speaker 1: if that's more your thing, and if you like the episode. 791 00:40:10,640 --> 00:40:12,719 Speaker 1: You know, take that phone out of the pocket and 792 00:40:12,800 --> 00:40:15,200 Speaker 1: leave us a review. It helps people find the show, 793 00:40:15,280 --> 00:40:18,319 Speaker 1: which in turn helps us keep doing our thing. And 794 00:40:18,640 --> 00:40:22,520 Speaker 1: this thing is hosted by me Dexter Thomas. It's produced 795 00:40:22,520 --> 00:40:26,920 Speaker 1: by Sena Ozaki, Daryl luck Potts and Kate Osborne. Our 796 00:40:26,920 --> 00:40:29,480 Speaker 1: theme song is by Kyle Murdoch, who also makes the 797 00:40:29,520 --> 00:40:34,240 Speaker 1: show from Kaleidoscope. Our executive producers are Ozma Lashin, mangesh 798 00:40:34,239 --> 00:40:38,800 Speaker 1: Hot Togodur, and Kate Osborne. From iHeart our executive producers 799 00:40:38,880 --> 00:40:40,680 Speaker 1: are Katrina Norville and Nikki E. 800 00:40:40,840 --> 00:40:42,439 Speaker 3: Tour. That's it for this time. 801 00:40:42,760 --> 00:41:01,120 Speaker 1: Catch on the next one.