1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:16,280 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:17,040 --> 00:00:23,200 Speaker 1: How the tech are you? So I got got by that? 4 00:00:23,320 --> 00:00:27,880 Speaker 1: I mean I regurgitated some tech news that I saw 5 00:00:27,920 --> 00:00:30,960 Speaker 1: without actually taking the steps to make sure there was 6 00:00:31,000 --> 00:00:35,640 Speaker 1: something substantial behind that tech news. And that's a failure 7 00:00:35,680 --> 00:00:38,360 Speaker 1: on my part. And it shows that critical thinking is 8 00:00:38,360 --> 00:00:42,280 Speaker 1: a skill that you have to actively practice. It's not 9 00:00:42,400 --> 00:00:48,520 Speaker 1: something that just magically happens. So what got me? Well, 10 00:00:48,560 --> 00:00:52,080 Speaker 1: recently in a tech News episode, I talked about how 11 00:00:52,159 --> 00:00:57,600 Speaker 1: the FBI and the FCC, both official US government agencies, 12 00:00:58,240 --> 00:01:02,400 Speaker 1: issued new warnings about using public charging ports for your 13 00:01:02,440 --> 00:01:06,960 Speaker 1: mobile devices, you know, ports in places like hotels and 14 00:01:07,120 --> 00:01:10,920 Speaker 1: airports and coffee shops, that kind of thing, and that 15 00:01:11,080 --> 00:01:16,480 Speaker 1: hackers could have potentially compromised those ports so that while 16 00:01:16,480 --> 00:01:20,039 Speaker 1: it appears that you're charging your phone or your tablet 17 00:01:20,360 --> 00:01:25,679 Speaker 1: or whatever, some nefarious criminal, probably wearing a hoodie, sitting 18 00:01:25,720 --> 00:01:29,000 Speaker 1: in the dark someplace in front of a monochromatic computer 19 00:01:29,440 --> 00:01:33,600 Speaker 1: monitor that only can show green text, that person is 20 00:01:33,880 --> 00:01:37,760 Speaker 1: actively hijacking your phone and stealing all of your moneys 21 00:01:38,200 --> 00:01:42,679 Speaker 1: and stuff. In the parlance of cool security kids. This 22 00:01:42,760 --> 00:01:48,320 Speaker 1: practice is called juice jacking. That you have jacked the 23 00:01:48,440 --> 00:01:54,640 Speaker 1: charging station to spread malicious code. While this warning was recent, 24 00:01:54,760 --> 00:01:59,520 Speaker 1: there have been multiple incidents of authorities warning people against 25 00:01:59,600 --> 00:02:02,760 Speaker 1: using the these kinds of charging ports over the years. 26 00:02:03,200 --> 00:02:06,160 Speaker 1: Back in twenty nineteen, the District Attorney's Office of Los 27 00:02:06,200 --> 00:02:11,399 Speaker 1: Angeles County issued a similar advisory. Their warning red quote 28 00:02:11,760 --> 00:02:17,400 Speaker 1: travelers should avoid using public USB power charging stations in airports, hotels, 29 00:02:17,440 --> 00:02:21,240 Speaker 1: and other locations because they may contain dangerous malware end quote, 30 00:02:21,600 --> 00:02:24,519 Speaker 1: and the office went on to suggest that people use 31 00:02:24,680 --> 00:02:28,200 Speaker 1: AC power outlets and an adapter in order to charge 32 00:02:28,240 --> 00:02:33,040 Speaker 1: their devices and to avoid the possibility of someone compromising them. 33 00:02:33,360 --> 00:02:36,120 Speaker 1: And that wasn't the first such warning either. They go 34 00:02:36,240 --> 00:02:41,280 Speaker 1: back quite a way. Is because there were demonstrations as 35 00:02:41,280 --> 00:02:45,960 Speaker 1: far back as around twenty eleven twenty twelve at events 36 00:02:46,000 --> 00:02:51,560 Speaker 1: like Defcon that showed how a hacker could potentially compromise 37 00:02:51,639 --> 00:02:56,320 Speaker 1: a device using a malicious charging station. But here's the 38 00:02:56,360 --> 00:03:00,040 Speaker 1: reason I say that I got got. There are no 39 00:03:00,320 --> 00:03:04,680 Speaker 1: records of someone falling victim to juice jacking out in 40 00:03:04,720 --> 00:03:08,720 Speaker 1: the wild. There are no cases in which hackers have 41 00:03:08,840 --> 00:03:13,080 Speaker 1: taken over a public charging station, or created a fake 42 00:03:13,240 --> 00:03:16,440 Speaker 1: charging station that looks like the real thing, and then 43 00:03:16,600 --> 00:03:20,480 Speaker 1: use that to implant malware on devices in the general public. 44 00:03:20,800 --> 00:03:25,799 Speaker 1: Such a thing is technically possible, but it's tricky on 45 00:03:25,880 --> 00:03:29,800 Speaker 1: several levels. For one thing, a hacker would need physical 46 00:03:29,960 --> 00:03:32,680 Speaker 1: access to the area, and they would also need to 47 00:03:32,760 --> 00:03:37,680 Speaker 1: avoid notice while installing a malicious charging station. For stuff 48 00:03:37,720 --> 00:03:41,200 Speaker 1: that's built into things like tables and seating areas, I'm 49 00:03:41,200 --> 00:03:45,080 Speaker 1: thinking about like the long line of seats in an 50 00:03:45,120 --> 00:03:48,560 Speaker 1: airport waiting area, where you typically will have a little 51 00:03:48,920 --> 00:03:52,240 Speaker 1: station in between every couple of seats or so that's 52 00:03:52,240 --> 00:03:54,440 Speaker 1: hard to do. If you're in a high traffic environment 53 00:03:55,080 --> 00:03:58,560 Speaker 1: like an airport, it's tricky to get that access. It's 54 00:03:58,600 --> 00:04:02,520 Speaker 1: not impossible with little social engineering. You might pose as 55 00:04:02,600 --> 00:04:06,280 Speaker 1: like an IT maintenance person and you're there to repair 56 00:04:06,560 --> 00:04:11,680 Speaker 1: or to upgrade a system, and maybe after a brief conversation, 57 00:04:11,840 --> 00:04:13,680 Speaker 1: no one gives you much thought and you can go 58 00:04:13,800 --> 00:04:16,760 Speaker 1: about doing it. So it's not like it's impossible. You 59 00:04:16,839 --> 00:04:20,680 Speaker 1: could do it with a bit of effort. But that's 60 00:04:20,760 --> 00:04:25,040 Speaker 1: the first barrier, and there are other challenges as well. 61 00:04:25,120 --> 00:04:29,800 Speaker 1: Malware can spread via USB. That is true, like you 62 00:04:29,880 --> 00:04:34,080 Speaker 1: could connect two devices via USB and send malware from 63 00:04:34,120 --> 00:04:39,600 Speaker 1: one to the other. However, malware is not magically universal. 64 00:04:39,960 --> 00:04:43,000 Speaker 1: It's not like a skeleton key that works on every device. 65 00:04:43,080 --> 00:04:46,960 Speaker 1: This is one of those things that really gums up 66 00:04:47,080 --> 00:04:52,200 Speaker 1: the plot to Independence Day. Right Like in the movie 67 00:04:52,240 --> 00:04:57,919 Speaker 1: Independence Day, you have Jeff Goldbloom creating malware on a 68 00:04:58,240 --> 00:05:03,000 Speaker 1: Mac of all things, and then using that malware to 69 00:05:03,040 --> 00:05:06,400 Speaker 1: transfer over to the alien's computer systems to bring down 70 00:05:06,560 --> 00:05:11,679 Speaker 1: the alien defenses. Well, without knowing how the alien computer 71 00:05:11,760 --> 00:05:16,159 Speaker 1: systems work, and to build your own system that works 72 00:05:16,200 --> 00:05:20,600 Speaker 1: exactly that way, you can't do this. You cannot create 73 00:05:20,720 --> 00:05:25,240 Speaker 1: malware that just magically works to whatever hardware and operating 74 00:05:25,279 --> 00:05:30,560 Speaker 1: system environment it encounters. Malware is not that adaptable. Of course, 75 00:05:30,560 --> 00:05:33,200 Speaker 1: you could argue, and I have heard this argument that 76 00:05:33,279 --> 00:05:38,080 Speaker 1: Mac computers are not actually the outpouring of work from Apple, 77 00:05:38,320 --> 00:05:42,000 Speaker 1: but in fact trace their lineage back to the crashed 78 00:05:42,279 --> 00:05:46,720 Speaker 1: Roswell alien ship, and that all computer systems were really 79 00:05:46,839 --> 00:05:50,480 Speaker 1: just built on alien technology. But then, how did you 80 00:05:50,520 --> 00:05:52,960 Speaker 1: figure out how the alien technology worked in the first place. 81 00:05:53,360 --> 00:05:55,000 Speaker 1: We're going down a rabbit hole. I don't need to 82 00:05:55,240 --> 00:05:59,600 Speaker 1: the point being that malware does not magically adapt to 83 00:05:59,680 --> 00:06:03,440 Speaker 1: its environment. It needs to be designed for that environment. 84 00:06:04,400 --> 00:06:06,479 Speaker 1: So for malware to work, a hacker has to design 85 00:06:06,520 --> 00:06:09,839 Speaker 1: it for a particular operating system, and malicious code that 86 00:06:09,880 --> 00:06:13,720 Speaker 1: works on Windows machines generally won't work on say, Android 87 00:06:13,720 --> 00:06:18,599 Speaker 1: devices or iOS devices, So creating the back end that 88 00:06:18,720 --> 00:06:23,280 Speaker 1: is responsible for injecting malware into the targets is also 89 00:06:23,400 --> 00:06:26,040 Speaker 1: hard to do. Your average hacker isn't going to have 90 00:06:26,080 --> 00:06:28,880 Speaker 1: access to the tools to build something that is effective 91 00:06:29,000 --> 00:06:33,640 Speaker 1: against multiple platforms. Such things do exist, but they are 92 00:06:34,279 --> 00:06:38,159 Speaker 1: expensive to develop and to deploy. We're talking like stuff 93 00:06:38,200 --> 00:06:43,680 Speaker 1: that state backed hackers are getting paid tens of thousands 94 00:06:43,800 --> 00:06:46,679 Speaker 1: or hundreds of thousands of dollars in order to develop 95 00:06:46,720 --> 00:06:50,359 Speaker 1: and deploy. Your average hacker just doesn't have access to 96 00:06:50,400 --> 00:06:54,680 Speaker 1: this kind of stuff. Now, maybe the hackers really just thinking, oh, 97 00:06:54,760 --> 00:06:57,040 Speaker 1: most folks have an Android device, so I'm just going 98 00:06:57,080 --> 00:06:59,120 Speaker 1: to build something for that, and I'm not going to 99 00:06:59,160 --> 00:07:02,400 Speaker 1: worry about the IOA users. Or maybe they're thinking, my 100 00:07:02,560 --> 00:07:06,520 Speaker 1: preferred targets are iPhone users because they're known to use 101 00:07:06,560 --> 00:07:09,800 Speaker 1: their devices to do things like make purchases more frequently, 102 00:07:10,200 --> 00:07:13,120 Speaker 1: and they might target iOS devices and not worry about 103 00:07:13,160 --> 00:07:16,080 Speaker 1: the Android users, and so in other words, they're just 104 00:07:17,040 --> 00:07:21,360 Speaker 1: narrowing their pool of targets from the get go. That's possible, 105 00:07:21,760 --> 00:07:25,200 Speaker 1: but it gets more granular even than that, because a 106 00:07:25,200 --> 00:07:29,080 Speaker 1: lot of attacks aren't just operating systems specific. They are 107 00:07:29,400 --> 00:07:33,880 Speaker 1: hardware specific. For attacks that are really good at infiltrating 108 00:07:34,080 --> 00:07:40,040 Speaker 1: and converting target devices into hacker devices. So let's say 109 00:07:40,040 --> 00:07:42,480 Speaker 1: you have a Pixel phone and the person next to 110 00:07:42,520 --> 00:07:45,200 Speaker 1: you has a Samsung Galaxy phone, and you both charge 111 00:07:45,200 --> 00:07:49,600 Speaker 1: it into one of these compromised charging stations. Well, the 112 00:07:49,640 --> 00:07:52,760 Speaker 1: malware might work on them, but not on you, because 113 00:07:52,760 --> 00:07:55,640 Speaker 1: the malware needs to exploit specific quirks of the Galaxy 114 00:07:55,640 --> 00:07:59,080 Speaker 1: phone the hardware itself that the Pixel just doesn't have. 115 00:07:59,640 --> 00:08:02,720 Speaker 1: The can be really impressive in this case. There are 116 00:08:02,840 --> 00:08:07,400 Speaker 1: examples of security experts who have created devices that can 117 00:08:07,480 --> 00:08:13,200 Speaker 1: turn a particular model of phone into a hacker's plaything 118 00:08:13,240 --> 00:08:17,760 Speaker 1: in just seconds, but only under those specific parameters. It's 119 00:08:17,760 --> 00:08:19,520 Speaker 1: not like if you plugged it into a different phone 120 00:08:19,520 --> 00:08:21,400 Speaker 1: it would still work. No, it only works in that 121 00:08:21,520 --> 00:08:26,000 Speaker 1: one device, compromising or creating a public charging station in 122 00:08:26,040 --> 00:08:29,280 Speaker 1: the hopes that someone with that specific model of phone 123 00:08:29,320 --> 00:08:32,760 Speaker 1: plugs into that specific charging station. That's the sort of 124 00:08:32,800 --> 00:08:36,360 Speaker 1: gamble that's just not likely to pay off. But there's 125 00:08:36,440 --> 00:08:39,560 Speaker 1: even more technical challenges we need to talk about. Modern 126 00:08:39,679 --> 00:08:44,000 Speaker 1: smartphone operating systems will alert users to a requested file 127 00:08:44,120 --> 00:08:46,840 Speaker 1: transfer when you connect it to another device like a computer, 128 00:08:47,480 --> 00:08:50,600 Speaker 1: and users then have to acknowledge and allow that file 129 00:08:50,640 --> 00:08:53,720 Speaker 1: transfer to actually happen. So if you were to plug 130 00:08:53,760 --> 00:08:57,240 Speaker 1: your device into one of these hacker controlled charging stations 131 00:08:57,280 --> 00:09:00,920 Speaker 1: and their intent was to inject malware on your device, well, 132 00:09:01,000 --> 00:09:04,240 Speaker 1: your phone would essentially say, hey, bud, listen that thing 133 00:09:04,280 --> 00:09:06,960 Speaker 1: what you got me plugged into? It wants to send 134 00:09:06,960 --> 00:09:09,680 Speaker 1: a file your way? You cool? And you just choose Nah, 135 00:09:09,720 --> 00:09:12,560 Speaker 1: bra I ain't cool, and then boom, no malware gets 136 00:09:12,559 --> 00:09:15,599 Speaker 1: pushed to your device. Now could hackers figure out a 137 00:09:15,640 --> 00:09:20,600 Speaker 1: way to bypass this, Well, technically yes, there could be 138 00:09:20,640 --> 00:09:23,720 Speaker 1: a zero day vulnerability that no one but the hackers 139 00:09:23,760 --> 00:09:27,679 Speaker 1: know about. There are tools in the cybersecurity field that 140 00:09:27,760 --> 00:09:30,040 Speaker 1: do this sort of thing, but they typically require an 141 00:09:30,080 --> 00:09:34,760 Speaker 1: extended contact with the device, as ours technicup points out 142 00:09:34,960 --> 00:09:38,520 Speaker 1: the security tool gray Shift, which is designed to access 143 00:09:38,559 --> 00:09:40,800 Speaker 1: lock devices in order to pull data. It's the kind 144 00:09:40,800 --> 00:09:45,640 Speaker 1: of thing that law enforcement agencies would end up purchasing 145 00:09:45,679 --> 00:09:49,960 Speaker 1: and using. This tool can require up to three days 146 00:09:50,920 --> 00:09:53,240 Speaker 1: with a device in order to actually be able to 147 00:09:53,240 --> 00:09:56,240 Speaker 1: pull data from it. So unless you're posted up to 148 00:09:56,280 --> 00:09:59,120 Speaker 1: this charging station for several days in a row, you're 149 00:09:59,160 --> 00:10:01,600 Speaker 1: not likely to have that problem. Now, I will say 150 00:10:01,600 --> 00:10:04,640 Speaker 1: that this does add more complexity to the operation. It's 151 00:10:04,679 --> 00:10:07,920 Speaker 1: also really expensive. The gray Shift tool costs around thirty 152 00:10:08,000 --> 00:10:12,360 Speaker 1: grand to use, plus security updates from companies like Google 153 00:10:12,480 --> 00:10:16,240 Speaker 1: or Apple can shut down the methodology that gray Shift 154 00:10:16,320 --> 00:10:18,320 Speaker 1: uses to pull data in the first place, which means 155 00:10:18,320 --> 00:10:21,199 Speaker 1: it's back to the drawing board. This will only work 156 00:10:21,280 --> 00:10:24,319 Speaker 1: as long as the vulnerability exists, and if the vulnerability 157 00:10:24,360 --> 00:10:27,960 Speaker 1: is patched well, then that door is shut. Right. So, 158 00:10:28,000 --> 00:10:31,560 Speaker 1: as operations get more complex and expensive, the likelihood of 159 00:10:31,720 --> 00:10:38,319 Speaker 1: encountering them begins to approach zero. Okay, we're on the precipice, 160 00:10:38,840 --> 00:10:40,560 Speaker 1: so we're going to take a quick break. When we 161 00:10:40,600 --> 00:10:44,400 Speaker 1: come back, I will talk about why I say I 162 00:10:44,440 --> 00:10:58,120 Speaker 1: got got okay. Before the break, I said that as 163 00:10:58,360 --> 00:11:02,120 Speaker 1: the complexity and expense of an operation increases, the likelihood 164 00:11:02,160 --> 00:11:05,480 Speaker 1: that you're going to encounter it approaches zero. As it 165 00:11:05,520 --> 00:11:08,079 Speaker 1: turns out, zero is where we end up. At least 166 00:11:08,120 --> 00:11:10,520 Speaker 1: that's where we are when it comes to hijacked recharging 167 00:11:10,559 --> 00:11:12,920 Speaker 1: stations that are in the public there is a total 168 00:11:13,040 --> 00:11:17,160 Speaker 1: lack of documented cases in which it has happened. Now, 169 00:11:17,720 --> 00:11:20,920 Speaker 1: this was an article that I used for This is 170 00:11:20,920 --> 00:11:23,560 Speaker 1: from Ours Technica I mentioned earlier. It was written by 171 00:11:23,640 --> 00:11:26,560 Speaker 1: Dan Gooden. He went so far as to really look 172 00:11:26,600 --> 00:11:28,640 Speaker 1: into this rather than do what I did, which was 173 00:11:28,679 --> 00:11:32,319 Speaker 1: just repeat an advisory without looking into it further. So 174 00:11:32,440 --> 00:11:35,440 Speaker 1: Gooden did the right thing. Then I screwed up. And 175 00:11:35,480 --> 00:11:38,760 Speaker 1: when we look back at that twenty nineteen advisory that 176 00:11:38,800 --> 00:11:41,480 Speaker 1: was issued in Los Angeles, we see a similar issue. 177 00:11:41,520 --> 00:11:46,680 Speaker 1: So Gooden's talking about the more recent FBI and FCC 178 00:11:46,920 --> 00:11:50,320 Speaker 1: joint advisory. But if we look at the Los Angeles 179 00:11:50,320 --> 00:11:52,760 Speaker 1: one and you just do a quick search on it, well, 180 00:11:52,800 --> 00:11:55,320 Speaker 1: it brings up a Snopes page. It has been so 181 00:11:55,520 --> 00:11:57,400 Speaker 1: long since I've been on Snopes. I used to go 182 00:11:57,400 --> 00:12:01,240 Speaker 1: to that website all the time. And the Snopes judgment 183 00:12:01,320 --> 00:12:05,280 Speaker 1: comes down to this being a quote unquote mixture of 184 00:12:05,400 --> 00:12:09,960 Speaker 1: truth and falsehood. So the Snopes page mentions that tech 185 00:12:10,000 --> 00:12:13,120 Speaker 1: Crunch had previously reached out to the Los Angeles County's 186 00:12:13,160 --> 00:12:18,120 Speaker 1: chief prosecutor to ask about any cases involving juice jacking. 187 00:12:18,480 --> 00:12:22,040 Speaker 1: They said, well, how many instances have you encountered of 188 00:12:22,080 --> 00:12:26,240 Speaker 1: this practice, and the office said that they hadn't, They 189 00:12:26,280 --> 00:12:29,720 Speaker 1: had no documented cases. And so then tech Crunch said, so, 190 00:12:30,000 --> 00:12:33,440 Speaker 1: what's up with pushing out an advisory for something that 191 00:12:33,480 --> 00:12:36,839 Speaker 1: you don't even have evidence of happening, And they said, well, 192 00:12:36,880 --> 00:12:40,079 Speaker 1: it is really part of an awareness building campaign for security. 193 00:12:40,559 --> 00:12:42,920 Speaker 1: And again this gets back to the fact that it 194 00:12:43,080 --> 00:12:47,400 Speaker 1: is technically possible to do, and maybe a user would 195 00:12:47,520 --> 00:12:50,960 Speaker 1: even click through an acknowledgement of a filed transfer thinking 196 00:12:51,000 --> 00:12:54,040 Speaker 1: it was just another step toward charging, and thus allow 197 00:12:54,080 --> 00:12:57,160 Speaker 1: their phone to be compromised. But the fact remains there 198 00:12:57,240 --> 00:13:00,199 Speaker 1: still are no documented cases of juice jacking out there, 199 00:13:00,400 --> 00:13:02,600 Speaker 1: And when you think about the difficulty of the task 200 00:13:02,920 --> 00:13:05,760 Speaker 1: and the small number of successful hits that you're likely 201 00:13:05,760 --> 00:13:08,199 Speaker 1: to get as a hacker, you start to see why 202 00:13:08,679 --> 00:13:11,360 Speaker 1: it's not really a thing. It could be a thing, 203 00:13:11,640 --> 00:13:14,760 Speaker 1: but so far it's not. And let's take it from 204 00:13:14,840 --> 00:13:17,360 Speaker 1: the hacker point of view. Let's say that your goal 205 00:13:18,040 --> 00:13:22,120 Speaker 1: is to infect as many devices as possible for whatever reason. 206 00:13:22,679 --> 00:13:26,319 Speaker 1: Maybe you're trying to get your malware to a specific target, 207 00:13:26,440 --> 00:13:29,680 Speaker 1: but you don't have access to that target, so instead, 208 00:13:29,720 --> 00:13:32,960 Speaker 1: you're thinking, well, I'll just infect as many devices as 209 00:13:33,000 --> 00:13:37,360 Speaker 1: I can so that someone somewhere possibly passes this malware 210 00:13:37,480 --> 00:13:40,720 Speaker 1: to my actual target. It's kind of a long shot, 211 00:13:41,160 --> 00:13:44,200 Speaker 1: sort of a long play strategy, but it's also something 212 00:13:44,280 --> 00:13:46,920 Speaker 1: that we have seen in the past, particularly with state 213 00:13:47,200 --> 00:13:51,840 Speaker 1: backed malware attacks. Stucksnet was technically this kind of approach, 214 00:13:51,880 --> 00:13:57,320 Speaker 1: although it was through secondary targets. We assume it was 215 00:13:57,360 --> 00:14:01,560 Speaker 1: the United States and Israel targeting some specific companies that 216 00:14:01,600 --> 00:14:07,520 Speaker 1: were supplying software to the Iranian nuclear power program, and 217 00:14:07,559 --> 00:14:11,480 Speaker 1: that because they could not access the nuclear program itself, 218 00:14:12,840 --> 00:14:18,120 Speaker 1: US and Israel we assume targeted the suppliers for that 219 00:14:18,440 --> 00:14:22,040 Speaker 1: nuclear power program. That's all. They got the malware to 220 00:14:22,120 --> 00:14:25,840 Speaker 1: the targeted destination. So that is something that does happen. 221 00:14:26,360 --> 00:14:29,720 Speaker 1: But more likely the hacker's motive is just to infect 222 00:14:30,120 --> 00:14:33,120 Speaker 1: devices in order to harvest data or perhaps get access 223 00:14:33,120 --> 00:14:35,960 Speaker 1: to stuff like bank account information and that kind of thing. 224 00:14:36,400 --> 00:14:39,360 Speaker 1: So how do you go about doing this if you're 225 00:14:39,400 --> 00:14:41,760 Speaker 1: the hacker, Well, one thing you could do is run 226 00:14:41,800 --> 00:14:47,120 Speaker 1: a really broad phishing campaign. You're casting a really wide net. 227 00:14:47,400 --> 00:14:49,840 Speaker 1: This approach requires a little work on the back end, 228 00:14:49,880 --> 00:14:53,600 Speaker 1: but not a whole bunch it's relatively light work compared 229 00:14:53,640 --> 00:14:56,800 Speaker 1: to other methods, and it can touch a large number 230 00:14:56,840 --> 00:14:59,200 Speaker 1: of people. Like you can't really predict who's going to 231 00:14:59,240 --> 00:15:04,320 Speaker 1: see it necessarily, but you can target millions if you want, 232 00:15:04,760 --> 00:15:07,360 Speaker 1: and all the people who encounter the effort, maybe only 233 00:15:07,440 --> 00:15:09,560 Speaker 1: a few are likely to fall for it, but still, 234 00:15:09,600 --> 00:15:12,480 Speaker 1: if your attack is seen by tens of millions of 235 00:15:12,520 --> 00:15:15,200 Speaker 1: people and you get just a couple of percentage points 236 00:15:15,240 --> 00:15:18,160 Speaker 1: worth of victims out of it, that's still a lot 237 00:15:18,200 --> 00:15:22,120 Speaker 1: of victims. Another thing that you could do is you 238 00:15:22,200 --> 00:15:26,400 Speaker 1: might set up at a public space and either compromise 239 00:15:26,560 --> 00:15:30,280 Speaker 1: the local public Wi Fi or you create a hot 240 00:15:30,280 --> 00:15:34,520 Speaker 1: spot of your own and you pose as public WiFi. Now, 241 00:15:34,560 --> 00:15:37,880 Speaker 1: all the wireless traffic going through your hotspot is yours 242 00:15:37,880 --> 00:15:40,640 Speaker 1: for the sniffing, and you don't have to tamper with 243 00:15:40,760 --> 00:15:44,040 Speaker 1: any physical hardware in the area to do it. I mean, 244 00:15:44,080 --> 00:15:46,480 Speaker 1: you could if you wanted to try and actually compromise 245 00:15:46,520 --> 00:15:50,400 Speaker 1: the physical routers and such of the space, But if 246 00:15:50,440 --> 00:15:54,240 Speaker 1: you wanted to just set up an alternative that looks official, 247 00:15:54,440 --> 00:15:57,680 Speaker 1: you don't even have to touch any of the infrastructure 248 00:15:57,720 --> 00:16:00,840 Speaker 1: that's already there. Now, the downside to this approach is 249 00:16:00,840 --> 00:16:04,000 Speaker 1: that you are very location based, so your pool of 250 00:16:04,040 --> 00:16:07,000 Speaker 1: targets is much smaller. But the good news is you're 251 00:16:07,080 --> 00:16:09,760 Speaker 1: likely to have a much higher percentage of hits within 252 00:16:09,920 --> 00:16:14,240 Speaker 1: that small pool, so your goal is still met. Both 253 00:16:14,320 --> 00:16:19,000 Speaker 1: of those methods are lower risk and higher reward than 254 00:16:19,120 --> 00:16:22,720 Speaker 1: hijacking a public charging station, and you don't have to 255 00:16:22,760 --> 00:16:26,240 Speaker 1: worry so much about the devices themselves giving you away 256 00:16:26,360 --> 00:16:29,120 Speaker 1: by alerting the user that something hinky is going on. 257 00:16:29,880 --> 00:16:33,560 Speaker 1: So from a return on investment perspective, it makes way 258 00:16:33,600 --> 00:16:37,040 Speaker 1: more sense to do a different approach than to rely 259 00:16:37,200 --> 00:16:41,520 Speaker 1: on a physical connection between your malware injection system and 260 00:16:41,640 --> 00:16:46,880 Speaker 1: target devices, unless, and this is a big exception, we're 261 00:16:46,920 --> 00:16:52,080 Speaker 1: talking about targeting a specific person or specific group. So 262 00:16:52,920 --> 00:16:55,600 Speaker 1: if you're the equivalent of James Bond, and it's your 263 00:16:55,680 --> 00:16:57,840 Speaker 1: job how to figure out to you know you're going 264 00:16:57,920 --> 00:17:00,880 Speaker 1: to compromise the phone that's belonging to, say, the ambassador 265 00:17:00,880 --> 00:17:03,320 Speaker 1: of fre Donia, that's a made up nation in the 266 00:17:03,320 --> 00:17:07,600 Speaker 1: Marx Brothers movie Duck Soup. Well, maybe you do install 267 00:17:07,680 --> 00:17:11,000 Speaker 1: some hardware in an attempt to access that device. Maybe 268 00:17:11,040 --> 00:17:15,360 Speaker 1: you do create some fake charging stations, you get access 269 00:17:15,440 --> 00:17:18,720 Speaker 1: to a space that the Ambassador of Fredonia is going 270 00:17:18,760 --> 00:17:23,119 Speaker 1: to be in and you put in these compromising devices 271 00:17:23,800 --> 00:17:28,520 Speaker 1: in that space. So if we're talking about a potentially 272 00:17:29,080 --> 00:17:34,240 Speaker 1: high value target like a politician or the CEO of 273 00:17:34,280 --> 00:17:39,119 Speaker 1: a prominent company, or maybe really high profile journalists or 274 00:17:39,119 --> 00:17:43,639 Speaker 1: something like that, well then they might encounter something that 275 00:17:43,760 --> 00:17:46,960 Speaker 1: is similar to a fake charging station, but most of 276 00:17:47,040 --> 00:17:49,760 Speaker 1: us won't because the trouble of making it and to 277 00:17:49,840 --> 00:17:52,600 Speaker 1: keep it up to date as devices get security updates 278 00:17:53,200 --> 00:17:56,760 Speaker 1: is just way too much work for way too little reward. 279 00:17:57,440 --> 00:18:00,159 Speaker 1: So it sounds like the threat of juice jacking is 280 00:18:00,160 --> 00:18:02,879 Speaker 1: similar to what we hear every year as Halloween rolls around, 281 00:18:02,920 --> 00:18:04,800 Speaker 1: or at least I used to hear it that parents 282 00:18:04,880 --> 00:18:06,719 Speaker 1: need to check every single piece of candy to make 283 00:18:06,760 --> 00:18:10,159 Speaker 1: sure some malicious person hasn't hidden poison or razor blades 284 00:18:10,160 --> 00:18:12,520 Speaker 1: in there. Now, in the case of poison, there's no 285 00:18:12,560 --> 00:18:15,680 Speaker 1: documented evidence of that ever happening, but there have actually 286 00:18:15,720 --> 00:18:18,280 Speaker 1: been a few cases of people putting pins or other 287 00:18:18,320 --> 00:18:21,439 Speaker 1: sharp objects into stuff like apples at Halloween, though it 288 00:18:21,480 --> 00:18:23,560 Speaker 1: often ends up turning out that it was a kid 289 00:18:23,600 --> 00:18:26,560 Speaker 1: doing it purposefully to drum up drama. So in other words, 290 00:18:26,800 --> 00:18:30,560 Speaker 1: the awareness campaign becomes a self fulfilling prophecy that convinces 291 00:18:30,600 --> 00:18:32,639 Speaker 1: people to do the thing that the warning is about. 292 00:18:33,200 --> 00:18:35,879 Speaker 1: So end result here, The type of attack mentioned in 293 00:18:35,920 --> 00:18:39,720 Speaker 1: the FBI and FCC warnings is technically possible, but it's 294 00:18:39,760 --> 00:18:42,800 Speaker 1: not practical. Not a wide deployment, and not as a 295 00:18:42,800 --> 00:18:44,920 Speaker 1: way to create a wide spectrum attack that's going to 296 00:18:45,000 --> 00:18:47,680 Speaker 1: hit a lot of targets. It's technically difficult to pull off. 297 00:18:47,720 --> 00:18:51,720 Speaker 1: It requires access to infrastructure that isn't necessarily easy to get. 298 00:18:52,640 --> 00:18:56,040 Speaker 1: As security patches go out, the methods become ineffective. It's 299 00:18:56,119 --> 00:18:59,800 Speaker 1: expensive to develop and deploy. So is it possible someone 300 00:18:59,800 --> 00:19:01,720 Speaker 1: could do this and that people could fall a victim 301 00:19:01,760 --> 00:19:05,760 Speaker 1: to it? Yeah, but so impractical that it's extremely unlikely 302 00:19:05,800 --> 00:19:08,359 Speaker 1: to happen. There are lots of other security threats that 303 00:19:08,400 --> 00:19:10,840 Speaker 1: are far more pressing, so it's kind of weird that 304 00:19:10,840 --> 00:19:13,920 Speaker 1: we even get these warnings. I'm actually not sure what's 305 00:19:14,080 --> 00:19:17,600 Speaker 1: driving the push for that unless there is some top secret, 306 00:19:17,680 --> 00:19:21,359 Speaker 1: highly classified document containing countless cases of juice jacking that 307 00:19:21,440 --> 00:19:25,480 Speaker 1: for some reason are not allowed to be acknowledged. That's 308 00:19:25,520 --> 00:19:27,119 Speaker 1: the only way I could figure that this is a 309 00:19:27,160 --> 00:19:30,119 Speaker 1: real problem. It doesn't look like it actually is, so 310 00:19:30,200 --> 00:19:33,680 Speaker 1: my apologies for being part of this machine of spreading 311 00:19:33,680 --> 00:19:35,960 Speaker 1: a message that really isn't that important. I should have 312 00:19:36,040 --> 00:19:38,920 Speaker 1: checked further into it. I could give you all excuses, 313 00:19:38,960 --> 00:19:42,119 Speaker 1: like how I'm the only person writing and researching this show, 314 00:19:42,200 --> 00:19:44,720 Speaker 1: but that's kind of lane. So instead, I'll just remind 315 00:19:44,760 --> 00:19:47,720 Speaker 1: you critical thinking is important, even when I forget to 316 00:19:47,760 --> 00:19:51,320 Speaker 1: do it myself. That's all. I hope you're all well, 317 00:19:51,680 --> 00:20:01,040 Speaker 1: and I'll talk to you again really soon. Tech Stuff 318 00:20:01,119 --> 00:20:05,680 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 319 00:20:05,720 --> 00:20:09,240 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 320 00:20:09,280 --> 00:20:10,240 Speaker 1: your favorite shows.