1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:18,440 --> 00:00:21,240 Speaker 1: I love all things tech and it is time for 5 00:00:21,320 --> 00:00:26,720 Speaker 1: the tech news for Thursday, March eleventh, twenty one. Let's 6 00:00:26,720 --> 00:00:29,600 Speaker 1: get to it. Last week I told you about how 7 00:00:29,640 --> 00:00:33,480 Speaker 1: the social network site gab, known for the right wing 8 00:00:33,560 --> 00:00:37,839 Speaker 1: political philosophy of most of its users, was the victim 9 00:00:38,000 --> 00:00:41,919 Speaker 1: of a data breach as a hacker accessed gab systems 10 00:00:42,159 --> 00:00:46,920 Speaker 1: and stole around seventy gigabytes worth of data, including three 11 00:00:47,200 --> 00:00:52,480 Speaker 1: million private posts. Well earlier this week, gab has been 12 00:00:52,680 --> 00:00:58,080 Speaker 1: hacked again and even went offline temporarily as the administrators 13 00:00:58,120 --> 00:01:02,200 Speaker 1: of the site conducted and investigate into the security vulnerability 14 00:01:02,360 --> 00:01:06,119 Speaker 1: that made this possible. Now, during the downtime, users who 15 00:01:06,120 --> 00:01:09,399 Speaker 1: are trying to go to gab got an error message, 16 00:01:09,480 --> 00:01:12,400 Speaker 1: but the site is back up as of this recording. 17 00:01:12,920 --> 00:01:16,959 Speaker 1: A hacker using the handle captain Jack Sparrow that's j 18 00:01:17,319 --> 00:01:21,039 Speaker 1: A x p A r Oh claims responsibility for both hacks, 19 00:01:21,600 --> 00:01:24,720 Speaker 1: though who knows if that handle represents just one person 20 00:01:24,920 --> 00:01:27,720 Speaker 1: or it's being used by a group. Working together. The 21 00:01:27,760 --> 00:01:33,280 Speaker 1: hacker used authentication tokens that they gathered during the first hack, 22 00:01:33,959 --> 00:01:36,240 Speaker 1: and they used those to carry out the second one. 23 00:01:36,280 --> 00:01:39,760 Speaker 1: They were able to access the systems because those authentication 24 00:01:39,800 --> 00:01:43,560 Speaker 1: tokens were still good, and that showed the GAB failed 25 00:01:43,600 --> 00:01:47,360 Speaker 1: to reset those security tokens. The hacker also left a 26 00:01:47,360 --> 00:01:51,680 Speaker 1: message that claims that this hacker gave GAB CEO Andrew 27 00:01:51,760 --> 00:01:56,400 Speaker 1: Torba and ultimatum cough up eight bitcoins, which is worth 28 00:01:56,520 --> 00:02:00,880 Speaker 1: about four fifty thousand dollars, and the hacker would return 29 00:02:00,920 --> 00:02:05,040 Speaker 1: at least some of that stolen data. Torva apparently refused, which, 30 00:02:05,080 --> 00:02:07,920 Speaker 1: for the record, is what security experts say is the 31 00:02:08,040 --> 00:02:13,000 Speaker 1: right call. Capitulating to hacker demands typically just reinforces that 32 00:02:13,080 --> 00:02:16,800 Speaker 1: hackers decisions, and it also encourages other hackers to follow suit, 33 00:02:17,360 --> 00:02:20,400 Speaker 1: because if you pay the ransom, people know that there's 34 00:02:20,400 --> 00:02:23,200 Speaker 1: money to be made, and it just makes things worse. Plus, 35 00:02:23,520 --> 00:02:26,320 Speaker 1: there's never actually a guarantee that you'll ever get anything 36 00:02:26,360 --> 00:02:30,000 Speaker 1: back anyway, or that there won't be copies of the 37 00:02:30,040 --> 00:02:32,880 Speaker 1: stuff you get back floating around now. I have to 38 00:02:32,919 --> 00:02:37,959 Speaker 1: say GAB definitely comes out looking like a terrible steward 39 00:02:38,120 --> 00:02:43,240 Speaker 1: of user security and data like absolutely awful. The company 40 00:02:43,280 --> 00:02:47,320 Speaker 1: has completely failed to protect user assets. And just to 41 00:02:47,320 --> 00:02:50,520 Speaker 1: be clear, while I am about as far away in 42 00:02:50,600 --> 00:02:56,519 Speaker 1: political ideology as you can get from the typical GAB user, 43 00:02:57,040 --> 00:03:00,840 Speaker 1: I also don't really care for hackers who promise sites 44 00:03:00,960 --> 00:03:04,680 Speaker 1: and then issue ransom notices. I don't think that's a 45 00:03:04,760 --> 00:03:08,080 Speaker 1: valid approach either. I don't think anyone comes out of 46 00:03:08,080 --> 00:03:11,280 Speaker 1: this situation looking like a good guy, and that's gonna 47 00:03:11,320 --> 00:03:14,400 Speaker 1: be a kind of common message in today's news items. 48 00:03:14,440 --> 00:03:17,519 Speaker 1: I think my guess is that the leaders at gab 49 00:03:17,560 --> 00:03:20,920 Speaker 1: are kind of in over their heads. They got the 50 00:03:20,960 --> 00:03:25,480 Speaker 1: boot from various other platforms due to the actions of 51 00:03:25,520 --> 00:03:29,239 Speaker 1: their users and different violations of terms of service, so 52 00:03:29,400 --> 00:03:32,040 Speaker 1: they're kind of forced to take whatever they can get, 53 00:03:32,520 --> 00:03:36,520 Speaker 1: which is something we've also seen with Parlor. Moving on 54 00:03:36,560 --> 00:03:39,640 Speaker 1: to a different hacking story, I've covered the Solar Winds 55 00:03:39,720 --> 00:03:43,720 Speaker 1: hack numerous times already about how it looks as though 56 00:03:44,120 --> 00:03:47,920 Speaker 1: Russian backed hackers. I mean, it's all but confirmed that 57 00:03:47,960 --> 00:03:51,520 Speaker 1: they are Russian backed hackers who compromise Solar Winds software 58 00:03:51,600 --> 00:03:55,200 Speaker 1: build system and that let the hackers push out software 59 00:03:55,320 --> 00:03:59,400 Speaker 1: updates as if they were legitimate updates. But those updates 60 00:03:59,480 --> 00:04:03,720 Speaker 1: really carried malware two Solar Winds customers, and that gave 61 00:04:03,760 --> 00:04:07,760 Speaker 1: the hackers the ability to infiltrate thousands of computer systems 62 00:04:07,840 --> 00:04:10,400 Speaker 1: and they could then follow up on that attack. This 63 00:04:10,600 --> 00:04:14,840 Speaker 1: was the supply chain attack, where targeted systems accepted the 64 00:04:14,880 --> 00:04:18,680 Speaker 1: malware because it was coming from a previously trusted source, 65 00:04:18,839 --> 00:04:23,400 Speaker 1: that being Solar Winds. Well, now cybersecurity researchers have published 66 00:04:23,400 --> 00:04:27,000 Speaker 1: a report that shows hackers backed by China we're targeting 67 00:04:27,040 --> 00:04:30,560 Speaker 1: Solar Winds at the same time in a separate operation, 68 00:04:30,920 --> 00:04:35,760 Speaker 1: one that I have previously conflated with the Russian hackers. 69 00:04:35,839 --> 00:04:38,560 Speaker 1: So it's time to set the record straight. Based on 70 00:04:38,720 --> 00:04:41,760 Speaker 1: what these researchers found. While the Russian hack used the 71 00:04:41,760 --> 00:04:44,800 Speaker 1: method I just described a moment ago, it appears that 72 00:04:44,839 --> 00:04:48,440 Speaker 1: it was the Chinese hackers who were specifically targeting a 73 00:04:48,520 --> 00:04:54,240 Speaker 1: software project called Ryan and these hackers, according to the researchers, 74 00:04:54,680 --> 00:04:59,280 Speaker 1: were really going after a specific Solar Winds customer and 75 00:04:59,320 --> 00:05:04,880 Speaker 1: they install a malware shell now called Supernova around that 76 00:05:04,960 --> 00:05:08,360 Speaker 1: Orion software. So this sounds like it was a much 77 00:05:08,360 --> 00:05:12,320 Speaker 1: more targeted attack. The Russian approach was different. It let 78 00:05:12,480 --> 00:05:18,000 Speaker 1: hackers blast out malware two thousands of potential targets, and 79 00:05:18,040 --> 00:05:21,960 Speaker 1: then the hackers could follow up with the specific hits 80 00:05:22,000 --> 00:05:24,680 Speaker 1: that they wanted to. They could look at who they 81 00:05:24,680 --> 00:05:29,200 Speaker 1: were able to capture with that first blast and follow 82 00:05:29,320 --> 00:05:34,080 Speaker 1: up for further infiltration. Whereas the Chinese approach was different. 83 00:05:34,160 --> 00:05:36,880 Speaker 1: They were looking at a specific target and went right 84 00:05:36,920 --> 00:05:41,279 Speaker 1: for them, as opposed to doing that blast attack. And 85 00:05:41,440 --> 00:05:45,000 Speaker 1: we are not done with the hacking stories just yet. 86 00:05:45,760 --> 00:05:50,279 Speaker 1: Vercada or Verkada is a Silicon Valley startup that sells 87 00:05:50,360 --> 00:05:54,800 Speaker 1: security cameras with cloud based services, and they were hacked 88 00:05:54,880 --> 00:05:58,359 Speaker 1: earlier this week. Now I say hacked, but when I 89 00:05:58,400 --> 00:06:00,520 Speaker 1: tell you how it actually happened you, I think that 90 00:06:01,000 --> 00:06:03,479 Speaker 1: hack might be a bit of a generous term for 91 00:06:04,080 --> 00:06:09,240 Speaker 1: this particular thing. For Kado practices terrible security, and for 92 00:06:09,279 --> 00:06:12,479 Speaker 1: a company that's in the security business, that's not great. 93 00:06:12,839 --> 00:06:16,640 Speaker 1: So the hackers responsible belonged to a group called APT 94 00:06:17,160 --> 00:06:22,960 Speaker 1: six nine four to zero arson cats. Okay, but APT, 95 00:06:23,120 --> 00:06:25,719 Speaker 1: by the way, stands for Advanced Persistent Threat. It's a 96 00:06:25,720 --> 00:06:30,000 Speaker 1: common term in the cybersecurity world. Anyway, these hackers discovered 97 00:06:30,080 --> 00:06:34,360 Speaker 1: that Verkada had an Internet portal for the company's internal 98 00:06:34,440 --> 00:06:38,680 Speaker 1: development system away to log in to make changes to 99 00:06:38,800 --> 00:06:42,320 Speaker 1: code and various Arcadia products. You know, it's the way 100 00:06:42,360 --> 00:06:46,719 Speaker 1: that the developers can access stuff. Now, that makes sense, 101 00:06:47,200 --> 00:06:49,560 Speaker 1: particularly in a world where presumably a lot of the 102 00:06:49,600 --> 00:06:52,800 Speaker 1: people working for Vercada are doing so remotely. But what 103 00:06:53,000 --> 00:06:56,320 Speaker 1: does not make sense is that the hackers say that 104 00:06:56,360 --> 00:07:00,520 Speaker 1: this was essentially a publicly accessible portal. There were no 105 00:07:00,680 --> 00:07:05,760 Speaker 1: log in credentials required to get to that portal system, 106 00:07:05,800 --> 00:07:09,560 Speaker 1: so you get there without having to first verify that 107 00:07:09,640 --> 00:07:13,640 Speaker 1: you work for the company, and on that landing page 108 00:07:13,720 --> 00:07:16,840 Speaker 1: or that landing site, they found the login credentials to 109 00:07:16,880 --> 00:07:21,840 Speaker 1: get what they called super admin level access to Arcada's systems. 110 00:07:21,840 --> 00:07:24,600 Speaker 1: So as a result, they were able to access more 111 00:07:24,720 --> 00:07:28,840 Speaker 1: than one hundred thousand cameras the number I frequently see 112 00:07:28,840 --> 00:07:33,400 Speaker 1: a one hundred fifty thousand camera feeds belonging to various 113 00:07:33,480 --> 00:07:37,120 Speaker 1: Arcaded customers. They also got a list that was around 114 00:07:37,120 --> 00:07:41,280 Speaker 1: twenty four thousand entries long that that names those customers, 115 00:07:41,360 --> 00:07:47,120 Speaker 1: and they include businesses, churches, health care facilities, jails, and 116 00:07:47,280 --> 00:07:51,040 Speaker 1: the p g A of all things. Tesla is one 117 00:07:51,080 --> 00:07:54,720 Speaker 1: of their customers. Though Tesla says that the hack really 118 00:07:54,760 --> 00:07:57,720 Speaker 1: just gave the hackers of you into one of Tesla's 119 00:07:57,880 --> 00:08:03,320 Speaker 1: suppliers sites, but not the main manufacturing facility in Shanghai. 120 00:08:03,720 --> 00:08:08,080 Speaker 1: A software developer named Tilly Kottman gave details about the hack, 121 00:08:08,480 --> 00:08:12,320 Speaker 1: presumably having played some part in carrying it out. Now, 122 00:08:12,320 --> 00:08:14,280 Speaker 1: what we call this a hack? I would argue that 123 00:08:14,360 --> 00:08:18,480 Speaker 1: finding log in credentials on a publicly accessible landing page 124 00:08:19,040 --> 00:08:22,680 Speaker 1: is pretty much like walking up to someone's password protected 125 00:08:22,720 --> 00:08:26,240 Speaker 1: computer and seeing that they put the log in and 126 00:08:26,320 --> 00:08:29,640 Speaker 1: password on a sticky note and stuck it to the monitor. 127 00:08:30,120 --> 00:08:32,960 Speaker 1: It's not exactly safe. I mean, there's no point in 128 00:08:33,080 --> 00:08:37,560 Speaker 1: having a locked door if you're hanging a key off 129 00:08:37,600 --> 00:08:41,160 Speaker 1: the door. Knob Kottman says that part of the reason 130 00:08:41,280 --> 00:08:44,760 Speaker 1: that the hacker group published the information and shared video 131 00:08:44,880 --> 00:08:48,800 Speaker 1: from various locations you know Varcada customers, is that they 132 00:08:48,880 --> 00:08:52,920 Speaker 1: wanted to point out how widely distributed Varcada's systems actually 133 00:08:52,960 --> 00:08:57,079 Speaker 1: are and how inherently unsafe they are. Now, it's hard 134 00:08:57,160 --> 00:09:00,000 Speaker 1: for me to disagree with those points. I would argue 135 00:09:00,000 --> 00:09:02,360 Speaker 1: you this kind of puts the hackers into a sort 136 00:09:02,400 --> 00:09:06,360 Speaker 1: of gray hat area when it comes to the spectrum 137 00:09:06,480 --> 00:09:10,320 Speaker 1: of hackers, so we typically describe them by hats. So 138 00:09:10,360 --> 00:09:14,000 Speaker 1: you've got white hat hackers. These are hackers who probit systems. 139 00:09:14,040 --> 00:09:18,000 Speaker 1: They look for vulnerabilities, but the intent is to tell 140 00:09:18,280 --> 00:09:23,559 Speaker 1: the respective system administrators about those vulnerabilities, hopefully before someone 141 00:09:23,600 --> 00:09:26,800 Speaker 1: else can exploit them. So the whole goal is to 142 00:09:26,840 --> 00:09:29,360 Speaker 1: find weaknesses and systems and then say, hey, you need 143 00:09:29,400 --> 00:09:32,680 Speaker 1: to fix this. But then you've got your black hat hackers. 144 00:09:32,679 --> 00:09:35,160 Speaker 1: These are the people who are trying to profit from 145 00:09:35,160 --> 00:09:38,600 Speaker 1: being able to access systems and exploit them in some way. 146 00:09:39,200 --> 00:09:42,560 Speaker 1: Gray hats are somewhere in the middle. The hackers didn't 147 00:09:42,720 --> 00:09:45,880 Speaker 1: just go to Rakeda to alert the company of the mistake. 148 00:09:46,320 --> 00:09:49,240 Speaker 1: They didn't go to say, hey, you've got this massive 149 00:09:49,280 --> 00:09:52,400 Speaker 1: security vulnerability, you need to fix it right now. They 150 00:09:52,440 --> 00:09:56,720 Speaker 1: went public with this revelation, which definitely makes the company 151 00:09:56,760 --> 00:09:59,319 Speaker 1: look bad, And honestly, I think you can make a 152 00:09:59,360 --> 00:10:02,720 Speaker 1: good argument that there's some merit in that, considering that 153 00:10:02,760 --> 00:10:07,080 Speaker 1: the whole value proposition for a security company is that 154 00:10:07,600 --> 00:10:11,360 Speaker 1: it's safe. This is also a good reminder that security 155 00:10:11,360 --> 00:10:15,400 Speaker 1: systems that include ways to access a camera feed remotely 156 00:10:15,960 --> 00:10:19,920 Speaker 1: represents a potential security vulnerability. It's always a good idea 157 00:10:20,400 --> 00:10:23,839 Speaker 1: to do your research before you choose a security solution 158 00:10:24,440 --> 00:10:27,880 Speaker 1: on a related note, Jason Cobler and Joseph Cox over 159 00:10:27,920 --> 00:10:30,680 Speaker 1: at Vice dot Com pointed out that another really big 160 00:10:30,679 --> 00:10:34,760 Speaker 1: issue with this hack is that Verkada offers facial recognition 161 00:10:34,920 --> 00:10:39,280 Speaker 1: solutions in their security camera technologies, and that means that 162 00:10:39,360 --> 00:10:42,000 Speaker 1: hackers weren't just able to look at video feeds, they 163 00:10:42,040 --> 00:10:45,560 Speaker 1: could potentially identify the people who showed up in those 164 00:10:45,640 --> 00:10:49,040 Speaker 1: video feeds, And as they pointed out, quote, the breach 165 00:10:49,200 --> 00:10:53,040 Speaker 1: shows the astonishing reach of facial recognition enabled cameras in 166 00:10:53,160 --> 00:10:59,000 Speaker 1: ordinary workplaces, bars, parking lots, schools, stores, and more. End quote. 167 00:10:59,520 --> 00:11:02,640 Speaker 1: I think that that's putting it lightly because honestly, while 168 00:11:03,000 --> 00:11:06,240 Speaker 1: this is all about security cameras and a company that 169 00:11:06,720 --> 00:11:10,440 Speaker 1: has this kind of proprietary approach to facial recognition, we 170 00:11:10,480 --> 00:11:13,280 Speaker 1: have to remember that just about everybody carries a camera 171 00:11:13,400 --> 00:11:17,320 Speaker 1: with them, and depending upon the apps being used, a 172 00:11:17,360 --> 00:11:20,440 Speaker 1: lot of these companies are able to take advantage of 173 00:11:20,600 --> 00:11:25,040 Speaker 1: massive amounts of data and do facial recognition on their own. 174 00:11:25,360 --> 00:11:29,040 Speaker 1: So yeah, this is a very acute UH case that 175 00:11:29,120 --> 00:11:32,479 Speaker 1: we can point to, but it's by no means an outlier. 176 00:11:32,800 --> 00:11:35,760 Speaker 1: It is easy to imagine a scenario in which malicious 177 00:11:35,760 --> 00:11:39,400 Speaker 1: hackers would not only breach a system like Verkada, but 178 00:11:39,480 --> 00:11:42,880 Speaker 1: also keep it quiet, right, They might never come forward 179 00:11:43,120 --> 00:11:45,880 Speaker 1: letting people know that they got access, and then in 180 00:11:45,920 --> 00:11:48,480 Speaker 1: the meantime they could use these surveillance cameras for their 181 00:11:48,480 --> 00:11:52,800 Speaker 1: own purposes and perhaps even spy and potentially blackmail specific people. 182 00:11:53,360 --> 00:11:57,880 Speaker 1: The facial recognition genie is out of the bottle, and 183 00:11:57,920 --> 00:12:00,440 Speaker 1: the fact that there are numerous big companies are making 184 00:12:00,559 --> 00:12:03,880 Speaker 1: use of the technology in a widely distributed way means 185 00:12:03,920 --> 00:12:08,040 Speaker 1: that whenever you're on camera, you are potentially identifiable in 186 00:12:08,120 --> 00:12:11,880 Speaker 1: real time, and you're pretty much always just moments away 187 00:12:11,880 --> 00:12:14,960 Speaker 1: from being on camera if you're out and about. So 188 00:12:15,080 --> 00:12:18,320 Speaker 1: fun times. So hey, let's stay on this topic for 189 00:12:18,360 --> 00:12:21,760 Speaker 1: a little bit. The United States Army Research Laboratory has 190 00:12:21,760 --> 00:12:25,240 Speaker 1: been working on image recognition AI applications that will be 191 00:12:25,280 --> 00:12:28,360 Speaker 1: able to identify faces even if those images were taken 192 00:12:28,960 --> 00:12:33,360 Speaker 1: in darkness. So this research team took half a million 193 00:12:33,400 --> 00:12:37,120 Speaker 1: pictures of three people, which is a pretty small sample size, 194 00:12:37,120 --> 00:12:39,439 Speaker 1: believe it or not. Now, some of those photos were 195 00:12:39,480 --> 00:12:43,160 Speaker 1: taken in normal conditions, normal lighting conditions using a standard camera. 196 00:12:43,559 --> 00:12:46,200 Speaker 1: Others were taken in low light conditions, and some with 197 00:12:46,320 --> 00:12:49,960 Speaker 1: thermal cameras. I think it's pretty obvious to see where 198 00:12:49,960 --> 00:12:53,240 Speaker 1: the benefits are from a military perspective of having technology 199 00:12:53,280 --> 00:12:56,760 Speaker 1: that can identify a person even in low lighting conditions, 200 00:12:57,120 --> 00:13:00,120 Speaker 1: But there is no denying the idea is more or 201 00:13:00,120 --> 00:13:03,480 Speaker 1: than a little creepy. That being said, half a million photos, 202 00:13:03,520 --> 00:13:05,800 Speaker 1: like I said, is a very small sample size, and 203 00:13:05,840 --> 00:13:08,800 Speaker 1: the team says they're making progress, but they are nowhere 204 00:13:08,800 --> 00:13:12,080 Speaker 1: near a level where anything is sophisticated enough for deployment. 205 00:13:12,440 --> 00:13:15,160 Speaker 1: The system is still struggling to identify images and low 206 00:13:15,240 --> 00:13:18,840 Speaker 1: lighting conditions. Thermal cameras produce very different kinds of photos 207 00:13:18,880 --> 00:13:21,840 Speaker 1: than our normal cameras do, and the computer systems haven't 208 00:13:21,880 --> 00:13:25,360 Speaker 1: quite figured out how to reliably map those thermal images 209 00:13:25,640 --> 00:13:29,040 Speaker 1: to specific people. Also, they said that just a small 210 00:13:29,160 --> 00:13:32,000 Speaker 1: change in the camera's viewing angle, like the angle between 211 00:13:32,120 --> 00:13:34,280 Speaker 1: the person's face and the camera, can make it a 212 00:13:34,320 --> 00:13:37,079 Speaker 1: lot harder for a computer to suss out who they 213 00:13:37,120 --> 00:13:41,440 Speaker 1: are looking at. I've often talked about image recognition by 214 00:13:41,520 --> 00:13:44,080 Speaker 1: using coffee mugs as kind of an example. If you 215 00:13:44,120 --> 00:13:47,240 Speaker 1: were to feed millions of images of red coffee mugs 216 00:13:47,240 --> 00:13:49,559 Speaker 1: to a computer, but every single one of those images 217 00:13:50,000 --> 00:13:53,280 Speaker 1: showed the red coffee mug having it's handled pointing off 218 00:13:53,280 --> 00:13:56,160 Speaker 1: to the right side in some way, then the computer 219 00:13:56,280 --> 00:13:59,599 Speaker 1: might balk at seeing that same red coffee mug. But 220 00:13:59,679 --> 00:14:02,240 Speaker 1: what they handle pointed to the left side that could 221 00:14:02,240 --> 00:14:05,079 Speaker 1: be enough to throw the computer off. Computers are remarkable, 222 00:14:05,360 --> 00:14:08,400 Speaker 1: but they can still be pretty dumb in some ways. Still, 223 00:14:08,440 --> 00:14:11,559 Speaker 1: this area of research is a bit scary, particularly since 224 00:14:11,840 --> 00:14:14,120 Speaker 1: we already live in a world where lots of entities 225 00:14:14,880 --> 00:14:19,240 Speaker 1: like law enforcement agencies, rely on facial recognition technologies. And 226 00:14:19,280 --> 00:14:22,160 Speaker 1: that's without even getting into the existing problems we already 227 00:14:22,160 --> 00:14:25,720 Speaker 1: have with facial recognition, like racial and gender bias that 228 00:14:25,720 --> 00:14:29,720 Speaker 1: can lead to inaccurate results. And gosh, I wish I 229 00:14:29,760 --> 00:14:32,560 Speaker 1: had a happier story I could segue too. But I 230 00:14:32,560 --> 00:14:35,560 Speaker 1: should also mention that the Los Angeles Times published an 231 00:14:35,600 --> 00:14:39,520 Speaker 1: article titled clear View AI uses your online photos to 232 00:14:39,640 --> 00:14:44,440 Speaker 1: instantly I du that's a problem, lawsuit says, And yeah, 233 00:14:44,520 --> 00:14:46,960 Speaker 1: the headline pretty much tells the story. So you've got 234 00:14:46,960 --> 00:14:50,840 Speaker 1: this company, clear View AI, and they have an an 235 00:14:50,960 --> 00:14:55,240 Speaker 1: enormous image database, and they created it by scraping various 236 00:14:56,000 --> 00:15:00,240 Speaker 1: websites and services, particularly social networking platforms like face Book 237 00:15:00,240 --> 00:15:04,240 Speaker 1: and Twitter, but also other types of services like Google, 238 00:15:04,320 --> 00:15:07,960 Speaker 1: and Venmo, and they started gathering photos that way. According 239 00:15:07,960 --> 00:15:10,240 Speaker 1: to the l A Times, that means that the company 240 00:15:10,320 --> 00:15:13,880 Speaker 1: has a database of more than three billion photos and 241 00:15:13,920 --> 00:15:17,200 Speaker 1: has software that creates a digital faceprint of each person 242 00:15:17,320 --> 00:15:21,200 Speaker 1: based on those photos, which allows for faster facial recognition 243 00:15:21,240 --> 00:15:25,480 Speaker 1: identification if the system encounters a new image of someone 244 00:15:25,520 --> 00:15:28,920 Speaker 1: who is already in the database somewhere. So all those 245 00:15:28,920 --> 00:15:31,760 Speaker 1: photos that people share on different sites without really thinking 246 00:15:31,760 --> 00:15:34,400 Speaker 1: about it, and I include myself in this group of people, 247 00:15:35,120 --> 00:15:38,200 Speaker 1: you could be part of that massive database. That means 248 00:15:38,240 --> 00:15:41,400 Speaker 1: this company could potentially be using those photos to make 249 00:15:41,440 --> 00:15:44,800 Speaker 1: it possible for a clear view customers like once again, 250 00:15:45,040 --> 00:15:49,040 Speaker 1: law enforcement agencies to use it in real time with 251 00:15:49,160 --> 00:15:53,120 Speaker 1: the various solutions. Now, this prompted some civil liberties activists 252 00:15:53,160 --> 00:15:56,640 Speaker 1: to file a lawsuit against the company in California. They 253 00:15:56,680 --> 00:16:00,480 Speaker 1: said that the company's practices violate privacy and create a 254 00:16:00,560 --> 00:16:05,080 Speaker 1: chilling effect on protected activities such as the right to assembly. Now, 255 00:16:05,160 --> 00:16:07,360 Speaker 1: right now, we're in a pandemic. People shouldn't really be 256 00:16:07,400 --> 00:16:10,960 Speaker 1: assembling in public and big numbers anyway. But they're specifically 257 00:16:11,000 --> 00:16:14,760 Speaker 1: talking about things like the Black Lives Matter movement. So 258 00:16:15,000 --> 00:16:19,280 Speaker 1: the people who have had their accounts scraped, were never 259 00:16:19,320 --> 00:16:22,520 Speaker 1: given any notice that that was happening, let alone ever 260 00:16:22,640 --> 00:16:26,800 Speaker 1: given the chance to give consent for it. Now, the 261 00:16:26,840 --> 00:16:29,360 Speaker 1: flip side of that argument is that you could say, 262 00:16:29,520 --> 00:16:32,480 Speaker 1: someone posting to Facebook or whatever is doing so in 263 00:16:32,520 --> 00:16:35,920 Speaker 1: a semi public way, right if they haven't protected their 264 00:16:35,920 --> 00:16:40,000 Speaker 1: account in any way, then anyone could potentially see those photos. 265 00:16:40,040 --> 00:16:42,000 Speaker 1: But then you have to ask, what about people who 266 00:16:42,040 --> 00:16:45,480 Speaker 1: don't have a Facebook account. Maybe their friends have taken 267 00:16:45,520 --> 00:16:47,880 Speaker 1: photos of them and put those photos up on their 268 00:16:47,920 --> 00:16:52,040 Speaker 1: own account. Maybe those photos even have identifiable information in them, 269 00:16:52,080 --> 00:16:54,040 Speaker 1: And then you've got someone who doesn't even have a 270 00:16:54,080 --> 00:16:57,520 Speaker 1: Facebook account who had their picture scraped for this kind 271 00:16:57,520 --> 00:17:01,520 Speaker 1: of thing doesn't matter in that case. So the plaintiffs 272 00:17:01,560 --> 00:17:05,480 Speaker 1: are focusing just on California and California's citizens for this 273 00:17:05,520 --> 00:17:09,439 Speaker 1: particular lawsuit, and they are seeking an injunction that would 274 00:17:09,520 --> 00:17:13,960 Speaker 1: force clear View AI to stop collecting biometric information on 275 00:17:14,200 --> 00:17:17,159 Speaker 1: people in California, But they also want the company to 276 00:17:17,240 --> 00:17:20,760 Speaker 1: delete all the biometric data that it has already collected. 277 00:17:21,200 --> 00:17:25,200 Speaker 1: Clear View AI faces a similar lawsuit in Illinois. That 278 00:17:25,240 --> 00:17:27,720 Speaker 1: one is being brought against it by the American Civil 279 00:17:27,800 --> 00:17:30,679 Speaker 1: Liberties Union. I'm sure. I will be following up on 280 00:17:30,720 --> 00:17:33,720 Speaker 1: this story later on in the year as it plays out. 281 00:17:34,320 --> 00:17:39,800 Speaker 1: It really is indicative of how people are becoming aware 282 00:17:40,400 --> 00:17:44,560 Speaker 1: that the early choices we made when we were building 283 00:17:44,560 --> 00:17:51,160 Speaker 1: out stuff like social networking sites, we're rather shortsighted and 284 00:17:51,240 --> 00:17:55,720 Speaker 1: have had bigger consequences than we anticipated back then. I mean, heck, 285 00:17:55,760 --> 00:18:00,200 Speaker 1: Facebook started as a website for students to rate how 286 00:18:00,240 --> 00:18:04,560 Speaker 1: hot each other happened to be, So that's kind of 287 00:18:04,560 --> 00:18:07,120 Speaker 1: creepy all on its own. It doesn't have the same 288 00:18:07,160 --> 00:18:09,720 Speaker 1: sort of weight as this is a site that some 289 00:18:09,800 --> 00:18:12,439 Speaker 1: company is going to use later down the road to 290 00:18:12,600 --> 00:18:17,800 Speaker 1: build out an identification engine that law enforcement agencies might 291 00:18:17,840 --> 00:18:22,199 Speaker 1: be using ethically or otherwise in the future. Uh. And 292 00:18:22,280 --> 00:18:25,120 Speaker 1: because of those consequences, we now have to revisit those 293 00:18:25,119 --> 00:18:27,840 Speaker 1: early decisions we made and ask questions like, how can 294 00:18:27,880 --> 00:18:30,800 Speaker 1: we address this? Are there any ways to fix it? 295 00:18:31,200 --> 00:18:34,160 Speaker 1: Will it require us to create a ban on the 296 00:18:34,200 --> 00:18:40,120 Speaker 1: private use of facial recognition technologies? Honestly, those are big 297 00:18:40,200 --> 00:18:42,280 Speaker 1: questions that we don't have answers to yet, and I 298 00:18:42,320 --> 00:18:45,119 Speaker 1: don't know what answers we will arrive at, but I 299 00:18:45,160 --> 00:18:49,720 Speaker 1: will continue to cover the cases. That's it for today's news. 300 00:18:49,760 --> 00:18:52,919 Speaker 1: I hope you guys stay safe and healthy. If you 301 00:18:52,960 --> 00:18:55,040 Speaker 1: have anything you want to share with me, the best 302 00:18:55,040 --> 00:18:57,760 Speaker 1: place to do it is over on Twitter the handle 303 00:18:57,800 --> 00:19:00,639 Speaker 1: I use this tech stuff. H S double you and 304 00:19:00,680 --> 00:19:08,560 Speaker 1: I'll talk to you again really soon. Y Text Stuff 305 00:19:08,640 --> 00:19:11,800 Speaker 1: is an I Heart Radio production. For more podcasts from 306 00:19:11,800 --> 00:19:15,600 Speaker 1: my Heart Radio, visit the I Heart Radio app, Apple Podcasts, 307 00:19:15,680 --> 00:19:17,679 Speaker 1: or wherever you listen to your favorite shows