1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,079 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,120 --> 00:00:20,240 Speaker 1: and I love all things tech. And this is the 5 00:00:20,239 --> 00:00:24,119 Speaker 1: tech news for a Thursday, October seven, twenty twenty one. 6 00:00:24,600 --> 00:00:27,640 Speaker 1: And before I get to the news, uh, I need 7 00:00:27,640 --> 00:00:31,920 Speaker 1: to issue a correction because I totally bungled something in 8 00:00:32,120 --> 00:00:35,600 Speaker 1: yesterday's episode. So if you've already listened to that one, 9 00:00:36,080 --> 00:00:38,440 Speaker 1: there was you know, I was talking about space navigation 10 00:00:38,440 --> 00:00:42,519 Speaker 1: in that episode, and I made a dumb mistake on 11 00:00:42,600 --> 00:00:45,519 Speaker 1: my part, totally me and then I doubled down on it, 12 00:00:45,560 --> 00:00:50,440 Speaker 1: which made it worse. But fortunately Twitter user Charlie Tango 13 00:00:50,640 --> 00:00:55,200 Speaker 1: Bravo pointed this out to me. And the issue was 14 00:00:55,280 --> 00:00:59,200 Speaker 1: I was describing the inverse square law, and what I 15 00:00:59,240 --> 00:01:02,440 Speaker 1: said was that the intensity of a signal goes down 16 00:01:02,480 --> 00:01:04,560 Speaker 1: by half upon the square of the distance. This was 17 00:01:04,680 --> 00:01:08,000 Speaker 1: a complete misunderstanding on my part about the inverse square law. 18 00:01:08,280 --> 00:01:12,959 Speaker 1: I mean, yes, the strength of a signal decreases over distance, 19 00:01:13,040 --> 00:01:16,800 Speaker 1: but not that's not the relationship. Anyway. I could have 20 00:01:16,800 --> 00:01:19,560 Speaker 1: avoided this entirely by just taking a little bit more 21 00:01:19,600 --> 00:01:23,160 Speaker 1: time to make sure I understood the inverse square law 22 00:01:23,200 --> 00:01:26,880 Speaker 1: before I included an explanation of it in my podcast. 23 00:01:27,120 --> 00:01:30,840 Speaker 1: So again, this is all on me, and that stinks. 24 00:01:31,319 --> 00:01:33,480 Speaker 1: I never want to get something wrong, and of course 25 00:01:33,520 --> 00:01:35,560 Speaker 1: it's even worse than I could have avoided it if 26 00:01:35,600 --> 00:01:38,360 Speaker 1: I had just been a little more careful, so my 27 00:01:38,400 --> 00:01:41,440 Speaker 1: apologies to all of you for that. It also means 28 00:01:41,480 --> 00:01:44,480 Speaker 1: that the example I gave was fundamentally wrong. I have 29 00:01:44,640 --> 00:01:49,440 Speaker 1: recorded an updated segment that Tari is putting into yesterday's episode, 30 00:01:49,720 --> 00:01:52,120 Speaker 1: so that doesn't solve the problem for everybody who's already 31 00:01:52,160 --> 00:01:54,080 Speaker 1: listen to it, but it does mean that at least 32 00:01:54,120 --> 00:01:56,560 Speaker 1: in in the future, should someone be going back and 33 00:01:56,600 --> 00:02:00,400 Speaker 1: pulling up that episode, they won't have the wrong definition 34 00:02:00,440 --> 00:02:03,600 Speaker 1: and explanation in there. And again thanks to Charlie Tango 35 00:02:03,640 --> 00:02:08,399 Speaker 1: Bravo for the heads up. Uh and I I'll try 36 00:02:08,440 --> 00:02:11,639 Speaker 1: to do better. I'm gonna make mistakes. I just hope 37 00:02:11,639 --> 00:02:14,000 Speaker 1: they're not as embarrassing as that one. But let's get 38 00:02:14,080 --> 00:02:17,240 Speaker 1: to the news. First up, A hacker pulled more than 39 00:02:17,280 --> 00:02:21,120 Speaker 1: a hundred gigabytes of data down from Twitch, the video 40 00:02:21,160 --> 00:02:24,880 Speaker 1: streaming service that Amazon owns and caters, primarily to gamers. 41 00:02:25,600 --> 00:02:28,679 Speaker 1: That data included source code for the platform itself, so 42 00:02:28,960 --> 00:02:32,680 Speaker 1: like the actual code that Twitch runs on. Uh. It 43 00:02:32,720 --> 00:02:37,280 Speaker 1: included records showing how much top streamers were making due 44 00:02:37,280 --> 00:02:39,560 Speaker 1: to the platform. So we're talking about, you know, like 45 00:02:39,720 --> 00:02:42,960 Speaker 1: top performers making millions of dollars. Not all that information 46 00:02:43,120 --> 00:02:46,160 Speaker 1: was you know, new necessarily, but it was all in 47 00:02:46,200 --> 00:02:48,520 Speaker 1: one place. So a lot of people have been shocked 48 00:02:48,560 --> 00:02:53,919 Speaker 1: by that. Also, stuff like user data, including potentially encrypted passwords. 49 00:02:54,280 --> 00:02:56,960 Speaker 1: Now passwords being encrypted, that's a good thing. It that 50 00:02:57,080 --> 00:02:59,720 Speaker 1: means that, you know, you can't just immediately read them. 51 00:02:59,720 --> 00:03:03,119 Speaker 1: But however, with enough time and effort and a sufficiently 52 00:03:03,720 --> 00:03:07,800 Speaker 1: powerful computer system, you can break encryption. It's just a 53 00:03:07,919 --> 00:03:10,400 Speaker 1: question of how good was the encryption That will tell 54 00:03:10,440 --> 00:03:12,280 Speaker 1: you how long it will typically take for you to 55 00:03:12,320 --> 00:03:14,680 Speaker 1: break it. And for that reason, a lot of folks, 56 00:03:14,680 --> 00:03:17,640 Speaker 1: including myself, are suggesting that anyone who has a Twitch 57 00:03:17,680 --> 00:03:21,040 Speaker 1: account should go in and change their password. Hopefully you're 58 00:03:21,080 --> 00:03:23,160 Speaker 1: not using the same password on Twitch as you are 59 00:03:23,240 --> 00:03:26,480 Speaker 1: for other sites, because that could potentially mean that all 60 00:03:26,520 --> 00:03:30,560 Speaker 1: of those accounts are now vulnerable to right, because if 61 00:03:30,760 --> 00:03:33,840 Speaker 1: the hackers and all the people who are purchasing this 62 00:03:33,919 --> 00:03:37,880 Speaker 1: information on the you know, digital black market. If they 63 00:03:37,880 --> 00:03:41,160 Speaker 1: are all aware of the password you used and use 64 00:03:41,240 --> 00:03:43,960 Speaker 1: it everywhere, well, now you've just you know, you've handed 65 00:03:43,960 --> 00:03:46,880 Speaker 1: a skeleton key to people who just need to try 66 00:03:46,880 --> 00:03:49,840 Speaker 1: it in all the different locks. While you're at it, 67 00:03:49,920 --> 00:03:52,320 Speaker 1: while you're changing your password on Twitch, you should probably 68 00:03:52,400 --> 00:03:56,000 Speaker 1: also go ahead and activate two factor authentication. That way, 69 00:03:56,360 --> 00:03:59,000 Speaker 1: should someone ever get your password, they would still need 70 00:03:59,040 --> 00:04:01,560 Speaker 1: your phone before they could access your account. So it's 71 00:04:01,560 --> 00:04:05,080 Speaker 1: a good thing to have that active. Uh. It appears 72 00:04:05,120 --> 00:04:08,040 Speaker 1: that the hacker was taking advantage of a vulnerability that 73 00:04:08,120 --> 00:04:12,720 Speaker 1: was created when Amazon was doing some reconfiguration of Twitches servers. 74 00:04:13,120 --> 00:04:18,920 Speaker 1: We've seen a couple of examples of server reconfigurations leading 75 00:04:18,920 --> 00:04:22,440 Speaker 1: to big issues. In this case, it created an opportunity 76 00:04:22,480 --> 00:04:26,599 Speaker 1: for a hacker to attack. In Facebook's case, it led 77 00:04:26,640 --> 00:04:30,520 Speaker 1: to a situation where the internet essentially forgot that Facebook 78 00:04:30,720 --> 00:04:34,760 Speaker 1: and all of its services existed for like six hours. 79 00:04:34,839 --> 00:04:39,360 Speaker 1: So it really does tell us that you know, these processes, 80 00:04:39,800 --> 00:04:42,640 Speaker 1: even when they are routine and mundane, you have to 81 00:04:42,680 --> 00:04:45,760 Speaker 1: do them with a lot of care or else you 82 00:04:45,760 --> 00:04:50,000 Speaker 1: can introduce some pretty big problems. Facebook founder Mark Zuckerberg 83 00:04:50,040 --> 00:04:54,760 Speaker 1: attempted to downplay some statements that were made by the 84 00:04:54,920 --> 00:04:59,520 Speaker 1: whistle blower and former Facebook employee Francis Hogan that she 85 00:04:59,640 --> 00:05:02,840 Speaker 1: was making to the United States Senate. Zuckerberg said that 86 00:05:02,920 --> 00:05:07,120 Speaker 1: the research Hoggan was citing had been taken completely out 87 00:05:07,160 --> 00:05:10,120 Speaker 1: of context, and that it was painting a misleading portrait 88 00:05:10,279 --> 00:05:14,560 Speaker 1: of what Facebook is, and he should know because he 89 00:05:14,720 --> 00:05:17,240 Speaker 1: and other Facebook executives have made it kind of an 90 00:05:17,320 --> 00:05:21,320 Speaker 1: art form to present, let's call it a highly curated 91 00:05:21,440 --> 00:05:26,440 Speaker 1: image of Facebook, highlighting stuff that appears positive and then 92 00:05:26,520 --> 00:05:31,120 Speaker 1: downplaying or even ignoring stuff that's negative, and meanwhile the 93 00:05:31,279 --> 00:05:34,080 Speaker 1: entire time claiming that the company is operating in a 94 00:05:34,080 --> 00:05:37,599 Speaker 1: transparent way. At one point, when The New York Times 95 00:05:37,640 --> 00:05:41,440 Speaker 1: published an article that showed how Facebook was being selective 96 00:05:42,160 --> 00:05:44,400 Speaker 1: to choose, you know, what kind of data to report 97 00:05:44,440 --> 00:05:48,320 Speaker 1: and what data not to report, a Facebook spokesperson brushed 98 00:05:48,320 --> 00:05:50,440 Speaker 1: it off and said, we're guilty of cleaning up our 99 00:05:50,480 --> 00:05:54,159 Speaker 1: house a bit before we invited company. Okay, so here's 100 00:05:54,200 --> 00:05:58,440 Speaker 1: the thing, though, Um, that's not what transparency means. Transparency 101 00:05:58,440 --> 00:06:01,920 Speaker 1: doesn't mean let's just show you the pretty stuff. That's 102 00:06:01,960 --> 00:06:05,440 Speaker 1: not transparency Facebook. Well, anyway, I suspect we're going to 103 00:06:05,480 --> 00:06:07,520 Speaker 1: see a lot more scrutiny into the company in the 104 00:06:07,560 --> 00:06:11,480 Speaker 1: weeks to come, and probably more examples of spin and 105 00:06:11,600 --> 00:06:14,960 Speaker 1: damage control from the company as well. But I think 106 00:06:15,600 --> 00:06:20,200 Speaker 1: tolerance for Facebook's shenanigans is at kind of a low 107 00:06:20,240 --> 00:06:22,400 Speaker 1: point right now. I'm not saying it can't go lower. 108 00:06:22,800 --> 00:06:26,240 Speaker 1: It might, but um, yeah, I think Facebook is kind 109 00:06:26,240 --> 00:06:28,080 Speaker 1: of treading on thin ice at the moment. I think 110 00:06:28,080 --> 00:06:32,040 Speaker 1: the government is of the United States in particular, is 111 00:06:32,360 --> 00:06:35,680 Speaker 1: UH is kind of gearing up to to lay the 112 00:06:35,720 --> 00:06:38,640 Speaker 1: smack down on Facebook, and the same is true in 113 00:06:38,680 --> 00:06:40,800 Speaker 1: other parts of the world. By the way, all right, 114 00:06:41,080 --> 00:06:43,640 Speaker 1: let's get back to hoggins testimony for a second. At 115 00:06:43,640 --> 00:06:47,880 Speaker 1: one point, she talked about how Facebook's algorithms have exacerbated 116 00:06:48,000 --> 00:06:53,080 Speaker 1: xenophobic rhetoric and made dangerous situations a whole lot worse 117 00:06:53,160 --> 00:06:55,320 Speaker 1: in different parts of the world. One of the things 118 00:06:55,320 --> 00:06:59,160 Speaker 1: you refer to was the coup in Myanmar, the military cue, 119 00:06:59,279 --> 00:07:02,760 Speaker 1: and how face books algorithm pushed posts that turned the 120 00:07:02,800 --> 00:07:07,039 Speaker 1: dial up with calls for things like ethnic violence within 121 00:07:07,120 --> 00:07:10,400 Speaker 1: that country. And she also warned that the same thing 122 00:07:10,520 --> 00:07:14,080 Speaker 1: is kind of unfolding now in Ethiopia. Researchers with the 123 00:07:14,160 --> 00:07:18,400 Speaker 1: human rights organization Global witness back up that statement. They 124 00:07:18,400 --> 00:07:21,720 Speaker 1: conducted a study in the mean Mar case. They actually 125 00:07:21,720 --> 00:07:25,320 Speaker 1: looked to to a Facebook page that was a page 126 00:07:25,360 --> 00:07:29,360 Speaker 1: dedicated to Myanmar's military, and that page didn't have any 127 00:07:29,440 --> 00:07:33,720 Speaker 1: violations to Facebook's policies on it. They then liked that 128 00:07:33,760 --> 00:07:37,080 Speaker 1: page to see what would happen next, and then Facebook 129 00:07:37,080 --> 00:07:41,560 Speaker 1: started to suggest other pages that they might want to follow, 130 00:07:42,280 --> 00:07:46,200 Speaker 1: and among those pages there were a bunch that had 131 00:07:46,360 --> 00:07:49,720 Speaker 1: abusive content in them, stuff that was calling for like 132 00:07:50,600 --> 00:07:57,080 Speaker 1: violence against specific ethnic minority groups. And even if you 133 00:07:57,120 --> 00:08:00,560 Speaker 1: start from a place that doesn't violate Facebook's terms of service, 134 00:08:00,600 --> 00:08:03,160 Speaker 1: it does not take long for stuff that is not 135 00:08:03,280 --> 00:08:07,480 Speaker 1: playing by the rules to pop up, promoted by Facebook itself. 136 00:08:07,520 --> 00:08:10,040 Speaker 1: I mean, you wouldn't necessarily even know it existed except 137 00:08:10,120 --> 00:08:12,840 Speaker 1: for the fact that Facebook's algorithm is suggesting it to you. 138 00:08:13,240 --> 00:08:15,920 Speaker 1: So this can quickly lead to a situation where a 139 00:08:16,000 --> 00:08:19,800 Speaker 1: person sees frequent posts calling for violence or discrimination or 140 00:08:19,840 --> 00:08:25,280 Speaker 1: promoting harmful and hateful ideologies, and it gets reinforced with 141 00:08:25,320 --> 00:08:28,840 Speaker 1: every visit to Facebook. And now, I don't think anyone 142 00:08:29,400 --> 00:08:31,560 Speaker 1: would go so far as to say that Facebook is 143 00:08:31,600 --> 00:08:36,120 Speaker 1: the root cause of these problems. That is far too simplistic. 144 00:08:36,240 --> 00:08:38,760 Speaker 1: It's just not reflective of the truth. But I think 145 00:08:38,800 --> 00:08:42,760 Speaker 1: it's fair to say that Facebook is acting like an amplifier. 146 00:08:42,960 --> 00:08:47,320 Speaker 1: It's taking a signal and boosting it. U S Senator 147 00:08:47,400 --> 00:08:52,199 Speaker 1: Elizabeth Warren and House Representative Deborah Ross have introduced proposed 148 00:08:52,320 --> 00:08:56,760 Speaker 1: legislation that they're calling the Ransom Disclosure Act, So they're 149 00:08:56,800 --> 00:09:01,040 Speaker 1: calling for companies to have a legal obligation reveal when 150 00:09:01,080 --> 00:09:04,000 Speaker 1: they have paid off a ransom as a result of 151 00:09:04,000 --> 00:09:06,400 Speaker 1: a ransomware attack. Now, if you've been listening to my 152 00:09:06,440 --> 00:09:09,480 Speaker 1: show for a while, you know I mean I I've 153 00:09:10,200 --> 00:09:13,520 Speaker 1: beaten this dead horse so many times that I always say, 154 00:09:13,920 --> 00:09:18,079 Speaker 1: never pay the ransom, because paying a ransom means you're 155 00:09:18,120 --> 00:09:23,040 Speaker 1: sending the signal this criminal activity is profitable that encourages 156 00:09:23,360 --> 00:09:27,600 Speaker 1: future attacks both against you and other entities. Plus, you 157 00:09:27,600 --> 00:09:30,800 Speaker 1: can never guarantee that the attackers will actually return to 158 00:09:30,840 --> 00:09:34,320 Speaker 1: you whatever it is that they have locked down, just 159 00:09:34,400 --> 00:09:38,040 Speaker 1: in case. Ransomware is something you're not really familiar with. Typically, 160 00:09:38,040 --> 00:09:42,360 Speaker 1: this involves hackers gaining access to a system and then 161 00:09:42,440 --> 00:09:47,200 Speaker 1: they will encrypt large amounts of data and file folders 162 00:09:47,200 --> 00:09:50,000 Speaker 1: and things like that in the system. So without a 163 00:09:50,040 --> 00:09:54,360 Speaker 1: decryption key, without a way to reverse that process, all 164 00:09:54,400 --> 00:09:59,440 Speaker 1: that data becomes unusable. It's it's it's gibberish, so you 165 00:09:59,520 --> 00:10:03,439 Speaker 1: can't do anything with it. Um. Now, there are some 166 00:10:03,920 --> 00:10:07,000 Speaker 1: different variations on this attack, but they all basically boiled 167 00:10:07,040 --> 00:10:09,319 Speaker 1: down to a hacker trying to make critical systems or 168 00:10:09,400 --> 00:10:13,720 Speaker 1: data inaccessible to the rightful owners. And then the hackers say, 169 00:10:13,840 --> 00:10:17,120 Speaker 1: fork over the money, usually in the form of cryptocurrency, 170 00:10:17,200 --> 00:10:20,120 Speaker 1: and then we'll hand you the decryption key so that 171 00:10:20,200 --> 00:10:24,160 Speaker 1: you can get all your stuff back. So this bill, 172 00:10:24,640 --> 00:10:28,800 Speaker 1: if passed into law, would require companies to disclose any 173 00:10:28,920 --> 00:10:32,320 Speaker 1: ransom payment they made to hackers within forty eight hours 174 00:10:32,320 --> 00:10:36,240 Speaker 1: of having made that payment, including how much they paid 175 00:10:36,400 --> 00:10:39,800 Speaker 1: and in what format they paid it in. I imagine 176 00:10:39,800 --> 00:10:42,559 Speaker 1: that if this bill does become law, it will discourage 177 00:10:42,559 --> 00:10:46,640 Speaker 1: companies from trying to quietly handle these sorts of matters 178 00:10:46,679 --> 00:10:48,679 Speaker 1: in the hopes that no one finds out about it, 179 00:10:49,080 --> 00:10:53,360 Speaker 1: because if the government does find out that a company 180 00:10:53,400 --> 00:10:56,319 Speaker 1: paid a ransom and that it did not comply with 181 00:10:56,400 --> 00:11:00,840 Speaker 1: the rules, there's going to be some pretty serious consequences. Uh. 182 00:11:01,120 --> 00:11:03,720 Speaker 1: Of course, this has not been passed into a law, 183 00:11:03,800 --> 00:11:07,200 Speaker 1: it may never make it to a law, but is 184 00:11:07,280 --> 00:11:10,880 Speaker 1: interesting to see the proposal over in Europe. A majority 185 00:11:10,920 --> 00:11:13,720 Speaker 1: in the European Parliament voted on a band throughout the 186 00:11:13,760 --> 00:11:17,480 Speaker 1: EU with regard to police using facial recognition surveillance to 187 00:11:17,600 --> 00:11:21,440 Speaker 1: identify people who are not suspected of committing a crime. 188 00:11:22,160 --> 00:11:25,840 Speaker 1: And as we've seen many times through lots of different studies, 189 00:11:26,160 --> 00:11:30,720 Speaker 1: facial recognition technologies are incredibly prone to bias. Bias tends 190 00:11:30,760 --> 00:11:35,800 Speaker 1: to disproportionately affect people in ethnic minority groups, and the 191 00:11:35,840 --> 00:11:38,960 Speaker 1: European Parliament also released a statement that said, at least 192 00:11:38,960 --> 00:11:43,199 Speaker 1: in part quote, to respect privacy and human dignity, m 193 00:11:43,240 --> 00:11:46,800 Speaker 1: EPs ask for a permanent ban on the automated recognition 194 00:11:46,840 --> 00:11:51,079 Speaker 1: of individuals in public spaces, noting that citizens should only 195 00:11:51,120 --> 00:11:54,760 Speaker 1: be monitored when suspected of a crime. Parliament calls for 196 00:11:54,800 --> 00:11:58,559 Speaker 1: the use of private facial recognition databases like the clear 197 00:11:58,640 --> 00:12:02,360 Speaker 1: View AI system which is already in use, and predictive 198 00:12:02,400 --> 00:12:06,680 Speaker 1: policing based on behavioral data to be forbidden end quote. 199 00:12:07,280 --> 00:12:10,000 Speaker 1: And we've seen a growing concern around the world regarding 200 00:12:10,000 --> 00:12:15,319 Speaker 1: how various authorities, particularly in law enforcement capacities, have been 201 00:12:15,360 --> 00:12:19,160 Speaker 1: relying upon facial recognition technologies and how the technology can 202 00:12:19,200 --> 00:12:22,400 Speaker 1: cause harm to innocent people. Even if everyone's using the 203 00:12:22,440 --> 00:12:27,000 Speaker 1: tech correctly, the tech itself can just be wrong. So 204 00:12:27,360 --> 00:12:30,440 Speaker 1: that's assuming if you're using it correctly. That's a big assumption. 205 00:12:30,480 --> 00:12:32,320 Speaker 1: There are a lot of cases where people are just 206 00:12:32,440 --> 00:12:35,840 Speaker 1: not using tech the right way. I'm sure you've encountered 207 00:12:35,880 --> 00:12:39,760 Speaker 1: this just in general. Well, when you're talking about people 208 00:12:39,760 --> 00:12:43,160 Speaker 1: in positions of authority who are using that to be 209 00:12:43,280 --> 00:12:49,079 Speaker 1: part of a surveillance package on citizens, that becomes an 210 00:12:49,080 --> 00:12:54,800 Speaker 1: incredibly dangerous situation, one that can lead to an authoritarian 211 00:12:54,960 --> 00:12:58,559 Speaker 1: police state, uh and a lot of human rights being 212 00:12:58,679 --> 00:13:01,839 Speaker 1: violated in the process. So I think this is an 213 00:13:01,880 --> 00:13:07,080 Speaker 1: important point and something that I'm seeing pop up in 214 00:13:07,200 --> 00:13:11,280 Speaker 1: other places around the world. Well, we have a few 215 00:13:11,280 --> 00:13:13,640 Speaker 1: more stories to go through, but before we get to 216 00:13:13,720 --> 00:13:25,280 Speaker 1: any of those, let's take a quick break. BlackBerry, which 217 00:13:25,800 --> 00:13:28,640 Speaker 1: is a name I've not heard in a long time, 218 00:13:29,520 --> 00:13:35,440 Speaker 1: A long time anyway. BlackBerry has a research team that 219 00:13:35,520 --> 00:13:40,200 Speaker 1: reports that it discovered a Chinese state sponsored hacker group 220 00:13:40,640 --> 00:13:44,560 Speaker 1: that was using three different fishing schemes, all targeting companies 221 00:13:44,559 --> 00:13:49,080 Speaker 1: and individuals in India. The BlackBerry researchers say that it 222 00:13:49,160 --> 00:13:52,800 Speaker 1: looks like the hackers were operating both as spies, so 223 00:13:53,000 --> 00:13:57,280 Speaker 1: conducting espionage on behalf of the Chinese government and also 224 00:13:57,440 --> 00:14:02,520 Speaker 1: pursuing quote financially motivated operations end quote. So you know, 225 00:14:02,840 --> 00:14:06,720 Speaker 1: tricking people out of money and stealing and also spying 226 00:14:06,840 --> 00:14:09,600 Speaker 1: on behalf of an authoritarian government. You know the usual. 227 00:14:10,080 --> 00:14:13,800 Speaker 1: The name of this hacker group is APT forty one, 228 00:14:14,080 --> 00:14:17,800 Speaker 1: and they lured in targets by sending out messages claiming 229 00:14:17,800 --> 00:14:20,840 Speaker 1: to be related to official matters like you know, taxes 230 00:14:21,360 --> 00:14:25,600 Speaker 1: or COVID nineteen measures. These are common tactics, right. You 231 00:14:25,600 --> 00:14:29,600 Speaker 1: You set the bait by picking something that is likely 232 00:14:29,760 --> 00:14:33,120 Speaker 1: to get a reaction from your target, something that that 233 00:14:33,280 --> 00:14:35,720 Speaker 1: they would be concerned about. If you send them a 234 00:14:35,720 --> 00:14:39,160 Speaker 1: message saying, hey, you know, we found some money that 235 00:14:39,240 --> 00:14:43,000 Speaker 1: we owe you in taxes because you overpaid, a lot 236 00:14:43,000 --> 00:14:45,400 Speaker 1: of people are gonna think, oh awesome and just open 237 00:14:45,480 --> 00:14:50,080 Speaker 1: that up, right, very common tactic, or sometimes they pray 238 00:14:50,120 --> 00:14:53,160 Speaker 1: on fear. Right, They say, it looks like you underpaid 239 00:14:53,360 --> 00:14:56,280 Speaker 1: and if you don't pay this fee, then you're going 240 00:14:56,360 --> 00:14:59,320 Speaker 1: to end up facing jail time and you're scaring them 241 00:14:59,320 --> 00:15:03,240 Speaker 1: into a opening up the the attachment, which typically has 242 00:15:03,280 --> 00:15:06,040 Speaker 1: some form of malware on it, or it lures you 243 00:15:06,120 --> 00:15:10,320 Speaker 1: into sharing information you really shouldn't. The researchers showed that 244 00:15:10,400 --> 00:15:13,760 Speaker 1: these hackers weren't being particularly careful about disguising the fact 245 00:15:13,840 --> 00:15:17,280 Speaker 1: that a single group was behind the different fishing schemes. 246 00:15:17,760 --> 00:15:20,080 Speaker 1: They were kind of using some of the same assets 247 00:15:20,120 --> 00:15:24,080 Speaker 1: across the three different types of attacks. And they also said, like, 248 00:15:24,120 --> 00:15:26,160 Speaker 1: there's not a whole lot they can do in these 249 00:15:26,160 --> 00:15:29,480 Speaker 1: cases because you know, it's a it's an organization that's 250 00:15:29,520 --> 00:15:32,960 Speaker 1: within China. It's you know, kind of you know, untouchable. 251 00:15:33,400 --> 00:15:36,480 Speaker 1: So the best thing to do, you know, you can 252 00:15:36,480 --> 00:15:38,640 Speaker 1: start to try and block IP addresses and stuff, but 253 00:15:38,640 --> 00:15:40,600 Speaker 1: there are ways around that. So the best thing to 254 00:15:40,600 --> 00:15:43,000 Speaker 1: do is to raise awareness in as many people as 255 00:15:43,000 --> 00:15:46,840 Speaker 1: possible to try and decrease the number of positive attacks. 256 00:15:47,040 --> 00:15:49,320 Speaker 1: If you can make it to the point where hackers 257 00:15:49,360 --> 00:15:51,920 Speaker 1: just aren't getting that many hits, you can get it 258 00:15:51,960 --> 00:15:54,200 Speaker 1: to a point where where the return on investment is 259 00:15:54,240 --> 00:15:57,760 Speaker 1: so low that there's no point and even bothering. Uh, 260 00:15:57,840 --> 00:16:00,400 Speaker 1: it's unlikely to ever reach that point, but it's a 261 00:16:00,400 --> 00:16:03,400 Speaker 1: good gold strike for In past episodes, I've talked about 262 00:16:03,440 --> 00:16:07,000 Speaker 1: the Pegasus software that's you know kind of it's really malware. 263 00:16:07,760 --> 00:16:11,520 Speaker 1: It's software that uses a zero click attack through iOS 264 00:16:11,560 --> 00:16:16,000 Speaker 1: and I Message. In particular, Apple includes I Message by 265 00:16:16,000 --> 00:16:19,560 Speaker 1: default in all iPhones. You cannot uninstall it, at least 266 00:16:19,560 --> 00:16:23,680 Speaker 1: not under normal conditions, and I Message will automatically accept 267 00:16:23,720 --> 00:16:27,840 Speaker 1: any messages sent from other iOS devices that have sent 268 00:16:27,920 --> 00:16:31,600 Speaker 1: a message to that phone's specific phone number. So really 269 00:16:31,600 --> 00:16:33,760 Speaker 1: you just need your Targets phone number, and you need 270 00:16:33,880 --> 00:16:37,040 Speaker 1: an iOS device that has the Pegasus software on it, 271 00:16:37,400 --> 00:16:40,040 Speaker 1: and you can send an attack that effectively turns your 272 00:16:40,040 --> 00:16:43,560 Speaker 1: Targets phone into a spying device. It can give you 273 00:16:43,600 --> 00:16:47,120 Speaker 1: access to stuff like that phone's camera and microphone. Essentially 274 00:16:47,120 --> 00:16:49,880 Speaker 1: you can operate the phone as if you were in 275 00:16:49,960 --> 00:16:54,960 Speaker 1: direct possession of it. It's a powerful malware tool. This 276 00:16:55,080 --> 00:16:57,760 Speaker 1: product comes from an Israeli company called n s O Group, 277 00:16:58,400 --> 00:17:01,320 Speaker 1: and they say that the purpose to the malware is 278 00:17:01,360 --> 00:17:06,159 Speaker 1: to give governmental authorities their customers a tool to infiltrate, 279 00:17:06,280 --> 00:17:09,200 Speaker 1: you know, like criminal and terrorist organizations. You use this 280 00:17:09,560 --> 00:17:12,439 Speaker 1: when you're doing like a sting operation. But you know, 281 00:17:12,680 --> 00:17:15,520 Speaker 1: it doesn't really matter what the company says the tool 282 00:17:15,680 --> 00:17:18,879 Speaker 1: was intended to do. It actually matters how people really 283 00:17:19,000 --> 00:17:22,200 Speaker 1: use the tool. So the whole thing was to set 284 00:17:22,280 --> 00:17:25,600 Speaker 1: up the fact that a UK judge has said that 285 00:17:25,920 --> 00:17:30,880 Speaker 1: Sheik Mohammed bin Rashid al Maktoum, the ruler of Dubai, 286 00:17:31,720 --> 00:17:36,200 Speaker 1: used Pegasus to infect the phone belonging to his ex wife, 287 00:17:36,280 --> 00:17:40,879 Speaker 1: the Princess Hya bin al Hussein. And I should also 288 00:17:41,119 --> 00:17:44,879 Speaker 1: add that he targeted, according to the judge, her entire 289 00:17:44,960 --> 00:17:49,159 Speaker 1: legal team. And this was all during a very acrimonious 290 00:17:49,200 --> 00:17:52,280 Speaker 1: custody battle between the Sheik and the princess over there 291 00:17:52,359 --> 00:17:57,199 Speaker 1: their two children. Um So the UK Judge Andrew McFarlane 292 00:17:57,720 --> 00:18:01,199 Speaker 1: factored this into his ruling on that case. Now, that 293 00:18:01,240 --> 00:18:04,720 Speaker 1: whole judgment was done a year ago, but it was 294 00:18:04,840 --> 00:18:08,359 Speaker 1: held in private. It was it was under lock and 295 00:18:08,480 --> 00:18:12,000 Speaker 1: key for a full year before being published this year. 296 00:18:12,359 --> 00:18:16,200 Speaker 1: The chik has subsequently denied the allegations and also argued 297 00:18:16,240 --> 00:18:18,440 Speaker 1: that the court didn't actually have the authority to share 298 00:18:18,480 --> 00:18:21,320 Speaker 1: that kind of information anyway, and also heads of foreign 299 00:18:21,320 --> 00:18:24,560 Speaker 1: state or exempt from inquiries into the legality of their actions. 300 00:18:25,480 --> 00:18:28,439 Speaker 1: That doesn't really say to me, hey, I totally didn't 301 00:18:28,440 --> 00:18:31,040 Speaker 1: do that thing you accused me of doing. And the 302 00:18:31,080 --> 00:18:33,600 Speaker 1: story actually gets worse from there, but it also gets 303 00:18:33,600 --> 00:18:36,440 Speaker 1: away from the tech angle. So I'll just say there's 304 00:18:36,440 --> 00:18:40,399 Speaker 1: a lot more to it. But it's another example of 305 00:18:40,480 --> 00:18:45,240 Speaker 1: how a tool could be, you know, made to do 306 00:18:45,320 --> 00:18:49,080 Speaker 1: one specific thing and maybe that that effort was sincere. 307 00:18:50,160 --> 00:18:54,320 Speaker 1: But if people start turning it to another use, that's 308 00:18:54,359 --> 00:18:56,920 Speaker 1: still that's still a bad thing, right, Like you still 309 00:18:57,000 --> 00:18:58,800 Speaker 1: have to look at the company that's making the tool 310 00:18:58,840 --> 00:19:04,560 Speaker 1: and say, hey, you are propagating a piece of malware 311 00:19:04,600 --> 00:19:07,919 Speaker 1: that is causing an enormous amount of harm, And it 312 00:19:07,960 --> 00:19:12,000 Speaker 1: doesn't really matter what your intent was anyway. Have you 313 00:19:12,040 --> 00:19:15,040 Speaker 1: ever found yourself waiting into a flame war on Twitter? 314 00:19:15,359 --> 00:19:18,320 Speaker 1: Maybe you got your dander up and you jumped into 315 00:19:18,440 --> 00:19:22,040 Speaker 1: a hotly contested thread before you really thought it over. 316 00:19:22,560 --> 00:19:24,359 Speaker 1: Maybe you even did it by accident. You were just 317 00:19:24,400 --> 00:19:27,240 Speaker 1: replying to someone cheekily and then it blew up in 318 00:19:27,240 --> 00:19:30,239 Speaker 1: your face, and maybe you regretted it afterwards. Maybe you've 319 00:19:30,280 --> 00:19:33,480 Speaker 1: got all these different replies and retweets and stuff, and 320 00:19:33,520 --> 00:19:37,240 Speaker 1: maybe just going on Twitter now is stressful and upsetting. Well, 321 00:19:37,359 --> 00:19:40,119 Speaker 1: now Twitter is rolling out a feature to folks using 322 00:19:40,160 --> 00:19:44,359 Speaker 1: the Android and iOS Twitter apps that could help prevent 323 00:19:44,400 --> 00:19:48,080 Speaker 1: this from happening. The apps will now occasionally show prompts 324 00:19:48,359 --> 00:19:50,080 Speaker 1: when it looks like you might be engaging in a 325 00:19:50,119 --> 00:19:53,920 Speaker 1: Twitter thread that appears to be quote unquote intense. One 326 00:19:53,960 --> 00:19:57,880 Speaker 1: example they gave was a prompt that reads, let's look 327 00:19:57,880 --> 00:20:01,400 Speaker 1: out for each other and the mess our values make 328 00:20:01,440 --> 00:20:05,800 Speaker 1: Twitter better. And then they includes some reminders to maybe 329 00:20:06,119 --> 00:20:09,000 Speaker 1: convince you to act like, you know, a compassionate human being. 330 00:20:09,119 --> 00:20:12,440 Speaker 1: Like it says that, you know, chances are the person 331 00:20:12,480 --> 00:20:14,440 Speaker 1: who's on the other end of that Twitter handle is 332 00:20:14,480 --> 00:20:18,160 Speaker 1: a person that ignores the fact that there's like a 333 00:20:18,280 --> 00:20:21,800 Speaker 1: rampant bot problem on Twitter, but you know, you get it. 334 00:20:22,320 --> 00:20:26,560 Speaker 1: And it also says, hey, you know, we shouldn't ignore facts. 335 00:20:26,840 --> 00:20:29,840 Speaker 1: Facts are important. Facts are facts, and even if they're 336 00:20:29,840 --> 00:20:35,080 Speaker 1: inconvenient to our own perspective, we cannot just dismiss a fact. Uh. 337 00:20:35,119 --> 00:20:38,400 Speaker 1: It also says, yeah, having different perspectives is a good thing, 338 00:20:38,480 --> 00:20:40,280 Speaker 1: you know, Like you can get people who have different 339 00:20:40,320 --> 00:20:44,280 Speaker 1: perspectives having a conversation, and new ideas can develop and 340 00:20:44,320 --> 00:20:46,960 Speaker 1: people can be opened up to other points of view, 341 00:20:48,600 --> 00:20:52,240 Speaker 1: which is sometimes true. Essentially, what Twitter is trying to 342 00:20:52,240 --> 00:20:54,600 Speaker 1: do is to remind us not to go nuclear on 343 00:20:54,640 --> 00:20:57,159 Speaker 1: the platform, and I think that is good advice. But 344 00:20:57,200 --> 00:21:00,000 Speaker 1: I also think this is important for Twitter because social media, 345 00:21:00,000 --> 00:21:02,920 Speaker 1: your platforms can really come under fire if it looks 346 00:21:02,920 --> 00:21:06,959 Speaker 1: like they're facilitating stuff like hate speech and misinformation. Apple 347 00:21:07,000 --> 00:21:09,680 Speaker 1: has established a new policy that app developers will need 348 00:21:09,760 --> 00:21:13,439 Speaker 1: to follow starting January thirty one next year. Uh at 349 00:21:13,480 --> 00:21:16,639 Speaker 1: least any app developers who release apps that require users 350 00:21:16,680 --> 00:21:20,480 Speaker 1: to create an account of some sort. Apple wants all 351 00:21:20,520 --> 00:21:22,760 Speaker 1: of those kinds of apps to include an option to 352 00:21:22,880 --> 00:21:26,640 Speaker 1: delete user accounts if the user wants to do that. 353 00:21:27,280 --> 00:21:29,520 Speaker 1: Uh So, like if you just delete an app off 354 00:21:29,520 --> 00:21:34,119 Speaker 1: your phone, that doesn't delete your account, The account is 355 00:21:34,119 --> 00:21:37,760 Speaker 1: still sitting there on the servers of whatever developer I 356 00:21:37,800 --> 00:21:40,600 Speaker 1: created the app for you, and it's still holding all 357 00:21:40,600 --> 00:21:44,200 Speaker 1: that data, So you might want to close an account 358 00:21:44,200 --> 00:21:47,080 Speaker 1: out entirely. Apple wants that to be built into the 359 00:21:47,119 --> 00:21:50,159 Speaker 1: apps themselves, so that you're not just saying I'm not 360 00:21:50,200 --> 00:21:52,960 Speaker 1: just using this app anymore. You know, you're saying I 361 00:21:53,000 --> 00:21:56,080 Speaker 1: don't want to have an account anymore. The Verge has 362 00:21:56,080 --> 00:21:58,520 Speaker 1: pointed out that Apple's policy has some wiggle room in it. 363 00:21:58,560 --> 00:22:01,320 Speaker 1: For example, there's nothing thing that would stop a company 364 00:22:01,359 --> 00:22:05,760 Speaker 1: from routing any sort of cancelation request to a customer 365 00:22:05,800 --> 00:22:08,639 Speaker 1: service agent who then tries their best to talk you 366 00:22:08,760 --> 00:22:12,080 Speaker 1: out of canceling your account, which reminds me of every 367 00:22:12,160 --> 00:22:14,960 Speaker 1: experience I've ever had while trying to cancel cable service. 368 00:22:15,440 --> 00:22:18,440 Speaker 1: But I think that generally speaking, this is a good move. 369 00:22:18,560 --> 00:22:21,879 Speaker 1: It's not going to solve every problem, of course, but 370 00:22:22,200 --> 00:22:24,199 Speaker 1: it at least gives users a chance to make a 371 00:22:24,240 --> 00:22:27,080 Speaker 1: cleaner break when they decide they no longer want to 372 00:22:27,359 --> 00:22:31,200 Speaker 1: rely on a specific service. Speaking of Apple, I've talked 373 00:22:31,200 --> 00:22:33,479 Speaker 1: before about how a US judge has ruled that Apple 374 00:22:34,040 --> 00:22:37,119 Speaker 1: must allow developers who want to use a different in 375 00:22:37,200 --> 00:22:42,560 Speaker 1: app purchase option besides Apple's official one to be able 376 00:22:42,560 --> 00:22:44,520 Speaker 1: to do so. That's what the judges said. They said 377 00:22:44,520 --> 00:22:47,080 Speaker 1: that these developers, if they want to sell stuff within 378 00:22:47,119 --> 00:22:50,840 Speaker 1: their apps, you know, like a video game app, offering 379 00:22:51,080 --> 00:22:54,119 Speaker 1: things like character skins, that they are not required to 380 00:22:54,160 --> 00:22:57,159 Speaker 1: go through Apple's own system to do that. That Apple 381 00:22:57,240 --> 00:23:00,600 Speaker 1: should not require that to be the case. This is 382 00:23:00,640 --> 00:23:02,560 Speaker 1: at the heart of how Apple makes a ton of 383 00:23:02,600 --> 00:23:06,359 Speaker 1: revenue through the app store. It's not by creating apps, 384 00:23:06,400 --> 00:23:10,520 Speaker 1: but rather by taking a chunk out of you know, 385 00:23:10,600 --> 00:23:12,960 Speaker 1: taking a cut of up to like out of these 386 00:23:13,000 --> 00:23:17,040 Speaker 1: in app purchases. And a Dutch antitrust authority has made 387 00:23:17,080 --> 00:23:20,760 Speaker 1: a similar judgment against Apple. The authority has said that 388 00:23:21,480 --> 00:23:24,440 Speaker 1: the rules that Apple had in place are anti competitive 389 00:23:24,480 --> 00:23:27,160 Speaker 1: and that Apple must allow developers to offer their own 390 00:23:27,200 --> 00:23:29,919 Speaker 1: in app purchasing options if they want to. So it 391 00:23:29,960 --> 00:23:33,399 Speaker 1: looks like there's a growing movement to push back against 392 00:23:33,400 --> 00:23:37,560 Speaker 1: Apple's policies. South Korea made a similar ruling, which affects 393 00:23:37,600 --> 00:23:40,239 Speaker 1: not just Apple but also Google because Google does the 394 00:23:40,320 --> 00:23:43,880 Speaker 1: same thing. So we're starting to see more countries say, 395 00:23:44,040 --> 00:23:46,959 Speaker 1: you know, you can't do this. It's anti competitive and 396 00:23:47,000 --> 00:23:51,040 Speaker 1: it's harmful to developers who you know, are really reliant 397 00:23:51,119 --> 00:23:54,119 Speaker 1: upon those in app purchases to generate the revenue they 398 00:23:54,200 --> 00:23:57,320 Speaker 1: need to stay in business. Finally, someone managed to get 399 00:23:57,359 --> 00:23:59,680 Speaker 1: access to the Facebook profile page for the U S 400 00:23:59,800 --> 00:24:03,480 Speaker 1: n V ship the U S S Kid. The person 401 00:24:03,600 --> 00:24:07,639 Speaker 1: has used that Facebook profile to stream game sessions of 402 00:24:07,840 --> 00:24:11,920 Speaker 1: Age of Empires, a real time strategy computer game. Vice 403 00:24:11,960 --> 00:24:16,520 Speaker 1: dot com reports that whomever is responsible is h might 404 00:24:16,520 --> 00:24:19,520 Speaker 1: be good at guessing the Navy's passwords, but they are 405 00:24:19,680 --> 00:24:23,160 Speaker 1: not a good Age of Empires player, according to Vice, 406 00:24:23,600 --> 00:24:26,240 Speaker 1: and I checked the profile just before I started recording 407 00:24:26,240 --> 00:24:28,720 Speaker 1: this episode, and at least when I checked it, it 408 00:24:28,760 --> 00:24:31,399 Speaker 1: looked like the Navy had not re established ownership of 409 00:24:31,480 --> 00:24:34,639 Speaker 1: the page. But then again, nothing has been posted to 410 00:24:34,680 --> 00:24:39,280 Speaker 1: that page since October four. However, all those gaming sessions 411 00:24:39,359 --> 00:24:42,520 Speaker 1: were still up on the profile at the time of 412 00:24:42,520 --> 00:24:45,400 Speaker 1: this recording, which suggests to me that the Navy has 413 00:24:45,440 --> 00:24:49,320 Speaker 1: not regained control of that page yet, which makes me 414 00:24:49,400 --> 00:24:51,480 Speaker 1: wonder what's going on. I would think that Facebook would 415 00:24:51,480 --> 00:24:55,680 Speaker 1: respond to that. Maybe the Navy just has Maybe there's 416 00:24:55,720 --> 00:24:58,160 Speaker 1: just too much going on right, maybe they just haven't 417 00:24:58,160 --> 00:25:01,919 Speaker 1: sent the request yet. Well that's the news for Thursday, 418 00:25:02,480 --> 00:25:05,879 Speaker 1: October seven, two twenty one. And yes, I did just 419 00:25:05,920 --> 00:25:07,720 Speaker 1: have to look at a calendar because I had already 420 00:25:07,720 --> 00:25:10,840 Speaker 1: forgotten what day it was. If you have suggestions for 421 00:25:10,880 --> 00:25:13,080 Speaker 1: topics I should cover in future episodes of tech Stuff, 422 00:25:13,080 --> 00:25:15,159 Speaker 1: please reach out to me. The best way to do 423 00:25:15,200 --> 00:25:17,840 Speaker 1: that is on Twitter. The handle for the show is 424 00:25:17,960 --> 00:25:21,480 Speaker 1: text Stuff H s W and I'll talk to you 425 00:25:21,520 --> 00:25:31,600 Speaker 1: again really soon. Text Stuff is an I Heart Radio production. 426 00:25:31,840 --> 00:25:34,639 Speaker 1: For more podcasts from My Heart Radio, visit the I 427 00:25:34,760 --> 00:25:38,000 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 428 00:25:38,040 --> 00:25:38,960 Speaker 1: your favorite shows.