1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,400 --> 00:00:16,239 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,239 --> 00:00:19,720 Speaker 1: an executive producer with iHeartRadio and how the tech are you. 4 00:00:20,440 --> 00:00:23,880 Speaker 1: It's time for the tech news for Thursday August third, 5 00:00:24,200 --> 00:00:29,280 Speaker 1: twenty twenty three. Over in the UK, the Competition and 6 00:00:29,400 --> 00:00:35,000 Speaker 1: Markets Authority or CMA has re entered negotiations with Microsoft 7 00:00:35,080 --> 00:00:40,600 Speaker 1: regarding the long awaited acquisition of Activision Blizzard. Now, you 8 00:00:40,680 --> 00:00:44,520 Speaker 1: might recall that there were a lot of regulators around 9 00:00:44,600 --> 00:00:49,160 Speaker 1: the world who initially resisted this deal, but over time 10 00:00:49,280 --> 00:00:53,440 Speaker 1: most of them came around to giving the acquisition the 11 00:00:53,479 --> 00:00:57,160 Speaker 1: green light after the companies had made multiple assurances that 12 00:00:57,160 --> 00:01:00,800 Speaker 1: they weren't going to freeze out competitors like Sony from 13 00:01:01,280 --> 00:01:05,360 Speaker 1: having access to stuff like Call of Duty, specifically that franchise. 14 00:01:05,920 --> 00:01:09,160 Speaker 1: Here in the United States, a judge denied the Federal 15 00:01:09,200 --> 00:01:13,320 Speaker 1: Trade Commission's attempt to secure an injunction against this deal, 16 00:01:14,120 --> 00:01:18,320 Speaker 1: and then the FTC subsequently withdrew its case, but it 17 00:01:18,440 --> 00:01:21,520 Speaker 1: could still file that case again in the future, so 18 00:01:22,200 --> 00:01:26,640 Speaker 1: that's not necessarily over here in the US. But right now, 19 00:01:27,120 --> 00:01:31,440 Speaker 1: the one remaining hurdle for these two companies is in 20 00:01:31,520 --> 00:01:35,520 Speaker 1: the UK. Microsoft has committed to a ten year licensing 21 00:01:35,640 --> 00:01:38,720 Speaker 1: arrangement with Sony regarding the Call of Duty franchise and 22 00:01:38,760 --> 00:01:43,080 Speaker 1: an effort to remove concerns about this arrangement. The CMA 23 00:01:43,120 --> 00:01:47,039 Speaker 1: is seeking comment from companies and individuals, leaving open the 24 00:01:47,080 --> 00:01:50,320 Speaker 1: option to respond to the CMA until the end of 25 00:01:50,400 --> 00:01:54,840 Speaker 1: the day tomorrow. It's unknown if the CMA will reverse 26 00:01:54,880 --> 00:01:57,600 Speaker 1: its decision, but if it does, that would be the 27 00:01:57,720 --> 00:02:02,280 Speaker 1: first time the CMA has done so since Brexit. According 28 00:02:02,280 --> 00:02:06,440 Speaker 1: to the Financial Times, the CMA has until August twenty 29 00:02:06,520 --> 00:02:09,480 Speaker 1: ninth to rule on this matter, so by the end 30 00:02:09,520 --> 00:02:12,200 Speaker 1: of this month we should know if this acquisition can 31 00:02:12,280 --> 00:02:16,560 Speaker 1: move forward or not, at least until the FTC makes 32 00:02:16,720 --> 00:02:21,560 Speaker 1: its next move. Dan gooden Over at Ours Technica has 33 00:02:21,600 --> 00:02:25,720 Speaker 1: a great piece about how Microsoft is reeling after security 34 00:02:25,760 --> 00:02:30,000 Speaker 1: experts have blasted the company's security practices or lack thereof 35 00:02:30,240 --> 00:02:33,639 Speaker 1: for its cloud based products like Azure. In case you're 36 00:02:33,680 --> 00:02:38,119 Speaker 1: not familiar with Azure or Asure if you prefer, it's 37 00:02:38,160 --> 00:02:42,640 Speaker 1: Microsoft's cloud platform and it lets customers use Microsoft's cloud 38 00:02:42,680 --> 00:02:45,840 Speaker 1: services to develop apps and that kind of thing. So 39 00:02:46,080 --> 00:02:49,280 Speaker 1: it's similar in many ways to Amazon Web Services or 40 00:02:49,400 --> 00:02:54,480 Speaker 1: Google Cloud or countless other cloud platforms anyway, Gooden quotes 41 00:02:54,600 --> 00:02:59,480 Speaker 1: Amit Juran, the CEO of internet security firm Tenable, who 42 00:02:59,520 --> 00:03:05,280 Speaker 1: called my Microsoft quote unquote grossly irresponsible, which doesn't seem 43 00:03:05,320 --> 00:03:09,359 Speaker 1: to be mincing many words. You know, the complaints aren't unfounded. 44 00:03:09,400 --> 00:03:13,640 Speaker 1: Earlier this year, Chinese backed hacker groups penetrated Microsoft's systems 45 00:03:14,320 --> 00:03:18,480 Speaker 1: and stole boatloads of emails from Microsoft customers who were 46 00:03:18,560 --> 00:03:22,760 Speaker 1: using Azure. And there are open questions as to how 47 00:03:22,800 --> 00:03:26,160 Speaker 1: that actually happened, because it would have meant the hackers 48 00:03:26,200 --> 00:03:29,960 Speaker 1: somehow gained access to what should have been a heavily 49 00:03:30,000 --> 00:03:35,480 Speaker 1: protected encryption key. Juran took to a LinkedIn post and 50 00:03:36,080 --> 00:03:39,680 Speaker 1: sent a very lengthy description of how Microsoft is continuing 51 00:03:39,720 --> 00:03:44,120 Speaker 1: to fail its customers. In his opinion, while the company 52 00:03:44,160 --> 00:03:48,120 Speaker 1: initially claims to have fixed the security issue, Uran says 53 00:03:48,320 --> 00:03:50,880 Speaker 1: that his team was able to gain access to quote, 54 00:03:51,200 --> 00:03:56,280 Speaker 1: authentication secrets to a bank end quote. Yeah. All that 55 00:03:57,240 --> 00:04:01,160 Speaker 1: definitely doesn't sound like things were fixed inside cybersecurity terms. 56 00:04:02,000 --> 00:04:05,840 Speaker 1: This is very bad for Microsoft and also for its customers. Heck, 57 00:04:05,880 --> 00:04:08,800 Speaker 1: it's also bad for cloud computing in general, because the 58 00:04:08,800 --> 00:04:13,520 Speaker 1: promise of cloud computing is that a company can benefit 59 00:04:13,640 --> 00:04:17,640 Speaker 1: from some other company's investment in computing and storage capabilities. Right, 60 00:04:18,600 --> 00:04:21,200 Speaker 1: There's no need for you to build out your own 61 00:04:21,360 --> 00:04:26,240 Speaker 1: server farm and databases, or to heavily invest in compute 62 00:04:26,240 --> 00:04:31,159 Speaker 1: power if instead, you can offload those responsibilities onto a 63 00:04:31,279 --> 00:04:36,320 Speaker 1: cloud based product that you purchase or subscribe to. But 64 00:04:36,400 --> 00:04:39,800 Speaker 1: that only works if customers are confident that the system 65 00:04:39,839 --> 00:04:43,159 Speaker 1: is secure and that the provider will respond quickly and 66 00:04:43,200 --> 00:04:47,359 Speaker 1: sufficiently to any vulnerabilities in the system. Failing to do 67 00:04:47,440 --> 00:04:51,920 Speaker 1: that essentially boils down to a breach of trust. Microsoft 68 00:04:51,960 --> 00:04:55,159 Speaker 1: reps have said that the threat of sophisticated attas makes 69 00:04:55,279 --> 00:04:59,839 Speaker 1: cybersecurity more challenging than ever, which is undoubtedly true, but 70 00:05:00,080 --> 00:05:03,880 Speaker 1: Microsoft's critics say this doesn't excuse the company's response to 71 00:05:03,920 --> 00:05:07,880 Speaker 1: this particular breach and how that's been handled so far. 72 00:05:09,600 --> 00:05:13,280 Speaker 1: In other Microsoft hacking news, the company warns that a 73 00:05:13,360 --> 00:05:17,120 Speaker 1: Russian backed hacking group is trying to steal login credentials 74 00:05:17,600 --> 00:05:24,080 Speaker 1: at various specific organizations by sending fraudulent Microsoft team's messages 75 00:05:24,320 --> 00:05:27,520 Speaker 1: while posing as it support, which is of course an 76 00:05:27,600 --> 00:05:31,400 Speaker 1: age old trick, right. We call it social engineering. It's 77 00:05:31,440 --> 00:05:33,880 Speaker 1: where you convince someone to hand over the keys to 78 00:05:33,960 --> 00:05:36,600 Speaker 1: a system, and it's one of the most effective tools 79 00:05:36,640 --> 00:05:40,359 Speaker 1: in hackers arsenals because it's usually way easier to trick 80 00:05:40,400 --> 00:05:44,279 Speaker 1: someone into being an unwitting accomplice to giving you access 81 00:05:44,320 --> 00:05:46,800 Speaker 1: to a system than it is to brute force your 82 00:05:46,839 --> 00:05:51,520 Speaker 1: way into a system. Anyway, Microsoft says the attacks are 83 00:05:51,600 --> 00:05:55,000 Speaker 1: quote unquote highly targeted, with the hacker sending messages to 84 00:05:55,000 --> 00:06:00,479 Speaker 1: people working at less than forty unique global organizations fewer 85 00:06:00,520 --> 00:06:05,400 Speaker 1: than forty This isn't a wide phishing net. In other words, 86 00:06:05,400 --> 00:06:08,400 Speaker 1: this isn't like just trying to see what you can catch. 87 00:06:08,440 --> 00:06:11,760 Speaker 1: This is more like spear phishing, where you have specifically 88 00:06:11,839 --> 00:06:15,240 Speaker 1: identified the targets you're interested in. Now, the name of 89 00:06:15,279 --> 00:06:18,800 Speaker 1: this hacking group is APT twenty nine, but it's better 90 00:06:18,880 --> 00:06:23,080 Speaker 1: known as Midnight Blizzard, and US authorities say this group 91 00:06:23,120 --> 00:06:26,760 Speaker 1: has links to Russia's intelligence service. So if you use 92 00:06:26,880 --> 00:06:29,280 Speaker 1: Microsoft teams, then you happen to work for one of 93 00:06:29,360 --> 00:06:32,880 Speaker 1: these unnamed organizations. I don't know which ones specifically we're 94 00:06:32,920 --> 00:06:35,599 Speaker 1: being targeted, but presumably a lot of them are like 95 00:06:35,640 --> 00:06:38,520 Speaker 1: government agencies and things of that nature, and high profile 96 00:06:38,600 --> 00:06:42,039 Speaker 1: tech companies that sort of stuff. I would recommend against 97 00:06:42,120 --> 00:06:46,480 Speaker 1: responding to weird requests from it, especially if they're asking 98 00:06:46,520 --> 00:06:49,240 Speaker 1: you to share multi factor authentication information. That should be 99 00:06:49,320 --> 00:06:53,800 Speaker 1: a dead giveaway every single time if someone's asking you 100 00:06:53,920 --> 00:06:57,120 Speaker 1: to share some multi factor authentication stuff, like they're like, hey, 101 00:06:57,120 --> 00:06:58,960 Speaker 1: I need to log into your account, can you give 102 00:06:59,000 --> 00:07:04,520 Speaker 1: me the the the six digit you know number so 103 00:07:04,600 --> 00:07:09,120 Speaker 1: I can access and authenticate? Obviously your answer should be 104 00:07:09,120 --> 00:07:12,960 Speaker 1: no every single time, no, no, no, no no. But 105 00:07:13,240 --> 00:07:16,280 Speaker 1: you know again, it's a trick that works. So Microsoft 106 00:07:16,280 --> 00:07:19,920 Speaker 1: has raised the flag on that one. Speaking of Russia, 107 00:07:20,360 --> 00:07:25,040 Speaker 1: a court in Moscow has fined Apple four hundred thousand rubles, 108 00:07:25,360 --> 00:07:30,040 Speaker 1: which is, let me do the quick calculation here, around 109 00:07:30,120 --> 00:07:33,640 Speaker 1: two hundred and seventy dollars. And the reason is Russia 110 00:07:33,680 --> 00:07:37,320 Speaker 1: says that Apple failed to delete what the court was 111 00:07:37,360 --> 00:07:42,840 Speaker 1: calling inaccurate content, specifically about the ongoing quote unquote special 112 00:07:42,880 --> 00:07:48,120 Speaker 1: military operation in Ukraine. Apple, which by the way, stopped 113 00:07:48,160 --> 00:07:50,800 Speaker 1: all products sales in Russia more than a year ago 114 00:07:50,880 --> 00:07:56,120 Speaker 1: after Russia first invaded Ukraine, has not commented on this fine. 115 00:07:56,280 --> 00:07:58,560 Speaker 1: I mean, two hundred and seventy dollars, that's not even 116 00:07:58,680 --> 00:08:02,280 Speaker 1: enough money to even read as a thing to Apple, 117 00:08:02,600 --> 00:08:07,200 Speaker 1: that's nothing, And really it's another example of how Russian 118 00:08:07,240 --> 00:08:10,520 Speaker 1: institutions are trying to define the narrative on Russia's war 119 00:08:10,680 --> 00:08:14,480 Speaker 1: on Ukraine and anyone who doesn't toe the line in 120 00:08:14,520 --> 00:08:18,720 Speaker 1: that they're trying to punish. Although four hundred thousand rubles, 121 00:08:18,760 --> 00:08:23,360 Speaker 1: I mean, I just like it, Why would you even bother? Anyway, 122 00:08:23,480 --> 00:08:26,320 Speaker 1: Apple joins Wikimedia as one of the companies that the 123 00:08:26,360 --> 00:08:30,200 Speaker 1: Russians have declared are in violation for disseminating incorrect information. 124 00:08:30,920 --> 00:08:36,040 Speaker 1: TorrentFreak dot com reports that Putin's government in Russia is 125 00:08:36,080 --> 00:08:39,360 Speaker 1: trying to lock down internet within the country even more 126 00:08:39,400 --> 00:08:42,160 Speaker 1: than it already has. A New law that will go 127 00:08:42,200 --> 00:08:45,400 Speaker 1: into effect later this year will make it illegal for 128 00:08:45,559 --> 00:08:49,160 Speaker 1: any Russian Internet platform to allow someone to sign up 129 00:08:49,160 --> 00:08:52,760 Speaker 1: for their services if that person is using a foreign 130 00:08:52,880 --> 00:08:56,760 Speaker 1: email system. For example, if someone has a Gmail email account, 131 00:08:56,960 --> 00:08:59,280 Speaker 1: they cannot use that email account to sign up for 132 00:08:59,320 --> 00:09:02,800 Speaker 1: one of these Russian internet platforms. It would be against 133 00:09:02,840 --> 00:09:06,760 Speaker 1: the law. Further, these platforms will not be allowed to 134 00:09:06,800 --> 00:09:11,120 Speaker 1: offer services to customers until those customers go through a 135 00:09:11,280 --> 00:09:17,680 Speaker 1: verification process to prove their identities first. Now, obviously that 136 00:09:17,880 --> 00:09:22,120 Speaker 1: means that anything the customers then access or post or 137 00:09:22,160 --> 00:09:27,079 Speaker 1: whatever is linked directly to them there's no anonymity here. 138 00:09:27,880 --> 00:09:32,040 Speaker 1: They have a trail that leads right back to them, 139 00:09:32,080 --> 00:09:35,440 Speaker 1: assuming that it is the person who created the account 140 00:09:35,440 --> 00:09:39,000 Speaker 1: who's actually using the account. So that's a real big 141 00:09:39,000 --> 00:09:43,120 Speaker 1: concern too. Right, it's like a form of government tracking. 142 00:09:43,720 --> 00:09:46,080 Speaker 1: As I have talked about in previous episodes of this show. 143 00:09:46,720 --> 00:09:49,480 Speaker 1: Russia is also cracked down on virtual private networks or 144 00:09:49,559 --> 00:09:54,400 Speaker 1: VPN services within Russia. Like all outside VPN services have 145 00:09:54,520 --> 00:09:58,480 Speaker 1: had real problems working within Russia because Russia wanted to 146 00:09:58,480 --> 00:10:03,560 Speaker 1: crack down on that internally, some of the VPN services 147 00:10:03,559 --> 00:10:07,520 Speaker 1: that were originating out of Russia chose to just shut 148 00:10:07,600 --> 00:10:11,480 Speaker 1: down operations in the country because the only way they 149 00:10:11,480 --> 00:10:14,080 Speaker 1: are allowed to operate within the country is to also 150 00:10:14,160 --> 00:10:17,960 Speaker 1: agree to certain criteria with Russia's government. Those are the 151 00:10:18,000 --> 00:10:21,040 Speaker 1: only VPN services that are allowed to operate within Russia, 152 00:10:21,600 --> 00:10:25,640 Speaker 1: So that brings into question if the Russian government can 153 00:10:25,720 --> 00:10:30,240 Speaker 1: demand that a VPN service within Russia must share its 154 00:10:30,320 --> 00:10:33,840 Speaker 1: logs that show what customers were doing on the VPN, 155 00:10:33,920 --> 00:10:36,520 Speaker 1: which kind of defeats at least one of the purposes 156 00:10:36,559 --> 00:10:39,960 Speaker 1: of using a VPN in the first place. Right, Like, 157 00:10:40,080 --> 00:10:42,600 Speaker 1: a lot of the reason to use a VPN is 158 00:10:42,640 --> 00:10:45,040 Speaker 1: to kind of make sure that snoops don't see what 159 00:10:45,200 --> 00:10:47,480 Speaker 1: you're doing. Like it may be that you know you 160 00:10:47,559 --> 00:10:51,240 Speaker 1: need to handle some sensitive information, or maybe you're a whistleblower, 161 00:10:51,320 --> 00:10:54,440 Speaker 1: or maybe you're a journalist or whatever, and so you 162 00:10:54,480 --> 00:10:56,880 Speaker 1: don't want to be tracked down. That's obviously a big 163 00:10:56,960 --> 00:11:00,760 Speaker 1: danger to you and your work. But if the government 164 00:11:00,800 --> 00:11:02,839 Speaker 1: can come to the VPN and demand, hey, you've got 165 00:11:02,840 --> 00:11:04,680 Speaker 1: to hand over all your logs and show us who 166 00:11:04,800 --> 00:11:08,280 Speaker 1: was accessing what, then the VPN doesn't really serve its 167 00:11:08,320 --> 00:11:12,880 Speaker 1: purpose anyway. Russian law also makes it illegal to talk 168 00:11:12,960 --> 00:11:17,120 Speaker 1: about using VPNs and other methods like tor in an 169 00:11:17,160 --> 00:11:22,400 Speaker 1: effort to circumvent Russian controls. So not only are they 170 00:11:22,440 --> 00:11:25,760 Speaker 1: limiting things, you can't even talk about the possibility of 171 00:11:25,880 --> 00:11:29,920 Speaker 1: using tools to get around these controls because that in 172 00:11:29,960 --> 00:11:35,079 Speaker 1: itself is illegal. So yeah, really cracking down over there. Okay, 173 00:11:35,360 --> 00:11:37,600 Speaker 1: I've got a bunch of other news items we're going 174 00:11:37,640 --> 00:11:40,720 Speaker 1: to cover today. Let's take a quick break to thank 175 00:11:40,760 --> 00:11:53,720 Speaker 1: our sponsors. We're back, and now for some the left 176 00:11:53,760 --> 00:11:57,320 Speaker 1: hand doesn't know what the right hand is doing news. 177 00:11:57,880 --> 00:12:02,160 Speaker 1: The Washington Post reports that the FBA has actually discovered 178 00:12:02,240 --> 00:12:06,359 Speaker 1: who was at least partly responsible for directing a contractor 179 00:12:06,600 --> 00:12:10,680 Speaker 1: to purchase a spy tool from the Israeli company in SOO. 180 00:12:10,920 --> 00:12:14,280 Speaker 1: Now you might remember NSO is the company that was 181 00:12:14,320 --> 00:12:18,200 Speaker 1: behind Pegasus. That was a tool that could turn a 182 00:12:18,320 --> 00:12:22,880 Speaker 1: targeted iOS device like an iPhone into an espionage gadget. 183 00:12:22,920 --> 00:12:25,559 Speaker 1: You could activate the microphone on the phone and listen 184 00:12:25,600 --> 00:12:27,800 Speaker 1: in on conversations that were happening within the room. You 185 00:12:27,800 --> 00:12:30,680 Speaker 1: could activate the camera, you could look through the phone's 186 00:12:30,920 --> 00:12:36,840 Speaker 1: information like it really was an incredibly dangerous espionage tool 187 00:12:36,960 --> 00:12:41,680 Speaker 1: that NSO sold to lots of different customers. Anyway, back 188 00:12:41,720 --> 00:12:45,920 Speaker 1: in April, the New York Times found that a contractor 189 00:12:46,559 --> 00:12:51,000 Speaker 1: had bought and used a tool from NSO and had 190 00:12:51,040 --> 00:12:56,079 Speaker 1: done so on behalf of a US government agency, though 191 00:12:56,400 --> 00:13:00,199 Speaker 1: it wasn't known which agency, it was, just that this 192 00:13:00,480 --> 00:13:05,520 Speaker 1: this contractor had purchased this from NSO. However, the Biden 193 00:13:05,559 --> 00:13:10,520 Speaker 1: administration had put NSO on a blacklist saying no US 194 00:13:10,840 --> 00:13:15,080 Speaker 1: government office or company or anything like that should purchase 195 00:13:15,280 --> 00:13:18,439 Speaker 1: products from NSO. So the White House directed the FBI 196 00:13:18,520 --> 00:13:24,000 Speaker 1: to track down who the heck authorized a contractor to 197 00:13:24,120 --> 00:13:26,520 Speaker 1: purchase this tool from the NSO, and the FBI says 198 00:13:26,559 --> 00:13:33,720 Speaker 1: it was the FBI sad trombone. But the FBI says 199 00:13:33,840 --> 00:13:37,240 Speaker 1: they toats didn't know that the invasive surveillance they were 200 00:13:37,240 --> 00:13:40,640 Speaker 1: relying upon was thanks to a tool from the NSO group, 201 00:13:41,120 --> 00:13:46,240 Speaker 1: so it's not really their fault. The Contractorriva Networks secured 202 00:13:46,280 --> 00:13:49,840 Speaker 1: the use of a tool from NSO called Landmark. This 203 00:13:49,960 --> 00:13:54,520 Speaker 1: tool allows for geo tracking targets without their knowledge and 204 00:13:54,679 --> 00:13:58,880 Speaker 1: was specifically used for people in Mexico, people like connected 205 00:13:58,880 --> 00:14:05,240 Speaker 1: to drug cartels, such spooky stuff. And also, the FBI 206 00:14:05,360 --> 00:14:08,280 Speaker 1: says once they found out the Reva Networks had been 207 00:14:08,360 --> 00:14:11,360 Speaker 1: naughty and the purchased this tool from NSO, the FBI 208 00:14:11,520 --> 00:14:16,000 Speaker 1: totally terminated that contract with Reva Networks. Now I might 209 00:14:16,240 --> 00:14:20,080 Speaker 1: just be a humble technology podcaster, but I find the 210 00:14:20,160 --> 00:14:27,240 Speaker 1: FBI's excuses to be not fully satisfactory. Firstly, the FBI 211 00:14:27,360 --> 00:14:30,560 Speaker 1: had already used Reva Networks in the past to purchase 212 00:14:30,600 --> 00:14:36,000 Speaker 1: tools from NSO, including Pegasus, and this was before NSO 213 00:14:36,160 --> 00:14:39,880 Speaker 1: was on the blacklist, so arguably it was kind of 214 00:14:39,920 --> 00:14:44,240 Speaker 1: in the clear. But anyway, this was not an unprecedented event. 215 00:14:44,440 --> 00:14:46,640 Speaker 1: FBI had worked with Reva Networks in the past, and 216 00:14:46,800 --> 00:14:50,480 Speaker 1: Reva Networks had been using NSO products as part of 217 00:14:50,560 --> 00:14:53,960 Speaker 1: the contract work they were doing with FBI. Secondly, I 218 00:14:53,960 --> 00:14:58,040 Speaker 1: would argue whether the tool came from NSO or from 219 00:14:58,120 --> 00:15:02,000 Speaker 1: Reva Networks itself or from some other developer, there are 220 00:15:02,080 --> 00:15:05,640 Speaker 1: issues that are really more about what this tool does 221 00:15:06,360 --> 00:15:09,640 Speaker 1: that concern me, rather than where it came from. The 222 00:15:09,760 --> 00:15:14,000 Speaker 1: lack of oversight and accountability for the FBI and for 223 00:15:14,120 --> 00:15:19,040 Speaker 1: its various contract partners raises some really troubling questions about 224 00:15:19,080 --> 00:15:23,160 Speaker 1: the FBI's authority. I mean, part of the reason ANSO 225 00:15:23,240 --> 00:15:26,560 Speaker 1: Group is on the blacklist is not because of these 226 00:15:26,800 --> 00:15:30,400 Speaker 1: incredibly invasive tools it makes. I mean, you would think 227 00:15:30,440 --> 00:15:32,400 Speaker 1: that's the case, but that's not really why it's on 228 00:15:32,440 --> 00:15:34,640 Speaker 1: the blacklist. The reason why it's on the blacklist is 229 00:15:34,680 --> 00:15:38,200 Speaker 1: that company counts among its most loyal customers some of 230 00:15:38,240 --> 00:15:42,000 Speaker 1: the most dangerous dictators and authoritarians in the world, who 231 00:15:42,040 --> 00:15:46,520 Speaker 1: subsequently use those tools to target people like journalists and activists. 232 00:15:46,800 --> 00:15:49,080 Speaker 1: And it would look really, really bad if the US 233 00:15:49,160 --> 00:15:51,080 Speaker 1: government said, you know what, it's okay for us to 234 00:15:51,080 --> 00:15:53,960 Speaker 1: do business with them. So you could argue that optics 235 00:15:54,000 --> 00:15:57,240 Speaker 1: are a really big part of why NSO is on 236 00:15:57,280 --> 00:16:03,040 Speaker 1: that blacklist. However, if we were to be more generous 237 00:16:03,120 --> 00:16:06,000 Speaker 1: and say, maybe the US government said no, this is 238 00:16:06,240 --> 00:16:09,200 Speaker 1: a step too far, This is too invasive, the surveillance 239 00:16:09,280 --> 00:16:13,560 Speaker 1: is too dangerous. There are too few checks and balances 240 00:16:13,600 --> 00:16:17,520 Speaker 1: on authority and overreach. If we were to say that, well, 241 00:16:17,960 --> 00:16:20,400 Speaker 1: I mean a lot of US authorities say right now 242 00:16:20,440 --> 00:16:26,960 Speaker 1: that using geotracking technology doesn't technically violate the Executive Order 243 00:16:27,240 --> 00:16:31,640 Speaker 1: against NSO, so no harm, no foul. If you find 244 00:16:31,640 --> 00:16:34,200 Speaker 1: some other place that makes the same sort of tools, 245 00:16:34,720 --> 00:16:39,240 Speaker 1: oh brave new world. On a different note, researchers at 246 00:16:39,360 --> 00:16:43,880 Speaker 1: Technical University Berlin, along with a researcher named Oleg Droken, 247 00:16:44,200 --> 00:16:47,320 Speaker 1: demonstrated that they could exploit the infotainment system on recent 248 00:16:47,400 --> 00:16:50,920 Speaker 1: Tesla vehicles to gain root access to the vehicles systems. 249 00:16:51,520 --> 00:16:54,920 Speaker 1: This would let someone activate features that normally Tesla reserves 250 00:16:55,040 --> 00:16:58,840 Speaker 1: for paid customers, meaning you could activate stuff that is 251 00:16:58,880 --> 00:17:02,000 Speaker 1: already available in your vehicle and you could do it 252 00:17:02,040 --> 00:17:05,199 Speaker 1: for free rather than have to pay Tesla to unlock it. 253 00:17:05,240 --> 00:17:07,920 Speaker 1: Things like you know, heated seats and full self driving, 254 00:17:07,960 --> 00:17:10,639 Speaker 1: that kind of stuff. They also were able to do 255 00:17:10,720 --> 00:17:15,040 Speaker 1: even more things like remove geolocation restrictions on full self driving. 256 00:17:15,560 --> 00:17:18,000 Speaker 1: They were able to transfer a driver's profile to a 257 00:17:18,040 --> 00:17:22,280 Speaker 1: different Tesla vehicle using this method. However, it's not easy 258 00:17:22,320 --> 00:17:25,560 Speaker 1: to do. It actually requires some electrical engineering, some soldering, 259 00:17:25,640 --> 00:17:28,959 Speaker 1: you know, it's not just software. Hardware is involved too. However, 260 00:17:29,000 --> 00:17:31,320 Speaker 1: if a Tesla owner has the knowledge and skill and 261 00:17:31,359 --> 00:17:34,120 Speaker 1: about one hundred bucks worth of equipment, they could potentially 262 00:17:34,160 --> 00:17:37,159 Speaker 1: do the same thing. They've also said that bad actors 263 00:17:37,200 --> 00:17:40,040 Speaker 1: could possibly use the same method to gain access to 264 00:17:40,080 --> 00:17:42,679 Speaker 1: things like data logs in a Tesla, which could include 265 00:17:42,720 --> 00:17:47,080 Speaker 1: the owner's private information, or otherwise tamper with someone's vehicle. 266 00:17:47,880 --> 00:17:50,560 Speaker 1: It would not be easy. Again, it would require physical 267 00:17:50,640 --> 00:17:53,159 Speaker 1: access to the vehicle, but it is possible, and they 268 00:17:53,200 --> 00:17:56,360 Speaker 1: plan to present their findings at black Hat USA, which 269 00:17:56,359 --> 00:17:58,800 Speaker 1: happens next week in Las Vegas. For what it's worth, 270 00:17:58,880 --> 00:18:01,920 Speaker 1: they actually praised sless security measures. They said that they're 271 00:18:01,960 --> 00:18:05,800 Speaker 1: really leaders in the automotive space on that front. This week, 272 00:18:06,040 --> 00:18:10,480 Speaker 1: the US finally officially banned incandescent light bulbs, and you 273 00:18:10,560 --> 00:18:13,280 Speaker 1: might be thinking, I thought we already did that, but no, 274 00:18:14,280 --> 00:18:16,640 Speaker 1: what we did do is announce the plan to ban 275 00:18:16,800 --> 00:18:20,679 Speaker 1: incandescent light bulbs, but hadn't actually put in action. The 276 00:18:20,720 --> 00:18:22,840 Speaker 1: decision actually traces its history all the way back to 277 00:18:22,880 --> 00:18:26,520 Speaker 1: two thousand and seven during George W. Bush's administration, when 278 00:18:26,560 --> 00:18:28,880 Speaker 1: the White House called for a twenty five percent improvement 279 00:18:28,880 --> 00:18:33,200 Speaker 1: in efficiency for light bulbs. In twenty seventeen, Barack Obama 280 00:18:33,640 --> 00:18:37,760 Speaker 1: planned on facing out incandescent bulbs by twenty twenty. Trump 281 00:18:37,840 --> 00:18:43,080 Speaker 1: reversed that decision. Biden reversed the reversal, although technically it 282 00:18:43,119 --> 00:18:48,560 Speaker 1: doesn't target incandescent bulbs specifically. Rather, it says, hey, light 283 00:18:48,560 --> 00:18:51,360 Speaker 1: bulbs need to be able to produce forty five lumens, 284 00:18:51,440 --> 00:18:54,880 Speaker 1: which is a measurement of brightness per wat of electricity, 285 00:18:55,080 --> 00:18:57,600 Speaker 1: and incantestant bulbs just can't do that. They max out 286 00:18:57,640 --> 00:19:02,080 Speaker 1: at around fifteen lumens per watt. They are banned effectively, 287 00:19:02,200 --> 00:19:04,199 Speaker 1: just not by name. Also, there are a lot of 288 00:19:04,240 --> 00:19:06,160 Speaker 1: exceptions to this, like there are a lot of light 289 00:19:06,160 --> 00:19:09,480 Speaker 1: bulbs that are incandescent that get an exception to this rule, 290 00:19:09,520 --> 00:19:12,520 Speaker 1: like black lights, so your wicked band poster can still 291 00:19:12,880 --> 00:19:15,600 Speaker 1: light up in the dark. Yay, okay. I got two 292 00:19:15,680 --> 00:19:18,520 Speaker 1: article recommendations re all before I sign off. The first 293 00:19:18,560 --> 00:19:21,040 Speaker 1: was published yesterday on ours Tetnica. It was written by 294 00:19:21,160 --> 00:19:25,240 Speaker 1: John Broadkin. The article is titled Internet providers that one 295 00:19:25,320 --> 00:19:29,720 Speaker 1: FCC grants try to escape broadband commitments. As the headline indicates, 296 00:19:29,720 --> 00:19:33,840 Speaker 1: The story covers various ISPs that had agreed to participate 297 00:19:33,840 --> 00:19:36,879 Speaker 1: in the US program to provide broadband to rural communities, 298 00:19:37,320 --> 00:19:39,439 Speaker 1: but now they either want more money to do that 299 00:19:40,000 --> 00:19:42,000 Speaker 1: or they just want to get out because I guess 300 00:19:42,040 --> 00:19:46,440 Speaker 1: it's just too darn hard. The second article I recommend 301 00:19:46,960 --> 00:19:49,159 Speaker 1: is from the Washington Post. It was written by Joseph 302 00:19:49,320 --> 00:19:52,840 Speaker 1: men and the headline is hacking group plans system to 303 00:19:53,000 --> 00:19:56,040 Speaker 1: encrypt social media and other apps. So we live in 304 00:19:56,040 --> 00:19:57,840 Speaker 1: a world where lots of places are trying to chip 305 00:19:57,880 --> 00:20:00,800 Speaker 1: away at privacy and encryption. It's not just a auauortarian 306 00:20:00,920 --> 00:20:04,840 Speaker 1: states like Russia. I'm looking at you, UK and also 307 00:20:04,880 --> 00:20:07,760 Speaker 1: parts of the United States. Anyway. This article talks about 308 00:20:07,800 --> 00:20:10,520 Speaker 1: how the hacker group called Cult of the Dead Cow 309 00:20:10,720 --> 00:20:13,800 Speaker 1: is working to create encrypted alternatives to social media, and 310 00:20:13,840 --> 00:20:16,560 Speaker 1: it's well worth a read. But that's it for the 311 00:20:16,760 --> 00:20:20,399 Speaker 1: Tech News for Thursday, August third, twenty twenty three. I 312 00:20:20,480 --> 00:20:23,119 Speaker 1: hope you are all well, and I'll talk to you 313 00:20:23,160 --> 00:20:34,080 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 314 00:20:34,200 --> 00:20:39,040 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple podcasts, 315 00:20:39,160 --> 00:20:44,760 Speaker 1: or wherever you listen to your favorite shows.