1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,600 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,760 --> 00:00:18,760 Speaker 1: Jonathan Strickland and how the tech are you. It's time 4 00:00:18,800 --> 00:00:24,200 Speaker 1: for the tech news for Tuesday, May two thousand twenty two, 5 00:00:24,920 --> 00:00:28,200 Speaker 1: and it's also time for an update to the Elon 6 00:00:28,320 --> 00:00:33,120 Speaker 1: Musk slash Twitter saga. When last we checked in, I 7 00:00:33,200 --> 00:00:36,040 Speaker 1: mentioned that the U S Securities and Exchange Commission was 8 00:00:36,080 --> 00:00:39,720 Speaker 1: investigating Musk for allegedly filing his intent to purchase Twitter 9 00:00:40,200 --> 00:00:43,800 Speaker 1: at least ten days too late, which in turn had 10 00:00:43,800 --> 00:00:47,960 Speaker 1: a potentially massive impact on other investors. But since then, 11 00:00:48,200 --> 00:00:51,879 Speaker 1: a lot of other stuff has happened. Namely, last Friday, 12 00:00:51,880 --> 00:00:55,120 Speaker 1: Elon Musk tweeted that the deal to purchase Twitter was 13 00:00:55,160 --> 00:00:59,680 Speaker 1: on hold, at least temporarily, because Twitter reported that it 14 00:01:00,040 --> 00:01:03,880 Speaker 1: estimated fewer than five pc of Twitter accounts are bought accounts, 15 00:01:04,319 --> 00:01:08,000 Speaker 1: i e. Fake accounts that are driven by automated processes. 16 00:01:08,920 --> 00:01:12,160 Speaker 1: Now must wasn't terribly clear at first about what the 17 00:01:12,240 --> 00:01:16,520 Speaker 1: actual problem was. Presumably the problem was that he didn't 18 00:01:16,680 --> 00:01:20,000 Speaker 1: believe Twitter and felt that the real number is much higher. 19 00:01:20,040 --> 00:01:22,560 Speaker 1: But here's the thing. He had already put in the 20 00:01:22,600 --> 00:01:26,600 Speaker 1: offer of buying the company at forty four billion dollars 21 00:01:26,640 --> 00:01:30,840 Speaker 1: and repeatedly was boasting about one of the main things 22 00:01:30,880 --> 00:01:35,080 Speaker 1: he wanted to do was purge twitter body accounts and 23 00:01:35,200 --> 00:01:39,160 Speaker 1: institute a verification system to make certain that every account 24 00:01:39,319 --> 00:01:43,840 Speaker 1: belongs to an actual human being. So naturally this led 25 00:01:43,880 --> 00:01:47,480 Speaker 1: people like me saying, well, if part of your reasoning 26 00:01:47,720 --> 00:01:51,000 Speaker 1: to purchase Twitter is to get rid of the bots 27 00:01:51,080 --> 00:01:55,920 Speaker 1: on it, why does this particular estimate upset you? And 28 00:01:56,040 --> 00:01:59,520 Speaker 1: that really comes down to money. I guess my assumption 29 00:01:59,760 --> 00:02:04,160 Speaker 1: is that Elon Musk's argument is that Twitter is misrepresenting 30 00:02:04,360 --> 00:02:08,600 Speaker 1: how many accounts on the platform are bots, and they 31 00:02:08,600 --> 00:02:13,200 Speaker 1: did that when filing with the SEC, and that this, 32 00:02:13,240 --> 00:02:17,000 Speaker 1: in turn has inflated Twitter's stock price. It it amounts 33 00:02:17,040 --> 00:02:21,440 Speaker 1: to fraud, either intentionally or otherwise. And it also means 34 00:02:21,440 --> 00:02:25,400 Speaker 1: that Twitter is overvalued, and that Elon Musk's initial offer 35 00:02:26,080 --> 00:02:31,000 Speaker 1: was based off Twitter's stock price, which has subsequently slipped 36 00:02:31,080 --> 00:02:34,240 Speaker 1: quite a bit, largely due to Musk kind of waffling 37 00:02:34,240 --> 00:02:37,680 Speaker 1: about the deal. Now I can see how that can 38 00:02:37,720 --> 00:02:41,320 Speaker 1: be a legit argument, except you would think that anyone 39 00:02:41,400 --> 00:02:44,600 Speaker 1: willing to pop down forty four billion dollars to buy 40 00:02:44,600 --> 00:02:48,280 Speaker 1: a company would already have done, you know, some pretty 41 00:02:48,360 --> 00:02:52,720 Speaker 1: thorough investigations to determine if the value represented by the 42 00:02:52,760 --> 00:02:56,400 Speaker 1: stock price actually reflected the real worth of the company, 43 00:02:56,720 --> 00:03:00,960 Speaker 1: you know, before you make the offer anyway. Musk also 44 00:03:01,760 --> 00:03:05,560 Speaker 1: in a reply to someone, mentioned that Twitter's supposed methodology 45 00:03:05,639 --> 00:03:08,679 Speaker 1: for determining how many bots are on the platform is 46 00:03:08,720 --> 00:03:11,840 Speaker 1: to take a random sample of just one accounts and 47 00:03:11,880 --> 00:03:14,040 Speaker 1: then investigate to see how many of those one hundred 48 00:03:14,120 --> 00:03:19,960 Speaker 1: are bought accounts and then extrapolate that. Um that, if true, 49 00:03:20,639 --> 00:03:23,840 Speaker 1: is is I mean to call it, to call it 50 00:03:23,880 --> 00:03:27,680 Speaker 1: too small a sample size is an understatement, like that's 51 00:03:28,120 --> 00:03:32,320 Speaker 1: that's nothing. You're talking about a platform that boasts a 52 00:03:32,360 --> 00:03:35,680 Speaker 1: couple of hundred million accounts on it. Just to take 53 00:03:35,760 --> 00:03:39,920 Speaker 1: one hundred and then trying to extrapolate anything from one 54 00:03:40,000 --> 00:03:45,040 Speaker 1: hundred random accounts, that's nothing. So if that is true, 55 00:03:45,680 --> 00:03:51,000 Speaker 1: that is a laughable approach to making those kinds of estimates. 56 00:03:51,040 --> 00:03:55,560 Speaker 1: But then Musk said that Twitter accused him of of 57 00:03:55,840 --> 00:03:59,920 Speaker 1: breaking a non disclosure agreement or in d a about 58 00:04:00,080 --> 00:04:04,160 Speaker 1: all this, and at a tech conference in Florida, Musk 59 00:04:04,280 --> 00:04:06,920 Speaker 1: said that the real percentage of body accounts is likely 60 00:04:06,960 --> 00:04:12,000 Speaker 1: twenty or higher, possibly as high as n though I 61 00:04:12,040 --> 00:04:17,040 Speaker 1: think that is ludicrously high and really doubted. I don't 62 00:04:17,080 --> 00:04:21,040 Speaker 1: think only of the accounts on Twitter are real people. 63 00:04:21,720 --> 00:04:25,520 Speaker 1: Um I think some of those real people are real jerks, 64 00:04:25,640 --> 00:04:29,560 Speaker 1: but they're still people. Musk then said that this deal 65 00:04:29,600 --> 00:04:33,480 Speaker 1: will not move forward unless Twitter can prove that less 66 00:04:33,480 --> 00:04:36,120 Speaker 1: than five percent of the accounts on the platform are 67 00:04:36,240 --> 00:04:40,400 Speaker 1: in fact bots or fake or spam or whatever. Musk 68 00:04:40,440 --> 00:04:43,080 Speaker 1: also said that if he could arrive at a lower 69 00:04:43,120 --> 00:04:45,520 Speaker 1: deal for Twitter, a purchase might not be out of 70 00:04:45,560 --> 00:04:49,080 Speaker 1: the question. Now I should add there is a ton 71 00:04:49,200 --> 00:04:54,359 Speaker 1: of speculation online that Musk is also scrambling because maybe 72 00:04:54,520 --> 00:04:57,640 Speaker 1: the funding he secured to purchase Twitter is no longer 73 00:04:57,680 --> 00:05:01,919 Speaker 1: sufficient to cover that already four billion dollar price tag, 74 00:05:02,400 --> 00:05:05,320 Speaker 1: so some of the funding depends on Musk's shares in 75 00:05:05,360 --> 00:05:10,640 Speaker 1: Tesla as uh as collateral, at least according to some reports. 76 00:05:11,080 --> 00:05:14,960 Speaker 1: But Tesla stock prices have also dropped significantly by around 77 00:05:15,040 --> 00:05:19,320 Speaker 1: thirty percent, because you know, they've been dropping ever since 78 00:05:19,440 --> 00:05:22,480 Speaker 1: this deal to buy Twitter was first announced, and Tesla 79 00:05:22,520 --> 00:05:25,280 Speaker 1: shareholders have been worrying that they might be saddled with 80 00:05:25,360 --> 00:05:29,320 Speaker 1: footing the bill for Musk's desire to own Twitter, which 81 00:05:29,720 --> 00:05:33,479 Speaker 1: has nothing to do with their their interest in Tesla 82 00:05:33,600 --> 00:05:37,839 Speaker 1: as a company. Well, if that is in fact true, 83 00:05:38,160 --> 00:05:41,440 Speaker 1: if you know things that he has as collateral has 84 00:05:41,600 --> 00:05:44,720 Speaker 1: have dipped in value by like thirty that could mean 85 00:05:44,760 --> 00:05:47,919 Speaker 1: that he might need to secure some additional funding sources 86 00:05:47,960 --> 00:05:51,600 Speaker 1: to cover the forty four billion dollars. Now, yes, Ellen 87 00:05:51,720 --> 00:05:55,760 Speaker 1: Musk is a multibillionaire. In fact, he's the richest person 88 00:05:55,920 --> 00:05:58,400 Speaker 1: in the world. He has an estimated wealth of around 89 00:05:58,440 --> 00:06:03,200 Speaker 1: two nineteen bill in dollars. But we have to remember 90 00:06:03,880 --> 00:06:07,600 Speaker 1: the vast majority of that wealth isn't liquid. It's wrapped 91 00:06:07,680 --> 00:06:11,600 Speaker 1: up in stuff like his stock in Tesla or in cryptocurrency, 92 00:06:12,200 --> 00:06:15,200 Speaker 1: and if Musk were to sell off assets for the 93 00:06:15,240 --> 00:06:20,239 Speaker 1: sake of liquidity, it would affect the value of those assets, 94 00:06:20,400 --> 00:06:23,160 Speaker 1: which means that he would get less for them than 95 00:06:23,200 --> 00:06:26,479 Speaker 1: what they are currently worth, which really just nails down 96 00:06:26,560 --> 00:06:30,719 Speaker 1: how insane wealth is, and that most rich people really 97 00:06:30,760 --> 00:06:33,440 Speaker 1: just leverage their assets to get hold of cash when 98 00:06:33,440 --> 00:06:36,280 Speaker 1: they need it rather than sell stuff off. Because if 99 00:06:36,320 --> 00:06:39,279 Speaker 1: you sell off a lot of it, well, it becomes 100 00:06:39,279 --> 00:06:41,920 Speaker 1: like supply and demand. Right. If you're flooding the market 101 00:06:41,960 --> 00:06:45,640 Speaker 1: with supply because you're offloading your massive shares in something, 102 00:06:46,360 --> 00:06:49,520 Speaker 1: the demand goes down and the price goes down, so 103 00:06:49,600 --> 00:06:52,760 Speaker 1: you get less than what you would have gotten if 104 00:06:52,760 --> 00:06:55,279 Speaker 1: you didn't sell it off. It's it's like this weird 105 00:06:55,320 --> 00:06:59,720 Speaker 1: catch twenty two, right. Anyway, My point is that even 106 00:06:59,720 --> 00:07:02,680 Speaker 1: though Musk is the wealthiest person in the world, he 107 00:07:02,800 --> 00:07:05,880 Speaker 1: does not have forty four billion dollars in cash just 108 00:07:06,000 --> 00:07:09,680 Speaker 1: laying around, so he had to secure additional funding, some 109 00:07:09,800 --> 00:07:13,200 Speaker 1: of which has depreciated in value since the time of 110 00:07:13,200 --> 00:07:17,920 Speaker 1: the announcement. So some people assume that Musk is really 111 00:07:17,920 --> 00:07:20,720 Speaker 1: looking for a way to back out of the deal 112 00:07:21,000 --> 00:07:24,920 Speaker 1: that will save face, or alternatively find a way to 113 00:07:25,000 --> 00:07:28,600 Speaker 1: renegotiate the deal so that he can purchase Twitter at 114 00:07:28,640 --> 00:07:31,760 Speaker 1: a much lower price. Uh. I think that that's something 115 00:07:31,800 --> 00:07:36,400 Speaker 1: Twitter's board might resist, and certainly Twitter's shareholders would probably 116 00:07:36,440 --> 00:07:39,320 Speaker 1: be less interested in it. You know, if they're being 117 00:07:39,360 --> 00:07:43,040 Speaker 1: told they're gonna get bought out at fifty four dollars 118 00:07:43,040 --> 00:07:45,680 Speaker 1: a share and then they're told no, I'm sorry, it's 119 00:07:45,720 --> 00:07:48,200 Speaker 1: gonna be closer to you know, forty dollars a share, 120 00:07:49,200 --> 00:07:53,000 Speaker 1: that's a big step down, right, And I'm just using 121 00:07:53,000 --> 00:07:56,720 Speaker 1: those as hypothetical figures, right, But the point being that 122 00:07:56,880 --> 00:07:59,880 Speaker 1: shareholders are probably not going to be as thrilled at 123 00:08:00,120 --> 00:08:04,120 Speaker 1: thought of getting a smaller payout, so it could put 124 00:08:04,160 --> 00:08:07,560 Speaker 1: the deal at risk. Now, if Musk does back out 125 00:08:07,560 --> 00:08:10,360 Speaker 1: of the deal on his terms, it will cost him 126 00:08:10,360 --> 00:08:12,720 Speaker 1: a cool billion dollars to do it, because that was 127 00:08:12,760 --> 00:08:15,960 Speaker 1: part of the negotiation, was that if either party called 128 00:08:15,960 --> 00:08:20,000 Speaker 1: off the deal, then it would come at a cost 129 00:08:20,080 --> 00:08:23,320 Speaker 1: of a billion dollars. Now, could it be that Musk 130 00:08:23,360 --> 00:08:27,880 Speaker 1: will ultimately spend a billion dollars to not buy Twitter? Meanwhile, 131 00:08:28,200 --> 00:08:33,280 Speaker 1: Twitter CEO Parague Agal dismissed Musk's objection, saying that he 132 00:08:33,280 --> 00:08:36,440 Speaker 1: would discuss the issue of spam accounts quote with the 133 00:08:36,440 --> 00:08:40,960 Speaker 1: benefit of data, facts and context end quote. So the 134 00:08:41,000 --> 00:08:48,040 Speaker 1: story is not over yet. A security consultant named Sultan 135 00:08:48,280 --> 00:08:53,160 Speaker 1: CoStim Khan created a hack that exploits the keyless operation 136 00:08:53,240 --> 00:08:57,240 Speaker 1: mode for certain automobiles, such as Tesla's, which is why 137 00:08:57,240 --> 00:09:00,640 Speaker 1: I'm mentioning this story right after the Elon Musk story. Now, 138 00:09:00,640 --> 00:09:03,439 Speaker 1: according to con his method works on other types of 139 00:09:03,520 --> 00:09:08,080 Speaker 1: vehicles too, but he specifically demonstrated it on Tesla vehicles 140 00:09:08,559 --> 00:09:10,600 Speaker 1: that it works on things like the models and the 141 00:09:10,640 --> 00:09:14,080 Speaker 1: model why his hack would allow him to not only 142 00:09:14,280 --> 00:09:18,480 Speaker 1: unlock a vehicle's doors, but also start the electric motor 143 00:09:18,600 --> 00:09:20,680 Speaker 1: and drive off in the car. So, in other words, 144 00:09:20,720 --> 00:09:25,480 Speaker 1: you could totally steal a Tesla using this technology. Now, 145 00:09:25,480 --> 00:09:27,720 Speaker 1: to pull it off requires a couple of pieces of hardware. 146 00:09:28,360 --> 00:09:32,680 Speaker 1: One piece is essentially a Bluetooth sniffer device, So this 147 00:09:32,720 --> 00:09:36,120 Speaker 1: is something that can detect Bluetooth signals that are being 148 00:09:36,160 --> 00:09:41,000 Speaker 1: broadcast from a source. Now, Bluetooth signals have a relatively 149 00:09:41,040 --> 00:09:43,200 Speaker 1: short range, so that means you would have to get 150 00:09:43,600 --> 00:09:46,880 Speaker 1: this sniffer device fairly close to your target with then 151 00:09:46,920 --> 00:09:51,320 Speaker 1: about fifteen feet or so, and you know, whether that 152 00:09:51,320 --> 00:09:55,280 Speaker 1: that target is a smartphone that has a specific app 153 00:09:55,320 --> 00:09:58,160 Speaker 1: that interacts with the Tesla or other vehicle, or it's 154 00:09:58,200 --> 00:10:00,920 Speaker 1: a keyless fob or whatever it might be, and it 155 00:10:01,000 --> 00:10:04,080 Speaker 1: then will sniff the signal that's being sent from that 156 00:10:04,160 --> 00:10:08,000 Speaker 1: device that would normally go to the vehicle. The sniffer 157 00:10:08,040 --> 00:10:11,600 Speaker 1: device would then relay that signal to a laptop that 158 00:10:11,800 --> 00:10:15,720 Speaker 1: is presumably close to the target vehicle, because again Bluetooth 159 00:10:16,320 --> 00:10:21,120 Speaker 1: doesn't have a very long range, So then con could 160 00:10:21,200 --> 00:10:23,959 Speaker 1: use the laptop that was coming in you know, that 161 00:10:24,080 --> 00:10:27,320 Speaker 1: was detecting the signal from the sniffer to broadcast a 162 00:10:27,400 --> 00:10:31,320 Speaker 1: similar signal to the car and unlock it and turn 163 00:10:31,400 --> 00:10:34,960 Speaker 1: on the electric motor. So this method is not super 164 00:10:35,040 --> 00:10:38,880 Speaker 1: easy to pull off. It does require some hardware and 165 00:10:39,000 --> 00:10:42,199 Speaker 1: some maneuvering in order to get it to work, but 166 00:10:42,320 --> 00:10:45,320 Speaker 1: the point is it is possible and there's no defense 167 00:10:45,360 --> 00:10:48,800 Speaker 1: against it. Con says that he contacted Tesla and other 168 00:10:48,840 --> 00:10:52,280 Speaker 1: companies and made them aware of this vulnerability, but fixing 169 00:10:52,320 --> 00:10:55,800 Speaker 1: this particular problem would require a fairly substantial overhaul of 170 00:10:55,840 --> 00:10:58,600 Speaker 1: the keyless entry system as well as the hardware it 171 00:10:58,720 --> 00:11:01,600 Speaker 1: runs on. And further more, the risk of someone actually 172 00:11:01,600 --> 00:11:05,240 Speaker 1: developing and then using this hack is fairly low, so 173 00:11:05,440 --> 00:11:08,199 Speaker 1: there do not appear to be any plans to address 174 00:11:08,280 --> 00:11:11,839 Speaker 1: this issue because the likelihood of it becoming a thing 175 00:11:11,920 --> 00:11:16,720 Speaker 1: out in the wild is fairly limited. Cohn also revealed 176 00:11:16,760 --> 00:11:20,000 Speaker 1: that other cars that use the quick set smart locks 177 00:11:20,040 --> 00:11:23,360 Speaker 1: are vulnerable to this kind of attack, but that drivers 178 00:11:23,360 --> 00:11:26,840 Speaker 1: who use their iPhones to access their vehicles have some 179 00:11:26,880 --> 00:11:29,960 Speaker 1: additional security measures that they can take advantage of. For one, 180 00:11:30,440 --> 00:11:34,160 Speaker 1: they can enable two factor authentication, which in general you 181 00:11:34,160 --> 00:11:37,440 Speaker 1: should pretty much always do for any system that offers it, 182 00:11:38,160 --> 00:11:40,880 Speaker 1: and also the iPhone compatible locks have a time out 183 00:11:40,960 --> 00:11:44,600 Speaker 1: that makes them harder to hack. The Android version locks 184 00:11:45,280 --> 00:11:49,120 Speaker 1: lack these security factors, though quick Set lands on an 185 00:11:49,200 --> 00:11:53,360 Speaker 1: upgrade later this year that will likely address that. We 186 00:11:53,440 --> 00:11:55,840 Speaker 1: have lots more stories to cover, including some more hacking 187 00:11:55,880 --> 00:11:58,440 Speaker 1: stories as well as a lot of other stuff. But 188 00:11:58,520 --> 00:12:00,720 Speaker 1: before we get to any of that, let's take a 189 00:12:00,840 --> 00:12:12,920 Speaker 1: quick break. We're back. Cal Newport posted in his blog 190 00:12:13,000 --> 00:12:17,000 Speaker 1: a really interesting piece about a scientific study titled taking 191 00:12:17,120 --> 00:12:20,559 Speaker 1: a one week break from Social Media Improves well being, 192 00:12:20,800 --> 00:12:24,959 Speaker 1: depression and anxiety. Well, the title of the paper pretty 193 00:12:25,040 --> 00:12:27,600 Speaker 1: much tells you what you need to know. The researchers 194 00:12:27,640 --> 00:12:33,199 Speaker 1: took on volunteers. They randomly divided those volunteers into two groups. 195 00:12:33,360 --> 00:12:36,680 Speaker 1: One group was the control group. They were given no instructions, 196 00:12:36,720 --> 00:12:40,400 Speaker 1: They just went about living their normal lives. The experiment 197 00:12:40,440 --> 00:12:43,360 Speaker 1: group was told to take a week long break from 198 00:12:43,400 --> 00:12:48,840 Speaker 1: social media, including popular platforms like Instagram, Facebook, TikTok, and Twitter. 199 00:12:49,640 --> 00:12:53,160 Speaker 1: After a week, the researchers found that the experiment group 200 00:12:53,280 --> 00:12:57,360 Speaker 1: reported they experienced less anxiety and depression, and they felt 201 00:12:57,400 --> 00:13:01,280 Speaker 1: an improved sense of well being. The improvements were significant 202 00:13:01,320 --> 00:13:05,040 Speaker 1: even when adjusting for variables like age and gender, so 203 00:13:05,200 --> 00:13:08,040 Speaker 1: it didn't just disappear once you started to factor those 204 00:13:08,360 --> 00:13:13,000 Speaker 1: variables into it. Newport points out, we should always look 205 00:13:13,040 --> 00:13:17,360 Speaker 1: at these kinds of scientific studies with some skepticism, largely 206 00:13:17,400 --> 00:13:21,079 Speaker 1: because methodologies can get a bit fuzzy, especially when we're 207 00:13:21,080 --> 00:13:24,400 Speaker 1: talking about things like mental health. There are so many 208 00:13:24,520 --> 00:13:28,320 Speaker 1: variables that impact our mental health that it can be 209 00:13:28,320 --> 00:13:31,360 Speaker 1: pretty much impossible to narrow in on one or two 210 00:13:31,440 --> 00:13:35,040 Speaker 1: variables and then eliminate all the rest. Right, Still, the 211 00:13:35,040 --> 00:13:39,040 Speaker 1: study seems to support earlier hypotheses that a dependence on 212 00:13:39,080 --> 00:13:42,600 Speaker 1: social media tends to bring with it an increased sense 213 00:13:42,679 --> 00:13:47,520 Speaker 1: of depression and anxiety. Anecdotally, I have cut way back 214 00:13:47,600 --> 00:13:51,160 Speaker 1: on my social network presence, having deactivated my Facebook account. 215 00:13:51,280 --> 00:13:54,040 Speaker 1: I don't go to Instagram, I don't have TikTok anymore, 216 00:13:54,480 --> 00:13:57,199 Speaker 1: and I only use Twitter for this show. I don't 217 00:13:57,400 --> 00:14:00,880 Speaker 1: I don't access my personal Twitter feed anymore. Are now 218 00:14:00,920 --> 00:14:04,359 Speaker 1: I can't say that I have had a massive improvement 219 00:14:04,440 --> 00:14:07,040 Speaker 1: in my mental health, but I do feel that I'm 220 00:14:07,080 --> 00:14:10,640 Speaker 1: not getting worse, which wasn't necessarily the case when I 221 00:14:10,679 --> 00:14:14,960 Speaker 1: was still on everything. But again, that's anecdotal evidence, which 222 00:14:15,000 --> 00:14:17,600 Speaker 1: really that's not evidence at all. We all know that, right, 223 00:14:17,640 --> 00:14:22,400 Speaker 1: Anecdotal evidence is not actual evidence. And goodness knows, I 224 00:14:22,400 --> 00:14:25,520 Speaker 1: would much rather feel like I was a totally new, 225 00:14:25,520 --> 00:14:29,080 Speaker 1: well adjusted person. I'm just not there yet. Still, if 226 00:14:29,120 --> 00:14:32,800 Speaker 1: you do find yourself feeling overwhelmed, it might be worth 227 00:14:32,840 --> 00:14:37,600 Speaker 1: considering taking a weak vacation from social media. It could help. Now, 228 00:14:37,640 --> 00:14:40,320 Speaker 1: I'm pretty sure. I talked about this next story in 229 00:14:40,360 --> 00:14:43,440 Speaker 1: an earlier episode that Apple has rolled out changes that 230 00:14:43,480 --> 00:14:47,960 Speaker 1: allow app developers to alter subscription prices and charge users 231 00:14:48,040 --> 00:14:54,320 Speaker 1: more money automatically, as long as certain conditions are met. Previously, 232 00:14:54,840 --> 00:14:58,840 Speaker 1: developers were required to send a notification to users alerting 233 00:14:58,880 --> 00:15:02,440 Speaker 1: them of an upcome main price hike for their services, 234 00:15:02,480 --> 00:15:05,960 Speaker 1: and the user would then have the option to opt 235 00:15:06,080 --> 00:15:09,160 Speaker 1: in to paying the higher price, or they could discontinue 236 00:15:09,200 --> 00:15:12,560 Speaker 1: their subscription. But Apple said that this method had the 237 00:15:12,640 --> 00:15:17,080 Speaker 1: unintended consequence of leading to interruptions and service because users 238 00:15:17,080 --> 00:15:21,960 Speaker 1: were missing or ignoring notifications so they didn't take the 239 00:15:22,000 --> 00:15:25,239 Speaker 1: action to opt in, and their service would get discontinued 240 00:15:25,280 --> 00:15:28,200 Speaker 1: once their subscription was over and the price hike had 241 00:15:28,200 --> 00:15:32,000 Speaker 1: taken effect, and it would cause some friction. So now 242 00:15:32,040 --> 00:15:34,960 Speaker 1: the policy has changed, developers still have to send out 243 00:15:34,960 --> 00:15:38,000 Speaker 1: a message to users alerting them ahead of time. If 244 00:15:38,000 --> 00:15:41,760 Speaker 1: they're going to change the price, but the price change 245 00:15:41,800 --> 00:15:45,240 Speaker 1: gets applied automatically and there's no opt in required, so 246 00:15:45,400 --> 00:15:50,160 Speaker 1: if the user doesn't actively deactivate their account, they get 247 00:15:50,240 --> 00:15:54,240 Speaker 1: charged more So, if you miss or ignore a message, 248 00:15:54,760 --> 00:15:57,520 Speaker 1: then you might discover when you're looking at your bank 249 00:15:57,560 --> 00:16:01,520 Speaker 1: statement that what used to cost X per month now 250 00:16:01,600 --> 00:16:05,880 Speaker 1: costs X plus something else per month. Apple does have 251 00:16:05,920 --> 00:16:08,440 Speaker 1: a few other requirements the developers have to meet in 252 00:16:08,560 --> 00:16:11,960 Speaker 1: order to qualify for this feature. Uh, the developers cannot 253 00:16:12,000 --> 00:16:15,880 Speaker 1: hike a price more than once per year. Any increase 254 00:16:15,960 --> 00:16:19,720 Speaker 1: that they give to any subscription cannot exceed five dollars 255 00:16:19,880 --> 00:16:24,520 Speaker 1: or fifty of the current subscription price for monthly subscriptions, 256 00:16:24,560 --> 00:16:28,080 Speaker 1: and if it's an annual subscription service, the max hike 257 00:16:28,320 --> 00:16:32,120 Speaker 1: is fifty dollars or fifty of the subscription price. The 258 00:16:32,240 --> 00:16:36,760 Speaker 1: increase also cannot break any local laws. Now, if developers 259 00:16:36,840 --> 00:16:39,880 Speaker 1: failed to meet some of those qualifications, they can still 260 00:16:39,960 --> 00:16:43,360 Speaker 1: increase their subscription prices, but they will have to adhere 261 00:16:43,400 --> 00:16:47,040 Speaker 1: to the old process of giving you know, users the 262 00:16:47,120 --> 00:16:51,680 Speaker 1: opportunity to opt in, or otherwise their service will be discontinued. 263 00:16:52,480 --> 00:16:55,920 Speaker 1: Darren Allen of tech Radar has an article titled bad 264 00:16:56,000 --> 00:16:59,080 Speaker 1: news and video. A m D S new GPUs are 265 00:16:59,080 --> 00:17:02,320 Speaker 1: in good stock and priced strictly at m s r 266 00:17:02,400 --> 00:17:06,120 Speaker 1: p UM. In case you're not familiar with the term 267 00:17:06,240 --> 00:17:10,560 Speaker 1: m s r P, that stands for manufacturers suggested retail price. 268 00:17:11,000 --> 00:17:14,080 Speaker 1: In other words, it's how much the company what makes 269 00:17:14,160 --> 00:17:18,600 Speaker 1: the thing tells stores how much they should charge to 270 00:17:18,680 --> 00:17:22,160 Speaker 1: sell that thing. It's their suggested retail price. Stores aren't 271 00:17:22,160 --> 00:17:25,440 Speaker 1: obligated to sell things at m s r P, but 272 00:17:25,840 --> 00:17:29,520 Speaker 1: it is what the manufacturers are suggesting. And obviously if 273 00:17:29,520 --> 00:17:32,760 Speaker 1: stores charge weight more than that and other stores charged 274 00:17:32,800 --> 00:17:37,040 Speaker 1: closer to m s RP, that creates competition in the market. YadA, YadA, YadA. Well, 275 00:17:37,040 --> 00:17:40,840 Speaker 1: when it comes to GPUs or graphics processing units, we've 276 00:17:40,840 --> 00:17:44,680 Speaker 1: had a long run of graphics cards being incredibly difficult 277 00:17:44,720 --> 00:17:49,399 Speaker 1: to find and grossly overpriced above m s r P. 278 00:17:50,560 --> 00:17:54,000 Speaker 1: One really big reason for that was that crypto minors 279 00:17:54,119 --> 00:17:57,800 Speaker 1: for currencies like ethereum, we're scooping up all the available 280 00:17:57,800 --> 00:18:01,760 Speaker 1: GPUs before they could hit consumer mark markets and plugging 281 00:18:01,800 --> 00:18:05,800 Speaker 1: them into mining machines, or they were buying GPUs at 282 00:18:05,880 --> 00:18:09,639 Speaker 1: way above market price and then putting them into mining 283 00:18:09,680 --> 00:18:13,520 Speaker 1: machines UM. And this gave a few enterprising folks the 284 00:18:13,600 --> 00:18:16,560 Speaker 1: chance to buy up cards very quickly and then hike 285 00:18:16,640 --> 00:18:20,120 Speaker 1: the price up to obscene levels before offering them up 286 00:18:20,119 --> 00:18:26,240 Speaker 1: on after market platforms like eBay. Uh. Then the you know, 287 00:18:26,480 --> 00:18:29,600 Speaker 1: you got to take into account the semiconductor chip shortage 288 00:18:29,640 --> 00:18:33,480 Speaker 1: that we're currently in, and there are all the factors 289 00:18:33,480 --> 00:18:36,199 Speaker 1: you really need right. You have limited supply, you have 290 00:18:36,320 --> 00:18:40,520 Speaker 1: high demand that fuels crazy high prices. Well, Alan Over 291 00:18:40,560 --> 00:18:42,440 Speaker 1: at tech Radar has pointed out that a m D 292 00:18:43,000 --> 00:18:46,399 Speaker 1: has several GPUs that appear to be in stock on 293 00:18:46,480 --> 00:18:50,119 Speaker 1: sites like new Egg, and more importantly, that these in 294 00:18:50,280 --> 00:18:52,919 Speaker 1: stock cards are pretty much priced where they should be. 295 00:18:53,440 --> 00:18:56,320 Speaker 1: Prices have been coming down a little bit recently, in 296 00:18:56,400 --> 00:19:00,359 Speaker 1: large part because the value of cryptocurrency has plummeted in 297 00:19:00,440 --> 00:19:05,680 Speaker 1: recent weeks. Obviously, if it costs more money to run 298 00:19:05,840 --> 00:19:11,240 Speaker 1: mining operations, then you get out of successfully mining cryptocurrency. 299 00:19:11,640 --> 00:19:14,359 Speaker 1: You stop doing it, right, because otherwise you're operating at 300 00:19:14,359 --> 00:19:17,520 Speaker 1: a net loss. You want to make profits. So if 301 00:19:17,600 --> 00:19:19,679 Speaker 1: it costs you more to do the thing, then you 302 00:19:19,720 --> 00:19:22,639 Speaker 1: get out of the thing, you stop doing it. That 303 00:19:22,720 --> 00:19:25,760 Speaker 1: also means we don't quite see the same rush to 304 00:19:25,840 --> 00:19:29,200 Speaker 1: snap up all available GPUs. Now, don't get me wrong, 305 00:19:29,240 --> 00:19:31,719 Speaker 1: there is still a rush, it's just not quite as 306 00:19:31,800 --> 00:19:34,360 Speaker 1: vicious as it used to be, and I suspect we'll 307 00:19:34,359 --> 00:19:38,520 Speaker 1: see GPU availability and pricing improve, assuming there's not a 308 00:19:38,600 --> 00:19:43,320 Speaker 1: huge surge in the crypto market specifically with Ethereum. Also, 309 00:19:43,359 --> 00:19:45,879 Speaker 1: we have to remember Ethereum is currently running its proof 310 00:19:45,960 --> 00:19:50,840 Speaker 1: of steak blockchain in parallel with its proof of work blockchain, 311 00:19:51,240 --> 00:19:54,080 Speaker 1: and proof of steak is not going to require all 312 00:19:54,080 --> 00:19:58,679 Speaker 1: that computational effort to verify transactions the way that proof 313 00:19:58,720 --> 00:20:02,639 Speaker 1: of work a pro which is do so once Ethereum 314 00:20:02,680 --> 00:20:06,199 Speaker 1: actually transitions to proof of steak, because right now the 315 00:20:06,280 --> 00:20:09,280 Speaker 1: actual transactions are still happening on proof of work blockchain. 316 00:20:09,720 --> 00:20:13,760 Speaker 1: But once things merge and Ethereum has transitioned over to 317 00:20:13,800 --> 00:20:17,000 Speaker 1: proof of steak, the demand for all those GPUs will 318 00:20:17,080 --> 00:20:20,359 Speaker 1: drop again. I mean, other cryptocurrencies will still use proof 319 00:20:20,359 --> 00:20:26,160 Speaker 1: of work, but they typically are valued much lower than Ethereum, 320 00:20:26,160 --> 00:20:29,239 Speaker 1: which in turn is valued much lower than bitcoing. So 321 00:20:29,320 --> 00:20:32,280 Speaker 1: it could mean that you know, the people who end 322 00:20:32,359 --> 00:20:35,680 Speaker 1: up seeking out and acquiring those cards end up being 323 00:20:35,720 --> 00:20:38,479 Speaker 1: you know, gamers who want to run their PCs so 324 00:20:38,520 --> 00:20:41,359 Speaker 1: that they can be able to to run the latest 325 00:20:41,400 --> 00:20:45,520 Speaker 1: titles at the highest settings. Wouldn't that be nice? Earlier 326 00:20:45,560 --> 00:20:49,160 Speaker 1: this year, I did an episode about the company San Sui, 327 00:20:49,480 --> 00:20:53,159 Speaker 1: once famous for producing high end audio equipment, and that 328 00:20:53,200 --> 00:20:56,160 Speaker 1: company is now really no more than a brand name, 329 00:20:56,760 --> 00:20:59,639 Speaker 1: something that's happened to a lot of companies over the years. 330 00:21:00,200 --> 00:21:02,240 Speaker 1: And we have another company that we can kind of 331 00:21:02,280 --> 00:21:04,760 Speaker 1: add to that list, and this one is on Kyo, 332 00:21:04,920 --> 00:21:08,840 Speaker 1: oh in k y O. Unlike san Sui, on Kyo's 333 00:21:08,880 --> 00:21:12,760 Speaker 1: target market was more in the mid range for audio 334 00:21:12,800 --> 00:21:16,520 Speaker 1: equipment and video equipment. The company produced a v equipment 335 00:21:16,520 --> 00:21:19,280 Speaker 1: that was a little more modest than the brands that 336 00:21:19,280 --> 00:21:23,159 Speaker 1: were targeting, you know, the the wealthy audio file crowd. 337 00:21:23,560 --> 00:21:26,800 Speaker 1: But on Kyo hit on hard times. It got delisted 338 00:21:26,840 --> 00:21:30,080 Speaker 1: from the Stock Exchange last year, and now the company, 339 00:21:30,160 --> 00:21:35,920 Speaker 1: which was originally founded way back in has gone bankrupt. However, 340 00:21:36,920 --> 00:21:42,040 Speaker 1: last year, Sharp and the American Premium Audio Company formed 341 00:21:42,040 --> 00:21:47,000 Speaker 1: a partnership and purchased the Onkyo Home Entertainment division. So 342 00:21:47,520 --> 00:21:50,879 Speaker 1: while on Kyo, the company that created that division is 343 00:21:50,920 --> 00:21:55,200 Speaker 1: no more, the assets, the brand, the technology, and all 344 00:21:55,240 --> 00:21:58,919 Speaker 1: the stuff that the company actually made now belongs to 345 00:21:59,000 --> 00:22:02,200 Speaker 1: other companies that are actually in a much better position. 346 00:22:03,160 --> 00:22:05,640 Speaker 1: So essentially it means we should still expect to see 347 00:22:05,640 --> 00:22:08,840 Speaker 1: on Kyo products on the market. It's just that the 348 00:22:08,880 --> 00:22:12,480 Speaker 1: company that started it all is no more. I might 349 00:22:12,520 --> 00:22:14,480 Speaker 1: have to do a full episode about this in the future. 350 00:22:14,960 --> 00:22:17,480 Speaker 1: In fact, leave me a talk back message on the 351 00:22:17,480 --> 00:22:20,040 Speaker 1: I Heart radio app if you agree. If you don't 352 00:22:20,040 --> 00:22:22,240 Speaker 1: know what that is. On the I Heart Radio app, 353 00:22:22,600 --> 00:22:25,200 Speaker 1: you will see a little microphone icon on the tech 354 00:22:25,280 --> 00:22:28,200 Speaker 1: stuff page, either on the page itself or on the 355 00:22:28,240 --> 00:22:31,000 Speaker 1: actual entry for this episode, and if you tap that, 356 00:22:31,440 --> 00:22:34,000 Speaker 1: you can leave a voice message up to thirty seconds 357 00:22:34,040 --> 00:22:37,159 Speaker 1: long and let me know what you think. All right, 358 00:22:37,200 --> 00:22:39,680 Speaker 1: We've got some more stories to get through before we 359 00:22:40,160 --> 00:22:51,360 Speaker 1: wrap up this episode, but first let's take another quick break. Well, 360 00:22:51,440 --> 00:22:54,480 Speaker 1: the power grid in the United States state of Texas 361 00:22:54,920 --> 00:22:58,320 Speaker 1: is back in the news again. So last year, during 362 00:22:58,359 --> 00:23:02,359 Speaker 1: a particularly harsh into storm, thousands of people in the 363 00:23:02,400 --> 00:23:05,880 Speaker 1: state of Texas experienced power outages and it really put 364 00:23:05,880 --> 00:23:11,080 Speaker 1: a spotlight on that state's unique situation. See. Unlike most 365 00:23:11,119 --> 00:23:12,920 Speaker 1: of the other states in the US, which have power 366 00:23:12,960 --> 00:23:16,000 Speaker 1: grids that interconnect with one another, Texas decided to go 367 00:23:16,200 --> 00:23:19,320 Speaker 1: in house with its power grid. And you might wonder 368 00:23:19,359 --> 00:23:24,679 Speaker 1: why well, Being interconnected means being partly dependent upon and 369 00:23:24,760 --> 00:23:28,119 Speaker 1: partly accountable to other parties, and gosh dang it, that 370 00:23:28,200 --> 00:23:31,960 Speaker 1: just ain't a Texan thing to do. More seriously, being 371 00:23:32,040 --> 00:23:34,960 Speaker 1: part of the interconnected system would bring texas As power 372 00:23:35,000 --> 00:23:38,840 Speaker 1: grid within the jurisdiction of federal regulations, and that darn 373 00:23:38,920 --> 00:23:42,240 Speaker 1: sure anna Texan thing to do. The state has always 374 00:23:42,240 --> 00:23:46,200 Speaker 1: been fiercely independent, even when it meant that citizens were 375 00:23:46,240 --> 00:23:49,320 Speaker 1: bound to suffer as a consequence. There's also a lot 376 00:23:49,359 --> 00:23:52,720 Speaker 1: of allegations that the various politicians who have maintained a 377 00:23:52,800 --> 00:23:57,280 Speaker 1: status quo have done so at considerable personal benefit, meaning 378 00:23:57,600 --> 00:24:00,760 Speaker 1: there's a lot of alleged greasing of politic coal wheels 379 00:24:00,840 --> 00:24:04,120 Speaker 1: going on here if you believe the accusations. But let's 380 00:24:04,119 --> 00:24:08,320 Speaker 1: talk about the impact on the average person. Back in 381 00:24:09,840 --> 00:24:13,639 Speaker 1: around folks lost their lives because they didn't have power 382 00:24:13,880 --> 00:24:18,520 Speaker 1: during this incredible winter storm, and now the Texas power 383 00:24:18,520 --> 00:24:21,840 Speaker 1: grid is struggling with electricity demands as the state experiences 384 00:24:22,160 --> 00:24:25,880 Speaker 1: high temperatures, not an unusual thing in Texas. Uh. It's 385 00:24:25,880 --> 00:24:29,000 Speaker 1: pretty hard to live in Texas without air conditioning, or 386 00:24:29,040 --> 00:24:31,800 Speaker 1: at least without a lot of fans. And it's definitely 387 00:24:31,920 --> 00:24:34,479 Speaker 1: dangerous to do it. So a lot of folks are 388 00:24:34,520 --> 00:24:37,480 Speaker 1: running their air conditioning when the weather gets hot, and 389 00:24:37,520 --> 00:24:41,240 Speaker 1: it's already really hot. Some parts of the state hit 390 00:24:41,320 --> 00:24:45,000 Speaker 1: a hundred five degrees fahrenheit or forty point six degrees 391 00:24:45,000 --> 00:24:50,560 Speaker 1: celsius yesterday. But this over taxes the state's power grid, 392 00:24:50,680 --> 00:24:53,720 Speaker 1: which is incapable of tapping into the rest of the 393 00:24:53,840 --> 00:24:56,760 Speaker 1: United States as systems which would allow it to help 394 00:24:56,840 --> 00:25:00,200 Speaker 1: shoulder the load. It can't do that because it's an 395 00:25:00,200 --> 00:25:04,199 Speaker 1: independent little island, and as a result, power stations have 396 00:25:04,320 --> 00:25:07,600 Speaker 1: had outages recently, and the state has urged citizens to 397 00:25:08,000 --> 00:25:11,439 Speaker 1: limit their power usage, asking people to set their thermostats 398 00:25:11,480 --> 00:25:14,160 Speaker 1: to seventy eight degrees fahrenheit and for those of y'all 399 00:25:14,160 --> 00:25:17,000 Speaker 1: who use the more civilized celsius, that's about twenty five 400 00:25:17,040 --> 00:25:20,600 Speaker 1: point six degrees celsius. They also have been asked not 401 00:25:20,640 --> 00:25:23,399 Speaker 1: to run any big appliances during the peak hours of 402 00:25:23,480 --> 00:25:26,560 Speaker 1: three PM to eight pm. And that's from our coat, 403 00:25:26,600 --> 00:25:32,160 Speaker 1: the ironically named Electric Reliability Council of Texas. It seems 404 00:25:32,160 --> 00:25:34,520 Speaker 1: to me like that reliability part of the name should 405 00:25:34,520 --> 00:25:37,560 Speaker 1: really be revisited. All right, let's get back to hacking 406 00:25:37,880 --> 00:25:42,160 Speaker 1: and exploiting hardware. Ours Technica reports that hackers have discovered 407 00:25:42,320 --> 00:25:45,520 Speaker 1: how to install malware on an iPhone that can run 408 00:25:45,640 --> 00:25:48,320 Speaker 1: even if the phone itself is powered off. And you 409 00:25:48,400 --> 00:25:51,000 Speaker 1: might think, well, how is that possible, Well, it's because 410 00:25:51,000 --> 00:25:54,920 Speaker 1: when you turn your iPhone off, it's not really totally 411 00:25:54,920 --> 00:25:57,520 Speaker 1: powered down. There are certain functions that the phone needs 412 00:25:57,560 --> 00:26:00,919 Speaker 1: to be able to do even if the phone itself 413 00:26:00,920 --> 00:26:04,800 Speaker 1: has been turned off for normal operation. So there are 414 00:26:04,800 --> 00:26:07,720 Speaker 1: certain features that run in a low power mode and 415 00:26:07,760 --> 00:26:10,479 Speaker 1: they just sip as little juice as they can in 416 00:26:10,560 --> 00:26:13,800 Speaker 1: order to continue to function even with the phone turned off, 417 00:26:14,160 --> 00:26:17,359 Speaker 1: and the hackers figured out how to exploit this mode 418 00:26:17,640 --> 00:26:20,840 Speaker 1: so that malware could run on a device that's even 419 00:26:20,880 --> 00:26:25,480 Speaker 1: been powered down. The key is Bluetooth, just like with 420 00:26:25,560 --> 00:26:30,200 Speaker 1: our our keyless entry conversation. Bluetooth was a big part 421 00:26:30,200 --> 00:26:33,120 Speaker 1: of that as well. The Bluetooth chip and an iPhone 422 00:26:33,200 --> 00:26:37,640 Speaker 1: lacks any digital signature methodology or encryption on its firmware, 423 00:26:38,040 --> 00:26:41,320 Speaker 1: which gave the hackers, which I keep saying hackers, they 424 00:26:41,320 --> 00:26:45,000 Speaker 1: were security researchers at the University of Darmstadt in this case, 425 00:26:45,359 --> 00:26:48,639 Speaker 1: gave them the opportunity to develop malware that could do 426 00:26:48,720 --> 00:26:52,000 Speaker 1: stuff like track the phone's physical location even when it's 427 00:26:52,040 --> 00:26:55,240 Speaker 1: turned off. That's one of the low power processes that 428 00:26:55,359 --> 00:26:57,840 Speaker 1: retain support even when a phone has powered down. It's 429 00:26:57,880 --> 00:27:01,640 Speaker 1: the find my feature, right, So if you had turned 430 00:27:01,680 --> 00:27:04,480 Speaker 1: your phone off and you left it somewhere, you want 431 00:27:04,520 --> 00:27:07,920 Speaker 1: that find my feature to work. Well. This is an 432 00:27:07,960 --> 00:27:11,600 Speaker 1: exploit of that. And considering that state backed companies like 433 00:27:11,640 --> 00:27:15,600 Speaker 1: the NSO group have previously targeted iOS devices to convert 434 00:27:15,680 --> 00:27:20,480 Speaker 1: those devices into surveillance tools, this new development is pretty concerning. 435 00:27:20,520 --> 00:27:23,560 Speaker 1: I mean, imagine being able to exploit a target device 436 00:27:23,920 --> 00:27:27,280 Speaker 1: and turn it into at least a partial surveillance tool, 437 00:27:27,359 --> 00:27:29,760 Speaker 1: even if it had been turned off. Even if the 438 00:27:29,760 --> 00:27:33,080 Speaker 1: owner thinks they're being savvy by turning their phone off 439 00:27:33,320 --> 00:27:36,680 Speaker 1: as they traveled to some sensitive location, they could potentially 440 00:27:36,680 --> 00:27:39,440 Speaker 1: be tracked by someone running malware and using a process 441 00:27:39,520 --> 00:27:44,639 Speaker 1: similar to the find my iPhone, which is not good now. 442 00:27:45,560 --> 00:27:48,720 Speaker 1: Security researchers and hackers in our last story, they're working 443 00:27:48,720 --> 00:27:51,720 Speaker 1: to improve security, but obviously there are other hackers who 444 00:27:51,760 --> 00:27:55,560 Speaker 1: have more extreme agendas, such is the case with CONTI, 445 00:27:55,920 --> 00:27:59,640 Speaker 1: a gang of Russian speaking hackers who not long ago 446 00:27:59,800 --> 00:28:02,639 Speaker 1: in full traded computer systems belonging to the government of 447 00:28:02,680 --> 00:28:06,439 Speaker 1: Costa Rica and then locked down those computer systems in 448 00:28:06,480 --> 00:28:11,080 Speaker 1: a ransomware attack. They recently have demanded a ransom of 449 00:28:11,080 --> 00:28:15,880 Speaker 1: of twenty million dollars UH. They increase that in fact 450 00:28:15,880 --> 00:28:19,080 Speaker 1: to twenty million, possibly because Costa Rica just installed a 451 00:28:19,160 --> 00:28:23,879 Speaker 1: new president, Roderigo Chavez, and then that president held a 452 00:28:23,920 --> 00:28:27,320 Speaker 1: press conference and hypothesized that the attack involved people inside 453 00:28:27,359 --> 00:28:32,199 Speaker 1: Costa Rica cooperating with the presumably Russian hacker group. And 454 00:28:32,520 --> 00:28:35,240 Speaker 1: that's something that the hackers themselves have said is true. 455 00:28:35,280 --> 00:28:38,280 Speaker 1: They claim that there are insiders in the government who 456 00:28:38,320 --> 00:28:41,240 Speaker 1: are working with them in order to compromise the systems. 457 00:28:41,440 --> 00:28:44,400 Speaker 1: The hackers have been implying that their goal is to 458 00:28:44,440 --> 00:28:47,920 Speaker 1: overthrow the government, but most researchers say that's likely just 459 00:28:48,040 --> 00:28:52,240 Speaker 1: posturing and that really all they want is cash, cash, money. 460 00:28:52,360 --> 00:28:55,280 Speaker 1: And I do talk a lot about hackers on tech stuff, 461 00:28:55,440 --> 00:28:58,080 Speaker 1: but one thing I frequently don't have information on is 462 00:28:58,200 --> 00:29:01,960 Speaker 1: who is actually developing the malware that's being used out 463 00:29:02,000 --> 00:29:05,240 Speaker 1: there in the real world. Some hacker groups rely on 464 00:29:05,320 --> 00:29:08,080 Speaker 1: tools that are made by people from outside the group itself. 465 00:29:08,360 --> 00:29:12,480 Speaker 1: They're deploying tools that were developed somewhere else. In fact, 466 00:29:12,560 --> 00:29:16,440 Speaker 1: there are plenty of hackers who do that exclusively, and 467 00:29:16,480 --> 00:29:19,080 Speaker 1: in hacking communities, they at least they used to be 468 00:29:19,120 --> 00:29:23,120 Speaker 1: referred to as script kiddies, people who, you know, we're 469 00:29:23,160 --> 00:29:26,520 Speaker 1: taking the programming code that someone else made and just 470 00:29:26,680 --> 00:29:29,280 Speaker 1: using it. But they don't develop code themselves. They don't 471 00:29:29,280 --> 00:29:34,160 Speaker 1: even necessarily understand how to code well anyway. The U 472 00:29:34,320 --> 00:29:37,640 Speaker 1: S Department of Justice claims it has identified someone responsible 473 00:29:37,680 --> 00:29:41,800 Speaker 1: for developing two different malware suites that have been used 474 00:29:41,840 --> 00:29:46,160 Speaker 1: in ransomware attacks. That person is a cardiologist living in 475 00:29:46,280 --> 00:29:50,760 Speaker 1: Venezuela named Boises louise A Gala Gonzalez. Now, according to 476 00:29:50,840 --> 00:29:54,600 Speaker 1: the d J, Zagala is the developer responsible for malware 477 00:29:54,640 --> 00:29:59,480 Speaker 1: called Jigsaw and another one called Thanos. Now. Apparently, one 478 00:29:59,480 --> 00:30:02,640 Speaker 1: of the zagala As relatives tipped off authorities that this 479 00:30:02,760 --> 00:30:07,000 Speaker 1: doctor was moonlighting as a sort of cyber warfare arms dealer, 480 00:30:07,360 --> 00:30:11,280 Speaker 1: supplying hacker groups with tools and providing technical support and 481 00:30:11,320 --> 00:30:16,160 Speaker 1: getting relatively modest payouts in the process. But Zagala was 482 00:30:16,240 --> 00:30:19,360 Speaker 1: using this relative to kind of funnel the money through 483 00:30:19,480 --> 00:30:21,760 Speaker 1: and not go directly back to him. And then the 484 00:30:21,840 --> 00:30:25,960 Speaker 1: relative got investigated as part of this overall investigation into 485 00:30:26,000 --> 00:30:29,960 Speaker 1: the hack and tipped the authorities off to Zagala. One 486 00:30:30,000 --> 00:30:33,080 Speaker 1: of the groups that Zagala was allegedly helping was a 487 00:30:33,120 --> 00:30:37,800 Speaker 1: state backed Iranian hacking group that targeted Israeli companies. So 488 00:30:37,920 --> 00:30:41,080 Speaker 1: Zagala is now charged with two counts of attempted computer 489 00:30:41,160 --> 00:30:45,440 Speaker 1: intrusions and conspiracy to commit computer intrusions. He has not 490 00:30:46,200 --> 00:30:49,440 Speaker 1: been arrested. He's charged with this by the United States, 491 00:30:49,480 --> 00:30:52,200 Speaker 1: but he's living in Venezuela and has not been arrested. Now, 492 00:30:52,240 --> 00:30:56,880 Speaker 1: should he be arrested and should he be uh extradited 493 00:30:57,000 --> 00:30:59,760 Speaker 1: and stand trial for those charges? He could face five 494 00:30:59,840 --> 00:31:02,920 Speaker 1: years in prison for each of the charges. It seems 495 00:31:02,920 --> 00:31:06,600 Speaker 1: that Zegala is a self taught hacker as well. And 496 00:31:06,720 --> 00:31:08,120 Speaker 1: it's really going to be a heck of a thing 497 00:31:08,160 --> 00:31:11,480 Speaker 1: for a cardiologist to devote spare time towards learning how 498 00:31:11,520 --> 00:31:15,440 Speaker 1: to code and then creating malware, particularly when you consider 499 00:31:15,520 --> 00:31:20,400 Speaker 1: that medical facilities are frequently the prime target for ransomware attacks. 500 00:31:21,240 --> 00:31:24,840 Speaker 1: Kind of disturbing. Finally, have you ever asked yourself how 501 00:31:24,880 --> 00:31:29,040 Speaker 1: many times are companies sharing my personal data with one another? Like? 502 00:31:29,160 --> 00:31:33,120 Speaker 1: How many times are the various entities out there, like 503 00:31:33,320 --> 00:31:37,480 Speaker 1: say a data broker, making transactions where your personal information 504 00:31:37,560 --> 00:31:40,120 Speaker 1: is part of the deal. Well, if you live in 505 00:31:40,160 --> 00:31:44,600 Speaker 1: the EU, the answer is probably around six times a 506 00:31:44,720 --> 00:31:48,280 Speaker 1: day on average. So that's how many times companies are 507 00:31:48,320 --> 00:31:52,880 Speaker 1: sharing your personal information with stuff like you know, advertising companies. 508 00:31:53,400 --> 00:31:56,840 Speaker 1: But you know, y'all in the EU can rest easy. 509 00:31:57,280 --> 00:31:58,840 Speaker 1: Those of us in the good old U s of a. 510 00:31:59,440 --> 00:32:04,080 Speaker 1: It's a walking seven hundred forty seven times per day 511 00:32:04,120 --> 00:32:07,280 Speaker 1: on average. This is according to the Irish Council for 512 00:32:07,360 --> 00:32:11,000 Speaker 1: Civil Liberties or i c c L. That keep in mind, 513 00:32:11,520 --> 00:32:16,160 Speaker 1: these companies aren't singling you out personally. Your data gets 514 00:32:16,240 --> 00:32:19,120 Speaker 1: lumped in with the data from millions of other people. 515 00:32:19,600 --> 00:32:21,600 Speaker 1: And one way this works is that when you visit 516 00:32:21,600 --> 00:32:25,120 Speaker 1: a web page, your browser exchanges information with that page. 517 00:32:25,240 --> 00:32:28,080 Speaker 1: The page bundles that information up with all the data 518 00:32:28,080 --> 00:32:30,040 Speaker 1: from folks who also went to that page at that 519 00:32:30,160 --> 00:32:33,560 Speaker 1: same time, and this becomes a block of info that 520 00:32:33,600 --> 00:32:36,920 Speaker 1: then gets auctioned off at a market and the winning 521 00:32:36,960 --> 00:32:41,040 Speaker 1: bidder gets the privilege of serving their ad to you. 522 00:32:41,360 --> 00:32:44,160 Speaker 1: This whole process takes less than a second to happen. 523 00:32:44,240 --> 00:32:47,760 Speaker 1: That's just the little delay that happens before an ad 524 00:32:47,880 --> 00:32:51,720 Speaker 1: loads in onto your page, and it's happening all the time. 525 00:32:52,160 --> 00:32:55,480 Speaker 1: The process is called real time Bidding or r t B. 526 00:32:56,360 --> 00:32:58,280 Speaker 1: You can read the full report on the I c 527 00:32:58,520 --> 00:33:02,160 Speaker 1: c L website. The report is titled the Biggest Data 528 00:33:02,240 --> 00:33:05,200 Speaker 1: Breach and it really pulls back the curtain on the 529 00:33:05,280 --> 00:33:09,120 Speaker 1: data trading industry. This is something that more governments around 530 00:33:09,160 --> 00:33:12,680 Speaker 1: the world are starting to look into and consider regulating, 531 00:33:12,960 --> 00:33:15,920 Speaker 1: so it's possible that this rampant exchange of people's personal 532 00:33:15,960 --> 00:33:19,479 Speaker 1: information will eventually get reined in. And that's it for 533 00:33:19,520 --> 00:33:22,880 Speaker 1: the Tech News for May seventeen, two thousand twenty two. 534 00:33:23,440 --> 00:33:25,240 Speaker 1: As a reminder, if you want to reach out to 535 00:33:25,320 --> 00:33:27,240 Speaker 1: tech Stuff, there are a couple of ways to do it. 536 00:33:27,280 --> 00:33:29,640 Speaker 1: One way is to use the I Heart radio app. 537 00:33:30,160 --> 00:33:32,960 Speaker 1: Navigate to the tech Stuff page and use that little 538 00:33:33,000 --> 00:33:36,680 Speaker 1: microphone icon to leave a talkback voice recording of up 539 00:33:36,720 --> 00:33:42,000 Speaker 1: to thirty seconds to make suggestions for topics, comment on episodes. 540 00:33:42,640 --> 00:33:44,680 Speaker 1: If you go into a specific episode and you leave 541 00:33:44,680 --> 00:33:47,280 Speaker 1: a talk back there, it will be tagged with that episode, 542 00:33:47,320 --> 00:33:49,560 Speaker 1: so I'll know that that's the episode you're referring to. 543 00:33:50,280 --> 00:33:52,800 Speaker 1: It's a pretty cool feature and you should check it out, 544 00:33:53,240 --> 00:33:55,360 Speaker 1: or if you prefer, you can always reach out to 545 00:33:55,400 --> 00:33:57,960 Speaker 1: me on Twitter. The handle for the show is text 546 00:33:58,000 --> 00:34:02,360 Speaker 1: Stuff H sw I'll talk to you again really soon. 547 00:34:03,160 --> 00:34:10,239 Speaker 1: Y text Stuff is an I heart Radio production. For 548 00:34:10,320 --> 00:34:13,279 Speaker 1: more podcasts from I heart Radio, visit the i heart 549 00:34:13,360 --> 00:34:16,520 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 550 00:34:16,560 --> 00:34:17,280 Speaker 1: favorite shows.