1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:15,080 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,280 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,480 --> 00:00:21,920 Speaker 1: and I love all things tech. This time for the 5 00:00:21,960 --> 00:00:27,160 Speaker 1: tech news for Tuesday, November nine, twenty twenty one. Let's 6 00:00:27,160 --> 00:00:30,680 Speaker 1: get to it. In the United States, lawmakers in the 7 00:00:30,720 --> 00:00:34,640 Speaker 1: House of Representatives have introduced a bill that would require 8 00:00:34,680 --> 00:00:39,159 Speaker 1: companies that use algorithms to determine what users see to 9 00:00:39,520 --> 00:00:44,280 Speaker 1: offer up an alternative algorithm light version. So, in other words, 10 00:00:44,760 --> 00:00:48,599 Speaker 1: a company like Meta slash Facebook would have to give 11 00:00:48,680 --> 00:00:53,159 Speaker 1: people the option to opt out of the algorithm driven 12 00:00:53,200 --> 00:00:57,960 Speaker 1: methodology that the company relies upon. Now, the idea behind 13 00:00:58,040 --> 00:01:01,320 Speaker 1: this is that it would let users choose a path 14 00:01:01,760 --> 00:01:04,120 Speaker 1: less likely to fill up their news feeds with the 15 00:01:04,120 --> 00:01:09,160 Speaker 1: type of material that quote unquote drives engagement, since we've 16 00:01:09,200 --> 00:01:11,400 Speaker 1: seen time and time again that a lot of bad 17 00:01:11,440 --> 00:01:15,480 Speaker 1: actors have taken advantage of that approach to spread misinformation 18 00:01:15,920 --> 00:01:20,959 Speaker 1: and hate speech and other awful stuff. This comes on 19 00:01:21,000 --> 00:01:24,880 Speaker 1: the heels of the Francis Hogan hearings, in which the 20 00:01:24,920 --> 00:01:29,200 Speaker 1: former Facebook employee shared her observations about the company's practices 21 00:01:29,720 --> 00:01:33,560 Speaker 1: and She asserted that these practices are harmful on a 22 00:01:33,680 --> 00:01:37,640 Speaker 1: really broad scale, not just two people individually, but potentially 23 00:01:37,680 --> 00:01:42,080 Speaker 1: to big stuff like the concept of democracy itself. The 24 00:01:42,120 --> 00:01:44,800 Speaker 1: bill is worded in a way that says companies would 25 00:01:44,840 --> 00:01:49,760 Speaker 1: need to create an quote input transparent algorithm end quote, 26 00:01:50,400 --> 00:01:54,360 Speaker 1: meaning that the platform wouldn't use any user data to 27 00:01:54,480 --> 00:01:59,040 Speaker 1: determine what that user sees, so theoretically get a more 28 00:01:59,120 --> 00:02:03,120 Speaker 1: generic experien oriants in that regard. One way this might 29 00:02:03,280 --> 00:02:07,760 Speaker 1: manifest is a true reverse chronological order of posts. That is, 30 00:02:08,200 --> 00:02:11,000 Speaker 1: you would see the most recent posts at the top 31 00:02:11,040 --> 00:02:13,800 Speaker 1: of your feed, and then you would work down to 32 00:02:14,000 --> 00:02:17,320 Speaker 1: progressively older posts, you know, the way a lot of 33 00:02:17,320 --> 00:02:21,080 Speaker 1: folks preferred Facebook before the company made that choice harder 34 00:02:21,120 --> 00:02:24,320 Speaker 1: and harder to find and ultimately watered it down with 35 00:02:24,360 --> 00:02:29,200 Speaker 1: an algorithmic approach. If this bill becomes law, it could 36 00:02:29,200 --> 00:02:33,680 Speaker 1: have a massive impact on Facebook's business. The company depends 37 00:02:33,840 --> 00:02:37,440 Speaker 1: heavily on promoting material that gets a lot of engagement 38 00:02:37,800 --> 00:02:41,480 Speaker 1: because that keeps people on the platform longer. That therefore 39 00:02:41,600 --> 00:02:45,440 Speaker 1: Facebook has more time to sell more ads, essentially to 40 00:02:45,520 --> 00:02:49,959 Speaker 1: display more ads to users, and thus they make way 41 00:02:50,000 --> 00:02:54,240 Speaker 1: more money through advertising. Any regulations that would impact that 42 00:02:54,320 --> 00:02:58,399 Speaker 1: could potentially lead to people spending less time on the platform, 43 00:02:58,440 --> 00:03:02,120 Speaker 1: and then Facebook would make money, which I assure you 44 00:03:02,400 --> 00:03:06,400 Speaker 1: the company is not super keen on doing so. If 45 00:03:06,400 --> 00:03:10,000 Speaker 1: this bill does pass, I can almost guarantee that, unless 46 00:03:10,080 --> 00:03:14,480 Speaker 1: it is otherwise mandated by the law, Facebook will make 47 00:03:14,560 --> 00:03:18,920 Speaker 1: sure that this option to switch to the transparent algorithm 48 00:03:19,000 --> 00:03:22,440 Speaker 1: approach will be buried deep in options so that the 49 00:03:22,480 --> 00:03:26,160 Speaker 1: average person isn't likely to go looking for it anyway. 50 00:03:26,440 --> 00:03:29,400 Speaker 1: I think this is an interesting approach, and one I 51 00:03:29,440 --> 00:03:32,840 Speaker 1: happen to think is is a pretty decent idea, at 52 00:03:32,919 --> 00:03:36,880 Speaker 1: least on the surface. I also imagine it won't stop 53 00:03:36,960 --> 00:03:40,840 Speaker 1: Facebook's critics from advocating for the company to get broken up. 54 00:03:41,320 --> 00:03:45,680 Speaker 1: That is a conversation I suspect will continue. The US 55 00:03:45,800 --> 00:03:49,840 Speaker 1: Federal Trade Commission, or FTC will soon send around sixty 56 00:03:49,880 --> 00:03:54,680 Speaker 1: million dollars to more than a hundred forty thousand drivers 57 00:03:54,720 --> 00:03:58,200 Speaker 1: for Amazon. So why is that happening? Well, this is 58 00:03:58,240 --> 00:04:04,200 Speaker 1: because Amazon US illegally withholding tips that the drivers had 59 00:04:04,240 --> 00:04:10,240 Speaker 1: earned from customers between to twenty nineteen. The FTC sued 60 00:04:10,320 --> 00:04:13,360 Speaker 1: Amazon earlier this year over the matter, saying that the 61 00:04:13,360 --> 00:04:17,240 Speaker 1: company was regularly holding back tips that customers had opted 62 00:04:17,360 --> 00:04:22,200 Speaker 1: to give drivers through you know apps and web based services. Further, 63 00:04:22,600 --> 00:04:26,920 Speaker 1: the FTC said that Amazon only stopped doing this once 64 00:04:26,960 --> 00:04:31,240 Speaker 1: they became aware that the FTC was investigating them. Amazon 65 00:04:31,360 --> 00:04:34,760 Speaker 1: ultimately settled this case out of court and agreed to 66 00:04:34,880 --> 00:04:39,080 Speaker 1: return all the money it had stolen from drivers, and 67 00:04:39,200 --> 00:04:42,080 Speaker 1: they would also stop obfuse skating how much the drivers 68 00:04:42,080 --> 00:04:46,000 Speaker 1: were making. Also, the company can't change how tips are 69 00:04:46,080 --> 00:04:52,279 Speaker 1: factored into driver compensation without first acquiring drivers informed consent 70 00:04:52,520 --> 00:04:55,680 Speaker 1: on the matter due to terms of the settlement. According 71 00:04:55,680 --> 00:04:59,599 Speaker 1: to the FTC, the largest payout to an individual driver 72 00:05:00,440 --> 00:05:04,520 Speaker 1: and I can't believe this, it's an excess of twenty 73 00:05:04,680 --> 00:05:10,080 Speaker 1: eight thousand dollars. Now, the average payout is closer to 74 00:05:10,200 --> 00:05:13,000 Speaker 1: four d twenty two dollars. But let's go back to 75 00:05:13,040 --> 00:05:16,080 Speaker 1: that twenty eight large for a second. Imagine, just for 76 00:05:16,120 --> 00:05:20,200 Speaker 1: a moment, that you found out your employer had effectively 77 00:05:20,320 --> 00:05:25,560 Speaker 1: stolen twenty eight thousand dollars from you, that you had 78 00:05:25,600 --> 00:05:30,200 Speaker 1: earned that money, but your employer took it for themselves. 79 00:05:30,760 --> 00:05:37,720 Speaker 1: That is beyond unacceptable. I mean it's absolutely crazy. Right Anyway, 80 00:05:37,960 --> 00:05:42,200 Speaker 1: drivers who received checks and that would be one nine thousand, 81 00:05:42,200 --> 00:05:45,960 Speaker 1: five hundred seven of the drivers, they should deposit or 82 00:05:46,160 --> 00:05:51,200 Speaker 1: cash those checks before January seven, twenty two. The remaining 83 00:05:51,279 --> 00:05:54,440 Speaker 1: drivers are actually receiving their payouts the paypals, so they 84 00:05:54,480 --> 00:05:57,400 Speaker 1: don't have to worry about that. If a driver receives 85 00:05:57,480 --> 00:06:00,160 Speaker 1: more than six hundred dollars, they will also receive than 86 00:06:00,240 --> 00:06:03,120 Speaker 1: I R S. Form ten to fill out because they 87 00:06:03,120 --> 00:06:05,960 Speaker 1: have to declare that on taxes, but that should actually 88 00:06:06,000 --> 00:06:12,040 Speaker 1: affect fewer than twenty thousand of the more than one drivers. Anyway, 89 00:06:12,279 --> 00:06:14,279 Speaker 1: I am glad to hear that the drivers will finally 90 00:06:14,279 --> 00:06:16,720 Speaker 1: be getting the money they earned, and glad that for 91 00:06:16,800 --> 00:06:20,120 Speaker 1: once we're seeing a big company held accountable for screwing 92 00:06:20,240 --> 00:06:24,760 Speaker 1: over the workers that that company employs. Speaking of companies 93 00:06:24,760 --> 00:06:27,520 Speaker 1: that are being held accountable, it's time to talk about 94 00:06:27,560 --> 00:06:31,919 Speaker 1: Tesla again. So you might remember that last month the 95 00:06:31,960 --> 00:06:35,120 Speaker 1: company rolled out an update to Tesla owners who are 96 00:06:35,120 --> 00:06:38,479 Speaker 1: participating in the beta program for the so called full 97 00:06:38,520 --> 00:06:41,960 Speaker 1: self driving feature. And once again I remind you that 98 00:06:42,000 --> 00:06:45,200 Speaker 1: full self driving is in fact a misleading phrase because 99 00:06:45,240 --> 00:06:50,400 Speaker 1: it is not actually a fully autonomous mode. Anyway, that 100 00:06:50,520 --> 00:06:55,640 Speaker 1: rollout ended up causing dangerous problems as Tesla vehicles were 101 00:06:55,640 --> 00:07:02,640 Speaker 1: autonomously miss and mistakenly interpreting a potential frontal collision even 102 00:07:02,680 --> 00:07:05,440 Speaker 1: if there were no cars or any other obstacles in 103 00:07:05,560 --> 00:07:07,760 Speaker 1: front of the vehicles. They were saying, WHOA, We're about 104 00:07:07,800 --> 00:07:10,600 Speaker 1: to collide with something and there was nothing there. That 105 00:07:10,640 --> 00:07:14,160 Speaker 1: then prompted the cars to engage the brakes to avoid 106 00:07:14,200 --> 00:07:19,400 Speaker 1: this so called collision, and it made the Tesla's practically undrivable. 107 00:07:19,840 --> 00:07:23,280 Speaker 1: Owners reported this issue. Some people even took videos of 108 00:07:23,280 --> 00:07:25,240 Speaker 1: what it was like trying to drive the car with 109 00:07:25,360 --> 00:07:29,160 Speaker 1: this mode enabled, and they showed how the car would 110 00:07:29,200 --> 00:07:33,320 Speaker 1: spontaneously detect a potential collision in perfectly safe conditions and 111 00:07:33,360 --> 00:07:37,440 Speaker 1: then slam on the brakes, and obviously that act would 112 00:07:37,440 --> 00:07:42,120 Speaker 1: then constitute an actual dangerous situation, potentially leading to rear 113 00:07:42,200 --> 00:07:46,200 Speaker 1: end collisions if someone were following that Tesla and they 114 00:07:46,240 --> 00:07:49,720 Speaker 1: were going at a good clip. The company hastily rolled 115 00:07:49,840 --> 00:07:53,680 Speaker 1: back this update, and Elon Musk tweeted out the equivalent 116 00:07:53,720 --> 00:07:56,720 Speaker 1: of Hey, these things happen. It's software had it's a 117 00:07:56,760 --> 00:08:00,600 Speaker 1: beta program and essentially brushed it off, but Tesla. The 118 00:08:00,640 --> 00:08:04,840 Speaker 1: company subsequently filed a recall notice with the National Highway 119 00:08:04,880 --> 00:08:09,400 Speaker 1: Traffic Safety Administration or the n h t s A 120 00:08:09,400 --> 00:08:15,040 Speaker 1: a government authority that obviously oversees stuff like safety issues 121 00:08:15,080 --> 00:08:19,280 Speaker 1: with highways and roads. So why did Tesla actually issue 122 00:08:19,600 --> 00:08:24,080 Speaker 1: the recall if they had already rolled back the update. Well, 123 00:08:24,080 --> 00:08:26,800 Speaker 1: that was because the nh T s A has been 124 00:08:26,800 --> 00:08:30,800 Speaker 1: taking a more aggressive approach in investigating and regulating self 125 00:08:30,880 --> 00:08:34,920 Speaker 1: driving vehicles recently, and Tesla has been the subject of 126 00:08:35,000 --> 00:08:38,600 Speaker 1: some scrutiny. The n h T s A had already 127 00:08:38,640 --> 00:08:43,199 Speaker 1: reprimanded Tesla, saying that the company has to actually follow protocol. 128 00:08:43,240 --> 00:08:45,880 Speaker 1: It has to operate within the boundaries of laws and 129 00:08:45,960 --> 00:08:50,000 Speaker 1: regulations for vehicles instead of just playing more fast and 130 00:08:50,040 --> 00:08:53,920 Speaker 1: loose like a typical Silicon Valley company. If a software 131 00:08:53,960 --> 00:08:57,640 Speaker 1: company rolls back a patch, that might be frustrating, right 132 00:08:57,720 --> 00:09:00,000 Speaker 1: if a patch ens up making things worse for your 133 00:09:00,040 --> 00:09:02,719 Speaker 1: software and then they roll it back. But for car 134 00:09:02,800 --> 00:09:05,240 Speaker 1: companies this is really a matter of safety. I mean, 135 00:09:05,280 --> 00:09:07,960 Speaker 1: if my copy of Diablo gets rolled back, I might 136 00:09:08,040 --> 00:09:10,880 Speaker 1: lose some data, but with cars, we're talking about life 137 00:09:11,000 --> 00:09:15,120 Speaker 1: or death kind of stuff. Anyway, Tesla, the company has 138 00:09:15,240 --> 00:09:19,560 Speaker 1: started to change its ways conforming to these requirements, but 139 00:09:19,760 --> 00:09:24,439 Speaker 1: not without Elon complaining about it and sicking rabid Tesla 140 00:09:24,559 --> 00:09:29,680 Speaker 1: fans On regulators. Classy guy that Elon Musk also classy folks, 141 00:09:29,920 --> 00:09:35,360 Speaker 1: those rabid Tesla fans cheese. I suspect this isn't going 142 00:09:35,400 --> 00:09:38,240 Speaker 1: to convince the government to adopt a more lenient approach 143 00:09:38,280 --> 00:09:42,520 Speaker 1: to Tesla, because Elon Musk is really good at rilling 144 00:09:43,240 --> 00:09:46,880 Speaker 1: people and organizations and governments up. And while we're talking 145 00:09:46,920 --> 00:09:50,440 Speaker 1: about Musk, let's also mention his ongoing complaints about, you know, 146 00:09:50,679 --> 00:09:54,200 Speaker 1: having to pay taxes like a common person. You know, 147 00:09:54,280 --> 00:09:57,960 Speaker 1: it's that thing that billionaires really hate to do. There's 148 00:09:58,000 --> 00:10:02,120 Speaker 1: been an increased push to in increased taxes to billionaires. 149 00:10:02,640 --> 00:10:05,360 Speaker 1: Those are the people who can obviously afford to pay 150 00:10:05,400 --> 00:10:08,400 Speaker 1: taxes the most, but are also the least inclined to 151 00:10:08,440 --> 00:10:11,400 Speaker 1: do so. And Musk is no exception to the rule 152 00:10:11,600 --> 00:10:13,760 Speaker 1: that these ultra rich folks really just don't want to 153 00:10:13,800 --> 00:10:17,600 Speaker 1: part way is with even a fraction of their billions. Anyway, 154 00:10:17,840 --> 00:10:21,679 Speaker 1: Musk tweeted out a pole recently, and he wrote, quote 155 00:10:22,240 --> 00:10:26,120 Speaker 1: much is made lately of unrealized gains being a means 156 00:10:26,200 --> 00:10:30,520 Speaker 1: of tax avoidance. So I propose selling of my test 157 00:10:30,559 --> 00:10:34,680 Speaker 1: less stock. Do you support this? End quote? Nearly fifty 158 00:10:35,160 --> 00:10:39,280 Speaker 1: of the respondents said yes. Nowhere yet on if Elon 159 00:10:39,360 --> 00:10:41,360 Speaker 1: Musk has followed through with this. He said he would, 160 00:10:42,160 --> 00:10:44,520 Speaker 1: but at least when I have gone to record, that 161 00:10:44,559 --> 00:10:47,160 Speaker 1: had not yet been reported. However, if he did sell 162 00:10:47,200 --> 00:10:49,880 Speaker 1: that off, based upon the closing price for Testlas stock 163 00:10:49,960 --> 00:10:52,600 Speaker 1: on Friday, it would have meant he would have been 164 00:10:52,600 --> 00:10:57,040 Speaker 1: selling around twenty one billion with a b dollars worth 165 00:10:57,120 --> 00:11:02,600 Speaker 1: of stock. Yawza. Also, the whole thing appeared to affect 166 00:11:02,720 --> 00:11:06,600 Speaker 1: Tesla's stock value because the price of Tesla's stock declined 167 00:11:06,600 --> 00:11:09,680 Speaker 1: by five percent. And I think this is yet another 168 00:11:09,720 --> 00:11:14,760 Speaker 1: example of Elon Musk's messaging having a potentially massive impact 169 00:11:14,880 --> 00:11:17,560 Speaker 1: on the market. We've seen it before, where he tweets 170 00:11:17,559 --> 00:11:21,160 Speaker 1: something and then we see a massive move in the market. 171 00:11:21,520 --> 00:11:24,360 Speaker 1: I think that says a lot about Musk's influence. It 172 00:11:24,400 --> 00:11:27,520 Speaker 1: also says a lot about his lack of concern about 173 00:11:27,559 --> 00:11:31,360 Speaker 1: how much influence he has, and it probably it's a 174 00:11:31,400 --> 00:11:34,840 Speaker 1: pretty decent argument about you know why, the stock market 175 00:11:34,880 --> 00:11:39,360 Speaker 1: is sometimes just a big old dumb psychological experiment. Now, granted, 176 00:11:40,000 --> 00:11:44,400 Speaker 1: over time, the market typically will correct itself as long 177 00:11:44,440 --> 00:11:46,120 Speaker 1: as there is enough time for it to do so, 178 00:11:46,200 --> 00:11:50,000 Speaker 1: and there's a lack of interference, but really this does 179 00:11:50,160 --> 00:11:54,000 Speaker 1: nail home that we humans are emotional and often irrational 180 00:11:54,080 --> 00:11:57,640 Speaker 1: creatures that can do a lot of harm without a 181 00:11:57,640 --> 00:12:01,400 Speaker 1: lot of effort. The Washington Post most reports that the 182 00:12:01,480 --> 00:12:05,400 Speaker 1: Israeli military has been employing facial recognition technology in a 183 00:12:05,480 --> 00:12:10,400 Speaker 1: widespread surveillance campaign targeting Palestinians in the West Bank region. 184 00:12:10,920 --> 00:12:14,640 Speaker 1: It's part of an initiative called Blue Wolf, and according 185 00:12:14,679 --> 00:12:18,320 Speaker 1: to the Post, it involves the Israeli military taking photos 186 00:12:18,360 --> 00:12:22,480 Speaker 1: of Palestinians and those photos go into a large database 187 00:12:22,520 --> 00:12:26,160 Speaker 1: of images, and the Blue Wolf system looks for matches 188 00:12:26,400 --> 00:12:29,320 Speaker 1: in its database and then sends a signal to the 189 00:12:29,440 --> 00:12:32,200 Speaker 1: soldiers phone, and the phone will flash a color to 190 00:12:32,240 --> 00:12:35,440 Speaker 1: the soldier indicating whether the person in the image should 191 00:12:35,440 --> 00:12:38,600 Speaker 1: be left alone or if the soldiers should detain that 192 00:12:38,679 --> 00:12:43,760 Speaker 1: person or even arrest them. Apparently, the military created incentives 193 00:12:44,000 --> 00:12:47,680 Speaker 1: for soldiers to participate in this program, getting them to 194 00:12:47,720 --> 00:12:51,560 Speaker 1: take thousands of photos of Palestinians in the process. There 195 00:12:51,559 --> 00:12:55,640 Speaker 1: were even competitions for soldiers where they could win prizes 196 00:12:55,679 --> 00:12:58,559 Speaker 1: if they took the most photos within a given amount 197 00:12:58,600 --> 00:13:03,360 Speaker 1: of time. The system also interconnects with closed circuit UH 198 00:13:03,400 --> 00:13:06,760 Speaker 1: security cameras. Those are found throughout the West Bank region. 199 00:13:06,800 --> 00:13:09,320 Speaker 1: There in, in fact, are some of these cameras that 200 00:13:09,440 --> 00:13:13,400 Speaker 1: aim at or into homes of Palestinians, So to call 201 00:13:13,480 --> 00:13:17,720 Speaker 1: this invasive would be understating things to the extreme. And 202 00:13:17,760 --> 00:13:21,240 Speaker 1: the fact that there were incentives for soldiers to be 203 00:13:21,320 --> 00:13:24,040 Speaker 1: active participants in this really makes me think of the 204 00:13:24,120 --> 00:13:26,960 Speaker 1: awful stop and frisk policy that we used to see 205 00:13:26,960 --> 00:13:29,680 Speaker 1: in New York City not that long ago, in which 206 00:13:29,720 --> 00:13:33,280 Speaker 1: police officers effectively had quotas they were expected to meet 207 00:13:33,559 --> 00:13:38,080 Speaker 1: when it came to just stopping random citizens and frisking them, 208 00:13:38,280 --> 00:13:43,160 Speaker 1: which led to disproportionate targeting of non white citizens. And 209 00:13:43,320 --> 00:13:47,199 Speaker 1: as always, we have to remember that facial recognition technology 210 00:13:47,320 --> 00:13:50,320 Speaker 1: is far from perfect. So even if you roll it 211 00:13:50,320 --> 00:13:54,880 Speaker 1: out in a way that isn't you know, authoritarian and scary, 212 00:13:55,360 --> 00:13:58,440 Speaker 1: the technology still makes mistakes and when it comes to 213 00:13:58,480 --> 00:14:01,360 Speaker 1: stuff like deciding if someone should be detained or arrested, 214 00:14:01,840 --> 00:14:06,600 Speaker 1: that really becomes a dystopian nightmare. After that one, I 215 00:14:06,640 --> 00:14:09,319 Speaker 1: think we all could use a little bit of a break, 216 00:14:09,559 --> 00:14:20,840 Speaker 1: So we will be back after these short messages. We're back. 217 00:14:21,360 --> 00:14:25,160 Speaker 1: So the brokerage company robin Hood, which is known primarily 218 00:14:25,320 --> 00:14:29,000 Speaker 1: as a fee free company that allows the average person 219 00:14:29,080 --> 00:14:32,800 Speaker 1: to invest in you know. The stock market has recently 220 00:14:32,840 --> 00:14:37,160 Speaker 1: announced that hackers had breached corporate systems and gained access 221 00:14:37,240 --> 00:14:41,640 Speaker 1: to the personal information of around five million customers. Uh 222 00:14:42,120 --> 00:14:45,560 Speaker 1: most of them, it was just their email addresses. However, 223 00:14:45,640 --> 00:14:48,080 Speaker 1: the hackers were also able to see the full names 224 00:14:48,120 --> 00:14:52,440 Speaker 1: of around two million robin Hood customers and apparently for 225 00:14:52,520 --> 00:14:56,680 Speaker 1: a lucky three d robin Hood customers. The hackers also 226 00:14:56,760 --> 00:15:01,080 Speaker 1: saw more personal information like there's zip code and their 227 00:15:01,120 --> 00:15:05,000 Speaker 1: birth date and other stuff. Now, robin Hood says that 228 00:15:05,480 --> 00:15:09,760 Speaker 1: the more sensitive information stuff like social security numbers or 229 00:15:09,880 --> 00:15:13,200 Speaker 1: bank account numbers or debit card numbers, those were not 230 00:15:13,360 --> 00:15:17,520 Speaker 1: part of this breach, so it definitely could have been 231 00:15:17,520 --> 00:15:21,040 Speaker 1: a lot worse. And apparently the hackers were able to 232 00:15:21,080 --> 00:15:26,120 Speaker 1: gain access through the tried and true method of social engineering, 233 00:15:26,160 --> 00:15:29,520 Speaker 1: which essentially is a fancy way of saying they tricked 234 00:15:29,640 --> 00:15:33,240 Speaker 1: someone into letting them in. So in this case, it 235 00:15:33,320 --> 00:15:37,240 Speaker 1: seems like they fooled a customer support representative at robin 236 00:15:37,280 --> 00:15:40,560 Speaker 1: Hood to grant them access to the system. The classic 237 00:15:40,560 --> 00:15:42,560 Speaker 1: way of doing this, by the way, is that you 238 00:15:42,680 --> 00:15:46,120 Speaker 1: pose as I T and that you need access to 239 00:15:46,160 --> 00:15:48,960 Speaker 1: a system in order to install an update or otherwise 240 00:15:48,960 --> 00:15:52,160 Speaker 1: do some sort of maintenance, and you trick someone into 241 00:15:52,280 --> 00:15:56,520 Speaker 1: giving you that level of access. You know, obviously from 242 00:15:56,520 --> 00:16:00,640 Speaker 1: a hacker standpoint, what you're aiming for is administrator level access. 243 00:16:00,680 --> 00:16:03,080 Speaker 1: But sometimes, you know, you just take what you can get. 244 00:16:03,280 --> 00:16:05,560 Speaker 1: That seems to be the case with robin Hood in 245 00:16:05,880 --> 00:16:10,600 Speaker 1: this instant. So that's a classic way of getting access 246 00:16:10,600 --> 00:16:12,480 Speaker 1: to a system. It doesn't require you to set it 247 00:16:12,520 --> 00:16:16,320 Speaker 1: a keyboard and just randomly guess at passwords, which is 248 00:16:16,360 --> 00:16:20,480 Speaker 1: that's how we typically see it in like Hollywood productions, 249 00:16:20,800 --> 00:16:25,240 Speaker 1: you don't often see the social engineering side. Some shows 250 00:16:25,240 --> 00:16:27,840 Speaker 1: and movies do that, but most of the time it's 251 00:16:27,840 --> 00:16:31,880 Speaker 1: the whole No, it's not that no, it's not that oh, hey, 252 00:16:31,920 --> 00:16:34,400 Speaker 1: you got it, which I mean, if it were that easy, 253 00:16:34,440 --> 00:16:38,400 Speaker 1: then nothing would be safe. It's not that easy. By 254 00:16:38,440 --> 00:16:40,960 Speaker 1: the way. You might also remember robin Hood as the 255 00:16:41,160 --> 00:16:47,720 Speaker 1: company that was heavily criticized when individual investors wanted to 256 00:16:47,920 --> 00:16:51,320 Speaker 1: invest in game Stop stock as part of a hedge 257 00:16:51,320 --> 00:16:54,800 Speaker 1: fund squeeze campaign, and then they found that robin Hood, 258 00:16:54,880 --> 00:16:58,280 Speaker 1: the supposed brokerage for the people, had put the brakes 259 00:16:58,360 --> 00:17:01,320 Speaker 1: on that activity, saying that they didn't want to encourage 260 00:17:01,360 --> 00:17:05,240 Speaker 1: market volatility and such, whereas the critics were saying that 261 00:17:05,359 --> 00:17:09,879 Speaker 1: Robin Hood had corporate ties with individuals who had a 262 00:17:09,920 --> 00:17:14,000 Speaker 1: financial stake in game Stop stock going down. If you 263 00:17:14,000 --> 00:17:17,280 Speaker 1: don't remember that story, game Stop stock had been trading 264 00:17:17,280 --> 00:17:19,919 Speaker 1: it around twenty dollars a share for a long time, 265 00:17:20,359 --> 00:17:27,800 Speaker 1: and then uh, after months of enthusiasm, we saw more 266 00:17:27,880 --> 00:17:31,959 Speaker 1: investors individual investors start to buy up game Stop stock, 267 00:17:32,359 --> 00:17:34,840 Speaker 1: which drove the price up. At last I looked at 268 00:17:34,840 --> 00:17:37,719 Speaker 1: with some trading it somewhere around two nineteen dollars, so, 269 00:17:38,480 --> 00:17:41,800 Speaker 1: you know, ten times as valuable as it was a 270 00:17:41,800 --> 00:17:45,280 Speaker 1: couple of years ago. Pretty incredible. But the whole reason 271 00:17:45,320 --> 00:17:47,280 Speaker 1: for that, or at least one of the reasons for that, 272 00:17:47,800 --> 00:17:50,440 Speaker 1: was that there were these hedge funds that had recommended 273 00:17:50,480 --> 00:17:54,160 Speaker 1: short selling the stock, which hinges on a stock price 274 00:17:54,280 --> 00:17:57,160 Speaker 1: going down. With the stock price going up, it ended 275 00:17:57,240 --> 00:18:01,479 Speaker 1: up putting the squeeze on these hedge funds. Anyway, just 276 00:18:02,280 --> 00:18:06,680 Speaker 1: another fun story in the in the history of Robin Hood. 277 00:18:08,280 --> 00:18:12,879 Speaker 1: In other hacker news, a group called fail overflow announced 278 00:18:12,920 --> 00:18:16,640 Speaker 1: it had uncovered the root keys for PS five encryption, 279 00:18:16,680 --> 00:18:20,560 Speaker 1: that is, the Sony PlayStation five. Now what does that 280 00:18:20,600 --> 00:18:24,680 Speaker 1: actually mean. Well, let's suss this out, so consoles when 281 00:18:24,680 --> 00:18:28,560 Speaker 1: you boil them down, video game consoles are just specialized computers. 282 00:18:28,760 --> 00:18:31,360 Speaker 1: Not that I recommend you boil a video game console 283 00:18:31,640 --> 00:18:33,800 Speaker 1: or any other computer for that matter, because that will 284 00:18:33,960 --> 00:18:39,040 Speaker 1: definitely violate your warranty. But computers run software, right However, 285 00:18:39,240 --> 00:18:42,639 Speaker 1: video game consoles typically have protections in place, so you 286 00:18:42,680 --> 00:18:45,320 Speaker 1: can't just run any software you like on them. You 287 00:18:45,320 --> 00:18:48,199 Speaker 1: can only run the stuff that the company allows you 288 00:18:48,240 --> 00:18:50,760 Speaker 1: to run on them, so you can run specific software 289 00:18:50,840 --> 00:18:54,119 Speaker 1: like games. So it's a gated community in other words, 290 00:18:54,200 --> 00:18:58,680 Speaker 1: and the company behind the video game console controls gate access, 291 00:18:58,720 --> 00:19:02,560 Speaker 1: but the encryption keys, the root keys, are essentially a 292 00:19:02,600 --> 00:19:05,760 Speaker 1: way to fool the gate into thinking you're the authorized 293 00:19:05,760 --> 00:19:08,240 Speaker 1: identity that can come and go as you please. That 294 00:19:08,320 --> 00:19:12,399 Speaker 1: means you could potentially run other types of software on 295 00:19:12,440 --> 00:19:16,040 Speaker 1: this console once you unlock it. Now to me, it 296 00:19:16,080 --> 00:19:19,520 Speaker 1: sounds like the fail overflow group is really your classic 297 00:19:19,760 --> 00:19:24,160 Speaker 1: hacker group. These are people who wonder how systems actually work, 298 00:19:24,440 --> 00:19:26,440 Speaker 1: and then they figure it out, and then they learn 299 00:19:26,520 --> 00:19:30,200 Speaker 1: how they can exploit those systems. You know, but by exploit, 300 00:19:30,240 --> 00:19:32,879 Speaker 1: I really mean they just use it in a way 301 00:19:32,920 --> 00:19:36,560 Speaker 1: that was not the way the creators intended. So running 302 00:19:36,560 --> 00:19:39,679 Speaker 1: home brew software and a console doesn't have to involve 303 00:19:39,720 --> 00:19:43,560 Speaker 1: stuff like pirating games, for example, And in fact, fail 304 00:19:43,600 --> 00:19:46,760 Speaker 1: overflow has made statements that make me think that the 305 00:19:46,800 --> 00:19:49,880 Speaker 1: hacker group really doesn't want to encourage piracy at all, 306 00:19:49,960 --> 00:19:53,520 Speaker 1: just the opposite. In fact, however, the group also acknowledges 307 00:19:53,640 --> 00:19:57,400 Speaker 1: that the majority of people who would actually use root 308 00:19:57,520 --> 00:20:01,520 Speaker 1: keys would likely be doing so in order to pirate games. 309 00:20:02,119 --> 00:20:04,919 Speaker 1: Perhaps for that reason, the group is not publishing the 310 00:20:05,000 --> 00:20:08,160 Speaker 1: root keys. Oh so they have participated in the past 311 00:20:08,600 --> 00:20:14,359 Speaker 1: and finding bugs in Sony platforms and and earning money 312 00:20:14,400 --> 00:20:17,000 Speaker 1: that way, because a lot of companies will pay out 313 00:20:17,440 --> 00:20:20,560 Speaker 1: essentially a bug bounty. In this case, I think it's 314 00:20:20,600 --> 00:20:25,200 Speaker 1: a reminder that security systems are not perfect and they're 315 00:20:25,200 --> 00:20:30,240 Speaker 1: not infallible. And this would be an argument that Sony 316 00:20:30,280 --> 00:20:35,199 Speaker 1: had put in a maybe not a weak encryption scheme, 317 00:20:35,480 --> 00:20:37,560 Speaker 1: but not the strongest that they could. Like, if it 318 00:20:37,760 --> 00:20:41,399 Speaker 1: is something that is breakable, that's not great. It shows 319 00:20:41,440 --> 00:20:46,640 Speaker 1: a weakness, and that's not necessarily meant to exploit that weakness, 320 00:20:46,640 --> 00:20:51,359 Speaker 1: but rather perhaps an indication to the company, Hey, the 321 00:20:51,400 --> 00:20:54,480 Speaker 1: way you did this was not very good. You need 322 00:20:54,520 --> 00:20:58,040 Speaker 1: to do better. Ransomware attacks have been on the rise 323 00:20:58,160 --> 00:21:01,520 Speaker 1: for the last couple of years, and the hacker group 324 00:21:01,680 --> 00:21:04,639 Speaker 1: REvil is one of several that have been in the 325 00:21:04,640 --> 00:21:08,040 Speaker 1: news lately. The U s Department of Justice recently announced 326 00:21:08,040 --> 00:21:11,199 Speaker 1: that it had arrested someone alleged to be part of 327 00:21:11,240 --> 00:21:15,320 Speaker 1: the Revill gang. This would be a Ukrainian man named 328 00:21:15,440 --> 00:21:22,800 Speaker 1: Yaroslav Vazynski. Polish authorities detained Vezynski in October because US 329 00:21:22,840 --> 00:21:28,160 Speaker 1: authorities had issued an indictment against Vizynski way back in August, 330 00:21:28,840 --> 00:21:33,199 Speaker 1: and Vazynski now faces extradition hearings which could see him 331 00:21:33,200 --> 00:21:35,720 Speaker 1: transferred to the United States to stand for trial for 332 00:21:35,800 --> 00:21:39,240 Speaker 1: cyber crimes. In addition, the d o J announced that 333 00:21:39,320 --> 00:21:42,879 Speaker 1: it had seized more than six million dollars in assets 334 00:21:43,359 --> 00:21:47,880 Speaker 1: believed to be linked to Revil's ransomware activities. Now, this 335 00:21:47,920 --> 00:21:52,560 Speaker 1: money did not come from Vezynski's accounts. Instead, they came 336 00:21:52,600 --> 00:21:57,719 Speaker 1: from a Russian named Yavigny paulian In. And I know 337 00:21:57,880 --> 00:22:00,760 Speaker 1: I've butchered the name. You don't have to let me know. 338 00:22:01,640 --> 00:22:04,760 Speaker 1: I apologize, But it was a different member of REvil, 339 00:22:04,840 --> 00:22:09,040 Speaker 1: or alleged member I should say of REvil. And he 340 00:22:09,160 --> 00:22:12,639 Speaker 1: was also indicted in August. He remains at large, so 341 00:22:12,680 --> 00:22:16,679 Speaker 1: he has not been taken into custody. Zynski is the 342 00:22:16,760 --> 00:22:20,920 Speaker 1: third of Revil's alleged members to be arrested. They could 343 00:22:21,040 --> 00:22:24,320 Speaker 1: potentially face a prison sentence of up to one hundred 344 00:22:24,440 --> 00:22:28,159 Speaker 1: years if they were convicted on all counts against them, 345 00:22:28,200 --> 00:22:32,119 Speaker 1: And I think this is really a campaign that's a 346 00:22:32,160 --> 00:22:36,320 Speaker 1: message to ransomware gangs that if they are caught, they 347 00:22:36,400 --> 00:22:39,280 Speaker 1: face a price that's much higher than what they can 348 00:22:39,320 --> 00:22:42,880 Speaker 1: extort through ransomware attacks. But whether that acts as an 349 00:22:42,880 --> 00:22:47,439 Speaker 1: actual deterrent remains to be seen. Recently, the House of 350 00:22:47,480 --> 00:22:52,800 Speaker 1: Representatives passed the Infrastructure Investment and Jobs Act, which most 351 00:22:52,840 --> 00:22:56,240 Speaker 1: folks have heard of as the one point to trillion 352 00:22:56,480 --> 00:22:59,920 Speaker 1: dollar infrastructure bill here in the United States. Now that 353 00:23:00,000 --> 00:23:03,800 Speaker 1: there is a ton of stuff in that bill, and 354 00:23:03,840 --> 00:23:07,040 Speaker 1: we're going to talk about a couple of tech related pieces. 355 00:23:07,680 --> 00:23:11,280 Speaker 1: One of those is a sixty five billion dollar package 356 00:23:11,520 --> 00:23:16,000 Speaker 1: to increase broadband internet access in the US. Most of 357 00:23:16,040 --> 00:23:20,119 Speaker 1: that would go to subsidies to internet service providers that 358 00:23:20,200 --> 00:23:25,560 Speaker 1: commit to building out broadband infrastructure into underserved areas. Some 359 00:23:25,760 --> 00:23:28,600 Speaker 1: of the money will also go to subsidy programs that 360 00:23:28,640 --> 00:23:32,640 Speaker 1: will help individual households so that they can afford broadband 361 00:23:32,760 --> 00:23:36,200 Speaker 1: access plans and thus offset the cost that they would 362 00:23:36,200 --> 00:23:39,840 Speaker 1: otherwise have to pay. These measures are a good step, 363 00:23:39,960 --> 00:23:44,200 Speaker 1: but they represent a massive cut to what was originally 364 00:23:44,480 --> 00:23:47,600 Speaker 1: in the bill. The original version of the bill set 365 00:23:47,600 --> 00:23:52,360 Speaker 1: aside one hundred billion dollars to improve broadband access, so 366 00:23:52,600 --> 00:23:55,720 Speaker 1: almost half of that got taken out by the time 367 00:23:55,760 --> 00:23:59,920 Speaker 1: it finally was passed. Still, a little improvement is better 368 00:24:00,200 --> 00:24:04,560 Speaker 1: than the status quo by definition, but it now falls 369 00:24:04,600 --> 00:24:06,639 Speaker 1: to the various I s p s and the f 370 00:24:06,760 --> 00:24:09,720 Speaker 1: c C to make sure that this plan actually leads 371 00:24:09,760 --> 00:24:14,119 Speaker 1: to action and increased access once President Biden actually signs 372 00:24:14,160 --> 00:24:18,159 Speaker 1: it into law. Of that is, so while the law 373 00:24:18,480 --> 00:24:22,119 Speaker 1: is a good step in the right direction, it's not 374 00:24:22,160 --> 00:24:24,120 Speaker 1: as big at step is what people were hoping for. 375 00:24:24,480 --> 00:24:28,840 Speaker 1: And of course the activation is really what matters. Right. 376 00:24:28,880 --> 00:24:32,080 Speaker 1: If these companies end up taking subsidies and then really 377 00:24:32,160 --> 00:24:35,439 Speaker 1: drag their feet on actually building out the infrastructure, it 378 00:24:35,480 --> 00:24:38,200 Speaker 1: doesn't really do anyone any good. And we have seen 379 00:24:38,240 --> 00:24:41,400 Speaker 1: that kind of stuff happened in the past. Uh, hopefully 380 00:24:41,600 --> 00:24:45,880 Speaker 1: that's not how it's gonna turn out this time. But um, 381 00:24:45,960 --> 00:24:49,280 Speaker 1: you know, it's too early to say. We have a 382 00:24:49,280 --> 00:24:52,359 Speaker 1: couple more stories that we're going to cover. But before 383 00:24:52,400 --> 00:25:02,360 Speaker 1: we get to that, let's take another quick break. So 384 00:25:02,480 --> 00:25:04,760 Speaker 1: I mentioned that there were a couple of pieces in 385 00:25:04,760 --> 00:25:08,440 Speaker 1: that Infrastructure bill that related to tech. Well, another element 386 00:25:08,920 --> 00:25:11,919 Speaker 1: is one that actually concerns me quite a bit, and 387 00:25:11,960 --> 00:25:15,320 Speaker 1: I've talked about it in a previous tech Stuff tech 388 00:25:15,440 --> 00:25:20,320 Speaker 1: News episode. So wrapped up in that Infrastructure bill is 389 00:25:20,359 --> 00:25:25,080 Speaker 1: a mandate that car manufacturers will have to incorporate technology 390 00:25:25,119 --> 00:25:29,359 Speaker 1: that can passively monitor drivers and identify whether or not 391 00:25:29,560 --> 00:25:33,600 Speaker 1: the driver is operating a vehicle while impaired, and then 392 00:25:34,400 --> 00:25:38,000 Speaker 1: go on to limit or prevent vehicle operation if the 393 00:25:38,080 --> 00:25:40,920 Speaker 1: vehicle says, yeah, this person is kind of messed up. Now, 394 00:25:41,000 --> 00:25:45,040 Speaker 1: let me be clear, if the technology we used to 395 00:25:45,160 --> 00:25:50,840 Speaker 1: do this were amazingly accurate, and the implementation in systems 396 00:25:50,880 --> 00:25:54,280 Speaker 1: that would limit motor vehicle operation were proven to be 397 00:25:54,600 --> 00:25:58,800 Speaker 1: both effective and safe, I would be all for this. 398 00:25:59,080 --> 00:26:05,080 Speaker 1: I don't want intoxicated or otherwise impaired drivers to operate vehicles, 399 00:26:05,080 --> 00:26:08,840 Speaker 1: potentially putting themselves and others in severe danger. I don't 400 00:26:08,880 --> 00:26:13,120 Speaker 1: want that to happen. I think driving well intoxicated is Honestly, 401 00:26:13,160 --> 00:26:16,280 Speaker 1: I think of it as an unforgivable act because I 402 00:26:16,280 --> 00:26:20,880 Speaker 1: think it shows a flagrant disregard for human life and safety, 403 00:26:21,040 --> 00:26:26,320 Speaker 1: and it's so alien to me that I absolutely condemn it. However, 404 00:26:27,040 --> 00:26:31,040 Speaker 1: my big problem is that the tech we have today 405 00:26:31,320 --> 00:26:34,760 Speaker 1: isn't totally reliable. Now, car companies will have to start 406 00:26:34,920 --> 00:26:39,119 Speaker 1: implementing this by around so it still gives us some time. 407 00:26:39,600 --> 00:26:43,280 Speaker 1: But you also have to remember that incorporating this takes years, right, 408 00:26:43,320 --> 00:26:46,080 Speaker 1: you have to you know, the whole process of developing 409 00:26:46,080 --> 00:26:49,840 Speaker 1: a car can be a two year process, so it's 410 00:26:49,880 --> 00:26:52,160 Speaker 1: really not that far out when you start thinking, oh, 411 00:26:52,240 --> 00:26:56,879 Speaker 1: we have to incorporate this within the that really means 412 00:26:57,040 --> 00:27:01,200 Speaker 1: that car companies have to actively start working on implementation 413 00:27:01,240 --> 00:27:05,400 Speaker 1: by four And the fact that the tech is not 414 00:27:05,880 --> 00:27:09,800 Speaker 1: fully reliable is a potentially enormous problem. Like you could 415 00:27:09,800 --> 00:27:14,320 Speaker 1: have a car mistakenly identify someone as being intoxicated, and 416 00:27:14,359 --> 00:27:16,920 Speaker 1: then that person would suddenly find that they can't drive 417 00:27:17,000 --> 00:27:19,600 Speaker 1: their car. That would be a problem, right, even if 418 00:27:19,640 --> 00:27:22,879 Speaker 1: they aren't actually inhibited in any other way, they wouldn't 419 00:27:22,880 --> 00:27:24,399 Speaker 1: be able to use their car because the car had 420 00:27:24,440 --> 00:27:29,679 Speaker 1: mistakenly identified them as being intoxicated. That's bad. But if 421 00:27:29,720 --> 00:27:33,640 Speaker 1: the system fails to detect that someone is intoxicated when 422 00:27:33,640 --> 00:27:37,080 Speaker 1: they actually are that's like having no system in there 423 00:27:37,119 --> 00:27:40,119 Speaker 1: at all. Like, it's essentially the same as if you 424 00:27:40,160 --> 00:27:43,520 Speaker 1: didn't have a system. So if it doesn't have a 425 00:27:43,600 --> 00:27:46,719 Speaker 1: high enough level of accuracy and detecting it, you might 426 00:27:46,720 --> 00:27:48,920 Speaker 1: as well not even have it. On top of that, 427 00:27:49,640 --> 00:27:53,800 Speaker 1: we've seen with Tesla's autopilot and it's full self driving 428 00:27:53,840 --> 00:27:56,560 Speaker 1: systems that features that are meant to take over for 429 00:27:56,600 --> 00:28:01,560 Speaker 1: a human driver can sometimes be dangerous themselves. Now, I'm 430 00:28:01,600 --> 00:28:05,360 Speaker 1: not saying that these systems are more dangerous than an 431 00:28:05,400 --> 00:28:11,840 Speaker 1: inebriated driver, right that would be ridiculous for me to suggest. However, 432 00:28:12,040 --> 00:28:16,880 Speaker 1: they haven't been proven to actually be safe. And so 433 00:28:17,040 --> 00:28:22,240 Speaker 1: if you say let's replace this one definitively very unsafe 434 00:28:22,320 --> 00:28:25,919 Speaker 1: act with another that has yet to be proven to 435 00:28:26,119 --> 00:28:28,600 Speaker 1: be safe, that doesn't seem like it's a good move 436 00:28:28,680 --> 00:28:31,320 Speaker 1: to me. I feel like this is an example of 437 00:28:31,400 --> 00:28:35,560 Speaker 1: people who are relying upon technology to solve a problem 438 00:28:35,600 --> 00:28:40,280 Speaker 1: that the tech just isn't up to solving yet. And again, 439 00:28:41,040 --> 00:28:43,160 Speaker 1: I do get that this is a very real problem 440 00:28:43,280 --> 00:28:46,800 Speaker 1: that we need a solution. We need to address the 441 00:28:46,880 --> 00:28:51,080 Speaker 1: problem of drivers who drive under the influence, because they 442 00:28:51,120 --> 00:28:55,080 Speaker 1: stand to be an enormous danger to others, and it's 443 00:28:55,120 --> 00:28:58,240 Speaker 1: awful that there's not some you know, easy way to 444 00:28:58,320 --> 00:29:03,600 Speaker 1: do this that does and involves severely restricting someone's freedom. 445 00:29:03,640 --> 00:29:06,760 Speaker 1: I'm just worried that technology, at least as it is 446 00:29:07,120 --> 00:29:11,640 Speaker 1: right now, is not the right solution, because you know, 447 00:29:11,720 --> 00:29:14,720 Speaker 1: it's it's it's not it's not able to do what 448 00:29:14,760 --> 00:29:17,760 Speaker 1: we are depending upon it to do. And if we 449 00:29:17,880 --> 00:29:20,680 Speaker 1: dust off our hands and say, all right, we sorted 450 00:29:20,680 --> 00:29:22,720 Speaker 1: that out, now, let's go on to the next problem, 451 00:29:22,960 --> 00:29:25,280 Speaker 1: what we're really doing is just setting ourselves up for 452 00:29:25,400 --> 00:29:28,320 Speaker 1: tragedy in the future when the tech falls short and 453 00:29:28,400 --> 00:29:31,880 Speaker 1: people are still being people. That is, we still have 454 00:29:32,000 --> 00:29:35,560 Speaker 1: some people who get inebriated and then insist on driving, 455 00:29:36,040 --> 00:29:40,360 Speaker 1: and I you know, I'll also point out that there 456 00:29:40,360 --> 00:29:42,600 Speaker 1: will be people who argue that this sort of thing 457 00:29:42,720 --> 00:29:45,640 Speaker 1: kind of infringes on freedoms and whatnot. For the record, 458 00:29:45,760 --> 00:29:48,440 Speaker 1: that's not my objection. I don't I'm not one of 459 00:29:48,440 --> 00:29:53,000 Speaker 1: those people who says this tech is infringing upon my 460 00:29:53,080 --> 00:29:55,840 Speaker 1: freedom because I don't believe in freedom extending to the 461 00:29:55,880 --> 00:29:58,280 Speaker 1: point where you can put other people in danger. My 462 00:29:58,440 --> 00:30:02,440 Speaker 1: argument is more along the lines of if I drink poison, 463 00:30:03,360 --> 00:30:05,440 Speaker 1: getting someone to put a band aid on my finger 464 00:30:05,640 --> 00:30:09,600 Speaker 1: is not solving the problem, right, It's not addressing the 465 00:30:09,680 --> 00:30:14,400 Speaker 1: actual thing. The solution is not really a solution. And 466 00:30:14,440 --> 00:30:16,760 Speaker 1: that's what I worry about when it comes to tech 467 00:30:17,400 --> 00:30:21,160 Speaker 1: as a way of preventing drunk driving. I just I 468 00:30:21,160 --> 00:30:27,520 Speaker 1: am not convinced that it actually is doing anything helpful. Now. Hopefully, 469 00:30:27,880 --> 00:30:31,440 Speaker 1: by the time car companies have to, you know, work 470 00:30:31,480 --> 00:30:35,000 Speaker 1: within this mandate because it is law now, maybe that 471 00:30:35,080 --> 00:30:38,280 Speaker 1: will all be sorted out, and hopefully it means that 472 00:30:38,360 --> 00:30:42,600 Speaker 1: we'll see a drastic reduction of injuries and deaths due 473 00:30:42,640 --> 00:30:46,280 Speaker 1: to drivers who would otherwise be operating a vehicle while 474 00:30:46,480 --> 00:30:51,960 Speaker 1: under the influence of alcohol or other intoxicating effects. I 475 00:30:52,080 --> 00:30:57,200 Speaker 1: just I worry that it's not gonna be a real solution. 476 00:30:57,520 --> 00:31:00,520 Speaker 1: All Right, I'm done. I know I wind on about 477 00:31:00,520 --> 00:31:03,400 Speaker 1: that for way too long, but it's something that I 478 00:31:03,520 --> 00:31:07,400 Speaker 1: recognize in a lot of places where people, you know, 479 00:31:07,480 --> 00:31:09,840 Speaker 1: see a problem and they just say tech will sort 480 00:31:09,880 --> 00:31:14,440 Speaker 1: that out, and they walk away without really without really 481 00:31:14,480 --> 00:31:17,280 Speaker 1: coming up with an effective approach. The same thing, by 482 00:31:17,320 --> 00:31:20,760 Speaker 1: the way, happens with climate change. I know tons of 483 00:31:20,800 --> 00:31:24,440 Speaker 1: people who said we'll engineer our way out of climate 484 00:31:24,520 --> 00:31:31,720 Speaker 1: change without actually proposing any actionable engineering approaches to dealing 485 00:31:31,760 --> 00:31:34,160 Speaker 1: with climate change, which is the same thing as saying 486 00:31:34,600 --> 00:31:38,680 Speaker 1: that's a future Jonathan problem. It never works out well 487 00:31:38,760 --> 00:31:44,400 Speaker 1: for future Jonathan. Future Jonathan invariably hates present day Jonathan 488 00:31:44,480 --> 00:31:50,320 Speaker 1: and really hates past Jonathan. I can say that with certainty. 489 00:31:50,760 --> 00:31:53,960 Speaker 1: All Right, A while back, in fact, quite a bit 490 00:31:54,160 --> 00:31:56,520 Speaker 1: ways back, I did a series of episodes about the 491 00:31:56,520 --> 00:31:59,760 Speaker 1: company General Electric a k A. G E. In fact, 492 00:32:00,280 --> 00:32:03,800 Speaker 1: I plan on running those episodes next week because I'm 493 00:32:03,800 --> 00:32:07,160 Speaker 1: gonna be on vacation next week, so we're gonna rerun 494 00:32:07,360 --> 00:32:09,600 Speaker 1: the g E episodes. But I've got an update to 495 00:32:09,640 --> 00:32:13,680 Speaker 1: the GE story, and that's the company recently announced it 496 00:32:13,760 --> 00:32:17,720 Speaker 1: is going to split into three separate companies over the 497 00:32:17,760 --> 00:32:22,200 Speaker 1: near future. One is going to focus on energy, one 498 00:32:22,280 --> 00:32:25,640 Speaker 1: company will focus on the healthcare industry, and the third 499 00:32:25,640 --> 00:32:30,440 Speaker 1: company will focus on aviation. Now, this split again, it's 500 00:32:30,480 --> 00:32:33,479 Speaker 1: not happening immediately. It's not like, you know, they just 501 00:32:33,520 --> 00:32:37,120 Speaker 1: did a reverse Vultron. Instead, GE announced that it will 502 00:32:37,160 --> 00:32:40,720 Speaker 1: spin off the healthcare organization sometime in the beginning of 503 00:32:41,560 --> 00:32:44,760 Speaker 1: three and then the energy division will spin off sometime 504 00:32:44,800 --> 00:32:49,560 Speaker 1: in early The idea is that each individual company will 505 00:32:49,600 --> 00:32:53,400 Speaker 1: be more focused with its own specific leadership in order 506 00:32:53,440 --> 00:32:57,520 Speaker 1: to direct company efforts to excel in those specific industries 507 00:32:57,560 --> 00:33:01,400 Speaker 1: without having to worry about the other division. Current ge 508 00:33:01,720 --> 00:33:05,600 Speaker 1: CEO Lawrence Culp said he will lead the Aviation division 509 00:33:06,000 --> 00:33:08,240 Speaker 1: and that's the one that is going to hold on 510 00:33:08,400 --> 00:33:13,080 Speaker 1: to the g E name. And finally, Nintendo has given 511 00:33:13,200 --> 00:33:17,080 Speaker 1: video game console fans something to look forward to. The 512 00:33:17,120 --> 00:33:20,840 Speaker 1: company announced in an earning skull the plans to release 513 00:33:21,120 --> 00:33:25,080 Speaker 1: a brand new video game console. So the Switch, which 514 00:33:25,120 --> 00:33:30,480 Speaker 1: is Nintendo's most recent console, launched in ten and Nintendo 515 00:33:30,560 --> 00:33:32,360 Speaker 1: gave the Switch a little bit of a glow up 516 00:33:32,400 --> 00:33:36,080 Speaker 1: earlier this year, releasing an O led screen version that 517 00:33:36,160 --> 00:33:41,000 Speaker 1: had some modest tweaks, uh, the biggest one being a larger, 518 00:33:41,160 --> 00:33:44,600 Speaker 1: brighter screen that was an old screen, but not nothing 519 00:33:44,680 --> 00:33:47,680 Speaker 1: revolutionary like it was a It was an incremental step, 520 00:33:47,760 --> 00:33:52,000 Speaker 1: not a not like a full premium upgrade. But they're 521 00:33:52,000 --> 00:33:55,520 Speaker 1: also had not been any word of a successor to 522 00:33:55,600 --> 00:33:59,880 Speaker 1: the Nintendo Switch, So this is really big news. When 523 00:34:00,400 --> 00:34:05,920 Speaker 1: can you expect the as yet unnamed Nintendo console, Well, 524 00:34:06,640 --> 00:34:10,960 Speaker 1: it will happen sometime between today and December thirty one, 525 00:34:11,200 --> 00:34:15,080 Speaker 1: two thousand YEP. According to the slide deck in the 526 00:34:15,080 --> 00:34:18,480 Speaker 1: earnings call, we should expect a new console sometime in 527 00:34:18,520 --> 00:34:22,280 Speaker 1: the year two zero x x, so that means sometime 528 00:34:22,360 --> 00:34:26,840 Speaker 1: in the next seventy two years or so we should 529 00:34:26,880 --> 00:34:31,320 Speaker 1: get a new Nintendo console. Now, sure you could argue 530 00:34:31,719 --> 00:34:35,399 Speaker 1: that's not really news at all, because surely Nintendo would 531 00:34:35,440 --> 00:34:39,320 Speaker 1: release something between now and the end of the century. 532 00:34:39,320 --> 00:34:42,399 Speaker 1: But to you, I say, come on, get off my back. 533 00:34:42,480 --> 00:34:45,279 Speaker 1: I wanted something silly to end this episode with, and 534 00:34:45,840 --> 00:34:49,279 Speaker 1: this one one out all right. That is the tech 535 00:34:49,360 --> 00:34:53,360 Speaker 1: news for Tuesday, November nine, twenty one. If you have 536 00:34:53,440 --> 00:34:56,000 Speaker 1: suggestions for topics I should cover in future episodes of 537 00:34:56,000 --> 00:34:59,120 Speaker 1: Tech Stuff, reach out to me. The best way to 538 00:34:59,160 --> 00:35:03,000 Speaker 1: do that is with Twitter. Use the Twitter handled text 539 00:35:03,040 --> 00:35:06,560 Speaker 1: stuff H s W and that'll get in touch with me. Also, 540 00:35:06,640 --> 00:35:09,799 Speaker 1: as a reminder, I am on vacation. Next week we 541 00:35:09,840 --> 00:35:14,759 Speaker 1: will be rerunning the g E Story episodes and on Thursday, 542 00:35:14,800 --> 00:35:17,160 Speaker 1: I believe we will have of next week we will 543 00:35:17,200 --> 00:35:20,719 Speaker 1: have a Smart Talks with IBM episode and then I'll 544 00:35:20,719 --> 00:35:24,359 Speaker 1: be back the following week and we will be back 545 00:35:24,360 --> 00:35:28,640 Speaker 1: to business as usual. And until all that time, I'll 546 00:35:28,640 --> 00:35:40,480 Speaker 1: talk to you again, really Sion like tomorrow text Stuff 547 00:35:40,560 --> 00:35:43,719 Speaker 1: is an I heart Radio production. For more podcasts from 548 00:35:43,719 --> 00:35:47,520 Speaker 1: my heart Radio, visit the i heart Radio app, Apple Podcasts, 549 00:35:47,640 --> 00:35:49,600 Speaker 1: or wherever you listen to your favorite shows.