1 00:00:04,440 --> 00:00:12,559 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Okay, hey there, 2 00:00:12,600 --> 00:00:16,360 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,400 --> 00:00:19,599 Speaker 1: I'm an executive producer with iHeart Podcast. Send How the 4 00:00:19,720 --> 00:00:22,279 Speaker 1: tech are you? It's time for the tech news for 5 00:00:22,320 --> 00:00:26,560 Speaker 1: the week ending on March twenty second, twenty twenty four. 6 00:00:26,800 --> 00:00:30,480 Speaker 1: And last week I dedicated an entire episode to how 7 00:00:30,520 --> 00:00:33,040 Speaker 1: the US House of Representatives voted in favor of a 8 00:00:33,040 --> 00:00:36,360 Speaker 1: bill to force TikTok's parent company, Byte Dance to either 9 00:00:36,560 --> 00:00:41,040 Speaker 1: sell TikTok off or face a national band for the app. 10 00:00:41,400 --> 00:00:44,159 Speaker 1: And I've got a couple of follow ups on that story. 11 00:00:44,200 --> 00:00:47,080 Speaker 1: And to be clear, that particular bill still has to 12 00:00:47,159 --> 00:00:50,440 Speaker 1: pass in the US Senate, which isn't a guarantee. In fact, 13 00:00:50,479 --> 00:00:52,440 Speaker 1: it's not even a guarantee that the bill will come 14 00:00:52,520 --> 00:00:55,360 Speaker 1: up for a vote in the Senate before it could 15 00:00:55,440 --> 00:00:57,800 Speaker 1: be signed into law, so it's not a done deal. 16 00:00:58,480 --> 00:01:01,960 Speaker 1: So first up, in that episode, I heavily criticized the 17 00:01:02,040 --> 00:01:05,679 Speaker 1: US government for being far too myopic and focusing just 18 00:01:05,760 --> 00:01:08,720 Speaker 1: on TikTok instead of looking at the larger picture in 19 00:01:08,760 --> 00:01:12,399 Speaker 1: which data brokers collect and sell information from a host 20 00:01:12,600 --> 00:01:16,520 Speaker 1: of different services and devices. And platforms. I argued that 21 00:01:16,640 --> 00:01:20,360 Speaker 1: TikTok doesn't need to have a parent company that merits 22 00:01:20,400 --> 00:01:23,959 Speaker 1: the quote unquote foreign adversary label in order for it 23 00:01:24,000 --> 00:01:27,080 Speaker 1: to be a threat or a concern. I guess some 24 00:01:27,120 --> 00:01:28,920 Speaker 1: folks in the House were thinking of the same thing, 25 00:01:29,160 --> 00:01:31,959 Speaker 1: because this week the House of Representatives voted to pass 26 00:01:32,080 --> 00:01:36,360 Speaker 1: a bill called the Protecting Americans Data from Foreign Adversaries Act. 27 00:01:36,520 --> 00:01:38,720 Speaker 1: So this bill would make it against the law for 28 00:01:38,840 --> 00:01:44,640 Speaker 1: data brokers to sell American citizen data to foreign adversaries 29 00:01:44,680 --> 00:01:48,840 Speaker 1: such as China, Russia, or North Korea or entities that 30 00:01:49,080 --> 00:01:53,000 Speaker 1: work within those countries. So it wouldn't matter where this 31 00:01:53,240 --> 00:01:57,120 Speaker 1: information came from, whether it came from TikTok or Facebook 32 00:01:57,240 --> 00:02:00,640 Speaker 1: or Google or Apple or any other entity that regularly 33 00:02:00,720 --> 00:02:03,520 Speaker 1: trades and the zeros and ones that relate to American 34 00:02:03,560 --> 00:02:07,600 Speaker 1: citizens information. If this has passed into law, the Federal 35 00:02:07,640 --> 00:02:12,080 Speaker 1: Trade Commission or FTC, will be able to penalize data 36 00:02:12,120 --> 00:02:15,000 Speaker 1: brokers that are found to have violated the law. And 37 00:02:15,080 --> 00:02:18,359 Speaker 1: I think this is a modest improvement. It removes that 38 00:02:18,440 --> 00:02:23,800 Speaker 1: laser focus from really a single company, that being TikTok. However, 39 00:02:23,880 --> 00:02:28,919 Speaker 1: all that being said, it's still a pretty modest bump. Personally, 40 00:02:29,320 --> 00:02:32,480 Speaker 1: I don't think foreign adversaries are really the only entities 41 00:02:32,520 --> 00:02:35,160 Speaker 1: that we should be concerned about when it comes to 42 00:02:35,280 --> 00:02:39,320 Speaker 1: protecting our privacy and personal information. I think domestic companies 43 00:02:39,639 --> 00:02:43,640 Speaker 1: can be just as dangerous, like Meta or Google. That 44 00:02:43,760 --> 00:02:46,880 Speaker 1: being said, the Verge reports that lawmakers behind this most 45 00:02:46,919 --> 00:02:50,320 Speaker 1: recent act are hopeful that the recent ground swell of 46 00:02:50,360 --> 00:02:52,519 Speaker 1: support that these measures have had, at least in the 47 00:02:52,560 --> 00:02:56,560 Speaker 1: House of Representatives, could build momentum to a more comprehensive 48 00:02:56,600 --> 00:02:59,799 Speaker 1: privacy protection set of laws down the road here in 49 00:02:59,800 --> 00:03:02,200 Speaker 1: the United States, which you know, I sure hope that 50 00:03:02,200 --> 00:03:06,480 Speaker 1: that's the case, because it's long overdue. Meanwhile, over at Gizmoto, 51 00:03:06,800 --> 00:03:10,320 Speaker 1: Thomas Jermaine has a piece titled politicians who voted to 52 00:03:10,360 --> 00:03:13,000 Speaker 1: ban TikTok may own as much as one hundred and 53 00:03:13,000 --> 00:03:17,480 Speaker 1: twenty six million dollars in tech stocks. So Germaine points 54 00:03:17,480 --> 00:03:20,960 Speaker 1: out that the stock ownership could indicate a conflict of 55 00:03:21,000 --> 00:03:25,360 Speaker 1: interest with regard to the TikTok. Brewjaha that the companies 56 00:03:25,440 --> 00:03:29,120 Speaker 1: that some of these politicians have heavily invested in are 57 00:03:29,200 --> 00:03:32,520 Speaker 1: ones that potentially could benefit if something were to you, 58 00:03:32,680 --> 00:03:36,400 Speaker 1: though happened to TikTok sure would be a shame. Could 59 00:03:36,440 --> 00:03:40,440 Speaker 1: be that these politicians are not even entirely motivated out 60 00:03:40,440 --> 00:03:43,960 Speaker 1: of concern for protecting US citizens, as hard as that 61 00:03:44,120 --> 00:03:46,280 Speaker 1: is to believe, they might not be acting in our 62 00:03:46,320 --> 00:03:50,119 Speaker 1: best interests, and instead maybe they're hopeful that by slapping 63 00:03:50,120 --> 00:03:52,640 Speaker 1: a Chinese owned company around, they could see their own 64 00:03:52,640 --> 00:03:55,960 Speaker 1: stock portfolio flourish. I don't know if that's the case, 65 00:03:56,200 --> 00:03:58,040 Speaker 1: but I will say that once again, at the end 66 00:03:58,080 --> 00:04:00,080 Speaker 1: of the day, it really doesn't even matter what the 67 00:04:00,120 --> 00:04:02,680 Speaker 1: truth is, because as long as there is a perception 68 00:04:03,120 --> 00:04:05,760 Speaker 1: that someone is acting more out of their own self 69 00:04:05,760 --> 00:04:09,240 Speaker 1: interest rather than a genuine desire to be a leader 70 00:04:09,560 --> 00:04:12,880 Speaker 1: and to protect their constituents, well that's enough to undermine 71 00:04:12,880 --> 00:04:16,120 Speaker 1: confidence in that person. In fact, Jermaine's piece and Gizmoto 72 00:04:16,200 --> 00:04:19,000 Speaker 1: really explores that issue thoroughly. I recommend checking it out. 73 00:04:19,200 --> 00:04:21,320 Speaker 1: It's a well written piece and it quotes a lot 74 00:04:21,360 --> 00:04:23,880 Speaker 1: of different people who are saying I'm not saying that 75 00:04:23,960 --> 00:04:28,320 Speaker 1: these politicians are trying to engage in some form of 76 00:04:28,360 --> 00:04:31,279 Speaker 1: insider trading. What I'm saying is this isn't a good look, 77 00:04:31,680 --> 00:04:36,359 Speaker 1: and unfortunately it does end up hurting confidence in those leaders. 78 00:04:36,400 --> 00:04:38,400 Speaker 1: And why they are doing what they're doing, and they 79 00:04:38,440 --> 00:04:41,480 Speaker 1: need to address that. Now, let us turn our attention 80 00:04:41,600 --> 00:04:45,520 Speaker 1: to the US government targeting another big tech company, this 81 00:04:45,560 --> 00:04:49,480 Speaker 1: one firmly rooted in America itself. I'm talking about Apple. 82 00:04:50,000 --> 00:04:53,560 Speaker 1: The US Department of Justice has brought a lawsuit against Apple, 83 00:04:53,760 --> 00:04:56,359 Speaker 1: alleging that the company has made a monopoly out of 84 00:04:56,400 --> 00:05:00,440 Speaker 1: the Apple iPhone ecosystem, and that the company had leveraged 85 00:05:00,480 --> 00:05:04,279 Speaker 1: its near total control over that ecosystem to drive corporate 86 00:05:04,360 --> 00:05:08,360 Speaker 1: valuation at the expense of pretty much anyone who wasn't 87 00:05:08,360 --> 00:05:12,599 Speaker 1: directly an Apple stakeholder, including folks like customers and developers, 88 00:05:12,920 --> 00:05:14,880 Speaker 1: and from a very high level, I don't think you 89 00:05:14,920 --> 00:05:19,400 Speaker 1: can really argue against that. It's pretty blatantly clear that 90 00:05:19,720 --> 00:05:23,280 Speaker 1: Apple has done that. They have nearly always been a 91 00:05:23,279 --> 00:05:28,480 Speaker 1: buttoned down company that practiced jealous control over all aspects 92 00:05:28,520 --> 00:05:32,479 Speaker 1: of its technological ecosystem. Now I say nearly always, because 93 00:05:32,480 --> 00:05:34,800 Speaker 1: there was a period of time when Steve Jobs had 94 00:05:34,839 --> 00:05:38,159 Speaker 1: been banished from Apple where things were a bit different, 95 00:05:38,200 --> 00:05:40,640 Speaker 1: but upon his return, they kind of came back to it. 96 00:05:41,000 --> 00:05:44,840 Speaker 1: This particular lawsuit cites a myriad of Apple decisions that 97 00:05:45,080 --> 00:05:48,840 Speaker 1: support the legal accusation of the company being anti competitive, 98 00:05:49,080 --> 00:05:54,120 Speaker 1: including the infamous Apple Messages issue. So the difference between 99 00:05:54,160 --> 00:05:57,840 Speaker 1: those green and blue bubbles and Apple Messages is one 100 00:05:57,839 --> 00:06:00,920 Speaker 1: of many choices that the Department of Justice says exists 101 00:06:01,160 --> 00:06:05,080 Speaker 1: solely to pressure customers into buying into and then staying within, 102 00:06:05,200 --> 00:06:08,080 Speaker 1: the Apple iPhone world. But that's just the tip of 103 00:06:08,120 --> 00:06:11,520 Speaker 1: the iceberg. The DOJ says that Apple has also purposefully 104 00:06:11,560 --> 00:06:15,840 Speaker 1: limited interoperability with technologies from other companies like smart watches 105 00:06:15,960 --> 00:06:21,040 Speaker 1: or digital wallets, which discourages iPhone owners from branching outside 106 00:06:21,080 --> 00:06:24,000 Speaker 1: the Apple world and kind of corrals them into buying 107 00:06:24,080 --> 00:06:27,279 Speaker 1: and using more Apple products. Now, I've said that for 108 00:06:27,440 --> 00:06:30,200 Speaker 1: years that Apple's very good at building stuff that works 109 00:06:30,240 --> 00:06:33,080 Speaker 1: great with other Apple stuff, but not so great with 110 00:06:33,200 --> 00:06:36,479 Speaker 1: working with other technologies. The iPod is a great example. 111 00:06:36,520 --> 00:06:39,600 Speaker 1: Early on, if you didn't have a Macintosh computer, there 112 00:06:39,680 --> 00:06:41,840 Speaker 1: was no reason to get an iPod because it wasn't 113 00:06:41,880 --> 00:06:44,680 Speaker 1: compatible with other types of computers. And even when it 114 00:06:44,800 --> 00:06:48,760 Speaker 1: was compatible, say like the Windows version of iTunes, was 115 00:06:49,000 --> 00:06:53,080 Speaker 1: so bloated and clunky that it would discourage you from 116 00:06:53,160 --> 00:06:55,520 Speaker 1: using it like it was almost it almost felt like 117 00:06:55,600 --> 00:07:00,240 Speaker 1: Apple was intentionally trying to drive people to a band 118 00:07:00,000 --> 00:07:04,360 Speaker 1: and whatever other technologies they used in favor of Apple technologies, 119 00:07:04,600 --> 00:07:08,080 Speaker 1: because then everything worked really well. And so what the 120 00:07:08,120 --> 00:07:10,520 Speaker 1: DOJ is saying is that the company has continued to 121 00:07:10,560 --> 00:07:13,920 Speaker 1: do that, and it's doing it specifically in order to 122 00:07:14,400 --> 00:07:17,160 Speaker 1: decrease competition in the space. Now, there's a lot more 123 00:07:17,200 --> 00:07:20,080 Speaker 1: to the lawsuit than what I've already said, and I 124 00:07:20,120 --> 00:07:22,800 Speaker 1: am no legal expert, but on a surface level, it's 125 00:07:22,840 --> 00:07:24,920 Speaker 1: actually really hard for me to find much fault in 126 00:07:24,960 --> 00:07:27,480 Speaker 1: the argument. Now, that does not mean that it's going 127 00:07:27,520 --> 00:07:30,200 Speaker 1: to stand up in court. It might not, and Apple 128 00:07:30,280 --> 00:07:33,560 Speaker 1: says it vehemently disagrees with the premise of the lawsuit, 129 00:07:33,840 --> 00:07:36,560 Speaker 1: that in fact, that this constitutes a monopoly in the 130 00:07:36,560 --> 00:07:39,040 Speaker 1: first place, and that the company is going to fight 131 00:07:39,120 --> 00:07:42,320 Speaker 1: the lawsuit. Of course, the fact that Apple recently had 132 00:07:42,360 --> 00:07:46,400 Speaker 1: to comply with EU regulations that deal with similar issues 133 00:07:46,680 --> 00:07:49,320 Speaker 1: could mean that Apple is already on its back foot 134 00:07:49,720 --> 00:07:52,600 Speaker 1: on this matter. Now, I do have another Apple related 135 00:07:52,600 --> 00:07:54,480 Speaker 1: story I wanted to mention before we move on with 136 00:07:54,640 --> 00:07:59,520 Speaker 1: other news. Technica's Dan Gooden has a piece titled Unpatchable 137 00:07:59,640 --> 00:08:04,840 Speaker 1: Vulnerability in Apple chip leaks secret encryption keys, and as 138 00:08:04,880 --> 00:08:07,040 Speaker 1: I'm sure you can imagine, that is not a good thing. 139 00:08:07,320 --> 00:08:11,360 Speaker 1: The chips in question are Apple's M series of processors. 140 00:08:11,440 --> 00:08:13,880 Speaker 1: I highly recommend reading the piece if you want all 141 00:08:13,920 --> 00:08:17,679 Speaker 1: the technical details that the piece does go into great 142 00:08:17,720 --> 00:08:20,400 Speaker 1: technical detail, which I'm not going to do for this 143 00:08:20,440 --> 00:08:23,160 Speaker 1: news episode. But the challenge of fixing this problem is 144 00:08:23,200 --> 00:08:26,920 Speaker 1: that the vulnerability is actually built into the architecture of 145 00:08:27,000 --> 00:08:30,840 Speaker 1: the chips themselves, as in the physical layout of the 146 00:08:30,920 --> 00:08:34,000 Speaker 1: chips design. So that kind of thing is not easy 147 00:08:34,040 --> 00:08:36,320 Speaker 1: to fix. You know, you can't just take the chip 148 00:08:36,360 --> 00:08:38,520 Speaker 1: in to get an alteration. You would either need to 149 00:08:38,600 --> 00:08:40,920 Speaker 1: swap the chip out for a different one with a 150 00:08:40,960 --> 00:08:43,960 Speaker 1: completely different architecture, and that would still need to be 151 00:08:43,960 --> 00:08:47,880 Speaker 1: able to work with whatever device in question you're switching 152 00:08:47,920 --> 00:08:51,440 Speaker 1: out of. Right, Like, a different architecture doesn't necessarily mean 153 00:08:51,720 --> 00:08:54,720 Speaker 1: that it's going to be compatible with the computer you're using. 154 00:08:55,120 --> 00:08:56,880 Speaker 1: Or you know, you have to wait for a new 155 00:08:56,960 --> 00:08:59,360 Speaker 1: generation of devices that have a brand new chip in 156 00:08:59,400 --> 00:09:02,079 Speaker 1: them that don't does and include the flaw, Or you've 157 00:09:02,080 --> 00:09:04,800 Speaker 1: got to find some way to patch the issue and 158 00:09:04,920 --> 00:09:07,880 Speaker 1: try and work around it, and likely any patch you 159 00:09:07,960 --> 00:09:11,960 Speaker 1: create would really impact chip performance because the chip would 160 00:09:12,000 --> 00:09:14,280 Speaker 1: have to find a way to work that is counter 161 00:09:14,400 --> 00:09:16,839 Speaker 1: to the way the chip had been designed to work, 162 00:09:17,240 --> 00:09:19,800 Speaker 1: so there are no great solutions to this kind of 163 00:09:19,800 --> 00:09:22,880 Speaker 1: a problem, and the vulnerability could mean that malicious hackers 164 00:09:22,880 --> 00:09:28,000 Speaker 1: could steal cryptographic keys from Macintosh computers under the right circumstances, 165 00:09:28,040 --> 00:09:30,679 Speaker 1: which is not cool. But again, you should check out 166 00:09:30,679 --> 00:09:34,840 Speaker 1: the article in Ours Technica to get all the technical details. Okay, 167 00:09:34,840 --> 00:09:37,480 Speaker 1: we've got more news to go through before we get 168 00:09:37,520 --> 00:09:40,160 Speaker 1: to that, Let's take a quick break to thank our sponsors. 169 00:09:49,280 --> 00:09:53,800 Speaker 1: We're back, so I've got more government versus tech stories 170 00:09:53,840 --> 00:09:56,480 Speaker 1: to cover. A Senator Elizabeth Warren is calling on the 171 00:09:56,520 --> 00:10:00,240 Speaker 1: Securities in Exchange Commission or SEC in the US to 172 00:10:00,320 --> 00:10:03,679 Speaker 1: investigate Elon Musk and Tesla. This would not be the 173 00:10:03,679 --> 00:10:07,360 Speaker 1: first rodeo between the SEC and Tesla if this investigation 174 00:10:07,520 --> 00:10:10,320 Speaker 1: does in fact launch. There have been a few tussles 175 00:10:10,400 --> 00:10:13,360 Speaker 1: between Musk and the SEC in the past, as Musk 176 00:10:13,440 --> 00:10:16,400 Speaker 1: has shown quite a few times that apparently he sees 177 00:10:16,520 --> 00:10:19,880 Speaker 1: rules as being something that happened to other people. Anyway, 178 00:10:20,200 --> 00:10:24,160 Speaker 1: what is the reasoning behind this current push for an investigation. Well, 179 00:10:24,200 --> 00:10:26,920 Speaker 1: Warren says she is concerned there could be a quote 180 00:10:27,280 --> 00:10:32,080 Speaker 1: possible misappropriation of Tesla resources and conflicts of interest arising 181 00:10:32,080 --> 00:10:36,120 Speaker 1: from mister Musk's dual role at Tesla and X renamed 182 00:10:36,160 --> 00:10:39,400 Speaker 1: from Twitter end quote. This is something that some Tesla 183 00:10:39,440 --> 00:10:42,240 Speaker 1: shareholders have been arguing for some time, so at the 184 00:10:42,360 --> 00:10:45,240 Speaker 1: very least, they have said that Musk dividing his attention 185 00:10:45,360 --> 00:10:49,040 Speaker 1: between his electric vehicle car company that has been down 186 00:10:49,040 --> 00:10:51,560 Speaker 1: a pretty bumpy road for a couple of years now 187 00:10:51,840 --> 00:10:55,400 Speaker 1: and a trash fire of a social media platform isn't 188 00:10:55,440 --> 00:10:58,319 Speaker 1: really in the best interest for Tesla shareholders. I mean, 189 00:10:58,320 --> 00:11:00,880 Speaker 1: this was the reason why Musk named an new CEO 190 00:11:01,160 --> 00:11:04,840 Speaker 1: for X, even though Musk has not really stepped away 191 00:11:04,840 --> 00:11:08,240 Speaker 1: from X all that much. So. Warren alleges that the 192 00:11:08,360 --> 00:11:12,800 Speaker 1: board of directors over at Tesla is essentially stocked with Musks, 193 00:11:12,960 --> 00:11:16,840 Speaker 1: buddies and cronies, that it's it's a Musk controlled board 194 00:11:16,880 --> 00:11:19,960 Speaker 1: of directors, and that this is good for Elon Musk, 195 00:11:20,080 --> 00:11:23,280 Speaker 1: but it's not good for shareholders or for the company itself. 196 00:11:23,600 --> 00:11:27,280 Speaker 1: One might argue that Musk's compensation package serves as evidence 197 00:11:27,360 --> 00:11:30,400 Speaker 1: that this accusation has some merit to it. Warren also 198 00:11:30,480 --> 00:11:33,440 Speaker 1: cites Elon Musk himself, who has stated on X and 199 00:11:33,520 --> 00:11:36,360 Speaker 1: elsewhere that his desire to gain twenty five percent voting 200 00:11:36,440 --> 00:11:39,720 Speaker 1: power at TESLA is a very real concern of his, 201 00:11:39,840 --> 00:11:41,960 Speaker 1: and that if he doesn't get what he wants, he's 202 00:11:42,040 --> 00:11:45,520 Speaker 1: gonna make them sorry. Not in so many words, but 203 00:11:45,760 --> 00:11:48,520 Speaker 1: you know that's the gist of it. Musk has responded 204 00:11:48,559 --> 00:11:52,960 Speaker 1: in his own typical way by lobbying criticisms and accusations 205 00:11:52,960 --> 00:11:56,200 Speaker 1: and insults through his ex platform. So will we see 206 00:11:56,280 --> 00:11:59,040 Speaker 1: the SEC go after Musk yet again? And if so, 207 00:11:59,160 --> 00:12:03,040 Speaker 1: will the SEC again slap Musk with more penalties? I 208 00:12:03,040 --> 00:12:06,400 Speaker 1: would say probably, But that's largely because Musk makes it 209 00:12:06,520 --> 00:12:09,960 Speaker 1: really hard to not go after him. He just seems 210 00:12:09,960 --> 00:12:13,520 Speaker 1: intent on escalating situations. All right, But how about some 211 00:12:13,559 --> 00:12:16,440 Speaker 1: positive news related to one of Musk's endeavors for real, 212 00:12:16,679 --> 00:12:20,880 Speaker 1: I'm not being facetious here. Nolan Arba is the first 213 00:12:20,960 --> 00:12:24,920 Speaker 1: human recipient of a Neurlink brain computer interface system. So 214 00:12:25,000 --> 00:12:28,840 Speaker 1: Neurlink is one of Elon Musk's companies, and Arba is 215 00:12:28,880 --> 00:12:32,160 Speaker 1: a quadriplegic who opted into the surgery to have a 216 00:12:32,200 --> 00:12:35,880 Speaker 1: Neurlink interface implant, and he reports that he can now 217 00:12:35,920 --> 00:12:39,160 Speaker 1: control a cursor on a computer screen just by imagining 218 00:12:39,160 --> 00:12:42,280 Speaker 1: the cursor moving around and He's actually likened it to 219 00:12:42,360 --> 00:12:45,800 Speaker 1: being akin to using the Force in a Star Wars movie. 220 00:12:46,040 --> 00:12:49,000 Speaker 1: It's given him the ability to play Civilization six, which 221 00:12:49,040 --> 00:12:50,920 Speaker 1: is a turn based strategy game in one of his 222 00:12:51,040 --> 00:12:53,840 Speaker 1: favorite video games, and I think that's pretty awesome. So 223 00:12:53,960 --> 00:12:57,600 Speaker 1: our Ba says that the technology hasn't been flawless, it's 224 00:12:57,600 --> 00:13:00,240 Speaker 1: not without its issues, but that he's been given a 225 00:13:00,280 --> 00:13:03,280 Speaker 1: ton more agency than he experienced in the past. And 226 00:13:03,360 --> 00:13:06,960 Speaker 1: I think any technology that increases accessibility and gives folks 227 00:13:06,960 --> 00:13:10,960 Speaker 1: more autonomy is something that we should really be proud of, 228 00:13:11,280 --> 00:13:13,480 Speaker 1: and I think it's super cool. We also have to 229 00:13:13,520 --> 00:13:18,199 Speaker 1: remember that there still are concerns about Neuralink, including ethical 230 00:13:18,200 --> 00:13:21,080 Speaker 1: concerns about how the company has experimented on animals in 231 00:13:21,120 --> 00:13:23,720 Speaker 1: the process of developing this technology, and that perhaps it 232 00:13:23,720 --> 00:13:27,080 Speaker 1: hasn't done so responsibly, so we can't ignore that. But 233 00:13:27,160 --> 00:13:29,559 Speaker 1: I do think it's important that you know, I acknowledge 234 00:13:29,720 --> 00:13:33,440 Speaker 1: the stuff that is legitimately really cool, and I think 235 00:13:33,720 --> 00:13:37,160 Speaker 1: giving folks the ability to have computer access even if 236 00:13:37,200 --> 00:13:40,240 Speaker 1: they don't have command of their limbs, I think that's 237 00:13:40,280 --> 00:13:44,040 Speaker 1: really awesome. Back to various US government agencies that are 238 00:13:44,080 --> 00:13:46,400 Speaker 1: trying to hold tech companies to task. So next up, 239 00:13:46,440 --> 00:13:50,119 Speaker 1: I want to talk about the Federal Communications Commission, or FCC, 240 00:13:50,360 --> 00:13:53,480 Speaker 1: as well as the cable industry. So I'm largely drawing 241 00:13:53,480 --> 00:13:56,720 Speaker 1: from an article written by John Broadkin for Ours Technica. 242 00:13:56,800 --> 00:14:01,120 Speaker 1: It's titled FCC bans cable TV industry favorite trick for 243 00:14:01,200 --> 00:14:04,280 Speaker 1: hiding full cost of service, So you might remember this 244 00:14:04,320 --> 00:14:07,400 Speaker 1: has been an ongoing issue. The FCC has argued that 245 00:14:07,400 --> 00:14:11,920 Speaker 1: cable companies regularly hide tons of fees in monthly bills, 246 00:14:12,120 --> 00:14:14,840 Speaker 1: so that the amount you have to pay is significantly 247 00:14:14,960 --> 00:14:19,160 Speaker 1: higher than the amount the cable companies advertise. So the 248 00:14:19,240 --> 00:14:22,520 Speaker 1: company might say, like, this is just a hypothetical example 249 00:14:22,840 --> 00:14:26,720 Speaker 1: that perhaps monthly service would cost forty nine dollars, But 250 00:14:26,800 --> 00:14:29,680 Speaker 1: this is before all these odd fees start to pop up, 251 00:14:29,760 --> 00:14:33,120 Speaker 1: and it balloons that figure to much higher levels every 252 00:14:33,160 --> 00:14:36,640 Speaker 1: billing cycle. So now the FCC has mandated that cable 253 00:14:36,640 --> 00:14:40,760 Speaker 1: companies include the all in price while they advertise services. 254 00:14:40,960 --> 00:14:44,640 Speaker 1: The cable company can't cite some stripped down version only 255 00:14:44,640 --> 00:14:46,880 Speaker 1: to pull the old switch ru when it's time to 256 00:14:46,960 --> 00:14:50,000 Speaker 1: actually build customers. Instead, they have to say what the 257 00:14:50,080 --> 00:14:52,800 Speaker 1: full cost is going to be instead of saying, oh 258 00:14:52,960 --> 00:14:56,520 Speaker 1: sorry that yeah, that was without the broadcast TV fee included, 259 00:14:56,760 --> 00:15:00,160 Speaker 1: you know, or whatever. Cable companies are ticked off. No 260 00:15:00,200 --> 00:15:03,280 Speaker 1: big surprise there. Broadcod quotes a lobby group for the 261 00:15:03,320 --> 00:15:07,360 Speaker 1: industry that says, quote, micro management of advertising in today's 262 00:15:07,440 --> 00:15:11,760 Speaker 1: hyper competitive marketplace will force operators to either clutter their 263 00:15:11,800 --> 00:15:16,040 Speaker 1: ads with confusing disclosures or leave pricing information out entirely 264 00:15:16,160 --> 00:15:18,280 Speaker 1: end quote. And I just want to say, all right, 265 00:15:18,480 --> 00:15:23,360 Speaker 1: First of all, what do you mean by hyper competitive marketplace? 266 00:15:23,760 --> 00:15:28,160 Speaker 1: That is laughable? In the United States. There are tons 267 00:15:28,240 --> 00:15:30,680 Speaker 1: of households in the US that have little to no 268 00:15:30,920 --> 00:15:34,080 Speaker 1: choice in providers when it comes to things like terrestrial 269 00:15:34,240 --> 00:15:38,920 Speaker 1: television service. So where is this competition you're talking about 270 00:15:39,120 --> 00:15:43,320 Speaker 1: unless you're talking about competing with other forms like streaming services. 271 00:15:43,440 --> 00:15:47,000 Speaker 1: And yes, cable TV is dying out, but it's because 272 00:15:47,280 --> 00:15:51,400 Speaker 1: largely of the predatory practices and terrible service associated with 273 00:15:51,480 --> 00:15:55,480 Speaker 1: cable companies. Anyway, these new rules won't take effect for 274 00:15:55,520 --> 00:15:58,240 Speaker 1: several more months, and there are a lot more processes 275 00:15:58,280 --> 00:16:00,760 Speaker 1: that are to follow, so you can easily imagine the 276 00:16:00,800 --> 00:16:04,160 Speaker 1: cable industry is likely to contest this and the FCC's 277 00:16:04,200 --> 00:16:07,160 Speaker 1: authority to even pass this in the first place. So 278 00:16:07,280 --> 00:16:10,520 Speaker 1: fun times. But I recommend the article in Ours Technica 279 00:16:10,520 --> 00:16:13,479 Speaker 1: if you want to know more of than details. Alexandra 280 00:16:13,600 --> 00:16:17,440 Speaker 1: Alper of Reuter's reported on the United Nations adopting a 281 00:16:17,480 --> 00:16:21,640 Speaker 1: global resolution that focuses on artificial intelligence and how countries 282 00:16:21,840 --> 00:16:25,400 Speaker 1: must commit to protecting human rights while making sure AI doesn't, 283 00:16:25,440 --> 00:16:29,360 Speaker 1: you know, go all terminator on US done those exact words, 284 00:16:29,520 --> 00:16:32,160 Speaker 1: mind you. So the US is actually the country that 285 00:16:32,200 --> 00:16:35,480 Speaker 1: proposed this resolution, China co sponsored it, and those are 286 00:16:35,960 --> 00:16:39,320 Speaker 1: you know, the two leading superpowers advancing AI technology in 287 00:16:39,360 --> 00:16:42,080 Speaker 1: the world today. And you know, we've seen numerous times 288 00:16:42,120 --> 00:16:45,000 Speaker 1: in the recent past that the way AI has been 289 00:16:45,280 --> 00:16:48,920 Speaker 1: advanced hasn't always been in the most peaceful or non 290 00:16:49,080 --> 00:16:52,720 Speaker 1: disruptive ways. So this resolution has passed, but it's also 291 00:16:52,800 --> 00:16:57,400 Speaker 1: a non binding resolution. So you could make an argument, 292 00:16:57,480 --> 00:16:59,760 Speaker 1: and I certainly do that there is a lot of 293 00:17:00,400 --> 00:17:04,240 Speaker 1: circumstance going on here, but nothing really of substance. It's 294 00:17:04,320 --> 00:17:07,479 Speaker 1: kind of like leaders saying, yes, these are things that 295 00:17:07,480 --> 00:17:10,359 Speaker 1: we should definitely be concerned about, and then that's the 296 00:17:10,400 --> 00:17:12,600 Speaker 1: extent of it, and everything just keeps on going the 297 00:17:12,600 --> 00:17:15,080 Speaker 1: way it's been going, which is kind of frustrating. If 298 00:17:15,119 --> 00:17:18,560 Speaker 1: you ask me, I guess once we're all reduced to 299 00:17:18,640 --> 00:17:21,960 Speaker 1: hiding out in bunkers as the robots rampage overhead. We 300 00:17:22,000 --> 00:17:25,400 Speaker 1: can scold the various world leaders for not actually doing 301 00:17:25,520 --> 00:17:29,639 Speaker 1: anything substantial about safeguarding against risks associated with AI, but 302 00:17:29,800 --> 00:17:33,480 Speaker 1: just you know, saying we probably should. It's just a 303 00:17:35,000 --> 00:17:38,000 Speaker 1: man politics. I get it why everyone hates when I 304 00:17:38,040 --> 00:17:41,040 Speaker 1: bring it up. It's unfortunate. I don't want to either. 305 00:17:41,320 --> 00:17:43,760 Speaker 1: It's just the world we live in. So John Ray, 306 00:17:44,080 --> 00:17:46,639 Speaker 1: the guy who was brought in to help dismantle the 307 00:17:46,720 --> 00:17:51,440 Speaker 1: bankrupted cryptocurrency company FTX, y'all remember that one, right, Jenrey 308 00:17:51,560 --> 00:17:53,760 Speaker 1: was brought in in order to return as much value 309 00:17:53,840 --> 00:17:57,359 Speaker 1: as he could to investors and former customers of this 310 00:17:57,400 --> 00:18:00,480 Speaker 1: cryptocurrency exchange, and he has now weighed in on Sam 311 00:18:00,520 --> 00:18:06,400 Speaker 1: Bankman Freed's current fate. So Freed, or SBF as he 312 00:18:06,720 --> 00:18:09,520 Speaker 1: was known in the crypto community, is coming up for 313 00:18:09,680 --> 00:18:12,760 Speaker 1: sentencing soon. He was found guilty on multiple charges of 314 00:18:12,800 --> 00:18:16,480 Speaker 1: fraud earlier, and now he faces up to a maximum 315 00:18:16,520 --> 00:18:19,720 Speaker 1: of one hundred and ten years in prison, though it 316 00:18:19,760 --> 00:18:21,960 Speaker 1: could be significantly fewer than that. It could be like 317 00:18:22,080 --> 00:18:25,800 Speaker 1: five years if the judge decides, and SBF essentially has 318 00:18:25,880 --> 00:18:29,280 Speaker 1: previously argued for clemency. He wrote letters that stay at 319 00:18:29,280 --> 00:18:33,119 Speaker 1: his crimes resulted in essentially no losses or harmed to 320 00:18:33,160 --> 00:18:36,919 Speaker 1: customers and investors, et cetera. So John Ray wrote in 321 00:18:36,960 --> 00:18:41,240 Speaker 1: to say, oh, contrere monfrere Yeah. He says that stakeholders 322 00:18:41,240 --> 00:18:45,520 Speaker 1: have filed claims that collectively amount to twenty three point 323 00:18:45,600 --> 00:18:53,600 Speaker 1: six quintillion dollars, that is beyond a princely sum. The 324 00:18:53,640 --> 00:18:58,080 Speaker 1: actual estimated loss is somewhere around ten billion dollars, still 325 00:18:58,320 --> 00:19:02,320 Speaker 1: an astronomical amount of money, but not in the quintillion range, 326 00:19:02,480 --> 00:19:05,199 Speaker 1: So there's some discrepancy here. And Ray says that like 327 00:19:05,320 --> 00:19:07,879 Speaker 1: in order to address each claim, there is a significant 328 00:19:07,920 --> 00:19:10,080 Speaker 1: amount of work that goes into it. So clearly there 329 00:19:10,200 --> 00:19:13,880 Speaker 1: is a real cost associated with SBF's crimes here. And 330 00:19:14,160 --> 00:19:17,760 Speaker 1: Ray does not mince words in his letter. He says 331 00:19:17,760 --> 00:19:19,760 Speaker 1: that while his team has worked very hard to get 332 00:19:19,760 --> 00:19:23,240 Speaker 1: as much back as possible, it did so while SBF 333 00:19:23,400 --> 00:19:27,040 Speaker 1: was casting aspersions on their efforts. And moreover, there are 334 00:19:27,119 --> 00:19:30,520 Speaker 1: plenty of cases where there's just no way to reclaim 335 00:19:30,600 --> 00:19:35,280 Speaker 1: what had been lost, including quote, the bribes to Chinese 336 00:19:35,320 --> 00:19:39,520 Speaker 1: officials or the hundreds of millions of dollars he meaning 337 00:19:40,000 --> 00:19:44,440 Speaker 1: SBF spent to buy access to or time with celebrities 338 00:19:44,560 --> 00:19:48,720 Speaker 1: or politicians or investments for which he grossly overpaid having 339 00:19:48,760 --> 00:19:54,159 Speaker 1: done zero diligence. The harm was vast, the remorse is nonexistent. 340 00:19:54,640 --> 00:19:59,359 Speaker 1: End quote yaoza. That is spilling the ding dang darn tea, y'all. 341 00:19:59,560 --> 00:20:02,320 Speaker 1: But beyond on that, Ray also outlined five times he 342 00:20:02,359 --> 00:20:04,840 Speaker 1: says SBF lied in an effort to ask for a 343 00:20:04,920 --> 00:20:08,680 Speaker 1: lighter sentence. So John Ray is essentially saying, throw the 344 00:20:08,720 --> 00:20:11,920 Speaker 1: book at this guy. That the crimes that SBF committed 345 00:20:12,000 --> 00:20:14,960 Speaker 1: and his continued quest to shift blame to other people 346 00:20:15,240 --> 00:20:18,320 Speaker 1: show that he doesn't deserve a lighter sentence, So we'll 347 00:20:18,320 --> 00:20:21,120 Speaker 1: have to see what actually gets handed down. Prosecutors are 348 00:20:21,119 --> 00:20:23,879 Speaker 1: calling for a pretty hefty sentence of around like forty 349 00:20:24,000 --> 00:20:27,480 Speaker 1: or fifty years, so I'm not feeling too optimistic for 350 00:20:27,600 --> 00:20:30,399 Speaker 1: SBF's chances of getting out with a light one in 351 00:20:30,480 --> 00:20:32,960 Speaker 1: this particular case. I think the judge might be a 352 00:20:33,000 --> 00:20:36,080 Speaker 1: little more inclined to lean a bit heavier in this 353 00:20:36,280 --> 00:20:40,720 Speaker 1: particular example. Okay, I've got some more news stories to 354 00:20:40,760 --> 00:20:42,720 Speaker 1: get through before we get to that. Let's take another 355 00:20:42,800 --> 00:20:54,760 Speaker 1: quick break, all right, Now we've got a couple of 356 00:20:54,840 --> 00:20:57,720 Speaker 1: hacking stories that we should cover, So first up, some 357 00:20:57,800 --> 00:21:03,280 Speaker 1: researchers at Colorado State University uncovered vulnerabilities in electronic logging 358 00:21:03,320 --> 00:21:07,000 Speaker 1: devices or elds that are used by the trucking industry. 359 00:21:07,440 --> 00:21:11,120 Speaker 1: So the purpose of an ELD is to keep track 360 00:21:11,160 --> 00:21:14,520 Speaker 1: of stuff like the number of miles and hours driven 361 00:21:14,920 --> 00:21:18,720 Speaker 1: for a run of vehicle performance and stuff like that. Right, 362 00:21:18,800 --> 00:21:22,359 Speaker 1: it's to be part of fleet maintenance and oversight for 363 00:21:22,440 --> 00:21:26,280 Speaker 1: things like drivers and individual vehicles. But the researchers found 364 00:21:26,400 --> 00:21:29,400 Speaker 1: that at least one type of ELD and there's only 365 00:21:29,440 --> 00:21:32,280 Speaker 1: a few that are used by the whole industry. These 366 00:21:32,320 --> 00:21:36,159 Speaker 1: elds are components that are part of larger systems, and 367 00:21:36,240 --> 00:21:39,080 Speaker 1: while there are a ton of these systems that are 368 00:21:39,080 --> 00:21:42,000 Speaker 1: in use in the trucking industry, there are only, you know, 369 00:21:42,200 --> 00:21:47,280 Speaker 1: a few elds that are produced. So at least one, 370 00:21:47,720 --> 00:21:52,719 Speaker 1: maybe more of these elds has a severe vulnerability that 371 00:21:52,760 --> 00:21:57,200 Speaker 1: can be compromised via Bluetooth or Wi Fi. And it's 372 00:21:57,240 --> 00:22:01,080 Speaker 1: even possible to do a compromising attack on one of 373 00:22:01,119 --> 00:22:04,600 Speaker 1: these elds if you're just driving relatively close to a 374 00:22:04,720 --> 00:22:08,760 Speaker 1: truck that's in motion that has a system that has 375 00:22:08,800 --> 00:22:12,320 Speaker 1: this ELD in it, and once it's compromised, the attacker 376 00:22:12,480 --> 00:22:15,480 Speaker 1: can do stuff like they can send a signal that 377 00:22:15,520 --> 00:22:18,000 Speaker 1: would force a truck driver to pull off the road. 378 00:22:18,640 --> 00:22:21,440 Speaker 1: They can even infect the truck so that it in 379 00:22:21,480 --> 00:22:25,439 Speaker 1: turn will infect other trucks if they come within driving range. 380 00:22:25,840 --> 00:22:28,600 Speaker 1: And it's like turning a truck into patient zero for 381 00:22:28,640 --> 00:22:31,800 Speaker 1: a pandemic, only it's a computer virus rather than a 382 00:22:31,800 --> 00:22:35,000 Speaker 1: biological one. And obviously this vulnerability could lead to a 383 00:22:35,000 --> 00:22:39,480 Speaker 1: situation where an entire fleet could become compromised and then manipulated. 384 00:22:39,840 --> 00:22:43,240 Speaker 1: It's a huge danger to not only drivers on the road, 385 00:22:43,320 --> 00:22:46,919 Speaker 1: but also supply chains in general. So, according to the researchers, 386 00:22:46,960 --> 00:22:51,000 Speaker 1: the manufacturer behind the particular ELD they have tested, they 387 00:22:51,040 --> 00:22:53,959 Speaker 1: didn't reveal which one it was. They said that company 388 00:22:54,000 --> 00:22:56,800 Speaker 1: is working on a firmware update that could address this problem. 389 00:22:57,040 --> 00:23:00,960 Speaker 1: But the researchers say other elds might have similar vulnerabilities, 390 00:23:00,960 --> 00:23:04,600 Speaker 1: and that all manufacturers should be investigating this so that 391 00:23:04,640 --> 00:23:08,000 Speaker 1: they can avoid, say a series of attacks that could 392 00:23:08,080 --> 00:23:10,840 Speaker 1: have a severe impact on the shipping industry in the 393 00:23:10,920 --> 00:23:14,880 Speaker 1: United States. Next up are some research hackers who disclose 394 00:23:15,000 --> 00:23:19,560 Speaker 1: that electronic locks from Dormacaba have a massive security flaw 395 00:23:19,680 --> 00:23:23,399 Speaker 1: in them, so these locks use RFID technology. Lots of 396 00:23:23,400 --> 00:23:27,520 Speaker 1: hotels use this particular brand for their door locks, and 397 00:23:27,600 --> 00:23:31,280 Speaker 1: the hacker showed that using an NFC capable device like 398 00:23:31,840 --> 00:23:36,520 Speaker 1: even an Android phone with NFC it's near field communication abilities, 399 00:23:36,800 --> 00:23:40,000 Speaker 1: you can clone hotel key cards and use the cloned 400 00:23:40,080 --> 00:23:43,160 Speaker 1: card to then access a room like you can make 401 00:23:43,320 --> 00:23:46,679 Speaker 1: a master key this way. All you really need is 402 00:23:46,760 --> 00:23:50,520 Speaker 1: an actual key card that was compatible with these systems 403 00:23:50,840 --> 00:23:53,879 Speaker 1: and an NFC enabled device to send a signal to 404 00:23:54,000 --> 00:23:56,960 Speaker 1: overwrite the information on that key card. You know, these 405 00:23:57,000 --> 00:23:59,760 Speaker 1: cards are meant to be wiped and then re used, right, 406 00:24:00,119 --> 00:24:02,359 Speaker 1: so if you can do that and you overwrite the 407 00:24:02,359 --> 00:24:05,399 Speaker 1: info onto the key card, then you can create essentially 408 00:24:05,440 --> 00:24:08,760 Speaker 1: a skeleton key that'll work on these locks. That includes 409 00:24:08,800 --> 00:24:11,440 Speaker 1: being able to open dead bolts, because while you can't 410 00:24:11,480 --> 00:24:14,640 Speaker 1: close a dead bolt on your hotel room door, typically 411 00:24:14,720 --> 00:24:17,719 Speaker 1: these dead bolts are also connected to electronic systems, so 412 00:24:17,760 --> 00:24:21,280 Speaker 1: that housekeeping, for example, can open the door even if 413 00:24:21,320 --> 00:24:23,280 Speaker 1: you've closed the dead bolts. So really the only thing 414 00:24:23,359 --> 00:24:26,200 Speaker 1: you're left with that can keep your door locked would 415 00:24:26,200 --> 00:24:28,160 Speaker 1: be something like if it has a chain on it, 416 00:24:28,560 --> 00:24:30,639 Speaker 1: or something similar along those lines. You know, something that 417 00:24:30,680 --> 00:24:34,960 Speaker 1: you manually use to secure the door, and even then 418 00:24:35,200 --> 00:24:37,639 Speaker 1: you know it's just another level of protection. It's not 419 00:24:37,800 --> 00:24:42,479 Speaker 1: meant to keep everyone out forever. So researchers had already 420 00:24:42,480 --> 00:24:45,520 Speaker 1: worked with Dormacaba and told them about this vulnerability, and 421 00:24:45,680 --> 00:24:48,240 Speaker 1: the company has done an impressive amount of work to 422 00:24:48,320 --> 00:24:51,040 Speaker 1: replace more than a third of the locks that were 423 00:24:51,359 --> 00:24:53,639 Speaker 1: out there in the world that are vulnerable to this. 424 00:24:54,040 --> 00:24:56,040 Speaker 1: But that means you still have around two thirds of 425 00:24:56,080 --> 00:24:58,600 Speaker 1: the locks that are still vulnerable, and it could take 426 00:24:58,960 --> 00:25:02,600 Speaker 1: years to replace most of those because apparently it's not 427 00:25:02,720 --> 00:25:06,560 Speaker 1: just replacing the locks, it's updating the entire system so 428 00:25:06,600 --> 00:25:10,280 Speaker 1: that that includes key cards and the encoders and everything 429 00:25:10,359 --> 00:25:12,959 Speaker 1: like that. The researchers say that as far as they 430 00:25:13,000 --> 00:25:16,440 Speaker 1: can tell, the exploit is not currently active in the wild, 431 00:25:16,680 --> 00:25:19,520 Speaker 1: so that's something at least. But yeah, if you're going 432 00:25:19,560 --> 00:25:21,320 Speaker 1: to be staying in a hotel, just you know, keep 433 00:25:21,359 --> 00:25:22,960 Speaker 1: that in mind. You might want to make use of 434 00:25:23,000 --> 00:25:26,000 Speaker 1: that safe in that hotel room if you can. Now, 435 00:25:26,080 --> 00:25:29,440 Speaker 1: let's end with some fun news. There's a rumor circulating 436 00:25:29,560 --> 00:25:33,159 Speaker 1: that Sony is preparing an upgraded PS five to launch 437 00:25:33,200 --> 00:25:36,119 Speaker 1: in time for this year's holiday season. So this comes 438 00:25:36,119 --> 00:25:38,600 Speaker 1: as Sony has said that the PS five is kind 439 00:25:38,640 --> 00:25:41,640 Speaker 1: of entering the latter half of its life cycle, which 440 00:25:41,680 --> 00:25:43,520 Speaker 1: is pretty wild to me. I still don't own a 441 00:25:43,520 --> 00:25:46,280 Speaker 1: PS five or an Xbox Series X for that matter. 442 00:25:46,600 --> 00:25:49,560 Speaker 1: The rumored PS five Pro should be able to deliver 443 00:25:49,920 --> 00:25:53,600 Speaker 1: upscaled graphics and provide better performance and frame rates, but 444 00:25:53,680 --> 00:25:56,320 Speaker 1: there's no confirmation on this story yet, so there's also 445 00:25:56,400 --> 00:26:00,960 Speaker 1: no firm details to give about this console's stats or 446 00:26:00,960 --> 00:26:03,679 Speaker 1: how much it's gonna cost. My guess is it'll be 447 00:26:03,800 --> 00:26:06,960 Speaker 1: fairly expensive because Sony has stated that it plans to 448 00:26:07,000 --> 00:26:09,959 Speaker 1: focus on profit for the last half of the PS 449 00:26:09,960 --> 00:26:13,200 Speaker 1: five's life cycle. So it sounds to me like Sony's 450 00:26:13,200 --> 00:26:15,879 Speaker 1: goal is not just to entice folks like me who 451 00:26:16,000 --> 00:26:18,960 Speaker 1: never adopted the current generation of consoles, but to also 452 00:26:19,040 --> 00:26:22,600 Speaker 1: convince people who already own a PS five to upgrade 453 00:26:22,640 --> 00:26:25,960 Speaker 1: to the new model now. I think to do that, 454 00:26:26,080 --> 00:26:30,520 Speaker 1: they're gonna have to really firmly demonstrate why this new 455 00:26:30,560 --> 00:26:34,119 Speaker 1: PS five is going to be desirable. I don't think 456 00:26:34,359 --> 00:26:37,760 Speaker 1: a boost in graphics or performance is going to be enough, 457 00:26:38,200 --> 00:26:40,959 Speaker 1: like to say, like existing titles will run better and 458 00:26:41,000 --> 00:26:43,159 Speaker 1: look better on this. I don't think that's gonna be 459 00:26:43,240 --> 00:26:45,520 Speaker 1: quite enough to convince a lot of folks to upgrade. 460 00:26:45,600 --> 00:26:50,480 Speaker 1: They're gonna need to have a decent library of launch 461 00:26:50,520 --> 00:26:54,600 Speaker 1: title games that really demonstrate the value of the PS 462 00:26:54,640 --> 00:26:57,600 Speaker 1: five pro if in fact this is their plan, because 463 00:26:57,760 --> 00:27:00,000 Speaker 1: without those titles, there's not much of a sales pitch 464 00:27:00,080 --> 00:27:03,400 Speaker 1: there unless it's for someone like me who just never 465 00:27:03,520 --> 00:27:06,119 Speaker 1: got on one of the PS fives in the first place. 466 00:27:06,160 --> 00:27:08,280 Speaker 1: And if I'm going to buy one, I'm more likely 467 00:27:08,320 --> 00:27:12,160 Speaker 1: to go for, you know, the best version that's out there, 468 00:27:12,240 --> 00:27:15,679 Speaker 1: rather than buy something that's been said to be on 469 00:27:15,760 --> 00:27:18,119 Speaker 1: the end of its life cycle. And Sony is not 470 00:27:18,160 --> 00:27:22,160 Speaker 1: the only company that's planning new hardware launches. Microsoft had 471 00:27:22,800 --> 00:27:26,480 Speaker 1: a leak, a document leak that showed that a code 472 00:27:26,560 --> 00:27:30,920 Speaker 1: name device called Brooklyn is a future Xbox hardware. It's 473 00:27:30,960 --> 00:27:35,080 Speaker 1: like cylindrical in shape, which is interesting. And Nintendo reportedly 474 00:27:35,200 --> 00:27:37,240 Speaker 1: is preparing to launch a follow up to the highly 475 00:27:37,280 --> 00:27:40,639 Speaker 1: successful Switch handheld consoles, so should be looking at some 476 00:27:40,800 --> 00:27:44,640 Speaker 1: new video game hardware in the not too distant future. Finally, 477 00:27:44,680 --> 00:27:47,520 Speaker 1: I have a recommended article for y'all. It's from Time 478 00:27:47,720 --> 00:27:51,920 Speaker 1: dot Com. It's by Will Henshall and it's titled Nobody 479 00:27:52,040 --> 00:27:56,280 Speaker 1: Knows How to safety test AI, which you know, it's 480 00:27:56,280 --> 00:27:58,600 Speaker 1: a bit of a problem. We all we recognize that 481 00:27:58,640 --> 00:28:02,680 Speaker 1: AI can pose potential threats. That's clear that those threats 482 00:28:03,160 --> 00:28:07,120 Speaker 1: range in severity from worrying about something that could cost 483 00:28:07,160 --> 00:28:11,640 Speaker 1: a company millions of dollars or potentially replace people so 484 00:28:11,680 --> 00:28:14,159 Speaker 1: that folks are out of a job, or all the 485 00:28:14,160 --> 00:28:17,480 Speaker 1: way up to like AI powered weaponized systems that could 486 00:28:17,520 --> 00:28:21,679 Speaker 1: potentially misidentify targets. I mean, there's pretty terrifying ways that 487 00:28:21,720 --> 00:28:25,840 Speaker 1: AI could potentially harm us. But it's also hard to 488 00:28:25,840 --> 00:28:27,800 Speaker 1: figure out how do we test for that, How do 489 00:28:27,880 --> 00:28:32,520 Speaker 1: we discover how AI could potentially cause problems before we 490 00:28:32,680 --> 00:28:35,119 Speaker 1: deploy that AI in the real world and just find 491 00:28:35,160 --> 00:28:38,960 Speaker 1: out through experience that is an issue. So I recommend 492 00:28:39,040 --> 00:28:41,680 Speaker 1: checking out that article if you're interested in learning more 493 00:28:41,760 --> 00:28:45,440 Speaker 1: about that. That's it for this week's news. I hope 494 00:28:45,560 --> 00:28:48,560 Speaker 1: all of you out there are well, and I will 495 00:28:48,600 --> 00:28:57,959 Speaker 1: talk to you again really soon. Tech Stuff is an 496 00:28:58,000 --> 00:29:03,520 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 497 00:29:03,680 --> 00:29:06,840 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.