1 00:00:04,440 --> 00:00:12,400 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,160 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:20,160 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:20,239 --> 00:00:22,520 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,560 --> 00:00:27,040 Speaker 1: the week ending December sixth, twenty twenty four. And my apologies, 6 00:00:27,080 --> 00:00:29,840 Speaker 1: but we have some heavy stories both at the beginning 7 00:00:29,960 --> 00:00:33,159 Speaker 1: and the end of this one because of the way 8 00:00:33,440 --> 00:00:36,519 Speaker 1: news unfolded this week. One of the top stories this 9 00:00:36,600 --> 00:00:41,800 Speaker 1: week was that the United Healthcare CEO Brian Thompson died 10 00:00:41,880 --> 00:00:45,120 Speaker 1: after being attacked by a gun wielding assailant in New 11 00:00:45,200 --> 00:00:49,640 Speaker 1: York City and what was clearly a premeditated act of violence. Now, 12 00:00:49,800 --> 00:00:53,239 Speaker 1: this is not a true crime podcast, so I'm not 13 00:00:53,280 --> 00:00:56,560 Speaker 1: gonna dwell on the extraordinary facts about this case, the 14 00:00:56,560 --> 00:00:58,360 Speaker 1: ones that have been shared with us so far. You 15 00:00:58,400 --> 00:01:01,360 Speaker 1: can read up all about that if you like. Rather, 16 00:01:01,760 --> 00:01:05,160 Speaker 1: what I wanted to talk about is how the reaction 17 00:01:05,440 --> 00:01:09,319 Speaker 1: online to this act of violence has been pretty brutal, 18 00:01:09,600 --> 00:01:13,240 Speaker 1: and not just among your typical Internet trolls. Over on Reddit, 19 00:01:13,600 --> 00:01:17,040 Speaker 1: moderators had to shut down a thread in which doctors 20 00:01:17,080 --> 00:01:21,160 Speaker 1: were criticizing Brian Thompson and the insurance industry as a 21 00:01:21,160 --> 00:01:26,119 Speaker 1: whole for practices that prioritize profits and shareholder return over 22 00:01:26,200 --> 00:01:29,520 Speaker 1: human life. The participants in a thread in the subreddit 23 00:01:29,800 --> 00:01:33,800 Speaker 1: r slash medicine took United Healthcare to task for allegedly 24 00:01:33,880 --> 00:01:36,800 Speaker 1: denying coverage to patients, all in an effort to squeeze 25 00:01:36,800 --> 00:01:40,240 Speaker 1: out more value for shareholders. It got so spicy that 26 00:01:40,319 --> 00:01:43,440 Speaker 1: moderators chose to nuke the thread from orbit, because it's 27 00:01:43,440 --> 00:01:45,399 Speaker 1: the only way to be sure. I've been seeing a 28 00:01:45,400 --> 00:01:48,680 Speaker 1: lot of response to this killing online and there's a 29 00:01:48,800 --> 00:01:52,400 Speaker 1: very dark core of eat the rich fueling a lot 30 00:01:52,440 --> 00:01:54,760 Speaker 1: of these reactions. And y'all, don't get me wrong, I'm 31 00:01:54,920 --> 00:01:57,120 Speaker 1: very much in the camp of eat the rich, but 32 00:01:57,600 --> 00:02:00,880 Speaker 1: I think there's a very small number of an incredibly 33 00:02:01,160 --> 00:02:04,600 Speaker 1: privileged people who wield far too much wealth and power 34 00:02:04,600 --> 00:02:07,080 Speaker 1: in the United States and beyond, and that strikes me 35 00:02:07,520 --> 00:02:11,280 Speaker 1: as wrong. However, I'm also very much in the let's 36 00:02:11,320 --> 00:02:14,960 Speaker 1: not kill anybody camp when it comes to those sorts 37 00:02:14,960 --> 00:02:18,160 Speaker 1: of things. I have deep compassion for the countless patients 38 00:02:18,200 --> 00:02:22,400 Speaker 1: who found themselves lacking coverage despite having insurance. Like that 39 00:02:22,440 --> 00:02:27,200 Speaker 1: should never happen. That's why we have insurance Obviously, denying 40 00:02:27,320 --> 00:02:31,000 Speaker 1: patients in need is wrong, and it's a problem that 41 00:02:31,160 --> 00:02:34,240 Speaker 1: does need to be rectified. But I also, at the 42 00:02:34,280 --> 00:02:37,519 Speaker 1: same time think that killing someone is wrong as well. 43 00:02:37,760 --> 00:02:40,560 Speaker 1: You know, both things can both be you know, simultaneously 44 00:02:40,560 --> 00:02:43,240 Speaker 1: be true. Anyway, I guess this is where I say 45 00:02:43,280 --> 00:02:47,720 Speaker 1: the compassion part of my philosophy of compassion and critical 46 00:02:47,760 --> 00:02:50,880 Speaker 1: thinking comes in. I don't think those who create and 47 00:02:51,000 --> 00:02:54,440 Speaker 1: foster systems that profit off the suffering of others should 48 00:02:54,520 --> 00:02:58,040 Speaker 1: get away with that. They should be held accountable and 49 00:02:58,120 --> 00:03:02,600 Speaker 1: be required to participate in the fixing of those systems. However, 50 00:03:02,639 --> 00:03:05,280 Speaker 1: I also think gunning someone down in cold blood isn't 51 00:03:05,320 --> 00:03:08,960 Speaker 1: a reasonable solution to those kinds of problems anyway. One 52 00:03:09,280 --> 00:03:11,680 Speaker 1: other thing I want to mention about this is that 53 00:03:11,720 --> 00:03:15,520 Speaker 1: there's a lot of misinformation going around regarding this attack. 54 00:03:15,760 --> 00:03:20,519 Speaker 1: I have seen numerous posts quoting people, primarily Elon Musk, 55 00:03:20,800 --> 00:03:23,720 Speaker 1: about this incident. In fact, I saw one post getting 56 00:03:23,720 --> 00:03:26,480 Speaker 1: a lot of traction on Facebook being shared by multiple 57 00:03:26,480 --> 00:03:28,880 Speaker 1: friends of mine, and in that post it appeared that 58 00:03:29,000 --> 00:03:32,240 Speaker 1: Elon Musk was tweeting out essentially that CEOs keep the 59 00:03:32,280 --> 00:03:36,080 Speaker 1: world going round and that without CEOs everything falls apart. 60 00:03:36,280 --> 00:03:39,000 Speaker 1: The only thing is I could actually find no evidence 61 00:03:39,000 --> 00:03:43,160 Speaker 1: that he ever tweeted that message, and you know, he 62 00:03:43,200 --> 00:03:45,600 Speaker 1: tweets a lot, and there was no record of him 63 00:03:45,600 --> 00:03:49,680 Speaker 1: actually tweeting that out. There were these photos of a tweet, 64 00:03:49,680 --> 00:03:52,200 Speaker 1: but they looked weird, like the font wasn't right, the 65 00:03:52,600 --> 00:03:55,600 Speaker 1: kerning wasn't right on the typeface. So, in other words, 66 00:03:55,800 --> 00:03:58,520 Speaker 1: at least to my eyes, it appeared to be fabricated 67 00:03:58,600 --> 00:04:01,520 Speaker 1: in order to support a narrative. Of that narrative apparently 68 00:04:01,560 --> 00:04:04,440 Speaker 1: being that Elon Musk is a billionaire with no consideration 69 00:04:04,600 --> 00:04:07,160 Speaker 1: or awareness of what normal folks go through, and that 70 00:04:07,240 --> 00:04:10,560 Speaker 1: he has an overinflated opinion of himself, which you know, 71 00:04:10,600 --> 00:04:13,480 Speaker 1: I agree with that perception. I think it's accurate, but 72 00:04:13,720 --> 00:04:17,159 Speaker 1: I don't think that means people should go around inventing quotes. 73 00:04:17,279 --> 00:04:20,560 Speaker 1: The guy has plenty of real quotes that give us 74 00:04:20,600 --> 00:04:23,919 Speaker 1: insight into his beliefs and thoughts. There's no need to 75 00:04:24,000 --> 00:04:27,000 Speaker 1: make up new ones just to support a particular point 76 00:04:27,000 --> 00:04:30,520 Speaker 1: of view. So yeah, I guess ultimately what I'm saying 77 00:04:30,520 --> 00:04:34,760 Speaker 1: when I'm bringing up this whole story is one try 78 00:04:34,800 --> 00:04:38,800 Speaker 1: and keep compassion in mind. I understand if your capacity 79 00:04:38,800 --> 00:04:43,440 Speaker 1: for compassion is stretched due to the impact the insurance 80 00:04:43,520 --> 00:04:47,080 Speaker 1: industry has had at people at large, particularly people who 81 00:04:47,120 --> 00:04:50,280 Speaker 1: need it most. But also keep using critical thinking when 82 00:04:50,320 --> 00:04:53,200 Speaker 1: you're seeing you know memes and stuff shared about this 83 00:04:53,279 --> 00:04:56,159 Speaker 1: because often the people who are creating them have no 84 00:04:56,279 --> 00:05:00,760 Speaker 1: real consideration for you know, reality. They might saying things 85 00:05:00,800 --> 00:05:05,479 Speaker 1: that are accurate depictions, but not actual quotations, and I 86 00:05:05,520 --> 00:05:08,400 Speaker 1: think it's important to make the distinction, by the way. 87 00:05:08,520 --> 00:05:11,760 Speaker 1: To stick with the tech theme, one story that has 88 00:05:11,880 --> 00:05:15,760 Speaker 1: resurfaced in the wake of the killing of Brian Thompson 89 00:05:16,040 --> 00:05:18,880 Speaker 1: is how some ultra wealthy folks have been worrying and 90 00:05:18,960 --> 00:05:22,359 Speaker 1: preparing for a tumultuous upheaval brought about by one of 91 00:05:22,440 --> 00:05:27,120 Speaker 1: many potential things like climate change or a social revolution, 92 00:05:27,440 --> 00:05:30,520 Speaker 1: or just war in general, or another pandemic, that kind 93 00:05:30,560 --> 00:05:33,640 Speaker 1: of thing. Douglas Rushkoff wrote about this for The Guardian 94 00:05:33,920 --> 00:05:36,520 Speaker 1: back in twenty twenty two, and even then he was 95 00:05:36,560 --> 00:05:39,919 Speaker 1: referencing an event that had happened years earlier, and the 96 00:05:40,040 --> 00:05:43,760 Speaker 1: article is titled the super rich Preppers planning to save 97 00:05:43,839 --> 00:05:47,920 Speaker 1: themselves from the Apocalypse Now. In that piece, rushkov explains 98 00:05:47,920 --> 00:05:51,279 Speaker 1: he was lured to speak at an event, only he 99 00:05:51,400 --> 00:05:55,120 Speaker 1: discovered this event was actually a group of billionaires who 100 00:05:55,200 --> 00:05:58,360 Speaker 1: wanted to know about practical considerations they should be, you know, 101 00:05:58,400 --> 00:06:02,279 Speaker 1: making their designing constrs, structing, maintaining and operating a shelter 102 00:06:02,440 --> 00:06:05,720 Speaker 1: or a bunker that's designed to keep their precious bodies 103 00:06:05,960 --> 00:06:08,880 Speaker 1: safe while the rest of us suffer out in the 104 00:06:09,160 --> 00:06:12,160 Speaker 1: hellscape that is the apocalypse. And he claims that at 105 00:06:12,160 --> 00:06:14,960 Speaker 1: one point, the fat cats asked him how best to 106 00:06:15,080 --> 00:06:18,679 Speaker 1: keep control over their security details. After all, the people 107 00:06:18,720 --> 00:06:21,920 Speaker 1: working security would be coming from the commoners and they 108 00:06:22,000 --> 00:06:24,960 Speaker 1: might share some of those sentiments with the struggling and 109 00:06:25,000 --> 00:06:28,960 Speaker 1: the dying, dirty people outside. Rushcoff suggested that they actually 110 00:06:28,960 --> 00:06:32,039 Speaker 1: extend a hand and become friends with their staff, you know, 111 00:06:32,440 --> 00:06:36,880 Speaker 1: before the world falls apart, because human cooperation is necessary 112 00:06:36,920 --> 00:06:40,920 Speaker 1: for survival. But the rich folks, according to Rushkoff, favored 113 00:06:40,960 --> 00:06:44,960 Speaker 1: approaches like controlling the food supply or even requiring their 114 00:06:44,960 --> 00:06:48,159 Speaker 1: security detail to wear electric shock collars. And I'm not 115 00:06:48,200 --> 00:06:50,640 Speaker 1: being dramatic here. You should read the article now. The 116 00:06:50,680 --> 00:06:52,760 Speaker 1: reason I bring up this article, which is a couple 117 00:06:52,760 --> 00:06:54,800 Speaker 1: of years old, more than two years old at this point, 118 00:06:55,000 --> 00:06:58,200 Speaker 1: is that new articles are popping up about very similar 119 00:06:58,240 --> 00:07:02,080 Speaker 1: topics in the wake of Thompson killing. Natasha Tiku of 120 00:07:02,160 --> 00:07:06,159 Speaker 1: The Washington Post has an article titled Fearful of Crime, 121 00:07:06,520 --> 00:07:11,000 Speaker 1: the tech elite transformed their homes into military bunkers. That 122 00:07:11,160 --> 00:07:14,200 Speaker 1: article published just yesterday, and she also talks about a 123 00:07:14,280 --> 00:07:18,120 Speaker 1: Silicon Valley startup company the designed security systems for the 124 00:07:18,240 --> 00:07:20,720 Speaker 1: uber rich. And the company has a name that pretty 125 00:07:20,760 --> 00:07:22,880 Speaker 1: much tells you everything you need to know the name 126 00:07:22,920 --> 00:07:26,120 Speaker 1: of the company, sourn. Now, I wish I were making 127 00:07:26,160 --> 00:07:28,720 Speaker 1: that up. It's already insane to me that there's a 128 00:07:29,000 --> 00:07:33,360 Speaker 1: tech company out there with the unironic name Palanteer. And 129 00:07:33,480 --> 00:07:36,360 Speaker 1: here you have companies catering to the ultra wealthy or 130 00:07:36,400 --> 00:07:40,280 Speaker 1: the defense industry, and they're eagerly adopting names from J. R. R. 131 00:07:40,400 --> 00:07:44,480 Speaker 1: Tolkien's works, particularly the Lord of the Rings series, and 132 00:07:44,520 --> 00:07:47,800 Speaker 1: these are often names that are associated with bad guys. 133 00:07:48,120 --> 00:07:50,760 Speaker 1: So they're saying the quiet part out loud, y'all. Anyway, 134 00:07:50,800 --> 00:07:52,680 Speaker 1: if you want to get angry at rich people who 135 00:07:52,760 --> 00:07:54,600 Speaker 1: seem to think that it's better to live in a 136 00:07:54,640 --> 00:07:58,280 Speaker 1: world that's falling apart as long as they personally continue 137 00:07:58,280 --> 00:08:01,760 Speaker 1: to accumulate wealth, rather than live in a world where 138 00:08:01,800 --> 00:08:05,080 Speaker 1: they don't accumulate so much wealth and the rest of 139 00:08:05,080 --> 00:08:09,640 Speaker 1: the world starts to patch itself up, like, oh, well, no, 140 00:08:09,800 --> 00:08:14,280 Speaker 1: I want to have money, So why would I deny 141 00:08:14,400 --> 00:08:17,480 Speaker 1: myself all that money just so that the world could 142 00:08:17,600 --> 00:08:21,680 Speaker 1: potentially fix some problems. Let the world sort itself out. 143 00:08:21,800 --> 00:08:24,679 Speaker 1: I need mine. Well, these two articles are a pretty 144 00:08:24,720 --> 00:08:27,480 Speaker 1: darn good start for you to go down your journey 145 00:08:27,680 --> 00:08:32,280 Speaker 1: of decrying the rich. It's interesting because, like, there's definitely 146 00:08:32,320 --> 00:08:36,079 Speaker 1: been a growing sentiment and in the world in general, 147 00:08:36,120 --> 00:08:38,720 Speaker 1: at least the world I'm familiar with that has been 148 00:08:39,240 --> 00:08:42,880 Speaker 1: anti wealthy because of these sorts of issues. And when 149 00:08:42,880 --> 00:08:45,720 Speaker 1: I say wealthy, I mean like ultra wealthy, the people 150 00:08:45,720 --> 00:08:50,960 Speaker 1: who are benefiting from others, you know. And it sounds 151 00:08:51,040 --> 00:08:53,520 Speaker 1: like folks are worried that it's coming to a head. 152 00:08:53,600 --> 00:08:55,960 Speaker 1: I don't think it's coming to a head yet, but 153 00:08:56,440 --> 00:08:58,560 Speaker 1: you know, pressure does seem to be building, and I 154 00:08:58,640 --> 00:09:00,800 Speaker 1: expect that's going to continue over the next few years. 155 00:09:01,320 --> 00:09:03,800 Speaker 1: Another big story here in the United States involves more 156 00:09:03,840 --> 00:09:09,160 Speaker 1: details around the Chinese hacking operation known as Salt Typhoon. Now, 157 00:09:09,160 --> 00:09:14,080 Speaker 1: I'm pretty sure that we talked about Salt Typhoon months ago, 158 00:09:14,800 --> 00:09:18,040 Speaker 1: that we had talked about the discovery of a naming 159 00:09:18,120 --> 00:09:21,480 Speaker 1: of this operation earlier this year, but more details have 160 00:09:21,520 --> 00:09:24,760 Speaker 1: emerged recently that indicate the project is far broader in 161 00:09:24,880 --> 00:09:29,560 Speaker 1: scope than first imagined. So, according to the FBI, Chinese 162 00:09:29,559 --> 00:09:33,120 Speaker 1: hackers have infiltrated telecommunication systems here in the United States, 163 00:09:33,200 --> 00:09:37,000 Speaker 1: and we're talking the big guys like AT and T, Verizon, 164 00:09:37,080 --> 00:09:39,959 Speaker 1: and T Mobile, although T Mobile claims there's no evidence 165 00:09:39,960 --> 00:09:44,760 Speaker 1: hackers accessed customer information on their network. Further, these hackers 166 00:09:44,880 --> 00:09:49,560 Speaker 1: have accessed call information and texts and that kind of thing. 167 00:09:50,040 --> 00:09:52,440 Speaker 1: The matter is severe enough that the Senate had a 168 00:09:52,520 --> 00:09:56,720 Speaker 1: classified briefing on the matter this past Wednesday. According to Reuter's, 169 00:09:56,920 --> 00:10:00,000 Speaker 1: Senator Rick Scott was angry that the briefing didn't detail 170 00:10:00,160 --> 00:10:04,280 Speaker 1: information on why the FBI didn't catch this intrusion and 171 00:10:04,600 --> 00:10:07,760 Speaker 1: what they could have done to prevent the intrusion. Well, 172 00:10:07,800 --> 00:10:10,160 Speaker 1: I think i'll handle this one, Senator Scott. I think 173 00:10:10,200 --> 00:10:14,280 Speaker 1: it's entirely possible that the manner in which the hackers 174 00:10:14,360 --> 00:10:17,080 Speaker 1: use to intrude upon these systems is not yet known. 175 00:10:17,400 --> 00:10:19,520 Speaker 1: And if you don't know how someone got into your house, 176 00:10:19,720 --> 00:10:22,400 Speaker 1: you can't really address how you would have prevented it 177 00:10:22,440 --> 00:10:26,880 Speaker 1: from happening, Right, So I would argue that perhaps more 178 00:10:27,000 --> 00:10:30,480 Speaker 1: germane at least for the immediate future, is how do 179 00:10:30,600 --> 00:10:33,800 Speaker 1: we oust the hackers from the systems and then shut 180 00:10:33,840 --> 00:10:36,400 Speaker 1: off access so they can no longer spy on people 181 00:10:36,400 --> 00:10:39,120 Speaker 1: here in the United States. However, I admit I am 182 00:10:39,240 --> 00:10:42,320 Speaker 1: no politician anyway, this sort of thing, you know, state 183 00:10:42,320 --> 00:10:47,079 Speaker 1: impact hackers from China infiltrating telecommunication systems on a widespread basis. 184 00:10:47,120 --> 00:10:49,720 Speaker 1: It's one of the big reasons why several politicians voted 185 00:10:49,760 --> 00:10:53,360 Speaker 1: to ban TikTok in the United States unless it's Chinese 186 00:10:53,440 --> 00:10:58,040 Speaker 1: parent company Byte Dance divests itself of TikTok, And I 187 00:10:58,160 --> 00:11:00,840 Speaker 1: understand that line of reasoning, but I'll point out once 188 00:11:00,880 --> 00:11:03,679 Speaker 1: again that the issue of China spying on the United 189 00:11:03,720 --> 00:11:07,640 Speaker 1: States is way way bigger than TikTok, and there's a 190 00:11:07,720 --> 00:11:12,840 Speaker 1: real danger of using TikTok to be an example of 191 00:11:12,880 --> 00:11:16,840 Speaker 1: patching a very small hole while ignoring a huge crater. 192 00:11:17,320 --> 00:11:20,760 Speaker 1: And if we focus exclusively on TikTok, we're going to 193 00:11:20,840 --> 00:11:25,880 Speaker 1: ignore the larger problem that continues to be an issue. Okay, 194 00:11:25,880 --> 00:11:28,360 Speaker 1: we've got more news stories to get through before we 195 00:11:28,559 --> 00:11:31,319 Speaker 1: move on. Let's take a quick break to thank our sponsors. 196 00:11:40,720 --> 00:11:44,600 Speaker 1: We're back, and you know what, here's another story that 197 00:11:44,720 --> 00:11:47,920 Speaker 1: involves tech, the United States and China. We left off 198 00:11:47,960 --> 00:11:50,000 Speaker 1: with one of those, let's pick up with a different one. 199 00:11:50,160 --> 00:11:54,640 Speaker 1: This one revolves around rare earth metals and other raw materials. Also, 200 00:11:55,080 --> 00:11:58,319 Speaker 1: just as a reminder, rare earth metals tends to refer 201 00:11:58,360 --> 00:12:02,000 Speaker 1: to a class of metals that aren't necessarily actually rare. 202 00:12:02,360 --> 00:12:05,840 Speaker 1: They might be plentiful in the Earth's crust. The problem 203 00:12:05,920 --> 00:12:12,000 Speaker 1: is they don't typically appear in concentrated forms, so getting 204 00:12:12,120 --> 00:12:14,920 Speaker 1: hold of a lot of it is hard. It requires 205 00:12:14,960 --> 00:12:16,760 Speaker 1: a lot of work. Often it requires a lot of 206 00:12:16,800 --> 00:12:21,840 Speaker 1: work that is environmentally dangerous, also dangerous to the health 207 00:12:21,880 --> 00:12:23,719 Speaker 1: of the people who are working there. Like you're using 208 00:12:23,760 --> 00:12:27,520 Speaker 1: a lot of caustic chemicals to leach away metals from 209 00:12:27,640 --> 00:12:32,360 Speaker 1: other material So it's dangerous stuff. But the word rare 210 00:12:32,559 --> 00:12:36,920 Speaker 1: is a little bit misleading anyway. These are materials that 211 00:12:36,960 --> 00:12:40,240 Speaker 1: are very important for different aspects of the tech sector, 212 00:12:40,640 --> 00:12:44,640 Speaker 1: and recently the United States instituted new restrictions on products 213 00:12:44,640 --> 00:12:48,320 Speaker 1: that contain US made semiconductor chips from being sold to 214 00:12:48,440 --> 00:12:52,199 Speaker 1: Chinese businesses, essentially saying China is desperate to get their 215 00:12:52,240 --> 00:12:56,920 Speaker 1: hands on US designed semiconductors in order to steal the 216 00:12:57,000 --> 00:13:00,000 Speaker 1: technology and then repurpose it for other stuff. This is 217 00:13:00,120 --> 00:13:03,240 Speaker 1: been an ongoing concern for many years now, and so 218 00:13:03,800 --> 00:13:06,960 Speaker 1: one way that the United States tries to address this 219 00:13:07,280 --> 00:13:11,320 Speaker 1: is by banning the sale of certain materials and certain 220 00:13:11,840 --> 00:13:15,960 Speaker 1: products to China so that they can't just harvest the 221 00:13:15,960 --> 00:13:20,439 Speaker 1: semiconductors and use those to create knockoffs. But in retaliation 222 00:13:20,520 --> 00:13:26,680 Speaker 1: of this increased restriction on certain products to China, China 223 00:13:26,679 --> 00:13:28,880 Speaker 1: itself has announced that it is going to ban the 224 00:13:28,920 --> 00:13:31,880 Speaker 1: export of several rare earth metals and other minerals and 225 00:13:31,960 --> 00:13:35,640 Speaker 1: materials that are used in industries including military applications, as 226 00:13:35,679 --> 00:13:38,080 Speaker 1: well as the tech sector in general. So you may 227 00:13:38,120 --> 00:13:40,880 Speaker 1: be aware that many of our devices rely in part 228 00:13:41,160 --> 00:13:45,400 Speaker 1: on stuff that requires this kind of material, stuff like batteries, 229 00:13:45,559 --> 00:13:49,400 Speaker 1: for example. So this is a case of two superpowers 230 00:13:49,440 --> 00:13:52,160 Speaker 1: squaring off against each other in an attempt to deny 231 00:13:52,280 --> 00:13:56,439 Speaker 1: access to technology or the materials used to build technology, 232 00:13:56,520 --> 00:13:58,640 Speaker 1: and I suspect we're going to see more of these 233 00:13:58,720 --> 00:14:02,840 Speaker 1: kinds of conflicts during the Trump administration, assuming the tariffs 234 00:14:02,840 --> 00:14:05,760 Speaker 1: that Trump has promised actually come to pass. To read 235 00:14:05,800 --> 00:14:09,800 Speaker 1: more about this, I recommend Ashley Bellinger's article in Ours Technica. 236 00:14:09,880 --> 00:14:13,560 Speaker 1: It is titled China hits US with ban on critical 237 00:14:13,600 --> 00:14:18,000 Speaker 1: minerals used in tech manufacturing. Oh hey, let's get another 238 00:14:18,120 --> 00:14:21,640 Speaker 1: Tolkien reference in here. Why not? I mean, I'm leaving 239 00:14:21,640 --> 00:14:23,040 Speaker 1: the show in a month, so I might as well 240 00:14:23,080 --> 00:14:24,680 Speaker 1: sneak in as many as I can. I mean, I've 241 00:14:24,720 --> 00:14:28,240 Speaker 1: got a Tolkien tattoo on my arm for goodness sakes. Actually, 242 00:14:28,280 --> 00:14:30,920 Speaker 1: in this case, I'm not really sneaking anything in the 243 00:14:30,960 --> 00:14:34,520 Speaker 1: tech sector has me covered. Palmer Lucky, the guy who 244 00:14:34,560 --> 00:14:38,400 Speaker 1: created the Oculus VR headset and then who subsequently made 245 00:14:38,440 --> 00:14:42,920 Speaker 1: headlines with some very edge Lord activities, also has a 246 00:14:43,000 --> 00:14:46,640 Speaker 1: military defense company, because of course he does. And it's 247 00:14:46,680 --> 00:14:50,240 Speaker 1: called and Uril, which at least is not a reference 248 00:14:50,280 --> 00:14:53,000 Speaker 1: to a villain in the Lord of the Rings books. Instead, 249 00:14:53,000 --> 00:14:56,520 Speaker 1: it's the name of a sword anyway. And Uril has 250 00:14:56,520 --> 00:15:01,560 Speaker 1: announced a strategic partnership with open you know, the company 251 00:15:01,560 --> 00:15:05,040 Speaker 1: that's behind Chad GPT. And if you are thinking, well 252 00:15:05,080 --> 00:15:08,000 Speaker 1: that sounds bad, then you and I were on the 253 00:15:08,040 --> 00:15:11,680 Speaker 1: same page. And if you're also thinking, wait a minute, 254 00:15:11,880 --> 00:15:15,760 Speaker 1: I thought open AI was founded to be an organization 255 00:15:15,960 --> 00:15:21,280 Speaker 1: dedicated to the peaceful, responsible development of artificial intelligence. Well, 256 00:15:22,200 --> 00:15:25,840 Speaker 1: my dear sweet summer child, I've got some bad news 257 00:15:25,880 --> 00:15:29,160 Speaker 1: for you, because that version of open AI hasn't really 258 00:15:29,200 --> 00:15:32,720 Speaker 1: been a reality for quite some time at this point. Anyway. 259 00:15:33,240 --> 00:15:36,800 Speaker 1: Rob Thubren of tech Spot has a piece on this 260 00:15:37,000 --> 00:15:41,040 Speaker 1: that's titled open AI partners with Palmer Lucky's defense firm, 261 00:15:41,080 --> 00:15:45,040 Speaker 1: paving the way for AI driven military tech, And yeah, 262 00:15:45,280 --> 00:15:50,160 Speaker 1: I don't really see a positive way to spend this. 263 00:15:50,760 --> 00:15:53,960 Speaker 1: Apparently the initial focus of the partnership will be to 264 00:15:54,000 --> 00:15:59,680 Speaker 1: develop military platforms that can identify, target, and destroy unmanned drones. 265 00:16:00,200 --> 00:16:02,840 Speaker 1: But apparently they mentioned that they could also use that 266 00:16:02,920 --> 00:16:07,720 Speaker 1: same technology to deal with quote unquote legacy threats. Legacy 267 00:16:07,760 --> 00:16:11,239 Speaker 1: threats is a euphemism. It's a nice way to obfuscate 268 00:16:11,280 --> 00:16:14,120 Speaker 1: the fact that what they're talking about our aerial vehicles 269 00:16:14,320 --> 00:16:18,160 Speaker 1: with an actual human crew, you know, like aircraft, And 270 00:16:18,200 --> 00:16:21,040 Speaker 1: this would be a case of robots killing human beings. 271 00:16:21,320 --> 00:16:24,520 Speaker 1: And yes, we have all seen this movie, and yes 272 00:16:24,680 --> 00:16:28,440 Speaker 1: we do know it turns out poorly, and yet here 273 00:16:28,480 --> 00:16:34,560 Speaker 1: it is happening. Anyway. Yes, early this week, Intel announced 274 00:16:34,640 --> 00:16:38,920 Speaker 1: that CEO Pat Gelsinger is retiring from the company, but 275 00:16:39,000 --> 00:16:43,240 Speaker 1: as Stephen Vaughn Nichols a Computer World puts it, that's 276 00:16:43,280 --> 00:16:45,880 Speaker 1: a line that very few in the tech sector are 277 00:16:45,920 --> 00:16:49,480 Speaker 1: actually buying. So von Nichols has an article that's titled 278 00:16:49,680 --> 00:16:57,640 Speaker 1: Intel's CEO, Pat Gelsinger retires right. That's actually the name 279 00:16:57,680 --> 00:17:02,120 Speaker 1: of the article. His right has like five eyes in it. Anyway, 280 00:17:02,280 --> 00:17:05,600 Speaker 1: it indicates there's a certain level of skepticism around how 281 00:17:05,680 --> 00:17:09,000 Speaker 1: much of this was actually Gelsinger's own decision versus a 282 00:17:09,040 --> 00:17:13,680 Speaker 1: corporate move in an attempt to right the course of Intel. 283 00:17:14,000 --> 00:17:17,639 Speaker 1: And as you know, von Nichols writes, Intel's board of 284 00:17:17,640 --> 00:17:21,879 Speaker 1: directors has grown impatient and allegedly gave Gelsinger the choice 285 00:17:21,920 --> 00:17:26,240 Speaker 1: to either retire or face just being outright canned. Von 286 00:17:26,359 --> 00:17:29,000 Speaker 1: Nichols has a great treatment of what has happened at 287 00:17:29,000 --> 00:17:31,920 Speaker 1: Intel since two thousand and seven. Two thousand and seven 288 00:17:32,040 --> 00:17:34,840 Speaker 1: was when Intel fumbled a deal that would have seen 289 00:17:34,960 --> 00:17:39,359 Speaker 1: Intel chips inside Apple iPhones instead. That deal would end 290 00:17:39,440 --> 00:17:42,439 Speaker 1: up going to Samsung at the time, and von Nichols 291 00:17:42,480 --> 00:17:45,120 Speaker 1: makes a case that this was sort of the beginning 292 00:17:45,200 --> 00:17:47,879 Speaker 1: of a decline for Intel, and he says the company 293 00:17:47,880 --> 00:17:51,359 Speaker 1: has been in that downward trajectory ever since, and the 294 00:17:51,440 --> 00:17:55,200 Speaker 1: board apparently feels that Gelsinger didn't do enough to reverse 295 00:17:55,200 --> 00:17:59,399 Speaker 1: company fortunes. Whether anyone else can succeed in that job 296 00:17:59,480 --> 00:18:02,720 Speaker 1: remains to be seen, although against Stephen Vaughan Nichols at 297 00:18:02,760 --> 00:18:05,359 Speaker 1: the very least is not optimistic about it, and I 298 00:18:05,400 --> 00:18:10,680 Speaker 1: am inclined to agree with his assessment. Hailey Welch achieved 299 00:18:10,720 --> 00:18:13,840 Speaker 1: Internet celebrity with her response to a Man on the 300 00:18:13,840 --> 00:18:17,520 Speaker 1: Street interview that well, let's just say, if you don't 301 00:18:17,560 --> 00:18:19,800 Speaker 1: know Hailey Welch, if you don't know who that is, 302 00:18:20,119 --> 00:18:21,879 Speaker 1: I'm not going to go into it here because this 303 00:18:22,000 --> 00:18:24,399 Speaker 1: is a family show and I'm an old man, and 304 00:18:24,440 --> 00:18:27,800 Speaker 1: I just I just don't have the energy, y'all. But yeah, 305 00:18:27,840 --> 00:18:30,600 Speaker 1: I mean, she gave like a flippant, little funny answer 306 00:18:30,600 --> 00:18:34,000 Speaker 1: to a question and then became an Internet celebrity. And 307 00:18:34,640 --> 00:18:39,080 Speaker 1: apparently she wanted to leverage her newly won celebrity status 308 00:18:39,320 --> 00:18:43,960 Speaker 1: by lending her brand to a meme coin cryptocurrency that 309 00:18:44,160 --> 00:18:50,040 Speaker 1: was called a dollar Sign hawk Hawk. But that meme 310 00:18:50,160 --> 00:18:54,120 Speaker 1: coin experienced a dramatic decline in value, or, as Meredith 311 00:18:54,200 --> 00:18:58,080 Speaker 1: Clark of The Independent put it, dollar Sign hawk Quote 312 00:18:58,440 --> 00:19:02,399 Speaker 1: crashed by more than ninety percent just hours after it 313 00:19:02,480 --> 00:19:06,119 Speaker 1: hit the market. End quote, and in the wake of 314 00:19:06,200 --> 00:19:10,320 Speaker 1: that glorious failure, Welch is now facing accusations that she 315 00:19:10,440 --> 00:19:12,960 Speaker 1: was involved in essentially what was a pump and dump 316 00:19:13,280 --> 00:19:16,360 Speaker 1: rug pole scam. So a rug pole, in case you're 317 00:19:16,359 --> 00:19:18,760 Speaker 1: not familiar with the term, is a scheme in which 318 00:19:18,960 --> 00:19:22,359 Speaker 1: someone puts forward an asset that they invite people to 319 00:19:22,600 --> 00:19:26,040 Speaker 1: invest in. That asset could be anything. It doesn't have 320 00:19:26,160 --> 00:19:29,159 Speaker 1: to be digital, it could be whatever. The scam artist 321 00:19:29,400 --> 00:19:33,760 Speaker 1: owns steak in this asset, and that asset could effectively 322 00:19:33,800 --> 00:19:36,840 Speaker 1: be worthless, but the goal is to build up excitement 323 00:19:37,119 --> 00:19:39,560 Speaker 1: for this asset, to get people to buy in, and 324 00:19:39,560 --> 00:19:43,159 Speaker 1: that drives up the selling price of the asset, and 325 00:19:43,200 --> 00:19:46,639 Speaker 1: then the scam artist sells off their steak and essentially 326 00:19:46,680 --> 00:19:48,760 Speaker 1: they make money off of nothing at all. And then 327 00:19:48,800 --> 00:19:51,280 Speaker 1: typically they head for the hills. And that's your classic 328 00:19:51,440 --> 00:19:54,800 Speaker 1: rug pole. You pull the rug out from under the investors, 329 00:19:55,040 --> 00:19:57,720 Speaker 1: You build up excitement around something, you get people eager 330 00:19:57,760 --> 00:20:00,399 Speaker 1: to buy in out of fear of missing out, and 331 00:20:00,440 --> 00:20:03,360 Speaker 1: then you cash out before the whole thing comes crashing down. Now, 332 00:20:03,440 --> 00:20:06,560 Speaker 1: Welch ha said her mean Coin was never intended to 333 00:20:06,600 --> 00:20:10,919 Speaker 1: be a rug pull. That was not her aim. That however, 334 00:20:11,000 --> 00:20:13,840 Speaker 1: has not stopped an early investor from filing a complaint 335 00:20:13,880 --> 00:20:18,040 Speaker 1: with the US Securities Exchange Commission or SEC against Welch. 336 00:20:18,440 --> 00:20:21,480 Speaker 1: Welch maintains she never tried to pull a pump and up, 337 00:20:21,480 --> 00:20:25,320 Speaker 1: and in fact says her stake in Dollar Sign Hawk 338 00:20:25,720 --> 00:20:28,359 Speaker 1: is something that she's not even allowed to sell for 339 00:20:28,560 --> 00:20:31,840 Speaker 1: a year. And if that is true, then this just 340 00:20:31,920 --> 00:20:34,960 Speaker 1: might be a case of snipers and such taking advantage 341 00:20:35,000 --> 00:20:37,440 Speaker 1: of a meme coin launch and everyone else is ending 342 00:20:37,520 --> 00:20:41,480 Speaker 1: up left holding the bag, including Welch, which stinks but 343 00:20:41,920 --> 00:20:46,440 Speaker 1: also pretty predictable outcome. If I'm being totally honest, It's 344 00:20:46,440 --> 00:20:50,080 Speaker 1: not like this is unprecedented. It is very much precedented. 345 00:20:50,480 --> 00:20:53,720 Speaker 1: Waimo has announced that it is bringing its driverless cab 346 00:20:53,840 --> 00:20:57,960 Speaker 1: service to Miami, that the driverless vehicles will start trolling 347 00:20:58,000 --> 00:21:00,800 Speaker 1: around the streets in twenty twenty five. In fact, the 348 00:21:00,800 --> 00:21:05,920 Speaker 1: company says it will quote began reacquainting waymos all electric 349 00:21:06,160 --> 00:21:10,760 Speaker 1: Jaguar eye pieces to Miami's streets quote in a recent 350 00:21:10,800 --> 00:21:14,520 Speaker 1: press release. Now, the actual service, the driverless cab service 351 00:21:14,720 --> 00:21:17,920 Speaker 1: would wait until twenty twenty six to get into business. 352 00:21:18,160 --> 00:21:20,560 Speaker 1: And I'm sure there are many people in Miami who 353 00:21:20,560 --> 00:21:23,800 Speaker 1: could benefit from improved transportation options. I'm thinking of like 354 00:21:23,920 --> 00:21:27,960 Speaker 1: seniors in Miami who might not have easy ways to 355 00:21:28,000 --> 00:21:30,959 Speaker 1: get around, and I am glad that they will have 356 00:21:31,000 --> 00:21:34,320 Speaker 1: more options. However, I'm also worried that adding more cars 357 00:21:34,320 --> 00:21:37,560 Speaker 1: to the streets isn't really the ideal solution. But what 358 00:21:37,680 --> 00:21:40,600 Speaker 1: do I know? I mean, hasn't stopped us from doing 359 00:21:40,680 --> 00:21:44,359 Speaker 1: it in other markets too, even though I think ultimately 360 00:21:44,520 --> 00:21:48,160 Speaker 1: that there are other solutions that could not only give 361 00:21:48,320 --> 00:21:52,760 Speaker 1: more options for those who need transportation, but also have 362 00:21:52,960 --> 00:21:57,639 Speaker 1: a smaller impact on overall traffic in cities. And I 363 00:21:57,640 --> 00:22:00,560 Speaker 1: don't know, I think adding more cars does not really 364 00:22:00,600 --> 00:22:04,080 Speaker 1: address the traffic issue. In fact, I mean this might 365 00:22:04,080 --> 00:22:06,439 Speaker 1: be ignorant speaking, but it seems to me like it 366 00:22:06,640 --> 00:22:11,040 Speaker 1: just exacerbates that problem. All right, we've got more headlines 367 00:22:11,080 --> 00:22:13,280 Speaker 1: to get through before we get to that. Let's take 368 00:22:13,280 --> 00:22:26,760 Speaker 1: another quick break to think our sponsors. Google has also proclaimed, 369 00:22:26,840 --> 00:22:31,200 Speaker 1: through the AI arm of Google called DeepMind, that an 370 00:22:31,240 --> 00:22:36,240 Speaker 1: AI research laboratory called Gencast has developed a weather predicting 371 00:22:36,359 --> 00:22:40,639 Speaker 1: AI model that is more accurate than existing models, the 372 00:22:40,680 --> 00:22:45,359 Speaker 1: models that meteorologists typically depend upon, and that this model 373 00:22:45,400 --> 00:22:49,359 Speaker 1: can better predict extreme weather events up to fifteen days 374 00:22:49,520 --> 00:22:53,480 Speaker 1: out and in less time than it would take a 375 00:22:53,560 --> 00:22:58,080 Speaker 1: traditional model. This is interesting, and if it is true, 376 00:22:58,600 --> 00:23:01,960 Speaker 1: it could potentially lead to a rather drastic improvement in 377 00:23:02,000 --> 00:23:05,439 Speaker 1: weather forecasting and disaster preparedness. Now, I'm sure it's going 378 00:23:05,520 --> 00:23:08,920 Speaker 1: to take a lot more time to analyze gen casts, 379 00:23:08,960 --> 00:23:13,239 Speaker 1: systems and results to make certain that in fact, what 380 00:23:13,359 --> 00:23:16,399 Speaker 1: appears to be an improvement is really the case, right, 381 00:23:16,480 --> 00:23:19,560 Speaker 1: Because it could just be that it looks like it's 382 00:23:19,560 --> 00:23:22,240 Speaker 1: an improvement, but maybe over the long run, when you 383 00:23:22,240 --> 00:23:25,680 Speaker 1: get a larger sample size, those advantages might wash out. 384 00:23:26,080 --> 00:23:29,119 Speaker 1: We don't know. I hope they don't wash out, but 385 00:23:29,320 --> 00:23:32,359 Speaker 1: you need a lot of testing to make sure that 386 00:23:32,400 --> 00:23:35,359 Speaker 1: these things are actually more accurate. But this is the 387 00:23:35,440 --> 00:23:38,119 Speaker 1: sort of AI application I can actually really get behind. 388 00:23:38,400 --> 00:23:41,239 Speaker 1: If we can develop AI that is more effective at 389 00:23:41,280 --> 00:23:45,159 Speaker 1: analyzing data and patterns than we are in order to 390 00:23:45,160 --> 00:23:47,720 Speaker 1: do stuff like forecast the weather, I think that's great. 391 00:23:48,080 --> 00:23:51,440 Speaker 1: I think if people have an accurate or at least 392 00:23:51,880 --> 00:23:56,199 Speaker 1: a pretty accurate forecast, that if there are signs of 393 00:23:56,240 --> 00:23:59,080 Speaker 1: severe weather on the horizon, they can have more time 394 00:23:59,119 --> 00:24:02,119 Speaker 1: to prepare for that. I think that could save countless lives. 395 00:24:02,480 --> 00:24:04,800 Speaker 1: But we have to understand also that weather is an 396 00:24:04,920 --> 00:24:08,879 Speaker 1: extremely complex phenomena, and even a really effective model is 397 00:24:08,880 --> 00:24:11,480 Speaker 1: going to get things wrong sometimes. I know people who 398 00:24:11,480 --> 00:24:14,679 Speaker 1: get mad when it's been predicted that there's going to 399 00:24:14,680 --> 00:24:18,520 Speaker 1: be really traumatic, like severe weather, and then it turns 400 00:24:18,560 --> 00:24:20,480 Speaker 1: out to not be so bad, and they get angry 401 00:24:20,520 --> 00:24:22,480 Speaker 1: about it, And I'm like, y'all are angry about the 402 00:24:22,520 --> 00:24:25,000 Speaker 1: wrong things. I would be angry if I were told 403 00:24:25,040 --> 00:24:26,159 Speaker 1: it was just going to be a little bit of 404 00:24:26,200 --> 00:24:29,160 Speaker 1: a sprinkle outside and then a tornado tore my house down. 405 00:24:29,520 --> 00:24:32,040 Speaker 1: That's when I would get mad. I don't get mad 406 00:24:32,359 --> 00:24:34,760 Speaker 1: the weather's not as bad as I was told it 407 00:24:34,800 --> 00:24:38,400 Speaker 1: was going to be. I don't know, I'm probably a weirdo. 408 00:24:39,040 --> 00:24:42,280 Speaker 1: Bill Nelson, the administrator for NASA, announced that the agency's 409 00:24:42,359 --> 00:24:45,119 Speaker 1: Artemis program is going to experience a bit of a delay, 410 00:24:45,359 --> 00:24:48,320 Speaker 1: and I think this comes as a surprise to absolutely 411 00:24:48,680 --> 00:24:51,760 Speaker 1: no one. I'm pretty sure that I've mentioned a few 412 00:24:51,800 --> 00:24:55,480 Speaker 1: times this year that I thought NASA's earlier plan for 413 00:24:55,640 --> 00:24:59,960 Speaker 1: Artemis was overly ambitious. But right now, the Artemis mission 414 00:25:00,640 --> 00:25:03,520 Speaker 1: the next one, which is to send astronauts on a 415 00:25:03,520 --> 00:25:06,159 Speaker 1: trip that will take them around the backside of the 416 00:25:06,200 --> 00:25:08,880 Speaker 1: Moon and then back to Earth, not landing on the Moon, 417 00:25:09,000 --> 00:25:12,040 Speaker 1: just going around it. That mission has now been pushed 418 00:25:12,040 --> 00:25:16,119 Speaker 1: back to April twenty twenty six. The Moon landing mission, 419 00:25:16,160 --> 00:25:18,639 Speaker 1: which would return astronauts to the lunar surface for the 420 00:25:18,680 --> 00:25:22,959 Speaker 1: first time since nineteen seventy two, has now been pushed 421 00:25:22,960 --> 00:25:26,840 Speaker 1: back to twenty twenty seven. Further, Trump's pick for the 422 00:25:26,880 --> 00:25:30,719 Speaker 1: new head of NASA is Jared Isigmann, who is a 423 00:25:31,240 --> 00:25:34,720 Speaker 1: filthy rich businessman who back in twenty twenty one paid 424 00:25:34,760 --> 00:25:38,160 Speaker 1: SpaceX and undisclosed amount of money, but I imagine it 425 00:25:38,200 --> 00:25:42,080 Speaker 1: was a princely sum to fly aboard a SpaceX mission 426 00:25:42,160 --> 00:25:45,600 Speaker 1: in an all civilian space tourist mission. He would later 427 00:25:45,720 --> 00:25:49,080 Speaker 1: on fly in a second space tourist mission to become 428 00:25:49,119 --> 00:25:52,560 Speaker 1: the first space tourist to participate in an EVA. That's 429 00:25:52,600 --> 00:25:55,960 Speaker 1: an extra vehicular activity or spacewalk. Do you and me? 430 00:25:56,440 --> 00:26:00,760 Speaker 1: And In very sad news, we learned not too long 431 00:26:00,800 --> 00:26:04,880 Speaker 1: ago that Marshall Brain, the founder of HowStuffWorks dot com, 432 00:26:05,160 --> 00:26:08,320 Speaker 1: committed suicide in his office at the campus of North 433 00:26:08,320 --> 00:26:12,320 Speaker 1: Carolina State University. Brain founded how stuff Works in the 434 00:26:12,400 --> 00:26:16,160 Speaker 1: nineteen nineties. He used his passion for learning and communicating 435 00:26:16,359 --> 00:26:21,480 Speaker 1: to build a site that aim to explain how everything works. 436 00:26:21,920 --> 00:26:25,119 Speaker 1: He would later launch a podcast called brain Stuff that 437 00:26:25,160 --> 00:26:28,520 Speaker 1: would become the first of many stuff shows to launch, 438 00:26:28,600 --> 00:26:32,400 Speaker 1: including tech Stuff, and the story obviously has a big 439 00:26:32,440 --> 00:26:35,720 Speaker 1: personal impact on me. I literally would not have the 440 00:26:35,800 --> 00:26:39,159 Speaker 1: career I am in right now without Marshall Brain. I 441 00:26:39,200 --> 00:26:42,239 Speaker 1: wouldn't be here, I wouldn't be doing this podcast. I 442 00:26:42,280 --> 00:26:45,960 Speaker 1: met Marshall Brain several times, but we never actually worked 443 00:26:45,960 --> 00:26:49,880 Speaker 1: together on any projects. Some of my colleagues spent much 444 00:26:49,920 --> 00:26:52,199 Speaker 1: more time with him, and all of us from the 445 00:26:52,240 --> 00:26:56,320 Speaker 1: Stuff era are still processing this and grieving for his family. 446 00:26:57,000 --> 00:27:00,479 Speaker 1: I have no real details to share about his story 447 00:27:00,520 --> 00:27:03,000 Speaker 1: and what led up to him deciding to take his 448 00:27:03,080 --> 00:27:06,920 Speaker 1: own life. There are pieces in outlets like Ours Technica 449 00:27:07,040 --> 00:27:09,440 Speaker 1: that shed a little more light into what was going 450 00:27:09,440 --> 00:27:12,879 Speaker 1: on leading up to his death, but I don't feel 451 00:27:12,920 --> 00:27:15,440 Speaker 1: like this is the right place to dive into that. 452 00:27:15,520 --> 00:27:17,720 Speaker 1: I would just like to say that if you are 453 00:27:17,760 --> 00:27:21,560 Speaker 1: struggling with thoughts of suicide, reach out for help. There's 454 00:27:21,720 --> 00:27:24,960 Speaker 1: no shame in reaching out for help, and I, for one, 455 00:27:25,320 --> 00:27:28,440 Speaker 1: would prefer a world that has you in it. That's 456 00:27:28,520 --> 00:27:30,919 Speaker 1: a very heavy note to end on. I do have 457 00:27:30,960 --> 00:27:33,720 Speaker 1: a couple of additional reading suggestions for all of y'all 458 00:27:33,760 --> 00:27:36,159 Speaker 1: before I head out. First up is a piece in 459 00:27:36,240 --> 00:27:41,399 Speaker 1: The Verge by Elizabeth Lopoto titled stop using Generative AI 460 00:27:41,560 --> 00:27:45,280 Speaker 1: as a search engine. Lopato details incidents in which people 461 00:27:45,320 --> 00:27:48,560 Speaker 1: have used generative AI to answer questions only to receive 462 00:27:48,720 --> 00:27:53,640 Speaker 1: incorrect answers, often with invented details that just never really 463 00:27:53,680 --> 00:27:56,080 Speaker 1: happened in the real world, which is really similar to 464 00:27:56,119 --> 00:27:59,040 Speaker 1: what I found when I use chat GPT to quote 465 00:27:59,080 --> 00:28:02,880 Speaker 1: unquote write an episode of Tech Stuff. You know, if 466 00:28:02,920 --> 00:28:05,680 Speaker 1: you haven't listened to that episode, you should, because I 467 00:28:06,320 --> 00:28:09,480 Speaker 1: bring up some big concerns I have as a result 468 00:28:09,520 --> 00:28:12,439 Speaker 1: of what I got when I use chat GPT to 469 00:28:12,480 --> 00:28:15,720 Speaker 1: do that. But check out Lopato's piece for more insight 470 00:28:16,200 --> 00:28:19,840 Speaker 1: into that issue. And finally, I recommend reading Tim Cushing's 471 00:28:19,880 --> 00:28:24,159 Speaker 1: piece in Tech Dirt titled Federal Court says dismantling a 472 00:28:24,240 --> 00:28:27,959 Speaker 1: phone to install firmware isn't a search, even if it 473 00:28:28,040 --> 00:28:31,919 Speaker 1: was done to facilitate a search. So essentially what the 474 00:28:31,960 --> 00:28:34,359 Speaker 1: story is saying is that a court has decided that 475 00:28:34,480 --> 00:28:38,360 Speaker 1: law enforcement officials do not need to secure a search 476 00:28:38,440 --> 00:28:42,680 Speaker 1: warrant before attempting to compromise a phone or other device 477 00:28:42,760 --> 00:28:45,720 Speaker 1: like a laptop. But once they do get access to 478 00:28:45,840 --> 00:28:48,560 Speaker 1: that device, they then would need to get a search 479 00:28:48,600 --> 00:28:51,240 Speaker 1: warrant in order to actually search the device. They wouldn't 480 00:28:51,240 --> 00:28:53,760 Speaker 1: be allowed to search it after breaching it unless they 481 00:28:53,760 --> 00:28:55,760 Speaker 1: had the warrant. They just don't need the warrant to 482 00:28:55,800 --> 00:29:00,640 Speaker 1: start the process of breaching the device. Moreover, the essentially 483 00:29:00,640 --> 00:29:03,960 Speaker 1: hold on to the locked or defunct or broken device 484 00:29:04,000 --> 00:29:06,600 Speaker 1: for as long as they need to before they're able 485 00:29:06,640 --> 00:29:09,120 Speaker 1: to crack it open. So check out that article. He 486 00:29:09,200 --> 00:29:15,480 Speaker 1: brings up some pretty interesting arguments about the good and 487 00:29:15,520 --> 00:29:19,959 Speaker 1: the bad of that, and he advocates for more of 488 00:29:20,120 --> 00:29:24,560 Speaker 1: a more clarity around this from a court perspective, which 489 00:29:24,560 --> 00:29:26,440 Speaker 1: I think makes sense, Like we live in that world 490 00:29:26,720 --> 00:29:31,000 Speaker 1: where this is a very real possibility and the laws 491 00:29:31,080 --> 00:29:35,040 Speaker 1: that guide us around illegal search and seizure were not 492 00:29:35,160 --> 00:29:39,280 Speaker 1: equipped to handle those kinds of scenarios, so we really 493 00:29:39,280 --> 00:29:42,360 Speaker 1: do need more clarification on that. That's it for this 494 00:29:42,640 --> 00:29:46,440 Speaker 1: episode of the tech News of tech Stuff. I hope 495 00:29:46,560 --> 00:29:49,400 Speaker 1: all of you out there are doing well, and I'll 496 00:29:49,440 --> 00:29:58,760 Speaker 1: talk to you again really soon. Tech Stuff is an 497 00:29:58,760 --> 00:30:04,280 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio, app, 498 00:30:04,440 --> 00:30:07,600 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.