1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Be there 2 00:00:12,400 --> 00:00:16,040 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,040 --> 00:00:20,200 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:20,320 --> 00:00:24,440 Speaker 1: are you. It's time for the news for Thursday October 5 00:00:24,600 --> 00:00:28,360 Speaker 1: twenty six, twenty twenty three, and let's start off with 6 00:00:28,440 --> 00:00:33,280 Speaker 1: some news about hackers and vulnerability, shall we. Ours Technica 7 00:00:33,320 --> 00:00:37,839 Speaker 1: has an article titled pro Russia hackers target inboxes with 8 00:00:38,120 --> 00:00:42,959 Speaker 1: zero day in webmail app used by millions. The webmail 9 00:00:43,120 --> 00:00:49,280 Speaker 1: app in question is round cubes webmail application. Many organizations, 10 00:00:49,320 --> 00:00:53,599 Speaker 1: like a lot of universities, rely upon round cube and 11 00:00:53,760 --> 00:00:57,000 Speaker 1: they reskin it as a different email service, but they 12 00:00:57,080 --> 00:01:00,280 Speaker 1: use that for their staff and students. It's real least 13 00:01:00,320 --> 00:01:05,520 Speaker 1: under the GNU or GANW General Public License. It's available 14 00:01:05,520 --> 00:01:09,680 Speaker 1: for anyone to download, deploy, and distribute. But it turns 15 00:01:09,720 --> 00:01:13,480 Speaker 1: out that until very recently it had a vulnerability in 16 00:01:13,560 --> 00:01:17,240 Speaker 1: cross site scripting that gives hackers the chance to actually 17 00:01:17,280 --> 00:01:24,080 Speaker 1: compromise email servers and user computers and then intercept communications 18 00:01:24,600 --> 00:01:30,120 Speaker 1: sent across that machine's email. So the attack is pretty insidious. 19 00:01:30,160 --> 00:01:34,319 Speaker 1: The hackers hide the attack inside an email. The only 20 00:01:34,400 --> 00:01:36,440 Speaker 1: thing that the victim has to do in order for 21 00:01:36,480 --> 00:01:39,120 Speaker 1: the attack to launch is to view the email. They 22 00:01:39,120 --> 00:01:41,720 Speaker 1: don't have to click on any links. This isn't like 23 00:01:41,760 --> 00:01:44,319 Speaker 1: a phishing attack, or at least not your traditional kind. 24 00:01:45,160 --> 00:01:49,400 Speaker 1: Just opening the email will do it because it initiates 25 00:01:49,840 --> 00:01:53,920 Speaker 1: a pretty diabolical sequence of events. So the hackers create 26 00:01:53,960 --> 00:01:58,920 Speaker 1: an attack encoded in JavaScript, and that attack is triggered 27 00:01:59,400 --> 00:02:03,800 Speaker 1: if the target computer detects and error. So you might think, well, 28 00:02:03,840 --> 00:02:07,680 Speaker 1: if there's no error detected, then this attack goes untriggered. However, 29 00:02:07,760 --> 00:02:11,480 Speaker 1: the tag itself in the email contains an error, so 30 00:02:11,800 --> 00:02:16,399 Speaker 1: viewing the email creates the error report, which then initiates 31 00:02:16,720 --> 00:02:20,840 Speaker 1: the actual attack included within the coding of that email, 32 00:02:21,480 --> 00:02:24,520 Speaker 1: and the attack is invisible to the victim. It results 33 00:02:24,520 --> 00:02:27,720 Speaker 1: in the hackers gaining access essentially to the victim's emails. 34 00:02:28,400 --> 00:02:32,480 Speaker 1: The hacker group responsible for exploiting this vulnerability is called 35 00:02:32,919 --> 00:02:37,560 Speaker 1: Winter Vivern. In the past, they've mostly focused on targeting 36 00:02:37,760 --> 00:02:42,359 Speaker 1: government officials in the United States, particularly government officials who 37 00:02:42,440 --> 00:02:46,120 Speaker 1: showed support for Ukraine, which again points to this being 38 00:02:46,160 --> 00:02:51,359 Speaker 1: a Russian backed hacker attack. Fortunately, the security firm e 39 00:02:51,560 --> 00:02:55,640 Speaker 1: Set detected the attacks a day after they had first 40 00:02:55,639 --> 00:02:59,240 Speaker 1: started to launch and then immediately reached out to the 41 00:02:59,280 --> 00:03:03,680 Speaker 1: developers over a Roundcube. So three days after the attacks began, 42 00:03:04,320 --> 00:03:08,200 Speaker 1: Roundcube issued a patch. Now, the patch does require server 43 00:03:08,320 --> 00:03:12,880 Speaker 1: admins to install it, or for end users who are 44 00:03:13,000 --> 00:03:18,160 Speaker 1: using round Cube based webmail to run, make sure that 45 00:03:18,160 --> 00:03:20,920 Speaker 1: the software running is a patched version of that software, 46 00:03:21,240 --> 00:03:23,320 Speaker 1: or else you run the risk of becoming one of 47 00:03:23,360 --> 00:03:27,560 Speaker 1: the targets. Over on the Apple side, Ours Technica has 48 00:03:27,639 --> 00:03:31,000 Speaker 1: another article. Ours Technica is just a great site, y'all. Again, 49 00:03:31,040 --> 00:03:33,400 Speaker 1: I have no connection to ours Technica. They just do 50 00:03:33,480 --> 00:03:37,160 Speaker 1: great work anyway. They have another article titled hackers can 51 00:03:37,280 --> 00:03:42,240 Speaker 1: force iOS and macOS browsers to divulge passwords and much 52 00:03:42,400 --> 00:03:47,200 Speaker 1: more so. Unlike the first story, today, as far as 53 00:03:47,240 --> 00:03:50,480 Speaker 1: we know, this particular exploit has not been used in 54 00:03:50,520 --> 00:03:54,960 Speaker 1: the wild. Instead, some security researchers discovered it on iOS 55 00:03:54,960 --> 00:03:59,800 Speaker 1: and macOS devices that are running on hardware that has 56 00:04:00,080 --> 00:04:04,320 Speaker 1: Apple's A or M series CPUs in them, so it 57 00:04:04,320 --> 00:04:07,520 Speaker 1: doesn't affect every Apple product, just Apple products that have 58 00:04:07,640 --> 00:04:12,560 Speaker 1: a A or M series CPU. They have called their 59 00:04:12,640 --> 00:04:16,320 Speaker 1: exploit eye Leakage, in a cute little nod to Apples 60 00:04:16,400 --> 00:04:20,600 Speaker 1: naming conventions. Now, according to ours technica. This attack is 61 00:04:20,720 --> 00:04:24,480 Speaker 1: not a simple one. It actually requires a fairly significant 62 00:04:24,520 --> 00:04:29,120 Speaker 1: familiarity with an understanding of Apple hardware in order to 63 00:04:29,240 --> 00:04:32,200 Speaker 1: pull it off. But the attack targets what is called 64 00:04:32,240 --> 00:04:34,599 Speaker 1: a side channel, and you can kind of think of 65 00:04:34,560 --> 00:04:37,279 Speaker 1: as a side channel as sort of you're looking not 66 00:04:37,600 --> 00:04:42,160 Speaker 1: at the data itself, instead you're looking at evidence that 67 00:04:42,360 --> 00:04:45,920 Speaker 1: data was there. So I'm talking about stuff like electromagnetic 68 00:04:46,120 --> 00:04:50,680 Speaker 1: emanations things like that. You know, it's like seeing evidence 69 00:04:50,720 --> 00:04:53,840 Speaker 1: that something has been there and then reverse engineering it. 70 00:04:53,920 --> 00:04:57,320 Speaker 1: So it is really complicated stuff. But the researcher showed 71 00:04:57,360 --> 00:05:01,320 Speaker 1: that through a process that's called speculative to execution, which 72 00:05:01,320 --> 00:05:04,120 Speaker 1: has been at the root of a lot of exploits 73 00:05:04,120 --> 00:05:08,359 Speaker 1: in the last few years, and by using a malicious website, 74 00:05:08,400 --> 00:05:12,240 Speaker 1: the researchers could execute a JavaScript application that would then 75 00:05:12,279 --> 00:05:15,560 Speaker 1: give them the ability to open a new window that 76 00:05:15,680 --> 00:05:19,400 Speaker 1: was running on the target device and be able to 77 00:05:19,480 --> 00:05:22,560 Speaker 1: access stuff as if they were the user. So, for example, 78 00:05:22,960 --> 00:05:26,360 Speaker 1: let's say that I use this attack, I exploited your 79 00:05:26,440 --> 00:05:29,480 Speaker 1: Apple device, and then it would open up a window 80 00:05:29,520 --> 00:05:33,320 Speaker 1: on my screen where I could say, navigate to YouTube 81 00:05:33,360 --> 00:05:36,440 Speaker 1: and look at your YouTube view history, or I could 82 00:05:36,520 --> 00:05:40,039 Speaker 1: navigate to a login page, and if you had enabled 83 00:05:40,520 --> 00:05:45,200 Speaker 1: auto complete or autofill passwords or whatever, I could get 84 00:05:45,240 --> 00:05:48,600 Speaker 1: access to your account through that. Potentially I could even 85 00:05:49,040 --> 00:05:51,760 Speaker 1: do things like figure out what your password was and 86 00:05:51,800 --> 00:05:55,080 Speaker 1: then maybe change it so that then I control whatever 87 00:05:55,160 --> 00:05:58,880 Speaker 1: account I was snooping on. So yeah, it's a pretty 88 00:05:59,040 --> 00:06:03,520 Speaker 1: serious vulnerable ability here. But again it is a very 89 00:06:03,960 --> 00:06:08,520 Speaker 1: sophisticated and difficult attack, So it's not something that is, 90 00:06:09,279 --> 00:06:13,880 Speaker 1: as far as we know, readily active out in the wild. 91 00:06:14,120 --> 00:06:17,560 Speaker 1: Apple has said that they are aware of the vulnerability, 92 00:06:17,560 --> 00:06:22,000 Speaker 1: they're working on addressing it. And because there's no known 93 00:06:22,240 --> 00:06:25,800 Speaker 1: actual live attacks using this method, because there's such a 94 00:06:25,880 --> 00:06:29,200 Speaker 1: high learning curve as to how to use it. For 95 00:06:29,279 --> 00:06:33,760 Speaker 1: the meantime, Apple users don't necessarily have anything to worry about. Obviously. 96 00:06:34,320 --> 00:06:36,440 Speaker 1: You always have to worry about the types of websites 97 00:06:36,480 --> 00:06:39,279 Speaker 1: you visit. That's just a given, like you should not 98 00:06:39,440 --> 00:06:42,320 Speaker 1: just be going willy nilly to websites you don't know 99 00:06:42,920 --> 00:06:46,040 Speaker 1: if you can manage that. But apart from that, it's 100 00:06:46,240 --> 00:06:49,160 Speaker 1: not likely to be an active attack in Apple, as 101 00:06:49,200 --> 00:06:52,599 Speaker 1: I said, is aware of it and attempting to mitigate 102 00:06:52,760 --> 00:06:56,120 Speaker 1: the problem. I mentioned in an earlier news episode that 103 00:06:56,240 --> 00:06:59,719 Speaker 1: hackers gained access to user data on twenty three and meters, 104 00:06:59,760 --> 00:07:03,159 Speaker 1: the genetic testing service, and the company maintains that the 105 00:07:03,200 --> 00:07:06,120 Speaker 1: hackers didn't get access on the corporate side, like they 106 00:07:06,160 --> 00:07:10,280 Speaker 1: didn't get access into twenty three and me's systems through 107 00:07:10,360 --> 00:07:13,880 Speaker 1: some sort of breach attack. Instead, they say that the 108 00:07:13,920 --> 00:07:18,080 Speaker 1: hackers essentially they just used usernames and passwords from other 109 00:07:18,200 --> 00:07:21,640 Speaker 1: data leaks and then tried them to access twenty three 110 00:07:21,640 --> 00:07:25,200 Speaker 1: and meters and sometimes it worked, which again reminds us 111 00:07:25,240 --> 00:07:28,360 Speaker 1: that we need to have unique passwords for every service 112 00:07:28,400 --> 00:07:32,240 Speaker 1: we use, even though that's a pain in the tukis. Anyway, 113 00:07:32,400 --> 00:07:35,320 Speaker 1: I mentioned in that news item that there were rumors 114 00:07:35,320 --> 00:07:39,880 Speaker 1: the hackers were targeting specific ethnic groups, particularly Jewish people. 115 00:07:40,480 --> 00:07:43,800 Speaker 1: Now NBC reports that some hackers have published a database 116 00:07:44,240 --> 00:07:48,280 Speaker 1: on dark web forums that lists one person shy of 117 00:07:48,360 --> 00:07:52,320 Speaker 1: one million people, so nine hundred nine, nine hundred ninety 118 00:07:52,400 --> 00:07:56,400 Speaker 1: nine people who share an Ashkenazi heritage, so Ashkenazi Jews, 119 00:07:57,120 --> 00:08:01,760 Speaker 1: and considering historical threats of anti Semitism in general, and 120 00:08:01,920 --> 00:08:05,920 Speaker 1: that there's been a rise in those kinds of sentiments 121 00:08:06,400 --> 00:08:10,080 Speaker 1: across the world recently, this is a pretty alarming incident. 122 00:08:10,160 --> 00:08:13,360 Speaker 1: You know whether the hacker or hackers actually harbor antisemitic 123 00:08:13,440 --> 00:08:17,280 Speaker 1: views themselves, or they just callously wish to profit off 124 00:08:17,320 --> 00:08:21,160 Speaker 1: others who do have those feelings. I don't know the 125 00:08:21,200 --> 00:08:24,239 Speaker 1: answer to that, but it is generally a very bad 126 00:08:24,360 --> 00:08:27,400 Speaker 1: thing whenever anyone starts to make long lists of people 127 00:08:27,400 --> 00:08:32,040 Speaker 1: who share a particular background. Also, I should add that 128 00:08:32,160 --> 00:08:35,480 Speaker 1: I believe this to be true no matter what the background. 129 00:08:35,960 --> 00:08:38,200 Speaker 1: I try to use critical thinking and compassion as by 130 00:08:38,280 --> 00:08:42,320 Speaker 1: guiding principles in general, and as such I object to 131 00:08:42,400 --> 00:08:47,400 Speaker 1: violence acted out against innocent people, innocent civilians, no matter 132 00:08:47,640 --> 00:08:51,920 Speaker 1: what their ethnicity, nationality, or religion happens to be. I 133 00:08:51,960 --> 00:08:56,400 Speaker 1: also know I am vastly under educated in things like 134 00:08:56,960 --> 00:09:02,680 Speaker 1: Middle Eastern culture and politics, and that of Israel and Palestine, 135 00:09:02,760 --> 00:09:05,160 Speaker 1: so I know I am not qualified to make any 136 00:09:05,160 --> 00:09:09,400 Speaker 1: sort of judgment about what is going on beyond the 137 00:09:09,559 --> 00:09:13,000 Speaker 1: need to end attacks against civilians, whether in Israel or 138 00:09:13,080 --> 00:09:16,439 Speaker 1: Gaza or beyond that, I think is just true as 139 00:09:16,480 --> 00:09:21,160 Speaker 1: a blanket statement. But beyond that, I admit I am 140 00:09:21,240 --> 00:09:26,319 Speaker 1: too ignorant to weigh in on anything more substantive than that. 141 00:09:26,400 --> 00:09:29,480 Speaker 1: But I think it's our darn good start to stop 142 00:09:30,160 --> 00:09:34,760 Speaker 1: attacks on innocent civilians. All right, I need to preface 143 00:09:35,120 --> 00:09:39,680 Speaker 1: this next story with a content warning. I'm about to 144 00:09:39,679 --> 00:09:43,400 Speaker 1: talk about how people are using generative AI to create 145 00:09:43,440 --> 00:09:49,640 Speaker 1: images depicting sexual violence against children, which is obviously a 146 00:09:49,720 --> 00:09:52,840 Speaker 1: deeply disturbing topic, and I wanted to give you listeners 147 00:09:52,880 --> 00:09:56,920 Speaker 1: the opportunity to skip ahead or to stop listening. I 148 00:09:56,920 --> 00:09:59,800 Speaker 1: think it's something we can't ignore, but I also think 149 00:10:00,040 --> 00:10:04,200 Speaker 1: that your mental health is important. So if you feel 150 00:10:04,280 --> 00:10:07,720 Speaker 1: the need to just nope out of this next story, 151 00:10:08,320 --> 00:10:12,640 Speaker 1: no judgment here, I totally understand. The Guardian posted a 152 00:10:12,679 --> 00:10:16,839 Speaker 1: story about how the Internet Watch Foundation or IWF in 153 00:10:16,920 --> 00:10:21,439 Speaker 1: the UK found almost three thousand instances of AI generated 154 00:10:21,640 --> 00:10:25,600 Speaker 1: child abuse images, at least according to this group, So 155 00:10:25,640 --> 00:10:29,200 Speaker 1: the organization says that such material is poised to quote 156 00:10:29,400 --> 00:10:33,280 Speaker 1: overwhelm the Internet end quote. I don't know how true 157 00:10:33,320 --> 00:10:36,920 Speaker 1: that is, but if in fact it is easy to 158 00:10:37,040 --> 00:10:40,000 Speaker 1: produce such content, then it does stand to reason that 159 00:10:40,040 --> 00:10:42,520 Speaker 1: we're going to encounter more of it in the future. So, 160 00:10:42,640 --> 00:10:45,880 Speaker 1: according to the group, people who are using AI to 161 00:10:45,960 --> 00:10:50,760 Speaker 1: make these terrible images are an indication that the AI 162 00:10:50,840 --> 00:10:55,080 Speaker 1: models in question must have had real life images of 163 00:10:55,120 --> 00:10:59,920 Speaker 1: abuse to use as training material. Because generative AI can't 164 00:11:00,120 --> 00:11:01,920 Speaker 1: make stuff out of a vacuum. It has to be 165 00:11:02,000 --> 00:11:07,360 Speaker 1: trained first, and that's also really disturbing. The organization also 166 00:11:07,440 --> 00:11:10,440 Speaker 1: said that people were using this technology to take pictures 167 00:11:10,480 --> 00:11:14,680 Speaker 1: of clothed children and then alter those pictures to try 168 00:11:14,720 --> 00:11:17,480 Speaker 1: and show what those children might look like without clothing. 169 00:11:17,600 --> 00:11:22,480 Speaker 1: So it's truly abhorrent stuff. And while the threat of 170 00:11:22,559 --> 00:11:26,520 Speaker 1: a flood of AI generated abuse material is already horrifying, 171 00:11:27,400 --> 00:11:30,360 Speaker 1: the IWF worries that this is going to make real 172 00:11:30,440 --> 00:11:35,360 Speaker 1: world instances of abuse harder to detect and thus make 173 00:11:35,400 --> 00:11:40,440 Speaker 1: it more difficult to rescue child victims from abusive situations. 174 00:11:40,880 --> 00:11:45,199 Speaker 1: The IWF specifically identified a generative AI tool that's called 175 00:11:45,400 --> 00:11:50,319 Speaker 1: stable Diffusion from the company stability AI as the tool 176 00:11:50,400 --> 00:11:54,520 Speaker 1: that these folks were using to create the images. Stability 177 00:11:54,600 --> 00:11:57,960 Speaker 1: AI had a representative that told The Guardian the company 178 00:11:58,040 --> 00:12:02,560 Speaker 1: quote prohibits any missus for illegal or immoral purposes across 179 00:12:02,600 --> 00:12:06,199 Speaker 1: our platforms, and our policies are clear that this includes 180 00:12:06,360 --> 00:12:09,640 Speaker 1: ce sam end quote SEESAM, by the way, stands for 181 00:12:09,760 --> 00:12:14,000 Speaker 1: child sexual abuse material. It is unclear to me if 182 00:12:14,000 --> 00:12:17,360 Speaker 1: this means Stability AI has actually built in guardrails and 183 00:12:17,400 --> 00:12:21,360 Speaker 1: those guardrails have somehow failed, or if the company is 184 00:12:21,440 --> 00:12:25,000 Speaker 1: just attempting to distance itself from the ways its customers 185 00:12:25,000 --> 00:12:29,560 Speaker 1: are using its products. Okay, We're going to now take 186 00:12:29,960 --> 00:12:32,600 Speaker 1: a quick break to think our sponsors. We'll be back 187 00:12:32,760 --> 00:12:44,720 Speaker 1: with some more tech news in a moment. We're back 188 00:12:44,920 --> 00:12:48,000 Speaker 1: so here. In the United States, a collection of more 189 00:12:48,040 --> 00:12:52,240 Speaker 1: than thirty attorneys general across the nation have filed a 190 00:12:52,440 --> 00:12:56,960 Speaker 1: lawsuit against the company Meta aka the former Facebook. The 191 00:12:57,040 --> 00:13:01,960 Speaker 1: attorneys general argue that the company has no implemented harmful features, 192 00:13:02,000 --> 00:13:06,240 Speaker 1: including ones meant to addict users to staying with the 193 00:13:06,240 --> 00:13:08,880 Speaker 1: product for as long as possible, and that the company 194 00:13:08,920 --> 00:13:13,000 Speaker 1: has targeted young people, including children, in these efforts. The 195 00:13:13,160 --> 00:13:17,000 Speaker 1: attorney's general argue that the company's practices have led to 196 00:13:17,080 --> 00:13:21,000 Speaker 1: harm by promoting material by algorithm, and that some of 197 00:13:21,000 --> 00:13:24,640 Speaker 1: this material contributes to mental health issues and other problems 198 00:13:24,640 --> 00:13:29,240 Speaker 1: like eating disorders. The lawsuits argue that Meta has encouraged 199 00:13:29,280 --> 00:13:33,080 Speaker 1: and profited off of hurting young people and contributing to 200 00:13:33,120 --> 00:13:37,120 Speaker 1: a decline in mental health. Andy Stone, whom I think 201 00:13:37,120 --> 00:13:40,880 Speaker 1: of as the mouth of Meta, said, we're disappointed that, 202 00:13:41,000 --> 00:13:44,040 Speaker 1: instead of working productively with companies across the industry, to 203 00:13:44,040 --> 00:13:47,960 Speaker 1: create clear, age appropriate standards for the many apps teens use. 204 00:13:48,360 --> 00:13:52,800 Speaker 1: The Attorneys general have chosen this path, and I imagine 205 00:13:52,800 --> 00:13:58,240 Speaker 1: Meta is very disappointed now, y'all. I have been extremely 206 00:13:58,320 --> 00:14:01,680 Speaker 1: critical of Meta as well as platforms, and I do 207 00:14:01,760 --> 00:14:05,160 Speaker 1: believe these companies profit off of misery in lots of 208 00:14:05,200 --> 00:14:09,400 Speaker 1: different ways. I would suggest that the extent to which 209 00:14:09,400 --> 00:14:15,280 Speaker 1: these companies actually cause misery is not fully known or understood. 210 00:14:15,960 --> 00:14:19,920 Speaker 1: I'm not sure that we can draw firm conclusions between 211 00:14:19,920 --> 00:14:24,200 Speaker 1: correlation and causation here. So I've said this before. There's 212 00:14:24,200 --> 00:14:27,720 Speaker 1: sometimes a tendency to say that people encounter problems with 213 00:14:27,760 --> 00:14:34,040 Speaker 1: mental health after they spend increasingly long hours on social media. However, 214 00:14:34,680 --> 00:14:38,440 Speaker 1: it could be that people who already have mental health 215 00:14:38,480 --> 00:14:42,720 Speaker 1: issues are more prone to being active on social media 216 00:14:42,800 --> 00:14:46,960 Speaker 1: for longer hours, rather than the other way around. I 217 00:14:47,000 --> 00:14:50,200 Speaker 1: don't know where the truth lies. I'm sure it's somewhere 218 00:14:50,480 --> 00:14:53,000 Speaker 1: in the middle. I doubt it's as easy as one 219 00:14:53,040 --> 00:14:55,880 Speaker 1: extreme or the other. But my point is this is 220 00:14:55,920 --> 00:14:58,760 Speaker 1: a really tricky issue. You know, clearly, we do have 221 00:14:58,800 --> 00:15:02,240 Speaker 1: an obligation to protect you generations. That is important. We 222 00:15:02,280 --> 00:15:07,320 Speaker 1: should be prioritizing the health and wellness of people in general. 223 00:15:08,000 --> 00:15:10,320 Speaker 1: But we also need to make sure whatever measures we 224 00:15:10,400 --> 00:15:14,400 Speaker 1: take to do that actually addresses the issue and not 225 00:15:14,840 --> 00:15:19,360 Speaker 1: just a symptom. And I am not sure that we've 226 00:15:20,000 --> 00:15:23,840 Speaker 1: identified the issue right. I mean, the symptom is clearly there. 227 00:15:23,880 --> 00:15:29,160 Speaker 1: It is undeniable to say that metas algorithms have promoted 228 00:15:29,240 --> 00:15:33,200 Speaker 1: things that are not healthy to various users. I've seen 229 00:15:33,200 --> 00:15:37,760 Speaker 1: it myself. For me, I have seen when logging into Facebook, 230 00:15:37,960 --> 00:15:41,960 Speaker 1: promotions of stuff that is clearly not healthy. Uh. And 231 00:15:42,800 --> 00:15:48,240 Speaker 1: so that's obvious. But whether that's actually contributing to harm, 232 00:15:48,560 --> 00:15:52,160 Speaker 1: that's harder to say. I'm not saying that it doesn't exist, 233 00:15:52,560 --> 00:15:56,240 Speaker 1: but that we kind of need more information about it 234 00:15:56,280 --> 00:15:59,600 Speaker 1: before we can make definitive conclusions. If it does mean 235 00:15:59,600 --> 00:16:02,480 Speaker 1: that we cut back on the harmful stuff or the 236 00:16:02,480 --> 00:16:06,040 Speaker 1: stuff that scams or are promoting things that are bad, 237 00:16:06,640 --> 00:16:08,360 Speaker 1: I'm all for it. I would like to see less 238 00:16:08,360 --> 00:16:10,800 Speaker 1: of that anyway. I just don't know that you can 239 00:16:10,960 --> 00:16:14,920 Speaker 1: justify it as someone needs to protect the children. It's 240 00:16:14,960 --> 00:16:19,200 Speaker 1: a complicated issue, and I normally keep article recommendations for 241 00:16:19,280 --> 00:16:21,920 Speaker 1: the end of episodes, but in this case, I actually 242 00:16:22,000 --> 00:16:25,520 Speaker 1: have one that is relevant to this story. I think 243 00:16:25,840 --> 00:16:29,000 Speaker 1: it's by Caitlin Vogus of the Freedom of the Press Foundation. 244 00:16:29,320 --> 00:16:33,640 Speaker 1: So clearly there is a particular perspective being brought here. 245 00:16:34,320 --> 00:16:38,360 Speaker 1: And the article is titled Kids Online Safety Act will 246 00:16:38,400 --> 00:16:41,400 Speaker 1: censor the News. So this piece is all about how 247 00:16:41,880 --> 00:16:45,320 Speaker 1: a particular piece of legislation could have a chilling effect 248 00:16:45,320 --> 00:16:48,440 Speaker 1: on speech in general, even in ways that go beyond 249 00:16:48,840 --> 00:16:52,680 Speaker 1: the intended purpose of the law. And again, I think 250 00:16:52,680 --> 00:16:55,480 Speaker 1: this is a really complicated issue. It's one that is 251 00:16:55,640 --> 00:16:59,240 Speaker 1: highly charged emotionally because child safety is a factor, right, 252 00:16:59,280 --> 00:17:03,480 Speaker 1: and that's obviously important, but it's also important to consider 253 00:17:03,600 --> 00:17:07,240 Speaker 1: all the factors and consequences of our response to problems, 254 00:17:07,520 --> 00:17:10,720 Speaker 1: because we may just exacerbate problems or create all new 255 00:17:10,760 --> 00:17:14,280 Speaker 1: ones in the process. So, yeah, an important read. I think. 256 00:17:14,320 --> 00:17:17,960 Speaker 1: I don't think it's the definitive answer either. I just 257 00:17:17,960 --> 00:17:20,080 Speaker 1: think it's something that has to be brought into consideration. 258 00:17:20,400 --> 00:17:23,080 Speaker 1: And I can't pretend that I have the answer. I 259 00:17:23,119 --> 00:17:26,080 Speaker 1: certainly don't. I just think it's really important to consider 260 00:17:26,600 --> 00:17:31,520 Speaker 1: different perspectives before you make any conclusion. Now we move 261 00:17:31,560 --> 00:17:35,760 Speaker 1: on to X the service formerly known as Twitter. Just 262 00:17:35,840 --> 00:17:37,760 Speaker 1: a couple of little stories to talk about today, So 263 00:17:37,760 --> 00:17:41,600 Speaker 1: I'm not gonna do my normal Twitter dump. So first up, 264 00:17:41,920 --> 00:17:44,000 Speaker 1: X is rolling out new features on the platform that 265 00:17:44,040 --> 00:17:48,400 Speaker 1: support audio and video calls, so before I started recording, 266 00:17:48,400 --> 00:17:51,040 Speaker 1: I checked to see if the rollout had reached me, 267 00:17:51,160 --> 00:17:56,240 Speaker 1: but no dice. However, I think this particular feature may 268 00:17:56,359 --> 00:18:00,000 Speaker 1: only be applicable for people who are using the X 269 00:18:00,480 --> 00:18:04,280 Speaker 1: app on their phones. I don't do that. I installed 270 00:18:04,480 --> 00:18:08,000 Speaker 1: Twitter Ages ago and I never put it on when 271 00:18:08,040 --> 00:18:10,800 Speaker 1: it became X, so I don't have the app. I 272 00:18:10,840 --> 00:18:14,399 Speaker 1: only check the service if I use the web browser version. 273 00:18:14,840 --> 00:18:17,880 Speaker 1: So I suspect that I wouldn't have access to this 274 00:18:18,280 --> 00:18:20,359 Speaker 1: one way or the other because I think it only 275 00:18:20,400 --> 00:18:23,440 Speaker 1: goes to people who have the app version. But other 276 00:18:23,440 --> 00:18:26,679 Speaker 1: people report that upon opening their app they see a 277 00:18:26,720 --> 00:18:31,120 Speaker 1: notification announcing the arrival of audio and video calls. These 278 00:18:31,200 --> 00:18:35,280 Speaker 1: work within the Direct Messages system, so if you open 279 00:18:35,320 --> 00:18:38,600 Speaker 1: a DM with a user, you can select a little 280 00:18:38,600 --> 00:18:42,159 Speaker 1: phone icon to try and initiate a call with that person, 281 00:18:42,640 --> 00:18:45,320 Speaker 1: but you can't just call anyone, which I think must 282 00:18:45,400 --> 00:18:47,440 Speaker 1: be a huge relief to a lot of the folks 283 00:18:47,440 --> 00:18:51,040 Speaker 1: who are still using X. So the feature first requires 284 00:18:51,040 --> 00:18:54,040 Speaker 1: that users turn it on with a toggle in the 285 00:18:54,080 --> 00:18:57,320 Speaker 1: settings for Twitter or X in this case, so you 286 00:18:57,320 --> 00:18:59,760 Speaker 1: would be prompted to go into your settings and toggle 287 00:18:59,880 --> 00:19:04,840 Speaker 1: on on the support for audio and video calls. Then 288 00:19:04,880 --> 00:19:07,359 Speaker 1: you can also choose the types of users that you 289 00:19:07,480 --> 00:19:10,919 Speaker 1: will engage in calls with, so that might just be 290 00:19:10,960 --> 00:19:13,960 Speaker 1: people who are on your contact list, it might be 291 00:19:14,119 --> 00:19:17,760 Speaker 1: people that you follow, or it might only be verified users, 292 00:19:18,160 --> 00:19:21,040 Speaker 1: or some combination of the three. And I'm very relieved 293 00:19:21,040 --> 00:19:22,800 Speaker 1: to hear that, because I'm sure there's no shortage of 294 00:19:22,840 --> 00:19:26,760 Speaker 1: folks who still use that service who otherwise would be 295 00:19:26,880 --> 00:19:29,600 Speaker 1: flooded with call requests whenever they open the app, and 296 00:19:29,640 --> 00:19:33,159 Speaker 1: that would be miserable. According to the Verge, it is 297 00:19:33,200 --> 00:19:36,560 Speaker 1: not yet clear if any x user will be able 298 00:19:36,560 --> 00:19:38,840 Speaker 1: to access this feature once it is fully rolled out, 299 00:19:39,400 --> 00:19:42,400 Speaker 1: or if this will be kept just for those who 300 00:19:42,400 --> 00:19:45,320 Speaker 1: subscribe to the premium service. If I had to guess, 301 00:19:45,560 --> 00:19:48,960 Speaker 1: I'd say it was the second category. I think Musk 302 00:19:49,040 --> 00:19:53,160 Speaker 1: has been pushing really hard to create features for premium 303 00:19:53,280 --> 00:19:57,880 Speaker 1: users to encourage more adoption of subscribers. Don't know how 304 00:19:57,880 --> 00:20:01,400 Speaker 1: well it's going, but I suspect that's the case. Also, 305 00:20:01,880 --> 00:20:05,639 Speaker 1: this is related to X. Slack has sunset its integration 306 00:20:06,359 --> 00:20:09,879 Speaker 1: with that platform, so once upon a time, Slack users 307 00:20:09,920 --> 00:20:13,920 Speaker 1: could incorporate Twitter and then later X into their workspace 308 00:20:14,480 --> 00:20:18,320 Speaker 1: and be able to access features of Twitter through Slack, 309 00:20:18,400 --> 00:20:22,200 Speaker 1: But due to changes in x's Application Programming Interface or API, 310 00:20:22,800 --> 00:20:26,240 Speaker 1: Slack has chosen to end that support. The company has 311 00:20:26,240 --> 00:20:29,919 Speaker 1: also announced it is retiring its status account on X, 312 00:20:29,960 --> 00:20:33,560 Speaker 1: so it's closing out its own Twitter profile. So it 313 00:20:33,560 --> 00:20:36,000 Speaker 1: sounds like Slack and X are kind of going their 314 00:20:36,200 --> 00:20:39,560 Speaker 1: separate ways. In a previous News episode, I talked about 315 00:20:39,560 --> 00:20:43,600 Speaker 1: how there's a growing concern among US lawmakers that algorithmically 316 00:20:43,720 --> 00:20:48,160 Speaker 1: driven systems that set things like housing rental prices can 317 00:20:48,240 --> 00:20:52,040 Speaker 1: lead to an anti competitive situation, one in which landlords 318 00:20:52,080 --> 00:20:56,240 Speaker 1: are effectively engaging in collusion, even if they're doing so 319 00:20:56,440 --> 00:20:59,600 Speaker 1: unknowingly by relying on this app to help them set 320 00:20:59,640 --> 00:21:05,879 Speaker 1: rental prices. Essentially, software that's intended to help landlords factor 321 00:21:05,960 --> 00:21:08,240 Speaker 1: in how much they should set their rent based upon 322 00:21:08,760 --> 00:21:13,240 Speaker 1: the competitive market, when it's distributed across enough landlords in 323 00:21:13,240 --> 00:21:17,840 Speaker 1: a region, transforms into a price fixing scheme. Again, that 324 00:21:17,920 --> 00:21:21,399 Speaker 1: might be unintentional, but effectively that's what it's become. You 325 00:21:21,440 --> 00:21:24,000 Speaker 1: get enough people using this tool, the tool starts to 326 00:21:24,600 --> 00:21:28,040 Speaker 1: end up manipulating prices across the entire region, and next thing, 327 00:21:28,080 --> 00:21:32,359 Speaker 1: you know, prices creep up, and you've got yourself in 328 00:21:32,440 --> 00:21:36,440 Speaker 1: any competitive situation. So now the US Department of Justice 329 00:21:36,440 --> 00:21:40,280 Speaker 1: has issued a notice of potential participation, which is a 330 00:21:40,320 --> 00:21:44,880 Speaker 1: sort of preamble into a possible prosecution against the company 331 00:21:45,040 --> 00:21:48,160 Speaker 1: that makes the software. It's called real Page, by the way, 332 00:21:48,720 --> 00:21:52,080 Speaker 1: and the filing states quote. The government has a particularly 333 00:21:52,119 --> 00:21:55,760 Speaker 1: substantial interest in addressing the proper application of Section one 334 00:21:56,000 --> 00:22:00,359 Speaker 1: of the Sherman Act fifteen USC. Section one to the 335 00:22:00,480 --> 00:22:04,560 Speaker 1: use of algorithms by competitors to help set pricing. Company's 336 00:22:04,640 --> 00:22:07,280 Speaker 1: use of algorithms in price setting, often in an effort 337 00:22:07,320 --> 00:22:11,840 Speaker 1: to increase pricing, has become more prevalent in the modern economy. 338 00:22:12,000 --> 00:22:14,520 Speaker 1: As a result, the issues involved in this case are 339 00:22:14,520 --> 00:22:18,440 Speaker 1: of increasing significance to the application of antitrust laws across 340 00:22:18,440 --> 00:22:23,080 Speaker 1: the economy. End quote. Now, the DOJ plans to first 341 00:22:23,119 --> 00:22:26,800 Speaker 1: observe how an ongoing civil case related to this matter 342 00:22:26,960 --> 00:22:30,600 Speaker 1: plays out before it decides on making a move. So 343 00:22:30,640 --> 00:22:33,360 Speaker 1: there's no guarantee yet that the US government will get 344 00:22:33,440 --> 00:22:36,560 Speaker 1: more heavily involved in this matter, but it is a 345 00:22:36,600 --> 00:22:41,399 Speaker 1: distinct possibility. Last year, General Motors and Honda announced a 346 00:22:41,480 --> 00:22:44,640 Speaker 1: project to create an EV platform that would ultimately lead 347 00:22:44,640 --> 00:22:47,840 Speaker 1: to cheaper electric vehicles in markets like the United States, 348 00:22:48,320 --> 00:22:50,600 Speaker 1: North America in general, South America, that kind of thing. 349 00:22:51,680 --> 00:22:54,680 Speaker 1: But now that project is on the rocks. The companies 350 00:22:54,720 --> 00:22:57,960 Speaker 1: have pulled out of it and stated, quote, after extensive 351 00:22:58,000 --> 00:23:01,080 Speaker 1: studies and analysis, we have come to a mutual decision 352 00:23:01,119 --> 00:23:05,440 Speaker 1: to discontinue the program. Each company remains committed to affordability 353 00:23:05,480 --> 00:23:09,560 Speaker 1: in the EV market end quote. So it appears that 354 00:23:09,800 --> 00:23:12,840 Speaker 1: one of the major hurdles that faced the project has 355 00:23:12,920 --> 00:23:17,399 Speaker 1: been in the field of battery production. GM developed a 356 00:23:17,400 --> 00:23:22,000 Speaker 1: battery called Ultium, and this battery costs less to produce 357 00:23:22,119 --> 00:23:27,760 Speaker 1: than other older types of EV batteries. However, GM ran 358 00:23:27,800 --> 00:23:32,560 Speaker 1: into production bottlenecks all along the supply chain, and those 359 00:23:32,600 --> 00:23:36,199 Speaker 1: bottlenecks meant that there were delays in manufacturing, which of 360 00:23:36,240 --> 00:23:40,160 Speaker 1: course means that critically, there were delays in delivering vehicles 361 00:23:40,200 --> 00:23:44,600 Speaker 1: to customers, and that's a huge problem. Meanwhile, vehicles that 362 00:23:44,640 --> 00:23:49,160 Speaker 1: were built using older, more expensive batteries didn't have these 363 00:23:49,160 --> 00:23:52,760 Speaker 1: same production issues. They've got a production line that is mature, 364 00:23:53,359 --> 00:23:57,440 Speaker 1: so that wasn't interrupted. The cheaper one was the one 365 00:23:57,440 --> 00:23:59,720 Speaker 1: that just could not get established, and for those of 366 00:23:59,760 --> 00:24:04,119 Speaker 1: US EV models that are say, within our price range, 367 00:24:04,760 --> 00:24:08,920 Speaker 1: this is a pretty disappointing setback. All Right, we're gonna 368 00:24:08,960 --> 00:24:11,440 Speaker 1: take another quick break. When we come back. I've got 369 00:24:11,680 --> 00:24:24,440 Speaker 1: a few more news items I want to cover. We're back, 370 00:24:25,240 --> 00:24:27,359 Speaker 1: So Sarah Kay brent Is, one of the co founders 371 00:24:27,359 --> 00:24:30,520 Speaker 1: of Google, has a big old airship that now has 372 00:24:30,560 --> 00:24:34,280 Speaker 1: the green light to go fly In Brinn founded a 373 00:24:34,320 --> 00:24:39,119 Speaker 1: company a few years back named LTA Research. LTA stands 374 00:24:39,119 --> 00:24:43,000 Speaker 1: for Lighter than Air, so the goal was to build 375 00:24:43,080 --> 00:24:46,800 Speaker 1: lighter than air airships that could transport stuff like cargo 376 00:24:47,160 --> 00:24:52,800 Speaker 1: and supplies, specifically for humanitarian aid missions. The first of 377 00:24:52,880 --> 00:24:56,919 Speaker 1: the company's airships is called the Pathfinder one, and now 378 00:24:57,200 --> 00:25:00,000 Speaker 1: it can start to have test flights. It has received 379 00:25:00,119 --> 00:25:04,800 Speaker 1: clearance from the US Federal Aviation Administration or FAA, and 380 00:25:04,840 --> 00:25:08,080 Speaker 1: the Pathfinder one is according to the I triple E 381 00:25:08,920 --> 00:25:12,760 Speaker 1: I EEE or IE, as I used to say, it 382 00:25:12,840 --> 00:25:18,080 Speaker 1: is the largest aircraft to fly since the Hindenberg. However, 383 00:25:18,640 --> 00:25:24,439 Speaker 1: unlike that tragic dirigible, which famously caught fire and exploded, 384 00:25:26,160 --> 00:25:31,680 Speaker 1: it does not rely upon hydrogen as a lifting gas. Instead, 385 00:25:32,480 --> 00:25:36,680 Speaker 1: it relies on the non combustible gas helium to provide lift. 386 00:25:37,320 --> 00:25:40,800 Speaker 1: So again, the Hennenberg relied on hydrogen, which is highly 387 00:25:40,840 --> 00:25:44,240 Speaker 1: combustible and explosive. You know, you get hydrogen and oxygen 388 00:25:44,800 --> 00:25:46,879 Speaker 1: in an environment and you add a flame and you 389 00:25:46,960 --> 00:25:51,240 Speaker 1: get a big flash. Don't do it, by the way, 390 00:25:51,359 --> 00:25:57,080 Speaker 1: it's very dangerous. Helium, however, doesn't do this, And the 391 00:25:57,200 --> 00:26:01,240 Speaker 1: reason why hydrogen was used was not because people just 392 00:26:01,280 --> 00:26:04,960 Speaker 1: thought it would be okay. It's because hydrogen is a 393 00:26:05,000 --> 00:26:10,199 Speaker 1: better lifting gas. It's about eight percent more efficient to 394 00:26:10,640 --> 00:26:14,960 Speaker 1: lift a weight using hydrogen than helium, at least according 395 00:26:15,000 --> 00:26:18,080 Speaker 1: to one estimate I saw. It's capable of lifting more 396 00:26:18,200 --> 00:26:21,040 Speaker 1: mass than helium is, so in order to lift the 397 00:26:21,080 --> 00:26:24,440 Speaker 1: same structure with helium, you have to have much more 398 00:26:24,440 --> 00:26:30,240 Speaker 1: helium right now. To get around this deficit in lifting capacity, 399 00:26:30,720 --> 00:26:35,439 Speaker 1: LTA Research relied heavily on using very lightweight materials like 400 00:26:35,840 --> 00:26:40,840 Speaker 1: polymer tubes, reinforce with carbon fiber, and titanium in an 401 00:26:40,840 --> 00:26:45,280 Speaker 1: effort to build out frames and an airship that was 402 00:26:45,600 --> 00:26:49,840 Speaker 1: liftable through helium. A single pilot is all that's needed 403 00:26:49,840 --> 00:26:52,520 Speaker 1: to control this airship. The company does plan to use 404 00:26:52,560 --> 00:26:56,880 Speaker 1: two pilots for their tests. The control systems have redundancy, 405 00:26:57,080 --> 00:26:59,920 Speaker 1: which is good. You always want redundant systems for safety. 406 00:27:00,720 --> 00:27:05,280 Speaker 1: The gondola attached to the airship was created by the 407 00:27:05,480 --> 00:27:09,440 Speaker 1: Zeppelin Company. That's a throwback to the old dirigible days, 408 00:27:09,520 --> 00:27:13,000 Speaker 1: right Zeppelin. The gondola can hold up to fourteen people, 409 00:27:13,040 --> 00:27:16,600 Speaker 1: but LTA Research stresses that no passengers will actually be 410 00:27:16,640 --> 00:27:20,320 Speaker 1: allowed on the test flights, only the pilots in order 411 00:27:20,359 --> 00:27:24,119 Speaker 1: to maintain safety and to truly test the vehicle without 412 00:27:24,160 --> 00:27:29,040 Speaker 1: putting people at risk unnecessarily. The airship measures one hundred 413 00:27:29,400 --> 00:27:34,080 Speaker 1: twenty four meters in length. That is a monster. It 414 00:27:34,119 --> 00:27:38,680 Speaker 1: will initially conduct tests while remaining anchored to a point 415 00:27:38,720 --> 00:27:41,959 Speaker 1: on the ground on a mast, essentially, so they're not 416 00:27:42,000 --> 00:27:44,960 Speaker 1: going to be free flying the tests, at least not initially. 417 00:27:45,680 --> 00:27:49,400 Speaker 1: LTA Research already has an even larger airship in production, 418 00:27:49,640 --> 00:27:53,440 Speaker 1: which is called the Pathfinder three. When complete, it will 419 00:27:53,440 --> 00:27:57,199 Speaker 1: measure an incredible one hundred and eighty meters long. So 420 00:27:57,280 --> 00:28:00,879 Speaker 1: the plan is to use these airships to deliver aid 421 00:28:01,119 --> 00:28:04,640 Speaker 1: to remote locations that are difficult or impossible to access 422 00:28:05,080 --> 00:28:08,320 Speaker 1: via land or sea. So I really like that it's 423 00:28:08,359 --> 00:28:12,880 Speaker 1: not like meant to be some sort of billionaire pleasure 424 00:28:13,080 --> 00:28:18,320 Speaker 1: craft where you can just casually fly across the region 425 00:28:18,560 --> 00:28:24,240 Speaker 1: the country below you and enjoy, you know, expensive luxuries. 426 00:28:24,320 --> 00:28:26,480 Speaker 1: That's not what this is intended to do. It's intended 427 00:28:26,520 --> 00:28:31,400 Speaker 1: to get relief to places that otherwise would be very 428 00:28:31,440 --> 00:28:33,760 Speaker 1: hard to reach. I don't know if this is the 429 00:28:33,760 --> 00:28:36,240 Speaker 1: most efficient way to do that. I don't know if 430 00:28:36,280 --> 00:28:40,040 Speaker 1: it's a better alternative than other plans, which would include 431 00:28:40,040 --> 00:28:43,280 Speaker 1: things like, you know, drones or whatever. But I like 432 00:28:43,760 --> 00:28:46,800 Speaker 1: that that is the application for this technology. We'll have 433 00:28:46,840 --> 00:28:49,080 Speaker 1: to wait and see if it turns out to be 434 00:28:49,480 --> 00:28:53,040 Speaker 1: a practical one. It may not be, but I think 435 00:28:53,080 --> 00:28:58,200 Speaker 1: it's cool to at least have the potential for darrigibles 436 00:28:58,240 --> 00:29:02,120 Speaker 1: to take flight once again, more than just promotional purposes 437 00:29:02,160 --> 00:29:06,320 Speaker 1: at sporting events. Tech Radar reports that Microsoft is very 438 00:29:06,440 --> 00:29:11,360 Speaker 1: hurt when new Windows users launch Microsoft Edge only to 439 00:29:11,440 --> 00:29:14,360 Speaker 1: download the installer for Google Chrome, which of course is 440 00:29:14,400 --> 00:29:18,160 Speaker 1: a rival web browser to Edge, and Microsoft is so 441 00:29:18,400 --> 00:29:21,960 Speaker 1: hurt they want to know why you did it. Apparently, 442 00:29:22,520 --> 00:29:26,160 Speaker 1: using Edge to download the installer for Google Chrome now 443 00:29:26,200 --> 00:29:30,080 Speaker 1: prompts a pop up that contains a poll asking you 444 00:29:30,400 --> 00:29:33,360 Speaker 1: why are you doing this? So the reasons you can 445 00:29:33,400 --> 00:29:38,920 Speaker 1: pick include I can't search Google easily, I can't access 446 00:29:38,960 --> 00:29:43,200 Speaker 1: my Google documents, I don't have my favorites or passwords. 447 00:29:43,200 --> 00:29:47,080 Speaker 1: Here too many ads and pop ups. I don't like 448 00:29:47,160 --> 00:29:52,080 Speaker 1: the news feed, it's too slow. My websites don't work 449 00:29:52,080 --> 00:29:56,800 Speaker 1: on Microsoft Edge, and my reason is not listed. I'm 450 00:29:56,880 --> 00:29:59,840 Speaker 1: personally a little disappointed that they don't have It's not you, 451 00:30:00,120 --> 00:30:03,440 Speaker 1: It's me, or I'm washing my hair. I think those 452 00:30:03,440 --> 00:30:07,160 Speaker 1: should have made the cut. Christina Terek at tech Radar 453 00:30:07,240 --> 00:30:10,440 Speaker 1: points out that using Edge does mean that the user 454 00:30:10,480 --> 00:30:12,360 Speaker 1: often has to deal with a lot of like pop 455 00:30:12,480 --> 00:30:16,800 Speaker 1: ups and suggestions that primarily direct people to other Microsoft products. 456 00:30:17,360 --> 00:30:20,239 Speaker 1: Same thing's true of Windows. Heck, this morning, when I 457 00:30:20,400 --> 00:30:23,360 Speaker 1: was launching my computer, I got a whole bunch of 458 00:30:23,400 --> 00:30:28,560 Speaker 1: suggestions for Windows related stuff that just popped up unprompted. 459 00:30:28,960 --> 00:30:33,160 Speaker 1: Fun times. She also points out that Edge's newsfeed doesn't 460 00:30:33,160 --> 00:30:35,720 Speaker 1: have a really good reputation. A lot of people complain 461 00:30:35,880 --> 00:30:39,120 Speaker 1: that the articles and elements included in the news feed 462 00:30:39,600 --> 00:30:46,480 Speaker 1: are lower quality. They're not properly factual or useful. But Microsoft, 463 00:30:46,720 --> 00:30:51,120 Speaker 1: you're coming across a little bit clingy, if you ask me. 464 00:30:52,000 --> 00:30:55,440 Speaker 1: Joe rosignol Over at mac rumors dot Com wrote a 465 00:30:55,560 --> 00:30:59,960 Speaker 1: piece yesterday about an upcoming Apple Online event which is 466 00:31:00,080 --> 00:31:04,080 Speaker 1: scheduled for this coming Monday, and the rumor is that 467 00:31:04,120 --> 00:31:06,840 Speaker 1: this event will at least in some part focus on 468 00:31:07,080 --> 00:31:12,240 Speaker 1: high end gaming on Apple devices, including on Mac computers. 469 00:31:12,680 --> 00:31:15,200 Speaker 1: So Apple, of course has had a very long history 470 00:31:15,240 --> 00:31:18,520 Speaker 1: with gaming, sometimes in a positive way, other times in 471 00:31:18,560 --> 00:31:21,520 Speaker 1: a negative way. Some of the earliest computer games I 472 00:31:21,560 --> 00:31:24,560 Speaker 1: ever played were on an Apple to E computer, although 473 00:31:24,600 --> 00:31:27,760 Speaker 1: technically I think my very first computer game was Hunt 474 00:31:27,800 --> 00:31:31,720 Speaker 1: the Wumpus on the TI ninety nine for a home computer. 475 00:31:32,800 --> 00:31:36,160 Speaker 1: And then there was a time where game developers largely 476 00:31:36,400 --> 00:31:41,560 Speaker 1: ignored Apple platforms because Apple had a fairly small percentage 477 00:31:41,600 --> 00:31:44,880 Speaker 1: of the overall market share for desktop computers and it 478 00:31:44,960 --> 00:31:48,360 Speaker 1: just didn't make sense to pour the resources into creating 479 00:31:48,480 --> 00:31:51,719 Speaker 1: games for a platform that had very few people using it. 480 00:31:52,160 --> 00:31:54,520 Speaker 1: So there was a reputation for a while that Apple 481 00:31:54,560 --> 00:31:57,880 Speaker 1: computers just were not good for gaming, not because they 482 00:31:57,920 --> 00:32:02,520 Speaker 1: weren't powerful enough, but because game developers weren't making stuff 483 00:32:02,640 --> 00:32:06,520 Speaker 1: for them. But if Rosignol is right, it seems as 484 00:32:06,520 --> 00:32:10,000 Speaker 1: though Apple is now poised to really push Mac computers 485 00:32:10,040 --> 00:32:14,880 Speaker 1: as a serious gaming platform, and that the Mac computer 486 00:32:15,160 --> 00:32:18,200 Speaker 1: is the perfect machine to run triple A titles on 487 00:32:18,360 --> 00:32:21,800 Speaker 1: high settings while incorporating features like ray tracing, for example, 488 00:32:22,040 --> 00:32:26,200 Speaker 1: which deals with how a computer handles lighting in graphics. 489 00:32:26,800 --> 00:32:30,240 Speaker 1: I'm sure I will report next week, which we'll actually 490 00:32:30,320 --> 00:32:32,520 Speaker 1: be on Thursday. Next week. We will not have a 491 00:32:32,600 --> 00:32:36,440 Speaker 1: news episode on Tuesday because we have another Smart Talks 492 00:32:36,480 --> 00:32:39,760 Speaker 1: with IBM episode scheduled for that day, but on Thursday 493 00:32:39,800 --> 00:32:42,320 Speaker 1: I will try and follow up with this and see 494 00:32:42,320 --> 00:32:46,080 Speaker 1: whether or not Apple really did emphasize gaming in that 495 00:32:46,360 --> 00:32:51,400 Speaker 1: presentation on this upcoming Monday. Now, I previously mentioned one 496 00:32:51,480 --> 00:32:54,680 Speaker 1: article recommendation earlier in this episode. Before I conclude, I 497 00:32:54,720 --> 00:32:58,440 Speaker 1: do have one other recommendation. This is a piece that 498 00:32:58,640 --> 00:33:03,280 Speaker 1: is on BBC dot com. It's by Victoria Wollaston and 499 00:33:03,320 --> 00:33:07,720 Speaker 1: it's titled the Surprisingly Subtle ways Microsoft word has changed 500 00:33:07,720 --> 00:33:12,200 Speaker 1: the way we use language Now. I'm always fascinated at 501 00:33:12,240 --> 00:33:15,320 Speaker 1: how we shape tech and how tech in turn shapes us. 502 00:33:16,240 --> 00:33:21,040 Speaker 1: And sometimes these changes are not really happening consciously, Like 503 00:33:21,080 --> 00:33:24,000 Speaker 1: it's not an intentional thing. It's just something that kind 504 00:33:24,040 --> 00:33:28,560 Speaker 1: of happens through use and adoption, and it can lead 505 00:33:28,560 --> 00:33:32,240 Speaker 1: to really unanticipated consequence. It's not necessarily bad ones, sometimes 506 00:33:32,280 --> 00:33:35,120 Speaker 1: just interesting ones. So I think this article is a 507 00:33:35,120 --> 00:33:39,160 Speaker 1: great example of detailing how that happened with Microsoft Word. 508 00:33:39,440 --> 00:33:42,520 Speaker 1: So check it out. And that's it for the news 509 00:33:42,560 --> 00:33:46,440 Speaker 1: for today, Thursday, October twenty sixth, twenty twenty three. I 510 00:33:46,560 --> 00:33:49,360 Speaker 1: hope you are all well and I will talk to 511 00:33:49,400 --> 00:33:59,960 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 512 00:34:00,360 --> 00:34:05,360 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 513 00:34:05,480 --> 00:34:07,479 Speaker 1: or wherever you listen to your favorite shows.