1 00:00:04,480 --> 00:00:12,440 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,600 --> 00:00:16,480 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,480 --> 00:00:20,000 Speaker 1: an executive producer with iHeart Podcasts and How the tech 4 00:00:20,040 --> 00:00:22,520 Speaker 1: are you. It's time for the tech news for the 5 00:00:22,520 --> 00:00:29,000 Speaker 1: week ending on Friday, September thirteenth, twenty twenty four. Happy 6 00:00:29,200 --> 00:00:32,559 Speaker 1: Friday the thirteenth, everybody. I'm wishing all of you the 7 00:00:32,600 --> 00:00:37,040 Speaker 1: best of luck and hopefully a notable absence of a hulking, undead, 8 00:00:37,080 --> 00:00:41,559 Speaker 1: hockey mask wearing psychopath. We start off this episode with 9 00:00:41,760 --> 00:00:46,239 Speaker 1: some stories about AI, as is our wont to do so. 10 00:00:46,360 --> 00:00:52,000 Speaker 1: First up, Kylie Robinson of The Verge Robison, perhaps my apologies, Kylie. 11 00:00:52,240 --> 00:00:55,360 Speaker 1: Kylie of the Verge has an article titled open AI 12 00:00:55,560 --> 00:01:01,400 Speaker 1: releases OH one, its first model with reasoning abilities, which 13 00:01:01,440 --> 00:01:05,800 Speaker 1: sounds pretty significant, and Kylie explains that one is the 14 00:01:05,840 --> 00:01:10,399 Speaker 1: first of several planned sophisticated AI models that should be 15 00:01:10,440 --> 00:01:14,920 Speaker 1: capable of answering complex queries. While in development, this was 16 00:01:14,959 --> 00:01:19,680 Speaker 1: known to the AI crowd as the Strawberry Model. That's 17 00:01:19,720 --> 00:01:22,920 Speaker 1: actually the first I've heard of that code name. But 18 00:01:22,959 --> 00:01:25,480 Speaker 1: then again, I'm not exactly brought in to hear about 19 00:01:25,520 --> 00:01:28,880 Speaker 1: all things AI. The model is said to be able 20 00:01:28,920 --> 00:01:32,759 Speaker 1: to tackle multi step problems, which is something that earlier 21 00:01:32,800 --> 00:01:37,560 Speaker 1: AI models have had troubled doing, and doing this comes 22 00:01:37,560 --> 00:01:40,720 Speaker 1: at a couple of costs. There's the cost of time, 23 00:01:41,000 --> 00:01:44,160 Speaker 1: because it does take a little longer for this model 24 00:01:44,240 --> 00:01:47,520 Speaker 1: to create an answer, although it typically is still faster 25 00:01:47,640 --> 00:01:50,000 Speaker 1: than humans would be able to do it. But then 26 00:01:50,040 --> 00:01:54,400 Speaker 1: there's also the financial cost. These models aren't free to use, 27 00:01:54,600 --> 00:01:57,440 Speaker 1: and they come with a fairly hefty fee for access. 28 00:01:57,600 --> 00:02:00,720 Speaker 1: That fee is like three to four times more expensive 29 00:02:01,000 --> 00:02:06,520 Speaker 1: than using the current AI models for other purposes. Now, 30 00:02:06,520 --> 00:02:10,160 Speaker 1: according to open ai representatives, the one model is less 31 00:02:10,240 --> 00:02:13,960 Speaker 1: prone to hallucinations than earlier models are, but it is 32 00:02:14,040 --> 00:02:17,480 Speaker 1: still not immune to them. So the challenge of creating 33 00:02:17,480 --> 00:02:21,080 Speaker 1: an AI model that you know is dependable and accountable 34 00:02:21,320 --> 00:02:24,280 Speaker 1: without having to worry that it's just making stuff up, Well, 35 00:02:24,320 --> 00:02:29,560 Speaker 1: that problem persists. It's also apparently pretty good at coding. 36 00:02:30,160 --> 00:02:34,200 Speaker 1: It can't outperform the best of the best of human coders, 37 00:02:34,200 --> 00:02:36,760 Speaker 1: at least not yet, but I feel like if open 38 00:02:36,800 --> 00:02:40,240 Speaker 1: ai can convince companies that you know, their AI models 39 00:02:40,280 --> 00:02:44,320 Speaker 1: are equivalent to exceptional coders. Maybe not the best, but 40 00:02:44,600 --> 00:02:49,040 Speaker 1: really good coders. That's a pretty big sales bitch. Coders 41 00:02:49,280 --> 00:02:53,080 Speaker 1: who don't have salaries, they don't have healthcare plans, they 42 00:02:53,120 --> 00:02:57,600 Speaker 1: don't have stock options or retirement accounts or anything like that. 43 00:02:58,000 --> 00:03:00,840 Speaker 1: You know. I know that AI ideally is meant to 44 00:03:01,080 --> 00:03:04,520 Speaker 1: augment humans so that they can do their jobs better 45 00:03:04,760 --> 00:03:08,800 Speaker 1: and they can offload the really tedious work to automated processes. 46 00:03:09,080 --> 00:03:11,560 Speaker 1: But I still have concerns that, at least in the 47 00:03:11,600 --> 00:03:14,000 Speaker 1: short term, there are going to be business leaders out 48 00:03:14,000 --> 00:03:16,720 Speaker 1: there who will view AI as kind of a shortcut 49 00:03:16,800 --> 00:03:21,720 Speaker 1: to downsizing staff and outsourcing work to the robots. Marie 50 00:03:21,760 --> 00:03:25,239 Speaker 1: Baran of Newsweek wrote an article this week titled AI 51 00:03:25,400 --> 00:03:29,840 Speaker 1: generated junk science is flooding Google scholar studied claims, and 52 00:03:30,080 --> 00:03:33,040 Speaker 1: assuming that the study is correct, it illustrates one of 53 00:03:33,080 --> 00:03:38,000 Speaker 1: the many concerns I have about generative AI. Essentially, the 54 00:03:38,080 --> 00:03:42,000 Speaker 1: study found that AI generated science papers, fake science papers 55 00:03:42,000 --> 00:03:45,000 Speaker 1: in other words, are showing up in Google scholar right 56 00:03:45,040 --> 00:03:49,800 Speaker 1: next to legitimate, you know, scholarly articles within search results. 57 00:03:50,040 --> 00:03:53,680 Speaker 1: So Google has been indexing pages that are made by 58 00:03:53,720 --> 00:03:57,640 Speaker 1: AI bots, you know, like chat GPT. The Harvard Kennedy 59 00:03:57,680 --> 00:04:03,600 Speaker 1: School Misinformation Review published this study and identified one hundred 60 00:04:03,640 --> 00:04:07,240 Speaker 1: and thirty nine papers that are likely generated by AI, 61 00:04:07,680 --> 00:04:11,320 Speaker 1: and more than half of those identified papers focused on 62 00:04:11,400 --> 00:04:15,640 Speaker 1: topics that relate closely to stuff like climate change, health concerns, 63 00:04:15,680 --> 00:04:19,720 Speaker 1: and other policy relevant subject matter. And the danger here 64 00:04:19,839 --> 00:04:23,279 Speaker 1: is obvious. The papers may appear to be from legitimate 65 00:04:23,320 --> 00:04:27,080 Speaker 1: researchers who followed rigorous scientific practices in order to you know, 66 00:04:27,160 --> 00:04:31,039 Speaker 1: draw their conclusions, but in reality, they could just be 67 00:04:31,160 --> 00:04:35,279 Speaker 1: propaganda that's pushing a specific point of view and disguised 68 00:04:35,360 --> 00:04:38,279 Speaker 1: as a legitimate scientific paper and considering a lot of 69 00:04:38,279 --> 00:04:42,279 Speaker 1: people quote unquote research by just looking for passages and 70 00:04:42,360 --> 00:04:46,240 Speaker 1: papers that appear to confirm what the people already believe, 71 00:04:46,640 --> 00:04:49,120 Speaker 1: you know, just cherry picking. In other words, this could 72 00:04:49,200 --> 00:04:52,480 Speaker 1: lead to situations in which folks are citing fraudulent papers 73 00:04:52,680 --> 00:04:57,000 Speaker 1: because those papers support specious arguments. Now y'all already know 74 00:04:57,120 --> 00:05:01,640 Speaker 1: that I call out for critical thinking regular on this show, 75 00:05:01,839 --> 00:05:05,800 Speaker 1: and this trend really drives home how important critical thinking is. 76 00:05:06,320 --> 00:05:09,919 Speaker 1: I am just as guilty as anyone else of grabbing 77 00:05:09,960 --> 00:05:13,800 Speaker 1: onto a source that appears to confirm my biases. I 78 00:05:13,839 --> 00:05:16,520 Speaker 1: actually have to remind myself to take a closer look 79 00:05:16,560 --> 00:05:20,000 Speaker 1: to one make sure that the source is legitimate, and 80 00:05:20,040 --> 00:05:24,080 Speaker 1: then make double sure that what it's actually saying is 81 00:05:24,120 --> 00:05:27,039 Speaker 1: what I think it says, because that's not always the case. 82 00:05:27,160 --> 00:05:30,880 Speaker 1: I have been known to misinterpret stuff like that has happened, 83 00:05:31,240 --> 00:05:33,760 Speaker 1: so I have to be careful about this too, and 84 00:05:33,800 --> 00:05:36,520 Speaker 1: I don't always succeed, but it's good to keep it 85 00:05:36,560 --> 00:05:38,880 Speaker 1: in mind, like this is something to strive for, use 86 00:05:38,920 --> 00:05:41,760 Speaker 1: critical thinking, and it sounds to me like this is 87 00:05:41,839 --> 00:05:44,160 Speaker 1: just going to get more challenging to do as time 88 00:05:44,240 --> 00:05:47,080 Speaker 1: goes on. One thing that could help is if Google 89 00:05:47,200 --> 00:05:51,280 Speaker 1: developed some tools to make Google scholar more reliable. You know, 90 00:05:51,400 --> 00:05:54,360 Speaker 1: knowing that these things are going to exist, how can 91 00:05:54,440 --> 00:05:59,719 Speaker 1: Google scholar better differentiate between legitimate papers and those that 92 00:05:59,760 --> 00:06:04,799 Speaker 1: were generated by AI. That might involve improving transparency regarding 93 00:06:04,839 --> 00:06:08,360 Speaker 1: how Google Scholar indexes and ranks scholarly results in the 94 00:06:08,400 --> 00:06:11,599 Speaker 1: first place, or giving more tools so you can filter 95 00:06:11,800 --> 00:06:15,200 Speaker 1: stuff out, like make sure that the articles you get 96 00:06:15,240 --> 00:06:18,919 Speaker 1: are from legitimate peer reviewed sources. I suppose I should 97 00:06:18,960 --> 00:06:22,839 Speaker 1: also throw in this potential pitfall. If future generations of 98 00:06:22,960 --> 00:06:27,360 Speaker 1: AI models train themselves on fake scientific papers that were 99 00:06:27,360 --> 00:06:31,040 Speaker 1: created by earlier AI models, then the quality of content 100 00:06:31,080 --> 00:06:34,920 Speaker 1: on the Internet will further decline reasoning AI models like 101 00:06:34,960 --> 00:06:38,160 Speaker 1: the aforementioned one would be more likely to create incorrect 102 00:06:38,200 --> 00:06:42,960 Speaker 1: solutions if the training material it used included this sort 103 00:06:43,000 --> 00:06:47,200 Speaker 1: of stuff, these fraudulent scientific papers, because AI can't just 104 00:06:47,360 --> 00:06:53,520 Speaker 1: magically know what information is relevant and dependable and which 105 00:06:53,560 --> 00:06:57,320 Speaker 1: one is just invented in order to push forward a narrative. 106 00:06:57,920 --> 00:07:00,200 Speaker 1: Emma Roth that The Verge has an article that it's 107 00:07:00,240 --> 00:07:04,479 Speaker 1: titled Google is using AI to make fake podcasts from 108 00:07:04,600 --> 00:07:07,240 Speaker 1: your notes, and it kind of makes me think of 109 00:07:07,680 --> 00:07:10,480 Speaker 1: a recent episode I did on tech Stuff. I titled 110 00:07:10,480 --> 00:07:15,240 Speaker 1: it this episode was written by AI sort of, So 111 00:07:15,520 --> 00:07:18,240 Speaker 1: in that episode, I use chat GPT to create a 112 00:07:18,320 --> 00:07:21,880 Speaker 1: tech podcast episode not just not of tech stuff, just 113 00:07:22,280 --> 00:07:25,800 Speaker 1: a tech podcast, and it was supposed to trace the 114 00:07:25,880 --> 00:07:28,840 Speaker 1: history and the technology of airbags. And I read the 115 00:07:29,040 --> 00:07:33,440 Speaker 1: entire generated episode out in that podcast episode of tech Stuff. 116 00:07:33,520 --> 00:07:36,160 Speaker 1: I then spent the rest of the episode fact checking 117 00:07:36,200 --> 00:07:39,520 Speaker 1: and critiquing the AI's work. Now, the most disturbing thing 118 00:07:39,640 --> 00:07:44,360 Speaker 1: I encountered with this experiment was that the AI kept 119 00:07:44,480 --> 00:07:49,760 Speaker 1: inventing fake experts to deliver various bits of information. Sometimes 120 00:07:49,760 --> 00:07:52,920 Speaker 1: that information was wrong, and all the experts didn't really 121 00:07:53,000 --> 00:07:57,360 Speaker 1: exist anyway. In Roth's article, it unfolds that Google created 122 00:07:57,400 --> 00:08:00,800 Speaker 1: a feature in its notebook LM app that will take 123 00:08:00,880 --> 00:08:03,880 Speaker 1: notes that you have written down. It will then generate 124 00:08:04,240 --> 00:08:07,960 Speaker 1: this you know, AI created podcast hosted by a couple 125 00:08:08,040 --> 00:08:12,160 Speaker 1: of AI bots posing as the hosts, and the podcast 126 00:08:12,520 --> 00:08:16,840 Speaker 1: has hosts having a discussion about whatever the research topic is, 127 00:08:16,960 --> 00:08:19,920 Speaker 1: and it uses your notes to create a conversation between 128 00:08:19,960 --> 00:08:23,520 Speaker 1: these two bot hosts, and it kind of riffs off 129 00:08:23,560 --> 00:08:26,920 Speaker 1: the information that you have gathered. It sounds like the 130 00:08:27,000 --> 00:08:31,280 Speaker 1: AI is careful to only draw information from your notes, 131 00:08:31,600 --> 00:08:35,240 Speaker 1: so the output you get should reflect the input that 132 00:08:35,320 --> 00:08:39,040 Speaker 1: the bots relied upon. So in other words, if something 133 00:08:39,120 --> 00:08:41,600 Speaker 1: is wrong in the episode, it would be because your 134 00:08:41,679 --> 00:08:45,800 Speaker 1: notes have wrong information in them or incomplete information. The 135 00:08:45,880 --> 00:08:49,840 Speaker 1: AI wouldn't necessarily be hallucinating or confabulating, you know, drawing 136 00:08:50,240 --> 00:08:54,600 Speaker 1: from some source you've never seen either, And that's a 137 00:08:54,600 --> 00:08:56,880 Speaker 1: good thing. It does remind me. In the case of 138 00:08:56,960 --> 00:09:01,240 Speaker 1: the AI episode that I generated, Chad gbts was actually 139 00:09:01,360 --> 00:09:04,679 Speaker 1: unable to share with me what the sources were that 140 00:09:04,760 --> 00:09:07,520 Speaker 1: it was pulling information from. I asked it to, and 141 00:09:07,559 --> 00:09:11,240 Speaker 1: it couldn't. Instead, it gave me a list of sources 142 00:09:11,240 --> 00:09:14,160 Speaker 1: that the information might have come from, but there was 143 00:09:14,240 --> 00:09:17,040 Speaker 1: no guarantee that any of the information used in the 144 00:09:17,080 --> 00:09:20,920 Speaker 1: episode actually came from those sources. Now, in this case, 145 00:09:20,960 --> 00:09:23,800 Speaker 1: with Google, the sources you or at least the notes 146 00:09:24,120 --> 00:09:27,520 Speaker 1: that you've taken. And I think this approach is interesting. 147 00:09:27,640 --> 00:09:31,200 Speaker 1: It doesn't strike me quite as off putting as what 148 00:09:31,280 --> 00:09:34,600 Speaker 1: I experienced with chat GPT for one thing. To me, 149 00:09:34,760 --> 00:09:37,080 Speaker 1: this feels more like a study tool. I mean, we 150 00:09:37,160 --> 00:09:40,480 Speaker 1: all know that people have different learning styles, right, so 151 00:09:40,640 --> 00:09:43,319 Speaker 1: I think this tool could potentially be good for someone 152 00:09:43,760 --> 00:09:47,240 Speaker 1: who does take meticulous notes, but that doesn't really help 153 00:09:47,320 --> 00:09:51,480 Speaker 1: them absorb the information, right they don't really understand they've 154 00:09:51,520 --> 00:09:54,320 Speaker 1: got the notes, but it hasn't kind of sunk in. 155 00:09:54,880 --> 00:09:57,800 Speaker 1: So I think this kind of approach could create a 156 00:09:58,160 --> 00:10:02,680 Speaker 1: way of synthesizing and contect rualizing the information that could 157 00:10:02,720 --> 00:10:06,280 Speaker 1: be more impactful depending upon the subject matter and you know, 158 00:10:06,360 --> 00:10:09,480 Speaker 1: the learning style of the person involved. So as a 159 00:10:09,520 --> 00:10:13,520 Speaker 1: studying tool, I think it's a pretty neat idea. Now, 160 00:10:14,000 --> 00:10:18,160 Speaker 1: I also wonder if this will ultimately lead to people 161 00:10:18,240 --> 00:10:22,120 Speaker 1: using this tool to create podcasts that are hosted by AI, 162 00:10:22,559 --> 00:10:25,520 Speaker 1: which could be a problem. I mean, especially considering that 163 00:10:25,840 --> 00:10:28,400 Speaker 1: it's going to be based on whatever notes were made 164 00:10:28,440 --> 00:10:31,600 Speaker 1: to create the podcast, so you could do it to 165 00:10:31,840 --> 00:10:34,480 Speaker 1: make them say whatever you wanted, or you know, not 166 00:10:34,640 --> 00:10:37,760 Speaker 1: make them say, but they would say things drawn from 167 00:10:38,080 --> 00:10:40,880 Speaker 1: your own perspective in the notes you created. Now, that 168 00:10:40,880 --> 00:10:43,560 Speaker 1: could be funny if you were to create really weird 169 00:10:43,679 --> 00:10:47,400 Speaker 1: notes about obviously fake stuff, not in an effort to 170 00:10:47,520 --> 00:10:51,440 Speaker 1: mislead listeners, but rather as a way to entertain them. 171 00:10:51,760 --> 00:10:54,240 Speaker 1: And I'm thinking about shows that are something along the 172 00:10:54,240 --> 00:10:59,000 Speaker 1: lines of existing fictional podcasts out there, stuff like Welcome 173 00:10:59,000 --> 00:11:02,720 Speaker 1: to night Vale or Old Gods of Appalachia or my 174 00:11:02,920 --> 00:11:06,800 Speaker 1: friend Shay's podcast Kadi Womple with the Shadow People. And yes, 175 00:11:06,880 --> 00:11:09,440 Speaker 1: that last one is real. Kadi Womple with the Shadow 176 00:11:09,480 --> 00:11:12,120 Speaker 1: People is a real podcast, and yes I am plugging 177 00:11:12,160 --> 00:11:15,600 Speaker 1: my friends podcast sort of. So if Southern gothic fantasy 178 00:11:15,679 --> 00:11:18,120 Speaker 1: with a healthy dose of feminism is your kick, you 179 00:11:18,120 --> 00:11:19,920 Speaker 1: should check it out. In fact, you should just check 180 00:11:19,960 --> 00:11:22,520 Speaker 1: it out anyway, give an episode a listen, because Shay 181 00:11:22,640 --> 00:11:25,920 Speaker 1: is a great storyteller, and you know, maybe it's your jam, 182 00:11:25,960 --> 00:11:28,600 Speaker 1: maybe it's not, but yeah, I could see this tool 183 00:11:28,679 --> 00:11:31,720 Speaker 1: being used for that kind of thing. That arguably that 184 00:11:31,960 --> 00:11:36,040 Speaker 1: does bring into question the artistry although you would still 185 00:11:36,080 --> 00:11:38,360 Speaker 1: need to put in the work to create the source 186 00:11:38,880 --> 00:11:42,720 Speaker 1: notes that the hosts are drawing from. So it's a 187 00:11:42,760 --> 00:11:45,800 Speaker 1: gray area for me. Like generally, I tend to be 188 00:11:45,920 --> 00:11:49,800 Speaker 1: pretty negative or pretty critical at least about generative AI, 189 00:11:50,280 --> 00:11:52,559 Speaker 1: But if it comes down to something like this, where 190 00:11:52,640 --> 00:11:56,800 Speaker 1: you have done an enormous amount of work to build 191 00:11:56,840 --> 00:12:02,400 Speaker 1: the source material, I'm not quite as adamant the generative 192 00:12:02,400 --> 00:12:06,760 Speaker 1: AI is a bad tool to use in this context. 193 00:12:06,800 --> 00:12:08,839 Speaker 1: But maybe I just need to think on it more. 194 00:12:09,200 --> 00:12:13,160 Speaker 1: Now back to artificial intelligence in general. So on Science Alert, 195 00:12:13,480 --> 00:12:17,080 Speaker 1: David Neil has an article that's titled AI chatbots have 196 00:12:17,160 --> 00:12:21,720 Speaker 1: a political bias that could unknowingly influence society. Now, I 197 00:12:21,760 --> 00:12:24,240 Speaker 1: don't think this should come as a surprise because bias 198 00:12:24,280 --> 00:12:27,880 Speaker 1: has been a big issue in AI for decades. Some 199 00:12:28,080 --> 00:12:31,960 Speaker 1: experts have argued that not all bias is bad, right, Like, 200 00:12:32,280 --> 00:12:35,640 Speaker 1: you might build an AI model that is quote unquote 201 00:12:35,720 --> 00:12:40,440 Speaker 1: biased to pick out instances of say, medical images that 202 00:12:40,480 --> 00:12:43,440 Speaker 1: could indicate a health hazard, like the presence of say 203 00:12:43,440 --> 00:12:48,080 Speaker 1: a tumor, for example. But unintended bias is bad, and 204 00:12:48,160 --> 00:12:50,760 Speaker 1: we've seen lots of examples of that with AI, like 205 00:12:50,840 --> 00:12:55,320 Speaker 1: with facial recognition, technologies and the like. Neiled cites a 206 00:12:55,600 --> 00:12:59,959 Speaker 1: computer scientist named David Rosato from Otago Polytechnic and New 207 00:13:00,080 --> 00:13:04,240 Speaker 1: Zealand who uses various political questionnaires to test different AI 208 00:13:04,320 --> 00:13:07,840 Speaker 1: chat models to see where they fall on the political 209 00:13:07,880 --> 00:13:11,360 Speaker 1: spectrum based upon their responses to these questionnaires, and according 210 00:13:11,400 --> 00:13:15,520 Speaker 1: to his results, the models all fell somewhere left of 211 00:13:15,720 --> 00:13:19,760 Speaker 1: center on political matters, and they tended toward a more 212 00:13:19,840 --> 00:13:23,839 Speaker 1: libertarian point of view rather than an authoritarian point of view. 213 00:13:24,160 --> 00:13:26,320 Speaker 1: None of the models were coming out as like hard 214 00:13:26,480 --> 00:13:29,840 Speaker 1: left evangelists or anything like that, but the bias was 215 00:13:30,280 --> 00:13:34,400 Speaker 1: present and it was significant. Rosatto doesn't believe that the 216 00:13:34,400 --> 00:13:38,880 Speaker 1: bias was intentional, but rather it's sort of an emergent quality. 217 00:13:39,200 --> 00:13:41,960 Speaker 1: And why is it emerging at all? Well, the best 218 00:13:42,000 --> 00:13:44,600 Speaker 1: guess is that the material used to train the AI 219 00:13:44,760 --> 00:13:49,439 Speaker 1: models skews left more than it does right. Not all 220 00:13:49,440 --> 00:13:52,080 Speaker 1: of it, but that overall, when taken as in it, 221 00:13:52,320 --> 00:13:55,120 Speaker 1: you know, a hole, it skews more left. That there 222 00:13:55,120 --> 00:13:58,439 Speaker 1: are more pieces written from a left of center perspective 223 00:13:58,520 --> 00:14:02,559 Speaker 1: than right of center. This in turn imbalances the material, 224 00:14:02,960 --> 00:14:06,240 Speaker 1: so that leads to a bias in the models. And 225 00:14:06,280 --> 00:14:08,800 Speaker 1: it kind of makes me think about how matter and 226 00:14:08,840 --> 00:14:13,440 Speaker 1: antimatter are. So when matter and antimatter come into contact 227 00:14:13,440 --> 00:14:16,840 Speaker 1: with one another, they annihilate each other. So if there 228 00:14:16,920 --> 00:14:20,720 Speaker 1: had been a perfect balance of matter and antimatter at 229 00:14:20,800 --> 00:14:24,120 Speaker 1: the dawn of time, there would be no universe to 230 00:14:24,120 --> 00:14:26,200 Speaker 1: speak up, because it would have all blowed up before 231 00:14:26,200 --> 00:14:29,080 Speaker 1: it could even get started. But for some reason, there 232 00:14:29,160 --> 00:14:33,320 Speaker 1: was a little bit more matter than there was antimatter, 233 00:14:33,560 --> 00:14:37,320 Speaker 1: and we got the universe. So with these AI models, 234 00:14:37,360 --> 00:14:41,080 Speaker 1: the training material had more left of center perspective material 235 00:14:41,400 --> 00:14:45,520 Speaker 1: than otherwise, which y'all you know, I lean left so hard. 236 00:14:45,560 --> 00:14:48,120 Speaker 1: I walk around at a forty five degree angle. But 237 00:14:48,920 --> 00:14:53,040 Speaker 1: I don't think having a biased perspective in the tools 238 00:14:53,120 --> 00:14:58,680 Speaker 1: themselves that are meant to provide and contextualize information is 239 00:14:58,720 --> 00:15:02,240 Speaker 1: a good thing, even though that bias kind of leans 240 00:15:02,280 --> 00:15:05,400 Speaker 1: toward the way that I view the world. I don't 241 00:15:05,400 --> 00:15:07,640 Speaker 1: think a bias is good in that respect. It needs 242 00:15:07,680 --> 00:15:11,240 Speaker 1: to be as objective as it possibly can, in my opinion, 243 00:15:11,800 --> 00:15:15,080 Speaker 1: So if the bias means we can't rely on the 244 00:15:15,120 --> 00:15:18,600 Speaker 1: results provided, that ends up being a big problem, especially 245 00:15:18,680 --> 00:15:22,360 Speaker 1: considering how gung ho everybody is on AI. Now. Despite 246 00:15:22,400 --> 00:15:25,520 Speaker 1: these findings, we have also seen examples of generative AI 247 00:15:25,720 --> 00:15:29,440 Speaker 1: engaged in recreating some really ugly biases as well. I'm 248 00:15:29,440 --> 00:15:32,720 Speaker 1: thinking primarily of image generating models that tend to be 249 00:15:32,760 --> 00:15:35,920 Speaker 1: guilty of perpetuating racial stereotypes. So I guess you could 250 00:15:35,920 --> 00:15:38,600 Speaker 1: see this issue as a wake up call regarding our 251 00:15:38,640 --> 00:15:42,520 Speaker 1: own prejudices and biases on top of the issue we 252 00:15:42,560 --> 00:15:45,560 Speaker 1: have with AI. Okay, we've got a ton more news 253 00:15:45,600 --> 00:15:47,520 Speaker 1: to get through. Let's take a quick break to thank 254 00:15:47,560 --> 00:15:59,720 Speaker 1: our sponsors. We're back and we're not done with AI 255 00:15:59,880 --> 00:16:03,240 Speaker 1: just yet. Just Weather Bed of the Verge has a 256 00:16:03,320 --> 00:16:07,680 Speaker 1: beast titled Meta fed its AI on almost everything you've 257 00:16:07,720 --> 00:16:11,400 Speaker 1: posted publicly since two thousand and seven. And yeah, that 258 00:16:11,520 --> 00:16:14,080 Speaker 1: article starts off with a whammy. In fact, I'm just 259 00:16:14,120 --> 00:16:16,480 Speaker 1: going to quote whether Bed at the beginning of the article. 260 00:16:16,720 --> 00:16:20,760 Speaker 1: Who writes quote Meta has acknowledged that all text and 261 00:16:20,880 --> 00:16:24,960 Speaker 1: photos that adult Facebook and Instagram users have publicly published 262 00:16:25,280 --> 00:16:28,480 Speaker 1: since two thousand and seven have been fed into its 263 00:16:28,600 --> 00:16:33,080 Speaker 1: artificial intelligence models. End quote. Now this is significant for 264 00:16:33,600 --> 00:16:36,720 Speaker 1: many reasons, one of which is that Meta executives had 265 00:16:36,760 --> 00:16:40,440 Speaker 1: previously sort of denied that this was the case when 266 00:16:40,480 --> 00:16:44,200 Speaker 1: asked by Australian legislators if user data was being exploited 267 00:16:44,280 --> 00:16:46,840 Speaker 1: in this way, but ultimately they did cop to the 268 00:16:46,880 --> 00:16:51,080 Speaker 1: practice when lawmakers really cornered them with pretty direct questions 269 00:16:51,080 --> 00:16:55,000 Speaker 1: that they couldn't just deflect. So essentially, unless users had 270 00:16:55,320 --> 00:16:58,360 Speaker 1: set their content to something other than public, you know, 271 00:16:58,440 --> 00:17:01,560 Speaker 1: like friends only or private or whatever, then that content 272 00:17:01,680 --> 00:17:04,480 Speaker 1: was up for grabs and Meta grabbed it for the 273 00:17:04,480 --> 00:17:08,040 Speaker 1: purposes of training AI. Meta didn't go so far as 274 00:17:08,080 --> 00:17:10,960 Speaker 1: to explain if there's a cutoff for when the data 275 00:17:11,000 --> 00:17:14,840 Speaker 1: scraping happened. So, for example, assuming that it does go 276 00:17:14,880 --> 00:17:17,160 Speaker 1: all the way back to two thousand and seven, can 277 00:17:17,200 --> 00:17:20,840 Speaker 1: the bots scrape everything that was ever posted to the platform, 278 00:17:21,040 --> 00:17:24,800 Speaker 1: at least publicly? And could that be the case even 279 00:17:24,840 --> 00:17:27,520 Speaker 1: if the person who posted that stuff was a minor 280 00:17:27,680 --> 00:17:31,040 Speaker 1: at the time, so all of those posts, including images, 281 00:17:31,440 --> 00:17:34,040 Speaker 1: could be up for grabs. Now, Meta has said it 282 00:17:34,160 --> 00:17:37,080 Speaker 1: does not scrape profiles of users who are under the 283 00:17:37,119 --> 00:17:41,560 Speaker 1: age of eighteen. Fine, but what about users who today 284 00:17:41,880 --> 00:17:45,040 Speaker 1: are adults but have been on Facebook long enough so 285 00:17:45,080 --> 00:17:48,280 Speaker 1: that the earliest days of their Facebook use was when 286 00:17:48,280 --> 00:17:51,520 Speaker 1: they were under the age of eighteen? Did the data 287 00:17:51,560 --> 00:17:55,240 Speaker 1: scraping include their data from that time? I mean, yeah, 288 00:17:55,280 --> 00:17:58,400 Speaker 1: today they're adults. But when they posted those things back 289 00:17:58,440 --> 00:18:01,320 Speaker 1: in say, two thousand and seven, they well, that question 290 00:18:01,440 --> 00:18:04,840 Speaker 1: is more murky, and to be truthful, the Meta representatives 291 00:18:04,880 --> 00:18:08,280 Speaker 1: didn't really have an answer for it, and that's very concerning. Now. 292 00:18:08,280 --> 00:18:11,800 Speaker 1: Meta does allow users to opt out of this data 293 00:18:11,840 --> 00:18:15,320 Speaker 1: scraping practice if those users happen to live in the 294 00:18:15,359 --> 00:18:19,640 Speaker 1: European Union, where regional laws mandate that Meta create this 295 00:18:19,720 --> 00:18:23,920 Speaker 1: option to opt out. Likewise, laws in Brazil required Meta 296 00:18:23,960 --> 00:18:27,000 Speaker 1: to cut out the data scraping for AI there as well, 297 00:18:27,280 --> 00:18:29,840 Speaker 1: But everywhere else in the world, it's fair game, baby. 298 00:18:30,040 --> 00:18:33,520 Speaker 1: If it ain't expressly against the law, Meta is doing it, 299 00:18:34,000 --> 00:18:36,440 Speaker 1: which might be food for thought for all the other 300 00:18:36,560 --> 00:18:39,439 Speaker 1: countries out there, at least the ones with any interest 301 00:18:39,440 --> 00:18:44,320 Speaker 1: at all regarding protecting citizen data from massive corporations. Metta 302 00:18:44,400 --> 00:18:46,560 Speaker 1: is also in the news here in the United States, 303 00:18:46,640 --> 00:18:50,639 Speaker 1: as Republican Congressman Tim Wahlberg has some pretty harsh words 304 00:18:50,680 --> 00:18:53,880 Speaker 1: for the company after it responded to concerns over how 305 00:18:53,920 --> 00:18:57,040 Speaker 1: it has hosted ads for illegal drugs, which I talked 306 00:18:57,080 --> 00:19:00,400 Speaker 1: about in an earlier Tech Stuff episode, so to kind 307 00:19:00,400 --> 00:19:04,760 Speaker 1: of summarize, Legislators had sent an inquiry to Meta following 308 00:19:04,800 --> 00:19:08,240 Speaker 1: reports from the Tech Transparency Project as well as the 309 00:19:08,240 --> 00:19:12,760 Speaker 1: Wall Street Journal that detailed how advertisements for illegal drugs 310 00:19:12,800 --> 00:19:17,400 Speaker 1: were appearing on Facebook and Instagram, both prescription drugs and 311 00:19:17,760 --> 00:19:21,760 Speaker 1: recreational drugs, like log on and suddenly there's an ad 312 00:19:21,760 --> 00:19:25,840 Speaker 1: for cocaine on your Facebook feed. And this means that 313 00:19:25,920 --> 00:19:29,640 Speaker 1: Meta was not only providing a platform that these illegal 314 00:19:29,720 --> 00:19:33,200 Speaker 1: ads got to use, but Meta itself was profiting from 315 00:19:33,280 --> 00:19:37,400 Speaker 1: these illegal advertisements. Now, Meta didn't create the ads, They're 316 00:19:37,440 --> 00:19:40,520 Speaker 1: just hosting them, but they are profiting from them because 317 00:19:40,520 --> 00:19:44,480 Speaker 1: the advertisers have to pay Meta to have this space. Right, 318 00:19:44,800 --> 00:19:48,119 Speaker 1: So the lawmakers sent more than a dozen questions to 319 00:19:48,160 --> 00:19:50,640 Speaker 1: Meta to really get down to how big an issue 320 00:19:50,640 --> 00:19:54,359 Speaker 1: this is, how prevalent is this problem, and what the 321 00:19:54,400 --> 00:19:58,120 Speaker 1: heck is Meta doing about it? And Meta essentially responded 322 00:19:58,160 --> 00:20:01,600 Speaker 1: by saying, and to be clear, I am paraphrasing like 323 00:20:01,840 --> 00:20:05,240 Speaker 1: crazy here, but they said essentially like, yeah, you know, 324 00:20:05,359 --> 00:20:09,760 Speaker 1: that's crazy. We agree that's crazy. But Meta is all 325 00:20:09,800 --> 00:20:12,240 Speaker 1: about doing its part to fight illegal activity. This is 326 00:20:12,240 --> 00:20:15,800 Speaker 1: a big issue beyond any one platform. This is this 327 00:20:15,880 --> 00:20:17,720 Speaker 1: is major. This isn't just us, This is this is 328 00:20:17,720 --> 00:20:21,240 Speaker 1: a big problem. Now, Wahlberg was not buying this and 329 00:20:21,280 --> 00:20:24,920 Speaker 1: called the response unacceptable. I agree with him. He went 330 00:20:24,960 --> 00:20:28,800 Speaker 1: on to say, quote Metta's response not only ignores most 331 00:20:28,800 --> 00:20:31,600 Speaker 1: of the questions posed in our letter, but also refuses 332 00:20:31,640 --> 00:20:35,199 Speaker 1: to acknowledge that these illicit drug ads were approved and 333 00:20:35,400 --> 00:20:39,120 Speaker 1: monetized by Meta and allowed to run on their platforms 334 00:20:39,320 --> 00:20:42,880 Speaker 1: end quote. The director of Tech Transparency Project, Katie Paul, 335 00:20:43,080 --> 00:20:47,359 Speaker 1: also accused Meta of deliberately sidestepping questions of accountability in 336 00:20:47,400 --> 00:20:49,600 Speaker 1: an effort to deflect and to claim that this is 337 00:20:49,680 --> 00:20:53,360 Speaker 1: just a bigger issue, much bigger than Meta and its platforms. 338 00:20:53,800 --> 00:20:58,000 Speaker 1: And CEO Mark Zuckerberg recently said on a podcast interview 339 00:20:58,200 --> 00:21:00,720 Speaker 1: that he thinks in the past he has made the 340 00:21:00,720 --> 00:21:04,479 Speaker 1: mistake of accepting responsibility for stuff that subsequently he believes 341 00:21:04,520 --> 00:21:07,480 Speaker 1: his company wasn't actually guilty of doing. Like he said, 342 00:21:07,480 --> 00:21:09,679 Speaker 1: we got to stop saying we're sorry for stuff that 343 00:21:09,840 --> 00:21:12,359 Speaker 1: isn't our fault. Essentially is how it came across to 344 00:21:12,440 --> 00:21:16,280 Speaker 1: me anyway. But this particular subject seems pretty cut and 345 00:21:16,359 --> 00:21:19,760 Speaker 1: dried to me, because either Meta was selling ads to 346 00:21:19,800 --> 00:21:24,359 Speaker 1: clients who ultimately made those ads about illegal drugs, or 347 00:21:24,440 --> 00:21:28,440 Speaker 1: Meta did not do that. So either Meta profited off 348 00:21:28,480 --> 00:21:33,200 Speaker 1: of illegal advertisements or it didn't. There's not grey area here, 349 00:21:33,800 --> 00:21:36,600 Speaker 1: you know. And sure Meta could argue that the scale 350 00:21:36,680 --> 00:21:39,240 Speaker 1: of its business as such that it can't police every 351 00:21:39,320 --> 00:21:43,200 Speaker 1: ad or ensure that an ad that was sold actually 352 00:21:43,280 --> 00:21:45,720 Speaker 1: ends up being for whatever it was sold to be. 353 00:21:46,400 --> 00:21:51,960 Speaker 1: But shouldn't they because advertising is their business, that's what 354 00:21:52,080 --> 00:21:55,120 Speaker 1: Meta does. It's where the vast majority of the company's 355 00:21:55,119 --> 00:21:57,720 Speaker 1: revenue comes from. So it seems to me that the 356 00:21:57,720 --> 00:22:02,800 Speaker 1: company absolutely should prioritize that it ensures that its core 357 00:22:03,000 --> 00:22:07,560 Speaker 1: business is legal. Maybe I'm being unreasonable here. I don't 358 00:22:07,560 --> 00:22:11,560 Speaker 1: think so, but maybe. David Shepherdson of Reuter's has a 359 00:22:11,640 --> 00:22:16,359 Speaker 1: piece titled TikTok faces crucial court hearing that could decide 360 00:22:16,400 --> 00:22:19,760 Speaker 1: fate in the US. So you might remember that lawmakers 361 00:22:19,800 --> 00:22:22,439 Speaker 1: here in the United States decided that TikTok would have 362 00:22:22,520 --> 00:22:26,040 Speaker 1: to divorce itself from its parent company, Byteedance, which is 363 00:22:26,160 --> 00:22:29,040 Speaker 1: headquartered in China, if it is to be allowed to 364 00:22:29,160 --> 00:22:33,600 Speaker 1: continue to operate in the United States, and subsequently, TikTok 365 00:22:33,640 --> 00:22:38,160 Speaker 1: has argued that such a separation is technologically impossible. Now 366 00:22:38,200 --> 00:22:41,520 Speaker 1: that's a claim I personally find hard to believe, though 367 00:22:41,560 --> 00:22:44,359 Speaker 1: I do think any separation would require huge changes to 368 00:22:44,400 --> 00:22:48,000 Speaker 1: how TikTok operates. It also said it's legally impossible and 369 00:22:48,160 --> 00:22:52,080 Speaker 1: financially impossible. Well, next week, the US Court of Appeals 370 00:22:52,119 --> 00:22:55,240 Speaker 1: for the District of Columbia will hear arguments about that 371 00:22:55,400 --> 00:22:58,560 Speaker 1: legal side from TikTok's legal team, and they're saying that 372 00:22:58,840 --> 00:23:02,280 Speaker 1: this law is unconstantitutional that violates the First Amendment, which 373 00:23:02,320 --> 00:23:04,679 Speaker 1: is also known as the freedom of speech. Well, no 374 00:23:04,760 --> 00:23:08,080 Speaker 1: matter what the outcome is of this particular case, I 375 00:23:08,119 --> 00:23:10,480 Speaker 1: think chances are pretty good it's going to get pushed 376 00:23:10,560 --> 00:23:13,320 Speaker 1: upward to the Supreme Court. Both the US Department of 377 00:23:13,440 --> 00:23:16,960 Speaker 1: Justice and TikTok's lawyers have asked the Court of Appeals 378 00:23:17,000 --> 00:23:20,280 Speaker 1: to render a decision by no later than December sixth, 379 00:23:20,600 --> 00:23:22,840 Speaker 1: and that will be just a little over a month 380 00:23:22,880 --> 00:23:25,400 Speaker 1: before the nationwide ban is to go into effect, which 381 00:23:25,440 --> 00:23:29,400 Speaker 1: is on January nineteenth. That's assuming that, you know, there's 382 00:23:29,440 --> 00:23:32,960 Speaker 1: no challenge to this, and it provides very little time 383 00:23:32,960 --> 00:23:35,040 Speaker 1: for the Supreme Court to get involved to make a 384 00:23:35,080 --> 00:23:37,359 Speaker 1: decision as to what might happen once it gets to 385 00:23:37,400 --> 00:23:40,600 Speaker 1: the Supreme Court. That beats me, because I mean Donald 386 00:23:40,600 --> 00:23:45,520 Speaker 1: Trump appointed three Supreme Court justices during his presidency and 387 00:23:45,920 --> 00:23:49,680 Speaker 1: At that time, Trump was actively trying to ban TikTok 388 00:23:50,040 --> 00:23:53,440 Speaker 1: by executive order. That was a big deal. However, since then, 389 00:23:53,520 --> 00:23:57,399 Speaker 1: Trump has flip flopped about TikTok. Whether that has anything 390 00:23:57,440 --> 00:24:01,600 Speaker 1: to do with a billionaire TikTok and who made significant 391 00:24:01,600 --> 00:24:05,720 Speaker 1: campaign donations to the GOP, I can't say. But certainly 392 00:24:05,760 --> 00:24:09,280 Speaker 1: Trump's own explanations of well, you know, kids really like it, 393 00:24:09,480 --> 00:24:12,119 Speaker 1: that doesn't seem to be a compelling reason for his 394 00:24:12,760 --> 00:24:16,320 Speaker 1: one to eighty degree turn on TikTok. Anyway, I have 395 00:24:16,480 --> 00:24:20,480 Speaker 1: no clue if the current Supreme Court would side more 396 00:24:20,560 --> 00:24:24,120 Speaker 1: on what past Trump said back when he was president 397 00:24:24,280 --> 00:24:26,240 Speaker 1: or what he says now when he's trying to be 398 00:24:26,280 --> 00:24:29,520 Speaker 1: president again. But that decision will happen after the election, 399 00:24:29,800 --> 00:24:33,119 Speaker 1: so maybe that could have an impact, I have no 400 00:24:33,200 --> 00:24:37,320 Speaker 1: way of knowing. Yesterday, the Food and Drug Administration here 401 00:24:37,320 --> 00:24:39,880 Speaker 1: in the United States issued a press release that announced 402 00:24:39,920 --> 00:24:44,520 Speaker 1: that Apple AirPods pro headphones have qualified to be labeled 403 00:24:44,560 --> 00:24:48,320 Speaker 1: as over the counter hearing aids software devices, and that 404 00:24:48,480 --> 00:24:51,679 Speaker 1: is a first. It's the first over the counter hearing 405 00:24:51,680 --> 00:24:55,920 Speaker 1: a device or software device I guess that has ever 406 00:24:56,000 --> 00:24:59,080 Speaker 1: received that designation here in the United States. Previous hearing 407 00:24:59,119 --> 00:25:02,040 Speaker 1: aids have not been over the counter. You had to 408 00:25:02,080 --> 00:25:06,119 Speaker 1: go through a pretty lengthy sequence of visits with various 409 00:25:06,160 --> 00:25:10,080 Speaker 1: doctors before you could get a medical device to age 410 00:25:10,119 --> 00:25:13,560 Speaker 1: your hearing. With Apple, users can customize the performance of 411 00:25:13,600 --> 00:25:16,920 Speaker 1: their AirPods to meet their hearing needs, assuming that they 412 00:25:16,960 --> 00:25:20,879 Speaker 1: have mild to moderate hearing impairment. Beyond that, they would 413 00:25:20,920 --> 00:25:25,720 Speaker 1: still need to go through the medical pathway. But folks 414 00:25:25,720 --> 00:25:29,040 Speaker 1: who have experienced mild to moderate hearing loss can order 415 00:25:29,119 --> 00:25:32,719 Speaker 1: directly through Apple without first going through the whole medical system, 416 00:25:32,960 --> 00:25:35,440 Speaker 1: so they don't have to seek out an examination from 417 00:25:35,480 --> 00:25:38,879 Speaker 1: their doctor and then you get referred to audiologists and 418 00:25:38,920 --> 00:25:41,600 Speaker 1: all that kind of stuff. For people who have insufficient 419 00:25:41,680 --> 00:25:45,040 Speaker 1: time or health coverage, this is a huge deal. The 420 00:25:45,080 --> 00:25:49,360 Speaker 1: FDA's designation lends credibility to this technology. There's no shortage 421 00:25:49,400 --> 00:25:52,159 Speaker 1: of tech out there that claims to be helpful in 422 00:25:52,240 --> 00:25:56,160 Speaker 1: various ways in the medical field, but if the products 423 00:25:56,240 --> 00:26:00,680 Speaker 1: lack the FDA designation, then there's no authority out there saying, yes, 424 00:26:00,760 --> 00:26:03,880 Speaker 1: this stuff works for that intended purpose. Now, I am 425 00:26:03,920 --> 00:26:06,040 Speaker 1: not an Apple user, but I do think this is 426 00:26:06,080 --> 00:26:08,800 Speaker 1: a great day for folks who have mild to moderate 427 00:26:08,840 --> 00:26:11,840 Speaker 1: hearing loss and gives them a lot more options. Maybe 428 00:26:11,880 --> 00:26:16,280 Speaker 1: we'll get an Android compatible candidate that also meets FDA 429 00:26:16,400 --> 00:26:19,160 Speaker 1: requirements to receive this sort of designation. I would find 430 00:26:19,200 --> 00:26:21,720 Speaker 1: that pretty helpful. I mean, I went to way too 431 00:26:21,720 --> 00:26:24,359 Speaker 1: many loud music shows when I was in college, and 432 00:26:24,440 --> 00:26:28,160 Speaker 1: I'm certainly paying for that now. This week, Sony announced 433 00:26:28,200 --> 00:26:31,439 Speaker 1: that starting on November seventh, you can order yourself a 434 00:26:31,480 --> 00:26:35,800 Speaker 1: brand new PS five Pro. This model has more oomph 435 00:26:36,040 --> 00:26:40,440 Speaker 1: than the previous PS five consoles, so earlier gamers had 436 00:26:40,440 --> 00:26:43,359 Speaker 1: to make a choice. They could play games at the 437 00:26:43,520 --> 00:26:46,880 Speaker 1: highest visual settings enabled, but they would do so while 438 00:26:46,920 --> 00:26:50,159 Speaker 1: taking a hit on stuff like frame rate, so the 439 00:26:50,200 --> 00:26:53,000 Speaker 1: performance of the game would take a bit of a hit, 440 00:26:53,119 --> 00:26:57,399 Speaker 1: but it would look gorgeous. Or they could optimize for performance, 441 00:26:57,600 --> 00:27:00,280 Speaker 1: which means the graphics wouldn't look is pretty, but the 442 00:27:00,280 --> 00:27:03,320 Speaker 1: game would run much more smoothly. So this new PS 443 00:27:03,359 --> 00:27:07,000 Speaker 1: five Pro model is meant to eliminate that problem by 444 00:27:07,080 --> 00:27:10,840 Speaker 1: providing enough power to run games at their higher visual 445 00:27:10,920 --> 00:27:15,080 Speaker 1: settings without impacting the performance, and it would only set 446 00:27:15,119 --> 00:27:20,240 Speaker 1: you back seven hundred US dollars, obviously priced differently in 447 00:27:20,280 --> 00:27:24,440 Speaker 1: different regions. On top of that, however, this particular Pro 448 00:27:24,880 --> 00:27:29,080 Speaker 1: model does not have a disk drive. It's digital only, 449 00:27:29,280 --> 00:27:32,800 Speaker 1: so if you wanted a console that could also play discs, well, 450 00:27:33,080 --> 00:27:35,760 Speaker 1: then you would need to buy an external drive for 451 00:27:35,840 --> 00:27:38,600 Speaker 1: the PS five and connect it to the PS five 452 00:27:38,800 --> 00:27:41,440 Speaker 1: Pro to get that capability. I think that's kind of 453 00:27:41,520 --> 00:27:44,199 Speaker 1: rough for folks who depend upon a game console to 454 00:27:44,240 --> 00:27:47,399 Speaker 1: be a multitasker. Now a lot of game publishers have 455 00:27:47,520 --> 00:27:51,600 Speaker 1: ditched physical media in favor of digital downloads, So for 456 00:27:51,680 --> 00:27:54,320 Speaker 1: a lot of games, there is no physical disk to 457 00:27:54,359 --> 00:27:56,679 Speaker 1: buy anyway. The only way to get the game is 458 00:27:56,720 --> 00:28:00,040 Speaker 1: to download it digitally. But there's still a lot of 459 00:28:00,119 --> 00:28:03,720 Speaker 1: us out there who are either still collecting physical media 460 00:28:04,040 --> 00:28:07,520 Speaker 1: like movies and TV shows on disc, or we have 461 00:28:07,600 --> 00:28:10,760 Speaker 1: recently gone back to physical media after we got tired 462 00:28:10,760 --> 00:28:14,280 Speaker 1: of streaming services dropping the films and TV shows we 463 00:28:14,400 --> 00:28:17,840 Speaker 1: love from their respective libraries. I fall in that camp. 464 00:28:18,119 --> 00:28:21,399 Speaker 1: For a while, I was digital only, but eventually I 465 00:28:21,440 --> 00:28:24,280 Speaker 1: did get fed up with constantly having to play leap 466 00:28:24,320 --> 00:28:27,280 Speaker 1: Frog to figure out which service has the movie that 467 00:28:27,320 --> 00:28:29,480 Speaker 1: I wanted to watch on it. Now forget it. I'll 468 00:28:29,520 --> 00:28:31,240 Speaker 1: just buy a copy of the movie so that way 469 00:28:31,280 --> 00:28:33,800 Speaker 1: I always have it if I want to watch it. Well, 470 00:28:34,040 --> 00:28:37,240 Speaker 1: consoles have obviously not served just as game centers, they've 471 00:28:37,240 --> 00:28:41,000 Speaker 1: also served as physical media players, so ditching the drive 472 00:28:41,120 --> 00:28:43,640 Speaker 1: is tough on those of us who want both. Now. 473 00:28:43,680 --> 00:28:46,760 Speaker 1: I've heard that sales of PS five external disk drives 474 00:28:46,760 --> 00:28:50,080 Speaker 1: have spiked in the wake of this announcement, and also 475 00:28:50,280 --> 00:28:53,280 Speaker 1: a lot of analysts have interpreted Sony's move to indicate 476 00:28:53,320 --> 00:28:57,400 Speaker 1: that future consoles will likewise leave off the disk drive, 477 00:28:57,480 --> 00:29:00,560 Speaker 1: it just won't be part of the system. Analysts are 478 00:29:00,560 --> 00:29:03,640 Speaker 1: also cheesed off that the seven hundred dollars price tag 479 00:29:03,840 --> 00:29:06,640 Speaker 1: is pretty hefty, considering you do not get a disk 480 00:29:06,760 --> 00:29:10,000 Speaker 1: drive in that model. Now, you could argue, yes, the 481 00:29:10,080 --> 00:29:14,120 Speaker 1: microchips are more advanced, they're more powerful, but it's still 482 00:29:14,360 --> 00:29:17,800 Speaker 1: hard to feel like you're not paying more for diminishing returns, 483 00:29:17,880 --> 00:29:20,840 Speaker 1: particularly if you're someone who can't really see the difference 484 00:29:20,960 --> 00:29:23,920 Speaker 1: in the various graphic settings to begin with, like me, 485 00:29:24,360 --> 00:29:27,200 Speaker 1: I have trouble seeing much of a difference between the 486 00:29:27,240 --> 00:29:29,640 Speaker 1: highest settings and the ones that allow you to play 487 00:29:29,720 --> 00:29:33,440 Speaker 1: with little impact to gameplay. Now, I don't doubt that 488 00:29:33,480 --> 00:29:35,800 Speaker 1: there is a difference. I'm sure there is, but my 489 00:29:35,880 --> 00:29:38,560 Speaker 1: television isn't large enough and I don't sit close enough 490 00:29:38,600 --> 00:29:41,360 Speaker 1: to it to be able to pick out those differences. 491 00:29:41,640 --> 00:29:44,440 Speaker 1: So I suppose one argument supporting the PS five pro 492 00:29:44,640 --> 00:29:47,080 Speaker 1: is that in the future there will be PS five 493 00:29:47,160 --> 00:29:52,040 Speaker 1: titles that will require that horsepower to run well. But then, 494 00:29:52,160 --> 00:29:56,560 Speaker 1: assuming that there will be future PS generations, we're likely 495 00:29:56,600 --> 00:29:59,320 Speaker 1: at the halfway point for the current console's life cycle, 496 00:29:59,440 --> 00:30:01,320 Speaker 1: so there's a lot to balance out when making a 497 00:30:01,320 --> 00:30:04,760 Speaker 1: decision as to whether you're going to buy one or not. Okay, 498 00:30:04,840 --> 00:30:06,880 Speaker 1: I've got a few more news stories to get through. 499 00:30:07,080 --> 00:30:19,280 Speaker 1: Let's take another quick break to thank our sponsors. So 500 00:30:19,480 --> 00:30:25,400 Speaker 1: Microsoft has held another round of layoffs for its Xbox division. Reportedly, 501 00:30:25,520 --> 00:30:28,000 Speaker 1: some six hundred and fifty employees are going to lose 502 00:30:28,040 --> 00:30:31,880 Speaker 1: their jobs as part of these layoffs. That sucks. Sorry 503 00:30:31,960 --> 00:30:35,600 Speaker 1: for anyone out there affected by that. That stinks. This 504 00:30:35,680 --> 00:30:39,440 Speaker 1: is according to reporting from Matthew Schomer of Game Rant. 505 00:30:39,560 --> 00:30:42,920 Speaker 1: He has an article titled Xbox has reportedly been told 506 00:30:42,960 --> 00:30:46,600 Speaker 1: to go dark today. So by go dark, what Schomer 507 00:30:46,680 --> 00:30:50,600 Speaker 1: means is that allegedly the Xbox division has been directed 508 00:30:50,680 --> 00:30:53,880 Speaker 1: to say nothing on social media, and this is an 509 00:30:53,920 --> 00:30:58,640 Speaker 1: effort to sidestep the reaction to this layoff decision. So, 510 00:30:58,680 --> 00:31:02,240 Speaker 1: according to Microsoft, game being CEO Phil Spencer. The layoffs 511 00:31:02,240 --> 00:31:05,840 Speaker 1: are a continuation of the restructuring that has had to 512 00:31:05,920 --> 00:31:09,880 Speaker 1: happen in the wake of Microsoft acquiring Activision Blizzard. You 513 00:31:09,960 --> 00:31:14,480 Speaker 1: might recall that particular acquisition was a very lengthy process. 514 00:31:14,680 --> 00:31:17,800 Speaker 1: It took way longer than what was anticipated, and it 515 00:31:17,880 --> 00:31:20,320 Speaker 1: was not guaranteed to work out because there were various 516 00:31:20,360 --> 00:31:24,120 Speaker 1: regulators around the world who are raising concerns that the 517 00:31:24,200 --> 00:31:27,520 Speaker 1: acquisition would lead to a decline in competition in various 518 00:31:27,640 --> 00:31:31,920 Speaker 1: gaming markets, most notably in the cloud gaming market. You 519 00:31:32,000 --> 00:31:35,920 Speaker 1: might also remember that Microsoft has already held rounds of 520 00:31:36,000 --> 00:31:38,640 Speaker 1: layoffs that the company claimed to be connected to this 521 00:31:38,880 --> 00:31:42,080 Speaker 1: restructuring in the wake of the acquisition. When Microsoft did 522 00:31:42,080 --> 00:31:44,720 Speaker 1: this back in May, the company became the target for 523 00:31:44,800 --> 00:31:48,360 Speaker 1: a lot of online criticism. Tom Warren of The Verge 524 00:31:48,400 --> 00:31:52,360 Speaker 1: posted that Microsoft has directed employees to avoid posting social 525 00:31:52,400 --> 00:31:56,000 Speaker 1: media in order to try and prevent a similar online 526 00:31:56,080 --> 00:31:59,320 Speaker 1: backlash situation this week. I'm not sure that's really going 527 00:31:59,400 --> 00:32:01,800 Speaker 1: to work out for them. The gaming industry as a 528 00:32:01,800 --> 00:32:04,800 Speaker 1: whole has been hit with a lot of layoffs in 529 00:32:04,840 --> 00:32:07,560 Speaker 1: the last year and a half, and it concerns me 530 00:32:07,800 --> 00:32:11,120 Speaker 1: as I know there are thousands of talented people who 531 00:32:11,160 --> 00:32:14,960 Speaker 1: are following their passion for video games and you know, 532 00:32:15,000 --> 00:32:17,440 Speaker 1: making a career out of that passion, and they have 533 00:32:17,480 --> 00:32:21,840 Speaker 1: subsequently found themselves out of work, which again stinks. I 534 00:32:21,880 --> 00:32:25,400 Speaker 1: really hope anyone affected by this lands on their feet 535 00:32:25,600 --> 00:32:29,840 Speaker 1: very quickly. Boeing continues to get hit by bad news. 536 00:32:30,400 --> 00:32:33,720 Speaker 1: Union workers who are machinists at Boeing have voted to 537 00:32:33,760 --> 00:32:36,520 Speaker 1: authorize a strike after more than ninety four percent of 538 00:32:36,640 --> 00:32:40,600 Speaker 1: union members rejected a proposed contract agreement, which would have 539 00:32:40,680 --> 00:32:44,080 Speaker 1: seen a twenty five percent pay raise over the course 540 00:32:44,160 --> 00:32:47,560 Speaker 1: of four years. Interestingly, the union leaders who were at 541 00:32:47,600 --> 00:32:52,800 Speaker 1: the negotiating table with Boeing had prompted members to agree 542 00:32:53,000 --> 00:32:56,600 Speaker 1: to this, but the union as a whole disagreed with 543 00:32:56,800 --> 00:33:00,720 Speaker 1: the team that negotiated this agreement and said, no, this 544 00:33:00,880 --> 00:33:04,040 Speaker 1: is not good enough. The strike effectively began this morning, 545 00:33:04,080 --> 00:33:08,440 Speaker 1: one minute after midnight, and thirty three thousand machinists are 546 00:33:08,480 --> 00:33:11,280 Speaker 1: represented in this union, which means all work on things 547 00:33:11,360 --> 00:33:14,920 Speaker 1: like Boeing aircraft has come to a halt. Now, a 548 00:33:15,040 --> 00:33:18,520 Speaker 1: twenty five percent raise ain't nothing right, So you might 549 00:33:18,520 --> 00:33:22,280 Speaker 1: be saying, what are the workers expecting, Well, they had 550 00:33:22,280 --> 00:33:27,760 Speaker 1: been asking for a much more aggressive raise schedule. They 551 00:33:27,800 --> 00:33:32,040 Speaker 1: wanted forty percent increase in raises over the course of 552 00:33:32,280 --> 00:33:36,600 Speaker 1: three years, So they wanted more money, and they wanted 553 00:33:36,560 --> 00:33:39,719 Speaker 1: it on a shorter timeline. They say, it's that the 554 00:33:39,880 --> 00:33:42,160 Speaker 1: twenty five percent is not enough to compensate for how 555 00:33:42,200 --> 00:33:46,920 Speaker 1: employees have been made to make concessions regarding compensation and 556 00:33:46,960 --> 00:33:50,480 Speaker 1: pensions since two thousand and eight. So they say that, 557 00:33:50,760 --> 00:33:54,200 Speaker 1: you know, the previous sixteen years went with no raises 558 00:33:54,240 --> 00:33:56,480 Speaker 1: at all, and that a twenty five percent increase would 559 00:33:56,480 --> 00:33:59,280 Speaker 1: not put them on equal footing of where they would 560 00:33:59,280 --> 00:34:01,760 Speaker 1: be had they been and getting year over year raises 561 00:34:02,080 --> 00:34:04,960 Speaker 1: the way you would typically expect. So what they're saying 562 00:34:05,040 --> 00:34:07,320 Speaker 1: is this isn't good enough. It doesn't bring us to 563 00:34:07,320 --> 00:34:10,759 Speaker 1: where we should be, and it doesn't address the other 564 00:34:10,880 --> 00:34:15,280 Speaker 1: issues that we have. So we're going on strike pretty 565 00:34:15,400 --> 00:34:18,319 Speaker 1: rough situation last up, and then we're going to get 566 00:34:18,360 --> 00:34:22,800 Speaker 1: to some reading recommendations. Jared Isaacman and Sarah Gillis became 567 00:34:22,840 --> 00:34:26,640 Speaker 1: the first two private citizens to go on an EVA 568 00:34:26,880 --> 00:34:32,440 Speaker 1: or extra vehicular activity in spice or space. This is 569 00:34:32,480 --> 00:34:35,319 Speaker 1: also known as a space walk, and they did this 570 00:34:35,400 --> 00:34:39,200 Speaker 1: as part of Polaris Dawn, which is a SpaceX mission 571 00:34:39,280 --> 00:34:42,280 Speaker 1: that carried the two private citizens up to space along 572 00:34:42,320 --> 00:34:45,000 Speaker 1: with two other crew members, so four in total. The 573 00:34:45,280 --> 00:34:48,920 Speaker 1: pair each spent about eight minutes out there in space 574 00:34:49,080 --> 00:34:52,280 Speaker 1: in their space suits. They were not fully outside the capsule, 575 00:34:52,400 --> 00:34:57,080 Speaker 1: so they weren't like walking around or drifting around the capsule. 576 00:34:57,360 --> 00:35:00,760 Speaker 1: Their legs were still inside the cap so their upper 577 00:35:00,760 --> 00:35:03,240 Speaker 1: half was kind of poking out. They did try different 578 00:35:03,239 --> 00:35:08,759 Speaker 1: experiments to use tools and test their spacesuits maneuverability and 579 00:35:09,239 --> 00:35:12,800 Speaker 1: how useful it would be in the instance of actually 580 00:35:13,040 --> 00:35:15,920 Speaker 1: doing a spacewalk where you're trying to perform some sort 581 00:35:15,960 --> 00:35:19,400 Speaker 1: of engineering task. Everyone obviously had to wear space suits 582 00:35:19,400 --> 00:35:23,040 Speaker 1: because there's no airlock in this SpaceX Dragon capsule. The 583 00:35:23,080 --> 00:35:26,880 Speaker 1: whole cabin had to be depressurized to allow for this exercise. 584 00:35:27,239 --> 00:35:30,640 Speaker 1: But the exercise was a success, and it's a huge 585 00:35:30,640 --> 00:35:33,239 Speaker 1: achievement for all the people at SpaceX who have been 586 00:35:33,280 --> 00:35:36,200 Speaker 1: working for years to get to this point. I know, 587 00:35:36,239 --> 00:35:40,040 Speaker 1: I get really critical of Elon Musk and his various antics, 588 00:35:40,239 --> 00:35:42,680 Speaker 1: as well as the companies he oversees, but there is 589 00:35:42,719 --> 00:35:45,759 Speaker 1: no denying that the folks at SpaceX have hit some 590 00:35:45,960 --> 00:35:51,800 Speaker 1: pretty impressive milestones. They were incredibly ambitious, and they were achieved. Hopefully, 591 00:35:52,040 --> 00:35:56,839 Speaker 1: one day people who aren't billionaires or SpaceX engineers will 592 00:35:56,840 --> 00:36:00,239 Speaker 1: get a chance to have a similar experience. Right now, well, 593 00:36:00,560 --> 00:36:05,360 Speaker 1: it remains well out of range of let's say, typical people. 594 00:36:05,440 --> 00:36:09,240 Speaker 1: I almost said ordinary, but that's making a judgment against 595 00:36:09,320 --> 00:36:12,680 Speaker 1: SpaceX engineers. I don't feel bad judging billionaires. They can 596 00:36:12,719 --> 00:36:16,760 Speaker 1: afford it. I'll judge billionaires all day long and they'll 597 00:36:16,800 --> 00:36:21,160 Speaker 1: be just fine. Okay, now we're at recommended reading time. 598 00:36:21,200 --> 00:36:23,480 Speaker 1: I've actually got four articles I want to mention. There 599 00:36:23,520 --> 00:36:26,520 Speaker 1: were so much going on this week. Well, three of 600 00:36:26,560 --> 00:36:29,080 Speaker 1: the articles I have to mention are all from Here's 601 00:36:29,080 --> 00:36:31,920 Speaker 1: no surprise, Ours Technica. Again, I have no connection to 602 00:36:31,960 --> 00:36:34,680 Speaker 1: Ours Technica. I'm just a fan. So first up, we've 603 00:36:34,680 --> 00:36:39,000 Speaker 1: got Kevin Perdy's Ours Technica article. It's titled Music Industries 604 00:36:39,160 --> 00:36:43,640 Speaker 1: nineteen nineties. Hard drives, like all HDDs are dying. So 605 00:36:43,680 --> 00:36:47,440 Speaker 1: this piece details how a data storage company has discovered 606 00:36:47,480 --> 00:36:50,520 Speaker 1: that around twenty percent of the hard disk drives that 607 00:36:50,560 --> 00:36:54,600 Speaker 1: were sent to them by media companies are ultimately unreadable. 608 00:36:54,800 --> 00:36:57,439 Speaker 1: And that really makes it clear that poorting data over 609 00:36:57,560 --> 00:37:00,440 Speaker 1: to other storage methods needs to be a priority for 610 00:37:00,480 --> 00:37:03,719 Speaker 1: anyone who's still relying on legacy hard disk drives from 611 00:37:04,400 --> 00:37:07,680 Speaker 1: decades earlier. As the equipment fails, it becomes harder and 612 00:37:07,760 --> 00:37:11,399 Speaker 1: sometimes impossible to retrieve the data that's been stored on them, 613 00:37:11,600 --> 00:37:14,880 Speaker 1: and so we run the risk of irretrievably losing some 614 00:37:14,960 --> 00:37:18,440 Speaker 1: of the information that could include things like master tracks 615 00:37:18,640 --> 00:37:20,840 Speaker 1: for some of the most popular songs of the past. 616 00:37:21,360 --> 00:37:23,520 Speaker 1: This is actually reminding me that I should probably get 617 00:37:23,560 --> 00:37:26,560 Speaker 1: some cloud storage solutions for some media files I currently 618 00:37:26,640 --> 00:37:30,160 Speaker 1: have stored on an external HDD, but that's a meat problem. 619 00:37:30,640 --> 00:37:34,160 Speaker 1: Next up, there's a piece by Rebecca Valentine of Ign 620 00:37:34,400 --> 00:37:37,560 Speaker 1: about how the entire gaming staff of a video game 621 00:37:37,600 --> 00:37:41,719 Speaker 1: development studio has recently resigned. That studio is on a 622 00:37:41,800 --> 00:37:45,840 Speaker 1: Purna and the reasons behind the mass resignation are pretty interesting. 623 00:37:45,880 --> 00:37:50,000 Speaker 1: So the article is titled Anna Perna's entire gaming team 624 00:37:50,360 --> 00:37:53,919 Speaker 1: has resigned, So go check that out. Eric Berger back 625 00:37:53,920 --> 00:37:57,560 Speaker 1: at Ours Technica has an article titled the future of 626 00:37:57,640 --> 00:38:03,879 Speaker 1: Boeing's Crude as in crwed spaceflight program is muddy after 627 00:38:03,960 --> 00:38:08,160 Speaker 1: Starliner's return, so it follows up on the tale of 628 00:38:08,239 --> 00:38:12,960 Speaker 1: the beleaguered star Liner spacecraft, which obviously it experienced malfunctions 629 00:38:13,000 --> 00:38:16,440 Speaker 1: as it was nearing the International Space Station. It ultimately 630 00:38:16,480 --> 00:38:19,760 Speaker 1: returned back to Earth safely, but without its human crew 631 00:38:19,880 --> 00:38:23,319 Speaker 1: aboord it. They remain on the ISS for the time being, 632 00:38:23,719 --> 00:38:27,239 Speaker 1: so check that out. It's kind of bringing into question 633 00:38:27,440 --> 00:38:31,160 Speaker 1: where does Boeing go from here? How does NASA handle this? 634 00:38:31,760 --> 00:38:36,160 Speaker 1: Will the two organizations be able to move forward or 635 00:38:36,960 --> 00:38:41,200 Speaker 1: is it really in limbo? Now? Finally, there's Jennifer Oulette's 636 00:38:41,560 --> 00:38:45,120 Speaker 1: article in Ours Tetnica. It's titled Meet the Winners of 637 00:38:45,200 --> 00:38:49,000 Speaker 1: the twenty twenty four iig Nobel Prizes. Now. I did 638 00:38:49,000 --> 00:38:52,200 Speaker 1: a tech Stuff episode about the ig Nobel Prizes a 639 00:38:52,320 --> 00:38:55,600 Speaker 1: while back. If you're not familiar with the Ignobels, these 640 00:38:55,640 --> 00:39:00,600 Speaker 1: prizes celebrate weird and unexpected achievements in various fields, usually 641 00:39:01,120 --> 00:39:04,560 Speaker 1: in science and technology, but also other areas as well, 642 00:39:04,800 --> 00:39:07,719 Speaker 1: and the general philosophy of the prizes is that they 643 00:39:07,760 --> 00:39:11,520 Speaker 1: go to projects that first make you laugh, then they 644 00:39:11,560 --> 00:39:14,480 Speaker 1: make you think. So check that out as well. Maybe 645 00:39:14,480 --> 00:39:16,960 Speaker 1: I'll do a follow up episode to my ig No 646 00:39:17,160 --> 00:39:19,600 Speaker 1: Bells to just talk about some of the stuff that won. 647 00:39:20,000 --> 00:39:22,400 Speaker 1: Often I feel like it's better for me to wait 648 00:39:22,760 --> 00:39:26,040 Speaker 1: and do those in roundups of multiple years because often 649 00:39:26,560 --> 00:39:30,200 Speaker 1: a lot of those projects are only tangentially related to tech, 650 00:39:30,560 --> 00:39:33,919 Speaker 1: and while they are funny and interesting, they don't necessarily 651 00:39:34,200 --> 00:39:37,520 Speaker 1: meet the rubric of tech stuff. I've been listening to 652 00:39:37,560 --> 00:39:41,160 Speaker 1: a lot of the Besties podcast It's again. I have 653 00:39:41,160 --> 00:39:44,080 Speaker 1: no connection to the Besties, but it's a show about 654 00:39:44,239 --> 00:39:46,840 Speaker 1: video games and stuff, and they use the word rubric 655 00:39:46,880 --> 00:39:50,839 Speaker 1: a lot, especially in their Patreon episodes, so it's kind 656 00:39:50,840 --> 00:39:54,919 Speaker 1: of gotten stuck in my vocabulary recently, just from osmosis. 657 00:39:54,960 --> 00:39:58,800 Speaker 1: I guess that's it for today's episode about tech news 658 00:39:58,800 --> 00:40:01,879 Speaker 1: for the week ending scept Member thirteenth, twenty twenty four. 659 00:40:01,920 --> 00:40:05,080 Speaker 1: Happy Friday the thirteenth. Everybody be safe out there. I 660 00:40:05,120 --> 00:40:07,800 Speaker 1: hope you're all well, and I'll talk to you again 661 00:40:08,560 --> 00:40:18,480 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 662 00:40:18,560 --> 00:40:23,279 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 663 00:40:23,320 --> 00:40:28,720 Speaker 1: wherever you listen to your favorite shows.