1 00:00:04,480 --> 00:00:12,520 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:16,360 Speaker 1: and welcome to tech Stuff. I'm your host, Jovin Strickland. 3 00:00:16,360 --> 00:00:19,520 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,600 --> 00:00:22,040 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,120 --> 00:00:26,920 Speaker 1: the week ending July twelfth, two thy twenty four. There 6 00:00:27,040 --> 00:00:30,400 Speaker 1: was actually a ton of news this week. I guess 7 00:00:30,400 --> 00:00:35,159 Speaker 1: making up for last week, so I'm just covering like 8 00:00:35,360 --> 00:00:37,840 Speaker 1: half of it because there was so much, I had 9 00:00:37,840 --> 00:00:40,640 Speaker 1: to choose which stories I was going to include. As 10 00:00:40,640 --> 00:00:42,080 Speaker 1: it stands, this is going to be a long one, 11 00:00:42,159 --> 00:00:45,400 Speaker 1: so let's get to it. Last week, the German newspaper 12 00:00:45,479 --> 00:00:49,239 Speaker 1: Der Spiegel and a Russian independent news site called The 13 00:00:49,400 --> 00:00:55,120 Speaker 1: Insider revealed that Russia's Foreign Intelligence Service aka the SVR 14 00:00:55,520 --> 00:00:59,280 Speaker 1: has a program that they call Project Kailo. I don't 15 00:00:59,280 --> 00:01:02,200 Speaker 1: know if that's name after Kylo Renn from the Star 16 00:01:02,240 --> 00:01:05,640 Speaker 1: Wars films. Anyway, this project's aim is to spread misinformation 17 00:01:05,720 --> 00:01:09,840 Speaker 1: and propaganda in support of Russia's goals, particularly with regard 18 00:01:10,080 --> 00:01:12,920 Speaker 1: to the ongoing war with Ukraine. And I find it 19 00:01:12,959 --> 00:01:15,720 Speaker 1: interesting that a lot of media sites reference this as 20 00:01:15,760 --> 00:01:20,479 Speaker 1: a bombshell report, linking Russia to these efforts. I think 21 00:01:20,520 --> 00:01:23,880 Speaker 1: it's surprising, not because I don't think it's important. I 22 00:01:23,920 --> 00:01:26,400 Speaker 1: do think it's important. I just figured we were all 23 00:01:26,440 --> 00:01:30,640 Speaker 1: working under the assumption that this was in fact happening already. 24 00:01:30,880 --> 00:01:34,200 Speaker 1: I guess the appearance of confirmation is really the story 25 00:01:34,240 --> 00:01:39,080 Speaker 1: here anyway. The report says that the svr's playbook involves 26 00:01:39,200 --> 00:01:43,319 Speaker 1: establishing fake news websites and then flooding these websites with 27 00:01:43,480 --> 00:01:47,800 Speaker 1: misleading content, often boosted through the use of generative AI. 28 00:01:48,000 --> 00:01:50,040 Speaker 1: You can write a whole lot of stuff if you 29 00:01:50,160 --> 00:01:52,920 Speaker 1: offload it to AI, and if you're not really concerned 30 00:01:52,960 --> 00:01:56,040 Speaker 1: about the accuracy of it, I mean, if your goal 31 00:01:56,120 --> 00:02:00,760 Speaker 1: is to create misinformation, accuracy really isn't in your top concerns. 32 00:02:01,600 --> 00:02:04,720 Speaker 1: The agency also made heavy use of social networking and 33 00:02:04,800 --> 00:02:09,800 Speaker 1: media platforms to spread these messages, so establishing a place 34 00:02:09,840 --> 00:02:12,440 Speaker 1: to post everything and then a distribution method for getting 35 00:02:12,440 --> 00:02:14,880 Speaker 1: it out in front as many eyeballs as possible. So, 36 00:02:14,919 --> 00:02:18,160 Speaker 1: according to these newspapers, the agency went to the trouble 37 00:02:18,240 --> 00:02:21,960 Speaker 1: of actually hiring real world people to pose as protesters 38 00:02:22,040 --> 00:02:26,680 Speaker 1: in various Western countries in an effort to amplify messaging, 39 00:02:27,000 --> 00:02:32,320 Speaker 1: and then beyond that, they helped distribute video footage of 40 00:02:32,400 --> 00:02:36,600 Speaker 1: these protests in order to get a wider release of 41 00:02:36,639 --> 00:02:39,840 Speaker 1: that messaging. This is interesting because here in the United States, 42 00:02:39,840 --> 00:02:42,480 Speaker 1: actually not too far from the state where I live, 43 00:02:43,080 --> 00:02:48,840 Speaker 1: there was a notable white supremacist march that took place 44 00:02:49,040 --> 00:02:53,560 Speaker 1: in Nashville, Tennessee. And I'm curious as to whether or 45 00:02:53,639 --> 00:02:58,959 Speaker 1: not that march was amplified by efforts like this, where 46 00:02:59,000 --> 00:03:03,680 Speaker 1: people were specific brought in to boost the numbers and 47 00:03:03,760 --> 00:03:08,399 Speaker 1: make them larger. And then obviously that video footage of 48 00:03:08,480 --> 00:03:12,760 Speaker 1: that that particular march got pretty wide circulation. And I 49 00:03:12,800 --> 00:03:14,519 Speaker 1: mean it may not be at all related to the 50 00:03:15,240 --> 00:03:17,320 Speaker 1: Russian efforts. I want to make that clear. This could 51 00:03:17,400 --> 00:03:20,600 Speaker 1: be completely independent of that, but it does sort of 52 00:03:20,919 --> 00:03:26,040 Speaker 1: fall in line with that same strategy. Also last week, 53 00:03:26,280 --> 00:03:29,880 Speaker 1: Cowlan McGee and I apologize for butchering the name, but 54 00:03:29,960 --> 00:03:32,400 Speaker 1: McGee wrote a piece for I News, which is an 55 00:03:32,440 --> 00:03:36,400 Speaker 1: English news site, and in that piece, McGee writes that 56 00:03:36,440 --> 00:03:41,200 Speaker 1: French officials identified Russian propaganda as pushing French citizens to 57 00:03:41,360 --> 00:03:46,200 Speaker 1: support far right candidates during their parliamentary elections. Those took 58 00:03:46,200 --> 00:03:50,000 Speaker 1: place just this past Sunday. Now, those efforts ultimately fell 59 00:03:50,160 --> 00:03:53,040 Speaker 1: a little short. The far right did win quite a 60 00:03:53,080 --> 00:03:56,040 Speaker 1: few seats, like one hundred and forty two seats in parliament, 61 00:03:56,440 --> 00:03:59,000 Speaker 1: but that puts them in third place, behind the Left 62 00:03:59,040 --> 00:04:01,920 Speaker 1: wing Party, which won one hundred eighty eight seats and 63 00:04:02,000 --> 00:04:06,240 Speaker 1: the Centrist Party, which won one hundred and sixty one seats. However, 64 00:04:06,680 --> 00:04:09,960 Speaker 1: no single party secured two hundred and eighty nine or 65 00:04:10,000 --> 00:04:14,400 Speaker 1: more seats, which means France's president now faces the tough 66 00:04:14,480 --> 00:04:18,040 Speaker 1: challenge of appointing a prime minister because with they don't 67 00:04:18,080 --> 00:04:23,240 Speaker 1: have the support of a single parliamentary party. Right, there's 68 00:04:23,320 --> 00:04:27,000 Speaker 1: no party that holds a majority, and if the prime 69 00:04:27,000 --> 00:04:29,960 Speaker 1: minister doesn't have the support of parliament, then the government 70 00:04:30,040 --> 00:04:32,479 Speaker 1: kind of stops working. So what I'm saying is that 71 00:04:32,600 --> 00:04:36,000 Speaker 1: Russia's campaign, assuming that it actually had an impact on 72 00:04:36,560 --> 00:04:40,279 Speaker 1: the French elections at all, was not a total failure. Yeah, 73 00:04:40,279 --> 00:04:43,440 Speaker 1: they didn't secure a majority, but if you're thinking it's 74 00:04:43,520 --> 00:04:47,480 Speaker 1: more in an effort to disrupt politics and to add 75 00:04:47,520 --> 00:04:52,000 Speaker 1: some instability to Western governments, then there was success here, 76 00:04:52,440 --> 00:04:55,480 Speaker 1: or at least it's in alignment with Russia's goals. Whether 77 00:04:55,600 --> 00:04:58,599 Speaker 1: or not the efforts moved the needle at all, that's 78 00:04:59,040 --> 00:05:03,560 Speaker 1: that's another question. McGee's piece in IEWS lists out numerous 79 00:05:03,600 --> 00:05:08,640 Speaker 1: examples of Russian attempts to misinform, from bought controlled social 80 00:05:08,640 --> 00:05:11,679 Speaker 1: accounts to fake news sites pushing lies to the public, 81 00:05:11,839 --> 00:05:15,719 Speaker 1: and obviously France is just one nation that's targeted by Russia. 82 00:05:15,839 --> 00:05:19,320 Speaker 1: The convergence of tools like AI and social networks, on 83 00:05:19,440 --> 00:05:22,760 Speaker 1: top of how easy it is to just launch a website, 84 00:05:22,839 --> 00:05:25,279 Speaker 1: has really created a perfect storm for the creation and 85 00:05:25,320 --> 00:05:29,440 Speaker 1: distribution of propaganda. So, as always, I recommend viewing media 86 00:05:29,560 --> 00:05:32,520 Speaker 1: through the lens of critical thinking. It's not a guarantee 87 00:05:32,520 --> 00:05:35,760 Speaker 1: that you still won't get misled. That can still happen, 88 00:05:36,040 --> 00:05:39,760 Speaker 1: but using critical thinking really helps cut down on the 89 00:05:40,080 --> 00:05:43,800 Speaker 1: frequency at which that can happen. Make sure you're paying 90 00:05:43,839 --> 00:05:46,760 Speaker 1: attention to the source of that information and what their 91 00:05:46,839 --> 00:05:51,240 Speaker 1: sources are, because if we're all looking at news that's 92 00:05:51,279 --> 00:05:54,320 Speaker 1: all drawing its information from a single source, and that 93 00:05:54,480 --> 00:05:57,920 Speaker 1: source is compromised, well then we can't trust any of 94 00:05:57,960 --> 00:06:00,240 Speaker 1: the news that was reported from it. We're in the 95 00:06:00,320 --> 00:06:03,200 Speaker 1: United States, the Department of Justice announced just this past 96 00:06:03,200 --> 00:06:07,000 Speaker 1: Tuesday that it had identified and disrupted a Russian operation 97 00:06:07,320 --> 00:06:11,320 Speaker 1: making use of bots and misinformation to sew discord here 98 00:06:11,320 --> 00:06:14,440 Speaker 1: in the states, which just as a point of reference, y'all, 99 00:06:14,480 --> 00:06:17,479 Speaker 1: we do not need outside help on that. We're really 100 00:06:17,520 --> 00:06:22,880 Speaker 1: good at sowing discord amongst ourselves. Anyway, the DOJ seized 101 00:06:23,000 --> 00:06:26,279 Speaker 1: two domains in around one thousand social accounts believed to 102 00:06:26,279 --> 00:06:29,920 Speaker 1: be part of this Russian operation. The associated accounts received 103 00:06:29,920 --> 00:06:32,680 Speaker 1: a ban on x formerly known as Twitter, and they 104 00:06:32,720 --> 00:06:36,440 Speaker 1: had a history of posting content with a pro Kremlin perspective. 105 00:06:37,000 --> 00:06:41,279 Speaker 1: Apart from using AI to undermine governments and societies, it 106 00:06:41,279 --> 00:06:44,080 Speaker 1: seems like there's not really enough of a use case 107 00:06:44,080 --> 00:06:47,640 Speaker 1: for artificial intelligence to justify the truly insane amount of 108 00:06:47,680 --> 00:06:51,320 Speaker 1: investment that's going into it. Now that's not just my opinion, 109 00:06:51,640 --> 00:06:55,159 Speaker 1: although it is my opinion. Goldman Sachs released a newsletter 110 00:06:55,200 --> 00:06:59,600 Speaker 1: warning that told businesses there there just might be this crazy, 111 00:06:59,680 --> 00:07:03,200 Speaker 1: uns sustainable bubble forming around AI. But to be clear, 112 00:07:03,279 --> 00:07:05,479 Speaker 1: I am applying my own point of view to what 113 00:07:05,640 --> 00:07:09,080 Speaker 1: Goldman Sachs found. They didn't use words like crazy, and 114 00:07:09,080 --> 00:07:11,760 Speaker 1: in fact, they were pretty light with the use of 115 00:07:11,800 --> 00:07:14,600 Speaker 1: the word bubble, although at least one Goldman sax rep 116 00:07:14,680 --> 00:07:19,040 Speaker 1: did refer to it as a potential bubble, not definitively 117 00:07:19,080 --> 00:07:22,720 Speaker 1: a bubble, but it could be so. The report revealed 118 00:07:22,760 --> 00:07:26,320 Speaker 1: that around a trillion dollars is on track to pour 119 00:07:26,520 --> 00:07:30,440 Speaker 1: into the generative AI space, and that includes stuff like 120 00:07:30,880 --> 00:07:34,800 Speaker 1: building out and outfitting data centers, you know, the purchase 121 00:07:34,840 --> 00:07:38,720 Speaker 1: of new hardware, research and development, all this kind of stuff. 122 00:07:38,920 --> 00:07:44,040 Speaker 1: Goldman Sachs says, there's also no killer application for AI 123 00:07:44,240 --> 00:07:48,480 Speaker 1: that kind of justifies this crazy expenditure. So, in other words, 124 00:07:48,680 --> 00:07:52,160 Speaker 1: the research newsletter says, companies are rushing into the AI 125 00:07:52,280 --> 00:07:56,320 Speaker 1: space without really a clear goal or any assurance that 126 00:07:56,360 --> 00:07:59,000 Speaker 1: the investment they're making is going to pay off down 127 00:07:59,080 --> 00:08:02,640 Speaker 1: the road. It could be that the companies that are 128 00:08:02,680 --> 00:08:07,200 Speaker 1: spending billions of dollars in the AI generative AI specifically 129 00:08:07,240 --> 00:08:10,480 Speaker 1: in that field, that they're setting huge piles of cash 130 00:08:10,520 --> 00:08:13,440 Speaker 1: on fire. Alah, the joker in that Dark Night movie. 131 00:08:13,760 --> 00:08:17,800 Speaker 1: Jim Cavello, the head of Goldman Sachs's Global equity research, 132 00:08:17,960 --> 00:08:22,560 Speaker 1: put it this way, quote, what one trillion dollar problem 133 00:08:22,800 --> 00:08:28,040 Speaker 1: will AI solve? Replacing low wage jobs with tremendously costly 134 00:08:28,120 --> 00:08:32,400 Speaker 1: technology is basically the polar opposite of the prior technology 135 00:08:32,440 --> 00:08:36,079 Speaker 1: transitions I've witnessed in my thirty years of closely following 136 00:08:36,120 --> 00:08:39,360 Speaker 1: the tech industry end quote. So what he's saying is 137 00:08:39,679 --> 00:08:43,360 Speaker 1: there's not a clear problem that AI is solving, especially 138 00:08:43,360 --> 00:08:46,880 Speaker 1: not one that amounts to a one trillion dollar problem, 139 00:08:47,040 --> 00:08:50,280 Speaker 1: and that the ways we are seeing AI being applied 140 00:08:50,360 --> 00:08:54,280 Speaker 1: right now are low cost things. And in fact, it 141 00:08:54,320 --> 00:08:57,040 Speaker 1: could mean that you're spending more money with AI than 142 00:08:57,040 --> 00:09:01,040 Speaker 1: you were with human beings, so you're actually costing yourself 143 00:09:01,080 --> 00:09:05,400 Speaker 1: more money and you're funding that whole process. So it 144 00:09:05,400 --> 00:09:08,000 Speaker 1: really sounds like shots fired across the ballot companies that 145 00:09:08,080 --> 00:09:11,360 Speaker 1: have been downsizing staff in favor of relying more on 146 00:09:11,520 --> 00:09:15,319 Speaker 1: generative AI, such as oh I don't know HowStuffWorks dot 147 00:09:15,360 --> 00:09:18,720 Speaker 1: Com my former employer. We'll have more about the issue 148 00:09:18,760 --> 00:09:21,800 Speaker 1: of AI displacing creatives later in this episode, but yeah, 149 00:09:21,840 --> 00:09:24,320 Speaker 1: I'm still obviously salty about that, even though I wasn't 150 00:09:24,360 --> 00:09:26,959 Speaker 1: directly affected, right, I had already moved on from how 151 00:09:27,000 --> 00:09:30,040 Speaker 1: stuff works, or really my part of the company moved 152 00:09:30,080 --> 00:09:33,240 Speaker 1: on from How Stuff Works. But I'm still upset that 153 00:09:33,440 --> 00:09:37,520 Speaker 1: site ended up laying off the editorial staff in favor 154 00:09:37,559 --> 00:09:42,160 Speaker 1: of AI generated articles. I think that was an enormous mistake. 155 00:09:42,480 --> 00:09:44,640 Speaker 1: I plan on doing an episode in the near future 156 00:09:44,960 --> 00:09:47,160 Speaker 1: about one of the many reasons I think that's a 157 00:09:47,440 --> 00:09:52,640 Speaker 1: massive mistake, because it's not just that very talented, dedicated 158 00:09:52,720 --> 00:09:55,920 Speaker 1: people lost their jobs over it. That's a big part 159 00:09:55,920 --> 00:09:58,679 Speaker 1: of it. I mean, there's an emotional attachment, don't get 160 00:09:58,679 --> 00:10:01,000 Speaker 1: me wrong. But I also think that there is a 161 00:10:01,640 --> 00:10:06,199 Speaker 1: technological mistake that's being made, and I'm not the only one. 162 00:10:06,760 --> 00:10:10,000 Speaker 1: It involves things like model collapse, as in large language 163 00:10:10,000 --> 00:10:13,320 Speaker 1: model collapse, but we'll talk about that in a future 164 00:10:13,360 --> 00:10:17,680 Speaker 1: episode not too long from now. Anyway, the report warns 165 00:10:17,720 --> 00:10:21,760 Speaker 1: that any returns on investment are likely to be many 166 00:10:21,960 --> 00:10:24,800 Speaker 1: years down the road because it's going to take time 167 00:10:24,920 --> 00:10:30,160 Speaker 1: to build the AI infrastructure that actually creates meaningful improvements 168 00:10:30,240 --> 00:10:33,720 Speaker 1: in stuff like business performance and costs. So in the 169 00:10:33,760 --> 00:10:36,800 Speaker 1: short term, it's far more likely that running AI operations 170 00:10:36,840 --> 00:10:40,280 Speaker 1: is actually going to cost more than going the alternative route. 171 00:10:40,360 --> 00:10:43,200 Speaker 1: And all of that obviously becomes dangerous for companies that 172 00:10:43,240 --> 00:10:46,320 Speaker 1: are currently investing heavily in the space. They might find 173 00:10:46,360 --> 00:10:49,679 Speaker 1: themselves either in need of a severe course correction or 174 00:10:49,720 --> 00:10:53,120 Speaker 1: they might have to convince stakeholders to you know, patiently 175 00:10:53,160 --> 00:10:55,000 Speaker 1: wait for things to pay off in the future. That's 176 00:10:55,080 --> 00:10:57,760 Speaker 1: kind of what Meta has been doing, particularly in the 177 00:10:58,160 --> 00:11:03,080 Speaker 1: Reality Labs division, and metaverse stuff. Zuckerberg has to keep saying, 178 00:11:03,520 --> 00:11:05,760 Speaker 1: this is going to cost us a lot of money 179 00:11:05,760 --> 00:11:07,720 Speaker 1: in the short term, but in the long term, we 180 00:11:07,800 --> 00:11:12,200 Speaker 1: believe it's the future of our business. I remain unconvinced 181 00:11:12,520 --> 00:11:14,679 Speaker 1: that that's going to be the case, but that's the 182 00:11:14,800 --> 00:11:18,319 Speaker 1: argument Zuckerberg has to make two investors pretty much every 183 00:11:18,360 --> 00:11:21,440 Speaker 1: earnings called because they have to grapple with the fact 184 00:11:21,480 --> 00:11:24,760 Speaker 1: that they're spending billions of dollars in this division and 185 00:11:24,800 --> 00:11:27,439 Speaker 1: they don't have a lot to show for it yet, 186 00:11:27,679 --> 00:11:31,720 Speaker 1: and the stuff they have shown has not really created 187 00:11:31,840 --> 00:11:36,719 Speaker 1: an amazing response among consumers or investors. I expect we're 188 00:11:36,720 --> 00:11:39,520 Speaker 1: going to see very similar stories play out in the 189 00:11:39,559 --> 00:11:44,200 Speaker 1: AI space. Speaking of AI displacing jobs, into It, a 190 00:11:44,320 --> 00:11:49,280 Speaker 1: company that specializes in tax preparation software announced a reorganization 191 00:11:49,400 --> 00:11:52,120 Speaker 1: this week and that's going to result in downsizing staff 192 00:11:52,120 --> 00:11:55,200 Speaker 1: by around ten percent of the company. Company reps say 193 00:11:55,280 --> 00:11:57,600 Speaker 1: this is part of the long term strategy in which 194 00:11:57,720 --> 00:12:00,560 Speaker 1: into It will hire at least that same number of 195 00:12:00,679 --> 00:12:03,320 Speaker 1: people that they're laying off this year. So they're saying, 196 00:12:03,320 --> 00:12:06,400 Speaker 1: like next year they're going to hire as many or 197 00:12:06,480 --> 00:12:09,120 Speaker 1: more people as what they are laying off right now. 198 00:12:09,280 --> 00:12:12,960 Speaker 1: That number is around one eight hundred staff and that 199 00:12:13,120 --> 00:12:18,360 Speaker 1: this long term plan also involves incorporating artificial intelligence in 200 00:12:18,360 --> 00:12:22,040 Speaker 1: into its offerings. No big surprise there that they're saying 201 00:12:22,080 --> 00:12:24,120 Speaker 1: AI is going to be a big part of their 202 00:12:24,440 --> 00:12:28,400 Speaker 1: ongoing strategy. That is kind of the messaging we're hearing 203 00:12:28,600 --> 00:12:32,680 Speaker 1: across all industries right now. So into Its CEO further 204 00:12:32,760 --> 00:12:35,839 Speaker 1: said that more than half of those one eight hundred 205 00:12:35,880 --> 00:12:39,079 Speaker 1: employees who are being let go were selected because they 206 00:12:39,080 --> 00:12:42,600 Speaker 1: failed to meet company expectations, which is a big old yikes. 207 00:12:42,960 --> 00:12:47,400 Speaker 1: Like when you know arguably around five percent or more 208 00:12:47,480 --> 00:12:51,760 Speaker 1: of your company isn't meeting expectations, that's to me, that's 209 00:12:51,760 --> 00:12:55,760 Speaker 1: a management problem. That means that managers are not doing 210 00:12:55,800 --> 00:12:59,120 Speaker 1: their job in making sure that employees have what they 211 00:12:59,200 --> 00:13:02,360 Speaker 1: need and understand what they need to do in order 212 00:13:02,360 --> 00:13:05,080 Speaker 1: to meet expectations. But I mean, what do I know. 213 00:13:05,360 --> 00:13:08,440 Speaker 1: I'm not an expert in business or anything like that. 214 00:13:08,640 --> 00:13:11,440 Speaker 1: It just it just seems that way to me. So 215 00:13:12,400 --> 00:13:15,560 Speaker 1: I do think for the record that AI will have 216 00:13:15,880 --> 00:13:18,640 Speaker 1: its place. I'm not totally against AI. I know I 217 00:13:18,679 --> 00:13:22,080 Speaker 1: come down hard on AI a lot on this show. 218 00:13:22,520 --> 00:13:24,920 Speaker 1: I do think there will be ways to integrate AI 219 00:13:25,080 --> 00:13:29,040 Speaker 1: into business strategies that will be overall beneficial and not 220 00:13:29,120 --> 00:13:31,520 Speaker 1: just beneficial to the bottom line, which is kind of 221 00:13:31,520 --> 00:13:33,920 Speaker 1: the cynical way of looking at it right now, but 222 00:13:34,080 --> 00:13:36,880 Speaker 1: to people who are actually working at those companies. I 223 00:13:36,920 --> 00:13:40,560 Speaker 1: do think there are ways to use AI to enhance 224 00:13:41,000 --> 00:13:45,600 Speaker 1: people's work so that they can focus on the things 225 00:13:45,640 --> 00:13:49,000 Speaker 1: that really matter, while AI handles little details that are 226 00:13:49,040 --> 00:13:53,800 Speaker 1: important but are not necessary for humans to handle personally. 227 00:13:54,240 --> 00:13:57,040 Speaker 1: I just think there's also a ton of company executives 228 00:13:57,040 --> 00:14:00,840 Speaker 1: out there who are rushing into integrating AI without fully 229 00:14:00,880 --> 00:14:04,120 Speaker 1: thinking about the consequences they're going to encounter and the 230 00:14:04,240 --> 00:14:08,480 Speaker 1: lives are going to impact. They're doing it quickly without 231 00:14:09,000 --> 00:14:12,760 Speaker 1: fully thinking it through, and that's what leads to really 232 00:14:12,840 --> 00:14:16,560 Speaker 1: big messes. Okay, speaking of a big mess, I just 233 00:14:16,800 --> 00:14:18,960 Speaker 1: made one by spelling my coffee. We're going to take 234 00:14:19,000 --> 00:14:31,400 Speaker 1: a quick break and I'll be right back. Okay, we're back. 235 00:14:32,320 --> 00:14:37,200 Speaker 1: Coffee spillage has been contained. Cleaning efforts are ongoing, but 236 00:14:37,280 --> 00:14:41,320 Speaker 1: let's get back into the episode. So, a hacker group 237 00:14:41,480 --> 00:14:47,920 Speaker 1: called sieged Sec. Sie ged Sec says it has breached 238 00:14:47,920 --> 00:14:51,720 Speaker 1: the systems of an organization called the Heritage Foundation and 239 00:14:51,800 --> 00:14:55,480 Speaker 1: stolen a large amount of data and subsequently released that 240 00:14:55,560 --> 00:14:59,920 Speaker 1: data online. The Heritage Foundation is a conservative political or 241 00:15:00,040 --> 00:15:04,160 Speaker 1: organization and think tank. It is credited with being responsible 242 00:15:04,200 --> 00:15:09,200 Speaker 1: for a broad platform of policy proposals, collectively referenced as 243 00:15:09,400 --> 00:15:13,040 Speaker 1: Project twenty twenty five. Now, to go into Project twenty 244 00:15:13,080 --> 00:15:17,160 Speaker 1: twenty five is well beyond the scope of this podcast. However, 245 00:15:17,200 --> 00:15:19,640 Speaker 1: I can recommend you check out the June twenty eighth, 246 00:15:19,840 --> 00:15:22,440 Speaker 1: twenty twenty four episode of Stuff they Don't Want You 247 00:15:22,520 --> 00:15:26,280 Speaker 1: to Know. It is titled Project twenty twenty five. Should 248 00:15:26,280 --> 00:15:30,840 Speaker 1: we be concerned anyway? The sieged sec hackers carried out 249 00:15:30,880 --> 00:15:34,640 Speaker 1: the attack in part to ensure more quote transparency to 250 00:15:34,680 --> 00:15:39,200 Speaker 1: the public regarding who exactly is supporting Heritage end quote. 251 00:15:39,280 --> 00:15:41,840 Speaker 1: So that quote comes from a hacker who uses the 252 00:15:41,920 --> 00:15:46,000 Speaker 1: handle vo viio. This hacker spoke with the new site 253 00:15:46,040 --> 00:15:50,800 Speaker 1: Cyberscoop about the breach. That breach is mostly apparently a 254 00:15:50,880 --> 00:15:54,120 Speaker 1: list of names of people who have been involved with 255 00:15:54,280 --> 00:15:57,680 Speaker 1: or contributed to the Heritage Foundation. This is all part 256 00:15:57,720 --> 00:16:03,240 Speaker 1: of a quickly developing story regarding Project twenty twenty five, 257 00:16:03,480 --> 00:16:06,840 Speaker 1: and there's a lot of information and misinformation out about 258 00:16:06,880 --> 00:16:10,440 Speaker 1: that as well. So again, use critical thinking when you're 259 00:16:10,480 --> 00:16:14,320 Speaker 1: looking into this stuff. I do think personally that it's 260 00:16:14,360 --> 00:16:19,760 Speaker 1: something to be concerned about, and that it is. There 261 00:16:19,760 --> 00:16:23,480 Speaker 1: are elements of it that are pretty darn scary if 262 00:16:23,520 --> 00:16:28,160 Speaker 1: you if you care about stuff like say LGBTQ rights, 263 00:16:28,520 --> 00:16:32,880 Speaker 1: it's very scary, or women's autonomy rights, that kind of stuff. 264 00:16:33,040 --> 00:16:36,440 Speaker 1: It gets pretty frightening. But I recommend everyone do their 265 00:16:36,480 --> 00:16:39,680 Speaker 1: research and learn about it, and you know, mind where 266 00:16:39,680 --> 00:16:43,880 Speaker 1: you're getting your information, so that you're you've got a 267 00:16:43,920 --> 00:16:48,160 Speaker 1: reasonable handle on what's going on, because it's always important 268 00:16:48,280 --> 00:16:50,760 Speaker 1: to keep that in mind when you're engaging with material, 269 00:16:50,840 --> 00:16:55,000 Speaker 1: particularly material that might be confirming any biases you happen 270 00:16:55,080 --> 00:16:57,960 Speaker 1: to have. I have to remind myself of that all 271 00:16:58,000 --> 00:17:02,480 Speaker 1: the time. Marco Risol of hackerdose dot com has an 272 00:17:02,600 --> 00:17:06,200 Speaker 1: article titled hackers may use Telegram video to gain full 273 00:17:06,240 --> 00:17:09,880 Speaker 1: control of your phone. That is a heck of a headline. 274 00:17:10,080 --> 00:17:13,800 Speaker 1: So first off, in case you are not familiar with Telegram, 275 00:17:14,240 --> 00:17:18,840 Speaker 1: that's a smartphone based messaging app. It's also cloud based 276 00:17:18,920 --> 00:17:22,760 Speaker 1: messaging and it offers end to end encryption. You can 277 00:17:22,800 --> 00:17:26,960 Speaker 1: do voice calls, video calls, text messages. It has become 278 00:17:27,000 --> 00:17:30,359 Speaker 1: a favored messaging app among those who value their privacy 279 00:17:30,400 --> 00:17:35,360 Speaker 1: and they don't want their communications to be snoopable. They 280 00:17:35,359 --> 00:17:37,919 Speaker 1: want to do end to end encryption. But an X 281 00:17:38,400 --> 00:17:41,360 Speaker 1: or Twitter user, depending on how you want to call 282 00:17:41,400 --> 00:17:47,240 Speaker 1: that service called. Today cyber News posted that a vulnerability 283 00:17:47,240 --> 00:17:51,800 Speaker 1: in the Android version of Telegram allows an on ramp 284 00:17:51,960 --> 00:17:56,640 Speaker 1: for malware. So the alleged exploit involves the target receiving 285 00:17:56,640 --> 00:17:59,440 Speaker 1: a link to a video that if the target tries 286 00:17:59,480 --> 00:18:02,800 Speaker 1: to launch, the video, prompts a pop up message, and 287 00:18:02,840 --> 00:18:07,400 Speaker 1: that message says that Telegram's native video player is unable 288 00:18:07,440 --> 00:18:11,080 Speaker 1: to play the media file in question, So don't worry, 289 00:18:11,359 --> 00:18:15,359 Speaker 1: because you will be prompted to go toward a site 290 00:18:15,800 --> 00:18:18,720 Speaker 1: that is claimed to be a third party video service 291 00:18:19,000 --> 00:18:21,520 Speaker 1: that then will play the video. So it's saying, oh, 292 00:18:21,680 --> 00:18:24,200 Speaker 1: Telegram can't play this, but just click this link, you'll 293 00:18:24,240 --> 00:18:26,960 Speaker 1: go to a site that will play the video. Except 294 00:18:26,960 --> 00:18:29,440 Speaker 1: it's all a ploy to trick the target into unknowingly 295 00:18:29,560 --> 00:18:34,000 Speaker 1: consenting to downloading malware. The malware infects the phone and 296 00:18:34,040 --> 00:18:38,040 Speaker 1: effectively hands control over to a remote operator who has 297 00:18:38,680 --> 00:18:42,320 Speaker 1: essentially full administrative access to the device. At that point, 298 00:18:42,640 --> 00:18:46,240 Speaker 1: they can access the camera on the phone, the microphones 299 00:18:46,400 --> 00:18:48,840 Speaker 1: call logs, they can see all the apps that are 300 00:18:49,520 --> 00:18:52,000 Speaker 1: saved to the phone, all that kind of stuff, all 301 00:18:52,040 --> 00:18:55,399 Speaker 1: of that allegedly becomes available to the hackers. That is, 302 00:18:55,640 --> 00:18:59,400 Speaker 1: if these claims are in fact true, and they might 303 00:18:59,520 --> 00:19:03,639 Speaker 1: not be so. The tweet included a video demonstration of 304 00:19:03,680 --> 00:19:07,040 Speaker 1: this exploit in action. However, the resolution of the video 305 00:19:07,240 --> 00:19:10,760 Speaker 1: was not great, and thus it was difficult to verify 306 00:19:11,160 --> 00:19:15,520 Speaker 1: that this is a legitimate video of an actual exploit. 307 00:19:15,880 --> 00:19:20,480 Speaker 1: Earlier this year, Telegram actually debunked a different vulnerability claim 308 00:19:20,560 --> 00:19:24,080 Speaker 1: that was made against it. So there's already precedent for 309 00:19:24,280 --> 00:19:28,119 Speaker 1: false claims against Telegram, and it could turn out that 310 00:19:28,200 --> 00:19:31,640 Speaker 1: this new allegation is also a hoax. But I would 311 00:19:31,720 --> 00:19:35,600 Speaker 1: suggest to anyone who's using Telegram, or really any messaging 312 00:19:35,680 --> 00:19:38,479 Speaker 1: service for that matter, to think twice before clicking on 313 00:19:38,560 --> 00:19:41,520 Speaker 1: links to stuff, particularly if it is in relation to 314 00:19:41,760 --> 00:19:44,440 Speaker 1: a message that has a clickbaity kind of vibe to it, 315 00:19:44,680 --> 00:19:48,240 Speaker 1: right Like, if you get a video that's appealing to 316 00:19:48,480 --> 00:19:52,840 Speaker 1: either your paranoia, right like it's a video saying, hey, 317 00:19:52,960 --> 00:19:56,200 Speaker 1: is this you and this video? Or if it's something 318 00:19:56,240 --> 00:20:00,280 Speaker 1: that's very it's like appealing to a salacious sign of 319 00:20:00,359 --> 00:20:04,240 Speaker 1: human nature. Anything like that that should raise red flags. 320 00:20:04,240 --> 00:20:07,040 Speaker 1: And you should be very cautious before following any links 321 00:20:07,080 --> 00:20:09,679 Speaker 1: because it's a tried and true method to trick people 322 00:20:09,720 --> 00:20:12,359 Speaker 1: into a trap. If you were an AT and T 323 00:20:12,520 --> 00:20:16,520 Speaker 1: cellular customer between May and November in twenty twenty two, 324 00:20:16,720 --> 00:20:20,480 Speaker 1: I have some bad news. There's a really good chance 325 00:20:20,520 --> 00:20:25,000 Speaker 1: that your phone number and some information about who you 326 00:20:25,200 --> 00:20:29,280 Speaker 1: texted and called has been leaked in a massive data breach. 327 00:20:29,359 --> 00:20:33,359 Speaker 1: So AT and T says that essentially all cellular customers 328 00:20:33,640 --> 00:20:36,680 Speaker 1: at that time period were involved in this data leak, 329 00:20:36,760 --> 00:20:39,520 Speaker 1: in which some third party was able to download the 330 00:20:39,600 --> 00:20:43,119 Speaker 1: information off of a cloud platform that AT and T uses. 331 00:20:43,200 --> 00:20:46,280 Speaker 1: So this is another case where a hackers target not 332 00:20:47,040 --> 00:20:50,800 Speaker 1: the company itself, not AT and T, but a business 333 00:20:50,840 --> 00:20:54,840 Speaker 1: to business operator that AT and T uses. You know, 334 00:20:54,880 --> 00:20:58,520 Speaker 1: it's a provider that has this cloud services platform, and 335 00:20:58,560 --> 00:21:02,320 Speaker 1: the hackers targeted that provider. AT and T says that 336 00:21:02,560 --> 00:21:05,119 Speaker 1: names were not included in this data leak. It was 337 00:21:05,240 --> 00:21:09,240 Speaker 1: just numbers and then metadata about the text messages and 338 00:21:09,320 --> 00:21:12,520 Speaker 1: phone calls those numbers were engaged in. However, AT and 339 00:21:12,560 --> 00:21:16,000 Speaker 1: T also says it can be rather trivial to match 340 00:21:16,119 --> 00:21:18,479 Speaker 1: a name with a phone number. There are a lot 341 00:21:18,520 --> 00:21:22,600 Speaker 1: of databases out there that have information about phone numbers 342 00:21:22,600 --> 00:21:25,800 Speaker 1: and who those numbers belong to. So all that info 343 00:21:26,320 --> 00:21:29,199 Speaker 1: may already be out in the wild, and just with 344 00:21:29,280 --> 00:21:31,960 Speaker 1: a little cross referencing, a hacker can figure out who 345 00:21:32,080 --> 00:21:34,800 Speaker 1: this number belongs to. Some of the call and texting 346 00:21:34,880 --> 00:21:37,919 Speaker 1: data related to some of the numbers was part of 347 00:21:37,920 --> 00:21:41,320 Speaker 1: this attack, as I've mentioned, So it's not the contents 348 00:21:41,359 --> 00:21:43,960 Speaker 1: of those phone calls or the texts. It's not like 349 00:21:44,000 --> 00:21:47,720 Speaker 1: the hackers can read what someone has texted to someone else. Instead, 350 00:21:47,880 --> 00:21:50,520 Speaker 1: it's a call log or a text log, so you 351 00:21:50,560 --> 00:21:54,600 Speaker 1: can see which numbers have communicated with one another. According 352 00:21:54,680 --> 00:21:58,080 Speaker 1: to AT and T spokesperson Alex Byers, this breach is 353 00:21:58,200 --> 00:22:01,960 Speaker 1: unrelated to a different incident that the company acknowledged back 354 00:22:02,000 --> 00:22:04,639 Speaker 1: in March. So AT and T has been the target 355 00:22:04,720 --> 00:22:08,840 Speaker 1: of two massive hacker attacks and data breaches. Not great, 356 00:22:09,240 --> 00:22:13,640 Speaker 1: Apple has assuaged the concerns of EU regulators, at least 357 00:22:13,640 --> 00:22:18,479 Speaker 1: for now, regarding the company's Tap to Pay iPhone payment system. 358 00:22:18,760 --> 00:22:22,600 Speaker 1: So regulators in the EU had accused Apple of jealously 359 00:22:22,760 --> 00:22:26,240 Speaker 1: guarding that Tap to pay system in order to maintain 360 00:22:26,359 --> 00:22:30,800 Speaker 1: a monopoly over payment processing through the iPhone, and if 361 00:22:30,840 --> 00:22:34,439 Speaker 1: Apple did not open that up to third parties, to 362 00:22:34,560 --> 00:22:38,119 Speaker 1: other like payment processing companies and such, Apple could face 363 00:22:38,200 --> 00:22:41,200 Speaker 1: a pretty sizable fine, which is putting it lightly. By 364 00:22:41,240 --> 00:22:46,359 Speaker 1: pretty sizable, I mean ten percent of the company's global revenue. 365 00:22:46,720 --> 00:22:50,919 Speaker 1: Apple is a three trillion dollar company, so that is 366 00:22:51,280 --> 00:22:55,719 Speaker 1: like billions of euros or billions of dollars worth of 367 00:22:55,840 --> 00:23:00,320 Speaker 1: money if Apple failed to comply to these regular later 368 00:23:00,359 --> 00:23:02,840 Speaker 1: demands or the company would have to cease operations in 369 00:23:02,880 --> 00:23:07,000 Speaker 1: the EU to avoid it. So the services that Apple 370 00:23:07,440 --> 00:23:10,119 Speaker 1: is opening things up like Apple has committed to doing 371 00:23:10,160 --> 00:23:14,120 Speaker 1: this include third parties that might have their own mobile 372 00:23:14,400 --> 00:23:18,640 Speaker 1: wallet features as well as their own payment processing services, 373 00:23:18,680 --> 00:23:21,840 Speaker 1: but it could also include things like using your mobile 374 00:23:21,880 --> 00:23:24,919 Speaker 1: phone to do stuff for all sorts of transactions, like 375 00:23:25,119 --> 00:23:28,800 Speaker 1: as a hotel key or as a ticket to an event, 376 00:23:29,080 --> 00:23:31,720 Speaker 1: or as a corporate badge to get into a building. 377 00:23:31,800 --> 00:23:34,760 Speaker 1: That kind of stuff. It all uses NFC or near 378 00:23:34,880 --> 00:23:40,280 Speaker 1: field communications technologies. So Apple appeers to have committed to this. 379 00:23:41,160 --> 00:23:45,200 Speaker 1: They have a deadline of July twenty fifth to open 380 00:23:45,280 --> 00:23:47,560 Speaker 1: up their system to other companies. If they do so, 381 00:23:48,000 --> 00:23:50,520 Speaker 1: there'll be no further problems. If they don't, then they 382 00:23:50,600 --> 00:23:55,119 Speaker 1: are facing the potential of a massive fine. Apple is 383 00:23:55,160 --> 00:23:57,640 Speaker 1: not the only tech company that has had to contend 384 00:23:57,640 --> 00:24:01,640 Speaker 1: with EU regulations recently. You know, the app formerly known 385 00:24:01,640 --> 00:24:05,200 Speaker 1: as Twitter has been called to task for breaching online 386 00:24:05,320 --> 00:24:09,840 Speaker 1: content rules under the EU's Digital Services Act or DSA. 387 00:24:10,160 --> 00:24:14,280 Speaker 1: So EU regulators investigated X for seven months and found 388 00:24:14,280 --> 00:24:17,960 Speaker 1: the company failed to comply with various rules requiring it 389 00:24:18,040 --> 00:24:21,480 Speaker 1: to make available a searchable advertising repository so that the 390 00:24:21,560 --> 00:24:25,240 Speaker 1: regulators can ensure that the advertising policies on the platform 391 00:24:25,359 --> 00:24:29,639 Speaker 1: are within EU's laws. Worst, the regulators accuse X of 392 00:24:29,680 --> 00:24:33,480 Speaker 1: denying them access to public data that they're supposed to 393 00:24:33,520 --> 00:24:36,600 Speaker 1: be able to review. In addition, the regulators found that 394 00:24:36,880 --> 00:24:40,440 Speaker 1: X's blue check mark system, which previously was all about 395 00:24:40,560 --> 00:24:43,760 Speaker 1: verifying an account, but now it just means that you've 396 00:24:43,880 --> 00:24:49,080 Speaker 1: paid for that service. They're now in opposition to industry practice, and, 397 00:24:49,200 --> 00:24:53,520 Speaker 1: as Reuter's reporter Thu Yung Chi puts it, quote, negatively 398 00:24:53,600 --> 00:24:57,400 Speaker 1: affect user's ability to make free and informed decisions about 399 00:24:57,440 --> 00:25:01,720 Speaker 1: the authenticity of the accounts they interact with. End quote. 400 00:25:02,320 --> 00:25:04,479 Speaker 1: X could face a fine of up to six percent 401 00:25:04,600 --> 00:25:07,520 Speaker 1: of its annual global revenue. That would definitely be a 402 00:25:07,600 --> 00:25:10,040 Speaker 1: huge blow to the company. I mean, it's a company 403 00:25:10,080 --> 00:25:12,800 Speaker 1: that has continued to struggle in an effort to keep 404 00:25:12,920 --> 00:25:17,760 Speaker 1: advertisers on board. Like revenue for Twitter, Slash, X has 405 00:25:17,800 --> 00:25:21,200 Speaker 1: been a real sticky subject over the last couple of years. 406 00:25:21,520 --> 00:25:23,840 Speaker 1: X will have a chance to defend itself. So this 407 00:25:23,920 --> 00:25:27,840 Speaker 1: is not a cut and dried decision yet. X can 408 00:25:28,040 --> 00:25:31,760 Speaker 1: come forward and attempt to either defend itself or make changes. 409 00:25:32,000 --> 00:25:34,600 Speaker 1: But the European Commission says that if the findings of 410 00:25:34,640 --> 00:25:38,160 Speaker 1: the investigation hold up to scrutiny, X will be fined 411 00:25:38,240 --> 00:25:41,680 Speaker 1: and forced to change should it continue to operate within 412 00:25:41,720 --> 00:25:46,320 Speaker 1: the EU. So yeah, tough news for X. Moving on 413 00:25:46,440 --> 00:25:49,960 Speaker 1: to a different Elon Musk led company, According to Bloomberg, 414 00:25:50,119 --> 00:25:53,280 Speaker 1: Tesla has had to make an adjustment to its schedule, 415 00:25:53,440 --> 00:25:56,920 Speaker 1: which doesn't surprise me even a little bit. So several 416 00:25:56,920 --> 00:26:00,280 Speaker 1: months ago, back in April, Elon Musk gave August eighth 417 00:26:00,520 --> 00:26:03,760 Speaker 1: as a date at which time Tesla would unveil its 418 00:26:03,960 --> 00:26:07,560 Speaker 1: Robotaxi business. Now, this was during the same earnings call 419 00:26:07,600 --> 00:26:11,280 Speaker 1: in which Musk also told everybody that Tesla would no 420 00:26:11,359 --> 00:26:14,760 Speaker 1: longer work toward releasing a low cost electric vehicle, which 421 00:26:14,840 --> 00:26:17,840 Speaker 1: was a big blow. But you know, the Robotaxi thing 422 00:26:17,920 --> 00:26:20,919 Speaker 1: was meant to kind of shift focus, I think, and 423 00:26:21,040 --> 00:26:25,240 Speaker 1: to reframe Tesla as being more than an electric vehicle company. 424 00:26:25,320 --> 00:26:27,560 Speaker 1: Like Musk has said that you should think of Tesla 425 00:26:27,600 --> 00:26:31,800 Speaker 1: as an AI and robotics company. Now Tesla says it 426 00:26:31,840 --> 00:26:35,520 Speaker 1: has had to delay the event and push it back 427 00:26:35,600 --> 00:26:39,240 Speaker 1: to October because it turns out building autonomous taxis is 428 00:26:39,280 --> 00:26:42,080 Speaker 1: actually really hard, which is something I think most people 429 00:26:42,160 --> 00:26:45,399 Speaker 1: already understood. I mean, we've heard plenty of stories about 430 00:26:45,400 --> 00:26:49,560 Speaker 1: how other robotaxi services have encountered issues, and goodness knows, 431 00:26:49,560 --> 00:26:54,200 Speaker 1: there's no shortage of issues with Tesla's own autonomous offerings. 432 00:26:54,200 --> 00:26:58,520 Speaker 1: I'm speaking specifically about autopilot and full self driving mode. 433 00:26:58,680 --> 00:27:02,119 Speaker 1: So I wish to throw no shade at the engineers 434 00:27:02,160 --> 00:27:04,640 Speaker 1: over at Tesla who've been working on this. I think 435 00:27:04,720 --> 00:27:07,439 Speaker 1: instead this is more of a case of Elon Musk 436 00:27:07,680 --> 00:27:12,840 Speaker 1: proposing an overly aggressive time table for a launch, something 437 00:27:12,880 --> 00:27:15,760 Speaker 1: he's done a few times in the past, possibly as 438 00:27:15,800 --> 00:27:18,679 Speaker 1: an effort to soften the blow of harder news, in 439 00:27:18,720 --> 00:27:21,520 Speaker 1: this case canceling a low cost EV that was something 440 00:27:21,520 --> 00:27:24,760 Speaker 1: that investors thought would really help push Tesla to the 441 00:27:24,760 --> 00:27:28,800 Speaker 1: next level, and it clearly is not gonna happen. So 442 00:27:29,880 --> 00:27:31,880 Speaker 1: it could be that you could argue the August eighth 443 00:27:31,880 --> 00:27:33,920 Speaker 1: thing was kind of a hail Mary pass to get 444 00:27:33,960 --> 00:27:38,159 Speaker 1: attention away from the low cost ev setback, and that 445 00:27:38,480 --> 00:27:42,919 Speaker 1: now we're just seeing that naturally that was too aggressive 446 00:27:43,880 --> 00:27:48,199 Speaker 1: a timetable, or it might be much more innocent than that. 447 00:27:48,440 --> 00:27:51,639 Speaker 1: I don't know. Well, we're going to take another quick break. 448 00:27:51,920 --> 00:28:04,240 Speaker 1: We will be back with some more tech news. Okay, 449 00:28:04,280 --> 00:28:09,280 Speaker 1: I'm back, so Kyle Barr of Gizmoto reports that analysts 450 00:28:09,320 --> 00:28:13,320 Speaker 1: are ringing the death knell for Apple Vision Pro sales. 451 00:28:13,440 --> 00:28:18,200 Speaker 1: That's Apple's mixed reality headset that they introduced earlier this year. 452 00:28:18,720 --> 00:28:21,560 Speaker 1: The analysts are saying that pretty much everyone who was 453 00:28:21,680 --> 00:28:23,920 Speaker 1: willing to fork over the cash for one of those 454 00:28:23,960 --> 00:28:28,320 Speaker 1: three five hundred dollars headsets now has one, and that 455 00:28:28,520 --> 00:28:31,639 Speaker 1: there just aren't other people who are willing to do that. So, 456 00:28:31,680 --> 00:28:34,840 Speaker 1: according to the analysts that bar sites, sales for the 457 00:28:34,840 --> 00:28:37,680 Speaker 1: rest of twenty twenty four are likely to be pretty 458 00:28:38,240 --> 00:28:42,200 Speaker 1: darn low that by August you're looking at a seventy 459 00:28:42,240 --> 00:28:46,600 Speaker 1: five percent drop off in sales. An analyst firm called 460 00:28:46,760 --> 00:28:50,400 Speaker 1: IDC reports that Apple has not yet sold one hundred 461 00:28:50,440 --> 00:28:54,280 Speaker 1: thousand units of the ding dang darn thing. Current estimations 462 00:28:54,320 --> 00:28:56,920 Speaker 1: say that's not really going to change for the rest 463 00:28:57,000 --> 00:29:00,480 Speaker 1: of this year, and in my opinion, the issue is twofold. 464 00:29:00,760 --> 00:29:04,760 Speaker 1: They are two main problems Apple faces with the Apple 465 00:29:04,840 --> 00:29:10,080 Speaker 1: Vision pro First off, it is undeniably really expensive, and 466 00:29:10,600 --> 00:29:13,880 Speaker 1: that is a big bar for entry, right like, not 467 00:29:13,960 --> 00:29:16,720 Speaker 1: a lot of people have thirty five hundred dollars to 468 00:29:16,880 --> 00:29:20,680 Speaker 1: drop on a tech toy. Let's be honest, it is 469 00:29:20,960 --> 00:29:23,960 Speaker 1: a toy right now. Because that brings us to problem 470 00:29:24,040 --> 00:29:27,680 Speaker 1: number two. There are a limited number of applications for 471 00:29:27,800 --> 00:29:31,320 Speaker 1: this technology as it stands right now. Now, I personally 472 00:29:31,360 --> 00:29:33,800 Speaker 1: have not had the chance to try out a headset. 473 00:29:34,040 --> 00:29:37,560 Speaker 1: From what I hear, it's a really impressive experience. The 474 00:29:37,640 --> 00:29:39,600 Speaker 1: problem is there's just not a whole lot you get 475 00:29:39,680 --> 00:29:43,640 Speaker 1: to do. The things it does are allegedly really incredible, 476 00:29:43,960 --> 00:29:47,560 Speaker 1: but they're also limited in number. Very few applications have 477 00:29:47,600 --> 00:29:51,800 Speaker 1: actually been developed for the platform itself. That actually makes 478 00:29:51,840 --> 00:29:54,920 Speaker 1: sense because they're such a small user base. Now I've 479 00:29:54,920 --> 00:29:58,360 Speaker 1: said this many times before, but if you are a developer, 480 00:29:58,720 --> 00:30:01,760 Speaker 1: you have to choose which platforms you're going to develop 481 00:30:02,000 --> 00:30:05,880 Speaker 1: four and from a financial standpoint, it only makes sense 482 00:30:05,920 --> 00:30:08,120 Speaker 1: to aim at platforms that are likely to see a 483 00:30:08,160 --> 00:30:11,560 Speaker 1: good return on investment. You want all that time and 484 00:30:11,720 --> 00:30:15,920 Speaker 1: effort that you're pouring into this development process to pay off. 485 00:30:16,240 --> 00:30:18,440 Speaker 1: So with that in mind, you want your work to 486 00:30:18,480 --> 00:30:22,360 Speaker 1: reach as many potential customers as possible, or a group 487 00:30:22,400 --> 00:30:26,200 Speaker 1: of customers who are more likely to spend money on 488 00:30:26,280 --> 00:30:29,040 Speaker 1: the thing you have made. That's not really a thing 489 00:30:29,120 --> 00:30:31,560 Speaker 1: if fewer than one hundred thousand people have even bought 490 00:30:31,600 --> 00:30:35,000 Speaker 1: into the platform. Apple is reportedly working on a lower 491 00:30:35,040 --> 00:30:39,280 Speaker 1: cost headset that would launch late next year. That could 492 00:30:39,360 --> 00:30:42,240 Speaker 1: potentially change things, but the cost needs to reach a 493 00:30:42,360 --> 00:30:46,360 Speaker 1: level where more curious people can actually afford it. So 494 00:30:46,640 --> 00:30:49,680 Speaker 1: even if you cut the price in half, that's really 495 00:30:49,680 --> 00:30:52,760 Speaker 1: not enough, right, Like, that's still like seventeen hundred dollars. 496 00:30:53,360 --> 00:30:55,760 Speaker 1: That's still a lot of money. So even cutting the 497 00:30:55,800 --> 00:30:58,800 Speaker 1: price in half is not likely to see a huge 498 00:30:59,320 --> 00:31:01,760 Speaker 1: rush of custom I mean, you'll probably see more than 499 00:31:01,800 --> 00:31:04,280 Speaker 1: you did with thirty five hundred, but I don't know. 500 00:31:05,080 --> 00:31:08,080 Speaker 1: But this might be enough to be a tipping point 501 00:31:08,080 --> 00:31:11,200 Speaker 1: for developers. We might see more developers willing to spend 502 00:31:11,200 --> 00:31:14,720 Speaker 1: the time and effort it takes to create things, specifically 503 00:31:14,760 --> 00:31:18,880 Speaker 1: for the Vision platform, and maybe then we'll see some 504 00:31:18,920 --> 00:31:22,920 Speaker 1: really innovative uses of augmented reality and virtual reality and 505 00:31:23,040 --> 00:31:25,280 Speaker 1: have those find their way to the headset. I would 506 00:31:25,280 --> 00:31:27,840 Speaker 1: actually love to see that happen, because as much as 507 00:31:27,920 --> 00:31:30,560 Speaker 1: I have dogged on Apple for the Vision pro, I 508 00:31:30,680 --> 00:31:35,080 Speaker 1: do think AR could be super cool if it's done correctly. 509 00:31:35,400 --> 00:31:38,600 Speaker 1: It's just really hard to do because typically it means 510 00:31:38,640 --> 00:31:42,080 Speaker 1: that people have to wear a bulky, power hungry headset 511 00:31:42,080 --> 00:31:45,280 Speaker 1: on their face, and that's a big ask, Like that's 512 00:31:45,280 --> 00:31:47,680 Speaker 1: asking a lot of folks to do that, and as 513 00:31:47,680 --> 00:31:51,120 Speaker 1: we've seen time and again, people are rarely willing to 514 00:31:51,240 --> 00:31:54,480 Speaker 1: do that unless it's for something very specific like gaming. 515 00:31:54,720 --> 00:31:58,280 Speaker 1: Right VR gaming is an exception, but most people, like 516 00:31:58,320 --> 00:32:00,760 Speaker 1: when it came to three D television, they weren't willing 517 00:32:00,840 --> 00:32:03,360 Speaker 1: to take that step. So it's going to take some 518 00:32:03,480 --> 00:32:08,360 Speaker 1: pretty compelling applications, I think, to win people over beyond 519 00:32:09,080 --> 00:32:13,080 Speaker 1: just you know, your pool of bleeding edge adopters. And 520 00:32:13,120 --> 00:32:16,160 Speaker 1: now for the cantankerous old coot section of our show, 521 00:32:16,520 --> 00:32:19,600 Speaker 1: I bring you my physical media is dying and I'm 522 00:32:19,680 --> 00:32:23,480 Speaker 1: mad about it. Rant I'm not the only one doing 523 00:32:23,520 --> 00:32:25,920 Speaker 1: the sort of ranting, by the way, Sharon Harding of 524 00:32:25,960 --> 00:32:29,320 Speaker 1: Ours Technica has a piece titled DVDs are dying right 525 00:32:29,400 --> 00:32:32,960 Speaker 1: as streaming has made them appealing again, and that piece 526 00:32:33,000 --> 00:32:35,920 Speaker 1: really goes into this so, as Harding points out, the 527 00:32:35,960 --> 00:32:39,560 Speaker 1: streaming landscape actually can make it harder if you are 528 00:32:39,600 --> 00:32:44,040 Speaker 1: a fan of specific stuff apart from original titles that 529 00:32:44,520 --> 00:32:47,880 Speaker 1: spawned out of these streaming platforms. It can be hard 530 00:32:47,880 --> 00:32:50,440 Speaker 1: to predict where stuff is going to end up. So 531 00:32:50,520 --> 00:32:53,800 Speaker 1: you might start watching, you know, a series on one 532 00:32:53,880 --> 00:32:57,120 Speaker 1: streaming platform and then it jumps ship to a different 533 00:32:57,120 --> 00:33:00,440 Speaker 1: streaming platform. If you're not a subscriber to this other service, 534 00:33:00,480 --> 00:33:02,560 Speaker 1: you're out of luck. This has happened to me on 535 00:33:02,680 --> 00:33:06,920 Speaker 1: multiple occasions. Worse yet, you can have a licensing agreement 536 00:33:07,120 --> 00:33:10,800 Speaker 1: expire and no one has the thing you wanted to watch. 537 00:33:11,120 --> 00:33:14,680 Speaker 1: So maybe you start a series, the licensing agreement expires, 538 00:33:14,920 --> 00:33:18,040 Speaker 1: the series disappears from that platform, no one else has 539 00:33:18,040 --> 00:33:20,280 Speaker 1: picked it up, and you're just left without being able 540 00:33:20,320 --> 00:33:25,040 Speaker 1: to watch it. That's really frustrating. Buying digital copies isn't 541 00:33:25,080 --> 00:33:27,640 Speaker 1: a guarantee to fix this, right, because there's always the 542 00:33:27,720 --> 00:33:32,120 Speaker 1: chance that the provider will stop supporting the service or 543 00:33:32,120 --> 00:33:34,880 Speaker 1: even go out of business. Then you're left without the 544 00:33:34,920 --> 00:33:37,280 Speaker 1: media that you have purchased, or rather the media that 545 00:33:37,320 --> 00:33:40,360 Speaker 1: you've paid to watch. I always think about that too, 546 00:33:40,480 --> 00:33:43,280 Speaker 1: like I have libraries of digital films that I like 547 00:33:43,480 --> 00:33:45,880 Speaker 1: to go to every now and again, but there's always 548 00:33:45,880 --> 00:33:49,080 Speaker 1: the chance that the company will just stop supporting that service, 549 00:33:49,240 --> 00:33:52,720 Speaker 1: and even though I have paid to access that media, 550 00:33:53,360 --> 00:33:56,200 Speaker 1: the service no longer will work and I won't be 551 00:33:56,200 --> 00:33:58,680 Speaker 1: able to watch the thing I paid for. So you 552 00:33:58,800 --> 00:34:02,720 Speaker 1: might turn to the solid of physical media like Blu 553 00:34:02,880 --> 00:34:05,840 Speaker 1: rays and DVDs, because then you can keep a copy 554 00:34:06,040 --> 00:34:08,719 Speaker 1: for yourself and no one is going to take that 555 00:34:08,840 --> 00:34:12,040 Speaker 1: away from you, except that physical media is dying out 556 00:34:12,080 --> 00:34:15,520 Speaker 1: too harding details how red Box, which is a DVD 557 00:34:15,680 --> 00:34:20,360 Speaker 1: rental service that operated red kiosks filled with DVDs across 558 00:34:20,400 --> 00:34:23,879 Speaker 1: the United States. Red Box is shutting down. It went 559 00:34:23,920 --> 00:34:27,360 Speaker 1: into bankruptcy initially went into chapter eleven. Now it's shifting 560 00:34:27,400 --> 00:34:31,279 Speaker 1: to chapter seven, which includes liquidation of all assets. I'm 561 00:34:31,280 --> 00:34:34,160 Speaker 1: sure Mike of Red Letter Media is distressed to really 562 00:34:34,160 --> 00:34:36,120 Speaker 1: hear about this one, because red Box is kind of 563 00:34:36,160 --> 00:34:41,040 Speaker 1: his go to for really trashy movies. The closure means 564 00:34:41,040 --> 00:34:45,280 Speaker 1: that twenty four thousand kiosks are shutting down, as will 565 00:34:45,400 --> 00:34:49,600 Speaker 1: the services that were operated by Redbox's parent company, Chicken Soup. 566 00:34:49,640 --> 00:34:54,360 Speaker 1: For the soul entertainment that includes services like Crackle and 567 00:34:54,480 --> 00:34:57,640 Speaker 1: Popcorn Flicks, they're also going bye by. So it's not 568 00:34:57,719 --> 00:35:00,520 Speaker 1: just the physical media. Some streaming media s business are 569 00:35:00,560 --> 00:35:04,640 Speaker 1: going way too. Meanwhile, big box stores are cutting back 570 00:35:04,680 --> 00:35:08,800 Speaker 1: on DVD sales and repurposing that floor space for other products. 571 00:35:09,000 --> 00:35:11,800 Speaker 1: Like Target, for example, is going to shift to a 572 00:35:11,840 --> 00:35:16,000 Speaker 1: strategy where they only offer DVDs seasonally and then the 573 00:35:16,040 --> 00:35:17,960 Speaker 1: rest of the time that space will be meant for 574 00:35:18,040 --> 00:35:21,640 Speaker 1: something else. Sony has announced it's going to stop producing 575 00:35:21,760 --> 00:35:25,280 Speaker 1: recordable Blu ray discs in the near future, and Harding 576 00:35:25,280 --> 00:35:28,080 Speaker 1: does a really great job of explaining the reasoning behind 577 00:35:28,160 --> 00:35:31,520 Speaker 1: why all these shifts are happening, Because the numbers don't 578 00:35:31,560 --> 00:35:35,439 Speaker 1: lie and physical media is far less profitable these days. 579 00:35:35,440 --> 00:35:39,320 Speaker 1: There's just fewer people buying physical media, but considering the 580 00:35:39,440 --> 00:35:42,600 Speaker 1: chaos that is streaming, this means consumers will have fewer 581 00:35:42,640 --> 00:35:45,840 Speaker 1: options when it comes to ensuring that they have access 582 00:35:45,840 --> 00:35:48,520 Speaker 1: to the media they love. Heck, over the last couple 583 00:35:48,560 --> 00:35:51,560 Speaker 1: of years, I've actually started collecting physical media again. I 584 00:35:51,640 --> 00:35:54,600 Speaker 1: had kind of shifted away from that for quite some time, 585 00:35:54,880 --> 00:35:57,320 Speaker 1: but I'm back to it now and it's largely because 586 00:35:57,360 --> 00:36:00,520 Speaker 1: of this issue. I even belong to a monthly horror 587 00:36:00,560 --> 00:36:04,319 Speaker 1: movie club where I get four discs per month. And 588 00:36:04,360 --> 00:36:07,200 Speaker 1: for folks like me, this trend is pretty sad and 589 00:36:07,280 --> 00:36:10,640 Speaker 1: also frustrating. It's also totally understandable. It's not like I 590 00:36:10,880 --> 00:36:14,840 Speaker 1: fault the companies for doing this. I don't expect companies 591 00:36:14,880 --> 00:36:18,400 Speaker 1: to continue to manufacture physical discs at a loss, but 592 00:36:18,600 --> 00:36:21,520 Speaker 1: the fact is we're seeing this move away from physical 593 00:36:21,560 --> 00:36:26,880 Speaker 1: media entirely while simultaneously dealing with the constantly shifting landscape 594 00:36:26,920 --> 00:36:31,080 Speaker 1: of streaming, and that stinks. Speaking of physical media, the 595 00:36:31,160 --> 00:36:34,759 Speaker 1: German Navy is looking to phase out a legacy technology 596 00:36:34,800 --> 00:36:38,360 Speaker 1: for its fleet of anti submarine frigates. See. These ships 597 00:36:38,400 --> 00:36:41,400 Speaker 1: currently rely on the storage system that saves data to 598 00:36:41,880 --> 00:36:46,439 Speaker 1: eight inch floppy discs. Yeah, floppy discs. I wager there's 599 00:36:46,440 --> 00:36:49,920 Speaker 1: some of y'all out there who have never even used 600 00:36:50,000 --> 00:36:52,920 Speaker 1: floppy discs. I grew up using five and a quarter 601 00:36:53,000 --> 00:36:55,719 Speaker 1: and then later three and a half inch floppies. There 602 00:36:55,760 --> 00:37:00,000 Speaker 1: are people older than me who used much larger floppy discs. 603 00:37:00,120 --> 00:37:03,320 Speaker 1: Apart from an escape room that I participated in recently. 604 00:37:03,360 --> 00:37:06,160 Speaker 1: I haven't touched a floppy disk in years. It's kind 605 00:37:06,200 --> 00:37:08,560 Speaker 1: of crazy to think that the German Navy is still 606 00:37:08,600 --> 00:37:12,480 Speaker 1: running equipment that relies on this legacy technology. Some would 607 00:37:12,520 --> 00:37:15,839 Speaker 1: call it an obsolete technology, but here we are moving 608 00:37:15,880 --> 00:37:20,120 Speaker 1: stuff to a new system. Isn't easy. Modernizing hardware requires 609 00:37:20,200 --> 00:37:23,680 Speaker 1: a lot of considerations because changing one thing can sometimes 610 00:37:23,719 --> 00:37:27,600 Speaker 1: break something else. According to the site Tom's Hardware, the 611 00:37:27,680 --> 00:37:30,680 Speaker 1: German Navy is looking at an emulator to provide the 612 00:37:30,760 --> 00:37:34,239 Speaker 1: same kind of storage and data retrieval services as the 613 00:37:34,280 --> 00:37:37,880 Speaker 1: floppy disk system did, and that means the Navy won't 614 00:37:37,920 --> 00:37:41,319 Speaker 1: have to retool their entire process. The emulator would serve 615 00:37:41,360 --> 00:37:44,279 Speaker 1: as a substitute for the floppy drive, but otherwise things 616 00:37:44,320 --> 00:37:47,160 Speaker 1: would largely operate the same way they have for years. 617 00:37:47,560 --> 00:37:50,279 Speaker 1: Kind of wild that we're into twenty twenty four and 618 00:37:50,320 --> 00:37:54,680 Speaker 1: this is finally being addressed. Hurricane Beryl packed a heck 619 00:37:54,760 --> 00:37:57,760 Speaker 1: of a wallop when it landed in Texas earlier this week. 620 00:37:58,000 --> 00:38:00,640 Speaker 1: The storm knocked out power for nearly two two million 621 00:38:00,719 --> 00:38:05,160 Speaker 1: Houston residents. Meanwhile, a massive heat wave created really dangerous 622 00:38:05,160 --> 00:38:08,200 Speaker 1: situations for folks living in the Houston area. Not only 623 00:38:08,200 --> 00:38:11,200 Speaker 1: did they have to deal with flooding and other damage 624 00:38:11,200 --> 00:38:14,880 Speaker 1: from the hurricane, but the temperatures and humidity meant that 625 00:38:15,040 --> 00:38:18,440 Speaker 1: going without power and access to air conditioning was like 626 00:38:18,520 --> 00:38:22,560 Speaker 1: a real health hazard, making matters more complicated than necessary 627 00:38:22,800 --> 00:38:25,840 Speaker 1: would be. Center Point Energy, a utility company in the 628 00:38:25,880 --> 00:38:30,000 Speaker 1: Houston area, Texas, famously is not part of the US 629 00:38:30,120 --> 00:38:34,240 Speaker 1: National Power Grid. Texas opted to go at lone wolf style, 630 00:38:34,440 --> 00:38:37,360 Speaker 1: which frequently does not work out so great for the 631 00:38:37,440 --> 00:38:40,680 Speaker 1: residents of Texas, but it seems to make business executives 632 00:38:40,680 --> 00:38:44,480 Speaker 1: and local politicians a whole lot of money anyway. Center 633 00:38:44,560 --> 00:38:48,440 Speaker 1: Point Energy doesn't have a dedicated app, so Houston citizens 634 00:38:48,480 --> 00:38:50,760 Speaker 1: began to turn to a different app in an effort 635 00:38:51,000 --> 00:38:54,759 Speaker 1: to get information about where power outages currently are in 636 00:38:54,800 --> 00:38:57,560 Speaker 1: the city. That app happens to belong to the fast 637 00:38:57,600 --> 00:39:02,200 Speaker 1: food chain WAA Burger. Waaburger's app includes a map that 638 00:39:02,280 --> 00:39:06,400 Speaker 1: shows all local Waterburger locations within the Houston area, and 639 00:39:06,480 --> 00:39:09,040 Speaker 1: there's an icon that indicates whether or not the store 640 00:39:09,200 --> 00:39:12,280 Speaker 1: is open. So by opening up the map and seeing 641 00:39:12,320 --> 00:39:16,000 Speaker 1: where stores are closed and where they're open, Houston citizens 642 00:39:16,000 --> 00:39:18,600 Speaker 1: could actually figure out which regions were still affected by 643 00:39:18,600 --> 00:39:22,120 Speaker 1: power allagies because the ones that have power allagies. Clearly 644 00:39:22,200 --> 00:39:25,240 Speaker 1: those restaurants aren't going to be open. So users shared 645 00:39:25,280 --> 00:39:29,000 Speaker 1: this discovery on x and subsequently Waaburger's app saw a 646 00:39:29,080 --> 00:39:33,120 Speaker 1: flurry of downloads. It went from being fortieth on iOS's 647 00:39:33,160 --> 00:39:38,440 Speaker 1: app store to sixteenth. Necessity is the Mother of Invention. Waaburger, 648 00:39:38,480 --> 00:39:42,120 Speaker 1: for its part, has responded by saying they never intended 649 00:39:42,120 --> 00:39:43,960 Speaker 1: their app to be used this way, and they really 650 00:39:44,040 --> 00:39:47,440 Speaker 1: hope that people are safe. Like they're not objecting to 651 00:39:47,440 --> 00:39:50,120 Speaker 1: the app being used this way, they're showing their concern 652 00:39:50,280 --> 00:39:54,600 Speaker 1: for the community. Researchers have created a urine processing system 653 00:39:54,640 --> 00:39:57,440 Speaker 1: meant for space suits, similar to this still suits in 654 00:39:57,480 --> 00:40:01,239 Speaker 1: the science fiction series Doom. The researchers have created a 655 00:40:01,280 --> 00:40:05,960 Speaker 1: system that includes a quote vacuum based external catheter leading 656 00:40:06,000 --> 00:40:10,000 Speaker 1: to a combined forward reverse osmosis unit end quote in 657 00:40:10,080 --> 00:40:13,720 Speaker 1: order to take an astronaut's urine, remove all the yucky bits, 658 00:40:13,960 --> 00:40:16,920 Speaker 1: and purify it into drinkable water. And it's said to 659 00:40:16,960 --> 00:40:19,880 Speaker 1: be eighty seven percent efficient, so I assume that means 660 00:40:20,120 --> 00:40:22,920 Speaker 1: it can reclaim about eighty seven percent of the water 661 00:40:23,080 --> 00:40:25,960 Speaker 1: content in urine. That means you can pee and in 662 00:40:26,000 --> 00:40:29,640 Speaker 1: about five minutes enjoy a refreshing beverage made from your pee. 663 00:40:30,080 --> 00:40:32,600 Speaker 1: I'm being a little cheeky about this because we all 664 00:40:32,640 --> 00:40:35,600 Speaker 1: need to remember all the water of our planet has 665 00:40:35,680 --> 00:40:39,840 Speaker 1: been here for millions of years, hundreds of millions of years. 666 00:40:40,040 --> 00:40:42,239 Speaker 1: For us, water is kind of like the force. It 667 00:40:42,320 --> 00:40:46,120 Speaker 1: binds us and penetrates us and moves through us. Sometimes 668 00:40:46,160 --> 00:40:47,879 Speaker 1: it moves through us in the middle of a Peter 669 00:40:48,000 --> 00:40:50,120 Speaker 1: Jackson movie. He really needs to make his films less 670 00:40:50,120 --> 00:40:53,120 Speaker 1: than four hours long. Anyway. I think most folks are 671 00:40:53,160 --> 00:40:56,520 Speaker 1: aware that water purification is a thing. We depend on 672 00:40:56,600 --> 00:41:01,080 Speaker 1: water purification already, right, like we process waste water so 673 00:41:01,120 --> 00:41:04,840 Speaker 1: that we can then purify it and make it drinkable 674 00:41:04,880 --> 00:41:07,359 Speaker 1: and usable. Again, it's just a little freaky to think 675 00:41:07,400 --> 00:41:11,239 Speaker 1: of it as a miniaturized system that's wearable and that 676 00:41:11,320 --> 00:41:14,720 Speaker 1: you could complete this purification process in less than ten minutes. 677 00:41:15,000 --> 00:41:17,840 Speaker 1: The system will undoubtedly be handy if we ever do 678 00:41:17,960 --> 00:41:20,840 Speaker 1: figure out a safe way to send astronauts on longer 679 00:41:20,880 --> 00:41:24,080 Speaker 1: space missions like to Mars and such. This kind of 680 00:41:24,080 --> 00:41:26,520 Speaker 1: thing will be necessary in order to be able to 681 00:41:26,560 --> 00:41:29,680 Speaker 1: process the water that is brought on board and to 682 00:41:29,800 --> 00:41:33,560 Speaker 1: maximize it to use it as long as is possible. 683 00:41:33,920 --> 00:41:37,600 Speaker 1: Because every bit of weight that you add to a 684 00:41:37,680 --> 00:41:41,839 Speaker 1: space mission requires more energy to get that thing off 685 00:41:41,920 --> 00:41:46,319 Speaker 1: the planet. So these considerations are challenging. You have to 686 00:41:46,360 --> 00:41:48,840 Speaker 1: have ways to be able to process this stuff, and 687 00:41:48,880 --> 00:41:52,919 Speaker 1: to do so while it's worked within a suit means 688 00:41:52,920 --> 00:41:55,040 Speaker 1: that you can do it even if someone has gone 689 00:41:55,160 --> 00:41:58,640 Speaker 1: out on a spacewalk mission. You know, when you're on 690 00:41:58,719 --> 00:42:01,799 Speaker 1: board a spacecraft or in the space station using the 691 00:42:01,880 --> 00:42:04,879 Speaker 1: onboard facilities whenever you need to use the bathroom that 692 00:42:05,360 --> 00:42:09,520 Speaker 1: has these sort of reclamation systems built into it already. 693 00:42:10,120 --> 00:42:13,120 Speaker 1: But if you're doing something that's just in the suit, 694 00:42:13,600 --> 00:42:17,840 Speaker 1: then typically what you're really relying upon is essentially a diaper. 695 00:42:18,239 --> 00:42:22,480 Speaker 1: So this kind of system opens up the capacity for 696 00:42:22,920 --> 00:42:27,520 Speaker 1: recapturing water from waste even when you're out on a 697 00:42:27,560 --> 00:42:31,319 Speaker 1: spacewalk and you're not inside a spacecraft. Okay, I've got 698 00:42:31,360 --> 00:42:33,680 Speaker 1: a couple of article recommendations for you all this week. 699 00:42:33,719 --> 00:42:37,080 Speaker 1: First up is Samuel Aksen's piece for Ours Technica. It 700 00:42:37,160 --> 00:42:41,800 Speaker 1: is titled shady company relaunches popular old tech blogs steals 701 00:42:41,920 --> 00:42:49,000 Speaker 1: writer's identities yuck. Okay, so apparently this web advertising company 702 00:42:49,280 --> 00:42:53,439 Speaker 1: has resurrected some dead blogs blogs that used to be 703 00:42:53,560 --> 00:42:56,440 Speaker 1: active on the Internet but haven't been for years, and 704 00:42:56,719 --> 00:43:00,680 Speaker 1: are using AI to pose as the people who originally 705 00:43:00,680 --> 00:43:05,160 Speaker 1: contributed to those blogs, and thus churning out garbage articles 706 00:43:05,400 --> 00:43:11,200 Speaker 1: with the bylines of people who aren't actually creating that work. Clearly, 707 00:43:11,680 --> 00:43:15,520 Speaker 1: this is unethical, Like, there's no question this is unethical. 708 00:43:15,800 --> 00:43:17,960 Speaker 1: If you were one of those writers and you started 709 00:43:17,960 --> 00:43:20,920 Speaker 1: to see junk getting pushed out with your name attached 710 00:43:20,920 --> 00:43:23,560 Speaker 1: to it, that would be bad because imagine that you're 711 00:43:23,600 --> 00:43:26,880 Speaker 1: applying for a job and your potential employer has researched 712 00:43:26,920 --> 00:43:30,600 Speaker 1: you and just found all these articles that are just terrible. 713 00:43:30,680 --> 00:43:33,799 Speaker 1: They're subpar, and they have your name attached to it, 714 00:43:34,000 --> 00:43:37,040 Speaker 1: but it wasn't written by you. That would be really awful. 715 00:43:37,280 --> 00:43:39,759 Speaker 1: As someone who used to write for a website, you know, 716 00:43:39,760 --> 00:43:42,760 Speaker 1: a website that notably dumped its editorial staff in favor 717 00:43:42,800 --> 00:43:46,200 Speaker 1: of AI generated pieces, I definitely have concerns about this 718 00:43:46,320 --> 00:43:48,600 Speaker 1: kind of thing. So yeah, I think it's well worth 719 00:43:48,640 --> 00:43:51,760 Speaker 1: a read. Next up, there's a piece by Sky Jacobs 720 00:43:51,800 --> 00:43:54,319 Speaker 1: of tech Spot. It is titled why you should be 721 00:43:54,360 --> 00:43:59,319 Speaker 1: Suspicious of that verified Amazon customer review. It examined something 722 00:43:59,360 --> 00:44:01,839 Speaker 1: I think a lot of people had already suspected that 723 00:44:02,000 --> 00:44:05,080 Speaker 1: those reviews you see on Amazon listings, even from supposed 724 00:44:05,160 --> 00:44:09,120 Speaker 1: trusted consumers, are sometimes just bought and paid for by 725 00:44:09,200 --> 00:44:12,760 Speaker 1: various merchants, meaning you can't trust the reviews, which strikes 726 00:44:12,840 --> 00:44:16,280 Speaker 1: right at the heart of one of Amazon's most valuable contributions. 727 00:44:16,280 --> 00:44:18,960 Speaker 1: In fact, I think customer reviews are part of what 728 00:44:19,120 --> 00:44:22,480 Speaker 1: helped Amazon navigate the dot com bubble crisis in the 729 00:44:22,520 --> 00:44:26,600 Speaker 1: early two thousands. Anyway, both of those articles are well 730 00:44:26,640 --> 00:44:29,760 Speaker 1: worth your time and you should check those out. That's 731 00:44:29,800 --> 00:44:32,440 Speaker 1: it for the news this week. I'll chat with you 732 00:44:32,480 --> 00:44:37,120 Speaker 1: next week, probably about large language model collapse and how 733 00:44:37,239 --> 00:44:41,319 Speaker 1: AI training itself on other AI ultimately leads to a 734 00:44:41,880 --> 00:44:46,600 Speaker 1: desolate landscape of garbage content that's fun to talk about. 735 00:44:46,719 --> 00:44:48,480 Speaker 1: We'll chat about that next week, as well as some 736 00:44:48,520 --> 00:44:51,640 Speaker 1: other stuff. And I hope you have a fantastic weekend, 737 00:44:51,880 --> 00:45:01,160 Speaker 1: and I'll talk to you again really soon. Tech Stuff 738 00:45:01,239 --> 00:45:05,800 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 739 00:45:05,800 --> 00:45:09,360 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 740 00:45:09,400 --> 00:45:10,360 Speaker 1: your favorite shows.