1 00:00:04,480 --> 00:00:12,360 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,520 --> 00:00:15,840 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,840 --> 00:00:19,079 Speaker 1: an executive producer at iHeart Podcasts. And how the tech 4 00:00:19,120 --> 00:00:21,479 Speaker 1: are you. It's time for the tech news for the 5 00:00:21,480 --> 00:00:26,520 Speaker 1: week ending July nineteenth, twenty twenty four. And the big 6 00:00:26,560 --> 00:00:29,200 Speaker 1: story that I have this week is how a product 7 00:00:29,280 --> 00:00:34,960 Speaker 1: update for software called Falcon Sensor led to worldwide operating 8 00:00:35,000 --> 00:00:39,960 Speaker 1: system failures, which in turn necessitated that important businesses businesses 9 00:00:40,080 --> 00:00:43,960 Speaker 1: like airlines, for example, had to shut down or at 10 00:00:44,000 --> 00:00:48,080 Speaker 1: least delay operations while folks tried to fix the problem. 11 00:00:48,320 --> 00:00:52,640 Speaker 1: The heart of the matter lies with a cybersecurity company 12 00:00:52,680 --> 00:00:57,720 Speaker 1: called CrowdStrike. So CrowdStrike pushed out a content update to 13 00:00:57,840 --> 00:01:03,000 Speaker 1: customers for the Falcon Sensor product. And Microsoft is such 14 00:01:03,040 --> 00:01:07,120 Speaker 1: a customer, and this update went out to Windows virtual machines, 15 00:01:07,160 --> 00:01:11,360 Speaker 1: many of which handle mission critical operations for big customers, 16 00:01:11,360 --> 00:01:15,240 Speaker 1: including some major companies all over the world. The update 17 00:01:15,360 --> 00:01:19,680 Speaker 1: caught the machines into a recovery boot loop, which means 18 00:01:19,720 --> 00:01:22,240 Speaker 1: the machines never really loaded in. They would get to 19 00:01:22,280 --> 00:01:24,800 Speaker 1: a certain point of booting up and then they would 20 00:01:25,040 --> 00:01:27,720 Speaker 1: reboot and they would just keep doing this, and that's 21 00:01:27,760 --> 00:01:30,480 Speaker 1: when the failures began. So here in the United States, 22 00:01:30,760 --> 00:01:35,720 Speaker 1: Delta Airlines, American Airlines, and United Airlines all grounded their 23 00:01:35,760 --> 00:01:39,360 Speaker 1: flights because their systems effectively went down. In other parts 24 00:01:39,400 --> 00:01:42,800 Speaker 1: of the world, banks were affected, which severely slowed down 25 00:01:42,840 --> 00:01:46,840 Speaker 1: operations there. One stock trader called it the mother of 26 00:01:46,959 --> 00:01:51,040 Speaker 1: all global market outages. In the UK, Sky News went 27 00:01:51,080 --> 00:01:54,480 Speaker 1: off the air they were unable to broadcast. In Australia, 28 00:01:54,720 --> 00:01:58,320 Speaker 1: emergency call centers reported disruptions due to this issue, which 29 00:01:58,360 --> 00:02:02,040 Speaker 1: is pretty darn scary. Everyone was plagued by the dreaded 30 00:02:02,120 --> 00:02:06,120 Speaker 1: blue screen of death. George Kurt's CEO of CrowdStrike, took 31 00:02:06,160 --> 00:02:10,760 Speaker 1: to x formerly known as Twitter, and posted quote, CrowdStrike 32 00:02:10,919 --> 00:02:14,680 Speaker 1: is actively working with customers impacted by a defect found 33 00:02:14,720 --> 00:02:18,840 Speaker 1: in a single content update for Windows hosts, Mac and 34 00:02:19,000 --> 00:02:23,080 Speaker 1: Linux hosts are not impacted. This is not a security 35 00:02:23,120 --> 00:02:27,160 Speaker 1: incident or cyber attack. The issue has been identified, isolated, 36 00:02:27,360 --> 00:02:31,280 Speaker 1: and a fix has been deployed. End quote. Now, deploying 37 00:02:31,320 --> 00:02:34,200 Speaker 1: a fix is great, However, it does not magically stop 38 00:02:34,280 --> 00:02:38,760 Speaker 1: the already affected machines from being in that reboot loop cycle. 39 00:02:38,919 --> 00:02:42,920 Speaker 1: So to do that it admins will actually have to 40 00:02:42,960 --> 00:02:45,919 Speaker 1: stop the reboot cycle. They'll have to boot into essentially 41 00:02:46,000 --> 00:02:50,440 Speaker 1: safe mode and then navigate to the CrowdStrike file directory 42 00:02:50,760 --> 00:02:54,600 Speaker 1: and delete the appropriate file manually. And depending on the 43 00:02:54,639 --> 00:02:57,840 Speaker 1: it situation for each business, that could be a fairly 44 00:02:57,919 --> 00:03:00,639 Speaker 1: easy thing to do or a not so easy thing 45 00:03:00,680 --> 00:03:04,320 Speaker 1: to do. It's definitely a time consuming thing to do, 46 00:03:04,440 --> 00:03:06,720 Speaker 1: but in the meantime, a lot of companies are unable 47 00:03:06,720 --> 00:03:10,480 Speaker 1: to function properly, some can't really function at all, and 48 00:03:10,840 --> 00:03:13,800 Speaker 1: this mistake really points out how reliant the world is 49 00:03:13,960 --> 00:03:17,600 Speaker 1: on a limited number of service providers, and I would 50 00:03:17,680 --> 00:03:21,160 Speaker 1: argue it also reinforces to hackers the potential impact of 51 00:03:21,240 --> 00:03:24,160 Speaker 1: targeting the supply chain because it lets you hit a 52 00:03:24,240 --> 00:03:27,600 Speaker 1: huge number of targets simultaneously and disrupt them. Though again 53 00:03:27,639 --> 00:03:29,800 Speaker 1: in this case, we're not talking about a hacker attack. 54 00:03:29,880 --> 00:03:33,480 Speaker 1: This was just a mistake that CrowdStrike made that has 55 00:03:33,520 --> 00:03:35,840 Speaker 1: affected a lot of people all around the world. And 56 00:03:35,880 --> 00:03:38,480 Speaker 1: I bet there's going to be some uncomfortable meetings about 57 00:03:38,480 --> 00:03:44,119 Speaker 1: it at CrowdStrike headquarters today, So yeah, yikes. Another obviously 58 00:03:44,200 --> 00:03:47,760 Speaker 1: huge story this week was the assassination attempt on the 59 00:03:47,800 --> 00:03:51,920 Speaker 1: life of former president and now GOP nominee Donald Trump. 60 00:03:52,280 --> 00:03:56,160 Speaker 1: But this is not a politics or crime podcast, so 61 00:03:56,640 --> 00:03:58,960 Speaker 1: why would I bring it up. Well, that's because the 62 00:03:59,040 --> 00:04:02,440 Speaker 1: FBI took possess of the would be assassin's phone and 63 00:04:02,640 --> 00:04:06,080 Speaker 1: within a couple of days announced that they had successfully 64 00:04:06,280 --> 00:04:09,560 Speaker 1: accessed the phone's contents, which raised a lot of questions 65 00:04:09,600 --> 00:04:12,640 Speaker 1: in the tech community, as there is this ongoing concern 66 00:04:12,880 --> 00:04:15,920 Speaker 1: in tech about the extent to which your privacy and 67 00:04:16,000 --> 00:04:19,640 Speaker 1: security can be insured by the hardware that you use. 68 00:04:20,240 --> 00:04:22,799 Speaker 1: So if someone were to get hold of your device, 69 00:04:23,080 --> 00:04:27,039 Speaker 1: would they be able to access it without your direct aid? 70 00:04:28,000 --> 00:04:30,920 Speaker 1: And how did the FBI manage this? While the phone 71 00:04:30,960 --> 00:04:34,919 Speaker 1: in question is a Samsung device and it was locked 72 00:04:35,200 --> 00:04:38,520 Speaker 1: when the FBI received it. Bloomberg reported that the FBI 73 00:04:38,600 --> 00:04:42,279 Speaker 1: initially couldn't crack the phone's security, but then they worked 74 00:04:42,279 --> 00:04:47,279 Speaker 1: with an Israeli digital intelligence company called Celebrate to get access. 75 00:04:47,720 --> 00:04:50,960 Speaker 1: Earlier this week, I saw online speculation that the at 76 00:04:50,960 --> 00:04:55,120 Speaker 1: that time unnamed Israeli firm was actually the NSO Group. 77 00:04:55,279 --> 00:04:57,760 Speaker 1: That's the group that's best known for producing the spyware 78 00:04:57,839 --> 00:05:01,440 Speaker 1: on steroids product called Pegasus. But no, these are two 79 00:05:01,600 --> 00:05:07,120 Speaker 1: separate cyber intelligence companies. According to Bloomberg, Celebrate used new 80 00:05:07,200 --> 00:05:10,600 Speaker 1: software that's actually still in development, it's not offered as 81 00:05:10,600 --> 00:05:13,880 Speaker 1: a product yet, and cracked the phone security and gave 82 00:05:14,080 --> 00:05:17,159 Speaker 1: FBI access to it. A piece in nine to five, 83 00:05:17,240 --> 00:05:20,960 Speaker 1: Mac points out that a data leak from within Celebrate 84 00:05:21,040 --> 00:05:24,599 Speaker 1: revealed its software is currently unable to crack iPhone security 85 00:05:24,680 --> 00:05:28,080 Speaker 1: at least on any devices running iOS seventeen point four 86 00:05:28,279 --> 00:05:32,960 Speaker 1: or later. So this just shows a gap between their 87 00:05:33,040 --> 00:05:38,960 Speaker 1: ability to crack security in iOS platforms versus Android platforms. 88 00:05:39,080 --> 00:05:42,359 Speaker 1: And yeah, this is like an ongoing discussion in the 89 00:05:42,400 --> 00:05:47,000 Speaker 1: tech community. Is you know which operating system provides better security? 90 00:05:47,440 --> 00:05:51,760 Speaker 1: Can you expect to keep your stuff on lock even 91 00:05:51,800 --> 00:05:53,600 Speaker 1: if it were to fall into the hands of, say 92 00:05:53,800 --> 00:05:56,400 Speaker 1: the FBI. Like, there are a lot of conversations in 93 00:05:56,440 --> 00:05:58,680 Speaker 1: tech that follow that, which is why I covered the story. 94 00:05:59,279 --> 00:06:03,880 Speaker 1: On Thursday, open ai unveiled a slimmed down AI Large 95 00:06:03,960 --> 00:06:07,120 Speaker 1: Language Model, So I guess a not so large language 96 00:06:07,120 --> 00:06:11,880 Speaker 1: model or LM, and it's called fittingly enough GPT four 97 00:06:11,880 --> 00:06:15,000 Speaker 1: to oh mini oh in this case is the lowercase 98 00:06:15,120 --> 00:06:18,640 Speaker 1: letter O not the number zero or the numeral zero 99 00:06:18,839 --> 00:06:21,559 Speaker 1: or however you want to call it. I'm not a mathematician. Further, 100 00:06:21,920 --> 00:06:25,080 Speaker 1: this model is slated to take over from GPT three 101 00:06:25,120 --> 00:06:29,080 Speaker 1: point five Turbo for the purposes of powering chat GPT, 102 00:06:29,520 --> 00:06:33,000 Speaker 1: so it will be the language model behind the chatbot. 103 00:06:33,279 --> 00:06:36,760 Speaker 1: It's currently available for consumer chat GPT users. It will 104 00:06:36,839 --> 00:06:41,520 Speaker 1: later spread to enterprise customers next week. For now, it's 105 00:06:41,560 --> 00:06:45,080 Speaker 1: similar to earlier large language models, but open ai says 106 00:06:45,080 --> 00:06:47,640 Speaker 1: this one will eventually be able to analyze and interpret 107 00:06:47,760 --> 00:06:50,640 Speaker 1: images as well as generate its own so not just text, 108 00:06:50,880 --> 00:06:53,680 Speaker 1: plus it will also be able to analyze audio. It 109 00:06:53,720 --> 00:06:56,920 Speaker 1: will not, however, have access to the latest breaking news. 110 00:06:57,200 --> 00:07:00,800 Speaker 1: It will have a knowledge cutoff of October twenty twenty three. 111 00:07:01,360 --> 00:07:05,119 Speaker 1: So what's the benefit of moving to a smaller large 112 00:07:05,160 --> 00:07:08,440 Speaker 1: language model. Well, the big one is cost. They are 113 00:07:08,520 --> 00:07:12,720 Speaker 1: less expensive to operate and thus less expensive for customers. 114 00:07:13,040 --> 00:07:16,560 Speaker 1: They're still fairly robust, though they aren't as capable of 115 00:07:16,600 --> 00:07:21,600 Speaker 1: performing in depth analysis as their bigger siblings. They're good 116 00:07:21,680 --> 00:07:25,880 Speaker 1: for smaller and more niche oriented tasks, so the sort 117 00:07:25,880 --> 00:07:28,920 Speaker 1: of stuff you would probably encounter in your typical apps 118 00:07:29,160 --> 00:07:32,440 Speaker 1: that tap into generative AI. Because your apps usually aren't 119 00:07:32,760 --> 00:07:35,960 Speaker 1: do everything apps, they usually are keyed to a specific 120 00:07:36,080 --> 00:07:39,720 Speaker 1: subset of features. So, for example, let's say you've made 121 00:07:39,760 --> 00:07:43,680 Speaker 1: an app that is essentially a dynamic to do list. Well, 122 00:07:44,440 --> 00:07:47,720 Speaker 1: you won't need access to a massive, large language model 123 00:07:47,760 --> 00:07:52,160 Speaker 1: to handle any AI component of this implementation, right, because 124 00:07:52,160 --> 00:07:54,920 Speaker 1: that would be overkill. Like, you don't need something that's 125 00:07:54,960 --> 00:07:58,200 Speaker 1: great at everything, or even passable at everything. You need 126 00:07:58,240 --> 00:08:00,760 Speaker 1: something that's good at, you know, a very small set 127 00:08:00,800 --> 00:08:04,119 Speaker 1: of tasks. But on the flip side, if you did 128 00:08:04,320 --> 00:08:08,080 Speaker 1: need some heavy lifting on the AI front, smaller models 129 00:08:08,560 --> 00:08:10,760 Speaker 1: aren't necessarily going to be as good of a fit 130 00:08:11,160 --> 00:08:13,600 Speaker 1: because they just won't be capable of performing that in 131 00:08:13,680 --> 00:08:16,560 Speaker 1: depth analysis. You would end up spending more time and 132 00:08:16,600 --> 00:08:20,080 Speaker 1: possibly more money to achieve the results you actually wanted, 133 00:08:20,160 --> 00:08:23,640 Speaker 1: So there are trade offs. Plus, you are also still 134 00:08:23,640 --> 00:08:26,720 Speaker 1: beholden to whatever material was used to train the model 135 00:08:26,720 --> 00:08:29,560 Speaker 1: in the first place. That doesn't matter about the size. 136 00:08:29,600 --> 00:08:31,680 Speaker 1: In fact, like whether it's a small model or a 137 00:08:31,760 --> 00:08:34,719 Speaker 1: large model, the training material needs to be good. If 138 00:08:34,760 --> 00:08:37,559 Speaker 1: the training material is garbage, that's what you're going to 139 00:08:37,600 --> 00:08:39,959 Speaker 1: get on the other end, right, garbage in garbage out. 140 00:08:40,120 --> 00:08:43,079 Speaker 1: I talked about this previously this week on tech Stuff 141 00:08:43,120 --> 00:08:47,800 Speaker 1: about how bad training material leads to bad AI performance, 142 00:08:47,920 --> 00:08:50,960 Speaker 1: and there's a real fear that AI models of the 143 00:08:50,960 --> 00:08:53,880 Speaker 1: future are going to be training off of AI generated 144 00:08:53,960 --> 00:08:56,680 Speaker 1: content from today, and that will lead to what is 145 00:08:56,720 --> 00:09:03,920 Speaker 1: called model collapse. That's a real thing. Listen to earlier 146 00:09:03,960 --> 00:09:07,679 Speaker 1: episodes this week to learn more about that. Ours Technica's 147 00:09:07,760 --> 00:09:11,640 Speaker 1: Ashley Bellinger has an article titled Elon Musk's X may 148 00:09:11,679 --> 00:09:16,120 Speaker 1: succeed in blocking California content moderation law on appeal and 149 00:09:16,200 --> 00:09:18,040 Speaker 1: I would like to touch on this. The matter at 150 00:09:18,080 --> 00:09:21,200 Speaker 1: hand concerns a law called AB five to eight seven 151 00:09:21,280 --> 00:09:24,080 Speaker 1: in California, and it is concerned with the terms of 152 00:09:24,120 --> 00:09:27,800 Speaker 1: service for social media companies. Specifically, the bill says that 153 00:09:27,840 --> 00:09:31,200 Speaker 1: social media companies must make available their terms of service. 154 00:09:31,559 --> 00:09:34,880 Speaker 1: They must define clearly what is and is not permitted 155 00:09:35,040 --> 00:09:37,960 Speaker 1: under those terms, and then create a record of any 156 00:09:38,000 --> 00:09:40,959 Speaker 1: and all cases in which users or their content are 157 00:09:41,000 --> 00:09:45,480 Speaker 1: subjected to action from the company for violation of those terms, 158 00:09:45,520 --> 00:09:49,000 Speaker 1: such as when a company deletes a post or bans 159 00:09:49,080 --> 00:09:51,240 Speaker 1: a user, as well as a record of exactly what 160 00:09:51,440 --> 00:09:55,480 Speaker 1: violation occurred. X's law team argue that the law violates 161 00:09:55,480 --> 00:09:58,800 Speaker 1: the company's First Amendment rights to conduct content moderation. That's 162 00:09:58,880 --> 00:10:04,119 Speaker 1: forcing them to make a comment on specific controversial subjects, 163 00:10:04,440 --> 00:10:08,160 Speaker 1: and that is against the First Amendment, and you know, 164 00:10:08,240 --> 00:10:11,440 Speaker 1: to definitively state why someone or something was banned, which 165 00:10:11,480 --> 00:10:14,000 Speaker 1: also sends the message that stuff that is allowed to 166 00:10:14,000 --> 00:10:16,480 Speaker 1: stay on the platform is permitted. This is actually a 167 00:10:16,480 --> 00:10:20,440 Speaker 1: movement to create backlash against X, and the State of 168 00:10:20,440 --> 00:10:24,120 Speaker 1: California is determined to just stir stuff up, so to speak, 169 00:10:24,160 --> 00:10:26,680 Speaker 1: by making X essentially say, well, yeah, what this person 170 00:10:26,720 --> 00:10:29,320 Speaker 1: said was awful, but it's not against our terms of service, 171 00:10:29,520 --> 00:10:31,560 Speaker 1: or we don't like what this person said, so we 172 00:10:31,640 --> 00:10:34,880 Speaker 1: banned them. Bellinger points out that the appeals court seems 173 00:10:34,920 --> 00:10:37,480 Speaker 1: at least partly sympathetic to X, and I can understand 174 00:10:37,559 --> 00:10:39,959 Speaker 1: why I might not agree with stuff going on over 175 00:10:40,000 --> 00:10:42,000 Speaker 1: at X, and in fact, I disagree a lot with 176 00:10:42,040 --> 00:10:44,319 Speaker 1: stuff going on over at X. But you do get 177 00:10:44,320 --> 00:10:46,960 Speaker 1: into a pretty thorny issue when a company has to 178 00:10:47,000 --> 00:10:50,400 Speaker 1: first define what is and isn't acceptable. As one judge 179 00:10:50,400 --> 00:10:53,120 Speaker 1: pointed out, x's argument seems to be that the State 180 00:10:53,120 --> 00:10:56,520 Speaker 1: of California is trying to compel X to express company 181 00:10:56,600 --> 00:10:59,600 Speaker 1: views on these controversial subjects that range from stuff like 182 00:10:59,640 --> 00:11:04,080 Speaker 1: hates each to foreign political interference, and that compelling someone 183 00:11:04,160 --> 00:11:06,800 Speaker 1: to speak does seem like it is running a foul 184 00:11:06,920 --> 00:11:10,560 Speaker 1: of first Amendment protections. So does this mean the appeals 185 00:11:10,559 --> 00:11:13,599 Speaker 1: court will ultimately find in favor of X. Possibly, but 186 00:11:13,640 --> 00:11:15,920 Speaker 1: it's not a done deal yet. Okay, I got a 187 00:11:15,960 --> 00:11:18,000 Speaker 1: few more news stories to cover before we get to that. 188 00:11:18,080 --> 00:11:30,520 Speaker 1: Let's take a quick break. We're back, so and less 189 00:11:30,720 --> 00:11:35,600 Speaker 1: serious but still irritating news. Netflix is increasing the cost 190 00:11:35,679 --> 00:11:38,640 Speaker 1: of subscribers who want an AD free experience on the 191 00:11:38,679 --> 00:11:43,560 Speaker 1: streaming platform by getting rid of the existing lowest tier 192 00:11:44,160 --> 00:11:47,040 Speaker 1: in that in that range, so the monthly fee for 193 00:11:47,160 --> 00:11:51,320 Speaker 1: the lowest AD free tier, which limited users to viewing 194 00:11:51,320 --> 00:11:54,000 Speaker 1: Netflix on just one device at a time, as well 195 00:11:54,040 --> 00:11:57,000 Speaker 1: as viewing content and a maximum resolution of seven hundred 196 00:11:57,000 --> 00:11:59,920 Speaker 1: and twenty P. That has been twelve dollars a month, 197 00:12:00,040 --> 00:12:03,280 Speaker 1: But now that tier is being ended phased out, and 198 00:12:03,480 --> 00:12:06,320 Speaker 1: current customers who are at that level will get shifted 199 00:12:06,400 --> 00:12:09,960 Speaker 1: to a different tier, less expensive per month, but it 200 00:12:10,040 --> 00:12:12,720 Speaker 1: is AD supported. If they want to move to an 201 00:12:12,840 --> 00:12:16,040 Speaker 1: AD free version, they'll have to go up to fifteen 202 00:12:16,120 --> 00:12:18,719 Speaker 1: dollars forty nine cents per month to do it. Now. 203 00:12:18,760 --> 00:12:22,320 Speaker 1: This is technically a different tier, a different plan. It 204 00:12:22,400 --> 00:12:25,560 Speaker 1: supports up to two devices and also a resolution of 205 00:12:25,640 --> 00:12:29,240 Speaker 1: ten ADP plus. Customers at this tier are allowed to 206 00:12:29,320 --> 00:12:31,719 Speaker 1: download content to a local device, So like if you're 207 00:12:31,720 --> 00:12:33,280 Speaker 1: going on a plane or something and you're not going 208 00:12:33,320 --> 00:12:36,360 Speaker 1: to have access to Wi Fi, you download the episode 209 00:12:36,440 --> 00:12:38,679 Speaker 1: or whatever tier phone and you can watch it then. So, 210 00:12:38,720 --> 00:12:42,400 Speaker 1: according to Sharon Harding of Ours Technica, Netflix's strategy is 211 00:12:42,400 --> 00:12:46,600 Speaker 1: to get more people to move to this lowest lower tier, 212 00:12:46,760 --> 00:12:50,240 Speaker 1: this AD supported tier, because ads are more lucrative as 213 00:12:50,280 --> 00:12:54,080 Speaker 1: a revenue generator than subscriptions alone. So their hope is 214 00:12:54,160 --> 00:12:58,240 Speaker 1: not to convince people to spend more per month and 215 00:12:58,360 --> 00:13:02,560 Speaker 1: move to the new lowest AD free tier, but rather 216 00:13:02,920 --> 00:13:05,400 Speaker 1: be content with the fact that they're also going to 217 00:13:05,440 --> 00:13:08,199 Speaker 1: have to watch some ads, because that is a more 218 00:13:08,280 --> 00:13:12,120 Speaker 1: lucrative revenue generation model. Of course, Netflix isn't the only 219 00:13:12,160 --> 00:13:15,120 Speaker 1: streaming service that has recently bumped up prices. That's been 220 00:13:15,120 --> 00:13:17,760 Speaker 1: true across the board. We've actually seen, you know, some 221 00:13:17,880 --> 00:13:20,680 Speaker 1: streaming platforms bump up prices a couple of times in 222 00:13:20,720 --> 00:13:22,559 Speaker 1: the span of a year. A heck, I just got 223 00:13:22,600 --> 00:13:25,360 Speaker 1: an email earlier this week about how my Peacock subscription 224 00:13:25,840 --> 00:13:28,720 Speaker 1: is getting more expensive and I'm wondering should I even 225 00:13:29,520 --> 00:13:31,840 Speaker 1: bother keeping it? I mean, I only use it to 226 00:13:31,880 --> 00:13:35,480 Speaker 1: watch like things super rarely, but I think this also 227 00:13:35,520 --> 00:13:38,160 Speaker 1: has to be expected because obviously, early on in the 228 00:13:38,200 --> 00:13:41,360 Speaker 1: streaming days, the game was all about getting as large 229 00:13:41,400 --> 00:13:43,880 Speaker 1: a subscriber base as you possibly could, even if that 230 00:13:43,960 --> 00:13:46,240 Speaker 1: meant you were losing money in the process. Like you 231 00:13:46,320 --> 00:13:51,080 Speaker 1: had these very attractive Intrey offers, you had a lot 232 00:13:51,080 --> 00:13:55,280 Speaker 1: of original material. But you can't operate like this forever. 233 00:13:55,400 --> 00:13:57,000 Speaker 1: You know, you do have to find ways to make 234 00:13:57,000 --> 00:13:59,800 Speaker 1: a profit or at least to keep shareholders happy, since 235 00:13:59,800 --> 00:14:02,200 Speaker 1: that's really what it's all about these days, and that 236 00:14:02,360 --> 00:14:06,720 Speaker 1: means you can't just give stuff away for very low 237 00:14:06,760 --> 00:14:09,880 Speaker 1: amounts of money. You have to figure out a method 238 00:14:09,960 --> 00:14:12,880 Speaker 1: to cover the costs of operation at the very least, 239 00:14:12,920 --> 00:14:17,000 Speaker 1: and preferably make a profit. Just Weatherbed of The Verge 240 00:14:17,040 --> 00:14:19,440 Speaker 1: has a piece about how Google is cleaning house on 241 00:14:19,480 --> 00:14:21,600 Speaker 1: the app front. The company says it will be cracking 242 00:14:21,640 --> 00:14:24,000 Speaker 1: down on apps that don't live up to Google's play 243 00:14:24,000 --> 00:14:28,080 Speaker 1: Store standards. So for apps that have extremely limited functionality 244 00:14:28,360 --> 00:14:31,200 Speaker 1: or content and thus are seen as low value apps, 245 00:14:31,240 --> 00:14:33,400 Speaker 1: they could find themselves on the chopping block. So let's 246 00:14:33,400 --> 00:14:35,920 Speaker 1: say it's an app and all it is is like 247 00:14:36,360 --> 00:14:39,320 Speaker 1: a wallpaper. That's it. That's the whole app. This is 248 00:14:39,400 --> 00:14:41,760 Speaker 1: one little thing of wallpaper. That's the sort of thing 249 00:14:41,800 --> 00:14:46,400 Speaker 1: that Google says will not survive this change. Google already 250 00:14:46,400 --> 00:14:49,000 Speaker 1: has policies in place for apps that are downright malicious 251 00:14:49,120 --> 00:14:52,120 Speaker 1: or that introduce security vulnerabilities to hardware, which is a 252 00:14:52,160 --> 00:14:55,000 Speaker 1: no brainer. But this new rule says, not only must 253 00:14:55,000 --> 00:14:57,400 Speaker 1: your app be safe, it also has to be you know, 254 00:14:58,240 --> 00:15:01,000 Speaker 1: kind of good. The new rule take effect at the 255 00:15:01,200 --> 00:15:05,480 Speaker 1: end of August. There's a famous saying that is often 256 00:15:05,480 --> 00:15:08,520 Speaker 1: attributed to Dustyyevsky, and it goes something like this, the 257 00:15:08,600 --> 00:15:12,000 Speaker 1: degree of civilization in a society can be judged by 258 00:15:12,120 --> 00:15:14,760 Speaker 1: entering its prisons. Now, before I go any further, I 259 00:15:14,760 --> 00:15:19,560 Speaker 1: should add that Princeton University's Russian literature professor Ilyavinitsky has 260 00:15:19,600 --> 00:15:23,720 Speaker 1: written extensively about how this quote is not in fact 261 00:15:23,760 --> 00:15:28,400 Speaker 1: from Dustyevsky, and actually it's in conflict with what Dustyevsky's 262 00:15:28,400 --> 00:15:31,760 Speaker 1: actually actual words on the subject of prisons have said. 263 00:15:32,000 --> 00:15:34,680 Speaker 1: It's a great read, by the way, I love reading 264 00:15:35,200 --> 00:15:38,200 Speaker 1: experts who are saying, yeah, everyone says this quote came 265 00:15:38,200 --> 00:15:40,440 Speaker 1: from this place, but it really doesn't. But wherever the 266 00:15:40,520 --> 00:15:43,520 Speaker 1: quote does come from, it has been repeated and reinforced 267 00:15:43,560 --> 00:15:46,280 Speaker 1: by lots of others, including like Nelson Mandela, And while 268 00:15:46,320 --> 00:15:48,600 Speaker 1: the origins of the quote are perhaps unknown and maybe 269 00:15:48,640 --> 00:15:52,960 Speaker 1: even unknowable, I do think the intent behind the quote 270 00:15:53,200 --> 00:15:55,680 Speaker 1: has merit no matter who it was that said at first. 271 00:15:55,920 --> 00:15:57,520 Speaker 1: Which is a long way to go to introduce this 272 00:15:57,600 --> 00:16:00,200 Speaker 1: next story, which is that the FCC has no now 273 00:16:00,240 --> 00:16:03,120 Speaker 1: closed a loophole that existed here in the United States 274 00:16:03,320 --> 00:16:08,160 Speaker 1: that allowed the various prison telephone service operators in this 275 00:16:08,240 --> 00:16:12,320 Speaker 1: country to place really high telephone rates on prisoners who 276 00:16:12,360 --> 00:16:16,160 Speaker 1: wanted to make calls, specifically intra state calls like within 277 00:16:16,360 --> 00:16:19,840 Speaker 1: a state. And most prisoners, like their families, live in 278 00:16:19,840 --> 00:16:23,160 Speaker 1: the same state that the prison that they're in is in, 279 00:16:23,440 --> 00:16:25,720 Speaker 1: so most of the calls they want to make are 280 00:16:25,760 --> 00:16:29,320 Speaker 1: within the state. And for a long time, those rates 281 00:16:29,800 --> 00:16:35,200 Speaker 1: were pretty darn high. And like a fifteen minute phone 282 00:16:35,240 --> 00:16:39,240 Speaker 1: call might cost a prisoner eleven dollars thirty five cents. Well, 283 00:16:39,480 --> 00:16:41,440 Speaker 1: that doesn't sound like a lot maybe to you, But 284 00:16:41,480 --> 00:16:44,160 Speaker 1: then you've got to remember these prisoners are making pennies 285 00:16:44,160 --> 00:16:47,840 Speaker 1: on the dollar in prison. Like there's a whole conversation 286 00:16:47,920 --> 00:16:51,760 Speaker 1: to be had about what ultimately amounts to slave labor 287 00:16:51,800 --> 00:16:54,080 Speaker 1: and the prison system of the United States. But anyway, 288 00:16:54,480 --> 00:16:57,880 Speaker 1: now that eleven dollars thirty five cents is going to 289 00:16:57,880 --> 00:17:00,640 Speaker 1: be closer to ninety cents at least that's what it 290 00:17:00,680 --> 00:17:03,400 Speaker 1: would be for very large prisons, like prisons that have 291 00:17:03,840 --> 00:17:07,199 Speaker 1: at least one thousand inmates, if not more. This is 292 00:17:07,240 --> 00:17:10,919 Speaker 1: significant because a previous challenge to the FCC's attempts to 293 00:17:11,080 --> 00:17:15,040 Speaker 1: curb telephone rates in prisons was defeated when the companies 294 00:17:15,040 --> 00:17:18,480 Speaker 1: that are actually providing these services argued that the FCC 295 00:17:18,640 --> 00:17:22,000 Speaker 1: lacked the authority to impose price caps in the first place. 296 00:17:22,240 --> 00:17:25,720 Speaker 1: But President Biden signed into law a bill that grants 297 00:17:25,800 --> 00:17:29,280 Speaker 1: such powers to the FCC. So now the FCC is saying, well, 298 00:17:29,480 --> 00:17:33,359 Speaker 1: Congress has granted us that authority. Now, so now you 299 00:17:33,520 --> 00:17:36,600 Speaker 1: have to reiin in your prices, and we've already analyzed 300 00:17:36,640 --> 00:17:39,360 Speaker 1: the numbers. We can prove that, based upon the information 301 00:17:39,600 --> 00:17:42,920 Speaker 1: provided by you, you'll still make a profit and cover 302 00:17:43,000 --> 00:17:45,879 Speaker 1: all costs of operations even with these lower price caps. 303 00:17:46,280 --> 00:17:49,800 Speaker 1: So there, so that's a good story. I think that 304 00:17:49,920 --> 00:17:53,800 Speaker 1: shows how if you don't have regulations, there's no check 305 00:17:53,800 --> 00:17:58,600 Speaker 1: and balance system for most vulnerable populations, right like powerful populations, 306 00:17:58,640 --> 00:18:01,960 Speaker 1: ones that have a lot of sway they can push back, 307 00:18:02,359 --> 00:18:05,840 Speaker 1: and that's substantial. But when you're talking about prisoners, who 308 00:18:05,880 --> 00:18:10,480 Speaker 1: are the most vulnerable in society arguably or at least 309 00:18:10,520 --> 00:18:13,639 Speaker 1: among the most vulnerable populations in society, they have no 310 00:18:13,760 --> 00:18:18,440 Speaker 1: political clout. So obviously any company that decides that they're 311 00:18:18,480 --> 00:18:21,520 Speaker 1: going to price gouge, there's no checks and balances there. 312 00:18:21,680 --> 00:18:23,760 Speaker 1: So now the FCC is saying, well, we've been grinded 313 00:18:23,800 --> 00:18:26,320 Speaker 1: that authority. You can't do it anymore. We'll see where 314 00:18:26,359 --> 00:18:29,440 Speaker 1: it goes from here. To round up for recommended reading 315 00:18:29,480 --> 00:18:31,760 Speaker 1: this week, I've got a couple of space related pieces 316 00:18:31,760 --> 00:18:34,520 Speaker 1: from ours Tetnica, both by Stephen Clark, and I think 317 00:18:34,560 --> 00:18:36,520 Speaker 1: these are worth your time. So first up is his 318 00:18:37,000 --> 00:18:40,320 Speaker 1: Rocket Report, which is a weekly roundup of space news, 319 00:18:40,800 --> 00:18:43,320 Speaker 1: and it's incredible, Like he I don't know where he 320 00:18:43,359 --> 00:18:45,800 Speaker 1: gets the time to do all this, because he does 321 00:18:45,840 --> 00:18:47,879 Speaker 1: a great job. But you can learn about how a 322 00:18:47,960 --> 00:18:52,720 Speaker 1: space startup recently lost at CEO due to inappropriate relationships 323 00:18:52,840 --> 00:18:55,960 Speaker 1: with an employee of the company. You can learn about 324 00:18:55,960 --> 00:19:00,600 Speaker 1: how SpaceX's Falcon nine rockets are temporarily grounded after a 325 00:19:01,680 --> 00:19:04,560 Speaker 1: launch vehicle. The second stage of a launch vehicle failed 326 00:19:04,600 --> 00:19:08,800 Speaker 1: to deliver twenty Starlink Internet satellites into their proper orbit 327 00:19:09,119 --> 00:19:11,800 Speaker 1: last week. That being said, the Falcon nine actually remains 328 00:19:11,800 --> 00:19:15,400 Speaker 1: the most reliable launch vehicle in history and it has 329 00:19:15,440 --> 00:19:18,159 Speaker 1: a ninety nine point seven percent success rate. This was 330 00:19:18,200 --> 00:19:22,360 Speaker 1: the first failure, so it's an amazing piece of technology 331 00:19:22,359 --> 00:19:25,200 Speaker 1: and the FAA is likely to lift that grounded status 332 00:19:25,200 --> 00:19:28,240 Speaker 1: within the next day or so. There are also lots 333 00:19:28,280 --> 00:19:31,160 Speaker 1: more stories in that roundup, and next up is another 334 00:19:31,200 --> 00:19:34,479 Speaker 1: piece by Stephen Clark in Ours Technica. It's titled NASA 335 00:19:34,520 --> 00:19:36,840 Speaker 1: built a moon rover, but can't afford to get it 336 00:19:36,880 --> 00:19:39,560 Speaker 1: to the launch pad, and it's the sad tale of Viper, 337 00:19:39,800 --> 00:19:42,280 Speaker 1: a project that already saw four hundred and fifty million 338 00:19:42,320 --> 00:19:45,600 Speaker 1: dollars of investment, but sounds like the various delays in 339 00:19:45,640 --> 00:19:48,360 Speaker 1: the project and the cost increases reached a point where 340 00:19:48,440 --> 00:19:50,800 Speaker 1: NASA had to make a decision they had to either 341 00:19:50,880 --> 00:19:53,159 Speaker 1: cut the project or they would have to cancel a 342 00:19:53,160 --> 00:19:56,520 Speaker 1: bunch of other projects in order to find money in 343 00:19:56,560 --> 00:19:59,480 Speaker 1: the budget to get it across the finish line. And 344 00:19:59,600 --> 00:20:02,000 Speaker 1: last is a piece in The Guardian by Charlotte thal 345 00:20:02,640 --> 00:20:07,919 Speaker 1: titled doom Scrolling Linked to existential Anxiety distrust suspicion, and 346 00:20:08,000 --> 00:20:11,000 Speaker 1: despair study finds. I think Thal did a really good 347 00:20:11,080 --> 00:20:14,239 Speaker 1: job pointing out that the study had some limitations and 348 00:20:14,320 --> 00:20:16,760 Speaker 1: it does seem to confirm a lot of common sense 349 00:20:16,800 --> 00:20:19,239 Speaker 1: ideas about the nature of doom scrolling, like it's going 350 00:20:19,320 --> 00:20:22,520 Speaker 1: to just sort of be like, yeah, of course. But 351 00:20:23,080 --> 00:20:25,480 Speaker 1: she also points out that due to the scope and 352 00:20:25,600 --> 00:20:28,199 Speaker 1: nature of the study, it is perhaps viewed as a 353 00:20:28,200 --> 00:20:32,080 Speaker 1: good reason to research the matter more thoroughly, rather than 354 00:20:32,240 --> 00:20:34,840 Speaker 1: to draw conclusions from the study itself. In other words, 355 00:20:34,840 --> 00:20:38,119 Speaker 1: be careful not to fall victim to confirmation bias. And 356 00:20:38,160 --> 00:20:40,200 Speaker 1: that's it for the tech News for the week ending 357 00:20:40,280 --> 00:20:43,840 Speaker 1: July nineteenth, twenty twenty four. I hope you are all well, 358 00:20:44,240 --> 00:20:53,359 Speaker 1: and I'll talk to you again really soon. Tech Stuff 359 00:20:53,440 --> 00:20:57,400 Speaker 1: is an iHeart Radio production. For more podcasts from iHeartRadio, 360 00:20:57,720 --> 00:21:01,440 Speaker 1: visit the iHeartRadio app, Apple podcas casts, or wherever you listen 361 00:21:01,480 --> 00:21:02,520 Speaker 1: to your favorite shows.