1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, 3 00:00:15,960 --> 00:00:19,160 Speaker 1: dominant executive producer with iHeart Podcasts and how the tech 4 00:00:19,239 --> 00:00:21,279 Speaker 1: are you. It's time for the tech news for the 5 00:00:21,320 --> 00:00:25,520 Speaker 1: week ending on August nine, twenty twenty four, and this 6 00:00:25,640 --> 00:00:30,040 Speaker 1: week Google got some bad news. A US judge has 7 00:00:30,120 --> 00:00:34,280 Speaker 1: ruled that Google has violated antitrust laws by spending let's 8 00:00:34,280 --> 00:00:38,920 Speaker 1: see says here, truckloads of cash in an effort to 9 00:00:39,040 --> 00:00:44,440 Speaker 1: establish its search engine dominance in the global market. Google 10 00:00:44,520 --> 00:00:49,519 Speaker 1: has invested billions of dollars forging partnerships with various other 11 00:00:49,560 --> 00:00:52,960 Speaker 1: companies like Apple, to make the Google Search Engine the 12 00:00:53,080 --> 00:00:58,080 Speaker 1: default search tool on numerous devices and apps and sites. 13 00:00:58,560 --> 00:01:03,320 Speaker 1: This in turn, gave Google a near monopoly on search advertising. 14 00:01:03,560 --> 00:01:06,679 Speaker 1: According to the US government, this means that Google could 15 00:01:06,720 --> 00:01:10,880 Speaker 1: pretty much set whatever rules it liked without fear of competition. 16 00:01:11,200 --> 00:01:14,320 Speaker 1: The ruling means that now the US government can pursue 17 00:01:14,360 --> 00:01:18,360 Speaker 1: another case in an effort to rectify this situation. So 18 00:01:18,640 --> 00:01:22,240 Speaker 1: this case just said, yes, Google is guilty of having 19 00:01:22,319 --> 00:01:25,000 Speaker 1: done this thing. The next one will be this is 20 00:01:25,040 --> 00:01:27,400 Speaker 1: what we're going to do about it? But how do 21 00:01:27,440 --> 00:01:32,319 Speaker 1: you solve a problem like Google. That's a darn good question. 22 00:01:32,440 --> 00:01:35,200 Speaker 1: I guess you could google it. But one possibility is 23 00:01:35,240 --> 00:01:38,479 Speaker 1: that the government could look to break Google or its 24 00:01:38,560 --> 00:01:43,680 Speaker 1: parent company Alphabet into smaller entities. Now that's something that 25 00:01:43,760 --> 00:01:46,880 Speaker 1: hasn't really been done in the United States in many years. 26 00:01:47,160 --> 00:01:50,280 Speaker 1: But even if it does turn out to be the focus, 27 00:01:50,520 --> 00:01:53,280 Speaker 1: it will likely be a long time before anything like 28 00:01:53,360 --> 00:01:57,160 Speaker 1: that happens on a substantial level. The legal process is 29 00:01:57,360 --> 00:02:01,480 Speaker 1: likely to stretch on for more years, and there's every 30 00:02:01,520 --> 00:02:04,600 Speaker 1: reason to suspect that this case will eventually make its 31 00:02:04,640 --> 00:02:07,600 Speaker 1: way up the appeals process, possibly all the way to 32 00:02:07,640 --> 00:02:10,520 Speaker 1: the US Supreme Court at some point. So this story 33 00:02:10,639 --> 00:02:13,400 Speaker 1: is far from over now. For one thing, there's going 34 00:02:13,440 --> 00:02:17,200 Speaker 1: to be a ripple of massive consequences following this initial 35 00:02:17,280 --> 00:02:21,240 Speaker 1: Google ruling. One massive deal that is likely at risk 36 00:02:21,560 --> 00:02:25,760 Speaker 1: is Google's agreement with the aforementioned Apple. Apple has Google 37 00:02:25,880 --> 00:02:29,200 Speaker 1: as its default search engine on iOS devices, and in return, 38 00:02:29,639 --> 00:02:35,959 Speaker 1: Google bays Apple a cool twenty billion dollars every year. 39 00:02:36,680 --> 00:02:39,440 Speaker 1: That's more than a third of what Google makes as 40 00:02:39,480 --> 00:02:42,480 Speaker 1: a result of having it set as the default search 41 00:02:42,639 --> 00:02:46,360 Speaker 1: for all of these devices. So Apple could see a 42 00:02:46,520 --> 00:02:49,880 Speaker 1: drop of up to six percent on its annual profits. 43 00:02:49,960 --> 00:02:54,040 Speaker 1: Because that twenty billion wouldn't be coming in anymore. That's 44 00:02:54,160 --> 00:02:58,040 Speaker 1: not likely to make Apple investors very happy. According to Reuter's, 45 00:02:58,200 --> 00:03:03,480 Speaker 1: analysts expect Apple to FastTrack adopting AI powered search tools, 46 00:03:03,600 --> 00:03:06,639 Speaker 1: and was likely already doing this before the decision was 47 00:03:06,680 --> 00:03:10,079 Speaker 1: even handed down. One thing that isn't likely to happen 48 00:03:10,280 --> 00:03:13,880 Speaker 1: is Apple switching to Microsoft Bing. According to Eddie Q, 49 00:03:14,520 --> 00:03:18,360 Speaker 1: Apple's senior VP of Services, quote, I don't believe there's 50 00:03:18,360 --> 00:03:21,320 Speaker 1: a price in the world that Microsoft could offer us. 51 00:03:21,639 --> 00:03:24,119 Speaker 1: They offered to give us Bing for free, they could 52 00:03:24,160 --> 00:03:27,400 Speaker 1: give us the whole company end quote. So I think 53 00:03:27,480 --> 00:03:30,560 Speaker 1: that suggests Apple's likely going to stick around with Google 54 00:03:30,680 --> 00:03:34,480 Speaker 1: even without the twenty billion dollar deal, at least until 55 00:03:34,520 --> 00:03:38,120 Speaker 1: the company's ready to switch to some AI powered search solution. 56 00:03:38,880 --> 00:03:42,600 Speaker 1: Over at Ours Technica, there's an article titled Google and 57 00:03:42,720 --> 00:03:47,480 Speaker 1: Meta ignored their own rules in secret teen targeting ad deals. 58 00:03:47,760 --> 00:03:50,400 Speaker 1: The piece, in turn cites the Financial Times. In fact, 59 00:03:50,400 --> 00:03:53,280 Speaker 1: I think the article originally appeared in the Financial Times, 60 00:03:53,560 --> 00:03:57,480 Speaker 1: and it reports on documents showing that Meta and Google 61 00:03:57,560 --> 00:04:00,720 Speaker 1: partnered on a project in which Google he helped Meta 62 00:04:00,720 --> 00:04:05,480 Speaker 1: design a marketing campaign aimed at teens on YouTube. The 63 00:04:05,560 --> 00:04:10,560 Speaker 1: campaign promoted Meta's Instagram app and features. The documents apparently 64 00:04:10,640 --> 00:04:13,960 Speaker 1: lay out something that violates Google's own rules. Namely, the 65 00:04:14,080 --> 00:04:18,440 Speaker 1: campaign was at least indirectly targeting users who were under 66 00:04:18,480 --> 00:04:21,760 Speaker 1: the age of eighteen. Now Google has pledged to not 67 00:04:22,200 --> 00:04:26,280 Speaker 1: target such users with specific advertising. If you are under 68 00:04:26,320 --> 00:04:28,480 Speaker 1: the age of eighteen, you are not supposed to receive 69 00:04:28,520 --> 00:04:32,680 Speaker 1: personalized targeted ads from Google platforms, So this is a 70 00:04:32,680 --> 00:04:36,480 Speaker 1: problem now. According to Ours Technica, the campaign began late 71 00:04:36,560 --> 00:04:39,479 Speaker 1: last year. Meta was in a tight spot because research 72 00:04:39,600 --> 00:04:42,320 Speaker 1: was showing that the company was having problems attracting young 73 00:04:42,360 --> 00:04:45,720 Speaker 1: folks to its various platforms. Google was looking to make 74 00:04:45,760 --> 00:04:48,160 Speaker 1: a lot more ad cash. So these were two great 75 00:04:48,200 --> 00:04:52,000 Speaker 1: tastes that go great together. I guess The Financial Times 76 00:04:52,440 --> 00:04:57,240 Speaker 1: investigated this. The two companies ran a pilot campaign in 77 00:04:57,320 --> 00:05:00,640 Speaker 1: Canada and then later began to roll out a similar 78 00:05:00,720 --> 00:05:04,200 Speaker 1: campaign in the United States, and when The Financial Times 79 00:05:04,279 --> 00:05:07,799 Speaker 1: contacted Google about the story, Google said it would investigate 80 00:05:07,800 --> 00:05:11,400 Speaker 1: the matter and then subsequently shut the project down, while 81 00:05:11,400 --> 00:05:16,120 Speaker 1: simultaneously not actually admitting fault. So the workaround for Google 82 00:05:16,320 --> 00:05:21,200 Speaker 1: was that this campaign was technically targeting unknown accounts as 83 00:05:21,200 --> 00:05:24,960 Speaker 1: an accounts where the age of the user wasn't actually recorded. 84 00:05:25,240 --> 00:05:27,960 Speaker 1: So that's how Google could say we aren't targeting people 85 00:05:28,000 --> 00:05:30,120 Speaker 1: under the age of eighteen, at least we're not doing 86 00:05:30,200 --> 00:05:32,880 Speaker 1: it knowingly because we have no idea how old these 87 00:05:33,000 --> 00:05:35,800 Speaker 1: users are. But it seems like the users were actually 88 00:05:35,839 --> 00:05:41,520 Speaker 1: skewing young. That yes, technically there was a field for 89 00:05:41,600 --> 00:05:45,760 Speaker 1: a user's age and that field was blank, but in 90 00:05:45,800 --> 00:05:48,960 Speaker 1: reality you could easily conclude how old the users were, 91 00:05:49,160 --> 00:05:52,400 Speaker 1: or reasonably at least conclude how old the users were, 92 00:05:52,720 --> 00:05:56,240 Speaker 1: and that they were skewing young as an under the 93 00:05:56,279 --> 00:05:59,560 Speaker 1: age of eighteen, so you could say, like maybe this 94 00:05:59,640 --> 00:06:04,080 Speaker 1: was Google argument like ah, yeah, you know, it's plausible deniability, right, 95 00:06:04,200 --> 00:06:06,960 Speaker 1: we don't know how old they are, so as a loophole. 96 00:06:07,279 --> 00:06:09,520 Speaker 1: To learn more about the story, I would recommend checking 97 00:06:09,520 --> 00:06:12,679 Speaker 1: out the article in Ours Technica or The Financial Times. 98 00:06:13,160 --> 00:06:17,440 Speaker 1: Google's also facing some trouble in Russia, where Internet monitoring 99 00:06:17,520 --> 00:06:21,320 Speaker 1: companies detected a mass outage of YouTube this week. So 100 00:06:21,480 --> 00:06:26,960 Speaker 1: according to one such company called Sboi dot RF, many 101 00:06:27,120 --> 00:06:31,400 Speaker 1: Russian citizens reported being unable to access YouTube unless they 102 00:06:31,440 --> 00:06:35,000 Speaker 1: first made use of a VPN or virtual private network. 103 00:06:35,320 --> 00:06:38,320 Speaker 1: Google representatives have said that the interruptions in service have 104 00:06:38,480 --> 00:06:41,679 Speaker 1: not come from the company, whereas a lot of Russian 105 00:06:41,760 --> 00:06:45,600 Speaker 1: media says that it's it's YouTube's fault. The Russian government 106 00:06:45,680 --> 00:06:49,120 Speaker 1: has become increasingly hostile toward YouTube for what it claims 107 00:06:49,200 --> 00:06:53,760 Speaker 1: is a determined stance against Russian legislation. It wasn't long 108 00:06:53,800 --> 00:06:57,760 Speaker 1: ago when an information policy representative in Russia warned that 109 00:06:57,800 --> 00:07:00,680 Speaker 1: YouTube could face a reduction in transmissions of up to 110 00:07:00,760 --> 00:07:04,239 Speaker 1: seventy percent because of the company's refusal to allow certain 111 00:07:04,279 --> 00:07:07,560 Speaker 1: Russian state backed channels from posting to the site. So 112 00:07:08,200 --> 00:07:10,640 Speaker 1: I don't have proof that this is a case where 113 00:07:10,640 --> 00:07:15,760 Speaker 1: the Russian government has purposefully throttled YouTube traffic in Russia, 114 00:07:15,800 --> 00:07:19,920 Speaker 1: but it at least seems like that's a plausible reason 115 00:07:20,240 --> 00:07:26,440 Speaker 1: for the outages. Not confirmed, just plausible. The Global Alliance 116 00:07:26,480 --> 00:07:32,160 Speaker 1: for Responsible Media or GARM. GARM has shut down after 117 00:07:32,200 --> 00:07:36,160 Speaker 1: being served an antitrust lawsuit from x formerly known as Twitter. 118 00:07:36,480 --> 00:07:39,560 Speaker 1: So this group was spun off of the World Federation 119 00:07:39,680 --> 00:07:43,240 Speaker 1: of Advertisers and its purpose was to protect the interests 120 00:07:43,280 --> 00:07:47,440 Speaker 1: of member advertising companies and the primary focus was to 121 00:07:47,480 --> 00:07:50,680 Speaker 1: make certain that ads on platforms weren't going to be 122 00:07:50,720 --> 00:07:55,280 Speaker 1: served next to objectionable material, you know, like advertisers don't 123 00:07:55,360 --> 00:07:57,880 Speaker 1: like it if their ad is showing up next to 124 00:07:57,920 --> 00:08:01,480 Speaker 1: someone who's making hate comments, you know, or something along 125 00:08:01,480 --> 00:08:05,600 Speaker 1: those lines. X alleges that GARM told its members to 126 00:08:05,720 --> 00:08:09,280 Speaker 1: not advertise on X due to the platform's policies, and 127 00:08:09,840 --> 00:08:14,680 Speaker 1: it saw these policies as being increasingly hostile toward the platform. 128 00:08:15,320 --> 00:08:19,000 Speaker 1: GARM staffers received an email that essentially said the organization 129 00:08:19,120 --> 00:08:21,120 Speaker 1: was folding because it didn't have enough money to both 130 00:08:21,160 --> 00:08:24,440 Speaker 1: carry out its goals and defend itself from this lawsuit. 131 00:08:24,880 --> 00:08:28,400 Speaker 1: I think most folks assume the lawsuit doesn't actually have 132 00:08:28,560 --> 00:08:32,520 Speaker 1: that much merit to it, because boycotting isn't against the law. 133 00:08:32,800 --> 00:08:34,920 Speaker 1: That's not breaking the law. It's not against the law 134 00:08:34,960 --> 00:08:38,679 Speaker 1: to build associations either. These things are generally protected by 135 00:08:38,679 --> 00:08:41,480 Speaker 1: constitutional amendments, you know, the right to free speech and 136 00:08:41,520 --> 00:08:44,400 Speaker 1: the right to free assembly. But then it could be 137 00:08:44,880 --> 00:08:48,800 Speaker 1: that Musk is using the law not because he's seeking justice, 138 00:08:48,840 --> 00:08:52,040 Speaker 1: but rather to persecute those he finds to be vexum 139 00:08:52,120 --> 00:08:56,480 Speaker 1: because lawsuits are expensive at any rate. Musk has continued 140 00:08:56,520 --> 00:08:58,920 Speaker 1: to make statements that I think most ad companies would 141 00:08:58,920 --> 00:09:02,240 Speaker 1: at least find con so I'm not expecting ad dollars 142 00:09:02,240 --> 00:09:05,640 Speaker 1: to flood the corporate coffers now that GARM has disbanded. 143 00:09:05,840 --> 00:09:09,559 Speaker 1: And GARM wasn't the only company targeted by a Musk 144 00:09:09,720 --> 00:09:12,760 Speaker 1: lawsuit in recent days. He also brought a case against 145 00:09:12,800 --> 00:09:17,120 Speaker 1: open ai and it's CEO, Sam Altman. So this lawsuit 146 00:09:17,240 --> 00:09:20,839 Speaker 1: argues that open ai has put profit above public good, 147 00:09:21,040 --> 00:09:24,520 Speaker 1: which you know, I actually don't disagree. I think open 148 00:09:24,559 --> 00:09:27,920 Speaker 1: ai has put profit above public good, but I don't 149 00:09:27,960 --> 00:09:30,960 Speaker 1: know where the legal issue is In all of that. 150 00:09:31,640 --> 00:09:35,480 Speaker 1: Musk seeks to nullify open AI's partnership with Microsoft. That's 151 00:09:35,520 --> 00:09:39,480 Speaker 1: a partnership that has seen the giant Microsoft invest thirteen 152 00:09:39,720 --> 00:09:43,120 Speaker 1: billion dollars into open Ai. This could also be seen 153 00:09:43,200 --> 00:09:46,480 Speaker 1: as an effort to just outright destroy open ai, which, 154 00:09:46,480 --> 00:09:49,280 Speaker 1: according to some analysts, is on track to lose around 155 00:09:49,280 --> 00:09:52,280 Speaker 1: five billion dollars this year as revenues haven't come close 156 00:09:52,320 --> 00:09:55,120 Speaker 1: to covering the costs of operation. Plus we have to 157 00:09:55,120 --> 00:09:59,240 Speaker 1: remember Musk does have his own competing AI company that's 158 00:09:59,280 --> 00:10:03,000 Speaker 1: in the process of ramping up x dot ai, So 159 00:10:03,559 --> 00:10:06,640 Speaker 1: there's that too. This lawsuit, by the way, mirrors one 160 00:10:06,640 --> 00:10:09,840 Speaker 1: that Musk brought against open ai earlier. This year. But 161 00:10:10,040 --> 00:10:13,800 Speaker 1: in that case, Musk dropped the lawsuit one day before 162 00:10:13,840 --> 00:10:16,600 Speaker 1: a judge was to hear a motion to dismiss that 163 00:10:16,679 --> 00:10:19,960 Speaker 1: was brought against the lawsuit by open Ai. I suspect 164 00:10:19,960 --> 00:10:22,120 Speaker 1: we're going to get another motion to dismiss in this 165 00:10:22,200 --> 00:10:24,880 Speaker 1: case because I don't think the details have really changed 166 00:10:25,040 --> 00:10:28,200 Speaker 1: since the last time this came around. And that includes 167 00:10:28,240 --> 00:10:31,839 Speaker 1: a record of messages that Elon Musk himself sent back 168 00:10:31,880 --> 00:10:34,800 Speaker 1: when he was still part of Open Ai that showed 169 00:10:34,800 --> 00:10:37,880 Speaker 1: his own desire to focus on profitability. Of course, in 170 00:10:38,000 --> 00:10:40,880 Speaker 1: Musk's version of this plan, the profitability would have served 171 00:10:40,920 --> 00:10:44,079 Speaker 1: the interests of his company, Tesla. Maybe that's the real 172 00:10:44,120 --> 00:10:47,400 Speaker 1: sticking point here. Okay, we've got more tech news to 173 00:10:47,400 --> 00:10:49,439 Speaker 1: cover before we get to all that. Let's take a 174 00:10:49,520 --> 00:11:01,920 Speaker 1: quick break, and you know, there are a lot of 175 00:11:01,960 --> 00:11:05,360 Speaker 1: stories about antitrust issues and technology this week. So over 176 00:11:05,400 --> 00:11:08,920 Speaker 1: in the UK, the concern there is about Amazon and 177 00:11:09,000 --> 00:11:13,920 Speaker 1: the company's four billion dollar partnership with AI startup firm Anthropic. 178 00:11:14,320 --> 00:11:19,400 Speaker 1: The Competition in Markets Authority or CMA is probing if 179 00:11:19,440 --> 00:11:23,000 Speaker 1: this partnership will potentially further entrench Amazon in the realm 180 00:11:23,040 --> 00:11:26,800 Speaker 1: of cloud computing. So essentially, the fear is that Amazon 181 00:11:27,000 --> 00:11:30,240 Speaker 1: dominates the cloud compute space, and that people who are 182 00:11:30,240 --> 00:11:33,520 Speaker 1: developing AI models may feel they have no option other 183 00:11:33,640 --> 00:11:37,360 Speaker 1: than to rely on Amazon for computational power needs. So 184 00:11:37,400 --> 00:11:40,800 Speaker 1: the CMA is looking into the possibility that this partnership 185 00:11:40,840 --> 00:11:45,559 Speaker 1: between Amazon and Anthropic would really qualify more as a merger. 186 00:11:46,080 --> 00:11:48,560 Speaker 1: Maybe not a merger in the formal sense of two 187 00:11:48,640 --> 00:11:52,880 Speaker 1: companies actually merging into a single entity, but a business 188 00:11:53,040 --> 00:11:57,640 Speaker 1: process that ultimately results in an outcome that is similar 189 00:11:57,840 --> 00:12:01,880 Speaker 1: to that of a merger. Now, that particular investigation is 190 00:12:01,920 --> 00:12:04,720 Speaker 1: still ongoing, so this is not a foregone conclusion. The 191 00:12:04,800 --> 00:12:09,560 Speaker 1: CMA says it expects to probe this matter until October fourth, 192 00:12:09,760 --> 00:12:12,400 Speaker 1: at which point the CMA will then announce whether there's 193 00:12:12,520 --> 00:12:17,720 Speaker 1: enough concern to block this deal, or maybe they'll say no, no, 194 00:12:18,040 --> 00:12:21,440 Speaker 1: there's nothing here that rises to the level to make 195 00:12:21,480 --> 00:12:25,160 Speaker 1: it a merger situation. Or maybe it is a merger situation, 196 00:12:25,600 --> 00:12:29,160 Speaker 1: but it doesn't have enough red flags to make us 197 00:12:29,200 --> 00:12:32,439 Speaker 1: block it in the first place. So it's still entirely 198 00:12:32,559 --> 00:12:36,120 Speaker 1: possible that the CMA will allow this deal to continue 199 00:12:36,160 --> 00:12:40,240 Speaker 1: without impeding it in any way. However, the CMA can 200 00:12:40,280 --> 00:12:43,520 Speaker 1: be pretty darn strict with tech companies, and I don't 201 00:12:43,559 --> 00:12:46,320 Speaker 1: necessarily think that's always a bad thing, but I have 202 00:12:46,400 --> 00:12:49,920 Speaker 1: no clue which way they're going to decide this matter. 203 00:12:50,200 --> 00:12:51,960 Speaker 1: If you'd like to read up on it, I highly 204 00:12:51,960 --> 00:12:56,480 Speaker 1: recommend Ashley Bellinger's article in Ours Technica this titled Amazon 205 00:12:56,559 --> 00:13:01,920 Speaker 1: defends four billion dollar anthropic aideal from UK monopoly concerns. 206 00:13:02,559 --> 00:13:06,200 Speaker 1: Reuter's reports that open ai co founder John Shulman has 207 00:13:06,280 --> 00:13:09,960 Speaker 1: left that company and has now joined the aforementioned anthropic. 208 00:13:10,200 --> 00:13:13,319 Speaker 1: He posted x that quote, this choice stems from my 209 00:13:13,480 --> 00:13:16,520 Speaker 1: desire to deepen my focus on AI alignment and to 210 00:13:16,559 --> 00:13:19,160 Speaker 1: start a new chapter of my career where I can 211 00:13:19,200 --> 00:13:24,000 Speaker 1: return to hands on technical work end quote. Shulman is 212 00:13:24,080 --> 00:13:29,679 Speaker 1: one of several influential key people at open ai who 213 00:13:29,760 --> 00:13:33,360 Speaker 1: either has recently stepped away from that company or they've 214 00:13:33,360 --> 00:13:37,160 Speaker 1: had their role changed significantly in the recent past. Now, 215 00:13:37,200 --> 00:13:40,120 Speaker 1: could this be an indicator that open ai is actually 216 00:13:40,280 --> 00:13:43,959 Speaker 1: in some trouble? Is it possible the company expanded too 217 00:13:44,000 --> 00:13:48,679 Speaker 1: quickly and cannot sustain itself, or is this just coincidence 218 00:13:48,720 --> 00:13:50,480 Speaker 1: and just a sign that once you reach a certain 219 00:13:50,559 --> 00:13:53,360 Speaker 1: level within an organization, you could pretty much take any 220 00:13:53,400 --> 00:13:56,000 Speaker 1: giggy like beats me. It could be any of those 221 00:13:56,120 --> 00:13:58,760 Speaker 1: or something else. Entirely. I know that with all the 222 00:13:58,800 --> 00:14:01,880 Speaker 1: other open ai news, it becomes tempting to say, oh, 223 00:14:01,960 --> 00:14:06,000 Speaker 1: open ai is starting the long process of circling the drain. 224 00:14:06,360 --> 00:14:09,160 Speaker 1: I think it's way too early to say something like that, 225 00:14:09,320 --> 00:14:12,840 Speaker 1: but I also admittedly am not an expert on those matters. 226 00:14:13,280 --> 00:14:18,600 Speaker 1: The US Commodity Futures Trading Commission, or CFTC, has ruled 227 00:14:18,600 --> 00:14:22,920 Speaker 1: that the bankrupt cryptocurrency exchange FTX must pay twelve point 228 00:14:23,080 --> 00:14:26,560 Speaker 1: seven billion dollars in relief to former customers. Now, that 229 00:14:26,680 --> 00:14:29,560 Speaker 1: might actually sound like old news to you, because earlier 230 00:14:29,600 --> 00:14:34,280 Speaker 1: this year, ftx's CEO, slash Fires sale guru John J. 231 00:14:34,560 --> 00:14:38,280 Speaker 1: Ray the Third, had already assured various stakeholders that he 232 00:14:38,360 --> 00:14:41,640 Speaker 1: was able to claw back all the money that had 233 00:14:41,680 --> 00:14:45,680 Speaker 1: been lost in the FTX blowout. I guess, but this 234 00:14:45,880 --> 00:14:49,160 Speaker 1: new development should really make former FTX customers feel a 235 00:14:49,200 --> 00:14:53,080 Speaker 1: little better, because it represents a US agency guaranteeing that 236 00:14:53,120 --> 00:14:57,800 Speaker 1: those who were directly affected by ftx's spectacular implosion will 237 00:14:57,920 --> 00:15:01,480 Speaker 1: be compensated before any of the money goes to pay 238 00:15:01,480 --> 00:15:05,040 Speaker 1: off anything else. Since there is an ongoing lawsuit against 239 00:15:05,120 --> 00:15:08,120 Speaker 1: FTX that's from the US government. This is a good 240 00:15:08,120 --> 00:15:11,920 Speaker 1: thing for those folks, because the lawsuit could potentially mean 241 00:15:12,000 --> 00:15:14,920 Speaker 1: that it'll have to dip into ftx's assets to pay 242 00:15:14,920 --> 00:15:19,600 Speaker 1: off fines and whatnot. But the decision from the CFTC 243 00:15:19,840 --> 00:15:23,080 Speaker 1: says that they can't touch any of the money that's 244 00:15:23,160 --> 00:15:26,360 Speaker 1: meant to go back to paying the FTX customers. That 245 00:15:26,360 --> 00:15:29,000 Speaker 1: that money is for those customers and they're going to 246 00:15:29,040 --> 00:15:32,080 Speaker 1: receive their compensation. There's still some hurdles to clear, of course. 247 00:15:32,520 --> 00:15:34,600 Speaker 1: There are some folks who feel they should actually be 248 00:15:34,640 --> 00:15:38,400 Speaker 1: paid based on the current value of crypto as opposed 249 00:15:38,440 --> 00:15:41,920 Speaker 1: to where that value was in November twenty twenty two 250 00:15:42,040 --> 00:15:45,280 Speaker 1: when everything went pear shaped. So yeah, not over yet, 251 00:15:45,440 --> 00:15:48,800 Speaker 1: but still good news for folks who thought they might 252 00:15:48,840 --> 00:15:53,040 Speaker 1: have lost everything when FTX went belly up. Now let's 253 00:15:53,160 --> 00:15:55,680 Speaker 1: wrap this up with some space news, and it's really 254 00:15:55,760 --> 00:15:59,360 Speaker 1: bad news for Boeing anyway. That company is already fighting 255 00:15:59,400 --> 00:16:02,360 Speaker 1: a public perception war on multiple fronts at this point. 256 00:16:02,560 --> 00:16:05,280 Speaker 1: I mean, you've obviously got the seven thirty seven issues, 257 00:16:05,480 --> 00:16:07,400 Speaker 1: but you've also got the star Liner. We're going to 258 00:16:07,400 --> 00:16:09,880 Speaker 1: get to that in just a moment, but for right now, 259 00:16:09,960 --> 00:16:12,440 Speaker 1: I actually want to mention a report from NASA that 260 00:16:12,480 --> 00:16:17,360 Speaker 1: calls into question Boeing's quality control capabilities, particularly with regard 261 00:16:17,720 --> 00:16:23,320 Speaker 1: to the Exploration upper stage section of NASA's SLS launch vehicle. 262 00:16:23,640 --> 00:16:27,840 Speaker 1: So this is a piece of a much bigger project 263 00:16:27,920 --> 00:16:31,920 Speaker 1: that includes NASA's planned mission to return astronauts to the Moon. 264 00:16:32,280 --> 00:16:35,480 Speaker 1: But according to NASA's report, the agency found quote an 265 00:16:35,600 --> 00:16:39,920 Speaker 1: array of issues that could hinder SLS Block oneb's readiness 266 00:16:39,960 --> 00:16:46,080 Speaker 1: for Artemis four, including Boeing's inadequate quality management system, escalating 267 00:16:46,160 --> 00:16:50,560 Speaker 1: costs and schedules, and inadequate visibility into the Block one 268 00:16:50,640 --> 00:16:54,240 Speaker 1: bee's projected costs end quote. So, in other words, NASA 269 00:16:54,280 --> 00:16:57,160 Speaker 1: says Boeing is not doing a good job, they're going 270 00:16:57,200 --> 00:16:59,960 Speaker 1: way over budget, and they're not very transparent about either 271 00:17:00,120 --> 00:17:03,960 Speaker 1: of those things. Further, NASA llegends that Boeing's workforce has 272 00:17:04,040 --> 00:17:09,240 Speaker 1: quote insufficient aerospace production experience end quote. Again, according to 273 00:17:09,280 --> 00:17:13,119 Speaker 1: this report, it is this lack of experience that NASA 274 00:17:13,200 --> 00:17:16,160 Speaker 1: credits as being the cause of issues in quality control. 275 00:17:16,359 --> 00:17:19,800 Speaker 1: This is not good news. Now. To learn more about this, 276 00:17:19,960 --> 00:17:24,480 Speaker 1: I recommend Eric Berger's piece. A new report finds Boeing's 277 00:17:24,560 --> 00:17:27,760 Speaker 1: rockets are built with an unqualified workforce. You can find 278 00:17:27,760 --> 00:17:30,320 Speaker 1: that in ours technica. And of course we're not done 279 00:17:30,320 --> 00:17:33,240 Speaker 1: with Boeing woes just yet. So you may recall that 280 00:17:33,600 --> 00:17:38,320 Speaker 1: astronauts Sunny Williams and Butch Willmore piloted a Boeing Starliner 281 00:17:38,440 --> 00:17:42,159 Speaker 1: capsule to the International Space Station in early June of 282 00:17:42,200 --> 00:17:45,320 Speaker 1: twenty twenty four, and we're able to dock despite some 283 00:17:45,480 --> 00:17:49,520 Speaker 1: technical issues. One of those technical issues was a helium leak, which, 284 00:17:49,600 --> 00:17:53,880 Speaker 1: while concerning, wasn't a catastrophic event. Helium is not toxic, 285 00:17:53,920 --> 00:17:56,760 Speaker 1: it's not flammable. You can asphyxiate if that's all you 286 00:17:56,800 --> 00:18:00,280 Speaker 1: have to breathe, but otherwise it was seen as a 287 00:18:00,320 --> 00:18:04,280 Speaker 1: concern but not a mission critical issue. But another bigger 288 00:18:04,280 --> 00:18:06,679 Speaker 1: issue was that five of the twenty eight thrusters on 289 00:18:06,720 --> 00:18:10,560 Speaker 1: the space capsule became unresponsive on approach to the ISS, 290 00:18:10,880 --> 00:18:13,240 Speaker 1: and NASA and the astronauts had to do some troubleshooting 291 00:18:13,280 --> 00:18:16,400 Speaker 1: to get enough of those working again to proceed with docking, 292 00:18:16,520 --> 00:18:18,720 Speaker 1: which they were able to do. But since then NASA 293 00:18:18,800 --> 00:18:21,399 Speaker 1: has been working on ways to address the problems with 294 00:18:21,440 --> 00:18:24,639 Speaker 1: the star Liner so that these two astronauts can return 295 00:18:24,640 --> 00:18:27,560 Speaker 1: to Earth. So they were meant to stay aboard the 296 00:18:27,600 --> 00:18:30,280 Speaker 1: ISS for a little more than a week, but now 297 00:18:30,359 --> 00:18:32,960 Speaker 1: it's been more than two months and they're still there. 298 00:18:33,320 --> 00:18:36,800 Speaker 1: NASA has indicated that they might stay up there for 299 00:18:36,880 --> 00:18:40,080 Speaker 1: the rest of twenty twenty four, and the agency hasn't 300 00:18:40,119 --> 00:18:42,600 Speaker 1: ruled out the possibility that when they do come back, 301 00:18:42,960 --> 00:18:45,800 Speaker 1: it will not be aboard the star Liner but instead 302 00:18:45,920 --> 00:18:49,679 Speaker 1: a SpaceX Dragon capsule. Now those decisions have yet to 303 00:18:49,720 --> 00:18:53,800 Speaker 1: be made, nothing is official yet. Before the star Liner issue, 304 00:18:53,880 --> 00:18:57,199 Speaker 1: plans called for another Dragon Crew capsule to launch on 305 00:18:57,240 --> 00:18:59,800 Speaker 1: August eighteenth to go to the ISS, but that is 306 00:18:59,840 --> 00:19:03,679 Speaker 1: now been pushed back to September twenty fourth at the earliest. 307 00:19:04,200 --> 00:19:06,840 Speaker 1: So what would a change of plans look like If 308 00:19:06,920 --> 00:19:09,480 Speaker 1: NASA says, you know what, we don't feel comfortable putting 309 00:19:09,520 --> 00:19:12,400 Speaker 1: them aboard the star Liner, Well, if NASA does feel 310 00:19:12,400 --> 00:19:15,240 Speaker 1: the star Liner poses too great a risk, the likely 311 00:19:15,320 --> 00:19:17,919 Speaker 1: solution is to have the star Liner separate from the 312 00:19:17,960 --> 00:19:21,600 Speaker 1: ISS and return to Earth with no crew aboard the capsule. 313 00:19:21,920 --> 00:19:25,439 Speaker 1: A Dragon Crew capsule with just two astronauts aboard. It 314 00:19:25,480 --> 00:19:28,760 Speaker 1: would then join the ISS and relieve both the star 315 00:19:28,880 --> 00:19:32,360 Speaker 1: Liner crew and two other ISS crew members, but that 316 00:19:32,440 --> 00:19:36,080 Speaker 1: might not even happen. Until next year. So why would 317 00:19:36,080 --> 00:19:38,240 Speaker 1: there be such a long delay. Well, A big part 318 00:19:38,240 --> 00:19:41,960 Speaker 1: of that is that we actually aren't still really sure 319 00:19:42,000 --> 00:19:45,440 Speaker 1: about what went wrong with those thrusters. Engineers on Earth 320 00:19:45,480 --> 00:19:49,040 Speaker 1: tested an identical thruster and they found some problems. But 321 00:19:49,160 --> 00:19:52,119 Speaker 1: then subsequent tests on the star Liner that's in orbit 322 00:19:52,320 --> 00:19:56,399 Speaker 1: showed that its thrusters that had previously been unresponsive were 323 00:19:56,680 --> 00:20:00,000 Speaker 1: now working again, and potentially that means that the issue 324 00:20:00,240 --> 00:20:03,639 Speaker 1: that we're seen on Earth are different from the ones 325 00:20:03,680 --> 00:20:06,760 Speaker 1: that affected the star Liner in orbit. So there's a 326 00:20:06,840 --> 00:20:10,240 Speaker 1: lot we don't know, and figuring it out remains a 327 00:20:10,280 --> 00:20:13,200 Speaker 1: top priority. I mentioned a couple of articles worth reading 328 00:20:13,200 --> 00:20:15,760 Speaker 1: this week, but one more is Rebecca Jennings article on 329 00:20:15,880 --> 00:20:19,480 Speaker 1: vox dot com titled those Olympics AI ads feel bad 330 00:20:19,560 --> 00:20:23,200 Speaker 1: for a reason, and I highly recommend you check it out. 331 00:20:23,280 --> 00:20:26,280 Speaker 1: It really does a good job of talking about missing 332 00:20:26,280 --> 00:20:29,320 Speaker 1: the mark when it comes to trying to promote AI, 333 00:20:29,720 --> 00:20:32,359 Speaker 1: because it seems like it's promoting AI to do things 334 00:20:32,440 --> 00:20:35,240 Speaker 1: like take on emotional loads on our behalf so that 335 00:20:35,280 --> 00:20:38,120 Speaker 1: we can be I don't know, soulless consumers. Doesn't seem 336 00:20:38,119 --> 00:20:40,879 Speaker 1: that great to me. Highly recommend checking that out and 337 00:20:40,880 --> 00:20:43,320 Speaker 1: that's it for this week. I hope you are all 338 00:20:43,359 --> 00:20:53,120 Speaker 1: well and I'll talk to you again. Really So tech 339 00:20:53,160 --> 00:20:57,520 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 340 00:20:57,880 --> 00:21:01,600 Speaker 1: visit the iHeartRadio app, app podcasts, or wherever you listen 341 00:21:01,640 --> 00:21:02,719 Speaker 1: to your favorite shows.