1 00:00:04,480 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,920 --> 00:00:19,240 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,239 --> 00:00:21,680 Speaker 1: tech are you. It's time for the tech news for 5 00:00:21,720 --> 00:00:26,640 Speaker 1: the week ending Friday, June fourteenth, twenty twenty four. For 6 00:00:26,760 --> 00:00:30,560 Speaker 1: all you gamers out there, this past weekend and following 7 00:00:30,640 --> 00:00:34,560 Speaker 1: week was an absolute smorgas board of video game trailers 8 00:00:34,920 --> 00:00:38,440 Speaker 1: as well as commercials for like gaming hardware, though no 9 00:00:38,560 --> 00:00:41,640 Speaker 1: new consoles. I'm sad to say so. While E three 10 00:00:41,920 --> 00:00:46,920 Speaker 1: is really most sincerely dead, we had Jeff Keeley's Summer 11 00:00:47,040 --> 00:00:51,600 Speaker 1: Games Fest, we had the Ubisoft Forward Event, the PC 12 00:00:51,960 --> 00:00:56,240 Speaker 1: Gaming Show, and the Xbox Showcase, and I'm probably missing 13 00:00:56,240 --> 00:00:59,080 Speaker 1: a couple in there. Soon we're gonna have Nintendo Direct 14 00:00:59,360 --> 00:01:01,840 Speaker 1: that has not yet happened. In fact, last I checked, 15 00:01:01,840 --> 00:01:04,800 Speaker 1: they had not even mentioned what date that is going 16 00:01:04,800 --> 00:01:06,640 Speaker 1: to happen on, but most people are expecting it to 17 00:01:06,640 --> 00:01:10,520 Speaker 1: happen next week anyway. Tons of games have been announced, 18 00:01:10,640 --> 00:01:14,319 Speaker 1: though not very many in the category of Triple A titles, 19 00:01:14,360 --> 00:01:16,399 Speaker 1: apart from some of the Ubisoft stuff, like they had 20 00:01:16,440 --> 00:01:19,200 Speaker 1: an Assassin's Creed title that kind of thing. It's been 21 00:01:19,240 --> 00:01:23,120 Speaker 1: a rough few months for the games industry, largely because 22 00:01:23,240 --> 00:01:28,960 Speaker 1: of layoffs and shutdowns. I'm specifically looking at you, Microsoft, 23 00:01:29,160 --> 00:01:31,360 Speaker 1: But if you'd like to hear more about what was 24 00:01:31,520 --> 00:01:34,480 Speaker 1: announced over the last few days, well, there's like tons 25 00:01:34,520 --> 00:01:37,319 Speaker 1: of resources. You know. You could go to Polygon or 26 00:01:37,400 --> 00:01:39,679 Speaker 1: ign just go to YouTube and do a search on 27 00:01:39,720 --> 00:01:42,479 Speaker 1: these sorts of things, or you can listen to a 28 00:01:42,520 --> 00:01:45,440 Speaker 1: show like the Besties podcast. And I don't have any 29 00:01:45,480 --> 00:01:47,400 Speaker 1: connection with any of those things, by the way, I 30 00:01:47,400 --> 00:01:49,640 Speaker 1: don't work with them, and I don't know them. I 31 00:01:49,840 --> 00:01:52,560 Speaker 1: just I'm just a fan as well. So check those out. 32 00:01:52,680 --> 00:01:55,920 Speaker 1: But now let's get to some tech news. Apple held 33 00:01:56,040 --> 00:02:00,800 Speaker 1: its Worldwide Developers Conference or WWDC this week, and as 34 00:02:00,840 --> 00:02:05,400 Speaker 1: I'm sure everybody anticipated, AI was foremost on the menu. 35 00:02:05,440 --> 00:02:09,040 Speaker 1: When it came to keynotes and presentations. CEO Tim Cook 36 00:02:09,120 --> 00:02:12,120 Speaker 1: took the stage and showed off how an AI boosted 37 00:02:12,200 --> 00:02:17,280 Speaker 1: serie could do more complex tasks with features like emails, messages, 38 00:02:17,360 --> 00:02:21,000 Speaker 1: and even third party apps. Not everyone is happy with 39 00:02:21,240 --> 00:02:25,200 Speaker 1: Apple's recent decisions regarding AI. More on that in a second, 40 00:02:25,360 --> 00:02:28,880 Speaker 1: but investors they were happy as a clam Apple saw 41 00:02:28,880 --> 00:02:31,520 Speaker 1: its stock price go up enough to the point where 42 00:02:31,560 --> 00:02:34,919 Speaker 1: the company bumped Microsoft off the top of its most 43 00:02:35,040 --> 00:02:39,639 Speaker 1: Valuable Corporations list. Now Apple once again stands triumphant as 44 00:02:39,720 --> 00:02:43,080 Speaker 1: the most valuable company in the world with evaluation of 45 00:02:43,840 --> 00:02:47,760 Speaker 1: good golly like three point twenty nine trillion dollars. But 46 00:02:47,880 --> 00:02:51,560 Speaker 1: back to the WWDC, Apple's version of AI stands for, 47 00:02:51,919 --> 00:02:56,280 Speaker 1: of course, Apple Intelligence. Apple announced it was incorporating the 48 00:02:56,360 --> 00:03:00,760 Speaker 1: GPT four oh large language model into its products so 49 00:03:00,800 --> 00:03:04,520 Speaker 1: that the AI can help assist with specific user needs. 50 00:03:04,880 --> 00:03:07,560 Speaker 1: Now this means that Apple has entered into a sort 51 00:03:07,560 --> 00:03:10,440 Speaker 1: of partnership with open Ai, which you might recall is 52 00:03:10,480 --> 00:03:14,680 Speaker 1: also working very closely with Microsoft because Microsoft initiated a 53 00:03:14,800 --> 00:03:19,240 Speaker 1: ten billion dollar investment in open Ai. According to just 54 00:03:19,360 --> 00:03:22,880 Speaker 1: Weatherbed of The Verge, Apple isn't handing out bucket loads 55 00:03:22,880 --> 00:03:25,880 Speaker 1: of cash the same way Microsoft is, but rather, quote 56 00:03:26,000 --> 00:03:30,200 Speaker 1: believes the exposure it's giving chat GPT is payment enough 57 00:03:30,480 --> 00:03:32,760 Speaker 1: end quote. That's kind of funny. I've always said, you know, 58 00:03:32,800 --> 00:03:36,200 Speaker 1: you can't live off exposure, but you sure can die 59 00:03:36,240 --> 00:03:40,440 Speaker 1: from it anyway. Weatherbed says that Apple is reportedly in 60 00:03:40,480 --> 00:03:43,520 Speaker 1: talks with other companies that have AI models, and the 61 00:03:43,560 --> 00:03:46,200 Speaker 1: plan appears to be that Apple users will actually have 62 00:03:46,360 --> 00:03:51,080 Speaker 1: options when it comes to whichever AI overlord they want 63 00:03:51,120 --> 00:03:55,240 Speaker 1: to use, so those who prefer Google's Gemini AI could 64 00:03:55,320 --> 00:03:59,200 Speaker 1: use that instead, for example, assuming a deal between Apple 65 00:03:59,240 --> 00:04:03,240 Speaker 1: and Google through Apple set some expectations early with the 66 00:04:03,280 --> 00:04:06,480 Speaker 1: AI talk, saying that these enhanced features will only be 67 00:04:06,560 --> 00:04:11,360 Speaker 1: available on some hardware, mostly the newest and highest ended 68 00:04:11,400 --> 00:04:15,040 Speaker 1: stuff like the iPhone fifteen Pro and the iPhone fifteen 69 00:04:15,080 --> 00:04:18,840 Speaker 1: Pro Max models or Max and iPads that have at 70 00:04:18,920 --> 00:04:22,080 Speaker 1: least an M one or later chip in them. Also, 71 00:04:22,240 --> 00:04:25,320 Speaker 1: it will only work if the language on the device 72 00:04:25,400 --> 00:04:30,000 Speaker 1: is set to English. Apple's presentation largely focused on context 73 00:04:30,360 --> 00:04:33,320 Speaker 1: of use, meaning that let's say you launch an app 74 00:04:33,800 --> 00:04:36,960 Speaker 1: and that you then ask that some information that's on 75 00:04:37,080 --> 00:04:40,119 Speaker 1: a different app get incorporated into the one you're using. 76 00:04:40,560 --> 00:04:42,760 Speaker 1: The AI would then handle what you want. So let 77 00:04:42,760 --> 00:04:45,280 Speaker 1: me give you a hypothetical example. Let's say that I 78 00:04:45,320 --> 00:04:48,280 Speaker 1: open up a messenger app and I want to send 79 00:04:48,600 --> 00:04:52,400 Speaker 1: a link to a video that's in a different app, right, Like, 80 00:04:52,400 --> 00:04:54,919 Speaker 1: I've got YouTube open on a different in one of 81 00:04:54,920 --> 00:04:56,880 Speaker 1: the other apps, and I want to send a link 82 00:04:56,920 --> 00:04:58,560 Speaker 1: to the YouTube video I was just watching to a 83 00:04:58,560 --> 00:05:03,400 Speaker 1: friend of mine on message, and the AI would theoretically 84 00:05:03,400 --> 00:05:05,479 Speaker 1: be able to suss all that out and save me 85 00:05:05,560 --> 00:05:07,599 Speaker 1: the effort of figuring out how to do that myself, 86 00:05:07,680 --> 00:05:10,039 Speaker 1: like to go in through the share feature on YouTube 87 00:05:10,120 --> 00:05:12,599 Speaker 1: or whatever in order to pull the link and post 88 00:05:12,600 --> 00:05:13,920 Speaker 1: it to my friend. I could just do it very 89 00:05:13,960 --> 00:05:16,880 Speaker 1: quickly by asking the assistant to do it. Apple also 90 00:05:16,960 --> 00:05:20,920 Speaker 1: showed off an AI image generation feature called gen Moji, 91 00:05:21,160 --> 00:05:24,160 Speaker 1: and as that name suggests, it will let users generate 92 00:05:24,200 --> 00:05:27,200 Speaker 1: an emoji that otherwise does not exist, so that you 93 00:05:27,240 --> 00:05:30,720 Speaker 1: can suitably express whatever it is you're feeling without you know, 94 00:05:30,760 --> 00:05:32,760 Speaker 1: having to use words. I mean, you do have to 95 00:05:32,839 --> 00:05:35,119 Speaker 1: use words to generate the emoji in the first place, 96 00:05:35,160 --> 00:05:37,640 Speaker 1: but then you won't be using words for whomever is 97 00:05:37,680 --> 00:05:41,279 Speaker 1: on the receiving end of that, you know, custom made emoji. 98 00:05:41,640 --> 00:05:44,440 Speaker 1: I guess I don't actually understand any of this because 99 00:05:44,480 --> 00:05:47,000 Speaker 1: I was an English major. I still think writing actual 100 00:05:47,080 --> 00:05:50,680 Speaker 1: letters is cool. But anyway, Apple announced this and lots 101 00:05:50,720 --> 00:05:53,960 Speaker 1: more stuff at the WWDC this year, and it got 102 00:05:54,000 --> 00:05:58,520 Speaker 1: developers and investors really excited. But it's not all good 103 00:05:58,560 --> 00:06:02,240 Speaker 1: news for Apple. There's a growing anti trust lawsuit mounting 104 00:06:02,279 --> 00:06:04,640 Speaker 1: against the company here in the United States. It's led 105 00:06:04,680 --> 00:06:08,040 Speaker 1: by the US Department of Justice, and so far nineteen 106 00:06:08,160 --> 00:06:13,039 Speaker 1: states as well as the District of Columbia have joined. Nevada, Massachusetts, Washington, 107 00:06:13,080 --> 00:06:16,760 Speaker 1: and Indiana are the latest for states to get involved 108 00:06:16,760 --> 00:06:19,720 Speaker 1: in this lawsuit. And it pretty much follows an argument 109 00:06:19,760 --> 00:06:23,120 Speaker 1: that we have heard for years now in various variations. 110 00:06:23,360 --> 00:06:27,599 Speaker 1: So Apple maintains a tight ecosystem, largely in an effort 111 00:06:27,640 --> 00:06:31,200 Speaker 1: to maximize profits and to discourage competition. What's more, of 112 00:06:31,200 --> 00:06:34,200 Speaker 1: these lawsuits say that Apple pairs that strategy with a 113 00:06:34,279 --> 00:06:38,320 Speaker 1: decision to sell iPhones at higher prices, and this maximizes 114 00:06:38,360 --> 00:06:41,440 Speaker 1: their profit margins, all the while forcing partners to pay 115 00:06:41,440 --> 00:06:44,719 Speaker 1: hefty fees for the privilege of being allowed to play 116 00:06:44,920 --> 00:06:49,080 Speaker 1: within Apple's sandbox. Apple has responded by saying, uh uh no, 117 00:06:49,200 --> 00:06:51,200 Speaker 1: we don't. And plus, even if we did, that's not 118 00:06:51,240 --> 00:06:53,640 Speaker 1: the definition of anti trust because no court has ever 119 00:06:53,680 --> 00:06:57,200 Speaker 1: said that, so that could be a valid point. But 120 00:06:57,480 --> 00:06:59,400 Speaker 1: you know, the interesting thing about courts is that they 121 00:06:59,400 --> 00:07:02,520 Speaker 1: can make new interpretations of old laws. That's kind of 122 00:07:02,520 --> 00:07:04,839 Speaker 1: what they do. So we'll have to see how this 123 00:07:05,480 --> 00:07:10,000 Speaker 1: develops further. Meanwhile, in Japan, Apple will have to allow 124 00:07:10,240 --> 00:07:13,840 Speaker 1: third party app stores and payment providers on iOS devices 125 00:07:13,880 --> 00:07:16,120 Speaker 1: in order to comply with a new law passed by 126 00:07:16,200 --> 00:07:19,560 Speaker 1: Japan's parliament. Google will also have to obey this law 127 00:07:19,600 --> 00:07:23,040 Speaker 1: with its Android devices. And again, this follows a growing 128 00:07:23,120 --> 00:07:27,440 Speaker 1: trend of regulators and other government officials cracking down on 129 00:07:27,560 --> 00:07:31,800 Speaker 1: the walled garden approach that both Apple and Google have used. 130 00:07:32,160 --> 00:07:36,040 Speaker 1: The companies have long argued that their policies protect consumers 131 00:07:36,240 --> 00:07:40,480 Speaker 1: by preventing malicious actors from tricking people into downloading malware 132 00:07:40,720 --> 00:07:43,720 Speaker 1: or sharing personal information with the wrong party, a claim 133 00:07:44,080 --> 00:07:47,840 Speaker 1: that I think is only partly valid because both Google 134 00:07:47,880 --> 00:07:50,480 Speaker 1: and Apple have a history of allowing third party apps 135 00:07:50,520 --> 00:07:54,000 Speaker 1: that later turned out to be malicious. But we're seeing 136 00:07:54,040 --> 00:07:57,640 Speaker 1: more governments say I'm not buying it. You do this 137 00:07:57,720 --> 00:08:01,200 Speaker 1: so that you can maintain control over ecosystem and then 138 00:08:01,320 --> 00:08:03,760 Speaker 1: profit off of other people's work. So by that, I 139 00:08:03,800 --> 00:08:06,400 Speaker 1: mean Apple and Google both have a practice of collecting 140 00:08:06,880 --> 00:08:10,640 Speaker 1: a percentage of every in app transaction for most apps, 141 00:08:10,920 --> 00:08:13,360 Speaker 1: So regulators are saying this is kind of like the 142 00:08:13,400 --> 00:08:16,480 Speaker 1: mobs skimming off of profits of businesses that the mob 143 00:08:16,640 --> 00:08:20,680 Speaker 1: quote unquote protects. Now, I mentioned earlier that one person 144 00:08:20,840 --> 00:08:24,480 Speaker 1: in particular isn't happy about the Apple and open Ai 145 00:08:24,640 --> 00:08:29,400 Speaker 1: news and that happens to be our favorite Bunker's billionaire 146 00:08:29,600 --> 00:08:32,160 Speaker 1: Elon Musk. You know, there's some weeks that I managed 147 00:08:32,200 --> 00:08:35,280 Speaker 1: to do a whole news episode without talking about him 148 00:08:35,320 --> 00:08:37,840 Speaker 1: at all. But today we've got a few stories that 149 00:08:37,920 --> 00:08:40,520 Speaker 1: involve him, and I guess that's because I invoked his 150 00:08:40,640 --> 00:08:43,040 Speaker 1: name when I did an episode earlier this week about 151 00:08:43,080 --> 00:08:47,240 Speaker 1: the origins of Tesla. Anyway, Musk threatened to ban Apple 152 00:08:47,280 --> 00:08:51,400 Speaker 1: computers and Apple devices from his various companies in the 153 00:08:51,440 --> 00:08:54,760 Speaker 1: wake of the open Ai announcement. So if you recall, 154 00:08:55,120 --> 00:08:57,800 Speaker 1: once upon a time, Elon Musk was a co founding 155 00:08:57,880 --> 00:09:01,440 Speaker 1: member of the original incarnation of open ai, which at 156 00:09:01,440 --> 00:09:05,000 Speaker 1: that time was a non profit dedicated to developing responsible 157 00:09:05,160 --> 00:09:09,319 Speaker 1: and safe artificial intelligence. But then Musk stormed out of 158 00:09:09,360 --> 00:09:13,280 Speaker 1: that organization, or perhaps he was encouraged to leave, and 159 00:09:13,360 --> 00:09:15,680 Speaker 1: he says that his decision was due to open ai 160 00:09:15,880 --> 00:09:19,359 Speaker 1: turning to the dark side of the force and embracing 161 00:09:19,400 --> 00:09:23,200 Speaker 1: a for profit approach. More on that than just a bit. Anyway, 162 00:09:23,440 --> 00:09:26,760 Speaker 1: Musk has been in a pretty darn public tiff with 163 00:09:26,880 --> 00:09:30,320 Speaker 1: open ai ever since, and he's also building out his 164 00:09:30,400 --> 00:09:34,720 Speaker 1: own AI company, which of course is called Xai. Musk 165 00:09:35,200 --> 00:09:38,400 Speaker 1: has said that if Apple devices incorporate open AI into them, 166 00:09:38,440 --> 00:09:40,960 Speaker 1: which is what is now happening, then that would mean 167 00:09:41,080 --> 00:09:45,160 Speaker 1: the Apple devices would pose quote an unacceptable security violation 168 00:09:45,480 --> 00:09:48,080 Speaker 1: end quote. And yikes, that's got to be tough because, 169 00:09:48,080 --> 00:09:51,760 Speaker 1: of course, now Microsoft also has tight integration with open AI. 170 00:09:51,840 --> 00:09:55,960 Speaker 1: I mean, the company is launching the Copilot plus PC 171 00:09:56,280 --> 00:10:00,880 Speaker 1: line next week. So if Microsoft and Apple are both 172 00:10:00,920 --> 00:10:04,880 Speaker 1: working closely with open AI, and open ai cannot be trusted, 173 00:10:05,160 --> 00:10:08,720 Speaker 1: I guess that means everyone working for one of Musk's 174 00:10:08,760 --> 00:10:11,520 Speaker 1: companies is going to have to switch to a Google 175 00:10:11,600 --> 00:10:15,760 Speaker 1: Chrome computer or something. I mean, there's also Linux and such, 176 00:10:15,840 --> 00:10:19,160 Speaker 1: but you know, everyone switching I mean, I don't know. 177 00:10:19,640 --> 00:10:22,320 Speaker 1: I would imagine the boffins at SpaceX might need something 178 00:10:22,520 --> 00:10:25,600 Speaker 1: with a little bit more oomph than a Google Chrome computer. Anyway, 179 00:10:26,240 --> 00:10:29,280 Speaker 1: Will Musk actually put a ban on Apple devices in 180 00:10:29,360 --> 00:10:33,960 Speaker 1: place at Tesla and SpaceX and Eurolink and XAI and 181 00:10:34,559 --> 00:10:38,240 Speaker 1: x Perhaps I've given up predicting on what he will 182 00:10:38,320 --> 00:10:41,040 Speaker 1: and will not do. I imagine that if he does put 183 00:10:41,040 --> 00:10:44,439 Speaker 1: it in place, it won't apply to him. That seems 184 00:10:44,480 --> 00:10:46,520 Speaker 1: to be the way rules work in his world is 185 00:10:46,720 --> 00:10:50,360 Speaker 1: those are things for other people, Okay, We've got lots 186 00:10:50,400 --> 00:10:53,240 Speaker 1: more to cover, including some more Elon Musk news, But 187 00:10:53,240 --> 00:10:55,400 Speaker 1: before I get to all that, let's take a quick 188 00:10:55,400 --> 00:11:09,240 Speaker 1: break to think our sponsors. So related to Elon Musk say, 189 00:11:09,280 --> 00:11:13,400 Speaker 1: he had previously brought a lawsuit against open Ai. He 190 00:11:13,559 --> 00:11:16,920 Speaker 1: argued that open ai had violated the founding principles of 191 00:11:16,960 --> 00:11:20,760 Speaker 1: the original nonprofit and that the company was purposefully withholding 192 00:11:20,760 --> 00:11:24,800 Speaker 1: the most useful and advanced AI tools for high paying customers. 193 00:11:25,080 --> 00:11:26,880 Speaker 1: And that does seem to fly in the face of 194 00:11:26,960 --> 00:11:30,680 Speaker 1: the open part of open ai, right, Like, if you're 195 00:11:30,760 --> 00:11:34,080 Speaker 1: an open company, you're not withholding stuff for the folks 196 00:11:34,120 --> 00:11:36,760 Speaker 1: who just pay you more. But then open ai has 197 00:11:36,760 --> 00:11:39,640 Speaker 1: now published a blog post that included a bunch of 198 00:11:39,679 --> 00:11:43,600 Speaker 1: emails written by Elon Musk, and those emails indicated that 199 00:11:43,840 --> 00:11:46,560 Speaker 1: back when Musk was part of open ai, he not 200 00:11:46,600 --> 00:11:49,120 Speaker 1: only knew that open ai would need to generate an 201 00:11:49,360 --> 00:11:52,400 Speaker 1: enormous amount of money to pay for AI research and development, 202 00:11:52,679 --> 00:11:55,079 Speaker 1: but he also wanted to be the one to lead 203 00:11:55,320 --> 00:11:58,760 Speaker 1: those efforts. The emails show that Musk agreed that open 204 00:11:58,760 --> 00:12:01,839 Speaker 1: ai would not be able to just raise money from donations. 205 00:12:01,920 --> 00:12:04,280 Speaker 1: It would have to create a means of generating actual 206 00:12:04,320 --> 00:12:07,200 Speaker 1: revenue in order to survive. That's essentially what the for 207 00:12:07,400 --> 00:12:10,640 Speaker 1: profit arm of open Ai was founded on, although lots 208 00:12:10,679 --> 00:12:12,920 Speaker 1: of other folks have questioned as whether or not that 209 00:12:13,040 --> 00:12:16,040 Speaker 1: means the nonprofit arm is the prime beneficiary of all 210 00:12:16,080 --> 00:12:18,840 Speaker 1: that revenue. The blog post seems to say, Hey, Elon, 211 00:12:19,120 --> 00:12:21,840 Speaker 1: this thing you're complaining about, it's the same thing you 212 00:12:22,000 --> 00:12:25,280 Speaker 1: wanted to do. Whether that had any influence on the 213 00:12:25,320 --> 00:12:29,720 Speaker 1: decision or not, I can't say, but Elon Musk's lawyers 214 00:12:29,760 --> 00:12:32,720 Speaker 1: have now dropped the lawsuit not long after that blog 215 00:12:32,760 --> 00:12:36,200 Speaker 1: was published. Open Ai and Musk himself have remained fairly 216 00:12:36,320 --> 00:12:39,600 Speaker 1: quiet about all of that. Now, we're not done with 217 00:12:39,800 --> 00:12:44,000 Speaker 1: Musk just yet. Some former SpaceX workers have filed a 218 00:12:44,080 --> 00:12:48,360 Speaker 1: lawsuit against him for sexual harassment and retaliation. Gebby Delvall 219 00:12:48,640 --> 00:12:52,000 Speaker 1: of The Verge reports that eight former engineers are part 220 00:12:52,040 --> 00:12:54,880 Speaker 1: of this lawsuit, and they say that Musk quote knowingly 221 00:12:54,920 --> 00:12:58,560 Speaker 1: and purposefully created an unwelcome, hostile work environment based upon 222 00:12:58,600 --> 00:13:03,640 Speaker 1: his conduct of interge into the workplace, vile sexual photographs, 223 00:13:03,800 --> 00:13:08,440 Speaker 1: memes and commentary that demeaned women and or the LGBTQ 224 00:13:08,679 --> 00:13:11,880 Speaker 1: plus community, which is a big old yuck. The lawsuit 225 00:13:11,960 --> 00:13:14,800 Speaker 1: argues that Musk is responsible for creating and fostering a 226 00:13:14,800 --> 00:13:18,880 Speaker 1: hostile work environment at SpaceX one that encouraged a sexist 227 00:13:19,000 --> 00:13:21,800 Speaker 1: culture within the company. The lawsuit even claims that a 228 00:13:21,880 --> 00:13:26,720 Speaker 1: video featuring C suite executives appeared to mock situations involving 229 00:13:26,760 --> 00:13:29,600 Speaker 1: sexual misconduct, which is a big old' yikes. I mean, y'all. 230 00:13:29,720 --> 00:13:33,640 Speaker 1: One thing all iHeart employees have to do every year 231 00:13:33,880 --> 00:13:36,920 Speaker 1: is go through a training course about appropriate conduct. And 232 00:13:36,960 --> 00:13:40,040 Speaker 1: I cannot tell you how horrified I would be if, 233 00:13:40,080 --> 00:13:42,360 Speaker 1: in the course of that training we were to see 234 00:13:42,400 --> 00:13:46,200 Speaker 1: our company's executives acting out inappropriate scenarios while making jokes 235 00:13:46,240 --> 00:13:49,160 Speaker 1: about it. Of course, that does not happen here. It's 236 00:13:49,240 --> 00:13:52,559 Speaker 1: just unthinkable to me that it happens anywhere, let alone 237 00:13:52,600 --> 00:13:56,200 Speaker 1: it's SpaceX. Moving away from Elon, let's talk a little 238 00:13:56,200 --> 00:13:59,199 Speaker 1: bit more about Microsoft. The company has faced mounting criticism 239 00:13:59,240 --> 00:14:02,800 Speaker 1: regarding an unres least Windows eleven feature called Windows Recall 240 00:14:03,080 --> 00:14:07,280 Speaker 1: and If You Recall. This feature essentially takes snapshots of 241 00:14:07,280 --> 00:14:10,199 Speaker 1: what's going on on your PC and then keeps those 242 00:14:10,200 --> 00:14:14,240 Speaker 1: snapshots as a record. So hypothetically, if for some reason 243 00:14:14,280 --> 00:14:16,840 Speaker 1: you needed to know more about something you once did 244 00:14:16,840 --> 00:14:19,440 Speaker 1: on your PC in the past. You could search through 245 00:14:19,480 --> 00:14:22,760 Speaker 1: those snapshots for answers, but critics worried that the feature 246 00:14:22,800 --> 00:14:27,040 Speaker 1: would become a huge, juicy target for malicious actors. Microsoft 247 00:14:27,080 --> 00:14:30,000 Speaker 1: stress that all Windows Recall functions would just take place 248 00:14:30,000 --> 00:14:32,920 Speaker 1: on the native machines, meaning the information would not get 249 00:14:33,000 --> 00:14:35,440 Speaker 1: shared to the cloud or anything like that. But this 250 00:14:35,520 --> 00:14:39,080 Speaker 1: week the company also faced questioning from US Congress regarding 251 00:14:39,120 --> 00:14:43,360 Speaker 1: how Microsoft is handled, or rather failed to handle, various 252 00:14:43,440 --> 00:14:47,480 Speaker 1: security crises. So I'm guessing all of that convinced Microsoft 253 00:14:47,520 --> 00:14:50,160 Speaker 1: that maybe, just maybe, it would not be the best 254 00:14:50,200 --> 00:14:52,800 Speaker 1: time to release Windows Recall, even if the tool is 255 00:14:52,920 --> 00:14:56,680 Speaker 1: relatively secure, due to those concerns. So next week, on 256 00:14:56,760 --> 00:15:00,000 Speaker 1: June eighteenth, Microsoft still plans to launch its Copilot plus 257 00:15:00,120 --> 00:15:03,680 Speaker 1: PC program, in which new Windows PCs will have the 258 00:15:03,720 --> 00:15:07,400 Speaker 1: Copilot AI more tightly integrated into them, but the Windows 259 00:15:07,440 --> 00:15:10,360 Speaker 1: Recall feature will be absent. The company does say that 260 00:15:10,400 --> 00:15:13,240 Speaker 1: Windows Recall will get rolled out at some future date, 261 00:15:13,360 --> 00:15:15,400 Speaker 1: but did not give a timeline as to when that 262 00:15:15,440 --> 00:15:19,760 Speaker 1: will happen. Brandon viglia Rollo of The Register reports that 263 00:15:19,800 --> 00:15:22,760 Speaker 1: Weimo is responding to an incident that happened with one 264 00:15:22,760 --> 00:15:26,960 Speaker 1: of its driverless taxis back on May twenty first in Phoenix, Arizona. 265 00:15:27,200 --> 00:15:30,920 Speaker 1: So apparently that incident involved a Weimo vehicle driving into 266 00:15:30,960 --> 00:15:34,320 Speaker 1: a wooden telephone pole at around eight miles per hour. Now, 267 00:15:34,360 --> 00:15:38,360 Speaker 1: I'll remind you that wooden telephone poles are inanimate, and 268 00:15:38,400 --> 00:15:41,200 Speaker 1: they are stationary objects, and they're ones that you would 269 00:15:41,240 --> 00:15:44,720 Speaker 1: expect a driverless car to naturally avoid. Weimo claimed that 270 00:15:44,760 --> 00:15:48,520 Speaker 1: the problem came from quote a mapping and software issue 271 00:15:48,760 --> 00:15:51,560 Speaker 1: end quote. No one was harmed in the incident, so 272 00:15:51,640 --> 00:15:53,720 Speaker 1: that's good at least. But even though this was a 273 00:15:53,840 --> 00:15:56,480 Speaker 1: minor accident in the grand scheme of things, it comes 274 00:15:56,560 --> 00:16:00,400 Speaker 1: at a time when cities are scrutinizing driverless proms a 275 00:16:00,440 --> 00:16:03,280 Speaker 1: little bit more closely than they used to. Weaimo is 276 00:16:03,320 --> 00:16:06,080 Speaker 1: in the midst of a software recall on its automated 277 00:16:06,120 --> 00:16:09,640 Speaker 1: driver system, so we'll see whether or not this leads 278 00:16:09,640 --> 00:16:14,160 Speaker 1: to bigger headaches for driverless taxi efforts. In general, I 279 00:16:14,200 --> 00:16:19,040 Speaker 1: think there's a growing skepticism around those. Natalie Cratch of 280 00:16:19,040 --> 00:16:21,400 Speaker 1: The Wrap has an upsetting piece about the New York 281 00:16:21,440 --> 00:16:24,720 Speaker 1: Times as art department. So recently the newspaper reduced its 282 00:16:24,800 --> 00:16:28,840 Speaker 1: art production department by more than half, from sixteen positions 283 00:16:29,120 --> 00:16:32,040 Speaker 1: down to seven. The New York Times Guild alleges that 284 00:16:32,080 --> 00:16:34,480 Speaker 1: this move is in an effort to offload work that 285 00:16:34,720 --> 00:16:39,000 Speaker 1: would normally go to human artists to AI instead. This 286 00:16:39,040 --> 00:16:41,520 Speaker 1: is ryan line with the concerns many have expressed for 287 00:16:41,640 --> 00:16:44,760 Speaker 1: the creative industries as a whole that companies will favor 288 00:16:44,840 --> 00:16:49,240 Speaker 1: AI over human creatives. That's a huge problem for many reasons, 289 00:16:49,280 --> 00:16:51,320 Speaker 1: not the least of which is that it puts creatives 290 00:16:51,320 --> 00:16:53,880 Speaker 1: out of work. But also, you know, consider that the 291 00:16:53,960 --> 00:16:58,440 Speaker 1: training methods for image generating and image altering AI often 292 00:16:58,480 --> 00:17:02,440 Speaker 1: involved feeding countless exams samples of art and human created 293 00:17:02,480 --> 00:17:06,000 Speaker 1: work to the AI model. So it's entirely possible that 294 00:17:06,040 --> 00:17:09,080 Speaker 1: the stuff the AI does will at least in part, 295 00:17:09,119 --> 00:17:11,800 Speaker 1: be based off the work of the very human beings 296 00:17:11,840 --> 00:17:14,080 Speaker 1: who are laid off in the first place, which is 297 00:17:14,160 --> 00:17:16,600 Speaker 1: kind of like being fired and then replaced by a 298 00:17:16,640 --> 00:17:20,040 Speaker 1: really crappy copy of yourself. But a spokesperson for The 299 00:17:20,040 --> 00:17:23,320 Speaker 1: Times now has said that AI did not factor into 300 00:17:23,359 --> 00:17:26,000 Speaker 1: this decision at all, and that in fact, the employees 301 00:17:26,000 --> 00:17:29,160 Speaker 1: were offered a buyout and the whole decision to downsize 302 00:17:29,200 --> 00:17:32,080 Speaker 1: the department reflects a push to create a more efficient 303 00:17:32,200 --> 00:17:35,720 Speaker 1: process that doesn't need as many staff to do its work, 304 00:17:35,920 --> 00:17:38,920 Speaker 1: though some have said that that's because of the incorporation 305 00:17:39,080 --> 00:17:43,400 Speaker 1: of AI tools. So the story is still unfolding. And 306 00:17:43,400 --> 00:17:45,960 Speaker 1: then a story that makes me wonder how this actually works. 307 00:17:46,040 --> 00:17:49,520 Speaker 1: We have a piece by Takeshi Nhirabi of the Asian 308 00:17:49,680 --> 00:17:53,120 Speaker 1: Shimbun about a company using AI in order to protect 309 00:17:53,400 --> 00:17:57,040 Speaker 1: its customer service staff from angry customers. All right, so 310 00:17:57,119 --> 00:18:00,520 Speaker 1: the story goes that Soft Bank, which is a massive 311 00:18:00,600 --> 00:18:03,159 Speaker 1: conglomerate that's had a bit of a roller coaster of 312 00:18:03,200 --> 00:18:06,119 Speaker 1: a history in recent years, has rolled out an AI 313 00:18:06,240 --> 00:18:09,920 Speaker 1: tool that acts as a go between for staff and customers. 314 00:18:10,240 --> 00:18:12,800 Speaker 1: So the idea is that the AI receives the angry 315 00:18:12,800 --> 00:18:16,440 Speaker 1: calls of customers seeking a solution and turns them into 316 00:18:16,520 --> 00:18:19,840 Speaker 1: a calm and reasonable voice, or at least one that 317 00:18:20,000 --> 00:18:26,159 Speaker 1: doesn't turn into an intimidating, angry tirade. So you know, 318 00:18:26,160 --> 00:18:28,760 Speaker 1: a customer could be screaming, listen, you idiot, I'm telling 319 00:18:28,760 --> 00:18:31,160 Speaker 1: you that I need a solution, but the AI might 320 00:18:31,240 --> 00:18:34,000 Speaker 1: turn that into could you please help me resolve this issue? 321 00:18:34,160 --> 00:18:36,640 Speaker 1: Actually that's not true. According to the article, the AI 322 00:18:36,720 --> 00:18:39,639 Speaker 1: doesn't actually change the wording at all, So if someone 323 00:18:39,760 --> 00:18:42,760 Speaker 1: calls a staff an idiot, that idiot is still going 324 00:18:42,800 --> 00:18:45,360 Speaker 1: to get through in the message. But what it does 325 00:18:45,440 --> 00:18:48,199 Speaker 1: do is it changes the pitch and the tone of 326 00:18:48,240 --> 00:18:52,040 Speaker 1: the voice. So presumably the person on the other end 327 00:18:52,080 --> 00:18:54,760 Speaker 1: of the line, if they start getting no a bit heated, 328 00:18:55,119 --> 00:18:58,240 Speaker 1: the AI might start increasing the pitch of their voice 329 00:18:58,600 --> 00:19:01,000 Speaker 1: up to chipmunk levels, because you know, it's hard to 330 00:19:01,040 --> 00:19:04,600 Speaker 1: feel stressed out about a chipmunk being angry at you. Interestingly, 331 00:19:04,640 --> 00:19:07,200 Speaker 1: the developer said one of the challenges was figuring out 332 00:19:07,200 --> 00:19:10,760 Speaker 1: how much anger to leave in because obviously, if a 333 00:19:10,800 --> 00:19:13,239 Speaker 1: customer is getting angry, the staff member needs to know 334 00:19:13,280 --> 00:19:17,040 Speaker 1: about in order not to escalate matters further. And apparently 335 00:19:17,040 --> 00:19:19,280 Speaker 1: SoftBank hopes to make this tool a product it can 336 00:19:19,320 --> 00:19:22,639 Speaker 1: sell to other companies starting next year. Okay, some article 337 00:19:22,640 --> 00:19:25,520 Speaker 1: recommendations for y'all. First up is a piece by Kyle 338 00:19:25,680 --> 00:19:28,800 Speaker 1: Chaika in The New York Are titled is Google SEO 339 00:19:29,000 --> 00:19:32,240 Speaker 1: Gaslighting the Internet? So the article describes how some leaked 340 00:19:32,280 --> 00:19:35,320 Speaker 1: internal Google documents paint a very different picture of how 341 00:19:35,440 --> 00:19:39,159 Speaker 1: search works than what Google has said to the general public. 342 00:19:39,200 --> 00:19:42,280 Speaker 1: It's well worth a read. Next up is Ron Amadeo's 343 00:19:42,320 --> 00:19:45,400 Speaker 1: piece for Ours Technica titled the Google Pay App Is Dead, 344 00:19:45,440 --> 00:19:47,640 Speaker 1: which tells the story of how Google put to rest 345 00:19:47,760 --> 00:19:50,720 Speaker 1: yet another Google feature in favor of another one that 346 00:19:50,760 --> 00:19:52,480 Speaker 1: does pretty much the same thing. And this is like 347 00:19:52,560 --> 00:19:56,680 Speaker 1: the fourth time that's happened. And finally, Ashley Blanger has 348 00:19:56,760 --> 00:19:59,800 Speaker 1: another piece on Ours Technica. This one's titled Microsoft in 349 00:20:00,040 --> 00:20:03,879 Speaker 1: Damage Control Mode says it will prioritize security over AI. 350 00:20:04,280 --> 00:20:06,720 Speaker 1: Now I alluded to this earlier in the episode. Microsoft 351 00:20:06,800 --> 00:20:09,600 Speaker 1: has had to face questioning from Congress about some recent 352 00:20:09,640 --> 00:20:13,199 Speaker 1: security breaches that have affected thousands of government staff, and 353 00:20:13,280 --> 00:20:16,080 Speaker 1: the piece raises serious questions about whether the US government 354 00:20:16,080 --> 00:20:18,480 Speaker 1: has put too much faith and responsibility on the shoulders 355 00:20:18,480 --> 00:20:22,119 Speaker 1: of Microsoft, and whether Microsoft has behaved responsibly in the 356 00:20:22,119 --> 00:20:25,840 Speaker 1: wake of cybersecurity breaches. According to one whistleblower, Microsoft made 357 00:20:25,880 --> 00:20:30,080 Speaker 1: the conscious decision to not respond and to not reveal 358 00:20:30,359 --> 00:20:33,960 Speaker 1: certain attacks in an effort to avoid putting these lucrative 359 00:20:33,960 --> 00:20:37,720 Speaker 1: government contracts in jeopardy, which is a heck of an accusation, 360 00:20:38,040 --> 00:20:40,800 Speaker 1: so well worth a read. That's it. I hope you 361 00:20:40,960 --> 00:20:42,680 Speaker 1: had a great week. I hope you have an even 362 00:20:42,720 --> 00:20:46,040 Speaker 1: better weekend. And I'll talk to you again, really soon. 363 00:20:52,160 --> 00:20:56,840 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 364 00:20:57,119 --> 00:21:00,960 Speaker 1: visit the iHeartRadio app Apple Podcasts, wherever you listen to 365 00:21:01,040 --> 00:21:01,959 Speaker 1: your favorite shows.