1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,600 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,640 --> 00:00:18,320 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,360 --> 00:00:20,440 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:20,480 --> 00:00:23,959 Speaker 1: April twenty fifth, twenty twenty three. Last week, I talked 6 00:00:23,960 --> 00:00:28,080 Speaker 1: about how the SpaceX Starship launch was a success despite 7 00:00:28,120 --> 00:00:30,960 Speaker 1: the fact that the ding dang darn thing sploded all 8 00:00:30,960 --> 00:00:34,440 Speaker 1: over the Gulf of Mexico. As a reminder, this was 9 00:00:34,479 --> 00:00:38,680 Speaker 1: a test launch of the world's most powerful launch vehicle 10 00:00:38,800 --> 00:00:42,519 Speaker 1: ever and had no crew aboard it, thank goodness, and 11 00:00:42,560 --> 00:00:45,040 Speaker 1: the purpose was just to get the launch vehicle off 12 00:00:45,080 --> 00:00:48,000 Speaker 1: the ground without destroying the launch tower in the process. 13 00:00:48,440 --> 00:00:51,199 Speaker 1: While there was a hope that the starship would actually 14 00:00:51,240 --> 00:00:55,040 Speaker 1: be able to achieve orbit, that was a secondary goal. 15 00:00:55,480 --> 00:00:58,640 Speaker 1: So when there was an engine failure and a failure 16 00:00:58,680 --> 00:01:01,320 Speaker 1: for the first stage to separate from the second and 17 00:01:01,360 --> 00:01:03,880 Speaker 1: then the ship began to spin in the air, SpaceX 18 00:01:03,960 --> 00:01:06,679 Speaker 1: chose to toggle a self destruct switch so that the 19 00:01:06,760 --> 00:01:09,520 Speaker 1: starship would detonate safely over the Gulf and not go 20 00:01:09,920 --> 00:01:13,720 Speaker 1: flying off somewhere. Since then, we've learned that there was 21 00:01:13,920 --> 00:01:18,440 Speaker 1: extensive damage to the launch pad itself during the launch, 22 00:01:18,600 --> 00:01:22,560 Speaker 1: while the tower remained standing. There are craters in debris 23 00:01:22,760 --> 00:01:25,600 Speaker 1: across the launch area, and it may be weeks or 24 00:01:25,680 --> 00:01:29,880 Speaker 1: maybe months before the launch pad is ready for another launch. 25 00:01:30,160 --> 00:01:32,759 Speaker 1: It will take some extensive repairs, and in the meantime, 26 00:01:33,080 --> 00:01:36,400 Speaker 1: Elon Musk has indicated that SpaceX may make some changes 27 00:01:36,440 --> 00:01:38,760 Speaker 1: to the launchpad to give it a better chance of 28 00:01:38,800 --> 00:01:42,640 Speaker 1: holding up to the incredible heat and force generated by 29 00:01:42,680 --> 00:01:47,320 Speaker 1: the thirty three rocket engines of the Starship's first stage. 30 00:01:47,400 --> 00:01:52,000 Speaker 1: This actually isn't dissimilar to issues NASA has encountered with 31 00:01:52,120 --> 00:01:55,800 Speaker 1: their Space Launch System, which has about half the thrust 32 00:01:55,800 --> 00:01:59,960 Speaker 1: power of the Starship. The SLS has also damaged lawn 33 00:02:00,000 --> 00:02:03,200 Speaker 1: bunch facilities during operations. It turns out when you play 34 00:02:03,280 --> 00:02:06,680 Speaker 1: with really big rockets, you can sometimes end up with 35 00:02:06,720 --> 00:02:09,639 Speaker 1: a real big hole in the ground. Now, let's talk 36 00:02:09,680 --> 00:02:13,560 Speaker 1: about Twitter. So last week the platform pulled the trigger 37 00:02:13,600 --> 00:02:16,120 Speaker 1: on a promise it had been making for a while. 38 00:02:16,320 --> 00:02:19,440 Speaker 1: Folks who had a verified check mark but who were 39 00:02:19,480 --> 00:02:23,400 Speaker 1: not subscribed to Twitter blue saw those check marks go 40 00:02:23,600 --> 00:02:27,320 Speaker 1: bye bye, which includes yours truly, my check mark went away, 41 00:02:27,639 --> 00:02:32,720 Speaker 1: and Elon Musk, ever the mature businessman comped certain celebrities 42 00:02:32,800 --> 00:02:37,040 Speaker 1: and gave them their check marks. These were mainly celebrities 43 00:02:37,200 --> 00:02:40,640 Speaker 1: who had been criticizing the move to turn the verification 44 00:02:40,760 --> 00:02:44,359 Speaker 1: system into a paid perk as opposed to, you know, 45 00:02:44,560 --> 00:02:48,200 Speaker 1: a means of actually verifying a user's identity. And the 46 00:02:48,240 --> 00:02:51,720 Speaker 1: blue check mark implies, or actually, heck it outright says it, 47 00:02:51,760 --> 00:02:54,519 Speaker 1: if you were to hover your mouse over the check mark, 48 00:02:55,080 --> 00:02:58,280 Speaker 1: that the person who has the check mark had paid 49 00:02:58,639 --> 00:03:01,120 Speaker 1: for that privilege, even though in the case of folks 50 00:03:01,160 --> 00:03:04,360 Speaker 1: like the famous author Stephen King, that wasn't the case. 51 00:03:04,400 --> 00:03:08,160 Speaker 1: He had not paid the Twitter subscription. The check mark 52 00:03:08,240 --> 00:03:11,720 Speaker 1: returned because Elon Musk, I guess wanted to rub it 53 00:03:11,720 --> 00:03:15,000 Speaker 1: in Stephen King's face because King had been criticizing the 54 00:03:15,080 --> 00:03:17,919 Speaker 1: whole paid for check mark thing in the first place, 55 00:03:18,280 --> 00:03:20,840 Speaker 1: and now it makes it seem like Stephen King had 56 00:03:20,840 --> 00:03:24,359 Speaker 1: paid for it, makes him seem like he's a hypocrite, 57 00:03:24,919 --> 00:03:29,400 Speaker 1: when in fact he didn't pay for this. So yeah, 58 00:03:29,560 --> 00:03:32,800 Speaker 1: there's been a lot of trash talking about Twitter turning 59 00:03:32,840 --> 00:03:36,720 Speaker 1: the check mark into a revenue generating scheme, and that 60 00:03:37,280 --> 00:03:41,440 Speaker 1: you know, now these critics have been branded with that 61 00:03:41,520 --> 00:03:44,360 Speaker 1: check mark as opposed to the check mark being a verification, 62 00:03:45,320 --> 00:03:48,840 Speaker 1: and Musk then expanded his generosity because Twitter accounts with 63 00:03:48,920 --> 00:03:51,400 Speaker 1: more than a million followers began to see their check 64 00:03:51,440 --> 00:03:54,520 Speaker 1: marks returned, even though they hadn't asked for it. I 65 00:03:54,560 --> 00:03:57,720 Speaker 1: remember seeing a tweet from author Neil Gaiman who expressed 66 00:03:57,720 --> 00:03:59,640 Speaker 1: curiosity about this. He went so far as to point 67 00:03:59,680 --> 00:04:02,520 Speaker 1: out that he had not subscribed to Twitter Blue, and 68 00:04:02,600 --> 00:04:06,320 Speaker 1: yet his check mark had returned. And now that check 69 00:04:06,360 --> 00:04:09,800 Speaker 1: mark is sort of viewed as a badge of dishonor like. 70 00:04:09,840 --> 00:04:13,120 Speaker 1: The implication is that people who have a check are 71 00:04:13,200 --> 00:04:16,680 Speaker 1: desperate for recognition and have paid for it. So even 72 00:04:16,720 --> 00:04:20,080 Speaker 1: people who did subscribe to Twitter Blue, some of them 73 00:04:20,080 --> 00:04:22,440 Speaker 1: are now trying to dump it in order to get 74 00:04:22,480 --> 00:04:24,160 Speaker 1: rid of that check mark, because they don't want to 75 00:04:24,200 --> 00:04:29,040 Speaker 1: be associated with a perception that they're paying to be recognized. 76 00:04:29,440 --> 00:04:32,480 Speaker 1: It's actually a pretty fascinating situation. Something that was once 77 00:04:32,600 --> 00:04:35,680 Speaker 1: viewed as a kind of status symbol and something to 78 00:04:35,720 --> 00:04:39,400 Speaker 1: be desired is now looked at as almost being taboo. 79 00:04:40,000 --> 00:04:43,719 Speaker 1: And I think there's a possible sociology dissertation in there somewhere, 80 00:04:44,000 --> 00:04:47,719 Speaker 1: and you're welcome, grad students. By the way, this gets 81 00:04:47,760 --> 00:04:51,120 Speaker 1: even more confusing because Twitter accounts that belong to people 82 00:04:51,120 --> 00:04:54,279 Speaker 1: who are no longer with us, people who have died. 83 00:04:54,400 --> 00:04:57,719 Speaker 1: Over the past few years, they've seen their check marks return, 84 00:04:57,800 --> 00:05:00,880 Speaker 1: and again, if you hover your mouth over the check mark, 85 00:05:00,920 --> 00:05:04,680 Speaker 1: it says that the person who owns that account is 86 00:05:04,760 --> 00:05:08,600 Speaker 1: paying for a Twitter Blue subscription, which you know, raises 87 00:05:08,680 --> 00:05:11,640 Speaker 1: questions as to how dead people were able to do that. 88 00:05:12,360 --> 00:05:17,120 Speaker 1: So again, not like the smoothest transition. A real shock 89 00:05:17,200 --> 00:05:20,760 Speaker 1: when you think about how Twitter has laid off seven 90 00:05:20,800 --> 00:05:24,400 Speaker 1: thousand of their employees over the last several months. On 91 00:05:24,440 --> 00:05:27,280 Speaker 1: a related note, some folks on Twitter are asking if 92 00:05:27,279 --> 00:05:32,039 Speaker 1: perhaps these check marks represent false advertising in a way. 93 00:05:32,400 --> 00:05:36,800 Speaker 1: So Twitter user drill Dril brought this question up, asking 94 00:05:36,839 --> 00:05:41,160 Speaker 1: if maybe this falls under the umbrella of a false endorsement. 95 00:05:41,600 --> 00:05:44,240 Speaker 1: So the argument kind of goes like this. The check 96 00:05:44,320 --> 00:05:48,719 Speaker 1: mark claims that the person whose name is in front 97 00:05:48,760 --> 00:05:51,760 Speaker 1: of that check has paid for this service. But as 98 00:05:51,800 --> 00:05:56,040 Speaker 1: we've seen, many, maybe even most, of the celebrities who 99 00:05:56,080 --> 00:05:59,440 Speaker 1: currently have a check mark didn't actually pay for it, 100 00:05:59,800 --> 00:06:05,360 Speaker 1: So that could imply that this celebrity is approving of 101 00:06:05,440 --> 00:06:07,840 Speaker 1: the service right because the claim is that they paid 102 00:06:07,880 --> 00:06:11,200 Speaker 1: for it, even if they haven't Twitter is saying they did. 103 00:06:11,680 --> 00:06:13,960 Speaker 1: And so you see a celebrity's name, you see that 104 00:06:14,080 --> 00:06:16,520 Speaker 1: check mark after it, You see the check mark claiming 105 00:06:16,560 --> 00:06:19,400 Speaker 1: that the person with the check mark has paid for 106 00:06:19,400 --> 00:06:23,960 Speaker 1: a subscription. That seems like the celebrity is endorsing the service, 107 00:06:24,040 --> 00:06:27,320 Speaker 1: like they're saying, this is cool, That's why I bought it, 108 00:06:28,320 --> 00:06:32,479 Speaker 1: And that could be seen as an endorsement, even though 109 00:06:32,480 --> 00:06:35,640 Speaker 1: the celebrities have not done that. They haven't made any 110 00:06:35,680 --> 00:06:38,720 Speaker 1: kind of deal with Twitter. There's no endorsing going on. 111 00:06:38,800 --> 00:06:43,320 Speaker 1: It's against their will, right, even if they don't really 112 00:06:43,320 --> 00:06:45,800 Speaker 1: care one way or another. They didn't seek it out. 113 00:06:46,120 --> 00:06:47,800 Speaker 1: So when you view it that way, it does sound 114 00:06:47,839 --> 00:06:50,400 Speaker 1: like Twitter. It could be violating some rules here with 115 00:06:50,520 --> 00:06:55,080 Speaker 1: various high profile accounts pulled in by association. However, it's 116 00:06:55,120 --> 00:06:57,880 Speaker 1: not a cut and dry case, so to be seen 117 00:06:58,080 --> 00:07:01,320 Speaker 1: as actually being illegal, this matter would have to be 118 00:07:01,400 --> 00:07:05,440 Speaker 1: framed as a type of advertisement, like it's an advertisement 119 00:07:05,520 --> 00:07:09,279 Speaker 1: of Twitter Blue subscriptions. That's what the laws cover fair 120 00:07:09,320 --> 00:07:13,800 Speaker 1: practices and advertising. But if you could frame it as 121 00:07:14,000 --> 00:07:17,600 Speaker 1: not being an advertising issue, if it's in a different context, 122 00:07:17,960 --> 00:07:21,120 Speaker 1: well you don't have that same legal foundation to support 123 00:07:21,120 --> 00:07:24,560 Speaker 1: your argument. So it may never go any further than this. 124 00:07:25,040 --> 00:07:27,560 Speaker 1: You know, we may not see anyone ever bring legal 125 00:07:27,600 --> 00:07:31,360 Speaker 1: action against Twitter for forcing a check mark on people 126 00:07:31,360 --> 00:07:34,160 Speaker 1: who don't want it, but it raises some really weird 127 00:07:34,240 --> 00:07:37,160 Speaker 1: questions that I didn't think I would ever be asking. 128 00:07:37,800 --> 00:07:40,520 Speaker 1: And one last Twitter piece for today before we move 129 00:07:40,560 --> 00:07:44,080 Speaker 1: on to other news. Keith Berghart, a computer scientist at 130 00:07:44,080 --> 00:07:48,200 Speaker 1: the Information Sciences Institute, has shared results from a research 131 00:07:48,280 --> 00:07:52,080 Speaker 1: project that looks at hate speech on Twitter. So Burghart 132 00:07:52,120 --> 00:07:56,440 Speaker 1: and his research team created a methodology to identify and 133 00:07:56,640 --> 00:08:01,920 Speaker 1: quantify hate speech on Twitter. Essentially, the team identified about 134 00:08:02,000 --> 00:08:07,240 Speaker 1: four dozen keywords closely associated with hate speech. I didn't 135 00:08:07,400 --> 00:08:09,320 Speaker 1: dare look at the list because I have a feeling 136 00:08:09,440 --> 00:08:11,880 Speaker 1: it would just make me sick. But they use these 137 00:08:11,960 --> 00:08:15,000 Speaker 1: keywords as the starting point. They started to search for 138 00:08:15,080 --> 00:08:19,600 Speaker 1: tweets that contain these keywords, but then they fed those 139 00:08:19,640 --> 00:08:24,720 Speaker 1: tweets to a tool called Perspective API. This tool tries 140 00:08:24,760 --> 00:08:28,800 Speaker 1: to identify examples of hate speech, and it differentiates between 141 00:08:28,920 --> 00:08:33,520 Speaker 1: hate speech and something that's not hate speech. So, for example, 142 00:08:33,600 --> 00:08:37,760 Speaker 1: you might have a tweet that is discussing hate speech, 143 00:08:37,800 --> 00:08:40,400 Speaker 1: but it is not actually an example of hate speech. 144 00:08:40,400 --> 00:08:42,840 Speaker 1: You would want to separate that out. Or you might 145 00:08:42,920 --> 00:08:47,880 Speaker 1: have a tweet that's meant perhaps in a sexual context, 146 00:08:48,080 --> 00:08:50,480 Speaker 1: but not a hateful one. You'd want to separate that 147 00:08:50,559 --> 00:08:55,160 Speaker 1: out too, So they use this tool. They filtered through 148 00:08:55,200 --> 00:08:57,640 Speaker 1: the results and then they compared the amount of hate 149 00:08:57,640 --> 00:09:01,000 Speaker 1: speech present on Twitter before us took control of the 150 00:09:01,000 --> 00:09:04,360 Speaker 1: company late last year, and how things are going now 151 00:09:04,400 --> 00:09:07,840 Speaker 1: that he's in charge. And the findings are likely confirming 152 00:09:07,880 --> 00:09:13,120 Speaker 1: your guesses. Incidents increased after Musk took control, both in 153 00:09:13,160 --> 00:09:17,679 Speaker 1: the number of hateful keywords appearing in individual tweets, so 154 00:09:17,760 --> 00:09:21,120 Speaker 1: more of these keywords are cropping up now, and also 155 00:09:21,520 --> 00:09:25,720 Speaker 1: how frequently hateful users are tweeting out hate speech, So 156 00:09:26,040 --> 00:09:29,720 Speaker 1: the number of times that people are tweeting out these 157 00:09:29,760 --> 00:09:33,559 Speaker 1: things has increased. In fact, it was close to twice 158 00:09:33,679 --> 00:09:37,280 Speaker 1: as much as it was before Musk took over. Now, 159 00:09:37,320 --> 00:09:40,480 Speaker 1: I should also add hate speech has always been a 160 00:09:40,640 --> 00:09:44,880 Speaker 1: problem on Twitter. This is not unique to Musk taking 161 00:09:44,960 --> 00:09:48,520 Speaker 1: over the company. It was present before he ever started 162 00:09:48,559 --> 00:09:52,680 Speaker 1: to express any interest in buying Twitter. And also, we 163 00:09:52,760 --> 00:09:55,280 Speaker 1: haven't even had a full year's worth of data since 164 00:09:55,360 --> 00:09:58,040 Speaker 1: Musk took over, right, we had a year has not passed, 165 00:09:58,600 --> 00:10:01,439 Speaker 1: so we have a pretty small all sample size of data. 166 00:10:01,480 --> 00:10:04,320 Speaker 1: We should keep that in mind too. This might not 167 00:10:04,480 --> 00:10:07,640 Speaker 1: be a trend. It might be a spike, and it 168 00:10:07,679 --> 00:10:10,600 Speaker 1: could be that over time the spike levels out. Again, 169 00:10:10,679 --> 00:10:13,520 Speaker 1: we don't know yet, so I don't want anyone to 170 00:10:13,600 --> 00:10:16,200 Speaker 1: jump to conclusions just to say, all right, well, Twitter 171 00:10:16,360 --> 00:10:19,840 Speaker 1: is now a haven for hate and it will be 172 00:10:19,960 --> 00:10:22,760 Speaker 1: like that forever. We don't know that. We do know 173 00:10:23,480 --> 00:10:27,760 Speaker 1: that the incidents have definitely increased since Musk took over, 174 00:10:28,040 --> 00:10:31,120 Speaker 1: but that's as far as we can go. News like 175 00:10:31,160 --> 00:10:34,000 Speaker 1: this is not good good for Twitter obviously. It's still 176 00:10:34,040 --> 00:10:37,760 Speaker 1: trying to attract advertisers back to the platform, and when 177 00:10:37,800 --> 00:10:41,440 Speaker 1: you see a report like this, that's a that's not 178 00:10:41,480 --> 00:10:44,120 Speaker 1: a big selling point. Okay, we've got a lot more 179 00:10:44,200 --> 00:10:46,600 Speaker 1: news to cover before we get into any more. Let's 180 00:10:46,600 --> 00:10:59,800 Speaker 1: take a quick break. We're back. Snapchat launched it's My 181 00:11:00,200 --> 00:11:03,640 Speaker 1: Ai feature last week to users around the world. So 182 00:11:03,800 --> 00:11:08,200 Speaker 1: before that, the feature was only available to Snapchat users 183 00:11:08,240 --> 00:11:12,000 Speaker 1: who had enrolled in a subscription service, so it was 184 00:11:12,040 --> 00:11:17,160 Speaker 1: a subscriber only feature until last week. Now, the reception 185 00:11:17,280 --> 00:11:21,839 Speaker 1: of the feature has been pretty mixed, with the most 186 00:11:21,960 --> 00:11:25,920 Speaker 1: vocal users going the old review bombing route and giving 187 00:11:25,960 --> 00:11:30,400 Speaker 1: the Snapchat app a terrible review. According to tech Crunch, 188 00:11:30,960 --> 00:11:34,319 Speaker 1: users hit the Snapchat app with an average of one 189 00:11:34,400 --> 00:11:39,040 Speaker 1: point sixty seven stars out of five. Seventy five percent 190 00:11:39,520 --> 00:11:44,480 Speaker 1: of all reviews last week were one star. That is brutal. 191 00:11:45,160 --> 00:11:49,320 Speaker 1: The chief reason cited for these low reviews was the 192 00:11:49,360 --> 00:11:53,520 Speaker 1: incorporation of the AI chatbot. As it stands, the AI 193 00:11:53,640 --> 00:11:57,080 Speaker 1: feature is pinned to the top of the app's chat feature. 194 00:11:58,040 --> 00:12:00,640 Speaker 1: Users who are critical of this inclusion wanted a way 195 00:12:00,679 --> 00:12:02,600 Speaker 1: to turn it off or opt out of the feature, 196 00:12:03,080 --> 00:12:06,240 Speaker 1: and currently there's not a way to do that. Criticisms 197 00:12:06,440 --> 00:12:11,080 Speaker 1: ranged from this is lame to this is creepy. One 198 00:12:11,240 --> 00:12:15,520 Speaker 1: alleged exchange between the AI and a user named Brittany 199 00:12:16,080 --> 00:12:19,560 Speaker 1: made the rounds on Twitter. Some screenshots showed that Brittany 200 00:12:19,600 --> 00:12:21,880 Speaker 1: asked the AI tool to point her in the direction 201 00:12:21,960 --> 00:12:26,040 Speaker 1: of the closest gas station, So the AI suggested a 202 00:12:26,040 --> 00:12:28,240 Speaker 1: gas station that was about five miles away from her, 203 00:12:28,240 --> 00:12:31,040 Speaker 1: and then Brittany followed that up by asking how did 204 00:12:31,040 --> 00:12:35,439 Speaker 1: the AI know where she was? The AI essentially said, 205 00:12:35,679 --> 00:12:38,400 Speaker 1: I don't know where you are, but Snapchat may have 206 00:12:38,480 --> 00:12:42,000 Speaker 1: access to your location. Well, Brittany then says, well, how 207 00:12:42,080 --> 00:12:45,200 Speaker 1: is that possible because I chose to have snapchat hidden 208 00:12:45,280 --> 00:12:49,319 Speaker 1: from maps, it shouldn't know my location. Then the AI said, oh, 209 00:12:49,360 --> 00:12:51,439 Speaker 1: I don't have access to your location. I can't tell 210 00:12:51,480 --> 00:12:54,800 Speaker 1: you where the closest gas station is actually, And Brittany said, 211 00:12:54,960 --> 00:13:00,120 Speaker 1: but you kind of already did that. So now I 212 00:13:00,120 --> 00:13:04,280 Speaker 1: cannot say for sure that this exchange was legitimate. The 213 00:13:04,320 --> 00:13:07,720 Speaker 1: screenshots could be faked, they could be photoshopped or something. 214 00:13:08,200 --> 00:13:11,640 Speaker 1: But I have seen similar issues with other AI chatbots 215 00:13:11,679 --> 00:13:16,160 Speaker 1: out there, chatbots who seem to unwittingly reveal that their 216 00:13:16,280 --> 00:13:19,120 Speaker 1: creators know way more about us than they let on, 217 00:13:19,360 --> 00:13:22,520 Speaker 1: including things where you ask a chatbot, hey, where am 218 00:13:22,559 --> 00:13:24,720 Speaker 1: I right now? And the chatbot says, oh, I don't 219 00:13:24,760 --> 00:13:28,040 Speaker 1: know where you are, so what's my location? I don't 220 00:13:28,080 --> 00:13:30,880 Speaker 1: have that information. Okay, what's the closest gas station? And 221 00:13:30,920 --> 00:13:34,000 Speaker 1: then it tells you, which obviously points out that somehow 222 00:13:34,000 --> 00:13:37,120 Speaker 1: it did know your location, or at least the general location. 223 00:13:37,640 --> 00:13:40,640 Speaker 1: It raises a lot of questions anyway. While I can't 224 00:13:40,679 --> 00:13:43,600 Speaker 1: say for sure that Britney's exchange was real, though to 225 00:13:43,640 --> 00:13:45,840 Speaker 1: be clear, I have no reason to doubt it either, 226 00:13:46,440 --> 00:13:49,000 Speaker 1: I can say for sure that a lot of users 227 00:13:49,000 --> 00:13:51,840 Speaker 1: have been vocal in their dislike of this new feature. 228 00:13:52,240 --> 00:13:55,199 Speaker 1: That doesn't necessarily mean everyone hates it. Of course, it 229 00:13:55,240 --> 00:13:59,199 Speaker 1: is possible that the vast majority of Snapchat users actually 230 00:13:59,320 --> 00:14:03,040 Speaker 1: love AI. They're just being quiet about it, and the 231 00:14:03,040 --> 00:14:04,839 Speaker 1: only people who are making a fuss are the ones 232 00:14:04,840 --> 00:14:08,840 Speaker 1: who hate it. That's a possibility. While some music studios 233 00:14:08,880 --> 00:14:11,880 Speaker 1: are attempting to use law so that they can stave 234 00:14:11,960 --> 00:14:15,720 Speaker 1: off a wave of AI generated deep fake songs featuring 235 00:14:15,840 --> 00:14:20,200 Speaker 1: recognizable voices signed to their labels, one musical artist is 236 00:14:20,280 --> 00:14:25,400 Speaker 1: welcoming our robot overlords. That artist is Grimes. She has 237 00:14:25,480 --> 00:14:27,960 Speaker 1: told her fans that she wants to see what AI 238 00:14:28,040 --> 00:14:31,040 Speaker 1: collaborations can come up with. She said that she asks 239 00:14:31,080 --> 00:14:34,800 Speaker 1: for a fifty percent split in royalties on quote any 240 00:14:34,880 --> 00:14:38,640 Speaker 1: successful AI generated song that uses my voice end quote, 241 00:14:39,320 --> 00:14:43,080 Speaker 1: that being her voice, not my voice. She said she 242 00:14:43,200 --> 00:14:46,480 Speaker 1: is putting no boundaries on this. There are no legal bindings. 243 00:14:46,600 --> 00:14:48,960 Speaker 1: She is not represented by a label, so there's no 244 00:14:49,080 --> 00:14:52,080 Speaker 1: danger of a giant company swooping in and saying no, no, no, 245 00:14:52,160 --> 00:14:55,960 Speaker 1: actually we own her voice, you can't have it. Grimes 246 00:14:55,960 --> 00:14:57,840 Speaker 1: says she wants to see what can come of this, 247 00:14:58,000 --> 00:15:01,240 Speaker 1: that she's willing to serve as a guinea pig, and 248 00:15:01,520 --> 00:15:04,160 Speaker 1: if she can create a model where an artist receives 249 00:15:04,160 --> 00:15:08,240 Speaker 1: compensation when someone else uses an AI generated version of 250 00:15:08,240 --> 00:15:11,440 Speaker 1: that person's voice, it could really shake things up in 251 00:15:11,480 --> 00:15:14,880 Speaker 1: the art world. I'm not saying that AI generated voices 252 00:15:14,920 --> 00:15:17,960 Speaker 1: will totally replace the real McCoy, but I could also 253 00:15:17,960 --> 00:15:21,080 Speaker 1: see some artists experiment with this, particularly if it means 254 00:15:21,080 --> 00:15:23,320 Speaker 1: you can sit back and collect the checks while the 255 00:15:23,400 --> 00:15:26,280 Speaker 1: robots do all the work. I am being a little 256 00:15:26,320 --> 00:15:30,320 Speaker 1: facetious here because clearly this only works because Grimes already 257 00:15:30,480 --> 00:15:34,040 Speaker 1: has done all the work. She has created art that 258 00:15:34,080 --> 00:15:36,760 Speaker 1: people want to emulate in the first place, and without that, 259 00:15:37,280 --> 00:15:40,240 Speaker 1: none of this works. An appeals court has issued a 260 00:15:40,320 --> 00:15:44,320 Speaker 1: ruling regarding the ongoing struggles between Apple and Epic Games. 261 00:15:44,680 --> 00:15:49,440 Speaker 1: The ruling upholds an earlier court's decision, which mostly finds 262 00:15:49,480 --> 00:15:53,280 Speaker 1: in favor of Apple, with one big exception. So just 263 00:15:53,400 --> 00:15:56,440 Speaker 1: as a reminder, this whole thing started when Epic Games, 264 00:15:56,520 --> 00:16:00,600 Speaker 1: the publisher for Fortnite, encouraged players on Iowa US and 265 00:16:00,600 --> 00:16:04,920 Speaker 1: Android devices, to use some workarounds for in game transactions 266 00:16:04,960 --> 00:16:09,400 Speaker 1: that would go outside the InApp transaction services. That way, 267 00:16:09,840 --> 00:16:12,680 Speaker 1: Epic Games would keep the full amount of the transaction 268 00:16:13,000 --> 00:16:15,880 Speaker 1: rather than Apple, or in the case of Android devices, 269 00:16:15,920 --> 00:16:19,760 Speaker 1: Google getting a thirty percent cut of every one of them. 270 00:16:20,440 --> 00:16:23,400 Speaker 1: Apple brought down the meteor hammer on Epic Games, and 271 00:16:23,520 --> 00:16:26,480 Speaker 1: a big legal fight has been going on ever since. Well. 272 00:16:26,480 --> 00:16:29,240 Speaker 1: While this particular part of the legal battle has mostly 273 00:16:29,320 --> 00:16:32,320 Speaker 1: found in favor of Apple, the one bit that helps 274 00:16:32,360 --> 00:16:35,960 Speaker 1: developers like Epic Games out is that Apple has to 275 00:16:36,040 --> 00:16:39,240 Speaker 1: allow links and buttons that go to third party in 276 00:16:39,400 --> 00:16:43,400 Speaker 1: app payment options if that's what the developer wants. So 277 00:16:43,560 --> 00:16:47,120 Speaker 1: Apple doesn't have to include these options by default, but 278 00:16:47,240 --> 00:16:49,880 Speaker 1: if a developer wants to give users the chance to 279 00:16:50,560 --> 00:16:54,920 Speaker 1: use some other transaction method, Apple isn't allowed to block that. 280 00:16:55,800 --> 00:16:58,920 Speaker 1: Apple has said it disagrees with this ruling. No big 281 00:16:58,920 --> 00:17:01,680 Speaker 1: surprise there, and the company has fourteen days to file 282 00:17:01,720 --> 00:17:05,040 Speaker 1: an appeal to this decision. If they do appeal, then 283 00:17:05,080 --> 00:17:07,680 Speaker 1: the case would then go to either a larger group 284 00:17:07,720 --> 00:17:10,800 Speaker 1: of judges on the Ninth Circuit Court of Appeals, which 285 00:17:10,880 --> 00:17:13,280 Speaker 1: is where this last case was heard, or it would 286 00:17:13,280 --> 00:17:17,000 Speaker 1: go straight to the US Supreme Court. Microsoft is backed 287 00:17:17,040 --> 00:17:19,480 Speaker 1: down in the EU in an effort to sidestep a 288 00:17:19,520 --> 00:17:24,119 Speaker 1: potential anti trust probe, kind of thematically tagging onto the 289 00:17:24,119 --> 00:17:26,760 Speaker 1: back of the Apple story here. So in this case, 290 00:17:26,840 --> 00:17:31,919 Speaker 1: at issue is Microsoft's practice of bundling Microsoft Teams with 291 00:17:32,040 --> 00:17:36,400 Speaker 1: its Office suite of productivity software. So competitors like Slack 292 00:17:36,920 --> 00:17:42,480 Speaker 1: have argued that Microsoft forcing Teams into Office bundles is 293 00:17:42,520 --> 00:17:46,200 Speaker 1: an anti competitive practice that Microsoft is trying to muscle 294 00:17:46,240 --> 00:17:51,240 Speaker 1: out competition from any other video conferencing or messaging services. 295 00:17:51,920 --> 00:17:55,440 Speaker 1: This really isn't that different from when Microsoft faced similar 296 00:17:55,480 --> 00:17:59,000 Speaker 1: complaints when it bundled its web browser, which was then 297 00:17:59,200 --> 00:18:03,600 Speaker 1: Internet Exploited or with the Windows operating system, though that 298 00:18:03,680 --> 00:18:07,480 Speaker 1: case was further complicated by how deeply integrated Internet Explorer 299 00:18:07,720 --> 00:18:11,399 Speaker 1: was with Windows. It has long been a key element 300 00:18:11,440 --> 00:18:15,439 Speaker 1: in Microsoft's strategy to bundle programs together in an effort 301 00:18:15,480 --> 00:18:19,080 Speaker 1: to dominate a market. While users might prefer to go 302 00:18:19,200 --> 00:18:21,760 Speaker 1: a la carte when it comes to picking and choosing 303 00:18:21,760 --> 00:18:25,720 Speaker 1: productivity tools, Microsoft's approach has mostly been an all or 304 00:18:25,840 --> 00:18:30,119 Speaker 1: nothing affair. Now, Microsoft says it will offer options for 305 00:18:30,160 --> 00:18:34,760 Speaker 1: EU customers to purchase Office with or without Microsoft Teams, 306 00:18:34,800 --> 00:18:38,280 Speaker 1: though how that's actually going to manifest remains to be seen. Also, 307 00:18:38,800 --> 00:18:42,560 Speaker 1: that might not be enough to prevent an antitrust probe anyway, 308 00:18:42,600 --> 00:18:45,159 Speaker 1: We'll just have to see what the regulators say. In 309 00:18:45,240 --> 00:18:50,480 Speaker 1: twenty twenty one, Alphabet, Google's parent company began work on 310 00:18:50,640 --> 00:18:55,120 Speaker 1: a truly massive project building out a mega campus in 311 00:18:55,200 --> 00:18:58,960 Speaker 1: San Jose, California. Do you know the way to San Jose? 312 00:18:59,600 --> 00:19:02,560 Speaker 1: And when I say a mega campus, I mean this 313 00:19:02,560 --> 00:19:06,240 Speaker 1: thing was to be huge, like eighty acres in size. 314 00:19:06,280 --> 00:19:10,280 Speaker 1: According to Ours Technica, the Finnish campus was to include offices, 315 00:19:10,840 --> 00:19:18,479 Speaker 1: housing units, parks, hotels, restaurants, shops, theaters, museums, and more like. 316 00:19:18,520 --> 00:19:21,280 Speaker 1: It sounded like a city within a city, just to 317 00:19:21,280 --> 00:19:24,000 Speaker 1: say that happens to be, you know, owned and operated 318 00:19:24,000 --> 00:19:27,199 Speaker 1: by Google. But now the company has hit pause on 319 00:19:27,240 --> 00:19:32,439 Speaker 1: those plants, or perhaps even stop, while construction crews have 320 00:19:32,480 --> 00:19:37,879 Speaker 1: already performed a wave of demolition work that now looks 321 00:19:37,920 --> 00:19:41,520 Speaker 1: like it's just going to be in limbo. According to CNBC, 322 00:19:42,320 --> 00:19:46,080 Speaker 1: that demolition work destroyed some landmarks in San Jose and 323 00:19:46,119 --> 00:19:48,840 Speaker 1: forced others to relocate to other parts of the city. 324 00:19:49,200 --> 00:19:53,280 Speaker 1: And now Google has nothing to go in those places. 325 00:19:53,320 --> 00:19:57,800 Speaker 1: It's just bulldozed territory. And there's a real fear that 326 00:19:57,880 --> 00:20:00,520 Speaker 1: these spots in San Jose will just remain un finished 327 00:20:00,520 --> 00:20:04,760 Speaker 1: for the foreseeable future. Google, like most big tech companies, 328 00:20:04,760 --> 00:20:07,960 Speaker 1: has been making some serious cost cutting moves recently, so 329 00:20:08,000 --> 00:20:10,520 Speaker 1: it's not exactly a surprise that the company would back 330 00:20:10,560 --> 00:20:13,560 Speaker 1: off on such an aggressive plan. In fact, one wonders 331 00:20:13,640 --> 00:20:17,600 Speaker 1: what convinced Google executives to even pursue this idea back 332 00:20:17,600 --> 00:20:20,280 Speaker 1: in twenty twenty one. That was a year after the 333 00:20:20,320 --> 00:20:23,600 Speaker 1: pandemic had begun and the world had really kind of 334 00:20:23,800 --> 00:20:28,320 Speaker 1: adjusted to a remote work situation, which makes it seem 335 00:20:28,359 --> 00:20:31,119 Speaker 1: like a pretty weird time to say, Hey, no one's 336 00:20:31,119 --> 00:20:33,600 Speaker 1: in the office, so let's build a whole lot more 337 00:20:33,640 --> 00:20:38,720 Speaker 1: office space, like a truly enormous amount office space, And 338 00:20:38,840 --> 00:20:42,280 Speaker 1: following rounds of layoffs at Google it measures like desk 339 00:20:42,440 --> 00:20:46,080 Speaker 1: sharing procedures and offices, it really seems like office space 340 00:20:46,160 --> 00:20:49,520 Speaker 1: isn't way up on the company's priorities. So the question 341 00:20:49,600 --> 00:20:52,520 Speaker 1: remains what happens to all that property in San Jose. 342 00:20:52,920 --> 00:20:58,360 Speaker 1: And I don't know the answer to that except to say, yuck, Okay, 343 00:20:58,720 --> 00:21:01,240 Speaker 1: I've got a handful more stories to go through before 344 00:21:01,280 --> 00:21:13,680 Speaker 1: we get to that. Let's take another quick break, okay. 345 00:21:13,800 --> 00:21:17,560 Speaker 1: California's Senate Judiciary Committee is set to deliberate on a 346 00:21:17,600 --> 00:21:21,239 Speaker 1: proposed bill that if passed, would let Californians opt to 347 00:21:21,280 --> 00:21:25,200 Speaker 1: delete their personal information from a company's database if they 348 00:21:25,240 --> 00:21:30,040 Speaker 1: wanted to. It's nicknamed the Delete Act, and it's likely 349 00:21:30,080 --> 00:21:33,600 Speaker 1: to face some pretty stiff opposition from the tech space. 350 00:21:34,080 --> 00:21:36,639 Speaker 1: As I'm sure you all know, a lot of money 351 00:21:36,640 --> 00:21:40,280 Speaker 1: in tech comes from data. There are companies that exist 352 00:21:40,480 --> 00:21:43,840 Speaker 1: solely to collect and sell data to other companies. And 353 00:21:43,880 --> 00:21:46,159 Speaker 1: while the main focus is on advertising, there are a 354 00:21:46,200 --> 00:21:48,879 Speaker 1: lot of other issues here, even if you don't have 355 00:21:49,160 --> 00:21:54,159 Speaker 1: a problem with targeted ads themselves, namely, law enforcement agencies 356 00:21:54,160 --> 00:21:57,080 Speaker 1: have been known to use this data when it's inconvenient 357 00:21:57,119 --> 00:22:01,000 Speaker 1: to you know, secure a warrant to sue such information 358 00:22:01,040 --> 00:22:03,480 Speaker 1: through proper channels. Why do that when you can just 359 00:22:03,600 --> 00:22:07,720 Speaker 1: buy it? Or when someone has a personal acts to 360 00:22:07,800 --> 00:22:10,560 Speaker 1: grind and they use this data to track down like 361 00:22:10,680 --> 00:22:14,840 Speaker 1: a former boss or that ex romantic partner. We've seen 362 00:22:14,960 --> 00:22:18,240 Speaker 1: instances of that as well. So the Act is proposing 363 00:22:18,280 --> 00:22:22,199 Speaker 1: a solution in which companies are obligated to allow Californians 364 00:22:22,240 --> 00:22:26,120 Speaker 1: the option to just remove themselves from that data picture. 365 00:22:26,680 --> 00:22:29,240 Speaker 1: This is not the first time that progressive lawmakers in 366 00:22:29,240 --> 00:22:32,640 Speaker 1: California have tried something like this, but as I said, 367 00:22:32,680 --> 00:22:35,959 Speaker 1: resistance in the tech sector is high to such legislation, 368 00:22:36,240 --> 00:22:38,679 Speaker 1: and I imagine the journey to get this bill passed 369 00:22:39,359 --> 00:22:41,600 Speaker 1: will be a lot like a walk in moor door. 370 00:22:42,240 --> 00:22:46,320 Speaker 1: It ain't simple. Reuter's reports that GM and Samsung are 371 00:22:46,359 --> 00:22:51,000 Speaker 1: working together to build an electric vehicle battery manufacturing plant 372 00:22:51,320 --> 00:22:54,400 Speaker 1: here in the United States. Now, you may know that 373 00:22:54,440 --> 00:22:57,600 Speaker 1: one of the challenges the world faces as more countries 374 00:22:57,640 --> 00:23:01,720 Speaker 1: pass mandates to switch from in turn combustion engine vehicles 375 00:23:02,000 --> 00:23:05,119 Speaker 1: to electric vehicles is that right now we do not 376 00:23:05,359 --> 00:23:09,080 Speaker 1: produce enough batteries to meet global demand. Well, it's going 377 00:23:09,160 --> 00:23:11,560 Speaker 1: to take a few years for car companies to completely 378 00:23:11,640 --> 00:23:16,040 Speaker 1: change over to manufacturing evs, and in the meantime, projects 379 00:23:16,080 --> 00:23:19,440 Speaker 1: like this one aim to reduce some of the bottlenecks 380 00:23:19,880 --> 00:23:23,680 Speaker 1: in the production line. Now, there will be other bottlenecks, 381 00:23:23,680 --> 00:23:27,160 Speaker 1: such as just getting the raw materials necessary to make batteries, 382 00:23:27,480 --> 00:23:30,000 Speaker 1: but GM and Samsung are taking steps to build out 383 00:23:30,040 --> 00:23:34,080 Speaker 1: the infrastructure that's going to be necessary if this conversion 384 00:23:34,160 --> 00:23:37,960 Speaker 1: to evs is to succeed. Also, it's interesting to see 385 00:23:37,960 --> 00:23:41,480 Speaker 1: a partnership between a South Korean tech company and a 386 00:23:41,520 --> 00:23:45,960 Speaker 1: Detroit based auto manufacturer. Speaking of GM, the company is 387 00:23:46,000 --> 00:23:51,040 Speaker 1: also phasing out its Chevy Bolt EV and EUV models 388 00:23:51,280 --> 00:23:55,040 Speaker 1: this year. This is a blow to the affordable EV market, 389 00:23:55,080 --> 00:23:57,359 Speaker 1: as the Bolt was one of the least expensive electric 390 00:23:57,480 --> 00:24:00,960 Speaker 1: vehicles available to consumers here in the United States. But 391 00:24:01,080 --> 00:24:03,919 Speaker 1: the Bolt is built on top of technology that no 392 00:24:04,000 --> 00:24:07,920 Speaker 1: longer competes with more recent innovations, and it doesn't look 393 00:24:07,960 --> 00:24:10,679 Speaker 1: like Chevy has a replacement in mind for the Bolt, 394 00:24:11,040 --> 00:24:13,280 Speaker 1: at least nothing that's meant to take out that spot 395 00:24:13,480 --> 00:24:16,560 Speaker 1: in the market, the affordable EV market. Rather, the company 396 00:24:16,880 --> 00:24:20,680 Speaker 1: is focusing more on vehicles like electric trucks and electric 397 00:24:20,840 --> 00:24:24,760 Speaker 1: SUVs because that seems to be what customers want, so 398 00:24:25,080 --> 00:24:29,760 Speaker 1: fare thee well Chevy Bolt. Chinese company net Ease, which 399 00:24:29,920 --> 00:24:34,440 Speaker 1: had served as the Chinese distributor for Blizzard Games until 400 00:24:34,640 --> 00:24:38,560 Speaker 1: early this year, is now suing Blizzard for around forty 401 00:24:38,560 --> 00:24:42,600 Speaker 1: four million dollars. So here's the rundown. Net Ease and 402 00:24:42,640 --> 00:24:46,479 Speaker 1: Blizzard were in negotiations to renew the licensing deal between 403 00:24:46,520 --> 00:24:49,520 Speaker 1: the two of them, but the companies failed to come 404 00:24:49,560 --> 00:24:53,720 Speaker 1: to agreeable terms, so Blizzard opted to just leave net 405 00:24:53,720 --> 00:24:56,680 Speaker 1: E's behind, and in the process, Blizzard shut down their 406 00:24:56,720 --> 00:24:59,720 Speaker 1: servers in China, which meant players in China could no 407 00:24:59,720 --> 00:25:04,800 Speaker 1: longer access their Blizzard games. Neties says that Blizzard failed 408 00:25:04,840 --> 00:25:08,720 Speaker 1: to make good on several contractual obligations, including a failure 409 00:25:08,760 --> 00:25:12,240 Speaker 1: to bring certain announced games to market or to issue 410 00:25:12,280 --> 00:25:15,679 Speaker 1: refunds to Chinese players. Blizzard, for its part, says the 411 00:25:15,680 --> 00:25:18,639 Speaker 1: company has yet to be served with any legal papers 412 00:25:18,680 --> 00:25:21,560 Speaker 1: regarding a lawsuit, So they're saying Netties is saying they're 413 00:25:21,600 --> 00:25:25,480 Speaker 1: suing us, but we haven't received a lawsuit. Also, Blizzard 414 00:25:25,600 --> 00:25:29,159 Speaker 1: still is not running games in China because the company 415 00:25:29,160 --> 00:25:33,480 Speaker 1: hasn't found a replacement for Nettie's and Chinese law mandates 416 00:25:33,920 --> 00:25:37,480 Speaker 1: that game companies from outside the country have to work 417 00:25:37,600 --> 00:25:40,399 Speaker 1: through a Chinese company in order to sell games and 418 00:25:40,440 --> 00:25:45,480 Speaker 1: services to citizens within China. Once upon a time, scalpers 419 00:25:45,520 --> 00:25:49,400 Speaker 1: looking to sell PlayStation five consoles and insane markups were 420 00:25:49,440 --> 00:25:52,439 Speaker 1: sitting pretty because there was a shortage of inventory. That 421 00:25:52,560 --> 00:25:55,240 Speaker 1: meant that someone who wanted a PS five had very 422 00:25:55,280 --> 00:25:57,960 Speaker 1: few options open to them. If they missed out on 423 00:25:58,000 --> 00:26:00,399 Speaker 1: a chance to scoop up one of just a few 424 00:26:00,520 --> 00:26:03,840 Speaker 1: units that became available on any given day, they were 425 00:26:03,880 --> 00:26:07,080 Speaker 1: back to waiting for the next one, or they would 426 00:26:07,119 --> 00:26:09,480 Speaker 1: have to give it to the temptation to buy one 427 00:26:09,640 --> 00:26:13,439 Speaker 1: from a scalper at a crazy inflated price. But now 428 00:26:13,720 --> 00:26:18,160 Speaker 1: Sony has produced more consoles, plus a lot of households 429 00:26:18,200 --> 00:26:21,720 Speaker 1: that really wanted one now have one. And that means 430 00:26:22,080 --> 00:26:25,240 Speaker 1: that today's scalpers are starting to find out they're just 431 00:26:25,280 --> 00:26:27,600 Speaker 1: playing out of luck, and at times some of them 432 00:26:27,640 --> 00:26:31,680 Speaker 1: have been reduced to selling consoles for less than retail price. 433 00:26:32,240 --> 00:26:35,400 Speaker 1: And I know it is a sad day when predatory 434 00:26:35,560 --> 00:26:38,560 Speaker 1: jerks are hoisted by their own petard. And it only 435 00:26:38,600 --> 00:26:41,880 Speaker 1: took a couple of years to get there. Finally, it's 436 00:26:41,920 --> 00:26:46,000 Speaker 1: an historic time for the Palo Alto Research Center aka 437 00:26:46,560 --> 00:26:52,560 Speaker 1: Park PARC. Back in nineteen seventy, Xerox founded Park as 438 00:26:52,600 --> 00:26:55,760 Speaker 1: an R and D lab, similar to famous tech labs 439 00:26:55,880 --> 00:27:00,600 Speaker 1: like Bell Laboratories. The researchers at Park ex faed and 440 00:27:00,680 --> 00:27:04,200 Speaker 1: invented new technologies, some of which would emerge to change 441 00:27:04,200 --> 00:27:08,639 Speaker 1: the world. For example, the GUY, or graphical user interface, 442 00:27:09,240 --> 00:27:14,000 Speaker 1: was really first implemented at Xerox Park. Then one day 443 00:27:14,040 --> 00:27:18,320 Speaker 1: an enterprising fellow named Steve Jobs went for a tour 444 00:27:18,880 --> 00:27:22,240 Speaker 1: of Park's facilities and saw a demonstration of the GUY, 445 00:27:22,920 --> 00:27:25,880 Speaker 1: also of the computer mouse, which wasn't invented at Park 446 00:27:26,200 --> 00:27:29,439 Speaker 1: but was heavily used there, and Steve Jobs got this 447 00:27:29,520 --> 00:27:34,240 Speaker 1: bright idea to offer up something similar in a personal computer, 448 00:27:34,440 --> 00:27:38,320 Speaker 1: and then we get the story of the Mac Well. 449 00:27:38,480 --> 00:27:42,600 Speaker 1: Xerox is now saying farewell to Park, and to be clear, 450 00:27:43,119 --> 00:27:46,439 Speaker 1: Parks spun off of Xerox to become an independent company 451 00:27:46,480 --> 00:27:49,639 Speaker 1: in two thousand and two, but Park's keys have now 452 00:27:49,720 --> 00:27:55,879 Speaker 1: been handed over to SRII International. That organization actually began 453 00:27:55,960 --> 00:27:59,840 Speaker 1: in nineteen forty six as part of Stanford University, though 454 00:27:59,840 --> 00:28:05,080 Speaker 1: the organization that would evolve into SRI International actually spun 455 00:28:05,080 --> 00:28:07,639 Speaker 1: off from Stanford in nineteen seventy, the same year that 456 00:28:07,760 --> 00:28:13,880 Speaker 1: Park was founded. Today, SRI International is a nonprofit scientific 457 00:28:13,960 --> 00:28:17,480 Speaker 1: research institute, so it sounds like a pretty great match 458 00:28:17,520 --> 00:28:23,000 Speaker 1: with Park. It'll be an organization dedicated to doing cutting 459 00:28:23,119 --> 00:28:27,600 Speaker 1: edge research in science and technology, potentially creating new products 460 00:28:27,600 --> 00:28:30,440 Speaker 1: along the way. Xerox is going to keep a preferred 461 00:28:30,480 --> 00:28:33,840 Speaker 1: research agreement with the newly merged organization, kind of like 462 00:28:33,880 --> 00:28:37,840 Speaker 1: a first rights sort of situation. It also will continue 463 00:28:37,880 --> 00:28:42,880 Speaker 1: to hold many of the patents that were secured by 464 00:28:42,960 --> 00:28:46,920 Speaker 1: Park prior to this handoff. But yeah, interesting times. I 465 00:28:46,920 --> 00:28:50,680 Speaker 1: should probably do a full episode about Xerox Park. Now 466 00:28:50,720 --> 00:28:54,040 Speaker 1: that this has happened, all right, that wraps up the 467 00:28:54,080 --> 00:28:58,560 Speaker 1: news for today, Tuesday, April twenty fifth, twenty twenty three. 468 00:28:58,760 --> 00:29:01,320 Speaker 1: I hope you are all well and I'll talk to 469 00:29:01,320 --> 00:29:11,920 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 470 00:29:12,200 --> 00:29:17,240 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 471 00:29:17,360 --> 00:29:19,360 Speaker 1: or wherever you listen to your favorite shows.