1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:16,000 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,040 --> 00:00:19,439 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,560 --> 00:00:22,920 Speaker 1: are you? It is time for the tech news for Tuesday, 5 00:00:23,200 --> 00:00:28,560 Speaker 1: August first, twenty twenty three. So let's start out with 6 00:00:28,640 --> 00:00:31,960 Speaker 1: the news that kind of happened and then unhappened. Since 7 00:00:32,000 --> 00:00:34,479 Speaker 1: the last time I did one of these episodes, and 8 00:00:34,520 --> 00:00:38,760 Speaker 1: that means starting with X you know, the service formerly 9 00:00:38,880 --> 00:00:42,160 Speaker 1: known as Twitter. I had mentioned that the San Francisco 10 00:00:42,280 --> 00:00:46,280 Speaker 1: Police had paid a little visit to x HQ when 11 00:00:46,440 --> 00:00:49,760 Speaker 1: a construction crew blocked off a couple of lanes of 12 00:00:49,800 --> 00:00:53,400 Speaker 1: traffic on a city street in San Francisco while removing 13 00:00:53,479 --> 00:00:59,400 Speaker 1: letters from the Twitter sign that hung on the headquarters building. Ultimately, 14 00:00:59,480 --> 00:01:03,520 Speaker 1: the cop did not charge anyone with a crime, but 15 00:01:03,600 --> 00:01:06,120 Speaker 1: it did mean that the removal process was interrupted and 16 00:01:06,160 --> 00:01:09,760 Speaker 1: for a while the er stayed up there. They had 17 00:01:09,800 --> 00:01:12,200 Speaker 1: removed the twit, but the er was still up. And 18 00:01:12,240 --> 00:01:15,119 Speaker 1: since a lot of us typically now we'll look at 19 00:01:15,120 --> 00:01:19,039 Speaker 1: the former Twitter and say er, I think it was 20 00:01:19,080 --> 00:01:22,720 Speaker 1: appropriate that the er was still there. But anyway, after 21 00:01:22,800 --> 00:01:27,400 Speaker 1: all that Musk's financial sinkhole of a company erected a 22 00:01:27,440 --> 00:01:32,560 Speaker 1: big ol X sign, an illuminated X sign on the 23 00:01:32,600 --> 00:01:36,920 Speaker 1: top of their HQ, so it lit up a lot, 24 00:01:37,360 --> 00:01:40,520 Speaker 1: and it was bright enough and lighting up and like 25 00:01:40,800 --> 00:01:44,360 Speaker 1: strobe effects and stuff that it prompted several people in 26 00:01:44,400 --> 00:01:48,440 Speaker 1: the neighborhood to lodge complaints, saying that it was garish 27 00:01:48,480 --> 00:01:51,440 Speaker 1: and disruptive and making it real difficult to get any 28 00:01:51,560 --> 00:01:56,040 Speaker 1: sleep if the X was flashing lights through bedroom windows 29 00:01:56,040 --> 00:01:59,840 Speaker 1: and stuff. This prompted the city to send a bill 30 00:02:00,240 --> 00:02:03,480 Speaker 1: inspector around to make sure that the news sign was 31 00:02:03,600 --> 00:02:06,000 Speaker 1: up to code, but the inspector says that they were 32 00:02:06,120 --> 00:02:10,799 Speaker 1: denied roof access multiple times, so the city took issue 33 00:02:10,800 --> 00:02:14,560 Speaker 1: with this as well as with musk erecting an quote 34 00:02:14,639 --> 00:02:20,160 Speaker 1: unquote illuminated structure without first securing a permit to do it. 35 00:02:20,720 --> 00:02:23,200 Speaker 1: There were also concerns that the sign itself was not 36 00:02:23,320 --> 00:02:26,880 Speaker 1: properly secured to the roof, which could be a truly 37 00:02:26,960 --> 00:02:30,000 Speaker 1: dangerous situation in the event of something like high winds 38 00:02:30,040 --> 00:02:34,040 Speaker 1: for example. Now the company is likely to face some 39 00:02:34,560 --> 00:02:38,839 Speaker 1: fines from the city. I haven't seen how much those 40 00:02:38,960 --> 00:02:42,280 Speaker 1: fines might be, But then if San Francisco wants to 41 00:02:42,280 --> 00:02:45,160 Speaker 1: get money out of x it's gonna probably have to 42 00:02:45,160 --> 00:02:49,520 Speaker 1: get in line behind the various landlords, vendors, and former 43 00:02:49,560 --> 00:02:54,000 Speaker 1: employees who are also awaiting payment. So I suppose one 44 00:02:54,000 --> 00:02:57,640 Speaker 1: of the big secrets to being rich is that you know, 45 00:02:58,120 --> 00:03:01,040 Speaker 1: you just don't pay your bills, then you get to 46 00:03:01,120 --> 00:03:07,120 Speaker 1: keep your money. It's genius. According to Casey Newton's Excellent 47 00:03:07,440 --> 00:03:12,440 Speaker 1: Tech newsletter, today marks a big day inside X SO. 48 00:03:12,560 --> 00:03:16,639 Speaker 1: Back when Musk bought Twitter, he famously did so at 49 00:03:16,680 --> 00:03:21,040 Speaker 1: fifty four dollars twenty cents per share. That's why I 50 00:03:21,120 --> 00:03:24,640 Speaker 1: ended up being more than forty billion dollars overall. And 51 00:03:24,720 --> 00:03:28,720 Speaker 1: a lot of Twitter employees had guaranteed stock grants, meaning 52 00:03:28,919 --> 00:03:32,320 Speaker 1: that they had vested shares with the company or shares 53 00:03:32,360 --> 00:03:34,920 Speaker 1: that were in the process of vesting, and that they 54 00:03:34,960 --> 00:03:38,480 Speaker 1: would then receive a payout for those shares that's part 55 00:03:38,520 --> 00:03:41,040 Speaker 1: of their compensation. They would get fifty four dollars twenty 56 00:03:41,080 --> 00:03:45,600 Speaker 1: cents per share vested with the company. According to Newton, 57 00:03:46,040 --> 00:03:50,360 Speaker 1: today marks the day when the remaining employees who were 58 00:03:50,840 --> 00:03:55,000 Speaker 1: still receiving stock grants will then be paid out. And 59 00:03:55,160 --> 00:03:59,200 Speaker 1: if that happens, it might mean we could see another 60 00:03:59,320 --> 00:04:03,119 Speaker 1: wave of resignations out of the company, because it's possible 61 00:04:03,120 --> 00:04:06,120 Speaker 1: that some folks at X stayed on really just so 62 00:04:06,200 --> 00:04:09,800 Speaker 1: that they could reach this moment to get that final payout, 63 00:04:10,360 --> 00:04:13,240 Speaker 1: and then potentially they could go skip off to find 64 00:04:13,280 --> 00:04:17,000 Speaker 1: greener or at least less chaotic pastures, though there is 65 00:04:17,040 --> 00:04:18,880 Speaker 1: still a concern that a lot of people would not 66 00:04:18,880 --> 00:04:21,520 Speaker 1: be able to find a job that pays, you know, 67 00:04:21,680 --> 00:04:24,360 Speaker 1: at the same level as what they're getting over at X. 68 00:04:25,480 --> 00:04:28,280 Speaker 1: And as we've already said, Musk has racked up a 69 00:04:28,279 --> 00:04:32,840 Speaker 1: pretty big history of refusing to pay bills, including to 70 00:04:33,200 --> 00:04:37,719 Speaker 1: former employees, So there is a fear that those who 71 00:04:37,760 --> 00:04:41,279 Speaker 1: should be receiving this compensation may be left waiting around 72 00:04:41,440 --> 00:04:45,400 Speaker 1: indefinitely and that worst case scenario, the payout might not 73 00:04:45,600 --> 00:04:48,880 Speaker 1: ever come. So maybe by the end of the day 74 00:04:48,920 --> 00:04:51,279 Speaker 1: we'll have more news about that, but that's how it 75 00:04:51,320 --> 00:04:54,960 Speaker 1: stands as I record the episode. In other awful news, 76 00:04:55,520 --> 00:05:00,920 Speaker 1: the Center for Countering Digital Hate or CCDA, which is 77 00:05:01,040 --> 00:05:04,880 Speaker 1: now in Elon Musk's crosshairs for daring to you know, 78 00:05:05,560 --> 00:05:09,880 Speaker 1: chronicle the rise of hate speech on X now. Musk 79 00:05:09,960 --> 00:05:13,120 Speaker 1: has claimed that hate speech impressions are actually on the 80 00:05:13,160 --> 00:05:18,200 Speaker 1: decline on X, but the CCdh argue that the opposite 81 00:05:18,320 --> 00:05:21,160 Speaker 1: is true, that there's been an increase across the board 82 00:05:21,720 --> 00:05:25,520 Speaker 1: in hate speech as well as in disinformation. And if 83 00:05:25,520 --> 00:05:27,920 Speaker 1: you've been following the news, you know that Musk has 84 00:05:27,960 --> 00:05:32,200 Speaker 1: reinstated the Twitter accounts for people who previously had received 85 00:05:32,240 --> 00:05:35,880 Speaker 1: bans for spreading things like hate speech on the platform. 86 00:05:36,200 --> 00:05:41,240 Speaker 1: So at least you know, from that perspective, you could say, well, 87 00:05:41,279 --> 00:05:44,120 Speaker 1: it seems to be logical that you would find an 88 00:05:44,160 --> 00:05:48,599 Speaker 1: increase in hate speech. They are re platforming people who 89 00:05:48,600 --> 00:05:52,720 Speaker 1: were banned for spreading hate speech. So anyway, the CCdh 90 00:05:53,240 --> 00:05:56,520 Speaker 1: says that they found a two hundred to two percent 91 00:05:57,360 --> 00:06:02,320 Speaker 1: increase in messages on the platform that contain slurs. They 92 00:06:02,440 --> 00:06:07,120 Speaker 1: also said that Twitter Blue subscribers, which maybe they're called 93 00:06:07,440 --> 00:06:12,000 Speaker 1: X blue subscribers. Now the X makes everything really confusing 94 00:06:12,040 --> 00:06:15,279 Speaker 1: because you think that it means former as opposed to 95 00:06:16,640 --> 00:06:20,479 Speaker 1: subscribed to X. Anyway, people who have subscribed to the 96 00:06:20,480 --> 00:06:24,479 Speaker 1: platform and who have subsequently posted messages that violate the 97 00:06:24,480 --> 00:06:29,479 Speaker 1: platform's policies seem to suffer no consequences. Their messages don't 98 00:06:29,520 --> 00:06:33,520 Speaker 1: appear to be removed or anything like that. The company's 99 00:06:33,560 --> 00:06:37,680 Speaker 1: own Trust and Safety Council resigned ages ago, right, the 100 00:06:37,680 --> 00:06:41,800 Speaker 1: people who were actually in charge of formulating those policies 101 00:06:41,880 --> 00:06:44,120 Speaker 1: and to make sure that the platform was doing well 102 00:06:44,120 --> 00:06:48,440 Speaker 1: to adhere to them, they left, and the department as 103 00:06:48,440 --> 00:06:52,000 Speaker 1: a whole is reportedly in a bit of the shambles. 104 00:06:52,440 --> 00:06:56,240 Speaker 1: And now Musk is sending legal threats and is arguing 105 00:06:56,279 --> 00:07:01,080 Speaker 1: that the CCdh has an agenda, that the cs intent 106 00:07:01,240 --> 00:07:05,520 Speaker 1: is to hurt X by spreading falsehoods in an attempt 107 00:07:05,560 --> 00:07:10,400 Speaker 1: to scare off advertisers. The CCDCH refutes those claims and 108 00:07:10,520 --> 00:07:14,680 Speaker 1: points out that the organization is neither government funded, so 109 00:07:14,720 --> 00:07:18,600 Speaker 1: it doesn't receive money from the government, nor is it 110 00:07:18,640 --> 00:07:21,840 Speaker 1: connected to any other competing social network, so it's not 111 00:07:21,880 --> 00:07:26,440 Speaker 1: like it's you know, conspiring with Meta to take down X. 112 00:07:27,040 --> 00:07:29,640 Speaker 1: In fact, the CDCH has also reported on issues like 113 00:07:29,920 --> 00:07:34,320 Speaker 1: hate speech and disinformation on other platforms besides X. The 114 00:07:34,360 --> 00:07:38,000 Speaker 1: CCDCH released a statement indicating that should X and Musk 115 00:07:38,160 --> 00:07:41,080 Speaker 1: actually pursue a legal case against the organization, because so 116 00:07:41,160 --> 00:07:44,520 Speaker 1: far it's just been threats about doing that, the organization 117 00:07:44,680 --> 00:07:47,760 Speaker 1: will fight back. So I'm sure we will have more 118 00:07:47,800 --> 00:07:53,440 Speaker 1: on this as it continues. And on a less dramatic 119 00:07:53,520 --> 00:07:57,040 Speaker 1: and less important side, of the whole x Twitter story, 120 00:07:57,640 --> 00:08:00,680 Speaker 1: one of the few remaining holdovers from the Twitter days 121 00:08:00,720 --> 00:08:03,720 Speaker 1: has actually changed. So it used to be that if 122 00:08:03,720 --> 00:08:06,920 Speaker 1: you wanted to post something on Twitter, you would hit 123 00:08:07,040 --> 00:08:11,320 Speaker 1: a little, you know, button that says tweet on there 124 00:08:11,360 --> 00:08:13,560 Speaker 1: after you've written your message, right, you would tweet and 125 00:08:13,560 --> 00:08:16,800 Speaker 1: that would post it. And now that icon no longer 126 00:08:16,880 --> 00:08:21,280 Speaker 1: says tweet. It now says host. It's changed back and 127 00:08:21,320 --> 00:08:24,000 Speaker 1: forth a couple of times over the last two days. 128 00:08:24,240 --> 00:08:26,360 Speaker 1: But as I record this, at least in my own 129 00:08:26,400 --> 00:08:29,680 Speaker 1: instance on the web based version of Twitter, I checked 130 00:08:30,040 --> 00:08:33,480 Speaker 1: and it says post now, not tweet. I guess we 131 00:08:33,520 --> 00:08:38,480 Speaker 1: should be thankful it's not X seat or you know, 132 00:08:38,520 --> 00:08:40,599 Speaker 1: because it's X is the name of the company. And 133 00:08:41,040 --> 00:08:44,440 Speaker 1: Musk said that tweets would now be called x eats 134 00:08:44,559 --> 00:08:49,320 Speaker 1: or zeats or skeets if you're pronouncing X like if 135 00:08:49,400 --> 00:08:53,160 Speaker 1: it were, you know, an anglicized Chinese word, then it 136 00:08:53,160 --> 00:08:55,360 Speaker 1: would mean we'd be using the sound, so it'd be 137 00:08:55,760 --> 00:09:00,000 Speaker 1: sheets then. But I'm guessing Musk doesn't like that because 138 00:09:00,240 --> 00:09:04,000 Speaker 1: if you're using the X sound as sh then Twitter 139 00:09:04,040 --> 00:09:09,239 Speaker 1: would become well, I won't say it. Over in the Netherlands, 140 00:09:09,240 --> 00:09:12,640 Speaker 1: a court has ruled that Meta Ireland must reveal the 141 00:09:12,679 --> 00:09:17,640 Speaker 1: identity of an anonymous user as part of a defamation lawsuit. 142 00:09:18,120 --> 00:09:21,600 Speaker 1: This one gets really ugly, y'all. So the plaintiff in 143 00:09:21,679 --> 00:09:24,200 Speaker 1: this case is a man who says he has been 144 00:09:24,200 --> 00:09:27,800 Speaker 1: defamed by this anonymous user who has posted in a 145 00:09:27,840 --> 00:09:31,160 Speaker 1: couple of different Facebook groups that are dedicated to people 146 00:09:31,240 --> 00:09:36,080 Speaker 1: talking about their dating experiences and kind of like horror 147 00:09:36,160 --> 00:09:39,679 Speaker 1: stories with dating experiences in general, and that this anonymous 148 00:09:39,760 --> 00:09:42,440 Speaker 1: user has said that this man has done stuff like 149 00:09:42,559 --> 00:09:46,520 Speaker 1: secretly recorded the women he was dating, which would definitely 150 00:09:46,520 --> 00:09:48,800 Speaker 1: be a red flag if that's true. But the man 151 00:09:48,880 --> 00:09:53,199 Speaker 1: denies these charges and he wants to sue this anonymous 152 00:09:53,240 --> 00:09:56,400 Speaker 1: user for defamation. He says his reputation has been harmed 153 00:09:56,440 --> 00:09:59,280 Speaker 1: as a result of these posts. But here's the rub. 154 00:09:59,559 --> 00:10:02,680 Speaker 1: You can't sue someone if you don't know who they are. 155 00:10:03,280 --> 00:10:05,760 Speaker 1: So the man has brought this lawsuit to the Court 156 00:10:05,840 --> 00:10:09,000 Speaker 1: of the Hague in the Netherlands, and the court considered 157 00:10:09,040 --> 00:10:11,319 Speaker 1: the case and has now sent an order to Meta 158 00:10:11,360 --> 00:10:15,040 Speaker 1: Ireland to cough up the identity of the anonymous poster 159 00:10:15,320 --> 00:10:19,520 Speaker 1: who was behind these messages in these Facebook groups, and 160 00:10:19,640 --> 00:10:23,080 Speaker 1: Meta Ireland is going to comply because it's a legal 161 00:10:23,120 --> 00:10:26,920 Speaker 1: court order. And you might think, huh, that sounds like 162 00:10:26,960 --> 00:10:29,520 Speaker 1: a precedent that could potentially set the stage for a 163 00:10:29,559 --> 00:10:32,000 Speaker 1: lot of abuse and discourage people from coming forward as 164 00:10:32,040 --> 00:10:35,199 Speaker 1: whistleblowers and such. In this case, the argument is that 165 00:10:35,280 --> 00:10:38,600 Speaker 1: the posts could be illegal if in fact they are 166 00:10:38,720 --> 00:10:42,960 Speaker 1: defamatory and untrue. But the court also went beyond this. 167 00:10:43,080 --> 00:10:48,160 Speaker 1: The court said, quote, according to settled case law and 168 00:10:48,360 --> 00:10:53,120 Speaker 1: under certain circumstances, Meta has an obligation to provide identifying 169 00:10:53,240 --> 00:10:56,679 Speaker 1: data even if the content of the relevant messages is 170 00:10:56,760 --> 00:11:02,760 Speaker 1: not unmistakably unlawful. So, in other words, the government in 171 00:11:02,800 --> 00:11:04,680 Speaker 1: the EU, in the form of the court system, has 172 00:11:04,720 --> 00:11:09,120 Speaker 1: the right to force platforms to strip away anonymity even 173 00:11:09,120 --> 00:11:12,240 Speaker 1: if the posts at the center of the matter do 174 00:11:12,360 --> 00:11:17,440 Speaker 1: not contain any overtly illegal content within them, which is 175 00:11:17,440 --> 00:11:20,840 Speaker 1: a big ol' yikes, right, Like that is stripping away 176 00:11:21,920 --> 00:11:25,880 Speaker 1: security and privacy. Now, on the flip side, you could say, yeah, 177 00:11:25,880 --> 00:11:29,280 Speaker 1: but if this is a real case of defamation, if 178 00:11:29,280 --> 00:11:33,520 Speaker 1: these claims are untrue, and this person has had their 179 00:11:33,520 --> 00:11:36,400 Speaker 1: reputation suffer as a result of this, and it's and 180 00:11:36,520 --> 00:11:38,320 Speaker 1: through no fault of his own because he didn't do 181 00:11:38,440 --> 00:11:41,800 Speaker 1: the things that were claimed against him. There needs to 182 00:11:41,840 --> 00:11:44,120 Speaker 1: be recourse. So you can see that this is a 183 00:11:44,120 --> 00:11:47,520 Speaker 1: complicated issue. On the flip side, if the claims are 184 00:11:47,640 --> 00:11:51,560 Speaker 1: absolutely true, then ripping the anonymity away from the person 185 00:11:51,600 --> 00:11:56,160 Speaker 1: who came forward is a real potential threat to their safety. 186 00:11:56,840 --> 00:12:00,480 Speaker 1: So it's a very complex situation and I can't pretend 187 00:12:00,559 --> 00:12:04,480 Speaker 1: like I have the answer. Okay, we're gonna take a 188 00:12:04,520 --> 00:12:06,480 Speaker 1: quick break. When we come back, We've got some more 189 00:12:06,600 --> 00:12:19,120 Speaker 1: news stories to talk about. Okay, we're back, and now 190 00:12:19,200 --> 00:12:22,920 Speaker 1: let's get into the AI part of this news episode. 191 00:12:23,000 --> 00:12:27,040 Speaker 1: So Meta is reportedly preparing to launch its AI powered 192 00:12:27,160 --> 00:12:32,240 Speaker 1: chatbots pretty soon, perhaps as early as September, according to 193 00:12:32,320 --> 00:12:36,160 Speaker 1: the Financial Times. The chatbots will have different personalities, some 194 00:12:36,320 --> 00:12:42,280 Speaker 1: based off famous historical figures, others based off more stereotypes, 195 00:12:42,360 --> 00:12:45,760 Speaker 1: kind of like surfer dude. According to the Financial Times, 196 00:12:45,800 --> 00:12:49,880 Speaker 1: the pushes in part meant to stem the rise and 197 00:12:49,920 --> 00:12:54,120 Speaker 1: then fall of participation on Threads. So when Threads first launched, 198 00:12:54,559 --> 00:12:56,960 Speaker 1: it took off like a rocket, right. It hit one 199 00:12:57,040 --> 00:13:01,320 Speaker 1: hundred million users in less than a week, But since then, 200 00:13:01,800 --> 00:13:05,880 Speaker 1: more than half of those users haven't really been back 201 00:13:05,920 --> 00:13:09,440 Speaker 1: on Threads. They dropped off super quickly, and it suggested 202 00:13:09,480 --> 00:13:11,640 Speaker 1: that Threads was really more of a flash in the 203 00:13:11,679 --> 00:13:15,120 Speaker 1: pan moment and not the Twitter substitute that a lot 204 00:13:15,200 --> 00:13:19,000 Speaker 1: of people were kind of hoping for. Apparently, Meta is 205 00:13:19,480 --> 00:13:23,480 Speaker 1: going to lean on these chatbots to help drive engagement. 206 00:13:24,440 --> 00:13:27,240 Speaker 1: I guess I can see the logic behind that, because 207 00:13:27,240 --> 00:13:30,199 Speaker 1: if you're on a social platform and you're posting stuff 208 00:13:30,679 --> 00:13:33,960 Speaker 1: but no one ever engages with anything you post, then 209 00:13:34,000 --> 00:13:36,320 Speaker 1: really you just end up keeping a journal, and that 210 00:13:36,440 --> 00:13:38,720 Speaker 1: might be enough for some folks. There's nothing wrong with that, 211 00:13:39,440 --> 00:13:42,320 Speaker 1: but I think a lot of people feel that they 212 00:13:42,360 --> 00:13:45,280 Speaker 1: want others to treat the stuff they say as if 213 00:13:45,320 --> 00:13:48,280 Speaker 1: it matters, right, The reason you're on social is to 214 00:13:48,280 --> 00:13:50,520 Speaker 1: stay in touch with other people and to get some 215 00:13:50,640 --> 00:13:54,199 Speaker 1: validation that the stuff you're posting is interesting or funny 216 00:13:54,360 --> 00:13:57,480 Speaker 1: or relevant. You know that people care about what you 217 00:13:57,600 --> 00:14:00,440 Speaker 1: have to say, and if you're not getting that, you're 218 00:14:00,480 --> 00:14:03,040 Speaker 1: not likely to be very satisfied with your experience on 219 00:14:03,080 --> 00:14:08,160 Speaker 1: the social platform. Maybe an artificial reaction provided by a 220 00:14:08,240 --> 00:14:12,599 Speaker 1: chatbot that's imitating Abraham Lincoln is the answer. Though it 221 00:14:12,640 --> 00:14:15,600 Speaker 1: should point out that the official explanation for these chatbots 222 00:14:16,000 --> 00:14:17,600 Speaker 1: is that they're going to serve as a way to 223 00:14:18,000 --> 00:14:21,240 Speaker 1: answer things like search queries and to give recommendations for stuff, 224 00:14:22,000 --> 00:14:24,960 Speaker 1: and not necessarily to stand in as some sort of 225 00:14:25,400 --> 00:14:30,920 Speaker 1: surrogate follower or friend. So we'll see. I don't know 226 00:14:30,960 --> 00:14:33,560 Speaker 1: if this will actually be a useful tool, or you 227 00:14:33,600 --> 00:14:36,240 Speaker 1: can count on Abraham Lincoln to be your plastic pal 228 00:14:36,320 --> 00:14:40,000 Speaker 1: who's fun to be with. Shout out if you've got 229 00:14:40,000 --> 00:14:44,360 Speaker 1: that reference. Anyway, Meta isn't the only company putting AI 230 00:14:44,600 --> 00:14:48,320 Speaker 1: to a new use. According to The Verge, YouTube is 231 00:14:48,400 --> 00:14:53,760 Speaker 1: testing AI for the purposes of summarizing video content, so, 232 00:14:54,040 --> 00:14:56,880 Speaker 1: in other words, telling you what a video actually is 233 00:14:56,960 --> 00:14:59,200 Speaker 1: about before you click in on the video. So the 234 00:14:59,240 --> 00:15:03,000 Speaker 1: test only cover a quote limited number of English language 235 00:15:03,080 --> 00:15:06,280 Speaker 1: videos and will only be viewable by a limited number 236 00:15:06,320 --> 00:15:09,120 Speaker 1: of users end quote. That's according to John Porter of 237 00:15:09,200 --> 00:15:13,640 Speaker 1: The Verge. Porter also writes that while the intent is 238 00:15:13,680 --> 00:15:18,440 Speaker 1: to create a quick summary that ideally helps users decide 239 00:15:18,480 --> 00:15:21,720 Speaker 1: which videos to watch, those summaries are not going to 240 00:15:21,800 --> 00:15:25,680 Speaker 1: replace the human written video descriptions. Personally, I'm having a 241 00:15:25,720 --> 00:15:28,760 Speaker 1: bit of trouble imagining what this actually looks like in 242 00:15:28,840 --> 00:15:32,280 Speaker 1: practice when you're on YouTube. I mean, do the videos 243 00:15:32,320 --> 00:15:35,400 Speaker 1: have two separate descriptions one that was written by AI. 244 00:15:36,080 --> 00:15:38,600 Speaker 1: But anyway, it shows how companies are looking at this 245 00:15:38,760 --> 00:15:42,960 Speaker 1: crazy tool called artificial intelligence, and now they're scrambling to 246 00:15:42,960 --> 00:15:47,960 Speaker 1: find quote unquote problems that this tool can quote unquote fix. Obviously, 247 00:15:48,600 --> 00:15:52,280 Speaker 1: anything that leads to greater engagement on one of Alphabet's 248 00:15:52,400 --> 00:15:55,240 Speaker 1: many platforms is going to be counted as a win. 249 00:15:55,680 --> 00:15:58,200 Speaker 1: So I can understand the whole throw noodles at the 250 00:15:58,200 --> 00:16:01,280 Speaker 1: wall and see what sticks aproach that they're taking here. 251 00:16:02,320 --> 00:16:05,880 Speaker 1: I'm curious to see one of these summaries myself. I 252 00:16:05,880 --> 00:16:07,880 Speaker 1: want to see what it looks like. But as far 253 00:16:07,920 --> 00:16:10,880 Speaker 1: as I can tell, I haven't come across anything like 254 00:16:10,920 --> 00:16:13,280 Speaker 1: that yet. But it is in a limited test run, 255 00:16:13,400 --> 00:16:16,720 Speaker 1: so it is far more likely that I am just 256 00:16:16,800 --> 00:16:18,880 Speaker 1: not in the pool of folks who are included in 257 00:16:18,920 --> 00:16:21,840 Speaker 1: that test. I hold out hope that we'll get at 258 00:16:21,920 --> 00:16:24,960 Speaker 1: least a few hilariously weird video summaries out of this 259 00:16:25,560 --> 00:16:28,840 Speaker 1: that we're clearly written by a robot. So I'll just 260 00:16:28,880 --> 00:16:33,280 Speaker 1: have to wait and see. NewsCorp, the giant media company, 261 00:16:33,600 --> 00:16:36,840 Speaker 1: has been using artificial intelligence to generate news articles that 262 00:16:36,880 --> 00:16:42,960 Speaker 1: are publishing across seventy five hyper local publications across Australia. So, 263 00:16:43,000 --> 00:16:46,120 Speaker 1: according to the Guardian, the AI, which is kind of 264 00:16:46,120 --> 00:16:50,960 Speaker 1: like chat gpt, is writing around three thousand pieces a 265 00:16:51,040 --> 00:16:55,200 Speaker 1: week and the topics range from stuff like weather reports 266 00:16:55,240 --> 00:16:58,840 Speaker 1: to giving updates on fuel prices and specific cities and 267 00:16:58,880 --> 00:17:01,440 Speaker 1: that kind of thing. The company says that the articles 268 00:17:01,440 --> 00:17:05,080 Speaker 1: are written by AI, but they're overseen by journalists, which 269 00:17:05,160 --> 00:17:07,600 Speaker 1: kind of reminds me of what how stuffworks dot Com 270 00:17:07,640 --> 00:17:11,880 Speaker 1: said about its recent embrace of artificial intelligence generated articles. 271 00:17:12,280 --> 00:17:15,280 Speaker 1: As a reminder, the editorial staff was let go at 272 00:17:15,320 --> 00:17:17,879 Speaker 1: how stuffworks dot com and that means that folks that 273 00:17:17,920 --> 00:17:20,880 Speaker 1: I used to work with found themselves out of a job. 274 00:17:20,920 --> 00:17:23,840 Speaker 1: So I am definitely biased on this topic. I'm saying 275 00:17:23,880 --> 00:17:26,920 Speaker 1: that so that you understand that I have a very 276 00:17:26,960 --> 00:17:30,360 Speaker 1: particular point of view on this that is pretty darn negative, 277 00:17:30,720 --> 00:17:36,000 Speaker 1: but that I completely admit this is my own bias. Anyway, 278 00:17:36,040 --> 00:17:38,119 Speaker 1: I think you can make an argument that the types 279 00:17:38,160 --> 00:17:41,639 Speaker 1: of pieces the AI is said to be tackling are 280 00:17:41,720 --> 00:17:44,960 Speaker 1: ones that human writers actually wouldn't find interesting or rewarding 281 00:17:45,000 --> 00:17:48,520 Speaker 1: to write in the first place. They are tedious exercises 282 00:17:48,560 --> 00:17:51,920 Speaker 1: at best, and goodness knows, I would occasionally get assignments 283 00:17:51,960 --> 00:17:55,720 Speaker 1: at how stuff works. That definitely fell into the tedious category. 284 00:17:56,400 --> 00:17:59,360 Speaker 1: You still had to do your best on those articles, 285 00:17:59,800 --> 00:18:02,920 Speaker 1: but you questioned the value of writing them in the 286 00:18:02,960 --> 00:18:06,600 Speaker 1: first place. You're like, this is a garbage article. Even 287 00:18:06,640 --> 00:18:09,280 Speaker 1: if I do my best work on it, it's just 288 00:18:09,280 --> 00:18:14,119 Speaker 1: the article itself is not very interesting. There is still 289 00:18:14,320 --> 00:18:16,800 Speaker 1: a need for a human person to oversee the work 290 00:18:16,840 --> 00:18:19,760 Speaker 1: that the AI is doing, because, as we know, AI 291 00:18:19,840 --> 00:18:23,840 Speaker 1: can sometimes get real loosey goosey with reality. And it 292 00:18:23,840 --> 00:18:26,679 Speaker 1: makes me wonder how much the companies and staff are 293 00:18:26,720 --> 00:18:29,800 Speaker 1: actually saving on time and effort, because if there's a 294 00:18:29,840 --> 00:18:33,480 Speaker 1: frequent enough need to do revisions and rewrites, really all 295 00:18:33,480 --> 00:18:36,080 Speaker 1: you're doing is just making more work for fewer people. 296 00:18:36,840 --> 00:18:39,040 Speaker 1: But it's hard to say, I bet there's a human 297 00:18:39,080 --> 00:18:43,119 Speaker 1: writer out there who really wants to essentially rewrite the 298 00:18:43,160 --> 00:18:45,960 Speaker 1: same article every day for the rest of their careers, 299 00:18:45,960 --> 00:18:48,639 Speaker 1: because if it's something like what are the current fuel 300 00:18:48,720 --> 00:18:51,639 Speaker 1: prices in your city, well, that's something you would have 301 00:18:51,680 --> 00:18:56,199 Speaker 1: to research and write every single day. That's probably not 302 00:18:56,240 --> 00:18:58,639 Speaker 1: a lot of fun to do. So maybe that is 303 00:18:58,680 --> 00:19:01,840 Speaker 1: a good use for AI. Right If the AI is 304 00:19:01,920 --> 00:19:07,200 Speaker 1: reasonably accurate and reliable, then it could mean that human 305 00:19:07,240 --> 00:19:11,200 Speaker 1: writer ideally could get a different assignment that's more interesting 306 00:19:11,280 --> 00:19:16,639 Speaker 1: and rewarding to work on. So I can see the 307 00:19:17,240 --> 00:19:21,439 Speaker 1: use for AI for specific versions of writing for stuff 308 00:19:21,440 --> 00:19:25,880 Speaker 1: that people genuinely need to know but is genuinely tedious 309 00:19:26,000 --> 00:19:29,800 Speaker 1: to research and write. The problem is I worry about 310 00:19:29,800 --> 00:19:32,879 Speaker 1: companies overstepping that and just saying, oh, wow, AI is 311 00:19:33,000 --> 00:19:35,479 Speaker 1: way cheaper than having humans, right, Let's just have them 312 00:19:35,480 --> 00:19:39,200 Speaker 1: write everything, and then we get an even bigger decline 313 00:19:39,240 --> 00:19:44,639 Speaker 1: in journalistic integrity and quality. California here in the United States, 314 00:19:44,680 --> 00:19:48,680 Speaker 1: has a new Department of Privacy regulators who are actually 315 00:19:49,480 --> 00:19:52,879 Speaker 1: authorized with power now, and this week they are hearing 316 00:19:53,000 --> 00:19:57,240 Speaker 1: their first case. They are looking at how auto manufacturers 317 00:19:57,280 --> 00:20:01,960 Speaker 1: are incorporating data gathering technology within the vehicles that they 318 00:20:02,280 --> 00:20:05,560 Speaker 1: make and sell, and how those companies then collect, use, 319 00:20:05,760 --> 00:20:09,960 Speaker 1: and protect that information because there are really no rules 320 00:20:10,000 --> 00:20:14,040 Speaker 1: in place to serve as a parameters for that. So 321 00:20:14,080 --> 00:20:16,560 Speaker 1: the agency is looking at the types of information that 322 00:20:16,600 --> 00:20:20,600 Speaker 1: are collected, which can include everything from geolocation data to 323 00:20:20,920 --> 00:20:25,439 Speaker 1: camera images depending on the vehicle. There's a concern that 324 00:20:25,560 --> 00:20:28,600 Speaker 1: data collection has become an important component in cars, but 325 00:20:28,640 --> 00:20:31,960 Speaker 1: there's such a lack of regulations and rules. Now companies 326 00:20:32,000 --> 00:20:35,240 Speaker 1: can use that information that it could lead to real problems. Now, 327 00:20:35,280 --> 00:20:39,080 Speaker 1: ideally manufacturers use this info to keep eyes on things 328 00:20:39,160 --> 00:20:42,600 Speaker 1: like vehicle performance and maybe even detect issues before they 329 00:20:42,680 --> 00:20:45,240 Speaker 1: become huge problems, and that could lead to a much 330 00:20:45,280 --> 00:20:48,640 Speaker 1: more effective method of dealing with stuff like recalls. For example, 331 00:20:49,520 --> 00:20:53,919 Speaker 1: maybe a company sees, oh, this isn't a critical problem yet, 332 00:20:54,000 --> 00:20:56,120 Speaker 1: but it's going to be if we don't do anything 333 00:20:56,119 --> 00:20:58,520 Speaker 1: about it, So let's do the recall earlier and save 334 00:20:58,560 --> 00:21:01,880 Speaker 1: ourselves a lot of green from the long run. That's 335 00:21:02,119 --> 00:21:04,679 Speaker 1: a legitimate use for that kind of data collection. But 336 00:21:04,800 --> 00:21:07,840 Speaker 1: if you go a step outside of the actual vehicle 337 00:21:08,400 --> 00:21:10,880 Speaker 1: and you think about what does the data say about 338 00:21:10,960 --> 00:21:14,640 Speaker 1: the driver who's inside that vehicle, that's where concerns start 339 00:21:14,640 --> 00:21:17,159 Speaker 1: to pop up. I mean that vehicle data could be 340 00:21:17,240 --> 00:21:18,800 Speaker 1: used to do a lot of things, like you might 341 00:21:18,840 --> 00:21:23,080 Speaker 1: be able to draw conclusions about the actual driver's life. 342 00:21:23,160 --> 00:21:28,040 Speaker 1: So imagine that you are having to seek regular medical 343 00:21:28,080 --> 00:21:31,720 Speaker 1: treatments for some condition you have, and that the car 344 00:21:31,760 --> 00:21:35,280 Speaker 1: is essentially gathering information about the fact that you're going 345 00:21:35,400 --> 00:21:39,240 Speaker 1: to a medical facility on a regular basis. That's information 346 00:21:39,320 --> 00:21:43,320 Speaker 1: you probably wouldn't just freely share with a car company 347 00:21:43,400 --> 00:21:47,600 Speaker 1: for no reason, right, I mean, that's private healthcare information. 348 00:21:48,600 --> 00:21:53,240 Speaker 1: And so this is really this regulation agency's first step 349 00:21:53,280 --> 00:21:56,479 Speaker 1: to get a full understanding of the scope and depth 350 00:21:56,560 --> 00:21:59,960 Speaker 1: of data collection in the auto industry. And I hope 351 00:22:00,520 --> 00:22:03,680 Speaker 1: it's an indication of a seed change shift in how 352 00:22:03,720 --> 00:22:08,040 Speaker 1: the United States in particular approaches data collection and use, 353 00:22:08,200 --> 00:22:12,920 Speaker 1: because for too long it's just been open season for 354 00:22:13,240 --> 00:22:17,840 Speaker 1: information out there. Okay, I've got a few more stories 355 00:22:17,880 --> 00:22:20,520 Speaker 1: to cover before we get to those. Let's take another 356 00:22:20,600 --> 00:22:33,080 Speaker 1: quick break. All right, we're back, and hey, let's head 357 00:22:33,080 --> 00:22:38,480 Speaker 1: back down to Australia. I got some bad news. So 358 00:22:38,560 --> 00:22:42,600 Speaker 1: Disney announced that after Guardians of the Galaxy Volume three 359 00:22:42,760 --> 00:22:47,000 Speaker 1: hits store shelves in DVD and Blu Ray formats this month, 360 00:22:47,600 --> 00:22:52,480 Speaker 1: the company is done producing physical media for the Australian market. 361 00:22:52,760 --> 00:22:56,800 Speaker 1: So moving forward, fans in Australia will have to use 362 00:22:56,800 --> 00:23:01,119 Speaker 1: stuff like streaming services or cable or satellite or whatever 363 00:23:01,200 --> 00:23:04,800 Speaker 1: in order to watch future Disney properties. So there will 364 00:23:04,840 --> 00:23:08,800 Speaker 1: be no more Marvel, Star Wars, or Disney DVDs or 365 00:23:08,840 --> 00:23:12,280 Speaker 1: Blu Rays. So why is that, Well, Disney says it's 366 00:23:12,280 --> 00:23:15,359 Speaker 1: because the home media market in Australia has slowed to 367 00:23:15,400 --> 00:23:18,600 Speaker 1: a point where it's just no longer profitable to produce 368 00:23:18,680 --> 00:23:21,479 Speaker 1: physical copies of stuff. And I get that a lot 369 00:23:21,520 --> 00:23:25,639 Speaker 1: of people have moved away from physical media, but personally, 370 00:23:25,680 --> 00:23:28,760 Speaker 1: as someone who has recently gotten back into that, I 371 00:23:28,800 --> 00:23:32,640 Speaker 1: am bummed by this news. For one thing, we are 372 00:23:32,680 --> 00:23:36,560 Speaker 1: now all aware that just because something is currently available 373 00:23:36,600 --> 00:23:39,360 Speaker 1: on a streaming service today, it doesn't mean it's going 374 00:23:39,400 --> 00:23:43,360 Speaker 1: to still be there tomorrow. We have seen dozens hundreds 375 00:23:43,359 --> 00:23:47,960 Speaker 1: of titles disappear off of streaming services, including Disney Plus, 376 00:23:48,440 --> 00:23:52,600 Speaker 1: and then become unavailable. A lot of streaming exclusive content 377 00:23:52,640 --> 00:23:54,960 Speaker 1: never even makes it to physical home media at all, 378 00:23:55,160 --> 00:24:00,000 Speaker 1: so once it goes, it just becomes inaccessible. It still exists, 379 00:24:00,240 --> 00:24:02,960 Speaker 1: but there's no way for you to watch it. Well. 380 00:24:02,960 --> 00:24:06,359 Speaker 1: With Disney making this move, it means for Australians everything 381 00:24:06,440 --> 00:24:09,280 Speaker 1: Disney makes is potentially in that category of stuff that 382 00:24:09,320 --> 00:24:13,440 Speaker 1: one day could just disappear. And there's an understandable concern 383 00:24:13,480 --> 00:24:16,160 Speaker 1: that other studios are going to follow Disney's lead. In fact, 384 00:24:16,160 --> 00:24:19,159 Speaker 1: I'd be shocked if that doesn't happen. But from a 385 00:24:19,200 --> 00:24:23,080 Speaker 1: business perspective, it's hard to fault the decision. You don't 386 00:24:23,080 --> 00:24:26,520 Speaker 1: get into the business to not make money. From a 387 00:24:26,560 --> 00:24:30,320 Speaker 1: fan perspective or from an archives perspective, this is a 388 00:24:30,400 --> 00:24:33,199 Speaker 1: huge blow, and I wouldn't be surprised to see similar 389 00:24:33,200 --> 00:24:36,200 Speaker 1: stories play out in other regions as more people migrate 390 00:24:36,280 --> 00:24:39,679 Speaker 1: away from physical media. Personally, I'm going to be buying 391 00:24:39,720 --> 00:24:42,120 Speaker 1: Blu ray and DVD copies of the stuff I love 392 00:24:42,400 --> 00:24:45,919 Speaker 1: while I still can, because I've had plenty of experiences 393 00:24:46,000 --> 00:24:48,959 Speaker 1: where something I really enjoyed was on a platform and 394 00:24:49,000 --> 00:24:53,160 Speaker 1: then an agreement expires and it's gone and there becomes 395 00:24:53,200 --> 00:24:56,119 Speaker 1: no legal way for me to access it anymore. And 396 00:24:56,160 --> 00:24:59,680 Speaker 1: I'm not the type of person to lean on illegal 397 00:24:59,720 --> 00:25:02,400 Speaker 1: MEAs means to get access to content if there are 398 00:25:02,560 --> 00:25:06,000 Speaker 1: legal alternatives. But when there are no legal alternatives, that 399 00:25:06,080 --> 00:25:09,200 Speaker 1: really I mean, you either do without or you break 400 00:25:09,240 --> 00:25:12,920 Speaker 1: the law. Those are really your only options. I guess 401 00:25:12,960 --> 00:25:15,080 Speaker 1: the responsible thing is I guess to do without. But 402 00:25:15,119 --> 00:25:17,240 Speaker 1: it just to get so frustrating when you think I 403 00:25:17,320 --> 00:25:19,119 Speaker 1: want to see this, I am happy to pay for 404 00:25:19,160 --> 00:25:22,760 Speaker 1: the ability to see it. There just isn't that option. 405 00:25:23,359 --> 00:25:27,560 Speaker 1: Video Game Chronicle reports that Nintendo is preparing to launch 406 00:25:27,640 --> 00:25:31,080 Speaker 1: its next console sometime next year, likely in the fall. 407 00:25:31,320 --> 00:25:34,320 Speaker 1: In time for the holiday season. Nintendo launched the Switch 408 00:25:34,400 --> 00:25:37,439 Speaker 1: back in twenty seventeen, and while titles like Tears of 409 00:25:37,440 --> 00:25:40,800 Speaker 1: the Kingdom show that this portable system, or at least 410 00:25:40,800 --> 00:25:43,760 Speaker 1: system capable of going into portable mode, can still pack 411 00:25:43,800 --> 00:25:46,480 Speaker 1: a surprising amount of punch, it is safe to say 412 00:25:46,640 --> 00:25:49,120 Speaker 1: that the hardware is now kind of pushing against its 413 00:25:49,200 --> 00:25:53,840 Speaker 1: upper limits. The VGC says that Nintendo has already started 414 00:25:53,840 --> 00:25:57,920 Speaker 1: to ship development kits to various partner studios in anticipation 415 00:25:58,000 --> 00:26:02,280 Speaker 1: of this launch next year. Details are understandably scarce, but 416 00:26:02,359 --> 00:26:06,200 Speaker 1: it sounds like the next console will again be capable 417 00:26:06,440 --> 00:26:09,040 Speaker 1: of being used in a portable method, you know, like 418 00:26:09,080 --> 00:26:12,760 Speaker 1: a handheld system similar to what Switch can do, and 419 00:26:12,800 --> 00:26:16,560 Speaker 1: that the company is choosing to use LCD screens instead 420 00:26:16,600 --> 00:26:19,320 Speaker 1: of o LED screens, likely in an effort to keep 421 00:26:19,359 --> 00:26:23,520 Speaker 1: manufacturing costs down. Hopefully that will also mean it will 422 00:26:23,520 --> 00:26:26,840 Speaker 1: help keep consumer prices down that they won't be too 423 00:26:27,040 --> 00:26:32,000 Speaker 1: expensive either. VGC also says that the system will include 424 00:26:32,080 --> 00:26:35,159 Speaker 1: a cartridge slot for physical media, so we're not going 425 00:26:35,200 --> 00:26:37,840 Speaker 1: all digital with this one. Not a big surprise, then. 426 00:26:37,840 --> 00:26:40,520 Speaker 1: Tendo has a long history of supporting physical media and 427 00:26:40,560 --> 00:26:46,160 Speaker 1: to kind of kind of drag its feet on things 428 00:26:46,200 --> 00:26:50,440 Speaker 1: like connected features for its consoles. But that's all that's 429 00:26:50,520 --> 00:26:53,200 Speaker 1: known right now, and here's hoping that this console is 430 00:26:53,240 --> 00:26:56,879 Speaker 1: another big success story like the Switch and not another 431 00:26:57,000 --> 00:27:02,240 Speaker 1: misfire like the WEU. Speaking of video games, Call of 432 00:27:02,359 --> 00:27:07,959 Speaker 1: Duty recently held a culling. The game targeted players who 433 00:27:08,000 --> 00:27:10,320 Speaker 1: are found to have been relying on cheats and hacks 434 00:27:10,640 --> 00:27:14,040 Speaker 1: in an effort to get advantages over other players. According 435 00:27:14,040 --> 00:27:18,840 Speaker 1: to the game's x feed or Twitter again, this is 436 00:27:18,880 --> 00:27:22,919 Speaker 1: so confusing x feed just whatever. Anyway, according to the 437 00:27:23,000 --> 00:27:26,600 Speaker 1: video game, the publisher has banned more than fourteen thousand 438 00:27:26,640 --> 00:27:31,560 Speaker 1: accounts for cheating and hacking within just twenty four hours now. Personally, 439 00:27:31,600 --> 00:27:33,920 Speaker 1: I don't play Call of Duty, but there are a 440 00:27:33,960 --> 00:27:37,240 Speaker 1: couple of British content creators I watch regularly who do 441 00:27:37,280 --> 00:27:39,760 Speaker 1: play it. I hope this means that they will encounter 442 00:27:39,840 --> 00:27:43,280 Speaker 1: fewer instances of people using cheats and hacks. It is 443 00:27:43,359 --> 00:27:46,680 Speaker 1: frustrating enough just as a viewer to watch someone who 444 00:27:46,720 --> 00:27:49,360 Speaker 1: is really good at the game they play, but then 445 00:27:49,680 --> 00:27:54,600 Speaker 1: they encounter cheaters and it becomes this unfair exchange, like 446 00:27:54,840 --> 00:27:58,120 Speaker 1: when they're up against genuinely skilled players. That's exciting when 447 00:27:58,119 --> 00:28:00,960 Speaker 1: they're against cheaters. It's just frustrating, but I imagine it's 448 00:28:01,040 --> 00:28:05,560 Speaker 1: way worse to actually experience it firsthand. I have encountered 449 00:28:05,600 --> 00:28:08,080 Speaker 1: cheaters in a few games I've played. I remember a 450 00:28:08,119 --> 00:28:12,840 Speaker 1: game of PUBG where the person who took me out 451 00:28:12,920 --> 00:28:14,800 Speaker 1: it turned out they were using a cheat where they 452 00:28:14,800 --> 00:28:18,240 Speaker 1: were just getting headshots, like they weren't even pointing at 453 00:28:18,280 --> 00:28:22,000 Speaker 1: people and getting headshots, and it was such an obvious 454 00:28:22,160 --> 00:28:24,639 Speaker 1: cheating mechanism. It was really frustrating. It convinced me to 455 00:28:24,640 --> 00:28:29,160 Speaker 1: stop playing PUBG. So that's fun. So really, game developers 456 00:28:29,200 --> 00:28:31,760 Speaker 1: have an incentive to crack down on this kind of 457 00:28:31,760 --> 00:28:34,920 Speaker 1: thing because otherwise they do run the risk of players 458 00:28:34,920 --> 00:28:37,159 Speaker 1: becoming frustrated with the title as a whole and just 459 00:28:37,200 --> 00:28:42,440 Speaker 1: abandoning it. And finally, a weather satellite called Eolis came 460 00:28:42,520 --> 00:28:46,400 Speaker 1: crashing down to Earth this week, but this was planned. 461 00:28:46,880 --> 00:28:50,440 Speaker 1: The European Space Agency launched Eolis about five years ago. 462 00:28:50,960 --> 00:28:54,800 Speaker 1: It carried a laser doppler tool called a Leyden or 463 00:28:54,840 --> 00:28:57,240 Speaker 1: Aladdin if you prefer. It just has one D, so 464 00:28:57,400 --> 00:29:00,560 Speaker 1: I called it a Leyden, and the scientist were using 465 00:29:00,560 --> 00:29:03,080 Speaker 1: this to help monitor and study wind speed and direction 466 00:29:03,160 --> 00:29:06,280 Speaker 1: at various elevations through the atmosphere. So this was part 467 00:29:06,320 --> 00:29:09,280 Speaker 1: of a broader study on weather and climate and just 468 00:29:09,320 --> 00:29:12,680 Speaker 1: gathering a lot of scientific information. But earlier this year, 469 00:29:13,600 --> 00:29:17,040 Speaker 1: the ESA made plans to deorbit the satellite. It was 470 00:29:17,080 --> 00:29:19,440 Speaker 1: reaching the end of its mission. It's also reaching the 471 00:29:19,480 --> 00:29:22,160 Speaker 1: end of its fuel, and as it turned out, the 472 00:29:22,200 --> 00:29:24,480 Speaker 1: satellite did not have enough fuel in it so that 473 00:29:24,600 --> 00:29:28,880 Speaker 1: the ESA could do a fully controlled deorbit. Instead, the 474 00:29:29,200 --> 00:29:31,160 Speaker 1: ESA used the remaining fuel to carry out what they 475 00:29:31,240 --> 00:29:35,080 Speaker 1: called an assist, which is kind of between a controlled 476 00:29:35,080 --> 00:29:39,160 Speaker 1: deorbit where you are using thrust to target a specific 477 00:29:39,240 --> 00:29:43,120 Speaker 1: location for touchdown or crash down. I guess it is 478 00:29:43,240 --> 00:29:47,440 Speaker 1: a better word for it, and an uncontrolled deorbit where 479 00:29:48,160 --> 00:29:50,080 Speaker 1: nature just takes its course and you have no idea 480 00:29:50,120 --> 00:29:53,480 Speaker 1: where that satellite's going to end up going. The ESA 481 00:29:53,760 --> 00:29:56,240 Speaker 1: couldn't guide the satellite the entire time, but it could 482 00:29:56,320 --> 00:29:59,120 Speaker 1: ensure that the spacecraft was able to use its fuel 483 00:29:59,120 --> 00:30:02,320 Speaker 1: and thrusters to manu itself above the Atlantic Ocean in 484 00:30:02,360 --> 00:30:05,239 Speaker 1: an effort to minimize any terrestrial issues, you know, like 485 00:30:05,840 --> 00:30:09,480 Speaker 1: having the satellite fall in someone's house or something. Deorbiting 486 00:30:09,480 --> 00:30:12,440 Speaker 1: the satellite also means that EOLIS would not become another 487 00:30:12,520 --> 00:30:16,360 Speaker 1: piece of space junk serving as a potential hazard in 488 00:30:16,400 --> 00:30:20,920 Speaker 1: lower orbit. So rest in pieces. Eolis, you did good 489 00:30:20,960 --> 00:30:25,320 Speaker 1: work out there. That's it for today's news. Hope you 490 00:30:25,440 --> 00:30:28,400 Speaker 1: are all well and I will talk to you again 491 00:30:29,200 --> 00:30:38,800 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 492 00:30:38,840 --> 00:30:43,600 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 493 00:30:43,600 --> 00:30:49,360 Speaker 1: wherever you listen to your favorite shows.