1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,840 --> 00:00:22,360 Speaker 1: and I love all things tech. And this is the 5 00:00:22,400 --> 00:00:27,520 Speaker 1: tech news for Tuesday, August three one, and we've got 6 00:00:27,520 --> 00:00:29,400 Speaker 1: a ton of it, So this might be a slightly 7 00:00:29,480 --> 00:00:33,200 Speaker 1: longer news episode than I usually do. And we're gonna 8 00:00:33,240 --> 00:00:37,199 Speaker 1: start off I'm not burying the lead. I've titled this 9 00:00:37,440 --> 00:00:40,680 Speaker 1: Tesla The Good, the Bad, and the on Fire. We're 10 00:00:40,720 --> 00:00:43,120 Speaker 1: gonna get right to it with a trio of Tesla 11 00:00:43,320 --> 00:00:46,599 Speaker 1: stories and we're gonna start off with the first one 12 00:00:46,760 --> 00:00:52,599 Speaker 1: being a positive story involving Tesla autopilot. Now, we typically 13 00:00:52,640 --> 00:00:57,920 Speaker 1: hear about autopilot when something goes wrong, sometimes tragically wrong, 14 00:00:58,360 --> 00:01:01,640 Speaker 1: but this is different. In Norway, a Tesla owner was 15 00:01:01,720 --> 00:01:06,319 Speaker 1: driving their car apparently well under the influence of alcohol. 16 00:01:06,680 --> 00:01:11,080 Speaker 1: Now that bit is awful. It never should have happened. Obviously, 17 00:01:11,360 --> 00:01:14,920 Speaker 1: no one should ever drive under the influence, but that's 18 00:01:14,959 --> 00:01:17,319 Speaker 1: what was going on with this case, apparently, and the 19 00:01:17,440 --> 00:01:21,360 Speaker 1: driver allegedly passed out behind the wheel of their car. 20 00:01:21,959 --> 00:01:25,680 Speaker 1: And obviously this could have led to tragedy, but the 21 00:01:25,720 --> 00:01:31,679 Speaker 1: Testla system. The autopilot system detected that the driver was unresponsive, 22 00:01:32,120 --> 00:01:35,720 Speaker 1: and it thus went into what it was supposed to do, 23 00:01:35,959 --> 00:01:38,440 Speaker 1: so at that point, the system kept the car traveling 24 00:01:38,480 --> 00:01:42,000 Speaker 1: smoothly within the lanes of the road until it could 25 00:01:42,000 --> 00:01:44,120 Speaker 1: come to a safe spot on the side of the 26 00:01:44,240 --> 00:01:47,200 Speaker 1: road and come to a stop. The driver was later 27 00:01:47,319 --> 00:01:50,360 Speaker 1: found unconscious behind the wheel of the vehicle and received 28 00:01:50,360 --> 00:01:54,560 Speaker 1: medical treatment. That driver also denied two authorities that they 29 00:01:54,600 --> 00:01:57,720 Speaker 1: had been operating the vehicle while intoxicated, but that story 30 00:01:57,800 --> 00:02:00,480 Speaker 1: was somewhat undermined by the fact that other motor had 31 00:02:00,520 --> 00:02:04,440 Speaker 1: apparently captured video of this driver unconscious behind the wheel 32 00:02:04,560 --> 00:02:09,519 Speaker 1: of the car. The authorities subsequently put a temporary suspension 33 00:02:09,600 --> 00:02:13,679 Speaker 1: on that driver's license and filed a criminal complaint against them. 34 00:02:14,080 --> 00:02:17,200 Speaker 1: So this is a case of Tesla's autopilot, which I 35 00:02:17,200 --> 00:02:21,440 Speaker 1: should remind you is not technically a fully self driving system, 36 00:02:21,480 --> 00:02:25,760 Speaker 1: but rather a driving assist system that was working just 37 00:02:25,919 --> 00:02:30,399 Speaker 1: as it was intended. The autopilot checks for driver responsiveness 38 00:02:30,800 --> 00:02:33,120 Speaker 1: to make sure that you are attentive and that you 39 00:02:33,160 --> 00:02:38,160 Speaker 1: are prepared to take over driving, and in cases where 40 00:02:38,200 --> 00:02:40,560 Speaker 1: that isn't happening, it will prompt the driver to take 41 00:02:40,600 --> 00:02:43,240 Speaker 1: control of the vehicle, essentially alerting you so that like 42 00:02:43,320 --> 00:02:46,440 Speaker 1: if you dozed off, you wake up and you can 43 00:02:46,639 --> 00:02:48,880 Speaker 1: put your hands in the wheel and that way you 44 00:02:48,919 --> 00:02:52,040 Speaker 1: can't just you know, engage the autopilot system and read 45 00:02:52,080 --> 00:02:54,520 Speaker 1: a book or go to sleep or watch the next 46 00:02:54,520 --> 00:02:57,440 Speaker 1: episode of ted Lasso or whatever. You have to be 47 00:02:57,480 --> 00:03:00,280 Speaker 1: an active driver, and if you ignore the alerts, then 48 00:03:00,360 --> 00:03:02,400 Speaker 1: the car does what we saw in this case, it 49 00:03:02,440 --> 00:03:05,119 Speaker 1: will come to a stop on the side of the road. So, 50 00:03:05,240 --> 00:03:10,040 Speaker 1: thanks to autopilot, a possible catastrophic car wreck did not happen. 51 00:03:10,440 --> 00:03:12,920 Speaker 1: Though again, we do need to remember we're still a 52 00:03:12,960 --> 00:03:18,120 Speaker 1: long way away from a fully autonomous consumer vehicle, and 53 00:03:18,240 --> 00:03:21,240 Speaker 1: if we need another reminder, we can move on to 54 00:03:21,320 --> 00:03:26,080 Speaker 1: our next story about Tesla. This one's also about autopilot, 55 00:03:26,560 --> 00:03:31,040 Speaker 1: but it's less good. It's not terrible, it's just not good. 56 00:03:31,440 --> 00:03:33,800 Speaker 1: And this story actually comes from last week, but I 57 00:03:33,919 --> 00:03:36,440 Speaker 1: only just saw it today, so I thought I would 58 00:03:36,440 --> 00:03:39,920 Speaker 1: include it in our Tesla trio of news stories. A 59 00:03:40,040 --> 00:03:44,760 Speaker 1: Twitter user named Jordans Nelson posted a video that showed 60 00:03:45,040 --> 00:03:49,640 Speaker 1: that the Tesla they were in mistakenly identified the moon 61 00:03:49,840 --> 00:03:54,120 Speaker 1: as a yellow traffic light, and thus the autopilot system 62 00:03:54,360 --> 00:03:57,040 Speaker 1: was that was guiding the car, told the car to 63 00:03:57,120 --> 00:04:00,200 Speaker 1: slow down in anticipation of a red light, that there 64 00:04:00,240 --> 00:04:02,600 Speaker 1: will be a red light, so let's slow down. This 65 00:04:02,680 --> 00:04:04,760 Speaker 1: can happen on any stretch of road in which the 66 00:04:04,800 --> 00:04:08,240 Speaker 1: car is facing a low yellow moon, even if you 67 00:04:08,320 --> 00:04:11,440 Speaker 1: happen to be, say on a highway. You could see 68 00:04:11,840 --> 00:04:14,240 Speaker 1: how that could be something of an issue if you 69 00:04:14,280 --> 00:04:17,320 Speaker 1: were engaging autopilot at that time. And in this case, 70 00:04:17,360 --> 00:04:23,320 Speaker 1: the autopilot mode was actually Tesla's full self driving feature FSD. 71 00:04:23,360 --> 00:04:26,120 Speaker 1: That's what Tesla calls it, which I would argue is 72 00:04:26,160 --> 00:04:29,760 Speaker 1: another somewhat confusing name. I've often said I don't like 73 00:04:29,839 --> 00:04:33,240 Speaker 1: the name autopilot. I think it gives the wrong implication 74 00:04:33,360 --> 00:04:37,160 Speaker 1: as to what that system actually does. Well, I feel 75 00:04:37,240 --> 00:04:41,400 Speaker 1: the same about the full self driving feature because Tesla 76 00:04:41,480 --> 00:04:44,520 Speaker 1: has fully admitted that none of its vehicles qualifies being 77 00:04:44,640 --> 00:04:48,880 Speaker 1: truly autonomous vehicles, and so even in FSD mode, the 78 00:04:48,960 --> 00:04:54,240 Speaker 1: vehicle needs to operate under human supervision. So yeah, that 79 00:04:54,360 --> 00:04:57,360 Speaker 1: name doesn't really work. But then I guess calling it 80 00:04:57,560 --> 00:05:01,040 Speaker 1: almost full self driving doesn't have quite the same ring 81 00:05:01,080 --> 00:05:04,880 Speaker 1: to it. Now, that moon issue that Jordan Nelson ran 82 00:05:04,920 --> 00:05:08,240 Speaker 1: into is not likely to happen all the time. Obviously, 83 00:05:08,320 --> 00:05:11,720 Speaker 1: it would only happen if the moon were particularly yellowish 84 00:05:11,760 --> 00:05:14,719 Speaker 1: that night and low in the sky enough so that 85 00:05:15,360 --> 00:05:17,039 Speaker 1: it would look kind of like the height of a 86 00:05:17,040 --> 00:05:19,240 Speaker 1: traffic light. The car would obviously have to be facing 87 00:05:19,480 --> 00:05:22,520 Speaker 1: that direction as well. But this does illustrate how the 88 00:05:22,600 --> 00:05:26,360 Speaker 1: challenges of developing autonomous cars can include stuff that we 89 00:05:26,480 --> 00:05:32,000 Speaker 1: humans just wouldn't take into account. Right, humans wouldn't mistake 90 00:05:32,279 --> 00:05:35,360 Speaker 1: a yellow moon for a yellow traffic light. Of course, 91 00:05:35,400 --> 00:05:39,080 Speaker 1: in the future, we might have some infrastructure like traffic 92 00:05:39,160 --> 00:05:42,680 Speaker 1: lights that maybe contained a little transmitter that can communicate 93 00:05:42,720 --> 00:05:45,600 Speaker 1: directly with compatible vehicles, and thus cars on the road 94 00:05:45,640 --> 00:05:49,040 Speaker 1: would quote unquote no if a light were about to change, 95 00:05:49,120 --> 00:05:53,839 Speaker 1: because they'd be in communication with the infrastructure. However, if 96 00:05:53,880 --> 00:05:57,480 Speaker 1: that is ever to happen, it's pretty darn far off. 97 00:05:57,520 --> 00:06:01,160 Speaker 1: Because here's the thing. While that technology isn't necessarily that 98 00:06:01,480 --> 00:06:05,560 Speaker 1: complicated or futuristic, it's something that we could probably do today. 99 00:06:05,839 --> 00:06:09,880 Speaker 1: The scale of rolling it out practically of actually installing 100 00:06:09,920 --> 00:06:13,440 Speaker 1: it would be enormous even for just a single city, 101 00:06:13,520 --> 00:06:16,480 Speaker 1: let alone the entire world. But a feller can dream 102 00:06:16,560 --> 00:06:21,680 Speaker 1: canny and now our third Tesla story rounding them out. 103 00:06:22,240 --> 00:06:27,280 Speaker 1: Last Friday, Tesla had a true disaster on its figurative hands. 104 00:06:27,720 --> 00:06:31,920 Speaker 1: A Tesla Mega Pact battery, which is a thirteen ton 105 00:06:32,320 --> 00:06:36,920 Speaker 1: lithium battery, caught fire. Now, the Mega Pac is meant 106 00:06:37,000 --> 00:06:42,320 Speaker 1: to store electricity for utility scale projects. Like imagine you've 107 00:06:42,360 --> 00:06:45,240 Speaker 1: got some sort of huge work site and you need 108 00:06:45,279 --> 00:06:49,120 Speaker 1: to supply electricity there. Or maybe you've got, you know, 109 00:06:49,400 --> 00:06:52,560 Speaker 1: some form of electrical generation facility and you need a 110 00:06:52,600 --> 00:06:56,040 Speaker 1: place to store excess electricity. That's sort of the kind 111 00:06:56,080 --> 00:06:58,560 Speaker 1: of thing we're talking about. This isn't a car battery 112 00:06:58,680 --> 00:07:02,719 Speaker 1: or anything like that. It's order of magnitude larger than that. Anyway, 113 00:07:02,960 --> 00:07:06,479 Speaker 1: one of these things caught fire on Friday in Australia 114 00:07:06,560 --> 00:07:10,560 Speaker 1: and it took four days, thirty fire trucks and one 115 00:07:10,760 --> 00:07:15,560 Speaker 1: hundred fifty firefighters to extinguish that fire. Now, if you 116 00:07:15,600 --> 00:07:19,640 Speaker 1: remember back when Samsung had issues with the Samsung Galaxy 117 00:07:19,800 --> 00:07:26,320 Speaker 1: Note seven battery which exploded in several different instances, you 118 00:07:26,360 --> 00:07:30,360 Speaker 1: remember how that issue was incredibly serious. Right, the reaction 119 00:07:30,520 --> 00:07:34,840 Speaker 1: is violent, it's it can be deadly. Uh, it also 120 00:07:34,920 --> 00:07:38,280 Speaker 1: can release a lot of toxic fumes, and that threat 121 00:07:38,320 --> 00:07:41,920 Speaker 1: was serious enough to prompt airlines to prevent passengers from 122 00:07:41,960 --> 00:07:44,840 Speaker 1: carrying a Note seven on board the aircraft. They were 123 00:07:44,880 --> 00:07:48,320 Speaker 1: essentially told they had to hand in their phones and 124 00:07:48,360 --> 00:07:51,720 Speaker 1: which would not be allowed on the plane, or they 125 00:07:51,760 --> 00:07:55,200 Speaker 1: themselves would not be allowed to get on the flight. Well, 126 00:07:55,520 --> 00:07:58,000 Speaker 1: this is kind of like that, only instead of being 127 00:07:58,000 --> 00:08:03,000 Speaker 1: a battery that fits inside handheld smartphone, it's a thirteen 128 00:08:03,360 --> 00:08:09,000 Speaker 1: ton monstrosity. The battery itself was inside a shipping container, 129 00:08:09,360 --> 00:08:12,360 Speaker 1: which was likely a good thing because it probably helped 130 00:08:12,400 --> 00:08:16,160 Speaker 1: prevent the fire from spreading to other megapact batteries at 131 00:08:16,200 --> 00:08:20,040 Speaker 1: that site, because if there's anything worse than one thirteen 132 00:08:20,080 --> 00:08:23,320 Speaker 1: ton battery catching on fire, it's a bunch of them 133 00:08:23,360 --> 00:08:28,720 Speaker 1: doing it. Australian authorities say the fire was brought under 134 00:08:28,720 --> 00:08:33,640 Speaker 1: control on Monday, so that's a good thing. But yeah, 135 00:08:33,760 --> 00:08:36,880 Speaker 1: it's a scary story and a reminder that batteries with 136 00:08:36,960 --> 00:08:40,559 Speaker 1: volatile components are a bit dangerous. If the battery should 137 00:08:40,600 --> 00:08:43,200 Speaker 1: be damaged in a way that the path for the 138 00:08:43,200 --> 00:08:46,199 Speaker 1: electrons to travel from the negative terminal to the positive 139 00:08:46,280 --> 00:08:49,840 Speaker 1: terminal opens up, you know, without having to be connected 140 00:08:49,880 --> 00:08:52,360 Speaker 1: to some sort of circuit, well then the battery is 141 00:08:52,400 --> 00:08:55,440 Speaker 1: going to discharge rapidly, right, the electrons are just going 142 00:08:55,480 --> 00:08:57,400 Speaker 1: to rush from one part of the battery to another. 143 00:08:58,000 --> 00:09:01,040 Speaker 1: This will end up causing the battery to start to 144 00:09:01,200 --> 00:09:04,320 Speaker 1: heat and then overheat, and that can lead to issues 145 00:09:04,440 --> 00:09:08,480 Speaker 1: like a fire or explosion. Now I'm not certain if 146 00:09:08,520 --> 00:09:11,120 Speaker 1: that's what led to the fire in this particular case, 147 00:09:11,400 --> 00:09:13,839 Speaker 1: but just based on how batteries work, I figure it's 148 00:09:13,840 --> 00:09:18,320 Speaker 1: a decent bet. Right. If you create a shortcut or 149 00:09:18,360 --> 00:09:22,760 Speaker 1: a short circuit between terminals, then this is the sort 150 00:09:22,760 --> 00:09:26,120 Speaker 1: of thing that can happen. Moving on from Tesla, a 151 00:09:26,240 --> 00:09:30,640 Speaker 1: cybersecurity firm called threat Fabric reports that the company's researchers 152 00:09:30,640 --> 00:09:34,680 Speaker 1: have found instances of Android apps downloaded from the Google 153 00:09:34,679 --> 00:09:39,280 Speaker 1: Play Store that are attempting to steal banking log in information. 154 00:09:39,800 --> 00:09:43,240 Speaker 1: This malware, which the firm has called Volter, which is 155 00:09:43,280 --> 00:09:46,640 Speaker 1: like Vulture but without E at the end, essentially takes 156 00:09:47,040 --> 00:09:50,040 Speaker 1: a screenshot when someone uses a bank app to log in. 157 00:09:50,080 --> 00:09:54,440 Speaker 1: So you launch your bank application and the malware notices this. 158 00:09:54,840 --> 00:09:58,480 Speaker 1: It attacts the data entry form format and it uses 159 00:09:58,559 --> 00:10:02,439 Speaker 1: the screenshot coupled with a key logger, so a program 160 00:10:02,480 --> 00:10:05,360 Speaker 1: that records key strokes in order to get the user 161 00:10:05,440 --> 00:10:10,200 Speaker 1: name and password uh for whatever that that application is, 162 00:10:10,240 --> 00:10:14,000 Speaker 1: and as I said, it's mainly banking applications. Then the 163 00:10:14,040 --> 00:10:17,080 Speaker 1: malware uses the phone itself to send that data off 164 00:10:17,120 --> 00:10:21,040 Speaker 1: to whichever computer the hackers are using to gather log 165 00:10:21,120 --> 00:10:25,280 Speaker 1: in information. The researchers found that most instances of this 166 00:10:25,400 --> 00:10:29,319 Speaker 1: did seem to target banking information, but some were also 167 00:10:29,400 --> 00:10:33,199 Speaker 1: apparently logging stuff like social media account logins and so 168 00:10:33,240 --> 00:10:36,520 Speaker 1: far and looks at this malware is mainly concentrated in 169 00:10:36,600 --> 00:10:42,720 Speaker 1: Europe and Australia. The Cybersecurity company suggests that Android users 170 00:10:42,760 --> 00:10:46,200 Speaker 1: install anti virus apps to try and detect if you 171 00:10:46,400 --> 00:10:49,440 Speaker 1: happen to have any malware on your phone, because it 172 00:10:49,480 --> 00:10:51,600 Speaker 1: can be a little tricky to figure it out if 173 00:10:51,640 --> 00:10:54,360 Speaker 1: it's on there just by yourself until it's too late. 174 00:10:54,520 --> 00:10:58,240 Speaker 1: I would also recommend having a really good password vault 175 00:10:58,640 --> 00:11:02,760 Speaker 1: that allows you to have a secure place to store 176 00:11:02,880 --> 00:11:08,160 Speaker 1: passwords where it can automatically insert the password into whichever 177 00:11:08,800 --> 00:11:11,320 Speaker 1: you know log inform you need, which means there are 178 00:11:11,360 --> 00:11:14,520 Speaker 1: no key strokes for it to log right if it's 179 00:11:14,559 --> 00:11:19,640 Speaker 1: an automated process where you need a master password to 180 00:11:19,720 --> 00:11:22,480 Speaker 1: log into your account, or if it has biometrics like 181 00:11:22,520 --> 00:11:26,040 Speaker 1: an ice scan or something that can be more secure, 182 00:11:26,200 --> 00:11:29,800 Speaker 1: assuming that obviously they're not key logging your password account, 183 00:11:29,800 --> 00:11:32,800 Speaker 1: which could be devastating. If they add access to all 184 00:11:32,840 --> 00:11:36,760 Speaker 1: your passwords, that's even worse. So I think that a 185 00:11:36,760 --> 00:11:40,520 Speaker 1: password vault and anti virus app is those are two 186 00:11:40,520 --> 00:11:43,880 Speaker 1: good things to have on an Android phone. Um keeping 187 00:11:43,880 --> 00:11:47,640 Speaker 1: in mind nowhere can also affect iOS devices. It's just 188 00:11:47,720 --> 00:11:51,480 Speaker 1: that with Google it has been more of an issue 189 00:11:51,520 --> 00:11:55,720 Speaker 1: more frequently, partly because the Android operating system is on 190 00:11:55,880 --> 00:11:59,079 Speaker 1: way more phones around the world than iOS is, so 191 00:11:59,240 --> 00:12:03,560 Speaker 1: it's a target rich environment. We have lots more stories 192 00:12:03,600 --> 00:12:05,520 Speaker 1: to go, but before we get to any of those, 193 00:12:05,600 --> 00:12:15,640 Speaker 1: let's take a quick break. I got a question for you, 194 00:12:15,760 --> 00:12:18,720 Speaker 1: dear listener. This is back into the news, and my 195 00:12:18,840 --> 00:12:22,520 Speaker 1: question to you is how much are your biometrics worth 196 00:12:22,720 --> 00:12:26,160 Speaker 1: to you? How much would a company need to pay 197 00:12:26,240 --> 00:12:30,920 Speaker 1: you in order to, say, register your fingerprints or your 198 00:12:30,920 --> 00:12:34,880 Speaker 1: retinal scan or maybe a palm print. If your answer 199 00:12:35,000 --> 00:12:37,760 Speaker 1: is ten bucks, well I have some great news for you, 200 00:12:38,120 --> 00:12:40,320 Speaker 1: at least if you happen to live somewhere close to 201 00:12:40,320 --> 00:12:44,240 Speaker 1: one of Amazon's checkout free stores. So the company is 202 00:12:44,280 --> 00:12:47,800 Speaker 1: offering a ten dollar credit to customers who allow their 203 00:12:47,840 --> 00:12:51,400 Speaker 1: palms to be scanned by the company and then linked 204 00:12:51,559 --> 00:12:55,640 Speaker 1: to their Amazon accounts, and the idea being that then 205 00:12:55,720 --> 00:12:58,320 Speaker 1: the customers could walk into one of these Amazon stores, 206 00:12:58,320 --> 00:13:00,560 Speaker 1: they pick up whatever it is they want, and as 207 00:13:00,600 --> 00:13:03,640 Speaker 1: they're leaving, they just scan their palm and that links 208 00:13:03,679 --> 00:13:08,200 Speaker 1: to their Amazon account and payment is automatically transferred to Amazon. 209 00:13:08,880 --> 00:13:11,200 Speaker 1: And that's it. That's the payment. You just, you know, 210 00:13:11,480 --> 00:13:14,400 Speaker 1: you Jedi style, waive your palm and say these are 211 00:13:14,440 --> 00:13:18,000 Speaker 1: the eight apples I'm looking for or whatever, and you 212 00:13:18,040 --> 00:13:21,120 Speaker 1: can go on your married little way. Uh. As you 213 00:13:21,200 --> 00:13:25,359 Speaker 1: might imagine, this move has prompted a rather divisive response, 214 00:13:25,640 --> 00:13:27,600 Speaker 1: and there are some people who just shrug their shoulders 215 00:13:27,600 --> 00:13:30,240 Speaker 1: and say, this is just tech creating a convenient method 216 00:13:30,400 --> 00:13:33,520 Speaker 1: for transactions, and that's kind of cool. And I like 217 00:13:33,640 --> 00:13:36,120 Speaker 1: the idea of being able to do this without having 218 00:13:36,160 --> 00:13:39,440 Speaker 1: to fumble for a credit card or put in some 219 00:13:39,480 --> 00:13:43,040 Speaker 1: sort of code for a transaction. This makes it even 220 00:13:43,040 --> 00:13:45,959 Speaker 1: more convenient and futuristic. It isn't the future awesome. Then 221 00:13:46,000 --> 00:13:47,800 Speaker 1: there are some people who take kind of a middle 222 00:13:47,840 --> 00:13:51,319 Speaker 1: ground approach and they say, well, yeah, it's more convenient, 223 00:13:51,400 --> 00:13:54,680 Speaker 1: but it's also another way for Amazon to tie purchases 224 00:13:55,000 --> 00:13:58,920 Speaker 1: to a specific person. I would argue that the company 225 00:13:58,960 --> 00:14:02,760 Speaker 1: could pretty much do it anyway, because presumably all purchases 226 00:14:02,800 --> 00:14:05,520 Speaker 1: would be linked to an Amazon account in the first place. 227 00:14:05,800 --> 00:14:07,960 Speaker 1: Though I guess you could have a household has a 228 00:14:08,000 --> 00:14:11,520 Speaker 1: shared Amazon account and thus you can't really target the 229 00:14:11,600 --> 00:14:16,280 Speaker 1: specific person in that case. So anyway, by linking purchases 230 00:14:16,320 --> 00:14:20,760 Speaker 1: to an individual, Amazon could potentially target ads more precisely 231 00:14:20,960 --> 00:14:23,600 Speaker 1: to that person. And then it gets to be a 232 00:14:23,600 --> 00:14:27,480 Speaker 1: little kind of big Brother creepy invasive. And then you 233 00:14:27,560 --> 00:14:31,640 Speaker 1: have the more extreme privacy advocates. And I use extreme 234 00:14:31,720 --> 00:14:35,000 Speaker 1: not to you know, downplay them, but rather just to 235 00:14:35,120 --> 00:14:39,200 Speaker 1: say they take privacy far more seriously than most people do. 236 00:14:40,000 --> 00:14:43,200 Speaker 1: They are obviously deeply concerned as Amazon has had a 237 00:14:43,280 --> 00:14:48,080 Speaker 1: pretty shaky record with biometrics, including facial recognition technology, and 238 00:14:48,120 --> 00:14:51,320 Speaker 1: that that that ten dollar credit makes it sound like 239 00:14:51,360 --> 00:14:53,840 Speaker 1: the company is paying people to, in the words of 240 00:14:53,880 --> 00:14:58,800 Speaker 1: Albert fox Cohn, a privacy advocate, sell their bodies. That 241 00:14:58,920 --> 00:15:01,920 Speaker 1: doesn't sound great. Con also pointed out that with many 242 00:15:01,960 --> 00:15:05,200 Speaker 1: aspects of your life, you can make changes if you 243 00:15:05,280 --> 00:15:09,280 Speaker 1: need to get like a fresh start. Let's say something 244 00:15:09,400 --> 00:15:12,680 Speaker 1: terrible happens. You know, maybe you made a terrible mistake 245 00:15:12,800 --> 00:15:15,160 Speaker 1: and then you had to go to jail for it. 246 00:15:15,240 --> 00:15:18,720 Speaker 1: Maybe you were, you know, mistakenly identified for something. Maybe 247 00:15:18,800 --> 00:15:22,400 Speaker 1: you just maybe you just need to start over because 248 00:15:22,920 --> 00:15:28,560 Speaker 1: of your life experiences. So you could move across country, 249 00:15:28,600 --> 00:15:31,240 Speaker 1: you could change your name, legally, you could try and 250 00:15:31,320 --> 00:15:35,280 Speaker 1: start out fresh, but biometrics stick with you and could 251 00:15:35,320 --> 00:15:37,520 Speaker 1: potentially link you back to a time that you would 252 00:15:37,600 --> 00:15:41,200 Speaker 1: rather leave behind. And I figure that I am actually 253 00:15:41,200 --> 00:15:45,640 Speaker 1: falling somewhere in that middle camp where I'm not sold 254 00:15:45,760 --> 00:15:49,840 Speaker 1: on the convenience over privacy feature, but I'm also not 255 00:15:50,000 --> 00:15:55,040 Speaker 1: quite so dystopian in my mindset as some privacy advocates are. 256 00:15:55,360 --> 00:15:57,720 Speaker 1: And I want to just point out again that that's 257 00:15:57,800 --> 00:16:00,200 Speaker 1: just me, and that could be a failing on my heart. 258 00:16:00,240 --> 00:16:03,760 Speaker 1: I might be giving this far too much credit, and 259 00:16:03,840 --> 00:16:07,960 Speaker 1: that maybe I do need to be as paranoid or 260 00:16:08,040 --> 00:16:11,760 Speaker 1: as passionate is probably a better word for as some 261 00:16:11,840 --> 00:16:15,480 Speaker 1: of these privacy advocates are. So I'm not saying they're wrong. 262 00:16:15,880 --> 00:16:18,240 Speaker 1: I'm just saying that's not where I'm at at the moment. 263 00:16:19,240 --> 00:16:23,120 Speaker 1: Moving on, the United States Air Force, specifically, the Special 264 00:16:23,160 --> 00:16:27,360 Speaker 1: Operations Command commissioned a study from the RAND Corporation to 265 00:16:27,440 --> 00:16:31,600 Speaker 1: look into the issue of disinformation campaigns online. As Popular 266 00:16:31,640 --> 00:16:37,080 Speaker 1: Science reports, disinformation is a deliberate attempt to mislead a target, 267 00:16:37,640 --> 00:16:41,160 Speaker 1: like a target audience, by feeding that target a false 268 00:16:41,280 --> 00:16:47,200 Speaker 1: narrative of some sort, so it is purposeful. Misinformation is 269 00:16:47,240 --> 00:16:49,640 Speaker 1: something that could be passed on by accident. You know, 270 00:16:49,720 --> 00:16:54,280 Speaker 1: maybe you omit certain facts and not necessarily on purpose, 271 00:16:54,880 --> 00:16:58,120 Speaker 1: or maybe you just get something wrong in the communication 272 00:16:58,320 --> 00:17:02,160 Speaker 1: of it and it becomes misinformation, whereas disinformation is done 273 00:17:02,200 --> 00:17:06,359 Speaker 1: on purpose with the intent to mislead. Well. The report 274 00:17:06,440 --> 00:17:09,640 Speaker 1: gives a pretty dismal assessment of the U. S government's 275 00:17:09,680 --> 00:17:16,040 Speaker 1: response to disinformation campaigns, calling it dubiously effective and fractured 276 00:17:16,200 --> 00:17:20,879 Speaker 1: and uncoordinated. So the takeaway is that the entities behind 277 00:17:21,119 --> 00:17:26,000 Speaker 1: these disinformation campaigns, such as various Russian operatives and Russian 278 00:17:26,040 --> 00:17:30,080 Speaker 1: troll farms, they tend to be at least they tend 279 00:17:30,080 --> 00:17:32,399 Speaker 1: to operate in a way that that makes them seem 280 00:17:32,480 --> 00:17:37,879 Speaker 1: highly organized and accurate in their targeting, whereas the response 281 00:17:37,960 --> 00:17:42,200 Speaker 1: to these attacks is slipshod and inconsistent. The report also 282 00:17:42,359 --> 00:17:44,400 Speaker 1: found something that I think a lot of people are 283 00:17:44,440 --> 00:17:48,679 Speaker 1: probably at least you know, unconsciously aware of which is 284 00:17:48,720 --> 00:17:53,520 Speaker 1: that these disinformation campaigns don't necessarily invent issues out of 285 00:17:53,560 --> 00:17:56,560 Speaker 1: whole cloth. It's not like they create a story and 286 00:17:56,680 --> 00:17:59,960 Speaker 1: push it out there and just hope it takes hold. Instead, 287 00:18:00,040 --> 00:18:05,240 Speaker 1: they usually leverage existing gaps that are in philosophies within 288 00:18:05,280 --> 00:18:08,000 Speaker 1: the United States, and then they exacerbate those gaps. So, 289 00:18:08,080 --> 00:18:12,760 Speaker 1: for example, before the pandemic even happened, there was already 290 00:18:12,800 --> 00:18:16,720 Speaker 1: a community of people who were distrustful of vaccines, the 291 00:18:16,840 --> 00:18:20,240 Speaker 1: anti vaxers. And they're not just in the United States, obviously, 292 00:18:20,280 --> 00:18:25,520 Speaker 1: there are anti vaxers everywhere. And much of that initial 293 00:18:25,600 --> 00:18:30,160 Speaker 1: distrust here in the United States hinged upon published works 294 00:18:30,200 --> 00:18:35,080 Speaker 1: in respected journals. But later those works got retracted because 295 00:18:35,119 --> 00:18:40,040 Speaker 1: people discovered that the actual results that were reported had 296 00:18:40,080 --> 00:18:45,440 Speaker 1: been falsified, that the papers that had been published themselves 297 00:18:45,480 --> 00:18:52,280 Speaker 1: were disinformation. And yet that distrust remained, you know, the 298 00:18:52,359 --> 00:18:56,199 Speaker 1: distrust towards vaccinations remained within that community, even though the 299 00:18:56,240 --> 00:19:01,359 Speaker 1: supposed evidence that supported that narrative had been proven to 300 00:19:01,400 --> 00:19:05,159 Speaker 1: be false. Like, they had already bought into the idea 301 00:19:05,280 --> 00:19:08,480 Speaker 1: and it didn't matter that the foundation for that idea 302 00:19:08,600 --> 00:19:13,479 Speaker 1: turned out to be false. Well, disinformation campaigns pounced on 303 00:19:13,520 --> 00:19:17,840 Speaker 1: that existing undercurrent of anti vax fear and fed into it, 304 00:19:17,960 --> 00:19:22,000 Speaker 1: creating doubts about the COVID Night teen vaccines, spreading fear 305 00:19:22,080 --> 00:19:26,720 Speaker 1: and uncertainty and doubt. The Good Old Food Brand evaluated 306 00:19:26,720 --> 00:19:31,360 Speaker 1: the Air Forces very young fourteen F Information Operations Division, 307 00:19:31,760 --> 00:19:34,880 Speaker 1: something that has really just recently come into its own. 308 00:19:34,880 --> 00:19:38,040 Speaker 1: In fact, the first trainees of that division graduated very 309 00:19:38,119 --> 00:19:41,639 Speaker 1: late last year. And according to Rand, that division is 310 00:19:41,680 --> 00:19:45,200 Speaker 1: a good idea, it's a nice step. However, the people 311 00:19:45,280 --> 00:19:49,880 Speaker 1: and it still lacked the training to really recognize and 312 00:19:49,960 --> 00:19:54,000 Speaker 1: react to modern disinformation campaigns, and they also lack the 313 00:19:54,080 --> 00:19:57,080 Speaker 1: resources needed to make any sort of real difference. In fact, 314 00:19:57,080 --> 00:20:00,879 Speaker 1: they were saying they might identify disin fformation campaign, but 315 00:20:00,920 --> 00:20:05,639 Speaker 1: they don't have the resources to actually, you know, act 316 00:20:05,680 --> 00:20:09,280 Speaker 1: on that and to make aware all the other divisions 317 00:20:09,320 --> 00:20:14,320 Speaker 1: to be able to counteract those disinformation campaigns. So you 318 00:20:14,359 --> 00:20:16,960 Speaker 1: would just have a division saying, yeah, we know about it, 319 00:20:17,000 --> 00:20:19,440 Speaker 1: but we can't really do anything about it. That's not great. 320 00:20:20,720 --> 00:20:23,440 Speaker 1: And while that assessment was specifically aimed at the Air 321 00:20:23,480 --> 00:20:27,800 Speaker 1: Force and disinformation campaigns that could impact say, military operations, 322 00:20:28,440 --> 00:20:31,800 Speaker 1: I'm pretty confident to say the same as true across 323 00:20:31,920 --> 00:20:35,920 Speaker 1: the board for disinformation campaigns in general, we are as 324 00:20:35,960 --> 00:20:40,640 Speaker 1: a rule sorely underprepared and under provisioned to deal with them. 325 00:20:40,680 --> 00:20:44,399 Speaker 1: And that's why they can be incredibly effective because we 326 00:20:44,520 --> 00:20:47,760 Speaker 1: don't have the resources to deal with them, and people 327 00:20:47,800 --> 00:20:52,240 Speaker 1: who buy into these disinformation campaigns can become evangelists and 328 00:20:52,280 --> 00:20:56,879 Speaker 1: effectively they become unwitting agents of the disinformation campaign itself 329 00:20:56,920 --> 00:21:01,120 Speaker 1: and spread it further. Disinformation really in itself is like 330 00:21:01,160 --> 00:21:04,359 Speaker 1: a virus, and there is no real vaccine for it 331 00:21:04,400 --> 00:21:09,280 Speaker 1: apart from critical thinking skills, so as always I advocate 332 00:21:09,320 --> 00:21:15,880 Speaker 1: to you to exercise critical thinking and compassion. Moving on, 333 00:21:16,119 --> 00:21:19,280 Speaker 1: ours Technica has a great article titled big tech companies 334 00:21:19,320 --> 00:21:22,399 Speaker 1: are at war with employees over remote work, and it 335 00:21:22,440 --> 00:21:24,600 Speaker 1: touches on things that we've talked about in other tech 336 00:21:24,680 --> 00:21:29,560 Speaker 1: news episodes, like how Apple's CEO Tim Cook would really 337 00:21:29,720 --> 00:21:32,680 Speaker 1: very much like to get all Apple employees back into 338 00:21:32,720 --> 00:21:36,120 Speaker 1: Apple offices for at least three days a week soon, 339 00:21:36,560 --> 00:21:39,840 Speaker 1: like in September, so that you know that super high 340 00:21:39,920 --> 00:21:44,000 Speaker 1: tech and incredibly expensive Apple campus isn't left empty all 341 00:21:44,040 --> 00:21:48,320 Speaker 1: the time. Cook's memo to employees feels a bit out 342 00:21:48,359 --> 00:21:52,000 Speaker 1: of touch to me and honestly, not just to me. 343 00:21:52,160 --> 00:21:55,679 Speaker 1: The Verge reported that an employee survey revealed that the 344 00:21:55,800 --> 00:21:59,000 Speaker 1: vast majority of Apple employees felt the memo was out 345 00:21:59,040 --> 00:22:03,080 Speaker 1: of touch. But Cook isn't the only tech leader pushing 346 00:22:03,119 --> 00:22:06,719 Speaker 1: to get folks back into offices. Google had plans to 347 00:22:06,760 --> 00:22:10,360 Speaker 1: do the same, as did Lift and other big tech companies. 348 00:22:10,760 --> 00:22:14,600 Speaker 1: The Delta variant of COVID has changed things, delaying those 349 00:22:14,640 --> 00:22:17,880 Speaker 1: plans in several cases. But as the RS Technica piece 350 00:22:17,920 --> 00:22:21,080 Speaker 1: points out, the pandemic may have simply rushed what was 351 00:22:21,240 --> 00:22:24,440 Speaker 1: an already growing movement within the corporate world in general, 352 00:22:24,640 --> 00:22:28,719 Speaker 1: the tech world in particular, that for at least several 353 00:22:28,800 --> 00:22:32,600 Speaker 1: types of jobs, remote work is totally feasible and just 354 00:22:32,800 --> 00:22:36,560 Speaker 1: as effective, sometimes more effective than going to the office, 355 00:22:36,960 --> 00:22:39,639 Speaker 1: And for people working for companies in the Bay Area, 356 00:22:39,960 --> 00:22:42,880 Speaker 1: it opened up the possibility of relocating to a part 357 00:22:42,880 --> 00:22:46,880 Speaker 1: of the country that isn't as insanely expensive to live 358 00:22:46,920 --> 00:22:50,080 Speaker 1: in as the Bay Area. So the idea of getting 359 00:22:50,359 --> 00:22:53,680 Speaker 1: a Bay Area tech salary but living in a place 360 00:22:53,720 --> 00:22:56,720 Speaker 1: with a lower cost of living is really attractive to 361 00:22:56,760 --> 00:22:59,359 Speaker 1: a lot of people. And plus you take out the 362 00:22:59,400 --> 00:23:03,000 Speaker 1: commuting and the other aspects of living in an expensive city, 363 00:23:03,280 --> 00:23:05,560 Speaker 1: and you start to see a lot of pros and 364 00:23:05,680 --> 00:23:09,960 Speaker 1: only a very few cons. Heck, there's also this general 365 00:23:10,040 --> 00:23:12,680 Speaker 1: move in the corporate world to the open floor plan. 366 00:23:13,680 --> 00:23:15,360 Speaker 1: I don't know about any of y'all who have worked 367 00:23:15,359 --> 00:23:17,520 Speaker 1: in offices with open plans, but I can tell you 368 00:23:17,880 --> 00:23:20,920 Speaker 1: that in my office it is not a popular thing, 369 00:23:21,280 --> 00:23:24,280 Speaker 1: at least not among the editorial team. It turns out 370 00:23:24,400 --> 00:23:27,280 Speaker 1: editors and writers like having a bit of space that 371 00:23:27,359 --> 00:23:29,119 Speaker 1: kind of shuts out the world around them so they 372 00:23:29,119 --> 00:23:31,520 Speaker 1: can focus on their jobs. I suspect a lot of 373 00:23:31,520 --> 00:23:34,720 Speaker 1: other people feel the same way on their jobs. So 374 00:23:34,920 --> 00:23:39,440 Speaker 1: there's this escalating tension between companies pushing to get back 375 00:23:39,440 --> 00:23:43,800 Speaker 1: to this corporate norm that feels a bit oppressive and 376 00:23:43,880 --> 00:23:46,640 Speaker 1: a bit you know again, big brother ish, like, like 377 00:23:47,440 --> 00:23:50,840 Speaker 1: they only feel comfortable if they can see what everyone's 378 00:23:50,880 --> 00:23:53,639 Speaker 1: doing all the time. And then you have the folks 379 00:23:53,640 --> 00:23:55,960 Speaker 1: who would really rather work remotely if it's all the 380 00:23:55,960 --> 00:23:58,920 Speaker 1: same to you. And for some companies that does seem 381 00:23:58,920 --> 00:24:02,000 Speaker 1: to have sunk in. At Twitter, the remote work policy 382 00:24:02,040 --> 00:24:06,159 Speaker 1: has effectively become at least semi permanent, setting Twitter apart 383 00:24:06,200 --> 00:24:09,000 Speaker 1: from some of the other big tech companies. It's also 384 00:24:09,080 --> 00:24:11,440 Speaker 1: kind of funny to see these tech companies that base 385 00:24:11,600 --> 00:24:15,320 Speaker 1: part of their reputations on innovation as being beholden to 386 00:24:15,400 --> 00:24:19,640 Speaker 1: this older idea of what work should be. Anyway, as 387 00:24:19,720 --> 00:24:23,239 Speaker 1: Samuel Acson of Ours Technica points out, we're likely at 388 00:24:23,240 --> 00:24:26,360 Speaker 1: the beginning of a pretty big shift in work cultures 389 00:24:26,359 --> 00:24:30,240 Speaker 1: in general and in the tech sector in particular. I've 390 00:24:30,280 --> 00:24:32,560 Speaker 1: got a few more tech stories to come up, but 391 00:24:32,640 --> 00:24:42,600 Speaker 1: before we get into that, let's take another quick break. Now, 392 00:24:42,760 --> 00:24:45,240 Speaker 1: if you've been listening to my tech news episodes, you 393 00:24:45,320 --> 00:24:49,120 Speaker 1: might remember one that I talked about last month, I think, 394 00:24:49,200 --> 00:24:52,080 Speaker 1: in which the video game publisher Electronic Arts was hit 395 00:24:52,160 --> 00:24:55,080 Speaker 1: with a hacker intrusion in which the hackers were able 396 00:24:55,119 --> 00:24:59,240 Speaker 1: to steal a good amount of digital information. Um. By 397 00:24:59,240 --> 00:25:02,320 Speaker 1: a good amount, I mean nearly a terrabyte of data, 398 00:25:02,440 --> 00:25:06,920 Speaker 1: like more than seven fifty gigabytes of information, and that included, 399 00:25:06,960 --> 00:25:09,920 Speaker 1: as it turns out, the source code for the company's 400 00:25:10,080 --> 00:25:13,600 Speaker 1: FIFA twenty one soccer game or football if you prefer. 401 00:25:13,960 --> 00:25:17,439 Speaker 1: FIFA is the most popular sports video game franchise in 402 00:25:17,480 --> 00:25:21,239 Speaker 1: the world, so it represents a really important cornerstone for 403 00:25:21,320 --> 00:25:24,639 Speaker 1: e A. Now, the hackers said that they would sell 404 00:25:24,800 --> 00:25:27,919 Speaker 1: back this information to e A to the tune of 405 00:25:27,960 --> 00:25:30,679 Speaker 1: twenty eight million dollars. But e A did what I 406 00:25:30,760 --> 00:25:34,040 Speaker 1: frequently say is the best course of action. It did 407 00:25:34,040 --> 00:25:37,520 Speaker 1: not pay up. E A did not play ball with 408 00:25:37,640 --> 00:25:40,920 Speaker 1: the FIFA thiefs, and when that happened, the hackers ended 409 00:25:40,960 --> 00:25:44,159 Speaker 1: up dumping their stolen data on the dark web. It 410 00:25:44,200 --> 00:25:46,800 Speaker 1: sounds like they were first trying to sell that off 411 00:25:46,840 --> 00:25:50,359 Speaker 1: to any buyers, but they couldn't find any because that 412 00:25:50,440 --> 00:25:54,920 Speaker 1: source code actually isn't that much use unless you want 413 00:25:54,960 --> 00:25:58,440 Speaker 1: to try and run your own illegal FIFA servers. Potentially 414 00:25:58,920 --> 00:26:01,760 Speaker 1: that has pretty lim an appeal. There's not there's not 415 00:26:01,800 --> 00:26:04,560 Speaker 1: like a lot of profit out of it. E A 416 00:26:04,680 --> 00:26:07,600 Speaker 1: has said that while that source code was grabbed, the 417 00:26:07,640 --> 00:26:12,160 Speaker 1: hackers actually didn't access any databases containing customer information, which 418 00:26:12,560 --> 00:26:16,320 Speaker 1: you know, would have potentially been far more valuable as 419 00:26:16,560 --> 00:26:19,240 Speaker 1: uh for to a thief than the source code would be. 420 00:26:19,480 --> 00:26:22,280 Speaker 1: So player information was still safe, and the company also 421 00:26:22,359 --> 00:26:25,280 Speaker 1: updated its security systems and practices in the wake of 422 00:26:25,280 --> 00:26:29,639 Speaker 1: the attack, which that attack involved both some techno wizardry 423 00:26:29,960 --> 00:26:33,080 Speaker 1: as well as some good old fashioned social engineering. That's 424 00:26:33,119 --> 00:26:36,320 Speaker 1: when you trick someone into helping you infiltrate a system. 425 00:26:36,320 --> 00:26:40,240 Speaker 1: Something that can often surprisingly be pretty easy to do. 426 00:26:40,960 --> 00:26:43,600 Speaker 1: Now I may have to do another episode about Electronic Arts, 427 00:26:43,640 --> 00:26:47,200 Speaker 1: because I mean that company has had a rough history. 428 00:26:47,280 --> 00:26:50,119 Speaker 1: In fact, there have been years where that company was 429 00:26:50,240 --> 00:26:54,280 Speaker 1: voted as the worst company to work for. But this 430 00:26:54,359 --> 00:26:56,479 Speaker 1: is a case where I would say that the company 431 00:26:56,520 --> 00:27:00,480 Speaker 1: did the right thing. Sticking with games, let's talk for 432 00:27:00,520 --> 00:27:03,440 Speaker 1: a moment about what's happening with ten Cent. That's a 433 00:27:03,640 --> 00:27:07,360 Speaker 1: Chinese holding company that owns a bunch of stuff, including 434 00:27:07,359 --> 00:27:10,680 Speaker 1: a major stake in several video game companies and studios 435 00:27:10,720 --> 00:27:14,520 Speaker 1: around the world, not just in China. Well back in China, 436 00:27:14,920 --> 00:27:18,560 Speaker 1: the Chinese media referred to video games in general as 437 00:27:18,600 --> 00:27:23,920 Speaker 1: sort of a spiritual opium and suggested that obsessing obsessively 438 00:27:24,000 --> 00:27:28,320 Speaker 1: playing games was ruining the children of China, with ten 439 00:27:28,359 --> 00:27:32,959 Speaker 1: cents game Honor of Kings getting special attention in the article. 440 00:27:33,320 --> 00:27:37,320 Speaker 1: Ten Cent then saw its stock prices take a beating 441 00:27:37,840 --> 00:27:40,800 Speaker 1: plunging as a result of this scathing attack. In fact, 442 00:27:41,080 --> 00:27:45,560 Speaker 1: the company lost around sixty billion dollars in valuation because 443 00:27:45,680 --> 00:27:49,000 Speaker 1: of the drop in stock price. The company says it 444 00:27:49,040 --> 00:27:53,480 Speaker 1: will now limit minors access to its video game and 445 00:27:54,160 --> 00:27:57,880 Speaker 1: there's no doubt that video games, especially mobile games set 446 00:27:57,960 --> 00:28:02,040 Speaker 1: up a challenge and reward cycle that encourages continued play. 447 00:28:02,080 --> 00:28:05,560 Speaker 1: I mean, that's like a basic element of game design 448 00:28:05,640 --> 00:28:08,879 Speaker 1: for these kinds of games. Games that monetize through in 449 00:28:09,000 --> 00:28:13,480 Speaker 1: game purchases only succeed if players want to spend more 450 00:28:13,520 --> 00:28:15,679 Speaker 1: time in that game and get access to more in 451 00:28:15,800 --> 00:28:19,480 Speaker 1: game features or content, and so spend money to do so. 452 00:28:19,480 --> 00:28:22,919 Speaker 1: So there definitely is something to the fact the games 453 00:28:22,960 --> 00:28:25,879 Speaker 1: have an addictive quality to them, so the report is 454 00:28:25,920 --> 00:28:29,480 Speaker 1: not completely off base. However, it's also interesting for me 455 00:28:30,160 --> 00:28:33,400 Speaker 1: because seeing the Chinese media take aim at a Chinese 456 00:28:33,440 --> 00:28:37,919 Speaker 1: business is sometimes a little unusual because both the media 457 00:28:38,120 --> 00:28:42,160 Speaker 1: and the businesses in China have strong links back to 458 00:28:42,200 --> 00:28:47,000 Speaker 1: the Chinese Communist government. The Communist government is essentially integrated 459 00:28:47,080 --> 00:28:51,280 Speaker 1: into every major system in China. Then again, this could 460 00:28:51,320 --> 00:28:53,880 Speaker 1: just be an issue in which the reporting is just sincere. 461 00:28:54,400 --> 00:28:58,240 Speaker 1: It could just be a sincere reporting on that social problem, 462 00:28:58,400 --> 00:29:01,440 Speaker 1: or it could also be an issue in which the 463 00:29:01,560 --> 00:29:04,040 Speaker 1: Chinese government is a little bit concerned that some of 464 00:29:04,040 --> 00:29:07,200 Speaker 1: these huge companies are getting powerful enough to have their 465 00:29:07,200 --> 00:29:10,120 Speaker 1: own authority that in some ways can rival that of 466 00:29:10,160 --> 00:29:13,920 Speaker 1: the Chinese governments itself. Maybe it's a combination of a 467 00:29:13,920 --> 00:29:17,280 Speaker 1: lot of these things, but it's kind of fascinating to me. 468 00:29:17,400 --> 00:29:20,040 Speaker 1: And what's also interesting is that the kerfuffle meant that 469 00:29:20,440 --> 00:29:24,480 Speaker 1: for a brief period, Tensent actually found itself knocked off 470 00:29:24,520 --> 00:29:28,120 Speaker 1: the most valuable company in Asia, though it did regain 471 00:29:28,200 --> 00:29:33,720 Speaker 1: that throne. Meanwhile, in space, last week there was an 472 00:29:33,720 --> 00:29:38,280 Speaker 1: accident that affected the International Space Station. A Russian module 473 00:29:38,320 --> 00:29:42,080 Speaker 1: called Knocka docked with the station last week, and a 474 00:29:42,120 --> 00:29:45,720 Speaker 1: few hours after it had docked with the station, it 475 00:29:45,800 --> 00:29:50,680 Speaker 1: started to fire its thrusters accidentally. It was unplanned, and 476 00:29:50,760 --> 00:29:53,360 Speaker 1: that caused the space station to start to move out 477 00:29:53,360 --> 00:29:56,960 Speaker 1: of alignment and lead to a loss of quote unquote 478 00:29:57,080 --> 00:30:02,160 Speaker 1: attitude control. Now, in my household, a loss of attitude 479 00:30:02,160 --> 00:30:05,520 Speaker 1: control means that get really sassy, but in space this 480 00:30:05,600 --> 00:30:08,920 Speaker 1: is way more serious than that. Apparently, the space station 481 00:30:09,000 --> 00:30:13,120 Speaker 1: ended up spinning one and a half times five forty 482 00:30:13,160 --> 00:30:17,680 Speaker 1: degrees and it stopped rotating upside down. Now, of course, 483 00:30:17,680 --> 00:30:20,760 Speaker 1: in a microgravity environment, up and down are meaningless without 484 00:30:20,760 --> 00:30:23,840 Speaker 1: a frame of reference. So the astronauts aboard the space 485 00:30:23,880 --> 00:30:28,200 Speaker 1: station were initially unaware of this issue because the movement 486 00:30:28,240 --> 00:30:30,440 Speaker 1: was actually really gradual. It's not like they were just 487 00:30:30,480 --> 00:30:35,240 Speaker 1: spinning around like in that film Gravity. NASA reports that 488 00:30:35,320 --> 00:30:38,920 Speaker 1: the station is now back in its correct alignment, having 489 00:30:39,000 --> 00:30:43,080 Speaker 1: undergone a one eighty degree forward flip. And you thought 490 00:30:43,120 --> 00:30:45,720 Speaker 1: the Olympics were just limited to here on Earth now, 491 00:30:45,800 --> 00:30:49,600 Speaker 1: all kidding aside, This whole thing sounds pretty intense and terrifying, 492 00:30:49,760 --> 00:30:52,520 Speaker 1: But the astronauts aboard the station and the people here 493 00:30:52,520 --> 00:30:55,760 Speaker 1: on Earth were able to handle the emergency and return 494 00:30:55,840 --> 00:30:59,239 Speaker 1: things to their normal operating parameters. And I find that 495 00:30:59,320 --> 00:31:02,400 Speaker 1: just really in orational, like really phenomenal stuff to show 496 00:31:02,440 --> 00:31:05,640 Speaker 1: what can happen when people banned together in the face 497 00:31:05,920 --> 00:31:10,480 Speaker 1: of an emergency and apply human ingenuity to fix the problem. 498 00:31:10,560 --> 00:31:13,000 Speaker 1: In fact, reading up on the full account, it gets 499 00:31:13,080 --> 00:31:16,239 Speaker 1: really crazy. The module fired thrusters as if it were 500 00:31:16,280 --> 00:31:19,120 Speaker 1: attempting to maneuver away from the station, even though it 501 00:31:19,200 --> 00:31:22,400 Speaker 1: was physically docked to the station. And again this was 502 00:31:22,560 --> 00:31:27,440 Speaker 1: apparently some sort of automatic error, and only Russian's mission 503 00:31:27,480 --> 00:31:30,800 Speaker 1: control was able to actually send a command to stop 504 00:31:30,880 --> 00:31:34,240 Speaker 1: the thrusters. But the space station wasn't in range of 505 00:31:34,320 --> 00:31:37,600 Speaker 1: Russian mission control. You know, the station orbits the Earth, 506 00:31:38,080 --> 00:31:41,400 Speaker 1: and at that point in the orbit, it wasn't close 507 00:31:41,520 --> 00:31:44,000 Speaker 1: enough to Russia for a message to go out, it 508 00:31:44,040 --> 00:31:46,600 Speaker 1: wasn't in line of sight. It would actually take another 509 00:31:46,680 --> 00:31:50,280 Speaker 1: hour before the station would potentially be reachable by Russian 510 00:31:50,280 --> 00:31:54,560 Speaker 1: mission control, So the crew ended up using a second 511 00:31:54,680 --> 00:31:58,400 Speaker 1: Russian module as well as a cargo ship, and fired 512 00:31:58,520 --> 00:32:02,560 Speaker 1: the thrusters aboard both of those UH elements in order 513 00:32:02,560 --> 00:32:07,240 Speaker 1: to try and counter the thruster effect of Naka's malfunctioning thrusters. 514 00:32:07,640 --> 00:32:11,400 Speaker 1: Fifteen minutes after it started the knocka module settled the 515 00:32:11,480 --> 00:32:14,640 Speaker 1: heck down. At the time I'm recording this, no one 516 00:32:14,680 --> 00:32:17,200 Speaker 1: seems to know what caused it to start or what 517 00:32:17,360 --> 00:32:20,640 Speaker 1: made it stop. And I'm sure that had to be 518 00:32:20,680 --> 00:32:25,480 Speaker 1: a pretty tense day aboard the space station. And finally, 519 00:32:25,880 --> 00:32:28,640 Speaker 1: I'll end this episode by talking about something that is 520 00:32:28,680 --> 00:32:32,320 Speaker 1: not nearly as traumatic, and that's the news we have 521 00:32:32,560 --> 00:32:36,320 Speaker 1: about the next flagship phones from Google. These are the 522 00:32:36,360 --> 00:32:39,680 Speaker 1: Pixel six and the Pixel six Pro, both of which 523 00:32:39,720 --> 00:32:45,200 Speaker 1: Google previewed yesterday on August two. With a limited preview. 524 00:32:45,480 --> 00:32:47,800 Speaker 1: The company did not give a full rundown of all 525 00:32:47,840 --> 00:32:51,320 Speaker 1: the specs on both phones, did spend a good amount 526 00:32:51,320 --> 00:32:54,160 Speaker 1: of time talking about the new system on a chip 527 00:32:54,680 --> 00:32:58,160 Speaker 1: or s o C called the Tensor S oc. This 528 00:32:58,240 --> 00:33:02,520 Speaker 1: chip includes multiple element it's not just a CPU. That's 529 00:33:02,520 --> 00:33:06,120 Speaker 1: why you get the system on a chip nomenclature, which 530 00:33:06,240 --> 00:33:10,560 Speaker 1: essentially means that it has all the necessary components to 531 00:33:10,840 --> 00:33:13,640 Speaker 1: allow it to act as a computer. It's all present 532 00:33:13,720 --> 00:33:17,200 Speaker 1: on a single chip, so like CPU, GPU, memory, all 533 00:33:17,200 --> 00:33:21,320 Speaker 1: that stuff. According to the Verge, the new phones definitely 534 00:33:21,400 --> 00:33:24,880 Speaker 1: have a higher tier fuel to them than previous Pixel phones. 535 00:33:25,400 --> 00:33:28,800 Speaker 1: That probably means they will be muccio expensive when they 536 00:33:28,800 --> 00:33:31,920 Speaker 1: come out later this year, but we do not know 537 00:33:32,280 --> 00:33:35,720 Speaker 1: the pricing as of yet. As the Verge piece points out, 538 00:33:35,760 --> 00:33:38,840 Speaker 1: it will probably be north of a thousand dollars for 539 00:33:39,000 --> 00:33:43,600 Speaker 1: just the base model. The Pixel six Pro is slightly 540 00:33:43,720 --> 00:33:47,280 Speaker 1: larger than the Pixel six the former. The Pro has 541 00:33:47,320 --> 00:33:50,920 Speaker 1: a screen that measures six point seven inches and the 542 00:33:51,000 --> 00:33:54,040 Speaker 1: regular Pixel six has a screen that measures six point 543 00:33:54,240 --> 00:33:57,680 Speaker 1: four inches. The Pro also has a higher refresh rate 544 00:33:57,720 --> 00:34:01,280 Speaker 1: of its screen. It has hurt refresh rate that means 545 00:34:01,680 --> 00:34:06,480 Speaker 1: the screen refreshes times a second. The regular old Pixel 546 00:34:06,600 --> 00:34:10,000 Speaker 1: six has a refresh rate of ninety hurts, so it 547 00:34:10,040 --> 00:34:14,160 Speaker 1: doesn't have quite the same rate. The Pro also has 548 00:34:14,320 --> 00:34:18,080 Speaker 1: three cameras. The Pixel six has just two, and the 549 00:34:18,120 --> 00:34:21,360 Speaker 1: pros extra camera is at telephoto camera, so you can 550 00:34:21,440 --> 00:34:24,719 Speaker 1: get those long distance shots. The new phones will also 551 00:34:24,760 --> 00:34:28,200 Speaker 1: incorporate AI features in some ways that I don't fully 552 00:34:28,280 --> 00:34:31,800 Speaker 1: grock yet. I know that some of it relates to photos, 553 00:34:31,880 --> 00:34:34,160 Speaker 1: which is good, Like if you have a telephoto lens 554 00:34:34,200 --> 00:34:38,000 Speaker 1: on a phone, having AI to help counter stuff like 555 00:34:38,120 --> 00:34:41,560 Speaker 1: jitter is really good because otherwise, uh, you would get 556 00:34:41,600 --> 00:34:44,560 Speaker 1: a lot of very fuzzy images if you were trying 557 00:34:44,560 --> 00:34:47,840 Speaker 1: to take long distance photos and you you could not 558 00:34:48,000 --> 00:34:51,080 Speaker 1: correct for just the general motion that most of us 559 00:34:51,120 --> 00:34:55,080 Speaker 1: have when we're holding stuff. Anyway, as I said, this 560 00:34:55,160 --> 00:34:57,239 Speaker 1: was more of a preview than anything else. We're likely 561 00:34:57,360 --> 00:34:59,080 Speaker 1: to hear a lot more about these phones as we 562 00:34:59,080 --> 00:35:02,319 Speaker 1: get closer. Talked Ober, and I'm debating on whether I 563 00:35:02,360 --> 00:35:06,000 Speaker 1: actually stick with the Pixel I have an earlier Pixel phone. 564 00:35:06,080 --> 00:35:09,560 Speaker 1: I think it's a Pixel four, or whether I change 565 00:35:09,600 --> 00:35:12,680 Speaker 1: teams because I am due to upgrade my phone. It's 566 00:35:12,719 --> 00:35:15,719 Speaker 1: it's a four and so I skipped the five. But 567 00:35:16,280 --> 00:35:18,480 Speaker 1: I want to make sure that the PIXEL features are 568 00:35:18,480 --> 00:35:21,200 Speaker 1: actually gonna be worth whatever that premium price turns out 569 00:35:21,239 --> 00:35:23,799 Speaker 1: to be. I don't want to be beholden to a 570 00:35:23,840 --> 00:35:27,160 Speaker 1: flagship phone just because it's a flagship phone. If I 571 00:35:27,239 --> 00:35:29,960 Speaker 1: don't feel like I'm getting value for whatever the prices, 572 00:35:30,000 --> 00:35:32,239 Speaker 1: so I'm gonna wait a little bit longer to learn 573 00:35:32,280 --> 00:35:35,200 Speaker 1: more details before I make a decision on that. And 574 00:35:35,320 --> 00:35:40,360 Speaker 1: that is the news for Tuesday, auguste. If you have 575 00:35:40,480 --> 00:35:44,040 Speaker 1: suggestions for topics I should cover on tech Stuff, reach 576 00:35:44,080 --> 00:35:45,920 Speaker 1: out to me. The best place to do that is 577 00:35:45,960 --> 00:35:48,840 Speaker 1: on Twitter and the handle we use is text stuff 578 00:35:49,120 --> 00:35:53,399 Speaker 1: hs W and I'll talk to you again really soon. 579 00:35:58,360 --> 00:36:01,080 Speaker 1: Text Stuff is an I heart rate Dio production. For 580 00:36:01,200 --> 00:36:04,160 Speaker 1: more podcasts from I heart Radio, visit the I heart 581 00:36:04,239 --> 00:36:07,399 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 582 00:36:07,440 --> 00:36:12,680 Speaker 1: favorite shows. H