1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,040 Speaker 1: Hey there, and welcome to tech Stuff. I'm Jonathan Strickland. 3 00:00:15,040 --> 00:00:17,320 Speaker 1: I'm an executive producer here at iHeart Radio and a 4 00:00:17,360 --> 00:00:20,440 Speaker 1: love all things tech. It is time for the tech 5 00:00:20,520 --> 00:00:26,000 Speaker 1: news for Tuesday, October twelfth. One. Let's get into it. 6 00:00:27,080 --> 00:00:30,680 Speaker 1: Last week was a tough one for Facebook, to put 7 00:00:30,720 --> 00:00:35,200 Speaker 1: it lightly, with a whistleblower appearing before Congress and testifying 8 00:00:35,240 --> 00:00:39,880 Speaker 1: that her former employer knowingly engaged in practices that were 9 00:00:39,920 --> 00:00:43,800 Speaker 1: harmful to its users and two entire countries. As it 10 00:00:43,800 --> 00:00:48,200 Speaker 1: would turn out, while Facebook went into damage control last week, 11 00:00:48,760 --> 00:00:52,680 Speaker 1: some of the measures that were taken by PR spokespeople 12 00:00:52,760 --> 00:00:55,360 Speaker 1: in an attempt to kind of deflect or to defend 13 00:00:55,400 --> 00:01:00,040 Speaker 1: themselves ended up rubbing other current and former employees a 14 00:01:00,120 --> 00:01:04,280 Speaker 1: Facebook the wrong way. One person who had already been 15 00:01:04,280 --> 00:01:08,399 Speaker 1: critical of the company is Sophie Young. She used to 16 00:01:08,440 --> 00:01:12,320 Speaker 1: work as a data scientist for Facebook. Well. She first 17 00:01:12,400 --> 00:01:16,399 Speaker 1: came forward with concerns about Facebook last year, and she 18 00:01:16,480 --> 00:01:20,640 Speaker 1: even published a full blog post that contained her concerns, 19 00:01:20,680 --> 00:01:26,240 Speaker 1: like she wrote this very long farewell message to Facebook employees, 20 00:01:26,240 --> 00:01:28,319 Speaker 1: and then she also posted it to her own website. 21 00:01:28,319 --> 00:01:34,280 Speaker 1: Facebook then subsequently sent a takedown notice to the the 22 00:01:34,520 --> 00:01:40,080 Speaker 1: administrator for the website itself, so it got taken down. 23 00:01:40,200 --> 00:01:43,120 Speaker 1: It's pretty bad anyway. Jiang has now said she's willing 24 00:01:43,160 --> 00:01:46,479 Speaker 1: to testify in front of Congress as well. She has 25 00:01:46,480 --> 00:01:52,120 Speaker 1: said that she sent potentially incriminating documents regarding Facebook practices 26 00:01:52,480 --> 00:01:57,480 Speaker 1: to a US law enforcement agency. Other people have also 27 00:01:57,560 --> 00:02:01,280 Speaker 1: shown support for investigations into face Book, as well as 28 00:02:01,320 --> 00:02:05,240 Speaker 1: condemning the company's tactics of trying to dismiss criticisms by 29 00:02:05,320 --> 00:02:09,720 Speaker 1: questioning the credentials of the whistleblowers. My guess is that 30 00:02:10,160 --> 00:02:13,800 Speaker 1: it's likely going to be another tough week for Facebook. 31 00:02:14,320 --> 00:02:18,320 Speaker 1: Sticking with Facebook, I want to talk about Louis Barclay. 32 00:02:18,480 --> 00:02:22,880 Speaker 1: Barclay is an app developer from the UK app and 33 00:02:22,919 --> 00:02:27,840 Speaker 1: software developer, and he created a browser extension specifically to 34 00:02:28,040 --> 00:02:33,800 Speaker 1: use on Facebook called Unfollow Everything, and you know, it 35 00:02:33,919 --> 00:02:36,480 Speaker 1: does exactly what it says on the tin, as they 36 00:02:36,480 --> 00:02:41,760 Speaker 1: say across the pond. A user activating unfollow Everything would 37 00:02:41,800 --> 00:02:45,640 Speaker 1: see that they had unfollowed every single friend and every 38 00:02:45,639 --> 00:02:48,480 Speaker 1: single page that they had previously followed on Facebook. So 39 00:02:48,880 --> 00:02:51,920 Speaker 1: they would unfollow everything and you could still you know, 40 00:02:52,080 --> 00:02:56,359 Speaker 1: connect to people on Facebook, but you unfollowed them, you 41 00:02:56,360 --> 00:03:00,480 Speaker 1: would have to use other methods, And um, it is 42 00:03:00,520 --> 00:03:02,720 Speaker 1: interesting because it would essentially mean that your news feed 43 00:03:02,720 --> 00:03:05,760 Speaker 1: would be blank except for stuff that maybe you had posted. 44 00:03:05,840 --> 00:03:09,280 Speaker 1: I'm assuming I don't know. I haven't actually seen pictures 45 00:03:09,480 --> 00:03:12,640 Speaker 1: of what this looked like when it was an action, 46 00:03:12,800 --> 00:03:16,360 Speaker 1: but it it's it confuses me a little, only in 47 00:03:16,360 --> 00:03:18,320 Speaker 1: the sense of I don't know what it would look like. 48 00:03:18,320 --> 00:03:21,400 Speaker 1: Would it just be your posts? Would it become the 49 00:03:21,440 --> 00:03:25,440 Speaker 1: true narcissists Facebook? I don't know. His extension, however, got 50 00:03:25,440 --> 00:03:28,320 Speaker 1: the attention of researchers in Switzerland who wanted to see 51 00:03:28,520 --> 00:03:31,440 Speaker 1: what impact there might be on this kind of thing. 52 00:03:31,480 --> 00:03:34,480 Speaker 1: Would would there be an impact on happiness for users 53 00:03:35,480 --> 00:03:38,240 Speaker 1: who were still on Facebook but no longer had a 54 00:03:38,280 --> 00:03:40,560 Speaker 1: real news feed. They also wanted to see what kind 55 00:03:40,560 --> 00:03:43,240 Speaker 1: of impact it might have on the amount of time 56 00:03:43,560 --> 00:03:46,920 Speaker 1: that users typically would spend on the platform. And I 57 00:03:46,960 --> 00:03:51,120 Speaker 1: honestly wonder really what that experience is like. Maybe it 58 00:03:51,160 --> 00:03:54,480 Speaker 1: would be like Facebook would become essentially your photo album 59 00:03:54,600 --> 00:03:58,080 Speaker 1: or maybe a journaling exercise. If you were posting and 60 00:03:58,200 --> 00:04:00,320 Speaker 1: you were the only one posting there. Any ways, you 61 00:04:00,320 --> 00:04:03,160 Speaker 1: can imagine Facebook was not happy about any of this. 62 00:04:03,800 --> 00:04:07,640 Speaker 1: So the company booted Barclay off Facebook as well as 63 00:04:07,680 --> 00:04:11,320 Speaker 1: Instagram and banned him from ever having an account with 64 00:04:11,440 --> 00:04:14,600 Speaker 1: either of those now. The article I read on Business 65 00:04:14,640 --> 00:04:19,480 Speaker 1: Insider didn't indicate whether Facebook has also banned him from WhatsApp, 66 00:04:19,839 --> 00:04:22,640 Speaker 1: the messaging app, but Barkley did say that he had 67 00:04:22,680 --> 00:04:25,440 Speaker 1: been on the platform for about fifteen years and you 68 00:04:25,800 --> 00:04:27,520 Speaker 1: used it a lot to stay in touch with friends 69 00:04:27,520 --> 00:04:31,880 Speaker 1: and family, so this has had an impact on him. 70 00:04:32,080 --> 00:04:36,120 Speaker 1: Facebook says that Barclay's browser extension violates Facebook's terms of Service, 71 00:04:36,200 --> 00:04:39,680 Speaker 1: which in parts says that users and developers are not 72 00:04:39,680 --> 00:04:43,200 Speaker 1: not to do anything that would interfere with the intended 73 00:04:43,360 --> 00:04:47,760 Speaker 1: operation of Facebook. And you know, Barklay said the tool 74 00:04:47,760 --> 00:04:51,400 Speaker 1: had only been downloaded a few thousand times. The whole thing, 75 00:04:51,480 --> 00:04:53,960 Speaker 1: to me is really interesting. I wonder how many people 76 00:04:54,000 --> 00:04:56,920 Speaker 1: would actually use a tool like this, and whether they 77 00:04:56,920 --> 00:04:58,719 Speaker 1: would continue to use it or if they would just 78 00:04:58,800 --> 00:05:01,160 Speaker 1: you know, turn it off after a it. I get 79 00:05:01,200 --> 00:05:04,080 Speaker 1: why Facebook is kind of going nuclear because the entire 80 00:05:04,160 --> 00:05:07,440 Speaker 1: revenue model for Facebook depends upon ads being served up 81 00:05:07,440 --> 00:05:11,400 Speaker 1: in the news feed and keeping folks on the platform 82 00:05:11,440 --> 00:05:14,400 Speaker 1: for as long as possible to sell as many things 83 00:05:14,440 --> 00:05:17,000 Speaker 1: as possible, right to serve up as many ads as 84 00:05:17,040 --> 00:05:21,000 Speaker 1: possible and rack up the big bucks. But if there's 85 00:05:21,000 --> 00:05:24,120 Speaker 1: no news feed, then people would likely spend way less 86 00:05:24,120 --> 00:05:27,480 Speaker 1: time on the platform and Facebook would make way less money. 87 00:05:27,720 --> 00:05:31,000 Speaker 1: Barclay is kind of without options here. He lives in 88 00:05:31,040 --> 00:05:34,200 Speaker 1: the UK, and that means that if he took Facebook 89 00:05:34,240 --> 00:05:36,479 Speaker 1: to court over the matter, and if he lost, he 90 00:05:36,560 --> 00:05:40,839 Speaker 1: would be liable to pay Facebook's court costs. And you know, 91 00:05:40,880 --> 00:05:45,360 Speaker 1: Facebook's a company with billions of dollars. Facebook could even 92 00:05:45,400 --> 00:05:48,400 Speaker 1: if they fought this in a way where they weren't 93 00:05:48,920 --> 00:05:52,800 Speaker 1: going to win, they could really drain Barclay's resources. And 94 00:05:52,839 --> 00:05:56,360 Speaker 1: if they did win and they really drew this out 95 00:05:56,520 --> 00:05:59,960 Speaker 1: so it was a very long court battle, it could 96 00:06:00,200 --> 00:06:03,080 Speaker 1: bankrupt him. So he's kind of resigned himself to being 97 00:06:03,120 --> 00:06:07,200 Speaker 1: Facebook free. He sounds like he's both a little sad 98 00:06:07,240 --> 00:06:09,520 Speaker 1: about it, but also kind of relieved because he had 99 00:06:09,560 --> 00:06:11,440 Speaker 1: said that he was trying to cut back for a 100 00:06:11,440 --> 00:06:14,680 Speaker 1: while now and now he has no choice but to 101 00:06:14,720 --> 00:06:18,960 Speaker 1: go cold turkey. We usually hear about successful hacker attacks. 102 00:06:19,320 --> 00:06:21,520 Speaker 1: It is fairly rare when we hear about a hacker 103 00:06:21,560 --> 00:06:25,240 Speaker 1: attack that was unsuccessful, that had been defeated, but we 104 00:06:25,360 --> 00:06:28,719 Speaker 1: have a report like that. Now, Microsoft says that it's 105 00:06:28,760 --> 00:06:32,919 Speaker 1: Azure Cloud service was able to mitigate a massive distributed 106 00:06:33,000 --> 00:06:37,680 Speaker 1: denial of service ord DOS attack in late August of 107 00:06:37,680 --> 00:06:41,320 Speaker 1: this year. Now, let's explain what that means. And we're 108 00:06:41,320 --> 00:06:45,920 Speaker 1: gonna start first with a denial of service or DOS attack, 109 00:06:46,400 --> 00:06:50,960 Speaker 1: because that's that's our basic unit here. That's an attack 110 00:06:51,040 --> 00:06:55,719 Speaker 1: where someone tries to overwhelm a specific targeted server or 111 00:06:55,760 --> 00:07:00,280 Speaker 1: system on the Internet, typically by sending tons of messages 112 00:07:00,400 --> 00:07:04,440 Speaker 1: to that system. And I often use a specific analogy here. 113 00:07:04,520 --> 00:07:07,239 Speaker 1: So imagine that you live in a house. The house 114 00:07:07,279 --> 00:07:10,360 Speaker 1: has a front door with a doorbell, and part of 115 00:07:10,400 --> 00:07:12,320 Speaker 1: the condition of you living in this house is that 116 00:07:12,440 --> 00:07:15,840 Speaker 1: if someone rings that doorbell, you have to answer the door. 117 00:07:16,040 --> 00:07:18,480 Speaker 1: It doesn't matter what else is going on. If the 118 00:07:18,520 --> 00:07:21,480 Speaker 1: doorbell rings, you've got to go answer it. And so 119 00:07:21,640 --> 00:07:25,760 Speaker 1: someone is starting to ring your doorbell and then run away. 120 00:07:26,400 --> 00:07:28,840 Speaker 1: And even though you know that there is a very 121 00:07:28,880 --> 00:07:30,640 Speaker 1: good chance that you're going to go to the door 122 00:07:30,680 --> 00:07:32,800 Speaker 1: and open it and no one's gonna be there, you 123 00:07:32,880 --> 00:07:36,920 Speaker 1: still have to answer it. You're not allowed to ignore it. 124 00:07:36,920 --> 00:07:40,200 Speaker 1: It would be really hard to get anything else done. 125 00:07:40,240 --> 00:07:43,920 Speaker 1: If someone was continuously ringing your doorbell and running away. Well, 126 00:07:44,040 --> 00:07:47,680 Speaker 1: denial of service attack is kind of similar. The attacker 127 00:07:48,080 --> 00:07:51,720 Speaker 1: is directing messages to a targeted server, and the server 128 00:07:51,840 --> 00:07:55,440 Speaker 1: is trying to answer these messages. But those messages are 129 00:07:55,440 --> 00:07:58,880 Speaker 1: coming in thick and fast, as you might say, and 130 00:07:58,960 --> 00:08:01,960 Speaker 1: soon the server can't do anything at all. It's just 131 00:08:02,080 --> 00:08:05,320 Speaker 1: overwhelmed by these incoming messages. It might even crash the server. 132 00:08:05,400 --> 00:08:07,960 Speaker 1: In fact, that's often the goal. If it's not to 133 00:08:08,040 --> 00:08:11,080 Speaker 1: just gum it up, it's to bring it down entirely. Well, 134 00:08:11,120 --> 00:08:15,560 Speaker 1: a distributed denial of service attack is the same thing, 135 00:08:15,640 --> 00:08:19,680 Speaker 1: but on a much bigger scale. Typically, hackers will first 136 00:08:19,760 --> 00:08:23,280 Speaker 1: use malware to infect as many computers as they possibly 137 00:08:23,320 --> 00:08:27,880 Speaker 1: can to get sort of backdoor access to these computers 138 00:08:28,160 --> 00:08:31,040 Speaker 1: and have them join what's called a bot net or 139 00:08:31,120 --> 00:08:35,280 Speaker 1: sometimes called a zombie network. So the infected computers are 140 00:08:35,520 --> 00:08:38,200 Speaker 1: the zombie army, and the hacker can use this army 141 00:08:38,240 --> 00:08:41,920 Speaker 1: to direct messages to this targeted server. Now, instead of 142 00:08:41,960 --> 00:08:45,720 Speaker 1: one machine just trying to overwhelm a target, you could 143 00:08:45,720 --> 00:08:50,520 Speaker 1: be talking tens of thousands, like seventy thousand computers. Is 144 00:08:50,559 --> 00:08:53,520 Speaker 1: not unheard of for this sort of thing. For this reason, 145 00:08:53,960 --> 00:08:57,120 Speaker 1: we often describe the magnitude of DIDOS attacks and how 146 00:08:57,200 --> 00:09:01,440 Speaker 1: much data is being sent toward a target. In this case, 147 00:09:01,720 --> 00:09:05,840 Speaker 1: it was a whopping two point for terra bytes per second. 148 00:09:06,200 --> 00:09:10,120 Speaker 1: That's the largest DIDAS attack on record by volume. A 149 00:09:10,240 --> 00:09:13,360 Speaker 1: terabyte is a trillion bytes, so a gigabyte is a 150 00:09:13,400 --> 00:09:16,400 Speaker 1: billion bytes. Terabyte is a trillion bites. This was two 151 00:09:16,400 --> 00:09:20,720 Speaker 1: point four terabytes per second. Microsoft says Azure detected the 152 00:09:20,720 --> 00:09:23,760 Speaker 1: flood of traffic and was able to mitigate it to 153 00:09:23,800 --> 00:09:27,400 Speaker 1: hold it back so that the targeted server didn't get overwhelmed, 154 00:09:27,600 --> 00:09:31,400 Speaker 1: which is pretty impressive. Microsoft did not identify the target, 155 00:09:31,760 --> 00:09:35,160 Speaker 1: but only said it was located in Europe. Amazon has 156 00:09:35,200 --> 00:09:38,000 Speaker 1: become the latest company to back off of an official 157 00:09:38,080 --> 00:09:42,120 Speaker 1: return to the office plan for their corporate employees. Like 158 00:09:42,320 --> 00:09:45,600 Speaker 1: several other big tech companies, Amazon originally hoped to be 159 00:09:45,679 --> 00:09:48,200 Speaker 1: back in the office this year, but had pushed back 160 00:09:48,240 --> 00:09:50,760 Speaker 1: to a plan that would see corporate employees returned to 161 00:09:50,840 --> 00:09:52,960 Speaker 1: at least three days a week in the office by 162 00:09:53,040 --> 00:09:57,439 Speaker 1: January two. Now, the company has sent out another message 163 00:09:57,760 --> 00:10:00,440 Speaker 1: saying employees will not have to return to the office 164 00:10:00,679 --> 00:10:04,120 Speaker 1: in the new year. Rather, team leaders will make decisions 165 00:10:04,160 --> 00:10:06,480 Speaker 1: on a team by team basis as to whether people 166 00:10:06,480 --> 00:10:09,760 Speaker 1: need to return or not. Uh, some people will be 167 00:10:09,800 --> 00:10:14,560 Speaker 1: able to work remotely indefinitely. In addition, employees will still 168 00:10:14,600 --> 00:10:16,600 Speaker 1: be able to work for up to four weeks in 169 00:10:16,640 --> 00:10:20,520 Speaker 1: any location within their home country, but they should spend 170 00:10:20,559 --> 00:10:23,040 Speaker 1: the rest of the year somewhere that would allow them 171 00:10:23,080 --> 00:10:27,480 Speaker 1: to attend, say, an in person meeting, given a day's notice. 172 00:10:28,080 --> 00:10:30,960 Speaker 1: In the ongoing battle between Apple and Epic Games, Apple 173 00:10:31,040 --> 00:10:33,600 Speaker 1: is now seeking an appeal to the court ruling that 174 00:10:33,720 --> 00:10:37,880 Speaker 1: requires Apple to allow app developers and alternative payment option 175 00:10:38,280 --> 00:10:41,160 Speaker 1: and not just go through Apple's own in app system. 176 00:10:41,200 --> 00:10:44,080 Speaker 1: While seeking the appeal, Apple has asked for a stay 177 00:10:44,120 --> 00:10:47,000 Speaker 1: on the injunction that would otherwise require them to follow 178 00:10:47,000 --> 00:10:50,120 Speaker 1: the court order. So essentially, they're saying, until we've actually 179 00:10:50,160 --> 00:10:52,480 Speaker 1: settled this matter for good, we don't want to have 180 00:10:52,520 --> 00:10:55,240 Speaker 1: to change stuff. And from that perspective, it makes kind 181 00:10:55,280 --> 00:10:57,600 Speaker 1: of sense, right if Apple were to change things and 182 00:10:57,640 --> 00:11:03,920 Speaker 1: then subsequently when their court appeal reversing that change after implementing, 183 00:11:03,960 --> 00:11:09,079 Speaker 1: it would be a messy thing at best. Apple claims 184 00:11:09,120 --> 00:11:12,160 Speaker 1: that allowing developers to use their own payment systems could 185 00:11:12,200 --> 00:11:15,760 Speaker 1: lead to incorporation of other stuff like external links and 186 00:11:15,840 --> 00:11:20,319 Speaker 1: potentially bring risks to Apple users. We think of the children. 187 00:11:21,120 --> 00:11:23,800 Speaker 1: So the argument here is that this could potentially lead 188 00:11:23,840 --> 00:11:26,280 Speaker 1: to security risks, and it's hard to argue against that. 189 00:11:26,600 --> 00:11:29,880 Speaker 1: But honestly, I don't think that's what Apple's main concern is. 190 00:11:30,080 --> 00:11:32,679 Speaker 1: I think their main concern is that they take a 191 00:11:32,720 --> 00:11:36,960 Speaker 1: fifteen to thirty cut out of in app purchases, and 192 00:11:37,320 --> 00:11:40,600 Speaker 1: if you allow for alternative payment systems, you have to 193 00:11:40,600 --> 00:11:44,319 Speaker 1: say goodbye to that kind of revenue. But then Apple 194 00:11:44,559 --> 00:11:47,200 Speaker 1: has lots of security concerns to think about. Two like 195 00:11:47,640 --> 00:11:49,720 Speaker 1: how a company is like n s, a group have 196 00:11:49,880 --> 00:11:53,280 Speaker 1: targeted eye message to create zero click attacks that turned 197 00:11:53,280 --> 00:11:58,640 Speaker 1: phones into spy devices. Apple did subsequently patch out certain vulnerabilities. 198 00:11:58,920 --> 00:12:02,080 Speaker 1: My point being that you know, there are security concerns 199 00:12:02,080 --> 00:12:04,880 Speaker 1: that have nothing to do with the apps being developed 200 00:12:04,880 --> 00:12:08,120 Speaker 1: by third parties. When you really dig into this case, 201 00:12:08,240 --> 00:12:11,600 Speaker 1: no one looks super awesome in the full light. I 202 00:12:11,640 --> 00:12:14,199 Speaker 1: think I'm more on the developer side on this one. 203 00:12:14,240 --> 00:12:16,800 Speaker 1: I feel that Apple and Google to have a real 204 00:12:16,840 --> 00:12:20,520 Speaker 1: stranglehold that can hurt developers. At the same time, there's 205 00:12:20,520 --> 00:12:25,480 Speaker 1: some legit arguments to make for a streamlined, trustworthy payment system. 206 00:12:25,520 --> 00:12:27,880 Speaker 1: We've got a few more stories to go through before 207 00:12:27,920 --> 00:12:38,560 Speaker 1: we get to that. Let's take a quick break. So 208 00:12:38,640 --> 00:12:40,840 Speaker 1: before the break, we were talking about Apple and Epic, 209 00:12:40,880 --> 00:12:43,520 Speaker 1: and a large part of that court argument is about 210 00:12:43,559 --> 00:12:46,840 Speaker 1: anti competitiveness and whether companies like Apple and Google are 211 00:12:46,880 --> 00:12:52,199 Speaker 1: being anti competitive with demanding developers use their in app purchasing. Well, 212 00:12:52,240 --> 00:12:55,200 Speaker 1: speaking of anti competitive issues, we've got a different story 213 00:12:55,640 --> 00:12:58,439 Speaker 1: that falls into that category. And Video has run into 214 00:12:58,480 --> 00:13:00,880 Speaker 1: a speed bump in its quest to acquire the chip 215 00:13:01,000 --> 00:13:04,520 Speaker 1: design company ARM. Now, if you listen to my episodes 216 00:13:04,559 --> 00:13:07,760 Speaker 1: about ARM, that's a r M, you know that's a 217 00:13:07,800 --> 00:13:10,640 Speaker 1: company that's based in the UK. And Video has a 218 00:13:10,720 --> 00:13:14,840 Speaker 1: fifty four billion dollar acquisition deal to bring ARM into 219 00:13:14,920 --> 00:13:18,400 Speaker 1: the company fold. But this move has raised some concerns, 220 00:13:18,400 --> 00:13:21,160 Speaker 1: and some of those originate in the UK itself. ARM 221 00:13:21,360 --> 00:13:24,280 Speaker 1: is an important company in the UK, and there have 222 00:13:24,360 --> 00:13:27,040 Speaker 1: even been some questions raised as to whether an acquisition 223 00:13:27,080 --> 00:13:29,880 Speaker 1: by a foreign company, because in Video is based out 224 00:13:29,880 --> 00:13:33,400 Speaker 1: of the United States, could constitute a matter of national security. 225 00:13:33,800 --> 00:13:38,480 Speaker 1: Then there are the questions about competitiveness. ARM creates chip 226 00:13:38,559 --> 00:13:43,080 Speaker 1: designs and then issues licenses to manufacturing companies. So a 227 00:13:43,160 --> 00:13:47,760 Speaker 1: chip manufacturing company makes a deal with ARM and gets 228 00:13:47,800 --> 00:13:50,439 Speaker 1: the design for a specific type of processor and then 229 00:13:50,480 --> 00:13:52,719 Speaker 1: they go and make it. Now, the fear is that 230 00:13:52,800 --> 00:13:58,080 Speaker 1: if in Vidia, which also makes chips, acquires ARM, maybe 231 00:13:58,080 --> 00:14:01,559 Speaker 1: in Video would say no more of doing that. Right, 232 00:14:01,760 --> 00:14:05,120 Speaker 1: Maybe in Video would prevent ARM from licensing designs to 233 00:14:05,320 --> 00:14:09,520 Speaker 1: in videos competitors. The EU plans and antitrust investigation to 234 00:14:09,559 --> 00:14:12,160 Speaker 1: look into the matter further. And this does not mean 235 00:14:12,200 --> 00:14:14,360 Speaker 1: that the acquisition is off the table, but it could 236 00:14:14,360 --> 00:14:16,680 Speaker 1: take a bit longer than in Video was hoping for. 237 00:14:17,200 --> 00:14:19,800 Speaker 1: Last year, many school systems in the United States shifted 238 00:14:19,920 --> 00:14:22,960 Speaker 1: to a distance learning model due to the pandemic. However, 239 00:14:23,560 --> 00:14:26,400 Speaker 1: that meant that, you know, kids needed access to computers 240 00:14:26,400 --> 00:14:29,160 Speaker 1: in order to participate in class, and not everyone comes 241 00:14:29,240 --> 00:14:32,200 Speaker 1: from a family where that's a possibility. There is a 242 00:14:32,320 --> 00:14:36,280 Speaker 1: real digital divide in the United States, so numerous programs 243 00:14:36,320 --> 00:14:39,640 Speaker 1: popped up to provide hardware to students. Now, a research 244 00:14:39,640 --> 00:14:42,600 Speaker 1: paper from the Center for Democracy and Technology shows that 245 00:14:42,840 --> 00:14:46,920 Speaker 1: a lot of these computers contained monitoring software on them 246 00:14:46,960 --> 00:14:52,160 Speaker 1: that logged student activities on these computers. Essentially, many computers 247 00:14:52,200 --> 00:14:55,360 Speaker 1: come loaded with spyware, and it gives schools the ability 248 00:14:55,400 --> 00:14:58,480 Speaker 1: to see what students are doing on those computers. This 249 00:14:58,560 --> 00:15:02,680 Speaker 1: opens up a pretty difficult series of topics. So the 250 00:15:02,760 --> 00:15:05,960 Speaker 1: argument to support this initiative says that students are vulnerable 251 00:15:06,360 --> 00:15:09,400 Speaker 1: and monitoring could allow school systems to detect a problem 252 00:15:09,480 --> 00:15:12,320 Speaker 1: early on and potentially address it before it gets worse. 253 00:15:12,560 --> 00:15:16,040 Speaker 1: For example, a student writing about you know, self harm 254 00:15:16,480 --> 00:15:19,920 Speaker 1: might need counseling to help them, or someone who's engaging 255 00:15:20,000 --> 00:15:23,280 Speaker 1: in some form of online bullying might similarly need someone 256 00:15:23,360 --> 00:15:26,240 Speaker 1: to step in and intervene and help that person before 257 00:15:26,400 --> 00:15:29,760 Speaker 1: that gets worse. On the other hand, the argument against 258 00:15:29,840 --> 00:15:32,840 Speaker 1: this points out that this becomes a surveillance state and 259 00:15:32,920 --> 00:15:36,120 Speaker 1: it places an unhealthy burden of responsibility on teachers who 260 00:15:36,200 --> 00:15:38,440 Speaker 1: end up having to serve multiple roles in addition to 261 00:15:38,520 --> 00:15:42,840 Speaker 1: making lesson plans and educating children. Plus, it disproportionately harms 262 00:15:43,160 --> 00:15:47,240 Speaker 1: the less fortunate, like the less economically fortunate, because those 263 00:15:47,280 --> 00:15:52,200 Speaker 1: are people who likely don't have any other computer to use, 264 00:15:52,400 --> 00:15:55,160 Speaker 1: or smartphone or anything like that, so they rely even 265 00:15:55,240 --> 00:16:00,280 Speaker 1: more on the school supplied devices than another students do, 266 00:16:00,800 --> 00:16:05,040 Speaker 1: and thus they are disproportionately affected by this monitoring. The 267 00:16:05,080 --> 00:16:07,440 Speaker 1: Guardian has a great rite up on this. It's called 268 00:16:07,720 --> 00:16:11,200 Speaker 1: us Schools Gave Kids laptops during the pandemic, then they 269 00:16:11,280 --> 00:16:14,400 Speaker 1: spied on them. It was written by Jessa Crispin. I 270 00:16:14,400 --> 00:16:16,200 Speaker 1: think it's a good read if you want to learn 271 00:16:16,200 --> 00:16:18,960 Speaker 1: more about what happened, why it happened, and whether or 272 00:16:19,040 --> 00:16:22,480 Speaker 1: not it's a good idea and spoiler alert. Jess thinks 273 00:16:22,520 --> 00:16:25,640 Speaker 1: it's a bad idea and I am inclined to agree. 274 00:16:26,040 --> 00:16:28,000 Speaker 1: One news item from last week that I did not 275 00:16:28,080 --> 00:16:30,360 Speaker 1: get to is that NASA has pushed back the launch 276 00:16:30,400 --> 00:16:33,080 Speaker 1: of a test flight for the Boeing star Liner spacecraft. 277 00:16:33,120 --> 00:16:36,080 Speaker 1: This would have been the second test flight for the 278 00:16:36,160 --> 00:16:39,480 Speaker 1: Boeing star Liner, and the plan was to conduct it 279 00:16:39,560 --> 00:16:42,840 Speaker 1: within this year, but now it's gonna happen next year 280 00:16:42,840 --> 00:16:45,200 Speaker 1: at the earliest. The reason for the delay is that 281 00:16:45,280 --> 00:16:49,680 Speaker 1: a component on the spacecraft and oxidizer isolation valve isn't 282 00:16:49,720 --> 00:16:53,120 Speaker 1: functioning correctly, and this component is not exactly in an 283 00:16:53,160 --> 00:16:56,800 Speaker 1: easy to access spot in the spacecraft, so it requires 284 00:16:56,800 --> 00:16:59,160 Speaker 1: a lot of planning just to figure out how to 285 00:16:59,200 --> 00:17:02,280 Speaker 1: address this and fixed the issue without causing more damage 286 00:17:02,320 --> 00:17:06,040 Speaker 1: along the way. Boeing Starliner did have an earlier unscrewed 287 00:17:06,240 --> 00:17:09,760 Speaker 1: test flight this year, but that one encountered problems with 288 00:17:09,800 --> 00:17:12,960 Speaker 1: its onboard software, which prevented it from docking with the 289 00:17:13,000 --> 00:17:16,399 Speaker 1: International Space Station. So for right now, the only U 290 00:17:16,480 --> 00:17:18,400 Speaker 1: S space craft that can make the journey to the 291 00:17:18,520 --> 00:17:23,840 Speaker 1: I S S is space X's Crew Dragon. Finally, Magic Leap, 292 00:17:24,040 --> 00:17:27,320 Speaker 1: which was a company I thought might be headed towards insolvency, 293 00:17:27,640 --> 00:17:31,240 Speaker 1: has turned things around, at least for the time being. Now. 294 00:17:31,320 --> 00:17:33,760 Speaker 1: If you're not familiar with this company, it started creating 295 00:17:33,800 --> 00:17:37,920 Speaker 1: Buzz several years ago as a pretty secretive business aimed 296 00:17:37,960 --> 00:17:42,480 Speaker 1: at creating an incredible augmented reality headset, presumably meant for 297 00:17:42,480 --> 00:17:46,480 Speaker 1: the consumer market. They released some promotional material that truly 298 00:17:46,520 --> 00:17:50,760 Speaker 1: made it look magical, so the name seemed fitting. The 299 00:17:50,800 --> 00:17:55,680 Speaker 1: company raised billions of dollars and investments like three billion dollars. 300 00:17:55,720 --> 00:17:58,560 Speaker 1: But over the years things changed, and I'm guessing that 301 00:17:58,600 --> 00:18:01,640 Speaker 1: some of the engineering challenges is were far more significant 302 00:18:01,680 --> 00:18:05,520 Speaker 1: than they had first anticipated, and this in turn drove 303 00:18:05,560 --> 00:18:09,920 Speaker 1: costs up. Ultimately, Magic Leap released a professional level headset 304 00:18:09,960 --> 00:18:13,720 Speaker 1: called Magic Leap One Creator. A name like that suggests 305 00:18:13,720 --> 00:18:16,359 Speaker 1: that this headset was intended for developers so that they 306 00:18:16,359 --> 00:18:20,480 Speaker 1: could build useful or compelling augmented reality experiences that would 307 00:18:20,560 --> 00:18:25,280 Speaker 1: rely on this headset. But it definitely wasn't priced at 308 00:18:25,480 --> 00:18:29,840 Speaker 1: or targeted for a consumer user base. It was professional only. 309 00:18:30,400 --> 00:18:34,240 Speaker 1: Magic Leap also laid off about half its workforce last year. 310 00:18:34,800 --> 00:18:37,760 Speaker 1: One of the co founders, Ronnie abbo It's, left the 311 00:18:37,880 --> 00:18:41,119 Speaker 1: company in July last year. But now the company has 312 00:18:41,160 --> 00:18:45,639 Speaker 1: raised an additional half billion dollars, which pretty phenomenal that 313 00:18:45,720 --> 00:18:48,440 Speaker 1: they were able to do that after all these issues. 314 00:18:49,000 --> 00:18:51,520 Speaker 1: And they plan to release a new a R headset 315 00:18:51,560 --> 00:18:55,000 Speaker 1: called the Magic Leap To next year. Some early headsets 316 00:18:55,040 --> 00:18:59,639 Speaker 1: are reportedly already in circulation, and this new headset is 317 00:18:59,760 --> 00:19:02,440 Speaker 1: light and it has a larger field of view than 318 00:19:02,480 --> 00:19:05,560 Speaker 1: the previous model. But it is however, still being targeted 319 00:19:05,680 --> 00:19:08,960 Speaker 1: at enterprise level customers. So this is not something you 320 00:19:09,160 --> 00:19:11,600 Speaker 1: or I would purchase in order to you like, turn 321 00:19:11,680 --> 00:19:15,640 Speaker 1: our homes into like a Willy Wonka Wonderland or something. Nope, 322 00:19:15,800 --> 00:19:17,840 Speaker 1: these are meant for companies that might use it for 323 00:19:18,119 --> 00:19:21,400 Speaker 1: design departments, or maybe they might use it for really 324 00:19:21,520 --> 00:19:25,560 Speaker 1: technically complicated installations like the a R headset could show 325 00:19:25,680 --> 00:19:29,560 Speaker 1: you exactly where you might need to solder something, or 326 00:19:29,960 --> 00:19:33,320 Speaker 1: where you need to connect cables or lay certain electronics 327 00:19:33,400 --> 00:19:36,720 Speaker 1: in a very compact space. I'm glad to see that 328 00:19:36,760 --> 00:19:40,000 Speaker 1: magic leap, hasn't you know, leapt off a cliff? But 329 00:19:40,080 --> 00:19:42,119 Speaker 1: I'm left wondering how long it will take before we 330 00:19:42,240 --> 00:19:46,520 Speaker 1: eventually do see a good consumer targeted air headset. And 331 00:19:46,760 --> 00:19:50,520 Speaker 1: that's it for the news for Tuesday, October twelve, twenty one. 332 00:19:50,600 --> 00:19:53,359 Speaker 1: We'll be back with more news later this week. If 333 00:19:53,440 --> 00:19:55,800 Speaker 1: you have suggestions for things I should cover on tech Stuff, 334 00:19:55,800 --> 00:19:58,080 Speaker 1: reach out to me on Twitter. The handle is tech 335 00:19:58,200 --> 00:20:02,960 Speaker 1: stuff hs W and I'll talk to you again really soon. 336 00:20:03,640 --> 00:20:10,680 Speaker 1: Y text Stuff is an I Heart Radio production. For 337 00:20:10,800 --> 00:20:13,760 Speaker 1: more podcasts from I Heart Radio, visit the I Heart 338 00:20:13,840 --> 00:20:17,000 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 339 00:20:17,040 --> 00:20:17,760 Speaker 1: favorite shows.