1 00:00:04,400 --> 00:00:12,600 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,600 --> 00:00:16,120 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:18,560 Speaker 1: I'm an executive producer with iHeartRadio and a love of 4 00:00:18,560 --> 00:00:22,160 Speaker 1: all things tech. It is time for the Tech News 5 00:00:22,160 --> 00:00:27,560 Speaker 1: per Tuesday, September twenty eight, twenty twenty one. And first, 6 00:00:27,880 --> 00:00:31,720 Speaker 1: let's follow up on one of the recent Facebook stories, 7 00:00:31,920 --> 00:00:36,400 Speaker 1: specifically the one about how Facebook researchers conducted an internal 8 00:00:36,479 --> 00:00:40,760 Speaker 1: study that linked Instagram use with potential negative effects on 9 00:00:40,840 --> 00:00:45,280 Speaker 1: mental health, particularly for teenage girls. Now, the head of 10 00:00:45,320 --> 00:00:50,239 Speaker 1: Facebook's research department has posted on Facebook Newsroom that the 11 00:00:50,240 --> 00:00:53,240 Speaker 1: conclusions that were drawn by the Wall Street Journal, which 12 00:00:53,280 --> 00:00:57,040 Speaker 1: reported on all this, are inaccurate and do not reflect 13 00:00:57,080 --> 00:01:03,240 Speaker 1: a proper interpretation of Facebook data. But really, the post 14 00:01:03,320 --> 00:01:09,360 Speaker 1: mostly is an indictment against her own department's research practices. 15 00:01:09,840 --> 00:01:13,560 Speaker 1: Because she stated that the study in question only had 16 00:01:13,680 --> 00:01:17,520 Speaker 1: forty participants. That means that that sample size was way 17 00:01:17,600 --> 00:01:21,600 Speaker 1: too small to allow for any kind of broad generalizations. Now, 18 00:01:21,680 --> 00:01:25,080 Speaker 1: I actually agree with that, because forty is an incredibly 19 00:01:25,240 --> 00:01:29,800 Speaker 1: small sample size for any kind of study. One might 20 00:01:29,880 --> 00:01:32,920 Speaker 1: even argue that it really doesn't make any sense to 21 00:01:32,920 --> 00:01:36,280 Speaker 1: conduct a study that has that small of a sample size, 22 00:01:36,319 --> 00:01:40,800 Speaker 1: particularly if you are a company that has a platform 23 00:01:40,840 --> 00:01:45,679 Speaker 1: with more than a billion users, more than two billion users. Now, 24 00:01:45,720 --> 00:01:50,200 Speaker 1: that does not mean that the findings are necessarily inaccurate, right, 25 00:01:50,360 --> 00:01:54,760 Speaker 1: I mean those findings might be accurate, But rather than that, 26 00:01:54,800 --> 00:01:56,840 Speaker 1: you can say, well, you can't be sure of that 27 00:01:56,880 --> 00:01:59,120 Speaker 1: because a sample size is way too small. It may 28 00:01:59,160 --> 00:02:02,120 Speaker 1: be that you just happen to have outliers and that 29 00:02:02,200 --> 00:02:05,360 Speaker 1: if you did a larger sample size, any effect would disappear. 30 00:02:06,000 --> 00:02:09,400 Speaker 1: And you know, you just can't really say for sure 31 00:02:09,560 --> 00:02:13,239 Speaker 1: to what if any degree Instagram contributes to negative mental 32 00:02:13,280 --> 00:02:16,640 Speaker 1: health outcomes based on a study like that. But I 33 00:02:16,639 --> 00:02:18,920 Speaker 1: will also say that her post feels a lot like 34 00:02:19,000 --> 00:02:21,400 Speaker 1: Facebook is doing kind of a tight rope routine of 35 00:02:21,440 --> 00:02:25,720 Speaker 1: saying the data doesn't show that, but also saying the 36 00:02:25,760 --> 00:02:30,400 Speaker 1: study isn't sufficient to draw conclusions from, like unless there's 37 00:02:30,480 --> 00:02:33,400 Speaker 1: other data that Facebook has, and there may well be, 38 00:02:34,720 --> 00:02:37,839 Speaker 1: but they haven't shared it, then it's impossible to say. 39 00:02:37,880 --> 00:02:40,920 Speaker 1: And because Facebook hasn't shared any extra information, there's no 40 00:02:40,960 --> 00:02:44,480 Speaker 1: way to validate whether or not you know, the arguments 41 00:02:44,520 --> 00:02:51,120 Speaker 1: that this is inaccurate are true. This Thursday and Tigany Davis, 42 00:02:51,160 --> 00:02:54,160 Speaker 1: the head of safety at Facebook, will appear before the 43 00:02:54,280 --> 00:02:57,400 Speaker 1: Senate Commerce Subcommittee to answer questions about the company and 44 00:02:57,440 --> 00:03:01,600 Speaker 1: its research on Instagram's effects. And I am sure that 45 00:03:01,680 --> 00:03:05,360 Speaker 1: there will be some senators with some pretty pointed questions. 46 00:03:05,800 --> 00:03:09,920 Speaker 1: We will have to see if Davis answers those in 47 00:03:09,960 --> 00:03:12,120 Speaker 1: a straightforward way or if we get more of what 48 00:03:12,160 --> 00:03:14,800 Speaker 1: we've seen in the past, where Facebook kind of dances 49 00:03:14,840 --> 00:03:19,799 Speaker 1: around answering things when it comes to accountability. In related news, 50 00:03:19,919 --> 00:03:22,360 Speaker 1: Facebook has announced that has put a temporary hold on 51 00:03:22,400 --> 00:03:25,560 Speaker 1: its plans to launch an Instagram app aimed at kids 52 00:03:25,680 --> 00:03:29,720 Speaker 1: under the age of thirteen. According to the head of Instagram, 53 00:03:29,760 --> 00:03:33,240 Speaker 1: Adam Massari, the decision to do this was in response 54 00:03:33,280 --> 00:03:37,120 Speaker 1: to the public backlash that the company has faced, particularly 55 00:03:37,160 --> 00:03:40,080 Speaker 1: in light of those Wall Street Journal stories I mentioned. 56 00:03:40,320 --> 00:03:42,960 Speaker 1: He also took the time to lament that you know, 57 00:03:43,040 --> 00:03:45,920 Speaker 1: we all just have it all wrong. This new Instagram 58 00:03:45,920 --> 00:03:49,520 Speaker 1: app isn't meant for little kids, he explained. It's meant 59 00:03:49,560 --> 00:03:51,760 Speaker 1: for kids that are between the ages of ten and twelve. 60 00:03:52,840 --> 00:03:55,040 Speaker 1: Now I'm going to spare you my personal reaction to 61 00:03:55,080 --> 00:03:59,200 Speaker 1: that particular response, because I think you could probably guess 62 00:03:59,200 --> 00:04:02,080 Speaker 1: what it is. But he also tweeted out that kids 63 00:04:02,120 --> 00:04:06,560 Speaker 1: are getting smart folks like younger ages, and then they 64 00:04:06,600 --> 00:04:09,200 Speaker 1: go around and download apps and they misrepresent their ages 65 00:04:09,240 --> 00:04:11,560 Speaker 1: in order to use those apps. So clearly it would 66 00:04:11,600 --> 00:04:15,400 Speaker 1: be better to make apps for the kids. I would say, 67 00:04:16,040 --> 00:04:20,440 Speaker 1: this doesn't explain how that stops kids from just continuing 68 00:04:20,520 --> 00:04:24,719 Speaker 1: to download adult apps and to misrepresent their ages. Like 69 00:04:24,880 --> 00:04:28,680 Speaker 1: that doesn't seem to solve that problem. It's really just saying, hey, 70 00:04:28,720 --> 00:04:31,400 Speaker 1: they're already on there, so what harm could we do 71 00:04:31,440 --> 00:04:35,440 Speaker 1: by introducing more of what they're already on. If anything, 72 00:04:36,040 --> 00:04:38,680 Speaker 1: this is really a call both to companies and to 73 00:04:38,880 --> 00:04:42,480 Speaker 1: parents to take more steps to protect kids. And I 74 00:04:42,480 --> 00:04:45,320 Speaker 1: think parents need to have a lot of that responsibility. 75 00:04:45,640 --> 00:04:48,719 Speaker 1: But how you know, we can't give companies a get 76 00:04:48,720 --> 00:04:51,160 Speaker 1: out of jail free card just because they have a 77 00:04:51,240 --> 00:04:57,159 Speaker 1: token age verification system that doesn't actually verify ages. In 78 00:04:57,279 --> 00:05:01,360 Speaker 1: my mind, that's just a type of se purity theater, Right, 79 00:05:01,520 --> 00:05:04,920 Speaker 1: You've got the appearance of security, but it's not actually 80 00:05:05,360 --> 00:05:09,160 Speaker 1: making anything more secure. Now that's my own personal opinion. 81 00:05:09,680 --> 00:05:12,960 Speaker 1: I could be way off base. Mossi argues that the 82 00:05:13,040 --> 00:05:17,200 Speaker 1: app that was planned for younger users would give parents 83 00:05:17,240 --> 00:05:20,480 Speaker 1: more oversight into what their kids are doing. But I 84 00:05:20,480 --> 00:05:23,320 Speaker 1: don't know. I mean, with every story that I cover 85 00:05:23,400 --> 00:05:25,760 Speaker 1: in this vein, I am tempted to just go move 86 00:05:25,760 --> 00:05:29,599 Speaker 1: into the woods and become a hermit, except I know me, 87 00:05:29,760 --> 00:05:33,520 Speaker 1: I would die of exposure within like two days. I'm 88 00:05:33,560 --> 00:05:38,320 Speaker 1: a soft city boy, but the urge to retreat is 89 00:05:38,400 --> 00:05:42,520 Speaker 1: definitely there. The Chinese government is taking some pretty extreme 90 00:05:42,560 --> 00:05:46,039 Speaker 1: steps to curb energy consumption and carbon dioxide emissions within 91 00:05:46,080 --> 00:05:49,680 Speaker 1: the country. More than half of all China's electricity comes 92 00:05:49,720 --> 00:05:52,839 Speaker 1: from coal power plants, something like sixty three percent of 93 00:05:52,839 --> 00:05:55,680 Speaker 1: it does, and while the country has committed to never 94 00:05:55,720 --> 00:05:59,000 Speaker 1: building another coal power plant, it doesn't change the fact 95 00:05:59,040 --> 00:06:02,760 Speaker 1: that right now China largely runs on coal. The price 96 00:06:02,839 --> 00:06:05,119 Speaker 1: of coal in China has been on the rise lately, 97 00:06:05,520 --> 00:06:09,359 Speaker 1: and so energy costs have subsequently been going up, and 98 00:06:09,480 --> 00:06:13,520 Speaker 1: of course we have the carbon dioxide emissions issue. So 99 00:06:13,600 --> 00:06:16,240 Speaker 1: to deal with these problems, China has started to cut 100 00:06:16,240 --> 00:06:20,400 Speaker 1: off power to some major manufacturing centers in certain regions 101 00:06:20,440 --> 00:06:23,479 Speaker 1: of the country, with the plan to potentially have a 102 00:06:23,520 --> 00:06:26,760 Speaker 1: schedule in place for when those regions will be allowed 103 00:06:26,800 --> 00:06:29,960 Speaker 1: to power these manufacturing centers and when those centers are 104 00:06:29,960 --> 00:06:32,880 Speaker 1: going to go without power. Now, the hope is that 105 00:06:32,960 --> 00:06:35,560 Speaker 1: doing all this will allow energy companies to get hold 106 00:06:35,600 --> 00:06:39,800 Speaker 1: of some more commodities to offset production and stabilize prices, 107 00:06:40,080 --> 00:06:42,920 Speaker 1: though there are some analysts who say that at best 108 00:06:43,000 --> 00:06:45,479 Speaker 1: that would be like a temporary band aid. But in 109 00:06:45,520 --> 00:06:48,560 Speaker 1: the meantime, it means that it's not always going to 110 00:06:48,600 --> 00:06:52,080 Speaker 1: be business as usual in China's manufacturing centers, and since 111 00:06:52,080 --> 00:06:54,839 Speaker 1: a lot of the world's leading electronics companies depend upon 112 00:06:55,000 --> 00:06:59,240 Speaker 1: Chinese factories, this could mean that some companies experience some 113 00:06:59,279 --> 00:07:04,039 Speaker 1: production lays. Coupled with the ongoing semiconductor shortage, this could 114 00:07:04,080 --> 00:07:06,640 Speaker 1: mean that we're in for some tough times when it 115 00:07:06,680 --> 00:07:09,880 Speaker 1: comes to consumer electronics. However, I should add that the 116 00:07:10,000 --> 00:07:13,800 Speaker 1: semiconductor industry in particular will be allowed to continue to 117 00:07:13,840 --> 00:07:17,840 Speaker 1: operate throughout this process, so China's not going to shut 118 00:07:17,880 --> 00:07:21,920 Speaker 1: off power to facilities that are making semiconductor chips. Also, 119 00:07:22,320 --> 00:07:26,119 Speaker 1: a lot of companies have facilities in different regions of China, 120 00:07:26,200 --> 00:07:28,800 Speaker 1: and some regions are not affected by this. Some regions 121 00:07:28,840 --> 00:07:33,040 Speaker 1: will continue and not be cutting off power to manufacturing centers, 122 00:07:33,200 --> 00:07:36,120 Speaker 1: so those companies might just shift operations around as much 123 00:07:36,160 --> 00:07:39,400 Speaker 1: as they can to offset any delays they would face 124 00:07:40,000 --> 00:07:43,400 Speaker 1: due to the downtime. This next story needs a little 125 00:07:43,400 --> 00:07:45,560 Speaker 1: bit of a lead in, all right, So have you 126 00:07:45,600 --> 00:07:48,480 Speaker 1: ever tried to log into a service but you couldn't 127 00:07:48,520 --> 00:07:52,760 Speaker 1: remember which password you used for it? So you try 128 00:07:52,760 --> 00:07:55,920 Speaker 1: a few of your old standbys, and after three or 129 00:07:55,920 --> 00:07:58,560 Speaker 1: so attempts, you get shut out and told that you 130 00:07:58,640 --> 00:08:01,440 Speaker 1: can't try and log in again for several minutes. Well, 131 00:08:01,440 --> 00:08:03,640 Speaker 1: that kind of system is in place to protect against 132 00:08:03,680 --> 00:08:07,160 Speaker 1: a type of password attack called brute force, and with 133 00:08:07,240 --> 00:08:09,600 Speaker 1: that name you probably have a pretty good idea of 134 00:08:09,600 --> 00:08:13,280 Speaker 1: how this works. Someone trying to access a system you 135 00:08:13,280 --> 00:08:16,920 Speaker 1: know they don't have authorization will end up submitting guests 136 00:08:17,080 --> 00:08:21,480 Speaker 1: after guests in the password field, perhaps using a dictionary 137 00:08:21,600 --> 00:08:25,240 Speaker 1: of common passwords to start off, and then moving beyond 138 00:08:25,280 --> 00:08:29,160 Speaker 1: that to other guesses should the dictionary fail to score 139 00:08:29,160 --> 00:08:31,720 Speaker 1: any hits. This is all done automatically. By the way, 140 00:08:31,960 --> 00:08:34,439 Speaker 1: There are computer programs that are just meant to do 141 00:08:34,480 --> 00:08:36,880 Speaker 1: this kind of attack. This is not a quick way 142 00:08:37,240 --> 00:08:40,040 Speaker 1: to gain access to a system, but with a sufficiently 143 00:08:40,120 --> 00:08:43,160 Speaker 1: powerful computer system behind it, you can get it done. 144 00:08:43,320 --> 00:08:46,880 Speaker 1: It just takes time. So these protection systems are in 145 00:08:46,960 --> 00:08:49,200 Speaker 1: place in order to prevent that. Right, the brute force 146 00:08:49,200 --> 00:08:52,520 Speaker 1: attack would take a lot longer to do because the 147 00:08:52,559 --> 00:08:56,920 Speaker 1: attacker would regularly get shut out after giving a few 148 00:08:57,080 --> 00:09:01,559 Speaker 1: incorrect passwords. Another way that you can protect systems is 149 00:09:01,600 --> 00:09:06,160 Speaker 1: to require two factor authentication. So a password is one factor, 150 00:09:06,480 --> 00:09:11,120 Speaker 1: it represents something you know. A two factor authentication process 151 00:09:11,160 --> 00:09:14,959 Speaker 1: would require either that you also submit something that you own, 152 00:09:15,360 --> 00:09:18,360 Speaker 1: like your phone. So this is like when you try 153 00:09:18,360 --> 00:09:20,480 Speaker 1: and log into something and a system sends you a 154 00:09:20,480 --> 00:09:23,000 Speaker 1: text message with an access code that you need to 155 00:09:23,000 --> 00:09:26,640 Speaker 1: put in an addition to your password, or maybe it 156 00:09:26,679 --> 00:09:29,839 Speaker 1: requires something that you are like your biometric data like 157 00:09:29,920 --> 00:09:34,040 Speaker 1: a fingerprint scan, and it's used in connection with the password. 158 00:09:34,360 --> 00:09:37,480 Speaker 1: These systems also protect against brute force because the attacker 159 00:09:37,520 --> 00:09:40,160 Speaker 1: needs more than just the password in order to access 160 00:09:40,200 --> 00:09:42,040 Speaker 1: the system. All right, now we've got all that. Other 161 00:09:42,080 --> 00:09:45,120 Speaker 1: way we can get to the actual story. Microsoft has 162 00:09:45,160 --> 00:09:49,280 Speaker 1: a product called Azure Active Directory or Azure a D, 163 00:09:50,120 --> 00:09:53,760 Speaker 1: and apparently it has neither of those protections in place. 164 00:09:53,840 --> 00:09:57,200 Speaker 1: It is a single factor authentication system, so you just 165 00:09:57,280 --> 00:10:00,360 Speaker 1: need a password and you can submit password to your 166 00:10:00,400 --> 00:10:05,240 Speaker 1: heart's content, and apparently, in at least some versions, the 167 00:10:05,320 --> 00:10:09,520 Speaker 1: system doesn't log the password attempts, so there's no record 168 00:10:09,600 --> 00:10:14,440 Speaker 1: kept that someone is trying and failing to submit a password. Now, 169 00:10:14,720 --> 00:10:18,040 Speaker 1: consider for a moment that Azure Active Directory is a 170 00:10:18,080 --> 00:10:22,040 Speaker 1: way for corporate users to sign into a corporate account 171 00:10:22,320 --> 00:10:26,280 Speaker 1: and then connect to all integrated corporate systems and devices. 172 00:10:26,640 --> 00:10:30,200 Speaker 1: It's a one login solution in other words, so you 173 00:10:30,280 --> 00:10:33,120 Speaker 1: might use it to log into your corporate email, but 174 00:10:33,160 --> 00:10:36,960 Speaker 1: then it also logs you into the corporate HR system 175 00:10:37,120 --> 00:10:40,440 Speaker 1: or maybe like a project management system that all of 176 00:10:40,480 --> 00:10:43,599 Speaker 1: these different things. Because they have the one login approach, 177 00:10:44,320 --> 00:10:47,320 Speaker 1: you already have authorization, so you're not frustrated by the 178 00:10:47,360 --> 00:10:50,840 Speaker 1: fact that you have to authenticate every single time you 179 00:10:50,960 --> 00:10:54,760 Speaker 1: try to access any company system. It's meant to make 180 00:10:54,800 --> 00:10:58,360 Speaker 1: things more streamlined. Right, As long as you were able 181 00:10:58,360 --> 00:11:03,760 Speaker 1: to authenticate that one gateway, you can access everything. So 182 00:11:05,080 --> 00:11:08,520 Speaker 1: that means that this is a potential huge security vulnerability, 183 00:11:08,640 --> 00:11:14,960 Speaker 1: right if a hacker targets and Azure ad log in 184 00:11:15,520 --> 00:11:18,160 Speaker 1: and they have a user name and they're just submitting passwords, 185 00:11:18,200 --> 00:11:21,439 Speaker 1: and those failed passwords aren't getting logged, So no system 186 00:11:21,480 --> 00:11:25,000 Speaker 1: administrators are aware of this because there's no you know, 187 00:11:25,280 --> 00:11:28,960 Speaker 1: notification popping up saying hey, so and so has submitted 188 00:11:29,000 --> 00:11:32,680 Speaker 1: five seven eighty three incorrect guesses for their password. Maybe 189 00:11:32,679 --> 00:11:35,200 Speaker 1: you need to look into this. Then you could just 190 00:11:35,280 --> 00:11:38,679 Speaker 1: keep on attacking until you got a hit and managed 191 00:11:38,679 --> 00:11:40,640 Speaker 1: to get into the system. If you would like to 192 00:11:40,720 --> 00:11:45,800 Speaker 1: learn more about what scenarios this would work in and 193 00:11:45,840 --> 00:11:47,680 Speaker 1: what you need to be on the lookout for, I 194 00:11:47,720 --> 00:11:52,120 Speaker 1: really recommend reading As Sharma's post on Ours Technica. It 195 00:11:52,240 --> 00:11:56,560 Speaker 1: is titled new Azure Active Directory password brute forcing flaw 196 00:11:56,720 --> 00:12:00,480 Speaker 1: has no fix. That headline kind of as it all. 197 00:12:00,840 --> 00:12:03,480 Speaker 1: All right, We've got some more news stories to cover, 198 00:12:03,600 --> 00:12:06,040 Speaker 1: but before we get to that, let's take a quick break. 199 00:12:13,920 --> 00:12:16,160 Speaker 1: If you are in the United States, you might remember 200 00:12:16,200 --> 00:12:19,439 Speaker 1: that a couple of years ago, the Chinese telecommunications company 201 00:12:19,520 --> 00:12:22,960 Speaker 1: Huawei fell under a lot of scrutiny here in America, 202 00:12:23,200 --> 00:12:26,320 Speaker 1: and in part this was because then President Trump had 203 00:12:26,400 --> 00:12:29,040 Speaker 1: engaged in a trade war with China, and so at 204 00:12:29,120 --> 00:12:32,559 Speaker 1: least some of that motivation for the pressure on Huawei 205 00:12:32,679 --> 00:12:36,200 Speaker 1: was political trade pressure, but there was also a growing 206 00:12:36,240 --> 00:12:40,560 Speaker 1: concern that a Chinese company, one that presumably has important 207 00:12:40,600 --> 00:12:44,720 Speaker 1: and tight connections with China's communist government, might not be 208 00:12:44,760 --> 00:12:49,080 Speaker 1: the best fit when it comes to building out telecommunications infrastructure. So, 209 00:12:49,120 --> 00:12:52,800 Speaker 1: in other words, if you're worried about potential Chinese spies. 210 00:12:53,440 --> 00:12:57,520 Speaker 1: Maybe it's better not to hire a Chinese communications company 211 00:12:57,800 --> 00:13:03,199 Speaker 1: to install critical infrastructure components within your own communications network. 212 00:13:04,080 --> 00:13:07,679 Speaker 1: It's like opening the door for potential spies. In other words, 213 00:13:07,920 --> 00:13:11,400 Speaker 1: so the US moved to push American communications companies to 214 00:13:11,559 --> 00:13:16,040 Speaker 1: scuttle Huawei systems and to replace them with other systems. Now, 215 00:13:16,040 --> 00:13:19,760 Speaker 1: the Federal Communications Commission, or FCC, says it has created 216 00:13:19,840 --> 00:13:24,360 Speaker 1: a nearly two billion dollar program to reimburse telecom carriers 217 00:13:24,400 --> 00:13:26,960 Speaker 1: that are going through the process of removing and replacing 218 00:13:27,040 --> 00:13:31,520 Speaker 1: Huawei network hardware from their systems. These telecom companies are 219 00:13:31,600 --> 00:13:34,600 Speaker 1: largely in rural areas in the United States, and I'm 220 00:13:34,600 --> 00:13:38,960 Speaker 1: sure that that relief is a literal relief to them. 221 00:13:39,080 --> 00:13:44,640 Speaker 1: These are not necessarily your gigantic coast to coast companies. 222 00:13:44,640 --> 00:13:48,640 Speaker 1: In other words, video game company Activision Blizzard has entered 223 00:13:48,679 --> 00:13:52,600 Speaker 1: into a settlement agreement with the US Equal Employment Opportunity Commission. 224 00:13:52,920 --> 00:13:55,599 Speaker 1: This was in response to a lawsuit that the EOC 225 00:13:55,840 --> 00:13:59,280 Speaker 1: brought against the company following multiple allegations of issues ranging 226 00:13:59,320 --> 00:14:02,840 Speaker 1: from a hostile work environment to sexual harassment, to pay 227 00:14:02,880 --> 00:14:07,240 Speaker 1: disparity and discrimination. As part of the settlement, activision. Blizzard 228 00:14:07,240 --> 00:14:11,600 Speaker 1: will create a fund that will compensate employees who claim damages. 229 00:14:11,920 --> 00:14:13,839 Speaker 1: So it Plays will submit a claim, it will be 230 00:14:13,920 --> 00:14:17,680 Speaker 1: evaluated and then determined whether or not that employee merits 231 00:14:17,800 --> 00:14:21,160 Speaker 1: getting money from this fund, and the total amount in 232 00:14:21,200 --> 00:14:23,920 Speaker 1: that fund is in the neighborhood of eighteen million dollars. 233 00:14:24,400 --> 00:14:27,920 Speaker 1: Any unclaimed funds after a certain period of time end 234 00:14:28,040 --> 00:14:32,320 Speaker 1: up going to nonprofit organizations dedicated to attracting more women 235 00:14:32,480 --> 00:14:36,600 Speaker 1: to enter into the video game development industry. Bobby Kodik, 236 00:14:36,920 --> 00:14:39,400 Speaker 1: the CEO of the company, says that he and the 237 00:14:39,400 --> 00:14:41,840 Speaker 1: executive team are dedicated to putting an end too the 238 00:14:41,920 --> 00:14:46,000 Speaker 1: toxic work environment, which I really hope is a sincere statement. 239 00:14:46,280 --> 00:14:48,440 Speaker 1: I mean, the cynical part of me says, well, of 240 00:14:48,480 --> 00:14:50,880 Speaker 1: course you want to bring an end to that, because 241 00:14:51,000 --> 00:14:53,920 Speaker 1: it's costing you money. As long as it wasn't costing 242 00:14:53,920 --> 00:14:57,840 Speaker 1: you money, there was no real incentive. But the hopeful 243 00:14:57,880 --> 00:15:01,120 Speaker 1: part of me says, we're trying to get better, and 244 00:15:01,200 --> 00:15:06,480 Speaker 1: people genuinely want to make workplaces a more positive environment. 245 00:15:06,960 --> 00:15:10,080 Speaker 1: So the optimist will continue to hope and the cynic 246 00:15:10,120 --> 00:15:14,479 Speaker 1: will continue to mistrust. A researcher at the Ethereum Foundation 247 00:15:14,560 --> 00:15:18,040 Speaker 1: named Virgil Griffith has pled guilty to charges that he 248 00:15:18,040 --> 00:15:22,160 Speaker 1: helped the country of North Korea get around US sanctions 249 00:15:22,160 --> 00:15:26,520 Speaker 1: that aimed to prevent North Korea from using blockchain technology. 250 00:15:26,560 --> 00:15:29,760 Speaker 1: And that needs some explanation. So, first of all, Ethereum 251 00:15:29,880 --> 00:15:33,239 Speaker 1: is a type of cryptocurrency, and when we talk about cryptocurrency, 252 00:15:33,240 --> 00:15:36,960 Speaker 1: a lot of people just think Bitcoin, or maybe if 253 00:15:36,960 --> 00:15:39,440 Speaker 1: they're in it for the means, they might think dogecoin. 254 00:15:40,360 --> 00:15:45,160 Speaker 1: Ethereum is another big popular cryptocurrency that is currently trying 255 00:15:45,200 --> 00:15:48,560 Speaker 1: to switch from a proof of work approach, which is 256 00:15:48,560 --> 00:15:52,400 Speaker 1: what Bitcoin uses. That's where you're using very fast computers 257 00:15:52,400 --> 00:15:56,320 Speaker 1: to try and solve a very hard computer problem before 258 00:15:56,360 --> 00:15:59,560 Speaker 1: anyone else can, and it's the reason why these systems 259 00:16:00,080 --> 00:16:03,640 Speaker 1: end up consuming so much energy and as a result, 260 00:16:03,720 --> 00:16:07,920 Speaker 1: contribute to things like carbon emissions and energy spikes and 261 00:16:07,960 --> 00:16:10,800 Speaker 1: all that kind of stuff. Ethereum is trying to move 262 00:16:10,880 --> 00:16:15,000 Speaker 1: to a proof of stake approach, which does not require 263 00:16:15,040 --> 00:16:19,600 Speaker 1: that kind of computational processing power in order to mine 264 00:16:19,760 --> 00:16:23,040 Speaker 1: new coins. However, it does mean that you have to 265 00:16:23,040 --> 00:16:27,240 Speaker 1: have a sufficient stake in ethereum in order to earn 266 00:16:27,280 --> 00:16:30,920 Speaker 1: interest more ethereum. And so some people point out that 267 00:16:30,920 --> 00:16:34,480 Speaker 1: that approach means that you already have to be wealthy 268 00:16:34,560 --> 00:16:41,560 Speaker 1: in order to even enjoy that potential payout, so upside 269 00:16:41,600 --> 00:16:45,840 Speaker 1: and downsides anyway, Like bitcoin, Ethereum uses blockchain to track 270 00:16:45,960 --> 00:16:49,600 Speaker 1: transactions and to prevent people from spending the same ethereum 271 00:16:49,720 --> 00:16:52,800 Speaker 1: unit twice. You know, if you have something that's digital, 272 00:16:52,840 --> 00:16:55,520 Speaker 1: then arguably you could just copy it a billion times, 273 00:16:55,720 --> 00:16:57,720 Speaker 1: so now instead of having one dollar bill, you have 274 00:16:57,720 --> 00:17:00,960 Speaker 1: a billion dollar bills. Block Chain prevents that kind of 275 00:17:00,960 --> 00:17:04,200 Speaker 1: stuff from happening. Well. Back in twenty nineteen, Virgil Griffith 276 00:17:04,240 --> 00:17:07,840 Speaker 1: attended a blockchain conference in Pyongyang, and the US government 277 00:17:07,880 --> 00:17:11,240 Speaker 1: alleges that Griffith's presentation at the conference was in effect 278 00:17:11,440 --> 00:17:14,440 Speaker 1: an instruction manual for how North Korea could make use 279 00:17:14,480 --> 00:17:18,160 Speaker 1: of blockchain technologies, despite US sanctions meant to prevent that 280 00:17:18,280 --> 00:17:22,320 Speaker 1: very thing from happening. Griffith was arrested upon returning from 281 00:17:22,359 --> 00:17:25,440 Speaker 1: the conference, and his trial was set to begin next week, 282 00:17:25,680 --> 00:17:28,080 Speaker 1: but he decided to plead guilty to the charges, which 283 00:17:28,080 --> 00:17:31,760 Speaker 1: could mean he could face up to twenty years in prison. 284 00:17:32,160 --> 00:17:34,840 Speaker 1: We'll have to follow up on this as we learn more. 285 00:17:35,240 --> 00:17:38,439 Speaker 1: No once we get to sentencing, Tesla has started to 286 00:17:38,480 --> 00:17:43,159 Speaker 1: open up its full self driving or FSD program to 287 00:17:43,240 --> 00:17:47,040 Speaker 1: a larger number of Tesla drivers, prompting them with a 288 00:17:47,359 --> 00:17:50,560 Speaker 1: request feature that appears on the dashboard touchscreen, so you 289 00:17:50,920 --> 00:17:53,479 Speaker 1: select it and then you can put in your request 290 00:17:53,520 --> 00:17:55,679 Speaker 1: to be part of the program. So now if you 291 00:17:55,800 --> 00:17:59,719 Speaker 1: have a Tesla that's capable of supporting FSD, you can 292 00:17:59,720 --> 00:18:03,280 Speaker 1: ask to have that feature enabled on your Tesla. Except 293 00:18:03,359 --> 00:18:06,080 Speaker 1: there is a catch. The company will run a safety 294 00:18:06,200 --> 00:18:10,440 Speaker 1: check on each driver, checking their driving against five criteria 295 00:18:10,520 --> 00:18:13,600 Speaker 1: to be certain that the drivers are responsible and safe. 296 00:18:13,880 --> 00:18:17,359 Speaker 1: Those criteria include instances in which the driver prompted a 297 00:18:17,480 --> 00:18:22,840 Speaker 1: forced autopilot disengagement. So autopilot is a driver assist feature 298 00:18:23,119 --> 00:18:26,719 Speaker 1: that some Tesla owners have famously abused by treating it 299 00:18:26,800 --> 00:18:30,680 Speaker 1: more as like a fully autonomous vehicle mode. And this 300 00:18:30,760 --> 00:18:33,800 Speaker 1: particular feature asks drivers that they keep their hands on 301 00:18:33,840 --> 00:18:37,000 Speaker 1: the wheel and they maintain their attention on the road, 302 00:18:37,560 --> 00:18:40,280 Speaker 1: and if a driver does not do this, then the 303 00:18:40,280 --> 00:18:43,480 Speaker 1: mode is supposed to alert the driver and disengage and 304 00:18:43,520 --> 00:18:47,119 Speaker 1: thus force the driver to take over manual control of 305 00:18:47,119 --> 00:18:49,720 Speaker 1: the car. So if that had happened, that would be 306 00:18:49,720 --> 00:18:52,600 Speaker 1: a strike against you, would that would knock points off 307 00:18:52,640 --> 00:18:56,200 Speaker 1: your safety score, but other criteria include stuff like how 308 00:18:56,240 --> 00:18:58,920 Speaker 1: frequently the car had to engage features like a forward 309 00:18:59,000 --> 00:19:02,680 Speaker 1: collision warning, which might indicate that you're following too closely 310 00:19:02,760 --> 00:19:05,800 Speaker 1: or not paying enough attention, or how frequently the driver 311 00:19:05,920 --> 00:19:08,840 Speaker 1: had to use hard breaking. Again, maybe you were traveling 312 00:19:08,840 --> 00:19:11,720 Speaker 1: too fast or you break too late when you're coming 313 00:19:11,800 --> 00:19:15,680 Speaker 1: up two stops. So you have to accrue a sufficient 314 00:19:15,720 --> 00:19:19,880 Speaker 1: safety score before you will be given access to subscribe 315 00:19:19,880 --> 00:19:23,440 Speaker 1: to FSD, But Tesla is not actually saying what that 316 00:19:23,800 --> 00:19:27,120 Speaker 1: score threshold is. Only the drivers will be judged out 317 00:19:27,160 --> 00:19:30,240 Speaker 1: of a total possible one hundred points, and most folks 318 00:19:30,280 --> 00:19:34,400 Speaker 1: will land somewhere around eighty points. The FSD product requires 319 00:19:34,400 --> 00:19:37,920 Speaker 1: a monthly subscription of one hundred ninety nine dollars a month, 320 00:19:38,440 --> 00:19:43,080 Speaker 1: which is a princely sum. Tesla also offered a version 321 00:19:43,119 --> 00:19:45,440 Speaker 1: where you could just buy it outright for the life 322 00:19:45,440 --> 00:19:47,760 Speaker 1: of the car for ten thousand dollars. And I've got 323 00:19:47,760 --> 00:19:50,800 Speaker 1: a lot of thoughts about this, and one of those 324 00:19:50,920 --> 00:19:54,040 Speaker 1: is that the name of it full self driving is 325 00:19:54,359 --> 00:19:59,120 Speaker 1: just as misleading as the name autopilot is, because it's 326 00:19:59,160 --> 00:20:03,520 Speaker 1: really just more features that augment autopilot, so it can 327 00:20:03,560 --> 00:20:06,359 Speaker 1: do stuff like in at least some cases have a 328 00:20:06,400 --> 00:20:08,880 Speaker 1: car navigate out of a parking space on its own, 329 00:20:09,119 --> 00:20:10,960 Speaker 1: so you can have it pull out of a space 330 00:20:11,040 --> 00:20:12,399 Speaker 1: and then you get into the car. You don't have 331 00:20:12,440 --> 00:20:15,159 Speaker 1: to squeeze by and that kind of stuff. But it 332 00:20:15,160 --> 00:20:19,080 Speaker 1: doesn't always work in every situation, and it can also 333 00:20:19,160 --> 00:20:21,879 Speaker 1: do things like obey traffic signals and stop signs, so 334 00:20:21,880 --> 00:20:24,280 Speaker 1: it can travel on surface streets in this mode, and 335 00:20:24,320 --> 00:20:27,199 Speaker 1: not just on highways. It can navigate from one highway 336 00:20:27,240 --> 00:20:30,920 Speaker 1: to another. But it is not true self driving, or 337 00:20:30,920 --> 00:20:35,080 Speaker 1: at least it's not truly fully self driving. It can't 338 00:20:35,119 --> 00:20:38,960 Speaker 1: autonomously operate the car in all situations and scenarios, and 339 00:20:39,040 --> 00:20:43,520 Speaker 1: many critics, including myself, have argued that the name doesn't 340 00:20:43,600 --> 00:20:47,680 Speaker 1: reflect what the product actually does. Also, I find it 341 00:20:47,800 --> 00:20:51,280 Speaker 1: somewhat telling that the company refers to the system as 342 00:20:51,320 --> 00:20:55,200 Speaker 1: full self driving, but it is requiring drivers who are 343 00:20:55,680 --> 00:20:59,199 Speaker 1: applying for this to pass a safety test before they 344 00:20:59,240 --> 00:21:02,640 Speaker 1: get access to the features, because if it were really 345 00:21:02,680 --> 00:21:07,119 Speaker 1: a full self driving feature, you would want bad drivers 346 00:21:07,200 --> 00:21:10,280 Speaker 1: to get that, right, I mean, if the vehicle is 347 00:21:10,320 --> 00:21:14,320 Speaker 1: capable of driving itself, which I argue full self driving 348 00:21:14,560 --> 00:21:19,160 Speaker 1: at least heavily implies, presumably it will do so safely 349 00:21:19,240 --> 00:21:22,160 Speaker 1: without the risk of an accident, and it makes more 350 00:21:22,240 --> 00:21:26,159 Speaker 1: sense to give bad drivers that service and thus remove 351 00:21:26,240 --> 00:21:29,520 Speaker 1: their human error from the road. But the fact that 352 00:21:29,560 --> 00:21:34,200 Speaker 1: Tesla requires drivers to meet a minimum safety requirement tells 353 00:21:34,240 --> 00:21:37,760 Speaker 1: me that that's not really what FSD does, right, Like, 354 00:21:37,800 --> 00:21:40,600 Speaker 1: if it really did that, then you wouldn't need the 355 00:21:40,600 --> 00:21:42,879 Speaker 1: safety check. If you need a safety check, you need 356 00:21:42,920 --> 00:21:44,680 Speaker 1: to make sure that the person who's driving the car 357 00:21:44,760 --> 00:21:47,400 Speaker 1: is going to be responsible, which tells you that FSD 358 00:21:48,000 --> 00:21:53,000 Speaker 1: is not really FSD. It's an augmentation system, not a 359 00:21:53,080 --> 00:21:57,080 Speaker 1: truly autonomous system. And I've got a lot of really 360 00:21:57,560 --> 00:22:01,040 Speaker 1: critical thoughts about Tesla doing this kind of stuff because 361 00:22:01,080 --> 00:22:04,520 Speaker 1: I feel that they set an unrealistic expectation in their 362 00:22:04,560 --> 00:22:09,920 Speaker 1: customer base, and then people have an overreliance on technology 363 00:22:09,960 --> 00:22:12,760 Speaker 1: that is not able to measure up to what the 364 00:22:12,800 --> 00:22:15,480 Speaker 1: people are putting it to, Like they're putting way too 365 00:22:15,560 --> 00:22:18,600 Speaker 1: much responsibility on the tech, and the tech just isn't 366 00:22:18,680 --> 00:22:22,000 Speaker 1: up to the challenge, and part of that is fueled 367 00:22:22,080 --> 00:22:27,639 Speaker 1: by the way Tesla markets it's technology. So yeah, I 368 00:22:27,680 --> 00:22:32,159 Speaker 1: think it's reprehensible. If I'm being you know, blunt, all right, 369 00:22:32,880 --> 00:22:35,720 Speaker 1: I'm gonna I'm gonna take a break and I'm gonna 370 00:22:35,760 --> 00:22:40,040 Speaker 1: find a way down off this high horse. But we'll 371 00:22:40,040 --> 00:22:49,920 Speaker 1: be back with some more news in just a second. Okay, 372 00:22:50,000 --> 00:22:53,440 Speaker 1: we're back now. By the time you hear this episode, 373 00:22:53,720 --> 00:22:58,160 Speaker 1: Amazon will have held a hardware event and revealed some 374 00:22:58,280 --> 00:23:02,119 Speaker 1: new products, potentially a bunch of new ones. The event 375 00:23:02,200 --> 00:23:08,800 Speaker 1: was invitation only. I did not get an invite. It's hurtful, 376 00:23:09,680 --> 00:23:12,160 Speaker 1: so I have no idea what it is that they revealed. 377 00:23:12,200 --> 00:23:15,199 Speaker 1: In fact, as I record this, the event has not 378 00:23:15,280 --> 00:23:18,000 Speaker 1: yet happened, but it will within like twenty five minutes 379 00:23:18,000 --> 00:23:21,239 Speaker 1: of me saying the sentence. The Verge has made some 380 00:23:21,320 --> 00:23:24,080 Speaker 1: guesses as to what could be revealed, so I thought 381 00:23:24,119 --> 00:23:26,600 Speaker 1: I would share with you what the Verge is guessing, 382 00:23:26,640 --> 00:23:29,000 Speaker 1: and then you can compare and see if the Verge 383 00:23:29,040 --> 00:23:31,480 Speaker 1: got it right. And I hope they did because I really, actually, 384 00:23:31,600 --> 00:23:34,760 Speaker 1: I really like that site a lot. Anyway, the Verge 385 00:23:34,760 --> 00:23:38,360 Speaker 1: predicts that Amazon will likely have a wall mounted echo device, 386 00:23:38,720 --> 00:23:43,520 Speaker 1: so you know, another smart speaker screen device that you 387 00:23:43,560 --> 00:23:47,040 Speaker 1: could actually mount on your wall. They also are predicting 388 00:23:47,080 --> 00:23:49,480 Speaker 1: that there's probably going to be a sound bar system 389 00:23:49,640 --> 00:23:52,160 Speaker 1: with Alexa integration in it. I mean We're already seeing 390 00:23:52,200 --> 00:23:56,080 Speaker 1: sound bars that have Alexa integration incorporated into them, but 391 00:23:56,119 --> 00:23:59,320 Speaker 1: this would be an official, like Echo soundbar type thing. 392 00:24:00,040 --> 00:24:02,479 Speaker 1: There's also the possibility that the company will have a 393 00:24:02,680 --> 00:24:06,399 Speaker 1: dash cam for cars that has Alexa integration built into it. 394 00:24:06,440 --> 00:24:10,119 Speaker 1: That seems to be, you know, a pretty safe bet. 395 00:24:10,600 --> 00:24:13,760 Speaker 1: But one thing that the Verge says we probably will 396 00:24:13,800 --> 00:24:18,080 Speaker 1: not see is more information about a robot that Amazon 397 00:24:18,240 --> 00:24:21,080 Speaker 1: has had in development for several years. It's a home 398 00:24:21,080 --> 00:24:25,520 Speaker 1: assistant robot that was called Vesta, but apparently there have 399 00:24:25,600 --> 00:24:29,159 Speaker 1: been some concerns within the company that there might not 400 00:24:29,240 --> 00:24:33,080 Speaker 1: be sufficient demand for Vesta and that if Amazon released 401 00:24:33,119 --> 00:24:35,880 Speaker 1: it as a product, it could just be a total flop, 402 00:24:36,240 --> 00:24:38,600 Speaker 1: and that they would end up costing the company more 403 00:24:38,720 --> 00:24:41,320 Speaker 1: because not enough people would buy it. So it's possible 404 00:24:41,359 --> 00:24:44,120 Speaker 1: that the company is actually backing away from that project. 405 00:24:44,440 --> 00:24:46,719 Speaker 1: We will have to wait and see what it is, 406 00:24:46,760 --> 00:24:49,600 Speaker 1: they said, or you won't have to wait and see, because, 407 00:24:49,680 --> 00:24:52,399 Speaker 1: like I said, by the time you hear this, they've 408 00:24:52,440 --> 00:24:57,320 Speaker 1: already had that event. Anyway. Amazon is also facing some 409 00:24:57,359 --> 00:25:00,479 Speaker 1: opposition in the state of California, and it's not the 410 00:25:00,520 --> 00:25:03,760 Speaker 1: only one. Governor Knewsome signed a bill into law last 411 00:25:03,760 --> 00:25:07,400 Speaker 1: week that will require companies that employ more than one 412 00:25:07,400 --> 00:25:11,400 Speaker 1: thousand warehouse workers that they will have to disclose how 413 00:25:11,440 --> 00:25:16,359 Speaker 1: they judge worker productivity, including how they set productivity quotas. 414 00:25:16,480 --> 00:25:20,680 Speaker 1: Amazon and other companies that meet that criteria will then 415 00:25:20,760 --> 00:25:24,639 Speaker 1: have thirty days after the bill becomes an actual law, 416 00:25:24,840 --> 00:25:28,000 Speaker 1: which will happen on January first, twenty twenty two. At 417 00:25:28,040 --> 00:25:31,640 Speaker 1: that point, they will have to disclose how they measure 418 00:25:31,760 --> 00:25:35,480 Speaker 1: productivity and how they collect that information. And this law 419 00:25:35,560 --> 00:25:39,760 Speaker 1: gives employees the right to sue their employers for unsafe quotas. 420 00:25:39,800 --> 00:25:42,879 Speaker 1: So if a company is like working people beyond reason, 421 00:25:43,160 --> 00:25:47,640 Speaker 1: if the quotas are so stringent and so restrictive that 422 00:25:47,880 --> 00:25:52,280 Speaker 1: people can't you go to the bathroom, or they can't 423 00:25:52,280 --> 00:25:56,399 Speaker 1: take a reasonable number of breaks, or they run the 424 00:25:56,520 --> 00:25:59,200 Speaker 1: risk of injuring themselves because they have to work so 425 00:25:59,240 --> 00:26:03,000 Speaker 1: hard in order to meet very high quotas. Well, now 426 00:26:03,240 --> 00:26:06,320 Speaker 1: those employees could potentially sue their employers, and the employers 427 00:26:06,359 --> 00:26:08,880 Speaker 1: would be held accountable for that in a court of law, 428 00:26:08,960 --> 00:26:11,840 Speaker 1: at least in California. Now it will take quite a 429 00:26:11,880 --> 00:26:14,440 Speaker 1: bit of effort on the part of workers in this process. 430 00:26:14,480 --> 00:26:16,320 Speaker 1: It's not like a worker can just step forward and 431 00:26:16,359 --> 00:26:18,200 Speaker 1: say I don't like working here and I'm going to 432 00:26:18,280 --> 00:26:20,920 Speaker 1: sue the company. The law will demand that workers who 433 00:26:20,960 --> 00:26:24,520 Speaker 1: assert that the company they work for has unsafe quotas, 434 00:26:24,560 --> 00:26:27,960 Speaker 1: they will be required to provide ninety days so three 435 00:26:28,040 --> 00:26:31,840 Speaker 1: months worth of documentation on productivity quotas that they have 436 00:26:31,880 --> 00:26:34,480 Speaker 1: to meet in order to be considered you know, successful 437 00:26:34,560 --> 00:26:38,200 Speaker 1: or failing at their job. In addition, California regulators will 438 00:26:38,240 --> 00:26:41,040 Speaker 1: also be authorized to investigate work sites that have an 439 00:26:41,040 --> 00:26:43,480 Speaker 1: injury rate that is one and a half times or 440 00:26:43,600 --> 00:26:47,400 Speaker 1: greater than the industry average. If you remember from a 441 00:26:47,440 --> 00:26:51,720 Speaker 1: previous Tech News episode, I talked about how Amazon delivery 442 00:26:51,760 --> 00:26:56,760 Speaker 1: centers have an unusually high injury rate compared to other 443 00:26:57,240 --> 00:27:03,119 Speaker 1: Amazon facilities and others within the delivery and warehouse industries. 444 00:27:03,480 --> 00:27:07,560 Speaker 1: That might be a case where a state regulator would 445 00:27:07,600 --> 00:27:10,000 Speaker 1: have the authorization to go in and conduct a full 446 00:27:10,040 --> 00:27:12,520 Speaker 1: investigation to get to the bottom of why is that 447 00:27:12,600 --> 00:27:16,000 Speaker 1: happening and to hold the company accountable for it down 448 00:27:16,119 --> 00:27:19,479 Speaker 1: UNDA in Australia, and I apologize for that. I know 449 00:27:20,160 --> 00:27:23,000 Speaker 1: I can never do an Australian accent. Australian and Scottish 450 00:27:23,080 --> 00:27:25,600 Speaker 1: are two that I will never ever ever be able 451 00:27:25,600 --> 00:27:29,600 Speaker 1: to do anyway down in Australia, citizens can rest assured 452 00:27:29,840 --> 00:27:33,040 Speaker 1: that nature, which as I understand it is eighty percent 453 00:27:33,080 --> 00:27:38,080 Speaker 1: more deadly in Australia, is prepared to fight the robot uprising. 454 00:27:38,600 --> 00:27:42,080 Speaker 1: Now I say that because the drone company Wing, which 455 00:27:42,119 --> 00:27:45,040 Speaker 1: is part of Google's parent company Alphabet, so this is 456 00:27:45,400 --> 00:27:48,800 Speaker 1: like Google and Weymo, part of the Alphabet family, Wing 457 00:27:48,920 --> 00:27:53,080 Speaker 1: has put its delivery service, which uses drones to deliver packages, 458 00:27:53,520 --> 00:27:57,359 Speaker 1: on pause following a few cases of bird attacks on 459 00:27:57,440 --> 00:28:01,400 Speaker 1: their drones. Google has been conduct tests of home delivery 460 00:28:01,840 --> 00:28:05,920 Speaker 1: via drone in Canberra, Australia, which has been particularly useful 461 00:28:06,000 --> 00:28:08,639 Speaker 1: during the pandemic where a lot of Australia is on 462 00:28:08,720 --> 00:28:14,200 Speaker 1: lockdown and there are very stiff restrictions on when, if 463 00:28:14,320 --> 00:28:17,119 Speaker 1: at all, people are allowed to leave their homes. But 464 00:28:17,280 --> 00:28:19,919 Speaker 1: ravens have taken to attacking the drones, presumably out of 465 00:28:19,920 --> 00:28:22,840 Speaker 1: concern that the drones are a predator. It is nesting 466 00:28:22,880 --> 00:28:27,520 Speaker 1: season and so there's a fear from these ravens apparently 467 00:28:27,760 --> 00:28:32,040 Speaker 1: that the drones are predators. Not predator drones, those are different. 468 00:28:32,760 --> 00:28:35,479 Speaker 1: That's kind of a pun. So the ravens are just 469 00:28:35,520 --> 00:28:38,200 Speaker 1: sort of protecting their nests. In other words, at least 470 00:28:38,200 --> 00:28:41,000 Speaker 1: one of the attacks has downed a drone. So I'm 471 00:28:41,120 --> 00:28:44,040 Speaker 1: very glad to hear this news because I would worry 472 00:28:44,080 --> 00:28:46,640 Speaker 1: that the drones could potentially cause harm to the birds. 473 00:28:46,960 --> 00:28:49,840 Speaker 1: And obviously I also worry that the birds could cause 474 00:28:49,920 --> 00:28:52,800 Speaker 1: some packages to go undelivered, and if those packages are 475 00:28:52,840 --> 00:28:56,360 Speaker 1: like critical, like maybe it's medication or something, that could 476 00:28:56,360 --> 00:28:59,120 Speaker 1: be a really bad thing. And also, I bet it's 477 00:28:59,120 --> 00:29:01,000 Speaker 1: weird to fill out a port saying you never got 478 00:29:01,040 --> 00:29:05,120 Speaker 1: your package because birds were roughing up the delivery person. Anyway, 479 00:29:05,240 --> 00:29:08,240 Speaker 1: Wing is studying ways to work around this issue, including 480 00:29:08,320 --> 00:29:11,600 Speaker 1: learning more about bird behavior and any measures that the 481 00:29:11,640 --> 00:29:14,000 Speaker 1: company could take to make certain their drones cause no 482 00:29:14,280 --> 00:29:18,800 Speaker 1: environmental harm. And finally, on Monday, TikTok said it had 483 00:29:18,840 --> 00:29:22,920 Speaker 1: passed the one billion monthly user mark that would be 484 00:29:23,000 --> 00:29:25,800 Speaker 1: active users. The installed base is actually quite a bit 485 00:29:25,920 --> 00:29:28,920 Speaker 1: larger than that. Of course, some people have TikTok installed 486 00:29:28,960 --> 00:29:32,160 Speaker 1: on more than one of their own devices, so it's 487 00:29:32,200 --> 00:29:36,680 Speaker 1: not apples to apples thing. TikTok launched in August twenty eighteen, 488 00:29:37,000 --> 00:29:39,560 Speaker 1: so it took a little more than three years to 489 00:29:39,600 --> 00:29:43,560 Speaker 1: reach one billion monthly users. Let's Compare that to Facebook, 490 00:29:43,800 --> 00:29:47,320 Speaker 1: the king of social network platforms, reached one billion users 491 00:29:47,400 --> 00:29:50,760 Speaker 1: eight years after the company launched. TikTok did it in 492 00:29:50,840 --> 00:29:54,120 Speaker 1: less than half that time. TikTok is really a true 493 00:29:54,160 --> 00:29:57,880 Speaker 1: beast of a player in the social networking space. Its 494 00:29:57,920 --> 00:30:01,480 Speaker 1: parent company byte Dance reported it's revenue doubled in size 495 00:30:01,520 --> 00:30:05,360 Speaker 1: from twenty nineteen to twenty twenty thanks largely to TikTok. 496 00:30:05,840 --> 00:30:09,400 Speaker 1: Now I'm behind the times on TikTok. I am thankful 497 00:30:09,520 --> 00:30:12,680 Speaker 1: that the app reminded me of that great Mika song 498 00:30:12,840 --> 00:30:14,960 Speaker 1: Grace Kelly. You know, it's the one where everyone's going 499 00:30:14,960 --> 00:30:16,440 Speaker 1: to I could be Brown, I could be blue, I 500 00:30:16,440 --> 00:30:20,440 Speaker 1: could be Violet Sky. Great song, great album. Actually went 501 00:30:20,480 --> 00:30:23,640 Speaker 1: out and bought that album on vinyl after being reminded 502 00:30:23,680 --> 00:30:26,080 Speaker 1: of that song that I hadn't thought about in years. 503 00:30:26,560 --> 00:30:29,640 Speaker 1: So thank you TikTok. I appreciate it. As for myself, 504 00:30:29,880 --> 00:30:33,760 Speaker 1: I have only ever done one TikTok video. It is terrible. 505 00:30:34,600 --> 00:30:36,760 Speaker 1: That's all I have to say about that. But you know, 506 00:30:36,880 --> 00:30:41,280 Speaker 1: I'm also old, so there are other old people who 507 00:30:41,320 --> 00:30:44,560 Speaker 1: are way better at TikTok than I am. I just 508 00:30:44,560 --> 00:30:46,920 Speaker 1: don't think I'm ever gonna get there. Maybe I'll give 509 00:30:46,920 --> 00:30:50,000 Speaker 1: it another try at some point. Anyway, that's it. That's 510 00:30:50,000 --> 00:30:51,960 Speaker 1: the tech news that I have for you on Tuesday, 511 00:30:52,000 --> 00:30:54,680 Speaker 1: September twenty eight, twenty twenty one. I hope you are 512 00:30:54,760 --> 00:30:57,320 Speaker 1: all well. If you have anything you would like to 513 00:30:57,320 --> 00:30:59,520 Speaker 1: share with me, maybe a topic you would like me 514 00:30:59,560 --> 00:31:01,920 Speaker 1: to cover on tech Stuff, then reach out to me 515 00:31:02,000 --> 00:31:04,520 Speaker 1: on Twitter. The handle for the show is tech stuff 516 00:31:04,800 --> 00:31:08,960 Speaker 1: HSW and I'll talk to you again and really soon. 517 00:31:13,760 --> 00:31:18,440 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 518 00:31:18,800 --> 00:31:22,480 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 519 00:31:22,520 --> 00:31:23,560 Speaker 1: to your favorite shows