1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,120 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,240 --> 00:00:17,200 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,239 --> 00:00:19,520 Speaker 1: And how the tech are you. It's time for the 5 00:00:19,520 --> 00:00:24,280 Speaker 1: tech news for August twenty two, and today we've got 6 00:00:24,320 --> 00:00:27,480 Speaker 1: a lot of news items about hacking because def Con, 7 00:00:27,640 --> 00:00:30,920 Speaker 1: the hacker convention, was just this past weekend. But first 8 00:00:31,560 --> 00:00:35,160 Speaker 1: let's touch on some other news items. The advocacy group 9 00:00:35,240 --> 00:00:40,560 Speaker 1: Global Witness has leveled some pretty nasty accusations against Facebook. 10 00:00:41,200 --> 00:00:44,159 Speaker 1: And I say accusations, but really it sounds like in 11 00:00:44,200 --> 00:00:48,240 Speaker 1: this case the group has a pretty open and shut argument. Now, 12 00:00:48,280 --> 00:00:51,919 Speaker 1: this all has to do with some upcoming elections in Brazil, 13 00:00:52,640 --> 00:00:55,680 Speaker 1: and I'm sure we're all aware of how Facebook has 14 00:00:55,680 --> 00:00:58,160 Speaker 1: been under scrutiny for a few years now for its 15 00:00:58,280 --> 00:01:02,880 Speaker 1: role in carrying and in some cases promoting misinformation, namely 16 00:01:02,920 --> 00:01:07,399 Speaker 1: through its recommendation algorithm in English speaking countries. But in 17 00:01:07,480 --> 00:01:10,959 Speaker 1: countries where the dominant language isn't English, this can be 18 00:01:11,319 --> 00:01:15,800 Speaker 1: even more of a problem. It can go unresolved. So 19 00:01:16,120 --> 00:01:19,520 Speaker 1: how does Global Witness know that Facebook has failed to 20 00:01:19,600 --> 00:01:23,039 Speaker 1: live up to its promised that the platform is quote 21 00:01:23,160 --> 00:01:27,080 Speaker 1: deeply committed to protecting election integrity end quote. When it 22 00:01:27,120 --> 00:01:32,520 Speaker 1: comes to Brazil, well, Global Witness created political ads and 23 00:01:32,560 --> 00:01:38,400 Speaker 1: they inserted outright misinformation into those political ads. Then Global 24 00:01:38,400 --> 00:01:42,920 Speaker 1: Witness submitted those ads to Facebook, and Facebook accepted the 25 00:01:42,959 --> 00:01:47,080 Speaker 1: ads without raising any objections to the content inside them. 26 00:01:47,120 --> 00:01:50,320 Speaker 1: The advocacy group included ads that listed the wrong date 27 00:01:50,640 --> 00:01:53,520 Speaker 1: for the elections. So these were ads that were targeted 28 00:01:53,600 --> 00:01:57,760 Speaker 1: towards specific populations in Brazil. So the concept here is 29 00:01:57,800 --> 00:02:01,160 Speaker 1: that you would be trying to miss need a specific 30 00:02:01,400 --> 00:02:05,200 Speaker 1: target demographic so that they would not participate in voting, 31 00:02:05,640 --> 00:02:08,360 Speaker 1: and you will give them the wrong date. There was 32 00:02:09,000 --> 00:02:11,840 Speaker 1: there were other ads that that called for people to 33 00:02:12,000 --> 00:02:16,000 Speaker 1: use voting methods that aren't sanctioned by Brazil, so an 34 00:02:16,000 --> 00:02:19,200 Speaker 1: example that might be to vote by mail when there 35 00:02:19,280 --> 00:02:22,760 Speaker 1: isn't a vote by mail option. Some of them even 36 00:02:22,800 --> 00:02:25,720 Speaker 1: called into question the validity of the election and the 37 00:02:25,760 --> 00:02:30,800 Speaker 1: election hasn't even happened yet, and Facebook accepted all of them, 38 00:02:30,880 --> 00:02:34,600 Speaker 1: according to Global Witness senior advisor John Lloyd, Now what 39 00:02:34,760 --> 00:02:38,639 Speaker 1: should have happened was that Facebook should have identified the 40 00:02:38,680 --> 00:02:43,920 Speaker 1: misinformation in the ads and then denied the submitted advertisement. Instead, 41 00:02:43,960 --> 00:02:47,440 Speaker 1: Global Witness received notification after notification that the fake ads 42 00:02:47,480 --> 00:02:51,160 Speaker 1: had passed Muster, which the group rightfully points out brings 43 00:02:51,160 --> 00:02:57,280 Speaker 1: into question Facebook's entire content moderation strategy. Brazil's upcoming election 44 00:02:57,360 --> 00:02:59,960 Speaker 1: is likely to be a rough one, with current president 45 00:03:00,080 --> 00:03:03,680 Speaker 1: Bolsonaro seeking out another term in office. He's also been 46 00:03:03,720 --> 00:03:08,079 Speaker 1: accused of spreading disinformation, including calling into question the validity 47 00:03:08,080 --> 00:03:11,880 Speaker 1: of results from certain voting machines. Global Witness points out 48 00:03:12,360 --> 00:03:15,200 Speaker 1: that its experiment wasn't just an exercise to see if 49 00:03:15,200 --> 00:03:19,280 Speaker 1: Facebook is living up to its promise. It's really a 50 00:03:19,360 --> 00:03:23,600 Speaker 1: critical demonstration of Facebook's failure in a high stakes, real 51 00:03:23,800 --> 00:03:26,840 Speaker 1: world scenario, something that is actually playing out right now. 52 00:03:27,040 --> 00:03:31,160 Speaker 1: So yeah, not a good look. Now, let's switch gears 53 00:03:31,200 --> 00:03:36,560 Speaker 1: for a little bit. So, misinformation and disinformation are clearly bad, right, 54 00:03:36,640 --> 00:03:41,320 Speaker 1: Misinformation you want to avoid, and disinformation that that says 55 00:03:41,360 --> 00:03:46,480 Speaker 1: there's a motive behind it where you are specifically seeking 56 00:03:46,520 --> 00:03:51,000 Speaker 1: to mislead people. So we should really seek to eliminate 57 00:03:51,040 --> 00:03:56,840 Speaker 1: misinformation and disinformation from platforms. However, we also have to 58 00:03:56,880 --> 00:04:00,600 Speaker 1: take into account who is making the claim that a 59 00:04:00,680 --> 00:04:05,360 Speaker 1: particular something is in fact misinformation in the first place. 60 00:04:06,200 --> 00:04:10,120 Speaker 1: That requires critical thinking and sometimes it gets really hard 61 00:04:10,160 --> 00:04:13,120 Speaker 1: to know what is true. Right, Like, if someone says 62 00:04:13,200 --> 00:04:18,000 Speaker 1: that's fake news, and they offer no evidence to show 63 00:04:18,320 --> 00:04:21,240 Speaker 1: that the claim is fake, and that the actual claim 64 00:04:21,320 --> 00:04:25,119 Speaker 1: has evidence to show that it's valid, then you can't 65 00:04:25,160 --> 00:04:27,920 Speaker 1: just say, oh, well, it's fake because this person said 66 00:04:27,920 --> 00:04:30,040 Speaker 1: it was fake. Right. And I say all this because 67 00:04:30,040 --> 00:04:32,719 Speaker 1: our next story is about how the Hong Kong Police 68 00:04:32,760 --> 00:04:36,640 Speaker 1: Force has opened a public relations wing that has identified 69 00:04:36,680 --> 00:04:40,760 Speaker 1: the elimination of online quote unquote smearring of police work 70 00:04:41,240 --> 00:04:43,960 Speaker 1: as a high priority. Now, the police force is claiming 71 00:04:44,520 --> 00:04:48,240 Speaker 1: that misinformation campaigns are portraying Hong Kong police in a 72 00:04:48,240 --> 00:04:51,280 Speaker 1: negative way and that that has led to the deterioration 73 00:04:51,360 --> 00:04:54,400 Speaker 1: of the relationship between the public and the police force, 74 00:04:54,760 --> 00:04:57,360 Speaker 1: Whereas some of the critics would say the police using 75 00:04:58,200 --> 00:05:03,720 Speaker 1: uh increased force against citizens is the deteriorating relationship that 76 00:05:03,800 --> 00:05:07,560 Speaker 1: we should be concerned about. So you could argue that 77 00:05:07,600 --> 00:05:10,520 Speaker 1: the Hong Kong Police Force is looking to censor voices 78 00:05:10,560 --> 00:05:14,000 Speaker 1: that criticize the police or bring attention to situations that 79 00:05:14,880 --> 00:05:18,520 Speaker 1: at least seemed to indicate an abuse of power. In 80 00:05:18,560 --> 00:05:20,560 Speaker 1: other words, you could argue the police are using the 81 00:05:20,680 --> 00:05:25,040 Speaker 1: label of misinformation to silence activists and critics in an 82 00:05:25,080 --> 00:05:29,520 Speaker 1: effort to control the narrative. So dealing with misinformation is 83 00:05:29,600 --> 00:05:32,680 Speaker 1: a tricky thing. I don't mean to suggest that it's 84 00:05:32,720 --> 00:05:35,560 Speaker 1: super easy. You always have to examine the validity of 85 00:05:35,600 --> 00:05:39,680 Speaker 1: the claim that something is misinformation to start with. Now, 86 00:05:39,720 --> 00:05:42,240 Speaker 1: in the case of Global Witness, that was obvious, right 87 00:05:42,279 --> 00:05:47,000 Speaker 1: because the group outright inserted falsehoods into political ads. There's 88 00:05:47,040 --> 00:05:50,200 Speaker 1: no denying that they were testing it and saw that 89 00:05:50,279 --> 00:05:54,120 Speaker 1: Facebook failed. But in the case of Hong Kong, it 90 00:05:54,279 --> 00:05:57,080 Speaker 1: looks like it's more of an authoritarian move in order 91 00:05:57,120 --> 00:06:02,760 Speaker 1: to try and limit dissent. So again, while we should 92 00:06:02,839 --> 00:06:07,719 Speaker 1: be aiming to eliminate misinformation, we always have to be 93 00:06:08,440 --> 00:06:12,800 Speaker 1: cognizant of where the claims of misinformation were coming from 94 00:06:12,880 --> 00:06:15,920 Speaker 1: and to weigh them carefully to make sure that this 95 00:06:16,080 --> 00:06:21,560 Speaker 1: isn't just an attempt to silence a critic. According to Bloomberg, 96 00:06:21,600 --> 00:06:25,320 Speaker 1: Apple is looking to insert more ads into the iPhone experience, 97 00:06:25,640 --> 00:06:29,360 Speaker 1: namely within certain first party Apple apps, uh the big 98 00:06:29,360 --> 00:06:33,400 Speaker 1: one being Apple Maps. Now, you might also encounter ads 99 00:06:33,440 --> 00:06:37,159 Speaker 1: within Apple's podcast app, And if that's the case, then 100 00:06:37,480 --> 00:06:39,760 Speaker 1: that means you're going to hear and see ads that 101 00:06:39,800 --> 00:06:42,320 Speaker 1: are not just inside a podcast, but also on the 102 00:06:42,400 --> 00:06:46,880 Speaker 1: podcasting platform itself. That's interesting because the podcast business is 103 00:06:46,960 --> 00:06:49,880 Speaker 1: largely dependent upon ads. I don't know if you've noticed, 104 00:06:49,920 --> 00:06:53,080 Speaker 1: but we've had a couple in our show. Well, Apple 105 00:06:53,080 --> 00:06:55,839 Speaker 1: doesn't get revenue from the ads that are in our show. 106 00:06:56,520 --> 00:07:00,240 Speaker 1: Apple is a way for people to access podcasts. But 107 00:07:00,360 --> 00:07:03,440 Speaker 1: unless the show was actually coming from Apple, then Apple 108 00:07:03,480 --> 00:07:07,240 Speaker 1: doesn't really like revenue from the podcasts that are running. 109 00:07:07,760 --> 00:07:10,720 Speaker 1: You know that it allows people to access, So putting 110 00:07:10,800 --> 00:07:14,960 Speaker 1: ads into the podcast app itself is a way for 111 00:07:15,000 --> 00:07:20,280 Speaker 1: Apple to monetize the podcasting phenomenon. Also, a CNBC article 112 00:07:20,320 --> 00:07:23,240 Speaker 1: points out that Apple now has a huge advantage over 113 00:07:23,360 --> 00:07:27,000 Speaker 1: third party apps on iOS. So you might remember that 114 00:07:27,040 --> 00:07:31,480 Speaker 1: Apple introduced the app Tracking Transparency feature last year. This 115 00:07:31,680 --> 00:07:33,760 Speaker 1: was a little notification that would pop up and let 116 00:07:33,840 --> 00:07:36,400 Speaker 1: users decide whether or not they wanted to opt in 117 00:07:36,560 --> 00:07:40,440 Speaker 1: too targeted tracking from apps. That's the source of stuff 118 00:07:40,440 --> 00:07:44,040 Speaker 1: that lets companies like Meta capitalize on user activity. Now, 119 00:07:44,120 --> 00:07:47,080 Speaker 1: a lot of folks opted out when they got that choice. 120 00:07:47,120 --> 00:07:49,840 Speaker 1: They chose not to have their data shared with these 121 00:07:49,880 --> 00:07:52,840 Speaker 1: third parties, and that's one of the major hits to 122 00:07:53,000 --> 00:07:56,320 Speaker 1: Meta's revenue In the recent past. In fact, the loss 123 00:07:56,360 --> 00:08:00,360 Speaker 1: of that customer data was a huge blow to Meta 124 00:08:00,560 --> 00:08:04,520 Speaker 1: because Meta was heavily dependent upon using that data in 125 00:08:04,640 --> 00:08:08,520 Speaker 1: order to market targeted ads and really make the most 126 00:08:08,640 --> 00:08:12,680 Speaker 1: out of its platforms. But without that piece, without your information, 127 00:08:13,320 --> 00:08:16,840 Speaker 1: then a lot of the value of that service is gone. 128 00:08:17,360 --> 00:08:20,920 Speaker 1: They can't target with such precision. Thus they can't demand 129 00:08:21,040 --> 00:08:24,120 Speaker 1: the same sort of prices from advertisers that they had 130 00:08:24,160 --> 00:08:26,720 Speaker 1: in the past, and the whole thing starts to kind 131 00:08:26,720 --> 00:08:31,400 Speaker 1: of fall apart. Now, Apple still has all that information, right, 132 00:08:31,520 --> 00:08:34,360 Speaker 1: like they've collected that info. It's not, it's just that 133 00:08:34,360 --> 00:08:38,720 Speaker 1: they're not sharing it with third parties. So you could 134 00:08:38,760 --> 00:08:41,680 Speaker 1: imagine a future in which Apple builds out its own 135 00:08:41,720 --> 00:08:45,439 Speaker 1: advertising business, for example, and leverages the data that other 136 00:08:45,520 --> 00:08:49,400 Speaker 1: parties aren't allowed to access, at least not without users 137 00:08:49,400 --> 00:08:53,240 Speaker 1: opting into the experience, right, And that's all speculation on 138 00:08:53,320 --> 00:08:55,360 Speaker 1: my part, of course, It doesn't mean that that's what 139 00:08:55,440 --> 00:08:58,600 Speaker 1: Apple is going to do. If Apple does design an 140 00:08:58,640 --> 00:09:02,200 Speaker 1: advertising business, it doesn't mean that Apple is going to 141 00:09:02,600 --> 00:09:06,599 Speaker 1: rely on that information without giving users the same opportunity 142 00:09:06,600 --> 00:09:09,600 Speaker 1: to opt out with Apple that they had with other apps. 143 00:09:10,200 --> 00:09:12,400 Speaker 1: None of that is is known. One way or the other. 144 00:09:13,040 --> 00:09:17,839 Speaker 1: But if it did happen, it wouldn't surprise me, although 145 00:09:17,840 --> 00:09:21,400 Speaker 1: it would bring more anti competitive scrutiny onto Apple, which 146 00:09:21,480 --> 00:09:25,040 Speaker 1: the company is already dealing with across the world in 147 00:09:25,200 --> 00:09:29,640 Speaker 1: various UH venues. So we'll have to see if that, 148 00:09:29,720 --> 00:09:32,920 Speaker 1: in fact is where we're headed. Bloomberg also reports that 149 00:09:32,960 --> 00:09:36,400 Speaker 1: Apple has once again set a deadline for corporate employees 150 00:09:36,440 --> 00:09:39,800 Speaker 1: to spend at least three days of the work week 151 00:09:39,840 --> 00:09:42,880 Speaker 1: in the office. Reportedly, employees are expected to come in 152 00:09:42,960 --> 00:09:46,600 Speaker 1: on Tuesdays and Thursdays, plus another day that was set 153 00:09:46,679 --> 00:09:51,600 Speaker 1: by their team leaders. So that new deadline, at least 154 00:09:51,600 --> 00:09:54,760 Speaker 1: according to one reporter, is September five. As far as 155 00:09:54,760 --> 00:09:58,160 Speaker 1: I know, Apple has not confirmed that or officially announced that, 156 00:09:58,679 --> 00:10:01,560 Speaker 1: but that's what a reporter's says. Is the plan that 157 00:10:01,600 --> 00:10:05,240 Speaker 1: by September five, all Apple corporate employees will be expected 158 00:10:05,240 --> 00:10:07,000 Speaker 1: to come into the office at least three days a 159 00:10:07,040 --> 00:10:10,720 Speaker 1: week now. Apple has moved this goal post multiple times 160 00:10:10,800 --> 00:10:14,960 Speaker 1: during the pandemic due to lots of reasons, mostly spikes 161 00:10:15,000 --> 00:10:18,000 Speaker 1: in COVID transmission rates, and there have been more than 162 00:10:18,040 --> 00:10:21,720 Speaker 1: a few reports of Apple corporate employees protesting this move. 163 00:10:21,760 --> 00:10:25,120 Speaker 1: In fact, the former head of Apple's machine learning department, 164 00:10:25,200 --> 00:10:30,040 Speaker 1: Ian Goodfellow left Apple reportedly because of this mandate that 165 00:10:30,040 --> 00:10:32,920 Speaker 1: employees would have to return to the office. Of course, 166 00:10:33,360 --> 00:10:35,559 Speaker 1: we're also in a time where a lot of companies 167 00:10:35,600 --> 00:10:39,360 Speaker 1: are looking to downsize, so you could argue that Apple's 168 00:10:39,360 --> 00:10:43,760 Speaker 1: continued insistence that employees returned to h Q might be 169 00:10:43,880 --> 00:10:47,160 Speaker 1: a way to kind of put the squeeze on Apple 170 00:10:47,200 --> 00:10:50,600 Speaker 1: employees and maybe shake out a few folks and slim 171 00:10:50,679 --> 00:10:55,280 Speaker 1: down without actually having to hold layoffs. Of course, that's 172 00:10:55,320 --> 00:10:59,280 Speaker 1: just a possibility. I'm not saying that Apple is doing that. 173 00:10:59,360 --> 00:11:02,080 Speaker 1: I'm saying that their companies are doing that. I just 174 00:11:02,120 --> 00:11:06,440 Speaker 1: don't know if Apple is um and maybe the reason 175 00:11:06,520 --> 00:11:08,480 Speaker 1: for it has nothing to do with that. Maybe there 176 00:11:08,679 --> 00:11:13,160 Speaker 1: is no desire to convince Apple employees, or at least 177 00:11:13,200 --> 00:11:17,520 Speaker 1: some of them, to maybe leave the company. Maybe a 178 00:11:17,559 --> 00:11:20,800 Speaker 1: big part of it is that, I don't know, Apple 179 00:11:20,880 --> 00:11:24,800 Speaker 1: spent billions of dollars to build out this campus and 180 00:11:24,920 --> 00:11:27,800 Speaker 1: barely got any use out of it before the pandemic hit, 181 00:11:28,120 --> 00:11:30,840 Speaker 1: and by gum, Tim Cook once those folks in that 182 00:11:30,960 --> 00:11:34,440 Speaker 1: expensive building, I'm sure the truth of the matter is 183 00:11:34,760 --> 00:11:38,640 Speaker 1: far more subtle, far more nuanced and complicated, But It 184 00:11:38,720 --> 00:11:41,480 Speaker 1: sure is fun to kind of boil these things down 185 00:11:41,520 --> 00:11:45,319 Speaker 1: to an absurd level. Al Right, Well, that's enough absurdity 186 00:11:45,360 --> 00:11:47,560 Speaker 1: for now, let's take a quick break. When we come back, 187 00:11:47,600 --> 00:11:58,559 Speaker 1: we'll have some more news. Before the break, I was 188 00:11:58,600 --> 00:12:02,200 Speaker 1: talking about Apple asking or telling employees that they need 189 00:12:02,200 --> 00:12:04,600 Speaker 1: to come back to the office. Let's talk about what's 190 00:12:04,640 --> 00:12:06,920 Speaker 1: going on over at A T and T. There's a 191 00:12:06,960 --> 00:12:10,560 Speaker 1: similar battle brewing between employees and management now. Some of 192 00:12:10,600 --> 00:12:13,319 Speaker 1: A T and T s employees have representation with the 193 00:12:13,360 --> 00:12:17,600 Speaker 1: Communications Workers of America UH and the cw A has 194 00:12:17,640 --> 00:12:21,680 Speaker 1: negotiated an extension for work from home operations until the 195 00:12:21,800 --> 00:12:26,120 Speaker 1: end of March two thousand twenty three. However, some A 196 00:12:26,240 --> 00:12:29,000 Speaker 1: T and T employees are saying that there are already 197 00:12:29,040 --> 00:12:32,200 Speaker 1: groups within the company that management has forced to come 198 00:12:32,200 --> 00:12:35,960 Speaker 1: back to the office now, so they're saying it just 199 00:12:36,080 --> 00:12:38,640 Speaker 1: it depends upon which department you work in and what 200 00:12:38,720 --> 00:12:41,720 Speaker 1: team you work for. Now, this has become an issue 201 00:12:41,720 --> 00:12:44,640 Speaker 1: for employee morale because those who have been forced to 202 00:12:44,679 --> 00:12:47,920 Speaker 1: go back into the office are kind of envying the 203 00:12:47,960 --> 00:12:50,560 Speaker 1: departments that continue to work from home. A lot of 204 00:12:50,559 --> 00:12:53,120 Speaker 1: folks have said that working from home had no negative 205 00:12:53,160 --> 00:12:58,200 Speaker 1: impact on productivity, or performance, so there was no downside 206 00:12:58,240 --> 00:13:01,600 Speaker 1: to the company for them work from home, and that moreover, 207 00:13:01,640 --> 00:13:04,640 Speaker 1: workers were saving money on stuff like transportation and other 208 00:13:04,720 --> 00:13:07,440 Speaker 1: expenses that led to a boost in quality of life 209 00:13:07,440 --> 00:13:11,240 Speaker 1: for the workers. So the narrative that we're seeing develop 210 00:13:11,480 --> 00:13:16,840 Speaker 1: across the entire industry is that workers feel the real 211 00:13:16,960 --> 00:13:21,439 Speaker 1: reason employers want them in the office isn't so much 212 00:13:21,480 --> 00:13:24,360 Speaker 1: about contributing to the bottom line. It's not so much 213 00:13:24,400 --> 00:13:29,560 Speaker 1: about performance and productivity and creativity and collaboration. It's more 214 00:13:29,640 --> 00:13:32,880 Speaker 1: about surveillance and control. If the employees are not in 215 00:13:32,920 --> 00:13:35,720 Speaker 1: the office, they can't be watched and they can't be controlled. 216 00:13:36,440 --> 00:13:41,360 Speaker 1: And whether that's a realistic narrative or not, I don't know. 217 00:13:41,679 --> 00:13:44,720 Speaker 1: I'm certain that most bosses aren't thinking of it in 218 00:13:44,760 --> 00:13:47,880 Speaker 1: those terms, but I know that that's a narrative that 219 00:13:48,040 --> 00:13:53,240 Speaker 1: is growing in in uh in size across the industry. Right. 220 00:13:53,240 --> 00:13:57,239 Speaker 1: You're seeing more and more people essentially boiled down the 221 00:13:57,240 --> 00:14:00,280 Speaker 1: the matter into those kind of terms. It's oh, I 222 00:14:00,320 --> 00:14:03,560 Speaker 1: know that any company out there there really wants to 223 00:14:03,600 --> 00:14:07,320 Speaker 1: bring its employees back into offices needs to be able 224 00:14:07,360 --> 00:14:10,440 Speaker 1: to address that concern in a satisfying way, or then 225 00:14:10,520 --> 00:14:13,040 Speaker 1: that narrative will continue to grow there as well. If 226 00:14:13,400 --> 00:14:16,040 Speaker 1: your employees are convinced that the only reason they're being 227 00:14:16,080 --> 00:14:19,480 Speaker 1: brought back into the office is so that the boss 228 00:14:19,560 --> 00:14:23,280 Speaker 1: can keep an eye on them and that's it, then 229 00:14:24,240 --> 00:14:27,640 Speaker 1: that's not a good narrative, right that that's that's going 230 00:14:27,680 --> 00:14:30,440 Speaker 1: to paint the company as a bad place to work. 231 00:14:31,160 --> 00:14:34,840 Speaker 1: That it's unreasonable. So again, I don't think that that's 232 00:14:34,880 --> 00:14:38,800 Speaker 1: necessarily where most companies are. I can't imagine most bosses 233 00:14:38,840 --> 00:14:42,640 Speaker 1: actually thinking in that that way, but they need to 234 00:14:42,640 --> 00:14:45,240 Speaker 1: get ahead of it if they want people to come 235 00:14:45,240 --> 00:14:48,920 Speaker 1: back into the office and not have it be a 236 00:14:48,960 --> 00:14:53,400 Speaker 1: catastrophic effect on the morale of the employees. Signal, the 237 00:14:53,480 --> 00:14:57,880 Speaker 1: encrypted messaging service, reported that a fishing attack on Twilo 238 00:14:58,040 --> 00:15:01,440 Speaker 1: Incorporated could mean the up to one thousand, nine hundred 239 00:15:01,480 --> 00:15:05,400 Speaker 1: Signal users had their phone numbers revealed to the attackers. 240 00:15:05,440 --> 00:15:10,320 Speaker 1: So here's the breakdown. Twilo is a verification services provider 241 00:15:10,520 --> 00:15:17,360 Speaker 1: and Signal uses Twilo's verification services, so Signal was not compromised. 242 00:15:17,680 --> 00:15:22,640 Speaker 1: The hackers did not target and and compromise Signal. Instead, 243 00:15:23,280 --> 00:15:26,920 Speaker 1: they compromised a company that Signal partners with for the 244 00:15:27,040 --> 00:15:30,479 Speaker 1: verification services. So the good news is that the attackers 245 00:15:30,520 --> 00:15:33,120 Speaker 1: seem to only have been able to get the phone numbers, 246 00:15:33,720 --> 00:15:35,680 Speaker 1: not even like the names of the people that the 247 00:15:35,680 --> 00:15:39,920 Speaker 1: phone numbers correspond to, just the numbers that represent devices 248 00:15:39,960 --> 00:15:42,520 Speaker 1: that have signal installed on them, and it's just nine 249 00:15:43,000 --> 00:15:46,240 Speaker 1: of them. That really limits what the attackers can do 250 00:15:46,280 --> 00:15:50,240 Speaker 1: with that information. They could conceivably attempt to reregister a 251 00:15:50,280 --> 00:15:53,760 Speaker 1: device's number, but that's about it. They didn't have access 252 00:15:53,800 --> 00:15:58,120 Speaker 1: to message history or profile information or anything like that, 253 00:15:58,280 --> 00:16:00,960 Speaker 1: so it could have been much worse. But it does 254 00:16:01,000 --> 00:16:04,400 Speaker 1: bring into focus the interconnected nature of tech companies and 255 00:16:04,440 --> 00:16:07,600 Speaker 1: how challenging it is to create a secure process because 256 00:16:08,080 --> 00:16:11,800 Speaker 1: if you're relying on another party to provide services that 257 00:16:11,880 --> 00:16:16,480 Speaker 1: make your company, you know, services possible, you might not 258 00:16:16,720 --> 00:16:21,080 Speaker 1: have total control over your own security procedures because some 259 00:16:21,160 --> 00:16:26,040 Speaker 1: of them are dependent upon another company entirely. So yeah, 260 00:16:26,200 --> 00:16:30,720 Speaker 1: it's complicated. Not long ago, a security researcher named Lenart 261 00:16:30,720 --> 00:16:34,200 Speaker 1: Wilder's and my apologies for butchering the name, revealed that 262 00:16:34,240 --> 00:16:36,320 Speaker 1: he was able to use a homemade device to hack 263 00:16:36,440 --> 00:16:41,560 Speaker 1: into Starlink, the satellite internet service provider arm of SpaceX now. 264 00:16:41,600 --> 00:16:46,040 Speaker 1: Starlink has since congratulated him on his find as it 265 00:16:46,480 --> 00:16:49,080 Speaker 1: you know, it uncovered a vulnerability that the company needed 266 00:16:49,080 --> 00:16:52,480 Speaker 1: to address. And you know he wasn't being a malicious hacker. 267 00:16:52,560 --> 00:16:55,880 Speaker 1: He was testing the security of starlinks technology and found 268 00:16:55,920 --> 00:16:58,440 Speaker 1: a an inroad. And the way I think about this 269 00:16:58,480 --> 00:17:01,600 Speaker 1: is that if researchers did do this and didn't reveal 270 00:17:01,640 --> 00:17:04,880 Speaker 1: their findings, there would be a danger that these vulnerabilities 271 00:17:04,880 --> 00:17:08,480 Speaker 1: would go unpatched, and then someone eventually who has bad 272 00:17:08,520 --> 00:17:12,480 Speaker 1: intentions would discover those vulnerabilities and then do nasty evil 273 00:17:12,520 --> 00:17:16,000 Speaker 1: things with that information. To that end, starlinkin SpaceX have 274 00:17:16,040 --> 00:17:20,080 Speaker 1: announced a bug bounty program inviting security researchers to poke 275 00:17:20,160 --> 00:17:23,159 Speaker 1: and prod and look for vulnerabilities, and if they discover 276 00:17:23,200 --> 00:17:26,639 Speaker 1: and report one, they get rewarded up to twenty five 277 00:17:26,680 --> 00:17:29,920 Speaker 1: thousand bucks, depending upon the nature of the vulnerability, which 278 00:17:29,960 --> 00:17:32,280 Speaker 1: is a responsible way to try and keep systems safe, 279 00:17:32,320 --> 00:17:35,479 Speaker 1: because if you're not supporting some sort of bug bounty, 280 00:17:35,520 --> 00:17:38,760 Speaker 1: you're inviting hackers to make money from those same vulnerabilities 281 00:17:39,160 --> 00:17:42,800 Speaker 1: in more nefarious ways. And finally, this past weekend, as 282 00:17:42,840 --> 00:17:45,040 Speaker 1: I said, was def Con a hacking convention and a 283 00:17:45,040 --> 00:17:47,200 Speaker 1: place where you absolutely want to make sure you're only 284 00:17:47,240 --> 00:17:49,480 Speaker 1: carrying a burner phone that has none of your personal 285 00:17:49,480 --> 00:17:53,320 Speaker 1: information on it, and one of the hackers attending goes 286 00:17:53,400 --> 00:17:56,560 Speaker 1: by the handle sick codes, and among sick codes as 287 00:17:56,560 --> 00:18:00,560 Speaker 1: accomplishments is jail breaking. John dear farm equipment. Now, I've 288 00:18:00,560 --> 00:18:04,200 Speaker 1: talked about this on past episodes, but John Deer installs 289 00:18:04,240 --> 00:18:08,119 Speaker 1: computers and software on its equipment like tractors that not 290 00:18:08,200 --> 00:18:11,720 Speaker 1: only provide extra functionality and features to the equipment, it 291 00:18:11,800 --> 00:18:15,560 Speaker 1: also really restricts how farmers can maintain and repair their stuff. 292 00:18:15,760 --> 00:18:19,240 Speaker 1: In fact, you could argue that really the main reason 293 00:18:19,320 --> 00:18:22,000 Speaker 1: for this technology is to create a closed off ecosystem 294 00:18:22,040 --> 00:18:24,080 Speaker 1: where the farmers have no option but to take their 295 00:18:24,119 --> 00:18:28,400 Speaker 1: equipment to authorize John Deer Associates to have their equipment 296 00:18:28,440 --> 00:18:31,760 Speaker 1: serviced and repaired. And that's why this company comes up 297 00:18:31,800 --> 00:18:34,720 Speaker 1: a lot in conversations around the right to repair, which 298 00:18:34,840 --> 00:18:37,320 Speaker 1: has this radical notion that once you purchase something, you 299 00:18:37,320 --> 00:18:39,800 Speaker 1: should be able to maintain it, repair it, and customize 300 00:18:39,800 --> 00:18:43,520 Speaker 1: it without limitations. Take it to whomever you want, not 301 00:18:43,640 --> 00:18:46,359 Speaker 1: just an authorized dealer you know because you bought the 302 00:18:46,440 --> 00:18:49,080 Speaker 1: ding dang darn thing. But companies like John Deer have 303 00:18:49,240 --> 00:18:53,680 Speaker 1: these systems in place to make that difficult or impossible. Well, 304 00:18:53,920 --> 00:18:58,480 Speaker 1: sick Codes had already demonstrated that he could compromise a 305 00:18:58,680 --> 00:19:01,880 Speaker 1: John Dear piece of equipment that he could hack and 306 00:19:01,920 --> 00:19:06,399 Speaker 1: get control root control of the computer system underneath. But 307 00:19:06,680 --> 00:19:10,680 Speaker 1: this year at dev Coon, he partnered with a Doom 308 00:19:10,760 --> 00:19:15,720 Speaker 1: model named Skelegant or handle Skelegant, and she created a 309 00:19:15,800 --> 00:19:18,680 Speaker 1: version of Doom that you could play on a John 310 00:19:18,720 --> 00:19:24,000 Speaker 1: Dear tractor like it had you mowing down demons in 311 00:19:24,160 --> 00:19:27,119 Speaker 1: a corn field. So great since the humor Skelegon, I 312 00:19:27,160 --> 00:19:31,560 Speaker 1: really dig that and just showed that, Yeah, ultimately, this 313 00:19:31,640 --> 00:19:34,080 Speaker 1: is just a computer system and all you have to 314 00:19:34,119 --> 00:19:39,360 Speaker 1: do is get around the various gates that John Dear 315 00:19:39,440 --> 00:19:43,000 Speaker 1: has put up blocking you from having root access, which 316 00:19:43,400 --> 00:19:47,199 Speaker 1: sick Codes has already demonstrated he can do interesting stuff. 317 00:19:48,160 --> 00:19:50,800 Speaker 1: Well that's it for this episode of tech Stuff. I 318 00:19:50,840 --> 00:19:53,119 Speaker 1: hope you're having a great week and I'll talk to 319 00:19:53,119 --> 00:20:02,440 Speaker 1: you again really soon. Text Stuff is an I Heart 320 00:20:02,520 --> 00:20:06,280 Speaker 1: Radio production. For more podcasts from I Heart Radio, visit 321 00:20:06,320 --> 00:20:09,359 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 322 00:20:09,480 --> 00:20:10,800 Speaker 1: listen to your favorite shows.