1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:18,239 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,280 --> 00:00:20,840 Speaker 1: And how the tech are you. It's time for the 5 00:00:20,880 --> 00:00:25,120 Speaker 1: tech news for Tuesday, April twelve, twenty twenty two. Let's 6 00:00:25,200 --> 00:00:28,360 Speaker 1: get to it. And what a difference just a few 7 00:00:28,440 --> 00:00:32,479 Speaker 1: days make. So last week I talked about how Elon 8 00:00:32,560 --> 00:00:37,040 Speaker 1: Musk had purchased a significant steak in Twitter and had 9 00:00:37,080 --> 00:00:41,519 Speaker 1: subsequently been invited to join Twitter's board of directors. This 10 00:00:41,640 --> 00:00:44,760 Speaker 1: was after Musk had started posting polls about stuff like 11 00:00:44,800 --> 00:00:48,360 Speaker 1: Twitter's approach to freedom of speech, since the platform tends 12 00:00:48,360 --> 00:00:52,280 Speaker 1: to push back on stuff like misinformation campaigns, something that 13 00:00:52,360 --> 00:00:56,440 Speaker 1: Musk himself has been you know, kind of accused of 14 00:00:56,480 --> 00:01:00,960 Speaker 1: spreading in the past. And Musk also needed a poll 15 00:01:01,200 --> 00:01:05,560 Speaker 1: about including an edit button in in Twitter, and Twitter 16 00:01:05,959 --> 00:01:08,880 Speaker 1: then said it was soon going to announce a button 17 00:01:09,160 --> 00:01:12,199 Speaker 1: and and unveil it, which I'll also suggests that Twitter 18 00:01:12,319 --> 00:01:14,000 Speaker 1: had been working on that for a while, that it 19 00:01:14,160 --> 00:01:19,479 Speaker 1: was not you know, prompted necessarily by Musk himself. Then 20 00:01:20,040 --> 00:01:24,800 Speaker 1: Twitter was arranging and asked me anything session with Elon 21 00:01:24,920 --> 00:01:29,119 Speaker 1: Musk so that Twitter employees could get more information about 22 00:01:29,360 --> 00:01:31,080 Speaker 1: where things were going next. There were a lot of 23 00:01:31,120 --> 00:01:34,759 Speaker 1: people who had some pretty massive concerns about Elon Musk 24 00:01:34,840 --> 00:01:37,920 Speaker 1: potentially joining the board of directors. But today things look 25 00:01:37,959 --> 00:01:41,480 Speaker 1: a lot different. Musk is no longer going to be 26 00:01:41,600 --> 00:01:43,600 Speaker 1: part of the board of directors. He has turned that down, 27 00:01:43,760 --> 00:01:47,640 Speaker 1: according to reports on Bloomberg, and Twitter has subsequently canceled 28 00:01:47,680 --> 00:01:51,080 Speaker 1: the Asked Me Anything session because it's moot, So instead 29 00:01:51,160 --> 00:01:54,080 Speaker 1: employees were given the day off, which is probably a 30 00:01:54,120 --> 00:01:56,720 Speaker 1: pretty nice gesture because the impression I get is that 31 00:01:56,760 --> 00:01:59,880 Speaker 1: a significant number of Twitter employees were feeling uneasy about 32 00:02:00,040 --> 00:02:06,000 Speaker 1: Musk's involvement. So why would Musk decline the invitation to 33 00:02:06,240 --> 00:02:08,560 Speaker 1: join the board of directors. Well, there's a lot of 34 00:02:08,639 --> 00:02:12,600 Speaker 1: speculation on that one possibility, though again I have to say, 35 00:02:13,040 --> 00:02:15,520 Speaker 1: this isn't something that Musk himself as confirmed, at least 36 00:02:15,560 --> 00:02:18,720 Speaker 1: as I'm recording this episode. But one possibility is that 37 00:02:18,840 --> 00:02:21,680 Speaker 1: must decline the offer because to become a board member 38 00:02:21,720 --> 00:02:24,040 Speaker 1: would mean that he would have to agree and then 39 00:02:24,080 --> 00:02:29,040 Speaker 1: adhere to Twitter's code of conduct, which doesn't seem very 40 00:02:29,160 --> 00:02:32,079 Speaker 1: musky to me. So that might have played a part. 41 00:02:33,040 --> 00:02:37,600 Speaker 1: But Another potentially huge factor could be that the members 42 00:02:37,639 --> 00:02:40,560 Speaker 1: of the board of directors are restricted from purchasing more 43 00:02:40,639 --> 00:02:45,079 Speaker 1: than fourteen point nine percent of Twitter stock. Musk currently 44 00:02:45,120 --> 00:02:48,000 Speaker 1: owns a little more than nine percent. Now, it could 45 00:02:48,080 --> 00:02:51,040 Speaker 1: be that Musk just doesn't want to be confined to 46 00:02:51,240 --> 00:02:54,440 Speaker 1: less than fifteen percent ownership in the long run um, 47 00:02:55,280 --> 00:02:57,440 Speaker 1: and it might not be that much of a long run. 48 00:02:57,560 --> 00:03:01,959 Speaker 1: Aaron Solomon, who is Esquire's did Agital's chief legal analyst, 49 00:03:02,520 --> 00:03:06,480 Speaker 1: has suggested that perhaps Musk is actually positioning himself to 50 00:03:06,600 --> 00:03:10,160 Speaker 1: hold a hostile takeover of Twitter. Now, for those of 51 00:03:10,240 --> 00:03:13,480 Speaker 1: y'all who didn't grow up in the eighties when hostile 52 00:03:13,520 --> 00:03:16,919 Speaker 1: takeovers were all the rage, the idea behind a hostile 53 00:03:16,960 --> 00:03:19,440 Speaker 1: takeover goes like this. So typically, if you wanted to 54 00:03:19,520 --> 00:03:23,320 Speaker 1: acquire a company, you would deal with that company's board 55 00:03:23,320 --> 00:03:25,359 Speaker 1: of directors and set it all up. You would do 56 00:03:25,400 --> 00:03:28,560 Speaker 1: all your negotiations with them, and they would set up 57 00:03:28,639 --> 00:03:34,400 Speaker 1: the the plan, potentially putting it to a vote among stakeholders, 58 00:03:34,880 --> 00:03:37,320 Speaker 1: uh in order to have it actually become a thing. 59 00:03:38,280 --> 00:03:42,760 Speaker 1: But if the board isn't, you know, in the mood 60 00:03:42,920 --> 00:03:46,080 Speaker 1: to sell, you have to take another pathway. So one 61 00:03:46,120 --> 00:03:48,800 Speaker 1: way to do that as you buy up a lot 62 00:03:48,840 --> 00:03:51,640 Speaker 1: of shares in the company. UH, if you buy up 63 00:03:51,760 --> 00:03:54,960 Speaker 1: enough shares, like more than half of the company, then 64 00:03:55,080 --> 00:03:58,680 Speaker 1: you your votes matter more than anyone else's, right. If 65 00:03:58,720 --> 00:04:01,080 Speaker 1: you have fifty one percent of the vote, no one 66 00:04:01,160 --> 00:04:04,360 Speaker 1: can outvote you on anything, and then you can vote 67 00:04:04,680 --> 00:04:09,000 Speaker 1: to sell the company, to say Elon Musk. Another way 68 00:04:09,040 --> 00:04:11,920 Speaker 1: of doing it is to convince other big shareholders to 69 00:04:12,240 --> 00:04:17,479 Speaker 1: join with you in UH in an effort to pressure 70 00:04:17,920 --> 00:04:20,040 Speaker 1: the sale or to force the sale. Like, it's not 71 00:04:20,080 --> 00:04:25,360 Speaker 1: even pressure if you have enough votes. Now, the board 72 00:04:25,400 --> 00:04:27,680 Speaker 1: of directors, even if this were to happen, would have 73 00:04:27,880 --> 00:04:32,240 Speaker 1: some potential options to follow, some of which are kind 74 00:04:32,279 --> 00:04:35,440 Speaker 1: of the nuclear extreme. There's a thing called a poison 75 00:04:35,560 --> 00:04:39,679 Speaker 1: pill that could take place. All of that's just pure speculation, however, 76 00:04:39,920 --> 00:04:42,360 Speaker 1: right like, we're nowhere close to that yet, at least 77 00:04:42,400 --> 00:04:44,360 Speaker 1: not as I'm recording this episode. Maybe by the time 78 00:04:44,360 --> 00:04:46,640 Speaker 1: it goes out at will, but right now it's not. 79 00:04:46,920 --> 00:04:51,240 Speaker 1: That's not the case. So maybe by the next news 80 00:04:51,320 --> 00:04:54,280 Speaker 1: episode will have a little bit more firm news and 81 00:04:54,440 --> 00:04:58,400 Speaker 1: less speculation. Uh. I don't know if this is like 82 00:04:58,560 --> 00:05:01,880 Speaker 1: an accurate read what's going on or not. It's hard 83 00:05:01,920 --> 00:05:03,800 Speaker 1: to say a lot of people are playing kind of 84 00:05:03,920 --> 00:05:06,760 Speaker 1: armchair psychologists and saying, well, of course Elon Musk wants 85 00:05:06,800 --> 00:05:12,160 Speaker 1: to have access to his own social platform, and it's 86 00:05:12,200 --> 00:05:15,279 Speaker 1: one that's already very well established, so it makes sense 87 00:05:15,560 --> 00:05:18,600 Speaker 1: from that perspective. I don't know if it's that simple. Um, 88 00:05:18,960 --> 00:05:21,480 Speaker 1: it probably is not. It probably is more complicated than that. 89 00:05:22,040 --> 00:05:24,720 Speaker 1: But we'll have to follow up on this story as 90 00:05:24,800 --> 00:05:29,080 Speaker 1: it develops. Now, let's get some Meta slash Facebook news 91 00:05:29,080 --> 00:05:31,920 Speaker 1: out of the way. One bit of good news is 92 00:05:32,000 --> 00:05:35,240 Speaker 1: that the company announced it had removed a disinformation network 93 00:05:35,360 --> 00:05:38,880 Speaker 1: located out of Brazil that focused on climate change denial. 94 00:05:39,560 --> 00:05:44,360 Speaker 1: Reuter's reports that Meta removed fourteen Facebook accounts, thirty nine 95 00:05:44,480 --> 00:05:48,880 Speaker 1: Instagram accounts, and nine Facebook pages all related to this 96 00:05:49,160 --> 00:05:53,800 Speaker 1: disinformation campaign. Now, apparently these groups were in some way 97 00:05:53,920 --> 00:05:58,920 Speaker 1: connected to Brazil's military, although whether or not it was 98 00:05:59,000 --> 00:06:03,320 Speaker 1: a military anctioned campaign remains to be seen. Like that 99 00:06:03,600 --> 00:06:07,640 Speaker 1: that's not that's not firm as of right now. Uh, 100 00:06:07,800 --> 00:06:09,960 Speaker 1: it may just be that it's a coincidence that these 101 00:06:10,000 --> 00:06:14,120 Speaker 1: are connected, and you might wonder how big a deal 102 00:06:14,440 --> 00:06:19,680 Speaker 1: this was. Well, based on Facebook's adversarial report, this particular 103 00:06:19,760 --> 00:06:24,080 Speaker 1: group hasn't made a huge impact on the platform. Apparently 104 00:06:24,120 --> 00:06:29,120 Speaker 1: the campaign had a fairly restricted reach and much maybe 105 00:06:29,160 --> 00:06:31,240 Speaker 1: not all, but but you know, a significant amount of 106 00:06:31,279 --> 00:06:35,120 Speaker 1: the engagement on the various profiles appeared to be essentially 107 00:06:35,160 --> 00:06:38,480 Speaker 1: manufactured by the group itself, right, it was boosting itself 108 00:06:38,839 --> 00:06:41,960 Speaker 1: in order to get more visibility. So it could have 109 00:06:42,040 --> 00:06:44,280 Speaker 1: been that this was a campaign that was kind of 110 00:06:44,360 --> 00:06:49,400 Speaker 1: in fairly early development. It sounds like not that many 111 00:06:49,440 --> 00:06:52,240 Speaker 1: people outside the campaign it self encountered the material. I mean, 112 00:06:52,279 --> 00:06:56,640 Speaker 1: there were some. It wasn't like it was completely internal, 113 00:06:57,640 --> 00:06:59,760 Speaker 1: but it wasn't as widespread as it could have been. 114 00:07:00,480 --> 00:07:02,680 Speaker 1: As for the kind of stuff that the group was posting, 115 00:07:03,000 --> 00:07:06,400 Speaker 1: it was messaging that related to stuff like DeForest station 116 00:07:06,800 --> 00:07:11,040 Speaker 1: and how according to this disinformation campaign, DeForest station really 117 00:07:11,160 --> 00:07:14,840 Speaker 1: isn't all that bad. It goes without saying that deforestation 118 00:07:14,920 --> 00:07:17,720 Speaker 1: is pretty darn bad for lots of different reasons, including 119 00:07:17,720 --> 00:07:21,520 Speaker 1: the reduction of habitat and the destruction of biodiversity in 120 00:07:21,640 --> 00:07:27,440 Speaker 1: the Amazon region. Another Facebook note, meta CEO Mark Zuckerberg 121 00:07:27,520 --> 00:07:31,480 Speaker 1: spent quite a few zuckbucks in one on flights and security, 122 00:07:32,160 --> 00:07:36,880 Speaker 1: or rather, the company spent nearly twenties seven million dollars 123 00:07:37,080 --> 00:07:40,360 Speaker 1: to pay for Zuckerberg's travel as well as a security 124 00:07:40,440 --> 00:07:44,120 Speaker 1: detail for Zuckerberg and his family. Now, in case you're 125 00:07:44,160 --> 00:07:47,760 Speaker 1: wondering if this is an outlier, like if one was 126 00:07:47,800 --> 00:07:52,640 Speaker 1: really unusual, not really. Back in twenty nineteen, the company 127 00:07:52,720 --> 00:07:58,240 Speaker 1: spent twenty three million dollars UH and in twenty which 128 00:07:58,320 --> 00:08:00,440 Speaker 1: you would think you would see a reduction in that 129 00:08:00,640 --> 00:08:04,840 Speaker 1: mount because it includes travel, right, and most of us 130 00:08:04,880 --> 00:08:07,440 Speaker 1: were sitting at home for much of the year and 131 00:08:07,520 --> 00:08:11,000 Speaker 1: not going anywhere. But the cost went from twenty three 132 00:08:11,080 --> 00:08:14,400 Speaker 1: million to twenty five million, and then one went up 133 00:08:14,440 --> 00:08:19,280 Speaker 1: to twenty seven million. Even during a pandemic, when you'd 134 00:08:19,320 --> 00:08:23,560 Speaker 1: suspect there'll be less travel, the amount spent was increased. 135 00:08:24,520 --> 00:08:27,840 Speaker 1: Meta's sec filings reveals the company spent more than fifteen 136 00:08:27,880 --> 00:08:31,720 Speaker 1: million dollars on Zuckerberg's personal security. So of that twenty 137 00:08:31,760 --> 00:08:36,040 Speaker 1: seven million, fifteen of it was security for Zuckerberg. That 138 00:08:36,120 --> 00:08:40,840 Speaker 1: includes stuff like seven bodyguards and an office with bulletproof glass, 139 00:08:40,840 --> 00:08:43,520 Speaker 1: among other things. And on the one hand, that might 140 00:08:43,600 --> 00:08:46,600 Speaker 1: sound like it's excessive, but then again, Zuck, as they 141 00:08:46,640 --> 00:08:50,400 Speaker 1: would say in the movies, made an awful lot of enemies. Uh, 142 00:08:50,600 --> 00:08:53,120 Speaker 1: there's no shortage of folks out there and organizations out 143 00:08:53,120 --> 00:08:56,480 Speaker 1: there and governments out there that aren't too keen on 144 00:08:57,080 --> 00:09:00,720 Speaker 1: Zuckerberg and his company. Fifteen million dollars is a lot 145 00:09:00,760 --> 00:09:03,360 Speaker 1: of money, and you might actually question whether you really 146 00:09:03,480 --> 00:09:07,080 Speaker 1: need to spend fifteen million dollars to keep Zuckerberg alive 147 00:09:07,160 --> 00:09:09,600 Speaker 1: and safe. But really there's only one way to find 148 00:09:09,679 --> 00:09:12,079 Speaker 1: out if it isn't enough, and that way is, you know, 149 00:09:13,000 --> 00:09:17,360 Speaker 1: not not appropriate. Like I don't like Zuckerberg, but I 150 00:09:17,400 --> 00:09:22,080 Speaker 1: would never wish him physical harm. The online marketplace Etsy 151 00:09:22,160 --> 00:09:23,920 Speaker 1: is hit hitting a bit of a rough patch. A 152 00:09:24,000 --> 00:09:27,160 Speaker 1: group of merchants on the platform have decided to hold 153 00:09:27,200 --> 00:09:30,679 Speaker 1: a week long quote unquote strike. Now strike is a 154 00:09:30,760 --> 00:09:34,000 Speaker 1: weird word because the people who are using etc as 155 00:09:34,040 --> 00:09:37,240 Speaker 1: a storefront, they aren't employed by Etsy. They're making you 156 00:09:37,320 --> 00:09:39,599 Speaker 1: they're actually customers of etc. Because they're making use of 157 00:09:39,640 --> 00:09:42,559 Speaker 1: Etsy's service. But at the heart of their complaint is 158 00:09:42,640 --> 00:09:46,280 Speaker 1: Etsy's recent decision to increase transaction fees by thirty percent. 159 00:09:47,280 --> 00:09:52,240 Speaker 1: So transaction fees are really the way that Ets makes revenue. 160 00:09:52,440 --> 00:09:55,480 Speaker 1: It takes a cut of each transaction made through the 161 00:09:55,559 --> 00:10:00,240 Speaker 1: online store. But an increase in transaction fees me means 162 00:10:00,320 --> 00:10:04,120 Speaker 1: that those costs have to get passed on somewhere else. Right. Uh, 163 00:10:04,640 --> 00:10:06,839 Speaker 1: Either the artists and creators have to find ways to 164 00:10:06,960 --> 00:10:09,920 Speaker 1: say more, maybe by buying in bulk and thus in 165 00:10:10,000 --> 00:10:12,520 Speaker 1: the long run saving more money, and they don't have 166 00:10:12,720 --> 00:10:16,199 Speaker 1: to affect their prices at all and their revenue pretty 167 00:10:16,280 --> 00:10:19,120 Speaker 1: much stays the same, or they have to pay out 168 00:10:19,160 --> 00:10:21,839 Speaker 1: a pocket, you know, whatever the differences, or they have 169 00:10:21,960 --> 00:10:25,120 Speaker 1: to increase the prices on their goods, which might mean 170 00:10:25,200 --> 00:10:28,280 Speaker 1: it discourages people from buying them. So it's got a 171 00:10:28,360 --> 00:10:31,640 Speaker 1: lot of people upset. Not everybody, not everyone on that 172 00:10:31,800 --> 00:10:34,679 Speaker 1: see thinks this is a bad thing or the end 173 00:10:34,679 --> 00:10:36,679 Speaker 1: of the world, but there are quite a few who 174 00:10:36,720 --> 00:10:40,800 Speaker 1: have banded together in order to have this quote unquote strike. 175 00:10:41,640 --> 00:10:45,079 Speaker 1: The company also promotes shop listings through ads on other sites, 176 00:10:45,240 --> 00:10:48,400 Speaker 1: so off site ads, and in those cases they charge 177 00:10:48,600 --> 00:10:52,719 Speaker 1: the merchants anywhere between twelve to fift and fees um, 178 00:10:52,880 --> 00:10:54,520 Speaker 1: and some of the merchants say, like, we aren't even 179 00:10:54,520 --> 00:10:58,000 Speaker 1: given the opportunity to kind of sidestep that. There are 180 00:10:58,040 --> 00:11:01,240 Speaker 1: other factors that have encouraged sellers to go into vacation 181 00:11:01,280 --> 00:11:03,280 Speaker 1: mode for a week. Another big one is that there's 182 00:11:03,320 --> 00:11:06,240 Speaker 1: a problem on ets with people copying other artists work 183 00:11:06,920 --> 00:11:11,840 Speaker 1: and then selling those copies or reselling original works under 184 00:11:11,880 --> 00:11:15,719 Speaker 1: their own name on Etsy, and uh there's been a 185 00:11:15,800 --> 00:11:17,880 Speaker 1: lot of people pushing at C to take more action 186 00:11:17,960 --> 00:11:20,559 Speaker 1: against that. Now, for its part, Etsy says that the 187 00:11:20,600 --> 00:11:23,000 Speaker 1: fee increases will actually provide the money for the company 188 00:11:23,080 --> 00:11:26,760 Speaker 1: to market more effectively and to take steps against these 189 00:11:26,920 --> 00:11:30,559 Speaker 1: same resellers and copycats. But if you happen to head 190 00:11:30,600 --> 00:11:32,280 Speaker 1: over to Etsy this week and you see that the 191 00:11:32,360 --> 00:11:35,000 Speaker 1: story you wanted to order from his offline, it could 192 00:11:35,040 --> 00:11:37,360 Speaker 1: be that that person has become part of this quote 193 00:11:37,400 --> 00:11:41,280 Speaker 1: unquote strike. We've got some more news stories to cover, 194 00:11:41,400 --> 00:11:43,280 Speaker 1: but before we get to that, let's take a quick break. 195 00:11:51,000 --> 00:11:52,920 Speaker 1: We're back and it's been a little bit since we 196 00:11:52,960 --> 00:11:55,079 Speaker 1: talked about the n s O Group. If you do 197 00:11:55,280 --> 00:11:58,280 Speaker 1: not remember n s O that's the Israeli company that 198 00:11:58,440 --> 00:12:04,199 Speaker 1: creates infiltration tool that compromise devices, notably Apple devices. Uh So. 199 00:12:04,440 --> 00:12:07,400 Speaker 1: It is i would think best known for its product 200 00:12:07,480 --> 00:12:12,520 Speaker 1: pegasusts that exploited a zero click vulnerability in Apple's eye message. 201 00:12:13,120 --> 00:12:15,800 Speaker 1: It would be a vulnerability that Apple would later patch 202 00:12:15,960 --> 00:12:19,559 Speaker 1: out of iOS. NSO group has long maintained that its 203 00:12:19,640 --> 00:12:22,800 Speaker 1: customers are only to use these tools in connection with 204 00:12:22,880 --> 00:12:26,400 Speaker 1: things like law enforcement to fight crime and terrorism. But 205 00:12:26,520 --> 00:12:29,800 Speaker 1: a lot of people have said that those same customers 206 00:12:29,880 --> 00:12:35,439 Speaker 1: have been using these tools to target innocent people, including journalists, politicians, activists, 207 00:12:35,600 --> 00:12:39,600 Speaker 1: and more. And since this exploit would turn an iPhone 208 00:12:39,720 --> 00:12:43,319 Speaker 1: into a surveillance device, I mean gives the the perpetrators 209 00:12:43,360 --> 00:12:47,720 Speaker 1: practically unlimited access to the target's phone, including the phones camera, 210 00:12:48,000 --> 00:12:50,800 Speaker 1: and its microphones, So you can turn the phone into 211 00:12:50,920 --> 00:12:55,600 Speaker 1: a spy device for yourself. That is a huge deal. Well, now, 212 00:12:55,720 --> 00:12:58,480 Speaker 1: Reuter's is reporting that at least five officials in the 213 00:12:58,559 --> 00:13:01,480 Speaker 1: EU were hit by similar or malware attacks on their 214 00:13:01,559 --> 00:13:07,040 Speaker 1: devices in late one This also includes the EUSE European 215 00:13:07,200 --> 00:13:10,959 Speaker 1: Justice Commissioners, so we're talking like high ranking officials in 216 00:13:11,080 --> 00:13:14,680 Speaker 1: the European Union. And while there is no smoking gun 217 00:13:14,880 --> 00:13:18,280 Speaker 1: that links these attacks specifically to n s O Groups products, 218 00:13:19,040 --> 00:13:21,760 Speaker 1: that appears to be at least a working hypothesis. Now, 219 00:13:21,920 --> 00:13:24,920 Speaker 1: ns O Group denies any involvement of the company or 220 00:13:24,960 --> 00:13:27,959 Speaker 1: its products in these attacks. And to be clear, n 221 00:13:28,120 --> 00:13:30,920 Speaker 1: s O Group is not the only company that's creating 222 00:13:31,040 --> 00:13:34,240 Speaker 1: tools like this that exploit vulnerabilities. In fact, it's not 223 00:13:34,360 --> 00:13:37,280 Speaker 1: even the only Israeli company doing that. There are a 224 00:13:37,360 --> 00:13:42,480 Speaker 1: couple of very secretive surveillance companies in Israel, some of 225 00:13:42,559 --> 00:13:44,559 Speaker 1: which you know, like you've heard of n s A 226 00:13:44,600 --> 00:13:46,480 Speaker 1: Group if you listen to this podcast, But there's some 227 00:13:46,800 --> 00:13:50,200 Speaker 1: that get mentioned maybe once or twice, and they don't 228 00:13:50,200 --> 00:13:52,360 Speaker 1: get nearly the same notoriety even though they're in the 229 00:13:52,440 --> 00:13:55,760 Speaker 1: same business. So it is entirely possible that the EU 230 00:13:55,880 --> 00:13:58,720 Speaker 1: officials fell victim to a hack from a competing company 231 00:13:58,800 --> 00:14:01,800 Speaker 1: that it wasn't from an n s O Group product. 232 00:14:02,120 --> 00:14:05,200 Speaker 1: But the really scary thing is that these attacks frequently 233 00:14:05,240 --> 00:14:07,520 Speaker 1: take advantage of something that the end user has no 234 00:14:07,640 --> 00:14:11,200 Speaker 1: control over. Right. We often talk about how we're all 235 00:14:11,320 --> 00:14:15,760 Speaker 1: responsible for our Internet safety, and in many ways that 236 00:14:15,960 --> 00:14:19,480 Speaker 1: is true, right, that our behaviors can make us safer 237 00:14:19,640 --> 00:14:22,080 Speaker 1: or put us more at risk. But in this case, 238 00:14:23,360 --> 00:14:25,960 Speaker 1: the the only safe issue would be if you didn't 239 00:14:26,000 --> 00:14:28,760 Speaker 1: have an iPhone at all, because the way Pegasus worked, 240 00:14:28,760 --> 00:14:32,560 Speaker 1: at least initially, was that you could send an attack 241 00:14:32,680 --> 00:14:35,360 Speaker 1: message through I message. As long as you had the 242 00:14:35,480 --> 00:14:39,360 Speaker 1: target phone's phone number, you could do this, and just 243 00:14:39,480 --> 00:14:42,280 Speaker 1: by receiving the message that would be enough to be 244 00:14:42,360 --> 00:14:45,880 Speaker 1: able to exploit this vulnerability and install the malware on 245 00:14:45,920 --> 00:14:48,160 Speaker 1: the phone, so the user didn't have to do anything, 246 00:14:48,200 --> 00:14:50,040 Speaker 1: They didn't have to open the message, they didn't have 247 00:14:50,120 --> 00:14:53,440 Speaker 1: to download something or activate something. It just would happen. 248 00:14:54,200 --> 00:14:57,760 Speaker 1: Like in those cases, you know it, there's nothing user 249 00:14:57,840 --> 00:15:00,360 Speaker 1: behavior that you can do to avoid it again, unless 250 00:15:00,360 --> 00:15:04,720 Speaker 1: you're talking about just getting rid of tech. So uh yeah, 251 00:15:04,800 --> 00:15:08,760 Speaker 1: scary world. Next year, South Korea will lean on SpaceX 252 00:15:08,840 --> 00:15:13,560 Speaker 1: to launch the country's first ever spy satellite. Further, South 253 00:15:13,640 --> 00:15:16,840 Speaker 1: Korea announces it intends to have five such satellites in 254 00:15:16,920 --> 00:15:21,440 Speaker 1: service by the end of Traditionally, South Korea has had 255 00:15:21,520 --> 00:15:24,040 Speaker 1: to rely upon allies like the United States to provide 256 00:15:24,080 --> 00:15:28,200 Speaker 1: satellite intelligence to South Korean officials. The country is also 257 00:15:28,320 --> 00:15:32,320 Speaker 1: pursuing the development of its own launch vehicle system. A 258 00:15:32,600 --> 00:15:35,920 Speaker 1: test launch late last year saw the country's solid fuel 259 00:15:36,000 --> 00:15:39,800 Speaker 1: rocket designed lift off successfully. So it has successful takeoff, 260 00:15:40,120 --> 00:15:43,400 Speaker 1: but it failed to get into lower orbit. It wasn't 261 00:15:43,400 --> 00:15:47,440 Speaker 1: able to put its payload into lower thorbit and fell 262 00:15:47,560 --> 00:15:50,240 Speaker 1: back to Earth. The payload in that case was a 263 00:15:50,360 --> 00:15:53,200 Speaker 1: dummy satellite. It was not an actual working satellite, which 264 00:15:53,240 --> 00:15:54,840 Speaker 1: is a good thing, you know, because the mission did 265 00:15:54,920 --> 00:15:57,880 Speaker 1: fall short. But the country has its sites set on 266 00:15:57,960 --> 00:16:02,760 Speaker 1: being another space power on Earth and could potentially end 267 00:16:02,880 --> 00:16:08,240 Speaker 1: up supplying very useful intelligence to its allies in that 268 00:16:08,360 --> 00:16:11,480 Speaker 1: particular part of the world. Like when you think about 269 00:16:11,520 --> 00:16:16,000 Speaker 1: North Korea, China, Russia, Uh, you know, South Korea having 270 00:16:16,040 --> 00:16:21,040 Speaker 1: those capabilities would be a huge strategic benefit to its allies. 271 00:16:21,640 --> 00:16:25,120 Speaker 1: Comedian and last Week Tonight host John Oliver send a 272 00:16:25,160 --> 00:16:27,760 Speaker 1: message to US Congress, namely that the government needs to 273 00:16:27,800 --> 00:16:30,680 Speaker 1: step up when it comes to protecting citizens against predatory 274 00:16:30,800 --> 00:16:35,040 Speaker 1: data practices. Oliver pointed out that senior citizens are frequently 275 00:16:35,120 --> 00:16:39,280 Speaker 1: the target of insidious data tracking campaigns and that those, 276 00:16:39,400 --> 00:16:43,640 Speaker 1: in turn can lead to even greater exploitation. Right like, 277 00:16:43,840 --> 00:16:46,640 Speaker 1: step one, get all the information you can about someone. 278 00:16:46,920 --> 00:16:50,720 Speaker 1: Step to exploit that information, and in turn that person 279 00:16:51,400 --> 00:16:54,960 Speaker 1: happens all the time. And so Oliver, presumably in an 280 00:16:55,000 --> 00:16:57,800 Speaker 1: effort to really drive home how sensitive the issue can be, 281 00:16:58,400 --> 00:17:02,400 Speaker 1: hired some data brokers to scrape information in and around 282 00:17:02,440 --> 00:17:05,520 Speaker 1: the Washington, d C. Area, And he gave the data 283 00:17:05,560 --> 00:17:09,879 Speaker 1: broker's criteria that would most likely lump in politicians. So, 284 00:17:09,960 --> 00:17:13,399 Speaker 1: in other words, like he specifically crafted his request to 285 00:17:13,600 --> 00:17:19,919 Speaker 1: target people who would potentially also be politicians so instead 286 00:17:19,920 --> 00:17:21,879 Speaker 1: of saying like, can you target politicians, would be like, 287 00:17:22,359 --> 00:17:25,680 Speaker 1: can you target people within this particular age range and 288 00:17:25,760 --> 00:17:29,920 Speaker 1: these other demographics, and knowing that that happens to cover 289 00:17:30,440 --> 00:17:34,200 Speaker 1: most of Congress. Uh. He has indicated that the data 290 00:17:34,359 --> 00:17:38,480 Speaker 1: that the company's provided him were interesting, that they gave 291 00:17:38,560 --> 00:17:43,080 Speaker 1: insights onto how various politicians are using the Internet, as 292 00:17:43,119 --> 00:17:47,080 Speaker 1: well as their search history, which you know, certainly sounds 293 00:17:47,080 --> 00:17:50,040 Speaker 1: like there could be something salacious going on there. And 294 00:17:50,440 --> 00:17:53,040 Speaker 1: his message to the government is that they might want 295 00:17:53,040 --> 00:17:56,400 Speaker 1: to take some steps to protect people in general from 296 00:17:57,560 --> 00:18:01,080 Speaker 1: you know, having their data leve bridget and exploited in 297 00:18:01,160 --> 00:18:06,359 Speaker 1: this way and preventing people from just gobbling up information 298 00:18:06,440 --> 00:18:09,800 Speaker 1: and then making use of that information in various ways. 299 00:18:10,320 --> 00:18:12,960 Speaker 1: The implication here is that if the government does not, 300 00:18:13,720 --> 00:18:16,720 Speaker 1: you know, swing into action on this and address the 301 00:18:16,960 --> 00:18:21,080 Speaker 1: very long standing problem that Oliver might do something with 302 00:18:21,200 --> 00:18:23,240 Speaker 1: that data. Who's to say what he could do with it. 303 00:18:23,800 --> 00:18:26,479 Speaker 1: And his whole point is that he could totally use 304 00:18:26,520 --> 00:18:30,000 Speaker 1: the data because there's just this lack of laws protecting U. 305 00:18:30,119 --> 00:18:32,840 Speaker 1: S citizens from just that. That's his whole point. He's saying. 306 00:18:33,240 --> 00:18:35,720 Speaker 1: It's not that he wants to use the data to 307 00:18:35,960 --> 00:18:41,199 Speaker 1: embarrass someone or compromise someone, but rather he could because 308 00:18:41,680 --> 00:18:45,200 Speaker 1: there aren't any laws preventing that from happening. And it's 309 00:18:45,200 --> 00:18:47,600 Speaker 1: not like Congress could just say don't you dare, because 310 00:18:47,680 --> 00:18:51,080 Speaker 1: you know, without legislation, there's nothing to enforce. And so 311 00:18:51,200 --> 00:18:53,720 Speaker 1: Oliver saying, hey, this is something people have needed for 312 00:18:53,800 --> 00:18:56,600 Speaker 1: a really long time. Now you're one of those people 313 00:18:56,600 --> 00:19:00,760 Speaker 1: who need it, so can we get it please. It's 314 00:19:00,840 --> 00:19:03,639 Speaker 1: really an interesting move. The weather it amounts to anything 315 00:19:03,760 --> 00:19:06,120 Speaker 1: remains to be seen. I tend to get a little 316 00:19:06,280 --> 00:19:11,080 Speaker 1: cynical when it comes to political action and technology. We 317 00:19:11,359 --> 00:19:14,520 Speaker 1: often see that politics moves at a pace that is 318 00:19:14,640 --> 00:19:17,679 Speaker 1: much slower than tech. Then this brings us into problems 319 00:19:17,760 --> 00:19:20,920 Speaker 1: over and over again. It's why we will frequently have 320 00:19:21,080 --> 00:19:24,840 Speaker 1: these giant bills that try to address tons and tons 321 00:19:24,880 --> 00:19:27,640 Speaker 1: of stuff that piled up over the years. And even 322 00:19:27,760 --> 00:19:31,480 Speaker 1: while we're debating that and getting that all hashed out, 323 00:19:31,840 --> 00:19:34,840 Speaker 1: technology continues to change. So even by the time you 324 00:19:34,880 --> 00:19:37,840 Speaker 1: pass that legislation, there at least some points that are 325 00:19:37,880 --> 00:19:41,000 Speaker 1: no longer relevant. Uh this is not news, I mean, 326 00:19:41,040 --> 00:19:43,359 Speaker 1: it happens all the time. But yeah, very interesting that 327 00:19:43,400 --> 00:19:47,440 Speaker 1: Oliver made this point. Obviously, as a history of doing 328 00:19:47,840 --> 00:19:51,040 Speaker 1: some very biting social commentary on all sorts of different topics, 329 00:19:51,600 --> 00:19:54,880 Speaker 1: and this one about protecting senior citizens. You know, it's 330 00:19:55,080 --> 00:19:58,760 Speaker 1: it's pretty easy to get behind him, because there is 331 00:19:58,800 --> 00:20:03,000 Speaker 1: no question that there are plenty of organizations and companies 332 00:20:03,040 --> 00:20:07,080 Speaker 1: out there that prey upon senior citizens in particular and 333 00:20:07,640 --> 00:20:12,800 Speaker 1: leverage these kinds of uh data collection agencies in order 334 00:20:12,880 --> 00:20:18,960 Speaker 1: to be as effective as possible. Not great, So, I again, 335 00:20:19,040 --> 00:20:22,639 Speaker 1: I'm skeptical that will see anything kind of really actionable 336 00:20:22,720 --> 00:20:26,560 Speaker 1: happen due to this, But it would be nice to see. 337 00:20:27,080 --> 00:20:31,000 Speaker 1: Not because I have not because I think that Congress 338 00:20:31,080 --> 00:20:32,639 Speaker 1: is going to be scared of it. But my hope 339 00:20:32,720 --> 00:20:36,960 Speaker 1: is that it's a way of really just shaking home 340 00:20:37,359 --> 00:20:41,960 Speaker 1: this idea that millions of people lack that basic protection 341 00:20:42,040 --> 00:20:44,159 Speaker 1: and they could really use it. All Right, we've got 342 00:20:44,240 --> 00:20:47,040 Speaker 1: a few more stories to go before we get to those. 343 00:20:47,240 --> 00:20:58,000 Speaker 1: Let's take another quick break. Hackers have targeted thousands of 344 00:20:58,080 --> 00:21:02,120 Speaker 1: sites running on the word press, Us and Jumla hosting services. 345 00:21:02,280 --> 00:21:05,200 Speaker 1: I want to be clear here, I don't think that 346 00:21:05,359 --> 00:21:09,880 Speaker 1: this is necessarily, uh the fault of WordPress or Jumla. 347 00:21:10,160 --> 00:21:13,000 Speaker 1: I don't want to give that impression, uh that while 348 00:21:13,119 --> 00:21:15,200 Speaker 1: these sites all seem to have that in common. That 349 00:21:15,359 --> 00:21:21,360 Speaker 1: does not mean that WordPress and or jumla is inherently bad. Rather, 350 00:21:21,600 --> 00:21:23,720 Speaker 1: it just looks like the hackers were able to compromise 351 00:21:23,840 --> 00:21:27,000 Speaker 1: numerous sites on these platforms, and that the one thing 352 00:21:27,080 --> 00:21:28,919 Speaker 1: that these had in common, besides the fact that they 353 00:21:28,920 --> 00:21:33,320 Speaker 1: were on these two big publication platforms, is that whomever 354 00:21:33,480 --> 00:21:37,440 Speaker 1: was making those sites were they were using really bad passwords, 355 00:21:38,280 --> 00:21:42,719 Speaker 1: just very weak passwords. And um, I'm not sure if 356 00:21:42,760 --> 00:21:45,840 Speaker 1: the hackers used a classic dictionary attack, it sounds to 357 00:21:45,920 --> 00:21:49,119 Speaker 1: me like that's possibly what they did. Uh. In a 358 00:21:49,200 --> 00:21:52,879 Speaker 1: dictionary attack, hackers use essentially a digital document that just 359 00:21:53,000 --> 00:21:56,760 Speaker 1: contains tens of thousands of common passwords. So when we 360 00:21:56,800 --> 00:21:59,320 Speaker 1: say dictionary, you might think, oh, like a like an 361 00:21:59,359 --> 00:22:02,560 Speaker 1: actual diction neary where it's just every word the English language. Well, 362 00:22:02,600 --> 00:22:05,399 Speaker 1: it can be that, but it's usually also lots of 363 00:22:05,520 --> 00:22:08,080 Speaker 1: common passwords that people have used that like, there are 364 00:22:08,080 --> 00:22:10,600 Speaker 1: a bunch of that people just use all the time, 365 00:22:10,680 --> 00:22:13,080 Speaker 1: like one two three four five or password one two 366 00:22:13,160 --> 00:22:17,399 Speaker 1: three four, that kind of stuff. So these dictionaries typically 367 00:22:17,440 --> 00:22:20,960 Speaker 1: have those kinds of phrases and words in them, and 368 00:22:21,040 --> 00:22:23,359 Speaker 1: then they use that to brute force their way into 369 00:22:23,680 --> 00:22:28,280 Speaker 1: various accounts, and that seems to be at least to 370 00:22:28,359 --> 00:22:33,520 Speaker 1: me what has happened in this case, although um again 371 00:22:33,560 --> 00:22:35,680 Speaker 1: I didn't get confirmation on that, so I could be wrong. 372 00:22:36,560 --> 00:22:39,879 Speaker 1: Once the hackers did get access to these sites, they 373 00:22:40,040 --> 00:22:43,920 Speaker 1: began to insert malware into the sites in an attempt 374 00:22:44,680 --> 00:22:48,680 Speaker 1: to lure visitors into installing that malware on their own devices. 375 00:22:48,960 --> 00:22:52,040 Speaker 1: So the malware would send a message to users that 376 00:22:52,119 --> 00:22:55,560 Speaker 1: would claim that they would need to know update their browser, 377 00:22:55,800 --> 00:22:58,440 Speaker 1: for example, in order to view the content on the 378 00:22:58,520 --> 00:23:02,040 Speaker 1: page due to a recent change in the site. But 379 00:23:02,160 --> 00:23:05,520 Speaker 1: of course it wasn't a browser update at all. It 380 00:23:05,680 --> 00:23:08,600 Speaker 1: was rather a way to convince people to install malicious 381 00:23:08,720 --> 00:23:12,040 Speaker 1: code on their devices so that they would download this 382 00:23:12,560 --> 00:23:18,119 Speaker 1: trojan malware that would then infect their computer. So with 383 00:23:18,240 --> 00:23:19,959 Speaker 1: that in mind, I want to give a double message 384 00:23:19,960 --> 00:23:22,679 Speaker 1: out there for all of you listening to this show. First, 385 00:23:23,560 --> 00:23:28,000 Speaker 1: always be supremely suspicious about any request to upgrade or 386 00:23:28,160 --> 00:23:31,600 Speaker 1: update or download something to your device. If it's not 387 00:23:31,840 --> 00:23:38,280 Speaker 1: coming from your typical you know, um OS company or 388 00:23:38,359 --> 00:23:41,320 Speaker 1: whatever it may be, you need to be triple sure 389 00:23:42,000 --> 00:23:44,800 Speaker 1: that those requests are coming from a trustworthy source like 390 00:23:44,880 --> 00:23:47,920 Speaker 1: it is important to keep things like operating systems and 391 00:23:48,000 --> 00:23:54,240 Speaker 1: browsers up to date because updates can often address vulnerabilities 392 00:23:54,840 --> 00:23:58,840 Speaker 1: and close off pathways for hackers to get access to systems. 393 00:23:59,240 --> 00:24:01,520 Speaker 1: So you are so spposed to keep updating, but you 394 00:24:01,560 --> 00:24:05,000 Speaker 1: want to make sure that those messages are coming from 395 00:24:05,160 --> 00:24:09,320 Speaker 1: the actual official source, not through some third party, because 396 00:24:09,400 --> 00:24:11,920 Speaker 1: that's a red flag that something hainky is going on. 397 00:24:13,000 --> 00:24:16,000 Speaker 1: And then for all the web administrators out there who 398 00:24:16,119 --> 00:24:19,840 Speaker 1: use platforms like WordPress and joom La, for goodness sakes, 399 00:24:20,200 --> 00:24:26,160 Speaker 1: use strong passwords, a unique, strong password to protect your 400 00:24:26,760 --> 00:24:29,960 Speaker 1: your resource. I mean for a lot of these these people, 401 00:24:30,040 --> 00:24:34,520 Speaker 1: I'm assuming that those websites are critical to some part 402 00:24:34,600 --> 00:24:37,040 Speaker 1: of their mission, whether it's a company thing or a 403 00:24:37,080 --> 00:24:41,560 Speaker 1: personal thing. So use a strong, unique password, and whenever possible, 404 00:24:41,880 --> 00:24:46,440 Speaker 1: use two factor authentication. These practices will eliminate the vast 405 00:24:46,600 --> 00:24:50,440 Speaker 1: majority of attempts to compromise your site. And you know, 406 00:24:50,520 --> 00:24:52,680 Speaker 1: we've got to keep in mind that premature to call 407 00:24:52,880 --> 00:24:55,760 Speaker 1: any system bulletproof. But yeah, if you want to take 408 00:24:55,840 --> 00:24:59,600 Speaker 1: some basic precautions, then you remove yourself from nearly every 409 00:24:59,680 --> 00:25:03,200 Speaker 1: tard a pool out there, because hackers are going to 410 00:25:03,280 --> 00:25:05,720 Speaker 1: go for the low hanging fruit. If you make sure 411 00:25:06,200 --> 00:25:10,760 Speaker 1: that you even have just just the moderate security practices 412 00:25:10,840 --> 00:25:17,080 Speaker 1: in place, you eliminate like of attacks just by doing that. 413 00:25:17,600 --> 00:25:20,639 Speaker 1: So do it. A driverless car in San Francisco went 414 00:25:20,720 --> 00:25:24,280 Speaker 1: on the run from the cops. Sort of, not really, 415 00:25:24,600 --> 00:25:27,359 Speaker 1: but it's fun to say so. A company called Cruise 416 00:25:27,440 --> 00:25:31,280 Speaker 1: has got these driverless vehicles in San Francisco, and one 417 00:25:31,359 --> 00:25:34,360 Speaker 1: of its vehicles, which had no one in it at all, 418 00:25:35,280 --> 00:25:37,960 Speaker 1: was roaming the streets of San Francisco when a cop 419 00:25:38,080 --> 00:25:40,639 Speaker 1: car pulled up behind it to pull it over. And 420 00:25:40,720 --> 00:25:43,639 Speaker 1: it turns out this autonomous vehicle didn't have its lights on. 421 00:25:44,000 --> 00:25:46,760 Speaker 1: It was after dark and the cars lights weren't on. 422 00:25:48,000 --> 00:25:49,960 Speaker 1: So the car came to a stop at an intersection, 423 00:25:50,040 --> 00:25:51,680 Speaker 1: at which point a cop got out in order to 424 00:25:51,720 --> 00:25:53,840 Speaker 1: talk to the driver of the vehicle. But there was 425 00:25:53,960 --> 00:25:56,600 Speaker 1: no driver of the vehicle. There was no one in 426 00:25:56,680 --> 00:25:59,879 Speaker 1: the car. And there's actually a video of this. A 427 00:26:00,040 --> 00:26:03,440 Speaker 1: bystander took video of it. The cop tried the door, 428 00:26:03,600 --> 00:26:05,800 Speaker 1: but the door must have been locked, and then the 429 00:26:05,840 --> 00:26:08,400 Speaker 1: cop turns back to their own squad car, and from 430 00:26:08,480 --> 00:26:11,920 Speaker 1: that perspective of the bystander, uh, the light at the 431 00:26:11,960 --> 00:26:14,840 Speaker 1: intersection changed and the autonomous car just kind of drove away. 432 00:26:15,440 --> 00:26:17,960 Speaker 1: So like the feeling I got, and I might have 433 00:26:18,080 --> 00:26:20,960 Speaker 1: just seen a short clip of this video, but the 434 00:26:21,119 --> 00:26:24,800 Speaker 1: video I saw hit like the cars like CIA and 435 00:26:24,920 --> 00:26:27,560 Speaker 1: just ditches the cops. That is not what happened. And 436 00:26:27,760 --> 00:26:32,080 Speaker 1: it turns out that the cruise vehicle cleared the intersection 437 00:26:32,480 --> 00:26:34,960 Speaker 1: because that's what's supposed to do in order to make 438 00:26:35,080 --> 00:26:37,920 Speaker 1: sure that traffic is safe, then pulled over out of 439 00:26:37,960 --> 00:26:40,879 Speaker 1: the way to allow the cops to come up to 440 00:26:41,080 --> 00:26:45,399 Speaker 1: the car, and the cops contacted the company explained what 441 00:26:45,520 --> 00:26:48,800 Speaker 1: had happened, that the car didn't have its lights on. Uh. 442 00:26:49,119 --> 00:26:52,600 Speaker 1: The company rep explained that this was due to human error, 443 00:26:52,720 --> 00:26:55,320 Speaker 1: it was not the car's fault. They addressed it. They 444 00:26:55,480 --> 00:26:57,800 Speaker 1: essentially turned on the lights, and the cops did not 445 00:26:58,160 --> 00:27:01,879 Speaker 1: choose to issue a citage in the matter. Uh. So 446 00:27:02,080 --> 00:27:03,639 Speaker 1: it is kind of funny. I mean that a lot 447 00:27:03,680 --> 00:27:05,920 Speaker 1: of people have made jokes about it. It's made the 448 00:27:06,000 --> 00:27:09,600 Speaker 1: rounds on Twitter. But it also reminds us that there 449 00:27:09,640 --> 00:27:13,160 Speaker 1: are still several questions that we haven't really officially answered 450 00:27:13,240 --> 00:27:18,080 Speaker 1: yet when it comes to accountability and autonomous systems. So, yeah, 451 00:27:18,240 --> 00:27:21,200 Speaker 1: this particular case is kind of like quirky and fun, 452 00:27:21,840 --> 00:27:24,640 Speaker 1: but it does. It does really drive home, no pun 453 00:27:24,720 --> 00:27:28,040 Speaker 1: intended that we need to think about these things and 454 00:27:28,160 --> 00:27:32,320 Speaker 1: make sure that there are sufficient laws in place so 455 00:27:32,480 --> 00:27:37,160 Speaker 1: that when something happens where you know, you would typically 456 00:27:37,960 --> 00:27:41,680 Speaker 1: cite a driver, for example, for failing to do something, 457 00:27:42,560 --> 00:27:47,040 Speaker 1: that there's an actual legal process to follow to make 458 00:27:47,119 --> 00:27:49,880 Speaker 1: sure that accountability still is maintained and that we don't 459 00:27:49,920 --> 00:27:55,280 Speaker 1: just have you know, untouchable driverless cars out there that 460 00:27:55,359 --> 00:27:58,240 Speaker 1: are above the law. That's that's just that's just one 461 00:27:58,359 --> 00:28:01,720 Speaker 1: pathway to the eventual robot uprising. We all know that. 462 00:28:03,480 --> 00:28:06,600 Speaker 1: And finally, I'm sure the parents among you out there 463 00:28:06,760 --> 00:28:09,520 Speaker 1: know the kind of struggle there is to trying to 464 00:28:09,640 --> 00:28:13,199 Speaker 1: limit kids screen time because, on the one hand, sometimes 465 00:28:13,280 --> 00:28:16,399 Speaker 1: you just need to get stuff done or just a 466 00:28:16,520 --> 00:28:19,959 Speaker 1: moment's peace, and a distraction is exactly how you need 467 00:28:20,080 --> 00:28:23,800 Speaker 1: to do it, you know, giving giving the kid a 468 00:28:23,920 --> 00:28:27,680 Speaker 1: tablet with a video on it or a simple game 469 00:28:27,840 --> 00:28:31,919 Speaker 1: that just might give you a chance to recapture your sanity. Now, 470 00:28:32,000 --> 00:28:34,160 Speaker 1: on the other hand, obviously you don't want your kids 471 00:28:34,240 --> 00:28:37,240 Speaker 1: to become glued to a screen and develop a lifelong 472 00:28:37,359 --> 00:28:40,200 Speaker 1: habit of doing so that could end up being really 473 00:28:40,960 --> 00:28:45,400 Speaker 1: destructive for a child. So it turns out that human 474 00:28:45,480 --> 00:28:48,360 Speaker 1: parents aren't the only ones worried about this um or 475 00:28:48,400 --> 00:28:50,920 Speaker 1: at least parents of human children aren't the only ones 476 00:28:51,440 --> 00:28:54,840 Speaker 1: worried about this because in the Lincoln Park Zoo in Chicago, 477 00:28:55,760 --> 00:28:58,760 Speaker 1: zookeepers are going through a similar struggle with m R. 478 00:28:58,840 --> 00:29:03,560 Speaker 1: A A a gorilla. This gorilla as a teenager, and 479 00:29:03,800 --> 00:29:08,040 Speaker 1: this teenage guerrilla has become fascinated with smartphones. Now Amari 480 00:29:08,240 --> 00:29:12,480 Speaker 1: is not allowed to have uh their own smartphone, but um, 481 00:29:12,800 --> 00:29:16,040 Speaker 1: but the gorilla does eagerly check out smartphones when visitors 482 00:29:16,200 --> 00:29:19,240 Speaker 1: show them off through the glass window of his enclosure. 483 00:29:19,280 --> 00:29:23,200 Speaker 1: He's particularly fond of hanging out next to the window. 484 00:29:23,320 --> 00:29:26,200 Speaker 1: That's like his favorite spot, and people would hold up 485 00:29:26,280 --> 00:29:27,800 Speaker 1: pictures like they might take a picture of him, and 486 00:29:27,840 --> 00:29:29,920 Speaker 1: then hold it up, and he was really interested. It 487 00:29:30,080 --> 00:29:32,280 Speaker 1: was shiny and interesting to him, and so he would 488 00:29:32,320 --> 00:29:34,520 Speaker 1: really stare at it, and that would encourage other people 489 00:29:35,080 --> 00:29:37,120 Speaker 1: to show their phones to him, and he would pay 490 00:29:37,120 --> 00:29:40,760 Speaker 1: attention to those and ultimately started to get really fixated 491 00:29:41,200 --> 00:29:44,080 Speaker 1: kind of on these screens, to the point where apparently 492 00:29:44,120 --> 00:29:47,600 Speaker 1: he didn't even notice when another young gorilla practiced certain 493 00:29:47,680 --> 00:29:51,120 Speaker 1: social behaviors such as a demonstration of aggression, which is 494 00:29:51,160 --> 00:29:53,440 Speaker 1: a way that you know, gorillas used to kind of 495 00:29:53,520 --> 00:29:57,200 Speaker 1: figure out their pecking order within a social group. Um 496 00:29:58,080 --> 00:30:00,640 Speaker 1: And because he wasn't noticing, he wasn't engaging this behavior 497 00:30:00,760 --> 00:30:05,240 Speaker 1: that could potentially, with enough time, lead to problems down 498 00:30:05,320 --> 00:30:07,480 Speaker 1: the road. Now we're not at that phase yet, but 499 00:30:07,640 --> 00:30:11,520 Speaker 1: zookeepers were concerned. They wanted to make sure that everyone 500 00:30:11,680 --> 00:30:14,120 Speaker 1: was able to enjoy being able to see m R. A, 501 00:30:15,200 --> 00:30:19,320 Speaker 1: that Amory was still having a positive experience in his enclosure, 502 00:30:19,760 --> 00:30:23,720 Speaker 1: but also to limit this behavior of showing off these 503 00:30:23,840 --> 00:30:28,640 Speaker 1: the smartphones to the gorilla. So now they have put 504 00:30:28,760 --> 00:30:31,880 Speaker 1: up a rope that's a small distance away from the 505 00:30:31,960 --> 00:30:34,760 Speaker 1: window in an effort to discourage people from holding up 506 00:30:34,760 --> 00:30:37,360 Speaker 1: their phones and distracting the gorilla. Then, you know, just 507 00:30:37,480 --> 00:30:41,760 Speaker 1: kind of kind of letting people know, hey, let's let's 508 00:30:41,880 --> 00:30:45,840 Speaker 1: enjoy this, but let's not don't don't create distractions because 509 00:30:45,920 --> 00:30:49,800 Speaker 1: potentially that could be harmful to the animal. And UM, 510 00:30:50,040 --> 00:30:52,960 Speaker 1: hopefully they will take that into consideration and take it 511 00:30:53,040 --> 00:30:57,800 Speaker 1: to heart because certainly we we wouldn't want to see 512 00:31:00,200 --> 00:31:02,400 Speaker 1: an animal come to harm because of this. I'm actually 513 00:31:02,480 --> 00:31:06,720 Speaker 1: reminded of a gorilla that lived at the Atlanta Zoo. 514 00:31:06,920 --> 00:31:09,479 Speaker 1: So you know, sit down, I'm gonna tell you all 515 00:31:09,560 --> 00:31:15,520 Speaker 1: the story. Many many years ago. Um, there was a 516 00:31:15,920 --> 00:31:19,760 Speaker 1: gorilla at the Atlanta Zoo named Willie Be. This gorilla 517 00:31:19,880 --> 00:31:24,880 Speaker 1: was named after former Atlanta Mayor William B. Hartsfield, also 518 00:31:25,000 --> 00:31:31,440 Speaker 1: known for giving the Atlanta airport it's name, Hartsfield International Airport. Uh. So, 519 00:31:32,120 --> 00:31:35,480 Speaker 1: Willie Be the gorilla was back when I was a kid, 520 00:31:35,960 --> 00:31:40,280 Speaker 1: lived in a small like concrete enclosure. It was, you know, 521 00:31:40,440 --> 00:31:45,640 Speaker 1: the old school, nineteen seventies era zoo enclosures, back before 522 00:31:46,640 --> 00:31:50,040 Speaker 1: a lot of zoos had really moved beyond that and 523 00:31:50,200 --> 00:31:56,080 Speaker 1: created large enclosures for animals to wander around. So Willie 524 00:31:56,120 --> 00:31:58,680 Speaker 1: Be was in a fairly small like a cell essentially, 525 00:31:58,920 --> 00:32:03,200 Speaker 1: and had a tire swing and an old TV in 526 00:32:03,320 --> 00:32:06,800 Speaker 1: there and would just watch television. And it was really sad. 527 00:32:07,000 --> 00:32:12,680 Speaker 1: And eventually the Atlanta Zoo expanded dramatically and created a 528 00:32:12,880 --> 00:32:17,560 Speaker 1: much better outdoor enclosure for Willie B. Took him a 529 00:32:17,640 --> 00:32:20,160 Speaker 1: while to get used to it. He was so used 530 00:32:20,200 --> 00:32:22,360 Speaker 1: to being in his concrete enclosure that it took him 531 00:32:22,360 --> 00:32:25,120 Speaker 1: a little while to get used to being outside. But 532 00:32:25,200 --> 00:32:29,120 Speaker 1: once he did. Uh, he was able to acclimate to it. 533 00:32:29,280 --> 00:32:33,520 Speaker 1: He eventually ended up being introduced to other guerrillas. He 534 00:32:33,640 --> 00:32:38,600 Speaker 1: became a father. Uh. He was a beloved, beloved representative 535 00:32:38,760 --> 00:32:42,760 Speaker 1: of the of the zoo. Passed away back in two thousand. 536 00:32:42,840 --> 00:32:44,800 Speaker 1: He was forty two years old when he passed away. 537 00:32:45,320 --> 00:32:49,320 Speaker 1: So uh, when I think of the story about him, 538 00:32:49,360 --> 00:32:51,480 Speaker 1: are a I think back to Willie B. And I 539 00:32:51,520 --> 00:32:54,920 Speaker 1: think back to how Willie B was. You know, he 540 00:32:55,080 --> 00:32:58,600 Speaker 1: was in an indoor environment till he was almost thirty, 541 00:32:58,680 --> 00:33:03,000 Speaker 1: I want to say, and and and just seeing clearly 542 00:33:03,080 --> 00:33:07,719 Speaker 1: this quality of life increased so much once they got 543 00:33:07,800 --> 00:33:10,640 Speaker 1: him away from that. Uh. It's it's what kind of 544 00:33:10,720 --> 00:33:12,560 Speaker 1: Ryan drives at home for me with this story. So 545 00:33:12,640 --> 00:33:14,520 Speaker 1: again it becomes one of those that's a little cute 546 00:33:14,600 --> 00:33:17,880 Speaker 1: on the surface, but um, I feel pretty strongly about 547 00:33:17,880 --> 00:33:20,280 Speaker 1: it when you really get down to it. Anyway. That 548 00:33:20,480 --> 00:33:24,600 Speaker 1: is the news for Tuesday, April twelve, two thousand twenty two. 549 00:33:25,000 --> 00:33:27,040 Speaker 1: If you have suggestions for topics I should cover on 550 00:33:27,120 --> 00:33:29,280 Speaker 1: future episodes of Tech Stuff, please reach out to me 551 00:33:29,360 --> 00:33:31,440 Speaker 1: and let me know. The handle for the show on 552 00:33:31,480 --> 00:33:35,160 Speaker 1: Twitter is text stuff hs W and I'll talk to 553 00:33:35,240 --> 00:33:44,080 Speaker 1: you again really soon. Text Stuff is an I heart 554 00:33:44,200 --> 00:33:47,880 Speaker 1: Radio production. For more podcasts from my heart Radio, visit 555 00:33:47,960 --> 00:33:50,960 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 556 00:33:51,120 --> 00:33:52,440 Speaker 1: listen to your favorite shows.