1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:15,040 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,200 --> 00:00:17,960 Speaker 1: Johnathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,079 --> 00:00:21,599 Speaker 1: and I love all things tech. This is the tech 5 00:00:21,680 --> 00:00:27,360 Speaker 1: news for September seventh one. It's a Tuesday, and we 6 00:00:27,480 --> 00:00:29,640 Speaker 1: got a ton of stuff to cover, and a lot 7 00:00:29,680 --> 00:00:31,920 Speaker 1: of it is pretty heavy, so let's get to it. 8 00:00:32,479 --> 00:00:36,040 Speaker 1: Tech Times reports that the Russian government has cracked down 9 00:00:36,240 --> 00:00:41,000 Speaker 1: on six virtual private network services as the country is 10 00:00:41,120 --> 00:00:46,240 Speaker 1: nearing a parliamentary election. And virtual private networks a k a. 11 00:00:46,440 --> 00:00:50,800 Speaker 1: VPNs have lots of different uses. A lot of companies 12 00:00:50,920 --> 00:00:54,360 Speaker 1: use VPNs for employees to log into before they use 13 00:00:54,760 --> 00:00:58,840 Speaker 1: corporate systems. One big reason to use VPNs is that 14 00:00:58,920 --> 00:01:02,240 Speaker 1: it provides a shield when browsing stuff on the Internet. 15 00:01:02,600 --> 00:01:04,839 Speaker 1: So the way it works from a very high level 16 00:01:04,959 --> 00:01:09,280 Speaker 1: is you log into a VPN essentially a server, and 17 00:01:09,319 --> 00:01:13,319 Speaker 1: then the VPN does all the info fetching for you. 18 00:01:13,720 --> 00:01:16,280 Speaker 1: So to someone who is snooping from the outside looking 19 00:01:16,319 --> 00:01:19,200 Speaker 1: at your connection, all they would see is that you 20 00:01:19,240 --> 00:01:22,039 Speaker 1: were connected to a VPN. They wouldn't be able to 21 00:01:22,080 --> 00:01:25,760 Speaker 1: see what you were doing beyond that. So if you 22 00:01:25,800 --> 00:01:29,679 Speaker 1: were researching stuff. Maybe you're researching stuff that the government 23 00:01:29,720 --> 00:01:32,320 Speaker 1: would disapprove of. Let's say you're in Russia and you 24 00:01:32,360 --> 00:01:35,560 Speaker 1: wanted to research something that perhaps the government wasn't in 25 00:01:35,640 --> 00:01:38,920 Speaker 1: favor of, Well, using a VPN, it would be very 26 00:01:38,920 --> 00:01:41,760 Speaker 1: difficult for them to figure that out. Now, this is, 27 00:01:41,840 --> 00:01:45,720 Speaker 1: of course, assuming that the VPN in question is encrypting 28 00:01:45,800 --> 00:01:49,840 Speaker 1: all communications that are taking place and also taking appropriate 29 00:01:49,840 --> 00:01:53,280 Speaker 1: steps to protect users. So if that's all true, then yeah, 30 00:01:53,680 --> 00:01:57,080 Speaker 1: it stands as a way for people to access stuff 31 00:01:57,120 --> 00:02:01,120 Speaker 1: without outsiders necessarily being privy to it. Anyway, if the 32 00:02:01,160 --> 00:02:05,120 Speaker 1: government feels like this is a bad thing, then the 33 00:02:05,240 --> 00:02:07,640 Speaker 1: really the easiest course of action is to block VPN 34 00:02:07,720 --> 00:02:11,280 Speaker 1: services in that country, and that's what Russia has done. 35 00:02:11,639 --> 00:02:15,320 Speaker 1: So those services include big names. If you've been listening 36 00:02:15,360 --> 00:02:18,480 Speaker 1: to podcasts or watching videos, you probably heard some of 37 00:02:18,520 --> 00:02:21,640 Speaker 1: these because they often advertise on those So it includes 38 00:02:21,680 --> 00:02:26,639 Speaker 1: stuff like Nord VPN, Express VPN, i p Vanish VPN, 39 00:02:26,760 --> 00:02:32,399 Speaker 1: Ola VPN, keep Solid VbN, and Speedify VPN. Now, according 40 00:02:32,440 --> 00:02:35,760 Speaker 1: to Russian representatives, this whole move has nothing to do 41 00:02:35,840 --> 00:02:40,320 Speaker 1: with the elections. It's to block services to curb a 42 00:02:40,400 --> 00:02:45,000 Speaker 1: legal activity such as the spread of child pornography, which 43 00:02:45,040 --> 00:02:47,839 Speaker 1: is frequently kind of the messaging that you will hear 44 00:02:47,880 --> 00:02:51,160 Speaker 1: whenever any government starts to crack down on stuff like this. 45 00:02:51,880 --> 00:02:55,600 Speaker 1: Russian representatives have also threatened companies like Google and Apple 46 00:02:56,080 --> 00:02:59,120 Speaker 1: for allowing an app that was created by Alexey Naval 47 00:02:59,280 --> 00:03:04,600 Speaker 1: these organization. He's a critic of Vladimir Putin, and so 48 00:03:05,760 --> 00:03:09,680 Speaker 1: the you know, the fact that that's also happening in 49 00:03:10,400 --> 00:03:13,239 Speaker 1: line with the shutdown of vp NS starts to kind 50 00:03:13,240 --> 00:03:18,760 Speaker 1: of point towards reasons other than curbing a legal activity. Um. 51 00:03:18,880 --> 00:03:21,600 Speaker 1: In fact, the Russian government has said that Google and 52 00:03:21,639 --> 00:03:24,880 Speaker 1: Apple are essentially interfering with Russian elections, which, you know, 53 00:03:25,080 --> 00:03:27,280 Speaker 1: if I can take a moment to add some opinion, 54 00:03:27,760 --> 00:03:31,160 Speaker 1: that's pretty freaking rich coming from Russia, country that has 55 00:03:31,200 --> 00:03:35,280 Speaker 1: interfered with elections around the world, including US elections a 56 00:03:35,280 --> 00:03:38,800 Speaker 1: few times, and will no doubt continue to do so. Anyway, 57 00:03:39,200 --> 00:03:41,840 Speaker 1: it's a good idea to pay attention to these sorts 58 00:03:41,840 --> 00:03:44,880 Speaker 1: of things, because while it's happening in Russia right now, 59 00:03:46,000 --> 00:03:49,040 Speaker 1: that country is only one of many that has political 60 00:03:49,120 --> 00:03:50,960 Speaker 1: leaders trying to get a firmer grip on the flow 61 00:03:50,960 --> 00:03:53,680 Speaker 1: of information. Some of those other countries are pretty darn 62 00:03:53,720 --> 00:03:57,400 Speaker 1: close to home. Some of them might actually be home. 63 00:03:58,320 --> 00:04:00,840 Speaker 1: A couple of weeks ago, I talk about the n 64 00:04:01,000 --> 00:04:04,040 Speaker 1: s O Group. That's a company out of Israel that 65 00:04:04,080 --> 00:04:08,240 Speaker 1: groups responsible for designing malware called Pegasus, and it allows 66 00:04:08,280 --> 00:04:12,440 Speaker 1: an external user access to infected devices remotely. So in 67 00:04:12,440 --> 00:04:16,440 Speaker 1: other words, if you infect a device with this malware, 68 00:04:17,000 --> 00:04:20,239 Speaker 1: you then can access that device in various ways, including 69 00:04:20,279 --> 00:04:25,160 Speaker 1: things like accessing a smartphones, microphone, and camera. Essentially, NSO 70 00:04:25,240 --> 00:04:28,640 Speaker 1: Group creates software that turns various devices into and like 71 00:04:28,720 --> 00:04:31,919 Speaker 1: you know, computers and smartphones into spies. That's what it 72 00:04:32,000 --> 00:04:36,080 Speaker 1: really gets down to. It it allows for surveillance. The 73 00:04:36,200 --> 00:04:39,520 Speaker 1: NSO Group markets this technology to various governments and does 74 00:04:39,560 --> 00:04:43,599 Speaker 1: so under the protection of the Israeli government. That means 75 00:04:43,640 --> 00:04:46,080 Speaker 1: that technically the n s A Group is not supposed 76 00:04:46,120 --> 00:04:49,320 Speaker 1: to market this to any country that is on unfriendly 77 00:04:49,480 --> 00:04:52,960 Speaker 1: terms with Israel. The company has shown a pretty blatant 78 00:04:53,000 --> 00:04:57,920 Speaker 1: disregard for accountability. It has made claims that I argue 79 00:04:57,960 --> 00:05:02,400 Speaker 1: our contradictory. For example, the company says it prohibits customers 80 00:05:02,440 --> 00:05:07,960 Speaker 1: from misusing the malware to target say that customer's own 81 00:05:08,120 --> 00:05:11,080 Speaker 1: citizens in an effort to exert authoritary and control. In 82 00:05:11,120 --> 00:05:13,320 Speaker 1: other words, n s A group says, no, this is 83 00:05:13,360 --> 00:05:16,080 Speaker 1: an order for us to protect against things like acts 84 00:05:16,120 --> 00:05:21,640 Speaker 1: of terrorism, but there have been numerous reports saying that 85 00:05:21,839 --> 00:05:27,720 Speaker 1: n s O groups customers include uh repressive and oppressive 86 00:05:28,760 --> 00:05:34,440 Speaker 1: governments that are using that technology against journalists, activists and 87 00:05:34,480 --> 00:05:37,680 Speaker 1: their own citizens. However, the company also says at the 88 00:05:37,680 --> 00:05:40,440 Speaker 1: same time that it has no way of monitoring how 89 00:05:40,480 --> 00:05:45,240 Speaker 1: its customers actually use its products, so that that ends 90 00:05:45,279 --> 00:05:47,800 Speaker 1: up being a contradiction right there, saying Okay, well, we're 91 00:05:47,839 --> 00:05:49,640 Speaker 1: only going to sell it to people who are using 92 00:05:49,640 --> 00:05:52,040 Speaker 1: it properly. Also, there's no way for us to know 93 00:05:52,080 --> 00:05:54,640 Speaker 1: how they're using it. Well, that's like n s O 94 00:05:54,720 --> 00:05:57,120 Speaker 1: groups saying all right, here you go, but don't go 95 00:05:57,240 --> 00:05:59,360 Speaker 1: using this in ways we don't approve of because there's 96 00:05:59,400 --> 00:06:03,760 Speaker 1: no way for us to track you. Right Like that's 97 00:06:03,800 --> 00:06:07,760 Speaker 1: the message there anyway. A an organization called Citizen Lab 98 00:06:07,880 --> 00:06:10,640 Speaker 1: recently released a report. In fact, they've released a few, 99 00:06:11,000 --> 00:06:14,839 Speaker 1: but the most recent report shows that it had identified 100 00:06:15,000 --> 00:06:19,880 Speaker 1: nine activists in Bahrain who had their devices compromised by 101 00:06:20,000 --> 00:06:24,360 Speaker 1: the n s O group Pegasus malware between June of 102 00:06:24,440 --> 00:06:27,839 Speaker 1: twenty and February of this year, and one thing that 103 00:06:27,920 --> 00:06:32,240 Speaker 1: makes this malware particularly effective is that it can infect 104 00:06:32,279 --> 00:06:36,400 Speaker 1: iOS devices through what is called a zero click attack, 105 00:06:37,480 --> 00:06:40,359 Speaker 1: essentially a message in I message, which means the target 106 00:06:40,400 --> 00:06:43,200 Speaker 1: does not have to click on like a malicious link 107 00:06:43,320 --> 00:06:46,320 Speaker 1: or anything like that to activate the malware. That means 108 00:06:46,360 --> 00:06:49,400 Speaker 1: that the whole advice of don't click on any links 109 00:06:49,440 --> 00:06:52,239 Speaker 1: you don't trust doesn't even apply here because it doesn't 110 00:06:52,240 --> 00:06:55,400 Speaker 1: require a link. Like I literally, the morning that I'm 111 00:06:55,400 --> 00:06:59,360 Speaker 1: recording this, I received a message on a different messaging 112 00:06:59,440 --> 00:07:03,279 Speaker 1: service that was a link attack, like it was trying 113 00:07:03,320 --> 00:07:05,240 Speaker 1: to get me to click on a link and to 114 00:07:05,480 --> 00:07:10,360 Speaker 1: enter my credentials. They're fortunately, even early in the morning, 115 00:07:10,440 --> 00:07:14,520 Speaker 1: before my coffee, I recognized it as an attack. Anyway. 116 00:07:14,960 --> 00:07:18,520 Speaker 1: More human rights organizations are calling on n s O 117 00:07:18,640 --> 00:07:23,000 Speaker 1: group to be held accountable for facilitating authoritarian acts and 118 00:07:23,440 --> 00:07:27,120 Speaker 1: furthering the capabilities of repressive regimes. They pointed out that 119 00:07:27,280 --> 00:07:30,320 Speaker 1: entities like the Bahraini government could use these tools to 120 00:07:30,360 --> 00:07:34,440 Speaker 1: seek out, detain, and even torture targets that the government 121 00:07:34,480 --> 00:07:37,320 Speaker 1: determines are you know, a problem, and by a problem, 122 00:07:37,360 --> 00:07:40,640 Speaker 1: I mean people who just happened to disagree with the government, 123 00:07:40,640 --> 00:07:45,160 Speaker 1: not necessarily someone who's like a terrorist or anything like that. So, 124 00:07:45,480 --> 00:07:48,600 Speaker 1: like I said, a lot of journalists and diplomats have 125 00:07:48,680 --> 00:07:53,360 Speaker 1: been targets of NSO groups Pegasus malware from various customers 126 00:07:53,360 --> 00:07:57,120 Speaker 1: of n s O group, and it's an ongoing problem 127 00:07:57,160 --> 00:08:01,120 Speaker 1: that just seems to be getting worse. All right, let's 128 00:08:01,120 --> 00:08:04,280 Speaker 1: stay on theme because this next story ties in with 129 00:08:04,320 --> 00:08:08,200 Speaker 1: both of the previous ones. Pro Publica has published a piece. 130 00:08:08,320 --> 00:08:12,240 Speaker 1: It's a lot of alliteration there about WhatsApp and a 131 00:08:12,280 --> 00:08:18,160 Speaker 1: contradiction in WhatsApps UH brand identity as a secure messaging platform. Alright, So, 132 00:08:18,280 --> 00:08:21,000 Speaker 1: just in case you're not familiar, the elevator pitch for 133 00:08:21,080 --> 00:08:24,920 Speaker 1: WhatsApp is that it is an end to end encrypted 134 00:08:25,200 --> 00:08:28,640 Speaker 1: messaging service. That means that if you and a buddy 135 00:08:28,680 --> 00:08:31,960 Speaker 1: are messaging and you're using WhatsApp, every message you send 136 00:08:31,960 --> 00:08:35,199 Speaker 1: to your buddy is encrypted and can only be decrypted 137 00:08:35,240 --> 00:08:38,080 Speaker 1: on the other end of that communication channel. Your buddy 138 00:08:38,120 --> 00:08:41,679 Speaker 1: is the only person who has the correct decrypt key 139 00:08:41,720 --> 00:08:44,800 Speaker 1: to read those messages. And then every message your buddy 140 00:08:44,800 --> 00:08:48,360 Speaker 1: sends to you is also encrypted, and only you hold 141 00:08:48,440 --> 00:08:51,400 Speaker 1: the decrypt key for those messages, so no one else 142 00:08:51,440 --> 00:08:54,160 Speaker 1: can read them. The same is true for any images 143 00:08:54,280 --> 00:08:57,120 Speaker 1: or videos, or links or anything else that's sent on 144 00:08:57,240 --> 00:09:00,240 Speaker 1: WhatsApp between the two of you. And WhatsApp is an 145 00:09:00,240 --> 00:09:05,760 Speaker 1: incredibly popular messaging service, not as much in the United States, 146 00:09:05,760 --> 00:09:08,679 Speaker 1: but in other parts of the world very much so. 147 00:09:08,679 --> 00:09:12,280 Speaker 1: So according to the company, no one outside of you 148 00:09:12,400 --> 00:09:14,559 Speaker 1: and your buddy would be able to understand what your 149 00:09:14,559 --> 00:09:17,880 Speaker 1: messages are, and that Facebook, which is the owner of WhatsApp, 150 00:09:18,640 --> 00:09:21,319 Speaker 1: would not be able to see those messages. WhatsApp would 151 00:09:21,320 --> 00:09:23,560 Speaker 1: not be able to have any idea of what it 152 00:09:23,640 --> 00:09:25,320 Speaker 1: was you were sending between the two of you, because 153 00:09:25,320 --> 00:09:29,160 Speaker 1: it would be encrypted, and so it's private and secure. However, 154 00:09:29,520 --> 00:09:33,520 Speaker 1: the company also has hundreds of contract workers whose job 155 00:09:33,559 --> 00:09:37,360 Speaker 1: it is to moderate or police content sent across WhatsApp, 156 00:09:37,720 --> 00:09:41,480 Speaker 1: And you might think, wait a minute, how is it 157 00:09:41,520 --> 00:09:46,160 Speaker 1: possible to moderate or police content if no one can 158 00:09:46,240 --> 00:09:50,960 Speaker 1: see the unencrypted stuff except for the users involved. Well, 159 00:09:51,000 --> 00:09:54,800 Speaker 1: there is one really big exception to this. If someone 160 00:09:54,840 --> 00:09:59,160 Speaker 1: were to tag an incoming message as being against WhatsApps policies, 161 00:09:59,559 --> 00:10:02,480 Speaker 1: Like you tag a message and you say that it's 162 00:10:02,520 --> 00:10:05,840 Speaker 1: spam or that it's abusive, or you know, something along 163 00:10:05,880 --> 00:10:09,600 Speaker 1: those lines, then that sends an alert to WhatsApp to 164 00:10:09,720 --> 00:10:15,200 Speaker 1: review the communication. So some contract workers somewhere, maybe Austin, Texas, 165 00:10:15,200 --> 00:10:17,360 Speaker 1: because there's a lot of them there, And this is 166 00:10:17,400 --> 00:10:20,560 Speaker 1: a person who technically does not work for WhatsApp or 167 00:10:20,600 --> 00:10:24,640 Speaker 1: for Facebook. They work for another company that serves as 168 00:10:24,679 --> 00:10:30,520 Speaker 1: the kind of the employment center for these contract workers 169 00:10:30,720 --> 00:10:34,560 Speaker 1: who then are contracting with WhatsApp. That person would be 170 00:10:34,600 --> 00:10:39,240 Speaker 1: sent the unencrypted offending message from the receiver, as well 171 00:10:39,280 --> 00:10:42,600 Speaker 1: as the four previous messages that preceded the one that 172 00:10:42,720 --> 00:10:47,240 Speaker 1: allegedly was against the policy. So in other words, you know, 173 00:10:47,320 --> 00:10:51,040 Speaker 1: if you receive a message, your side un encrypts that message. 174 00:10:51,080 --> 00:10:54,080 Speaker 1: From that point forward, that message is in plain text. 175 00:10:54,280 --> 00:10:55,920 Speaker 1: You can read it. So if someone were to get 176 00:10:55,920 --> 00:10:59,320 Speaker 1: access to your phone, they could read all that content. Right, 177 00:10:59,360 --> 00:11:01,680 Speaker 1: if they try to intercept the message, it would be encrypted, 178 00:11:01,840 --> 00:11:04,959 Speaker 1: but once it's on your device, they can read it. Well, 179 00:11:05,000 --> 00:11:08,360 Speaker 1: that's what this system does. If you flag something in WhatsApp, 180 00:11:09,240 --> 00:11:14,440 Speaker 1: it sends the unencrypted message on for review. Now I 181 00:11:14,720 --> 00:11:18,000 Speaker 1: point all this out because this is a sort of 182 00:11:18,040 --> 00:11:21,560 Speaker 1: back door through the security mess measures, and you can 183 00:11:21,640 --> 00:11:25,000 Speaker 1: kind of understand why it exists and that any platform 184 00:11:25,080 --> 00:11:27,920 Speaker 1: wants to reassure its users that those users are going 185 00:11:27,960 --> 00:11:31,160 Speaker 1: to be protected from harassment and other abuse. But on 186 00:11:31,200 --> 00:11:35,160 Speaker 1: the other hand, backdoors through security systems are almost always 187 00:11:35,160 --> 00:11:38,000 Speaker 1: a bad idea, and that's because of the possibility that 188 00:11:38,120 --> 00:11:42,120 Speaker 1: someone somewhere is going to abuse that exception to the 189 00:11:42,160 --> 00:11:46,120 Speaker 1: secure system and then turn it to nefarious purposes. Moreover, 190 00:11:46,520 --> 00:11:51,600 Speaker 1: since What's Apps brand is so heavily geared towards, you know, 191 00:11:51,920 --> 00:11:58,760 Speaker 1: privacy and security in communication, this really undermines the brand's position. 192 00:11:59,080 --> 00:12:03,560 Speaker 1: It undermines the rand's actual identity. Pro Publica also points 193 00:12:03,600 --> 00:12:09,080 Speaker 1: out that Facebook analyzes tons of metadata related to WhatsApp communications. 194 00:12:09,400 --> 00:12:12,240 Speaker 1: Metadata In case you're not familiar with the term, that's 195 00:12:12,320 --> 00:12:17,200 Speaker 1: information about information. So really it's all the stuff about 196 00:12:17,240 --> 00:12:21,280 Speaker 1: a communication except for the content of the communication itself. 197 00:12:21,800 --> 00:12:25,199 Speaker 1: Meta data can include who was involved in the communication, 198 00:12:25,400 --> 00:12:27,680 Speaker 1: like which were the parties that we're talking with each other? 199 00:12:28,040 --> 00:12:31,400 Speaker 1: What time were messages being sent back and forth? Where 200 00:12:31,440 --> 00:12:34,000 Speaker 1: were people when they were actually sending information? You know, 201 00:12:34,040 --> 00:12:37,280 Speaker 1: you pair it with like geolocation data, that kind of thing, 202 00:12:37,559 --> 00:12:40,040 Speaker 1: and you can determine a lot of stuff through meta 203 00:12:40,120 --> 00:12:43,640 Speaker 1: data analysis. You wouldn't know what the actual content of 204 00:12:43,679 --> 00:12:45,880 Speaker 1: the messages were, but you might be able to hazard 205 00:12:45,920 --> 00:12:49,000 Speaker 1: a guess just by examining enough meta data to figure out, 206 00:12:49,040 --> 00:12:52,719 Speaker 1: you know, what's going on, and that too, is a 207 00:12:52,800 --> 00:12:58,040 Speaker 1: huge threat to privacy and security. WhatsApp users were understandably 208 00:12:58,160 --> 00:13:02,360 Speaker 1: really upset when Facebook first denounced that some data collected 209 00:13:02,440 --> 00:13:06,880 Speaker 1: from WhatsApp would then be shared to Facebook proper, and 210 00:13:06,920 --> 00:13:09,640 Speaker 1: that the company itself would get into hot water because 211 00:13:10,200 --> 00:13:14,880 Speaker 1: reps from Facebook had previously sworn to EU officials that 212 00:13:15,480 --> 00:13:20,559 Speaker 1: incorporating that kind of ability was technically impossible between WhatsApp 213 00:13:20,600 --> 00:13:22,560 Speaker 1: and Facebook. They said, yeah, there's no way for us 214 00:13:22,600 --> 00:13:24,280 Speaker 1: to do it, even if we wanted to. We don't 215 00:13:24,280 --> 00:13:26,000 Speaker 1: want to want to, but even if we wanted to, 216 00:13:26,040 --> 00:13:28,280 Speaker 1: we couldn't do it. Then they turned around and said, hey, 217 00:13:28,320 --> 00:13:33,120 Speaker 1: guess what we're doing. So that was not, you know, consistent, 218 00:13:33,720 --> 00:13:35,800 Speaker 1: But yeah, it's just something to keep in mind, and 219 00:13:35,840 --> 00:13:40,280 Speaker 1: it helps remind us that most of Facebook's revenue depends 220 00:13:40,360 --> 00:13:43,640 Speaker 1: upon the company collecting and then leveraging, or if you prefer, 221 00:13:44,200 --> 00:13:49,199 Speaker 1: exploiting information about users. So with that in mind, it's 222 00:13:49,200 --> 00:13:53,120 Speaker 1: a really safe bet to assume that any product coming 223 00:13:53,120 --> 00:13:57,600 Speaker 1: out of Facebook contributes to that pursuit that even on 224 00:13:57,640 --> 00:14:00,240 Speaker 1: the face of it, if it isn't all about acting 225 00:14:00,280 --> 00:14:02,400 Speaker 1: your data in order to sell more stuff to you 226 00:14:03,559 --> 00:14:06,520 Speaker 1: somewhere down the line. That is what's going on, because 227 00:14:06,640 --> 00:14:11,679 Speaker 1: that's Facebook's business. That's how Facebook makes money. So yeah, 228 00:14:12,040 --> 00:14:14,480 Speaker 1: important to know. And it also explains why Facebook was 229 00:14:14,520 --> 00:14:18,080 Speaker 1: willing to shell out billions and billions of dollars to 230 00:14:18,240 --> 00:14:21,360 Speaker 1: acquire WhatsApp in the first place. So all the things 231 00:14:21,360 --> 00:14:24,560 Speaker 1: that people were worried about when Facebook did that are 232 00:14:25,360 --> 00:14:31,280 Speaker 1: kind of manifesting or becoming evident. Yeah, so important to remember. 233 00:14:31,480 --> 00:14:34,040 Speaker 1: We're gonna have some more stories that kind of relate 234 00:14:34,120 --> 00:14:36,240 Speaker 1: back to some of the themes we've already talked about. 235 00:14:36,280 --> 00:14:38,600 Speaker 1: I know it gets kind of dark and oppressive, but 236 00:14:38,600 --> 00:14:41,960 Speaker 1: it's important to know about them. But hey, I don't 237 00:14:41,960 --> 00:14:44,520 Speaker 1: know about you, but I could use a quick break. 238 00:14:52,120 --> 00:14:54,120 Speaker 1: All right, we're back and it's time to go down 239 00:14:54,200 --> 00:14:58,800 Speaker 1: on in Australia. Um, sorry, that was terrible. I had 240 00:14:58,800 --> 00:15:02,280 Speaker 1: to introduce a little of ity with my terrible terrible 241 00:15:03,200 --> 00:15:05,720 Speaker 1: Uh not even Australian accent, you can't even give it 242 00:15:05,760 --> 00:15:08,640 Speaker 1: that much. But yeah, my terrible attempt. Because there's a 243 00:15:08,640 --> 00:15:12,400 Speaker 1: new bill that's introducing some troubling powers for law enforcement 244 00:15:12,440 --> 00:15:17,440 Speaker 1: with regard to online information in Australia. UH, there's this 245 00:15:17,760 --> 00:15:21,960 Speaker 1: piece of legislation. It's called the Surveillance Legislation Amendment UH 246 00:15:22,160 --> 00:15:27,479 Speaker 1: Parentheses Identify and Disrupt INDO Parentheses Bill, and it introduces 247 00:15:27,560 --> 00:15:30,960 Speaker 1: three new powers for law enforcement. The first is that 248 00:15:31,280 --> 00:15:34,360 Speaker 1: law enforcement will be able to obtain a quote data 249 00:15:34,440 --> 00:15:38,200 Speaker 1: disruption warrant end quote and that in turn gives law 250 00:15:38,320 --> 00:15:43,000 Speaker 1: enforcement the right to copy, delete, or modify data in 251 00:15:43,120 --> 00:15:48,400 Speaker 1: order to disrupt it. Now, that alone is horrifying, and 252 00:15:48,720 --> 00:15:51,000 Speaker 1: I'll get to why I find it horrifying in just 253 00:15:51,080 --> 00:15:54,480 Speaker 1: a bit. But second, the second power is that law 254 00:15:54,600 --> 00:15:58,400 Speaker 1: enforcement will be able to apply for and obtain a 255 00:15:58,560 --> 00:16:02,440 Speaker 1: quote Network activit any warrant end quote that gives them 256 00:16:02,440 --> 00:16:06,040 Speaker 1: the power to collect data from networks and devices that 257 00:16:06,080 --> 00:16:09,200 Speaker 1: are believed to be used or maybe used with relation 258 00:16:09,280 --> 00:16:13,720 Speaker 1: to illegal activity. The third power is that they can 259 00:16:13,760 --> 00:16:18,520 Speaker 1: obtain an quote account takeover warrant end quote, and that 260 00:16:18,560 --> 00:16:20,800 Speaker 1: would give them the authority to do exactly what that 261 00:16:20,880 --> 00:16:25,240 Speaker 1: sounds like, to take over an account entirely in order 262 00:16:25,320 --> 00:16:28,000 Speaker 1: to gather information on an investigation. So, for example, if 263 00:16:28,040 --> 00:16:33,040 Speaker 1: you if the Australian Police think that you know Jimmy 264 00:16:33,120 --> 00:16:38,280 Speaker 1: the crook is is involved in a criminal operation, which 265 00:16:38,280 --> 00:16:42,360 Speaker 1: I mean seems logical because he's called Jimmy the Crook. 266 00:16:42,720 --> 00:16:44,880 Speaker 1: Then they could take over They could get a warrant 267 00:16:44,920 --> 00:16:48,520 Speaker 1: to take over Jimmy the Crook's Facebook profile and post 268 00:16:48,960 --> 00:16:51,840 Speaker 1: as Jimmy the Crook in an effort to gather information 269 00:16:51,880 --> 00:16:55,840 Speaker 1: about Jimmy the Crooks accomplices and potentially have like a 270 00:16:55,880 --> 00:17:00,760 Speaker 1: full sting operation. But yeah, as you get down to it, 271 00:17:00,800 --> 00:17:03,880 Speaker 1: this is all terrible, And again I totally get that 272 00:17:03,960 --> 00:17:08,080 Speaker 1: for law enforcement, online communications stand as a huge challenge. 273 00:17:08,400 --> 00:17:12,359 Speaker 1: Criminals can and do use all sorts of platforms to 274 00:17:12,440 --> 00:17:15,560 Speaker 1: communicate and coordinate, and it can be a challenge to 275 00:17:15,640 --> 00:17:18,399 Speaker 1: disrupt that, and it sure would be valuable to be 276 00:17:18,480 --> 00:17:21,800 Speaker 1: able to go in there and manipulate stuff, perhaps setting 277 00:17:21,880 --> 00:17:24,080 Speaker 1: up a sting operation to catch criminals in the act 278 00:17:24,359 --> 00:17:28,040 Speaker 1: and bring them to justice. But these provisions give way 279 00:17:28,080 --> 00:17:32,800 Speaker 1: too much power to law enforcement. So imagine that there's 280 00:17:32,840 --> 00:17:35,600 Speaker 1: a legal process for the police to get a warrant, 281 00:17:35,760 --> 00:17:37,399 Speaker 1: So they do have to at least go through the 282 00:17:37,480 --> 00:17:39,480 Speaker 1: steps of getting a warrant, but let's assume that it's 283 00:17:39,520 --> 00:17:41,680 Speaker 1: not too difficult to do that. So they go get 284 00:17:41,720 --> 00:17:44,240 Speaker 1: a warrant and then they can go in and manipulate 285 00:17:44,359 --> 00:17:47,080 Speaker 1: stuff that you have posted online. They can go in 286 00:17:47,160 --> 00:17:51,800 Speaker 1: and change the things you have posted. Arguably, they could 287 00:17:51,840 --> 00:17:55,360 Speaker 1: manipulate the stuff you post in such a way that 288 00:17:55,440 --> 00:17:58,720 Speaker 1: you appear to have posted incriminating material when in fact 289 00:17:58,760 --> 00:18:02,680 Speaker 1: you did not do any such thing. So an authoritarian 290 00:18:02,760 --> 00:18:07,879 Speaker 1: government could literally manufacture evidence against you, and then you 291 00:18:07,880 --> 00:18:11,200 Speaker 1: could be held accountable for something you never did. This 292 00:18:11,280 --> 00:18:14,920 Speaker 1: is the equivalent of the trope you see, and usually 293 00:18:14,920 --> 00:18:18,119 Speaker 1: it's in a comedy where a cop drops a brown 294 00:18:18,160 --> 00:18:20,880 Speaker 1: paper bag next to someone that picks it up and says, well, 295 00:18:20,920 --> 00:18:24,159 Speaker 1: what do we have here? Illicit drugs? It's kind of 296 00:18:24,200 --> 00:18:26,600 Speaker 1: like that, but on the data side of things. Now, 297 00:18:26,640 --> 00:18:30,639 Speaker 1: that's just one scenario in which this bill becomes a nightmare. 298 00:18:30,920 --> 00:18:33,959 Speaker 1: But again, it wouldn't even take an authoritarian government for 299 00:18:33,960 --> 00:18:37,160 Speaker 1: this to go wrong. Building in systems that would allow 300 00:18:37,240 --> 00:18:40,840 Speaker 1: law enforcement to do these sorts of things means that 301 00:18:40,920 --> 00:18:44,359 Speaker 1: you have to create vulnerabilities so that the police can 302 00:18:44,400 --> 00:18:46,320 Speaker 1: go in and do it. Right, you have to have 303 00:18:46,359 --> 00:18:50,200 Speaker 1: a way for police to access those platforms and those 304 00:18:50,200 --> 00:18:53,440 Speaker 1: accounts in order to do this data manipulation. However, if 305 00:18:53,480 --> 00:18:56,960 Speaker 1: you do that, if you create those vulnerabilities. It gives 306 00:18:57,000 --> 00:19:01,360 Speaker 1: the chance for other people to potentially exploit those. Now 307 00:19:01,359 --> 00:19:04,600 Speaker 1: I've used this analogy many times before, but here we go. 308 00:19:05,160 --> 00:19:08,200 Speaker 1: It's like you've got a gold reserve and you store 309 00:19:08,320 --> 00:19:11,640 Speaker 1: tons of gold bars in that reserve, and you've got 310 00:19:11,640 --> 00:19:15,360 Speaker 1: this huge vault, and you have a solid vault door 311 00:19:15,480 --> 00:19:19,199 Speaker 1: with like time locks and all this complicated stuff, and 312 00:19:19,280 --> 00:19:23,160 Speaker 1: it's really it's like practically impossible to break through. However, 313 00:19:23,920 --> 00:19:26,280 Speaker 1: you also include a little back door in the vault 314 00:19:26,440 --> 00:19:29,080 Speaker 1: that's protected by a cheap padlock because you need to 315 00:19:29,160 --> 00:19:31,520 Speaker 1: occasionally get into the vault and you don't want to 316 00:19:31,520 --> 00:19:34,600 Speaker 1: go through the trouble of unlocking the big door. Well, 317 00:19:34,640 --> 00:19:40,520 Speaker 1: anytime you create exceptions to a secure system, you essentially 318 00:19:40,600 --> 00:19:43,359 Speaker 1: nullify the security. So no matter how you shake it, 319 00:19:43,400 --> 00:19:46,480 Speaker 1: this bill is bad news. It creates far too much 320 00:19:46,600 --> 00:19:50,439 Speaker 1: surveillance power on the law enforcement side, and it introduces 321 00:19:50,480 --> 00:19:53,000 Speaker 1: the possibility for bad actors to take advantage of a 322 00:19:53,119 --> 00:19:56,879 Speaker 1: system for their own purposes. Heck, it also means people 323 00:19:57,040 --> 00:20:01,560 Speaker 1: can end up with a plausible alibi for all online communications, 324 00:20:02,000 --> 00:20:05,360 Speaker 1: because if there exists a way to go in and 325 00:20:05,520 --> 00:20:09,960 Speaker 1: change someone else's stuff online, perhaps even to the point 326 00:20:10,000 --> 00:20:13,760 Speaker 1: of fabricating messages on behalf of another person. Well that 327 00:20:13,880 --> 00:20:16,200 Speaker 1: what the waters are then moneyed right when it comes 328 00:20:16,240 --> 00:20:19,639 Speaker 1: to holding someone accountable for their online actions, because you 329 00:20:19,680 --> 00:20:24,560 Speaker 1: could conceivably argue that you never posted something incriminating because 330 00:20:24,600 --> 00:20:28,080 Speaker 1: there is a system that exists that allows others to 331 00:20:28,200 --> 00:20:31,879 Speaker 1: change what you post without your consent. So, in other words, 332 00:20:32,160 --> 00:20:36,640 Speaker 1: it can actually weaken legal cases rather than strengthen them. Now, 333 00:20:36,640 --> 00:20:38,960 Speaker 1: there's no word yet on how big I sps and 334 00:20:39,000 --> 00:20:43,560 Speaker 1: platforms are going to work within this legislation in Australia. 335 00:20:43,640 --> 00:20:46,000 Speaker 1: But this is the kind of stuff that really worries me, 336 00:20:46,880 --> 00:20:50,480 Speaker 1: and the hits just keep on coming. So maybe you've 337 00:20:50,520 --> 00:20:53,760 Speaker 1: heard of a service called proton mail. Maybe you use it. 338 00:20:54,080 --> 00:20:57,240 Speaker 1: This email service uses end to end encryption to keep 339 00:20:57,240 --> 00:21:01,680 Speaker 1: communications private between users, and just like WhatsApp, the service's 340 00:21:01,760 --> 00:21:06,360 Speaker 1: reputation depends heavily upon that fact, the idea that your 341 00:21:06,400 --> 00:21:10,720 Speaker 1: communications are secure through proton mail, and the bottom line 342 00:21:10,760 --> 00:21:13,719 Speaker 1: is that you use proton mail to protect your communication 343 00:21:13,760 --> 00:21:17,159 Speaker 1: from folks snooping on you. Well, now that services kind 344 00:21:17,200 --> 00:21:19,920 Speaker 1: of on the hot seat in the public eye because 345 00:21:19,920 --> 00:21:23,280 Speaker 1: proton Mail recently responded to a demand from police in 346 00:21:23,359 --> 00:21:26,960 Speaker 1: Switzerland to give up the IP address belonging to a 347 00:21:27,040 --> 00:21:32,880 Speaker 1: climate activist in France, which Proton Mail did. So why 348 00:21:32,920 --> 00:21:37,040 Speaker 1: were the Swiss concerned about a French climate activist, Well, 349 00:21:37,080 --> 00:21:40,040 Speaker 1: that's because the French authorities asked the Swiss for help. 350 00:21:40,480 --> 00:21:43,560 Speaker 1: It was ultimately the French authorities that wanted to track 351 00:21:43,600 --> 00:21:47,920 Speaker 1: down this activist who was responsible or you know, activists 352 00:21:48,160 --> 00:21:53,160 Speaker 1: who are responsible for organizing various protests in France, high 353 00:21:53,200 --> 00:21:56,879 Speaker 1: profile protests, and so these authorities called up their buddies 354 00:21:56,880 --> 00:22:01,639 Speaker 1: in Switzerland and they said, hey, love the chocolate, do 355 00:22:01,800 --> 00:22:04,320 Speaker 1: us a solid and lean on proton mail to give 356 00:22:04,400 --> 00:22:07,240 Speaker 1: us the IP address for this activist. Okay, and the 357 00:22:07,320 --> 00:22:11,440 Speaker 1: Swiss dead proton Mail, I should point out, is headquartered 358 00:22:11,640 --> 00:22:15,000 Speaker 1: in Switzerland, and thus when the Swiss authorities came knocking, 359 00:22:15,200 --> 00:22:17,360 Speaker 1: the company didn't really have a whole lot of options, 360 00:22:17,400 --> 00:22:20,359 Speaker 1: at least from a legal standpoint, and the CEO of 361 00:22:20,400 --> 00:22:23,720 Speaker 1: proton said as much that the company is legally bound 362 00:22:23,840 --> 00:22:27,119 Speaker 1: to follow local laws, and so there was really no 363 00:22:27,240 --> 00:22:30,320 Speaker 1: recourse other than to comply with the demand from the 364 00:22:30,359 --> 00:22:35,280 Speaker 1: Swiss authorities. The actual communication and the email service remains encrypted, 365 00:22:35,600 --> 00:22:39,120 Speaker 1: so the company legitimately cannot read any of the materials 366 00:22:39,119 --> 00:22:43,320 Speaker 1: sent through those addresses. They cannot reveal what communications were set, 367 00:22:43,840 --> 00:22:46,560 Speaker 1: Nor does proton mail have information on who owns a 368 00:22:46,600 --> 00:22:50,359 Speaker 1: particular email address. But they can give you an IP 369 00:22:50,560 --> 00:22:54,480 Speaker 1: address associated with a specific email address. They just can't 370 00:22:54,480 --> 00:22:58,920 Speaker 1: say definitively this email address belongs to you know, Joan 371 00:22:59,119 --> 00:23:02,840 Speaker 1: Smith or something. But as I mentioned earlier, metadata can 372 00:23:02,880 --> 00:23:05,480 Speaker 1: reveal an awful lot about a person, So it's quite 373 00:23:05,480 --> 00:23:09,680 Speaker 1: probable that French authorities are using the information to try 374 00:23:09,720 --> 00:23:12,640 Speaker 1: and track down activists who have arranged those really effective 375 00:23:12,680 --> 00:23:17,679 Speaker 1: protests in France around issues like affordable housing and climate concerns. 376 00:23:18,520 --> 00:23:22,040 Speaker 1: So pretty scary stuff. Sticking with tech and politics, but 377 00:23:22,119 --> 00:23:26,440 Speaker 1: now moving to the absurd. Business Insider reports that thirteen 378 00:23:26,520 --> 00:23:29,560 Speaker 1: House Republicans in the United States sent a letter to 379 00:23:29,680 --> 00:23:33,359 Speaker 1: Marissa Meyer, the quote president and chief executive officer of 380 00:23:33,440 --> 00:23:37,560 Speaker 1: Yahoo in quote uh and they were saying that she 381 00:23:37,600 --> 00:23:42,200 Speaker 1: should better well not hand over any personal information of 382 00:23:42,720 --> 00:23:45,959 Speaker 1: you know, any Republicans or anyone really over to an 383 00:23:46,000 --> 00:23:50,639 Speaker 1: investigative committee that's looking into the January six insurrection in Washington, 384 00:23:50,720 --> 00:23:55,960 Speaker 1: d C. Only okay, hey, House Republicans, you thirteen folks. 385 00:23:56,359 --> 00:24:00,080 Speaker 1: Marissa Meyer stepped down from Yahoo back in tween, so 386 00:24:00,200 --> 00:24:02,800 Speaker 1: she hasn't been in charge of that company for several years. 387 00:24:03,040 --> 00:24:05,520 Speaker 1: She wasn't in charge of the company during the whole 388 00:24:05,560 --> 00:24:08,800 Speaker 1: lead up to the twenty election, let alone the insurrection 389 00:24:08,800 --> 00:24:12,320 Speaker 1: on January six, and the fact that no one decided 390 00:24:12,359 --> 00:24:15,280 Speaker 1: to fact check that or I don't know, do a 391 00:24:15,359 --> 00:24:17,880 Speaker 1: simple Internet search. I mean they could have even used 392 00:24:17,960 --> 00:24:21,719 Speaker 1: Yahoo to do it. That speaks volumes not just about 393 00:24:21,840 --> 00:24:27,160 Speaker 1: tech literacy, but arguably just playing literacy. It's an embarrassing 394 00:24:27,200 --> 00:24:30,480 Speaker 1: display of ignorance. Anyway. At the heart of the matter 395 00:24:30,800 --> 00:24:34,480 Speaker 1: is the issue of one party in government power using 396 00:24:34,480 --> 00:24:39,080 Speaker 1: its authority to demand information and records about another party, 397 00:24:39,200 --> 00:24:41,560 Speaker 1: and doing so with companies like I s p s 398 00:24:41,640 --> 00:24:44,760 Speaker 1: and social media platforms and companies that provide email and 399 00:24:45,080 --> 00:24:49,440 Speaker 1: telecommunications companies and more. And to be clear, I think 400 00:24:49,480 --> 00:24:52,119 Speaker 1: there is merit in that concern. I think we do 401 00:24:52,320 --> 00:24:57,000 Speaker 1: need a thorough investigation into the January six riots, including 402 00:24:57,000 --> 00:24:59,960 Speaker 1: an investigation into whether or not anyone in government facil 403 00:25:00,000 --> 00:25:03,639 Speaker 1: ialitated or encourage the insurrection, into what degree as to 404 00:25:03,680 --> 00:25:05,720 Speaker 1: whether or not it was a criminal degree. I think 405 00:25:05,760 --> 00:25:08,600 Speaker 1: all of that needs to happen. However, I also think 406 00:25:08,600 --> 00:25:12,359 Speaker 1: we have to be careful in how that investigation advances. 407 00:25:12,760 --> 00:25:16,159 Speaker 1: Otherwise you could imagine a future in which whichever party 408 00:25:16,240 --> 00:25:19,320 Speaker 1: happens to be in power can lean on companies in 409 00:25:19,400 --> 00:25:24,360 Speaker 1: order to repress political opponents. That's not good. That's authoritarian, 410 00:25:24,480 --> 00:25:28,920 Speaker 1: no matter which way you you subscribe politically. And anyone 411 00:25:28,960 --> 00:25:31,440 Speaker 1: who has listened to me for a little while probably 412 00:25:31,440 --> 00:25:34,400 Speaker 1: has a pretty good feel for where my personal politics are. 413 00:25:34,680 --> 00:25:36,960 Speaker 1: So you know, it's not easy for me to necessarily 414 00:25:37,040 --> 00:25:41,359 Speaker 1: side with conservatives about data surveillance, but some things I 415 00:25:41,400 --> 00:25:43,840 Speaker 1: think need to be off the table. However, I also 416 00:25:43,880 --> 00:25:47,400 Speaker 1: want to make clear those rules have to apply across 417 00:25:47,480 --> 00:25:51,639 Speaker 1: the board, not just two Democrats who are looking into this, 418 00:25:52,000 --> 00:25:56,879 Speaker 1: but period to everyone, for all time, because otherwise it 419 00:25:56,960 --> 00:25:59,719 Speaker 1: becomes a free for all, and it's a nightmare. Now. 420 00:25:59,760 --> 00:26:02,200 Speaker 1: I'm pretty sure that a few weeks ago I mentioned 421 00:26:02,200 --> 00:26:05,600 Speaker 1: that El Salvador representatives were announcing that the company would 422 00:26:05,600 --> 00:26:09,959 Speaker 1: make bitcoin its official currency that becomes effective today, and 423 00:26:10,000 --> 00:26:13,479 Speaker 1: in preparation for that, El Salvador purchase around four hundred bitcoin. 424 00:26:13,520 --> 00:26:16,199 Speaker 1: Would plans to buy a lot more in order to 425 00:26:16,480 --> 00:26:20,679 Speaker 1: you know, make this transition. The four hundred bitcoin that 426 00:26:20,680 --> 00:26:24,200 Speaker 1: El Salvador purchased amounts to around twenty million dollars in value. 427 00:26:24,640 --> 00:26:28,560 Speaker 1: The country is aiming to make bitcoin legal tender alongside 428 00:26:28,560 --> 00:26:31,600 Speaker 1: the US dollar, which until today had served as the 429 00:26:31,680 --> 00:26:36,560 Speaker 1: nation's official currency. Citizens in El Salvador will at least 430 00:26:36,600 --> 00:26:40,080 Speaker 1: theoretically be able to use bitcoin for all financial transactions 431 00:26:40,080 --> 00:26:44,760 Speaker 1: within the country, including paying taxes. While cryptocurrency fans might 432 00:26:44,760 --> 00:26:48,320 Speaker 1: be excited, the same is not true for citizens inside 433 00:26:48,359 --> 00:26:52,520 Speaker 1: El Salvador. The majority of citizens have expressed concern about 434 00:26:52,520 --> 00:26:55,440 Speaker 1: this change. Some point to the fact that criminals often 435 00:26:55,480 --> 00:26:58,280 Speaker 1: rely on bitcoin in order to launder money from their 436 00:26:58,320 --> 00:27:02,320 Speaker 1: illegal exploits, and that El Salvador has a history of corruption, 437 00:27:02,560 --> 00:27:06,400 Speaker 1: including with the present government, and that might mean that 438 00:27:06,400 --> 00:27:08,840 Speaker 1: that corruption is about to get a whole lot worse 439 00:27:08,880 --> 00:27:12,000 Speaker 1: in that country. Other people in El Salvador are worried 440 00:27:12,359 --> 00:27:15,040 Speaker 1: that the change is going to have a negative impact 441 00:27:15,040 --> 00:27:19,640 Speaker 1: on their personal finances. The volatility of bitcoin is well established, 442 00:27:19,920 --> 00:27:23,199 Speaker 1: and so there's a legit fear that stuff like pensions 443 00:27:23,240 --> 00:27:25,520 Speaker 1: could get wiped out by another big dip in the 444 00:27:25,560 --> 00:27:29,320 Speaker 1: bitcoin market, just like the one we saw earlier this year. Now, 445 00:27:29,359 --> 00:27:33,040 Speaker 1: the currency has recovered quite a bit since it had 446 00:27:33,080 --> 00:27:35,440 Speaker 1: that dip, it hasn't gotten back to the all time 447 00:27:35,520 --> 00:27:39,600 Speaker 1: highs it was in in early spring, but it has 448 00:27:39,640 --> 00:27:42,359 Speaker 1: recovered a lot of the value that it lost, but 449 00:27:43,119 --> 00:27:46,400 Speaker 1: not all of it, and the value still is fluctuating 450 00:27:46,480 --> 00:27:49,399 Speaker 1: quite a lot. As I record this. The government in 451 00:27:49,480 --> 00:27:53,160 Speaker 1: El Salvador has had to respond to emerging issues during 452 00:27:53,200 --> 00:27:56,840 Speaker 1: the adoption process, including putting a hold on a digital 453 00:27:56,840 --> 00:27:59,800 Speaker 1: wallet app that's backed by the government. So it's the 454 00:28:00,119 --> 00:28:03,960 Speaker 1: Hill El Salvador Bitcoin app, and it's been but on 455 00:28:04,080 --> 00:28:07,239 Speaker 1: hold because of a high demand, a greater demand than 456 00:28:07,280 --> 00:28:11,840 Speaker 1: they expected, and that it was starting to tax the servers. 457 00:28:12,680 --> 00:28:15,560 Speaker 1: A significant population within El Salvador may not even be 458 00:28:15,640 --> 00:28:18,359 Speaker 1: able to access the cryptocurrency at all because there is 459 00:28:18,400 --> 00:28:23,160 Speaker 1: a significant digital divide in the country. So a lot 460 00:28:23,160 --> 00:28:25,520 Speaker 1: of people have no computer or smartphone. They would have 461 00:28:25,560 --> 00:28:28,399 Speaker 1: no way to access a digital wallet, which means they 462 00:28:28,440 --> 00:28:32,440 Speaker 1: can't even participate in that part of the economy. Now, 463 00:28:32,480 --> 00:28:36,240 Speaker 1: you might wonder why El Salvador is depending upon a 464 00:28:36,320 --> 00:28:40,040 Speaker 1: currency like bitcoin at all, or why it was dependent 465 00:28:40,080 --> 00:28:44,240 Speaker 1: upon the US dollar beforehand. Well, some countries like they're 466 00:28:44,280 --> 00:28:46,280 Speaker 1: at a point where they would not be able to 467 00:28:46,360 --> 00:28:50,360 Speaker 1: offer up a stable currency. It could be a currency 468 00:28:50,440 --> 00:28:54,680 Speaker 1: that is uh that fluctuates wildly in value, which is 469 00:28:55,200 --> 00:28:58,560 Speaker 1: not financially stable for all the people of that country. 470 00:28:58,640 --> 00:29:01,880 Speaker 1: So they in this case El Salvador was dependent upon 471 00:29:02,120 --> 00:29:07,400 Speaker 1: a larger, more stable uh country that of the United States. 472 00:29:07,480 --> 00:29:09,440 Speaker 1: At least when we get to the value of the dollar, 473 00:29:09,520 --> 00:29:13,040 Speaker 1: it's larger and more stable, uh. And that the switch 474 00:29:13,080 --> 00:29:17,880 Speaker 1: to bitcoin means that you're no longer dependent upon another 475 00:29:18,200 --> 00:29:22,600 Speaker 1: nation's currency. Bitcoin, of course is you know, it's it's 476 00:29:22,600 --> 00:29:26,760 Speaker 1: agnostic as far as nationality is concerned. So there's that 477 00:29:27,560 --> 00:29:30,320 Speaker 1: as well. Plus there was that whole corruption stuff that 478 00:29:30,480 --> 00:29:33,200 Speaker 1: was talking about. There are several reasons, none of them 479 00:29:33,200 --> 00:29:37,360 Speaker 1: are necessarily great for the majority of El Salvador's population, 480 00:29:37,960 --> 00:29:42,480 Speaker 1: and I am concerned about that. We've got a few 481 00:29:42,480 --> 00:29:45,120 Speaker 1: more news items to cover before we close out, But 482 00:29:45,200 --> 00:29:56,240 Speaker 1: before I get to that, let's take another quick break. Okay, 483 00:29:56,280 --> 00:29:59,880 Speaker 1: So The Verge has an article that's titled Automated hiring 484 00:30:00,000 --> 00:30:04,520 Speaker 1: software is mistakenly rejecting millions of viable job candidates. And 485 00:30:04,560 --> 00:30:07,680 Speaker 1: the piece cites a study from the Harvard Business School 486 00:30:08,040 --> 00:30:11,800 Speaker 1: that's looking into such software. And you've probably heard about 487 00:30:11,800 --> 00:30:14,640 Speaker 1: this type of software. Employers use it to come through 488 00:30:14,760 --> 00:30:17,800 Speaker 1: job applications, and they do it so that they can 489 00:30:17,840 --> 00:30:20,320 Speaker 1: weed out applicants who are just not suitable for a 490 00:30:20,320 --> 00:30:23,760 Speaker 1: particular position and then just focus on the ones who 491 00:30:23,800 --> 00:30:27,080 Speaker 1: are quote unquote the best fit. So the whole idea 492 00:30:27,120 --> 00:30:30,040 Speaker 1: is just, you know, not to eliminate folks who aren't 493 00:30:30,040 --> 00:30:32,440 Speaker 1: a good fit, but really define the right people to 494 00:30:32,560 --> 00:30:35,960 Speaker 1: match with the right job, which ideally is the best 495 00:30:36,000 --> 00:30:39,480 Speaker 1: solution for everyone because the job applicant lands a gig 496 00:30:39,520 --> 00:30:41,920 Speaker 1: for which they were best suited and the company ends 497 00:30:41,960 --> 00:30:45,240 Speaker 1: up getting the best candidate to fill a position. Only 498 00:30:45,320 --> 00:30:48,520 Speaker 1: that's not what's necessarily happening. According to the study. The 499 00:30:48,600 --> 00:30:51,680 Speaker 1: report says that software like what I just described is 500 00:30:51,720 --> 00:30:55,200 Speaker 1: actually rejecting millions of people who are qualified for the 501 00:30:55,200 --> 00:30:58,400 Speaker 1: positions that they apply for, and it contributes to an 502 00:30:58,480 --> 00:31:00,640 Speaker 1: issue in which there are people who are ready and 503 00:31:00,800 --> 00:31:03,600 Speaker 1: able and willing to work, but they're not able to 504 00:31:03,680 --> 00:31:08,160 Speaker 1: land a position because of these issues. That's bad news 505 00:31:08,200 --> 00:31:11,360 Speaker 1: for everyone in this system. It's really bad news for 506 00:31:11,400 --> 00:31:13,240 Speaker 1: all the people who are trying to land a job 507 00:31:13,320 --> 00:31:16,880 Speaker 1: because it means they consistently can't. But it's also bad 508 00:31:16,880 --> 00:31:19,760 Speaker 1: news for the employers because they are potentially missing out 509 00:31:19,760 --> 00:31:24,120 Speaker 1: on hiring the perfect candidate because some AI software mistakenly 510 00:31:24,200 --> 00:31:27,480 Speaker 1: put their application on the rejection pile. According to the 511 00:31:27,520 --> 00:31:33,720 Speaker 1: report of U S Employers and of Fortune five companies 512 00:31:34,200 --> 00:31:37,080 Speaker 1: use this kind of software to filter through job applicants. 513 00:31:37,480 --> 00:31:41,200 Speaker 1: As for the reasons behind the high rejection error, those 514 00:31:41,200 --> 00:31:44,040 Speaker 1: are many and they are varied. It really depends upon 515 00:31:44,200 --> 00:31:47,160 Speaker 1: which criteria the software is weighing as being the most 516 00:31:47,200 --> 00:31:51,000 Speaker 1: important in order to weed out unsuitable applications. So one 517 00:31:51,080 --> 00:31:54,600 Speaker 1: example that the Verge points out is that the software 518 00:31:54,640 --> 00:31:58,360 Speaker 1: could reject an application that includes an employment gap of 519 00:31:58,440 --> 00:32:01,920 Speaker 1: six months or more. Now, that would have meant that 520 00:32:02,000 --> 00:32:04,480 Speaker 1: I would have been eliminated once upon a time, because 521 00:32:04,560 --> 00:32:08,120 Speaker 1: I worked for a company for seven years steadily employed. 522 00:32:08,560 --> 00:32:12,959 Speaker 1: But then they eliminated my position. So I wasn't fired. 523 00:32:13,040 --> 00:32:16,280 Speaker 1: I just didn't have a job anymore. I realized the 524 00:32:16,360 --> 00:32:19,400 Speaker 1: distinction is a fine one, and I must admit it 525 00:32:19,520 --> 00:32:23,440 Speaker 1: was lost upon me at the time and arguably still is. 526 00:32:23,680 --> 00:32:26,720 Speaker 1: But anyway, you know, I was out of work for 527 00:32:26,720 --> 00:32:29,040 Speaker 1: six months while trying to get another job. And it 528 00:32:29,120 --> 00:32:32,240 Speaker 1: wasn't that I was unqualified. It wasn't that I wasn't 529 00:32:32,480 --> 00:32:34,800 Speaker 1: looking for work. I was doing both of those things. 530 00:32:34,800 --> 00:32:37,040 Speaker 1: It was just a really bad time to be out 531 00:32:37,040 --> 00:32:39,200 Speaker 1: of work when it happened to me, and I was 532 00:32:39,240 --> 00:32:42,680 Speaker 1: caught up in that it was just a bad job market. Well, 533 00:32:42,880 --> 00:32:44,800 Speaker 1: that sort of thing happens all the time, and this 534 00:32:44,840 --> 00:32:49,040 Speaker 1: type of criteria in software then exacerbates it. Right if 535 00:32:49,080 --> 00:32:51,160 Speaker 1: the software just says, oh, well, this person is not 536 00:32:51,240 --> 00:32:54,000 Speaker 1: reliable because they haven't had a job in six months, 537 00:32:54,440 --> 00:32:58,080 Speaker 1: that doesn't necessarily reflect poorly on the person. It could 538 00:32:58,080 --> 00:33:01,120 Speaker 1: reflect poorly on everything else, but then the person gets 539 00:33:01,160 --> 00:33:05,040 Speaker 1: the punishment for it. Meanwhile, you have this growing pool 540 00:33:05,200 --> 00:33:07,720 Speaker 1: of people who are seeking jobs who are getting rejected, 541 00:33:07,760 --> 00:33:09,960 Speaker 1: so they're back in the job hunt and they're looking 542 00:33:10,000 --> 00:33:12,480 Speaker 1: for more stuff, so they're more likely to apply to 543 00:33:12,600 --> 00:33:15,800 Speaker 1: lots of different positions for which they're qualified in the 544 00:33:15,800 --> 00:33:18,680 Speaker 1: hopes of landing something. But that means that the companies 545 00:33:18,720 --> 00:33:21,280 Speaker 1: that are posting those positions are going to see even 546 00:33:21,360 --> 00:33:25,320 Speaker 1: more applicants as more rejections join the job pool, and 547 00:33:25,400 --> 00:33:28,520 Speaker 1: that means that they get flooded with applicants, which means 548 00:33:28,520 --> 00:33:30,800 Speaker 1: that they feel like they have to rely upon this 549 00:33:30,920 --> 00:33:34,680 Speaker 1: kind of software in order to do that initial sorting. 550 00:33:34,760 --> 00:33:36,880 Speaker 1: And thus we see the problem get worse and it 551 00:33:37,000 --> 00:33:41,000 Speaker 1: feeds upon itself. Cycle after cycle, and as the software 552 00:33:41,040 --> 00:33:44,920 Speaker 1: becomes more entrenched in this hiring process, it gets harder 553 00:33:44,960 --> 00:33:47,560 Speaker 1: to shift to a different approach. Now, I wish I 554 00:33:47,560 --> 00:33:51,040 Speaker 1: had a happy ending to the story, but unless companies 555 00:33:51,120 --> 00:33:55,720 Speaker 1: across industries make a concerted effort to change, or unless 556 00:33:55,760 --> 00:33:59,040 Speaker 1: the organizations that offer these services work really hard to 557 00:33:59,080 --> 00:34:03,760 Speaker 1: eliminate those errors, this is going to keep going. And yes, 558 00:34:04,080 --> 00:34:06,760 Speaker 1: I recognize the irony that some of those jobs seeking 559 00:34:06,760 --> 00:34:11,320 Speaker 1: companies have advertised on this show. Now, to be clear, 560 00:34:11,719 --> 00:34:15,880 Speaker 1: I think their mission is admirable. The idea of let's 561 00:34:15,880 --> 00:34:18,040 Speaker 1: make it easier for the right people to find the 562 00:34:18,120 --> 00:34:21,160 Speaker 1: right jobs. I think that's great. That's a fantastic mission. 563 00:34:21,560 --> 00:34:24,480 Speaker 1: I also think that, at least in some cases, the 564 00:34:24,560 --> 00:34:29,960 Speaker 1: execution of that mission really needs some work. Analytics firm 565 00:34:30,080 --> 00:34:33,520 Speaker 1: app Any reports that Android users are spending more time 566 00:34:33,560 --> 00:34:37,840 Speaker 1: watching content on TikTok than on YouTube on a monthly basis, 567 00:34:38,040 --> 00:34:40,040 Speaker 1: and that this has been going on since August of 568 00:34:40,160 --> 00:34:42,759 Speaker 1: last year here in the United States, and over in 569 00:34:42,800 --> 00:34:45,640 Speaker 1: the UK it's been going on since May of last year. 570 00:34:45,960 --> 00:34:49,160 Speaker 1: So on average, users were spending around twenty two hours 571 00:34:49,160 --> 00:34:52,480 Speaker 1: and forty minutes per month watching content on YouTube, but 572 00:34:52,560 --> 00:34:56,480 Speaker 1: a full twenty four hours per month watching content on TikTok. 573 00:34:57,480 --> 00:34:59,960 Speaker 1: That's a lot of TikTok because those videos tend to 574 00:35:00,040 --> 00:35:03,480 Speaker 1: be pretty short, and YouTube has been leaning harder toward 575 00:35:03,560 --> 00:35:08,520 Speaker 1: promoting long form content. So it's two totally different content strategies. 576 00:35:08,920 --> 00:35:13,200 Speaker 1: So for TikTok to overtake YouTube is pretty darn impressive 577 00:35:13,200 --> 00:35:17,000 Speaker 1: because that's so many videos. All I can say is 578 00:35:17,040 --> 00:35:20,320 Speaker 1: that I contributed more toward YouTube because I watched maybe 579 00:35:20,680 --> 00:35:23,120 Speaker 1: five or six TikTok videos a month because I'm old 580 00:35:23,280 --> 00:35:25,879 Speaker 1: and I'm out of touch. That's not even really a joke. 581 00:35:25,920 --> 00:35:29,120 Speaker 1: It's more of a true statement. But the report was 582 00:35:29,239 --> 00:35:32,120 Speaker 1: focused solely on Android devices. I do not have any 583 00:35:32,200 --> 00:35:35,319 Speaker 1: details about iOS devices, but I would be shocked if 584 00:35:35,360 --> 00:35:38,640 Speaker 1: it was a much different story. I figured it's probably 585 00:35:38,680 --> 00:35:42,000 Speaker 1: somewhere all on the same lines. Also, while users were 586 00:35:42,040 --> 00:35:45,040 Speaker 1: spending more time on TikTok than on YouTube, I should 587 00:35:45,080 --> 00:35:48,120 Speaker 1: point out YouTube still has way more active users than 588 00:35:48,160 --> 00:35:51,360 Speaker 1: TikTok does, so when it comes down to overall numbers, 589 00:35:51,680 --> 00:35:54,480 Speaker 1: YouTube is still well in the lead. But it was 590 00:35:54,560 --> 00:35:57,880 Speaker 1: interesting to see the viewing trends are favoring TikTok. And 591 00:35:57,960 --> 00:36:01,319 Speaker 1: our final story is about a man named Gilbert Michaels, 592 00:36:01,400 --> 00:36:04,920 Speaker 1: who recently received a prison sentence of four years for 593 00:36:05,000 --> 00:36:07,719 Speaker 1: the crime of selling printer toner for way too much 594 00:36:07,719 --> 00:36:12,359 Speaker 1: money to small businesses and nonprofit organizations. The seventy nine 595 00:36:12,440 --> 00:36:16,200 Speaker 1: year old had been selling Toner to these smaller entities 596 00:36:16,239 --> 00:36:19,920 Speaker 1: for decades at a huge markup, sometimes up to ten 597 00:36:20,160 --> 00:36:23,960 Speaker 1: times the actual market price for Toner, which if you've 598 00:36:23,960 --> 00:36:27,600 Speaker 1: ever shot for Toner you know is already wicked expensive. 599 00:36:28,120 --> 00:36:31,600 Speaker 1: And he is the good old telemarketer fraud approach to 600 00:36:31,680 --> 00:36:33,799 Speaker 1: do it, and it was effective. He was doing this 601 00:36:33,840 --> 00:36:36,919 Speaker 1: as far back as the nineteen seventies, and prosecutors said 602 00:36:36,960 --> 00:36:40,240 Speaker 1: that in one six year span he made one twenty 603 00:36:40,280 --> 00:36:44,680 Speaker 1: six million dollars defrauding clients and selling them Toner at 604 00:36:44,719 --> 00:36:48,880 Speaker 1: elevated prices. Michaels was found guilty on charges of mail fraud, 605 00:36:49,040 --> 00:36:52,920 Speaker 1: money laundering, and conspiracy back in two thousand nineteen, so 606 00:36:52,960 --> 00:36:55,359 Speaker 1: this was just his sentencing trial. He had already been 607 00:36:55,360 --> 00:36:58,920 Speaker 1: found guilty of the crimes and this story is pretty awful. 608 00:36:58,960 --> 00:37:04,080 Speaker 1: So essentially Michael's and his conspirators would contact these various targets, 609 00:37:04,120 --> 00:37:08,640 Speaker 1: these small businesses and charities, often posing as representatives for 610 00:37:08,920 --> 00:37:13,520 Speaker 1: legit vendors, companies that had an established relationship with the 611 00:37:13,560 --> 00:37:17,080 Speaker 1: targets and had sold them toner in the past, so 612 00:37:17,760 --> 00:37:22,120 Speaker 1: they would essentially pose as these legitimate vendors, and then 613 00:37:22,160 --> 00:37:25,240 Speaker 1: they would say, oh, so the price of toner has increased, however, 614 00:37:26,000 --> 00:37:28,920 Speaker 1: we will continue to sell you toner at the lower price, 615 00:37:29,200 --> 00:37:31,239 Speaker 1: you know, to help you out. So go ahead and 616 00:37:31,239 --> 00:37:34,600 Speaker 1: agree to purchase the next box of toner, and then 617 00:37:34,800 --> 00:37:38,080 Speaker 1: they would get someone to sign over on that. They 618 00:37:38,080 --> 00:37:40,520 Speaker 1: would then send the toner, but they would send an 619 00:37:40,560 --> 00:37:45,240 Speaker 1: invoice with an h an enormously inflated price for the toner, 620 00:37:45,640 --> 00:37:47,759 Speaker 1: so instead of that lower price they had said on 621 00:37:47,800 --> 00:37:50,200 Speaker 1: the phone and ends up being a much much higher price. 622 00:37:50,560 --> 00:37:53,640 Speaker 1: And then they would threaten legal action and UH and 623 00:37:53,800 --> 00:37:58,839 Speaker 1: collections agencies and such if the target refused to pay up, 624 00:37:59,400 --> 00:38:03,160 Speaker 1: or they would charge exorbitant restocking fees if the clients 625 00:38:03,160 --> 00:38:07,240 Speaker 1: returned the toner. So it was definitely shady, illegal stuff. 626 00:38:07,600 --> 00:38:11,160 Speaker 1: So yeah, not all tech crime involves cryptocurrency and hacking. 627 00:38:11,640 --> 00:38:14,360 Speaker 1: Sometimes it just involves good old fraud on the telephone 628 00:38:14,440 --> 00:38:19,640 Speaker 1: and some insanely expensive toner. And that's the tech news 629 00:38:19,719 --> 00:38:24,080 Speaker 1: for Tuesday, September seven, One if you have suggestions for 630 00:38:24,160 --> 00:38:27,560 Speaker 1: things I should cover on the show, any tech companies 631 00:38:27,640 --> 00:38:30,840 Speaker 1: or technologies that you're interested in, or trends in tech. 632 00:38:31,320 --> 00:38:33,640 Speaker 1: Reach out to me on Twitter. The handle is tech 633 00:38:33,719 --> 00:38:36,640 Speaker 1: Stuff H s W and I'll talk to you again 634 00:38:37,600 --> 00:38:45,719 Speaker 1: really soon. Tech Stuff is an I Heart Radio production. 635 00:38:45,960 --> 00:38:48,759 Speaker 1: For more podcasts from my Heart Radio, visit the i 636 00:38:48,880 --> 00:38:52,120 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 637 00:38:52,160 --> 00:38:53,080 Speaker 1: your favorite shows.