1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,240 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,360 --> 00:00:21,120 Speaker 1: and I love all things tech and it is time 5 00:00:21,200 --> 00:00:26,920 Speaker 1: for the news for Thursday, February eleven, twenty one. Let's 6 00:00:26,960 --> 00:00:31,040 Speaker 1: get into it. SI Tech Daily published an article describing 7 00:00:31,080 --> 00:00:35,600 Speaker 1: how computer scientists were able to defeat deep fake detecting 8 00:00:35,640 --> 00:00:40,120 Speaker 1: computer algorithms and systems. So deep fakes, for those who 9 00:00:40,240 --> 00:00:45,280 Speaker 1: aren't really that familiar with the term, are fabricated digital videos. 10 00:00:45,360 --> 00:00:49,879 Speaker 1: They show what appears to be someone, but it's not 11 00:00:49,920 --> 00:00:53,800 Speaker 1: really them. It's been digitally created. And the typical way 12 00:00:53,960 --> 00:00:56,600 Speaker 1: that these are made is by feeding a lot of 13 00:00:56,720 --> 00:01:01,480 Speaker 1: videos of your target person. So it's easier if we 14 00:01:01,600 --> 00:01:06,600 Speaker 1: use an example. Let's say Ronald McDonald. You've got videos 15 00:01:06,640 --> 00:01:10,559 Speaker 1: of Ronald McDonald, and you just feed all those videos 16 00:01:10,560 --> 00:01:14,240 Speaker 1: of Ronald McDonald, the one specific incarnation of Ronald, because 17 00:01:14,240 --> 00:01:17,720 Speaker 1: obviously lots of different actors have played Ronald McDonald, But 18 00:01:17,959 --> 00:01:21,600 Speaker 1: you feed commercial after commercial of a specific actor playing 19 00:01:21,680 --> 00:01:25,840 Speaker 1: Ronald into your deep fake generator. The more videos you have, 20 00:01:25,959 --> 00:01:29,839 Speaker 1: the better, because you are training the digital machine learning 21 00:01:29,880 --> 00:01:34,920 Speaker 1: model on how to represent that person, and then the 22 00:01:35,160 --> 00:01:41,480 Speaker 1: deep fake generator can superimpose your target, Ronald McDonald their 23 00:01:41,600 --> 00:01:45,280 Speaker 1: face on top of another video. So you could take 24 00:01:45,319 --> 00:01:48,080 Speaker 1: a video of a totally different person. Maybe it's someone 25 00:01:48,120 --> 00:01:50,840 Speaker 1: who's doing a credible impression of the target. Maybe it's 26 00:01:50,880 --> 00:01:54,440 Speaker 1: not even that they're just going through various motions, and 27 00:01:54,440 --> 00:01:58,120 Speaker 1: then you use the deep fake generator to overlay the 28 00:01:58,160 --> 00:02:02,360 Speaker 1: digital face of Ronald McDonald on top of the other 29 00:02:02,440 --> 00:02:06,120 Speaker 1: actors face in the video, and it makes it seem 30 00:02:06,160 --> 00:02:10,040 Speaker 1: as though the target, Ronald was doing whatever it was 31 00:02:10,360 --> 00:02:13,079 Speaker 1: the actor was doing the whole time. And detectors tend 32 00:02:13,160 --> 00:02:15,640 Speaker 1: to look for little signs and videos that would indicate 33 00:02:15,680 --> 00:02:19,280 Speaker 1: a deep fake, particularly around the eyes. The eyes are tricky, 34 00:02:19,320 --> 00:02:22,680 Speaker 1: but the scientists who worked on this found that by 35 00:02:22,760 --> 00:02:28,160 Speaker 1: inserting something called an adversarial example, or actually a lot 36 00:02:28,200 --> 00:02:32,760 Speaker 1: of them, into videos, they could fool these detectors. So 37 00:02:32,840 --> 00:02:36,600 Speaker 1: generally the idea is to include tiny signs that force 38 00:02:36,760 --> 00:02:40,200 Speaker 1: these detectors to make mistakes. Now, according to the scientists, 39 00:02:40,320 --> 00:02:44,440 Speaker 1: it's even possible to introduce the sort of adversarial examples 40 00:02:44,480 --> 00:02:47,960 Speaker 1: into a deep fake video without any real knowledge of 41 00:02:48,080 --> 00:02:51,920 Speaker 1: how the detectors actually process the videos in order to 42 00:02:52,040 --> 00:02:55,680 Speaker 1: detect deep fakes, but a little knowledge does go a 43 00:02:55,720 --> 00:02:59,519 Speaker 1: long way. The team tested scenarios in which attackers might 44 00:02:59,560 --> 00:03:03,200 Speaker 1: have complete access to the detector model. In other words, 45 00:03:03,200 --> 00:03:06,440 Speaker 1: they know exactly what the detector is looking for, and 46 00:03:06,480 --> 00:03:11,120 Speaker 1: with that and introducing these adversarial examples, they could have 47 00:03:11,400 --> 00:03:17,200 Speaker 1: a more than success rate in avoiding detection for videos 48 00:03:17,240 --> 00:03:21,160 Speaker 1: that were uncompressed. They also tested scenarios in which hackers 49 00:03:21,240 --> 00:03:25,000 Speaker 1: might only have very limited knowledge of the detector model, 50 00:03:25,280 --> 00:03:28,239 Speaker 1: and even then with uncompressed video they saw an eight 51 00:03:28,440 --> 00:03:32,200 Speaker 1: six percent success rate. Now, the approach used by the 52 00:03:32,240 --> 00:03:36,680 Speaker 1: team preserves those adversarial examples even if a video does 53 00:03:36,760 --> 00:03:40,640 Speaker 1: go through compression or resizing. Typically those are things that 54 00:03:40,680 --> 00:03:45,520 Speaker 1: would remove adversarial examples from images, and under both scenarios 55 00:03:45,800 --> 00:03:48,800 Speaker 1: the method worked at a slightly lower success rate if 56 00:03:48,880 --> 00:03:53,600 Speaker 1: the video had been compressed, It was nearly successful if 57 00:03:53,640 --> 00:03:56,760 Speaker 1: the team had complete knowledge of the detector model, and 58 00:03:57,560 --> 00:04:01,280 Speaker 1: percent even if they didn't have that knowledge, so still 59 00:04:01,520 --> 00:04:07,120 Speaker 1: incredibly reliable. They also chose not to publish the code 60 00:04:07,360 --> 00:04:10,880 Speaker 1: that they were using for their adversarial examples because they said, 61 00:04:11,200 --> 00:04:14,600 Speaker 1: if we did that, then bad guys could take this 62 00:04:14,880 --> 00:04:17,640 Speaker 1: and make it their own and thus use it for 63 00:04:17,720 --> 00:04:21,599 Speaker 1: real zs. So they recommended that teams that are designing 64 00:04:21,680 --> 00:04:26,320 Speaker 1: deep fake detectors incorporate adversarial training when they build out 65 00:04:26,320 --> 00:04:29,120 Speaker 1: their models. I think most teams already do this. Essentially, 66 00:04:29,560 --> 00:04:32,520 Speaker 1: this creates a sort of seesaw effect. So you've got 67 00:04:32,560 --> 00:04:35,560 Speaker 1: one computer system and its job is just to try 68 00:04:35,600 --> 00:04:39,240 Speaker 1: and create the most convincing fakes possible, and you have 69 00:04:39,279 --> 00:04:43,160 Speaker 1: a different system that is attempting to tell the difference 70 00:04:43,200 --> 00:04:47,160 Speaker 1: between real videos and fake videos. So as the detectors 71 00:04:47,160 --> 00:04:51,080 Speaker 1: get better, the faker system tries out new tricks to 72 00:04:51,320 --> 00:04:55,640 Speaker 1: exploit potential weaknesses in the detector, and as the detectors 73 00:04:55,680 --> 00:04:59,960 Speaker 1: fall behind, they begin to adjust their approach to get 74 00:05:00,000 --> 00:05:02,400 Speaker 1: it are at picking out the fakes. And we've seen 75 00:05:02,400 --> 00:05:04,880 Speaker 1: this sort of process in a lot of other areas 76 00:05:04,920 --> 00:05:08,400 Speaker 1: of artificial intelligence. It really plays a big role in 77 00:05:08,480 --> 00:05:13,520 Speaker 1: how AI evolves over time. Earlier this week, I talked 78 00:05:13,520 --> 00:05:17,000 Speaker 1: about Tesla and bitcoin, and I did my normal angry 79 00:05:17,000 --> 00:05:20,280 Speaker 1: old man rant about bitcoin being more of a commodity 80 00:05:20,400 --> 00:05:24,240 Speaker 1: than it is a currency. Well, now I get to 81 00:05:24,279 --> 00:05:27,680 Speaker 1: talk about another thing that irritates me about bitcoin, and 82 00:05:27,720 --> 00:05:31,520 Speaker 1: that's how much juice bitcoin miners are using when they're 83 00:05:31,520 --> 00:05:35,120 Speaker 1: going after those expensive digital coins. So let's get some 84 00:05:35,200 --> 00:05:39,599 Speaker 1: quick explanations in here. You've got your digital currency bitcoin, 85 00:05:39,880 --> 00:05:43,520 Speaker 1: and there are two main ways you can get bitcoin. 86 00:05:44,080 --> 00:05:46,280 Speaker 1: There more than this, but these are the two big ones. 87 00:05:46,560 --> 00:05:49,880 Speaker 1: You can purchase some so you're exchanging some other form 88 00:05:49,920 --> 00:05:53,320 Speaker 1: of currency for some amount of bitcoin, or you can 89 00:05:53,360 --> 00:05:58,919 Speaker 1: try and mine bitcoin. Mining essentially involves using a computer 90 00:05:59,040 --> 00:06:03,760 Speaker 1: to guess the correct answer to a very hard mathematical problem, 91 00:06:03,800 --> 00:06:07,120 Speaker 1: and the Bitcoin system, that is, the overall network that's 92 00:06:07,120 --> 00:06:12,159 Speaker 1: in charge of tracking bitcoin transactions and dispensing mind bitcoins. 93 00:06:12,920 --> 00:06:16,960 Speaker 1: The system determines how hard that mathematical problem should be. Now, 94 00:06:17,000 --> 00:06:21,120 Speaker 1: ideally it should take about ten minutes before some computer 95 00:06:21,320 --> 00:06:24,240 Speaker 1: on the network is able to solve the problem. So 96 00:06:24,279 --> 00:06:28,480 Speaker 1: as more computational power joins this network, the system has 97 00:06:28,520 --> 00:06:33,600 Speaker 1: to make the problems more challenging. Otherwise these screaming fast 98 00:06:33,640 --> 00:06:37,480 Speaker 1: computer systems would solve the problems far too quickly. So 99 00:06:37,560 --> 00:06:40,279 Speaker 1: the goal is to aim for that average amount of 100 00:06:40,320 --> 00:06:42,280 Speaker 1: time that it takes to mind bitcoins, to keep it 101 00:06:42,320 --> 00:06:44,960 Speaker 1: in around ten minutes. There's more to it than that, 102 00:06:45,080 --> 00:06:49,560 Speaker 1: but that's a very basic explanation and since a single 103 00:06:49,600 --> 00:06:53,400 Speaker 1: bitcoin is worth tens of thousands of dollars, nearly fifty 104 00:06:53,800 --> 00:06:57,560 Speaker 1: dollars as I record this, and each successful mining of 105 00:06:57,880 --> 00:07:02,080 Speaker 1: bitcoin pays out six point to five bit coins per go, 106 00:07:02,240 --> 00:07:05,400 Speaker 1: so every ten minutes, another six and a quarter bitcoins 107 00:07:05,400 --> 00:07:08,359 Speaker 1: get mind, that means there's a lot of money to 108 00:07:08,360 --> 00:07:12,120 Speaker 1: be made, but it also means you need incredibly powerful 109 00:07:12,160 --> 00:07:14,760 Speaker 1: computer systems if you're going to stand a chance of 110 00:07:14,840 --> 00:07:18,560 Speaker 1: having the computer that guesses the correct answer to the 111 00:07:18,600 --> 00:07:22,040 Speaker 1: mathematical problem. Otherwise someone else is going to beat you 112 00:07:22,080 --> 00:07:25,239 Speaker 1: to the punch. And so people and groups have built 113 00:07:25,240 --> 00:07:29,560 Speaker 1: out mining computers and really mining computer networks that are 114 00:07:29,640 --> 00:07:31,480 Speaker 1: just trying to be the first to get these math 115 00:07:31,600 --> 00:07:35,560 Speaker 1: problems correct. And these machines require a lot of electricity. 116 00:07:35,920 --> 00:07:39,480 Speaker 1: Cambridge University researchers have estimated that the power consumption is 117 00:07:39,520 --> 00:07:42,560 Speaker 1: in the neighborhood of one d twenty one point thirty 118 00:07:42,600 --> 00:07:47,320 Speaker 1: six tara watt hours per year. That means all the 119 00:07:47,360 --> 00:07:50,760 Speaker 1: bitcoin miners in the world collectively are consuming as much 120 00:07:50,800 --> 00:07:55,360 Speaker 1: electricity as the entire nation of Argentina, and they're just 121 00:07:55,560 --> 00:07:59,880 Speaker 1: behind Norway. Now. The only reason for this is because 122 00:08:00,120 --> 00:08:03,760 Speaker 1: of those huge payouts, right, because if the amount you 123 00:08:03,880 --> 00:08:08,160 Speaker 1: earned was a low amount, whether because you were only 124 00:08:08,200 --> 00:08:10,800 Speaker 1: getting a fraction of a bitcoin every time you mind it, 125 00:08:11,040 --> 00:08:14,960 Speaker 1: or the value of bitcoin had plummeted, or both, then 126 00:08:15,520 --> 00:08:18,600 Speaker 1: it would mean that running your computer systems would actually 127 00:08:18,640 --> 00:08:21,640 Speaker 1: be more expensive than the money you would get from 128 00:08:21,760 --> 00:08:25,640 Speaker 1: mining bitcoins, and that would be a losing proposition, so 129 00:08:25,800 --> 00:08:28,600 Speaker 1: you would quit right. You wouldn't want to keep spending 130 00:08:28,640 --> 00:08:31,720 Speaker 1: money to make back a fraction of what you were spending. 131 00:08:32,360 --> 00:08:34,640 Speaker 1: And maybe that will happen at some point. And if 132 00:08:34,679 --> 00:08:38,120 Speaker 1: it does, and people drop out of the network because 133 00:08:38,280 --> 00:08:40,800 Speaker 1: it costs too much money to continue mining, then the 134 00:08:40,840 --> 00:08:44,840 Speaker 1: system will adjust that difficulty to those mathematical problems. Again, 135 00:08:45,080 --> 00:08:47,680 Speaker 1: if it detects that it's taking more than ten minutes 136 00:08:47,720 --> 00:08:51,719 Speaker 1: to solve the problems, it makes the problems easier. The 137 00:08:51,760 --> 00:08:54,840 Speaker 1: idea of being that no matter what computer power is 138 00:08:54,840 --> 00:08:58,760 Speaker 1: connected to the system, that ten minute to solution time 139 00:08:58,840 --> 00:09:03,200 Speaker 1: remains fairly steady. So it's kind of like if you 140 00:09:03,400 --> 00:09:07,040 Speaker 1: took the top performing students of a math class, but 141 00:09:07,120 --> 00:09:10,199 Speaker 1: you still wanted to have the same you know, average grade, 142 00:09:10,679 --> 00:09:14,440 Speaker 1: then you make the tests a little easier. But the 143 00:09:14,520 --> 00:09:17,600 Speaker 1: energy demands of bitcoin are one of the aspects of 144 00:09:17,640 --> 00:09:21,440 Speaker 1: the cryptocurrency that I find really irritating. Not only is 145 00:09:21,480 --> 00:09:25,160 Speaker 1: it a quote unquote currency with a value that fluctuates 146 00:09:25,320 --> 00:09:27,840 Speaker 1: so much that you can't really use it as a 147 00:09:27,880 --> 00:09:32,360 Speaker 1: currency with confidence, it's also encouraging people to consume way 148 00:09:32,400 --> 00:09:36,600 Speaker 1: more electricity than they would otherwise, and that has further 149 00:09:36,800 --> 00:09:40,000 Speaker 1: consequences depending upon where that electricity is coming from. It 150 00:09:40,080 --> 00:09:44,240 Speaker 1: might mean that bitcoin itself is contributing to climate change 151 00:09:44,640 --> 00:09:47,960 Speaker 1: because people are using so many computer systems that are 152 00:09:48,000 --> 00:09:50,600 Speaker 1: drawing power off a power grid that could be powered 153 00:09:50,600 --> 00:09:55,679 Speaker 1: by coal. And then there are other, admittedly more minor consequences, 154 00:09:55,679 --> 00:09:58,600 Speaker 1: such as not being able to find a decent graphics 155 00:09:58,640 --> 00:10:02,240 Speaker 1: processing unit these days is because cryptocurrency miners keep buying 156 00:10:02,280 --> 00:10:10,600 Speaker 1: them all stupid bitcoin. Cybersecurity researcher Alex Berson recently gave 157 00:10:10,679 --> 00:10:13,800 Speaker 1: us more evidence that supply chain attacks are something that 158 00:10:13,840 --> 00:10:18,880 Speaker 1: companies really need to take seriously, which is a theme 159 00:10:18,960 --> 00:10:22,160 Speaker 1: we've been seeing since the Solar Winds hack. So, in 160 00:10:22,200 --> 00:10:25,880 Speaker 1: this context, a supply chain attack is when a hacker 161 00:10:25,920 --> 00:10:29,320 Speaker 1: is able to inject malware into some sort of product 162 00:10:29,600 --> 00:10:33,240 Speaker 1: from a trusted entity that distributes that product to its 163 00:10:33,280 --> 00:10:37,160 Speaker 1: partners or customers. Or whatever. But and itself sounds a 164 00:10:37,160 --> 00:10:42,280 Speaker 1: bit confusing, so let's use a grim Batman style analogy. 165 00:10:42,440 --> 00:10:45,720 Speaker 1: Let's say the Joker has just about had enough of 166 00:10:45,720 --> 00:10:49,560 Speaker 1: Commissioner Gordon, and he knows that Commissioner Gordon likes pizza, 167 00:10:50,000 --> 00:10:52,480 Speaker 1: So the Joker has one of his goons deliver a 168 00:10:52,760 --> 00:10:58,160 Speaker 1: poisoned pizza to Gordon's house. Now, Gordon might accept the pizza, 169 00:10:58,640 --> 00:11:01,559 Speaker 1: he might decline it, he might accept it but not 170 00:11:01,880 --> 00:11:05,080 Speaker 1: eat it. So in other words, this is a ploy 171 00:11:05,120 --> 00:11:08,440 Speaker 1: that might work, but it might not. So let's say 172 00:11:08,559 --> 00:11:12,920 Speaker 1: instead the Joker decides, ah, I'm going to infiltrate Gordon's 173 00:11:13,080 --> 00:11:16,600 Speaker 1: favorite pizza parlor. And the Joker gets in there and 174 00:11:16,679 --> 00:11:20,559 Speaker 1: poisons all the dough in that pizza parlor, and then 175 00:11:20,559 --> 00:11:23,080 Speaker 1: he has one of his goons drop off some coupons 176 00:11:23,240 --> 00:11:27,080 Speaker 1: for the pizza parlor over at Gordon's home. Now, the 177 00:11:27,160 --> 00:11:30,000 Speaker 1: Joker is banking on the fact that Gordon, he knows 178 00:11:30,040 --> 00:11:33,120 Speaker 1: this pizza parlor, he likes the pizza there, and he 179 00:11:33,280 --> 00:11:36,440 Speaker 1: just got some coupons for it, and he's gonna probably 180 00:11:36,600 --> 00:11:39,880 Speaker 1: order pizza from that pizza place, and he will welcome 181 00:11:39,960 --> 00:11:43,560 Speaker 1: the poisoned pizza into his home. Eagerly. That's kind of 182 00:11:43,600 --> 00:11:46,800 Speaker 1: what happens with a supply chain attack because the malware 183 00:11:47,320 --> 00:11:51,160 Speaker 1: is hidden inside a product that's from a trusted partner, 184 00:11:51,360 --> 00:11:54,520 Speaker 1: so the victim is more likely to incorporate that malware 185 00:11:54,600 --> 00:11:58,240 Speaker 1: into their own systems. Vers San was able to create 186 00:11:58,320 --> 00:12:02,480 Speaker 1: an attack through squaw adding, which is grabbing a valid 187 00:12:02,600 --> 00:12:06,840 Speaker 1: internal package name on code that happened to be distributed 188 00:12:06,840 --> 00:12:10,199 Speaker 1: through geth hub. Essentially, he was leaning on an open 189 00:12:10,280 --> 00:12:14,319 Speaker 1: source approach to code distribution, and a lot of companies 190 00:12:14,760 --> 00:12:18,480 Speaker 1: use that for various products and it worked really well. 191 00:12:18,960 --> 00:12:22,240 Speaker 1: His quote unquote malware which really didn't do much but 192 00:12:22,320 --> 00:12:25,960 Speaker 1: returned some very basic information. Because he didn't want to 193 00:12:26,520 --> 00:12:30,000 Speaker 1: actually cause any problems, he didn't want to compromise these systems. 194 00:12:30,320 --> 00:12:32,240 Speaker 1: He was doing this a sort of a type of 195 00:12:32,280 --> 00:12:36,560 Speaker 1: penetration testing to see if the companies regarded against this 196 00:12:36,679 --> 00:12:40,160 Speaker 1: kind of attack. He found that it was effective in 197 00:12:40,559 --> 00:12:45,040 Speaker 1: at least thirty five organizations, and these included really big ones. 198 00:12:45,320 --> 00:12:48,440 Speaker 1: Apple was one of them. For example. He called it 199 00:12:48,600 --> 00:12:53,000 Speaker 1: a quote dependency confusion bug end quote, and indicated that 200 00:12:53,240 --> 00:12:56,600 Speaker 1: if a malicious hacker had taken that approach, they might 201 00:12:56,600 --> 00:13:00,000 Speaker 1: have been able to do stuff like install backdoor access 202 00:13:00,040 --> 00:13:03,680 Speaker 1: points into what would otherwise appear to be a secure system. 203 00:13:03,720 --> 00:13:07,079 Speaker 1: To get into full detail of what he did would 204 00:13:07,120 --> 00:13:10,880 Speaker 1: require an understanding that goes beyond what I have. But 205 00:13:11,120 --> 00:13:13,319 Speaker 1: it does show that this kind of attack can be 206 00:13:13,360 --> 00:13:17,320 Speaker 1: devastatingly effective if used by malicious hackers, and it could 207 00:13:17,360 --> 00:13:20,600 Speaker 1: mean that in the future we'll see companies quarantine a 208 00:13:20,720 --> 00:13:25,000 Speaker 1: system in order to test out patched software before incorporating 209 00:13:25,040 --> 00:13:29,160 Speaker 1: that patch into the full system overall, just in case 210 00:13:29,400 --> 00:13:33,520 Speaker 1: someone has compromised a trusted partner. I guess you just 211 00:13:33,559 --> 00:13:36,320 Speaker 1: can't trust anyone. That's going to be a theme that 212 00:13:36,360 --> 00:13:39,720 Speaker 1: comes back later. And in our last little bit for 213 00:13:39,760 --> 00:13:44,560 Speaker 1: this segment, Twitter's chief financial officer Ned Siegel said in 214 00:13:44,640 --> 00:13:48,120 Speaker 1: an interview that the band that Twitter has placed on 215 00:13:48,280 --> 00:13:53,480 Speaker 1: former President Donald Trump's account is in fact permanent. He's 216 00:13:53,520 --> 00:13:57,560 Speaker 1: really most sincerely banned. I guess you could say Trump 217 00:13:57,559 --> 00:14:00,160 Speaker 1: received the band toward the end of his presidency after 218 00:14:00,240 --> 00:14:03,320 Speaker 1: the riots at the US Capitol on January six, and 219 00:14:03,360 --> 00:14:06,000 Speaker 1: that ban is going to be in place even if 220 00:14:06,040 --> 00:14:10,800 Speaker 1: Trump is re elected to president sometime in the future. Yikes, 221 00:14:10,840 --> 00:14:14,920 Speaker 1: So Siegel says to CNBC, the way our policies work 222 00:14:15,080 --> 00:14:19,120 Speaker 1: when you're removed from the platform. You're removed from the platform, 223 00:14:19,280 --> 00:14:22,960 Speaker 1: whether you're a commentator, you're a CFO, or you are 224 00:14:23,040 --> 00:14:27,480 Speaker 1: a former or current public official. Remember, our policies are 225 00:14:27,480 --> 00:14:30,680 Speaker 1: designed to make sure that people are not inciting violence, 226 00:14:30,920 --> 00:14:33,840 Speaker 1: and if anybody does that, we have to remove them 227 00:14:33,880 --> 00:14:37,000 Speaker 1: from the service. And our policies don't allow people to 228 00:14:37,040 --> 00:14:41,280 Speaker 1: come back. End quote. Earlier this week, Twitter held and 229 00:14:41,400 --> 00:14:45,280 Speaker 1: earnings called to share information with shareholders about how the 230 00:14:45,320 --> 00:14:47,840 Speaker 1: company is doing. And I'm sure there was some concern 231 00:14:47,920 --> 00:14:51,240 Speaker 1: that banning Trump was going to have a negative impact 232 00:14:51,440 --> 00:14:55,560 Speaker 1: on the business side of the service. According to c NBC, 233 00:14:55,720 --> 00:14:59,880 Speaker 1: Twitter beat analyst projections for earnings and for revenue into 234 00:15:00,000 --> 00:15:03,120 Speaker 1: twenty twenty, but it did fall behind when it came 235 00:15:03,160 --> 00:15:07,480 Speaker 1: to growing the user base. Twitter CEO Jack Dorsey also 236 00:15:07,520 --> 00:15:12,040 Speaker 1: revealed that of Twitter's users are actually outside the United States. 237 00:15:12,280 --> 00:15:15,280 Speaker 1: Twitter is also still dealing with being in the crosshairs 238 00:15:15,360 --> 00:15:19,440 Speaker 1: of some world events, such as the ongoing farmer protests 239 00:15:19,560 --> 00:15:23,640 Speaker 1: in India that I reported on previously. Well, we have 240 00:15:23,800 --> 00:15:26,400 Speaker 1: some more news stories to cover, but before we get 241 00:15:26,440 --> 00:15:36,400 Speaker 1: to any of those, let's take a quick break. On 242 00:15:36,440 --> 00:15:39,640 Speaker 1: Tuesday's episode, I talked about how Apple's head of hardware 243 00:15:39,680 --> 00:15:43,440 Speaker 1: had moved over to the company's Virtual and augmented reality department, 244 00:15:44,000 --> 00:15:46,840 Speaker 1: which I guess we can just call mixed reality. That's 245 00:15:46,920 --> 00:15:50,360 Speaker 1: kind of the the umbrella term for those technologies. Well, 246 00:15:50,400 --> 00:15:52,960 Speaker 1: today we can follow up a little bit more on that. 247 00:15:53,360 --> 00:15:57,440 Speaker 1: Mac Rumors published an article including designs by artist Antonio 248 00:15:57,520 --> 00:16:00,960 Speaker 1: di Rosa, who has worked with Apple on several occasions 249 00:16:01,000 --> 00:16:05,280 Speaker 1: to create artists renderings of different design concepts. Now, these 250 00:16:05,280 --> 00:16:07,920 Speaker 1: renderings are meant more as a way to consider, you know, 251 00:16:08,000 --> 00:16:12,000 Speaker 1: different design options. They may or may not resemble a 252 00:16:12,080 --> 00:16:15,440 Speaker 1: final product from Apple. In fact, there's no guarantee that 253 00:16:15,480 --> 00:16:18,680 Speaker 1: there will be a final product in some cases, although 254 00:16:18,920 --> 00:16:22,040 Speaker 1: it seems pretty certain that this is going forward and 255 00:16:22,240 --> 00:16:26,480 Speaker 1: de Rosa's pieces, these mixed reality headsets are given the 256 00:16:26,560 --> 00:16:30,760 Speaker 1: name Apple View, but there is no indication that Apple 257 00:16:30,800 --> 00:16:34,000 Speaker 1: has actually settled on any name as of yet. I'm 258 00:16:34,000 --> 00:16:37,120 Speaker 1: still holding out for Eye Eyes or something like that. 259 00:16:37,760 --> 00:16:40,040 Speaker 1: So what was it looked like? Well, it kind of 260 00:16:40,040 --> 00:16:42,360 Speaker 1: looks like someone's got an iPhone strapped to their head. 261 00:16:42,520 --> 00:16:44,800 Speaker 1: I mean, it just imagined that you've got a headband 262 00:16:45,400 --> 00:16:48,600 Speaker 1: and there's a visor that fits over the eyes. It 263 00:16:48,680 --> 00:16:53,120 Speaker 1: is kind of like a horizontal somewhat oval shaped iPhone, 264 00:16:53,160 --> 00:16:55,720 Speaker 1: and it's curved so that it can fit over the 265 00:16:55,760 --> 00:16:58,720 Speaker 1: eyes of the user. Oh and you better believe there's 266 00:16:58,720 --> 00:17:01,760 Speaker 1: an Apple logos act dab in the middle of that thing. 267 00:17:02,440 --> 00:17:06,119 Speaker 1: According to the website the information, the Apple design is 268 00:17:06,160 --> 00:17:09,880 Speaker 1: said to incorporate more than twelve cameras in it, so 269 00:17:10,400 --> 00:17:14,800 Speaker 1: this gadget could potentially be both for virtual and augmented reality. 270 00:17:15,080 --> 00:17:17,720 Speaker 1: Those cameras could be used to provide a live video 271 00:17:17,800 --> 00:17:20,960 Speaker 1: feed of the users surroundings, and the device could lay 272 00:17:21,000 --> 00:17:23,760 Speaker 1: digital information on top of that, or it might also 273 00:17:23,800 --> 00:17:26,320 Speaker 1: be used so that the headset can just have a 274 00:17:26,359 --> 00:17:30,040 Speaker 1: sense of a user's environment and their head position and 275 00:17:30,080 --> 00:17:33,919 Speaker 1: orientation if they are using it as a virtual reality device, 276 00:17:35,040 --> 00:17:37,320 Speaker 1: thus giving queues to users so that you know, they 277 00:17:37,359 --> 00:17:40,120 Speaker 1: don't do stuff like run into a wall or something. 278 00:17:41,040 --> 00:17:43,880 Speaker 1: The general rumor right now is that it is going 279 00:17:43,920 --> 00:17:47,080 Speaker 1: to be geared more as a VR device than an 280 00:17:47,080 --> 00:17:49,320 Speaker 1: a R device. Maybe there will be some a R 281 00:17:49,480 --> 00:17:52,679 Speaker 1: use cases, but they'll be more limited. That actually shocks 282 00:17:52,720 --> 00:17:55,640 Speaker 1: me considering the number of cameras incorporated into this thing, 283 00:17:56,040 --> 00:17:59,199 Speaker 1: though presumably those cameras will provide the headset with a 284 00:17:59,240 --> 00:18:01,800 Speaker 1: lot of a bill these to track a user's head 285 00:18:01,800 --> 00:18:06,640 Speaker 1: motions anyway. It also is supposed to have to eight 286 00:18:06,760 --> 00:18:09,600 Speaker 1: K displays in there, one for each eye, and all 287 00:18:09,640 --> 00:18:12,600 Speaker 1: of this helps explain why most of the websites I've 288 00:18:12,640 --> 00:18:15,920 Speaker 1: seen talking about this thing are guessing that's going to 289 00:18:16,040 --> 00:18:19,520 Speaker 1: cost around three thousand dollars when it does become available 290 00:18:19,520 --> 00:18:22,439 Speaker 1: for purchase. It's supposed to be announced before the end 291 00:18:22,480 --> 00:18:25,600 Speaker 1: of this year, and JP Morgan predicts that will go 292 00:18:25,640 --> 00:18:28,879 Speaker 1: on sale within the first quarter of twenty twenty two, 293 00:18:29,119 --> 00:18:32,640 Speaker 1: so just a year. That's kind of exciting. And speaking 294 00:18:32,760 --> 00:18:35,280 Speaker 1: of cameras, well, we all know that we live in 295 00:18:35,320 --> 00:18:38,120 Speaker 1: a world peppered with cameras. Lots of us have one 296 00:18:38,280 --> 00:18:40,840 Speaker 1: within reach at any given moment thanks to our phones, 297 00:18:41,280 --> 00:18:45,080 Speaker 1: and sure those cameras catch plenty of ridiculous and trivial stuff, 298 00:18:45,280 --> 00:18:47,600 Speaker 1: but it also means that tons of people have powerful 299 00:18:47,640 --> 00:18:51,320 Speaker 1: tools to document important events at a moment's notice, and 300 00:18:51,320 --> 00:18:55,399 Speaker 1: that includes capturing video of police activity, something that some 301 00:18:55,520 --> 00:18:58,720 Speaker 1: police have really tried to discourage in the past, but 302 00:18:58,760 --> 00:19:01,199 Speaker 1: at least here in the United States, it is completely 303 00:19:01,280 --> 00:19:05,360 Speaker 1: legal to record video of a police officer performing official 304 00:19:05,480 --> 00:19:09,400 Speaker 1: duties in public, even if those police officers don't like it. 305 00:19:10,000 --> 00:19:13,040 Speaker 1: Bling Bling reports that now there's a growing trend of 306 00:19:13,080 --> 00:19:17,440 Speaker 1: police officers trying to leverage copyright takedown notices in an 307 00:19:17,440 --> 00:19:21,560 Speaker 1: effort to discourage people from posting and sharing videos of them. 308 00:19:21,920 --> 00:19:25,080 Speaker 1: The idea is pretty simple. So police officer sees that 309 00:19:25,200 --> 00:19:27,960 Speaker 1: someone is trying to capture them on video, So the 310 00:19:27,960 --> 00:19:31,600 Speaker 1: police officer whips out his or her own cell phone 311 00:19:32,119 --> 00:19:35,600 Speaker 1: and blasts some copyrighted music. So the idea here is 312 00:19:35,640 --> 00:19:39,600 Speaker 1: that the person capturing the video is getting that music 313 00:19:39,720 --> 00:19:42,639 Speaker 1: on the microphone as well. If they upload the video 314 00:19:42,680 --> 00:19:46,240 Speaker 1: to a social platform, the owner of the copyrighted music 315 00:19:46,280 --> 00:19:49,600 Speaker 1: will issue a takedown notice for the unlicensed use of 316 00:19:49,640 --> 00:19:53,639 Speaker 1: the music, particularly if it's all an automated system. Pretty 317 00:19:53,720 --> 00:19:58,000 Speaker 1: shady stuff, right, And for activists this is really hard 318 00:19:58,040 --> 00:20:01,800 Speaker 1: because if they get enough complaints about videos that they're posting, 319 00:20:02,240 --> 00:20:05,760 Speaker 1: they might receive a ban from their platforms. And many 320 00:20:05,800 --> 00:20:09,480 Speaker 1: of these activists are trying to perform a public service. 321 00:20:10,280 --> 00:20:12,760 Speaker 1: In the wake of the Black Lives Matter movement, we 322 00:20:12,840 --> 00:20:17,080 Speaker 1: have seen how important it is to hold police accountable, 323 00:20:17,200 --> 00:20:19,800 Speaker 1: to make sure that those who are supposed to be 324 00:20:19,880 --> 00:20:24,199 Speaker 1: protecting and serving and enforcing the law are themselves also 325 00:20:24,600 --> 00:20:29,280 Speaker 1: subject to the law. The Drive has an interesting article 326 00:20:29,400 --> 00:20:32,760 Speaker 1: about how farmers are turning to hackers in order to 327 00:20:32,800 --> 00:20:35,840 Speaker 1: be able to repair their own equipment. All right, So 328 00:20:35,880 --> 00:20:37,760 Speaker 1: this gets out a few different things that I've talked 329 00:20:37,800 --> 00:20:42,000 Speaker 1: about on previous episodes of tech Stuff. There's this concept 330 00:20:42,080 --> 00:20:45,479 Speaker 1: called the right to repair, which is, if you purchase 331 00:20:45,560 --> 00:20:49,280 Speaker 1: some technology, whatever it may be. Maybe it's a stereo system, 332 00:20:49,320 --> 00:20:53,119 Speaker 1: maybe it's a harvester for a farm, but you should 333 00:20:53,160 --> 00:20:55,960 Speaker 1: be able to make reasonable repairs to that technology all 334 00:20:56,000 --> 00:20:59,000 Speaker 1: on your own. The limits of your repair should really 335 00:20:59,040 --> 00:21:02,800 Speaker 1: be up to your expertise and knowledge. But a lot 336 00:21:02,880 --> 00:21:07,280 Speaker 1: of companies have made repairs really hard, from using proprietary 337 00:21:07,400 --> 00:21:11,240 Speaker 1: screws and other fasteners, to making it intentionally difficult to 338 00:21:11,320 --> 00:21:15,200 Speaker 1: access components, to including terms of service that discourage people 339 00:21:15,240 --> 00:21:17,800 Speaker 1: from opening up a piece of technology in the first place. 340 00:21:18,280 --> 00:21:21,240 Speaker 1: Then you've got the black box problem. This is when 341 00:21:21,240 --> 00:21:23,840 Speaker 1: it's difficult or impossible for someone to see how a 342 00:21:23,840 --> 00:21:28,560 Speaker 1: particular piece of technology works, and sometimes you have firmware 343 00:21:28,600 --> 00:21:31,879 Speaker 1: there that prevents you from even doing any work unless 344 00:21:31,920 --> 00:21:38,840 Speaker 1: you first run certain diagnostic tests and processes with official equipment. 345 00:21:39,440 --> 00:21:41,600 Speaker 1: Now we've definitely seen this sort of thing in the 346 00:21:41,640 --> 00:21:44,719 Speaker 1: automotive world, where it's much harder to work on modern 347 00:21:44,800 --> 00:21:48,120 Speaker 1: vehicles compared to cars from just a few decades ago. 348 00:21:48,640 --> 00:21:51,919 Speaker 1: While working on cars does require skill and knowledge, the 349 00:21:52,000 --> 00:21:54,760 Speaker 1: barrier to entry was lower in the past. I mean, 350 00:21:54,760 --> 00:21:56,720 Speaker 1: you still had to learn how to do it, but 351 00:21:56,960 --> 00:22:01,399 Speaker 1: car companies weren't making it harder to do this. However, 352 00:22:01,520 --> 00:22:05,119 Speaker 1: these days, with computerized systems and specialized service codes, it 353 00:22:05,160 --> 00:22:07,160 Speaker 1: could be really hard to figure out what's even going 354 00:22:07,240 --> 00:22:10,920 Speaker 1: on with a vehicle and even harder to conduct repairs. 355 00:22:11,359 --> 00:22:14,040 Speaker 1: And the same is true with farm equipment. So farmers 356 00:22:14,040 --> 00:22:17,600 Speaker 1: have become frustrated when trying to get hold of software 357 00:22:17,600 --> 00:22:20,720 Speaker 1: and firmware from companies like John Deer in order to 358 00:22:20,800 --> 00:22:25,120 Speaker 1: diagnose and repair problems with complicated equipment. Because of that, 359 00:22:25,440 --> 00:22:28,879 Speaker 1: these farmers are starting to turn to pirate firmware for 360 00:22:28,920 --> 00:22:30,760 Speaker 1: their equipment in order to be able to make those 361 00:22:30,800 --> 00:22:34,600 Speaker 1: repairs themselves or to have independent mechanics do it for them, 362 00:22:34,960 --> 00:22:38,120 Speaker 1: and companies like John Deer don't want that to happen. 363 00:22:38,520 --> 00:22:42,199 Speaker 1: These companies purposefully restrict access to stuff like firmware and 364 00:22:42,240 --> 00:22:47,440 Speaker 1: software because that represents another potential revenue source, which is 365 00:22:47,560 --> 00:22:50,679 Speaker 1: repair and maintenance. See, if you make a tractor, you 366 00:22:50,720 --> 00:22:54,159 Speaker 1: can only sell that one tractor once. But if you 367 00:22:54,280 --> 00:22:58,679 Speaker 1: also offer repair and maintenance services, while you can keep 368 00:22:58,760 --> 00:23:02,280 Speaker 1: making revenue off of at initial sale for as long 369 00:23:02,320 --> 00:23:04,399 Speaker 1: as possible, I mean, especially if you make it really 370 00:23:04,440 --> 00:23:07,160 Speaker 1: hard for anyone else to do those kinds of repairs 371 00:23:07,200 --> 00:23:10,920 Speaker 1: and maintenance. And that's the heart of the problem. Farmers 372 00:23:10,960 --> 00:23:13,400 Speaker 1: who understandably want to get the most out of their 373 00:23:13,400 --> 00:23:16,080 Speaker 1: equipment would like to be able to do that kind 374 00:23:16,080 --> 00:23:19,240 Speaker 1: of maintenance and repair themselves or on their own terms, 375 00:23:19,480 --> 00:23:22,800 Speaker 1: and thus limit their costs. And this gets right into 376 00:23:22,840 --> 00:23:25,320 Speaker 1: that right to repair issue, which is something we're seeing 377 00:23:25,359 --> 00:23:28,000 Speaker 1: in different parts of the world as groups of consumers 378 00:23:28,320 --> 00:23:32,520 Speaker 1: protest the business model that relies not just on sales 379 00:23:32,560 --> 00:23:37,360 Speaker 1: of hardware, but having a whole sales department that's related 380 00:23:37,400 --> 00:23:41,240 Speaker 1: to fixing and maintaining that hardware. And I should add that, 381 00:23:41,320 --> 00:23:44,280 Speaker 1: according to the Drive, most of the farmers aren't actually 382 00:23:44,320 --> 00:23:47,960 Speaker 1: happy about having to resort to pirate ID software and firmware. 383 00:23:48,080 --> 00:23:50,760 Speaker 1: They would much prefer to go through official channels. They 384 00:23:50,760 --> 00:23:53,840 Speaker 1: would prefer to purchase a license from John Deer directly. 385 00:23:53,880 --> 00:23:56,280 Speaker 1: It's just that's not on the table. I see a 386 00:23:56,280 --> 00:24:01,080 Speaker 1: lot of parallels with other industries actually, including the music industry. Historically, 387 00:24:01,320 --> 00:24:04,760 Speaker 1: if companies make some aspect of their products difficult to access, 388 00:24:05,200 --> 00:24:08,600 Speaker 1: people will find workarounds, and so it's generally just a 389 00:24:08,640 --> 00:24:11,040 Speaker 1: bad idea to even go that route in the first place. 390 00:24:11,640 --> 00:24:15,000 Speaker 1: On Wednesday, Facebook announced it would reduce the amount of 391 00:24:15,040 --> 00:24:18,879 Speaker 1: political subject matter popping up in people's news feeds as 392 00:24:18,920 --> 00:24:23,119 Speaker 1: part of an effort to reduce quote inflammatory content end quote. 393 00:24:23,160 --> 00:24:27,479 Speaker 1: According to Tech Explore, it's going to start small, aiming 394 00:24:27,520 --> 00:24:31,840 Speaker 1: at a sliver of users in places like Canada, Indonesia, 395 00:24:32,080 --> 00:24:34,840 Speaker 1: and Brazil, before trying it out in the United States. 396 00:24:35,600 --> 00:24:40,119 Speaker 1: Asta Gupta, the Product manager director at Facebook, said during 397 00:24:40,280 --> 00:24:43,240 Speaker 1: these initial tests, will explore a variety of ways to 398 00:24:43,359 --> 00:24:47,320 Speaker 1: rank political content in people's feeds using different signals, and 399 00:24:47,359 --> 00:24:51,200 Speaker 1: then decide on the approaches will use going forward. Now. 400 00:24:51,240 --> 00:24:55,480 Speaker 1: Apparently that means Facebook will no longer recommend politics themed 401 00:24:55,520 --> 00:24:59,600 Speaker 1: groups to users, and it will reduce political themed posts 402 00:24:59,640 --> 00:25:02,919 Speaker 1: that are here and feeds through automated systems. So, in 403 00:25:02,920 --> 00:25:07,040 Speaker 1: other words, the algorithm suggesting these posts that part will stop. 404 00:25:07,560 --> 00:25:10,199 Speaker 1: Users will still be able to post about politics if 405 00:25:10,240 --> 00:25:13,080 Speaker 1: they want to. They'll still be able to join groups 406 00:25:13,119 --> 00:25:16,720 Speaker 1: that have a political perspective and a political purpose if 407 00:25:16,720 --> 00:25:19,400 Speaker 1: they want so. The idea here is really the algorithm 408 00:25:19,480 --> 00:25:22,040 Speaker 1: is getting out of it. The algorithm that determines what 409 00:25:22,320 --> 00:25:25,560 Speaker 1: you see where in your Facebook feed will be pulling 410 00:25:25,600 --> 00:25:27,840 Speaker 1: back on the political stuff, But if all of your 411 00:25:27,840 --> 00:25:30,880 Speaker 1: friends are chatting about politics, you're likely to still see 412 00:25:30,920 --> 00:25:33,240 Speaker 1: a lot of politics over there. I think this shows 413 00:25:33,240 --> 00:25:35,760 Speaker 1: how Facebook is continuing to react to the issues of 414 00:25:36,160 --> 00:25:41,440 Speaker 1: extremist groups that were forming and radicalizing others on the platform, 415 00:25:41,480 --> 00:25:44,680 Speaker 1: and how Facebook's algorithm really kind of played a part 416 00:25:44,760 --> 00:25:49,160 Speaker 1: in all of that. The algorithm helped promote those groups 417 00:25:49,200 --> 00:25:54,480 Speaker 1: to people that were potentially uh able to be radicalized, 418 00:25:54,560 --> 00:25:56,960 Speaker 1: and so we saw that that trend get worse and worse. 419 00:25:57,119 --> 00:25:59,320 Speaker 1: So I feel like this is largely a response to 420 00:25:59,359 --> 00:26:04,399 Speaker 1: that and other lousy news. Nicolo Laurent, the CEO of 421 00:26:04,480 --> 00:26:08,280 Speaker 1: Riot Games, is now under investigation for sexual harassment and 422 00:26:08,320 --> 00:26:12,400 Speaker 1: gender discrimination charges. Now. The company Right Games is known 423 00:26:12,440 --> 00:26:15,800 Speaker 1: for games like League of Legends and Valeriant, and the 424 00:26:15,960 --> 00:26:19,200 Speaker 1: board of directors will be overseeing this investigation, which is 425 00:26:19,240 --> 00:26:23,479 Speaker 1: being conducted by a third party law firm Lawrence. Former 426 00:26:23,640 --> 00:26:27,960 Speaker 1: executive assistant Sharon O'Donnell claims that she was wrongfully terminated 427 00:26:28,040 --> 00:26:30,840 Speaker 1: this past summer after she filed a complaint with the 428 00:26:31,000 --> 00:26:35,320 Speaker 1: HR department against Laurent, saying he had made sexual advances 429 00:26:35,400 --> 00:26:38,720 Speaker 1: toward her. There are other allegations against Laurent as well, 430 00:26:38,840 --> 00:26:41,959 Speaker 1: primarily about some alleged comments he made in the company 431 00:26:41,960 --> 00:26:45,040 Speaker 1: of women in the past, and I won't repeat those 432 00:26:45,040 --> 00:26:49,840 Speaker 1: things here because you know they're gross. Riot Games has 433 00:26:49,840 --> 00:26:51,919 Speaker 1: been under the microscope in the past for this kind 434 00:26:51,960 --> 00:26:56,000 Speaker 1: of stuff too. Kotaku actually ran a really a great 435 00:26:56,040 --> 00:27:01,159 Speaker 1: investigative peace in eighteen about the sexist sure at Riot Games, 436 00:27:01,200 --> 00:27:04,280 Speaker 1: including stories from a former employee with the company who 437 00:27:04,320 --> 00:27:08,360 Speaker 1: described her experiences of trying to navigate the culture and 438 00:27:08,400 --> 00:27:11,960 Speaker 1: seeing her ideas consistently shot down, not on the merits 439 00:27:12,040 --> 00:27:14,919 Speaker 1: of the ideas themselves, but because of her gender. She 440 00:27:14,960 --> 00:27:18,000 Speaker 1: actually tells a story about how she pitched an idea, 441 00:27:18,320 --> 00:27:22,840 Speaker 1: got shot down, told a male peer to pitch the 442 00:27:22,880 --> 00:27:25,840 Speaker 1: same idea a couple of weeks later, and when he 443 00:27:25,880 --> 00:27:28,600 Speaker 1: did it, it it went over like Gangbusters. That's not great. 444 00:27:29,480 --> 00:27:33,720 Speaker 1: And later in a couple of people sued Riot Games 445 00:27:33,760 --> 00:27:36,199 Speaker 1: for gender discrimination. So to me, it sounds like this 446 00:27:36,240 --> 00:27:38,680 Speaker 1: has been an issue from the top down. Not that 447 00:27:38,840 --> 00:27:43,080 Speaker 1: dissimilar to what we heard about Ubisoft last year. Over 448 00:27:43,119 --> 00:27:46,879 Speaker 1: in the UK, police have arrested eight people between the 449 00:27:46,880 --> 00:27:49,800 Speaker 1: ages of eighteen and twenty six who were allegedly part 450 00:27:49,800 --> 00:27:52,680 Speaker 1: of a group trying to illegally access the phones of 451 00:27:52,800 --> 00:27:57,560 Speaker 1: various prominent Americans like celebrities and sports stars, or rather, 452 00:27:57,600 --> 00:28:00,400 Speaker 1: I should say the phone numbers. See, they were doing 453 00:28:00,400 --> 00:28:03,880 Speaker 1: this through sim swapping. This is a process in which 454 00:28:03,920 --> 00:28:08,520 Speaker 1: you will port a phone number from one simcard to another, 455 00:28:08,920 --> 00:28:12,119 Speaker 1: and you might have to do this for totally legitimate reasons. 456 00:28:12,200 --> 00:28:15,199 Speaker 1: Let's say that you're changing from one model of a 457 00:28:15,240 --> 00:28:17,600 Speaker 1: phone to a totally different model of a phone, and 458 00:28:17,640 --> 00:28:21,359 Speaker 1: they happen to use different style simcards. I had. This 459 00:28:21,440 --> 00:28:24,639 Speaker 1: happened when you know the micro mini simcard thing was 460 00:28:24,680 --> 00:28:28,000 Speaker 1: really transitioning, and you want to keep your phone numbers, 461 00:28:28,040 --> 00:28:29,800 Speaker 1: so you want to be able to port your number 462 00:28:29,880 --> 00:28:32,639 Speaker 1: from your old simcard to your new one, and you 463 00:28:32,680 --> 00:28:34,800 Speaker 1: also want to be able to access all the apps 464 00:28:34,800 --> 00:28:36,720 Speaker 1: and data you have on your old phone as easily 465 00:28:36,760 --> 00:28:39,880 Speaker 1: as possible. Well, porting your account over to a new 466 00:28:39,920 --> 00:28:43,040 Speaker 1: simcard is part of that process, and these hackers were 467 00:28:43,040 --> 00:28:45,920 Speaker 1: trying to move phone numbers over to SIM cards that 468 00:28:46,160 --> 00:28:50,120 Speaker 1: they had in their possession that would effectively hijack the 469 00:28:50,160 --> 00:28:53,640 Speaker 1: phone numbers so that any calls going to that celebrity 470 00:28:53,840 --> 00:28:56,640 Speaker 1: or sports star or whatever would instead go to this 471 00:28:56,720 --> 00:29:00,320 Speaker 1: new phone. And it gave them the potential old to 472 00:29:00,480 --> 00:29:03,800 Speaker 1: access tons of stuff along with it, like accounts with 473 00:29:03,840 --> 00:29:06,080 Speaker 1: companies like Amazon for example. And you can see how 474 00:29:06,080 --> 00:29:08,320 Speaker 1: this can be used for all sorts of illegal purposes, 475 00:29:08,360 --> 00:29:13,160 Speaker 1: from theft to blackmail. So how did they make the swaps, Well, 476 00:29:13,240 --> 00:29:15,320 Speaker 1: it seems like most of the time they were actually 477 00:29:15,360 --> 00:29:19,920 Speaker 1: working with various phone providers, and they might fool the 478 00:29:19,920 --> 00:29:22,840 Speaker 1: phone providers. You know, they could pose as the person 479 00:29:22,840 --> 00:29:25,600 Speaker 1: who owns the number, or maybe an assistant to that 480 00:29:25,680 --> 00:29:28,600 Speaker 1: celebrity or whatever, and then use a bit of social 481 00:29:28,680 --> 00:29:31,400 Speaker 1: engineering to convince the technicians on the other end of 482 00:29:31,440 --> 00:29:33,760 Speaker 1: the phone to go ahead and make that change to 483 00:29:33,840 --> 00:29:36,600 Speaker 1: the new simcard. Or they might actually work with someone 484 00:29:36,840 --> 00:29:39,400 Speaker 1: at the company's directly, someone who might have been in 485 00:29:39,480 --> 00:29:42,840 Speaker 1: cahoots with the thieves, and once they do port that 486 00:29:42,920 --> 00:29:45,400 Speaker 1: number over, they can wait to see if the targets 487 00:29:45,440 --> 00:29:48,440 Speaker 1: started to make changes to stuff like passwords. So you 488 00:29:48,480 --> 00:29:52,440 Speaker 1: know how two factor authentication systems can sometimes send like 489 00:29:52,480 --> 00:29:55,720 Speaker 1: a password code to your phone. Well, if you swap 490 00:29:55,800 --> 00:29:59,000 Speaker 1: the phone number over to a different simcard, then it's 491 00:29:59,040 --> 00:30:02,520 Speaker 1: the hackers who are getting those, uh those text messages 492 00:30:02,560 --> 00:30:06,160 Speaker 1: with the temporary passwords or the way to access certain accounts, 493 00:30:06,440 --> 00:30:09,600 Speaker 1: and potentially they could switch over the accounts to themselves 494 00:30:09,640 --> 00:30:13,160 Speaker 1: that way. UK law enforcement says the group faces charges 495 00:30:13,200 --> 00:30:16,600 Speaker 1: of violating the Computer Misuse Act as well as fraud 496 00:30:16,640 --> 00:30:19,760 Speaker 1: and money laundering charges, and that they could also face 497 00:30:19,880 --> 00:30:23,920 Speaker 1: extradition to the United States for their crimes. Okay, we 498 00:30:23,960 --> 00:30:27,080 Speaker 1: have a few more stories to wrap up today's news episode, 499 00:30:27,120 --> 00:30:37,880 Speaker 1: but before we get to that, let's take another quick break. Hey, 500 00:30:37,920 --> 00:30:41,120 Speaker 1: I'm back. Do you remember that earlier segment when I 501 00:30:41,160 --> 00:30:43,600 Speaker 1: said there was a security researcher who had used a 502 00:30:43,640 --> 00:30:46,880 Speaker 1: supply chain attack to test various big company systems and 503 00:30:46,880 --> 00:30:50,640 Speaker 1: then found them lacking. Well, something similar has happened to 504 00:30:50,800 --> 00:30:54,320 Speaker 1: a lot of Google Android users. Not at the heart 505 00:30:54,360 --> 00:30:58,920 Speaker 1: of the problem is an app called barcode Scanner. I 506 00:30:58,960 --> 00:31:02,760 Speaker 1: bet you can't guess what I does anyway. For a 507 00:31:02,800 --> 00:31:06,520 Speaker 1: long time that app was total legit. It was a 508 00:31:06,640 --> 00:31:11,400 Speaker 1: simple but effective barcode scanner. However, sometime in December of 509 00:31:11,520 --> 00:31:15,360 Speaker 1: last year that changed. The security firm malware Bites, which 510 00:31:15,520 --> 00:31:18,280 Speaker 1: you might remember was also the target of a supply 511 00:31:18,440 --> 00:31:22,560 Speaker 1: chain attack itself, has reported that customers were seeing some 512 00:31:22,800 --> 00:31:25,920 Speaker 1: weird behavior on their Android devices. When they would open 513 00:31:26,000 --> 00:31:29,800 Speaker 1: up their default browser, they were getting absolutely bombarded with 514 00:31:29,880 --> 00:31:32,160 Speaker 1: pop up ads. It was kind of like the bad 515 00:31:32,200 --> 00:31:35,680 Speaker 1: old days of the Worldwide Web all over again, stupid 516 00:31:35,800 --> 00:31:40,200 Speaker 1: pop up ads. At first, the analysts found these reports 517 00:31:40,280 --> 00:31:42,960 Speaker 1: kind of perplexing because they would look at what the 518 00:31:43,000 --> 00:31:45,680 Speaker 1: customers had on their phones and they'd say, Gosh, you 519 00:31:45,720 --> 00:31:48,720 Speaker 1: haven't installed any new apps in a while, so nothing 520 00:31:48,720 --> 00:31:51,200 Speaker 1: really has changed here. And all the apps that you 521 00:31:51,280 --> 00:31:56,720 Speaker 1: have installed came from the official Google app store Google Play. 522 00:31:56,800 --> 00:32:01,880 Speaker 1: These weren't like sideloaded apps from some geezy app developer 523 00:32:01,960 --> 00:32:05,120 Speaker 1: or anything like that. And it turned out that barcode 524 00:32:05,160 --> 00:32:07,960 Speaker 1: Scanner was in fact at fault, and more so, it 525 00:32:08,000 --> 00:32:12,280 Speaker 1: appeared to be an intentional decision. In the words of 526 00:32:12,320 --> 00:32:17,360 Speaker 1: security researcher Nathan Collier quote in the case of barcode Scanner, 527 00:32:17,520 --> 00:32:20,720 Speaker 1: malicious code had been added that was not in previous 528 00:32:20,840 --> 00:32:24,920 Speaker 1: versions of the app. Furthermore, the added code used heavy 529 00:32:25,000 --> 00:32:28,920 Speaker 1: obfuscation to avoid detection to verify this is from the 530 00:32:28,960 --> 00:32:32,280 Speaker 1: same app developer. We confirmed it had been signed by 531 00:32:32,280 --> 00:32:36,960 Speaker 1: the same digital certificate as previous clean versions. Because of 532 00:32:36,960 --> 00:32:41,320 Speaker 1: its malign intent, we jumped past our original detection category 533 00:32:41,400 --> 00:32:47,440 Speaker 1: of adware straight to trojan yikes end quote. The yikes 534 00:32:47,480 --> 00:32:50,800 Speaker 1: was mine, by the way, So yeah, a trojan is 535 00:32:50,920 --> 00:32:55,160 Speaker 1: malware that poses as if it's benign legitimate software, and 536 00:32:55,240 --> 00:32:57,720 Speaker 1: it carries with it some sort of payload that can 537 00:32:57,760 --> 00:33:02,040 Speaker 1: execute malware of some kind on an infected system. Google 538 00:33:02,080 --> 00:33:06,080 Speaker 1: has since removed the barcode Scanner app from the play Store, 539 00:33:06,320 --> 00:33:09,160 Speaker 1: and if you happen to have that app installed on 540 00:33:09,240 --> 00:33:13,240 Speaker 1: an Android device, you should uninstall that app remove it. 541 00:33:13,600 --> 00:33:17,320 Speaker 1: Google is not removing that automatically, at least not as 542 00:33:17,360 --> 00:33:20,520 Speaker 1: of the time I'm recording this episode, but you know, 543 00:33:21,360 --> 00:33:23,600 Speaker 1: better safe than sorry. Go ahead check make sure it's 544 00:33:23,640 --> 00:33:25,800 Speaker 1: not on any of your Android devices. If it is, 545 00:33:25,920 --> 00:33:29,360 Speaker 1: remove it. And here's an example of some malicious code 546 00:33:29,440 --> 00:33:31,680 Speaker 1: making its way to targets because it came from a 547 00:33:31,680 --> 00:33:35,200 Speaker 1: trusted source. The tech world is really turning into a 548 00:33:35,240 --> 00:33:39,120 Speaker 1: Mission Impossible movie where you learn you cannot trust anyone 549 00:33:40,080 --> 00:33:43,280 Speaker 1: except Benji. It seems like a stand up dude that 550 00:33:43,400 --> 00:33:47,000 Speaker 1: ethan character. I don't know about him. This next story 551 00:33:47,240 --> 00:33:51,040 Speaker 1: is one that I'm sure you have all heard already, 552 00:33:51,080 --> 00:33:54,000 Speaker 1: but I kind of had to cover it, particularly after 553 00:33:54,120 --> 00:33:59,200 Speaker 1: taiwoh Adebamawo tweeted this story to me late on Tuesday. 554 00:33:59,240 --> 00:34:03,680 Speaker 1: I am a course talking about the virtual courtroom proceeding 555 00:34:03,960 --> 00:34:07,080 Speaker 1: in which a lawyer appeared on screen as a kitty cat. 556 00:34:07,960 --> 00:34:13,240 Speaker 1: The three Judicial District Court in Texas was in session 557 00:34:13,680 --> 00:34:17,520 Speaker 1: and County Attorney Rod Ponton had a cute kitty cap 558 00:34:17,520 --> 00:34:21,840 Speaker 1: picture on display instead of video from his own webcam. 559 00:34:22,040 --> 00:34:25,760 Speaker 1: Judge Roy Ferguson helpfully pointed out that Mr Ponton appeared 560 00:34:25,800 --> 00:34:28,520 Speaker 1: to have a filter over his video chat settings, and 561 00:34:28,600 --> 00:34:33,120 Speaker 1: Ponton said, quote, I'm here live, I'm not a cat 562 00:34:33,880 --> 00:34:37,960 Speaker 1: end quote. The judge then replied, quote I can see 563 00:34:38,000 --> 00:34:42,640 Speaker 1: that end quote. The judge subsequently posted a tweet recommending 564 00:34:42,640 --> 00:34:47,040 Speaker 1: that Zoom users check their video filter settings before they 565 00:34:47,040 --> 00:34:50,080 Speaker 1: actually join a meeting. Now, in the grand scheme of 566 00:34:50,200 --> 00:34:54,160 Speaker 1: Zoom failures, this one's pretty innocent, and the judge commended 567 00:34:54,239 --> 00:34:58,600 Speaker 1: Mr Ponton. He said, quote, the filtered lawyer showed incredible 568 00:34:58,680 --> 00:35:02,600 Speaker 1: grace end quote. So honestly, after reading the story. This 569 00:35:02,719 --> 00:35:06,280 Speaker 1: was a really refreshing and amusing take. No one got hurt, 570 00:35:06,680 --> 00:35:09,600 Speaker 1: everyone remained professional. We got a fun story out of it, 571 00:35:09,840 --> 00:35:14,320 Speaker 1: which is just nice. You know something else that is nice, 572 00:35:14,520 --> 00:35:17,000 Speaker 1: or at least I think it is, is that computers 573 00:35:17,000 --> 00:35:20,719 Speaker 1: are coming up with some really funky math. Y'all all right, 574 00:35:20,760 --> 00:35:23,440 Speaker 1: So I'm gonna do my best to explain this, but 575 00:35:23,520 --> 00:35:26,480 Speaker 1: I do so with full acknowledgement that I am in 576 00:35:26,840 --> 00:35:30,759 Speaker 1: way over my head. My math skills topped out at 577 00:35:30,760 --> 00:35:35,920 Speaker 1: trigonometry and algebra. But Vice has an article titled machines 578 00:35:36,000 --> 00:35:39,520 Speaker 1: are inventing new math We've never seen, And at the 579 00:35:39,560 --> 00:35:43,520 Speaker 1: heart of this are mathematical conjectures. Now, if you're like me, 580 00:35:43,960 --> 00:35:47,759 Speaker 1: that term might be new to you. Essentially, it's a 581 00:35:47,800 --> 00:35:52,240 Speaker 1: form of mathematical statement that as yet has not been 582 00:35:52,600 --> 00:35:56,920 Speaker 1: rigorously proven to be true. So let's say you detect 583 00:35:57,000 --> 00:36:00,359 Speaker 1: what appears to be a pattern in some data, and 584 00:36:00,800 --> 00:36:03,520 Speaker 1: as far as you can tell, there is actually a 585 00:36:03,560 --> 00:36:06,000 Speaker 1: pattern there, And after a bit you figure out a 586 00:36:06,000 --> 00:36:10,759 Speaker 1: way to describe this pattern through a mathematical statement a formula, 587 00:36:10,880 --> 00:36:13,560 Speaker 1: if you will. Now, does this mean there really is 588 00:36:13,640 --> 00:36:17,000 Speaker 1: a pattern in that data? Well? Maybe, but maybe not. 589 00:36:17,719 --> 00:36:21,319 Speaker 1: It might be only the appearance of a pattern, and 590 00:36:21,400 --> 00:36:23,759 Speaker 1: it takes a lot of testing to make sure that 591 00:36:23,760 --> 00:36:27,560 Speaker 1: that conjecture holds up, and if it does so after 592 00:36:27,840 --> 00:36:32,080 Speaker 1: rigorous testing, then it might graduate and become a theorem. 593 00:36:32,120 --> 00:36:35,640 Speaker 1: This is sort of like the concepts of hypotheses and 594 00:36:35,840 --> 00:36:40,560 Speaker 1: theories in science. A hypothesis is a prediction which may 595 00:36:40,640 --> 00:36:43,960 Speaker 1: or may not come true. A theory is something that 596 00:36:44,000 --> 00:36:48,520 Speaker 1: has been tested multiple times in different ways, different approaches, 597 00:36:48,600 --> 00:36:52,440 Speaker 1: different people, and has held up to be true despite 598 00:36:52,480 --> 00:36:55,040 Speaker 1: all that testing, which is the same sort of thing, 599 00:36:55,120 --> 00:36:58,680 Speaker 1: really well. The Vice article explains that computer systems are 600 00:36:58,680 --> 00:37:03,760 Speaker 1: playing a bigger role in developing mathematical conjectures. One system, 601 00:37:03,800 --> 00:37:07,799 Speaker 1: in particular, the Rama NuGen machine, developed in part by 602 00:37:07,840 --> 00:37:11,280 Speaker 1: researchers from Google, is doing this. The system is creating 603 00:37:11,280 --> 00:37:16,560 Speaker 1: conjectures aimed at calculating universal constants. Pie is an example 604 00:37:16,880 --> 00:37:20,520 Speaker 1: of a universal constant. It's the ratio between the circumference 605 00:37:20,560 --> 00:37:24,280 Speaker 1: of a circle and that circle's diameter, and this ratio 606 00:37:24,400 --> 00:37:27,400 Speaker 1: is the same whether the circle is on the nanoscale 607 00:37:27,840 --> 00:37:30,600 Speaker 1: or if the circle is larger than the known universe, 608 00:37:30,800 --> 00:37:35,640 Speaker 1: because the ratio still remains three point one, four, etcetera. Etcetera, etcetera. 609 00:37:35,840 --> 00:37:40,080 Speaker 1: And so these systems are proposing different formula to calculate 610 00:37:40,200 --> 00:37:44,040 Speaker 1: universal constants. This could lead to far more efficient and 611 00:37:44,080 --> 00:37:48,760 Speaker 1: accurate means of doing that. If those conjectures hold water, 612 00:37:49,120 --> 00:37:51,799 Speaker 1: it will be up to mathematicians to put the conjectures 613 00:37:51,840 --> 00:37:55,640 Speaker 1: to the test and just let me say better than 614 00:37:55,680 --> 00:37:59,560 Speaker 1: than me. Now. I included this next story because I 615 00:37:59,600 --> 00:38:03,080 Speaker 1: happened to love the music of the punk band The Pogues, 616 00:38:03,600 --> 00:38:06,560 Speaker 1: famous for such tunes as fairy Tale of New York 617 00:38:07,040 --> 00:38:10,279 Speaker 1: and Turkish Song of the Damned and Sunny Side of 618 00:38:10,280 --> 00:38:14,240 Speaker 1: the Street. Well, Jim Finer, one of the original members 619 00:38:14,239 --> 00:38:16,400 Speaker 1: of the Pogues and also the co writer on the 620 00:38:16,440 --> 00:38:20,200 Speaker 1: songs I just mentioned, has created the proposal for an 621 00:38:20,239 --> 00:38:24,919 Speaker 1: interesting artistic experiment. The goal, he says, was to create 622 00:38:24,920 --> 00:38:28,520 Speaker 1: a means of making music of quote, indeterminate length and 623 00:38:28,680 --> 00:38:33,200 Speaker 1: indeterminate score end quote. He took inspiration from an ancient 624 00:38:33,239 --> 00:38:38,680 Speaker 1: practice in Japan, a type of meditation and musical instrument 625 00:38:38,800 --> 00:38:44,360 Speaker 1: called the su kin kutsu, which involves an upside down 626 00:38:44,640 --> 00:38:50,200 Speaker 1: jar and water drops. So imagine you've got a big jar, 627 00:38:50,840 --> 00:38:53,359 Speaker 1: it's open at the top, you turn it upside down, 628 00:38:53,520 --> 00:38:56,920 Speaker 1: you put it in a little like pit essentially, and 629 00:38:57,000 --> 00:39:00,280 Speaker 1: you bury it. You cut a hole in the bottom 630 00:39:00,320 --> 00:39:03,160 Speaker 1: of that jar, which is now effectively the top. You know, 631 00:39:03,200 --> 00:39:05,560 Speaker 1: it's the part that's poking out the ground, and you 632 00:39:05,640 --> 00:39:09,960 Speaker 1: allow water to drip through that hole into the interior. 633 00:39:10,360 --> 00:39:13,719 Speaker 1: As the water accumulates, it has a little pool at 634 00:39:13,800 --> 00:39:17,799 Speaker 1: the base of that jar that's buried underground, and the 635 00:39:17,920 --> 00:39:21,680 Speaker 1: vibrations as the drops splashed down caused the jar to 636 00:39:21,880 --> 00:39:24,680 Speaker 1: reverberate and it chimes out kind of like a bell. 637 00:39:25,480 --> 00:39:27,960 Speaker 1: So Finer took this idea and he cranked it up 638 00:39:27,960 --> 00:39:31,319 Speaker 1: a notch. In his own words, his quote score for 639 00:39:31,400 --> 00:39:35,200 Speaker 1: a hole in the ground end quote was conceived as 640 00:39:35,239 --> 00:39:39,440 Speaker 1: a composition of indeterminate length and score. Water dripping into 641 00:39:39,480 --> 00:39:43,800 Speaker 1: a deep underground chamber strikes both tuned percussion and a 642 00:39:43,920 --> 00:39:47,440 Speaker 1: pool at the bottom. The sounds are piped above ground 643 00:39:47,520 --> 00:39:51,400 Speaker 1: through a giant horn that stands seven ms above the ground, 644 00:39:51,920 --> 00:39:55,840 Speaker 1: and by pipe through, he's literally talking about a hollow tube, 645 00:39:55,880 --> 00:39:59,880 Speaker 1: a horn that would use acoustics to amplify the sound 646 00:40:00,000 --> 00:40:02,400 Speaker 1: as it travels up the length of the horn to 647 00:40:02,880 --> 00:40:06,239 Speaker 1: the flared in above ground. He said, I like the 648 00:40:06,280 --> 00:40:10,480 Speaker 1: idea of coming across this in a landscape unexpectedly. It's 649 00:40:10,480 --> 00:40:12,840 Speaker 1: a piece that's made for people who know it's there, 650 00:40:13,239 --> 00:40:16,960 Speaker 1: but equally it's made for people to just come across. Now, 651 00:40:17,000 --> 00:40:20,000 Speaker 1: this reminds me of some other acoustic art pieces that 652 00:40:20,080 --> 00:40:23,840 Speaker 1: I have seen, including the classic Aeolian harp, which is 653 00:40:23,840 --> 00:40:27,680 Speaker 1: played by the wind. Nothing more punk rock than handing 654 00:40:27,680 --> 00:40:30,520 Speaker 1: the instruments over to Mother Nature for a wicked solo. 655 00:40:31,440 --> 00:40:36,240 Speaker 1: And finally, ten years ago, the world received an amazing gift. 656 00:40:36,760 --> 00:40:43,279 Speaker 1: I am, of course talking about Rebecca Black's immortal classic Friday. 657 00:40:43,440 --> 00:40:46,200 Speaker 1: Whether you're kicking in the front seat or sitting in 658 00:40:46,239 --> 00:40:49,920 Speaker 1: the back seat. Well, now Rebecca Black has released a 659 00:40:50,000 --> 00:40:53,840 Speaker 1: remix of that song ten years later, and it features 660 00:40:53,880 --> 00:40:57,920 Speaker 1: not justin all grown up Rebecca sporting some neon punk looks, 661 00:40:58,360 --> 00:41:02,600 Speaker 1: but also three oh three Big Freedia, who has one 662 00:41:02,640 --> 00:41:06,600 Speaker 1: of the best Christmas albums I've ever heard, not Safe 663 00:41:06,600 --> 00:41:10,799 Speaker 1: for Work, and Dorian Elektra on the track as well. 664 00:41:10,840 --> 00:41:12,560 Speaker 1: And I listened to the remix and I have to 665 00:41:12,600 --> 00:41:17,320 Speaker 1: tell you that it's remarkable in that it is so 666 00:41:17,640 --> 00:41:22,640 Speaker 1: not my jam, but it is also not my jam 667 00:41:22,680 --> 00:41:26,160 Speaker 1: in a different way than the original was also so 668 00:41:26,320 --> 00:41:29,400 Speaker 1: not my jam. I think it is cool. However, that 669 00:41:29,600 --> 00:41:31,880 Speaker 1: ms Black has been able to embrace her impact on 670 00:41:31,920 --> 00:41:34,759 Speaker 1: the web and also pursue a legit career as a 671 00:41:34,800 --> 00:41:37,840 Speaker 1: singer and songwriter. She never disappeared. She's had a YouTube 672 00:41:37,920 --> 00:41:40,120 Speaker 1: channel and has been uploading to it for the last 673 00:41:40,200 --> 00:41:45,760 Speaker 1: nine years, so she's been engaging with the public since 674 00:41:45,960 --> 00:41:49,960 Speaker 1: her rise to viral fame. I have absolutely no doubt 675 00:41:49,960 --> 00:41:53,200 Speaker 1: that she was also on the receiving end of an 676 00:41:53,320 --> 00:41:57,160 Speaker 1: unbelievable amount of mockery for her original video, and I 677 00:41:57,200 --> 00:42:02,000 Speaker 1: think that's really a shame. I mean, wasn't an indulgent project, sure, 678 00:42:02,600 --> 00:42:05,439 Speaker 1: but good grief, y'all. If YouTube have been around when 679 00:42:05,480 --> 00:42:08,040 Speaker 1: I was a kid, I am certain there would be 680 00:42:08,120 --> 00:42:11,879 Speaker 1: no shortage of indulgent videos from yours truly. In fact, 681 00:42:11,920 --> 00:42:15,160 Speaker 1: you could probably argue convincingly that there is no shortage 682 00:42:15,160 --> 00:42:18,120 Speaker 1: of indulgent material from me as it stands right now. 683 00:42:18,600 --> 00:42:22,080 Speaker 1: It's just that mine doesn't happen to go viral. Anyway, 684 00:42:22,280 --> 00:42:25,760 Speaker 1: if you have fond or funny memories of the song, 685 00:42:25,960 --> 00:42:28,840 Speaker 1: you might want to check out the bonkers music video 686 00:42:28,880 --> 00:42:31,560 Speaker 1: for the remix. Now. I'm not saying you're gonna like it, 687 00:42:31,920 --> 00:42:35,120 Speaker 1: but maybe you will, and you will certainly see something 688 00:42:35,480 --> 00:42:39,520 Speaker 1: that is unique. Well, that wraps up this news episode. 689 00:42:39,560 --> 00:42:42,000 Speaker 1: Hope you guys enjoyed it. If you have any suggestions 690 00:42:42,040 --> 00:42:45,880 Speaker 1: for topics I can tackle on the normal Tech Stuff episodes, 691 00:42:46,239 --> 00:42:48,319 Speaker 1: you can let me know on Twitter. The handle is 692 00:42:48,360 --> 00:42:52,000 Speaker 1: tech stuff hs W and I will talk to you 693 00:42:52,040 --> 00:43:01,280 Speaker 1: again really soon. Text Stuff is an I Heart Radio production. 694 00:43:01,520 --> 00:43:04,319 Speaker 1: For more podcasts from I Heart Radio, visit the I 695 00:43:04,440 --> 00:43:07,680 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 696 00:43:07,719 --> 00:43:08,640 Speaker 1: your favorite shows