1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,120 --> 00:00:18,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,480 --> 00:00:20,680 Speaker 1: and I love all things tech. And this is the 5 00:00:20,720 --> 00:00:24,959 Speaker 1: tech news for Thursday, September second, two thousand twenty one. 6 00:00:25,960 --> 00:00:30,000 Speaker 1: Earlier this week, I talked about how thousands of subreddits 7 00:00:30,040 --> 00:00:33,600 Speaker 1: on the site Reddit are calling on Reddit to crack 8 00:00:33,680 --> 00:00:38,440 Speaker 1: down on COVID nineteen misinformation and disinformation that is otherwise 9 00:00:38,479 --> 00:00:43,560 Speaker 1: proliferating across the site. Reddit had quarantined a subreddit called 10 00:00:43,760 --> 00:00:48,440 Speaker 1: our slash No New Normal, meaning Reddit had effectively cut 11 00:00:48,440 --> 00:00:51,599 Speaker 1: off access to the subreddit for new users, but had 12 00:00:51,600 --> 00:00:55,120 Speaker 1: not actually shut it down. But now that's changed. Reddit 13 00:00:55,200 --> 00:00:59,560 Speaker 1: has subsequently banned the No New Normal subreddit completely, and 14 00:00:59,600 --> 00:01:03,440 Speaker 1: as warantined, more than fifty other subreddits also found to 15 00:01:03,480 --> 00:01:07,480 Speaker 1: have spread and promoted COVID misinformation. So it looks like 16 00:01:07,520 --> 00:01:10,640 Speaker 1: Reddit is making a shift, at least when it comes 17 00:01:10,640 --> 00:01:13,639 Speaker 1: to matters of public health. As I said earlier this week, 18 00:01:14,000 --> 00:01:17,280 Speaker 1: the official stance of the company had been let's just 19 00:01:17,360 --> 00:01:19,919 Speaker 1: let the users kind of hash out what to believe, 20 00:01:20,400 --> 00:01:24,839 Speaker 1: very similar to what Mark Zuckerberg said about content on Facebook. 21 00:01:25,319 --> 00:01:28,080 Speaker 1: And you know that can be fine if you're talking 22 00:01:28,080 --> 00:01:31,800 Speaker 1: about different points of view that have validity to them. Right, 23 00:01:32,000 --> 00:01:34,880 Speaker 1: If you have two people who disagree about something, but 24 00:01:35,040 --> 00:01:38,839 Speaker 1: they have valid reasons for that disagreement, letting them hash 25 00:01:38,880 --> 00:01:41,120 Speaker 1: it out, that makes sense and not taking sides until 26 00:01:41,160 --> 00:01:44,319 Speaker 1: we get that. But when it comes to truth versus misinformation, 27 00:01:44,720 --> 00:01:47,160 Speaker 1: it's a different matter entirely. Anyway, I'm not going to 28 00:01:47,240 --> 00:01:49,080 Speaker 1: go down that rabbit hole again. You already heard me 29 00:01:49,160 --> 00:01:52,240 Speaker 1: say that earlier this week. Instead, I'll say that Reddit 30 00:01:52,320 --> 00:01:56,680 Speaker 1: won't allow posts that promote quote falsifiable health information that 31 00:01:56,800 --> 00:02:00,280 Speaker 1: encourages or poses a significant risk of physical arm to 32 00:02:00,320 --> 00:02:04,120 Speaker 1: the reader end quote. I think that's the ethical choice here, 33 00:02:04,360 --> 00:02:07,160 Speaker 1: as this is not, you know, a case of the 34 00:02:07,280 --> 00:02:10,200 Speaker 1: dresses blue and black. No, you're an idiot, The dress 35 00:02:10,280 --> 00:02:13,560 Speaker 1: is white and gold. This is really about people spreading 36 00:02:13,680 --> 00:02:18,639 Speaker 1: narratives that have potentially tragic consequences, with more people getting sick, 37 00:02:18,880 --> 00:02:23,280 Speaker 1: hospitals becoming overwhelmed, and ultimately more people dying as a result. 38 00:02:23,639 --> 00:02:26,800 Speaker 1: I think Reddit's response is a good one. I also 39 00:02:26,840 --> 00:02:30,640 Speaker 1: think it was a necessary response, not just because it 40 00:02:30,720 --> 00:02:33,160 Speaker 1: was the right thing to do, as I believe it was, 41 00:02:33,720 --> 00:02:36,840 Speaker 1: but also because the subredits that were protesting all this 42 00:02:36,960 --> 00:02:39,520 Speaker 1: misinformation in the first place, We're starting to use some 43 00:02:39,600 --> 00:02:43,320 Speaker 1: pretty guerrilla warfare tactics to disrupt the subredits that we're 44 00:02:43,360 --> 00:02:46,720 Speaker 1: spreading misinformation going in there and like posting not safe 45 00:02:46,760 --> 00:02:49,799 Speaker 1: for work content and such in order to necessitate a 46 00:02:49,880 --> 00:02:53,240 Speaker 1: change in that subreddit status. And if Reddit had not 47 00:02:53,360 --> 00:02:56,120 Speaker 1: stepped in, the whole platform could have descended into a 48 00:02:56,200 --> 00:03:01,280 Speaker 1: chaotic mess. Samsung has entered used a new sensor for 49 00:03:01,360 --> 00:03:06,160 Speaker 1: mobile devices that has a resolution of two hundred megapixels. 50 00:03:06,200 --> 00:03:10,760 Speaker 1: That's pretty incredible stuff, but let's break down what that means. 51 00:03:10,800 --> 00:03:15,680 Speaker 1: Super Fast digital images when you zoom way way way in, 52 00:03:16,200 --> 00:03:20,280 Speaker 1: are made up of tiny little points of light called pixels, 53 00:03:20,520 --> 00:03:24,080 Speaker 1: and collectively, these pixels appear to us as you know, 54 00:03:24,120 --> 00:03:26,960 Speaker 1: photographs and videos and stuff when we're looking at them 55 00:03:26,960 --> 00:03:30,600 Speaker 1: on screens, and generally speaking, the more pixels you can 56 00:03:30,680 --> 00:03:35,960 Speaker 1: cram into a certain size frame, the more detail you 57 00:03:36,000 --> 00:03:39,320 Speaker 1: can capture in that image. Now, once you get beyond 58 00:03:39,320 --> 00:03:41,600 Speaker 1: a certain resolution, the human eye is going to have 59 00:03:41,600 --> 00:03:45,280 Speaker 1: a lot of trouble picking out any extra detail. So 60 00:03:46,160 --> 00:03:48,360 Speaker 1: if you're just looking at a full size image, you're 61 00:03:48,400 --> 00:03:51,920 Speaker 1: not likely to see the difference. So megapixels and cameras 62 00:03:51,920 --> 00:03:55,920 Speaker 1: really only matter up to a point unless you're looking 63 00:03:56,200 --> 00:04:00,000 Speaker 1: for a specific use case. So, for example, let's say 64 00:04:00,400 --> 00:04:03,000 Speaker 1: that you're a photographer and you want to take photos 65 00:04:03,000 --> 00:04:06,320 Speaker 1: of stuff that you later intend to enlarge so that 66 00:04:06,400 --> 00:04:10,000 Speaker 1: you can make movie poster sized print outs of it. Well, 67 00:04:10,040 --> 00:04:12,440 Speaker 1: in that case, you want a camera that has sensors 68 00:04:12,440 --> 00:04:15,520 Speaker 1: that can, yeah, you know, capture a lot of megapixels, 69 00:04:15,960 --> 00:04:18,800 Speaker 1: because otherwise, as you enlarge the image, you start to 70 00:04:18,839 --> 00:04:22,520 Speaker 1: lose that detail. If you've got, you know, are very 71 00:04:22,600 --> 00:04:26,279 Speaker 1: high megapixel camera, you can enlarge an image to a 72 00:04:26,520 --> 00:04:30,520 Speaker 1: pretty impressive degree and not lose any noticeable detail as 73 00:04:30,560 --> 00:04:34,400 Speaker 1: a result. So this means that in the future, if 74 00:04:34,440 --> 00:04:36,560 Speaker 1: you happen to buy one of these phones that's going 75 00:04:36,600 --> 00:04:39,400 Speaker 1: to have a two D megapixel image sensor, you could 76 00:04:39,440 --> 00:04:42,520 Speaker 1: theoretically snap a photo with your phone and then later 77 00:04:42,600 --> 00:04:45,560 Speaker 1: you could digitally zoom way, way way in on the 78 00:04:45,600 --> 00:04:48,200 Speaker 1: image to look at something specific, and you could even 79 00:04:48,240 --> 00:04:51,120 Speaker 1: crop the photo and have it take you know, a 80 00:04:51,120 --> 00:04:54,080 Speaker 1: full size photo as if you were you know, right 81 00:04:54,160 --> 00:04:57,240 Speaker 1: up on something as opposed to really far back from it, 82 00:04:57,520 --> 00:05:00,280 Speaker 1: and you don't lose any detail. In the image. That's 83 00:05:00,320 --> 00:05:02,320 Speaker 1: pretty cool, especially if you want to take photos of 84 00:05:02,360 --> 00:05:05,120 Speaker 1: stuff that you cannot get close to. Let's say like 85 00:05:05,440 --> 00:05:08,640 Speaker 1: you're taking a picture of a historic building and there's 86 00:05:08,800 --> 00:05:11,640 Speaker 1: a barrier where you're not supposed to cross, and you 87 00:05:11,680 --> 00:05:14,599 Speaker 1: want to get really fine detail on a specific part. 88 00:05:14,920 --> 00:05:17,360 Speaker 1: You can only zoom in so much with most digital cameras, 89 00:05:17,400 --> 00:05:19,920 Speaker 1: if especially if you don't have a telephoto lens. So 90 00:05:20,000 --> 00:05:22,479 Speaker 1: this is something that can help you with that, where 91 00:05:23,000 --> 00:05:26,520 Speaker 1: that pixel density can allow you to capture detail that 92 00:05:26,720 --> 00:05:30,039 Speaker 1: otherwise you might lose. Something else. The sensor does that 93 00:05:30,040 --> 00:05:32,480 Speaker 1: I think is really neat is it adapts to take 94 00:05:32,520 --> 00:05:35,880 Speaker 1: images in low light situations. So camera sensors are all 95 00:05:35,920 --> 00:05:38,680 Speaker 1: about capturing light when you get down to it, and 96 00:05:38,839 --> 00:05:42,560 Speaker 1: some digital cameras, whether they're in smartphones or otherwise, have 97 00:05:42,720 --> 00:05:46,520 Speaker 1: some trouble in dim light situations and they produce grainy 98 00:05:46,600 --> 00:05:50,000 Speaker 1: images as a result. The Samsung sensor will have blocks 99 00:05:50,080 --> 00:05:54,120 Speaker 1: of sixteen pixels essentially teaming up to act like a 100 00:05:54,160 --> 00:05:58,600 Speaker 1: single pixel. So in this case, the overall image would 101 00:05:58,600 --> 00:06:02,120 Speaker 1: get reduced to twelve point five megapixels instead of two 102 00:06:02,200 --> 00:06:05,840 Speaker 1: hundred because those blocks of sixteen would be acting as 103 00:06:05,920 --> 00:06:09,279 Speaker 1: one reducing it down to twelve point five megapixels, but 104 00:06:09,360 --> 00:06:12,240 Speaker 1: that those sixteen pixels will be able to more effectively 105 00:06:12,320 --> 00:06:15,640 Speaker 1: capture the light that's in the scene and thus produce 106 00:06:15,720 --> 00:06:18,120 Speaker 1: better low light photos as a result. So, yeah, there'll 107 00:06:18,160 --> 00:06:20,440 Speaker 1: be fewer megapixels, you won't be able to zoom in 108 00:06:20,600 --> 00:06:23,760 Speaker 1: as far without losing detail, but you'll be able to 109 00:06:23,839 --> 00:06:27,040 Speaker 1: get a better image out of a low light situation. Now, 110 00:06:27,120 --> 00:06:29,120 Speaker 1: there's no word yet on when we might expect to 111 00:06:29,120 --> 00:06:32,120 Speaker 1: see phones that actually have these sensors in them. I 112 00:06:32,120 --> 00:06:35,360 Speaker 1: imagine they'll be popping up after a generation or two 113 00:06:35,760 --> 00:06:38,560 Speaker 1: of new devices, So we're probably looking at you know, 114 00:06:38,720 --> 00:06:43,159 Speaker 1: not maybe not by early next year, but maybe mid 115 00:06:43,240 --> 00:06:46,600 Speaker 1: to late next year, maybe we start seeing devices that 116 00:06:46,680 --> 00:06:49,080 Speaker 1: have these kind of sensors in it. I think that's 117 00:06:49,080 --> 00:06:53,839 Speaker 1: pretty neat. You know. I've talked a lot this year 118 00:06:53,880 --> 00:06:57,760 Speaker 1: about data breaches and hackers in ransomware. I wish I 119 00:06:57,800 --> 00:07:00,560 Speaker 1: didn't have to talk about it so much, but it's 120 00:07:00,560 --> 00:07:03,520 Speaker 1: been happening a lot. Well, we're gonna get to some 121 00:07:03,600 --> 00:07:08,240 Speaker 1: similar stuff right now. Bleeping Computer, it's a website, has 122 00:07:08,279 --> 00:07:11,880 Speaker 1: a piece that says the ransomware hacker group lock Bit 123 00:07:12,080 --> 00:07:15,720 Speaker 1: two point oh is actively trying to recruit folks who 124 00:07:15,720 --> 00:07:19,760 Speaker 1: work inside large corporations and the whole purpose is to 125 00:07:19,840 --> 00:07:23,960 Speaker 1: try and help this ransomware group plant malware on corporate 126 00:07:24,080 --> 00:07:27,239 Speaker 1: machines to kind of act as like an in road 127 00:07:27,520 --> 00:07:30,880 Speaker 1: for the hackers. See, when we think about hackers and malware, 128 00:07:30,960 --> 00:07:33,680 Speaker 1: we often think about someone finding like, you know, a 129 00:07:33,800 --> 00:07:38,280 Speaker 1: technical backdoor vulnerability. They're on their computer, they're typing quickly, 130 00:07:38,600 --> 00:07:41,040 Speaker 1: you know, they type in a password a couple of 131 00:07:41,040 --> 00:07:43,480 Speaker 1: times to get access to a system. Or maybe we 132 00:07:43,560 --> 00:07:47,600 Speaker 1: think about someone using social engineering to convince a person 133 00:07:48,120 --> 00:07:51,440 Speaker 1: who is authorized to access systems to hand over that 134 00:07:51,520 --> 00:07:55,080 Speaker 1: access by tricking them. This is the classic. Hey, I'm 135 00:07:55,120 --> 00:07:56,960 Speaker 1: from I T and I need to install some new 136 00:07:57,000 --> 00:07:58,720 Speaker 1: software in your machine and it should only take a 137 00:07:58,720 --> 00:07:59,920 Speaker 1: few minutes if you want to go in, you know, 138 00:08:00,040 --> 00:08:04,720 Speaker 1: take a coffee break. That is more common than you 139 00:08:04,800 --> 00:08:09,640 Speaker 1: might think. But sometimes the malware is coming from inside 140 00:08:09,640 --> 00:08:14,000 Speaker 1: the house. Sometimes employees of these companies can be convinced 141 00:08:14,280 --> 00:08:18,960 Speaker 1: to turn on their employers and help out a ransomware group. Now, 142 00:08:19,040 --> 00:08:22,760 Speaker 1: ransomware gangs typically demand some pretty high prices to return 143 00:08:22,800 --> 00:08:25,640 Speaker 1: systems to their rightful owners. And in case you're not 144 00:08:25,760 --> 00:08:29,720 Speaker 1: really familiar with what ransomware is. It's usually some software 145 00:08:29,920 --> 00:08:33,840 Speaker 1: that will encrypt data on a computer system or a 146 00:08:33,880 --> 00:08:38,040 Speaker 1: full computer network so that that data becomes inaccessible unless 147 00:08:38,080 --> 00:08:40,320 Speaker 1: you happen to have the decryption key, and then the 148 00:08:40,400 --> 00:08:44,959 Speaker 1: ransomware group demands a ransom be paid in order for 149 00:08:45,000 --> 00:08:47,520 Speaker 1: them to turn over that decryption key, and so a 150 00:08:47,559 --> 00:08:51,120 Speaker 1: lot bit two point oh is saying that insiders stand 151 00:08:51,160 --> 00:08:54,320 Speaker 1: to earn quote millions of dollars end quote if they 152 00:08:54,360 --> 00:08:56,880 Speaker 1: go along with it. But let me say a couple 153 00:08:56,880 --> 00:08:58,679 Speaker 1: of things before some of you all decide that you're 154 00:08:58,679 --> 00:09:01,000 Speaker 1: going to stick it to the man and and become 155 00:09:01,040 --> 00:09:05,560 Speaker 1: the infection vector for malware to hit your company's systems. First, 156 00:09:06,120 --> 00:09:09,640 Speaker 1: the general advice that companies receive whenever they're hit with 157 00:09:09,760 --> 00:09:13,320 Speaker 1: ransomware is that they should not pay the ransom because one, 158 00:09:13,360 --> 00:09:18,120 Speaker 1: there's no guarantee that they will have control returned to them. 159 00:09:18,200 --> 00:09:21,439 Speaker 1: And to every single payout is a message that goes 160 00:09:21,480 --> 00:09:24,560 Speaker 1: out to the hacking community that says ransomware is a 161 00:09:24,559 --> 00:09:27,520 Speaker 1: good way to make money. So, in other words, paying 162 00:09:27,640 --> 00:09:31,120 Speaker 1: ransoms encourages future attacks, So that could mean you could 163 00:09:31,120 --> 00:09:34,520 Speaker 1: go through the trouble of performing a criminal act on 164 00:09:34,559 --> 00:09:38,080 Speaker 1: behalf of this ransomware group, putting yourself at risk and 165 00:09:38,240 --> 00:09:41,360 Speaker 1: your company at risk, and your company might never pay 166 00:09:41,400 --> 00:09:45,040 Speaker 1: out that ransom, so you end up making nothing for 167 00:09:45,120 --> 00:09:48,319 Speaker 1: that anyway, which means you've got a very high risk 168 00:09:49,200 --> 00:09:51,600 Speaker 1: in other words. But another reason not to do it 169 00:09:51,640 --> 00:09:53,840 Speaker 1: is that, let's say that the group is ready to 170 00:09:53,880 --> 00:09:57,000 Speaker 1: commit a crime and extort money out of a target, Well, 171 00:09:57,000 --> 00:09:59,560 Speaker 1: then what's to stop that same group from screwing over 172 00:09:59,600 --> 00:10:01,880 Speaker 1: the person and who gave them access in the first place. 173 00:10:02,160 --> 00:10:04,880 Speaker 1: I mean, it's not like you could come forward and say, hey, 174 00:10:05,760 --> 00:10:08,439 Speaker 1: those hackers that attack the systems, they promised to give 175 00:10:08,440 --> 00:10:10,800 Speaker 1: me money if I gave them access, and now they 176 00:10:10,880 --> 00:10:14,000 Speaker 1: stiffed me. You can't do that. It would be admitting 177 00:10:14,040 --> 00:10:18,120 Speaker 1: that you were part of the attack. So there's no 178 00:10:18,200 --> 00:10:21,200 Speaker 1: guarantee you would even get paid out. You know, don't 179 00:10:21,200 --> 00:10:24,640 Speaker 1: he help out ransomware gangs. And you know what, even 180 00:10:24,679 --> 00:10:27,360 Speaker 1: if they did pay you out, even if you did 181 00:10:27,480 --> 00:10:30,760 Speaker 1: get millions of dollars, you don't have to figure out 182 00:10:30,760 --> 00:10:34,359 Speaker 1: how to hide all that cash because trust me, organizations 183 00:10:34,440 --> 00:10:37,160 Speaker 1: like the I r S and law enforcement are really 184 00:10:37,160 --> 00:10:39,800 Speaker 1: gonna wonder how you got so flush with cash out 185 00:10:39,880 --> 00:10:45,199 Speaker 1: of nowhere. So long story short, don't pay ransoms and companies, 186 00:10:45,720 --> 00:10:48,640 Speaker 1: don't burn your employees and make them resent you, because 187 00:10:48,679 --> 00:10:52,439 Speaker 1: that's really what makes these sorts of schemes tempting. In fact, 188 00:10:52,600 --> 00:10:55,679 Speaker 1: we'll talk about more of along those lines in just 189 00:10:55,760 --> 00:10:58,360 Speaker 1: a moment, but before we get to that, let's take 190 00:10:58,480 --> 00:11:09,560 Speaker 1: a quick break. Okay, before the break, I was talking 191 00:11:09,600 --> 00:11:12,280 Speaker 1: about how companies need to be careful because if they 192 00:11:12,360 --> 00:11:14,800 Speaker 1: burn their employees, then they can create the sort of 193 00:11:14,880 --> 00:11:19,080 Speaker 1: environment where those employees would say, yeah, why don't I 194 00:11:19,120 --> 00:11:22,600 Speaker 1: screw over the company because they're screwing me over and 195 00:11:22,840 --> 00:11:26,440 Speaker 1: you don't want that. Well, let's talk about the story 196 00:11:26,520 --> 00:11:30,760 Speaker 1: of Julianna Barile, who recently pled guilty to charges that 197 00:11:30,800 --> 00:11:33,840 Speaker 1: she illegally accessed the computer systems of a New York 198 00:11:33,840 --> 00:11:38,839 Speaker 1: credit union and subsequently deleted more than twenty gigabytes of data. 199 00:11:39,280 --> 00:11:42,559 Speaker 1: So Bill was once an employee of this credit union 200 00:11:42,559 --> 00:11:45,800 Speaker 1: in New York, but she found her employment terminated this 201 00:11:45,920 --> 00:11:49,080 Speaker 1: past May. Now, someone in the I T department was 202 00:11:49,160 --> 00:11:53,600 Speaker 1: supposed to revoke her access to the credit union's computer systems, 203 00:11:53,880 --> 00:11:57,600 Speaker 1: which is pretty standard operating procedure when someone leaves a company, 204 00:11:57,600 --> 00:12:01,640 Speaker 1: whether it's through termination, or residue nation or retirement or 205 00:12:01,679 --> 00:12:05,840 Speaker 1: whatever it might be. It's just good info sec practice 206 00:12:05,960 --> 00:12:10,120 Speaker 1: to revoke system access when someone is no longer employed 207 00:12:10,160 --> 00:12:13,160 Speaker 1: by the company. But yeah, that didn't happen. And when 208 00:12:13,160 --> 00:12:15,839 Speaker 1: Barrill found out that she could still access the credit 209 00:12:15,920 --> 00:12:20,480 Speaker 1: Union system, she did, and then she started deleting stuff, 210 00:12:20,880 --> 00:12:24,720 Speaker 1: and that included like loan application folders and such. And 211 00:12:24,760 --> 00:12:26,960 Speaker 1: as someone who not too long ago went through the 212 00:12:27,000 --> 00:12:29,800 Speaker 1: process of applying for a loan, I could tell you 213 00:12:29,840 --> 00:12:34,560 Speaker 1: that any hiccup along the way is stressful and it's frustrating, 214 00:12:34,880 --> 00:12:37,720 Speaker 1: but having the whole thing wiped out by a disgruntled 215 00:12:37,840 --> 00:12:42,280 Speaker 1: former credit Union employee would make things way worse. According 216 00:12:42,320 --> 00:12:45,880 Speaker 1: to court documents, Burrile deleted some twenty thousand files and 217 00:12:45,920 --> 00:12:50,160 Speaker 1: more than three thousand directories in about forty minutes. The 218 00:12:50,240 --> 00:12:53,480 Speaker 1: credit Union has been able to restore some files from 219 00:12:53,480 --> 00:12:56,559 Speaker 1: backup and estimates that the cost for recovery has been 220 00:12:56,600 --> 00:12:59,240 Speaker 1: more than ten thousand dollars. And here's an example of 221 00:12:59,280 --> 00:13:01,400 Speaker 1: the sort of thing groups like lock a bit two 222 00:13:01,440 --> 00:13:04,240 Speaker 1: point oh that I talked about before the break. This 223 00:13:04,320 --> 00:13:06,440 Speaker 1: is what they're on the lookout for. People who have 224 00:13:06,559 --> 00:13:09,600 Speaker 1: access to a system and an ax to grind against 225 00:13:09,640 --> 00:13:14,959 Speaker 1: their company. What Brill did was wrong. She probably caused 226 00:13:14,960 --> 00:13:17,480 Speaker 1: more harm to the end the customers of the Create 227 00:13:17,559 --> 00:13:20,920 Speaker 1: Union than the Create Union itself, like all those people 228 00:13:20,960 --> 00:13:25,600 Speaker 1: who had loans in process. It's incredibly disruptive. Now. I 229 00:13:25,679 --> 00:13:29,000 Speaker 1: don't have the details around her termination, like why she 230 00:13:29,160 --> 00:13:32,440 Speaker 1: was let go, but obviously whether the reasons for that 231 00:13:32,520 --> 00:13:36,960 Speaker 1: termination were justified or not, we can all say that 232 00:13:37,000 --> 00:13:40,360 Speaker 1: the credit union really should have revoked her access straight away. 233 00:13:40,440 --> 00:13:43,280 Speaker 1: As soon as she was terminated, that access should have 234 00:13:43,280 --> 00:13:47,199 Speaker 1: been shut down. So there are you know, she's at fault, 235 00:13:47,240 --> 00:13:49,960 Speaker 1: certainly because she's the one who did the crime, but 236 00:13:50,040 --> 00:13:52,760 Speaker 1: the CREB union also bears some responsibility. I would not 237 00:13:52,800 --> 00:13:56,680 Speaker 1: be surprised if some credit Union customers sought action against 238 00:13:56,679 --> 00:13:59,840 Speaker 1: the Create Union for failing to protect their you know, 239 00:14:00,040 --> 00:14:04,520 Speaker 1: their information and their assets. Speaking of cybersecurity, let's talk 240 00:14:04,600 --> 00:14:07,600 Speaker 1: about cables for a second. Now. I think a lot 241 00:14:07,600 --> 00:14:11,040 Speaker 1: of people don't consider cables as you know, a cybersecurity 242 00:14:11,040 --> 00:14:14,040 Speaker 1: element that they need to worry about, particularly when you've 243 00:14:14,040 --> 00:14:17,600 Speaker 1: got tons of people connecting to public WiFi without running 244 00:14:17,600 --> 00:14:21,280 Speaker 1: a VPN or anything like that. But yeah, cables can 245 00:14:21,280 --> 00:14:26,200 Speaker 1: trip you up and uh, I don't just mean literally, 246 00:14:26,280 --> 00:14:29,160 Speaker 1: although they can do that too. Happens to be pretty 247 00:14:29,200 --> 00:14:32,560 Speaker 1: much once a day, but no. A cybersecurity researcher who 248 00:14:32,640 --> 00:14:36,880 Speaker 1: uses the handle MG has been creating cables that hide 249 00:14:37,040 --> 00:14:40,160 Speaker 1: sneaky hardware inside them for a while now, and he 250 00:14:40,240 --> 00:14:45,040 Speaker 1: calls them, oh MG cables cute, right, And he's recently 251 00:14:45,080 --> 00:14:48,920 Speaker 1: unveiled a new one that uses the USBC form factor, 252 00:14:49,280 --> 00:14:51,360 Speaker 1: which was something a lot of people thought wouldn't be 253 00:14:51,400 --> 00:14:53,640 Speaker 1: possible because if you look at one of those cables, 254 00:14:54,280 --> 00:14:56,960 Speaker 1: it has a very small plug at the end of it, 255 00:14:57,360 --> 00:14:58,800 Speaker 1: and you would think there's not a whole lot of 256 00:14:58,880 --> 00:15:02,120 Speaker 1: space for you to hide any secret tech in that 257 00:15:02,200 --> 00:15:06,160 Speaker 1: form factor. And for this to work, for these cables 258 00:15:06,160 --> 00:15:08,960 Speaker 1: to be a security threat, you have to be able 259 00:15:09,000 --> 00:15:13,160 Speaker 1: to hide specialized chips inside that cable, and you have 260 00:15:13,200 --> 00:15:15,080 Speaker 1: to do it effectively so that it doesn't look like 261 00:15:15,160 --> 00:15:18,120 Speaker 1: it's out of place. So what is so special about 262 00:15:18,120 --> 00:15:21,360 Speaker 1: these cables what makes them a security threat? Well, for 263 00:15:21,480 --> 00:15:26,440 Speaker 1: one thing, MG has incorporated a WiFi transceiver chip inside 264 00:15:26,440 --> 00:15:29,320 Speaker 1: the cable itself. So if you connect one of these 265 00:15:29,320 --> 00:15:33,760 Speaker 1: cables to a device like your phone or a tablet 266 00:15:33,880 --> 00:15:37,320 Speaker 1: or a computer, a remote hacker can see that WiFi 267 00:15:37,400 --> 00:15:41,600 Speaker 1: spot because the cable will have activated the WiFi hot spot, 268 00:15:41,640 --> 00:15:45,400 Speaker 1: it creates the power necessary to run it. The hacker 269 00:15:45,440 --> 00:15:49,240 Speaker 1: can log in through that WiFi hot spot to gain 270 00:15:49,360 --> 00:15:53,680 Speaker 1: access to the devices the cable connects to, and they 271 00:15:53,680 --> 00:15:57,080 Speaker 1: can start deploying payloads like a keystroke logger, which will 272 00:15:57,160 --> 00:15:59,720 Speaker 1: keep track of everything you type on that device, so 273 00:15:59,760 --> 00:16:02,400 Speaker 1: if you're typing out like passwords and stuff, they can 274 00:16:02,440 --> 00:16:06,840 Speaker 1: collect those passwords. The implications of this are pretty scary, 275 00:16:06,920 --> 00:16:09,000 Speaker 1: and it really drives home the fact that you should 276 00:16:09,040 --> 00:16:13,160 Speaker 1: not trust any cables that are not your own. Just 277 00:16:13,280 --> 00:16:17,640 Speaker 1: imagine someone quote unquote testing these cables by leaving them 278 00:16:17,680 --> 00:16:21,880 Speaker 1: in heavily trafficked areas like airport charging stations or a 279 00:16:21,880 --> 00:16:26,320 Speaker 1: coffee shop or whatever. The variations on these cables include 280 00:16:26,400 --> 00:16:30,560 Speaker 1: USBC two lightning connectors, which means you're not safe if 281 00:16:30,600 --> 00:16:33,480 Speaker 1: you're on a PC, an Android device, or an iOS 282 00:16:33,560 --> 00:16:37,080 Speaker 1: device if you are using one of these cables. And 283 00:16:37,120 --> 00:16:38,920 Speaker 1: like I said, from the outside, they look like just 284 00:16:39,120 --> 00:16:43,880 Speaker 1: normal USB or lightning cables. There's nothing about them that 285 00:16:44,000 --> 00:16:46,160 Speaker 1: would set you off and make you think, oh, well, 286 00:16:46,160 --> 00:16:50,480 Speaker 1: this is sus But they really are an incredibly effective 287 00:16:50,480 --> 00:16:53,600 Speaker 1: security penetration tool if the right person is making use 288 00:16:53,600 --> 00:16:56,760 Speaker 1: of them. The security company Hack five has partnered with 289 00:16:56,920 --> 00:16:59,560 Speaker 1: MG to sell these cables, which are now in mass production, 290 00:17:00,000 --> 00:17:02,840 Speaker 1: and the stated purpose. In fact, what hack five always 291 00:17:02,880 --> 00:17:06,440 Speaker 1: says is that this is all to provide security researchers 292 00:17:06,480 --> 00:17:10,320 Speaker 1: penetration testers with the tools they need to do various 293 00:17:10,359 --> 00:17:13,960 Speaker 1: testing and security measures. And to be fair, if something 294 00:17:14,119 --> 00:17:17,040 Speaker 1: is possible, even if you were to say, like, oh, 295 00:17:17,119 --> 00:17:19,680 Speaker 1: but you're making cables that could cause an enormous amount 296 00:17:19,720 --> 00:17:22,360 Speaker 1: of harm if they fell into the wrong hands, well, 297 00:17:22,400 --> 00:17:24,240 Speaker 1: the fact that we know it's possible just means that 298 00:17:24,320 --> 00:17:26,840 Speaker 1: someone sooner or later was going to make one for 299 00:17:27,160 --> 00:17:31,600 Speaker 1: nefarious purposes. So while while you might like feel weird 300 00:17:31,840 --> 00:17:34,240 Speaker 1: about the fact that a security researcher has creates something 301 00:17:34,280 --> 00:17:37,440 Speaker 1: that's a real security threat, in another way, you could 302 00:17:37,440 --> 00:17:40,000 Speaker 1: say it's a good thing because now we know that 303 00:17:40,000 --> 00:17:42,560 Speaker 1: this is possible, so we can be on the lookout 304 00:17:42,680 --> 00:17:46,399 Speaker 1: for stuff when we encounter it. Um Still, you know, 305 00:17:46,520 --> 00:17:48,400 Speaker 1: just knowing these things are out there kind of gives 306 00:17:48,400 --> 00:17:50,760 Speaker 1: me the heavy GEVs. So maybe I will get Shannon 307 00:17:50,760 --> 00:17:54,160 Speaker 1: Morris back on the show to talk about this, because 308 00:17:54,160 --> 00:17:58,080 Speaker 1: it's pretty incredible stuff, And again, don't trust anyone's cables 309 00:17:58,119 --> 00:18:00,720 Speaker 1: but your own, right unless you purchase stead and you 310 00:18:00,760 --> 00:18:04,639 Speaker 1: feel pretty good about it, don't use some other cable. 311 00:18:04,920 --> 00:18:07,159 Speaker 1: You never know what kind of tech it might be hiding. 312 00:18:07,440 --> 00:18:11,000 Speaker 1: It's also possible to have chips that could deploy malware 313 00:18:11,040 --> 00:18:13,320 Speaker 1: directly to a device if you plugged it in using 314 00:18:13,359 --> 00:18:17,720 Speaker 1: that cable. So word to the wise, be careful. Now 315 00:18:17,800 --> 00:18:20,040 Speaker 1: let's move over to India to talk about how that 316 00:18:20,080 --> 00:18:23,920 Speaker 1: country continues to make moves that I think are fairly 317 00:18:23,960 --> 00:18:27,960 Speaker 1: authoritarian when it comes to tech and digital information. I've 318 00:18:27,960 --> 00:18:31,240 Speaker 1: already taught about how the country's government has pressured social 319 00:18:31,280 --> 00:18:35,720 Speaker 1: networking platforms to either suppress messages that criticize the government 320 00:18:36,200 --> 00:18:40,120 Speaker 1: or to step back from enforcing misinformation policies when government 321 00:18:40,160 --> 00:18:43,840 Speaker 1: officials post stuff that appears to violate those policies. So, 322 00:18:43,880 --> 00:18:46,600 Speaker 1: in other words, like if a politician were to post 323 00:18:46,640 --> 00:18:52,000 Speaker 1: something that fact checkers thought was misleading or misinformation, then 324 00:18:52,119 --> 00:18:56,160 Speaker 1: Twitter might tag that post with a label that says 325 00:18:56,200 --> 00:18:59,960 Speaker 1: as much. And India has really objected to Twitter taking 326 00:19:00,080 --> 00:19:02,400 Speaker 1: that step, in fact, catting to the point where they 327 00:19:02,440 --> 00:19:07,000 Speaker 1: said we're shutting down Twitter in India. However, on top 328 00:19:07,040 --> 00:19:10,120 Speaker 1: of all those things I have not taught about VPNs 329 00:19:10,280 --> 00:19:16,040 Speaker 1: or virtual private networks, so vp ns are really legitimately useful. 330 00:19:16,760 --> 00:19:20,080 Speaker 1: A good VPN protects you from folks snooping on your business. 331 00:19:20,480 --> 00:19:22,680 Speaker 1: So you might use a vp N if you wanted 332 00:19:22,680 --> 00:19:25,119 Speaker 1: to connect to stuff like say your bank account, or 333 00:19:25,240 --> 00:19:28,600 Speaker 1: medical insurance, or any of a million things that are sensitive. 334 00:19:29,000 --> 00:19:32,600 Speaker 1: So essentially, you log into a server that's acting as 335 00:19:32,680 --> 00:19:36,240 Speaker 1: your virtual private network, and the server will then go 336 00:19:36,280 --> 00:19:39,600 Speaker 1: on to fetch all the stuff that you're wanting to 337 00:19:39,680 --> 00:19:43,560 Speaker 1: look at online and everything is encrypted. So let's say 338 00:19:43,600 --> 00:19:45,880 Speaker 1: you wanted to look at your bank statement, you would 339 00:19:45,880 --> 00:19:49,480 Speaker 1: send the command, it would go to the VPN first encrypted, 340 00:19:50,200 --> 00:19:53,200 Speaker 1: and then the VPN would go and send the request 341 00:19:53,359 --> 00:19:56,879 Speaker 1: to your bank, and the return would go through the 342 00:19:56,960 --> 00:19:59,959 Speaker 1: VPN before it came back to you, and from an 343 00:20:00,080 --> 00:20:02,520 Speaker 1: went outside, it would just look like all you were 344 00:20:02,560 --> 00:20:06,480 Speaker 1: doing was communicating with this one VPN server, but they 345 00:20:06,520 --> 00:20:08,960 Speaker 1: wouldn't be able to see what was happening. Beyond that point, 346 00:20:09,000 --> 00:20:11,720 Speaker 1: they wouldn't know what you were really looking at. As 347 00:20:11,760 --> 00:20:13,600 Speaker 1: long as the VPN is on the up and up, 348 00:20:14,520 --> 00:20:17,600 Speaker 1: which is an important point, and as long as the 349 00:20:17,680 --> 00:20:21,800 Speaker 1: VPN does you know good practices like purging user histories 350 00:20:22,160 --> 00:20:26,120 Speaker 1: that things are relatively secure and private. By the way, 351 00:20:26,520 --> 00:20:29,280 Speaker 1: you should be using a VPN pretty much any time 352 00:20:29,320 --> 00:20:32,480 Speaker 1: you're not on your own network, and if you don't 353 00:20:32,520 --> 00:20:35,800 Speaker 1: want your i sp knowing what you're looking at. For example, 354 00:20:36,119 --> 00:20:38,560 Speaker 1: let's say that you're shopping around for different I s 355 00:20:38,640 --> 00:20:41,560 Speaker 1: p s, Well, you might want to use a VPN 356 00:20:41,640 --> 00:20:45,720 Speaker 1: even on your home network as well. But India's Parliamentary 357 00:20:45,760 --> 00:20:50,040 Speaker 1: Standing Committee on Home Affairs wants a countrywide block on VPNs. 358 00:20:50,760 --> 00:20:55,200 Speaker 1: Why Well, the committee claims that VPNs are facilitating piracy 359 00:20:55,640 --> 00:20:58,840 Speaker 1: and illegal commerce and that it provides a haven for 360 00:20:58,920 --> 00:21:02,480 Speaker 1: hackers so that they can attack targets without fear of 361 00:21:02,520 --> 00:21:05,199 Speaker 1: being tracked down because all the attacks would seem to 362 00:21:05,200 --> 00:21:09,000 Speaker 1: originate from the VPN, not the hacker. But India also 363 00:21:09,119 --> 00:21:13,560 Speaker 1: recently liberalized VPNs. They noted that VPNs were really important 364 00:21:13,560 --> 00:21:15,880 Speaker 1: for people to be able to work remotely and log 365 00:21:15,920 --> 00:21:18,920 Speaker 1: into company systems. A lot of companies run their own 366 00:21:19,000 --> 00:21:23,800 Speaker 1: VPNs because it is a good security measure. It helps 367 00:21:23,880 --> 00:21:30,600 Speaker 1: protect proprietary or trade secret information from getting out into 368 00:21:30,680 --> 00:21:35,879 Speaker 1: the world at large. So VPNs do play a valid 369 00:21:35,920 --> 00:21:41,720 Speaker 1: and important role. So we have kind of a a 370 00:21:41,840 --> 00:21:44,920 Speaker 1: disconnect here, right, we have another example of governments trying 371 00:21:44,960 --> 00:21:47,960 Speaker 1: to get control of stuff that inherently is designed to 372 00:21:48,000 --> 00:21:50,880 Speaker 1: resist that kind of intrusion. And we'll have to see 373 00:21:50,880 --> 00:21:53,760 Speaker 1: how this one plays out. I've got a few more 374 00:21:53,800 --> 00:21:56,200 Speaker 1: stories for us to cover before we get to that. 375 00:21:56,400 --> 00:22:07,120 Speaker 1: Let's take another quick break. Okay, we're back and over 376 00:22:07,200 --> 00:22:11,200 Speaker 1: here in the United States, the Government Accountability Office recently 377 00:22:11,280 --> 00:22:13,920 Speaker 1: released a report, in fact, it was just last week 378 00:22:14,280 --> 00:22:18,639 Speaker 1: that revealed numerous federal agencies continued to use and even 379 00:22:18,680 --> 00:22:24,600 Speaker 1: plan on increasing the use of facial recognition technology. This 380 00:22:25,440 --> 00:22:29,280 Speaker 1: is disturbing because, as I have reported several times, even 381 00:22:29,320 --> 00:22:33,280 Speaker 1: if you're okay with the idea of this level of surveillance, 382 00:22:33,280 --> 00:22:38,000 Speaker 1: in general, this technology is imperfect and frequently there is 383 00:22:38,080 --> 00:22:42,439 Speaker 1: an inherent bias within the technology itself which leads to 384 00:22:42,520 --> 00:22:49,760 Speaker 1: false positives and misidentifications, particularly among populations of non white people. So, 385 00:22:49,800 --> 00:22:52,920 Speaker 1: in other words, this technology can lead to an increased 386 00:22:53,000 --> 00:22:58,080 Speaker 1: discrimination against non white groups. And there have already been 387 00:22:58,119 --> 00:23:02,600 Speaker 1: several cases in which, you know, law enforcement has relied 388 00:23:02,640 --> 00:23:06,080 Speaker 1: upon facial recognition technology that was later found to have 389 00:23:06,240 --> 00:23:10,600 Speaker 1: misidentified people of interest. And obviously, if the authorities have 390 00:23:10,680 --> 00:23:12,800 Speaker 1: tagged you as being someone they want to talk to 391 00:23:12,880 --> 00:23:17,639 Speaker 1: in connection with a crime that's pretty darn stressful and 392 00:23:17,680 --> 00:23:21,000 Speaker 1: can be incredibly disruptive to your life. And when you 393 00:23:21,080 --> 00:23:23,800 Speaker 1: have nothing to do with that crime, like you have 394 00:23:23,840 --> 00:23:26,520 Speaker 1: no connection to it whatsoever. It's just that this technology 395 00:23:26,560 --> 00:23:30,439 Speaker 1: has misidentified you, and then law enforcement is putting the 396 00:23:30,480 --> 00:23:33,199 Speaker 1: burden on you to somehow prove you had nothing to 397 00:23:33,240 --> 00:23:36,760 Speaker 1: do with a crime. Well, that's an injustice, and the 398 00:23:36,760 --> 00:23:40,000 Speaker 1: potential for such injustices appears to be on the rise. 399 00:23:40,080 --> 00:23:43,880 Speaker 1: According to this report, The g a O survey twenty 400 00:23:43,920 --> 00:23:47,640 Speaker 1: four different federal agencies and departments and found that eighteen 401 00:23:47,720 --> 00:23:51,920 Speaker 1: of them currently rely on facial recognition technology in some capacity, 402 00:23:52,160 --> 00:23:54,359 Speaker 1: and ten of them plan to use it even more 403 00:23:54,400 --> 00:23:58,199 Speaker 1: in the future. Now, this isn't just concerning to me. 404 00:23:58,720 --> 00:24:01,520 Speaker 1: The problem caused by facial recognition tech has been bad 405 00:24:01,640 --> 00:24:06,680 Speaker 1: enough that some major companies, including IBM, have stopped selling 406 00:24:06,720 --> 00:24:10,679 Speaker 1: that technology to law enforcement and regulation agencies. They have 407 00:24:10,840 --> 00:24:15,040 Speaker 1: essentially noted that this technology is it stands to do 408 00:24:15,119 --> 00:24:18,600 Speaker 1: more harm than good. Some states, like Maine have passed 409 00:24:18,680 --> 00:24:21,240 Speaker 1: laws that restrict the use of the tech, and there's 410 00:24:21,240 --> 00:24:24,760 Speaker 1: a debate at the federal level on issuing essentially a 411 00:24:24,840 --> 00:24:28,639 Speaker 1: nationwide ban on it, and here's hoping we see progress 412 00:24:28,840 --> 00:24:31,720 Speaker 1: here and less reliance on the technology that, at least 413 00:24:31,720 --> 00:24:35,000 Speaker 1: for some groups, has the capacity to cause a disproportionate 414 00:24:35,040 --> 00:24:39,240 Speaker 1: amount of harm. Facial recognition tech. I think it's great 415 00:24:39,320 --> 00:24:41,359 Speaker 1: if you're trying to unlock a phone, because if it 416 00:24:41,400 --> 00:24:43,520 Speaker 1: doesn't work, you can just put in your pen. You're fine. 417 00:24:44,080 --> 00:24:47,320 Speaker 1: It's irritating that it doesn't work in those cases, and 418 00:24:47,359 --> 00:24:50,520 Speaker 1: it does speak to a problem with bias, but that's 419 00:24:50,640 --> 00:24:54,119 Speaker 1: an inconvenience when we're talking about law enforcement using it 420 00:24:54,119 --> 00:24:57,000 Speaker 1: as justification to disrupt the lives of people who may 421 00:24:57,000 --> 00:25:00,000 Speaker 1: be completely innocent and have no connection to the matter 422 00:25:00,080 --> 00:25:03,960 Speaker 1: at hand. That's another issue entirely. It starts to get 423 00:25:04,320 --> 00:25:09,600 Speaker 1: close to some constitutional problems. Okay, time for some cryptocurrency talk. 424 00:25:10,160 --> 00:25:13,760 Speaker 1: So one problem with cryptocurrency isn't really the text fault. 425 00:25:13,800 --> 00:25:17,040 Speaker 1: And I know I come down on cryptocurrency pretty hard, 426 00:25:17,119 --> 00:25:20,840 Speaker 1: pretty frequently, But one thing that you know, I can't 427 00:25:20,840 --> 00:25:24,679 Speaker 1: really fault the tech for is that the technology itself 428 00:25:24,760 --> 00:25:28,359 Speaker 1: is so poorly understood that it gives scam artists the 429 00:25:28,359 --> 00:25:31,280 Speaker 1: opportunity to pull a fast one and cheat people out 430 00:25:31,320 --> 00:25:34,960 Speaker 1: of their money, or sometimes cheat entire businesses out of 431 00:25:35,000 --> 00:25:39,520 Speaker 1: their money. The United States Securities and Exchange Commission, or SEC, 432 00:25:40,240 --> 00:25:44,639 Speaker 1: has charged a man named Satish Kombani of violating various 433 00:25:44,680 --> 00:25:49,520 Speaker 1: registration laws meant to protect investors from scams. Kombani had 434 00:25:49,600 --> 00:25:54,280 Speaker 1: been the founder of bit Connect, a cryptocurrency exchange platform 435 00:25:54,320 --> 00:25:59,080 Speaker 1: that no longer exists because it was, you know, a 436 00:25:59,200 --> 00:26:02,000 Speaker 1: den of scumming villainy. I guess at the heart of 437 00:26:02,040 --> 00:26:07,040 Speaker 1: the matter is that Kombani fraudulently raised around two billion 438 00:26:07,359 --> 00:26:11,399 Speaker 1: dollars worth of investments from various retail companies in a 439 00:26:11,480 --> 00:26:15,760 Speaker 1: cryptocurrency investment scam. And as part of the scam, Kombani 440 00:26:15,880 --> 00:26:20,520 Speaker 1: showed potential investors fictitious returns of investment of around three 441 00:26:20,640 --> 00:26:25,639 Speaker 1: thousand seven annual return. So, in other words, let's say 442 00:26:25,680 --> 00:26:28,240 Speaker 1: that this was a real thing, that it was totally working, 443 00:26:28,720 --> 00:26:35,280 Speaker 1: and on January one, you invested one dollar into the 444 00:26:34,359 --> 00:26:38,280 Speaker 1: the strategy. January one, the following year, you would have 445 00:26:38,320 --> 00:26:42,560 Speaker 1: three thousand, seven hundred dollars. And if you invested ten grand, 446 00:26:42,880 --> 00:26:45,160 Speaker 1: or a hundred grand or a million, what you get 447 00:26:45,200 --> 00:26:51,120 Speaker 1: the idea, it's an astronomical return. Further, Kombani claimed that 448 00:26:51,200 --> 00:26:54,920 Speaker 1: bit connects own cryptocurrency which was called bit connect coin, 449 00:26:55,280 --> 00:26:59,320 Speaker 1: which isn't confusing at all, was a stable and safe 450 00:26:59,640 --> 00:27:02,639 Speaker 1: cryptocurrency that no one was gonna have to worry about. 451 00:27:03,280 --> 00:27:07,359 Speaker 1: Uh that failing only it totally wasn't stable and safe, 452 00:27:07,400 --> 00:27:13,360 Speaker 1: as the cryptocurrencies value plummeted by pent in January, and 453 00:27:13,440 --> 00:27:17,720 Speaker 1: that Kambani was doing the old Ponzi scheme. So that 454 00:27:17,840 --> 00:27:20,320 Speaker 1: involves getting a round of investors to give you money. 455 00:27:20,359 --> 00:27:25,400 Speaker 1: So you've got your initial flush of cash. Then when 456 00:27:25,400 --> 00:27:28,359 Speaker 1: it comes time to pay out the first investors, you 457 00:27:28,400 --> 00:27:30,639 Speaker 1: go find a second round of investors to give you 458 00:27:30,680 --> 00:27:34,040 Speaker 1: more money. You use the money from the second round 459 00:27:34,040 --> 00:27:37,439 Speaker 1: of investors to pay out your first round of investors. 460 00:27:37,680 --> 00:27:39,960 Speaker 1: But then you've got to go after a third round 461 00:27:40,040 --> 00:27:42,840 Speaker 1: of investors to help pay off that second round, and 462 00:27:42,920 --> 00:27:46,199 Speaker 1: so on. So ponzi schemes can break in huge amounts 463 00:27:46,200 --> 00:27:49,000 Speaker 1: of cash. They don't have to be connected at all 464 00:27:49,080 --> 00:27:52,119 Speaker 1: with cryptocurrency. But we have seen a lot of cryptocurrency 465 00:27:52,119 --> 00:27:55,399 Speaker 1: scams that are essentially Ponzi schemes in recent history, and 466 00:27:55,480 --> 00:27:57,560 Speaker 1: one common element about all of them is that they 467 00:27:57,720 --> 00:28:03,280 Speaker 1: always eventually come crashing down because sooner or later you 468 00:28:03,359 --> 00:28:05,520 Speaker 1: run out of folks who are willing to invest, and 469 00:28:05,560 --> 00:28:08,320 Speaker 1: it's time to pay the piper. Now. Maybe by then 470 00:28:08,359 --> 00:28:11,200 Speaker 1: you've made enough to make your escape, but it ain't 471 00:28:11,200 --> 00:28:15,240 Speaker 1: exactly the best get rich plan. Finally, Amazon and SpaceX 472 00:28:15,280 --> 00:28:18,159 Speaker 1: are having a fight down here on Earth over stuff 473 00:28:18,160 --> 00:28:21,880 Speaker 1: that's meant to float around in space. Specifically, the fight 474 00:28:22,000 --> 00:28:27,480 Speaker 1: revolves around Starlink, space X's satellite Internet communications system that 475 00:28:27,560 --> 00:28:30,520 Speaker 1: the company has had in beta testing for a while now. 476 00:28:30,920 --> 00:28:35,240 Speaker 1: So SpaceX's goal is to launch thousands of satellites up 477 00:28:35,240 --> 00:28:38,840 Speaker 1: into orbit to provide consistent satellite Internet coverage down here 478 00:28:38,880 --> 00:28:42,600 Speaker 1: on Earth. We're talking like thirty thousand satellites are more. 479 00:28:43,160 --> 00:28:46,840 Speaker 1: But SpaceX says that Amazon has been filing objections to 480 00:28:47,000 --> 00:28:52,760 Speaker 1: various SpaceX proposals in attempt to hinder SpaceX because, as 481 00:28:52,800 --> 00:28:57,600 Speaker 1: SpaceX puts it, Amazon's own satellite Internet solution, called Kiper, 482 00:28:58,200 --> 00:29:03,680 Speaker 1: is running behind schedule. So according to SpaceX, Amazon has 483 00:29:03,720 --> 00:29:07,600 Speaker 1: decided to try and slow down the competition while it 484 00:29:07,680 --> 00:29:11,640 Speaker 1: tries to get its own system up and running. Amazon, however, 485 00:29:11,840 --> 00:29:15,200 Speaker 1: filed complaints with the FCC saying that space X was 486 00:29:15,240 --> 00:29:18,360 Speaker 1: not following the proper rules when it comes to submitting 487 00:29:18,400 --> 00:29:22,920 Speaker 1: proposals for putting more satellites in orbit, and like I said, 488 00:29:23,000 --> 00:29:26,840 Speaker 1: we're talking about thousands of satellites here. And Amazon's complaint 489 00:29:26,920 --> 00:29:31,480 Speaker 1: is that SpaceX's proposal actually has two different configurations of 490 00:29:31,560 --> 00:29:34,800 Speaker 1: satellites in it, and you can't have two different ones 491 00:29:34,840 --> 00:29:37,040 Speaker 1: at the same time. Obviously, you can only have one. 492 00:29:37,800 --> 00:29:40,200 Speaker 1: And so Amazon says this is against the rules. The 493 00:29:40,280 --> 00:29:44,880 Speaker 1: rules require proposals to have no internal inconsistencies. You can't 494 00:29:45,320 --> 00:29:48,240 Speaker 1: say we might do it this way, or maybe we'll 495 00:29:48,280 --> 00:29:51,440 Speaker 1: do it this other way instead, according to Amazon, anyway, 496 00:29:51,520 --> 00:29:55,200 Speaker 1: the rules say you should commit to a single approach 497 00:29:56,080 --> 00:29:59,160 Speaker 1: in your proposal before the process can move forward. Meanwhile, 498 00:29:59,200 --> 00:30:02,120 Speaker 1: SpaceX is like, now, uh, you just want us to 499 00:30:02,320 --> 00:30:04,480 Speaker 1: not be able to go to the public comment phase. 500 00:30:04,680 --> 00:30:06,760 Speaker 1: And anyway, the public comment phase is where we could 501 00:30:06,760 --> 00:30:09,600 Speaker 1: hash all this out. It would all be fine. Now. 502 00:30:10,600 --> 00:30:13,760 Speaker 1: I am not on the side of either Amazon or 503 00:30:13,920 --> 00:30:17,680 Speaker 1: SpaceX in this matter. Personally, I actually worry about the 504 00:30:17,680 --> 00:30:20,480 Speaker 1: thousands of satellites that will be needed for both the 505 00:30:20,560 --> 00:30:24,680 Speaker 1: Starlink and the Amazon Keiper system to work. That is 506 00:30:24,720 --> 00:30:27,280 Speaker 1: a lot of stuff whizzing around out there in low 507 00:30:27,280 --> 00:30:31,240 Speaker 1: Earth orbit which could potentially become space junk and thus 508 00:30:31,440 --> 00:30:36,320 Speaker 1: interfere with other spacecraft. It can also interfere with astronomical 509 00:30:36,400 --> 00:30:41,479 Speaker 1: observations here on Earth. So I'm not super fan of 510 00:30:41,600 --> 00:30:44,480 Speaker 1: either of these things right now. But anyway, that's not 511 00:30:44,560 --> 00:30:46,840 Speaker 1: the point. The point is that Amazon and space X 512 00:30:46,880 --> 00:30:49,920 Speaker 1: are totes in a space fight just here on Earth, 513 00:30:50,760 --> 00:30:52,800 Speaker 1: and that's it. Those are all the stories I have 514 00:30:52,880 --> 00:30:56,360 Speaker 1: for you on Thursday, September two, two twenty one. I 515 00:30:56,400 --> 00:30:59,880 Speaker 1: hope you are well and safe. And for those of 516 00:31:00,000 --> 00:31:01,680 Speaker 1: you in the United States, I hope you have a 517 00:31:01,720 --> 00:31:05,480 Speaker 1: wonderful Labor Day weekend. And if you have any suggestions 518 00:31:05,480 --> 00:31:07,920 Speaker 1: for topics I should cover on tech Stuff, reach out 519 00:31:07,960 --> 00:31:09,520 Speaker 1: to me. The best way to do that is to 520 00:31:09,680 --> 00:31:12,360 Speaker 1: use Twitter, and I use the handle text stuff h 521 00:31:12,600 --> 00:31:16,600 Speaker 1: s W and I'll talk to you again really soon. 522 00:31:21,400 --> 00:31:24,440 Speaker 1: Text Stuff is an I Heart Radio production. For more 523 00:31:24,520 --> 00:31:27,880 Speaker 1: podcasts from I Heart Radio, visit the I Heart Radio app, 524 00:31:28,040 --> 00:31:31,200 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.