1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,080 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:15,240 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,440 --> 00:00:21,640 Speaker 1: and I love all things tech and this is the 5 00:00:21,680 --> 00:00:31,320 Speaker 1: tech news for Tuesday, auguste. Yesterday, telecrrier T Mobile confirmed 6 00:00:31,360 --> 00:00:35,280 Speaker 1: reports that the company had been hacked. The site Motherboard 7 00:00:35,320 --> 00:00:38,879 Speaker 1: reported that hackers were offering up for sale private data 8 00:00:39,280 --> 00:00:43,919 Speaker 1: of T mobile customers, including social security numbers and driver's 9 00:00:43,920 --> 00:00:48,440 Speaker 1: license numbers, which prompted the company to conduct an internal investigation. 10 00:00:48,760 --> 00:00:51,560 Speaker 1: While T Mobile did say that the investigation showed someone 11 00:00:51,680 --> 00:00:56,000 Speaker 1: had gained unauthorized access to T Mobile's system, the company 12 00:00:56,040 --> 00:00:59,360 Speaker 1: has not yet determined that any personal data was compromised, 13 00:00:59,360 --> 00:01:03,200 Speaker 1: and that the company quote cannot confirm the reported number 14 00:01:03,280 --> 00:01:06,960 Speaker 1: of records affected or the validity of statements made by 15 00:01:07,040 --> 00:01:11,280 Speaker 1: others end quote. According to Motherboard, the person who tipped 16 00:01:11,319 --> 00:01:13,960 Speaker 1: them off to the fact that this breach happened. A 17 00:01:14,120 --> 00:01:18,680 Speaker 1: seller of this data, said that their group compromised information 18 00:01:18,720 --> 00:01:24,920 Speaker 1: relating to one hundred million people. Motherboard confirmed that the 19 00:01:25,000 --> 00:01:28,399 Speaker 1: samples of data that they saw did in fact reflect 20 00:01:28,480 --> 00:01:32,640 Speaker 1: accurate information about T Mobile customers, and the sellers are 21 00:01:32,720 --> 00:01:35,920 Speaker 1: peddling the info on the dark web, asking for bitcoin 22 00:01:36,040 --> 00:01:40,200 Speaker 1: and exchange from millions of records of personal data. Motherboard 23 00:01:40,240 --> 00:01:43,640 Speaker 1: also reported that the sellers had lost access to the 24 00:01:43,680 --> 00:01:47,319 Speaker 1: back door entry to T Mobile servers, but they had 25 00:01:47,360 --> 00:01:51,040 Speaker 1: already downloaded all the data, so you know, that's not great. 26 00:01:51,520 --> 00:01:55,320 Speaker 1: The damage was already done. This is why businesses that 27 00:01:55,360 --> 00:01:59,240 Speaker 1: monitor stuff like the dark web and your information that 28 00:01:59,320 --> 00:02:01,960 Speaker 1: may or may not beyond it are getting more popular 29 00:02:02,080 --> 00:02:04,600 Speaker 1: because it's a way for you to find out if 30 00:02:04,680 --> 00:02:08,600 Speaker 1: you have had your private data compromised. I may need 31 00:02:08,639 --> 00:02:11,480 Speaker 1: to do a full episode about that kind of stuff 32 00:02:11,720 --> 00:02:15,000 Speaker 1: to talk about. You know, what tools are out there, 33 00:02:15,040 --> 00:02:17,880 Speaker 1: how useful are they? What you should do if you 34 00:02:17,919 --> 00:02:19,960 Speaker 1: find out that your information has been included in one 35 00:02:20,000 --> 00:02:23,880 Speaker 1: of those breaches, because it certainly can be really overwhelming 36 00:02:24,680 --> 00:02:29,800 Speaker 1: when it happens. Alright, so hackers stole data from T Mobile. 37 00:02:29,840 --> 00:02:32,639 Speaker 1: But our next story is about how the Dallas, Texas 38 00:02:32,720 --> 00:02:37,600 Speaker 1: Police Department lost an enormous amount of data all by themselves. 39 00:02:37,840 --> 00:02:42,440 Speaker 1: And by enormous, I'm talking about twenty two terra bytes 40 00:02:42,720 --> 00:02:46,480 Speaker 1: of information. And this relates to a data migration, which 41 00:02:46,520 --> 00:02:49,080 Speaker 1: is a process where I mean it's what sounds like, 42 00:02:49,120 --> 00:02:53,920 Speaker 1: you're moving information from you know, one storage receptacle to another, 43 00:02:54,840 --> 00:02:56,720 Speaker 1: and it's always a huge pain in the protocus to 44 00:02:57,040 --> 00:03:01,080 Speaker 1: do this kind of thing, particularly as amount of information 45 00:03:01,160 --> 00:03:04,720 Speaker 1: gets larger. Just think of every time that you've bought 46 00:03:04,720 --> 00:03:07,480 Speaker 1: a new phone and just the process of setting everything 47 00:03:07,560 --> 00:03:10,520 Speaker 1: up on there and transferring all your information from your 48 00:03:10,520 --> 00:03:12,919 Speaker 1: old phone to your new phone. Now that has slowly 49 00:03:13,000 --> 00:03:16,840 Speaker 1: become easier to do, but still kind of a hassle. 50 00:03:17,080 --> 00:03:19,400 Speaker 1: You know, it's time consuming and you just want to 51 00:03:19,440 --> 00:03:23,200 Speaker 1: be able to use your new thing. Well, multiply that 52 00:03:23,520 --> 00:03:27,120 Speaker 1: times I don't know, a bajillion, and you get the 53 00:03:27,160 --> 00:03:29,480 Speaker 1: idea of what it's like to move the data of 54 00:03:29,520 --> 00:03:34,079 Speaker 1: an entire organization from one resource to another, like if 55 00:03:34,120 --> 00:03:37,680 Speaker 1: you're going from one cloud based service to a different 56 00:03:37,720 --> 00:03:41,040 Speaker 1: one or something like that. And in this case, during 57 00:03:41,080 --> 00:03:45,200 Speaker 1: that process, the Dallas Police Department managed to delete that 58 00:03:45,520 --> 00:03:49,640 Speaker 1: mountain of information. And of course we're talking about the police, 59 00:03:49,880 --> 00:03:54,560 Speaker 1: so this can, and in this case did include some 60 00:03:54,680 --> 00:04:02,320 Speaker 1: incredibly important, crucial and private information, including case files related 61 00:04:02,400 --> 00:04:08,040 Speaker 1: to a murder trial. So obviously that's beyond disastrous. And 62 00:04:08,320 --> 00:04:12,840 Speaker 1: the actual data deletion happened way back in April of 63 00:04:12,920 --> 00:04:17,400 Speaker 1: twenty one, but the police Department did not report on 64 00:04:17,440 --> 00:04:21,000 Speaker 1: it until this month, and I suppose part of that 65 00:04:21,080 --> 00:04:24,320 Speaker 1: reason was that the department was going into emergency data 66 00:04:24,400 --> 00:04:27,960 Speaker 1: recovery mode, and to their credit, they were able to 67 00:04:28,000 --> 00:04:31,880 Speaker 1: restore around fourteen terabytes of data, but that still leaves 68 00:04:32,040 --> 00:04:37,039 Speaker 1: eight terabytes of information that have been lost forever, and 69 00:04:37,080 --> 00:04:40,400 Speaker 1: that includes some criminal case files that were created before July, 70 00:04:42,279 --> 00:04:45,760 Speaker 1: though the police department says they don't actually know the 71 00:04:45,839 --> 00:04:50,320 Speaker 1: precise number of case files that have been lost. And honestly, 72 00:04:50,440 --> 00:04:53,880 Speaker 1: I am shocked that there wasn't a more comprehensive data 73 00:04:53,960 --> 00:04:57,720 Speaker 1: backup strategy in place for this, because redundancy is a 74 00:04:57,800 --> 00:05:03,280 Speaker 1: key feature in data management and it's absolutely critical when 75 00:05:03,320 --> 00:05:07,440 Speaker 1: stuff like this happens. That's why you have backup systems. 76 00:05:07,680 --> 00:05:10,720 Speaker 1: It's why you should have a backup system just in 77 00:05:10,760 --> 00:05:13,719 Speaker 1: case something goes wrong with your data, so that you 78 00:05:13,800 --> 00:05:16,679 Speaker 1: still have access to it if the worst should happen, 79 00:05:16,960 --> 00:05:20,360 Speaker 1: whether that's a service now goes offline because it goes 80 00:05:20,360 --> 00:05:23,000 Speaker 1: out of business, or you have a hard drive that fails. 81 00:05:24,360 --> 00:05:30,400 Speaker 1: Moving on, a security researcher named Vola Demir Bob Diachinko, 82 00:05:30,480 --> 00:05:33,280 Speaker 1: and I'm certain I've butchered the pronunciation of his name, 83 00:05:33,279 --> 00:05:37,359 Speaker 1: and I apologize anyway. He uncovered an online list that 84 00:05:37,480 --> 00:05:41,839 Speaker 1: most certainly should not have been uncoverable. It should have 85 00:05:41,920 --> 00:05:46,240 Speaker 1: been hidden, and it was a terrorist watch list. Apparently 86 00:05:46,279 --> 00:05:49,960 Speaker 1: the Terrorists Screening Center created the files and stored them 87 00:05:49,960 --> 00:05:55,799 Speaker 1: online without turning off the index feature. So search engines 88 00:05:55,839 --> 00:05:59,599 Speaker 1: have these things called spiders, and these are little bits 89 00:05:59,600 --> 00:06:03,360 Speaker 1: of code just crawl the web and return results back 90 00:06:03,400 --> 00:06:06,800 Speaker 1: to search engines so that the search engines can continuously 91 00:06:07,040 --> 00:06:11,840 Speaker 1: update their indices. That way, when you search for a 92 00:06:11,880 --> 00:06:15,520 Speaker 1: particular topic, you have a chance of getting really recent 93 00:06:15,600 --> 00:06:18,599 Speaker 1: results if those happen to be the best match to 94 00:06:18,680 --> 00:06:21,840 Speaker 1: whatever your query is. But you might not want your 95 00:06:21,839 --> 00:06:25,680 Speaker 1: website to enter a search engines index. Let's say that 96 00:06:25,680 --> 00:06:28,839 Speaker 1: you've got a website and you just want a certain 97 00:06:28,920 --> 00:06:30,919 Speaker 1: number of people to have access to it. You don't 98 00:06:30,920 --> 00:06:33,719 Speaker 1: need it to be accessible to the world at large. 99 00:06:34,320 --> 00:06:39,159 Speaker 1: So you can actually go into HTML and create a 100 00:06:39,200 --> 00:06:44,279 Speaker 1: little tag so that web spiders just pass over your 101 00:06:44,560 --> 00:06:46,880 Speaker 1: your website, they don't index it, so it won't show 102 00:06:46,960 --> 00:06:49,440 Speaker 1: up in search that You can do other things too, 103 00:06:49,480 --> 00:06:53,000 Speaker 1: like obviously, you could password protect the site as well. 104 00:06:53,240 --> 00:06:54,880 Speaker 1: But that's one thing you can do is just not 105 00:06:55,040 --> 00:06:58,000 Speaker 1: have it show up in search. Well, apparently someone neglected 106 00:06:58,160 --> 00:07:01,200 Speaker 1: to take that step while uploading these files to a server, 107 00:07:01,960 --> 00:07:07,640 Speaker 1: and search engines thus indexed these files, and Diachinko was 108 00:07:07,680 --> 00:07:10,920 Speaker 1: able to read the information in the files through the 109 00:07:11,000 --> 00:07:14,840 Speaker 1: search engines like it wasn't hidden or password protected, and 110 00:07:14,880 --> 00:07:18,040 Speaker 1: it included tons of personal information, you know, all the 111 00:07:18,080 --> 00:07:21,320 Speaker 1: stuff you would expect, like the name of the person, 112 00:07:21,480 --> 00:07:25,240 Speaker 1: their date of birth, their citizenship status. But it also 113 00:07:25,280 --> 00:07:27,800 Speaker 1: included some other stuff that you don't typically see on 114 00:07:27,840 --> 00:07:30,080 Speaker 1: these kinds of lists, like whether or not that person 115 00:07:30,160 --> 00:07:33,880 Speaker 1: is also on a no fly list. And Diachinko said, 116 00:07:33,920 --> 00:07:37,600 Speaker 1: to make matters worse, the files had no password protection, 117 00:07:37,840 --> 00:07:41,320 Speaker 1: so anyone could access them, and Diachinko reported his findings 118 00:07:41,320 --> 00:07:44,680 Speaker 1: to the Department of Homeland Security. Now, according to the Verge, 119 00:07:45,000 --> 00:07:48,720 Speaker 1: the files were first indexed on July nine of this year, 120 00:07:49,200 --> 00:07:53,040 Speaker 1: and they were still up and available for search three 121 00:07:53,080 --> 00:07:56,760 Speaker 1: weeks later before the DHS finally removed them. So here 122 00:07:56,760 --> 00:07:59,440 Speaker 1: we have another example of human error leading to a 123 00:07:59,560 --> 00:08:04,640 Speaker 1: mass of security breach and privacy breach with two million 124 00:08:04,760 --> 00:08:10,000 Speaker 1: records affected. Pretty awful stuff. And when I say pretty 125 00:08:10,040 --> 00:08:12,480 Speaker 1: awful stuff. I mean, these are people who are on 126 00:08:12,520 --> 00:08:16,240 Speaker 1: a terrorist watch list. Some of those people are innocent, 127 00:08:16,440 --> 00:08:20,280 Speaker 1: like not all of those people are guilty of being 128 00:08:20,320 --> 00:08:24,560 Speaker 1: associated with or being involved in terrorist activities, And so 129 00:08:25,120 --> 00:08:30,000 Speaker 1: you realize that this creates are a true devastating breach 130 00:08:30,040 --> 00:08:33,720 Speaker 1: of their privacy. Although I guess it could also alert 131 00:08:33,800 --> 00:08:36,199 Speaker 1: them that they were, you know, on a watch list. 132 00:08:36,679 --> 00:08:41,440 Speaker 1: But yeah, terrible stuff. Moving on, Here's something I'm sure 133 00:08:41,480 --> 00:08:45,240 Speaker 1: a lot of you already knew, but because I have 134 00:08:45,600 --> 00:08:49,640 Speaker 1: very old, only learned about it today. There is an 135 00:08:49,720 --> 00:08:54,400 Speaker 1: online vigilante group that tends to focus on TikTok accounts 136 00:08:54,920 --> 00:08:58,040 Speaker 1: that's been pretty busy lately. The group is called the 137 00:08:58,080 --> 00:09:02,240 Speaker 1: Great Londini l in D I and I, and in 138 00:09:02,320 --> 00:09:08,439 Speaker 1: many ways it reminds me of the anarchic activist group Anonymous. 139 00:09:08,480 --> 00:09:12,040 Speaker 1: Just like that famous collective, the Great Landini relies on 140 00:09:12,200 --> 00:09:16,960 Speaker 1: a masked visage to serve as its symbol and spokesperson, 141 00:09:17,320 --> 00:09:21,319 Speaker 1: so Anonymous. The symbol of choice is a guy Fox mask, 142 00:09:21,760 --> 00:09:24,600 Speaker 1: which is referencing one of the conspirators who plotted to 143 00:09:24,760 --> 00:09:29,640 Speaker 1: ignite barrels of gunpowder hidden underneath Parliament hundreds of years ago. 144 00:09:30,080 --> 00:09:35,000 Speaker 1: That plan did not go well. Remember remember the Great 145 00:09:35,080 --> 00:09:39,120 Speaker 1: Landini's mask of choice has a joker like smile and 146 00:09:39,160 --> 00:09:43,720 Speaker 1: creepy hollow eyes, though the profile on YouTube for the 147 00:09:43,720 --> 00:09:46,680 Speaker 1: Great Landini includes other people who are actually wearing guy 148 00:09:46,720 --> 00:09:51,840 Speaker 1: Fox masks as well, so they also appropriate that imagery. Well. 149 00:09:52,240 --> 00:09:57,040 Speaker 1: They claim they are dedicated to quote exposing racist bullies, 150 00:09:57,200 --> 00:10:01,320 Speaker 1: scammers and trolls end quote h the racist singular is 151 00:10:01,480 --> 00:10:04,600 Speaker 1: in there, so I'm sure they meant racists. So the 152 00:10:04,600 --> 00:10:08,880 Speaker 1: group targets people that they identify as being bullies and racists, etcetera, 153 00:10:09,080 --> 00:10:12,280 Speaker 1: and then they publish those people's identities online, kind of 154 00:10:12,320 --> 00:10:16,640 Speaker 1: like dock sing. This includes informing people like the parents 155 00:10:16,760 --> 00:10:21,160 Speaker 1: of kids who are being trolls and displaying these kinds 156 00:10:21,200 --> 00:10:24,720 Speaker 1: of behaviors online, or employers in the case of you 157 00:10:24,760 --> 00:10:28,560 Speaker 1: know adults. They claim to consist of a collective of 158 00:10:28,600 --> 00:10:33,040 Speaker 1: people with cybersecurity and military experience, which I suspect is 159 00:10:33,559 --> 00:10:39,320 Speaker 1: at best and exaggeration. The website insider dot com reported 160 00:10:39,320 --> 00:10:41,720 Speaker 1: that the group says they can get to the real 161 00:10:41,800 --> 00:10:45,480 Speaker 1: identity of someone posting you know online on like TikTok 162 00:10:45,880 --> 00:10:48,960 Speaker 1: in eight or nine clicks, and it implies that they're 163 00:10:49,000 --> 00:10:51,640 Speaker 1: using some clever ways to figure out who is behind 164 00:10:51,640 --> 00:10:54,680 Speaker 1: any given account, and you know, digging down and finding 165 00:10:54,720 --> 00:10:59,280 Speaker 1: that information out. I suspect that there are some much 166 00:10:59,320 --> 00:11:02,320 Speaker 1: more simple methods that are being employed in many of 167 00:11:02,320 --> 00:11:07,000 Speaker 1: these cases. For example, the targets, the bullies or whatever, 168 00:11:07,480 --> 00:11:11,239 Speaker 1: they could be posting stuff that just contains identifiable information 169 00:11:11,280 --> 00:11:13,640 Speaker 1: in it. So they could be posting TikTok videos where 170 00:11:13,679 --> 00:11:16,120 Speaker 1: you're like, oh, I can totally tell who this person is, 171 00:11:16,559 --> 00:11:21,000 Speaker 1: or they might include links to their other social profiles 172 00:11:21,640 --> 00:11:24,520 Speaker 1: and that makes it really easy. You just cross reference 173 00:11:24,559 --> 00:11:26,680 Speaker 1: and you get all the information you need about who 174 00:11:26,720 --> 00:11:29,360 Speaker 1: that person is, Like if it's a Facebook account, it 175 00:11:29,440 --> 00:11:32,320 Speaker 1: might include their location and everything, and their employer and 176 00:11:32,840 --> 00:11:35,320 Speaker 1: you know, spouse, all that kind of stuff. So, in 177 00:11:35,320 --> 00:11:37,840 Speaker 1: other words, I'm not entirely convinced that this is a 178 00:11:37,880 --> 00:11:40,760 Speaker 1: case where people are using any sort of sophisticated approach 179 00:11:40,800 --> 00:11:44,240 Speaker 1: to get the information. And I'd frankly be surprised if 180 00:11:44,240 --> 00:11:48,520 Speaker 1: they really did have extensive cybersecurity or military experience, because 181 00:11:49,040 --> 00:11:52,079 Speaker 1: they don't need it to do what they're doing. TikTok 182 00:11:52,160 --> 00:11:56,520 Speaker 1: has banned accounts that are associated with this group multiple times, 183 00:11:56,600 --> 00:12:00,560 Speaker 1: and the group consistently keeps making new accounts. As for 184 00:12:00,640 --> 00:12:03,400 Speaker 1: what I think about all this, well, I think vigilante 185 00:12:03,440 --> 00:12:07,839 Speaker 1: justice is not great, but I also acknowledge that there's 186 00:12:07,920 --> 00:12:10,600 Speaker 1: not really a good system in place to hold trolls 187 00:12:10,720 --> 00:12:14,040 Speaker 1: accountable for their actions. I mean, typically it comes down 188 00:12:14,080 --> 00:12:18,120 Speaker 1: to the individual platforms, and as a rule, platforms are 189 00:12:18,160 --> 00:12:21,720 Speaker 1: not super responsive to this kind of stuff, usually because 190 00:12:21,720 --> 00:12:25,840 Speaker 1: it's just operating on such a scale that it becomes 191 00:12:26,280 --> 00:12:30,320 Speaker 1: impractical to react to these things, which just allows the 192 00:12:30,320 --> 00:12:35,400 Speaker 1: problems to fester. And that's really an issue because the 193 00:12:35,480 --> 00:12:39,120 Speaker 1: actions these bullies make, like the things they do, the 194 00:12:39,160 --> 00:12:42,920 Speaker 1: way they can harass people, those can sometimes lead to tragedy. 195 00:12:43,120 --> 00:12:45,880 Speaker 1: I mean there are cases where bullied people have gone 196 00:12:45,920 --> 00:12:49,920 Speaker 1: down a very self destructive pathway. So I understand the 197 00:12:49,960 --> 00:12:54,360 Speaker 1: motivation of the group to see justice done and to 198 00:12:54,559 --> 00:12:57,960 Speaker 1: convince bullies to stop being bullies, but I would rather 199 00:12:58,080 --> 00:13:00,560 Speaker 1: much see a much more concerted effort to create a 200 00:13:00,640 --> 00:13:05,600 Speaker 1: system of justice, like an an official approach, as opposed 201 00:13:05,640 --> 00:13:09,000 Speaker 1: to a self appointed vigilante group. But I also realized 202 00:13:09,200 --> 00:13:11,520 Speaker 1: that a lot of people have serious doubts that you 203 00:13:11,559 --> 00:13:16,240 Speaker 1: could ever really make a just system and have it work. 204 00:13:17,360 --> 00:13:19,920 Speaker 1: We have a couple more news stories But before we 205 00:13:19,960 --> 00:13:30,800 Speaker 1: get to that, let's take a quick break. We're back. 206 00:13:31,679 --> 00:13:35,640 Speaker 1: Only Fans, the website best known for hosting content creators 207 00:13:35,640 --> 00:13:39,160 Speaker 1: who specialize in not safe for work types of content, 208 00:13:39,840 --> 00:13:42,640 Speaker 1: is trying to get some broader acceptance in the app 209 00:13:42,679 --> 00:13:47,360 Speaker 1: store world. See, because Only Fans is mostly associated with 210 00:13:47,640 --> 00:13:51,480 Speaker 1: nudity and sexual content, the company has not been able 211 00:13:51,520 --> 00:13:55,520 Speaker 1: to secure permission from Apple and Google's respective app stores 212 00:13:55,880 --> 00:13:59,720 Speaker 1: to launch an app for users. So now Only Fans 213 00:13:59,800 --> 00:14:04,120 Speaker 1: is promoting a new ish app called o f TV, 214 00:14:04,840 --> 00:14:09,319 Speaker 1: which will allow Only Fans creators to make content specifically 215 00:14:09,400 --> 00:14:12,960 Speaker 1: for this app, as long as that content is in 216 00:14:13,040 --> 00:14:17,040 Speaker 1: the safe for work category. So there's a strict policy 217 00:14:17,080 --> 00:14:21,360 Speaker 1: against stuff like nudity and sexual content, So if creators 218 00:14:21,400 --> 00:14:23,680 Speaker 1: want to make stuff for o f TV, they have 219 00:14:23,760 --> 00:14:26,880 Speaker 1: to follow those rules that allows Only Fans to have 220 00:14:26,960 --> 00:14:30,240 Speaker 1: an app in the app stores without violating those app 221 00:14:30,320 --> 00:14:34,000 Speaker 1: stores policies. Now, I did say it's a new ish 222 00:14:34,120 --> 00:14:38,120 Speaker 1: app because it's actually been available since January of this year, 223 00:14:38,640 --> 00:14:41,600 Speaker 1: but Only Fans is only just recently started to, you know, 224 00:14:42,360 --> 00:14:45,880 Speaker 1: tell people that this app even exists. Now, maybe that 225 00:14:46,000 --> 00:14:50,160 Speaker 1: was an effort to allow interested content creators to make 226 00:14:50,280 --> 00:14:53,160 Speaker 1: stuff that could live on this app, so that the 227 00:14:53,200 --> 00:14:56,440 Speaker 1: app would have some content available as soon as folks 228 00:14:56,560 --> 00:14:59,720 Speaker 1: downloaded it, rather than having a content app with no 229 00:15:00,040 --> 00:15:03,280 Speaker 1: content on it, which isn't very interesting. I'm sure the 230 00:15:03,360 --> 00:15:07,520 Speaker 1: Only Fans reached out to some of their top creators 231 00:15:07,560 --> 00:15:10,920 Speaker 1: and invited them to participate in this. Not Only Fans 232 00:15:10,960 --> 00:15:14,000 Speaker 1: is really trying to branch out of the adult content 233 00:15:14,240 --> 00:15:18,600 Speaker 1: niche and become more of a broad content creator platform 234 00:15:18,640 --> 00:15:22,400 Speaker 1: that allows people to monetize their work, similar to platforms 235 00:15:22,440 --> 00:15:27,040 Speaker 1: like Patreon. That being said, the app seems more like 236 00:15:27,480 --> 00:15:31,000 Speaker 1: an outreach effort, like a marketing effort, rather than a 237 00:15:31,000 --> 00:15:34,960 Speaker 1: money maker. Tim Stokely, the CEO of Only Fans, told 238 00:15:35,000 --> 00:15:38,560 Speaker 1: Bloomberg that the content on O f TV is quote 239 00:15:39,120 --> 00:15:43,720 Speaker 1: not being monetized and there's no direct impact on creators 240 00:15:43,800 --> 00:15:48,640 Speaker 1: earnings end quote. Which if that's the case, then I 241 00:15:48,760 --> 00:15:52,880 Speaker 1: question why content creators already on Only Fans, those who 242 00:15:52,920 --> 00:15:57,120 Speaker 1: are you know, firmly established as creating adult content, in particular, 243 00:15:57,480 --> 00:16:00,480 Speaker 1: why they would ever bother to make stuff for O 244 00:16:00,680 --> 00:16:03,440 Speaker 1: f TV if they aren't allowed to create the kind 245 00:16:03,480 --> 00:16:07,320 Speaker 1: of content that actually brings them revenue, and if there's 246 00:16:07,360 --> 00:16:10,360 Speaker 1: no revenue to be made by creating content for o 247 00:16:10,520 --> 00:16:13,840 Speaker 1: f TV, at least not directly. Then that seems like 248 00:16:13,840 --> 00:16:16,560 Speaker 1: a pretty big investment of time and effort for little 249 00:16:16,600 --> 00:16:20,000 Speaker 1: to no payoff, unless your goal is to lure people 250 00:16:20,280 --> 00:16:23,800 Speaker 1: to go to the Only Fans dot com site and 251 00:16:23,840 --> 00:16:26,760 Speaker 1: then become a supporter there. So, in other words, to 252 00:16:27,920 --> 00:16:30,840 Speaker 1: hint at the sort of content you would be able 253 00:16:30,880 --> 00:16:32,920 Speaker 1: to get on the Only Fans site, but you couldn't 254 00:16:32,960 --> 00:16:36,880 Speaker 1: actually show it on o F TV. And it's possible 255 00:16:37,040 --> 00:16:42,440 Speaker 1: also the Only Fans commissioned content from creators, and like, yeah, 256 00:16:42,440 --> 00:16:46,440 Speaker 1: there's no monetization directly through the app, but only Fans 257 00:16:46,480 --> 00:16:49,400 Speaker 1: may have paid creators to create stuff to live on 258 00:16:49,440 --> 00:16:51,480 Speaker 1: o f t V. I wouldn't be surprised to learn that. 259 00:16:51,640 --> 00:16:54,720 Speaker 1: I do not know for the record whether that happened 260 00:16:54,800 --> 00:16:56,600 Speaker 1: or not, but it wouldn't shock me to learn that 261 00:16:56,600 --> 00:17:00,400 Speaker 1: that's what was going on. Now. Some people might just 262 00:17:00,440 --> 00:17:03,280 Speaker 1: want to make fun stuff for free, and there's nothing 263 00:17:03,280 --> 00:17:05,639 Speaker 1: wrong with that, you know, if you want to do that, 264 00:17:05,680 --> 00:17:10,280 Speaker 1: you should totally do that. But as a content creator myself, no, certainly, 265 00:17:10,320 --> 00:17:13,840 Speaker 1: not only fans creator, no one, No one wants that. 266 00:17:14,440 --> 00:17:18,000 Speaker 1: But I have to really be into something in order 267 00:17:18,119 --> 00:17:20,879 Speaker 1: to do it for free because my job has me 268 00:17:20,960 --> 00:17:25,440 Speaker 1: creating content every ding dang day, and if there's no 269 00:17:25,560 --> 00:17:28,240 Speaker 1: return on that, it feels like I might be taking 270 00:17:28,560 --> 00:17:33,320 Speaker 1: away time that I could use to do other stuff. Anyway, 271 00:17:33,359 --> 00:17:35,520 Speaker 1: it will be interesting to see how o f TV 272 00:17:35,720 --> 00:17:39,000 Speaker 1: plays into the overall strategy, both for only fans the 273 00:17:39,040 --> 00:17:44,400 Speaker 1: company and for the content creators making content for that site. Next. 274 00:17:44,680 --> 00:17:49,320 Speaker 1: Tesla is once again under the microscope for its autopilot feature. 275 00:17:49,840 --> 00:17:54,159 Speaker 1: This time, the National Highway Traffic and Safety Administration or 276 00:17:54,480 --> 00:17:58,160 Speaker 1: in ht s A, is looking into the company due 277 00:17:58,160 --> 00:18:01,280 Speaker 1: to a series of accidents in which Tesla vehicles that 278 00:18:01,320 --> 00:18:06,360 Speaker 1: were in autopilot mode collided with parked emergency vehicles. According 279 00:18:06,400 --> 00:18:09,800 Speaker 1: to the nht s A, since two thousand eighteen, there 280 00:18:09,800 --> 00:18:13,040 Speaker 1: have been eleven incidents in which Tesla vehicles in the 281 00:18:13,200 --> 00:18:17,120 Speaker 1: driver assist mode, whether that was autopilot or the other 282 00:18:17,720 --> 00:18:21,760 Speaker 1: style of driver as system modes, collided with vehicles at 283 00:18:21,800 --> 00:18:26,280 Speaker 1: a first responder emergency scene, and those crashes resulted in 284 00:18:26,400 --> 00:18:30,720 Speaker 1: seventeen people being injured and in one case, one person died. 285 00:18:31,160 --> 00:18:33,960 Speaker 1: And we're starting to see an increasing amount of pressure 286 00:18:34,359 --> 00:18:39,359 Speaker 1: on companies that are creating autonomous vehicle systems like government pressure. Now, 287 00:18:39,440 --> 00:18:44,600 Speaker 1: in an ideal implementation, autonomous vehicles would drastically reduce the 288 00:18:44,680 --> 00:18:48,240 Speaker 1: number of accidents and thus injuries and deaths on the road. 289 00:18:48,800 --> 00:18:53,400 Speaker 1: The vast majority of accidents on the road are due 290 00:18:53,480 --> 00:18:56,520 Speaker 1: to human error. So you take human error out and 291 00:18:56,560 --> 00:18:59,520 Speaker 1: you will eliminate a ton of accidents, assuming you have 292 00:18:59,600 --> 00:19:03,600 Speaker 1: an idea real implementation of autonomous driving technology. But we 293 00:19:03,680 --> 00:19:08,240 Speaker 1: are a far way away from having that ideal implementation. 294 00:19:08,720 --> 00:19:11,360 Speaker 1: No one has created a system that we could call 295 00:19:11,480 --> 00:19:17,040 Speaker 1: truly autonomous. Tesla maintains that autopilot is a driver assist feature. 296 00:19:17,080 --> 00:19:20,399 Speaker 1: It is not a true autonomous mode, but it also 297 00:19:20,520 --> 00:19:25,359 Speaker 1: promotes its vehicles as having quote all the hardware needed 298 00:19:25,400 --> 00:19:28,720 Speaker 1: in the future for full self driving in almost all 299 00:19:28,880 --> 00:19:32,960 Speaker 1: circumstances end quote. So essentially it's saying, yeah, this isn't 300 00:19:33,119 --> 00:19:36,480 Speaker 1: an autonomous vehicle, but it's got everything that can make 301 00:19:36,480 --> 00:19:38,760 Speaker 1: it an autonomous vehicle. So it's really just you know, 302 00:19:38,800 --> 00:19:41,520 Speaker 1: when we throw the switch is that's kind of the messaging. 303 00:19:41,640 --> 00:19:44,919 Speaker 1: That's what sounds like. I'm not saying that's specifically what 304 00:19:44,960 --> 00:19:47,480 Speaker 1: they're saying. That's how a lot of people interpret it, 305 00:19:47,840 --> 00:19:49,399 Speaker 1: which kind of comes across as a bit of a 306 00:19:49,440 --> 00:19:52,720 Speaker 1: mixed message, and I think ultimately we have to hold 307 00:19:52,840 --> 00:19:58,520 Speaker 1: Tesla owners accountable for their choices, such as using a 308 00:19:58,640 --> 00:20:03,679 Speaker 1: feature like autopilot beyond its intended function, but still a 309 00:20:03,680 --> 00:20:06,159 Speaker 1: good deal of responsibility also has to fall on the 310 00:20:06,160 --> 00:20:11,440 Speaker 1: company itself. The US investigation into Tesla covers all models 311 00:20:11,520 --> 00:20:14,080 Speaker 1: from two thousand and fourteen to present day and could 312 00:20:14,080 --> 00:20:17,800 Speaker 1: potentially lead to new rules about when features like autopilot 313 00:20:18,080 --> 00:20:21,320 Speaker 1: are allowed to be used and when they are not. 314 00:20:22,040 --> 00:20:25,600 Speaker 1: Back in March, a Chinese military satellite called Yunhai one 315 00:20:25,760 --> 00:20:29,720 Speaker 1: Dash zero two broke apart in orbit, and at the 316 00:20:29,800 --> 00:20:33,399 Speaker 1: time no one was really sure what caused it to happen, 317 00:20:34,280 --> 00:20:38,200 Speaker 1: so there was one hypothesis that perhaps the satellite's propulsion 318 00:20:38,240 --> 00:20:41,520 Speaker 1: system had a critical failure that led to an explosion, 319 00:20:42,119 --> 00:20:46,200 Speaker 1: something that the Chinese company behind Yunhai said was not likely. 320 00:20:46,880 --> 00:20:50,679 Speaker 1: An astrophysicist named Jonathan McDowell has now determined that the 321 00:20:50,720 --> 00:20:55,120 Speaker 1: true cause of yun Hi's demise was a high speed 322 00:20:55,320 --> 00:20:59,120 Speaker 1: space collision. He determined that a piece of debris measuring 323 00:20:59,160 --> 00:21:03,399 Speaker 1: between four and twenty inches wide collided with un high 324 00:21:03,440 --> 00:21:07,320 Speaker 1: and on further analysis, determined that this debris came from 325 00:21:07,480 --> 00:21:10,760 Speaker 1: a Senate to rocket that was used by Russia to 326 00:21:10,840 --> 00:21:14,320 Speaker 1: launch a spy satellite in orbit all the way back in. 327 00:21:17,240 --> 00:21:21,880 Speaker 1: This piece that was flying around in orbit is one 328 00:21:22,040 --> 00:21:26,640 Speaker 1: of eight known pieces from that rocket that satellite trackers 329 00:21:26,680 --> 00:21:29,600 Speaker 1: have kept tabs on over the years, saying here's some 330 00:21:29,640 --> 00:21:31,720 Speaker 1: more space debris that we have to be on the 331 00:21:31,760 --> 00:21:35,240 Speaker 1: lookout for. The collision with un High generated even more 332 00:21:35,280 --> 00:21:38,440 Speaker 1: space debris, which of course means now there's even more 333 00:21:38,480 --> 00:21:41,880 Speaker 1: stuff whizzing around in orbit that could potentially create hazards 334 00:21:41,880 --> 00:21:45,880 Speaker 1: for various types of spacecraft, whether that has people aborted 335 00:21:46,000 --> 00:21:50,080 Speaker 1: or not. Now, I will remind you space is big, 336 00:21:50,640 --> 00:21:54,560 Speaker 1: or as Douglas Adams put it, space is big, really big. 337 00:21:54,920 --> 00:21:57,880 Speaker 1: But the orbits where we tend to put stuff are 338 00:21:57,880 --> 00:22:00,600 Speaker 1: starting to get a little crowded, and that means that 339 00:22:00,640 --> 00:22:04,440 Speaker 1: with each passing year there are more opportunities for collisions 340 00:22:04,480 --> 00:22:09,320 Speaker 1: to happen, and depending upon what collides with what we 341 00:22:09,320 --> 00:22:15,560 Speaker 1: could see consequences here on Earth as various services get disrupted. Now, granted, 342 00:22:16,040 --> 00:22:18,760 Speaker 1: not every satellite is inhabiting the same morbid. We have 343 00:22:18,800 --> 00:22:22,960 Speaker 1: some satellites that are in really far out orbits around Earth, 344 00:22:23,400 --> 00:22:27,000 Speaker 1: so it's not like everything is crowded into the same space, 345 00:22:27,040 --> 00:22:32,880 Speaker 1: but it is getting a little more you know, rough 346 00:22:32,920 --> 00:22:36,800 Speaker 1: out there. Also, the increase in space junk means it's 347 00:22:36,840 --> 00:22:40,840 Speaker 1: harder for astronomers to see celestial bodies here on Earth's surface, 348 00:22:40,920 --> 00:22:45,560 Speaker 1: Like there are more things blocking our view beyond our 349 00:22:45,560 --> 00:22:49,320 Speaker 1: own orbit. And that's not great either. Not that I 350 00:22:49,440 --> 00:22:52,560 Speaker 1: think that this is necessarily going to get better anytime soon, 351 00:22:53,240 --> 00:22:57,560 Speaker 1: but the problem is becoming increasingly evident. I have a 352 00:22:57,640 --> 00:23:00,800 Speaker 1: couple more stories to get to. But before we jump 353 00:23:00,840 --> 00:23:11,480 Speaker 1: on that, let's take another quick break. Yak and e yak. 354 00:23:11,920 --> 00:23:16,840 Speaker 1: We're back, as is yik yak. I'm on track. Okay, So, 355 00:23:17,000 --> 00:23:20,439 Speaker 1: for those of you who have never heard of yik yak, 356 00:23:20,960 --> 00:23:23,640 Speaker 1: back in two thousand and fourteen, you could download this 357 00:23:23,720 --> 00:23:26,720 Speaker 1: app called yik yak and you could use it as 358 00:23:26,720 --> 00:23:31,159 Speaker 1: a kind of hyper local social networking platform. So the 359 00:23:31,240 --> 00:23:34,720 Speaker 1: idea was, at least in the original incarnation of yek yak, 360 00:23:35,320 --> 00:23:38,800 Speaker 1: that you could see what people were posting in your 361 00:23:38,960 --> 00:23:42,360 Speaker 1: immediate area. You could also post, and people in your 362 00:23:42,359 --> 00:23:45,680 Speaker 1: area could read what you wrote wrote, but you would 363 00:23:45,720 --> 00:23:47,800 Speaker 1: only see things that have been posted within a five 364 00:23:47,880 --> 00:23:51,040 Speaker 1: mile radius of your current location. And on top of that, 365 00:23:52,119 --> 00:23:56,280 Speaker 1: the whole thing was anonymous, so people would anonymously post 366 00:23:56,320 --> 00:23:59,639 Speaker 1: these things. The idea was for people to share fun 367 00:23:59,640 --> 00:24:03,480 Speaker 1: and from aation that was relevant to the specific location. 368 00:24:03,880 --> 00:24:07,720 Speaker 1: But if you're thinking hum an app that lets you 369 00:24:07,760 --> 00:24:14,399 Speaker 1: post anonymously about, you know, anything within your hyper local region, 370 00:24:14,800 --> 00:24:17,560 Speaker 1: I bet people could use something like that to really 371 00:24:18,000 --> 00:24:21,520 Speaker 1: troll the heck out of other folks, Well, you would 372 00:24:21,520 --> 00:24:27,639 Speaker 1: be right. Bullying, doxing, harassment, and more were rampant on 373 00:24:27,760 --> 00:24:32,199 Speaker 1: yik yak, which got particularly popular around college campuses. It 374 00:24:32,240 --> 00:24:35,439 Speaker 1: got to the point where the company had to geo 375 00:24:35,600 --> 00:24:39,040 Speaker 1: fence the app and cut it off for certain regions, 376 00:24:39,080 --> 00:24:43,280 Speaker 1: like around high schools and stuff, because people were using 377 00:24:43,280 --> 00:24:45,400 Speaker 1: it to do all sorts of things like spread rumors 378 00:24:45,440 --> 00:24:48,160 Speaker 1: about others, and because it's anonymous, you never really knew 379 00:24:48,160 --> 00:24:50,720 Speaker 1: who it was that was saying these things. And it 380 00:24:50,800 --> 00:24:54,520 Speaker 1: got bad enough, and the heat got intense enough that 381 00:24:54,720 --> 00:24:58,240 Speaker 1: ultimately yik yak changed up its policy and it required 382 00:24:58,320 --> 00:25:01,600 Speaker 1: users to establish a hand doll So it kind of 383 00:25:01,640 --> 00:25:05,160 Speaker 1: removed that anonymous feature because now there was a handle 384 00:25:05,200 --> 00:25:09,480 Speaker 1: associated with a message, it wasn't just anonymous yik yak user. 385 00:25:10,320 --> 00:25:13,840 Speaker 1: The company was already kind of spiraling at that point, 386 00:25:14,280 --> 00:25:17,760 Speaker 1: and ultimately it went out of business and now it 387 00:25:17,880 --> 00:25:20,879 Speaker 1: is under new management as a new group came in 388 00:25:20,960 --> 00:25:24,840 Speaker 1: and purchased the intellectual property to yik yak earlier this year, 389 00:25:25,400 --> 00:25:27,960 Speaker 1: and they are launching a new version of the app, 390 00:25:28,600 --> 00:25:32,800 Speaker 1: and anonymity is back. That is a feature that has returned, 391 00:25:33,160 --> 00:25:36,919 Speaker 1: as is the hyper local focus of the app itself. However, 392 00:25:36,960 --> 00:25:40,440 Speaker 1: the new company says it has a zero tolerance policy 393 00:25:40,520 --> 00:25:43,800 Speaker 1: for harassment and bullying. It's got a list of rules 394 00:25:43,840 --> 00:25:46,960 Speaker 1: that users are supposed to follow or else they face 395 00:25:47,480 --> 00:25:50,199 Speaker 1: getting banned from the service. Because while they might be 396 00:25:50,240 --> 00:25:53,520 Speaker 1: anonymous to all the other users, yik yak on the 397 00:25:53,560 --> 00:25:58,040 Speaker 1: server side can see which account posts what. There's also 398 00:25:58,080 --> 00:26:01,640 Speaker 1: a way apparently to up thee or down vote posts 399 00:26:01,720 --> 00:26:04,119 Speaker 1: on yr kik. I say, apparently, because I have not 400 00:26:04,280 --> 00:26:07,760 Speaker 1: downloaded it myself. Uh, And the company says that posts 401 00:26:07,800 --> 00:26:11,000 Speaker 1: that receive a minus five on that scoring system of 402 00:26:11,080 --> 00:26:14,640 Speaker 1: people down vote it, those kinds of messages won't show 403 00:26:14,720 --> 00:26:18,359 Speaker 1: up anymore for other users. So if someone's been talking 404 00:26:18,440 --> 00:26:21,280 Speaker 1: trash and enough people in your area have voted it 405 00:26:21,320 --> 00:26:23,680 Speaker 1: down and you log into yer kik, you won't even 406 00:26:23,680 --> 00:26:26,960 Speaker 1: see it because it will be below that threshold. Now, 407 00:26:27,000 --> 00:26:29,520 Speaker 1: whether all this is going to prevent abuse of the 408 00:26:29,560 --> 00:26:33,480 Speaker 1: platform remains to be seen. I'm somewhat weary of the 409 00:26:33,520 --> 00:26:37,000 Speaker 1: whole thing because I saw how ugly things got the 410 00:26:37,040 --> 00:26:39,520 Speaker 1: first time around when yek yak was a thing. And 411 00:26:39,560 --> 00:26:44,359 Speaker 1: if you ever want to hear about the most uncomfortable 412 00:26:44,440 --> 00:26:47,520 Speaker 1: I've ever been in my job, asked me about the 413 00:26:47,560 --> 00:26:50,400 Speaker 1: time I was told to moderate a south By Southwest 414 00:26:50,440 --> 00:26:55,440 Speaker 1: panel about yik yak, because I still have stressed dreams 415 00:26:55,840 --> 00:27:00,520 Speaker 1: about that day. Finally, sometimes hackers find an exploit in 416 00:27:00,560 --> 00:27:04,480 Speaker 1: a system and then you know, they exploit it, or 417 00:27:04,520 --> 00:27:08,360 Speaker 1: they might sell that knowledge to someone else so that 418 00:27:08,359 --> 00:27:13,000 Speaker 1: that someone else can go and exploit the vulnerability, potentially 419 00:27:13,040 --> 00:27:16,320 Speaker 1: getting caught in the process, while the hacker just pockets 420 00:27:16,760 --> 00:27:20,040 Speaker 1: whatever money was spent in order to get that information. 421 00:27:20,400 --> 00:27:23,760 Speaker 1: But sometimes, and actually this happens more frequently than it 422 00:27:23,800 --> 00:27:27,639 Speaker 1: tends to be reported, hackers will actually reach out to 423 00:27:28,280 --> 00:27:31,840 Speaker 1: a company and alert them of an exploit that they found, 424 00:27:32,400 --> 00:27:35,359 Speaker 1: and then they might get a reward for their good 425 00:27:35,400 --> 00:27:38,360 Speaker 1: deed because the company will say, oh my gosh, thank 426 00:27:38,400 --> 00:27:41,600 Speaker 1: you for telling us about this. They patch up the vulnerability, 427 00:27:41,640 --> 00:27:45,119 Speaker 1: and they reward the hacker That's what happened recently with 428 00:27:45,160 --> 00:27:49,800 Speaker 1: a hacker who reported an issue with valves payment process 429 00:27:49,880 --> 00:27:53,480 Speaker 1: on Steam. Now, in case you're not familiar, Steam is 430 00:27:53,520 --> 00:27:57,320 Speaker 1: an online storefront for computer games, and it also has 431 00:27:57,359 --> 00:28:00,159 Speaker 1: some social networking features. But I think most people just 432 00:28:00,240 --> 00:28:02,359 Speaker 1: view it as a way to purchase and organize a 433 00:28:02,400 --> 00:28:05,960 Speaker 1: computer game library. Maybe that's just my own experience, because 434 00:28:05,960 --> 00:28:08,760 Speaker 1: that's all I use it for. Anyway. This is all 435 00:28:08,800 --> 00:28:12,399 Speaker 1: digital download stuff, right You go on the online store, 436 00:28:12,720 --> 00:28:15,560 Speaker 1: you buy a game, you download it to your computer. 437 00:28:16,320 --> 00:28:19,199 Speaker 1: It's just the online version of what we used to 438 00:28:19,200 --> 00:28:21,720 Speaker 1: do by going into old brick and mortar stores. And 439 00:28:21,800 --> 00:28:25,600 Speaker 1: Steam's success has been the Valve really hasn't need needed 440 00:28:25,640 --> 00:28:27,840 Speaker 1: to spend, you know, nearly as much time doing what 441 00:28:27,880 --> 00:28:30,040 Speaker 1: it used to do, which was that it used to 442 00:28:30,080 --> 00:28:33,480 Speaker 1: make a lot of computer games, and now it doesn't. Anyway, 443 00:28:33,840 --> 00:28:37,359 Speaker 1: this hacker found out that if he created an email 444 00:28:37,400 --> 00:28:41,720 Speaker 1: address with the term amount five thousand in the email 445 00:28:41,720 --> 00:28:46,520 Speaker 1: address and then registered that email with Steam, and then 446 00:28:46,800 --> 00:28:50,760 Speaker 1: went through the process of allowing Steam to put a 447 00:28:50,800 --> 00:28:54,480 Speaker 1: one dollar charge to their payment system. This is a 448 00:28:54,480 --> 00:28:57,800 Speaker 1: way for Steam to verify that whatever payment process you 449 00:28:57,840 --> 00:29:03,200 Speaker 1: put in is legit. Then he would receive five thousand 450 00:29:03,240 --> 00:29:07,720 Speaker 1: dollars in his Steam wallet just by going through this approach. 451 00:29:07,760 --> 00:29:10,120 Speaker 1: So the email just had to have that term in 452 00:29:10,200 --> 00:29:14,080 Speaker 1: it and Steam would award that amount of money to 453 00:29:14,240 --> 00:29:16,600 Speaker 1: his wallet. So he could have just kept on making 454 00:29:16,600 --> 00:29:19,760 Speaker 1: accounts like that and using these fake Steam dollars to 455 00:29:19,800 --> 00:29:23,360 Speaker 1: buy tons of games. But instead the hacker let Valve 456 00:29:23,440 --> 00:29:27,120 Speaker 1: know about the exploit, and the company patched the problem 457 00:29:27,400 --> 00:29:29,800 Speaker 1: and then rewarded the hacker with a bug bounty of 458 00:29:29,920 --> 00:29:33,400 Speaker 1: seven thousand, five hundred dollars. And I think that's just 459 00:29:33,440 --> 00:29:37,320 Speaker 1: a nice story all around the idea of someone saying, hey, 460 00:29:37,480 --> 00:29:40,800 Speaker 1: you should know about this, because people, if they found 461 00:29:40,800 --> 00:29:43,360 Speaker 1: out about it, would totally exploit it, and you would 462 00:29:43,400 --> 00:29:45,880 Speaker 1: be out a huge amount of money because you'd have 463 00:29:46,440 --> 00:29:49,120 Speaker 1: all this money you would owe developers. It would be 464 00:29:49,280 --> 00:29:52,720 Speaker 1: just manufactured cash, to be nothing flowing into Valve and 465 00:29:52,760 --> 00:29:57,920 Speaker 1: everything flowing out. So it was a good thing to report. Um, Yeah, 466 00:29:58,240 --> 00:30:01,720 Speaker 1: I like stories like this. Hackers do this. I should 467 00:30:01,760 --> 00:30:06,400 Speaker 1: also add that there are cases, plenty of cases where 468 00:30:06,760 --> 00:30:09,840 Speaker 1: you will learn about hackers who find it a vulnerability. 469 00:30:09,960 --> 00:30:14,880 Speaker 1: They'll report it to whichever entity you know is responsible 470 00:30:14,880 --> 00:30:19,120 Speaker 1: for that software and nothing gets done. That is incredibly 471 00:30:19,400 --> 00:30:23,680 Speaker 1: irresponsible and something that I hate to see. And there's 472 00:30:23,720 --> 00:30:27,000 Speaker 1: so many stories about it where you're like, hacker X 473 00:30:27,360 --> 00:30:31,320 Speaker 1: finds out that company Y has some major vulnerability and 474 00:30:31,320 --> 00:30:34,800 Speaker 1: its software reports it to the company. The company doesn't respond, 475 00:30:35,120 --> 00:30:39,320 Speaker 1: the company doesn't change anything or patch the vulnerability. Then 476 00:30:39,400 --> 00:30:42,520 Speaker 1: the hacker then goes forward with the information going public, 477 00:30:42,560 --> 00:30:45,760 Speaker 1: saying hey, there's this massive vulnerability in the software package, 478 00:30:45,880 --> 00:30:47,520 Speaker 1: and then the company is saying, why are you telling 479 00:30:47,520 --> 00:30:50,960 Speaker 1: everybody this? Well, if the hacker has gone through the 480 00:30:50,960 --> 00:30:54,960 Speaker 1: process of alerting the company, then it's the company's responsibility 481 00:30:55,000 --> 00:30:58,920 Speaker 1: to respond and get that problem fixed. And if that 482 00:30:59,000 --> 00:31:01,480 Speaker 1: doesn't work, you know someone else is going to find 483 00:31:01,480 --> 00:31:04,120 Speaker 1: that exploit and take advantage of it. So I think 484 00:31:04,160 --> 00:31:06,720 Speaker 1: it's the only reasonable thing for a hacker to do 485 00:31:06,880 --> 00:31:08,880 Speaker 1: is to come forward and say, hey, I gave them 486 00:31:08,920 --> 00:31:11,520 Speaker 1: every opportunity to fix this, they didn't do it, so 487 00:31:11,560 --> 00:31:13,560 Speaker 1: maybe now they'll do it now that the whole world 488 00:31:13,600 --> 00:31:18,280 Speaker 1: knows about it. Um kind of brutal approach. But the 489 00:31:18,920 --> 00:31:21,720 Speaker 1: flip side of that is if a company doesn't fix 490 00:31:21,800 --> 00:31:26,360 Speaker 1: that problem, someone else will find it and they might 491 00:31:26,440 --> 00:31:30,360 Speaker 1: not have the moral compass to reach out to the 492 00:31:30,400 --> 00:31:33,880 Speaker 1: company and instead might exploit it and harm either the 493 00:31:33,920 --> 00:31:39,000 Speaker 1: company or the company's customers, or both in the process. Okay, 494 00:31:39,040 --> 00:31:42,480 Speaker 1: that's it for this episode and the news for Tuesday, 495 00:31:42,520 --> 00:31:46,640 Speaker 1: August one. If you have suggestions for topics I should 496 00:31:46,640 --> 00:31:48,960 Speaker 1: cover in future episodes of tech Stuff, reach out to me. 497 00:31:49,040 --> 00:31:51,280 Speaker 1: The best way to do that is over on Twitter. 498 00:31:51,720 --> 00:31:53,920 Speaker 1: The handle we use for the show is text stuff 499 00:31:54,240 --> 00:32:03,360 Speaker 1: HSW and I'll talk to you again really see. Text 500 00:32:03,360 --> 00:32:06,800 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 501 00:32:06,840 --> 00:32:09,600 Speaker 1: from I Heart Radio, visit the i Heart Radio app, 502 00:32:09,720 --> 00:32:12,880 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.