1 00:00:04,400 --> 00:00:12,520 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:16,200 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,239 --> 00:00:19,160 Speaker 1: I'm an executive producer with iHeartRadio and I love all 4 00:00:19,239 --> 00:00:22,120 Speaker 1: things tech, and it is time for the tech news 5 00:00:22,239 --> 00:00:28,040 Speaker 1: for Thursday, April eighth, twenty twenty one. Yes, that pause 6 00:00:28,520 --> 00:00:32,080 Speaker 1: is me looking at the calendar because every day is 7 00:00:32,120 --> 00:00:35,279 Speaker 1: the same to me, but the news keeps changing, so 8 00:00:35,320 --> 00:00:39,160 Speaker 1: I guess that's something right. And you know, there are 9 00:00:39,360 --> 00:00:43,599 Speaker 1: spy films that start off with a villain sabotaging some 10 00:00:43,720 --> 00:00:49,239 Speaker 1: sort of high security, high value target, like a nuclear 11 00:00:49,280 --> 00:00:53,239 Speaker 1: power plant, for example, but it turns out that the 12 00:00:53,280 --> 00:00:56,440 Speaker 1: same thing can be done by a gooey ocean critter. 13 00:00:57,080 --> 00:01:00,400 Speaker 1: I'm talking about salps, which are a type of invertebrate. 14 00:01:00,920 --> 00:01:05,080 Speaker 1: They're called tunicates. They look kind of like jellyfish, but 15 00:01:05,160 --> 00:01:09,120 Speaker 1: they are a different species. They use water jet propulsion 16 00:01:09,280 --> 00:01:12,120 Speaker 1: in order to get around, so they siphon in water 17 00:01:12,280 --> 00:01:15,880 Speaker 1: and siphon it back out again and squirt ahead. Anyway, 18 00:01:15,920 --> 00:01:18,319 Speaker 1: a whole bunch of them have gummed up the water 19 00:01:18,480 --> 00:01:22,559 Speaker 1: intake valves for two nuclear reactors in South Korea, which 20 00:01:22,600 --> 00:01:26,240 Speaker 1: is not you know great. The nuclear plants use water 21 00:01:26,280 --> 00:01:28,880 Speaker 1: to cool the reactors as part of the heat exchanges 22 00:01:28,920 --> 00:01:31,759 Speaker 1: that are at the heart of power generation, so those 23 00:01:31,800 --> 00:01:37,440 Speaker 1: reactors are currently offline. Salt numbers usually swell in early summer, 24 00:01:37,760 --> 00:01:41,759 Speaker 1: so we're actually seeing an early surge here. Now whether 25 00:01:41,840 --> 00:01:44,480 Speaker 1: that has anything to do with climate change is an 26 00:01:44,520 --> 00:01:47,760 Speaker 1: unresolved question. It would be premature for us to draw 27 00:01:47,800 --> 00:01:51,600 Speaker 1: conclusions without a lot more data and looking at trends 28 00:01:51,760 --> 00:01:57,200 Speaker 1: over time. This could just be an abnormality but not 29 00:01:57,280 --> 00:02:01,280 Speaker 1: an indication of big changes. It's kind of like how 30 00:02:01,320 --> 00:02:05,080 Speaker 1: if you had an unusually cool day in late May 31 00:02:05,240 --> 00:02:08,920 Speaker 1: here in Atlanta, you wouldn't say that's counter evidence to 32 00:02:09,000 --> 00:02:12,160 Speaker 1: climate change, that global warming is not a thing or whatever, 33 00:02:12,720 --> 00:02:15,160 Speaker 1: although no one really wants to use the term global 34 00:02:15,160 --> 00:02:18,440 Speaker 1: warming anymore. That's kind of reductive, But anyway, you wouldn't 35 00:02:18,440 --> 00:02:24,120 Speaker 1: say that because weather and climate are not interchangeable. The 36 00:02:24,160 --> 00:02:26,840 Speaker 1: story of the salps, I think is a little amusing 37 00:02:26,919 --> 00:02:30,480 Speaker 1: because the thought that this little sack like critter could 38 00:02:30,480 --> 00:02:34,000 Speaker 1: shut down a nuclear power plant sounds, you know, kind 39 00:02:34,040 --> 00:02:37,280 Speaker 1: of bizarre and funny, but it's obviously serious business, both 40 00:02:37,280 --> 00:02:40,480 Speaker 1: in terms of the impact on power generation and changes 41 00:02:40,800 --> 00:02:45,280 Speaker 1: in wildlife. Routines. One story I did not report on 42 00:02:45,360 --> 00:02:49,120 Speaker 1: earlier this week was about how hackers have been scraping 43 00:02:49,280 --> 00:02:53,960 Speaker 1: information pertaining to more than half a billion Facebook accounts, 44 00:02:54,120 --> 00:02:58,440 Speaker 1: including Mark Zuckerberg's. As it turns out, someone collected the 45 00:02:58,520 --> 00:03:02,120 Speaker 1: data back in twenty nine teen, and it includes stuff 46 00:03:02,120 --> 00:03:08,440 Speaker 1: like names, occupations, location, marital status, or relationship status in 47 00:03:08,480 --> 00:03:12,600 Speaker 1: some cases, phone numbers. Facebook says that the hackers likely 48 00:03:12,720 --> 00:03:18,000 Speaker 1: used a contact import tool to scrap data from various profiles, 49 00:03:18,480 --> 00:03:22,160 Speaker 1: which means no one actually penetrated Facebook's systems and then 50 00:03:22,240 --> 00:03:26,200 Speaker 1: rooted around for information that way. Instead, they just pulled 51 00:03:26,200 --> 00:03:29,760 Speaker 1: it off Facebook directly. So prior to twenty nineteen, it 52 00:03:29,840 --> 00:03:34,000 Speaker 1: was possible to do kind of a reverse search of Facebook. 53 00:03:34,320 --> 00:03:37,560 Speaker 1: You could plug in a phone number and put that 54 00:03:37,600 --> 00:03:41,640 Speaker 1: into a contact importer, and it would then search Facebook 55 00:03:41,640 --> 00:03:44,400 Speaker 1: to see if there was a profile on Facebook that 56 00:03:44,480 --> 00:03:47,120 Speaker 1: matched that phone number, which would then give you not 57 00:03:47,200 --> 00:03:50,040 Speaker 1: just the personal information of the account, but it would 58 00:03:50,040 --> 00:03:53,839 Speaker 1: confirm the link between the phone number and that person. Right. 59 00:03:54,480 --> 00:03:57,360 Speaker 1: And if you automated this and you just went through, 60 00:03:57,800 --> 00:04:00,880 Speaker 1: you know, a robo dial list of phone numbers, you 61 00:04:00,880 --> 00:04:04,560 Speaker 1: could collect a huge amount of data. Now Facebook changed that, 62 00:04:05,040 --> 00:04:07,160 Speaker 1: but the damage had already been done, the data was 63 00:04:07,160 --> 00:04:09,360 Speaker 1: already collected, and it's not much of a relief to 64 00:04:09,400 --> 00:04:11,720 Speaker 1: hear that the company has, you know, already done something 65 00:04:11,760 --> 00:04:15,400 Speaker 1: about it when the information is already out in the wild. 66 00:04:15,840 --> 00:04:18,760 Speaker 1: It's one of those shutting the barn door things after 67 00:04:18,880 --> 00:04:21,880 Speaker 1: the horse has already kind of bolted. You can check 68 00:04:21,920 --> 00:04:24,479 Speaker 1: to see if you were affected by going to have 69 00:04:24,880 --> 00:04:31,200 Speaker 1: I Been zucked? That's zu ckeed dot com, and you 70 00:04:31,240 --> 00:04:34,919 Speaker 1: can search against the database of all the stuff that 71 00:04:35,040 --> 00:04:37,359 Speaker 1: was collected, so you can put in your phone number, 72 00:04:37,360 --> 00:04:40,359 Speaker 1: your email address, or whatever. I tried my information. I 73 00:04:40,480 --> 00:04:44,039 Speaker 1: was not in that twenty nineteen hack. But that was 74 00:04:44,120 --> 00:04:47,440 Speaker 1: just pure luck. It wasn't that I did anything, you know, 75 00:04:47,560 --> 00:04:50,400 Speaker 1: to prevent it from happening. I was just lucky that 76 00:04:50,480 --> 00:04:54,640 Speaker 1: I wasn't in part of that grab. Another bad Facebook 77 00:04:54,720 --> 00:04:58,480 Speaker 1: story coming your way. The Insider has an article titled 78 00:04:58,839 --> 00:05:02,920 Speaker 1: Facebook did not higher black employees because they were not 79 00:05:03,120 --> 00:05:08,880 Speaker 1: a culture fit. Report says the article is well worth reading. 80 00:05:08,920 --> 00:05:11,680 Speaker 1: I think the headline kind of gives away the idea 81 00:05:11,720 --> 00:05:13,800 Speaker 1: of what was going on, but we'll dive in. So 82 00:05:13,839 --> 00:05:17,880 Speaker 1: the Equal Employment Opportunity Commission received reports from a few 83 00:05:18,000 --> 00:05:22,280 Speaker 1: black applicants who were seeking jobs with Facebook, and they 84 00:05:22,400 --> 00:05:28,320 Speaker 1: all reportedly had the qualifications necessary to do the posted jobs. 85 00:05:28,360 --> 00:05:31,599 Speaker 1: But they were all turned away, apparently after being told 86 00:05:31,640 --> 00:05:34,760 Speaker 1: they would not fit in with the corporate culture of Facebook. 87 00:05:35,120 --> 00:05:37,520 Speaker 1: And it's hard to view this as anything other than 88 00:05:37,880 --> 00:05:42,800 Speaker 1: grossly racist. A Washington Post article quoted an operating manager 89 00:05:42,920 --> 00:05:46,039 Speaker 1: over at Facebook who said he was told that the 90 00:05:46,120 --> 00:05:48,400 Speaker 1: people he recruited into the company needed to be a 91 00:05:48,440 --> 00:05:53,000 Speaker 1: culture fit, but quote, Unfortunately, not many people I knew 92 00:05:53,080 --> 00:05:56,599 Speaker 1: could pass that challenge because the culture here does not 93 00:05:56,680 --> 00:06:01,279 Speaker 1: reflect the culture of black people end quote. Organizations like 94 00:06:01,279 --> 00:06:05,000 Speaker 1: the Society for Human Resource Management have pushed back against 95 00:06:05,120 --> 00:06:09,280 Speaker 1: companies that have culture fit requirements in their job postings. 96 00:06:09,320 --> 00:06:13,000 Speaker 1: They've argued that this effectively creates an excuse to hire 97 00:06:13,080 --> 00:06:16,560 Speaker 1: mostly white applicants and turn away people of color while 98 00:06:16,600 --> 00:06:20,839 Speaker 1: attempting to maintain a sort of plausible deniability that there's 99 00:06:20,880 --> 00:06:25,279 Speaker 1: any kind of racial discrimination going on. Facebook representatives have 100 00:06:25,360 --> 00:06:27,960 Speaker 1: said that their company has policies and placement to detect 101 00:06:28,040 --> 00:06:32,039 Speaker 1: and eliminate discrimination. But this culture fit thing strikes me 102 00:06:32,240 --> 00:06:35,279 Speaker 1: as a way of kind of getting around that, maybe 103 00:06:35,320 --> 00:06:38,640 Speaker 1: not a conscious way. It might not even have been 104 00:06:38,920 --> 00:06:43,320 Speaker 1: consciously malevolent or malicious, but it has the effect of 105 00:06:43,360 --> 00:06:48,520 Speaker 1: being discriminatory, and that's a problem. On Tuesday, the messaging 106 00:06:48,600 --> 00:06:51,760 Speaker 1: app Signal rolled out a new beta in the UK 107 00:06:52,320 --> 00:06:56,039 Speaker 1: with a feature that, well, let's say it's caused a 108 00:06:56,040 --> 00:07:00,880 Speaker 1: little bit of a fuss. Signal is an encrypted messaging service. 109 00:07:00,880 --> 00:07:04,560 Speaker 1: It offers end to end encryption between people using it. 110 00:07:05,279 --> 00:07:07,120 Speaker 1: That's one that a lot of users have flocked to 111 00:07:07,240 --> 00:07:10,000 Speaker 1: in order to switch to a messaging app that didn't 112 00:07:10,040 --> 00:07:15,480 Speaker 1: commoditize relationships. When Facebook's WhatsApp started, when it was revealed 113 00:07:15,520 --> 00:07:18,680 Speaker 1: that the app was going to be collecting data about usage, 114 00:07:19,160 --> 00:07:21,840 Speaker 1: maybe not even personal data, but enough data about usage 115 00:07:22,040 --> 00:07:27,560 Speaker 1: that would ultimately go to Facebook proper and other Facebook properties, 116 00:07:28,040 --> 00:07:29,920 Speaker 1: a lot of people said, you know what, peace out, 117 00:07:29,960 --> 00:07:31,600 Speaker 1: I'm going to go use something else, and a lot 118 00:07:31,640 --> 00:07:35,120 Speaker 1: of people went to Signal, and signal sales pitch is 119 00:07:35,160 --> 00:07:38,600 Speaker 1: that it lets you communicate privately and securely with your contacts. 120 00:07:38,960 --> 00:07:41,600 Speaker 1: And there aren't really any bells and whistles thrown in, 121 00:07:41,920 --> 00:07:45,680 Speaker 1: except now there appears to be one a cryptocurrency called 122 00:07:45,840 --> 00:07:50,480 Speaker 1: mobile coin. Critics say that Signal is commercially exploiting its 123 00:07:50,600 --> 00:07:56,920 Speaker 1: user base, introducing a company controlled digital cryptocurrency, pumping up 124 00:07:56,960 --> 00:07:59,960 Speaker 1: the value of that digital currency, then pushing it all 125 00:08:00,040 --> 00:08:04,200 Speaker 1: off onto users, and then pocketing profit from it. Stephen 126 00:08:04,280 --> 00:08:08,280 Speaker 1: deal That's d I. E Hl wrote a great blog 127 00:08:08,320 --> 00:08:11,800 Speaker 1: post that critiques this move, and it laments the fact 128 00:08:12,120 --> 00:08:15,080 Speaker 1: that Signal seems to have taken a drastic turn from 129 00:08:15,120 --> 00:08:20,040 Speaker 1: its philosophical anchor point. It's called ET two Signal. I 130 00:08:20,120 --> 00:08:22,800 Speaker 1: really recommend you read it. His perspective, which I find 131 00:08:22,800 --> 00:08:26,160 Speaker 1: pretty compelling, is that there's this growing trend in tech 132 00:08:26,280 --> 00:08:30,080 Speaker 1: in general to rely on these digital tokens, whether they 133 00:08:30,120 --> 00:08:34,760 Speaker 1: are cryptocurrencies or NFTs. That trend sees a small group 134 00:08:34,800 --> 00:08:38,440 Speaker 1: of people trade digital goods a few times, thus driving 135 00:08:38,520 --> 00:08:42,439 Speaker 1: up the perceived value of those digital goods. Then they 136 00:08:42,440 --> 00:08:44,880 Speaker 1: can cash out, you know, they can build up some wealth, 137 00:08:45,040 --> 00:08:47,760 Speaker 1: cash out, and then they have that wealth in some 138 00:08:47,840 --> 00:08:53,280 Speaker 1: other form. And meanwhile, those digital assets are in a 139 00:08:53,440 --> 00:08:56,960 Speaker 1: danger of having a total collapse in value, similar to 140 00:08:56,960 --> 00:08:59,720 Speaker 1: what we have seen with Bitcoin in the past. Bitcoin's 141 00:08:59,720 --> 00:09:02,360 Speaker 1: writing high, but there have been times where we've seen 142 00:09:02,400 --> 00:09:07,800 Speaker 1: the value of bitcoin dip really low from pretty big high. 143 00:09:07,880 --> 00:09:10,400 Speaker 1: So there's no guarantee that it's going to maintain that. 144 00:09:10,960 --> 00:09:14,640 Speaker 1: While some might survive the ups and downs and stick around, 145 00:09:14,760 --> 00:09:19,000 Speaker 1: and by some I mean cryptocurrencies and fts. There's never 146 00:09:19,080 --> 00:09:22,000 Speaker 1: a guarantee that they will stick around. And the more 147 00:09:22,080 --> 00:09:25,360 Speaker 1: we see companies employ these kinds of strategies, the more 148 00:09:25,400 --> 00:09:27,920 Speaker 1: likely we're going to see governments take a closer look 149 00:09:27,920 --> 00:09:30,839 Speaker 1: at the whole thing and potentially try to pass legislation 150 00:09:30,960 --> 00:09:34,080 Speaker 1: to address it, particularly if people are at risk of 151 00:09:34,200 --> 00:09:38,439 Speaker 1: losing significant amounts of money due to these types of strategies. 152 00:09:38,600 --> 00:09:41,040 Speaker 1: And as we all know, it is not easy to 153 00:09:41,080 --> 00:09:44,640 Speaker 1: address the consequences of technologies once they are out in 154 00:09:44,679 --> 00:09:48,800 Speaker 1: the wild. It is way too easy to pass laws 155 00:09:48,840 --> 00:09:52,120 Speaker 1: that have their own unintended consequences. The laws might be 156 00:09:52,160 --> 00:09:56,960 Speaker 1: coming from a good place, but they can often not 157 00:09:57,320 --> 00:10:02,000 Speaker 1: hit the actual problem and cause other problems as a results, 158 00:10:02,080 --> 00:10:07,400 Speaker 1: so not necessarily a great move. Oh and just to 159 00:10:07,520 --> 00:10:12,000 Speaker 1: link the Facebook privacy story with the signal story, is 160 00:10:12,240 --> 00:10:14,480 Speaker 1: another story that a lot of people are having fun 161 00:10:14,520 --> 00:10:17,360 Speaker 1: with right now, and that is that by using that 162 00:10:17,520 --> 00:10:21,880 Speaker 1: screat data, some folks found out that Mark Zuckerberg, whom 163 00:10:21,920 --> 00:10:25,080 Speaker 1: I mentioned earlier, his data was among that that was 164 00:10:25,720 --> 00:10:29,200 Speaker 1: found in that more than half a billion accounts list. 165 00:10:29,800 --> 00:10:32,680 Speaker 1: It turns out he's been using the Signal messaging app, 166 00:10:32,960 --> 00:10:35,719 Speaker 1: which is interesting because Facebook is the owner of WhatsApp 167 00:10:36,080 --> 00:10:39,440 Speaker 1: that is a Signal competitor. But if you run the 168 00:10:39,480 --> 00:10:43,840 Speaker 1: company that owns a messaging app, why would you be 169 00:10:43,960 --> 00:10:47,760 Speaker 1: using a different messaging app than the one you actually 170 00:10:47,880 --> 00:10:49,840 Speaker 1: own as part of your company. Why would you use 171 00:10:49,840 --> 00:10:54,000 Speaker 1: a different one for your personal communication? Well, Signal has 172 00:10:54,200 --> 00:10:58,000 Speaker 1: d end encryption, and unlike WhatsApp, it's not coming through 173 00:10:58,120 --> 00:11:00,560 Speaker 1: user data, or at least it's not supposed to in 174 00:11:00,600 --> 00:11:04,000 Speaker 1: an effort to sell ads to users. So perhaps Zuckerberg 175 00:11:04,080 --> 00:11:07,199 Speaker 1: prefers his messaging services to be a little less intrusive, 176 00:11:07,640 --> 00:11:11,000 Speaker 1: but wanting a different set of rules for everybody else, 177 00:11:11,480 --> 00:11:16,760 Speaker 1: I don't know, I'm just spitballing here. Amplify Media gave 178 00:11:16,800 --> 00:11:20,480 Speaker 1: Apple podcasts a thorough look recently and found out some 179 00:11:20,640 --> 00:11:24,360 Speaker 1: stuff that I didn't find all that surprising. Apple podcasts 180 00:11:24,440 --> 00:11:27,480 Speaker 1: recently hit a milestone of having more than two million 181 00:11:27,760 --> 00:11:32,120 Speaker 1: different podcast titles on it, but amplify discovered that twenty 182 00:11:32,200 --> 00:11:35,960 Speaker 1: six percent of all those podcasts have only published a 183 00:11:36,040 --> 00:11:41,080 Speaker 1: single episode. Thirty seven percent had only published two or fewer. So, 184 00:11:41,120 --> 00:11:43,400 Speaker 1: in other words, you know, we went from twenty six 185 00:11:43,480 --> 00:11:47,200 Speaker 1: to thirty seven percent having had published two or less, 186 00:11:47,679 --> 00:11:51,000 Speaker 1: and then forty four percent had only published three or fewer. 187 00:11:51,559 --> 00:11:54,560 Speaker 1: So once you hit the ten or more episodes, like 188 00:11:54,600 --> 00:11:58,400 Speaker 1: how many podcasts on Apple Podcasts have published more than 189 00:11:58,440 --> 00:12:01,520 Speaker 1: ten episodes, Well you're looking at just thirty six percent 190 00:12:01,559 --> 00:12:05,440 Speaker 1: of all the podcasts up on Apple Podcasts, which tells 191 00:12:05,520 --> 00:12:08,400 Speaker 1: us that there are a ton of shows that have 192 00:12:08,520 --> 00:12:12,800 Speaker 1: not hit ten episodes on there. This didn't shock me 193 00:12:13,000 --> 00:12:16,080 Speaker 1: because podcasting takes a lot of work and energy. I 194 00:12:16,120 --> 00:12:19,840 Speaker 1: can name at least three shows I personally started recording 195 00:12:20,120 --> 00:12:23,560 Speaker 1: and stopped before I hit episode number four, And what 196 00:12:23,640 --> 00:12:24,880 Speaker 1: this tells me is that there are a lot of 197 00:12:24,880 --> 00:12:27,560 Speaker 1: people who are trying to podcast, but many of them 198 00:12:27,600 --> 00:12:30,679 Speaker 1: get discouraged for whatever reason and they give up on it. 199 00:12:30,920 --> 00:12:34,960 Speaker 1: Of course, some of those shows might not have been abandoned. 200 00:12:35,000 --> 00:12:37,440 Speaker 1: Some of them could have easily just been one offs 201 00:12:37,559 --> 00:12:41,840 Speaker 1: or very limited series. For example, Till Death Do Us Blart, 202 00:12:41,880 --> 00:12:44,280 Speaker 1: a show in which the hosts get together to watch 203 00:12:44,520 --> 00:12:48,360 Speaker 1: Paul Blart Mall Cop two every year, had to run 204 00:12:48,400 --> 00:12:51,520 Speaker 1: for four years straight before having more than three episodes, 205 00:12:51,960 --> 00:12:54,480 Speaker 1: and it currently has eight. If you also include the 206 00:12:54,520 --> 00:12:59,360 Speaker 1: trailer from twenty fifteen and a bonus episode from twenty twenty. Meanwhile, 207 00:12:59,679 --> 00:13:03,760 Speaker 1: Tech Stuff is currently close again on fourteen hundred episodes, 208 00:13:03,800 --> 00:13:06,400 Speaker 1: and that's if you're not counting classics. If you do 209 00:13:06,480 --> 00:13:09,600 Speaker 1: count the classic episodes that we publish on Fridays, because 210 00:13:10,200 --> 00:13:11,960 Speaker 1: I mean, at that point you're talking about one thousand, 211 00:13:12,080 --> 00:13:16,360 Speaker 1: five hundred twenty nine episodes. This will be one five 212 00:13:16,480 --> 00:13:20,400 Speaker 1: hundred thirty one because I'm pretty sure the Wednesday episode 213 00:13:20,400 --> 00:13:23,200 Speaker 1: hasn't published as I record this, and I think out 214 00:13:23,200 --> 00:13:26,360 Speaker 1: of all of those, I'm only not in one of them. 215 00:13:26,400 --> 00:13:30,160 Speaker 1: I mean every single one except one of them. I 216 00:13:30,200 --> 00:13:33,680 Speaker 1: need to lie down for a moment. A recent court 217 00:13:33,800 --> 00:13:37,199 Speaker 1: filing in a criminal case illustrates how sometimes smart city 218 00:13:37,240 --> 00:13:41,400 Speaker 1: implementations can be dangerously stupid. And let me be clear here, 219 00:13:41,440 --> 00:13:45,040 Speaker 1: I'm not saying it's dumb to update city infrastructure. I'm 220 00:13:45,080 --> 00:13:48,079 Speaker 1: saying it's dumb to do it poorly. In this case, 221 00:13:48,360 --> 00:13:51,640 Speaker 1: the incident in question happened back in twenty nineteen. A 222 00:13:51,640 --> 00:13:56,200 Speaker 1: man named Wyatt Tranacheck attempted to sabotage the water cleaning 223 00:13:56,240 --> 00:14:01,040 Speaker 1: system of Ellsworth County in Kansas. Travnicheck had been working 224 00:14:01,120 --> 00:14:04,120 Speaker 1: with the county, which had installed software in the county's 225 00:14:04,160 --> 00:14:07,600 Speaker 1: water cleaning system and it let employees log into that 226 00:14:07,800 --> 00:14:11,320 Speaker 1: cleaning system remotely, and the purpose was to let employees 227 00:14:11,360 --> 00:14:14,120 Speaker 1: monitor systems without having to be on the physical premises. 228 00:14:14,600 --> 00:14:17,880 Speaker 1: But the software also allows employees to make changes to 229 00:14:17,920 --> 00:14:21,760 Speaker 1: that system, like shutting down various subsystems. So Travin to check, 230 00:14:21,840 --> 00:14:26,160 Speaker 1: after having left his job, logged into the system months later. 231 00:14:26,520 --> 00:14:29,720 Speaker 1: His access had never been revoked or limited in any way, 232 00:14:29,800 --> 00:14:32,960 Speaker 1: so he logged in and he started to shut stuff down, 233 00:14:33,120 --> 00:14:37,400 Speaker 1: allegedly with the intent to cause harm. He was found out, arrested, 234 00:14:37,640 --> 00:14:39,720 Speaker 1: and if he's found guilty, he could serve up to 235 00:14:39,800 --> 00:14:43,280 Speaker 1: twenty years in prison for this. Fortunately no one was 236 00:14:43,320 --> 00:14:45,600 Speaker 1: actually harmed in this case, but it really illustrates the 237 00:14:45,640 --> 00:14:49,040 Speaker 1: need for proper security measures and critical systems that includes 238 00:14:49,080 --> 00:14:52,600 Speaker 1: everything from the power grid where we know foreign agents 239 00:14:52,600 --> 00:14:56,920 Speaker 1: have infiltrated various systems, to our water systems, to communication 240 00:14:56,960 --> 00:15:01,400 Speaker 1: systems and beyond. Good security is hard. It requires work, 241 00:15:02,080 --> 00:15:06,320 Speaker 1: It requires maintenance, It requires updating passwords and changing logins. 242 00:15:06,920 --> 00:15:11,960 Speaker 1: It includes revoking access when someone no longer merits having access. 243 00:15:12,160 --> 00:15:15,280 Speaker 1: It's essentially all the stuff that we should be doing 244 00:15:15,280 --> 00:15:17,840 Speaker 1: in our private lives with our own systems. But a 245 00:15:17,920 --> 00:15:21,320 Speaker 1: lot of us don't do that because it's work, it's tedious, 246 00:15:21,800 --> 00:15:23,520 Speaker 1: and there are only so many hours in a day. 247 00:15:23,960 --> 00:15:28,920 Speaker 1: But if we insist on making these various systems accessible remotely, 248 00:15:29,600 --> 00:15:32,120 Speaker 1: we have to take those measures into account or else 249 00:15:32,160 --> 00:15:35,480 Speaker 1: we will suffer very serious consequences. This could have been 250 00:15:35,960 --> 00:15:39,840 Speaker 1: a lot worse in the tell us something we didn't 251 00:15:39,880 --> 00:15:44,160 Speaker 1: already know category. The Broadband Technical Advisory Group reports that 252 00:15:44,200 --> 00:15:48,400 Speaker 1: broadband usage surged between thirty and forty percent in most 253 00:15:48,440 --> 00:15:51,160 Speaker 1: of the United States during the pandemic, with some areas 254 00:15:51,200 --> 00:15:54,560 Speaker 1: getting up to sixty percent. Of a surge. Seems like 255 00:15:54,600 --> 00:15:56,400 Speaker 1: when a lot of folks are stuck at home, they 256 00:15:56,440 --> 00:15:58,680 Speaker 1: turned to the Internet when it comes to things like 257 00:15:58,800 --> 00:16:02,560 Speaker 1: getting work done, or finding entertainment, or researching ways to 258 00:16:02,560 --> 00:16:07,120 Speaker 1: get vaccinated and so on. And also, while downstream data 259 00:16:07,320 --> 00:16:11,240 Speaker 1: was way up, so was upstream activity. Again, no big surprise. 260 00:16:11,280 --> 00:16:12,760 Speaker 1: I'm sure a lot of you have been on your 261 00:16:12,800 --> 00:16:16,480 Speaker 1: share of video meetings over the course of the last year. 262 00:16:17,000 --> 00:16:19,480 Speaker 1: And I mean working from home for me means that 263 00:16:19,520 --> 00:16:22,800 Speaker 1: I'm regularly uploading large audio files to my producer Tari, 264 00:16:23,440 --> 00:16:26,360 Speaker 1: and that adds up pretty quickly. While the infrastructure in 265 00:16:26,400 --> 00:16:29,240 Speaker 1: most cities handled the increased load without too much of 266 00:16:29,280 --> 00:16:32,640 Speaker 1: an issue, rural areas were a different story. Some of 267 00:16:32,680 --> 00:16:36,320 Speaker 1: those were relying on older networking devices that weren't able 268 00:16:36,360 --> 00:16:38,400 Speaker 1: to keep up with the demand as easily as the 269 00:16:38,480 --> 00:16:42,800 Speaker 1: stuff we have in metropolitan areas. And what this tells 270 00:16:42,840 --> 00:16:45,440 Speaker 1: me is that stuff like data caps sound like they're 271 00:16:45,480 --> 00:16:48,760 Speaker 1: more about getting cash out of customers and less about 272 00:16:48,760 --> 00:16:51,840 Speaker 1: guaranteeing a good Internet experience. If the US had a 273 00:16:51,920 --> 00:16:55,600 Speaker 1: surge and we didn't see home service slow to a 274 00:16:55,680 --> 00:16:59,120 Speaker 1: crawl across the country, seems pretty disingenuous to say that 275 00:16:59,200 --> 00:17:03,200 Speaker 1: data caps are necessity. Amazon's Twitch service has a new 276 00:17:03,280 --> 00:17:06,159 Speaker 1: policy in place that could see twitch streamers get banned 277 00:17:06,200 --> 00:17:10,399 Speaker 1: indefinitely for things that they do offline. That is, you 278 00:17:10,480 --> 00:17:13,960 Speaker 1: might have a perfectly acceptable streaming record, but you might 279 00:17:13,960 --> 00:17:17,439 Speaker 1: do something really bad out there in the real world, 280 00:17:17,880 --> 00:17:20,480 Speaker 1: and if Amazon finds out about it, you might find 281 00:17:20,520 --> 00:17:23,320 Speaker 1: that your stream has been shut down. Now I don't 282 00:17:23,320 --> 00:17:25,840 Speaker 1: object to this, by the way, but it is interesting 283 00:17:25,920 --> 00:17:28,920 Speaker 1: to me. Twitch streamers can achieve a great deal of fame. 284 00:17:29,200 --> 00:17:32,439 Speaker 1: They can attract a large and young audience, and if 285 00:17:32,480 --> 00:17:34,960 Speaker 1: a streamer has been found to have engaged in some 286 00:17:35,080 --> 00:17:39,280 Speaker 1: seriously bad behaviors, like being involved in acts of deadly 287 00:17:39,440 --> 00:17:44,240 Speaker 1: violence or terrorist activities, or committing sexual assault. Twitch has 288 00:17:44,359 --> 00:17:47,800 Speaker 1: rules now laid out to ban that user from streaming. 289 00:17:48,320 --> 00:17:50,840 Speaker 1: According to Twitch, it will work with a law firm 290 00:17:51,000 --> 00:17:54,360 Speaker 1: when cases arise to determine if any claims that were 291 00:17:54,400 --> 00:17:58,680 Speaker 1: made against that streamer have validity to them, and that 292 00:17:58,800 --> 00:18:01,560 Speaker 1: all has to happen before or Twitch does anything else. 293 00:18:01,880 --> 00:18:05,679 Speaker 1: Only on confirmation that the streamer was actually involved in 294 00:18:05,720 --> 00:18:10,400 Speaker 1: these activities will Twitch then act on the account. According 295 00:18:10,440 --> 00:18:12,399 Speaker 1: to Twitch, a person who has been found to have 296 00:18:12,480 --> 00:18:15,600 Speaker 1: engaged in these offline activities poses as a potential threat 297 00:18:15,640 --> 00:18:18,760 Speaker 1: to the Twitch community in general, and that is unacceptable. 298 00:18:19,200 --> 00:18:23,400 Speaker 1: Other platforms like Facebook and YouTube also sometimes take offline 299 00:18:23,440 --> 00:18:28,640 Speaker 1: behavior into consideration when dealing with accounts, but it's more 300 00:18:28,720 --> 00:18:32,440 Speaker 1: common to see most platforms focus solely on the behavior 301 00:18:32,480 --> 00:18:38,120 Speaker 1: and activities that happen on the platform itself and not elsewhere. Finally, 302 00:18:38,240 --> 00:18:42,040 Speaker 1: in the second McElroy family reference in this episode, let 303 00:18:42,119 --> 00:18:44,600 Speaker 1: us all bow our heads in a moment of silence 304 00:18:45,040 --> 00:18:49,720 Speaker 1: for yaho Answers, the platform which allows people to ask 305 00:18:49,880 --> 00:18:55,400 Speaker 1: questions ranging from the mundane to the unintelligible with answers 306 00:18:55,400 --> 00:18:59,000 Speaker 1: that frequently match or exceed the questions. Is going to 307 00:18:59,119 --> 00:19:03,400 Speaker 1: go away? On May fourth? Wiped clear off the internet 308 00:19:04,080 --> 00:19:07,760 Speaker 1: Star Wars Day. I mean, really, guys, that's low anyway. 309 00:19:07,840 --> 00:19:10,200 Speaker 1: Yaho Answers has been a part of the web since 310 00:19:10,240 --> 00:19:12,080 Speaker 1: two thousand and five, and it's been a part of 311 00:19:12,080 --> 00:19:15,040 Speaker 1: the podcast My Brother, My Brother and Me. There's the 312 00:19:15,080 --> 00:19:18,920 Speaker 1: reference to our mackel Royce that had originally launched back 313 00:19:18,960 --> 00:19:22,879 Speaker 1: in twenty ten. And sure there were really dumb questions 314 00:19:22,920 --> 00:19:26,359 Speaker 1: and even more dumb answers on yaho Answers, and sure 315 00:19:26,680 --> 00:19:30,199 Speaker 1: most of those questions you could probably find answered elsewhere. 316 00:19:30,880 --> 00:19:33,880 Speaker 1: And sure some people use Yahoo Answers not to ask 317 00:19:34,000 --> 00:19:36,080 Speaker 1: or answer questions, but to create a kind of rich 318 00:19:36,200 --> 00:19:40,360 Speaker 1: metafiction of weirdness. But without yah who Answers, we never 319 00:19:40,359 --> 00:19:43,439 Speaker 1: would have had the question how is Babby formed? And 320 00:19:43,560 --> 00:19:48,000 Speaker 1: countless people too scared to ask a question that they 321 00:19:48,080 --> 00:19:50,240 Speaker 1: really wanted to know the answer to and then potentially 322 00:19:50,320 --> 00:19:53,280 Speaker 1: look foolish, would be able to go to yah Who 323 00:19:53,320 --> 00:19:56,680 Speaker 1: Answers and ask it there, and sometimes they might even 324 00:19:56,720 --> 00:19:59,720 Speaker 1: get a useful response. Now there are other sites that 325 00:19:59,760 --> 00:20:02,880 Speaker 1: do sort of what yah Who Answers was doing, though 326 00:20:03,040 --> 00:20:07,480 Speaker 1: usually not with the same level of surreal datastic activities. 327 00:20:08,040 --> 00:20:11,280 Speaker 1: As for Mbimbam, that's my brother, my brother and me. 328 00:20:11,400 --> 00:20:14,040 Speaker 1: To y'all who have never listened to that show, I 329 00:20:14,080 --> 00:20:16,680 Speaker 1: suspect the show is going to be just fine. Yah 330 00:20:16,720 --> 00:20:19,560 Speaker 1: Who Answers served a specific role on that show, well 331 00:20:19,640 --> 00:20:22,280 Speaker 1: two of them really. One was to help buff up 332 00:20:22,359 --> 00:20:25,040 Speaker 1: listener questions because, of course, in the early days, the 333 00:20:25,040 --> 00:20:27,080 Speaker 1: show was still building its audience and so it only 334 00:20:27,119 --> 00:20:30,120 Speaker 1: had so many questions per episode. Yah Who Answers helped 335 00:20:30,160 --> 00:20:33,080 Speaker 1: fill that out. But the other was to sign off 336 00:20:33,160 --> 00:20:38,080 Speaker 1: every episode with an unanswered and arguably unanswerable question from 337 00:20:38,119 --> 00:20:40,879 Speaker 1: Yah Who Answers. There may now need to be a 338 00:20:40,920 --> 00:20:43,399 Speaker 1: new way to close out shows, but I believe in 339 00:20:43,400 --> 00:20:46,879 Speaker 1: the mackelroys. As for yall Who Answers, I'm sad to 340 00:20:46,880 --> 00:20:51,280 Speaker 1: see it go. It was bizarre and weird, sometimes disturbing, 341 00:20:51,720 --> 00:20:54,560 Speaker 1: but it also felt like a pretty accurate representation of 342 00:20:55,400 --> 00:20:59,760 Speaker 1: the psyche on the Internet. So shine on you, Crazy Diamond, 343 00:21:00,760 --> 00:21:05,000 Speaker 1: And that's it for today's episode. That's all the news 344 00:21:05,400 --> 00:21:09,280 Speaker 1: that I care to report for Thursday, April eight, twenty 345 00:21:09,359 --> 00:21:12,399 Speaker 1: twenty one. If you have suggestions, for things I should 346 00:21:12,440 --> 00:21:15,159 Speaker 1: tackle in episodes of tech Stuff let me know. The 347 00:21:15,200 --> 00:21:17,240 Speaker 1: best way to get in touch is on Twitter. The 348 00:21:17,280 --> 00:21:21,160 Speaker 1: handle for the show is text stuff HSW and I'll 349 00:21:21,160 --> 00:21:29,479 Speaker 1: talk to you again really soon. Tech Stuff is an 350 00:21:29,480 --> 00:21:35,040 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 351 00:21:35,160 --> 00:21:38,320 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.