1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:15,560 Speaker 1: Heth Aaron, Welcome to tech Stuff. I'm your host Joathan 3 00:00:15,600 --> 00:00:18,239 Speaker 1: Strickland of an executive producer with iHeart Radio and all 4 00:00:18,360 --> 00:00:21,239 Speaker 1: of all things tech, and it's time for the tech 5 00:00:21,360 --> 00:00:26,920 Speaker 1: news for Tuesday, October nineteen twenty one. Let's get to it. 6 00:00:27,840 --> 00:00:32,000 Speaker 1: Members of the United States House Judiciary Committee released a 7 00:00:32,080 --> 00:00:36,360 Speaker 1: statement saying they are considering a criminal investigation into Amazon. 8 00:00:37,080 --> 00:00:39,600 Speaker 1: This relates to the report I mentioned last week in 9 00:00:39,640 --> 00:00:44,800 Speaker 1: which Reuters revealed that internal documents from Amazon that Reuters 10 00:00:44,800 --> 00:00:48,960 Speaker 1: had come into possession of showed that the company had 11 00:00:49,040 --> 00:00:54,360 Speaker 1: copied products in India to create Amazon branded versions of 12 00:00:54,400 --> 00:00:59,320 Speaker 1: those same products, and then used Amazon's search algorithm to 13 00:00:59,440 --> 00:01:05,440 Speaker 1: promote Amazon owned versions over the competing brands, so essentially saying, 14 00:01:06,080 --> 00:01:09,800 Speaker 1: Amazon said, this particular product is making a lot of money, 15 00:01:09,920 --> 00:01:13,000 Speaker 1: let's make our own version, and then let's make sure 16 00:01:13,040 --> 00:01:19,560 Speaker 1: that people see that over the industry leader. Reuters published 17 00:01:20,000 --> 00:01:23,280 Speaker 1: that again, this was happening in India, not necessarily in 18 00:01:23,319 --> 00:01:27,000 Speaker 1: other parts of the world, but the five committee members 19 00:01:27,080 --> 00:01:30,200 Speaker 1: of the House Judiciary Committee that we're concerned about this. 20 00:01:30,280 --> 00:01:34,640 Speaker 1: Reached out to Amazon CEO Andy Jesse over this matter, 21 00:01:34,800 --> 00:01:41,000 Speaker 1: saying that these reports contradict sworn testimony that Amazon executives, 22 00:01:41,000 --> 00:01:45,240 Speaker 1: including Jeff Bezos, gave to Congress in the past. Now, 23 00:01:45,319 --> 00:01:49,040 Speaker 1: for its own part, Amazon reps say that the reporting 24 00:01:49,080 --> 00:01:53,200 Speaker 1: that Reuters has published is inaccurate and that the executives 25 00:01:53,200 --> 00:01:59,040 Speaker 1: who provided testimony before US Congress did so truthfully. But again, 26 00:01:59,720 --> 00:02:03,200 Speaker 1: these documents that Reuters says it's received suggest that at 27 00:02:03,240 --> 00:02:06,840 Speaker 1: least two top Amazon executives were aware of the practices 28 00:02:06,880 --> 00:02:11,840 Speaker 1: in India, and they paint a pretty anti competitive picture. Meanwhile, 29 00:02:11,840 --> 00:02:14,920 Speaker 1: in India, a trade group is petitioning the Prime Minister 30 00:02:15,120 --> 00:02:18,560 Speaker 1: of the country to sanction Amazon in light of the 31 00:02:18,600 --> 00:02:22,720 Speaker 1: Reuters report. Facebook is changing how it charges for ads, 32 00:02:22,880 --> 00:02:27,280 Speaker 1: and the news isn't great for advertisers. Okay, so let's 33 00:02:27,320 --> 00:02:31,960 Speaker 1: talk about how Facebook's ads work in a nutshell from 34 00:02:32,040 --> 00:02:36,240 Speaker 1: the advertiser perspective. Usually I'm talking about this from the 35 00:02:36,360 --> 00:02:38,360 Speaker 1: user point of view, but now we're looking at it 36 00:02:38,400 --> 00:02:41,760 Speaker 1: from the other way. So let's say we've got an advertiser. 37 00:02:42,360 --> 00:02:44,840 Speaker 1: In fact, let's say we're being super lucky and it's 38 00:02:44,840 --> 00:02:48,160 Speaker 1: a legitimate business, because if you're not lucky, it's a 39 00:02:48,200 --> 00:02:52,040 Speaker 1: scam company that's fleecing customers. I run into those all 40 00:02:52,040 --> 00:02:56,120 Speaker 1: the time on Facebook. They all lead to companies in 41 00:02:56,240 --> 00:02:59,720 Speaker 1: China that ship out cheap knockoffs are sometimes just junk 42 00:03:00,320 --> 00:03:03,360 Speaker 1: two people, and the cost of returning it would be 43 00:03:03,400 --> 00:03:06,119 Speaker 1: more than what you spent on the item. Anyway, that's 44 00:03:06,160 --> 00:03:09,040 Speaker 1: neither here nor there. Facebook doesn't really care one way 45 00:03:09,120 --> 00:03:11,040 Speaker 1: or the other if the company is legit or not, 46 00:03:11,160 --> 00:03:14,800 Speaker 1: because it turns out all that money spends the same 47 00:03:15,240 --> 00:03:18,079 Speaker 1: whether it comes from a legit source or an illegitimate one. 48 00:03:18,480 --> 00:03:22,520 Speaker 1: You can still spend that money anyway. These advertisers set 49 00:03:22,520 --> 00:03:26,040 Speaker 1: down a budget for Facebook and they submit an ad, 50 00:03:26,120 --> 00:03:29,480 Speaker 1: so it costs money right up front to run an 51 00:03:29,520 --> 00:03:33,360 Speaker 1: ad campaign, which makes sense, and the companies tell Facebook 52 00:03:33,480 --> 00:03:38,200 Speaker 1: what actions would represent a successful ad delivery. Maybe the 53 00:03:38,240 --> 00:03:41,920 Speaker 1: brand wants to count every engagement like they're saying, we 54 00:03:41,960 --> 00:03:43,840 Speaker 1: want to see how many likes we can get and 55 00:03:43,840 --> 00:03:48,400 Speaker 1: we're gonna pay X amount per like, Or maybe they 56 00:03:48,520 --> 00:03:51,480 Speaker 1: only want to count the times that someone actually clicks 57 00:03:51,560 --> 00:03:55,000 Speaker 1: through on an ad the click through rate, and Facebook 58 00:03:55,080 --> 00:03:57,360 Speaker 1: runs the ad for a certain amount of time, like 59 00:03:57,560 --> 00:04:01,720 Speaker 1: thirty days, and at the end of that campaign, Facebook 60 00:04:01,760 --> 00:04:04,880 Speaker 1: talies up all the hits on that ad based on 61 00:04:04,960 --> 00:04:08,800 Speaker 1: whatever criteria the client had selected at the beginning. Then 62 00:04:09,000 --> 00:04:13,480 Speaker 1: Facebook gives the client a bill by delivering that many hits. 63 00:04:13,560 --> 00:04:16,160 Speaker 1: So this can get pretty expensive if the ad is 64 00:04:16,240 --> 00:04:19,120 Speaker 1: really effective, and if the company is asking for something 65 00:04:19,120 --> 00:04:24,000 Speaker 1: that requires a particularly challenging user behavior like clicking through 66 00:04:24,040 --> 00:04:27,760 Speaker 1: on an AD is typically a pretty big step compared 67 00:04:27,800 --> 00:04:30,200 Speaker 1: to you know, just seeing the ad in your feed. 68 00:04:30,640 --> 00:04:34,640 Speaker 1: So Facebook could charge a few dollars per click, and 69 00:04:34,839 --> 00:04:38,080 Speaker 1: if tens of thousands of people actually click through on 70 00:04:38,120 --> 00:04:42,120 Speaker 1: that ad, that bill can get pretty big. Well, here's 71 00:04:42,120 --> 00:04:46,159 Speaker 1: where things take a turn for advertisers moving forward. Facebook 72 00:04:46,160 --> 00:04:49,719 Speaker 1: has decided that it will now treat Instagram and Facebook 73 00:04:49,760 --> 00:04:54,799 Speaker 1: accounts as separate accounts, even if the same user clicks 74 00:04:54,839 --> 00:04:58,800 Speaker 1: on the same ad both in Instagram and in Facebook. 75 00:04:59,279 --> 00:05:05,279 Speaker 1: So before Facebook would treat these as a single account, 76 00:05:05,440 --> 00:05:08,240 Speaker 1: Like if you have a Facebook and an Instagram account, 77 00:05:08,720 --> 00:05:11,160 Speaker 1: then you were to interact with the same ad on both. 78 00:05:12,000 --> 00:05:14,479 Speaker 1: In the past, Facebook said well, that's just one user, 79 00:05:14,880 --> 00:05:17,880 Speaker 1: so it only counts once. Now they're saying, no, it's 80 00:05:17,880 --> 00:05:24,160 Speaker 1: going to count twice, because you know why not. On 81 00:05:24,160 --> 00:05:27,080 Speaker 1: Facebook's point of view, this strikes a lot of people 82 00:05:27,160 --> 00:05:29,599 Speaker 1: is weird because if you're serving the same ADD to 83 00:05:29,680 --> 00:05:32,760 Speaker 1: the same person a couple of different times, it's not 84 00:05:32,839 --> 00:05:36,360 Speaker 1: like you've just created a new customer each time, right. 85 00:05:36,440 --> 00:05:39,640 Speaker 1: You can only become a new customer once. But now, 86 00:05:39,720 --> 00:05:43,600 Speaker 1: if someone engages with an AD on Instagram and on Facebook, 87 00:05:43,839 --> 00:05:46,720 Speaker 1: the company will treat that as if that one person 88 00:05:46,920 --> 00:05:50,400 Speaker 1: is actually two people. So, in other words, add clients 89 00:05:50,440 --> 00:05:53,919 Speaker 1: can potentially get charged twice for the same person seeing 90 00:05:53,960 --> 00:05:56,880 Speaker 1: the same AD on two different platforms, both of which 91 00:05:56,920 --> 00:06:00,840 Speaker 1: are owned by Facebook. Facebook says this will only happen 92 00:06:01,279 --> 00:06:05,320 Speaker 1: if the user hasn't had their Facebook and Instagram accounts 93 00:06:05,360 --> 00:06:08,720 Speaker 1: linked in the account center. Now you might wonder, what's 94 00:06:08,760 --> 00:06:11,480 Speaker 1: the account center, Well, this is an optional way for 95 00:06:11,640 --> 00:06:17,120 Speaker 1: users to connect their Facebook, Instagram, and Messenger accounts together 96 00:06:17,279 --> 00:06:21,120 Speaker 1: from an ad perspective. Uh. By the way, it's a 97 00:06:21,160 --> 00:06:25,000 Speaker 1: fully optional feature. It's up to the user to engage it, 98 00:06:25,160 --> 00:06:28,600 Speaker 1: and that feature has not been fully deployed. Only a 99 00:06:28,640 --> 00:06:31,880 Speaker 1: selection of Facebook users have even had the option to 100 00:06:31,960 --> 00:06:36,080 Speaker 1: activate it, so by default, most folks do not have 101 00:06:36,120 --> 00:06:39,400 Speaker 1: their accounts linked together, and even when this gets rolled 102 00:06:39,440 --> 00:06:42,520 Speaker 1: out in full deployment, it still will mean that users 103 00:06:42,520 --> 00:06:45,680 Speaker 1: will have to take the step to enable that option. 104 00:06:46,000 --> 00:06:48,360 Speaker 1: I'm guessing that Facebook is banking on the fact that 105 00:06:48,400 --> 00:06:51,360 Speaker 1: most people won't bother to do this. Plus, there are 106 00:06:51,360 --> 00:06:53,560 Speaker 1: plenty of folks who already don't like the idea of 107 00:06:53,640 --> 00:06:58,799 Speaker 1: linking their activities together for the purposes of ads, which 108 00:06:58,839 --> 00:07:02,240 Speaker 1: means that companies that are advertising on Facebook can look 109 00:07:02,279 --> 00:07:07,520 Speaker 1: forward to paying twice per user for those who interact 110 00:07:07,560 --> 00:07:11,000 Speaker 1: with an ad on more than one platform. I will 111 00:07:11,480 --> 00:07:16,040 Speaker 1: not be shocked to see UH some opposition to this plan, 112 00:07:16,720 --> 00:07:19,680 Speaker 1: but this is, you know, just the early days for it. 113 00:07:20,160 --> 00:07:23,240 Speaker 1: Speaking of Facebook, over the past few years, the company 114 00:07:23,280 --> 00:07:28,680 Speaker 1: has shifted UH to depend more upon AI algorithms in 115 00:07:28,800 --> 00:07:33,640 Speaker 1: order to detect and potentially remove posts that contain harmful 116 00:07:33,720 --> 00:07:37,640 Speaker 1: content or you know, stuff that just plane violates Facebook's policies, 117 00:07:38,000 --> 00:07:41,360 Speaker 1: and that includes stuff like, you know, depictions of violence, 118 00:07:41,680 --> 00:07:45,440 Speaker 1: hate speech, acts of cruelty, and so forth. But The 119 00:07:45,440 --> 00:07:49,200 Speaker 1: Wall Street Journal reports that Facebook's AI just isn't really 120 00:07:49,280 --> 00:07:52,800 Speaker 1: up to the task. Guy Rosen, who is the head 121 00:07:52,880 --> 00:07:56,560 Speaker 1: of integrity at Facebook, says that as recently as this 122 00:07:56,720 --> 00:08:00,720 Speaker 1: past spring, one in every two thousand and posts on 123 00:08:00,760 --> 00:08:05,400 Speaker 1: Facebook still contained hate speech. Facebook started using AI because 124 00:08:05,840 --> 00:08:09,000 Speaker 1: there's so much content that's posted to Facebook that it's 125 00:08:09,040 --> 00:08:13,000 Speaker 1: not really feasible to rely upon human beings to police everything, 126 00:08:13,600 --> 00:08:17,760 Speaker 1: and that makes sense. Unfortunately, the robot police in this 127 00:08:17,800 --> 00:08:22,239 Speaker 1: case aren't really good at their jobs. One senior research 128 00:08:22,360 --> 00:08:25,960 Speaker 1: scientists posted in two thousand nineteen that by his estimation, 129 00:08:26,600 --> 00:08:30,560 Speaker 1: the AI was removing posts that represented about two percent 130 00:08:30,880 --> 00:08:34,559 Speaker 1: of all the hate speech that was actually posted onto Facebook. 131 00:08:35,520 --> 00:08:41,360 Speaker 1: Missing of posts that violate the ban on hate speech 132 00:08:42,240 --> 00:08:46,360 Speaker 1: is pretty darn lousy by any metric, and when the 133 00:08:46,400 --> 00:08:49,680 Speaker 1: AI can't draw a firm conclusion about whether or not 134 00:08:49,840 --> 00:08:54,320 Speaker 1: a particular post violates Facebook's rules, the default response is 135 00:08:54,360 --> 00:08:57,440 Speaker 1: just to reduce how frequently that post will pop up 136 00:08:57,480 --> 00:08:59,880 Speaker 1: in other people's news feeds. You know, you kind of 137 00:09:00,160 --> 00:09:04,840 Speaker 1: emphasize the algorithm, and so it's not muting the post, 138 00:09:04,880 --> 00:09:08,760 Speaker 1: it's not removing the post or banning the user. It's 139 00:09:08,840 --> 00:09:12,000 Speaker 1: just making sure that that particular post that may include 140 00:09:12,000 --> 00:09:16,360 Speaker 1: hate speech doesn't spread quite as much on the platform. That's, 141 00:09:17,360 --> 00:09:21,559 Speaker 1: you know, not good. Facebook reps claim that the company 142 00:09:21,640 --> 00:09:24,320 Speaker 1: is working to improve the algorithms, but based upon the 143 00:09:24,360 --> 00:09:28,280 Speaker 1: posts of other research scientists, it sounds like there isn't 144 00:09:28,400 --> 00:09:31,920 Speaker 1: much cause to be optimistic that any improvement will be 145 00:09:31,960 --> 00:09:34,839 Speaker 1: significant enough to make a serious dent in the spread 146 00:09:34,920 --> 00:09:38,640 Speaker 1: of hate speech on the platform. Things to remember here 147 00:09:38,679 --> 00:09:43,600 Speaker 1: that artificial intelligence is a difficult and complicated discipline. It 148 00:09:43,760 --> 00:09:47,040 Speaker 1: is not always up to the tasks we assigned to it, 149 00:09:47,520 --> 00:09:51,160 Speaker 1: and there are some pretty awful people out there who 150 00:09:51,200 --> 00:09:55,040 Speaker 1: take advantage of this fact and they spread harmful posts 151 00:09:55,080 --> 00:10:00,000 Speaker 1: on platforms like Facebook, knowing that you know, the company 152 00:10:00,040 --> 00:10:05,920 Speaker 1: these own measures aren't really you know, up to the task. 153 00:10:06,120 --> 00:10:09,640 Speaker 1: They have a really good chance of getting that message 154 00:10:09,679 --> 00:10:14,679 Speaker 1: to spread, whether they specifically, you know, adhere to an 155 00:10:14,760 --> 00:10:17,560 Speaker 1: ideology that they really believe in and they're trying to 156 00:10:17,600 --> 00:10:21,920 Speaker 1: promote that, or they're just trying to cause chaos in 157 00:10:21,960 --> 00:10:26,559 Speaker 1: the in the case of things like Russian misinformation trolls. 158 00:10:26,960 --> 00:10:30,280 Speaker 1: So yeah, good things to remember. Speaking of Russia, the 159 00:10:30,360 --> 00:10:33,880 Speaker 1: Russian government is pressuring companies like Facebook and Google to 160 00:10:33,920 --> 00:10:37,640 Speaker 1: comply with orders to delete illegal content from their platforms. 161 00:10:37,960 --> 00:10:40,720 Speaker 1: In Google's case, the government is threatening a fine of 162 00:10:40,800 --> 00:10:45,400 Speaker 1: up to twent of Google's Russian turnover. By the way, 163 00:10:45,559 --> 00:10:48,360 Speaker 1: I actually had to look up the word turnover because 164 00:10:48,440 --> 00:10:51,680 Speaker 1: I think of turnovers as either a baked good or 165 00:10:51,880 --> 00:10:54,280 Speaker 1: the number of employees who leave a business within a 166 00:10:54,280 --> 00:10:57,080 Speaker 1: given amount of time, like that's what turnover is. But 167 00:10:57,200 --> 00:11:00,360 Speaker 1: turnover in this case refers to the amount of any 168 00:11:00,520 --> 00:11:03,760 Speaker 1: business brings in over an amount of time. I don't 169 00:11:03,760 --> 00:11:06,679 Speaker 1: know why we use that instead of the word like revenue. 170 00:11:07,280 --> 00:11:10,280 Speaker 1: But I mean, hey, that's the English language, right, Hey, 171 00:11:10,320 --> 00:11:13,600 Speaker 1: English language. You're confusing. And I say that as a 172 00:11:13,679 --> 00:11:18,839 Speaker 1: native speaker anyway. At least some of the content, this 173 00:11:18,960 --> 00:11:24,200 Speaker 1: illegal content that Google refuses to remove is stuff that's 174 00:11:24,240 --> 00:11:28,319 Speaker 1: critical of the Russian government. So in many ways, this 175 00:11:28,440 --> 00:11:31,760 Speaker 1: is different from the pressure that these same platforms face 176 00:11:31,840 --> 00:11:34,760 Speaker 1: when it comes to spreading hate speech, and it falls 177 00:11:34,760 --> 00:11:37,480 Speaker 1: more in line with a government attempting to suppress any 178 00:11:37,520 --> 00:11:41,600 Speaker 1: resistance to that government's authority. Now I should also add 179 00:11:42,040 --> 00:11:46,040 Speaker 1: I don't know what all the illegal content is that 180 00:11:46,160 --> 00:11:49,120 Speaker 1: Russia alleges Google has failed to remove. Some of that 181 00:11:49,160 --> 00:11:53,880 Speaker 1: could fall into categories like hate speech or misinformation, things 182 00:11:53,920 --> 00:11:57,360 Speaker 1: like that stuff that we would, you know, typically think 183 00:11:57,600 --> 00:12:01,640 Speaker 1: should be removed in general, I don't know how much 184 00:12:01,679 --> 00:12:05,040 Speaker 1: of that falls into those categories versus just plain old 185 00:12:05,320 --> 00:12:08,640 Speaker 1: you know, government censorship that is, you know, designed to 186 00:12:08,720 --> 00:12:13,120 Speaker 1: keep those in power to remain in power. But yeah, 187 00:12:13,520 --> 00:12:17,160 Speaker 1: Google is facing some pretty massive fines in Russia. If 188 00:12:17,160 --> 00:12:20,360 Speaker 1: they don't take those steps, we will have to see 189 00:12:20,559 --> 00:12:24,640 Speaker 1: how that develops. I've got a lot more news to cover, 190 00:12:24,840 --> 00:12:27,640 Speaker 1: but before we get to that, let's take a quick break. 191 00:12:35,280 --> 00:12:38,640 Speaker 1: The green Ridge Generation power plant in New York State 192 00:12:38,840 --> 00:12:42,400 Speaker 1: is the latest to come under the scrutiny of environmentalist groups, 193 00:12:42,800 --> 00:12:47,120 Speaker 1: and this is because that particular facility has partnered with 194 00:12:47,160 --> 00:12:51,840 Speaker 1: bitcoin mining operations. Now, the power plant had shut down, 195 00:12:52,520 --> 00:12:56,080 Speaker 1: it was defunct, but it came back online in two 196 00:12:56,120 --> 00:12:59,679 Speaker 1: thousand seventeen and it houses more than fifteen thousand computer 197 00:12:59,720 --> 00:13:04,040 Speaker 1: services that are engaged in bitcoin mining. Now. As a reminder, 198 00:13:04,200 --> 00:13:08,400 Speaker 1: bitcoin is based off a proof of work mining model, 199 00:13:09,080 --> 00:13:13,040 Speaker 1: and from a very high, high level, just to kind 200 00:13:13,040 --> 00:13:16,200 Speaker 1: of cover this, this means that the system generates what 201 00:13:16,280 --> 00:13:20,320 Speaker 1: amounts to be a number guessing game, and the system 202 00:13:20,400 --> 00:13:23,640 Speaker 1: makes this number increasingly more difficult to guess as more 203 00:13:23,720 --> 00:13:28,520 Speaker 1: computational power joins the system, because the goal is to 204 00:13:28,679 --> 00:13:34,160 Speaker 1: keep the solution time period to about ten minutes, so 205 00:13:34,440 --> 00:13:38,520 Speaker 1: it should take around ten minutes to guess the right answer. Well, 206 00:13:38,600 --> 00:13:41,160 Speaker 1: as the value of bitcoin goes up, more people spend 207 00:13:41,160 --> 00:13:44,640 Speaker 1: more money to put together computer networks that are just 208 00:13:44,760 --> 00:13:47,600 Speaker 1: trying to get the correct answer right, because they're going 209 00:13:47,679 --> 00:13:51,520 Speaker 1: to make a lot of cash in the form of bitcoin. Well, 210 00:13:52,240 --> 00:13:55,439 Speaker 1: that means the system has to make the number harder 211 00:13:55,440 --> 00:13:58,600 Speaker 1: to guess in order to keep that ten minute goal 212 00:13:58,880 --> 00:14:02,719 Speaker 1: in place. And this cycle feeds on itself as long 213 00:14:02,760 --> 00:14:06,640 Speaker 1: as the currency remains really valuable, and right now it's 214 00:14:06,720 --> 00:14:11,560 Speaker 1: you know, around sixty dollars per coin. Meanwhile, all these 215 00:14:11,559 --> 00:14:15,400 Speaker 1: computers need electricity to run, and if the expense of 216 00:14:15,520 --> 00:14:19,560 Speaker 1: running the system is more than what you're getting from mining, 217 00:14:20,000 --> 00:14:22,480 Speaker 1: then the whole thing falls apart. Right, you're spending more 218 00:14:22,520 --> 00:14:26,040 Speaker 1: money than you make in mining. And when you've got 219 00:14:26,040 --> 00:14:30,560 Speaker 1: fifteen thousand servers, you need a lot of electricity. So 220 00:14:30,680 --> 00:14:35,040 Speaker 1: a lot of bitcoin mining operations have looked too defunct. 221 00:14:35,120 --> 00:14:39,240 Speaker 1: Power plants like the green Ridge Generation power Plant had 222 00:14:39,280 --> 00:14:43,200 Speaker 1: been in order to bring them back online, specifically to 223 00:14:43,400 --> 00:14:48,040 Speaker 1: provide electricity for bitcoin mining operations and then perhaps selling 224 00:14:48,200 --> 00:14:54,280 Speaker 1: any additional electricity to the local power grid. Now, bitcoin 225 00:14:54,360 --> 00:14:57,920 Speaker 1: mining operations do this because it brings the cost of 226 00:14:58,040 --> 00:15:01,960 Speaker 1: electricity way down. If the cost is is higher than 227 00:15:02,000 --> 00:15:04,560 Speaker 1: that cuts into your profits, right, And eventually you might 228 00:15:04,560 --> 00:15:08,240 Speaker 1: get to a point where maintaining all the equipment plus 229 00:15:08,480 --> 00:15:11,080 Speaker 1: spending all the money on electricity is more than what 230 00:15:11,160 --> 00:15:13,400 Speaker 1: you get out of mining, and you you've got to 231 00:15:13,440 --> 00:15:16,840 Speaker 1: cut your business or else you'll you'll you know, your 232 00:15:16,880 --> 00:15:19,960 Speaker 1: your costs are greater than your profits, or actually you 233 00:15:19,960 --> 00:15:22,800 Speaker 1: don't have profits. Your costs are greater than your revenue. 234 00:15:23,400 --> 00:15:27,760 Speaker 1: Now that means that these previously defunct fossil fuel power 235 00:15:27,800 --> 00:15:31,320 Speaker 1: plants are coming back online, and thus they're generating more pollution. 236 00:15:32,080 --> 00:15:35,400 Speaker 1: Bitcoin mining operations say that they make sure that they 237 00:15:35,400 --> 00:15:39,240 Speaker 1: purchase carbon offsets in order to remain carbon neutral, but 238 00:15:39,320 --> 00:15:43,400 Speaker 1: a lot of environmentalists have said that carbon offsets tend 239 00:15:43,440 --> 00:15:47,280 Speaker 1: to encourage more reliance on fossil fuel and that this 240 00:15:47,360 --> 00:15:50,120 Speaker 1: is harmful behavior. Essentially that companies use it as an 241 00:15:50,120 --> 00:15:53,880 Speaker 1: excuse in order to not make the changes needed to 242 00:15:54,200 --> 00:15:57,520 Speaker 1: get off the dependence on fossil fuels, and in the meantime, 243 00:15:57,880 --> 00:16:01,000 Speaker 1: we just continue to dump more carbon and into the atmosphere. 244 00:16:01,040 --> 00:16:05,240 Speaker 1: And sure, we might use carbon offsets to fund things 245 00:16:05,320 --> 00:16:09,160 Speaker 1: that can remove carbon from the atmosphere, but that's further 246 00:16:09,240 --> 00:16:12,440 Speaker 1: into the future, and the harm that those carbon emissions 247 00:16:12,480 --> 00:16:17,480 Speaker 1: can do is more immediate. So, in other words, carbon 248 00:16:17,480 --> 00:16:20,760 Speaker 1: offsets can be a solution for a long term as 249 00:16:20,800 --> 00:16:24,120 Speaker 1: long as you also are backing off on carbon emissions. 250 00:16:24,160 --> 00:16:27,440 Speaker 1: But that's not what we're seeing here, and so the 251 00:16:27,480 --> 00:16:31,560 Speaker 1: environmentalists are calling upon New York politicians to deny an 252 00:16:31,560 --> 00:16:34,480 Speaker 1: air permit for Green Ridge and to send an example 253 00:16:34,520 --> 00:16:36,960 Speaker 1: both to the state of New York as well as 254 00:16:37,040 --> 00:16:40,840 Speaker 1: the country overall. They have warned that not doing that 255 00:16:41,120 --> 00:16:44,600 Speaker 1: will seriously undermine state and national efforts to cut carbon 256 00:16:44,640 --> 00:16:48,560 Speaker 1: emissions drastically over the next few years. I'm inclined to 257 00:16:48,600 --> 00:16:51,760 Speaker 1: agree with the activists on this one. And while bitcoin 258 00:16:51,840 --> 00:16:54,800 Speaker 1: miners might argue that their operations bring revenue to the 259 00:16:54,840 --> 00:16:58,200 Speaker 1: state and they create high paying jobs, I would counter 260 00:16:58,320 --> 00:17:01,920 Speaker 1: that when you've got increased carbon emissions, that kind of 261 00:17:01,960 --> 00:17:04,520 Speaker 1: negates those benefits in the long term. I mean, what 262 00:17:04,680 --> 00:17:07,680 Speaker 1: good is money if all the shops end up being 263 00:17:07,720 --> 00:17:11,960 Speaker 1: either underwater or on fire. South Korea has adjusted its 264 00:17:12,000 --> 00:17:15,679 Speaker 1: targets for reducing carbon emissions by twenty thirty. Speaking of 265 00:17:15,720 --> 00:17:20,000 Speaker 1: carbon emissions, previously the country had a goal of reducing 266 00:17:20,040 --> 00:17:24,399 Speaker 1: carbon emissions by twenty six point three by thirty but 267 00:17:24,520 --> 00:17:27,919 Speaker 1: now the country is upping that to a commitment to 268 00:17:28,080 --> 00:17:32,800 Speaker 1: reducing it by forty percent by with the overall goal 269 00:17:32,880 --> 00:17:36,000 Speaker 1: to become carbon neutral by twenty fifty. Now, this is 270 00:17:36,040 --> 00:17:39,360 Speaker 1: a really significant change. South Korea is a country that 271 00:17:39,440 --> 00:17:44,480 Speaker 1: depends heavily on coal power plants. They generate about of 272 00:17:44,520 --> 00:17:48,160 Speaker 1: the electricity that the country relies upon, and it has 273 00:17:48,200 --> 00:17:51,000 Speaker 1: a long way to go to get to those goals 274 00:17:51,040 --> 00:17:53,879 Speaker 1: that it's set. But South Korea's government has created an 275 00:17:53,960 --> 00:17:56,960 Speaker 1: aggressive plan that calls for a coal power generation to 276 00:17:57,040 --> 00:18:00,840 Speaker 1: drop to twenty one eight percent by twenty thirty and 277 00:18:00,880 --> 00:18:04,639 Speaker 1: renewables would go from the current six point two percent 278 00:18:04,800 --> 00:18:10,959 Speaker 1: of power generation to thirty by It is a very 279 00:18:11,000 --> 00:18:13,399 Speaker 1: aggressive green plan. It will be interesting to see if 280 00:18:13,400 --> 00:18:16,560 Speaker 1: the country can follow through. The U s Department of 281 00:18:16,680 --> 00:18:20,800 Speaker 1: Justice has leveled charges against Mark Faulkner, who was the 282 00:18:20,920 --> 00:18:24,320 Speaker 1: chief technical pilot for Boeing. Now, the heart of the 283 00:18:24,359 --> 00:18:27,800 Speaker 1: matter relates to the seven thirty seven Max aircraft. You 284 00:18:27,880 --> 00:18:31,560 Speaker 1: might remember that there were two tragic crashes, one near Jakarta, 285 00:18:31,640 --> 00:18:36,160 Speaker 1: Indonesia and another one in Ethiopia that involved seven thirty 286 00:18:36,200 --> 00:18:41,280 Speaker 1: seven Max aircraft and the aircraft's Maneuvering Characteristics Augmentation System 287 00:18:41,440 --> 00:18:44,560 Speaker 1: or m CAST, was at fault. Now, the m CAST 288 00:18:44,800 --> 00:18:48,560 Speaker 1: is meant to stabilize flight and the seven thirty seven 289 00:18:48,560 --> 00:18:52,639 Speaker 1: MAX has a design that means that the nose of 290 00:18:52,680 --> 00:18:57,040 Speaker 1: the aircraft tends to go up during flight as you 291 00:18:57,080 --> 00:19:01,200 Speaker 1: are are moving forward, and this can cause the plane 292 00:19:01,240 --> 00:19:03,679 Speaker 1: to climb or even to reach a point where the 293 00:19:03,720 --> 00:19:10,440 Speaker 1: aircraft could you know, suffer a massive stall. Engineers tweaked 294 00:19:10,800 --> 00:19:14,680 Speaker 1: the MCAST system to counteract this tendency and thus force 295 00:19:14,840 --> 00:19:20,399 Speaker 1: the nose back down to a level orientation. But the 296 00:19:20,440 --> 00:19:22,800 Speaker 1: problem was that the system would start pushing the nose 297 00:19:22,840 --> 00:19:27,240 Speaker 1: down prematurely, like during takeoff, it would mistakenly identify that 298 00:19:27,280 --> 00:19:29,920 Speaker 1: the aircraft was in a dangerous angle of attack, that's 299 00:19:30,000 --> 00:19:32,800 Speaker 1: the orientation of the aircraft in relation to the direction 300 00:19:32,840 --> 00:19:37,040 Speaker 1: that it is traveling. Worst, Boeing had failed to include 301 00:19:37,119 --> 00:19:41,160 Speaker 1: information about im casts in various training manuals. In fact, 302 00:19:41,160 --> 00:19:43,960 Speaker 1: the whole point of the seven MAX was to decrease 303 00:19:44,000 --> 00:19:46,119 Speaker 1: the amount of time that pilots would have to spend 304 00:19:46,119 --> 00:19:49,920 Speaker 1: in training in order to fly the aircraft. So pilots 305 00:19:49,920 --> 00:19:52,920 Speaker 1: had no idea how to shut off the system and 306 00:19:53,000 --> 00:19:56,200 Speaker 1: they lost control the aircraft, thus leading to these tragic 307 00:19:56,760 --> 00:20:01,160 Speaker 1: plane crashes. The d o J says that Wagner presented 308 00:20:01,520 --> 00:20:06,639 Speaker 1: false and incomplete information to various airlines regarding mcasts, and 309 00:20:06,960 --> 00:20:09,600 Speaker 1: again this was in order for Boeing to sell the 310 00:20:09,640 --> 00:20:14,320 Speaker 1: aircraft with the essentially the advertising line of you won't 311 00:20:14,400 --> 00:20:18,480 Speaker 1: have to bring your pilots off duty to complete hundreds 312 00:20:18,520 --> 00:20:21,640 Speaker 1: of hours or dozens of hours of training in order 313 00:20:21,680 --> 00:20:25,120 Speaker 1: to fly this aircraft. They're training and other aircraft will 314 00:20:25,160 --> 00:20:28,159 Speaker 1: be sufficient and they'll just have to complete maybe a 315 00:20:28,160 --> 00:20:30,159 Speaker 1: few hours and that's it, and then they're back in 316 00:20:30,200 --> 00:20:33,919 Speaker 1: the air flying passengers around. So, in other words, it 317 00:20:33,960 --> 00:20:37,159 Speaker 1: was it was being sold as a way to to 318 00:20:37,400 --> 00:20:41,639 Speaker 1: save in a long run while also upgrading your your fleet. 319 00:20:42,240 --> 00:20:46,399 Speaker 1: And now the d o J is saying that Forakner specifically, 320 00:20:46,760 --> 00:20:50,159 Speaker 1: was leaving out critical information that could have saved lives. 321 00:20:50,600 --> 00:20:54,280 Speaker 1: A grand jury has indicted Wargner, who faces multiple accounts 322 00:20:54,359 --> 00:20:59,040 Speaker 1: of wire fraud. The United States Federal Communications Commission, or FCC, 323 00:20:59,359 --> 00:21:03,200 Speaker 1: has already created rules for telecom companies in the United 324 00:21:03,200 --> 00:21:07,040 Speaker 1: States regarding spam phone calls, and the major carriers have 325 00:21:07,160 --> 00:21:11,880 Speaker 1: already reached an obligation to comply to those orders. Smaller 326 00:21:11,920 --> 00:21:14,960 Speaker 1: carriers have a little bit longer. They have un til 327 00:21:15,160 --> 00:21:17,760 Speaker 1: I think June and next year to implement those changes, 328 00:21:18,400 --> 00:21:21,560 Speaker 1: and the goal is to cut down on spam calls. 329 00:21:21,880 --> 00:21:24,280 Speaker 1: Now the FCC is looking to do the same thing 330 00:21:24,359 --> 00:21:29,000 Speaker 1: with text messages. FCC Chairwoman Jessica Rosen Warsol says that 331 00:21:29,080 --> 00:21:32,920 Speaker 1: the increase in robotexts aimed at tricking people into downloading 332 00:21:32,920 --> 00:21:36,600 Speaker 1: nolware or sharing personal information has been on the rise, 333 00:21:37,080 --> 00:21:40,800 Speaker 1: and the FCC and telecom companies have an obligation to respond. 334 00:21:41,400 --> 00:21:44,680 Speaker 1: The f CC will propose rules for this response, which 335 00:21:44,840 --> 00:21:47,520 Speaker 1: the public will be able to review and comment upon 336 00:21:48,040 --> 00:21:51,560 Speaker 1: before they are implemented. This is just the normal operations 337 00:21:51,560 --> 00:21:53,800 Speaker 1: for the f c C. You might remember that during 338 00:21:53,840 --> 00:21:58,399 Speaker 1: the whole net neutrality battles, there were times where the 339 00:21:58,440 --> 00:22:02,119 Speaker 1: f CC opened up things for public comment that became 340 00:22:02,160 --> 00:22:05,159 Speaker 1: a whole mess depending upon which administration was in charge. 341 00:22:05,200 --> 00:22:08,320 Speaker 1: There were cases where there were a bunch of uh 342 00:22:08,520 --> 00:22:13,440 Speaker 1: comments that were suspected of being fake, that were intended 343 00:22:13,480 --> 00:22:17,600 Speaker 1: to push one agenda over another. Anyway, the whole point 344 00:22:17,600 --> 00:22:20,280 Speaker 1: of that process is to give the public an opportunity 345 00:22:20,400 --> 00:22:24,320 Speaker 1: to to weigh in and to you know, argue what 346 00:22:24,400 --> 00:22:27,239 Speaker 1: things matter most to them. So in this case, I 347 00:22:27,280 --> 00:22:30,119 Speaker 1: hope that the public response by saying yes, I would 348 00:22:30,119 --> 00:22:33,240 Speaker 1: like to have fewer spam text messages. At this point, 349 00:22:33,480 --> 00:22:35,440 Speaker 1: I just want my phone to work whenever I needed 350 00:22:35,480 --> 00:22:39,000 Speaker 1: to work, and I don't want people to be contacting me. Specifically, 351 00:22:39,040 --> 00:22:42,320 Speaker 1: I don't want spam to be contacting me. We have 352 00:22:42,400 --> 00:22:45,560 Speaker 1: a few more news stories to cover before we get 353 00:22:45,600 --> 00:22:57,440 Speaker 1: into that. Let's take another quick break. The Sinclair Broadcast Group, 354 00:22:57,520 --> 00:23:01,320 Speaker 1: which operates lots of television station across the United States, 355 00:23:01,480 --> 00:23:04,960 Speaker 1: and which is also known for pushing conservative talking points 356 00:23:05,000 --> 00:23:09,520 Speaker 1: through its numerous regional stations, you know, requiring local staff 357 00:23:09,600 --> 00:23:13,800 Speaker 1: to repeat those particular talking points, it has been hit 358 00:23:13,960 --> 00:23:17,320 Speaker 1: with a ransomware attack. Now, before I get into this story, 359 00:23:17,440 --> 00:23:19,080 Speaker 1: I do want to say that while I do not 360 00:23:19,320 --> 00:23:23,359 Speaker 1: agree with Sinclair's political philosophy, and I definitely don't agree 361 00:23:23,400 --> 00:23:27,240 Speaker 1: with its policy of requiring staff to repeat those talking points, 362 00:23:27,560 --> 00:23:31,879 Speaker 1: I also condemn ransomware attacks in general. Right, I don't 363 00:23:32,640 --> 00:23:35,400 Speaker 1: think that that is the right way to go. Two 364 00:23:35,480 --> 00:23:39,439 Speaker 1: wrongs do not make a right anyway. This attack happened 365 00:23:39,440 --> 00:23:43,400 Speaker 1: this past Saturday and disrupted operations at numerous stations around 366 00:23:43,440 --> 00:23:47,680 Speaker 1: the United States. It affected the station's ability to stream content, 367 00:23:47,880 --> 00:23:52,000 Speaker 1: It disrupted services like email and even telephone service at 368 00:23:52,040 --> 00:23:55,119 Speaker 1: some of these locations. In addition, the hackers appear to 369 00:23:55,160 --> 00:23:59,560 Speaker 1: have access data on Sinclair's systems and locked Sinclair out 370 00:23:59,560 --> 00:24:02,520 Speaker 1: of those systems at the time of this recording, it's 371 00:24:02,520 --> 00:24:05,640 Speaker 1: not entirely clear what data that might be or how 372 00:24:05,680 --> 00:24:08,960 Speaker 1: it will affect Sinclair moving forward. I've not seen any 373 00:24:09,000 --> 00:24:12,720 Speaker 1: reports on how much the hackers have demanded in ransom. 374 00:24:12,760 --> 00:24:16,080 Speaker 1: I mean, if it's a ransomware attack, that's typically part 375 00:24:16,080 --> 00:24:20,080 Speaker 1: of it that I haven't seen any any reference to 376 00:24:20,119 --> 00:24:23,879 Speaker 1: the demands. Uh, nor have I found any information on 377 00:24:23,960 --> 00:24:27,399 Speaker 1: which hacker group is responsible. At least as of the 378 00:24:27,440 --> 00:24:31,880 Speaker 1: recording of this podcast, no group has claimed responsibility. Now, 379 00:24:31,880 --> 00:24:34,800 Speaker 1: it might be tempting to assume that the hackers have 380 00:24:34,920 --> 00:24:39,159 Speaker 1: an ideological beef with Sinclair, but it's also possible that 381 00:24:39,240 --> 00:24:42,920 Speaker 1: this was just an attack of opportunity, unguided by any 382 00:24:42,960 --> 00:24:47,119 Speaker 1: sort of ideological stance. But as always, my advice is 383 00:24:47,680 --> 00:24:52,560 Speaker 1: never pay the ransom. It encourages future attacks. UH. This 384 00:24:52,640 --> 00:24:55,560 Speaker 1: is easier said than done for a lot of organizations 385 00:24:55,760 --> 00:24:59,360 Speaker 1: because it could be critical infrastructure that's affected by the attack, 386 00:25:00,040 --> 00:25:03,120 Speaker 1: and if there's not an easy fix to reverse that, 387 00:25:03,840 --> 00:25:07,840 Speaker 1: then what do you do? But paying that ransom it 388 00:25:07,880 --> 00:25:11,399 Speaker 1: really just sends the message of Hey, your tactic worked 389 00:25:11,680 --> 00:25:13,800 Speaker 1: and you made money off of it, so you should 390 00:25:13,800 --> 00:25:17,440 Speaker 1: do it again. Not a great situation for the rest 391 00:25:17,440 --> 00:25:20,960 Speaker 1: of us. Now, on the subject of ransomware, the FBI, 392 00:25:21,160 --> 00:25:25,840 Speaker 1: along with the Cybersecurity and Infrastructure Security Agency or CAESA, 393 00:25:26,080 --> 00:25:28,840 Speaker 1: as well as the National Security Agency or n s A, 394 00:25:29,480 --> 00:25:34,440 Speaker 1: have issued a joint advisory about the black Matter ransomware 395 00:25:34,680 --> 00:25:38,480 Speaker 1: gang and how that group works. So the goal is 396 00:25:38,520 --> 00:25:43,359 Speaker 1: to alert various organizations, essentially big businesses, to the tactics 397 00:25:43,359 --> 00:25:46,879 Speaker 1: that this group uses in an effort to bolster cybersecurity 398 00:25:47,119 --> 00:25:52,560 Speaker 1: and prevent future attacks. Black Matter has traditionally offered bribes 399 00:25:52,760 --> 00:25:56,959 Speaker 1: to employees who work for really big organizations, sometimes up 400 00:25:57,000 --> 00:26:00,920 Speaker 1: to a hundred thousand dollars for these emloyees to share 401 00:26:01,080 --> 00:26:05,080 Speaker 1: log in credentials. Essentially, they're saying, if you have a 402 00:26:05,200 --> 00:26:08,679 Speaker 1: key to the vault, we will pay you a hundred 403 00:26:08,680 --> 00:26:11,919 Speaker 1: thousand dollars to let us in. The group does have 404 00:26:11,960 --> 00:26:16,280 Speaker 1: some restrictions, however. They apparently don't want to target nonprofit organizations. 405 00:26:16,720 --> 00:26:20,199 Speaker 1: They don't want to target hospitals or healthcare companies in 406 00:26:20,240 --> 00:26:25,000 Speaker 1: that space. Uh. They say that government organizations are off limits. 407 00:26:25,119 --> 00:26:27,479 Speaker 1: Companies that are in the defense industry are off limits 408 00:26:27,520 --> 00:26:30,359 Speaker 1: stuff like that, so they're really looking to target like 409 00:26:30,480 --> 00:26:33,359 Speaker 1: big businesses that are outside of those industries. You know, 410 00:26:33,400 --> 00:26:38,280 Speaker 1: like financial institutions could be totally on target. Uh, big 411 00:26:38,320 --> 00:26:41,280 Speaker 1: telecommunications companies, that kind of stuff, like any of those 412 00:26:41,280 --> 00:26:45,199 Speaker 1: other companies could be considered fair game. And their attack 413 00:26:45,320 --> 00:26:48,720 Speaker 1: methods include good old standbys, such as using compromise log 414 00:26:48,760 --> 00:26:52,160 Speaker 1: incredentials to gain unauthorized access to a system. That's why 415 00:26:52,160 --> 00:26:55,600 Speaker 1: they will offer that like hundred thousand dollar payout in 416 00:26:55,680 --> 00:26:59,000 Speaker 1: order to get access to what are considered to be 417 00:26:59,080 --> 00:27:02,200 Speaker 1: really high value targets. This is the sort of thing 418 00:27:02,240 --> 00:27:07,280 Speaker 1: that the average person can prevent. So first, you know, 419 00:27:07,520 --> 00:27:10,240 Speaker 1: you don't give into the temptation to sell out a 420 00:27:10,320 --> 00:27:13,520 Speaker 1: company for a hundred thousand dollars. That kind of stuff 421 00:27:13,760 --> 00:27:18,600 Speaker 1: can be tracked. It's more likely to get you into 422 00:27:18,640 --> 00:27:21,080 Speaker 1: serious trouble than that hundred grand is worth. I mean 423 00:27:21,119 --> 00:27:24,439 Speaker 1: a hundred thousand dollars. That's a lot of money, and 424 00:27:24,480 --> 00:27:27,520 Speaker 1: I can see where the temptation would be. But uh 425 00:27:27,680 --> 00:27:32,160 Speaker 1: not many people have any idea of how to hide 426 00:27:32,920 --> 00:27:36,679 Speaker 1: a sudden flow of a hundred thousand dollars into their 427 00:27:36,720 --> 00:27:40,200 Speaker 1: bank accounts. So really, for those who want to try 428 00:27:40,200 --> 00:27:43,080 Speaker 1: and keep things safe, they need to set strong passwords. 429 00:27:43,160 --> 00:27:46,000 Speaker 1: They need to keep those passwords safe. One thing you 430 00:27:46,000 --> 00:27:48,840 Speaker 1: should not do, for example, is write your password down 431 00:27:48,880 --> 00:27:51,280 Speaker 1: on a sticky note and attach that sticky note to 432 00:27:51,320 --> 00:27:55,080 Speaker 1: your computer monitor. Not that I've personally seen that happen. 433 00:27:56,680 --> 00:28:00,119 Speaker 1: The agencies revealed that the black Matter group also is 434 00:28:00,160 --> 00:28:02,440 Speaker 1: something that a lot of other groups will not do. 435 00:28:03,560 --> 00:28:06,960 Speaker 1: Instead of encrypt data backups, that's what a lot of 436 00:28:07,200 --> 00:28:09,840 Speaker 1: ransomware groups will do, though, and cry crypt the backups 437 00:28:09,880 --> 00:28:12,960 Speaker 1: so that the backups are not accessible, black Matter goes 438 00:28:12,960 --> 00:28:18,080 Speaker 1: a step further. They will overwrite the data backup systems, 439 00:28:18,119 --> 00:28:22,679 Speaker 1: so they will essentially erase all that backup data. That 440 00:28:22,720 --> 00:28:26,399 Speaker 1: means that companies are left with encrypted systems and no backups. 441 00:28:26,760 --> 00:28:29,280 Speaker 1: Whether those are backups are encrypted or otherwise. It's a 442 00:28:29,280 --> 00:28:32,520 Speaker 1: pretty brutal attack, and obviously the best approach is to 443 00:28:32,560 --> 00:28:35,439 Speaker 1: try and prevent an attack from happening in the first place. 444 00:28:36,080 --> 00:28:38,880 Speaker 1: There are more details, and I would urge anyone out 445 00:28:38,880 --> 00:28:41,760 Speaker 1: there who oversees large and critical networks to read the 446 00:28:41,840 --> 00:28:45,280 Speaker 1: Joint Advisory. You can find it at the ces A website. 447 00:28:45,640 --> 00:28:49,960 Speaker 1: It is alert A A to one DASH to nine 448 00:28:50,280 --> 00:28:55,440 Speaker 1: one a catchy right, But seriously, if you are at 449 00:28:55,440 --> 00:28:59,200 Speaker 1: all responsible for those sorts of networks, I highly recommend 450 00:28:59,240 --> 00:29:02,520 Speaker 1: you read the advice ory. It could potentially save you 451 00:29:02,600 --> 00:29:06,480 Speaker 1: and your organization a lot of trouble. Finally, as part 452 00:29:06,480 --> 00:29:09,920 Speaker 1: of the Artemis program, which is the NASA program that 453 00:29:09,960 --> 00:29:12,880 Speaker 1: includes sending astronauts back to the Moon for the first 454 00:29:12,880 --> 00:29:16,600 Speaker 1: time since the early nineteen seventies, NASA plans to establish 455 00:29:16,680 --> 00:29:21,560 Speaker 1: a communications network that it is currently calling Luna net Now. 456 00:29:21,600 --> 00:29:24,080 Speaker 1: According to NASA, the goal is to have a system 457 00:29:24,120 --> 00:29:27,320 Speaker 1: that will allow for communications on and around the Moon 458 00:29:27,720 --> 00:29:30,200 Speaker 1: in a way that's similar to how WiFi works here 459 00:29:30,240 --> 00:29:34,000 Speaker 1: on Earth. Now, I'm using vague language to describe this 460 00:29:34,560 --> 00:29:38,520 Speaker 1: because what NASA has done involves setting the goals of 461 00:29:38,560 --> 00:29:41,760 Speaker 1: what this system should be capable of doing, rather than 462 00:29:41,840 --> 00:29:44,920 Speaker 1: laying out the design of the system itself. And this 463 00:29:45,000 --> 00:29:48,160 Speaker 1: is because the agency has released a specification on what 464 00:29:48,240 --> 00:29:51,240 Speaker 1: the system should be able to do, and it's looking 465 00:29:51,280 --> 00:29:56,280 Speaker 1: to engage in technical discussions with various telecommunication and internet 466 00:29:56,640 --> 00:29:59,480 Speaker 1: experts in an effort to design a system that will 467 00:29:59,600 --> 00:30:02,600 Speaker 1: make those goals a reality. The system will need to 468 00:30:02,600 --> 00:30:05,680 Speaker 1: take into account stuff that isn't necessarily as big a 469 00:30:05,720 --> 00:30:10,240 Speaker 1: concern here on Earth, like solar flare activity. So it 470 00:30:10,360 --> 00:30:15,160 Speaker 1: is possible for a particularly powerful solar flare to be 471 00:30:15,200 --> 00:30:18,160 Speaker 1: strong enough to affect electrical systems here on Earth and 472 00:30:18,200 --> 00:30:22,240 Speaker 1: to disrupt systems. It's it's like an electromagnetic pulse that 473 00:30:22,320 --> 00:30:24,800 Speaker 1: can wipe out things like a power grid if it 474 00:30:24,840 --> 00:30:28,120 Speaker 1: were really powerful and it hit Earth at just the 475 00:30:28,200 --> 00:30:33,080 Speaker 1: right time. But our magneto sphere and Earth's atmosphere protect 476 00:30:33,160 --> 00:30:36,080 Speaker 1: us quite a bit here on Earth. Out in space, 477 00:30:36,120 --> 00:30:39,720 Speaker 1: it's a different story. You don't necessarily have that protection, 478 00:30:39,840 --> 00:30:44,160 Speaker 1: So the system needs to be resilient. More importantly, the 479 00:30:44,240 --> 00:30:47,520 Speaker 1: system should be able to alert astronauts of an impending 480 00:30:47,720 --> 00:30:50,760 Speaker 1: solar flare ahead of time, and it gives them the 481 00:30:50,840 --> 00:30:53,720 Speaker 1: chance to prepare for that faster than they would if 482 00:30:53,760 --> 00:30:57,080 Speaker 1: the news first had to come from mission control on Earth, 483 00:30:57,520 --> 00:31:00,520 Speaker 1: and any delay at all can be disaster risks in 484 00:31:00,720 --> 00:31:05,040 Speaker 1: these sort of events. Assuming that lunar net all comes together, 485 00:31:05,200 --> 00:31:07,200 Speaker 1: the astronauts of the future will be able to use 486 00:31:07,200 --> 00:31:10,959 Speaker 1: the network to communicate with each other, just as some 487 00:31:10,960 --> 00:31:13,800 Speaker 1: some of the folks water around the Moon and others 488 00:31:13,880 --> 00:31:17,560 Speaker 1: remain in lunar orbit or beyond. It should also allow 489 00:31:17,640 --> 00:31:21,480 Speaker 1: astronauts to conduct more extensive operations on the Moon's surface, 490 00:31:22,000 --> 00:31:25,120 Speaker 1: using the network to aid in navigation so that they 491 00:31:25,120 --> 00:31:29,520 Speaker 1: can range further from home base and still return easily. Right, 492 00:31:29,680 --> 00:31:32,200 Speaker 1: you know, they don't get lost on the Moon's surface. 493 00:31:32,680 --> 00:31:34,960 Speaker 1: There's not a whole lot of landmarks that you can 494 00:31:35,000 --> 00:31:38,080 Speaker 1: look at besides things like craters and stuff. So the 495 00:31:38,120 --> 00:31:42,920 Speaker 1: idea is that this communication system will allow for more 496 00:31:44,080 --> 00:31:48,000 Speaker 1: robust and extensive lunar operations and that could help with 497 00:31:48,160 --> 00:31:50,880 Speaker 1: science that could help with establishing moon bases and that 498 00:31:50,960 --> 00:31:54,000 Speaker 1: sort of stuff in the future. So we will have 499 00:31:54,040 --> 00:31:57,000 Speaker 1: to wait and see how this unfolds. I think it's 500 00:31:57,000 --> 00:32:00,840 Speaker 1: pretty cool and um I look word to hearing about 501 00:32:00,840 --> 00:32:05,640 Speaker 1: the Looney Communications Network. All right, that is it for 502 00:32:05,720 --> 00:32:08,680 Speaker 1: the tech news I have for you today, Tuesday, October 503 00:32:08,760 --> 00:32:12,120 Speaker 1: ninety one. If you have suggestions for topics I should 504 00:32:12,120 --> 00:32:15,960 Speaker 1: cover in future episodes of tech Stuff, whether it's a technology, 505 00:32:16,160 --> 00:32:20,000 Speaker 1: a trend, a company, anything like that, let me know 506 00:32:20,320 --> 00:32:22,080 Speaker 1: the best way to do that is to reach out 507 00:32:22,120 --> 00:32:25,120 Speaker 1: on Twitter to handle for the show is tech Stuff's 508 00:32:25,320 --> 00:32:29,400 Speaker 1: h s W and I'll talk to you again really soon. 509 00:32:34,880 --> 00:32:37,880 Speaker 1: Tech Stuff is an I Heart Radio production. For more 510 00:32:37,960 --> 00:32:41,360 Speaker 1: podcasts from my heart Radio, visit the i heart Radio app, 511 00:32:41,480 --> 00:32:44,640 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.