1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:18,080 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,120 --> 00:00:20,800 Speaker 1: And how the tech are you? It's time for the 5 00:00:20,840 --> 00:00:25,160 Speaker 1: tech news for Thursday, March one, twenty two. We should 6 00:00:25,200 --> 00:00:29,160 Speaker 1: all be thankful that a news episode did not fall 7 00:00:29,360 --> 00:00:33,000 Speaker 1: on April one, because who knows what would have happened. Then. 8 00:00:33,720 --> 00:00:36,080 Speaker 1: I do know what's gonna happen. Now. We're gonna complain 9 00:00:36,200 --> 00:00:38,760 Speaker 1: a lot about big tech because there are a lot 10 00:00:38,800 --> 00:00:43,040 Speaker 1: of stories that just continue to put a lot of 11 00:00:43,040 --> 00:00:46,560 Speaker 1: companies in a pretty bad light. And I would love 12 00:00:46,600 --> 00:00:49,520 Speaker 1: for that not to be the case. But here we are, 13 00:00:49,720 --> 00:00:54,560 Speaker 1: so let's get started. Now. A lot of the news, 14 00:00:54,640 --> 00:00:56,280 Speaker 1: like I said, is going to be about big tech 15 00:00:56,360 --> 00:00:59,800 Speaker 1: shenanigans in this episode, not as many as as I 16 00:01:00,000 --> 00:01:02,400 Speaker 1: originally had lined up. In fact, I threw out a 17 00:01:02,440 --> 00:01:06,560 Speaker 1: ton of stories because the episode was starting to go 18 00:01:06,640 --> 00:01:10,039 Speaker 1: too long, and and also it just I was starting 19 00:01:10,080 --> 00:01:13,360 Speaker 1: to get beaten down by it. So anyway, one of 20 00:01:13,360 --> 00:01:16,080 Speaker 1: the things that I read about today which I found 21 00:01:16,120 --> 00:01:20,880 Speaker 1: interesting is an organization called the Tech Oversight Project that 22 00:01:20,959 --> 00:01:24,520 Speaker 1: has created a wiki specifically to help folks read up 23 00:01:24,560 --> 00:01:27,840 Speaker 1: on the various issues surrounding big tech over the last 24 00:01:27,840 --> 00:01:30,880 Speaker 1: several years, like things that big tech companies have done 25 00:01:30,920 --> 00:01:35,200 Speaker 1: that have been, you know, not good. The wiki is 26 00:01:35,240 --> 00:01:39,240 Speaker 1: fittingly called the big Tech Wiki, and you can read 27 00:01:39,280 --> 00:01:42,920 Speaker 1: a great article about this resource and the organization behind 28 00:01:42,959 --> 00:01:47,240 Speaker 1: it over on gizmoto dot com. The article is titled 29 00:01:47,640 --> 00:01:52,800 Speaker 1: Watchdog Group publishes Encyclopedia of all the nasty things big 30 00:01:52,800 --> 00:01:57,200 Speaker 1: Tech has done, and it's by Mac Dagaron. But one 31 00:01:57,280 --> 00:02:01,320 Speaker 1: caveat because the article itself reveals this, it's not actually 32 00:02:01,960 --> 00:02:04,840 Speaker 1: all the nasty things big tech has done, because even 33 00:02:04,880 --> 00:02:09,079 Speaker 1: with more than ninety pages of documentation, it's just scratching 34 00:02:09,120 --> 00:02:13,440 Speaker 1: the surface. But the wiki collects this documentation about things 35 00:02:13,520 --> 00:02:17,799 Speaker 1: like anti competitive practices, UH, the spread of misinformation, how 36 00:02:17,840 --> 00:02:21,480 Speaker 1: these companies are funding political lobbying groups to shape legislation 37 00:02:21,520 --> 00:02:25,880 Speaker 1: in their favor, and controversial partnerships with other entities, such 38 00:02:25,919 --> 00:02:29,480 Speaker 1: as for example, Google's work with the U. S Military, 39 00:02:29,600 --> 00:02:33,680 Speaker 1: and stuff like facial recognition which could potentially be weaponized. 40 00:02:34,360 --> 00:02:37,040 Speaker 1: And it's a good article, so I do recommend you 41 00:02:37,120 --> 00:02:40,359 Speaker 1: check it out, and the resource, the actual wiki itself, 42 00:02:40,360 --> 00:02:42,680 Speaker 1: can help you catch up on all the exhausting amount 43 00:02:42,680 --> 00:02:44,880 Speaker 1: of skullduggery that's been going on in the big tech 44 00:02:44,960 --> 00:02:48,480 Speaker 1: sector for quite some time. I should also add that 45 00:02:48,560 --> 00:02:53,519 Speaker 1: the project receives funds from the Ahmadyar Network, which politically 46 00:02:53,680 --> 00:02:57,640 Speaker 1: is a left leaning organization. Now, I say that simply 47 00:02:57,680 --> 00:02:59,400 Speaker 1: because I think it's always good to keep in mind 48 00:03:00,160 --> 00:03:05,200 Speaker 1: the perspective that was used to cover certain issues. I mean, 49 00:03:05,240 --> 00:03:08,799 Speaker 1: for that matter, I actually, no surprise, I leaned pretty 50 00:03:08,840 --> 00:03:12,200 Speaker 1: hard left myself, So I'm not saying that my perspective 51 00:03:12,240 --> 00:03:15,680 Speaker 1: is the correct one. I don't necessarily believe that I 52 00:03:15,720 --> 00:03:18,400 Speaker 1: know how I feel, but I would. It would be 53 00:03:18,480 --> 00:03:21,120 Speaker 1: way too much hubris for me to say my way 54 00:03:21,440 --> 00:03:24,520 Speaker 1: is the right way. But you should check out the 55 00:03:24,639 --> 00:03:26,760 Speaker 1: article and the wiki if you want to get angry 56 00:03:26,800 --> 00:03:30,720 Speaker 1: at big companies like Google, Meta, Apple, Microsoft, those kind 57 00:03:30,720 --> 00:03:36,440 Speaker 1: of things. Now, along those lines, CNBC published an article 58 00:03:36,520 --> 00:03:41,280 Speaker 1: titled how Google and Amazon bankrolled a grassroots activist group 59 00:03:41,600 --> 00:03:45,840 Speaker 1: of small business owners to lobby against big tech oversight. 60 00:03:46,560 --> 00:03:49,880 Speaker 1: There's a heck of a long headline, and that's just 61 00:03:49,960 --> 00:03:52,320 Speaker 1: the kind of thing that the Big Tech Wiki would cover. 62 00:03:52,800 --> 00:03:56,400 Speaker 1: So in that article, CNBC reporters reveal that the lobbying 63 00:03:56,440 --> 00:04:01,160 Speaker 1: group is called the Connected Commerce Council, and it's supposed 64 00:04:01,200 --> 00:04:05,000 Speaker 1: to represent small businesses and Amazon and Google are simply 65 00:04:05,040 --> 00:04:08,400 Speaker 1: referred to as partners in the group. But apparently at 66 00:04:08,480 --> 00:04:11,000 Speaker 1: least some of the business is listed as members of 67 00:04:11,000 --> 00:04:16,000 Speaker 1: the group never actually joined it. Many had not even 68 00:04:16,080 --> 00:04:19,800 Speaker 1: heard of the group, according to them, and yet their 69 00:04:19,800 --> 00:04:23,960 Speaker 1: businesses were listed on the roster. Now, the implication is 70 00:04:24,000 --> 00:04:28,440 Speaker 1: that this makes this a an AstroTurf campaign. AstroTurf is 71 00:04:28,480 --> 00:04:32,080 Speaker 1: a term used to describe a situation where companies create 72 00:04:32,120 --> 00:04:37,160 Speaker 1: what is supposed to look like a grassroots political movement, 73 00:04:37,720 --> 00:04:42,040 Speaker 1: but is an actuality an attempt to push against legislation 74 00:04:42,080 --> 00:04:46,320 Speaker 1: that would restrict or regulate the companies, or in some case, 75 00:04:46,360 --> 00:04:49,920 Speaker 1: push for legislation that would give those companies more advantages. 76 00:04:50,680 --> 00:04:55,120 Speaker 1: And astroturfing is pretty insidious. It's also a pretty common tactic. 77 00:04:55,839 --> 00:04:58,359 Speaker 1: We've also seen big tech companies try to leverage small 78 00:04:58,400 --> 00:05:01,719 Speaker 1: business in a way to excuse certain policies and practices. 79 00:05:02,160 --> 00:05:04,640 Speaker 1: Meta has done this a lot as well, claiming that 80 00:05:04,680 --> 00:05:09,200 Speaker 1: certain restrictions to its advertising strategy would harm small businesses. 81 00:05:09,720 --> 00:05:11,520 Speaker 1: You know, they try to position it so it's not 82 00:05:11,560 --> 00:05:15,080 Speaker 1: about hurting the bottom line of the big company, but 83 00:05:15,240 --> 00:05:18,640 Speaker 1: rather it's the equivalent of think of the children. Is 84 00:05:18,680 --> 00:05:21,160 Speaker 1: really what it comes down to, and it can be 85 00:05:21,200 --> 00:05:25,240 Speaker 1: pretty shady stuff. Now that being said, this article also 86 00:05:25,680 --> 00:05:30,520 Speaker 1: mentions that there are legitimate small businesses that are really 87 00:05:30,600 --> 00:05:34,560 Speaker 1: part of the organization, so it's not like just a 88 00:05:34,680 --> 00:05:38,839 Speaker 1: dummy group or anything like that. There are actual small 89 00:05:38,880 --> 00:05:41,760 Speaker 1: businesses represented in the ranks that really do want to 90 00:05:41,800 --> 00:05:45,040 Speaker 1: be there, and there are several that are in favor 91 00:05:45,200 --> 00:05:47,640 Speaker 1: of the group and the group's policies, and they say 92 00:05:47,640 --> 00:05:49,840 Speaker 1: that the group keeps them up to date on proposed 93 00:05:49,960 --> 00:05:54,279 Speaker 1: legislation that could impact their businesses. So you might say, 94 00:05:54,360 --> 00:05:57,000 Speaker 1: is this a real AstroTurf case or does it only 95 00:05:57,040 --> 00:05:59,920 Speaker 1: appear that way due to the involvement of big tech 96 00:06:00,040 --> 00:06:04,000 Speaker 1: companies that are particularly politically active, And I honestly don't 97 00:06:04,000 --> 00:06:07,200 Speaker 1: know the answer to that. In a similar story, the 98 00:06:07,240 --> 00:06:10,839 Speaker 1: Washington Post published an article revealing that Meta has been 99 00:06:10,880 --> 00:06:16,320 Speaker 1: paying a consulting firm called Targeted Victory, and Targeted Victory 100 00:06:16,440 --> 00:06:22,279 Speaker 1: primarily caters to US Republican candidates during election cycles. Specifically, 101 00:06:23,080 --> 00:06:25,400 Speaker 1: Meta has hired on the consulting firm in order to 102 00:06:25,440 --> 00:06:29,839 Speaker 1: create a kind of smear campaign against rival social media 103 00:06:29,920 --> 00:06:33,719 Speaker 1: platform TikTok. Now, according to the Post, the goal was 104 00:06:33,800 --> 00:06:37,360 Speaker 1: to reach out to various news outlets, and we're primarily 105 00:06:37,400 --> 00:06:41,679 Speaker 1: talking about regional or local news outlets, and then convince 106 00:06:41,720 --> 00:06:45,840 Speaker 1: them to run pieces criticizing TikTok and making various claims 107 00:06:45,880 --> 00:06:50,760 Speaker 1: about TikTok's potential for harm, particularly for younger people. Now, 108 00:06:50,800 --> 00:06:54,160 Speaker 1: some of those claims I think have some merit to them. 109 00:06:54,200 --> 00:06:58,280 Speaker 1: I do think social platforms can facilitate harm, not that 110 00:06:58,320 --> 00:07:02,600 Speaker 1: they're necessarily harmful just by themselves, but that you know, 111 00:07:02,839 --> 00:07:08,919 Speaker 1: they are very effective at transmitting harm. In fact, Meta 112 00:07:09,000 --> 00:07:11,720 Speaker 1: thinks this too about its own platforms, according to those 113 00:07:11,760 --> 00:07:16,520 Speaker 1: internal documents that Francis Hogan leaked last year. But some 114 00:07:16,640 --> 00:07:19,880 Speaker 1: of the narratives pushed by Targeted Victory are at best 115 00:07:19,960 --> 00:07:24,520 Speaker 1: insincere and at worst are an outright form of misinformation. 116 00:07:25,080 --> 00:07:28,280 Speaker 1: According to the Post, some of the stories Targeted Victory 117 00:07:28,320 --> 00:07:32,080 Speaker 1: pushed were about harmful trends that had supposedly originated and 118 00:07:32,120 --> 00:07:36,000 Speaker 1: propagated across TikTok, when, in fact, and at least a 119 00:07:36,040 --> 00:07:40,080 Speaker 1: couple of those cases, those trends actually got their start 120 00:07:40,320 --> 00:07:44,440 Speaker 1: on Facebook. So, in other words, the stuff Facebook was 121 00:07:44,600 --> 00:07:50,680 Speaker 1: indirectly accusing TikTok of through targeted victory, we're actually examples 122 00:07:50,680 --> 00:07:56,760 Speaker 1: of Facebook's own shortcomings. Now, to be fair, Meta slash 123 00:07:56,800 --> 00:08:01,760 Speaker 1: Facebook does have some distance from these efforts because it's 124 00:08:01,800 --> 00:08:05,280 Speaker 1: essentially paid Targeted Victory to do the dirty work, so 125 00:08:05,440 --> 00:08:09,840 Speaker 1: Facebook's not directly involved, and the goals of that dirty 126 00:08:09,880 --> 00:08:12,360 Speaker 1: work are twofold. One is to try to level the 127 00:08:12,360 --> 00:08:15,840 Speaker 1: playing field a little bit between Meta and TikTok, because, 128 00:08:16,120 --> 00:08:20,600 Speaker 1: as we've seen with Meta's financial report earlier this year, 129 00:08:21,240 --> 00:08:24,200 Speaker 1: the company has been struggling to attract younger users and 130 00:08:24,240 --> 00:08:26,560 Speaker 1: has cited TikTok as being one of the big reasons 131 00:08:26,600 --> 00:08:31,680 Speaker 1: for that. Now, another goal is to deflect attention away 132 00:08:31,800 --> 00:08:36,600 Speaker 1: from Meta and onto someone else, because Meta's been in 133 00:08:36,679 --> 00:08:39,440 Speaker 1: the center of a of a target for a while 134 00:08:39,559 --> 00:08:43,800 Speaker 1: now for good reason. But the company would really rather 135 00:08:43,840 --> 00:08:46,400 Speaker 1: someone else take that place, and TikTok would sure be 136 00:08:46,559 --> 00:08:50,880 Speaker 1: a nice substitute. And you know, TikTok, being a company 137 00:08:50,880 --> 00:08:54,120 Speaker 1: that's owned by a larger Chinese conglomerate is a pretty 138 00:08:54,160 --> 00:08:56,199 Speaker 1: good target if that's your goal. I mean, there are 139 00:08:56,240 --> 00:09:00,480 Speaker 1: some legitimate concerns to have about TikTok, so it's not 140 00:09:00,559 --> 00:09:05,360 Speaker 1: like all of these attacks have no substance to them. 141 00:09:05,440 --> 00:09:08,160 Speaker 1: There are reasons to be concerned. Now, my own point 142 00:09:08,200 --> 00:09:12,320 Speaker 1: of view is TikTok isn't that great and it does 143 00:09:12,400 --> 00:09:15,959 Speaker 1: merit some scrutiny, But then I say the same thing 144 00:09:16,000 --> 00:09:22,319 Speaker 1: about Meta. Both of them need to be scrutinized, potentially regulated, 145 00:09:22,440 --> 00:09:26,320 Speaker 1: uh certainly held accountable for when they do things that 146 00:09:26,400 --> 00:09:31,439 Speaker 1: are harmful. Anyway, it's interesting to see big tech behaving 147 00:09:32,080 --> 00:09:35,800 Speaker 1: more and more like the ugliest facets of the political process. 148 00:09:35,960 --> 00:09:42,319 Speaker 1: And by interesting I mean discouraging but not surprising. Staying 149 00:09:42,440 --> 00:09:45,719 Speaker 1: on this topic a bit longer, Global Witness, which has 150 00:09:45,760 --> 00:09:49,480 Speaker 1: frequently been a thorn and Meta's side, released a report 151 00:09:49,600 --> 00:09:54,040 Speaker 1: yesterday saying Facebook's algorithm appears to be amplifying climate denial 152 00:09:54,200 --> 00:09:57,520 Speaker 1: posts rather than offering up links to more reliable sources 153 00:09:57,559 --> 00:10:01,240 Speaker 1: on the subject of climate change. So they actually ran 154 00:10:01,280 --> 00:10:04,400 Speaker 1: a bit of an experiment. They created a couple of 155 00:10:04,520 --> 00:10:08,319 Speaker 1: dummy accounts, Jane and John, both of which were supposed 156 00:10:08,320 --> 00:10:15,079 Speaker 1: to represent climate skeptics. Now, the John account was set 157 00:10:15,120 --> 00:10:19,800 Speaker 1: to follow legit scientific organizations, so it was liking pages 158 00:10:19,840 --> 00:10:28,040 Speaker 1: that belonged to actually actual credible scientific groups and and institutions. 159 00:10:28,679 --> 00:10:33,200 Speaker 1: The Jane account was directed to like a couple of 160 00:10:33,240 --> 00:10:37,840 Speaker 1: pages that related to climate change skepticism. Then they sat 161 00:10:37,880 --> 00:10:40,600 Speaker 1: back and looked to see what kind of content was 162 00:10:40,640 --> 00:10:44,640 Speaker 1: being recommended in the respective news feeds, and they saw 163 00:10:44,679 --> 00:10:48,240 Speaker 1: that Jane's news feed began creating, you know, seeing way 164 00:10:48,240 --> 00:10:51,800 Speaker 1: more climate denial content in that feed. And on top 165 00:10:51,840 --> 00:10:55,559 Speaker 1: of that, two thirds of the pages that included climate 166 00:10:55,679 --> 00:11:00,280 Speaker 1: change misinformation were not labeled as such, so there was 167 00:11:00,320 --> 00:11:03,400 Speaker 1: no warning. They're saying, you know, this is not necessarily 168 00:11:03,440 --> 00:11:06,640 Speaker 1: reliable information and you should really look to such and 169 00:11:06,720 --> 00:11:11,240 Speaker 1: such a place to get more reliable info. Now, the 170 00:11:11,320 --> 00:11:14,200 Speaker 1: feed for John didn't have this problem. John didn't get 171 00:11:14,880 --> 00:11:19,600 Speaker 1: the notifications of um, you know, posts that included climate 172 00:11:19,760 --> 00:11:23,439 Speaker 1: denialism in it. So while John was seeing more information 173 00:11:23,480 --> 00:11:28,440 Speaker 1: from legitimate sources, Jane saw progressively extremist content on the subject. 174 00:11:28,760 --> 00:11:31,880 Speaker 1: So once again we see how Facebook's algorithm, coupled with 175 00:11:31,920 --> 00:11:37,800 Speaker 1: Meta's insufficient flagging process leads to the amplification of misinformation. 176 00:11:38,120 --> 00:11:40,760 Speaker 1: Will be coming back to that in just a moment, 177 00:11:40,800 --> 00:11:44,400 Speaker 1: because we have another story that that touches on this. 178 00:11:44,800 --> 00:11:48,080 Speaker 1: But before we get to that, let's take a quick break. 179 00:11:55,960 --> 00:11:58,880 Speaker 1: So before the break, I talked about how Meta was 180 00:11:59,559 --> 00:12:05,280 Speaker 1: failing to sufficiently label climate change misinformation as such, while 181 00:12:05,320 --> 00:12:07,559 Speaker 1: it's not doing much better when it comes to preventing 182 00:12:07,600 --> 00:12:12,400 Speaker 1: disinformation about the ongoing war in Ukraine either. According to 183 00:12:12,440 --> 00:12:17,160 Speaker 1: the Center for Countering Digital Hate the c c d H, 184 00:12:17,400 --> 00:12:22,200 Speaker 1: Facebook only manages to label about of all posts pushing 185 00:12:22,240 --> 00:12:27,640 Speaker 1: misinformation and conspiracy theories surrounding that the ongoing war in Ukraine, 186 00:12:28,080 --> 00:12:30,960 Speaker 1: so that means eighty percent of those posts are just 187 00:12:31,040 --> 00:12:34,840 Speaker 1: slipping by unlabeled, and these include messages such as a 188 00:12:34,920 --> 00:12:38,360 Speaker 1: claim that the United States has been supplying bioweaponry to 189 00:12:38,679 --> 00:12:43,000 Speaker 1: the Ukraine to use against Russian soldiers, a claim that 190 00:12:43,080 --> 00:12:45,959 Speaker 1: has no basis in evidence or real support, and yet 191 00:12:46,360 --> 00:12:50,440 Speaker 1: is a conspiracy theory that is propagating pretty quickly across 192 00:12:50,800 --> 00:12:54,559 Speaker 1: social platforms like Facebook. So four out of the five 193 00:12:54,640 --> 00:12:58,520 Speaker 1: posts that are pushing this in similar misinformation campaigns are 194 00:12:58,559 --> 00:13:01,480 Speaker 1: going through without Facebook abling the post without so much 195 00:13:01,520 --> 00:13:05,400 Speaker 1: as a missing context label, let alone and outright false 196 00:13:05,480 --> 00:13:09,280 Speaker 1: information label. Im Ron Ahmed, the head of the c 197 00:13:09,480 --> 00:13:13,480 Speaker 1: c d H, said, if our researchers can identify false 198 00:13:13,520 --> 00:13:17,600 Speaker 1: information about Ukraine openly circulating on its platform, it is 199 00:13:17,640 --> 00:13:21,559 Speaker 1: within Meta's capability to do the same. That's a pretty 200 00:13:21,600 --> 00:13:24,319 Speaker 1: sick burn because I mean it's hard to argue against 201 00:13:24,320 --> 00:13:26,600 Speaker 1: that statement. Right, if some outside group can come in 202 00:13:26,640 --> 00:13:29,720 Speaker 1: and say, look, we're finding it all over your your platform, 203 00:13:30,040 --> 00:13:33,160 Speaker 1: and it's your platform, clearly you should be able to 204 00:13:33,160 --> 00:13:36,280 Speaker 1: find it too, how do you how do you argue 205 00:13:36,280 --> 00:13:40,000 Speaker 1: against that? So Ahmed went on to reiterate the platforms 206 00:13:40,040 --> 00:13:43,440 Speaker 1: like Facebook profit off of misinformation, which I kind of 207 00:13:43,480 --> 00:13:46,319 Speaker 1: touched on before the break, and you know, we've said 208 00:13:46,360 --> 00:13:51,560 Speaker 1: it many times. Misinformation quote unquote drives engagement and that 209 00:13:51,679 --> 00:13:56,000 Speaker 1: is a major metric that Facebook relies upon while executing 210 00:13:56,000 --> 00:14:01,800 Speaker 1: its revenue strategy. So engagement from revenue standpoint is good, 211 00:14:02,320 --> 00:14:05,400 Speaker 1: and you know, you just have to distance yourself from 212 00:14:05,440 --> 00:14:09,360 Speaker 1: what type of engagement you're talking about. It's unfortunate and 213 00:14:09,800 --> 00:14:12,280 Speaker 1: it's something again, it's not new. We've talked about this 214 00:14:12,320 --> 00:14:16,760 Speaker 1: many times on this show. And we are not yet 215 00:14:17,160 --> 00:14:20,720 Speaker 1: done with Meta, the company and another one called Sama 216 00:14:21,040 --> 00:14:25,280 Speaker 1: s a m A which Meta contracts with in order 217 00:14:25,320 --> 00:14:31,160 Speaker 1: to run content moderation operations in Africa on Facebook. They 218 00:14:31,160 --> 00:14:34,560 Speaker 1: have been named in a lawsuit in Kenya, and the plaintiff, 219 00:14:34,760 --> 00:14:41,440 Speaker 1: Daniel Motog says that Sama violated Kenya's laws around employee health, 220 00:14:41,560 --> 00:14:45,600 Speaker 1: safety and privacy. So content moderation on Facebook is a 221 00:14:45,640 --> 00:14:49,320 Speaker 1: really tough gig, and in some regions it can be 222 00:14:49,720 --> 00:14:54,560 Speaker 1: downright traumatizing because it's your job to look through stuff 223 00:14:54,600 --> 00:14:58,080 Speaker 1: that gets flat flagged on Facebook and figure out, okay, 224 00:14:58,240 --> 00:15:01,280 Speaker 1: does this in fact violate books policies? And in some 225 00:15:01,320 --> 00:15:06,680 Speaker 1: cases it is incredibly evident that it violates policies. But 226 00:15:06,720 --> 00:15:10,240 Speaker 1: in the process of reviewing the content, you're exposed to 227 00:15:10,280 --> 00:15:14,760 Speaker 1: some really dreadful stuff. Motong says that the first video 228 00:15:14,800 --> 00:15:19,120 Speaker 1: he remembers moderating had a video of a beheading in it. 229 00:15:19,560 --> 00:15:22,320 Speaker 1: Now I know that if I were exposed to that 230 00:15:22,440 --> 00:15:27,160 Speaker 1: kind of content, it would definitely have a massively negative 231 00:15:27,160 --> 00:15:31,240 Speaker 1: psychological impact on me, to put it lightly. Motan says 232 00:15:31,320 --> 00:15:34,320 Speaker 1: that Somema deceived employees. It gave them kind of a 233 00:15:34,320 --> 00:15:38,600 Speaker 1: bait and switch offer. According to Motong, the employees were 234 00:15:38,640 --> 00:15:40,480 Speaker 1: told they were going to work at a call center, 235 00:15:40,960 --> 00:15:44,120 Speaker 1: and then once they signed on, they found out, no, 236 00:15:44,240 --> 00:15:46,040 Speaker 1: it's not a call center, You're actually going to do 237 00:15:46,160 --> 00:15:50,480 Speaker 1: content moderation on Facebook. Motong also says that some mfl 238 00:15:50,640 --> 00:15:54,000 Speaker 1: far short of Kenya's requirements for employers to offer sufficient 239 00:15:54,040 --> 00:15:57,440 Speaker 1: mental health resources to their employees. And this is not 240 00:15:57,520 --> 00:16:00,600 Speaker 1: the first time we've seen complaints relating to data and 241 00:16:00,680 --> 00:16:02,640 Speaker 1: the mental health of people who are tasked with the 242 00:16:02,720 --> 00:16:07,200 Speaker 1: job of content moderation. Um, So we'll have to see 243 00:16:07,200 --> 00:16:11,080 Speaker 1: what the outcome of this particular lawsuit is and whether 244 00:16:11,200 --> 00:16:15,040 Speaker 1: or not it will precipitate any meaningful change in the 245 00:16:15,120 --> 00:16:18,440 Speaker 1: process of content moderation and how the companies that are 246 00:16:18,480 --> 00:16:24,480 Speaker 1: tasked with doing that are held accountable for employee welfare. Okay, 247 00:16:24,600 --> 00:16:27,920 Speaker 1: while we add Apple into the mix of stories here, 248 00:16:28,160 --> 00:16:30,520 Speaker 1: we're still, by the way, also he being criticism on 249 00:16:30,600 --> 00:16:35,560 Speaker 1: Meta and Facebook, or at least pointing out something that's troubling. 250 00:16:36,600 --> 00:16:40,760 Speaker 1: Bloomberg reports that both Meta and Apple have handed over 251 00:16:40,920 --> 00:16:47,200 Speaker 1: user data to hackers who fraudulently submitted emergency data requests. Now, 252 00:16:47,240 --> 00:16:50,440 Speaker 1: tech companies generally try to hold off on just handing 253 00:16:50,560 --> 00:16:54,400 Speaker 1: user data over to authorities as a means of establishing 254 00:16:54,440 --> 00:16:56,880 Speaker 1: trust with users. Right, if you find out that a 255 00:16:56,880 --> 00:17:03,239 Speaker 1: particular platform is frequently sharing user information with any agency 256 00:17:03,280 --> 00:17:05,720 Speaker 1: out there, it's probably gonna make you get the he 257 00:17:05,840 --> 00:17:09,960 Speaker 1: b gebs. Well, when companies are compelled by law, they 258 00:17:10,040 --> 00:17:12,320 Speaker 1: will do it. I mean, obviously they don't want to 259 00:17:12,320 --> 00:17:16,200 Speaker 1: break the law, and an emergency data request represents an 260 00:17:16,280 --> 00:17:18,520 Speaker 1: urgent need for information. It could be in a case 261 00:17:18,680 --> 00:17:21,120 Speaker 1: where a person has gone missing, for example, it could 262 00:17:21,160 --> 00:17:25,080 Speaker 1: literally be life or death. So emergency data requests, unlike 263 00:17:25,119 --> 00:17:28,560 Speaker 1: other types of authority data requests, do not require a 264 00:17:28,600 --> 00:17:32,200 Speaker 1: court order, and a hacker group called Recursion Team are 265 00:17:32,240 --> 00:17:36,720 Speaker 1: thought to be at least partly responsible for these fraudulent requests. Now, 266 00:17:36,720 --> 00:17:39,680 Speaker 1: on the one hand, it's easy to accuse the companies 267 00:17:39,720 --> 00:17:42,520 Speaker 1: of Apple and Meta of not doing due diligence to 268 00:17:42,680 --> 00:17:46,560 Speaker 1: ensure that incoming requests are actually legitimate. But on the 269 00:17:46,600 --> 00:17:50,399 Speaker 1: other hand, the very nature of emergency requests indicates that 270 00:17:50,600 --> 00:17:54,760 Speaker 1: a speedy response can be absolutely critical. I think in 271 00:17:54,800 --> 00:17:57,960 Speaker 1: this particular case, the real problem lies in the process 272 00:17:58,000 --> 00:18:01,480 Speaker 1: more than with Apple and met as actions, and the 273 00:18:01,520 --> 00:18:03,879 Speaker 1: fact that it was possible for hackers to compromise that 274 00:18:04,000 --> 00:18:07,639 Speaker 1: process is something we should really look at. Over in 275 00:18:07,720 --> 00:18:11,919 Speaker 1: the UK case involving a Twitter user reminded me of 276 00:18:11,960 --> 00:18:14,600 Speaker 1: the limits of free speech and how they are different 277 00:18:14,640 --> 00:18:18,200 Speaker 1: in different parts of the world. Joseph Kelly was found 278 00:18:18,200 --> 00:18:22,560 Speaker 1: guilty of sending a quote unquote grossly offensive tweet and 279 00:18:22,640 --> 00:18:26,720 Speaker 1: a judge subsequently has sentenced Kelly to one hundred fifty 280 00:18:26,840 --> 00:18:30,840 Speaker 1: hours of community service. So you might wonder, well, what 281 00:18:30,880 --> 00:18:36,119 Speaker 1: was this tweet that merited uh, that kind of sentence. Well, 282 00:18:36,200 --> 00:18:38,840 Speaker 1: it had to do with Sir Tom Moore, who was 283 00:18:39,080 --> 00:18:42,479 Speaker 1: a man who, leading up to his one birthday, was 284 00:18:43,280 --> 00:18:46,240 Speaker 1: doing laps around his garden. He did a hundred laps 285 00:18:46,280 --> 00:18:48,359 Speaker 1: around his garden and it was all in a way 286 00:18:48,359 --> 00:18:52,479 Speaker 1: to raise money for the UK's National Health Service in 287 00:18:52,520 --> 00:18:55,520 Speaker 1: the early days of the pandemic. Sir Tom more than 288 00:18:56,560 --> 00:19:00,280 Speaker 1: not too long later passed away, and one day after 289 00:19:00,320 --> 00:19:04,600 Speaker 1: his passing, Kelly posted his tweet, which was the only 290 00:19:04,800 --> 00:19:08,639 Speaker 1: good brit soldier is a deed one burn, old fella 291 00:19:08,800 --> 00:19:13,560 Speaker 1: burn because he's you know, tweeting in a Scottish accent, 292 00:19:13,600 --> 00:19:17,000 Speaker 1: which I will not do because I can't. Now I 293 00:19:17,040 --> 00:19:20,720 Speaker 1: think anyone could agree that that tweet was at the 294 00:19:20,920 --> 00:19:26,239 Speaker 1: very least in very poor taste. Two uh to you know, 295 00:19:26,440 --> 00:19:31,320 Speaker 1: you had a country in mourning because Tom, Sir Tom 296 00:19:31,440 --> 00:19:34,880 Speaker 1: had had really inspired a lot of people, and so 297 00:19:34,920 --> 00:19:39,040 Speaker 1: then he he ends up saying this very you know, uh, 298 00:19:39,280 --> 00:19:43,160 Speaker 1: unsympathetic thing. Now the question is did that break the law? 299 00:19:43,440 --> 00:19:50,240 Speaker 1: Specifically didn't break Section on seven of the UK's Communications Act. Now, 300 00:19:50,280 --> 00:19:53,639 Speaker 1: when that Act was passed, that section was originally meant 301 00:19:53,680 --> 00:19:57,840 Speaker 1: to create accountability for people who were doing stuff like 302 00:19:58,000 --> 00:20:02,560 Speaker 1: making obscene telephone call. That's what it was to refer to. 303 00:20:03,000 --> 00:20:05,400 Speaker 1: But in the years since it has expanded to cover 304 00:20:05,560 --> 00:20:10,480 Speaker 1: social media posts as well. The grossly offensive part really 305 00:20:10,560 --> 00:20:13,880 Speaker 1: becomes tricky simply because you have to decide what are 306 00:20:13,880 --> 00:20:18,600 Speaker 1: the criteria you're using to determine if something is grossly offensive. 307 00:20:18,600 --> 00:20:23,040 Speaker 1: I mean, grossly offensive is a subjective thing, right, You 308 00:20:23,119 --> 00:20:25,479 Speaker 1: might be offended by something I'm not offended by, and 309 00:20:25,600 --> 00:20:30,840 Speaker 1: vice versa. Kelly, by the way, he deleted his message 310 00:20:31,760 --> 00:20:34,639 Speaker 1: just twenty minutes after he posted it, and his lawyer 311 00:20:34,840 --> 00:20:37,359 Speaker 1: in the trial argued that Kelly had made the tweet 312 00:20:37,400 --> 00:20:40,879 Speaker 1: while he was intoxicated, like he wasn't you know, sober 313 00:20:40,920 --> 00:20:43,439 Speaker 1: when he did it. But none of that managed to 314 00:20:43,440 --> 00:20:47,160 Speaker 1: get him off the hook. And so now he's sentenced 315 00:20:47,200 --> 00:20:50,680 Speaker 1: to do a hundred fifty hours of community service. Uh. 316 00:20:50,760 --> 00:20:53,520 Speaker 1: For people in the United States, that probably comes as 317 00:20:53,520 --> 00:20:57,480 Speaker 1: a shock because here we could tweet something like that 318 00:20:57,960 --> 00:21:01,320 Speaker 1: and yeah, we might be called out for being insensitive 319 00:21:01,840 --> 00:21:05,520 Speaker 1: or or you know, having really bad taste or just 320 00:21:05,600 --> 00:21:08,479 Speaker 1: being tacky or whatever it may be. But you wouldn't 321 00:21:08,520 --> 00:21:12,080 Speaker 1: expect anyone to be held accountable and have to do 322 00:21:12,160 --> 00:21:16,040 Speaker 1: community service in return for doing that. Now, in the UK, 323 00:21:16,160 --> 00:21:19,080 Speaker 1: there is an a new bill that will be coming 324 00:21:19,200 --> 00:21:21,680 Speaker 1: law later on. It will becoming effective soon. It's called 325 00:21:21,680 --> 00:21:24,760 Speaker 1: the Online Safety Bill that will end up replacing the 326 00:21:24,800 --> 00:21:29,920 Speaker 1: old UK's Communications Act, So that will end up having 327 00:21:29,960 --> 00:21:32,639 Speaker 1: a new set of rules. However, there's still measures in 328 00:21:32,720 --> 00:21:36,720 Speaker 1: place where there's pretty vague language that you know, how 329 00:21:37,080 --> 00:21:41,080 Speaker 1: the the nation could handle messages that are considered to 330 00:21:41,080 --> 00:21:46,320 Speaker 1: be quote unquote harmful, Like who determines what constitutes harm 331 00:21:46,400 --> 00:21:49,640 Speaker 1: and how do you determine accountability for those things. So 332 00:21:49,760 --> 00:21:53,320 Speaker 1: if you are in the UK and you can't tweet 333 00:21:53,359 --> 00:21:59,680 Speaker 1: something nice, don't tweet anything at all. I guess okay. 334 00:21:59,720 --> 00:22:03,239 Speaker 1: I have a couple more stories that are less you know, 335 00:22:03,880 --> 00:22:07,560 Speaker 1: the triolic, But we're gonna take another quick break and 336 00:22:07,600 --> 00:22:17,879 Speaker 1: we'll be right back. All right, Let's get to the 337 00:22:18,000 --> 00:22:21,399 Speaker 1: last couple of news stories for this episode. One of 338 00:22:21,480 --> 00:22:25,920 Speaker 1: those is that Canadian politicians have drafted an emissions reduction 339 00:22:26,040 --> 00:22:30,200 Speaker 1: plan that will require all new cars sold in Canada 340 00:22:30,320 --> 00:22:35,080 Speaker 1: to be zero emission vehicles by the year twenty five. Uh, 341 00:22:35,240 --> 00:22:38,960 Speaker 1: that's passenger cars, I should add, And this would put 342 00:22:39,000 --> 00:22:41,640 Speaker 1: Canada on a growing list of countries that are setting 343 00:22:41,960 --> 00:22:45,560 Speaker 1: similar deadlines for when car companies will no longer be 344 00:22:45,640 --> 00:22:50,600 Speaker 1: allowed to sell new internal combustion engine vehicles in those countries. UH. 345 00:22:50,680 --> 00:22:53,640 Speaker 1: That list notably does not include the United States. There 346 00:22:53,760 --> 00:22:59,240 Speaker 1: is no federal mandate that follows this trend, but there 347 00:22:59,240 --> 00:23:02,480 Speaker 1: have been a and orso states that have listed their 348 00:23:02,480 --> 00:23:07,000 Speaker 1: own state deadlines for that. And honestly, once you start 349 00:23:07,080 --> 00:23:13,800 Speaker 1: getting to a certain tipping point, there becomes a a 350 00:23:13,960 --> 00:23:17,840 Speaker 1: movement within the automotive industry where you would expect everyone 351 00:23:17,920 --> 00:23:21,639 Speaker 1: to switch over to electric anyway or some other zero 352 00:23:21,720 --> 00:23:25,560 Speaker 1: emission vehicle design anyway, because it would just make more 353 00:23:25,640 --> 00:23:28,880 Speaker 1: sense from a manufacturing standpoint to go that way rather 354 00:23:29,000 --> 00:23:33,119 Speaker 1: than divide it up. So it may be that we 355 00:23:33,320 --> 00:23:37,440 Speaker 1: don't ever see the US create a similar national policy, 356 00:23:37,480 --> 00:23:42,800 Speaker 1: but if enough states follow that trend, then the effect 357 00:23:42,840 --> 00:23:45,200 Speaker 1: will be the same. And like I said, it only 358 00:23:45,200 --> 00:23:49,680 Speaker 1: applies to passenger cars, you know Canada's policy. Uh. In 359 00:23:49,680 --> 00:23:52,920 Speaker 1: industry cars like things that are being used for enterprise purposes, 360 00:23:53,359 --> 00:23:57,040 Speaker 1: they will have a longer timeline in order to convert 361 00:23:57,080 --> 00:23:59,240 Speaker 1: over to zero emissions, which makes sense. I mean, like, 362 00:23:59,280 --> 00:24:02,720 Speaker 1: if you're talking about things like heavy duty hauling vehicles 363 00:24:02,880 --> 00:24:07,040 Speaker 1: or really strong construction vehicles, you're talking about stuff that 364 00:24:07,080 --> 00:24:09,760 Speaker 1: has power needs that you know might not be met 365 00:24:09,840 --> 00:24:14,920 Speaker 1: with current zero emission systems in place. So that does 366 00:24:15,000 --> 00:24:18,240 Speaker 1: make a little more sense. But yes, we are seeing 367 00:24:18,320 --> 00:24:23,480 Speaker 1: another country say no more internal combustion engine vehicles here, 368 00:24:23,640 --> 00:24:28,600 Speaker 1: at least no new ones after a certain date. And finally, 369 00:24:29,160 --> 00:24:32,800 Speaker 1: some security researchers demonstrated that it's possible to hack into 370 00:24:32,840 --> 00:24:36,879 Speaker 1: a communications satellite and broadcast a video feed to a 371 00:24:37,000 --> 00:24:41,239 Speaker 1: large region. Now, to be clear, the researchers did this 372 00:24:41,280 --> 00:24:44,440 Speaker 1: with permission, so it's not like they were secretly hacking 373 00:24:44,480 --> 00:24:47,640 Speaker 1: into a satellite feed and taking it over and creating 374 00:24:47,840 --> 00:24:53,720 Speaker 1: pirate satellite television. But they were given the opportunity to 375 00:24:53,840 --> 00:24:57,680 Speaker 1: attempt to access a Canadian satellite that was no longer 376 00:24:57,760 --> 00:24:59,880 Speaker 1: going to be used. It had passed out of its 377 00:25:00,040 --> 00:25:04,120 Speaker 1: useful life expectancy, but it had not yet been transferred 378 00:25:04,280 --> 00:25:08,720 Speaker 1: to a graveyard orbit. I talked about this briefly earlier 379 00:25:08,760 --> 00:25:12,480 Speaker 1: this week in an episode about orbits. A graveyard orbit 380 00:25:12,840 --> 00:25:15,800 Speaker 1: is an orbit where you just you push stuff when 381 00:25:15,840 --> 00:25:19,320 Speaker 1: you're when it's no longer useful, and it gets it 382 00:25:19,359 --> 00:25:21,120 Speaker 1: out of the way so that you can put more 383 00:25:21,240 --> 00:25:25,080 Speaker 1: useful stuff in that orbit. And it typically is is 384 00:25:25,240 --> 00:25:29,840 Speaker 1: at an orbit that just isn't really well suited for 385 00:25:29,880 --> 00:25:34,960 Speaker 1: any practical purposes here on Earth. So pushing a communication 386 00:25:35,000 --> 00:25:37,280 Speaker 1: satellite out to a graveyard orbit means that would no 387 00:25:37,320 --> 00:25:40,719 Speaker 1: longer really align properly to transmit back to Earth. So 388 00:25:40,760 --> 00:25:43,840 Speaker 1: if if they had waited longer, this really wouldn't have 389 00:25:43,880 --> 00:25:47,879 Speaker 1: been a possibility. But because that satellite was no longer 390 00:25:48,280 --> 00:25:52,040 Speaker 1: in service but still reachable, it also meant there were 391 00:25:52,040 --> 00:25:55,840 Speaker 1: no competing signals being sent to that satellite. So if 392 00:25:55,880 --> 00:25:58,880 Speaker 1: you could send a signal to the satellite, it could 393 00:25:58,960 --> 00:26:03,159 Speaker 1: then be it back down to Earth. Now, accessing the 394 00:26:03,200 --> 00:26:07,280 Speaker 1: satellite required using an uplink facility. This is essentially a 395 00:26:07,320 --> 00:26:11,480 Speaker 1: place that has a powerful satellite dish antenna and a 396 00:26:11,600 --> 00:26:15,640 Speaker 1: really powerful amplifier and sending the right kind of signal 397 00:26:15,920 --> 00:26:18,880 Speaker 1: that was strong enough to reach the satellite in question. 398 00:26:19,359 --> 00:26:22,959 Speaker 1: Like you couldn't just do this with a simple radio 399 00:26:23,000 --> 00:26:25,159 Speaker 1: antenna or something like that. You have to have a 400 00:26:25,240 --> 00:26:30,960 Speaker 1: very concentrated, powerful beam of of signal to go up 401 00:26:31,080 --> 00:26:34,040 Speaker 1: and reach the satellite, and then the satellite in fact 402 00:26:34,160 --> 00:26:36,879 Speaker 1: did beam that signal back down to Earth. So the 403 00:26:36,960 --> 00:26:40,919 Speaker 1: researchers showed there are no actual security measures in place 404 00:26:41,160 --> 00:26:45,480 Speaker 1: on the satellites themselves. There's like there's no there's no 405 00:26:45,560 --> 00:26:49,840 Speaker 1: like password or authentication or anything like that. If you 406 00:26:49,880 --> 00:26:52,879 Speaker 1: are capable of sending the signal to the satellite, then 407 00:26:53,119 --> 00:26:54,880 Speaker 1: it will just do its job and send it back 408 00:26:54,880 --> 00:26:57,800 Speaker 1: down to Earth. Now you can kind of understand why 409 00:26:57,960 --> 00:27:01,280 Speaker 1: there aren't any real protective measures on the satellite themselves, 410 00:27:01,359 --> 00:27:05,440 Speaker 1: because you know, in order to even get this to work, 411 00:27:05,480 --> 00:27:08,159 Speaker 1: you first have to have access to something like an 412 00:27:08,240 --> 00:27:11,120 Speaker 1: uplink facility. That is not an easy thing to do. 413 00:27:11,880 --> 00:27:13,880 Speaker 1: So it's not like you can go to your local 414 00:27:13,920 --> 00:27:17,879 Speaker 1: electronics store and buy a consumer electronics version of a 415 00:27:17,920 --> 00:27:22,040 Speaker 1: massively powerful transmitter and amplification system. Plus you'd have to 416 00:27:22,040 --> 00:27:24,399 Speaker 1: have a way to identify where the satellite is and 417 00:27:24,480 --> 00:27:28,680 Speaker 1: target it and track it. But the researcher showed it 418 00:27:28,800 --> 00:27:32,160 Speaker 1: was at least possible, and in fact, it's not even 419 00:27:32,200 --> 00:27:36,000 Speaker 1: that difficult once they had that access to the uplink center, 420 00:27:36,440 --> 00:27:41,200 Speaker 1: so hackers could potentially get access to an uplink center 421 00:27:41,400 --> 00:27:46,640 Speaker 1: and cause problems that way. And also they pointed out 422 00:27:46,680 --> 00:27:49,439 Speaker 1: that just like in the past, it's still possible to 423 00:27:49,520 --> 00:27:54,240 Speaker 1: hijack a working communication satellite, one that's still in service, 424 00:27:54,800 --> 00:27:59,040 Speaker 1: as long as you send a signal that's stronger than 425 00:27:59,160 --> 00:28:02,240 Speaker 1: the official And in fact, this has happened in the past. 426 00:28:02,920 --> 00:28:05,919 Speaker 1: In the mid nineteen eighties. You lived on the East Coast, 427 00:28:05,920 --> 00:28:09,359 Speaker 1: and we're a subscriber to HBO, it's possible that you 428 00:28:09,400 --> 00:28:13,600 Speaker 1: witnessed this yourself because and I think it was six 429 00:28:13,640 --> 00:28:16,959 Speaker 1: there was a disgruntled technician who was working at an 430 00:28:17,040 --> 00:28:22,119 Speaker 1: uplink facility in Florida who used that facility to override 431 00:28:22,600 --> 00:28:27,119 Speaker 1: the official New York Facilities HBO signal to a particular 432 00:28:27,119 --> 00:28:31,000 Speaker 1: communication satellite. So, in other words, you have this this 433 00:28:31,119 --> 00:28:34,359 Speaker 1: facility in New York, it's beaming the HBO feed up 434 00:28:34,359 --> 00:28:37,399 Speaker 1: to a communication satellite which is beaming that back down 435 00:28:37,440 --> 00:28:40,240 Speaker 1: to Earth on the East coast of the United States. 436 00:28:40,680 --> 00:28:44,239 Speaker 1: This person in Florida decides, I'm going to use the 437 00:28:44,280 --> 00:28:47,680 Speaker 1: Florida's uplinks center to override that signal. I'll just send 438 00:28:47,680 --> 00:28:50,200 Speaker 1: a stronger signal to that satellite and then I'll have 439 00:28:50,320 --> 00:28:54,720 Speaker 1: control the technician going by the name Captain Midnight took 440 00:28:54,760 --> 00:28:57,440 Speaker 1: over a few minutes of airtime on HBO and used 441 00:28:57,480 --> 00:29:02,160 Speaker 1: it to drumroll please complain about how expensive it was 442 00:29:02,200 --> 00:29:05,960 Speaker 1: to get HBO added on too. Consumer satellite services, good 443 00:29:06,040 --> 00:29:09,320 Speaker 1: use of time anyway, things haven't really changed that much 444 00:29:09,360 --> 00:29:12,960 Speaker 1: since the nineteen eighties. Is still possible to take over 445 00:29:13,040 --> 00:29:17,080 Speaker 1: a satellite by sending a stronger signal to that satellite, 446 00:29:18,040 --> 00:29:20,440 Speaker 1: although you can also run the risk of damaging a 447 00:29:20,480 --> 00:29:22,800 Speaker 1: satellite in the process if the signals get to be 448 00:29:22,880 --> 00:29:26,640 Speaker 1: too strong. This is not that different from how radio works. 449 00:29:26,640 --> 00:29:30,640 Speaker 1: In fact, radio works exactly the same way. We saw 450 00:29:30,680 --> 00:29:33,000 Speaker 1: that and so did television. We saw that in the 451 00:29:33,120 --> 00:29:35,520 Speaker 1: in the nineteen eighties as well, with the infamous Max 452 00:29:35,600 --> 00:29:40,600 Speaker 1: Headroom incident that was over the air broadcast, not not satellite, 453 00:29:40,920 --> 00:29:43,000 Speaker 1: but yeah, same sort of thing. If you're able to 454 00:29:43,080 --> 00:29:47,920 Speaker 1: send out a stronger signal than an official one over 455 00:29:47,960 --> 00:29:51,320 Speaker 1: a particular frequency, then that's what people are gonna get. 456 00:29:51,520 --> 00:29:54,600 Speaker 1: That's how pirate radio can be a thing. Um, and 457 00:29:54,720 --> 00:30:01,080 Speaker 1: it's illegal anyway. Uh. In an era of state sponsored 458 00:30:01,120 --> 00:30:05,720 Speaker 1: hacker groups and propaganda campaigns, this knowledge kind of raises 459 00:30:05,800 --> 00:30:09,120 Speaker 1: up some troubling possibilities, Like you could easily imagine a 460 00:30:09,160 --> 00:30:12,880 Speaker 1: scenario where a country uses its own up link facilities 461 00:30:13,200 --> 00:30:18,120 Speaker 1: to target a satellite that's that's orbiting a nearby region 462 00:30:18,680 --> 00:30:22,520 Speaker 1: that happens to be an adversary of that country, and 463 00:30:22,600 --> 00:30:27,720 Speaker 1: to take over that satellite and broadcast propaganda or shut 464 00:30:27,760 --> 00:30:30,800 Speaker 1: it down. Even like, you can easily imagine that, and 465 00:30:30,880 --> 00:30:34,320 Speaker 1: the fact that there aren't these security measures on the 466 00:30:34,320 --> 00:30:38,840 Speaker 1: satellite themselves makes that a possibility. Or you could even 467 00:30:38,840 --> 00:30:42,480 Speaker 1: have state sponsored hackers trying to get access to uplinks 468 00:30:42,560 --> 00:30:46,120 Speaker 1: centers that are in other countries and and achieve the 469 00:30:46,160 --> 00:30:49,760 Speaker 1: same goal. Uh so, maybe this will lead to changes 470 00:30:50,240 --> 00:30:54,520 Speaker 1: in security around satellites. I think making sure that the 471 00:30:54,560 --> 00:30:58,800 Speaker 1: up links centers are really secure is important because again, 472 00:30:58,840 --> 00:31:01,200 Speaker 1: if you don't have access to an uplink center, you're 473 00:31:01,240 --> 00:31:03,440 Speaker 1: not gonna send a signal strong enough in the first 474 00:31:03,440 --> 00:31:07,600 Speaker 1: place to make it an issue. So protect those first. 475 00:31:07,600 --> 00:31:11,080 Speaker 1: But I think also it might be time to start in, 476 00:31:11,320 --> 00:31:15,360 Speaker 1: you know, figuring out security for the satellites themselves. Okay, 477 00:31:15,880 --> 00:31:19,440 Speaker 1: those are the news stories I chose to cover on Thursday, 478 00:31:19,480 --> 00:31:22,640 Speaker 1: March two. Like I said, there were a ton more, 479 00:31:23,280 --> 00:31:27,080 Speaker 1: but uh yeah, I was already getting pretty grouchy, as 480 00:31:27,120 --> 00:31:29,880 Speaker 1: you can tell, and I figured that this was a 481 00:31:29,920 --> 00:31:33,200 Speaker 1: good mixture to kind of share with all of you. 482 00:31:33,560 --> 00:31:35,680 Speaker 1: If you have suggestions for topics I should cover in 483 00:31:35,880 --> 00:31:38,120 Speaker 1: episodes of tech Stuff, reach out to me on Twitter. 484 00:31:38,520 --> 00:31:41,080 Speaker 1: The handle for the show is text Stuff H s 485 00:31:41,280 --> 00:31:49,680 Speaker 1: W and I'll talk to you again really soon. Text 486 00:31:49,680 --> 00:31:53,160 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 487 00:31:53,160 --> 00:31:55,920 Speaker 1: from my heart Radio, visit the i Heart Radio app, 488 00:31:56,080 --> 00:31:59,200 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows. 489 00:32:00,920 --> 00:32:01,400 Speaker 1: Eight