1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,000 --> 00:00:14,320 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,440 --> 00:00:17,440 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,480 --> 00:00:19,840 Speaker 1: and I love all things tech. And this is the 5 00:00:19,880 --> 00:00:25,640 Speaker 1: tech news for Tuesday, December seven, twenty twenty one. Rohinga refugees, 6 00:00:25,800 --> 00:00:30,120 Speaker 1: survivors of a genocide campaign, and Myanmar have brought two 7 00:00:30,280 --> 00:00:34,600 Speaker 1: class action lawsuits against Meta, you know, the company that 8 00:00:35,040 --> 00:00:38,040 Speaker 1: used to be known as Facebook. One of those class 9 00:00:38,040 --> 00:00:40,440 Speaker 1: action lawsuits is in the United States and the other 10 00:00:40,479 --> 00:00:43,800 Speaker 1: one is in the United Kingdom, and together the lawsuits 11 00:00:43,800 --> 00:00:48,360 Speaker 1: are seeking around a hundred fifty billion dollars in damages. 12 00:00:49,000 --> 00:00:50,159 Speaker 1: I don't know why I put a d at the 13 00:00:50,200 --> 00:00:53,560 Speaker 1: end of billion there. The Rohinga are a Muslim people, 14 00:00:53,840 --> 00:00:58,520 Speaker 1: and extremist Buddhists in Myanmar, encouraged by the country's government, 15 00:00:58,840 --> 00:01:02,680 Speaker 1: had sought to Why about those people? Uh, the atrocities 16 00:01:02,680 --> 00:01:07,520 Speaker 1: are just absolutely horrifying anyway. The lawsuits claim that Facebook 17 00:01:08,040 --> 00:01:12,880 Speaker 1: exacerbated an already deadly situation in Myanmar, facilitating the spread 18 00:01:12,920 --> 00:01:16,640 Speaker 1: of hate speech and misinformation, and that this contributed to 19 00:01:16,800 --> 00:01:21,720 Speaker 1: various users in Myanmar growing more extreme and organizing acts 20 00:01:21,720 --> 00:01:26,720 Speaker 1: of unspeakable violence and cruelty and brutality. They further alleged 21 00:01:26,880 --> 00:01:31,080 Speaker 1: that Facebook did nothing to remove posts from Myanmar's government 22 00:01:31,160 --> 00:01:36,559 Speaker 1: offices that essentially called for the extermination of the Rohinga people. Further, 23 00:01:36,959 --> 00:01:40,160 Speaker 1: the lawsuit alleges that Facebook's algorithm, which is the one 24 00:01:40,200 --> 00:01:42,720 Speaker 1: I was talking about in yesterday's episode of Tech Stuff, 25 00:01:43,360 --> 00:01:46,880 Speaker 1: encouraged people to join groups that aligned with user beliefs. 26 00:01:46,920 --> 00:01:51,040 Speaker 1: So let's say there's someone in Myanmar who joins Facebook 27 00:01:51,320 --> 00:01:54,200 Speaker 1: and that this person already harbors negative views about the 28 00:01:54,240 --> 00:01:58,520 Speaker 1: Rohinga people. Facebook's algorithm starts to recommend groups for this 29 00:01:58,560 --> 00:02:01,640 Speaker 1: person to join, and these are groups filled with others 30 00:02:01,960 --> 00:02:07,120 Speaker 1: who harbor racist or xenophobic beliefs about the Rohinga and 31 00:02:07,280 --> 00:02:10,280 Speaker 1: the user joins the group and becomes part of an 32 00:02:10,280 --> 00:02:14,000 Speaker 1: echo chamber that escalates matters, and it drives people who 33 00:02:14,000 --> 00:02:18,079 Speaker 1: are already prejudiced toward extremism and adding fuel to an 34 00:02:18,080 --> 00:02:22,320 Speaker 1: already dangerous fire. And meanwhile, Facebook was profiting off the 35 00:02:22,360 --> 00:02:25,480 Speaker 1: whole thing, because, as I mentioned in yesterday's episode, the 36 00:02:25,520 --> 00:02:28,240 Speaker 1: whole business purpose of Facebook is to keep as many 37 00:02:28,280 --> 00:02:30,960 Speaker 1: people on the platform for as long as possible to 38 00:02:31,080 --> 00:02:35,040 Speaker 1: serve up as many ads as possible. So the lawsuit 39 00:02:35,080 --> 00:02:39,560 Speaker 1: claims Facebook execs didn't care that the platform was contributing 40 00:02:39,560 --> 00:02:42,799 Speaker 1: to a hostile situation, to put it lightly, because it 41 00:02:42,880 --> 00:02:47,040 Speaker 1: was a profitable situation. At least in the company's ledger, 42 00:02:47,639 --> 00:02:51,880 Speaker 1: the plaintiffs wrote, quote to maximize engagement, Facebook does not 43 00:02:52,080 --> 00:02:56,639 Speaker 1: merely fill users news feeds with disproportionate amounts of hate 44 00:02:56,680 --> 00:03:01,160 Speaker 1: speech and misinformation. It employs a system of social rewards 45 00:03:01,200 --> 00:03:05,680 Speaker 1: that manipulates and trains users to create such content. End quote. 46 00:03:06,320 --> 00:03:09,440 Speaker 1: The complaint goes on to say that Facebook was made 47 00:03:09,480 --> 00:03:12,400 Speaker 1: aware of the anti Rohinga material on the site as 48 00:03:12,400 --> 00:03:15,239 Speaker 1: far as back as two thousand thirteen, but failed to 49 00:03:15,280 --> 00:03:18,440 Speaker 1: do anything about it. Here in the US, the plaintiffs 50 00:03:18,440 --> 00:03:21,639 Speaker 1: hope to have the matter tried under Burmese law. Now. 51 00:03:21,720 --> 00:03:24,960 Speaker 1: The reason for that, you know, under Burmese lawns of 52 00:03:25,080 --> 00:03:28,919 Speaker 1: US laws, because the United States famously has the Communications 53 00:03:28,960 --> 00:03:33,160 Speaker 1: Decency Act, which contains section to thirty. Uh that's the 54 00:03:33,240 --> 00:03:36,680 Speaker 1: infamous section that grants platforms immunity from liability for the 55 00:03:36,720 --> 00:03:40,520 Speaker 1: stuff that their users post, at least under certain situations. 56 00:03:40,600 --> 00:03:44,920 Speaker 1: The case law on the matter is a little bit muddled. Meanwhile, 57 00:03:45,400 --> 00:03:48,680 Speaker 1: some scientists and academics have sent an open letter to 58 00:03:48,840 --> 00:03:52,520 Speaker 1: Meta Slash Facebook and to Mark Zuckerberg, calling on the 59 00:03:52,560 --> 00:03:55,760 Speaker 1: company to share research that staff have done to determine 60 00:03:56,040 --> 00:04:00,160 Speaker 1: the mental health impact that products like Instagram, Facebook, and 61 00:04:00,200 --> 00:04:03,400 Speaker 1: WhatsApp can have on the young, and by young, I 62 00:04:03,400 --> 00:04:07,160 Speaker 1: mean like kids to teenagers. The group also asks that 63 00:04:07,240 --> 00:04:10,120 Speaker 1: met us seek out an independent third party to review 64 00:04:10,200 --> 00:04:14,160 Speaker 1: that research to help protect against bias, because obviously any 65 00:04:14,160 --> 00:04:17,680 Speaker 1: company that conducts research on itself has a potential conflict 66 00:04:17,680 --> 00:04:20,839 Speaker 1: of interest going on there. This follows the news reports 67 00:04:20,839 --> 00:04:24,800 Speaker 1: about leaked internal documents from the company that indicated Instagram 68 00:04:25,080 --> 00:04:29,679 Speaker 1: use might be mentally harmful, particularly for young women, that 69 00:04:29,839 --> 00:04:33,520 Speaker 1: it could potentially lead to problems like body image issues, 70 00:04:33,600 --> 00:04:38,320 Speaker 1: eating disorders, and anxiety. The documents obviously caused a lot 71 00:04:38,360 --> 00:04:40,839 Speaker 1: of concern, but at the same time, the group of 72 00:04:40,839 --> 00:04:44,200 Speaker 1: scientists say that without independent scrutiny, the methodology might not 73 00:04:44,240 --> 00:04:47,240 Speaker 1: be rigorous enough to give insight into the scope of 74 00:04:47,279 --> 00:04:50,359 Speaker 1: the problem or even you know, if there is a problem, 75 00:04:50,440 --> 00:04:53,039 Speaker 1: because it's possible that those who were surveyed for the 76 00:04:53,120 --> 00:04:57,640 Speaker 1: internal research were already dealing with mental health issues independent 77 00:04:57,720 --> 00:05:02,840 Speaker 1: of their usage of Facebook's products. Maybe those products exacerbated 78 00:05:02,839 --> 00:05:06,560 Speaker 1: existing issues, maybe they created them. It's impossible to say 79 00:05:06,640 --> 00:05:10,240 Speaker 1: based upon the limited research. So the point that scientists 80 00:05:10,279 --> 00:05:12,520 Speaker 1: are making is that it is a matter of concern 81 00:05:12,640 --> 00:05:16,360 Speaker 1: and should be investigated more thoroughly and with transparency, and 82 00:05:16,400 --> 00:05:20,760 Speaker 1: to behave otherwise is unethical, irresponsible, and potentially cruel. The 83 00:05:20,839 --> 00:05:24,000 Speaker 1: letter points out that platforms like Facebook have a global 84 00:05:24,040 --> 00:05:26,920 Speaker 1: reach and thus any effects it might have upon mental 85 00:05:26,960 --> 00:05:31,320 Speaker 1: health are a worldwide concern. But because big tech companies 86 00:05:31,360 --> 00:05:35,960 Speaker 1: like Meta Slash Facebook, because they rarely share research with 87 00:05:36,040 --> 00:05:40,440 Speaker 1: the scientific community, it is impossible to ascertain if an 88 00:05:40,440 --> 00:05:44,039 Speaker 1: effect is present and if so, to what extent. No 89 00:05:44,160 --> 00:05:49,120 Speaker 1: word on if Zuckerberg has written back yet. Now let's 90 00:05:49,120 --> 00:05:51,960 Speaker 1: give Facebook a little bit of credit in our next 91 00:05:52,080 --> 00:05:54,960 Speaker 1: news story. And this news story is actually from last week, 92 00:05:55,000 --> 00:05:59,080 Speaker 1: but it broke after we had published our news episode, 93 00:05:59,200 --> 00:06:02,760 Speaker 1: and it's about how Facebook identified and removed a disinformation 94 00:06:02,800 --> 00:06:08,200 Speaker 1: campaign that was originating out of China. Now, Facebook reps 95 00:06:08,240 --> 00:06:10,479 Speaker 1: didn't go so far as to say that this was 96 00:06:10,560 --> 00:06:14,560 Speaker 1: a state backed disinformation campaign, but it was a campaign 97 00:06:14,640 --> 00:06:18,720 Speaker 1: conducted by state backed companies and it was amplified by 98 00:06:18,880 --> 00:06:22,760 Speaker 1: state back media. But you know, Facebook left a little 99 00:06:22,760 --> 00:06:25,479 Speaker 1: bit of wiggle room there before saying, like the Chinese 100 00:06:25,520 --> 00:06:28,840 Speaker 1: government itself ordered the campaign. It's kind of like when 101 00:06:28,920 --> 00:06:32,480 Speaker 1: you say the word allegedly before you say something about someone. 102 00:06:32,760 --> 00:06:36,159 Speaker 1: I do that a lot on this show anyway. The 103 00:06:36,240 --> 00:06:41,000 Speaker 1: campaign was centered on a Swiss biologist named Wilson Edwards, 104 00:06:41,360 --> 00:06:45,480 Speaker 1: who had posted a claim that the United States government 105 00:06:45,520 --> 00:06:49,640 Speaker 1: had resorted to quote enormous pressure and even intimidation end 106 00:06:49,720 --> 00:06:54,719 Speaker 1: quote to force scientists around the world to blame China 107 00:06:54,800 --> 00:06:59,360 Speaker 1: for the origin of the coronavirus, essentially to implicate China 108 00:06:59,480 --> 00:07:04,200 Speaker 1: as being the place where the coronavirus first emerged. Essentially, 109 00:07:04,279 --> 00:07:06,920 Speaker 1: the campaign was saying that the scientific community had only 110 00:07:07,000 --> 00:07:10,480 Speaker 1: directed any attention towards China at all because the US 111 00:07:10,520 --> 00:07:13,760 Speaker 1: government was forcing it to. A bunch of accounts on 112 00:07:13,760 --> 00:07:17,640 Speaker 1: Facebook emerged and engaged with this content, liking it and 113 00:07:17,680 --> 00:07:20,960 Speaker 1: reposting it, and then the Chinese media picked up the 114 00:07:20,960 --> 00:07:23,440 Speaker 1: story and ran with it. Only it turns out there 115 00:07:23,560 --> 00:07:27,600 Speaker 1: is no Swiss biologists named Wilson Edwards, and that hundreds 116 00:07:27,680 --> 00:07:31,120 Speaker 1: of those Facebook accounts were created all around the same time. 117 00:07:31,480 --> 00:07:34,160 Speaker 1: They were all like posting at close to the same time, 118 00:07:34,240 --> 00:07:38,960 Speaker 1: sometimes with identical messages. In one case, a an account 119 00:07:39,000 --> 00:07:43,760 Speaker 1: accidentally posted the directions for how to spread this disinformation 120 00:07:44,240 --> 00:07:46,720 Speaker 1: and that the whole thing was in fact a manufactured story. 121 00:07:47,200 --> 00:07:51,520 Speaker 1: Facebook subsequently removed more than six hundred accounts connected with 122 00:07:51,560 --> 00:07:56,200 Speaker 1: this attempt, and security officials say that this was pretty 123 00:07:56,240 --> 00:07:59,800 Speaker 1: par for the course as far as Chinese disinformation campaigns go, 124 00:08:00,440 --> 00:08:03,880 Speaker 1: Uh saying that it was pretty clumsy, it was shoddy work, 125 00:08:03,960 --> 00:08:07,960 Speaker 1: and that these campaigns lack the sophistication and effectiveness that 126 00:08:08,120 --> 00:08:12,360 Speaker 1: we typically see with Russian disinformation campaigns, which are far 127 00:08:12,440 --> 00:08:15,480 Speaker 1: more effective. So, in other words, the Chinese have room 128 00:08:15,520 --> 00:08:18,840 Speaker 1: for improvement when it comes to spreading misinformation, and I 129 00:08:18,920 --> 00:08:22,640 Speaker 1: have no doubt they will work hard to get up 130 00:08:22,680 --> 00:08:26,840 Speaker 1: to speed with, you know, like Russia. On a related note, 131 00:08:27,200 --> 00:08:30,720 Speaker 1: Twitter and Meta slash Facebook have revealed that both of 132 00:08:30,760 --> 00:08:34,720 Speaker 1: these companies have removed thousands of accounts connected to various 133 00:08:34,800 --> 00:08:40,880 Speaker 1: state backed disinformation campaigns and coronavirus misinformation campaigns. Twitter has 134 00:08:40,920 --> 00:08:44,079 Speaker 1: removed around two thousand accounts that were spreading misinformation about 135 00:08:44,120 --> 00:08:47,679 Speaker 1: human rights issues in China. That is, these were Twitter 136 00:08:47,720 --> 00:08:50,960 Speaker 1: accounts that we're pushing the state backed messaging that there 137 00:08:51,000 --> 00:08:54,800 Speaker 1: are no human rights issues going on in China, and 138 00:08:54,920 --> 00:08:59,800 Speaker 1: Meta has removed three other coordinated inauthentic behavior or see 139 00:08:59,840 --> 00:09:03,320 Speaker 1: i B operations in addition to the one I just 140 00:09:03,400 --> 00:09:07,840 Speaker 1: covered a second ago. So that whole misinformation campaign about 141 00:09:08,120 --> 00:09:11,160 Speaker 1: the United States pressuring the scientific community that would be 142 00:09:11,400 --> 00:09:14,640 Speaker 1: one of the coordinated in authentic behavior or c i 143 00:09:14,760 --> 00:09:19,720 Speaker 1: B operations. Meta also said that some campaigns involved creating 144 00:09:19,840 --> 00:09:23,400 Speaker 1: a ton of accounts and then mass reporting a target 145 00:09:23,400 --> 00:09:26,360 Speaker 1: account in an effort to get Meta to remove it. So, 146 00:09:26,440 --> 00:09:30,920 Speaker 1: for example, let's say that there's an activist in Vietnam, 147 00:09:30,960 --> 00:09:33,760 Speaker 1: they might find that their account has been targeted by 148 00:09:33,760 --> 00:09:36,800 Speaker 1: one of these mass reporting campaigns in an effort to 149 00:09:37,000 --> 00:09:40,640 Speaker 1: silence them. So, in other words, some groups starts creating 150 00:09:40,640 --> 00:09:44,560 Speaker 1: a bunch of accounts and uses all the accounts to 151 00:09:44,600 --> 00:09:47,720 Speaker 1: report the activist with the goal of getting them banned 152 00:09:47,840 --> 00:09:51,800 Speaker 1: or removed. Meta also identified another form of harassment it 153 00:09:51,840 --> 00:09:55,280 Speaker 1: calls brigading. This is when a bunch of folks some 154 00:09:55,360 --> 00:09:58,520 Speaker 1: are brigading. I guess uh. This is when a bunch 155 00:09:58,520 --> 00:10:02,360 Speaker 1: of folks, sometimes hundreds or thousands of people coordinate to 156 00:10:02,480 --> 00:10:05,959 Speaker 1: harass targets in an effort to silence them. It's kind 157 00:10:06,000 --> 00:10:10,800 Speaker 1: of like we've seen this before where it wasn't necessarily 158 00:10:10,840 --> 00:10:15,240 Speaker 1: an organized campaign. Things that have grown organically where a 159 00:10:15,280 --> 00:10:18,600 Speaker 1: bunch of people pile on start harassing a specific person. 160 00:10:18,679 --> 00:10:21,960 Speaker 1: I remember, like during the Me Too movement that was 161 00:10:22,400 --> 00:10:24,960 Speaker 1: really ugly, like there were a lot of women getting 162 00:10:24,960 --> 00:10:29,400 Speaker 1: targeted by groups of people who kind of organically, we're 163 00:10:29,480 --> 00:10:31,680 Speaker 1: doing this, Well, this is a more coordinated effort to 164 00:10:31,679 --> 00:10:35,240 Speaker 1: do that, one that potentially state backed, so it's a 165 00:10:35,240 --> 00:10:39,840 Speaker 1: mass intimidation tactic. Meta pointed out a situation that emerged 166 00:10:39,840 --> 00:10:43,400 Speaker 1: in Europe, specifically in Italy and in France, in which 167 00:10:43,480 --> 00:10:48,440 Speaker 1: anti vaxers were brigading against medical professionals and politicians and 168 00:10:48,480 --> 00:10:52,920 Speaker 1: folks like journalists. And you know, there was a time 169 00:10:53,160 --> 00:10:55,560 Speaker 1: when I really thought the idea of social networks, like 170 00:10:55,640 --> 00:10:58,920 Speaker 1: online social networks, that that was a good thing, that 171 00:10:59,040 --> 00:11:02,240 Speaker 1: it was connecting people together, that was the democracy of 172 00:11:02,280 --> 00:11:04,160 Speaker 1: the web, and it would allow for the freedom of 173 00:11:04,200 --> 00:11:07,360 Speaker 1: expression on a scale never before seen. But y'all, it's 174 00:11:07,440 --> 00:11:10,360 Speaker 1: really hard for me to look at online social networks 175 00:11:10,400 --> 00:11:14,080 Speaker 1: today as anything other than toxic. It could be that 176 00:11:14,160 --> 00:11:17,640 Speaker 1: I'm just getting too old and too cynical, but yeah, 177 00:11:17,840 --> 00:11:21,559 Speaker 1: um ugly stuff. I'm glad to see that the companies 178 00:11:21,640 --> 00:11:25,520 Speaker 1: are trying to fight against that. Obviously, it will be 179 00:11:25,559 --> 00:11:29,320 Speaker 1: a never ending battle that there will always be entities 180 00:11:29,360 --> 00:11:31,880 Speaker 1: out there that are trying to use these platforms to 181 00:11:32,000 --> 00:11:35,280 Speaker 1: further their own agenda. But I'm glad to see, at 182 00:11:35,320 --> 00:11:37,800 Speaker 1: least in this case, that there are some actions being taken. 183 00:11:38,520 --> 00:11:42,440 Speaker 1: Microsoft announced that it has sees servers that the company says, 184 00:11:42,480 --> 00:11:44,439 Speaker 1: we're being used by a group of hackers based out 185 00:11:44,440 --> 00:11:49,200 Speaker 1: of China. Microsoft has called the group Nickel, and the 186 00:11:49,240 --> 00:11:52,959 Speaker 1: company says it had been tracking Nickel's activities since two 187 00:11:53,000 --> 00:11:56,360 Speaker 1: thousand and sixteen. The group have been focused on gathering 188 00:11:56,400 --> 00:11:59,120 Speaker 1: intelligence with these servers, primarily as a way to find 189 00:11:59,160 --> 00:12:03,439 Speaker 1: effective means to silence or compromise targets. So you might say, well, 190 00:12:03,440 --> 00:12:05,280 Speaker 1: what kind of targets, Well, you know, this is a 191 00:12:05,400 --> 00:12:09,160 Speaker 1: China based hacker group, So, as I'm sure all of 192 00:12:09,200 --> 00:12:13,679 Speaker 1: you have guessed, the targets are typically people and entities 193 00:12:13,720 --> 00:12:17,240 Speaker 1: that the Chinese government has viewed as being a problem, 194 00:12:17,280 --> 00:12:21,839 Speaker 1: including human rights organizations and journalists. So the hacker group 195 00:12:21,920 --> 00:12:26,480 Speaker 1: typically performs reconnaissance on these targets to determine what kind 196 00:12:26,480 --> 00:12:29,839 Speaker 1: of software and technology they are using that gives them 197 00:12:29,840 --> 00:12:32,200 Speaker 1: the information they need in order to start hunting for 198 00:12:32,280 --> 00:12:36,120 Speaker 1: potential exploits that they can use to compromise the targets 199 00:12:36,120 --> 00:12:42,000 Speaker 1: devices and perhaps do surveillance or lock down the devices, 200 00:12:42,040 --> 00:12:46,240 Speaker 1: et cetera. Microsoft obtained a court order allowing the company 201 00:12:46,520 --> 00:12:49,920 Speaker 1: to seize the servers and redirect traffic so that it 202 00:12:50,040 --> 00:12:53,520 Speaker 1: doesn't go to Nichols hackers. Instead, the traffic will go 203 00:12:53,559 --> 00:12:56,120 Speaker 1: to Microsoft, which will then use the information to study 204 00:12:56,160 --> 00:13:00,360 Speaker 1: the hackers techniques and potentially their goals. Now does that 205 00:13:00,400 --> 00:13:02,480 Speaker 1: mean that Nicol is out of commission because all these 206 00:13:02,480 --> 00:13:06,560 Speaker 1: servers got seized. Not hardly. Microsoft warns that the group 207 00:13:06,600 --> 00:13:10,040 Speaker 1: is still very much capable of acting against targets. This 208 00:13:10,120 --> 00:13:13,840 Speaker 1: was just part of their operations that's now been disrupted. Essentially, 209 00:13:13,840 --> 00:13:17,480 Speaker 1: the intelligence gathering part of their operations has hit a snag. 210 00:13:17,760 --> 00:13:20,280 Speaker 1: So the hackers still have the tools to attack targets 211 00:13:20,280 --> 00:13:23,080 Speaker 1: through malware and exploits, they just don't have the same 212 00:13:23,080 --> 00:13:26,560 Speaker 1: ability to gather information about their targets as they used to, 213 00:13:26,840 --> 00:13:30,959 Speaker 1: and even that is probably a temporary situation. Microsoft has 214 00:13:31,000 --> 00:13:35,199 Speaker 1: become one of the leading companies crusading against hackers, particularly 215 00:13:35,280 --> 00:13:39,120 Speaker 1: hackers that are financially motivated and that target companies and 216 00:13:39,160 --> 00:13:41,560 Speaker 1: you might wonder, well, how did this happen? Why did 217 00:13:41,679 --> 00:13:45,000 Speaker 1: Microsoft take the torch on this one? Well, you gotta 218 00:13:45,000 --> 00:13:48,720 Speaker 1: remember a lot of those companies happen to be Microsoft customers, 219 00:13:48,920 --> 00:13:51,680 Speaker 1: and in some cases, Microsoft is doing this because it 220 00:13:51,720 --> 00:13:55,199 Speaker 1: doesn't want its own products to be uh the gateway 221 00:13:55,200 --> 00:14:00,560 Speaker 1: that hackers used to get access to a potential client's systems. 222 00:14:00,880 --> 00:14:04,640 Speaker 1: So there's a pretty big motivator going on right there. Alright, 223 00:14:04,679 --> 00:14:07,240 Speaker 1: we have some more news stories to cover in this episode. 224 00:14:07,520 --> 00:14:10,160 Speaker 1: Before we get to that, let's take a quick break. 225 00:14:17,800 --> 00:14:20,920 Speaker 1: We're back. So about a year ago, word got out 226 00:14:20,960 --> 00:14:23,560 Speaker 1: that a Russian backed hacker group And yeah, I know, 227 00:14:23,680 --> 00:14:26,520 Speaker 1: there's a lot about hackers in this episode. Not my fault, 228 00:14:26,880 --> 00:14:29,160 Speaker 1: they just keep making the news. Anyway, a year ago, 229 00:14:29,280 --> 00:14:32,720 Speaker 1: this Russian backed hacker group, actually two groups, had managed 230 00:14:32,760 --> 00:14:36,400 Speaker 1: to compromise a product from the company's Solar Winds and 231 00:14:36,520 --> 00:14:39,960 Speaker 1: through an exploit, we're able to target around one hundred 232 00:14:39,960 --> 00:14:43,040 Speaker 1: of Solar Winds high profile clients. They're able to infect 233 00:14:43,360 --> 00:14:46,200 Speaker 1: thousands of clients, but they were specifically targeting around a 234 00:14:46,280 --> 00:14:49,800 Speaker 1: hundred of them, which included like nine federal agencies in 235 00:14:49,840 --> 00:14:52,840 Speaker 1: the United States. The hackers were able to create what 236 00:14:52,920 --> 00:14:57,120 Speaker 1: was called a supply chain attack, So they were focusing 237 00:14:57,960 --> 00:15:03,280 Speaker 1: not on their ultimate targets right list of one hundred organizations. Instead, 238 00:15:03,400 --> 00:15:06,960 Speaker 1: they were focusing on a trusted vendor that all of 239 00:15:07,000 --> 00:15:12,200 Speaker 1: those targets were using. They all were buying software from 240 00:15:12,240 --> 00:15:14,800 Speaker 1: Solar Winds. So the thought the hackers had was, well, 241 00:15:14,840 --> 00:15:18,840 Speaker 1: if we can infect the software that everyone is purchasing, 242 00:15:19,480 --> 00:15:22,760 Speaker 1: then we can infiltrate all these systems without having to 243 00:15:22,800 --> 00:15:26,480 Speaker 1: target each one individually. By compromising the vendor, the hackers 244 00:15:26,480 --> 00:15:30,600 Speaker 1: were able to exploit the trust that these companies had 245 00:15:30,640 --> 00:15:34,480 Speaker 1: in that vendor and then use maliciously altered products to 246 00:15:34,520 --> 00:15:38,000 Speaker 1: gain access to the targeted systems. Now, the hacking groups 247 00:15:38,040 --> 00:15:41,840 Speaker 1: responsible for that attack are targeting cloud solution providers or 248 00:15:41,920 --> 00:15:46,480 Speaker 1: c sps. These are companies that, as the name says, 249 00:15:46,520 --> 00:15:52,840 Speaker 1: they provide cloud computing solutions to customers. So the ideas 250 00:15:53,640 --> 00:15:57,440 Speaker 1: infiltrate the CSP and you know, make sure you're hiding 251 00:15:57,480 --> 00:16:01,560 Speaker 1: from detection, and then you can snoop on that CSPs customers. 252 00:16:01,640 --> 00:16:05,680 Speaker 1: You can potentially develop techniques to infiltrate systems that way. 253 00:16:05,760 --> 00:16:11,040 Speaker 1: Because so much of companies work nowadays moved to the cloud, 254 00:16:11,520 --> 00:16:13,680 Speaker 1: so it's another way to get access to targets without 255 00:16:13,760 --> 00:16:17,560 Speaker 1: having to focus on them individually. Security researchers have pointed 256 00:16:17,560 --> 00:16:20,040 Speaker 1: out that the hacker groups are quick to develop and 257 00:16:20,080 --> 00:16:23,960 Speaker 1: adopt new techniques, which makes them particularly difficult to combat. 258 00:16:24,240 --> 00:16:27,800 Speaker 1: It's also a drastic contrast with the shoddy disinformation campaign 259 00:16:27,840 --> 00:16:31,320 Speaker 1: out of China we talked about earlier their nine and Day. Really, 260 00:16:31,840 --> 00:16:34,840 Speaker 1: here in the United States, there's an ongoing political battle 261 00:16:34,960 --> 00:16:37,880 Speaker 1: that frames itself as a freedom of speech battle, and 262 00:16:37,920 --> 00:16:40,600 Speaker 1: what it really comes down to is whether social networking 263 00:16:40,640 --> 00:16:45,920 Speaker 1: platforms have the right to moderate content that is objectionable. Now, 264 00:16:45,920 --> 00:16:48,240 Speaker 1: I don't want to get too political on here. First 265 00:16:48,240 --> 00:16:51,560 Speaker 1: of all, I think everyone here knows how I lean politically. 266 00:16:51,880 --> 00:16:55,320 Speaker 1: It's not really a secret, and I don't want to 267 00:16:55,400 --> 00:16:59,840 Speaker 1: impart my bias on this story. But essentially this was 268 00:17:00,160 --> 00:17:04,240 Speaker 1: ah There was a law in Texas that was proposed 269 00:17:04,280 --> 00:17:08,520 Speaker 1: and passed and signed into law that was created by 270 00:17:08,880 --> 00:17:13,000 Speaker 1: conservatives who objected to platforms like Facebook and Twitter that 271 00:17:13,080 --> 00:17:17,760 Speaker 1: were removing posts and banning accounts belonging to mostly conservative 272 00:17:17,760 --> 00:17:21,679 Speaker 1: politicians who were adamant on passing on misinformation on everything 273 00:17:21,720 --> 00:17:26,520 Speaker 1: from the elections to COVID nineteen Texas passed a law 274 00:17:27,080 --> 00:17:30,200 Speaker 1: that said platforms would be banned from doing that. They 275 00:17:30,240 --> 00:17:36,680 Speaker 1: were saying that the platforms were effectively censoring uh individuals. Well, 276 00:17:36,720 --> 00:17:42,520 Speaker 1: now a federal judge has placed an injunction against that law, 277 00:17:42,600 --> 00:17:48,760 Speaker 1: meaning it cannot be enforced, saying that the law has 278 00:17:49,119 --> 00:17:53,800 Speaker 1: hypocritically effectively censored the social network platforms when it says 279 00:17:53,800 --> 00:17:57,119 Speaker 1: they're not allowed to moderate users submitted content. The judge 280 00:17:57,160 --> 00:18:01,720 Speaker 1: said that the law is quote replete with institutional defects, 281 00:18:01,760 --> 00:18:08,080 Speaker 1: including unconstitutional content and speaker based infringement on editorial discretion, 282 00:18:08,440 --> 00:18:14,320 Speaker 1: and onerously burdensome disclosure and operational requirements end quote. Now, 283 00:18:14,320 --> 00:18:17,399 Speaker 1: according to Ours Technica, the law as it was written 284 00:18:17,440 --> 00:18:20,920 Speaker 1: wouldn't apply to conservative social networks like Parlor and gab 285 00:18:21,400 --> 00:18:26,080 Speaker 1: because this law specifically targeted platforms that had at least 286 00:18:26,200 --> 00:18:29,920 Speaker 1: fifty million monthly users, and Parlor and gab are below 287 00:18:29,960 --> 00:18:34,640 Speaker 1: that threshold. One Texas senator proposed lowering that to twenty 288 00:18:34,680 --> 00:18:37,840 Speaker 1: five million monthly users, and that would have in fact 289 00:18:37,960 --> 00:18:41,240 Speaker 1: lumped Gab and Parlor into all the others, but the 290 00:18:41,280 --> 00:18:44,560 Speaker 1: Senate rejected that proposal, which does make it sound like 291 00:18:44,600 --> 00:18:47,480 Speaker 1: this was less about guaranteeing the right of free speech 292 00:18:47,840 --> 00:18:51,560 Speaker 1: for conservatives and more about restricting the moderating policies of 293 00:18:51,680 --> 00:18:55,680 Speaker 1: larger social networks. Over in the UK, the High Court 294 00:18:55,760 --> 00:18:58,879 Speaker 1: has ruled that Uber's business model is unlawful and the 295 00:18:58,880 --> 00:19:02,200 Speaker 1: company must comply with rules set out by the Private 296 00:19:02,280 --> 00:19:06,119 Speaker 1: Higher Vehicle sect. That act actually predates Uber, it was 297 00:19:06,160 --> 00:19:11,000 Speaker 1: first passed into law back in UK courts had previously 298 00:19:11,080 --> 00:19:15,040 Speaker 1: overruled Uber's claim that drivers are contractors and not workers. 299 00:19:15,359 --> 00:19:18,960 Speaker 1: According to UK courts, they are workers and thus Uber 300 00:19:19,040 --> 00:19:23,000 Speaker 1: is responsible for meeting obligations as an employer. This new 301 00:19:23,080 --> 00:19:26,040 Speaker 1: ruling says that Uber's attempts to have its contractual approach 302 00:19:26,160 --> 00:19:28,960 Speaker 1: made lawful are just done and dusted. It's a no go, 303 00:19:29,320 --> 00:19:31,800 Speaker 1: which means Uber will need to make fundamental changes in 304 00:19:31,840 --> 00:19:34,159 Speaker 1: how it operates in the UK in order to be 305 00:19:34,200 --> 00:19:37,680 Speaker 1: a legal business. See Part of Uber's argument was that 306 00:19:37,720 --> 00:19:41,600 Speaker 1: when a passenger uses Uber to book a ride, Uber 307 00:19:41,640 --> 00:19:44,600 Speaker 1: acted only as like an agent to connect a driver 308 00:19:44,840 --> 00:19:47,159 Speaker 1: to a passenger and then handle payment and that's it, 309 00:19:47,640 --> 00:19:51,200 Speaker 1: and that the actual contract of passage would just be 310 00:19:51,200 --> 00:19:54,639 Speaker 1: between the driver and the passenger and Uber would be 311 00:19:54,680 --> 00:19:57,960 Speaker 1: out of the picture. But the court says, no, my, 312 00:19:58,160 --> 00:20:01,000 Speaker 1: that's not how it exitn't it Yep, the contract and 313 00:20:01,040 --> 00:20:04,720 Speaker 1: does have the legal obligations relating to that contract. I 314 00:20:04,760 --> 00:20:08,200 Speaker 1: don't know why they talk like that. And so now 315 00:20:08,520 --> 00:20:12,000 Speaker 1: ride hailing companies like Uber, at least in London will 316 00:20:12,040 --> 00:20:15,480 Speaker 1: have to pay more taxes per trip, which likely means 317 00:20:15,480 --> 00:20:19,000 Speaker 1: we're gonna see price hikes in Uber in London in 318 00:20:19,000 --> 00:20:21,400 Speaker 1: the near future. It might also mean that Uber will 319 00:20:21,440 --> 00:20:25,359 Speaker 1: reevaluate operating in the UK at all, because it could 320 00:20:25,400 --> 00:20:29,880 Speaker 1: be in that, you know, without having these loopholes or 321 00:20:30,240 --> 00:20:33,720 Speaker 1: at least in this case uh illegalities in place, Uber 322 00:20:33,720 --> 00:20:37,440 Speaker 1: will be unable to quote unquote disrupt I mean, compete 323 00:20:37,600 --> 00:20:40,760 Speaker 1: with taxi services, So we'll have to see the company, 324 00:20:40,800 --> 00:20:43,360 Speaker 1: Samsung is making a big change, but it's just an 325 00:20:43,400 --> 00:20:47,760 Speaker 1: internal change. The company is dividing into two internal divisions, 326 00:20:47,880 --> 00:20:52,000 Speaker 1: so still one company, but two internal divisions. One is 327 00:20:52,000 --> 00:20:54,560 Speaker 1: going to focus on semiconductors and the other is going 328 00:20:54,600 --> 00:20:58,520 Speaker 1: to focus on consumer electronics like televisions and phones. Each 329 00:20:58,560 --> 00:21:03,080 Speaker 1: division will have its own CEO. So this organizational change 330 00:21:03,119 --> 00:21:07,560 Speaker 1: will likely mean that we won't see much change on 331 00:21:07,600 --> 00:21:11,159 Speaker 1: the outside, at least nothing that's really obvious. But the 332 00:21:11,160 --> 00:21:14,919 Speaker 1: thought is this will allow each internal division to have 333 00:21:15,119 --> 00:21:18,560 Speaker 1: a more unified and focused strategy in the in its 334 00:21:18,720 --> 00:21:22,920 Speaker 1: individual market, rather than having to craft an overall strategy 335 00:21:23,000 --> 00:21:26,439 Speaker 1: for a large company with very different divisions. When you 336 00:21:26,480 --> 00:21:29,000 Speaker 1: do that, when you're trying to create big strategies for 337 00:21:29,200 --> 00:21:32,520 Speaker 1: companies that have a lot of diverse departments, it can 338 00:21:32,560 --> 00:21:36,679 Speaker 1: mean that you end up with a strategy that's okay, 339 00:21:36,680 --> 00:21:40,439 Speaker 1: but not really great for any one division because you 340 00:21:40,480 --> 00:21:42,960 Speaker 1: can't prioritize anything. If you do, then you end up 341 00:21:43,040 --> 00:21:45,560 Speaker 1: leaving others behind. So this is a way to really 342 00:21:45,600 --> 00:21:50,280 Speaker 1: prioritize each of those divisions and focus on them without 343 00:21:50,320 --> 00:21:54,320 Speaker 1: having to worry about leaving something else behind. Cannon, the 344 00:21:54,440 --> 00:21:57,679 Speaker 1: Camera Company, has announced it has developed an image sensor 345 00:21:57,800 --> 00:22:00,280 Speaker 1: that can take high quality photographs even in the arc. 346 00:22:01,080 --> 00:22:03,520 Speaker 1: Now that's a pretty huge development. It's a tricky one 347 00:22:03,720 --> 00:22:07,600 Speaker 1: because image sensors are, you know, by definition, reliant on light. 348 00:22:08,080 --> 00:22:11,560 Speaker 1: The sensor can reportedly capture color photographs, and it uses 349 00:22:11,640 --> 00:22:15,560 Speaker 1: a technology called a single photon avalanche diode. Now I 350 00:22:15,600 --> 00:22:18,000 Speaker 1: would love to tell you how that works, but honestly, 351 00:22:18,119 --> 00:22:21,800 Speaker 1: it's beyond me. Apparently it acts kind of like an amplifier, 352 00:22:21,880 --> 00:22:25,160 Speaker 1: except instead of taking an incoming electric signal and then 353 00:22:25,240 --> 00:22:28,880 Speaker 1: boosting it with a transistor or vacuum tube. This tech 354 00:22:29,160 --> 00:22:32,119 Speaker 1: can take a single incoming photon and use it to 355 00:22:32,160 --> 00:22:35,800 Speaker 1: generate a large number of electrons. The company plans to 356 00:22:36,080 --> 00:22:39,119 Speaker 1: enter mass production with the sensor late next year, and 357 00:22:39,160 --> 00:22:42,080 Speaker 1: I'm really curious to learn more about this tech, though 358 00:22:42,160 --> 00:22:45,080 Speaker 1: I can't claim I'll actually understand how it works any 359 00:22:45,119 --> 00:22:47,920 Speaker 1: better than I do right now, which is to say 360 00:22:48,400 --> 00:22:52,400 Speaker 1: I don't, I don't, I don't. In a previous news episode, 361 00:22:52,560 --> 00:22:56,239 Speaker 1: I talked about how NASA launched a special spacecraft that 362 00:22:56,280 --> 00:22:58,840 Speaker 1: will act as kind of like a battering ram in 363 00:22:58,880 --> 00:23:01,399 Speaker 1: an effort to move an asteroid out of its orbital 364 00:23:01,440 --> 00:23:05,320 Speaker 1: path around a different asteroid. This asteroid is not on 365 00:23:05,359 --> 00:23:07,720 Speaker 1: any sort of collision course with the Earth, but the 366 00:23:07,720 --> 00:23:10,040 Speaker 1: project is a test to see if this would be 367 00:23:10,040 --> 00:23:14,080 Speaker 1: a working strategy to deflect any incoming meteor or asteroid 368 00:23:14,440 --> 00:23:17,440 Speaker 1: so that it doesn't hit the Earth in the future. Well, 369 00:23:17,520 --> 00:23:19,800 Speaker 1: adding onto that story is the fact that NASA has 370 00:23:19,920 --> 00:23:25,240 Speaker 1: unveiled and impact monitoring algorithm called Century too, and yes, 371 00:23:25,280 --> 00:23:29,560 Speaker 1: it is replacing an earlier algorithm called century So the 372 00:23:29,600 --> 00:23:32,600 Speaker 1: purpose of this algorithm is to analyze the paths of 373 00:23:32,680 --> 00:23:36,560 Speaker 1: any as that stands for near Earth asteroids, and then 374 00:23:36,640 --> 00:23:40,600 Speaker 1: calculate impact probabilities, like what is the probability that one 375 00:23:40,680 --> 00:23:45,480 Speaker 1: day this particular asteroid will collide with the Earth. Now, 376 00:23:45,480 --> 00:23:48,000 Speaker 1: as we know, the Earth has been hit with massive 377 00:23:48,040 --> 00:23:51,560 Speaker 1: objects from space before, so it's really just a matter 378 00:23:51,600 --> 00:23:55,040 Speaker 1: of time before it happens again. However, that matter of 379 00:23:55,040 --> 00:23:57,919 Speaker 1: time could be measured in millions of years. Still, it's 380 00:23:57,960 --> 00:24:00,359 Speaker 1: best to know about these things as early as possible 381 00:24:00,400 --> 00:24:02,760 Speaker 1: to give us the best chance to prevent an impact 382 00:24:02,800 --> 00:24:06,639 Speaker 1: from happening. Century too, is meant to calculate those risks 383 00:24:06,720 --> 00:24:09,400 Speaker 1: to give scientists enough warning time to develop a plan 384 00:24:09,800 --> 00:24:14,199 Speaker 1: should any in e A have a high probability of 385 00:24:14,240 --> 00:24:17,679 Speaker 1: colliding with the Earth. Of course, this algorithm will only 386 00:24:17,720 --> 00:24:20,240 Speaker 1: work with the n e as we know about, and 387 00:24:20,280 --> 00:24:22,240 Speaker 1: a lot of NASA's other work in this field is 388 00:24:22,280 --> 00:24:25,160 Speaker 1: all about expanding our knowledge of the neighborhood and looking 389 00:24:25,200 --> 00:24:29,120 Speaker 1: out for other possible planet party crashers. Finally, I've talked 390 00:24:29,160 --> 00:24:32,240 Speaker 1: on the show several times about quantum computing and how 391 00:24:32,240 --> 00:24:34,959 Speaker 1: it has the potential to revolutionize how we deal with 392 00:24:35,160 --> 00:24:39,239 Speaker 1: certain computational problems. I've also mentioned that it could with 393 00:24:39,320 --> 00:24:44,040 Speaker 1: the right algorithmic approach spell an end to modern cryptography. Well, 394 00:24:44,040 --> 00:24:47,320 Speaker 1: now a company called Cambridge Quantum is making hay while 395 00:24:47,359 --> 00:24:50,560 Speaker 1: the sun shines, announcing that it has developed a way 396 00:24:50,600 --> 00:24:54,800 Speaker 1: to generate superior cryptographic keys. Now, from what I can tell, 397 00:24:55,240 --> 00:24:59,200 Speaker 1: these are still traditional cryptographic keys. They're just a more 398 00:24:59,320 --> 00:25:03,480 Speaker 1: robust ursion with a lot more randomness thrown in. So 399 00:25:03,640 --> 00:25:06,239 Speaker 1: random number generation is a very tricky thing to do 400 00:25:06,280 --> 00:25:09,639 Speaker 1: with machines. Because machines rely on sets of instructions in 401 00:25:09,720 --> 00:25:13,880 Speaker 1: order to process operations, it is hard to write instructions 402 00:25:13,920 --> 00:25:18,800 Speaker 1: that create randomness. Usually we just manage pseudo randomness. So 403 00:25:19,119 --> 00:25:21,159 Speaker 1: in fact, you could argue that instructions are kind of 404 00:25:21,280 --> 00:25:25,520 Speaker 1: anti random right they are ordered anyway, these are not 405 00:25:26,119 --> 00:25:30,240 Speaker 1: quantum cryptographic keys. I did once to see a presentation 406 00:25:30,280 --> 00:25:36,560 Speaker 1: about quantum cryptography that was so fascinating and incomprehensible that 407 00:25:36,640 --> 00:25:40,120 Speaker 1: I cannot even start to explain it here. So this 408 00:25:40,200 --> 00:25:43,280 Speaker 1: is not the cryptography of the future necessarily. However, the 409 00:25:43,280 --> 00:25:47,560 Speaker 1: company does imply that its platform will generate cryptographic keys 410 00:25:47,600 --> 00:25:50,440 Speaker 1: that are more secure than the ones that we typically 411 00:25:50,480 --> 00:25:53,360 Speaker 1: use today, and will also be able to generate them 412 00:25:53,359 --> 00:25:57,240 Speaker 1: at a rate faster than other methods. So that's pretty cool. 413 00:25:57,680 --> 00:26:01,520 Speaker 1: And that's the news for Tuesday, December at the one. 414 00:26:01,960 --> 00:26:04,240 Speaker 1: If you have suggestions for topics I should cover in 415 00:26:04,280 --> 00:26:06,840 Speaker 1: future episodes of tech Stuff, please reach out to me. 416 00:26:07,320 --> 00:26:09,600 Speaker 1: The what best way to do that is over on Twitter. 417 00:26:09,760 --> 00:26:12,560 Speaker 1: The handle for the show is text Stuff H s 418 00:26:12,840 --> 00:26:21,120 Speaker 1: W and I'll talk to you again really soon. Text 419 00:26:21,119 --> 00:26:24,600 Speaker 1: Stuff is an I heart Radio production. For more podcasts 420 00:26:24,600 --> 00:26:27,359 Speaker 1: from my heart Radio, visit the i heart Radio app, 421 00:26:27,480 --> 00:26:30,639 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.