1 00:00:04,360 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:16,000 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,040 --> 00:00:20,119 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:20,160 --> 00:00:23,000 Speaker 1: are you. It's time for the Tech News or Tuesday, 5 00:00:23,079 --> 00:00:29,080 Speaker 1: March fourteenth, two thousand twenty three, and first up, Happy 6 00:00:29,200 --> 00:00:34,160 Speaker 1: pie Day, everybody. That's three fourteen Pie Day, at least 7 00:00:34,200 --> 00:00:38,280 Speaker 1: the way we do dates here in the United States. 8 00:00:38,640 --> 00:00:41,319 Speaker 1: All Right, Today, we're going to be looking at a 9 00:00:41,320 --> 00:00:44,240 Speaker 1: lot of updates to stories that we've been following for 10 00:00:44,280 --> 00:00:47,559 Speaker 1: a while. I should probably start by saying that the 11 00:00:47,640 --> 00:00:52,440 Speaker 1: collapse of Silicon Valley Bank or SVB and the US 12 00:00:52,520 --> 00:00:57,320 Speaker 1: government's response to guaranteed depositors access to their funds continues 13 00:00:57,360 --> 00:01:01,800 Speaker 1: to arguably be the biggest story right now at the moment. 14 00:01:02,200 --> 00:01:05,960 Speaker 1: There are countless articles and explainers on it, including yesterday's 15 00:01:06,000 --> 00:01:09,480 Speaker 1: episode of this very podcast. So if you want to 16 00:01:09,520 --> 00:01:12,399 Speaker 1: know more, you can listen to yesterday's episode. I talk 17 00:01:12,480 --> 00:01:15,319 Speaker 1: all about the bank and what led to the collapse. 18 00:01:15,680 --> 00:01:17,679 Speaker 1: I will say that it was pretty interesting to see 19 00:01:17,720 --> 00:01:21,039 Speaker 1: a bunch of important folks in tech and investment cry 20 00:01:21,040 --> 00:01:24,320 Speaker 1: out for a bailout when some of these folks are 21 00:01:24,720 --> 00:01:29,479 Speaker 1: the same who will typically criticize those kinds of decisions. 22 00:01:30,120 --> 00:01:32,480 Speaker 1: I guess when it's your money, it's important enough for 23 00:01:32,520 --> 00:01:35,040 Speaker 1: a bailout, but if it's someone else's money, they should 24 00:01:35,040 --> 00:01:38,800 Speaker 1: have known better. Anyway, I'm grateful that the relief provided 25 00:01:38,840 --> 00:01:41,399 Speaker 1: by the government means that folks who work for companies 26 00:01:41,600 --> 00:01:46,119 Speaker 1: that banked with SVB can continue to be paid, because 27 00:01:46,160 --> 00:01:49,040 Speaker 1: goodness knows, none of this is their fault. Also to 28 00:01:49,560 --> 00:01:53,520 Speaker 1: those same tech billionaires who helped fuel the panic that 29 00:01:53,640 --> 00:01:56,680 Speaker 1: led to the collapse in the first place, you're all idiots. 30 00:01:57,320 --> 00:01:59,520 Speaker 1: I know that's just an opinion, but at this point 31 00:01:59,880 --> 00:02:01,520 Speaker 1: I feel like I could say, like it's a fact 32 00:02:01,840 --> 00:02:06,200 Speaker 1: they fueled the crisis that they were scared of, Like 33 00:02:06,240 --> 00:02:10,480 Speaker 1: they are the reason why it happened. Their attempts to 34 00:02:10,680 --> 00:02:14,720 Speaker 1: avoid the problem created the problem. All right, enough of that. 35 00:02:15,200 --> 00:02:18,720 Speaker 1: Earlier today, news broke that Meta plans another round of layoffs, 36 00:02:19,280 --> 00:02:24,040 Speaker 1: this time affecting around ten thousand staff. Last year, Meta 37 00:02:24,160 --> 00:02:28,120 Speaker 1: laid off eleven thousand employees. These new layoffs are going 38 00:02:28,200 --> 00:02:31,720 Speaker 1: to stretch over the next few months. Mark Zuckerberg explained 39 00:02:31,720 --> 00:02:35,360 Speaker 1: that the company is restructuring and will be quote focused 40 00:02:35,400 --> 00:02:41,760 Speaker 1: on flattening our organs, canceling lower priority projects, and reducing 41 00:02:41,960 --> 00:02:46,240 Speaker 1: our hiring rates end quote. The earlier round of layoffs 42 00:02:46,320 --> 00:02:48,920 Speaker 1: last year already took a big swing at reducing the 43 00:02:49,200 --> 00:02:53,519 Speaker 1: vertical dimensions of Meta's hierarchy, you know, flattening it out. 44 00:02:53,760 --> 00:02:55,600 Speaker 1: So it sounds to me like this move is really 45 00:02:55,639 --> 00:02:59,880 Speaker 1: going to affect middle management a lot. Zuckerberg also wore 46 00:03:00,440 --> 00:03:03,040 Speaker 1: that he suspects the economic conditions that we're in right 47 00:03:03,120 --> 00:03:07,480 Speaker 1: now could continue for several years. Apparently, one area of 48 00:03:07,600 --> 00:03:11,320 Speaker 1: Zuckerberg is still gung ho on supporting would be any 49 00:03:11,360 --> 00:03:16,600 Speaker 1: area relating to the Metaverse initiative, including augmented and virtual 50 00:03:16,639 --> 00:03:21,720 Speaker 1: reality projects. And maybe those projects really will pay off 51 00:03:21,960 --> 00:03:24,040 Speaker 1: in the long run. And maybe I'm just a negative 52 00:03:24,080 --> 00:03:26,120 Speaker 1: nancy about this whole thing, but if you've been listening 53 00:03:26,160 --> 00:03:29,240 Speaker 1: to tech stuff, you know I've been worried about it. 54 00:03:29,880 --> 00:03:34,200 Speaker 1: I've been worried about the potential for success down the road, 55 00:03:34,720 --> 00:03:37,840 Speaker 1: and also worried about it creating an even deeper and 56 00:03:38,000 --> 00:03:42,720 Speaker 1: wider digital divide than what already exists. To put a 57 00:03:42,800 --> 00:03:45,880 Speaker 1: cherry on top of all this, After this announcement with public, 58 00:03:45,960 --> 00:03:50,320 Speaker 1: Meta's stock price rose by around five percent and here's 59 00:03:50,320 --> 00:03:54,440 Speaker 1: where you can just insert a random rant by yours 60 00:03:54,480 --> 00:03:58,040 Speaker 1: truly about how layoffs often make investors happy because they 61 00:03:58,080 --> 00:04:01,640 Speaker 1: look at it as reduced costs and thus more profit. 62 00:04:02,160 --> 00:04:04,280 Speaker 1: But I know I'm a broken record, so we'll just 63 00:04:04,480 --> 00:04:08,600 Speaker 1: keep ongoing. Sticking with Meta, the company has warned the 64 00:04:08,640 --> 00:04:12,480 Speaker 1: Canadian government that should the government pass the proposed Online 65 00:04:12,520 --> 00:04:17,080 Speaker 1: News Act, Meta will block links to news sites on 66 00:04:17,160 --> 00:04:20,839 Speaker 1: Instagram and Facebook in Canada. So this is similar to 67 00:04:20,880 --> 00:04:24,800 Speaker 1: a situation that unfolded in Australia. And what we have 68 00:04:24,960 --> 00:04:28,800 Speaker 1: here is a pretty complicated situation, all right. So let's 69 00:04:28,800 --> 00:04:32,279 Speaker 1: start with this proposed law. It would allow media companies 70 00:04:32,320 --> 00:04:36,719 Speaker 1: to negotiate with platforms like Facebook to secure a revenue 71 00:04:36,800 --> 00:04:40,760 Speaker 1: sharing agreement for links to news sources on Facebook. So 72 00:04:40,800 --> 00:04:44,480 Speaker 1: when links to news sources appear on Facebook, the revenue 73 00:04:44,520 --> 00:04:48,320 Speaker 1: that Facebook generates from people scrolling through all of that 74 00:04:48,680 --> 00:04:52,520 Speaker 1: would be shared some percentage of that would be shared 75 00:04:52,560 --> 00:04:56,200 Speaker 1: with the media companies. The media companies are arguing that 76 00:04:56,240 --> 00:04:58,960 Speaker 1: their work is showing up on Facebook, which means the 77 00:04:59,040 --> 00:05:03,240 Speaker 1: content is being seen by Facebook users and presumably adding 78 00:05:03,360 --> 00:05:07,640 Speaker 1: value to their Facebook experience. But the media companies aren't 79 00:05:07,640 --> 00:05:10,720 Speaker 1: seeing any rev share for their content being shown on 80 00:05:10,720 --> 00:05:15,040 Speaker 1: Facebook's platform. And yes, if someone does click through the 81 00:05:15,120 --> 00:05:18,400 Speaker 1: link and goes to the article, well then the media 82 00:05:18,400 --> 00:05:21,480 Speaker 1: company gets the page impression that way, right, like they 83 00:05:21,560 --> 00:05:25,039 Speaker 1: get the little counter goes up by one. But the 84 00:05:25,120 --> 00:05:28,560 Speaker 1: media companies are saying that they're providing value to platforms 85 00:05:28,560 --> 00:05:32,160 Speaker 1: like Facebook, but they're not getting the rev share back 86 00:05:32,240 --> 00:05:37,360 Speaker 1: in return, so that Meta is profiting off of the 87 00:05:37,400 --> 00:05:41,119 Speaker 1: media companies without giving anything back. And Meta's argument is, hey, 88 00:05:42,120 --> 00:05:45,640 Speaker 1: we don't post those links. That's not us. Those are 89 00:05:45,680 --> 00:05:48,720 Speaker 1: posted by our users, and we can't be held accountable 90 00:05:48,760 --> 00:05:51,159 Speaker 1: for what our users post. So as long as those 91 00:05:51,480 --> 00:05:56,479 Speaker 1: posts don't violate any of our policies, it's allowed. That's 92 00:05:56,520 --> 00:05:59,279 Speaker 1: on the user. And essentially Meta is saying that the 93 00:05:59,320 --> 00:06:03,039 Speaker 1: company is exploiting these news sources at all because they're 94 00:06:03,080 --> 00:06:05,520 Speaker 1: not the ones posting them. On top of all this, 95 00:06:06,240 --> 00:06:09,320 Speaker 1: we have the very real problem of funding for journalism. 96 00:06:10,279 --> 00:06:13,120 Speaker 1: That there's a lack of funding, and that lack of 97 00:06:13,120 --> 00:06:15,800 Speaker 1: funding means that in the absence of good journalism, we 98 00:06:15,880 --> 00:06:19,920 Speaker 1: end up with bad, cheaper journalism, and that in turn 99 00:06:20,000 --> 00:06:24,039 Speaker 1: means people are less informed overall, and they're more likely 100 00:06:24,080 --> 00:06:28,960 Speaker 1: to encounter misinformation and bias and propaganda and that collectively 101 00:06:29,200 --> 00:06:32,520 Speaker 1: this represents a decline in the public good. So, like 102 00:06:32,640 --> 00:06:37,880 Speaker 1: I said, this is a complicated situation. I honestly don't 103 00:06:37,960 --> 00:06:41,360 Speaker 1: know what the right approach is now. Down in Australia, 104 00:06:41,480 --> 00:06:45,159 Speaker 1: META eventually agreed with a final version of a similar law. 105 00:06:45,800 --> 00:06:49,640 Speaker 1: Results there have been mixed. I wouldn't say that it 106 00:06:49,720 --> 00:06:52,960 Speaker 1: was a huge success on either side, and META seems 107 00:06:53,000 --> 00:06:56,120 Speaker 1: determined to fight similar legislation in Canada and here in 108 00:06:56,120 --> 00:06:58,600 Speaker 1: the United States. So we'll have to see where this goes. 109 00:06:59,000 --> 00:07:01,880 Speaker 1: And again, I don't know what the right way forward 110 00:07:02,000 --> 00:07:06,080 Speaker 1: is here. This one is really complicated. Okay, pop quiz time. 111 00:07:06,880 --> 00:07:12,120 Speaker 1: What do the United States, Taiwan, India, Canada, the European 112 00:07:12,240 --> 00:07:15,720 Speaker 1: Union and the Netherlands, which I understand is in the EU, 113 00:07:15,800 --> 00:07:17,720 Speaker 1: but this is a little bit different. What do they 114 00:07:17,720 --> 00:07:21,840 Speaker 1: all have in common? Well, all of these countries and 115 00:07:22,200 --> 00:07:25,840 Speaker 1: the EU have banned TikTok to some extent on the 116 00:07:25,880 --> 00:07:30,440 Speaker 1: grounds of national security concerns. Most of those countries and 117 00:07:30,640 --> 00:07:35,120 Speaker 1: organizations have restricted the band to government owned devices, so 118 00:07:35,280 --> 00:07:37,760 Speaker 1: that just means that government employees are not allowed to 119 00:07:37,760 --> 00:07:41,680 Speaker 1: install TikTok on their work device, but they could still 120 00:07:41,760 --> 00:07:43,760 Speaker 1: put it on their personal device if they wanted to. 121 00:07:44,320 --> 00:07:47,240 Speaker 1: India did go a little harder on this, but then 122 00:07:47,320 --> 00:07:50,000 Speaker 1: India has also been banning China based apps for a 123 00:07:50,040 --> 00:07:52,720 Speaker 1: while and has had border disputes with China in the 124 00:07:52,760 --> 00:07:55,640 Speaker 1: recent past, so you know, you can see where that 125 00:07:55,640 --> 00:08:00,560 Speaker 1: would be a more extreme stance than in other countries. Well. Now, 126 00:08:00,600 --> 00:08:03,560 Speaker 1: the regional government of New South Wales in Australia may 127 00:08:03,600 --> 00:08:07,040 Speaker 1: soon be joining this list of government agencies that ban 128 00:08:07,200 --> 00:08:10,600 Speaker 1: the use of TikTok on government owned devices. The Guardian 129 00:08:10,680 --> 00:08:13,840 Speaker 1: reports that while there is a policy that requires government 130 00:08:13,880 --> 00:08:17,240 Speaker 1: employees to clear any app with their superiors before they 131 00:08:17,240 --> 00:08:20,280 Speaker 1: can install it on a government owned device, there is 132 00:08:20,360 --> 00:08:25,200 Speaker 1: no policy specifically about TikTok, but this could potentially change 133 00:08:25,440 --> 00:08:28,560 Speaker 1: at the moment. Each department within the New South Wales 134 00:08:28,640 --> 00:08:32,680 Speaker 1: government has the authority to decide its own policies, so 135 00:08:32,720 --> 00:08:35,720 Speaker 1: it's kind of a department by department basis, but there 136 00:08:35,760 --> 00:08:39,000 Speaker 1: could potentially be a government wide band that would trickle 137 00:08:39,080 --> 00:08:43,080 Speaker 1: down to every single department. That as a possibility. Apparently, 138 00:08:43,120 --> 00:08:46,400 Speaker 1: New South Wales is also looking at the overall Government 139 00:08:46,400 --> 00:08:51,080 Speaker 1: of Australia for some guidance on this as well, so 140 00:08:51,120 --> 00:08:53,920 Speaker 1: we'll have to see if New South Wales joins the list. 141 00:08:54,520 --> 00:08:57,760 Speaker 1: Ours Technica, which is one of my favorite websites. I mean, 142 00:08:57,800 --> 00:09:01,199 Speaker 1: I'm being serious here. It is fantastic stick. It has 143 00:09:01,240 --> 00:09:05,360 Speaker 1: a great piece titled why Sony says it can't trust 144 00:09:05,440 --> 00:09:10,720 Speaker 1: Microsoft's Call of Duty offer? One word Bethesda. So this 145 00:09:10,880 --> 00:09:13,679 Speaker 1: all has to do with the planned acquisition of Activision 146 00:09:13,720 --> 00:09:18,040 Speaker 1: Blizzard by the aforementioned Microsoft. You might remember that Microsoft 147 00:09:18,080 --> 00:09:22,160 Speaker 1: announced this planned deal last year and that the companies 148 00:09:22,200 --> 00:09:26,479 Speaker 1: had hoped to complete the acquisition by this summer. Activision 149 00:09:26,520 --> 00:09:31,400 Speaker 1: Blizzard produces several popular video game franchises, including Call of Duty, 150 00:09:32,040 --> 00:09:36,959 Speaker 1: and this has prompted Sony to protest vociferously to this acquisition, 151 00:09:37,360 --> 00:09:41,040 Speaker 1: and Sony has been lobbying regulators in the EU to 152 00:09:41,280 --> 00:09:44,640 Speaker 1: block the acquisition from happening, saying that it would be 153 00:09:44,760 --> 00:09:49,520 Speaker 1: a reduction in competition and create an unfair monopoly situation 154 00:09:49,559 --> 00:09:53,760 Speaker 1: with Microsoft. Now currently, regulators in the EU are considering 155 00:09:53,840 --> 00:09:57,600 Speaker 1: Microsoft's appeal that's designed to try and rescue this deal. 156 00:09:58,040 --> 00:10:01,480 Speaker 1: The decision on that is supposed to be made by 157 00:10:01,960 --> 00:10:06,840 Speaker 1: late April. Microsoft says that it has extended an offer 158 00:10:06,960 --> 00:10:11,360 Speaker 1: a guarantee that Microsoft will produce Call of Duty for 159 00:10:11,400 --> 00:10:14,600 Speaker 1: Sony's own consoles and release the title on the same 160 00:10:14,679 --> 00:10:17,400 Speaker 1: day and date as when it will come out for 161 00:10:17,480 --> 00:10:20,520 Speaker 1: Microsoft's consoles as well as PCs, so another word saying 162 00:10:20,880 --> 00:10:24,120 Speaker 1: Sony will get the equal access to the titles. There 163 00:10:24,120 --> 00:10:26,920 Speaker 1: would be nothing different about them. They won't be inferior. 164 00:10:26,960 --> 00:10:29,760 Speaker 1: They'll come out the same day. We will put this 165 00:10:29,800 --> 00:10:32,800 Speaker 1: agreement in writing and we will guarantee it for ten years. 166 00:10:33,640 --> 00:10:38,520 Speaker 1: Sony's argument is that Microsoft has already shown its hand 167 00:10:39,080 --> 00:10:44,959 Speaker 1: with Bethesda, which is another game producer that Microsoft previously acquired, 168 00:10:45,400 --> 00:10:49,960 Speaker 1: and Sony's pointing out that Microsoft has announced that of 169 00:10:50,120 --> 00:10:54,440 Speaker 1: coming Bethesda titles like Starfield and the next Elder Scrolls 170 00:10:54,480 --> 00:10:58,200 Speaker 1: game are going to be exclusive to PC's and Xbox consoles, 171 00:10:58,240 --> 00:11:02,120 Speaker 1: that Sony console owners will not get access to these games, 172 00:11:02,720 --> 00:11:06,719 Speaker 1: and that this shows how Microsoft you can't take them 173 00:11:06,720 --> 00:11:09,559 Speaker 1: at their word. Microsoft is like, hey, we never once 174 00:11:09,640 --> 00:11:12,559 Speaker 1: said that we were not going to do that with Bethesda. 175 00:11:12,679 --> 00:11:15,920 Speaker 1: We never made a promise. The EU regulators kind of 176 00:11:15,960 --> 00:11:19,319 Speaker 1: assumed that that was what Microsoft was going to do, 177 00:11:19,360 --> 00:11:21,040 Speaker 1: that it was going to have these out for all 178 00:11:21,080 --> 00:11:24,840 Speaker 1: the platforms, but Microsoft never actually said that. They're like, 179 00:11:25,080 --> 00:11:27,120 Speaker 1: but this time we are saying that, so you can't 180 00:11:27,160 --> 00:11:29,280 Speaker 1: say that we were dishonest because we never made a 181 00:11:29,360 --> 00:11:32,680 Speaker 1: claim otherwise. And in this case, we are putting down 182 00:11:32,720 --> 00:11:36,839 Speaker 1: and writing this agreement. So it's a pretty complicated and 183 00:11:37,200 --> 00:11:40,920 Speaker 1: highly charged conversation going on, and I recommend you go 184 00:11:41,040 --> 00:11:43,240 Speaker 1: check out the piece in Ours Technica if you want 185 00:11:43,240 --> 00:11:45,640 Speaker 1: to learn more. Again, it's really well done. It's called 186 00:11:45,679 --> 00:11:48,559 Speaker 1: why Sony says it can't trust Microsoft's call of duty? 187 00:11:48,600 --> 00:11:52,920 Speaker 1: Offer one word, Bethesda. Okay, we're going to take a 188 00:11:52,960 --> 00:11:54,719 Speaker 1: quick break. When we come back, we've got some more 189 00:11:54,760 --> 00:12:07,720 Speaker 1: news to talk about, all right. Next up, Insider reports 190 00:12:07,760 --> 00:12:11,840 Speaker 1: that the law firm Edelson has filed a class action 191 00:12:12,080 --> 00:12:15,640 Speaker 1: lawsuit against the AI company Do Not Pay. Now, y'all 192 00:12:15,720 --> 00:12:19,520 Speaker 1: might remember that Do Not Pays CEO Joshua Browder has 193 00:12:19,520 --> 00:12:23,640 Speaker 1: made some big claims slash publicity stunts relating to his 194 00:12:23,679 --> 00:12:27,440 Speaker 1: company's AI system, which is meant to assist people in 195 00:12:27,720 --> 00:12:32,160 Speaker 1: doing stuff like drafting legal documents and fight parking tickets. 196 00:12:32,160 --> 00:12:34,360 Speaker 1: In this kind of thing, it's been around since like 197 00:12:34,400 --> 00:12:38,560 Speaker 1: twenty fifteen. The class action lawsuit alleges that Do not 198 00:12:38,679 --> 00:12:42,680 Speaker 1: Pay is performing legal duties without first securing a law 199 00:12:42,720 --> 00:12:45,760 Speaker 1: degree or a license. Further, that the tool is not 200 00:12:45,880 --> 00:12:49,720 Speaker 1: subjected to any supervision by a real lawyer, and that 201 00:12:49,800 --> 00:12:54,840 Speaker 1: the whole thing has not ever been barred in any jurisdiction. 202 00:12:55,400 --> 00:12:58,560 Speaker 1: Barred in this case doesn't mean prevented. It means the 203 00:12:58,679 --> 00:13:02,480 Speaker 1: service has not asked the bar to serve as legal 204 00:13:02,520 --> 00:13:05,400 Speaker 1: counsel in any jurisdiction. This is something lawyers have to do. 205 00:13:05,920 --> 00:13:08,240 Speaker 1: Just because you're a lawyer in one jurisdiction doesn't mean 206 00:13:08,240 --> 00:13:10,320 Speaker 1: that you can practice law in another one. You have 207 00:13:10,360 --> 00:13:14,840 Speaker 1: to pass the bar in that jurisdiction. So, as a result, 208 00:13:15,360 --> 00:13:18,240 Speaker 1: the law firm Edelson is saying do not Pay is 209 00:13:18,440 --> 00:13:21,880 Speaker 1: practicing law without a license. Browler says his company will 210 00:13:21,880 --> 00:13:24,559 Speaker 1: fight the lawsuit, and that the whole reason he believes 211 00:13:24,559 --> 00:13:26,880 Speaker 1: in the company in the first place is that he 212 00:13:27,000 --> 00:13:30,680 Speaker 1: sees lawyers as manipulating and monopolizing the legal system to 213 00:13:30,800 --> 00:13:35,560 Speaker 1: benefit themselves rather than their clients. Essentially, that lawyers exist 214 00:13:35,600 --> 00:13:39,120 Speaker 1: to make lawyers rich, and that your average citizen ends 215 00:13:39,200 --> 00:13:42,199 Speaker 1: up being the victim of this legal system, and that 216 00:13:42,280 --> 00:13:45,800 Speaker 1: do not Pay was meant to provide an alternative to 217 00:13:46,080 --> 00:13:49,920 Speaker 1: securing a lawyer. But then the law firm is saying, well, 218 00:13:50,040 --> 00:13:52,200 Speaker 1: that may be true, but you know, you still have 219 00:13:52,200 --> 00:13:54,959 Speaker 1: to play by the rules, and you're not, so we'll 220 00:13:55,000 --> 00:13:59,880 Speaker 1: see where this goes from here. A ransomware group called alphv, 221 00:14:00,480 --> 00:14:04,880 Speaker 1: which uses malware that's called black Cat, claims that it 222 00:14:04,920 --> 00:14:10,760 Speaker 1: has compromised Amazon's ring as the doorbell security camera system. 223 00:14:10,840 --> 00:14:14,640 Speaker 1: It's been controversial because of reports that Amazon was sharing 224 00:14:15,200 --> 00:14:20,360 Speaker 1: video from customer ring cameras to law enforcement without the 225 00:14:20,360 --> 00:14:24,120 Speaker 1: input of customers. They're a whole thing about that, but anyway, 226 00:14:24,160 --> 00:14:27,400 Speaker 1: Amazon itself says that it doesn't have any evidence of 227 00:14:27,440 --> 00:14:30,400 Speaker 1: a data breach in its systems, that this ransomware group 228 00:14:30,440 --> 00:14:35,360 Speaker 1: did not breach Amazon systems, but that they did attack 229 00:14:35,480 --> 00:14:39,280 Speaker 1: a third party vendor and further indicated that this vendor 230 00:14:39,400 --> 00:14:42,720 Speaker 1: does not actually have access to customer records. So apparently 231 00:14:42,760 --> 00:14:45,840 Speaker 1: there was a successful ransomware attack. It was to a 232 00:14:45,920 --> 00:14:50,840 Speaker 1: third party vendor relating to Amazon's Ring product, but according 233 00:14:50,840 --> 00:14:53,800 Speaker 1: to Amazon, it's not Amazon itself. It is not quite 234 00:14:53,840 --> 00:14:57,240 Speaker 1: clear what information the hackers were able to access. They 235 00:14:57,280 --> 00:15:00,560 Speaker 1: are threatening to leak that data should there demands not 236 00:15:00,640 --> 00:15:03,240 Speaker 1: being met. This is where I remind everyone it's always 237 00:15:03,240 --> 00:15:06,880 Speaker 1: a bad idea to pay off ransomware hackers because paying 238 00:15:06,880 --> 00:15:10,480 Speaker 1: a ransom proofs that the attack is profitable. It encourages 239 00:15:10,520 --> 00:15:15,400 Speaker 1: future attacks against you and others because it worked. Plus 240 00:15:15,440 --> 00:15:17,800 Speaker 1: there's never a guarantee that the attackers are actually going 241 00:15:17,840 --> 00:15:21,720 Speaker 1: to abandon plans to exploit compromise data down the road. 242 00:15:21,760 --> 00:15:25,960 Speaker 1: They might just engage in some double dipping as it were. Also, 243 00:15:26,440 --> 00:15:30,200 Speaker 1: always practice good security protocol most of the time. Not 244 00:15:30,240 --> 00:15:32,920 Speaker 1: all the time, but most of the time. These attacks 245 00:15:32,960 --> 00:15:37,120 Speaker 1: come as the result of someone handing over access because 246 00:15:37,160 --> 00:15:40,960 Speaker 1: of a phishing attack or social engineering, So just practice 247 00:15:41,000 --> 00:15:44,480 Speaker 1: good security. That cuts back on a lot of these attacks, 248 00:15:44,520 --> 00:15:47,720 Speaker 1: not all of them. Some hackers are more clever and 249 00:15:47,880 --> 00:15:52,720 Speaker 1: find different ways to intrude on systems, but it will 250 00:15:52,760 --> 00:15:56,000 Speaker 1: cut back on a lot of them. India's IT Ministry 251 00:15:56,040 --> 00:15:59,760 Speaker 1: has proposed rules that will require smartphone companies and carriers 252 00:16:00,240 --> 00:16:05,120 Speaker 1: to make it possible for users to uninstall any preinstalled 253 00:16:05,120 --> 00:16:08,840 Speaker 1: apps on a smartphone, and then new smartphones will have 254 00:16:08,880 --> 00:16:11,480 Speaker 1: to be submitted to a government standards agency to make 255 00:16:11,520 --> 00:16:16,200 Speaker 1: sure that they meet whatever those security standards are. So 256 00:16:16,240 --> 00:16:18,640 Speaker 1: the reason the ministry gives for this move is related 257 00:16:18,680 --> 00:16:22,560 Speaker 1: to national security. So the argument is that if phone 258 00:16:22,600 --> 00:16:27,840 Speaker 1: manufacturers like Apple make it impossible to uninstall certain apps, 259 00:16:28,200 --> 00:16:30,960 Speaker 1: maybe even just a small number of them. Well, those 260 00:16:31,000 --> 00:16:34,160 Speaker 1: apps become prime targets for hackers because the hackers know 261 00:16:34,480 --> 00:16:38,280 Speaker 1: that everyone who has, say an iPhone, is definitely going 262 00:16:38,320 --> 00:16:42,040 Speaker 1: to have the phone app on that iPhone because it's 263 00:16:42,160 --> 00:16:46,320 Speaker 1: you can't uninstall it. So if you know that everybody 264 00:16:46,400 --> 00:16:48,320 Speaker 1: who has that kind of phone is going to have 265 00:16:48,320 --> 00:16:52,640 Speaker 1: the specific app that gives you a target to aim for, 266 00:16:52,800 --> 00:16:56,160 Speaker 1: you try and find a vulnerability in that app that 267 00:16:56,280 --> 00:17:00,520 Speaker 1: you can exploit and potentially access any phone that has 268 00:17:00,560 --> 00:17:03,760 Speaker 1: that app on it. So the I Ministry solution is 269 00:17:03,760 --> 00:17:06,960 Speaker 1: to make every app uninstallable so that users could choose 270 00:17:07,000 --> 00:17:10,639 Speaker 1: a different app that has the exact same functionality to 271 00:17:10,680 --> 00:17:14,800 Speaker 1: handle those services. This would be a pretty big move 272 00:17:14,880 --> 00:17:17,800 Speaker 1: and it would potentially mean that, you know, in the future, 273 00:17:18,119 --> 00:17:23,360 Speaker 1: India iPhone owners could do something unprecedented that replace the 274 00:17:23,400 --> 00:17:26,439 Speaker 1: phone app with a third party option, not just adding 275 00:17:26,480 --> 00:17:32,440 Speaker 1: an option, but removing the native Apple phone capabilities. Should 276 00:17:32,480 --> 00:17:35,399 Speaker 1: these rules go into affect, India will give manufacturers a 277 00:17:35,480 --> 00:17:41,000 Speaker 1: year to comply. Reportedly, the meetings that were held about 278 00:17:41,080 --> 00:17:47,040 Speaker 1: these rules included representatives from the major manufacturing companies, including Apple, 279 00:17:47,560 --> 00:17:52,000 Speaker 1: So yeah, really interesting to see where this goes from here. Finally, 280 00:17:52,080 --> 00:17:55,600 Speaker 1: there's some hubbub surrounding a Samsun feature introduced in the 281 00:17:55,640 --> 00:17:58,600 Speaker 1: Galaxy S twenty Ultra smartphone. It's also in the S 282 00:17:58,680 --> 00:18:02,760 Speaker 1: twenty one. The feature is called space Zoom, and the 283 00:18:02,800 --> 00:18:06,480 Speaker 1: way Samsung claims it works is that you take your 284 00:18:07,080 --> 00:18:09,520 Speaker 1: smartphone right, and you point that son of a gun 285 00:18:09,600 --> 00:18:12,320 Speaker 1: at that thar moon and you take a photo using 286 00:18:12,320 --> 00:18:16,320 Speaker 1: the space Zoom feature, which Samsung casually calls the one 287 00:18:16,400 --> 00:18:20,000 Speaker 1: hundred x space zoom. So presumably this means you could 288 00:18:20,000 --> 00:18:23,920 Speaker 1: take a photo at one hundred times magnification and then 289 00:18:24,040 --> 00:18:28,320 Speaker 1: this app this feature, the space Zoom feature, according to Samsung, 290 00:18:28,840 --> 00:18:32,040 Speaker 1: captures a whole bunch of photos in quick succession, so 291 00:18:32,080 --> 00:18:34,480 Speaker 1: it takes a ton of pictures of the Moon in 292 00:18:34,520 --> 00:18:37,959 Speaker 1: just a split second, right, and then it uses AI 293 00:18:38,040 --> 00:18:41,040 Speaker 1: to take the best features of each photo and stitch 294 00:18:41,160 --> 00:18:44,879 Speaker 1: together a high quality image of the Moon so that 295 00:18:44,960 --> 00:18:47,440 Speaker 1: you get a much higher resolution picture of it. You 296 00:18:47,480 --> 00:18:50,720 Speaker 1: can see things like craters and stuff. So while the 297 00:18:50,760 --> 00:18:54,360 Speaker 1: preview image might just look okay, the final photo could 298 00:18:54,400 --> 00:18:58,520 Speaker 1: be pretty stunning, which is really cool, right, except there 299 00:18:58,520 --> 00:19:02,199 Speaker 1: are folks claiming that this is not what Samsung is 300 00:19:02,280 --> 00:19:07,760 Speaker 1: actually doing. They claim instead that Samsung is using neural 301 00:19:08,040 --> 00:19:11,840 Speaker 1: links and massive database of photos of the Moon that 302 00:19:11,880 --> 00:19:15,720 Speaker 1: we're taken by high resolution telescopes, like a whole database 303 00:19:16,200 --> 00:19:19,920 Speaker 1: of high resolution moon photography, and they're using that as 304 00:19:20,160 --> 00:19:23,919 Speaker 1: reference material, and that the AI is not combining a 305 00:19:23,920 --> 00:19:27,919 Speaker 1: bunch of photos that you took. Instead, what it's doing 306 00:19:28,440 --> 00:19:31,919 Speaker 1: is it's applying textures and sharpening features on the image 307 00:19:31,920 --> 00:19:36,119 Speaker 1: you took using AI, using the pre existing high definition 308 00:19:36,160 --> 00:19:40,119 Speaker 1: photographs as reference material. So the argument is that Samsung 309 00:19:40,200 --> 00:19:42,480 Speaker 1: is using false advertising when it comes to how this 310 00:19:42,560 --> 00:19:45,200 Speaker 1: feature actually works. So instead of it being this cool 311 00:19:45,320 --> 00:19:48,119 Speaker 1: kind of in camera system that's paired with AI to 312 00:19:48,160 --> 00:19:51,679 Speaker 1: create the best possible photo, it's more like it's a 313 00:19:51,760 --> 00:19:57,959 Speaker 1: photoshop by AI feature. So yeah, clever Samsung. All right, 314 00:19:58,359 --> 00:20:01,040 Speaker 1: that wraps up this episode of stuff. Hope you are 315 00:20:01,119 --> 00:20:04,000 Speaker 1: all well. If you have suggestions for future topics, reach 316 00:20:04,040 --> 00:20:06,080 Speaker 1: out to me. You can do so on Twitter. They 317 00:20:06,080 --> 00:20:09,919 Speaker 1: handle for the show is tech Stuff HSW or you 318 00:20:09,960 --> 00:20:12,440 Speaker 1: can let me know on the iHeartRadio app. It's free 319 00:20:12,440 --> 00:20:15,480 Speaker 1: to download and use. Just navigate over to tech Stuff 320 00:20:15,560 --> 00:20:18,040 Speaker 1: use that little microphone icon you see. You can leave 321 00:20:18,080 --> 00:20:20,439 Speaker 1: a thirty second voice message for me. Let me know 322 00:20:20,440 --> 00:20:22,120 Speaker 1: what you would like to hear in a future episode, 323 00:20:22,320 --> 00:20:30,959 Speaker 1: and I'll talk to you again really soon. Tech Stuff 324 00:20:31,080 --> 00:20:35,600 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 325 00:20:35,640 --> 00:20:39,159 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 326 00:20:39,200 --> 00:20:40,119 Speaker 1: your favorite shows.