1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,920 --> 00:00:14,480 Speaker 1: Hey there, and Welcome to tech Stuff. I'm your host, 3 00:00:14,680 --> 00:00:19,720 Speaker 1: Jonathan Strickland. How the tech are you. We're gonna start 4 00:00:19,760 --> 00:00:24,120 Speaker 1: off today's news episode with a little story about digital surveillance. 5 00:00:24,560 --> 00:00:29,600 Speaker 1: Georgetown Laws Center on Privacy and Technology reports that the 6 00:00:29,640 --> 00:00:35,200 Speaker 1: Immigration and Customs Enforcement Agency or ICE, has been engaged 7 00:00:35,240 --> 00:00:39,879 Speaker 1: in widespread digital surveillance despite the fact that the agency 8 00:00:39,920 --> 00:00:43,560 Speaker 1: isn't supposed to do that now. Essentially, ICE has been 9 00:00:43,680 --> 00:00:48,760 Speaker 1: using a loophole, so instead of conducting digital surveillance on 10 00:00:48,920 --> 00:00:53,360 Speaker 1: US residents itself, the agency has actually just been purchasing 11 00:00:53,560 --> 00:00:57,440 Speaker 1: massive amounts of data through various data brokers and other 12 00:00:57,520 --> 00:01:02,280 Speaker 1: private companies as well as public utilities. Now, according to 13 00:01:02,320 --> 00:01:06,479 Speaker 1: the researchers, ICE has accumulated nearly three quarters of all 14 00:01:06,520 --> 00:01:10,240 Speaker 1: adult residents who have a driver's license in the US 15 00:01:10,840 --> 00:01:14,080 Speaker 1: three quarters of photos. That is, of those driver's licenses. 16 00:01:14,600 --> 00:01:18,319 Speaker 1: The agency purchases information from utility companies, and that gives 17 00:01:18,360 --> 00:01:21,919 Speaker 1: ICE the ability to detect when someone moves by seeing 18 00:01:21,959 --> 00:01:26,760 Speaker 1: when someone activates service at a new address. ICE is 19 00:01:26,800 --> 00:01:30,760 Speaker 1: part of the Department of Homeland Security, and the researchers 20 00:01:30,840 --> 00:01:36,160 Speaker 1: are arguing that the agency lacks proper congressional oversight. The 21 00:01:36,280 --> 00:01:40,200 Speaker 1: researchers have made a few recommendations to address this incredible 22 00:01:40,240 --> 00:01:44,760 Speaker 1: amount of digital surveillance, ranging from more thorough oversight of 23 00:01:44,840 --> 00:01:49,240 Speaker 1: the agency to reforming immigration law. Personally, I think this 24 00:01:49,360 --> 00:01:52,480 Speaker 1: really points to a need to create comprehensive laws about 25 00:01:52,480 --> 00:01:57,480 Speaker 1: who can sell data to whom and under what circumstances, 26 00:01:57,800 --> 00:02:01,680 Speaker 1: because as it stands right now, there's no need to 27 00:02:01,840 --> 00:02:06,720 Speaker 1: be some sort of skullduggery oriented agency. You don't have 28 00:02:06,800 --> 00:02:09,799 Speaker 1: to be you know, Mission impossible or anything in order 29 00:02:09,800 --> 00:02:14,040 Speaker 1: to spy on people. These agencies just purchased the information 30 00:02:14,080 --> 00:02:17,760 Speaker 1: they need from various sources, and through that they get 31 00:02:17,800 --> 00:02:21,320 Speaker 1: an incredibly detailed dossier on anyone they like. I mean, 32 00:02:21,360 --> 00:02:25,400 Speaker 1: our ability to analyze data has reached a point where 33 00:02:25,600 --> 00:02:29,000 Speaker 1: if you just have the information, you can sift through 34 00:02:29,040 --> 00:02:34,239 Speaker 1: it and put together a really comprehensive picture of who 35 00:02:34,280 --> 00:02:37,560 Speaker 1: someone is and where they've been. Big Brother has been 36 00:02:37,560 --> 00:02:42,720 Speaker 1: distributed and democratized. Yikes. In other news on how US 37 00:02:42,760 --> 00:02:47,600 Speaker 1: departments are using technology, the Pentagon recently held a demonstration 38 00:02:47,960 --> 00:02:54,000 Speaker 1: via its Joint Counter Small Unmanned Aircraft Systems Office. I 39 00:02:54,000 --> 00:02:56,760 Speaker 1: had to be really careful about reading that. It's kind 40 00:02:56,800 --> 00:03:01,480 Speaker 1: of a mouthful, But in this demonstrate, Shan vendor teams 41 00:03:01,600 --> 00:03:06,840 Speaker 1: used high powered microwave transmitters to bring down drones. So 42 00:03:06,960 --> 00:03:10,000 Speaker 1: the department has been assessing the possibility of using high 43 00:03:10,000 --> 00:03:13,520 Speaker 1: powered microwave beams to bring down hostile drones. We're talking 44 00:03:13,520 --> 00:03:17,280 Speaker 1: military grade drones here, not your average little quad copters. 45 00:03:18,000 --> 00:03:21,320 Speaker 1: And this was the third set of demonstrations, all intended 46 00:03:21,360 --> 00:03:25,079 Speaker 1: to give the agency the information needed to determine capability 47 00:03:25,120 --> 00:03:29,600 Speaker 1: gaps like where do they need to make improvements, and 48 00:03:29,639 --> 00:03:34,280 Speaker 1: to focus on developing the next generation of anti drone weaponry. 49 00:03:34,760 --> 00:03:38,520 Speaker 1: The vendors had to demonstrate their respective technologies effectiveness, which 50 00:03:38,560 --> 00:03:42,760 Speaker 1: included everything from threat detection to tracking too bringing down 51 00:03:42,800 --> 00:03:46,720 Speaker 1: targeted drones, sometimes two at a time. So this is 52 00:03:46,760 --> 00:03:50,080 Speaker 1: still very much in development, and while similar weapons may 53 00:03:50,120 --> 00:03:53,040 Speaker 1: already be deployed, what we're really looking at is the 54 00:03:53,120 --> 00:03:58,440 Speaker 1: refinement process intended to develop more effective anti drone defense systems. 55 00:03:59,320 --> 00:04:01,840 Speaker 1: There have been a few people who have suggested that 56 00:04:01,920 --> 00:04:05,960 Speaker 1: Elon Musk's quest to purchase Twitter has had at least 57 00:04:06,080 --> 00:04:09,160 Speaker 1: something to do with his own tendency to post things 58 00:04:09,200 --> 00:04:13,440 Speaker 1: that the United States securities and Exchange Commission or SEC, 59 00:04:14,000 --> 00:04:17,320 Speaker 1: has objected to not that buying Twitter is going to 60 00:04:17,400 --> 00:04:20,760 Speaker 1: change that at all, but whether that factored into his 61 00:04:20,800 --> 00:04:24,240 Speaker 1: decision to purchase the company or not. The SEC is 62 00:04:24,279 --> 00:04:29,119 Speaker 1: once again investigating Musk with relation to Twitter. This time, 63 00:04:29,160 --> 00:04:33,120 Speaker 1: it's because Musk was late in submitting a form indicating 64 00:04:33,240 --> 00:04:37,560 Speaker 1: his intent to purchase more than five percent of Twitter's shares. 65 00:04:37,600 --> 00:04:39,839 Speaker 1: He ended up purchasing a little more than nine percent. 66 00:04:40,640 --> 00:04:44,120 Speaker 1: That's something that investors are supposed to do when they 67 00:04:44,160 --> 00:04:47,320 Speaker 1: plan to make that substantial and investment in a company, 68 00:04:47,400 --> 00:04:51,320 Speaker 1: is to file that with the SEC. Musk apparently filed 69 00:04:51,320 --> 00:04:54,800 Speaker 1: the form at least ten days after he was supposed to, and, 70 00:04:54,839 --> 00:04:59,520 Speaker 1: according to the SEC, by not reporting the intended purchase 71 00:05:00,160 --> 00:05:02,640 Speaker 1: in a timely manner, Musk was able to buy that 72 00:05:02,720 --> 00:05:06,800 Speaker 1: stock essentially at a discount, potentially amounting to as much 73 00:05:06,839 --> 00:05:11,520 Speaker 1: as a hundred forty three million dollars. How Well, if 74 00:05:11,600 --> 00:05:16,760 Speaker 1: Musk had followed the proper protocol and submitted the form properly, 75 00:05:17,360 --> 00:05:21,000 Speaker 1: that submission would be public information, which means investors would 76 00:05:21,040 --> 00:05:23,680 Speaker 1: have seen that Musk was going to buy more than 77 00:05:23,800 --> 00:05:27,359 Speaker 1: nine of Twitter's stocks, and that in turn could have 78 00:05:27,360 --> 00:05:30,320 Speaker 1: been enough to drive up the share price, which would 79 00:05:30,760 --> 00:05:35,599 Speaker 1: have increased the wealth of current shareholders or encouraged more 80 00:05:35,640 --> 00:05:40,279 Speaker 1: people to buy into Twitter stock ahead of that actual purchase. 81 00:05:41,080 --> 00:05:45,280 Speaker 1: And one shareholder has now filed a lawsuit against Musk, 82 00:05:45,400 --> 00:05:49,400 Speaker 1: essentially arguing that by filing late, Musk robbed shareholders of 83 00:05:49,440 --> 00:05:53,880 Speaker 1: potential wealth. The SEC could also seek its own lawsuit 84 00:05:53,960 --> 00:05:58,480 Speaker 1: against Musk, but neither the shareholder lawsuit or a potential 85 00:05:58,680 --> 00:06:02,320 Speaker 1: SEC lawsuit would likely have any real impact on the 86 00:06:02,360 --> 00:06:07,359 Speaker 1: planned acquisition, which still requires shareholder and regulatory approval before 87 00:06:07,360 --> 00:06:12,839 Speaker 1: it can go through. Let's start talking about Meta. There 88 00:06:12,839 --> 00:06:15,360 Speaker 1: are a couple of stories to to cover now. The 89 00:06:15,400 --> 00:06:18,720 Speaker 1: owner of Facebook, that being Meta, is looking to make 90 00:06:18,800 --> 00:06:22,720 Speaker 1: some cutbacks to its Reality Labs division, which is the 91 00:06:22,800 --> 00:06:29,360 Speaker 1: hardware centric part of Meta that develops mixed reality hardware. Apparently, 92 00:06:29,680 --> 00:06:33,279 Speaker 1: at an internal meeting, leaders told employees to expect upcoming 93 00:06:33,360 --> 00:06:37,400 Speaker 1: changes to be announced within a week or so, with 94 00:06:37,640 --> 00:06:40,679 Speaker 1: those announcements being specific cutbacks. And this is a pretty 95 00:06:40,760 --> 00:06:45,720 Speaker 1: dramatic turnaround from where we were in the fall of back. 96 00:06:45,760 --> 00:06:49,880 Speaker 1: Then that's when Zuckerberg spoke at length about how the 97 00:06:49,920 --> 00:06:53,760 Speaker 1: company was putting an intense focus on developing all things metaverse. 98 00:06:54,200 --> 00:06:56,440 Speaker 1: I mean that's when they changed the company name from 99 00:06:56,440 --> 00:07:03,000 Speaker 1: Facebook to Meta and the development includes making immersive hardware 100 00:07:03,040 --> 00:07:06,680 Speaker 1: that will give people the chance to experience the metaverse. 101 00:07:07,160 --> 00:07:11,320 Speaker 1: But you see, this dedicated focus on the metaverse comes 102 00:07:11,520 --> 00:07:15,400 Speaker 1: at a massive cost. And if you skip ahead a 103 00:07:15,400 --> 00:07:19,160 Speaker 1: few months to earlier this year, we learned that Facebook 104 00:07:19,200 --> 00:07:24,840 Speaker 1: had experienced a drop in active users. That news shocked investors, 105 00:07:24,880 --> 00:07:28,200 Speaker 1: who I guess clutched their pearls whenever certain numbers go 106 00:07:28,320 --> 00:07:31,720 Speaker 1: down instead of up, and I might be getting increasingly 107 00:07:31,760 --> 00:07:35,480 Speaker 1: grouchy about how the stock market works. Anyway, there was 108 00:07:35,520 --> 00:07:38,760 Speaker 1: a drop in confidence from the investor community, and Meta 109 00:07:38,840 --> 00:07:41,840 Speaker 1: reps said that the company would be cutting costs to 110 00:07:42,000 --> 00:07:46,960 Speaker 1: reduce the negative impact on shareholders. Subsequently, news broke that 111 00:07:47,000 --> 00:07:49,640 Speaker 1: the company was slowing down on hiring for mid to 112 00:07:49,760 --> 00:07:53,040 Speaker 1: senior level positions within the company in general. And now 113 00:07:53,080 --> 00:07:55,840 Speaker 1: we hear that they're going to be cutbacks that will 114 00:07:55,880 --> 00:08:00,800 Speaker 1: put some of these reality labs projects on old at 115 00:08:00,840 --> 00:08:04,120 Speaker 1: least for a while. Some might end up being shelved permanently. 116 00:08:05,000 --> 00:08:07,680 Speaker 1: We do not yet know which of those projects are 117 00:08:07,720 --> 00:08:10,360 Speaker 1: going to be impacted. We had previously heard that the 118 00:08:10,400 --> 00:08:14,320 Speaker 1: division was working on four new headsets to aiming at 119 00:08:14,320 --> 00:08:17,840 Speaker 1: a more premium space in the market, which really means 120 00:08:18,040 --> 00:08:21,200 Speaker 1: you know, more features and more expensive, and too meant 121 00:08:21,240 --> 00:08:24,000 Speaker 1: for a more of an entry level space in the market. 122 00:08:24,160 --> 00:08:27,600 Speaker 1: Still expensive, but you know less. So not sure if 123 00:08:27,640 --> 00:08:29,920 Speaker 1: any of those four are going to be impacted by 124 00:08:29,920 --> 00:08:32,680 Speaker 1: these cutbacks or if this is actually going to affect 125 00:08:32,679 --> 00:08:36,120 Speaker 1: projects that are further down the pipeline. Meta reps said 126 00:08:36,160 --> 00:08:39,400 Speaker 1: there are no current plans for employee layoffs at this time, 127 00:08:39,520 --> 00:08:41,960 Speaker 1: so that's good. I mean, I don't like Meta, but 128 00:08:42,160 --> 00:08:45,800 Speaker 1: I really hate seeing people lose their jobs. In an 129 00:08:45,840 --> 00:08:50,319 Speaker 1: earlier tech News episode, I talked about how Facebook relied 130 00:08:50,559 --> 00:08:55,720 Speaker 1: on third party staffing services to hire content moderators in 131 00:08:55,800 --> 00:09:00,800 Speaker 1: places like Africa. Now, one former employee is bringing a 132 00:09:00,920 --> 00:09:05,400 Speaker 1: lawsuit against both Meta and one of those third party companies, 133 00:09:05,440 --> 00:09:09,719 Speaker 1: a tech outsourcing firm called Sama, which is based out 134 00:09:09,720 --> 00:09:15,040 Speaker 1: of San Francisco. This former employee argues that he was 135 00:09:15,120 --> 00:09:18,360 Speaker 1: hired under false pretenses because he says he was never 136 00:09:18,440 --> 00:09:21,959 Speaker 1: told that he would be working for Facebook and that 137 00:09:22,000 --> 00:09:25,640 Speaker 1: he would be doing content moderation when he interviewed for 138 00:09:25,720 --> 00:09:30,920 Speaker 1: the position. He relocated from South Africa to Kenya in 139 00:09:31,000 --> 00:09:33,679 Speaker 1: order to take this job and then discovered what it 140 00:09:33,760 --> 00:09:36,920 Speaker 1: was that he was meant to do. His lawsuit claims 141 00:09:37,000 --> 00:09:40,520 Speaker 1: that this practice is a violation of Kenya's anti human 142 00:09:40,559 --> 00:09:45,040 Speaker 1: trafficking laws. So the headlines about this all naturally say 143 00:09:45,040 --> 00:09:48,439 Speaker 1: that Facebook is being sued for engaging in human trafficking. 144 00:09:49,200 --> 00:09:51,760 Speaker 1: In fact, that's why I clicked on the story to 145 00:09:51,800 --> 00:09:54,280 Speaker 1: read up on what was going on. Now, I don't 146 00:09:54,280 --> 00:09:57,960 Speaker 1: know the particulars of this law in question, However, I 147 00:09:58,000 --> 00:10:02,320 Speaker 1: have to assume the fact at the former employee allegedly 148 00:10:02,400 --> 00:10:04,800 Speaker 1: had no idea who his employer was going to be 149 00:10:05,240 --> 00:10:08,000 Speaker 1: or what his actual position was going to be while 150 00:10:08,040 --> 00:10:11,600 Speaker 1: he relocated to another country has to factor into it. 151 00:10:12,040 --> 00:10:15,040 Speaker 1: In addition, the former employee claims he was fired after 152 00:10:15,080 --> 00:10:18,880 Speaker 1: he tried to organize other employees, so there are also 153 00:10:19,040 --> 00:10:22,559 Speaker 1: charges of union busting wrapped up in this too. Now, 154 00:10:22,600 --> 00:10:26,280 Speaker 1: on top of all that, the man says that exposure 155 00:10:26,320 --> 00:10:31,599 Speaker 1: to truly horrifying material, which ranged from child abuse to 156 00:10:32,400 --> 00:10:36,960 Speaker 1: videos of executions, has left him traumatized and impacted his 157 00:10:37,000 --> 00:10:39,880 Speaker 1: physical and mental health. Now, y'all, I can tell you 158 00:10:39,960 --> 00:10:43,960 Speaker 1: I could never be a content moderator on Facebook. Just 159 00:10:44,080 --> 00:10:47,600 Speaker 1: my exposure to the normal news is enough to cause 160 00:10:47,640 --> 00:10:50,440 Speaker 1: me mental distress, and that's the same stuff that everybody 161 00:10:50,480 --> 00:10:54,520 Speaker 1: else is seeing all the time. Anyway, Meta reps have 162 00:10:54,640 --> 00:10:58,200 Speaker 1: claimed that the company takes employee welfare seriously and holds 163 00:10:58,240 --> 00:11:02,640 Speaker 1: their third party staffing service is accountable for providing fair pay, 164 00:11:02,800 --> 00:11:07,920 Speaker 1: benefits and support, something that the lawsuit UH argues with, 165 00:11:08,520 --> 00:11:12,000 Speaker 1: and SAMA that third party staffing service, for its part, 166 00:11:12,040 --> 00:11:15,640 Speaker 1: denies that it engaged in any anti union behavior. So 167 00:11:15,679 --> 00:11:18,480 Speaker 1: I'll have to see where the story goes from here. Okay, 168 00:11:18,480 --> 00:11:20,600 Speaker 1: we've got some more news stories coming up, but first 169 00:11:20,880 --> 00:11:31,160 Speaker 1: let's take a quick break. Last year, the state of 170 00:11:31,240 --> 00:11:34,840 Speaker 1: Texas passed a law called HB twin Team, which would 171 00:11:34,840 --> 00:11:38,440 Speaker 1: give Texans the right to sue social media platforms that 172 00:11:38,559 --> 00:11:43,400 Speaker 1: have fifty million or more active monthly users if those 173 00:11:43,520 --> 00:11:47,640 Speaker 1: users believe that they got a ban from those platforms 174 00:11:47,720 --> 00:11:51,160 Speaker 1: due to their political views. So, in other words, if 175 00:11:51,200 --> 00:11:55,080 Speaker 1: they think that they've been banned from a service because 176 00:11:55,080 --> 00:11:57,840 Speaker 1: of their political views, they can sue the company within 177 00:11:57,880 --> 00:12:01,439 Speaker 1: the state of Texas, UH if they are residents of Texas. 178 00:12:01,480 --> 00:12:05,720 Speaker 1: That is, the law also forbids social networks from cutting 179 00:12:05,760 --> 00:12:09,160 Speaker 1: off access to people in Texas in order to sidestep 180 00:12:09,240 --> 00:12:14,079 Speaker 1: this issue, which y'all that is absolutely wild. Uh. It's 181 00:12:14,160 --> 00:12:16,960 Speaker 1: left a lot of folks saying, where does Texas get 182 00:12:17,000 --> 00:12:21,520 Speaker 1: off litigating how a business in another state conducts itself, 183 00:12:21,800 --> 00:12:26,680 Speaker 1: Like if the business is incorporated, saying California, why gives 184 00:12:26,720 --> 00:12:31,160 Speaker 1: Texas the authority to prevent that company from not providing 185 00:12:31,160 --> 00:12:35,880 Speaker 1: service to Texas? Anyway. A federal judge previously passed an 186 00:12:35,880 --> 00:12:40,680 Speaker 1: injunction against HB twenty, preventing it from going into effect 187 00:12:41,160 --> 00:12:44,240 Speaker 1: until further judgment determined whether or not the law is 188 00:12:44,440 --> 00:12:48,640 Speaker 1: you know, legal, and the Fifth U. S. Circuit Court 189 00:12:48,640 --> 00:12:53,040 Speaker 1: of Appeals has now overturned that injunction and HB twenty 190 00:12:53,080 --> 00:12:56,760 Speaker 1: is now in effect in Texas. I suspect this is 191 00:12:56,800 --> 00:13:00,720 Speaker 1: just another stop on the journey to bringing this law 192 00:13:00,920 --> 00:13:04,760 Speaker 1: ahead of the Supreme Court. But then, considering the current 193 00:13:05,080 --> 00:13:08,000 Speaker 1: makeup of the U. S. Supreme Court, I can't confidently 194 00:13:08,120 --> 00:13:11,760 Speaker 1: say that they will find the law to be unconstitutional, 195 00:13:12,640 --> 00:13:17,200 Speaker 1: even though it appears to be pretty unconstitutional. The federal 196 00:13:17,320 --> 00:13:21,640 Speaker 1: judge who passed the injunction had pointed out that passing 197 00:13:21,640 --> 00:13:25,040 Speaker 1: the law infringed upon the freedom of speech for the 198 00:13:25,080 --> 00:13:29,280 Speaker 1: actual social media platforms that these platforms have the right 199 00:13:29,400 --> 00:13:32,079 Speaker 1: to establish codes of conduct and then they have the 200 00:13:32,200 --> 00:13:35,480 Speaker 1: right to enforce those codes of conduct, and any state 201 00:13:35,559 --> 00:13:38,040 Speaker 1: coming in to say no, you can't do that is 202 00:13:38,080 --> 00:13:42,360 Speaker 1: akin to censorship, which you know is ironic because Texas 203 00:13:42,440 --> 00:13:46,920 Speaker 1: was positioning its law as being anti censorship, but in 204 00:13:46,960 --> 00:13:51,199 Speaker 1: fact Texas was engaging in censorship against the social platforms 205 00:13:51,600 --> 00:13:55,360 Speaker 1: in order to do this. Anyway, HB twenty is just 206 00:13:55,400 --> 00:13:59,120 Speaker 1: a terrible law, full stop. It's a terrible law. On 207 00:13:59,200 --> 00:14:02,920 Speaker 1: top of that, one of the judges in the the 208 00:14:03,000 --> 00:14:07,640 Speaker 1: Circuit Court of Appeals referred to Twitter as an Internet provider, 209 00:14:08,000 --> 00:14:12,080 Speaker 1: which is just playing wrong, Like you can't get more wrong. 210 00:14:12,640 --> 00:14:17,120 Speaker 1: Twitter provides a service that's on the Internet, but it's 211 00:14:17,120 --> 00:14:21,760 Speaker 1: not an Internet provider. It's not providing Internet service. I 212 00:14:21,800 --> 00:14:24,920 Speaker 1: swear Texas is like the upside down in Stranger Things. 213 00:14:25,000 --> 00:14:28,160 Speaker 1: And I say that as a resident of Georgia, a 214 00:14:28,200 --> 00:14:33,560 Speaker 1: state that has its own share of totally backwards legislation. Anyway, 215 00:14:34,120 --> 00:14:35,800 Speaker 1: I can't wait to see where the story goes in 216 00:14:35,840 --> 00:14:37,880 Speaker 1: the future, and my heart goes out to all the 217 00:14:37,880 --> 00:14:40,360 Speaker 1: folks in Texas who opposed this kind of stuff. I 218 00:14:40,440 --> 00:14:42,640 Speaker 1: understand how hard it can be to live in a 219 00:14:42,720 --> 00:14:47,040 Speaker 1: state where leaders make bad decisions that reflect poorly on you, 220 00:14:47,160 --> 00:14:49,640 Speaker 1: and you had nothing to do with it. Over in Europe, 221 00:14:49,680 --> 00:14:52,280 Speaker 1: we're seeing another example of how leaders, in an effort 222 00:14:52,320 --> 00:14:56,400 Speaker 1: to fight something that's truly hideous, in this case UH 223 00:14:56,600 --> 00:15:01,440 Speaker 1: images and video of child sexual abuse use, they are 224 00:15:01,440 --> 00:15:06,440 Speaker 1: pursuing rules that will have massive, unintended consequences and a 225 00:15:06,520 --> 00:15:11,200 Speaker 1: negative impact. So the EU has proposed founding a new 226 00:15:11,240 --> 00:15:15,360 Speaker 1: division to fight child abuse material online, as well as 227 00:15:15,400 --> 00:15:19,680 Speaker 1: required tech companies to quote, detect, report, block and remove 228 00:15:20,040 --> 00:15:24,360 Speaker 1: end quote such material from their platforms. On the surface, 229 00:15:24,440 --> 00:15:29,080 Speaker 1: that all sounds fairly you know, reasonable right to detect, report, block, 230 00:15:29,120 --> 00:15:32,720 Speaker 1: and remove child abuse material. And obviously this material is 231 00:15:32,760 --> 00:15:36,200 Speaker 1: incredibly harmful and reporting it would be absolutely key in 232 00:15:36,320 --> 00:15:40,320 Speaker 1: order to track down the people responsible and to stop 233 00:15:40,400 --> 00:15:44,640 Speaker 1: them from abusing children. But it's the detect part that 234 00:15:44,720 --> 00:15:50,520 Speaker 1: could be particularly disruptive and dangerous for lots of reasons. 235 00:15:50,560 --> 00:15:56,120 Speaker 1: You see, for any platform that allows private messaging between 236 00:15:56,160 --> 00:15:59,640 Speaker 1: people on that platform, if that platform is ordered to 237 00:15:59,720 --> 00:16:03,040 Speaker 1: dete hecked this kind of material, well, to do that, 238 00:16:03,080 --> 00:16:05,840 Speaker 1: it would have to scan messages that were being sent 239 00:16:05,960 --> 00:16:09,240 Speaker 1: between individuals on the platform to see if any of 240 00:16:09,320 --> 00:16:12,600 Speaker 1: that material was included in those messages. These could be 241 00:16:12,640 --> 00:16:16,920 Speaker 1: messages that people presumably come on as being private and secure. 242 00:16:17,400 --> 00:16:20,480 Speaker 1: It would also mean that end to end encryption would 243 00:16:20,480 --> 00:16:25,840 Speaker 1: be pretty much impossible or rendered meaningless. A solid end 244 00:16:25,840 --> 00:16:29,320 Speaker 1: to end encryption service would be created in such a 245 00:16:29,360 --> 00:16:34,040 Speaker 1: way that no one other than the parties communicating would 246 00:16:34,040 --> 00:16:37,520 Speaker 1: be able to see the contents of that communication. That 247 00:16:37,600 --> 00:16:41,360 Speaker 1: includes the communications platform itself. So in other words, if 248 00:16:41,400 --> 00:16:46,040 Speaker 1: company A allows true end to end encryption, company A 249 00:16:46,200 --> 00:16:49,280 Speaker 1: has no way of knowing what is being sent over 250 00:16:49,360 --> 00:16:54,640 Speaker 1: those messages. It cannot access those messages, it cannot scan them. 251 00:16:54,760 --> 00:16:56,560 Speaker 1: So the only way to get around this is to 252 00:16:56,600 --> 00:17:00,680 Speaker 1: do was called client side scanning. That means you would 253 00:17:00,680 --> 00:17:04,680 Speaker 1: actually have to scan the material on the end devices 254 00:17:04,880 --> 00:17:08,160 Speaker 1: once the material has been decrypted. So that would mean 255 00:17:08,960 --> 00:17:13,480 Speaker 1: all users in the EU would have various companies scanning 256 00:17:13,480 --> 00:17:17,360 Speaker 1: their personal devices for signs of this kind of material. 257 00:17:18,000 --> 00:17:20,639 Speaker 1: Apple had actually proposed doing this and then put the 258 00:17:20,640 --> 00:17:26,800 Speaker 1: plans on hold after receiving some pushback from civil rights activists, 259 00:17:27,320 --> 00:17:29,959 Speaker 1: and those activists have argued that this kind of surveillance 260 00:17:30,000 --> 00:17:34,000 Speaker 1: is really dangerous. It removes all privacy. Now, there is 261 00:17:34,160 --> 00:17:38,160 Speaker 1: no denying that child abuse is horrible and it needs 262 00:17:38,200 --> 00:17:42,240 Speaker 1: to be stopped and prevented. But researchers are suggesting that 263 00:17:42,640 --> 00:17:45,840 Speaker 1: it might be better to use other methodologies rather than 264 00:17:46,400 --> 00:17:50,119 Speaker 1: stripping away the possibility of true d end encryption, which 265 00:17:50,160 --> 00:17:54,520 Speaker 1: promotes human rights, and that those methodologies can be things 266 00:17:54,560 --> 00:17:59,800 Speaker 1: like analyzing metadata in order to detect criminal behavior and 267 00:17:59,840 --> 00:18:03,000 Speaker 1: to stop it. Obviously, this is one of those really 268 00:18:03,040 --> 00:18:07,040 Speaker 1: emotionally charged topics. It is hard to tackle. There are 269 00:18:07,119 --> 00:18:11,040 Speaker 1: no easy solutions, and it is very clear that a 270 00:18:11,119 --> 00:18:16,280 Speaker 1: solution is desperately needed, so it is a difficult situation 271 00:18:16,359 --> 00:18:19,920 Speaker 1: to kind of suss out. A couple of weeks ago, 272 00:18:20,119 --> 00:18:22,720 Speaker 1: we talked about how Netflix had a very rough call 273 00:18:22,840 --> 00:18:26,320 Speaker 1: with investors as the company reported its first net loss 274 00:18:26,359 --> 00:18:30,199 Speaker 1: and subscribers ever, and once again, you know, if numbers 275 00:18:30,200 --> 00:18:33,199 Speaker 1: go the wrong way, the stock market goes bananas. Anyway, 276 00:18:33,240 --> 00:18:36,600 Speaker 1: apart from this being another example of how reactionary things 277 00:18:36,720 --> 00:18:40,240 Speaker 1: are in the investment world, it pushed Netflix to talk 278 00:18:40,280 --> 00:18:43,280 Speaker 1: about the ways the company intends to drive up revenue 279 00:18:43,320 --> 00:18:47,479 Speaker 1: and reduce costs in the future. Two of those ways 280 00:18:47,960 --> 00:18:51,600 Speaker 1: involved clamping down on the practice of password sharing, and 281 00:18:51,640 --> 00:18:57,000 Speaker 1: the other was to introduce ads. Two create different tiers 282 00:18:57,040 --> 00:19:01,359 Speaker 1: of subscriptions now. The company says we might see both 283 00:19:01,400 --> 00:19:04,640 Speaker 1: of these practices begin by the end of this year, 284 00:19:05,400 --> 00:19:08,240 Speaker 1: and we have an idea of how this is probably 285 00:19:08,240 --> 00:19:10,360 Speaker 1: going to play out. For one thing, Netflix has been 286 00:19:10,400 --> 00:19:14,040 Speaker 1: experimenting with a model in a few countries in which 287 00:19:14,040 --> 00:19:17,440 Speaker 1: households can pay a little bit extra in their monthly 288 00:19:17,520 --> 00:19:21,560 Speaker 1: subscriptions and in return, they are authorized to share their 289 00:19:21,600 --> 00:19:27,359 Speaker 1: password with at least a person outside of their household. 290 00:19:27,720 --> 00:19:32,200 Speaker 1: So this this allows them for the case where someone 291 00:19:32,280 --> 00:19:35,040 Speaker 1: who is not living under the same roof is able 292 00:19:35,080 --> 00:19:37,920 Speaker 1: to access the service, so you pay a little bit 293 00:19:37,920 --> 00:19:41,240 Speaker 1: more per month and you get a kind of adjunct 294 00:19:41,280 --> 00:19:44,959 Speaker 1: household member who just happens to live outside the house. 295 00:19:46,080 --> 00:19:49,200 Speaker 1: As for advertisements, the general belief is that Netflix would 296 00:19:49,240 --> 00:19:51,840 Speaker 1: do something similar to what we've seen other platforms do 297 00:19:51,960 --> 00:19:54,200 Speaker 1: in the past, which is that they offer a less 298 00:19:54,200 --> 00:19:58,719 Speaker 1: expensive subscription model for folks who opt to have ads 299 00:19:58,760 --> 00:20:02,320 Speaker 1: included in their service, and those who don't want ads 300 00:20:02,600 --> 00:20:04,720 Speaker 1: will just have to pay a little bit more per 301 00:20:04,760 --> 00:20:08,439 Speaker 1: month for a premium service. Whether this will address the 302 00:20:08,520 --> 00:20:13,160 Speaker 1: issue of subscriber loss remains to be seen. Okay, I've 303 00:20:13,160 --> 00:20:15,600 Speaker 1: got a couple more stories to go through before we 304 00:20:15,640 --> 00:20:25,760 Speaker 1: get to those, let's take another quick break. One service 305 00:20:25,840 --> 00:20:29,679 Speaker 1: that did not have Netflix's problems was Disney Plus. The 306 00:20:29,680 --> 00:20:33,560 Speaker 1: mouse House reported that nearly eight million people joined as 307 00:20:33,640 --> 00:20:36,439 Speaker 1: new subscribers to Disney Plus over the first quarter of 308 00:20:36,480 --> 00:20:41,480 Speaker 1: twenty two, which was better than anticipated. This makes Disney 309 00:20:41,480 --> 00:20:45,640 Speaker 1: Plus a leading company when it comes to growth. Keep 310 00:20:45,680 --> 00:20:48,159 Speaker 1: in mind, like there are other services that are bigger, 311 00:20:48,560 --> 00:20:51,440 Speaker 1: they're just not growing at the same rate as Disney 312 00:20:51,440 --> 00:20:54,080 Speaker 1: Plus growth. Like again, this is why I think growth 313 00:20:54,119 --> 00:20:58,040 Speaker 1: is a bad metric for success, because you could be 314 00:20:58,119 --> 00:21:01,399 Speaker 1: growing really fast, but the reason for that might be 315 00:21:01,600 --> 00:21:06,000 Speaker 1: that you're pretty small. So if you have two people 316 00:21:06,240 --> 00:21:09,840 Speaker 1: on your service and then six more people sign up, 317 00:21:10,160 --> 00:21:13,040 Speaker 1: you know that if you if you convert that over 318 00:21:13,080 --> 00:21:15,919 Speaker 1: to percentages, looks really impressive, but when you look at 319 00:21:15,920 --> 00:21:19,760 Speaker 1: the actual numbers, maybe not so much. Anyway, despite the 320 00:21:19,800 --> 00:21:22,800 Speaker 1: fact that Disney Plus is growing very fast, faster than 321 00:21:23,240 --> 00:21:27,920 Speaker 1: services like Hbo Max, the division is losing lots of money. 322 00:21:27,960 --> 00:21:31,479 Speaker 1: In fact, it's losing money faster than it was before. This, 323 00:21:31,560 --> 00:21:33,359 Speaker 1: by the way, is again one of the things that 324 00:21:33,400 --> 00:21:37,600 Speaker 1: really puzzles me about business. It convinces me that all 325 00:21:37,640 --> 00:21:40,879 Speaker 1: money is fake and we're all just playing pretend until 326 00:21:40,920 --> 00:21:43,679 Speaker 1: it all falls apart. But yeah, Disney Plus is growing 327 00:21:43,720 --> 00:21:47,360 Speaker 1: super fast and pulling in more money per subscriber on average, 328 00:21:47,359 --> 00:21:50,600 Speaker 1: but it's also losing money faster because the cost of 329 00:21:50,640 --> 00:21:55,440 Speaker 1: producing exclusive content for the platform is really high. We've 330 00:21:55,440 --> 00:21:58,679 Speaker 1: seen this with other services as well. Netflix, in particular 331 00:21:59,520 --> 00:22:05,960 Speaker 1: streaming platforms attracts subscribers primarily by securing exclusive content. Sometimes 332 00:22:06,000 --> 00:22:10,320 Speaker 1: that is involved by signing deals with specific studios so 333 00:22:10,320 --> 00:22:14,440 Speaker 1: that you have the exclusive right to stream that studio's content. 334 00:22:14,960 --> 00:22:19,679 Speaker 1: In other cases, it involves funding productions directly, and this 335 00:22:19,720 --> 00:22:24,199 Speaker 1: stuff is expensive you. Disney is reportedly considering introducing an 336 00:22:24,200 --> 00:22:27,680 Speaker 1: ad supported subscription tier to Disney Plus in the future, 337 00:22:28,080 --> 00:22:33,560 Speaker 1: similar to what Netflix is considering. Yesterday, Google kicked off 338 00:22:33,600 --> 00:22:37,320 Speaker 1: its annual Io event. That's the one in which the 339 00:22:37,359 --> 00:22:41,560 Speaker 1: company reveals new products and invites developers to become familiar 340 00:22:41,600 --> 00:22:45,560 Speaker 1: with various platforms so that those developers can create stuff 341 00:22:45,600 --> 00:22:48,440 Speaker 1: that can work on those platforms. Like this is where 342 00:22:48,760 --> 00:22:54,480 Speaker 1: developers learn about increased features and and upcoming operating system updates, 343 00:22:54,520 --> 00:22:57,600 Speaker 1: that kind of stuff. Now, I've attended one io event 344 00:22:57,920 --> 00:22:59,560 Speaker 1: that was the year that they brought in Flight of 345 00:22:59,600 --> 00:23:02,359 Speaker 1: the con Chords to play the after party. But I 346 00:23:02,359 --> 00:23:05,000 Speaker 1: didn't know who Fly the Concords were at the time, 347 00:23:05,280 --> 00:23:07,640 Speaker 1: and now I kick myself because if I had known, 348 00:23:08,119 --> 00:23:10,280 Speaker 1: I would have been at the whole set instead of 349 00:23:10,320 --> 00:23:11,879 Speaker 1: just at the tail end of it. Anyway, none of 350 00:23:11,920 --> 00:23:14,520 Speaker 1: that matters. Let's talk about some of the stuff that 351 00:23:14,600 --> 00:23:19,359 Speaker 1: Google showed off this year. So it teased the Pixel seven. 352 00:23:19,920 --> 00:23:21,880 Speaker 1: This is going to be the next generation of the 353 00:23:21,920 --> 00:23:26,000 Speaker 1: Pixel line of smartphones, the Android phones. The company plans 354 00:23:26,040 --> 00:23:28,600 Speaker 1: to release a Pixel seven and a Pixel seven Pro, 355 00:23:29,000 --> 00:23:31,040 Speaker 1: which is in line with what it's done over the 356 00:23:31,080 --> 00:23:34,080 Speaker 1: last couple of years. Both of those phones will sport 357 00:23:34,240 --> 00:23:38,320 Speaker 1: Google Tensor chips, but that's about all the information that 358 00:23:38,359 --> 00:23:40,960 Speaker 1: the company was willing to part with at this stage. 359 00:23:41,440 --> 00:23:44,720 Speaker 1: Google also announced the Pixel six A, which is meant 360 00:23:44,760 --> 00:23:47,520 Speaker 1: to be a more budget friendly version of the Pixel 361 00:23:47,520 --> 00:23:52,560 Speaker 1: six smartphone. It'll be slightly smaller, might lack a few 362 00:23:52,560 --> 00:23:58,199 Speaker 1: other features. Uh, it will retail for four dollars, and 363 00:23:58,280 --> 00:24:00,560 Speaker 1: one thing that the Pixel six A will not have 364 00:24:01,000 --> 00:24:04,479 Speaker 1: is a headphone jack. Google has gone back and forth 365 00:24:04,840 --> 00:24:07,919 Speaker 1: when it comes to including a physical headphone jack, and 366 00:24:08,000 --> 00:24:11,280 Speaker 1: has even taken some potshots at Apple for having abandoned 367 00:24:11,400 --> 00:24:13,800 Speaker 1: the headphone Jack in the past, but it looks like 368 00:24:13,800 --> 00:24:15,639 Speaker 1: Google is doing the same thing, at least with the 369 00:24:15,640 --> 00:24:19,600 Speaker 1: six A. Maybe that's because of another announcement that the 370 00:24:19,600 --> 00:24:23,760 Speaker 1: company made, which is that the Pixel Buds Pro Earbuds, 371 00:24:23,800 --> 00:24:27,360 Speaker 1: a a two dollar set of earbuds with noise cancelation 372 00:24:27,720 --> 00:24:31,360 Speaker 1: and advanced microphones to allow for calls with a minimum 373 00:24:31,359 --> 00:24:34,560 Speaker 1: of interference, are coming out this year. At long last, 374 00:24:34,600 --> 00:24:39,760 Speaker 1: Google also unveiled the Pixel Watch, which will incorporate fitbit technology. 375 00:24:40,280 --> 00:24:43,080 Speaker 1: That fitbit connection is critical because it means Google can 376 00:24:43,119 --> 00:24:48,320 Speaker 1: rely upon tried and tested technology rather than developing everything 377 00:24:48,359 --> 00:24:51,320 Speaker 1: from scratch. The company said it would be available in 378 00:24:51,320 --> 00:24:54,240 Speaker 1: the fall, and it would be sold at a premium price, 379 00:24:54,920 --> 00:24:57,120 Speaker 1: which probably means if you have to ask, you can't 380 00:24:57,119 --> 00:25:00,960 Speaker 1: afford it, So I guess no Pixel Uch for me. Then. 381 00:25:01,680 --> 00:25:05,280 Speaker 1: Google also showed off a prototype of some smart glasses 382 00:25:05,800 --> 00:25:09,760 Speaker 1: that would be able to display translations in real time, 383 00:25:10,200 --> 00:25:13,000 Speaker 1: meaning if you had two people and each of them 384 00:25:13,040 --> 00:25:15,439 Speaker 1: had a pair of these glasses, they could have a 385 00:25:15,560 --> 00:25:19,119 Speaker 1: conversation with one another even if neither of them spoke 386 00:25:19,200 --> 00:25:22,879 Speaker 1: the other person's language, which is pretty cool. Uh, there's 387 00:25:22,880 --> 00:25:26,400 Speaker 1: no telling if this prototype will ever become an actual product, 388 00:25:26,920 --> 00:25:29,760 Speaker 1: but it is a nice demonstration of how augmented reality 389 00:25:29,800 --> 00:25:34,040 Speaker 1: technology could have some real world applications outside of gimmicky stuff. 390 00:25:34,640 --> 00:25:38,400 Speaker 1: The io event continues today, but usually we see all 391 00:25:38,440 --> 00:25:41,119 Speaker 1: the big reveals on day one. But if anything huge 392 00:25:41,160 --> 00:25:43,359 Speaker 1: gets announced today, I'll be sure to cover it in 393 00:25:43,960 --> 00:25:48,800 Speaker 1: an episode next week. Finally, Bethesda, the video game studio 394 00:25:48,920 --> 00:25:53,199 Speaker 1: and publisher, tweeted that upcoming video game titles Redfall and 395 00:25:53,320 --> 00:25:57,720 Speaker 1: Starfield will not be coming out this year. The company 396 00:25:57,760 --> 00:26:01,119 Speaker 1: has chosen to postpone the publication these games until the 397 00:26:01,160 --> 00:26:05,600 Speaker 1: first half of three, citing the need to quote ensure 398 00:26:05,760 --> 00:26:09,480 Speaker 1: that you received the best, most polished versions end quote. 399 00:26:10,080 --> 00:26:13,520 Speaker 1: Bethesda first announced Starfield at e three way back in 400 00:26:14,440 --> 00:26:17,040 Speaker 1: calling it a an all new i P, so not 401 00:26:17,720 --> 00:26:20,879 Speaker 1: a continuation of any of the franchises Bethesda is known for, 402 00:26:21,359 --> 00:26:25,000 Speaker 1: you know, stuff like Fallout and Skyrim or Elder Scrolls. 403 00:26:25,040 --> 00:26:28,000 Speaker 1: I should say, and that this game puts players in 404 00:26:28,000 --> 00:26:31,960 Speaker 1: the role of space explorers. Uh, there's not a whole 405 00:26:32,000 --> 00:26:35,560 Speaker 1: lot more detail available about Starfield. There's a bit, but 406 00:26:35,720 --> 00:26:38,440 Speaker 1: you know it's it's largely mysterious since it's a brand 407 00:26:38,440 --> 00:26:43,520 Speaker 1: new I P. Red Fall is actually an arcane Studios title. 408 00:26:43,760 --> 00:26:47,320 Speaker 1: Bethesda is the publisher, not the developer in this case, 409 00:26:47,840 --> 00:26:50,960 Speaker 1: and it's a first person shooter game similar to titles 410 00:26:50,960 --> 00:26:53,920 Speaker 1: like Left for Dead, in which players can assume one 411 00:26:54,080 --> 00:26:57,480 Speaker 1: of four playable characters and up to four people can 412 00:26:57,520 --> 00:27:02,080 Speaker 1: play cooperatively in a session x app. Instead of fighting 413 00:27:02,119 --> 00:27:05,400 Speaker 1: off zombie hordes like you do and Left for Dead, 414 00:27:05,440 --> 00:27:07,439 Speaker 1: where you're you know you're fighting off endless waves of 415 00:27:07,520 --> 00:27:12,120 Speaker 1: zombies on your way to a safe zone, the Red 416 00:27:12,160 --> 00:27:14,880 Speaker 1: Fall game is going to be about fighting off vampiric 417 00:27:15,359 --> 00:27:19,400 Speaker 1: hordes as you presumably try to make your way to safety. 418 00:27:19,560 --> 00:27:22,200 Speaker 1: I'm told there's more to it than that, and I'm 419 00:27:22,240 --> 00:27:25,639 Speaker 1: being a bit snarky. It's my reaction is mostly based 420 00:27:25,680 --> 00:27:28,080 Speaker 1: off of a teaser video about Red Fall that I 421 00:27:28,080 --> 00:27:30,800 Speaker 1: saw a couple of years ago, where I thought, ah, 422 00:27:30,880 --> 00:27:34,480 Speaker 1: it's Left for Dead, but with vampires. So I'm sure 423 00:27:34,480 --> 00:27:38,960 Speaker 1: it's more complicated and sophisticated than that. So my apologies. 424 00:27:39,240 --> 00:27:42,520 Speaker 1: It was just my initial reaction. And that's it for 425 00:27:42,600 --> 00:27:46,560 Speaker 1: the Tech News for today. That's Thursday May twelve, two 426 00:27:46,640 --> 00:27:49,359 Speaker 1: thousand twenty two. Hope all of you are well. If 427 00:27:49,400 --> 00:27:51,760 Speaker 1: you have suggestions for topics I should cover in future 428 00:27:51,800 --> 00:27:55,040 Speaker 1: episodes of tech Stuff, Please leave me a note on Twitter. 429 00:27:55,440 --> 00:27:57,879 Speaker 1: The handle for the show is text Stuff. Hs W 430 00:27:58,359 --> 00:28:02,440 Speaker 1: definitely used the the tech show handle because I don't 431 00:28:02,520 --> 00:28:05,359 Speaker 1: check my own Twitter anymore. And I will talk to 432 00:28:05,359 --> 00:28:14,399 Speaker 1: you again really soon. Y tech Stuff is an I 433 00:28:14,520 --> 00:28:18,000 Speaker 1: Heart Radio production. For more podcasts from I Heart Radio, 434 00:28:18,320 --> 00:28:21,520 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 435 00:28:21,600 --> 00:28:23,119 Speaker 1: you listen to your favorite shows.