1 00:00:04,440 --> 00:00:12,400 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,520 --> 00:00:15,800 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,800 --> 00:00:19,479 Speaker 1: an executive producer with iHeartRadio. And how the tech are you. 4 00:00:19,920 --> 00:00:23,239 Speaker 1: It's time for the tech news for Thursday, April twenty seventh, 5 00:00:23,320 --> 00:00:28,800 Speaker 1: twenty twenty three. Now, maybe you've seen this headline. Man 6 00:00:28,920 --> 00:00:33,720 Speaker 1: uses chat gpt to pick winning lottery numbers. That is 7 00:00:33,840 --> 00:00:35,879 Speaker 1: a heck of a claim, right, I mean that is 8 00:00:36,080 --> 00:00:38,200 Speaker 1: bound to get you to click through to the article. 9 00:00:38,440 --> 00:00:41,640 Speaker 1: Of course, if you read the actual story, it deflates 10 00:00:41,680 --> 00:00:45,519 Speaker 1: things a little bit. The story happened in Thailand. The 11 00:00:45,560 --> 00:00:49,199 Speaker 1: winner received the equivalent of fifty nine bucks. It's not 12 00:00:49,280 --> 00:00:52,280 Speaker 1: exactly enough to retire on. They have gone on to 13 00:00:52,400 --> 00:00:57,320 Speaker 1: TikTok to talk about what they claim has happened. But 14 00:00:58,040 --> 00:01:02,840 Speaker 1: what about chat GPT giving winning numbers? Well, first, I 15 00:01:02,840 --> 00:01:06,720 Speaker 1: couldn't find any verification that it actually happened. But let's 16 00:01:06,720 --> 00:01:09,240 Speaker 1: go ahead and assume the story is true. There's no 17 00:01:09,319 --> 00:01:12,280 Speaker 1: reason for us to just think that it's all made up. 18 00:01:12,959 --> 00:01:17,000 Speaker 1: The lottery drawn this case was of four numbers, and 19 00:01:17,080 --> 00:01:21,840 Speaker 1: apparently this man used some hypothetical situations while chatting with 20 00:01:21,959 --> 00:01:26,040 Speaker 1: Chad GPT and also asked the bot to analyze past 21 00:01:26,160 --> 00:01:29,880 Speaker 1: numbers and then suggest four numbers to pick for the 22 00:01:29,959 --> 00:01:33,000 Speaker 1: upcoming lottery. Now I don't have all the details, but 23 00:01:33,120 --> 00:01:37,400 Speaker 1: I can say that for random events, you cannot predict 24 00:01:37,400 --> 00:01:42,240 Speaker 1: the future based upon the past. Each event is independent 25 00:01:42,480 --> 00:01:44,560 Speaker 1: from the others. If you flip a coin and it 26 00:01:44,600 --> 00:01:48,200 Speaker 1: comes up heads, and it's a normal coin with the 27 00:01:48,240 --> 00:01:51,240 Speaker 1: heads and the tails, you cannot predict that the next 28 00:01:51,280 --> 00:01:54,160 Speaker 1: coin flip will be heads or tails. I mean, you 29 00:01:54,160 --> 00:01:57,000 Speaker 1: could try, but you can't be sure. Like there's no 30 00:01:57,640 --> 00:02:01,960 Speaker 1: legitimacy to the guess. And that's because each flip is 31 00:02:02,000 --> 00:02:06,880 Speaker 1: independent of all other flips. Over the long term, you 32 00:02:06,920 --> 00:02:10,360 Speaker 1: should see a fifty to fifty distribution of heads versus tails, 33 00:02:10,400 --> 00:02:12,720 Speaker 1: but from one flip to the next, there's no way 34 00:02:12,720 --> 00:02:16,840 Speaker 1: of predicting it. So this is assuming you're working with 35 00:02:17,000 --> 00:02:20,560 Speaker 1: actual random events. So long as the lottery system is 36 00:02:20,600 --> 00:02:25,280 Speaker 1: working properly, it should not matter what past numbers indicate. 37 00:02:25,760 --> 00:02:30,160 Speaker 1: But if there is a bias in the system, then 38 00:02:30,280 --> 00:02:32,600 Speaker 1: it would be possible that a computer could pick up 39 00:02:32,600 --> 00:02:37,079 Speaker 1: on patterns that would be otherwise a little difficult to spot. 40 00:02:37,360 --> 00:02:39,320 Speaker 1: It still would be a heck. Of a guess to 41 00:02:39,440 --> 00:02:43,520 Speaker 1: land winning numbers based off patterns, because chances are it's 42 00:02:43,560 --> 00:02:45,680 Speaker 1: not the exact same four numbers coming up over and 43 00:02:45,720 --> 00:02:47,720 Speaker 1: over again. If that were the case, then I can 44 00:02:48,000 --> 00:02:51,280 Speaker 1: guarantee you someone would come in and say, okay, we 45 00:02:51,400 --> 00:02:53,880 Speaker 1: need to look at what's going on here. It's far 46 00:02:53,919 --> 00:02:57,800 Speaker 1: more likely that a lottery commission would notice patterns before 47 00:02:57,840 --> 00:03:00,600 Speaker 1: anyone else would, and would demand a full audit of 48 00:03:00,639 --> 00:03:02,799 Speaker 1: the system to make sure that things were running fair 49 00:03:02,840 --> 00:03:07,079 Speaker 1: and square. Also, chad GPT only has access to information 50 00:03:07,480 --> 00:03:11,560 Speaker 1: up to twenty twenty one. Anything from twenty twenty one 51 00:03:11,760 --> 00:03:17,520 Speaker 1: forward isn't accessible. So if chad GPT were drawing information 52 00:03:17,800 --> 00:03:21,560 Speaker 1: from past lottery numbers, they would be very old. And 53 00:03:21,639 --> 00:03:24,480 Speaker 1: to me that suggests that the lottery system would probably 54 00:03:24,520 --> 00:03:28,000 Speaker 1: have been replaced or altered at some point in the 55 00:03:28,080 --> 00:03:31,320 Speaker 1: years between twenty twenty one and twenty twenty three, and 56 00:03:32,320 --> 00:03:36,000 Speaker 1: I guess well the year, so all three of those years, 57 00:03:36,560 --> 00:03:40,960 Speaker 1: and that these old patterns, if they did exist before 58 00:03:41,000 --> 00:03:44,680 Speaker 1: twenty twenty one, probably shouldn't exist today. So you couldn't 59 00:03:45,440 --> 00:03:47,960 Speaker 1: even if those patterns were there, you couldn't draw from 60 00:03:48,000 --> 00:03:51,920 Speaker 1: that and make a meaningful guess for today's numbers. Of course, 61 00:03:52,640 --> 00:03:56,280 Speaker 1: it's possible this man was feeding the more recent lottery 62 00:03:56,360 --> 00:03:59,800 Speaker 1: numbers to chat GBT manually that could be the case, 63 00:04:00,160 --> 00:04:02,920 Speaker 1: but again, unless something hinky is going on with the 64 00:04:02,960 --> 00:04:07,920 Speaker 1: actual lottery system, there's really nothing but coincidence connecting the 65 00:04:08,000 --> 00:04:12,720 Speaker 1: chatbot's suggestions to actual winning numbers. It's not like chat 66 00:04:12,760 --> 00:04:17,640 Speaker 1: GPT could analyze all the variables involved in the numbers 67 00:04:17,680 --> 00:04:20,719 Speaker 1: being picked and then predict the most likely outcome. It 68 00:04:20,760 --> 00:04:23,960 Speaker 1: doesn't even know all the variables. So I would caution 69 00:04:24,440 --> 00:04:27,719 Speaker 1: anyone against putting their faith in a chat bot to 70 00:04:27,800 --> 00:04:30,120 Speaker 1: help them win the lottery. And I know I'm being 71 00:04:30,160 --> 00:04:32,800 Speaker 1: a little bit flippant about this, but there really is 72 00:04:32,880 --> 00:04:36,640 Speaker 1: a danger when folks start to treat AI as if 73 00:04:36,680 --> 00:04:40,200 Speaker 1: it is magic, that somehow it does not obey physical 74 00:04:40,279 --> 00:04:43,919 Speaker 1: laws or can get around hard limitations to our knowledge. 75 00:04:44,440 --> 00:04:49,360 Speaker 1: That opens up the chance for fraud, chicanery, and general scumbagginess. 76 00:04:49,960 --> 00:04:57,200 Speaker 1: An organization called the Association for Mathematical Consciousness Science or AMCs, 77 00:04:57,520 --> 00:05:02,160 Speaker 1: has issued an open letter saying, and I am paraphrasing here, Hey, 78 00:05:02,640 --> 00:05:04,840 Speaker 1: before we do too much more work in AI and 79 00:05:04,880 --> 00:05:08,440 Speaker 1: potentially create a conscious machine, we should really learn more 80 00:05:08,480 --> 00:05:12,159 Speaker 1: about consciousness in general. Now, if you've been listening to 81 00:05:12,200 --> 00:05:14,400 Speaker 1: tech stuff for a while, you've probably heard me talk 82 00:05:14,480 --> 00:05:19,840 Speaker 1: about how human consciousness is a complicated, nuanced thing and 83 00:05:19,880 --> 00:05:23,880 Speaker 1: we don't fully understand it. And these experts are saying 84 00:05:23,920 --> 00:05:27,080 Speaker 1: that we need that kind of understanding in order to 85 00:05:27,080 --> 00:05:30,320 Speaker 1: have a better approach with AI, if and when it 86 00:05:30,680 --> 00:05:34,560 Speaker 1: achieves consciousness. Now, for the record, right now we appear 87 00:05:34,600 --> 00:05:39,960 Speaker 1: to be far from that kind of situation. Even seemingly 88 00:05:40,000 --> 00:05:44,360 Speaker 1: sophisticated AI agents are when you boil it down following 89 00:05:44,400 --> 00:05:50,240 Speaker 1: some conceptually simple processes. Now there's a complex system supporting 90 00:05:50,240 --> 00:05:54,599 Speaker 1: these processes, but if you really boil it down, it's 91 00:05:54,760 --> 00:05:59,640 Speaker 1: all about statistical probabilities. So while an AI chat bought 92 00:05:59,680 --> 00:06:03,680 Speaker 1: like chain GPT can seem to communicate like a person, 93 00:06:04,200 --> 00:06:08,960 Speaker 1: it actually lacks any consciousness or motivation or intent. But 94 00:06:09,000 --> 00:06:12,480 Speaker 1: there's a fear that without a more thoughtful approach, serious 95 00:06:12,520 --> 00:06:16,159 Speaker 1: mistakes could be made that could cause unintended harm, whether 96 00:06:16,240 --> 00:06:19,760 Speaker 1: to people or to the discipline of AI. This is 97 00:06:19,800 --> 00:06:22,719 Speaker 1: in line with what some other open letters we've seen 98 00:06:22,760 --> 00:06:25,599 Speaker 1: from various experts and concerned individuals have said. And while 99 00:06:25,600 --> 00:06:29,200 Speaker 1: I think it's still early to think about these things. 100 00:06:29,279 --> 00:06:32,039 Speaker 1: It's actually better to think about them early than to 101 00:06:32,160 --> 00:06:35,240 Speaker 1: wait until it's an active problem and then you're trying 102 00:06:35,240 --> 00:06:38,359 Speaker 1: to find a solution. Tech Crunch has an article about 103 00:06:38,400 --> 00:06:42,000 Speaker 1: the recent rise and reaction to deep fake audio projects 104 00:06:42,040 --> 00:06:47,000 Speaker 1: that digitally recreate an artist's voice to make new songs. Now, 105 00:06:47,040 --> 00:06:50,560 Speaker 1: I've talked about how the music industry recoiled in horror 106 00:06:51,080 --> 00:06:54,080 Speaker 1: when an AI generated song called Heart on My Sleeve 107 00:06:54,560 --> 00:06:59,839 Speaker 1: featured the AI synthesized voices of Drake and the Weekend, 108 00:07:00,120 --> 00:07:03,520 Speaker 1: and the song went viral. Studios began to scramble to 109 00:07:03,520 --> 00:07:05,839 Speaker 1: see what sort of legal protection they could cling to 110 00:07:06,040 --> 00:07:09,560 Speaker 1: in order to prevent an avalanche of AI impersonators releasing 111 00:07:09,640 --> 00:07:13,160 Speaker 1: music that sounds like a famous artist recorded it, when 112 00:07:13,200 --> 00:07:16,280 Speaker 1: really that person had nothing to do with it. Fun fact, 113 00:07:16,600 --> 00:07:20,720 Speaker 1: such legal protection doesn't really exist here in the United States. 114 00:07:21,040 --> 00:07:22,920 Speaker 1: I mean, if someone were to decide they wanted to 115 00:07:22,920 --> 00:07:25,720 Speaker 1: make prints sing the Hills Are Alive from the sound 116 00:07:25,760 --> 00:07:28,880 Speaker 1: of music, Well, in that case, there is a copyright 117 00:07:28,920 --> 00:07:33,400 Speaker 1: issue there because that song exists under copyright. But for 118 00:07:33,560 --> 00:07:37,360 Speaker 1: a brand new song, that gets way more complicated because 119 00:07:37,520 --> 00:07:40,960 Speaker 1: music studios wouldn't own the rights to that song that's 120 00:07:41,000 --> 00:07:44,080 Speaker 1: brand new and while I bet that wouldn't stop studios 121 00:07:44,120 --> 00:07:47,800 Speaker 1: trying to use copyright strikes to take down stuff from 122 00:07:48,280 --> 00:07:52,120 Speaker 1: people who had deep fakes of artists that those studios represented, 123 00:07:52,680 --> 00:07:56,720 Speaker 1: arguably that approach doesn't have much legal support because there's 124 00:07:56,720 --> 00:08:01,040 Speaker 1: no copyright infringement going on. Earlier this week, I mentioned 125 00:08:01,080 --> 00:08:04,400 Speaker 1: that musical artist Grimes is taking a different approach. She's 126 00:08:04,440 --> 00:08:07,520 Speaker 1: welcoming fans to use AI to recreate her voice for 127 00:08:07,640 --> 00:08:11,160 Speaker 1: new songs, provided they give her a fifty percent share 128 00:08:11,160 --> 00:08:14,360 Speaker 1: of royalties. Other artists are doing something similar. A few 129 00:08:14,360 --> 00:08:16,920 Speaker 1: of them are just saying, hey, go nuts out there, 130 00:08:17,000 --> 00:08:20,000 Speaker 1: do whatever you like. But the tech Crunch article brings 131 00:08:20,080 --> 00:08:22,800 Speaker 1: up a critical point for a lot of artists. This 132 00:08:22,920 --> 00:08:27,160 Speaker 1: possibility is brand new, and tons of people probably aren't 133 00:08:27,160 --> 00:08:30,320 Speaker 1: even aware that it's possible, and they haven't given their 134 00:08:30,360 --> 00:08:34,760 Speaker 1: consent to being copied, and that consent issue is important. 135 00:08:34,840 --> 00:08:36,960 Speaker 1: It gets to the heart of one of the really 136 00:08:37,040 --> 00:08:40,839 Speaker 1: big problems with deep fakes. A deep fake can rob 137 00:08:40,920 --> 00:08:44,920 Speaker 1: someone of agency, and it's become a really serious issue 138 00:08:45,000 --> 00:08:48,120 Speaker 1: in cases where someone has used deep fake technology to 139 00:08:48,160 --> 00:08:50,160 Speaker 1: make it seem as if a person has appeared in 140 00:08:50,200 --> 00:08:55,319 Speaker 1: say an adult film, when in fact that person didn't digitally, 141 00:08:55,720 --> 00:08:58,920 Speaker 1: that person is being forced to perform in an adult 142 00:08:58,960 --> 00:09:01,839 Speaker 1: film without their can. I really want to do a 143 00:09:01,880 --> 00:09:05,240 Speaker 1: full episode about that issue in particular, but to do so, 144 00:09:05,559 --> 00:09:07,520 Speaker 1: I feel I need to get some guests on who 145 00:09:07,520 --> 00:09:10,959 Speaker 1: can give me their own first hand perspective, because otherwise 146 00:09:11,000 --> 00:09:13,520 Speaker 1: it's just me sitting at a microphone saying this sounds 147 00:09:13,760 --> 00:09:17,320 Speaker 1: really bad. I think we already know it sounds really bad, 148 00:09:17,360 --> 00:09:21,080 Speaker 1: but getting that firsthand perspective of what it is like, 149 00:09:21,440 --> 00:09:25,320 Speaker 1: the reactions you get, the impact it can have on 150 00:09:25,400 --> 00:09:28,160 Speaker 1: your life when someone else just uses technology to make 151 00:09:28,200 --> 00:09:30,319 Speaker 1: it seem as if you did something you didn't do. 152 00:09:31,280 --> 00:09:33,640 Speaker 1: I feel like there needs to be a deeper episode 153 00:09:33,720 --> 00:09:38,240 Speaker 1: about that. Anyway. My hope is that once the dust 154 00:09:38,280 --> 00:09:43,520 Speaker 1: settles around this AI deep fake music issue, we figure 155 00:09:43,520 --> 00:09:45,800 Speaker 1: out a way for people to be able to give 156 00:09:45,920 --> 00:09:49,840 Speaker 1: consent to being copied if they don't mind it, or 157 00:09:49,880 --> 00:09:55,000 Speaker 1: to deny consent and receive protection against unwanted copies if 158 00:09:55,040 --> 00:09:59,319 Speaker 1: that's their preference, because right now we lack the framework 159 00:09:59,600 --> 00:10:03,360 Speaker 1: to deal with this situation. The Sydney Morning Herald reports 160 00:10:03,400 --> 00:10:07,320 Speaker 1: that China's Cyberspace Administration has now unveiled rules that set 161 00:10:07,360 --> 00:10:10,560 Speaker 1: strict boundaries for AI chatbots in China. I talked about 162 00:10:10,559 --> 00:10:13,080 Speaker 1: this in a past episode, So really it's just the 163 00:10:13,120 --> 00:10:17,960 Speaker 1: next stage in that development where this administration has pushed 164 00:10:17,960 --> 00:10:22,199 Speaker 1: out rules that do what you would expect, that chatbots 165 00:10:22,240 --> 00:10:26,400 Speaker 1: need to have socialist core values and they cannot contradict 166 00:10:26,559 --> 00:10:29,960 Speaker 1: the state. Again, no big shock here. If you are 167 00:10:29,960 --> 00:10:33,400 Speaker 1: familiar with China, you know that that country has a 168 00:10:33,840 --> 00:10:38,440 Speaker 1: reputation for not being particularly lenient when it comes to 169 00:10:38,520 --> 00:10:43,480 Speaker 1: questioning authority. And in an earnings call yesterday, Mark Zuckerberg 170 00:10:43,520 --> 00:10:45,760 Speaker 1: had a bit to say about AI as well. First, 171 00:10:45,880 --> 00:10:49,040 Speaker 1: that through AI, Meta has been able to better serve 172 00:10:49,120 --> 00:10:52,520 Speaker 1: content to users, and by better serve, I mean the 173 00:10:52,559 --> 00:10:55,720 Speaker 1: AI helps to choose what content is served to specific 174 00:10:55,800 --> 00:10:59,560 Speaker 1: users in an effort to keep them engaged longer, which 175 00:10:59,600 --> 00:11:03,000 Speaker 1: really just spoils down to what do you give somebody 176 00:11:03,160 --> 00:11:06,400 Speaker 1: that will keep them around, that will make them stay 177 00:11:06,400 --> 00:11:09,000 Speaker 1: on the platform and see more ads and generate more 178 00:11:09,040 --> 00:11:11,480 Speaker 1: revenue for the company. We've talked in the past how 179 00:11:11,520 --> 00:11:15,160 Speaker 1: Meta's various recommendation algorithms are designed with the goal of 180 00:11:15,280 --> 00:11:18,319 Speaker 1: keeping people glued to their platforms, and it sounds like 181 00:11:18,360 --> 00:11:21,640 Speaker 1: the AI tools are aiding in that effort, particularly on 182 00:11:21,720 --> 00:11:25,840 Speaker 1: Instagram with the incorporation of reels. In addition, the company 183 00:11:25,880 --> 00:11:30,240 Speaker 1: plans to integrate AI into other features, potentially into stuff 184 00:11:30,320 --> 00:11:33,040 Speaker 1: like chat So it's possible we could see chatbots or 185 00:11:33,080 --> 00:11:37,120 Speaker 1: elements of chatbots worked into tools like Messenger or wattsapp. 186 00:11:37,520 --> 00:11:41,000 Speaker 1: Considering how Snapchat users are currently review bombing their app 187 00:11:41,200 --> 00:11:44,480 Speaker 1: for the integration of an AI chatbot, I think Meta 188 00:11:44,520 --> 00:11:47,600 Speaker 1: should tread lightly here. Okay, we've got more news stories 189 00:11:47,600 --> 00:11:59,520 Speaker 1: to go, but let's take a quick break. Okay, we're 190 00:11:59,559 --> 00:12:01,760 Speaker 1: back a little bit more to say about that earnings 191 00:12:01,840 --> 00:12:05,360 Speaker 1: call that Mark Zuckerberg had for Meta, It went really, 192 00:12:05,400 --> 00:12:11,040 Speaker 1: really well for the company. Zuckerberg revealed that Meta outperformed expectations. 193 00:12:11,080 --> 00:12:14,840 Speaker 1: It earned a profit of five point seven billion dollars 194 00:12:14,920 --> 00:12:17,599 Speaker 1: for the first quarter of this year. Now, part of 195 00:12:17,640 --> 00:12:20,360 Speaker 1: that is undoubtedly due to the fact that Meta has 196 00:12:20,400 --> 00:12:23,959 Speaker 1: cut way back on costs, particularly in the form of 197 00:12:24,040 --> 00:12:28,000 Speaker 1: you know, salaries and stuff, because Meta famously laid off 198 00:12:28,040 --> 00:12:32,840 Speaker 1: around twenty one thousand people starting late last year in 199 00:12:32,880 --> 00:12:35,880 Speaker 1: the quest to make twenty twenty three the Year of Efficiency. 200 00:12:36,280 --> 00:12:39,000 Speaker 1: But on top of that, Zuckerberg said that Facebook had 201 00:12:39,040 --> 00:12:42,520 Speaker 1: nearly three billion monthly users each month this year, and 202 00:12:42,600 --> 00:12:45,559 Speaker 1: I mean, yeah, the company is performing well from that perspective, 203 00:12:45,559 --> 00:12:48,720 Speaker 1: no doubt about it. Zuckerberg also didn't just talk about 204 00:12:48,800 --> 00:12:52,040 Speaker 1: AI either. He said the company remains dedicated to the 205 00:12:52,080 --> 00:12:56,280 Speaker 1: development of the Metaverse. Whether anyone will want it by 206 00:12:56,320 --> 00:12:59,280 Speaker 1: the time it gets here remains an open question. The 207 00:12:59,400 --> 00:13:04,559 Speaker 1: UK's Competition in Market's Authority or CMA, has blocked Microsoft's 208 00:13:04,600 --> 00:13:09,000 Speaker 1: planned acquisition of Activision Blizzard. Now, technically this move just 209 00:13:09,080 --> 00:13:12,240 Speaker 1: blocks that acquisition in the UK, but since we're talking 210 00:13:12,240 --> 00:13:16,040 Speaker 1: about global entities here, it becomes a challenging situation for 211 00:13:16,120 --> 00:13:19,839 Speaker 1: Microsoft to move forward. It also sets a precedent while 212 00:13:19,920 --> 00:13:23,320 Speaker 1: other regulators, including ones in the US and the EU, 213 00:13:23,480 --> 00:13:27,200 Speaker 1: are considering this deal. The CMA's justification for blocking the 214 00:13:27,200 --> 00:13:30,359 Speaker 1: deal was a concern that Microsoft could create a monopolistic 215 00:13:30,400 --> 00:13:35,520 Speaker 1: command of the cloud gaming market. Right now, there's not 216 00:13:35,840 --> 00:13:38,880 Speaker 1: much of a cloud gaming market, but the thought is 217 00:13:38,920 --> 00:13:43,760 Speaker 1: that it's precisely at this stage where regulators cannot allow 218 00:13:43,800 --> 00:13:47,160 Speaker 1: a massive company, Microsoft being one of the world's largest, 219 00:13:47,480 --> 00:13:50,760 Speaker 1: to reduce the number of potential companies in the space, 220 00:13:50,920 --> 00:13:55,400 Speaker 1: or to prevent other competitors from accessing certain titles, which 221 00:13:55,400 --> 00:13:58,600 Speaker 1: would give Microsoft an unfair market advantage. In the process, 222 00:13:59,160 --> 00:14:03,920 Speaker 1: Microsoft reps understandably criticize this result, with arguments ranging from 223 00:14:04,040 --> 00:14:07,240 Speaker 1: you don't understand this market to the UK is going 224 00:14:07,320 --> 00:14:10,160 Speaker 1: to be sorry for doing this to me. See Also, 225 00:14:10,280 --> 00:14:14,840 Speaker 1: the company may appeal the CMA's decision, but uh, let's 226 00:14:14,880 --> 00:14:17,960 Speaker 1: just say that rarely works out in the long run. 227 00:14:18,280 --> 00:14:23,360 Speaker 1: The CMA is not an organization known for reversing its decisions. Also, 228 00:14:23,400 --> 00:14:27,280 Speaker 1: in Microsoft News and On, by default feature in the 229 00:14:27,280 --> 00:14:32,240 Speaker 1: Microsoft Edge browser has been snooping on users all right. 230 00:14:32,560 --> 00:14:35,600 Speaker 1: So this feature is meant to let people who follow 231 00:14:35,720 --> 00:14:40,680 Speaker 1: different content creators on different platforms to aggregate their interests 232 00:14:40,920 --> 00:14:42,840 Speaker 1: so that they have a single place they can go 233 00:14:42,960 --> 00:14:46,000 Speaker 1: to look for updates. So that way, instead of having 234 00:14:46,080 --> 00:14:49,280 Speaker 1: to go platform to platform and check on the content 235 00:14:49,320 --> 00:14:51,560 Speaker 1: creators you like to see if there's anything new, you 236 00:14:51,600 --> 00:14:54,280 Speaker 1: would go to this one destination and all the new 237 00:14:54,360 --> 00:14:57,240 Speaker 1: stuff would be updated there. This, by the way, is 238 00:14:57,240 --> 00:15:00,720 Speaker 1: not a novel idea. I used to rely on tools 239 00:15:00,760 --> 00:15:03,480 Speaker 1: that no longer exist that did this kind of stuff 240 00:15:03,520 --> 00:15:06,720 Speaker 1: with news articles. It's really what RSS feeds were intended 241 00:15:06,760 --> 00:15:10,520 Speaker 1: to do back in the day. But anyway, something went 242 00:15:10,600 --> 00:15:14,120 Speaker 1: wrong with a recent update to Edge see. The way 243 00:15:14,160 --> 00:15:16,960 Speaker 1: this tool is supposed to work is that when you 244 00:15:17,080 --> 00:15:21,880 Speaker 1: go to a page on a platform that supports this feature, 245 00:15:22,320 --> 00:15:27,000 Speaker 1: and only then Edge would send the URL to Microsoft 246 00:15:27,040 --> 00:15:32,600 Speaker 1: through a domain called bingapis dot com. But a redditor 247 00:15:32,960 --> 00:15:38,280 Speaker 1: user with the handle hackermchacker face or mchackface. I'm sorry, 248 00:15:38,360 --> 00:15:42,720 Speaker 1: I didn't mean to misidentify you, hacker mchackface. They posted 249 00:15:42,960 --> 00:15:45,560 Speaker 1: that a recent update to Edge now has it where 250 00:15:46,000 --> 00:15:50,080 Speaker 1: the browser sends every URL to every web page you 251 00:15:50,200 --> 00:15:54,239 Speaker 1: visit to that domain. So it's not just the creator 252 00:15:54,320 --> 00:15:58,160 Speaker 1: pages that have this tool built into the page where 253 00:15:58,160 --> 00:16:01,840 Speaker 1: it triggers this to happen. It's happening on any page 254 00:16:01,880 --> 00:16:04,920 Speaker 1: you go to. And as I said, this feature is 255 00:16:05,000 --> 00:16:09,600 Speaker 1: on by default, so using Edge means you are broadcasting 256 00:16:10,040 --> 00:16:13,640 Speaker 1: every URL you go to to Microsoft, which is a 257 00:16:13,680 --> 00:16:17,320 Speaker 1: gross breach of privacy. Now you can go into settings 258 00:16:17,360 --> 00:16:20,720 Speaker 1: an Edge and turn this feature off, but the fact 259 00:16:20,760 --> 00:16:24,000 Speaker 1: it's on by default and it's pulling everything in the 260 00:16:24,040 --> 00:16:27,520 Speaker 1: process ain't great. Microsoft reps have said that the company 261 00:16:27,560 --> 00:16:30,800 Speaker 1: is aware of the problem and will quote take appropriate 262 00:16:30,880 --> 00:16:35,600 Speaker 1: action to address any issue end quote. Finally, we all 263 00:16:35,640 --> 00:16:38,000 Speaker 1: know that last year was a tough one for Netflix, 264 00:16:38,040 --> 00:16:40,960 Speaker 1: The company revealed late last spring that it had a 265 00:16:41,080 --> 00:16:43,880 Speaker 1: decline in subscribers for the first time in more than 266 00:16:43,920 --> 00:16:47,920 Speaker 1: a decade, and then investor confidence dropped like a stone. 267 00:16:48,160 --> 00:16:52,520 Speaker 1: Since then, Netflix has been implementing different strategies to improve revenue. 268 00:16:53,040 --> 00:16:57,200 Speaker 1: One of those, famously is cracking down on password sharing. 269 00:16:57,680 --> 00:17:01,040 Speaker 1: This is something that Netflix has already implemented within certain 270 00:17:01,120 --> 00:17:05,440 Speaker 1: markets like Spain, and now a few months after Netflix 271 00:17:05,440 --> 00:17:09,840 Speaker 1: introduced the initiative in Spain, research firm Cantar says it 272 00:17:09,960 --> 00:17:14,960 Speaker 1: estimates more than one million former Netflix customers have canceled 273 00:17:15,240 --> 00:17:21,040 Speaker 1: their subscriptions there in Spain. This could change Netflix's strategy 274 00:17:21,160 --> 00:17:24,840 Speaker 1: to activate this password crackdown in other markets, including here 275 00:17:24,880 --> 00:17:27,760 Speaker 1: in the United States. We are on the schedule to 276 00:17:27,800 --> 00:17:31,560 Speaker 1: get Netflix's new rules this quarter now. To be clear, 277 00:17:31,800 --> 00:17:35,000 Speaker 1: company reps at Netflix say they expected to see a 278 00:17:35,040 --> 00:17:38,879 Speaker 1: decline in response to this password policy, But the loss 279 00:17:38,920 --> 00:17:42,480 Speaker 1: of a million subscribers in Spain, which I imagine has 280 00:17:42,520 --> 00:17:45,440 Speaker 1: to be a much smaller market than say the United States, 281 00:17:46,160 --> 00:17:49,280 Speaker 1: where the impact will be far larger, well, I mean 282 00:17:49,320 --> 00:17:52,920 Speaker 1: it looks rough for Netflix. Honestly, I don't even know 283 00:17:52,960 --> 00:17:55,840 Speaker 1: what the best approach is for the company. It expanded 284 00:17:55,880 --> 00:17:58,680 Speaker 1: to the point of saturation in a lot of markets, 285 00:17:58,920 --> 00:18:01,639 Speaker 1: and once you're there, it's really hard to achieve growth 286 00:18:01,680 --> 00:18:05,000 Speaker 1: without doing stuff like hiking prices, but not hiking them 287 00:18:05,040 --> 00:18:08,399 Speaker 1: so high that you convince folks to leave. Or you 288 00:18:08,440 --> 00:18:10,879 Speaker 1: can do stuff like try to crack down on loopholes 289 00:18:10,920 --> 00:18:14,080 Speaker 1: like shared passwords, but that hasn't gone well for them either. 290 00:18:14,640 --> 00:18:16,880 Speaker 1: I don't know what the right approach is. The streaming 291 00:18:16,920 --> 00:18:21,880 Speaker 1: market in general is still one that is complicated and 292 00:18:22,040 --> 00:18:25,600 Speaker 1: not fully understood, and a lot of companies are struggling 293 00:18:25,600 --> 00:18:29,159 Speaker 1: with how to balance paying for prestige content to it 294 00:18:29,359 --> 00:18:33,480 Speaker 1: attract and keep users, and to generate enough revenue to 295 00:18:33,520 --> 00:18:36,600 Speaker 1: cover all the costs and make a profit on the side. 296 00:18:36,720 --> 00:18:40,880 Speaker 1: It's hard. And that's it for the tech news for today, Thursday, 297 00:18:41,000 --> 00:18:43,920 Speaker 1: April twenty seventh, twenty twenty three. I hope you're all well, 298 00:18:44,560 --> 00:18:54,240 Speaker 1: and I'll talk to you again really soon. Tech Stuff 299 00:18:54,320 --> 00:18:58,840 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 300 00:18:58,880 --> 00:19:02,439 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 301 00:19:02,480 --> 00:19:03,399 Speaker 1: your favorite shows.