1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,800 --> 00:00:18,799 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:18,880 --> 00:00:22,680 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,760 --> 00:00:27,720 Speaker 1: the week ending Friday, November twenty second, twenty twenty four. 6 00:00:28,440 --> 00:00:32,760 Speaker 1: This week, the US Department of Justice filed proposals regarding 7 00:00:32,840 --> 00:00:36,000 Speaker 1: what to do about Google, and by what to do, 8 00:00:36,320 --> 00:00:38,760 Speaker 1: I mean how best to address the issue of Google 9 00:00:39,120 --> 00:00:43,239 Speaker 1: absolutely dominating web search, among other things. So this is 10 00:00:43,280 --> 00:00:48,440 Speaker 1: like an anti trust lawsuit proceeding, so as expected, the 11 00:00:48,520 --> 00:00:53,560 Speaker 1: suggestions the DOJ has are pretty extensive. One really big 12 00:00:53,560 --> 00:00:56,840 Speaker 1: one is that the DOJ says Google should stop paying 13 00:00:56,840 --> 00:00:59,880 Speaker 1: out gobs of cash to other platforms in order to 14 00:00:59,880 --> 00:01:03,440 Speaker 1: make Google Search the default search tool on those platforms. 15 00:01:03,880 --> 00:01:08,800 Speaker 1: So Google literally spends billions of dollars every year on 16 00:01:08,840 --> 00:01:12,360 Speaker 1: this effort and pays out companies like Apple to use 17 00:01:12,440 --> 00:01:16,199 Speaker 1: Google Search as the default on their various platforms, whether 18 00:01:16,240 --> 00:01:19,679 Speaker 1: that's iOS or whatever. And the DOJ argues that this 19 00:01:19,760 --> 00:01:24,319 Speaker 1: has led to Google essentially squashing competition and entrenching itself 20 00:01:24,600 --> 00:01:28,160 Speaker 1: as the dominant web search provider. But the proposal goes 21 00:01:28,240 --> 00:01:32,160 Speaker 1: further and says Google should also sell off its Chrome 22 00:01:32,400 --> 00:01:37,240 Speaker 1: browser entirely. Google introduced Chrome back in September two thousand 23 00:01:37,280 --> 00:01:40,119 Speaker 1: and eight, and since then it has become the most 24 00:01:40,160 --> 00:01:44,200 Speaker 1: popular web browser on the market. Stat Counter estimates that 25 00:01:44,240 --> 00:01:47,280 Speaker 1: it holds around sixty six percent of the market share 26 00:01:47,360 --> 00:01:50,480 Speaker 1: right now. The DOJ says the Google should not be 27 00:01:50,520 --> 00:01:54,160 Speaker 1: allowed to release some other browser during the term of judgment, 28 00:01:54,240 --> 00:01:56,440 Speaker 1: so not to create a workaround by oh, yeah, we'll 29 00:01:56,480 --> 00:01:59,480 Speaker 1: sell off Chrome, we just happen to release a different 30 00:01:59,480 --> 00:02:03,080 Speaker 1: browser CO called really highly polished Metal or something like 31 00:02:03,160 --> 00:02:06,240 Speaker 1: that would be a no no. Also, if Google doesn't 32 00:02:06,400 --> 00:02:10,560 Speaker 1: do what the DOJ says, then the department indicated that 33 00:02:10,600 --> 00:02:15,280 Speaker 1: it might come after Android, which is Google's mobile operating system. Well, 34 00:02:15,320 --> 00:02:19,919 Speaker 1: Google has of course protested these proposals in these strongest 35 00:02:19,960 --> 00:02:22,280 Speaker 1: possible terms, even going so far as to suggest the 36 00:02:22,360 --> 00:02:25,960 Speaker 1: changes that the DOJ is demanding would essentially break the 37 00:02:26,040 --> 00:02:30,400 Speaker 1: Internet and harm America's standing in the global technology market. 38 00:02:30,800 --> 00:02:33,880 Speaker 1: Companies like Apple are probably feeling a little antsy too, 39 00:02:33,960 --> 00:02:37,440 Speaker 1: because if the DOJ's proposals do move forward, well, Apple 40 00:02:37,480 --> 00:02:40,519 Speaker 1: would be saying bye bye to several billion dollars of 41 00:02:40,560 --> 00:02:43,760 Speaker 1: revenue every year, so that would be rough. The DOJ 42 00:02:44,040 --> 00:02:47,360 Speaker 1: argues that the payments to companies like Apple also end 43 00:02:47,440 --> 00:02:51,240 Speaker 1: up being incentives that Google is paying out. Essentially, Google 44 00:02:51,360 --> 00:02:56,040 Speaker 1: is incentivizing companies to not develop competing products on the 45 00:02:56,080 --> 00:02:59,440 Speaker 1: market because these companies can make more money just taking 46 00:02:59,440 --> 00:03:02,480 Speaker 1: money from Google and using Google's tool than they would 47 00:03:02,520 --> 00:03:06,560 Speaker 1: if they develop their own product individually. These proposals are 48 00:03:06,800 --> 00:03:09,720 Speaker 1: just proposals right now. These are not rules that Google 49 00:03:09,800 --> 00:03:12,320 Speaker 1: is going to have to immediately abide by. In fact, 50 00:03:12,400 --> 00:03:14,800 Speaker 1: this is something that will be brought before a judge 51 00:03:14,800 --> 00:03:18,120 Speaker 1: in the United States later in twenty twenty five, possibly 52 00:03:18,160 --> 00:03:21,000 Speaker 1: in the second half of twenty twenty five. So we're 53 00:03:21,080 --> 00:03:24,840 Speaker 1: far from a point where Google is facing actual consequences 54 00:03:25,200 --> 00:03:28,320 Speaker 1: just yet. And seeing where the US elections went this 55 00:03:28,400 --> 00:03:31,120 Speaker 1: past year, I would be shocked if the government continues 56 00:03:31,160 --> 00:03:34,960 Speaker 1: to take such a strong stance against corporations and anti 57 00:03:35,000 --> 00:03:39,840 Speaker 1: competitive practices moving forward. Generally speaking, Donald Trump has shown 58 00:03:39,920 --> 00:03:45,320 Speaker 1: to be very friendly toward corporations and less concerned about 59 00:03:45,320 --> 00:03:48,839 Speaker 1: things like, you know, anti competitive behaviors, so it would 60 00:03:48,880 --> 00:03:53,080 Speaker 1: really surprise me if we see continued movement on those fronts. 61 00:03:53,400 --> 00:03:57,120 Speaker 1: But that being said, the Trump administration also has its 62 00:03:57,200 --> 00:04:01,840 Speaker 1: own acts to grind against Google for perceived bias, not 63 00:04:01,880 --> 00:04:05,800 Speaker 1: saying there's actual bias, but there's definitely they perceive a 64 00:04:05,920 --> 00:04:09,840 Speaker 1: bias and have since identified Google as like an enemy 65 00:04:09,880 --> 00:04:15,760 Speaker 1: of the people by presenting search results that conservatives sometimes 66 00:04:15,920 --> 00:04:20,479 Speaker 1: argue are purposefully biased against them. And so it could 67 00:04:20,480 --> 00:04:24,039 Speaker 1: be possible that we'll still see government action brought against Google, 68 00:04:24,080 --> 00:04:27,039 Speaker 1: but it will be driven more by retribution than by 69 00:04:27,040 --> 00:04:30,160 Speaker 1: a desire to break up a trust, which is kind 70 00:04:30,200 --> 00:04:34,040 Speaker 1: of a weird place to be. Bahima abdel Rahman and 71 00:04:34,160 --> 00:04:37,800 Speaker 1: Joe Teddy have an article on BBC World Service. It 72 00:04:37,839 --> 00:04:40,160 Speaker 1: was in Arabic and so I had to rely upon 73 00:04:40,760 --> 00:04:44,560 Speaker 1: a translate feature, so hopefully everything that I read was 74 00:04:44,640 --> 00:04:48,000 Speaker 1: accurately translated. That can always be tricky, but it revealed 75 00:04:48,040 --> 00:04:52,080 Speaker 1: some pretty disturbing information about verified users on X, some 76 00:04:52,279 --> 00:04:55,400 Speaker 1: verified users a small selection, and of course X is 77 00:04:55,440 --> 00:04:59,599 Speaker 1: the service formerly known as Twitter. So the reporters say 78 00:05:00,040 --> 00:05:04,400 Speaker 1: that a BBC investigation found verified users on X who 79 00:05:04,480 --> 00:05:07,560 Speaker 1: were sharing links to sites that traffic in images and 80 00:05:07,640 --> 00:05:12,160 Speaker 1: videos of child sexual abuse material or CEESAM. The reporters 81 00:05:12,240 --> 00:05:16,719 Speaker 1: state that once they alerted X to these accounts, in question. 82 00:05:17,040 --> 00:05:21,160 Speaker 1: The service did ban those accounts, but questions remain about 83 00:05:21,200 --> 00:05:25,240 Speaker 1: the actual verification process, because one would hope that if 84 00:05:25,279 --> 00:05:28,760 Speaker 1: you're providing a check like a verified check, and if 85 00:05:28,760 --> 00:05:33,560 Speaker 1: you're allowing verified accounts to have a much larger reach 86 00:05:33,760 --> 00:05:36,680 Speaker 1: than standard accounts, you would also have a process that 87 00:05:36,720 --> 00:05:39,400 Speaker 1: would include some sort of vetting to make sure that 88 00:05:39,400 --> 00:05:43,440 Speaker 1: those accounts in question aren't violating the law. The BBC 89 00:05:43,560 --> 00:05:46,560 Speaker 1: reported that at least one verified account had been posting 90 00:05:46,640 --> 00:05:50,240 Speaker 1: links to illegal material for at least six months without 91 00:05:50,279 --> 00:05:54,039 Speaker 1: any intervention on behalf of x The investigators found the 92 00:05:54,080 --> 00:05:57,159 Speaker 1: accounts by searching for a few keywords in quote unquote 93 00:05:57,240 --> 00:06:00,560 Speaker 1: Arabic dialect. These words have been used as a sort 94 00:06:00,600 --> 00:06:03,680 Speaker 1: of code among se Sam traffickers as a means of 95 00:06:03,720 --> 00:06:07,800 Speaker 1: finding one another without directly raising suspicion. According to the BBC, 96 00:06:07,920 --> 00:06:11,880 Speaker 1: the verified accounts didn't have very many followers, but because 97 00:06:11,920 --> 00:06:15,360 Speaker 1: those accounts were verified, their posts actually had a really 98 00:06:15,400 --> 00:06:19,440 Speaker 1: broad reach and thousands of people saw these posts. So 99 00:06:19,480 --> 00:06:21,920 Speaker 1: while there weren't a lot of followers for the individual accounts, 100 00:06:22,200 --> 00:06:25,120 Speaker 1: they still had a pretty big impact. And obviously the 101 00:06:25,200 --> 00:06:29,039 Speaker 1: story is absolutely horrifying, and in my opinion, it also 102 00:06:29,080 --> 00:06:32,279 Speaker 1: puts a spotlight on how X's approach to verification, where 103 00:06:32,279 --> 00:06:35,919 Speaker 1: it becomes just a simple paid option, is far inferior 104 00:06:36,040 --> 00:06:38,480 Speaker 1: to the way it worked in the old Twitter days. 105 00:06:38,720 --> 00:06:41,200 Speaker 1: But I guess I should stop beating that drum and 106 00:06:41,640 --> 00:06:47,960 Speaker 1: just accept that X is, in my opinion, totally inferior 107 00:06:48,240 --> 00:06:53,000 Speaker 1: to what Twitter used to be. Smritty Malapti and my 108 00:06:53,080 --> 00:06:58,840 Speaker 1: apologies for Armalapati. I suppose I'm mispronouncing this name terribly anyway. 109 00:06:59,000 --> 00:07:02,440 Speaker 1: A reporter for Nature has an article that's titled a 110 00:07:02,560 --> 00:07:06,000 Speaker 1: Place of Joy Why scientists are joining the rush to 111 00:07:06,040 --> 00:07:09,320 Speaker 1: Blue Sky. Now, if you recall, blue Sky is an 112 00:07:09,360 --> 00:07:13,640 Speaker 1: alternative to x, slash, Twitter or Meta's threads, it's closer 113 00:07:13,640 --> 00:07:17,520 Speaker 1: to mast it on. It's like the federated version of 114 00:07:17,760 --> 00:07:21,400 Speaker 1: these services, which means that it's housed on multiple servers 115 00:07:21,760 --> 00:07:25,200 Speaker 1: that connect to one another, but it's not centralized the 116 00:07:25,240 --> 00:07:28,800 Speaker 1: way Twitter was or is. Blue Sky itself started as 117 00:07:28,840 --> 00:07:32,280 Speaker 1: a project within Twitter. Jack Dorsey was behind the initial creation, 118 00:07:32,360 --> 00:07:35,080 Speaker 1: although he is no longer involved with the company. So 119 00:07:35,320 --> 00:07:38,960 Speaker 1: why are scientists migrating to blue Sky? Well, according to 120 00:07:39,000 --> 00:07:42,760 Speaker 1: the article, it's for several reasons. One, there's more control 121 00:07:42,840 --> 00:07:45,440 Speaker 1: over what you see on Blue Sky, You're more likely 122 00:07:45,480 --> 00:07:48,560 Speaker 1: to see messages posted by people and accounts you actively follow, 123 00:07:48,960 --> 00:07:51,800 Speaker 1: rather than stuff that some algorithm is just shoving at 124 00:07:51,840 --> 00:07:55,800 Speaker 1: you for whatever reason, For example, prioritizing posts from folks 125 00:07:55,840 --> 00:07:58,480 Speaker 1: who paid to be verified over other stuff that you 126 00:07:58,520 --> 00:08:02,040 Speaker 1: actually want to see. Moderation has taken far more seriously 127 00:08:02,080 --> 00:08:04,320 Speaker 1: at blue Sky. If you block someone on blue Sky, 128 00:08:04,440 --> 00:08:06,720 Speaker 1: it is a real block. It's not the way it 129 00:08:06,760 --> 00:08:09,080 Speaker 1: is over on X. People who are blocked from you 130 00:08:09,440 --> 00:08:12,520 Speaker 1: on X can still see what you post, which is wild, 131 00:08:12,840 --> 00:08:15,800 Speaker 1: and it sounds like their reasons for leaving X are 132 00:08:15,840 --> 00:08:18,520 Speaker 1: pretty similar to my own reasons when I left X 133 00:08:18,560 --> 00:08:20,240 Speaker 1: a couple of years ago. If you still have an 134 00:08:20,240 --> 00:08:22,680 Speaker 1: account on X, I'm not throwing any shade at you. 135 00:08:23,000 --> 00:08:25,240 Speaker 1: I mean, lots of people have accounts for lots of reasons. 136 00:08:25,440 --> 00:08:27,880 Speaker 1: But it just became clear that X is not the 137 00:08:27,960 --> 00:08:30,800 Speaker 1: right place for me, and I think for a lot 138 00:08:30,840 --> 00:08:34,400 Speaker 1: of other folks they're coming to a similar conclusion. Sticking 139 00:08:34,400 --> 00:08:37,679 Speaker 1: with Blue Sky, Jonathan Vanian of CNBC has an article 140 00:08:37,679 --> 00:08:41,040 Speaker 1: in which Blue Sky's CEO Jay Graber has explained that 141 00:08:41,080 --> 00:08:44,720 Speaker 1: Blue Sky is quote unquote billionaire proof, meaning that Blue 142 00:08:44,720 --> 00:08:47,720 Speaker 1: Sky couldn't be bought and repurposed the way Elon Musk 143 00:08:47,800 --> 00:08:51,200 Speaker 1: bought and transformed Twitter. Blue Sky's design is based on 144 00:08:51,320 --> 00:08:54,600 Speaker 1: an open source approach with federated service as I mentioned, 145 00:08:54,720 --> 00:08:58,000 Speaker 1: so even if someone were buying up blue Sky servers, 146 00:08:58,080 --> 00:09:00,480 Speaker 1: you could create a new one and port your server 147 00:09:00,600 --> 00:09:03,520 Speaker 1: over to that one and maintain all your previous connections 148 00:09:03,559 --> 00:09:06,320 Speaker 1: with those whom you follow and those who follow you. 149 00:09:06,840 --> 00:09:09,920 Speaker 1: And Blue Sky's approach appears to be resonating with lots 150 00:09:09,920 --> 00:09:12,360 Speaker 1: of people. The service has seen millions of people sign 151 00:09:12,440 --> 00:09:14,520 Speaker 1: up in the wake of the US elections. It now 152 00:09:14,600 --> 00:09:17,520 Speaker 1: has more than twenty one million registered users, but that 153 00:09:17,679 --> 00:09:21,360 Speaker 1: is still minuscule compared to a service like Twitter. Twitter 154 00:09:21,600 --> 00:09:25,520 Speaker 1: has reported having hundreds of millions of monthly users. So 155 00:09:25,720 --> 00:09:29,040 Speaker 1: let's keep everything in perspective. Okay, we've gotten more news 156 00:09:29,080 --> 00:09:31,000 Speaker 1: to get through. Before we get to that, let's take 157 00:09:31,040 --> 00:09:44,760 Speaker 1: a quick break. We're back. The US Consumer Financial Protection 158 00:09:44,880 --> 00:09:48,959 Speaker 1: Bureau or CFPB issued rules that will group large digital 159 00:09:49,000 --> 00:09:52,440 Speaker 1: payment providers under the Bureau's regulations. So this is only 160 00:09:52,480 --> 00:09:55,520 Speaker 1: going to apply to digital payment apps that handle more 161 00:09:55,520 --> 00:09:59,560 Speaker 1: than fifty million transactions a year. That covers heavy hitters 162 00:09:59,600 --> 00:10:02,800 Speaker 1: like Google Wallet and Apple Pay that kind of thing. Initially, 163 00:10:03,360 --> 00:10:08,080 Speaker 1: the CFPB was planning on a much more comprehensive list 164 00:10:08,520 --> 00:10:12,160 Speaker 1: that would include apps that handle five million transactions or 165 00:10:12,160 --> 00:10:14,880 Speaker 1: more per year, but it switched to this, so these 166 00:10:14,880 --> 00:10:17,760 Speaker 1: services are now going to be subjected to regulations the 167 00:10:17,800 --> 00:10:19,960 Speaker 1: same way that banks and credit unions are here in 168 00:10:19,960 --> 00:10:23,640 Speaker 1: the United States. The CFPB issued a statement saying that 169 00:10:23,679 --> 00:10:26,360 Speaker 1: the rules mean that consumers are going to receive more 170 00:10:26,400 --> 00:10:31,080 Speaker 1: protection as a result, including against actions like illegal account closures, 171 00:10:31,360 --> 00:10:35,120 Speaker 1: just meaning that if these digital payment systems do certain 172 00:10:35,440 --> 00:10:40,360 Speaker 1: acts that are questionable or illegal, the CFPB has the 173 00:10:40,440 --> 00:10:45,320 Speaker 1: authority to regulate that and to punish a company for 174 00:10:45,400 --> 00:10:47,880 Speaker 1: doing those kinds of things, and that this was just 175 00:10:47,920 --> 00:10:50,840 Speaker 1: to establish that they do have to follow these rules. 176 00:10:51,160 --> 00:10:53,640 Speaker 1: There's some rough news for gamers as we head into 177 00:10:53,640 --> 00:10:56,920 Speaker 1: the holidays. Nvidia announced that it is facing a potential 178 00:10:57,040 --> 00:11:00,480 Speaker 1: gaming GPU shortage this quarter. This is a into an 179 00:11:00,559 --> 00:11:04,600 Speaker 1: article by Hassam Nasir of Tom's Hardware. Nasir reports that 180 00:11:04,720 --> 00:11:08,120 Speaker 1: during the most recent Nvidia earnings call to shareholders that 181 00:11:08,240 --> 00:11:12,760 Speaker 1: chief financial Officer Collette Kress revealed a possible squeeze in 182 00:11:12,880 --> 00:11:16,680 Speaker 1: GPU supplies, one that would likely be addressed early next year, 183 00:11:16,920 --> 00:11:19,400 Speaker 1: and this could mean that finding graphics cards in the 184 00:11:19,400 --> 00:11:22,760 Speaker 1: short term the holiday season, particularly graphics cards that are 185 00:11:22,800 --> 00:11:27,360 Speaker 1: not exorbitantly expensive, might be a little tricky. Nasir writes 186 00:11:27,400 --> 00:11:30,839 Speaker 1: that one possible reason for this shortage could actually be 187 00:11:30,880 --> 00:11:34,000 Speaker 1: in Vidia's plan to launch a new series of cards, 188 00:11:34,120 --> 00:11:37,640 Speaker 1: a new generation of GPUs called Blackwell, and this will 189 00:11:37,640 --> 00:11:41,160 Speaker 1: happen in January of next year, So the reduction in 190 00:11:41,200 --> 00:11:44,520 Speaker 1: supply could partly be due to Nvidia wanting to set 191 00:11:44,559 --> 00:11:46,720 Speaker 1: the stage for a huge launch with a new generation 192 00:11:46,800 --> 00:11:48,480 Speaker 1: of cards next year. And I know it's a lot 193 00:11:48,520 --> 00:11:50,960 Speaker 1: easier to sell a bunch of new cards if there 194 00:11:51,000 --> 00:11:55,000 Speaker 1: aren't a bunch of previous generation cards on sale for 195 00:11:55,200 --> 00:11:58,240 Speaker 1: a lower price on the market already. Tom Warren of 196 00:11:58,280 --> 00:12:01,000 Speaker 1: The Verge has an article explaining how Microsoft appears to 197 00:12:01,040 --> 00:12:03,640 Speaker 1: be urging Windows ten owners to upgrade to a new 198 00:12:03,679 --> 00:12:06,800 Speaker 1: computer in order to migrate to Windows eleven and to 199 00:12:06,840 --> 00:12:10,760 Speaker 1: take advantage of copilot features. Warren reports that some Windows 200 00:12:10,760 --> 00:12:14,320 Speaker 1: ten users are encountering full screen pop ups that not 201 00:12:14,440 --> 00:12:17,199 Speaker 1: only point out that Microsoft will be ending support for 202 00:12:17,280 --> 00:12:20,880 Speaker 1: Windows ten in October next year, but also that it 203 00:12:20,960 --> 00:12:23,160 Speaker 1: might be time to get a new machine because as 204 00:12:23,200 --> 00:12:26,360 Speaker 1: Warren points out the messaging is suggesting, you know, well, 205 00:12:26,400 --> 00:12:28,840 Speaker 1: you need a new computer to run Windows eleven because 206 00:12:29,280 --> 00:12:32,520 Speaker 1: Windows eleven has system requirements that a lot of older 207 00:12:32,520 --> 00:12:35,640 Speaker 1: computers just don't meet. But Warren also points out the 208 00:12:35,679 --> 00:12:39,640 Speaker 1: messaging could be considered a little misleading because Microsoft will 209 00:12:39,679 --> 00:12:43,560 Speaker 1: continue to provide limited ongoing support for Windows ten. However, 210 00:12:43,640 --> 00:12:45,680 Speaker 1: you will have to pay thirty dollars a year to 211 00:12:45,760 --> 00:12:48,800 Speaker 1: get those extra updates. But yeah, I don't know how 212 00:12:48,800 --> 00:12:50,920 Speaker 1: I would feel if I were working on my computer 213 00:12:51,000 --> 00:12:53,840 Speaker 1: and I just got a message that completely took up 214 00:12:53,880 --> 00:12:56,960 Speaker 1: the entire screen that essentially is saying, Hey, I know 215 00:12:57,080 --> 00:12:59,680 Speaker 1: stuff's real expensive right now, and you probably have a 216 00:12:59,679 --> 00:13:01,800 Speaker 1: lot of other things on your mind, but you should 217 00:13:01,840 --> 00:13:04,840 Speaker 1: really get a new computer. That would really cheese me off. Obviously, 218 00:13:05,200 --> 00:13:09,079 Speaker 1: the actual messages don't say that. I am liberally paraphrasing 219 00:13:09,080 --> 00:13:13,800 Speaker 1: and interpreting here and switching over to Alfonso Maruccia of 220 00:13:13,880 --> 00:13:16,840 Speaker 1: techt Spot, let's talk about another article that details a 221 00:13:16,840 --> 00:13:19,400 Speaker 1: move by Microsoft that is rubbing people the wrong way. 222 00:13:19,520 --> 00:13:23,480 Speaker 1: That article is called the Official Bing Wallpaper app does 223 00:13:23,520 --> 00:13:27,800 Speaker 1: some nasty malware like things to Windows. Yikes. So here's 224 00:13:27,840 --> 00:13:30,600 Speaker 1: the deal. The app is meant to let users swap 225 00:13:30,640 --> 00:13:33,840 Speaker 1: out their desktop wallpaper on their computers in a really 226 00:13:33,880 --> 00:13:37,120 Speaker 1: easy and seamless way. Only it seems to do stuff 227 00:13:37,120 --> 00:13:40,840 Speaker 1: that's not at all related to displaying wallpapers, you know, 228 00:13:40,880 --> 00:13:45,720 Speaker 1: stuff like decrypting cookies, including cookies saved in browsers other 229 00:13:45,760 --> 00:13:49,320 Speaker 1: than Microsoft Edge, like Chrome or Firefox. It also apparently 230 00:13:49,320 --> 00:13:53,800 Speaker 1: incorporates some sort of geolocation features and installs bing visual 231 00:13:53,880 --> 00:13:56,640 Speaker 1: search on the computer, as well as prompts users to 232 00:13:56,679 --> 00:14:00,360 Speaker 1: make Edge their default browser and install some sneaky browser 233 00:14:00,440 --> 00:14:03,960 Speaker 1: extensions and competing browsers of Chrome and Firefox. So it 234 00:14:04,040 --> 00:14:08,400 Speaker 1: certainly sounds like the Being Wallpaper app is drastically overstepping 235 00:14:08,440 --> 00:14:12,880 Speaker 1: itself here. I'm reminded of Sony and DRM, where Sony 236 00:14:13,000 --> 00:14:18,760 Speaker 1: inadvertently created malware with its digital rights management approach. That 237 00:14:18,960 --> 00:14:22,040 Speaker 1: sounds kind of like what we're talking about here, But 238 00:14:22,280 --> 00:14:24,720 Speaker 1: I have no idea what the intent was, but it 239 00:14:24,800 --> 00:14:27,680 Speaker 1: definitely doesn't doesn't sound like it was a good move 240 00:14:27,720 --> 00:14:31,880 Speaker 1: on in my opinion. Maxwell Zef of tech Crunch reports 241 00:14:31,880 --> 00:14:34,960 Speaker 1: that Apple is apparently developing an updated version of Siri 242 00:14:35,160 --> 00:14:38,280 Speaker 1: that will lean heavily on the large language model approach 243 00:14:38,320 --> 00:14:42,640 Speaker 1: to AI. This means that Siri would ideally become more conversational. 244 00:14:43,000 --> 00:14:45,800 Speaker 1: Presumably this will make it possible to use Siri to 245 00:14:45,920 --> 00:14:49,520 Speaker 1: interact with apps in a deeper, more complex way, but 246 00:14:49,600 --> 00:14:51,840 Speaker 1: it is going to take some time. Zef says. The 247 00:14:51,880 --> 00:14:54,400 Speaker 1: plan is for Apple to release this new version of 248 00:14:54,520 --> 00:14:58,440 Speaker 1: Siri in the spring of twenty twenty six. In the 249 00:14:58,680 --> 00:15:01,000 Speaker 1: do As I Say and Not As I Do category, 250 00:15:01,080 --> 00:15:03,800 Speaker 1: our next story is about generative AI and why you 251 00:15:03,840 --> 00:15:07,960 Speaker 1: should not rely on it, especially for important stuff like say, 252 00:15:08,280 --> 00:15:12,360 Speaker 1: filing expert testimony in a lawsuit that is aimed to 253 00:15:12,440 --> 00:15:17,200 Speaker 1: take on generative AI. All right, so not really generative AI. 254 00:15:17,280 --> 00:15:19,640 Speaker 1: Deep fakes. It's related, but not the same thing. So 255 00:15:19,800 --> 00:15:22,840 Speaker 1: in Minnesota, there is a state law that makes it 256 00:15:22,960 --> 00:15:27,040 Speaker 1: illegal to knowingly disseminate deep fakes up to ninety days 257 00:15:27,080 --> 00:15:30,440 Speaker 1: before an election if the material in that deep fake 258 00:15:30,560 --> 00:15:34,200 Speaker 1: video was made with an intent to influence the election, 259 00:15:34,480 --> 00:15:37,480 Speaker 1: and if the subject of the deep fake video did 260 00:15:37,480 --> 00:15:42,120 Speaker 1: not consent to being in it. Christopher Coles has challenged 261 00:15:42,280 --> 00:15:45,520 Speaker 1: this law, filing a lawsuit that argues this violates the 262 00:15:45,680 --> 00:15:49,080 Speaker 1: First Amendment the Freedom of speech in the US Constitution. 263 00:15:49,600 --> 00:15:54,000 Speaker 1: The state Minnesota has tapped the director of Stanford University's 264 00:15:54,000 --> 00:15:57,080 Speaker 1: Social Media Lab a guy named Jeff Hancock to provide 265 00:15:57,120 --> 00:16:00,600 Speaker 1: expert testimony regarding the dangers of deep fake technology. So 266 00:16:00,720 --> 00:16:04,760 Speaker 1: Hancock did, but apparently the testimony he submitted contains hints 267 00:16:04,800 --> 00:16:08,320 Speaker 1: that he himself relied on Generative AI in order to 268 00:16:08,400 --> 00:16:11,600 Speaker 1: write it, which is a big old whoopsie. Hancock's testimony 269 00:16:11,680 --> 00:16:15,240 Speaker 1: cites a study that doesn't appear to actually exist, which 270 00:16:15,280 --> 00:16:19,200 Speaker 1: suggests it is an AI hallucination. Now, some of y'all 271 00:16:19,280 --> 00:16:22,120 Speaker 1: might remember that several months ago I did an episode 272 00:16:22,160 --> 00:16:25,800 Speaker 1: of Tech Stuff that was quote unquote written by Generative AI, 273 00:16:26,200 --> 00:16:28,400 Speaker 1: and one of the things I found really upsetting when 274 00:16:28,440 --> 00:16:31,800 Speaker 1: I did this was that the AI invented experts in 275 00:16:31,880 --> 00:16:35,960 Speaker 1: order to present certain information as having academic validity, like 276 00:16:36,040 --> 00:16:39,080 Speaker 1: it was presenting a point of view, and then inventing 277 00:16:39,080 --> 00:16:42,080 Speaker 1: a person to have apparently given that point of view. 278 00:16:42,200 --> 00:16:44,280 Speaker 1: But those experts, as far as I can determine, were 279 00:16:44,280 --> 00:16:47,040 Speaker 1: not real people at all. So the same sort of 280 00:16:47,080 --> 00:16:49,440 Speaker 1: thing appears to have happened here in this case with 281 00:16:49,480 --> 00:16:53,320 Speaker 1: the expert testimony, and at the very least that is embarrassing. Now, 282 00:16:53,360 --> 00:16:56,440 Speaker 1: for the record, I do think deep fakes are incredibly 283 00:16:56,520 --> 00:17:00,200 Speaker 1: dangerous and that regulation is needed. I understand the First 284 00:17:00,240 --> 00:17:03,440 Speaker 1: Amendment argument, but if I can create a video that 285 00:17:03,520 --> 00:17:07,399 Speaker 1: appears to show you proclaiming beliefs that you absolutely do 286 00:17:07,520 --> 00:17:11,320 Speaker 1: not hold, or that shows you admitting to a crime 287 00:17:11,480 --> 00:17:14,960 Speaker 1: that you did not commit, or your calling for action 288 00:17:15,080 --> 00:17:17,399 Speaker 1: that you would never actually agree to, all of that 289 00:17:17,560 --> 00:17:20,280 Speaker 1: is a problem. Right, Like, if I create something that 290 00:17:20,359 --> 00:17:23,360 Speaker 1: makes it seem like you, like it's coming from you 291 00:17:23,440 --> 00:17:26,119 Speaker 1: and you are the ones saying these things, that's not really, 292 00:17:26,600 --> 00:17:29,440 Speaker 1: in my opinion, a First Amendment thing, because the First 293 00:17:29,440 --> 00:17:32,600 Speaker 1: Amendment covers my freedom of expression. But if I'm using 294 00:17:32,640 --> 00:17:35,520 Speaker 1: deep fakes, it appears that I am co opting your 295 00:17:35,600 --> 00:17:38,240 Speaker 1: freedom of expression to say whatever it is I want 296 00:17:38,280 --> 00:17:40,840 Speaker 1: you to say. At least that's my opinion. I'm not 297 00:17:40,880 --> 00:17:43,800 Speaker 1: an expert. I am not a legal expert by any means, 298 00:17:44,000 --> 00:17:48,040 Speaker 1: but yeah, I think the law actually has merit in 299 00:17:48,080 --> 00:17:52,000 Speaker 1: this case. But I guess that's a matter for the 300 00:17:52,000 --> 00:17:54,800 Speaker 1: courts to decide. And of course it kind of stinks. 301 00:17:54,800 --> 00:17:57,920 Speaker 1: I mean, it really stinks that the expert witness apparently 302 00:17:58,000 --> 00:18:01,560 Speaker 1: used generative AI to create their testimony, because it really 303 00:18:01,640 --> 00:18:06,080 Speaker 1: undermines their credibility and I think hurts the state's case, 304 00:18:06,440 --> 00:18:09,000 Speaker 1: and I don't want to see this go the other way. 305 00:18:09,560 --> 00:18:13,040 Speaker 1: One last story page, Gaully of Vice dot Com has 306 00:18:13,080 --> 00:18:16,800 Speaker 1: a piece titled AI Jesus is now taking confessions at 307 00:18:16,800 --> 00:18:19,639 Speaker 1: a church in Switzerland. That headline, I think is a 308 00:18:19,760 --> 00:18:23,280 Speaker 1: tad bit misleading. The AI Jesus is not meant to 309 00:18:23,400 --> 00:18:26,320 Speaker 1: take confession at least, it's not meant to perform the 310 00:18:26,400 --> 00:18:32,800 Speaker 1: sacrament of confession. So instead, this AI power generative tool 311 00:18:33,040 --> 00:18:35,800 Speaker 1: is meant to communicate in a way that's aligned with 312 00:18:35,960 --> 00:18:39,560 Speaker 1: at least depictions of Jesus. I don't know which depiction 313 00:18:39,920 --> 00:18:42,320 Speaker 1: of Jesus, like I don't know which interpretation of the 314 00:18:42,359 --> 00:18:45,960 Speaker 1: Bible was used to create this particular AI chatbot. But 315 00:18:46,080 --> 00:18:49,440 Speaker 1: church attendees can go into a confessional booth and have 316 00:18:49,520 --> 00:18:55,200 Speaker 1: a conversation with an AI chatbot that's meant to emulate Jesus, 317 00:18:55,440 --> 00:18:58,080 Speaker 1: and they receive answers to their questions that are meant 318 00:18:58,119 --> 00:19:02,160 Speaker 1: to engage their spiritual wary thoughts and questions and things 319 00:19:02,200 --> 00:19:05,040 Speaker 1: like that, And it sounds like lots of people find 320 00:19:05,040 --> 00:19:08,679 Speaker 1: the experience actually pretty enlightening, but others have dismissed it 321 00:19:08,720 --> 00:19:11,520 Speaker 1: as just a gimmick. It actually reminds me a lot 322 00:19:11,560 --> 00:19:14,679 Speaker 1: of early chat thoughts, because those were programmed to mimic 323 00:19:14,720 --> 00:19:19,320 Speaker 1: specific kinds of social interactions, like talking to a psychoanalyst, 324 00:19:19,359 --> 00:19:23,080 Speaker 1: for example. So this doesn't exactly surprise me. But again, 325 00:19:23,160 --> 00:19:26,280 Speaker 1: this isn't to say that there's some sort of robopowered, 326 00:19:26,320 --> 00:19:29,359 Speaker 1: coin operated confessional booth or something. We haven't gotten to 327 00:19:29,400 --> 00:19:33,119 Speaker 1: that point. We're not quite at Futurama levels of absurdity 328 00:19:33,200 --> 00:19:35,399 Speaker 1: just yet. But you know, give it a year, we'll 329 00:19:35,440 --> 00:19:37,640 Speaker 1: see where we end up. That's it for the tech 330 00:19:37,680 --> 00:19:40,359 Speaker 1: News for this week, the week ending November twenty second, 331 00:19:40,400 --> 00:19:43,200 Speaker 1: twenty twenty four. I hope all of you out there 332 00:19:43,240 --> 00:19:46,760 Speaker 1: are doing well, and I'll talk to you again really soon. 333 00:19:53,240 --> 00:19:57,920 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 334 00:19:58,240 --> 00:20:01,920 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 335 00:20:01,960 --> 00:20:03,040 Speaker 1: to your favorite shows.