1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,800 --> 00:00:18,799 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:18,880 --> 00:00:21,319 Speaker 1: Tech Are Yet it is time for the tech News 5 00:00:21,320 --> 00:00:26,680 Speaker 1: for Thursday, December fourteenth, twenty twenty three. And first up, 6 00:00:26,800 --> 00:00:29,159 Speaker 1: before I actually get into the news, I really have 7 00:00:29,200 --> 00:00:32,360 Speaker 1: to give a huge shout out to Ours Technica. I 8 00:00:32,440 --> 00:00:36,479 Speaker 1: referenced them all the time, but for today's episode, I mean, 9 00:00:36,720 --> 00:00:40,680 Speaker 1: article after article was popping up that really was relevant, 10 00:00:41,240 --> 00:00:43,360 Speaker 1: and the quality of the work over at Ours Technica 11 00:00:43,440 --> 00:00:46,159 Speaker 1: is taught notch. Once again, I don't have any connection 12 00:00:46,240 --> 00:00:48,839 Speaker 1: with Ours Technica. I as far as I know, I 13 00:00:48,880 --> 00:00:52,760 Speaker 1: don't know anyone who works there. I just really admire 14 00:00:52,840 --> 00:00:56,640 Speaker 1: the work that they create. So a lot of the 15 00:00:56,680 --> 00:00:59,800 Speaker 1: stuff today will be referencing that, as I will call 16 00:00:59,840 --> 00:01:03,920 Speaker 1: out out in the story after story. Anyway, our first 17 00:01:03,920 --> 00:01:05,679 Speaker 1: story actually is one of the ones that comes from 18 00:01:05,680 --> 00:01:09,360 Speaker 1: Ours Technica. It was written by Ashley Bellinger and it's 19 00:01:09,400 --> 00:01:14,080 Speaker 1: titled Elon Musk's x AD. Revenue reportedly fell one point 20 00:01:14,240 --> 00:01:19,759 Speaker 1: five billion with a B this year amid boycotts, so 21 00:01:19,920 --> 00:01:24,360 Speaker 1: the story elaborates on this, saying that initially analysts estimated 22 00:01:24,360 --> 00:01:26,760 Speaker 1: that X, which was actually known as Twitter at the 23 00:01:26,760 --> 00:01:29,720 Speaker 1: beginning of this year. That transformation actually happened this year. 24 00:01:29,760 --> 00:01:32,240 Speaker 1: It seems like an eternity ago at this point, But anyway, 25 00:01:32,560 --> 00:01:36,399 Speaker 1: that X would generate around three billion dollars in AD revenue. 26 00:01:36,440 --> 00:01:40,440 Speaker 1: That was the initial estimate for twenty twenty three. Instead, 27 00:01:41,319 --> 00:01:44,160 Speaker 1: that number is closer to two point five billion in 28 00:01:44,240 --> 00:01:47,720 Speaker 1: AD revenue. Now, keep in mind ads are just one 29 00:01:47,920 --> 00:01:51,240 Speaker 1: source of revenue for Twitter, but it's the main source. Anyway. 30 00:01:51,560 --> 00:01:54,880 Speaker 1: Last year it was closer to four billion dollars in 31 00:01:54,960 --> 00:01:58,279 Speaker 1: AD revenue. So yeah, the site has seen a massive drop, 32 00:01:58,440 --> 00:02:02,680 Speaker 1: right four billion year to two point five billion this year. 33 00:02:02,800 --> 00:02:06,480 Speaker 1: It's actually five billion the year before that. Now, this 34 00:02:06,520 --> 00:02:08,680 Speaker 1: doesn't come as a huge surprise because there have been 35 00:02:08,760 --> 00:02:15,680 Speaker 1: numerous brands, ranging from globally recognized names to smaller niche companies, 36 00:02:15,960 --> 00:02:18,840 Speaker 1: that have ceased to advertise on the platform in the 37 00:02:18,840 --> 00:02:23,440 Speaker 1: wake of various scandals. Those range from Elon Musk personally 38 00:02:23,480 --> 00:02:28,519 Speaker 1: elevating anti Semitic conspiracy theories to Elon Musk deciding he 39 00:02:28,560 --> 00:02:31,560 Speaker 1: would reinstate the account of Alex Jones while using the 40 00:02:31,960 --> 00:02:34,280 Speaker 1: cover story of it was the will of the people 41 00:02:34,960 --> 00:02:39,160 Speaker 1: in a poll that some might question the validity anyway. 42 00:02:39,600 --> 00:02:43,119 Speaker 1: Musk has pretty much taken the you're killing my company 43 00:02:43,320 --> 00:02:48,040 Speaker 1: stance against these advertisers, accusing them of running a semi 44 00:02:48,160 --> 00:02:52,240 Speaker 1: organized campaign of destruction against X. He seems unwilling or 45 00:02:52,280 --> 00:02:57,040 Speaker 1: maybe incapable of acknowledging that these are consequences that stem 46 00:02:57,080 --> 00:03:00,720 Speaker 1: from his own actions, and instead thinks of it as 47 00:03:00,720 --> 00:03:05,720 Speaker 1: some sort of conspiracy to destroy Twitter, just to argue 48 00:03:05,720 --> 00:03:09,680 Speaker 1: that I don't see the logic in companies banding together 49 00:03:09,840 --> 00:03:13,160 Speaker 1: to destroy Twitter. That makes no sense because prior to 50 00:03:13,240 --> 00:03:16,720 Speaker 1: Musk's takeover of the company, Twitter was looked at as 51 00:03:16,720 --> 00:03:20,120 Speaker 1: a viable advertising platform. I mean, it was far from perfect, 52 00:03:20,160 --> 00:03:23,200 Speaker 1: don't get me wrong. Twitter was a mess before Musk 53 00:03:23,320 --> 00:03:26,320 Speaker 1: took it over. But I don't know many brands that 54 00:03:26,360 --> 00:03:29,359 Speaker 1: are actually eager to decrease the number of options they 55 00:03:29,400 --> 00:03:31,840 Speaker 1: have when it comes to advertising. They would rather be 56 00:03:31,880 --> 00:03:36,360 Speaker 1: able to advertise everywhere, like including inside your brain, if 57 00:03:36,360 --> 00:03:39,840 Speaker 1: they could. Anyway, please check out the article in Ours 58 00:03:39,840 --> 00:03:43,360 Speaker 1: Technica if you want more in depth analysis on the issue. 59 00:03:43,720 --> 00:03:46,240 Speaker 1: I've got a lot more Musk related stories to get 60 00:03:46,240 --> 00:03:49,600 Speaker 1: through for this episode. My apologies for that, but that's 61 00:03:49,640 --> 00:03:52,720 Speaker 1: just how this week played out. So our next story 62 00:03:52,760 --> 00:03:59,440 Speaker 1: comes from CBC's Jonathan Montpetite, who reports that several Canadian 63 00:04:00,080 --> 00:04:04,720 Speaker 1: companies have also chosen to halt advertising on X. So 64 00:04:04,920 --> 00:04:09,840 Speaker 1: Montpetit says that CBC investigated several accounts on X that 65 00:04:09,960 --> 00:04:14,960 Speaker 1: were linked to extremists, including accounts that belonged to white nationalists, 66 00:04:15,600 --> 00:04:18,839 Speaker 1: and when they were looking at the posts on these accounts, 67 00:04:19,520 --> 00:04:23,839 Speaker 1: they also saw ads for prominent companies appearing in that feed, 68 00:04:24,400 --> 00:04:27,839 Speaker 1: side by side with the hateful content. This actually mirrors 69 00:04:27,880 --> 00:04:31,120 Speaker 1: what several other organizations have claimed to have found in 70 00:04:31,160 --> 00:04:34,960 Speaker 1: the past, including the Center for Countering Digital Hate. That 71 00:04:35,080 --> 00:04:37,839 Speaker 1: was one of the organizations that X threatened to sue 72 00:04:37,880 --> 00:04:40,760 Speaker 1: after that nonprofit pointed out there was a rise of 73 00:04:41,080 --> 00:04:45,360 Speaker 1: hateful content in general on X. That's something that advertisers 74 00:04:45,360 --> 00:04:48,839 Speaker 1: would likely object to. So X has been fighting against 75 00:04:49,640 --> 00:04:54,320 Speaker 1: these these these accusations or allegations, however you want to 76 00:04:54,320 --> 00:04:57,680 Speaker 1: frame it. It's been fighting against these claims for the 77 00:04:57,680 --> 00:05:04,600 Speaker 1: better part of the year. But organizations have found similar things, 78 00:05:04,640 --> 00:05:10,280 Speaker 1: like they've discovered commonalities. So either all of these organizations 79 00:05:10,400 --> 00:05:13,240 Speaker 1: are lying or at the very least manipulating data to 80 00:05:13,279 --> 00:05:18,120 Speaker 1: support the narrative they want to tell, or things that 81 00:05:18,440 --> 00:05:22,039 Speaker 1: X are pretty bad anyway, This story shows that X's 82 00:05:22,040 --> 00:05:26,960 Speaker 1: issues with advertising extends beyond US borders. Now, on top 83 00:05:27,000 --> 00:05:31,120 Speaker 1: of those issues, X continues to have sporadic technical problems. 84 00:05:31,400 --> 00:05:35,160 Speaker 1: In an article titled an X outage Broke All Outgoing 85 00:05:35,200 --> 00:05:40,200 Speaker 1: Links Again by Richard Lawler of The Verge yesterday afternoon, 86 00:05:40,320 --> 00:05:43,360 Speaker 1: users on X found that clicking on any outgoing link 87 00:05:43,520 --> 00:05:46,160 Speaker 1: in a post would only bring up an error message. 88 00:05:47,040 --> 00:05:51,400 Speaker 1: Lawler reports that X appeared to resolve the issue after 89 00:05:51,560 --> 00:05:56,120 Speaker 1: about an hour after it had happened, though according to Lawler, 90 00:05:56,160 --> 00:05:59,279 Speaker 1: there was never any official acknowledgment from the company that 91 00:05:59,320 --> 00:06:03,240 Speaker 1: the problem had even happened at all. Now, obviously, technical 92 00:06:03,240 --> 00:06:06,640 Speaker 1: issues like that are also a concern for advertisers, right, 93 00:06:06,680 --> 00:06:08,839 Speaker 1: it doesn't do you any good to post an ad 94 00:06:08,920 --> 00:06:12,600 Speaker 1: on X if when someone who's actually interested in the 95 00:06:12,640 --> 00:06:15,320 Speaker 1: ad clicks on a link and then they get an 96 00:06:15,480 --> 00:06:21,520 Speaker 1: error page instead. That's no good either. As for why 97 00:06:21,839 --> 00:06:24,320 Speaker 1: this problem happened in the first place, I would wager 98 00:06:24,560 --> 00:06:28,200 Speaker 1: that the massive layoffs that musk oversaw at Twitter slash 99 00:06:28,480 --> 00:06:32,400 Speaker 1: X means that it's much more difficult for staff to anticipate, 100 00:06:32,760 --> 00:06:36,919 Speaker 1: prevent or even fix issues. But that's just my opinion. 101 00:06:37,520 --> 00:06:40,920 Speaker 1: It's not something that I can point to as hard evidence. 102 00:06:41,320 --> 00:06:43,839 Speaker 1: It just seems to be logical to me that when 103 00:06:43,880 --> 00:06:47,320 Speaker 1: you lay off a massive amount of your engineering staff 104 00:06:48,040 --> 00:06:50,719 Speaker 1: when problems crop up, it is a lot harder to 105 00:06:51,160 --> 00:06:55,440 Speaker 1: address those in a timely manner. Still, it did resolve 106 00:06:55,520 --> 00:07:00,000 Speaker 1: the issue in a little around an hour, which isn't terrible, 107 00:07:00,200 --> 00:07:04,680 Speaker 1: I guess, especially when you consider the constraints that the 108 00:07:04,720 --> 00:07:09,240 Speaker 1: team is working under. Those folks should receive huge accolades 109 00:07:09,279 --> 00:07:11,880 Speaker 1: for getting the problem fixed at least within an hour. 110 00:07:12,560 --> 00:07:15,480 Speaker 1: But you could also argue, well, this shouldn't have happened 111 00:07:15,640 --> 00:07:19,280 Speaker 1: at all, and maybe it wouldn't have if Musk Hatton 112 00:07:19,400 --> 00:07:23,320 Speaker 1: had that massive culling of staff earlier in the year. 113 00:07:24,120 --> 00:07:27,000 Speaker 1: The woes don't stop for Musk yet. I don't think 114 00:07:27,040 --> 00:07:28,960 Speaker 1: he's going to be counting twenty twenty three as one 115 00:07:29,000 --> 00:07:30,760 Speaker 1: of the best years of his life as it's going 116 00:07:30,840 --> 00:07:33,160 Speaker 1: right now, at least not the end of it. It's 117 00:07:33,200 --> 00:07:37,960 Speaker 1: been a pretty rocky ride anyway. Druv Merotra and Dell 118 00:07:38,120 --> 00:07:42,120 Speaker 1: Cameron of Wired and my apologies for butchering the name there. 119 00:07:42,840 --> 00:07:46,600 Speaker 1: They have a piece that's titled Elon Musk's New Monkey 120 00:07:46,760 --> 00:07:51,280 Speaker 1: Death Claims spur Fresh Demands for an SEC investigation. That's 121 00:07:51,360 --> 00:07:57,760 Speaker 1: a wild headline, Elon Musk's New Monkey Death Claims. Elon 122 00:07:57,880 --> 00:08:00,320 Speaker 1: Musk's New Monkey Death Claims to me, sounds like the 123 00:08:00,400 --> 00:08:04,160 Speaker 1: name of a heavy metal band. But anyway, this is 124 00:08:04,200 --> 00:08:07,800 Speaker 1: actually very serious and very sad. Business has to do 125 00:08:07,880 --> 00:08:12,480 Speaker 1: with another of Musk's companies, the startup Neurolink. That's the 126 00:08:12,520 --> 00:08:17,200 Speaker 1: company Musk founded that is dedicated to developing brain computer interfaces. 127 00:08:17,320 --> 00:08:20,280 Speaker 1: That is a way for you to be able to 128 00:08:20,480 --> 00:08:25,080 Speaker 1: interact and control computer systems using thought, and that's all 129 00:08:25,120 --> 00:08:28,160 Speaker 1: you have to use. So the piece covers how an 130 00:08:28,240 --> 00:08:33,560 Speaker 1: organization called the Physicians Committee for Responsible Medicine disputes Elon 131 00:08:33,679 --> 00:08:37,880 Speaker 1: Musk's claims that the primates that have died as a 132 00:08:37,920 --> 00:08:43,040 Speaker 1: result of Neurolink experimentation were already terminal cases. Musk said, 133 00:08:43,720 --> 00:08:48,319 Speaker 1: you know, we've had to euthanize primates that we've experimented on, 134 00:08:49,200 --> 00:08:52,960 Speaker 1: but these primates were already facing death. They were already 135 00:08:53,000 --> 00:08:57,280 Speaker 1: in some form of terminal health. Perhaps they had cancer 136 00:08:57,400 --> 00:09:00,640 Speaker 1: or some other condition that meant that they were dying, 137 00:09:01,200 --> 00:09:06,440 Speaker 1: and that they specifically picked these animals because knowing that 138 00:09:06,440 --> 00:09:09,760 Speaker 1: they were going to have to be euthanized afterward because 139 00:09:09,800 --> 00:09:15,560 Speaker 1: they were already facing death. The advocacy group denies that 140 00:09:15,960 --> 00:09:19,880 Speaker 1: they said that, you know, veterinary data points to the 141 00:09:19,920 --> 00:09:24,280 Speaker 1: animals not having any sort of terminal cases before being 142 00:09:24,320 --> 00:09:29,400 Speaker 1: put through the experimental procedures that then ultimately led to 143 00:09:29,440 --> 00:09:33,440 Speaker 1: the animals being euthanized, and that instead what Musk was 144 00:09:33,480 --> 00:09:38,280 Speaker 1: doing was just spouting off misinformation and lies in an 145 00:09:38,280 --> 00:09:41,640 Speaker 1: effort to encourage investors to pour more money into the business. 146 00:09:42,040 --> 00:09:45,839 Speaker 1: That he was misleading investors, and as such, they are 147 00:09:45,880 --> 00:09:49,920 Speaker 1: calling upon the US Securities and Exchange Commission or SEC, 148 00:09:50,480 --> 00:09:54,080 Speaker 1: to open an investigation into Neuralink. If you want to 149 00:09:54,120 --> 00:09:57,199 Speaker 1: know more about this, you should definitely read the article 150 00:09:58,120 --> 00:10:00,840 Speaker 1: over and wired, but I will warn you it is 151 00:10:01,240 --> 00:10:05,800 Speaker 1: incredibly upsetting. If you are tenderhearted like I am, it 152 00:10:05,880 --> 00:10:09,080 Speaker 1: may disturb you. I make no apologies for being a sap, 153 00:10:09,400 --> 00:10:11,840 Speaker 1: by the way, That's just who I am, and I 154 00:10:11,920 --> 00:10:14,680 Speaker 1: love animals. So it was a very hard read, but 155 00:10:14,720 --> 00:10:18,440 Speaker 1: I think a very important one. We're not done with 156 00:10:18,520 --> 00:10:21,720 Speaker 1: Elon Musk's companies just yet. We actually have a couple 157 00:10:21,800 --> 00:10:24,960 Speaker 1: more stories, but before we get to that, for goodness sakes, 158 00:10:25,000 --> 00:10:27,880 Speaker 1: let's just take a quick break to thank our sponsors. 159 00:10:37,200 --> 00:10:39,200 Speaker 1: All right. I mentioned that we still had a couple 160 00:10:39,200 --> 00:10:42,400 Speaker 1: of stories about Elon Musk companies. In this case, the 161 00:10:42,480 --> 00:10:46,559 Speaker 1: company we're talking about is Tesla. So it's been an 162 00:10:46,559 --> 00:10:50,679 Speaker 1: eventful week for Elon Musk. Tesla had to issue or 163 00:10:51,120 --> 00:10:54,040 Speaker 1: we'll have to issue a recall for practically all Tesla's 164 00:10:54,080 --> 00:10:59,280 Speaker 1: on US roads today. It's a software update kind of recall. 165 00:10:59,679 --> 00:11:02,840 Speaker 1: This mas Tesla owners won't have to take their car 166 00:11:02,920 --> 00:11:06,400 Speaker 1: into a dealership to have modifications performed on the vehicle 167 00:11:06,480 --> 00:11:08,560 Speaker 1: or even swap out their vehicle or anything like that. 168 00:11:08,600 --> 00:11:10,680 Speaker 1: They won't have to do that. But it has to 169 00:11:10,720 --> 00:11:16,200 Speaker 1: do with Tesla's advanced driver assist features, namely autopilot. So essentially, 170 00:11:16,440 --> 00:11:21,480 Speaker 1: the National Highway Traffic Safety Administration, or NHTSA, has ordered 171 00:11:21,520 --> 00:11:25,760 Speaker 1: Tesla to increase the number of warnings sent to drivers 172 00:11:25,800 --> 00:11:29,560 Speaker 1: to make sure that they keep their attention on the 173 00:11:29,640 --> 00:11:34,920 Speaker 1: road even when they engage autopilot. The NHTSA determined that 174 00:11:34,960 --> 00:11:39,440 Speaker 1: several Tesla owners were using autopilot features outside of their 175 00:11:39,559 --> 00:11:43,600 Speaker 1: approved operating parameters, and this represents a danger to both 176 00:11:43,640 --> 00:11:46,640 Speaker 1: the driver and to others on the road. Now, Tesla 177 00:11:46,960 --> 00:11:52,000 Speaker 1: has always included language and its official documents that these 178 00:11:52,040 --> 00:11:56,800 Speaker 1: features are meant for specific use cases like highway driving. Right, 179 00:11:56,880 --> 00:11:59,760 Speaker 1: it's not meant for everything. And the drivers also are 180 00:11:59,800 --> 00:12:03,040 Speaker 1: so posed to always remain focused on their surroundings. They're 181 00:12:03,080 --> 00:12:06,800 Speaker 1: not supposed to just let the car take over. But 182 00:12:06,920 --> 00:12:10,760 Speaker 1: then you also have folks like Elon Musk who complicate 183 00:12:10,800 --> 00:12:13,920 Speaker 1: matters by hyping up the autonomous features of the technology, 184 00:12:14,400 --> 00:12:18,120 Speaker 1: seemingly suggesting that they are more capable than what they 185 00:12:18,200 --> 00:12:21,200 Speaker 1: really are. It's the same matter that's at the heart 186 00:12:21,200 --> 00:12:26,600 Speaker 1: of an ongoing lawsuit between the State of California and Tesla. Anyway, 187 00:12:27,280 --> 00:12:31,160 Speaker 1: once updated, Tesla's that are in autopilot mode will perform 188 00:12:31,200 --> 00:12:33,920 Speaker 1: more frequent checks to ensure that the driver is still 189 00:12:34,000 --> 00:12:37,280 Speaker 1: fully focused on what's going on and that their hands 190 00:12:37,320 --> 00:12:41,280 Speaker 1: remain on the steering wheel. Otherwise, the vehicle may disengage 191 00:12:41,280 --> 00:12:45,520 Speaker 1: autopilot and force the driver to take over. And according 192 00:12:45,520 --> 00:12:50,800 Speaker 1: to Reuter's, Tesla faces more opposition in Sweden. This stems 193 00:12:50,840 --> 00:12:54,280 Speaker 1: from the company's refusal to negotiate with collective bargaining groups 194 00:12:54,320 --> 00:12:58,720 Speaker 1: that represent various workers like mechanics, for example. I mentioned 195 00:12:58,720 --> 00:13:01,560 Speaker 1: this in an earlier Technic New episode. But in Sweden 196 00:13:01,920 --> 00:13:07,160 Speaker 1: it is customary for these worker groups essentially unions, to 197 00:13:07,240 --> 00:13:11,880 Speaker 1: negotiate their own salaries and benefits with each employer. There's 198 00:13:11,960 --> 00:13:14,800 Speaker 1: no national minimum wage or anything like that, so all 199 00:13:14,840 --> 00:13:18,840 Speaker 1: agreements are made between the worker organizations and the employers. 200 00:13:19,040 --> 00:13:22,960 Speaker 1: But Tesla hasn't played ball, and so multiple worker groups 201 00:13:22,960 --> 00:13:25,920 Speaker 1: have now refused to cooperate with Tesla to apply pressure 202 00:13:25,960 --> 00:13:29,640 Speaker 1: to the company. The latest of these is the Transport 203 00:13:29,679 --> 00:13:33,240 Speaker 1: Workers Union, which now says it will stop collecting waste 204 00:13:33,360 --> 00:13:38,360 Speaker 1: from Tesla's workshops starting December twenty fourth unless Tesla meets 205 00:13:38,360 --> 00:13:42,080 Speaker 1: with these groups. Honestly, the whole situation between Tesla and 206 00:13:42,160 --> 00:13:45,720 Speaker 1: the various Swedish worker groups is starting to sound like 207 00:13:45,760 --> 00:13:49,800 Speaker 1: an Always Sunny and Philadelphia episode to me. And I 208 00:13:49,840 --> 00:13:55,480 Speaker 1: know that Elon Musk traditionally has been very anti workers' unions, 209 00:13:55,600 --> 00:13:58,640 Speaker 1: so it doesn't surprise me that it has reached this 210 00:13:58,679 --> 00:14:04,240 Speaker 1: particular level now. In non Elon News, You're Welcome, a 211 00:14:04,360 --> 00:14:08,560 Speaker 1: lawsuit claims that insurance company Humana has been relying upon 212 00:14:08,800 --> 00:14:12,600 Speaker 1: an AI model to make decisions about agreeing to or 213 00:14:12,640 --> 00:14:16,360 Speaker 1: denying care to patients mostly the latter, and that this 214 00:14:16,480 --> 00:14:21,760 Speaker 1: AI model has a failure rate of around ninety percent. Yikes, 215 00:14:22,080 --> 00:14:25,240 Speaker 1: So the whole story is in ours tetnaca. Once again, 216 00:14:25,480 --> 00:14:28,040 Speaker 1: big shout out to them. This one is by Beth 217 00:14:28,160 --> 00:14:31,880 Speaker 1: Mole and it's titled Humana also using AI tool with 218 00:14:32,040 --> 00:14:36,000 Speaker 1: ninety percent error rate to deny care, lawsuit claims. So 219 00:14:36,040 --> 00:14:40,800 Speaker 1: the also in that headline refers to a similar accusation 220 00:14:41,320 --> 00:14:44,560 Speaker 1: against insurance company United Health, which also makes use of 221 00:14:44,560 --> 00:14:48,920 Speaker 1: this same AI model. That AI model is a product 222 00:14:49,160 --> 00:14:52,880 Speaker 1: titled NH Predict. It is named after the company that 223 00:14:52,960 --> 00:14:57,880 Speaker 1: makes it, nave Health. The ninety percent error rate references 224 00:14:57,920 --> 00:15:02,360 Speaker 1: that allegedly, when people have appealed a denial of care 225 00:15:02,480 --> 00:15:07,200 Speaker 1: decision that was guided by this AI model, ninety percent 226 00:15:07,200 --> 00:15:10,720 Speaker 1: of the time those decisions are reversed. So essentially, the 227 00:15:10,760 --> 00:15:14,360 Speaker 1: AI model takes into account various factors to determine when 228 00:15:14,440 --> 00:15:19,200 Speaker 1: benefits should end for specific medical issues. The lawsuit argues 229 00:15:19,360 --> 00:15:22,200 Speaker 1: that the model is far too restrictive and it's limited 230 00:15:22,200 --> 00:15:25,760 Speaker 1: and it cannot take into consideration particulars that are unique 231 00:15:25,800 --> 00:15:29,840 Speaker 1: to each individual patient, so as a result, patients find 232 00:15:29,840 --> 00:15:34,120 Speaker 1: their benefits cut off prematurely or denied outright, and that 233 00:15:34,200 --> 00:15:37,120 Speaker 1: insurance companies are making use of this AI tool not 234 00:15:37,320 --> 00:15:41,840 Speaker 1: because it is accurate, but actually for the opposite reason 235 00:15:42,360 --> 00:15:47,400 Speaker 1: that it is incredibly inaccurate, but it's inaccurate and airing 236 00:15:47,440 --> 00:15:50,880 Speaker 1: on the side where insurance companies aren't having to pay 237 00:15:50,920 --> 00:15:53,760 Speaker 1: out as much so they can hoard all that money 238 00:15:53,800 --> 00:15:56,360 Speaker 1: for themselves. There's a lot I could say about the 239 00:15:56,360 --> 00:15:59,680 Speaker 1: insurance industry, but I won't because that this isn't the 240 00:15:59,680 --> 00:16:03,160 Speaker 1: Pott cast for it. But the lawsuit accuses Humana of 241 00:16:03,280 --> 00:16:06,680 Speaker 1: breach of contract along with several other offenses, and we'll 242 00:16:06,680 --> 00:16:09,280 Speaker 1: have to see where this goes from here. I just 243 00:16:09,360 --> 00:16:13,600 Speaker 1: will remind you there are lots of legitimate beneficial uses 244 00:16:13,600 --> 00:16:16,920 Speaker 1: for artificial intelligence. There really are. They're ones that could 245 00:16:17,280 --> 00:16:21,640 Speaker 1: have incredible positive impact on our lives. But there are 246 00:16:21,680 --> 00:16:26,520 Speaker 1: other applications of AI that are unbalanced in how they 247 00:16:26,640 --> 00:16:30,240 Speaker 1: will benefit one party versus everybody else, in this case, 248 00:16:30,640 --> 00:16:35,000 Speaker 1: the insurance industry versus anyone who's not an insurance company 249 00:16:35,280 --> 00:16:38,720 Speaker 1: or an investor in an insurance company. So we do 250 00:16:38,800 --> 00:16:41,720 Speaker 1: need to wrestle with that. Next up, well, guess what, 251 00:16:41,800 --> 00:16:44,680 Speaker 1: I've got another piece from ours Tetnica once again by 252 00:16:44,720 --> 00:16:48,960 Speaker 1: Ashley Bellinger. Seriously, Ashley, where do you find the time 253 00:16:49,200 --> 00:16:52,280 Speaker 1: to do all this stuff? Your articles are great? I 254 00:16:52,360 --> 00:16:56,000 Speaker 1: just can't imagine. Gosh, you got to work super hard anyway. 255 00:16:56,360 --> 00:16:59,760 Speaker 1: The article is titled trains were designed to break down 256 00:17:00,000 --> 00:17:03,560 Speaker 1: after a third party repairs hackers find and yeah, this 257 00:17:03,600 --> 00:17:08,040 Speaker 1: is another piece that demonstrates how companies can jealously guard 258 00:17:08,320 --> 00:17:11,240 Speaker 1: their ecosystems. Right. We have the whole right to repair 259 00:17:11,359 --> 00:17:14,199 Speaker 1: movement here in the United States that aims to break 260 00:17:14,240 --> 00:17:17,760 Speaker 1: down those walls that companies will put up, like the 261 00:17:17,840 --> 00:17:22,920 Speaker 1: John Deere tractor company. There's all these cases where farmers 262 00:17:22,920 --> 00:17:24,920 Speaker 1: were arguing they should have the right to be able 263 00:17:24,960 --> 00:17:27,879 Speaker 1: to maintain and repair their own equipment and not have 264 00:17:27,960 --> 00:17:33,600 Speaker 1: to go to a specific authorized John Deere repair shop 265 00:17:33,720 --> 00:17:36,640 Speaker 1: in order to get this stuff done. This falls into 266 00:17:36,680 --> 00:17:40,199 Speaker 1: that same sort of category, but in this case it's 267 00:17:40,359 --> 00:17:45,280 Speaker 1: about trains in Poland. So a hacker group called Dragon 268 00:17:45,400 --> 00:17:49,720 Speaker 1: Sector investigated trains made by a Polish company called Nuog. 269 00:17:50,359 --> 00:17:55,119 Speaker 1: So Dragon Sector was actually hired to do this to 270 00:17:55,200 --> 00:17:58,720 Speaker 1: look into it because there was a railway company that 271 00:17:58,920 --> 00:18:02,520 Speaker 1: was using these trains for NWOG and they were seeing 272 00:18:02,640 --> 00:18:05,840 Speaker 1: a lot of breakdowns and were wondering what was actually 273 00:18:05,880 --> 00:18:07,439 Speaker 1: at the heart of it, because it didn't appear to 274 00:18:07,480 --> 00:18:10,800 Speaker 1: be hardware related. It seemed to be software related, and 275 00:18:10,840 --> 00:18:13,040 Speaker 1: they wondered if something hinky was going on, so they 276 00:18:13,080 --> 00:18:15,920 Speaker 1: hired these hackers to look into it. The hackers say 277 00:18:15,960 --> 00:18:19,639 Speaker 1: they found the reason that apparently the software running on 278 00:18:19,720 --> 00:18:23,520 Speaker 1: these trains would detect if the trains were brought anywhere 279 00:18:23,640 --> 00:18:28,560 Speaker 1: other than an official Nuog facility for repairs. So if 280 00:18:28,560 --> 00:18:32,159 Speaker 1: a railway company dared to use some third party repair 281 00:18:32,240 --> 00:18:36,600 Speaker 1: shop in order to perform maintenance or repair on ANWOG train, 282 00:18:37,400 --> 00:18:41,119 Speaker 1: the software would detect that and then trigger failures, so 283 00:18:41,240 --> 00:18:45,120 Speaker 1: essentially bricking the train, which is kind of like how 284 00:18:45,160 --> 00:18:48,359 Speaker 1: some smartphones would brick themselves if the phones detected that 285 00:18:48,480 --> 00:18:52,120 Speaker 1: users were trying to, you know, jail break them. Newog 286 00:18:52,240 --> 00:18:55,720 Speaker 1: has denied the hackers claims and threatened a lawsuit against them, 287 00:18:56,080 --> 00:18:58,960 Speaker 1: but the hackers maintained that their investigation has plenty of 288 00:18:59,080 --> 00:19:02,000 Speaker 1: proof that Nwogg's was fixing the game in their favor. 289 00:19:02,520 --> 00:19:07,840 Speaker 1: Newogg has also suggested that perhaps a competitor actually hacked 290 00:19:08,359 --> 00:19:12,440 Speaker 1: into Nuogg's trains and altered the software to make it 291 00:19:12,520 --> 00:19:16,160 Speaker 1: look like nwog was walling off its ecosystem, But in fact, 292 00:19:16,520 --> 00:19:21,439 Speaker 1: this is all an elaborate attempt to discredit Nuog and 293 00:19:21,600 --> 00:19:25,399 Speaker 1: to really heap dirt upon their name. That sounds like 294 00:19:25,440 --> 00:19:27,399 Speaker 1: a hail Mary kind of play if you ask me. 295 00:19:27,720 --> 00:19:30,400 Speaker 1: I'm not saying that it would be impossible, but it 296 00:19:30,440 --> 00:19:35,240 Speaker 1: doesn't sound like the most likely reason why this software 297 00:19:35,280 --> 00:19:38,439 Speaker 1: would include these kinds of features, assuming that the hackers 298 00:19:38,440 --> 00:19:42,040 Speaker 1: are telling the truth. They also said they uncovered essentially 299 00:19:42,160 --> 00:19:45,199 Speaker 1: a code that you could enter into the system that 300 00:19:45,240 --> 00:19:49,400 Speaker 1: would clear all the issues. So if that's true, then 301 00:19:49,680 --> 00:19:53,720 Speaker 1: it's what they're saying is, yeah, nuogg sabotages their trains 302 00:19:54,320 --> 00:19:56,520 Speaker 1: so that you're forced to use Nuog to fix them. 303 00:19:56,600 --> 00:19:58,679 Speaker 1: Then they just put in this code and everything's fixed 304 00:19:59,160 --> 00:20:01,480 Speaker 1: because they were the ones who put the problem there 305 00:20:01,480 --> 00:20:05,280 Speaker 1: in the first place. That's the allegation. Very interesting case here, 306 00:20:05,800 --> 00:20:08,400 Speaker 1: all right, I've got a couple more stories to go 307 00:20:08,480 --> 00:20:10,760 Speaker 1: through before we can get to that. Let's take another 308 00:20:10,840 --> 00:20:25,119 Speaker 1: quick break. We're back. CNBC's Chelsea Cox has a piece 309 00:20:25,160 --> 00:20:29,080 Speaker 1: titled FCC votes to ban termination fees for cable and 310 00:20:29,119 --> 00:20:32,040 Speaker 1: satellite services, and that's something that I think should make 311 00:20:32,040 --> 00:20:35,560 Speaker 1: a lot of American customers happy because previously, if you 312 00:20:35,600 --> 00:20:38,320 Speaker 1: were a customer of a cable or a satellite provider 313 00:20:38,880 --> 00:20:41,800 Speaker 1: and you wanted out of your agreement and it was 314 00:20:41,880 --> 00:20:43,920 Speaker 1: you know, maybe you had entered into a two year 315 00:20:43,960 --> 00:20:46,480 Speaker 1: agreement and a year and a half in you wanted out, 316 00:20:47,080 --> 00:20:49,000 Speaker 1: typically to get out you had to pay a fee, 317 00:20:49,160 --> 00:20:52,720 Speaker 1: and sometimes it was a prohibitively hefty fee for the 318 00:20:52,760 --> 00:20:58,800 Speaker 1: privilege of ending this agreement. Personally, in my opinion, I 319 00:20:58,840 --> 00:21:00,879 Speaker 1: have always thought of these fees as a way of 320 00:21:00,920 --> 00:21:04,960 Speaker 1: discouraging even the meager amount of competition we see in 321 00:21:04,960 --> 00:21:07,439 Speaker 1: this space here in the United States. I've said it 322 00:21:07,440 --> 00:21:11,320 Speaker 1: many times. Most Americans actually have very few options when 323 00:21:11,359 --> 00:21:14,800 Speaker 1: it comes to stuff like cable providers. In my neighborhood, 324 00:21:14,960 --> 00:21:18,040 Speaker 1: I have two options. That's only if I don't want 325 00:21:18,080 --> 00:21:20,760 Speaker 1: internet service as part of that package, right, If I 326 00:21:20,800 --> 00:21:24,119 Speaker 1: want internet service as part of my cable package, really 327 00:21:24,119 --> 00:21:27,000 Speaker 1: I only have one choice. The other one is so 328 00:21:27,560 --> 00:21:30,439 Speaker 1: far behind that it's not really a viable choice in 329 00:21:30,440 --> 00:21:34,840 Speaker 1: the first place. But one way to discourage competition is 330 00:21:34,880 --> 00:21:37,600 Speaker 1: to lock customers into a multi year agreement, and then 331 00:21:37,960 --> 00:21:39,880 Speaker 1: in order to make sure they stay there, you tack 332 00:21:39,960 --> 00:21:44,960 Speaker 1: on this hefty fee to prevent anyone from severing that 333 00:21:45,040 --> 00:21:50,000 Speaker 1: agreement early. Unsurprisingly, the FCC passed this measure right down 334 00:21:50,080 --> 00:21:53,119 Speaker 1: party lines. Two members of the FCC voted against it, 335 00:21:53,600 --> 00:21:56,600 Speaker 1: which is pretty typical. Like, I don't think I can't 336 00:21:56,640 --> 00:21:59,399 Speaker 1: remember the last time I saw a matter where the 337 00:21:59,440 --> 00:22:03,760 Speaker 1: FCC unanimously voted on a measure. It almost always comes 338 00:22:03,800 --> 00:22:06,520 Speaker 1: down to party lines. Now, before I get to some 339 00:22:06,680 --> 00:22:10,240 Speaker 1: article recommendations, I do have one final story. This one 340 00:22:10,320 --> 00:22:14,479 Speaker 1: comes from NASA. Miles Hatfield posted a blog entry and 341 00:22:14,520 --> 00:22:18,360 Speaker 1: it's titled Engineers are working to resolve issues with Voyager 342 00:22:18,400 --> 00:22:22,800 Speaker 1: one computer. So the Voyager one spacecraft has traveled further 343 00:22:22,880 --> 00:22:27,280 Speaker 1: into space than any other spacecraft we've ever sent up there. 344 00:22:28,040 --> 00:22:30,240 Speaker 1: It's in fact, it's no longer in our Solar System. 345 00:22:30,280 --> 00:22:34,440 Speaker 1: It is traveling through interstellar space, and a computer glitch 346 00:22:35,040 --> 00:22:39,560 Speaker 1: in Voyager one's Flight data System or FDS appears to 347 00:22:39,600 --> 00:22:44,520 Speaker 1: be the cause of this problem. It can actually receive 348 00:22:44,760 --> 00:22:47,840 Speaker 1: and execute commands that we send here from Earth, so 349 00:22:47,880 --> 00:22:50,919 Speaker 1: that part is still good, but it's unable to send 350 00:22:50,960 --> 00:22:54,640 Speaker 1: information back to us, so we can't get any scientific 351 00:22:54,800 --> 00:22:58,119 Speaker 1: or engineering data back from Voyager one. We can just 352 00:22:58,160 --> 00:23:02,200 Speaker 1: send stuff to it, which which greatly limits its utility 353 00:23:02,320 --> 00:23:07,520 Speaker 1: as a scientific instrument. Obviously, now engineers are hoping that 354 00:23:07,560 --> 00:23:10,679 Speaker 1: they can resolve this problem, but it's likely going to 355 00:23:10,720 --> 00:23:13,680 Speaker 1: take weeks in order to come up with a solution 356 00:23:13,840 --> 00:23:16,479 Speaker 1: to do it, so it may be a while before 357 00:23:16,680 --> 00:23:19,679 Speaker 1: we can find out if we can get Voyager to 358 00:23:19,720 --> 00:23:22,800 Speaker 1: send information back to us again. And in fact, just 359 00:23:22,800 --> 00:23:26,320 Speaker 1: sending a message between Earth and the spacecraft takes nearly 360 00:23:26,359 --> 00:23:29,119 Speaker 1: a full day. In fact, it takes twenty two and 361 00:23:29,119 --> 00:23:33,040 Speaker 1: a half hours to go from Earth to the Voyager one, 362 00:23:33,200 --> 00:23:36,800 Speaker 1: and that's because of how far away the spacecraft is 363 00:23:36,880 --> 00:23:40,480 Speaker 1: from US. Here's hoping NASA can implement a fix, as 364 00:23:40,520 --> 00:23:42,760 Speaker 1: the story of the Voyager spacecraft has been a truly 365 00:23:42,840 --> 00:23:48,040 Speaker 1: remarkable one. I've got a couple of suggested reading recommendations 366 00:23:48,040 --> 00:23:51,800 Speaker 1: for y'all. One of those, No Surprise, comes from Ours Technica. 367 00:23:52,040 --> 00:23:54,639 Speaker 1: This one was written by John Brodkin and it is 368 00:23:54,720 --> 00:23:58,480 Speaker 1: titled Ted Cruz wants to stop the FCC from updating 369 00:23:58,600 --> 00:24:03,280 Speaker 1: data breach notification rules. So yeah, it's another tech meets 370 00:24:03,560 --> 00:24:07,719 Speaker 1: political news item, and it highlights what seems like an 371 00:24:07,760 --> 00:24:10,679 Speaker 1: odd position from my perspective, because I would think that 372 00:24:10,760 --> 00:24:14,600 Speaker 1: requiring companies to be prompt about reporting data breaches was 373 00:24:14,920 --> 00:24:19,280 Speaker 1: an important component of keeping businesses, customers, investors, pretty much 374 00:24:19,280 --> 00:24:24,000 Speaker 1: everybody safer. But it gets more complicated than that. The 375 00:24:24,040 --> 00:24:27,280 Speaker 1: other recommendation is actually a series of articles in The 376 00:24:27,400 --> 00:24:31,360 Speaker 1: Verge and they're all posted under an umbrella topic titled 377 00:24:31,720 --> 00:24:34,840 Speaker 1: the Year Twitter Died. So The Verge has done an 378 00:24:34,880 --> 00:24:39,560 Speaker 1: amazing job of analyzing how Twitter slash x has navigated 379 00:24:39,600 --> 00:24:42,280 Speaker 1: the year twenty twenty three and makes the case that 380 00:24:42,320 --> 00:24:46,240 Speaker 1: Twitter is actually dead, if only because the service now 381 00:24:46,280 --> 00:24:49,560 Speaker 1: has the name X if nothing else. I won't lie. 382 00:24:50,200 --> 00:24:52,040 Speaker 1: It is a lot to read. There are a lot 383 00:24:52,040 --> 00:24:54,320 Speaker 1: of articles in this, but it's all very well done 384 00:24:54,400 --> 00:24:56,960 Speaker 1: and it gives a pretty thorough treatment of the various 385 00:24:57,000 --> 00:24:59,520 Speaker 1: issues the company has faced this year. All right, that's 386 00:24:59,560 --> 00:25:02,520 Speaker 1: it for the news today. I hope you are all well, 387 00:25:03,040 --> 00:25:12,800 Speaker 1: and I'll talk to you again really soon. Tech Stuff 388 00:25:12,880 --> 00:25:17,440 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 389 00:25:17,440 --> 00:25:21,000 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 390 00:25:21,040 --> 00:25:22,000 Speaker 1: your favorite shows.