1 00:00:04,440 --> 00:00:12,560 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,600 --> 00:00:16,720 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:16,760 --> 00:00:21,720 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:21,760 --> 00:00:24,520 Speaker 1: are you? It is time for the tech news for Tuesday, 5 00:00:24,640 --> 00:00:30,480 Speaker 1: October twenty fourth, twenty twenty three. First Up the Hill 6 00:00:30,520 --> 00:00:34,640 Speaker 1: reports that both Apple and Google have recently disabled live 7 00:00:34,720 --> 00:00:39,159 Speaker 1: traffic updates in the region around and including the Gaza 8 00:00:39,200 --> 00:00:43,519 Speaker 1: Strip as Israel pursues a ground war against Hamas. In 9 00:00:43,560 --> 00:00:47,800 Speaker 1: the past, companies that provide real time traffic updates have 10 00:00:48,080 --> 00:00:53,360 Speaker 1: suspended operations in conflict areas in order to avoid turning 11 00:00:53,360 --> 00:00:56,560 Speaker 1: into a method for gathering intelligence on things like a 12 00:00:56,600 --> 00:00:59,480 Speaker 1: military presence and troop movements and that sort of thing. 13 00:01:00,200 --> 00:01:04,480 Speaker 1: Bloomberg cited anonymous sources saying that the Israel Defense Forces 14 00:01:04,640 --> 00:01:08,520 Speaker 1: or IDF, made this request to both Apple and to 15 00:01:08,600 --> 00:01:13,399 Speaker 1: Google to temporarily disable live traffic notifications and Google Maps. 16 00:01:13,480 --> 00:01:19,160 Speaker 1: Spokesperson Carlone Bordeaux stated that quote anyone navigating to a 17 00:01:19,200 --> 00:01:23,000 Speaker 1: specific place will still get routes and ETAs that take 18 00:01:23,120 --> 00:01:26,920 Speaker 1: current traffic conditions into account end quote, so the navigation 19 00:01:27,040 --> 00:01:29,960 Speaker 1: tools will still have access to this live traffic data. 20 00:01:30,000 --> 00:01:32,399 Speaker 1: They're just not going to display the location of any 21 00:01:32,800 --> 00:01:35,280 Speaker 1: traffic bottlenecks or anything like that. They'll tell you how 22 00:01:35,280 --> 00:01:37,959 Speaker 1: long it'll take you to get to your destination, but 23 00:01:38,200 --> 00:01:41,960 Speaker 1: not what points along the route you should expect slow traffic. 24 00:01:42,560 --> 00:01:45,280 Speaker 1: So that's just an update on things that are going 25 00:01:45,319 --> 00:01:49,720 Speaker 1: on in the tech sphere in that particular conflict. Yesterday, 26 00:01:50,000 --> 00:01:53,560 Speaker 1: the US government announced it has designated thirty one technology 27 00:01:53,640 --> 00:01:56,920 Speaker 1: hubs across America that will focus on innovation and be 28 00:01:57,040 --> 00:02:01,480 Speaker 1: eligible to compete for millions of dollars in grant money. 29 00:02:01,960 --> 00:02:05,840 Speaker 1: The Economic Development Administration, which is a government office I 30 00:02:05,960 --> 00:02:08,920 Speaker 1: had not even heard of before today, said that the 31 00:02:09,040 --> 00:02:12,720 Speaker 1: goal is to encourage regions that are already pursuing innovation 32 00:02:13,240 --> 00:02:18,560 Speaker 1: in quote unquote critical technology ecosystems. The idea is that 33 00:02:18,600 --> 00:02:22,800 Speaker 1: the government is encouraging research and development that will lead 34 00:02:22,919 --> 00:02:28,760 Speaker 1: to technology and organizations that are all about job creation 35 00:02:29,480 --> 00:02:33,960 Speaker 1: and improving national security, like removing our dependence upon other nations, 36 00:02:33,960 --> 00:02:37,320 Speaker 1: for example, and also making the United States a leader 37 00:02:37,639 --> 00:02:41,680 Speaker 1: in various technical fields. Areas of focus include some pretty 38 00:02:41,680 --> 00:02:46,360 Speaker 1: obvious candidates like artificial intelligence and clean energy. That also 39 00:02:46,400 --> 00:02:50,919 Speaker 1: includes stuff like biotechnology, medical technology, and quantum computing, among 40 00:02:50,960 --> 00:02:55,519 Speaker 1: other disciplines. The thirty one hubs span thirty two states 41 00:02:55,560 --> 00:02:58,720 Speaker 1: and Puerto Rico, so it sounds like at least one 42 00:02:58,760 --> 00:03:03,120 Speaker 1: of those hubs state lines. So I hope everybody can 43 00:03:03,160 --> 00:03:08,079 Speaker 1: get along. The International Energy Agency announced today that according 44 00:03:08,120 --> 00:03:11,920 Speaker 1: to its estimates, renewable energy will make up nearly fifty 45 00:03:11,960 --> 00:03:16,320 Speaker 1: percent of global electricity production by twenty thirty and that 46 00:03:16,520 --> 00:03:19,200 Speaker 1: by that time there will be ten times as many 47 00:03:19,240 --> 00:03:22,400 Speaker 1: electric vehicles on the road as there are today. Now, 48 00:03:22,440 --> 00:03:24,960 Speaker 1: that is encouraging, but we should also remember that our 49 00:03:25,040 --> 00:03:29,120 Speaker 1: overall demand for electricity grows every year. It's not like 50 00:03:29,160 --> 00:03:32,760 Speaker 1: it stays stable. So while renewable sources will take up 51 00:03:32,760 --> 00:03:35,960 Speaker 1: a bigger piece of the pie, we're also talking about 52 00:03:36,000 --> 00:03:39,960 Speaker 1: a much larger pie. To that end, analysts expect that 53 00:03:40,000 --> 00:03:44,280 Speaker 1: we'll see peaks in the demand for fossil fuels this decade. Also, 54 00:03:44,400 --> 00:03:47,680 Speaker 1: electricity is just one way that we actually use energy. 55 00:03:48,160 --> 00:03:50,720 Speaker 1: If we take all of our energy needs into account, 56 00:03:51,000 --> 00:03:54,880 Speaker 1: they predict that the share that fossil fuels represents will 57 00:03:54,880 --> 00:03:59,280 Speaker 1: go from eighty percent down to seventy three percent. So 58 00:03:59,640 --> 00:04:02,000 Speaker 1: there's it's a very long way to go in order 59 00:04:02,040 --> 00:04:05,960 Speaker 1: to truly eliminate our dependence on stuff like fossil fuels. 60 00:04:06,440 --> 00:04:09,560 Speaker 1: The agency warns that despite the move toward renewable sources, 61 00:04:09,600 --> 00:04:12,440 Speaker 1: are continuing use of fossil fuels currently has us on 62 00:04:12,480 --> 00:04:15,720 Speaker 1: a trajectory to see a global temperature increase of around 63 00:04:15,800 --> 00:04:19,560 Speaker 1: two point four degrees celsius. That's nearly a full degree 64 00:04:19,680 --> 00:04:24,080 Speaker 1: above the target that was aimed at in the Paris Agreement, 65 00:04:24,680 --> 00:04:27,440 Speaker 1: and it at least implies that we're going to see 66 00:04:27,720 --> 00:04:31,640 Speaker 1: more major shifts than things like climate and then more 67 00:04:31,680 --> 00:04:38,080 Speaker 1: granularly weather patterns and severe weather events. Yesterday, Tesla disclosed 68 00:04:38,120 --> 00:04:40,560 Speaker 1: in a financial filing that it is the subject of 69 00:04:40,600 --> 00:04:44,159 Speaker 1: investigations from the US Department of Justice. So the DOJ 70 00:04:44,400 --> 00:04:46,720 Speaker 1: is looking into the company for a whole bunch of 71 00:04:46,720 --> 00:04:51,560 Speaker 1: different reasons, including how the company markets its driver assists technologies, 72 00:04:51,960 --> 00:04:55,880 Speaker 1: which Tesla refers to as autopilot and full self driving 73 00:04:56,040 --> 00:04:59,799 Speaker 1: or FSD. The investigation is also looking into whether Tesla 74 00:05:00,000 --> 00:05:04,120 Speaker 1: purposefully misled customers regarding how far their vehicles can travel 75 00:05:04,120 --> 00:05:07,200 Speaker 1: on a full charge of the battery. It's also looking 76 00:05:07,200 --> 00:05:11,320 Speaker 1: into the company practices, like the ones that relate to 77 00:05:11,440 --> 00:05:15,760 Speaker 1: quote unquote personal benefits and personnel decisions. So one of 78 00:05:15,800 --> 00:05:19,560 Speaker 1: the personal benefits could be the rumored glass house for 79 00:05:19,680 --> 00:05:22,600 Speaker 1: Elon Musk. I had totally forgotten about this story, y'all, 80 00:05:22,640 --> 00:05:25,640 Speaker 1: And when I read that there was a glass house project, 81 00:05:25,680 --> 00:05:28,120 Speaker 1: I thought, huh, I don't think I've ever heard the 82 00:05:28,240 --> 00:05:31,400 Speaker 1: term glass house project. I wonder what that is. Is 83 00:05:31,400 --> 00:05:35,120 Speaker 1: that like skunk works. No, we're talking about a literal 84 00:05:35,200 --> 00:05:38,200 Speaker 1: house with glass walls, you know, the kind of house 85 00:05:38,200 --> 00:05:41,120 Speaker 1: where you don't want to throw stones. The question is 86 00:05:41,600 --> 00:05:44,719 Speaker 1: if Elon Musk was actually using Tesla to fund and 87 00:05:44,920 --> 00:05:48,920 Speaker 1: build this house for himself, that would violate laws regarding 88 00:05:48,960 --> 00:05:52,000 Speaker 1: a company paying out benefits to an executive on top 89 00:05:52,080 --> 00:05:55,360 Speaker 1: of their normal compensation. I'm sure the full investigation has 90 00:05:55,400 --> 00:05:58,719 Speaker 1: a veritable laundry list of concerns, because Musk does have 91 00:05:58,760 --> 00:06:02,000 Speaker 1: a reputation for, let us say, playing fast and loose 92 00:06:02,080 --> 00:06:05,240 Speaker 1: with rules, as well as a history of campaigning to 93 00:06:05,279 --> 00:06:09,800 Speaker 1: have fewer rules in general. Analyst Ming chi Quo, whom 94 00:06:09,880 --> 00:06:13,479 Speaker 1: I mentioned recently in another news episode, has some more 95 00:06:13,800 --> 00:06:20,200 Speaker 1: juicy Apple news for us. Apple juice anyway. He wrote 96 00:06:20,240 --> 00:06:23,320 Speaker 1: a piece on medium that says Apple is kicking into 97 00:06:23,320 --> 00:06:27,039 Speaker 1: gear to get into the AI field. The company had 98 00:06:27,160 --> 00:06:31,160 Speaker 1: lingered behind a bit when compared with competitors like Google 99 00:06:31,200 --> 00:06:34,359 Speaker 1: and Microsoft, but that's pretty much par for the course 100 00:06:34,360 --> 00:06:38,159 Speaker 1: for Apple. If you look back on many of Apple's products, 101 00:06:38,200 --> 00:06:42,839 Speaker 1: like the big revolutionary products that Apple released often, you 102 00:06:42,880 --> 00:06:45,200 Speaker 1: will see they were not the first to market. The 103 00:06:45,240 --> 00:06:49,080 Speaker 1: iPod was not the first MP three player, for example. Instead, 104 00:06:49,120 --> 00:06:51,480 Speaker 1: the company would spend time figuring out how to best 105 00:06:51,640 --> 00:06:54,760 Speaker 1: meet customer expectations, or how to blow them out of 106 00:06:54,800 --> 00:06:57,680 Speaker 1: the water, or how to even create expectations where the 107 00:06:57,680 --> 00:07:01,200 Speaker 1: customer wasn't aware they had any Anyway, Apple has a 108 00:07:01,200 --> 00:07:03,800 Speaker 1: lot of catching up to do, and Quo's analysis says 109 00:07:03,839 --> 00:07:06,520 Speaker 1: that the company will be spending an enormous amount per 110 00:07:06,600 --> 00:07:09,840 Speaker 1: year to get there. So on the conservative end of 111 00:07:09,840 --> 00:07:13,360 Speaker 1: the spectrum, Apple would be spending around a billion dollars 112 00:07:13,400 --> 00:07:18,960 Speaker 1: a year to start working on AI implementations. Quo suspects 113 00:07:19,080 --> 00:07:21,840 Speaker 1: it's going to be closer to five billion dollars by 114 00:07:21,840 --> 00:07:24,600 Speaker 1: the end of twenty twenty four, so five billion split 115 00:07:24,680 --> 00:07:27,760 Speaker 1: over two years, with the vast majority of that really 116 00:07:27,840 --> 00:07:31,320 Speaker 1: spent in twenty twenty four. Even then, Apple will still 117 00:07:31,320 --> 00:07:35,400 Speaker 1: be trailing well behind companies like Microsoft. Those companies have 118 00:07:35,440 --> 00:07:40,240 Speaker 1: been incredibly aggressive while pursuing and developing AI technologies. However, 119 00:07:40,680 --> 00:07:43,360 Speaker 1: at least in the past, Apple has shown that its 120 00:07:43,400 --> 00:07:46,760 Speaker 1: approach can have an appeal that its competitors just can't match, 121 00:07:47,240 --> 00:07:50,840 Speaker 1: so maybe the same thing will be true with AI. 122 00:07:51,560 --> 00:07:55,320 Speaker 1: And speaking of AI, Technology Review has an interesting article 123 00:07:55,360 --> 00:07:58,760 Speaker 1: about how some digital artists are including elements that are 124 00:07:58,760 --> 00:08:01,800 Speaker 1: invisible to human eye, but they're intended to mess with 125 00:08:01,960 --> 00:08:07,200 Speaker 1: generative AI. So generative artificial intelligence has to receive training 126 00:08:07,600 --> 00:08:11,400 Speaker 1: in order to actually generate stuff. So with art, that 127 00:08:11,440 --> 00:08:15,880 Speaker 1: includes feeding millions of images into the AI system so 128 00:08:15,920 --> 00:08:19,560 Speaker 1: that it, in turn can learn to create images. But 129 00:08:19,800 --> 00:08:24,040 Speaker 1: this concerns artists because it's their work that becomes the 130 00:08:24,080 --> 00:08:27,560 Speaker 1: fodder to train these AI systems, and you can end 131 00:08:27,640 --> 00:08:31,040 Speaker 1: up with generative AI that will lift stylistic elements or 132 00:08:31,080 --> 00:08:36,480 Speaker 1: sometimes complete actual imagery from an established artist's work and 133 00:08:36,520 --> 00:08:39,560 Speaker 1: then pass it off as the AI's own creation or 134 00:08:39,600 --> 00:08:42,560 Speaker 1: someone else if they don't even indicate that AI was 135 00:08:42,640 --> 00:08:46,240 Speaker 1: used to make it. Meanwhile, you have artists concerned that 136 00:08:46,400 --> 00:08:49,600 Speaker 1: no one asked them permission to use their art to 137 00:08:49,640 --> 00:08:53,720 Speaker 1: train these AI models. They certainly haven't been compensated for 138 00:08:53,800 --> 00:08:57,240 Speaker 1: the use of their work. And then the artist's own 139 00:08:57,360 --> 00:09:01,160 Speaker 1: portfolio can become the fuel that ultimately robs that artist 140 00:09:01,200 --> 00:09:04,200 Speaker 1: of their livelihood or at the very least diminishes the 141 00:09:04,280 --> 00:09:08,400 Speaker 1: value of the work they produce. So what's the solution. Well, 142 00:09:08,520 --> 00:09:11,760 Speaker 1: some researchers have developed a tool they call night shade. 143 00:09:12,559 --> 00:09:14,600 Speaker 1: Nightshade by the way, that's a family of plants, but 144 00:09:14,640 --> 00:09:19,720 Speaker 1: I'm pretty sure they're specifically referencing deadly nightshade, a plant 145 00:09:20,040 --> 00:09:23,640 Speaker 1: that is in fact poisonous. And the tool makes changes 146 00:09:23,800 --> 00:09:28,319 Speaker 1: to digital art pixels, and it's changes that we humans 147 00:09:28,320 --> 00:09:31,120 Speaker 1: don't detect, Like it's invisible to us, we don't actually 148 00:09:31,160 --> 00:09:35,960 Speaker 1: see that any changes have happened. However, the AI scraping 149 00:09:36,000 --> 00:09:40,880 Speaker 1: these images totally can see that information, and the information 150 00:09:41,000 --> 00:09:44,360 Speaker 1: is misleading the AI. It's giving the AI the wrong 151 00:09:44,679 --> 00:09:51,559 Speaker 1: implication of how these various elements inside of art are created, 152 00:09:52,200 --> 00:09:55,000 Speaker 1: and it would prompt generative AI to start making some 153 00:09:55,160 --> 00:10:00,559 Speaker 1: mistakes when trying to replicate certain types of stuff. And ultimately, 154 00:10:00,640 --> 00:10:04,120 Speaker 1: the more the AI references works that have this kind 155 00:10:04,160 --> 00:10:07,320 Speaker 1: of stuff in it, the more wrong they get. So, 156 00:10:07,480 --> 00:10:11,880 Speaker 1: for a simple example, you might have some invisible AI 157 00:10:12,120 --> 00:10:16,439 Speaker 1: information there that is instructing AI and how to make 158 00:10:16,480 --> 00:10:20,080 Speaker 1: a picture of a dog, but it's all incorrect information. 159 00:10:20,679 --> 00:10:24,040 Speaker 1: Maybe the thing that it creates ends up looking more 160 00:10:24,080 --> 00:10:26,760 Speaker 1: like a cat than a dog, or maybe it looks 161 00:10:26,800 --> 00:10:29,080 Speaker 1: like a lovecrafty and monster out of the depths of 162 00:10:29,160 --> 00:10:32,800 Speaker 1: horror and imagination. Who's to say. But the point is 163 00:10:32,800 --> 00:10:37,160 Speaker 1: that if AI does start generating images like that just 164 00:10:37,240 --> 00:10:42,199 Speaker 1: upon simple queries, it indicates two things. One, the AI 165 00:10:42,360 --> 00:10:45,720 Speaker 1: company that created that tool trained it on art that 166 00:10:45,960 --> 00:10:49,040 Speaker 1: has been put through this night shade tool, and that 167 00:10:49,040 --> 00:10:52,640 Speaker 1: would indicate that, yeah, that AI company was doing this 168 00:10:52,840 --> 00:10:57,319 Speaker 1: was actually using artists' work, presumably without permission or compensation. 169 00:10:58,120 --> 00:11:00,080 Speaker 1: And the proof in the pudding is that, oh, oh, 170 00:11:00,120 --> 00:11:04,440 Speaker 1: your generative AI isn't creating pictures correctly. And of course 171 00:11:04,520 --> 00:11:08,120 Speaker 1: number two is the generative AI won't be creating pictures 172 00:11:08,120 --> 00:11:10,920 Speaker 1: correctly and thus will be less useful as a tool. 173 00:11:11,280 --> 00:11:14,120 Speaker 1: You can't really use AI to generate art about something 174 00:11:14,160 --> 00:11:18,240 Speaker 1: specific if it thinks that a house is actually a 175 00:11:18,280 --> 00:11:21,680 Speaker 1: boat or something along those lines. By the way, you 176 00:11:21,720 --> 00:11:24,720 Speaker 1: can actually read more about this in the MIT Technology 177 00:11:24,760 --> 00:11:29,320 Speaker 1: Review article titled this new data poisoning tool lets artists 178 00:11:29,400 --> 00:11:34,080 Speaker 1: fight back against generative AI. Okay, I've got more news 179 00:11:34,080 --> 00:11:36,280 Speaker 1: stories to go through today. But before we do that, 180 00:11:36,360 --> 00:11:48,679 Speaker 1: let's take a quick break. We're back. The Verge reports 181 00:11:48,679 --> 00:11:51,440 Speaker 1: that both AMD and Nvidia are gearing up to produce 182 00:11:51,640 --> 00:11:56,560 Speaker 1: ARM based CPUs for Windows machines soon. So that's significant 183 00:11:56,559 --> 00:12:00,520 Speaker 1: because right now Microsoft is partners with Qualcom when it 184 00:12:00,559 --> 00:12:04,640 Speaker 1: comes to CPUs for Windows eleven based machines. But it 185 00:12:04,679 --> 00:12:07,760 Speaker 1: appears that in the not too distant future there will 186 00:12:07,760 --> 00:12:10,480 Speaker 1: be a lot more competition in that space, and you 187 00:12:10,480 --> 00:12:13,640 Speaker 1: may find yourself shopping for a new Windows PC and 188 00:12:13,679 --> 00:12:17,080 Speaker 1: then you might have to compare different ARM based processors 189 00:12:17,080 --> 00:12:21,040 Speaker 1: with each other. This could also put pressure on Intel 190 00:12:21,160 --> 00:12:24,360 Speaker 1: to innovate in its own processor family or else risk 191 00:12:24,520 --> 00:12:28,040 Speaker 1: losing market share to AMD and to Nvidia, as well 192 00:12:28,080 --> 00:12:31,160 Speaker 1: as to Qualcomm. I really like this news because healthy 193 00:12:31,200 --> 00:12:35,800 Speaker 1: competition tends to mean better products and better prices, both 194 00:12:35,800 --> 00:12:38,280 Speaker 1: of which are great if you happen to be a consumer. 195 00:12:38,880 --> 00:12:42,800 Speaker 1: Gizmoto reports that Microsoft has finally addressed an issue in 196 00:12:42,800 --> 00:12:45,960 Speaker 1: its spreadsheet platform, Excel, that caused a headache in the 197 00:12:46,000 --> 00:12:49,920 Speaker 1: field of genetic research. So the problem is that Excel 198 00:12:50,080 --> 00:12:53,800 Speaker 1: thinks it knows better than the user and will automatically 199 00:12:53,840 --> 00:12:56,480 Speaker 1: make certain changes to data if that data is in 200 00:12:56,520 --> 00:12:59,880 Speaker 1: a particular format. So, for example, let's say you're a 201 00:13:00,080 --> 00:13:04,880 Speaker 1: geneticist and you type in the characters sept and the 202 00:13:04,960 --> 00:13:08,880 Speaker 1: numeral one sept one all together, and of course what 203 00:13:09,080 --> 00:13:12,480 Speaker 1: you mean is septin one, which is a protein coding 204 00:13:12,520 --> 00:13:16,040 Speaker 1: gene that plays a part in cytokinesis and the maintenance 205 00:13:16,040 --> 00:13:19,679 Speaker 1: of cellular morphology. Excel would not know that you are 206 00:13:19,720 --> 00:13:22,520 Speaker 1: creating a spreadsheet about genes, so it would assume that 207 00:13:22,559 --> 00:13:25,920 Speaker 1: when you typed sept one, what you actually met was 208 00:13:25,960 --> 00:13:28,960 Speaker 1: the first of September, and it might change that entry 209 00:13:29,000 --> 00:13:32,760 Speaker 1: so it says one dash SEP. The same is true 210 00:13:32,760 --> 00:13:35,600 Speaker 1: if you happen to be typing about membrane associated ring 211 00:13:35,960 --> 00:13:40,120 Speaker 1: CH type finger one. Geneticis would encode that as March 212 00:13:40,200 --> 00:13:43,360 Speaker 1: one m a RCH one. That of course would turn 213 00:13:43,400 --> 00:13:47,319 Speaker 1: into one dash mr. When your spreadsheet starts changing your 214 00:13:47,360 --> 00:13:52,040 Speaker 1: gene codes into dates, that's a problem. And worse, Excel 215 00:13:52,160 --> 00:13:55,400 Speaker 1: did not have a feature for you to turn that off, 216 00:13:55,800 --> 00:13:59,800 Speaker 1: so you couldn't tell it, hey, stop converting my gene 217 00:13:59,760 --> 00:14:02,680 Speaker 1: code codes into dates. And it meant that when it 218 00:14:02,720 --> 00:14:05,640 Speaker 1: got time for you to publish your work, you would 219 00:14:05,720 --> 00:14:08,880 Speaker 1: have to go through and spend countless hours making corrections. 220 00:14:09,160 --> 00:14:12,200 Speaker 1: Because the spreadsheets you were using had changed all your 221 00:14:12,240 --> 00:14:14,800 Speaker 1: codes into dates, so you had to make sure which 222 00:14:14,800 --> 00:14:16,480 Speaker 1: one of these should be dates, which of these are 223 00:14:16,520 --> 00:14:19,480 Speaker 1: actually codes. You could easily make mistakes and miss stuff. 224 00:14:19,600 --> 00:14:22,040 Speaker 1: It was a huge headache. In fact, it was such 225 00:14:22,040 --> 00:14:25,280 Speaker 1: a huge headache that a few years ago the field 226 00:14:25,360 --> 00:14:30,480 Speaker 1: overall changed its encoding system to work around this problem. 227 00:14:30,720 --> 00:14:33,240 Speaker 1: But now Microsoft has created an option in Excel that 228 00:14:33,360 --> 00:14:36,600 Speaker 1: lets you turn off the autoconversion feature as long as 229 00:14:36,600 --> 00:14:39,520 Speaker 1: you're not running macros on your spreadsheets, because then apparently 230 00:14:39,560 --> 00:14:42,120 Speaker 1: it all breaks down. I'm not sure if this isn't 231 00:14:42,280 --> 00:14:45,200 Speaker 1: a case of it being better late than never, but 232 00:14:45,240 --> 00:14:47,760 Speaker 1: I do think it's an example of how an automated 233 00:14:47,800 --> 00:14:51,400 Speaker 1: process can actually make things harder rather than easier. When 234 00:14:51,440 --> 00:14:54,920 Speaker 1: you extend that to AI implementations, it can raise some 235 00:14:55,040 --> 00:14:59,800 Speaker 1: really serious questions. Obviously, this is a much more simplified approach, right, 236 00:15:00,000 --> 00:15:03,400 Speaker 1: This is a conversion for one tiny instance of data, 237 00:15:03,960 --> 00:15:06,600 Speaker 1: But when you extend that out and you think about 238 00:15:06,640 --> 00:15:09,520 Speaker 1: the sort of things like the conclusions that AI can 239 00:15:09,600 --> 00:15:13,000 Speaker 1: come to based upon the information it has access to, 240 00:15:13,560 --> 00:15:17,440 Speaker 1: you can understand how sometimes the conclusions are not really 241 00:15:17,560 --> 00:15:23,160 Speaker 1: ideal all the time, because sometimes AI makes mistakes just 242 00:15:23,200 --> 00:15:26,840 Speaker 1: like regular I does. And by IY, I mean intelligence, 243 00:15:26,880 --> 00:15:29,360 Speaker 1: not I. I wasn't making a grammatical error. I was 244 00:15:29,360 --> 00:15:34,160 Speaker 1: making I guess joke is being too kind. Uh, let's 245 00:15:34,160 --> 00:15:37,280 Speaker 1: just move on over In the UK, the Public Accounts 246 00:15:37,280 --> 00:15:41,320 Speaker 1: Committee recently highlighted a looming challenge for the energy sector 247 00:15:41,320 --> 00:15:44,360 Speaker 1: in that company. This is according to the Register, and 248 00:15:44,440 --> 00:15:47,480 Speaker 1: it all has to do with smart meters. Smart meters 249 00:15:47,520 --> 00:15:50,880 Speaker 1: are supposed to monitor energy usage and maintain a communications 250 00:15:50,960 --> 00:15:55,160 Speaker 1: channel between customer and the energy provider, and it's meant 251 00:15:55,160 --> 00:15:57,520 Speaker 1: to keep an accurate tally on how much energy a 252 00:15:57,560 --> 00:16:01,480 Speaker 1: customer is using, but also to tech stuff like interruptions 253 00:16:01,480 --> 00:16:05,440 Speaker 1: and service and that kind of thing. Ideally, smart meters 254 00:16:05,480 --> 00:16:08,360 Speaker 1: reduce the need to send actual people out to physical 255 00:16:08,400 --> 00:16:11,960 Speaker 1: locations to take meter readings, and that ends up freeing 256 00:16:12,000 --> 00:16:15,280 Speaker 1: them up to do other stuff. Not too long ago, 257 00:16:15,680 --> 00:16:18,760 Speaker 1: the National Audit Office in the UK found that energy 258 00:16:18,800 --> 00:16:22,360 Speaker 1: companies had rolled out smart meters to around fifty seven 259 00:16:22,480 --> 00:16:26,160 Speaker 1: percent of their customers, so that meant forty three percent 260 00:16:26,280 --> 00:16:28,400 Speaker 1: still did not have smart meters. This, by the way, 261 00:16:28,920 --> 00:16:33,160 Speaker 1: was way behind schedule of the initial plan. Also, about 262 00:16:33,240 --> 00:16:37,560 Speaker 1: nine percent of those smart meters were not working correctly. Now, 263 00:16:37,560 --> 00:16:40,480 Speaker 1: the Public Accounts Committee built on top of this report 264 00:16:40,560 --> 00:16:43,400 Speaker 1: and found that twenty percent of the smart meters that 265 00:16:43,480 --> 00:16:46,640 Speaker 1: have been deployed are living on borrowed time, and that's 266 00:16:46,680 --> 00:16:51,560 Speaker 1: because these meters are only compatible with aging cellular networks, 267 00:16:51,600 --> 00:16:57,320 Speaker 1: primarily two G and three G. Meanwhile, telecommunications companies are 268 00:16:57,400 --> 00:17:02,840 Speaker 1: planning on sunsetting those networks and shutting them down, and 269 00:17:02,920 --> 00:17:06,160 Speaker 1: that means that the smart meters won't be so smart 270 00:17:06,359 --> 00:17:09,720 Speaker 1: after that. These twenty percent of are deployed smart meters, 271 00:17:10,760 --> 00:17:15,560 Speaker 1: so this shows there are lots of challenges that still 272 00:17:15,720 --> 00:17:18,040 Speaker 1: are in front of the energy sector in the UK. 273 00:17:19,000 --> 00:17:21,960 Speaker 1: Nearly half of the businesses and homes in the UK 274 00:17:22,920 --> 00:17:26,159 Speaker 1: have yet to receive a smart meter at all, some 275 00:17:26,200 --> 00:17:29,399 Speaker 1: of those who have one have one that doesn't work, 276 00:17:29,840 --> 00:17:32,400 Speaker 1: and then a whole bunch of them around seven million 277 00:17:32,720 --> 00:17:35,800 Speaker 1: different locations are going to need a new smart meter 278 00:17:36,400 --> 00:17:40,240 Speaker 1: before the cellular networks completely shut down, or else it'll 279 00:17:40,240 --> 00:17:42,639 Speaker 1: be like they have just a regular meter again. And 280 00:17:42,680 --> 00:17:45,680 Speaker 1: considering that this whole project has had to push deadlines 281 00:17:45,680 --> 00:17:48,359 Speaker 1: a few times already, I'm sure this comes as an 282 00:17:48,400 --> 00:17:52,560 Speaker 1: extremely frustrating development. It also just reminds us that connected 283 00:17:52,600 --> 00:17:56,119 Speaker 1: technologies are only useful for as long as the underlying 284 00:17:56,160 --> 00:17:59,960 Speaker 1: connection exists, which I know that's obvious on the face 285 00:18:00,080 --> 00:18:02,720 Speaker 1: of it. We can take that for granted, right when 286 00:18:02,720 --> 00:18:05,360 Speaker 1: we talk about things like the Internet of Things, there's 287 00:18:05,400 --> 00:18:07,720 Speaker 1: a lot of future proofing that needs to happen for 288 00:18:07,800 --> 00:18:11,760 Speaker 1: the Internet of Things to remain useful, because otherwise, if 289 00:18:11,800 --> 00:18:16,720 Speaker 1: the Internet of Things is only compatible with older wireless technologies, 290 00:18:16,720 --> 00:18:22,119 Speaker 1: for example, and we gradually sunset those because we've created 291 00:18:22,280 --> 00:18:28,480 Speaker 1: better processes, better protocols for wireless communication, if we can't 292 00:18:28,600 --> 00:18:35,080 Speaker 1: update those IoT devices, they just become inert and we 293 00:18:35,160 --> 00:18:39,440 Speaker 1: have all the stuff that isn't doing anything. So that's 294 00:18:39,480 --> 00:18:41,600 Speaker 1: just a good reminder for us to have. And it's 295 00:18:41,720 --> 00:18:44,520 Speaker 1: very difficult to future proof things because you never really 296 00:18:44,640 --> 00:18:47,800 Speaker 1: know how far along they're going to go. Sometimes, if 297 00:18:47,800 --> 00:18:51,040 Speaker 1: you're lucky, you have designed your product, not just lucky, 298 00:18:51,080 --> 00:18:53,880 Speaker 1: but smart. You've designed your product in such a way 299 00:18:53,880 --> 00:18:56,800 Speaker 1: where you can update it with things like firmware updates 300 00:18:57,160 --> 00:19:01,199 Speaker 1: to keep it relevant longer. Ultimately, you're always going to 301 00:19:01,200 --> 00:19:03,240 Speaker 1: have to swap stuff out. Your goal is to make 302 00:19:03,280 --> 00:19:07,000 Speaker 1: it as less frequent as you possibly can and to 303 00:19:07,119 --> 00:19:11,800 Speaker 1: maximize the life cycle of the devices you've deployed. Anyway, 304 00:19:12,200 --> 00:19:15,399 Speaker 1: this whole story was giving me flashbacks to the United 305 00:19:15,440 --> 00:19:20,120 Speaker 1: States back when television broadcasts switched from analog to digital. 306 00:19:20,400 --> 00:19:24,000 Speaker 1: It caused no end of confusion. Only a small number 307 00:19:24,080 --> 00:19:27,560 Speaker 1: of people were actually affected by that small in the 308 00:19:27,840 --> 00:19:31,240 Speaker 1: relative sense, because they needed to get a converter to 309 00:19:31,320 --> 00:19:35,120 Speaker 1: convert digital signals to an analog signal to feed into 310 00:19:35,119 --> 00:19:38,680 Speaker 1: an old television in order to still pick up broadcast TV. 311 00:19:39,119 --> 00:19:41,360 Speaker 1: For a while, everyone was thinking that they might need 312 00:19:41,400 --> 00:19:43,880 Speaker 1: one of those converters, and very few people actually did. 313 00:19:44,520 --> 00:19:48,760 Speaker 1: Dating in the modern world is hard, or so I'm 314 00:19:48,840 --> 00:19:52,159 Speaker 1: led to understand. I personally do not date because my 315 00:19:52,200 --> 00:19:56,639 Speaker 1: wife frowns upon it. Jokes. I've been happily married for 316 00:19:56,680 --> 00:19:59,199 Speaker 1: more than twenty five years at this point, so I 317 00:19:59,240 --> 00:20:02,800 Speaker 1: am totally ignorant when it comes to navigating the dating 318 00:20:02,800 --> 00:20:05,280 Speaker 1: world these days. To be honest, I wasn't exactly an 319 00:20:05,280 --> 00:20:08,800 Speaker 1: expert back when I was single. Anyway, What if I 320 00:20:08,840 --> 00:20:12,480 Speaker 1: told you that Tender is introducing a feature that will 321 00:20:12,560 --> 00:20:16,399 Speaker 1: let you turn friends and family into your own personal Yenta. 322 00:20:17,320 --> 00:20:20,760 Speaker 1: Yenta was the matchmaker and fiddler on the roof, to 323 00:20:20,880 --> 00:20:25,680 Speaker 1: be clear, So Tender's introducing this feature where you can 324 00:20:25,720 --> 00:20:29,040 Speaker 1: list up to fifteen people on your account who will 325 00:20:29,119 --> 00:20:33,119 Speaker 1: then be able to review potential matches based upon your 326 00:20:33,160 --> 00:20:36,480 Speaker 1: tender profile, and then they can weigh in on which 327 00:20:36,520 --> 00:20:40,760 Speaker 1: ones they think you should swipe left on or right on. Now, 328 00:20:40,800 --> 00:20:43,119 Speaker 1: they won't actually be able to do the swiping for you, 329 00:20:43,200 --> 00:20:45,560 Speaker 1: which is a good thing because can you just imagine 330 00:20:45,600 --> 00:20:49,760 Speaker 1: giving your parents access to your tender profile to be 331 00:20:49,840 --> 00:20:51,880 Speaker 1: able to vote on whether or not you should ask 332 00:20:51,920 --> 00:20:54,960 Speaker 1: that nice young person out on a date or not. 333 00:20:55,680 --> 00:20:58,719 Speaker 1: What a nightmare scenario that is. I apologize if that 334 00:20:58,960 --> 00:21:02,520 Speaker 1: major anxiety spike just now. They will also will not 335 00:21:02,560 --> 00:21:05,760 Speaker 1: be able to message the other profile, so you don't 336 00:21:05,800 --> 00:21:07,760 Speaker 1: have to worry about them reaching out on your behalf. 337 00:21:08,200 --> 00:21:11,720 Speaker 1: What they can do is give their opinion about whether 338 00:21:11,880 --> 00:21:14,560 Speaker 1: or not a person looks like a catch or if 339 00:21:14,600 --> 00:21:18,080 Speaker 1: you should just nope out of that potential relationship, and 340 00:21:18,119 --> 00:21:21,919 Speaker 1: they call this feature Matchmaker again. Getting back to Filler 341 00:21:21,920 --> 00:21:24,960 Speaker 1: on the roof, I'm just having the song matchmaker Matchmaker 342 00:21:25,240 --> 00:21:29,159 Speaker 1: make me a match play through my head, so hopefully 343 00:21:29,359 --> 00:21:32,679 Speaker 1: you are not aware of that song so that it 344 00:21:32,760 --> 00:21:35,199 Speaker 1: doesn't do the same to you. But I know that 345 00:21:35,280 --> 00:21:37,119 Speaker 1: the rest of the day it's going to be playing 346 00:21:37,160 --> 00:21:39,879 Speaker 1: in my head, partly because I was in a production 347 00:21:39,920 --> 00:21:43,120 Speaker 1: of Filler on the Roof. Anyway, I think this sounds 348 00:21:43,160 --> 00:21:45,560 Speaker 1: like Tinder is setting up the stage for a future 349 00:21:45,560 --> 00:21:50,520 Speaker 1: reality television program. Just imagine it's a reality TV program 350 00:21:50,640 --> 00:21:54,959 Speaker 1: where people can only go on a date with someone 351 00:21:54,960 --> 00:21:58,080 Speaker 1: who matches on their tender profile and gets the most 352 00:21:58,160 --> 00:22:01,959 Speaker 1: votes from that person's can unity of matchmakers. So like 353 00:22:02,320 --> 00:22:04,679 Speaker 1: your group of friends and family, I'll say this person 354 00:22:04,800 --> 00:22:07,840 Speaker 1: is right for you, and then you are obligated to 355 00:22:07,880 --> 00:22:10,240 Speaker 1: go on a date with that person. That sounds like 356 00:22:10,440 --> 00:22:13,840 Speaker 1: a reality TV show, Ready to go. I'm just giving 357 00:22:13,880 --> 00:22:16,960 Speaker 1: away ideas for free again, aren't I. You know what, 358 00:22:17,280 --> 00:22:19,840 Speaker 1: I'm gonna do some soul searching on that. When we 359 00:22:19,920 --> 00:22:32,240 Speaker 1: come back, I've got a few more stories to talk about. Okay, 360 00:22:32,520 --> 00:22:36,800 Speaker 1: we're back, so let me tell y'all a story. Way 361 00:22:37,119 --> 00:22:41,480 Speaker 1: back in twenty twelve, a company called Cloud Imperium Games 362 00:22:41,960 --> 00:22:45,800 Speaker 1: held itself a tiny little kickstarter was for a proposed 363 00:22:45,960 --> 00:22:51,240 Speaker 1: science fiction space game that they called Star Citizen. The 364 00:22:51,280 --> 00:22:56,920 Speaker 1: game's scope was to be astoundingly huge, and it would 365 00:22:56,920 --> 00:23:01,320 Speaker 1: include both multiplayer elements and elements that would be sort 366 00:23:01,320 --> 00:23:05,879 Speaker 1: of a single player campaign, which became known as Squadron 367 00:23:05,960 --> 00:23:11,280 Speaker 1: forty two now as Star Citizen grew and grew and 368 00:23:11,480 --> 00:23:15,920 Speaker 1: grew in development and sold more and more in game 369 00:23:16,000 --> 00:23:18,840 Speaker 1: products for a game that did not yet exist, and 370 00:23:19,119 --> 00:23:23,440 Speaker 1: made more than half a billion dollars in funding, no joke, 371 00:23:24,160 --> 00:23:27,639 Speaker 1: like they they something like six hundred million dollars I 372 00:23:27,640 --> 00:23:30,920 Speaker 1: think at this point. Anyway, the company decided to spin 373 00:23:31,000 --> 00:23:34,800 Speaker 1: off Squadron forty two to be its own standalone single 374 00:23:34,840 --> 00:23:40,120 Speaker 1: player campaign. But that campaign, just like the larger Star 375 00:23:40,240 --> 00:23:44,879 Speaker 1: Citizen game, kind of just stayed in development year after year. 376 00:23:45,160 --> 00:23:50,600 Speaker 1: And while supporters for Star Citizen have enjoyed releases of 377 00:23:50,720 --> 00:23:54,480 Speaker 1: some content, like there's been limited releases where you've got 378 00:23:54,520 --> 00:23:57,360 Speaker 1: some gameplay elements that people can actually play. I don't 379 00:23:57,400 --> 00:23:59,479 Speaker 1: want to I don't want to suggest that nothing has 380 00:23:59,520 --> 00:24:02,639 Speaker 1: come out. That's not the case. But they were still 381 00:24:02,640 --> 00:24:07,800 Speaker 1: waiting for the full promised title, and for Squadron forty two, 382 00:24:08,200 --> 00:24:12,560 Speaker 1: that weight might and I stress might be getting close 383 00:24:12,640 --> 00:24:15,320 Speaker 1: to the end because the company has announced it is 384 00:24:15,440 --> 00:24:20,440 Speaker 1: now feature complete, so that means that the developers are 385 00:24:20,520 --> 00:24:24,480 Speaker 1: done adding in features into Squadron forty two. They're not 386 00:24:24,600 --> 00:24:28,479 Speaker 1: adding in any more gameplay elements or anything like that. However, 387 00:24:28,520 --> 00:24:31,320 Speaker 1: it's not quite ready to ship yet. The developer set 388 00:24:31,320 --> 00:24:35,840 Speaker 1: it's working toward a beta and ultimately a release once 389 00:24:35,920 --> 00:24:39,520 Speaker 1: they are done finalizing the game, polishing the content that 390 00:24:39,600 --> 00:24:41,800 Speaker 1: kind of thing. They did not give a window for 391 00:24:41,880 --> 00:24:44,280 Speaker 1: when that might be. They did release a new trailer, 392 00:24:44,840 --> 00:24:48,880 Speaker 1: and I hope that the stars Citizen community that's been 393 00:24:48,920 --> 00:24:51,560 Speaker 1: waiting for this content for more than a decade at 394 00:24:51,560 --> 00:24:55,199 Speaker 1: this point are happy with the result, and that the 395 00:24:55,240 --> 00:24:59,080 Speaker 1: rest of Star Citizen also ultimately comes out. I do 396 00:24:59,080 --> 00:25:01,760 Speaker 1: not have a stake in this. I never supported the campaign, 397 00:25:01,880 --> 00:25:05,560 Speaker 1: I didn't pre order anything, and I have some very 398 00:25:05,600 --> 00:25:09,360 Speaker 1: strong feelings about how this whole project has progressed over 399 00:25:09,400 --> 00:25:12,520 Speaker 1: the years. Like I have been very critical of Star 400 00:25:12,640 --> 00:25:17,240 Speaker 1: Citizen in general, but I genuinely hope that the people 401 00:25:17,320 --> 00:25:20,399 Speaker 1: who have poured money and their own hopes into this 402 00:25:20,520 --> 00:25:23,960 Speaker 1: project end up being very happy. I would hate for 403 00:25:24,000 --> 00:25:28,200 Speaker 1: this to continue on as just a never ending development, 404 00:25:28,800 --> 00:25:32,200 Speaker 1: and I worry about people being disappointed with the final 405 00:25:32,200 --> 00:25:34,720 Speaker 1: product simply because they've been waiting so long and have 406 00:25:34,800 --> 00:25:38,720 Speaker 1: put so much money into it. I'm reminded a little 407 00:25:38,760 --> 00:25:44,840 Speaker 1: bit of Bethesda, where they had their own Starfield product 408 00:25:44,920 --> 00:25:47,280 Speaker 1: come out, and there were a lot of people who 409 00:25:47,320 --> 00:25:51,600 Speaker 1: felt that it did not live up to their expectations. 410 00:25:51,880 --> 00:25:54,280 Speaker 1: Other people really like that game. I have yet to 411 00:25:54,280 --> 00:25:57,439 Speaker 1: play it, so I haven't formed my own opinion. I 412 00:25:57,480 --> 00:25:59,760 Speaker 1: know people who think it's a great game, and I 413 00:25:59,800 --> 00:26:04,080 Speaker 1: know people who feel that it was a total disappointment. 414 00:26:04,440 --> 00:26:06,840 Speaker 1: So I'm sure most people fall somewhere in the middle. 415 00:26:07,119 --> 00:26:10,400 Speaker 1: But yeah, here's hoping that the folks who support Star Citizen, 416 00:26:10,720 --> 00:26:13,600 Speaker 1: when they do get access to Squadron forty two, are 417 00:26:13,640 --> 00:26:19,160 Speaker 1: not left disappointed. Now, imagine that you have to provide 418 00:26:19,160 --> 00:26:22,200 Speaker 1: tech support to a spacecraft that has left your Solar System. 419 00:26:22,600 --> 00:26:26,680 Speaker 1: That would be quite the long distance service call. That's 420 00:26:26,760 --> 00:26:30,680 Speaker 1: what NASA engineers are doing with Voyager one and Voyager two. 421 00:26:31,320 --> 00:26:34,840 Speaker 1: So both of those spacecraft are now in interstellar space, 422 00:26:34,880 --> 00:26:38,000 Speaker 1: they have left the Solar System, and NASA has sent 423 00:26:38,080 --> 00:26:41,680 Speaker 1: some instructions to each of them in order to extend 424 00:26:41,800 --> 00:26:44,600 Speaker 1: their ability to communicate with us all the way back 425 00:26:44,640 --> 00:26:47,920 Speaker 1: here on Earth for as long as possible. So last year, 426 00:26:48,160 --> 00:26:52,760 Speaker 1: Voyager one started to send some pretty weird communications about 427 00:26:52,760 --> 00:26:58,480 Speaker 1: the spacecraft's telemetry. Voyager sends telemetry data through a system 428 00:26:58,520 --> 00:27:02,919 Speaker 1: called the Attitude Articulation and Control System or AACS. But 429 00:27:03,200 --> 00:27:08,159 Speaker 1: it was passing this data through a computer system that 430 00:27:08,400 --> 00:27:12,920 Speaker 1: wasn't working properly, and it was odd because it wasn't 431 00:27:12,960 --> 00:27:16,480 Speaker 1: supposed to be passing the data through that specific computer system, 432 00:27:16,720 --> 00:27:20,320 Speaker 1: and so sending back garbled data nothing exciting. It's not 433 00:27:20,359 --> 00:27:23,199 Speaker 1: like the garbled data said, Hey, Earth, send more I 434 00:27:23,320 --> 00:27:26,680 Speaker 1: love Lucy episodes, which would have been cool because then 435 00:27:26,800 --> 00:27:29,359 Speaker 1: aliens would be looking at us. Now it's just like 436 00:27:30,480 --> 00:27:35,480 Speaker 1: just gibberish, really, And so the engineers have now beamed 437 00:27:35,520 --> 00:27:39,080 Speaker 1: a patch to fix this issue with Voyager one. It's 438 00:27:39,119 --> 00:27:41,760 Speaker 1: a software patch. There is the lingering question as to 439 00:27:41,800 --> 00:27:44,520 Speaker 1: why Voyager one was routing the data through this computer 440 00:27:44,560 --> 00:27:47,600 Speaker 1: system in the first place. We still don't know that answer, 441 00:27:47,920 --> 00:27:50,840 Speaker 1: but hopefully this will at least be a temporary fix 442 00:27:50,920 --> 00:27:53,600 Speaker 1: for the problem. The team also came up with a 443 00:27:53,720 --> 00:27:57,600 Speaker 1: new schedule for firing spacecraft thrusters for both Voyager one 444 00:27:57,720 --> 00:28:01,560 Speaker 1: and Voyager two in order to reduce the frequency of 445 00:28:01,640 --> 00:28:05,359 Speaker 1: those thrustering ignitions for a couple of reasons. One, obviously, 446 00:28:05,800 --> 00:28:08,040 Speaker 1: there's a limited amount of fuel, right like, you don't 447 00:28:08,080 --> 00:28:11,760 Speaker 1: have infinite fuel on the spacecraft. But the other one 448 00:28:11,880 --> 00:28:15,600 Speaker 1: is that each time you fire the thrusters, it builds 449 00:28:15,680 --> 00:28:18,680 Speaker 1: up a little bit of residue on the inlet tubes 450 00:28:18,680 --> 00:28:22,399 Speaker 1: that feed into those thrusters. So the concern is that 451 00:28:22,720 --> 00:28:26,160 Speaker 1: if they kept up with a more frequent schedule, that 452 00:28:26,280 --> 00:28:29,240 Speaker 1: build up would reach a point where it would interfere 453 00:28:29,359 --> 00:28:33,359 Speaker 1: with the thrusters. So now they'll do it less frequently 454 00:28:33,800 --> 00:28:37,560 Speaker 1: and hopefully extend the useful life of the spacecraft. And 455 00:28:37,600 --> 00:28:40,000 Speaker 1: they're obviously not going to work forever, but the hope 456 00:28:40,080 --> 00:28:42,640 Speaker 1: is to extend their lifespans a little bit more as 457 00:28:42,640 --> 00:28:47,200 Speaker 1: they continue on into interstellar space. Now I have a 458 00:28:47,200 --> 00:28:49,720 Speaker 1: couple of articles that I'm going to recommend, but before 459 00:28:49,800 --> 00:28:53,040 Speaker 1: I do, I do have one final story, and that 460 00:28:53,160 --> 00:28:55,800 Speaker 1: is Martin Getz has passed away at the age of 461 00:28:55,880 --> 00:28:59,880 Speaker 1: ninety three. Gets was the first person to secure a 462 00:29:00,240 --> 00:29:03,680 Speaker 1: US patent for software. He did that back in nineteen 463 00:29:03,760 --> 00:29:06,480 Speaker 1: sixty eight. Now, I've talked about this before when I 464 00:29:06,480 --> 00:29:09,560 Speaker 1: did an episode about the Patent Office, because the idea 465 00:29:09,600 --> 00:29:13,320 Speaker 1: of patenting software was one that had some major hurdles 466 00:29:13,320 --> 00:29:17,200 Speaker 1: to overcome. And that's because the US Patent Office has 467 00:29:17,280 --> 00:29:21,040 Speaker 1: rules on what you can and cannot patent. One thing 468 00:29:21,120 --> 00:29:25,360 Speaker 1: that you cannot patent is a mathematical process, and there 469 00:29:25,400 --> 00:29:29,600 Speaker 1: are arguments to be made that software ultimately is really 470 00:29:29,760 --> 00:29:36,600 Speaker 1: just a series of mathematical processes, and as such, software programs, etc. 471 00:29:37,240 --> 00:29:42,200 Speaker 1: Are not viable candidates for patents. That was the argument. Ultimately, 472 00:29:42,240 --> 00:29:46,000 Speaker 1: the Patent Office rejected this idea, at least within certain parameters, 473 00:29:46,360 --> 00:29:49,520 Speaker 1: and Gets his work in data sorting software became the 474 00:29:49,680 --> 00:29:53,200 Speaker 1: first to receive a patent from the US Patent Office. 475 00:29:53,920 --> 00:29:57,560 Speaker 1: Gets really campaigned for this in large part because independent 476 00:29:57,600 --> 00:30:02,600 Speaker 1: software developers had no protection for their work. So let's 477 00:30:02,600 --> 00:30:05,520 Speaker 1: say that someone came up with a really useful program 478 00:30:05,680 --> 00:30:10,680 Speaker 1: like sorting data in a mainframe computer system, Well, nothing 479 00:30:10,880 --> 00:30:16,880 Speaker 1: would stop a massive company with practically unlimited resources from 480 00:30:16,960 --> 00:30:20,480 Speaker 1: just copying or reverse engineering that software and using it 481 00:30:20,520 --> 00:30:25,480 Speaker 1: for themselves and never paying the developer anything. So Getz 482 00:30:25,520 --> 00:30:28,600 Speaker 1: pointed out that would be unfair. It's not fair to 483 00:30:28,720 --> 00:30:33,160 Speaker 1: just stand by while very powerful tech companies could steal 484 00:30:33,200 --> 00:30:35,760 Speaker 1: the work from independent programmers just because there are no 485 00:30:35,880 --> 00:30:40,040 Speaker 1: protective measures in place. So his patent set a precedent, 486 00:30:40,400 --> 00:30:42,520 Speaker 1: and it meant that programmers had a chance to secure 487 00:30:42,560 --> 00:30:45,440 Speaker 1: their work and protect their livelihoods. And I think that's 488 00:30:45,440 --> 00:30:49,840 Speaker 1: a good thing, So rest in peace, Martin gets Now 489 00:30:49,840 --> 00:30:51,960 Speaker 1: that wraps up our stories. But there are a couple 490 00:30:51,960 --> 00:30:53,920 Speaker 1: of articles that I would like to recommend you check 491 00:30:53,920 --> 00:30:56,840 Speaker 1: out if you have the time. Again, I have no 492 00:30:56,920 --> 00:31:00,520 Speaker 1: connection to these media outlets. I do not know the 493 00:31:00,560 --> 00:31:04,320 Speaker 1: writers personally or professionally. I just thought these were good 494 00:31:04,400 --> 00:31:06,400 Speaker 1: articles and you should check them out if you've got 495 00:31:06,440 --> 00:31:09,640 Speaker 1: time and you're interested. So first up is a piece 496 00:31:09,680 --> 00:31:14,520 Speaker 1: by Jennifer Pattinson Twoey of The Verge. It's titled Matter 497 00:31:14,760 --> 00:31:17,640 Speaker 1: one point two is a big move for the smart 498 00:31:17,640 --> 00:31:20,520 Speaker 1: home standard. Now I'll have to do a full episode 499 00:31:20,560 --> 00:31:23,800 Speaker 1: about Matter in the future and how the goal is 500 00:31:23,840 --> 00:31:26,760 Speaker 1: to create a foundation for smart home technologies, but this 501 00:31:26,920 --> 00:31:30,040 Speaker 1: article gives a great overview of where we're at currently 502 00:31:30,200 --> 00:31:35,080 Speaker 1: in that effort. Secondly, I recommend Ashley Bellinger's article in 503 00:31:35,080 --> 00:31:40,440 Speaker 1: Ours Technica titled will chat GPT's hallucinations be allowed to 504 00:31:40,600 --> 00:31:44,800 Speaker 1: ruin Your Life? This article covers how AI companies are 505 00:31:44,840 --> 00:31:49,560 Speaker 1: trying to insulate themselves against stuff like defamation lawsuits, so 506 00:31:49,600 --> 00:31:52,719 Speaker 1: that if their AI chatbot claims that I don't know 507 00:31:52,760 --> 00:31:55,840 Speaker 1: that you go around kicking puppies or something, you wouldn't 508 00:31:55,880 --> 00:31:59,320 Speaker 1: be able to sue them for ruining your reputation. I 509 00:31:59,320 --> 00:32:01,880 Speaker 1: think it's valuable to read that as well because it 510 00:32:02,200 --> 00:32:08,240 Speaker 1: very much plays upon the concepts of accountability and artificial 511 00:32:08,360 --> 00:32:14,080 Speaker 1: intelligence agents very important, really, and it highlights something that 512 00:32:14,240 --> 00:32:16,880 Speaker 1: is kind of in this gray area when it comes 513 00:32:16,880 --> 00:32:20,640 Speaker 1: to the law. You know, we're starting to define AI 514 00:32:21,040 --> 00:32:24,560 Speaker 1: in a legal sense through various court cases. For example, 515 00:32:24,600 --> 00:32:28,880 Speaker 1: the court case that determined that AI is incapable of 516 00:32:28,920 --> 00:32:31,800 Speaker 1: holding a copyright. You cannot copyright a work that was 517 00:32:31,840 --> 00:32:36,160 Speaker 1: created by AI because human authorship is a key component 518 00:32:36,280 --> 00:32:40,520 Speaker 1: of copyright as it is defined today. So that's kind 519 00:32:40,560 --> 00:32:43,320 Speaker 1: of how you know, we have to define things in 520 00:32:43,360 --> 00:32:45,680 Speaker 1: the court system. It has to be done by the 521 00:32:45,680 --> 00:32:50,280 Speaker 1: decisions of judges in various trials. So a very important 522 00:32:50,320 --> 00:32:55,640 Speaker 1: part of the adoption of artificial intelligence inside the United States. 523 00:32:56,120 --> 00:32:59,960 Speaker 1: And that wraps up the tech news for Tuesday, October two, 524 00:33:00,000 --> 00:33:03,920 Speaker 1: twenty forth twenty twenty three. I hope you are all 525 00:33:03,960 --> 00:33:13,800 Speaker 1: well and I'll talk to you again really soon. Tech 526 00:33:13,880 --> 00:33:18,280 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 527 00:33:18,600 --> 00:33:22,320 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 528 00:33:22,360 --> 00:33:27,160 Speaker 1: to your favorite shows.