1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,960 --> 00:00:18,960 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,000 --> 00:00:22,160 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:22,320 --> 00:00:28,320 Speaker 1: August eighth, twenty twenty three. And despite my voice catching there, 6 00:00:28,400 --> 00:00:31,200 Speaker 1: I feel a lot better than I did yesterday. Thanks 7 00:00:31,200 --> 00:00:31,800 Speaker 1: for asking. 8 00:00:32,159 --> 00:00:33,640 Speaker 2: Let's get to the tech news. 9 00:00:34,159 --> 00:00:37,120 Speaker 1: And we've got a lot of AI related stories today. 10 00:00:37,320 --> 00:00:40,879 Speaker 1: No big surprise there. It has been the topic of 11 00:00:41,000 --> 00:00:46,479 Speaker 1: twenty twenty three, at least whenever Elon Musk isn't demanding 12 00:00:46,520 --> 00:00:50,680 Speaker 1: all the headlines. And it's also sadly no surprise that 13 00:00:50,720 --> 00:00:53,199 Speaker 1: one of the AI stories we're covering today has to 14 00:00:53,240 --> 00:00:57,360 Speaker 1: do with faulty facial recognition technology and the mistake of 15 00:00:57,440 --> 00:01:01,040 Speaker 1: relying on that tech for the purposes of law enforcement. 16 00:01:01,760 --> 00:01:06,200 Speaker 1: Porsche Woodruff, a black woman in Detroit, found herself arrested 17 00:01:06,280 --> 00:01:10,560 Speaker 1: and attained for eleven hours when police acted on an 18 00:01:10,680 --> 00:01:14,680 Speaker 1: incorrect facial recognition match while seeking a suspect in a 19 00:01:14,720 --> 00:01:19,600 Speaker 1: carjacking and robbery crime. Not only that Portiae is eight 20 00:01:19,680 --> 00:01:23,200 Speaker 1: months pregnant, and the surveillance footage from the crime in 21 00:01:23,280 --> 00:01:27,319 Speaker 1: question showed a woman who very much was not eight 22 00:01:27,360 --> 00:01:30,160 Speaker 1: months pregnant. So not only was this a case of 23 00:01:30,200 --> 00:01:34,200 Speaker 1: facial recognition software giving a false positive, it's also a 24 00:01:34,240 --> 00:01:37,840 Speaker 1: case of cops apparently working under the assumption that a 25 00:01:37,880 --> 00:01:41,839 Speaker 1: woman can go from not visibly pregnant to eight months 26 00:01:41,840 --> 00:01:45,600 Speaker 1: pregnant in a very short amount of time, which is wild. 27 00:01:45,840 --> 00:01:50,040 Speaker 1: So here's kind of what unfolded. A man reported being 28 00:01:50,080 --> 00:01:53,920 Speaker 1: the victim of a robbery and carjacking. The police were 29 00:01:53,920 --> 00:01:57,080 Speaker 1: able to secure surveillance footage, and they used a tool 30 00:01:57,200 --> 00:02:02,720 Speaker 1: called data works plus to unmatches against mug shots that 31 00:02:02,800 --> 00:02:06,680 Speaker 1: were stored in a police database. Woodroff had been arrested 32 00:02:06,680 --> 00:02:09,840 Speaker 1: back in twenty fifteen, so her mugshot was one of 33 00:02:09,880 --> 00:02:13,440 Speaker 1: the images in that database. This tool pulled a match 34 00:02:13,600 --> 00:02:16,959 Speaker 1: between her mugshot and the surveillance footage, and the guy 35 00:02:16,960 --> 00:02:20,880 Speaker 1: who was robbed also mistook a photo of Woodroff her 36 00:02:20,919 --> 00:02:24,320 Speaker 1: mugshot as that as the same person as the perpetrator. 37 00:02:24,720 --> 00:02:27,840 Speaker 1: So the police go and they arrest Woodroff, and of 38 00:02:27,840 --> 00:02:30,400 Speaker 1: course she was not involved in the crime, could not 39 00:02:30,560 --> 00:02:33,080 Speaker 1: have been, and just the fact that she was, you know, 40 00:02:33,160 --> 00:02:36,919 Speaker 1: eight months pregnant should have been the immediate giveaway that 41 00:02:36,960 --> 00:02:40,000 Speaker 1: this is not the same person. The New York Times 42 00:02:40,040 --> 00:02:44,040 Speaker 1: subsequently reported that Woodroff's case was the third in the 43 00:02:44,080 --> 00:02:48,800 Speaker 1: city of Detroit alone that resulted in a wrongful arrest 44 00:02:49,040 --> 00:02:53,360 Speaker 1: due to incorrect facial recognition matches, and that all three 45 00:02:53,400 --> 00:02:56,400 Speaker 1: of those cases, as well as three other cases that 46 00:02:56,480 --> 00:03:01,359 Speaker 1: were not in Detroit, all involved black people. I think 47 00:03:01,400 --> 00:03:05,040 Speaker 1: it's safe to say that even the faulty facial recognition 48 00:03:05,080 --> 00:03:08,520 Speaker 1: technology is able to see a pattern emerging here, and 49 00:03:08,560 --> 00:03:13,320 Speaker 1: it's one of racial bias in surveillance and identification tools. Now, 50 00:03:13,360 --> 00:03:16,720 Speaker 1: as we have covered on this podcast, several cities and 51 00:03:16,840 --> 00:03:20,760 Speaker 1: jurisdictions have banned the use of facial recognition for law 52 00:03:20,840 --> 00:03:25,239 Speaker 1: enforcement purposes. Personally, I think that is merited. I can't 53 00:03:25,280 --> 00:03:29,440 Speaker 1: help but imagine what being wrongfully arrested must be like. 54 00:03:29,560 --> 00:03:33,440 Speaker 1: It's got to be incredibly traumatic and disruptive and. 55 00:03:33,360 --> 00:03:36,520 Speaker 2: Potentially cause lots of issues. 56 00:03:36,040 --> 00:03:39,480 Speaker 1: In your life, and you at no point were at 57 00:03:39,520 --> 00:03:42,720 Speaker 1: fault for any of it. And if a technology is 58 00:03:42,800 --> 00:03:49,000 Speaker 1: disproportionately leading to that kind of thing, we should not 59 00:03:49,320 --> 00:03:54,880 Speaker 1: be using that technology for those purposes. I don't know 60 00:03:54,920 --> 00:03:58,160 Speaker 1: how there's any argument against that. If the tool is 61 00:03:58,480 --> 00:04:01,640 Speaker 1: leading to innocent people getting arrested and their lives getting 62 00:04:01,680 --> 00:04:02,840 Speaker 1: up ended in the process. 63 00:04:03,320 --> 00:04:04,800 Speaker 2: You got to stop using the tool. 64 00:04:05,560 --> 00:04:09,400 Speaker 1: British researchers showed how a deep learning algorithm, once trained, 65 00:04:09,400 --> 00:04:13,640 Speaker 1: would be able to decode keystrokes just from the sound 66 00:04:14,040 --> 00:04:18,400 Speaker 1: of typing. This isn't totally new. I've heard of these 67 00:04:18,480 --> 00:04:22,040 Speaker 1: kinds of attacks before. But imagine for a moment that 68 00:04:22,520 --> 00:04:25,880 Speaker 1: you are set up in some public space and someone 69 00:04:25,880 --> 00:04:29,160 Speaker 1: else happens to have their phone out and you know, 70 00:04:29,240 --> 00:04:31,800 Speaker 1: you don't know this, but what they're actually doing is 71 00:04:31,839 --> 00:04:35,240 Speaker 1: activating the phone's microphone, and they're picking up on the 72 00:04:35,279 --> 00:04:39,359 Speaker 1: sound of you tippity tappany typing away, and you're oblivious 73 00:04:39,360 --> 00:04:41,400 Speaker 1: to any threats. So like maybe you're you're like being 74 00:04:41,400 --> 00:04:43,479 Speaker 1: careful with your screen or whatever, but you're not thinking 75 00:04:43,480 --> 00:04:48,039 Speaker 1: about the actual keystrokes. And meanwhile, a computer program on 76 00:04:48,080 --> 00:04:51,240 Speaker 1: the other end of that microphone is effectively transcribing everything 77 00:04:51,240 --> 00:04:55,880 Speaker 1: you've typed, potentially including your login information. This is an 78 00:04:55,920 --> 00:04:59,919 Speaker 1: acoustic attack, and I bet that makes all those mechanical 79 00:05:00,120 --> 00:05:03,600 Speaker 1: keyboard clackie clackie types out there a little nervous. So 80 00:05:04,040 --> 00:05:06,400 Speaker 1: is it possible that some hacker out there could get 81 00:05:06,440 --> 00:05:09,920 Speaker 1: your log in credentials just by having a computer listen 82 00:05:09,960 --> 00:05:14,040 Speaker 1: to you type. Technically, yes, it is possible. It is 83 00:05:14,080 --> 00:05:18,120 Speaker 1: not necessarily straightforward or easy to do. It is possible. 84 00:05:18,760 --> 00:05:23,320 Speaker 1: So if you're at a location, then you need to 85 00:05:23,360 --> 00:05:26,240 Speaker 1: know that is a possibility. But like a lot of 86 00:05:26,240 --> 00:05:28,479 Speaker 1: locations have a lot of other noise, it's hard to 87 00:05:28,480 --> 00:05:31,400 Speaker 1: set up a microphone in such a way that you're 88 00:05:31,440 --> 00:05:34,599 Speaker 1: going to get a very clear recording of that sound. 89 00:05:35,160 --> 00:05:36,960 Speaker 1: Maybe if you're at a coffee shop, you should just 90 00:05:36,960 --> 00:05:39,480 Speaker 1: start making CLACKI clacky noises with your mouth at the 91 00:05:39,520 --> 00:05:41,960 Speaker 1: same time as you type, you know, to throw off 92 00:05:42,000 --> 00:05:45,320 Speaker 1: any potential batties who are trying to listen in on you. 93 00:05:45,800 --> 00:05:46,280 Speaker 2: I don't know. 94 00:05:46,720 --> 00:05:49,919 Speaker 1: It's a wild world out there. Reuter's reports that the 95 00:05:49,920 --> 00:05:53,560 Speaker 1: Walt Disney Company has created a task force to research 96 00:05:53,600 --> 00:05:57,560 Speaker 1: how artificial intelligence could be used within that company, which 97 00:05:57,600 --> 00:06:01,159 Speaker 1: of course encompasses lots of different divisions. You've got the 98 00:06:01,279 --> 00:06:05,080 Speaker 1: entertainment division, You've got theme parks, You've got merchandising, you've 99 00:06:05,120 --> 00:06:10,080 Speaker 1: got advertising. There's tons of businesses underneath the umbrella of 100 00:06:10,120 --> 00:06:14,719 Speaker 1: Walt Disney. Well, the company currently has nearly a dozen 101 00:06:14,839 --> 00:06:19,200 Speaker 1: job openings that mention artificial intelligence research and development. So 102 00:06:20,040 --> 00:06:21,960 Speaker 1: this does look like it's a big push and could 103 00:06:21,960 --> 00:06:25,960 Speaker 1: include things like imagineering, but it also ranges to other stuff, 104 00:06:26,000 --> 00:06:30,240 Speaker 1: you know, theme parks to advertising to also Disney television. 105 00:06:30,440 --> 00:06:33,400 Speaker 1: And that last one is pretty notable because many of 106 00:06:33,400 --> 00:06:36,320 Speaker 1: the contentious elements that are at the center of the 107 00:06:36,600 --> 00:06:41,120 Speaker 1: ongoing strikes in Hollywood, which if you weren't aware, involves 108 00:06:41,160 --> 00:06:45,280 Speaker 1: both writers and actors. They're both on strike in Hollywood 109 00:06:45,320 --> 00:06:47,880 Speaker 1: right now. Well, one of the things they're striking about 110 00:06:48,560 --> 00:06:51,760 Speaker 1: is all about how studios should or should not use 111 00:06:51,880 --> 00:06:57,880 Speaker 1: artificial intelligence moving forward. Reuter's cites some unnamed folks connected 112 00:06:57,920 --> 00:07:00,560 Speaker 1: to Disney, no people who didn't want to have this 113 00:07:00,640 --> 00:07:03,400 Speaker 1: get back to them, but they said that the company 114 00:07:03,440 --> 00:07:06,400 Speaker 1: really has little choice here, that if it doesn't incorporate 115 00:07:06,480 --> 00:07:10,760 Speaker 1: AI into its strategy, it runs the risk of becoming obsolete. 116 00:07:11,840 --> 00:07:13,840 Speaker 2: Maybe that's true. 117 00:07:13,960 --> 00:07:17,400 Speaker 1: But my knee jerk reaction kind of is triggered because 118 00:07:17,440 --> 00:07:20,640 Speaker 1: back in two thousand and four, Disney famously shut down 119 00:07:20,680 --> 00:07:25,520 Speaker 1: its two D animation studios, and it was unthinkable for 120 00:07:25,600 --> 00:07:30,400 Speaker 1: a company that had built its reputation on traditional two 121 00:07:30,480 --> 00:07:34,000 Speaker 1: D animation to suddenly turn its back on it. It 122 00:07:34,040 --> 00:07:37,520 Speaker 1: has subsequently changed that, But for a while, it looked 123 00:07:37,560 --> 00:07:39,880 Speaker 1: like two D animation and Disney were just things of 124 00:07:39,920 --> 00:07:42,880 Speaker 1: the past, and it just seemed like there was this 125 00:07:43,000 --> 00:07:47,120 Speaker 1: attitude among Disney executive leadership that computer animation was somehow 126 00:07:47,480 --> 00:07:50,840 Speaker 1: not just different from traditional hand drawn two D animation, 127 00:07:51,320 --> 00:07:55,600 Speaker 1: but innately better than two D animated films. Like, audiences 128 00:07:55,640 --> 00:07:58,680 Speaker 1: didn't want to see two D animation, they just wanted 129 00:07:58,720 --> 00:08:01,720 Speaker 1: computer animation, and for proof of that you would look 130 00:08:01,760 --> 00:08:06,720 Speaker 1: at Pixar. Here's the problem. Pixar was investing heavily in 131 00:08:06,880 --> 00:08:11,720 Speaker 1: developing great stories to tell, and yes, the computer animation 132 00:08:12,240 --> 00:08:15,040 Speaker 1: was getting more and more impressive with every single film, 133 00:08:15,800 --> 00:08:19,960 Speaker 1: but they were really putting story first, whereas the animated 134 00:08:20,040 --> 00:08:23,480 Speaker 1: side over at Disney had fallen a long way since 135 00:08:23,520 --> 00:08:26,560 Speaker 1: the early days of the so called Disney Renaissance, which 136 00:08:26,600 --> 00:08:29,560 Speaker 1: included movies like The Little Mermaid and Beauty and the 137 00:08:29,560 --> 00:08:32,640 Speaker 1: Beast in Aladdin. You know, if you look at nineteen 138 00:08:32,679 --> 00:08:35,319 Speaker 1: ninety five, for example, I would say I don't think 139 00:08:35,360 --> 00:08:38,160 Speaker 1: Toy Story, which came out in nineteen ninety five, was 140 00:08:38,640 --> 00:08:41,920 Speaker 1: a better movie than Pocahontas, which also came out in 141 00:08:42,040 --> 00:08:45,679 Speaker 1: ninety five, just because Toy Story was computer animated and 142 00:08:45,720 --> 00:08:48,680 Speaker 1: Pocahontas was hand drawn. I think Toy Story was a 143 00:08:48,679 --> 00:08:51,839 Speaker 1: better movie because the script was better. But you know, 144 00:08:52,040 --> 00:08:54,280 Speaker 1: Pocahontas also had to follow up on the amazing work 145 00:08:54,280 --> 00:08:56,800 Speaker 1: of Ashman and Mancoln and Ashman had passed away in 146 00:08:56,880 --> 00:08:59,680 Speaker 1: nineteen ninety one, so there are a lot of other 147 00:09:00,080 --> 00:09:03,920 Speaker 1: to getting factors there. Anyway, Disney hasn't really commented on 148 00:09:04,240 --> 00:09:08,000 Speaker 1: how it plans to incorporate AI into its processes. It's 149 00:09:08,440 --> 00:09:11,560 Speaker 1: planning on doing it, but it hasn't talked about what 150 00:09:11,720 --> 00:09:14,600 Speaker 1: that might look like. So you know, you could have 151 00:09:14,679 --> 00:09:18,720 Speaker 1: AI incorporated into very mundane stuff, right, like automating things 152 00:09:18,720 --> 00:09:22,640 Speaker 1: like scheduling and finding the most efficient means to do that, 153 00:09:23,200 --> 00:09:28,199 Speaker 1: which isn't necessarily impacting the creative side of the business 154 00:09:28,200 --> 00:09:31,439 Speaker 1: that much. Right, if you're using it to handle stuff 155 00:09:31,960 --> 00:09:35,520 Speaker 1: that is otherwise tedious and takes up a lot of 156 00:09:35,559 --> 00:09:40,359 Speaker 1: time but is easy enough to automate, that's not necessarily 157 00:09:40,400 --> 00:09:42,840 Speaker 1: a bad thing. That can end up making a company 158 00:09:42,840 --> 00:09:47,439 Speaker 1: more efficient without also displacing employees in the process, or 159 00:09:47,440 --> 00:09:50,839 Speaker 1: at least freeing those employees up to do more rewarding 160 00:09:50,960 --> 00:09:54,760 Speaker 1: work instead of something that's really tedious. The concern is 161 00:09:54,800 --> 00:09:57,959 Speaker 1: whether or not that use of AI could end up 162 00:09:58,920 --> 00:10:04,120 Speaker 1: replacing very important and creative roles for people, whether it's 163 00:10:04,200 --> 00:10:09,600 Speaker 1: actors or writers or imagineers or whomever. So yeah, big 164 00:10:09,640 --> 00:10:15,480 Speaker 1: concern in AI with Disney. Yesterday, Zoom made a change 165 00:10:15,520 --> 00:10:18,280 Speaker 1: to its terms of service after receiving some pretty harsh 166 00:10:18,280 --> 00:10:23,040 Speaker 1: criticism from customers over the weekend. So around Sunday, people 167 00:10:23,080 --> 00:10:26,160 Speaker 1: who were paying attention to zoom terms of service started 168 00:10:26,200 --> 00:10:29,560 Speaker 1: to post screenshots of those terms, and they included a 169 00:10:29,600 --> 00:10:32,600 Speaker 1: passage saying the company has the right to collect, store, 170 00:10:32,920 --> 00:10:37,400 Speaker 1: and use quote unquote service generated data. Now that alone 171 00:10:38,120 --> 00:10:41,360 Speaker 1: seems a bit concerning. Right. Let's say your company uses 172 00:10:41,440 --> 00:10:44,800 Speaker 1: Zoom for business meetings. You probably don't like the idea 173 00:10:45,280 --> 00:10:49,480 Speaker 1: of Zoom potentially snooping in on your calls. On top 174 00:10:49,520 --> 00:10:51,400 Speaker 1: of that, the term said that Zoom could use that 175 00:10:51,520 --> 00:10:55,839 Speaker 1: data essentially to train AI, and that really got people upset. 176 00:10:56,200 --> 00:10:58,640 Speaker 1: The thought that their video calls could be used as 177 00:10:58,720 --> 00:11:03,559 Speaker 1: material to train another aimodel seemed invasive. Now, a Zoom 178 00:11:03,600 --> 00:11:06,559 Speaker 1: rep explained that the collection features are part of an 179 00:11:06,559 --> 00:11:10,960 Speaker 1: opt in system users can choose to enable generative AI 180 00:11:11,040 --> 00:11:14,840 Speaker 1: features such as transcription services, and the company does not 181 00:11:15,120 --> 00:11:19,959 Speaker 1: use any customer content without first gaining the consent from 182 00:11:20,000 --> 00:11:23,559 Speaker 1: the customer. To that end, now, Zoom has updated its 183 00:11:23,640 --> 00:11:26,880 Speaker 1: terms of use to clearly say it will only collect 184 00:11:26,920 --> 00:11:31,319 Speaker 1: audio and video data with consent from the user first. 185 00:11:31,920 --> 00:11:35,560 Speaker 1: In related news, Zoom leadership has also called on Zoom 186 00:11:35,559 --> 00:11:39,240 Speaker 1: employees who live within fifty miles of a Zoom office 187 00:11:39,960 --> 00:11:43,760 Speaker 1: to actually attend work in person at least two days 188 00:11:43,800 --> 00:11:44,120 Speaker 1: a week. 189 00:11:44,320 --> 00:11:44,920 Speaker 2: So that's right. 190 00:11:45,000 --> 00:11:48,600 Speaker 1: The company that made the tool talented as one of 191 00:11:48,600 --> 00:11:53,600 Speaker 1: the most important during lockdown, the one that enabled remote work, 192 00:11:54,520 --> 00:11:59,720 Speaker 1: is now restricting remote work at the corporate level at Zoom, 193 00:12:00,280 --> 00:12:04,240 Speaker 1: which is you know how the tables have turned. I 194 00:12:04,280 --> 00:12:07,800 Speaker 1: guess okay, we're going to take a quick break. When 195 00:12:07,800 --> 00:12:19,400 Speaker 1: we come back, we've got some more tech news. We're back, 196 00:12:19,480 --> 00:12:23,720 Speaker 1: and actually have a couple of other AI stories to 197 00:12:23,880 --> 00:12:26,880 Speaker 1: finish up with before we move on to other tech news, 198 00:12:27,320 --> 00:12:29,680 Speaker 1: so we're not done with AI just yet. AP News 199 00:12:29,720 --> 00:12:32,520 Speaker 1: reports that the folks over at Dungeons and Dragons, you know, 200 00:12:32,559 --> 00:12:34,840 Speaker 1: Wizards of the Coast, which in turn is part of Hasbro, 201 00:12:35,520 --> 00:12:40,000 Speaker 1: are now telling their artists not to use artificial intelligence 202 00:12:40,360 --> 00:12:43,920 Speaker 1: as part of the generative process to create fantasy arts, 203 00:12:43,920 --> 00:12:47,439 Speaker 1: so they're telling the artists, hey, don't use AI when 204 00:12:47,480 --> 00:12:50,440 Speaker 1: you're making art for us. This comes after several D 205 00:12:50,480 --> 00:12:53,480 Speaker 1: and D fans raised questions about an illustration that included 206 00:12:53,520 --> 00:12:57,520 Speaker 1: a giant that they said looked a little weird, like 207 00:12:57,840 --> 00:13:00,400 Speaker 1: perhaps it had not been made by a human being, 208 00:13:00,960 --> 00:13:03,360 Speaker 1: and they asked, hey, was this made by a robot? 209 00:13:03,600 --> 00:13:04,959 Speaker 2: So D and D reached. 210 00:13:04,679 --> 00:13:08,240 Speaker 1: Out and contacted the artist, talked with them, found out 211 00:13:08,240 --> 00:13:12,280 Speaker 1: that yes, there was some use of AI generative features 212 00:13:12,320 --> 00:13:16,880 Speaker 1: to collaborate and make this art, and the company now 213 00:13:16,960 --> 00:13:19,800 Speaker 1: is clarifying rules on what can and cannot be used 214 00:13:19,800 --> 00:13:23,920 Speaker 1: to make fantasy art for the games. The particular piece 215 00:13:24,160 --> 00:13:28,240 Speaker 1: that triggered you folks to ask, is this AI generated 216 00:13:28,600 --> 00:13:32,079 Speaker 1: is actually going to be appearing soon in an expansion 217 00:13:32,080 --> 00:13:35,560 Speaker 1: book called Big Bie Presents Glory of the Giants. I 218 00:13:35,559 --> 00:13:39,520 Speaker 1: think it comes out next week in fact, so maybe 219 00:13:39,520 --> 00:13:41,640 Speaker 1: collectors will rush out and grab a copy in case 220 00:13:41,679 --> 00:13:45,680 Speaker 1: future editions will remove the AI generated giant person. But 221 00:13:45,760 --> 00:13:48,680 Speaker 1: you might ask why is D and D, Why is 222 00:13:48,720 --> 00:13:52,640 Speaker 1: Wizard of the Coast, And perhaps by extension, Hasbro saying 223 00:13:53,240 --> 00:13:56,080 Speaker 1: don't use AI to generate fantasy art. Why is that 224 00:13:56,120 --> 00:13:59,640 Speaker 1: a big deal? Well, part of it is about copyright 225 00:13:59,720 --> 00:14:03,880 Speaker 1: quest like who owns the copyright to a machine generated 226 00:14:03,920 --> 00:14:07,640 Speaker 1: piece of work. Obviously, Wizards of the Coast want to 227 00:14:07,640 --> 00:14:10,640 Speaker 1: be able to copyright their their stuff and not have 228 00:14:10,760 --> 00:14:14,720 Speaker 1: some other party claim ownership of something that's featured in 229 00:14:14,960 --> 00:14:19,080 Speaker 1: a Wizards of the Coast work. But also there's the 230 00:14:19,240 --> 00:14:25,320 Speaker 1: issue of you know, copying another person's style. Right. If 231 00:14:25,440 --> 00:14:29,600 Speaker 1: artist A produces a ton of fantasy art and then 232 00:14:29,720 --> 00:14:34,800 Speaker 1: artist B uses a generative tool that coincidentally is referencing 233 00:14:35,120 --> 00:14:39,840 Speaker 1: artists a work and it's doing so extensively, and then 234 00:14:39,880 --> 00:14:43,000 Speaker 1: creates a new piece in the style of artist A, 235 00:14:43,200 --> 00:14:46,040 Speaker 1: well then it's almost like artist B is copying Artist A. 236 00:14:46,720 --> 00:14:49,400 Speaker 1: And you could argue, well, then that means artist A 237 00:14:49,480 --> 00:14:52,320 Speaker 1: could have landed this gig and gotten a paying gig 238 00:14:52,320 --> 00:14:56,680 Speaker 1: out of it, but instead their work was then you know, 239 00:14:56,720 --> 00:15:00,920 Speaker 1: sort of repurposed and reimagined without their consent. This is 240 00:15:00,920 --> 00:15:05,560 Speaker 1: an ongoing issue with generative AI in the visual arts realm. 241 00:15:05,800 --> 00:15:09,120 Speaker 1: There's also a very similar effect, the same problem that's 242 00:15:09,160 --> 00:15:13,800 Speaker 1: going on within the written word. Right. There are authors 243 00:15:13,840 --> 00:15:17,640 Speaker 1: and poets who are arguing that AI models being trained 244 00:15:17,720 --> 00:15:24,600 Speaker 1: on published works are effectively copying these folks without their consent. So, 245 00:15:25,400 --> 00:15:29,200 Speaker 1: if you'll excuse me, I need to roll to see 246 00:15:29,200 --> 00:15:33,040 Speaker 1: if my AI generated image of an elf will deceive 247 00:15:33,280 --> 00:15:41,640 Speaker 1: Wizards of the Coast and oh, critical fail. Okay, well, 248 00:15:42,480 --> 00:15:45,640 Speaker 1: I guess the fingers are all nudely and there's like 249 00:15:45,720 --> 00:15:48,440 Speaker 1: fifteen of them, so I guess that's a dead giveaway. 250 00:15:49,120 --> 00:15:49,480 Speaker 2: Okay. 251 00:15:49,960 --> 00:15:52,560 Speaker 1: You know a lot of folks have voiced concerns about 252 00:15:52,600 --> 00:15:56,680 Speaker 1: AI and the possible dangers that it could bring, and 253 00:15:56,760 --> 00:16:00,480 Speaker 1: we can now count the Pope as one of those voices. 254 00:16:00,560 --> 00:16:05,200 Speaker 1: Pope Francis has called on a global reflection today that 255 00:16:05,400 --> 00:16:09,680 Speaker 1: is all about how AI could be really dangerous. Pope 256 00:16:09,680 --> 00:16:12,720 Speaker 1: Francis has said in the past he is largely unfamiliar 257 00:16:12,800 --> 00:16:16,240 Speaker 1: with modern technology, including stuff like computers and the Internet, 258 00:16:16,680 --> 00:16:19,160 Speaker 1: but that he also sees that these tools can be 259 00:16:19,320 --> 00:16:23,400 Speaker 1: incredibly helpful when they are put to appropriate use, which 260 00:16:23,680 --> 00:16:26,560 Speaker 1: you know, is a refreshing take from someone who is 261 00:16:26,680 --> 00:16:31,760 Speaker 1: unfamiliar with technology that they recognize tools aren't necessarily good 262 00:16:31,880 --> 00:16:35,400 Speaker 1: or bad in of themselves. It's really all in how 263 00:16:35,440 --> 00:16:37,840 Speaker 1: we make use of those tools, and if we commit 264 00:16:38,040 --> 00:16:41,400 Speaker 1: to using them. In ways that aren't harmful, we can 265 00:16:41,440 --> 00:16:44,960 Speaker 1: see great benefit. But some of these tools like AI, 266 00:16:45,160 --> 00:16:48,480 Speaker 1: for example, have the potential to be very dangerous if 267 00:16:48,520 --> 00:16:51,960 Speaker 1: we are using them improperly or if we don't have 268 00:16:51,960 --> 00:16:55,040 Speaker 1: a full understanding of the consequences before we use them, 269 00:16:55,280 --> 00:16:58,600 Speaker 1: so we have to take extra care when we're working 270 00:16:58,680 --> 00:17:02,440 Speaker 1: with them. It's not that AI is not worthwhile or 271 00:17:02,640 --> 00:17:07,800 Speaker 1: could never do anything positive. That's clearly not the case. 272 00:17:08,440 --> 00:17:12,000 Speaker 1: We just have to be very very methodical in our 273 00:17:12,080 --> 00:17:14,919 Speaker 1: approach to using AI, and right now you could argue 274 00:17:14,960 --> 00:17:19,199 Speaker 1: that is not what we're seeing. Apple has reportedly struck 275 00:17:19,240 --> 00:17:23,880 Speaker 1: a huge deal with chip manufacturer TSMC out of Taiwan 276 00:17:24,480 --> 00:17:29,040 Speaker 1: that will see Apple purchase essentially all of TSMC's chips 277 00:17:29,080 --> 00:17:33,240 Speaker 1: made with their next generation manufacturing process, which is called 278 00:17:33,240 --> 00:17:38,000 Speaker 1: the three nanometer manufacturing process. Just as a reminder, once 279 00:17:38,080 --> 00:17:41,680 Speaker 1: upon a time, when we would use things like nanometer 280 00:17:41,840 --> 00:17:46,280 Speaker 1: to describe the chip manufacturing process that actually referenced the 281 00:17:46,320 --> 00:17:49,399 Speaker 1: size of individual components found on the chips, that you 282 00:17:49,400 --> 00:17:53,119 Speaker 1: would actually see stuff on the chips that measured at 283 00:17:52,800 --> 00:17:57,240 Speaker 1: that unit. But these days it's really just a naming convention. 284 00:17:57,440 --> 00:18:00,920 Speaker 1: It's really just to designate the generation of the chip 285 00:18:00,960 --> 00:18:04,399 Speaker 1: manufacturing process, the individual elements on the chips are not 286 00:18:04,800 --> 00:18:08,000 Speaker 1: three nanometers in size. That would end up being a 287 00:18:08,040 --> 00:18:12,440 Speaker 1: big disaster because of the way quantum physics works. So yeah, 288 00:18:12,520 --> 00:18:15,879 Speaker 1: just a reminder that the whole nanometer thing, it's just 289 00:18:15,920 --> 00:18:19,160 Speaker 1: a naming convention now. It doesn't actually reference anything other 290 00:18:19,280 --> 00:18:21,560 Speaker 1: than this is the newest one and it has to 291 00:18:21,640 --> 00:18:25,879 Speaker 1: keep getting smaller. So it does raise questions of do 292 00:18:25,960 --> 00:18:29,720 Speaker 1: we go down to the atomic scale once we get 293 00:18:29,760 --> 00:18:35,280 Speaker 1: past one nanometer. Anyway, according to the information, Apple has 294 00:18:35,359 --> 00:18:40,920 Speaker 1: essentially ordered every single TSMC three nanometer chip at least 295 00:18:40,920 --> 00:18:43,600 Speaker 1: in the short term, and by short term I mean 296 00:18:43,760 --> 00:18:48,119 Speaker 1: Apple will have exclusive use of chips made by that 297 00:18:48,160 --> 00:18:52,280 Speaker 1: manufacturing process from TSMC for about a year, and that 298 00:18:52,359 --> 00:18:55,959 Speaker 1: definitely gives Apple a leg up on the competition that 299 00:18:56,080 --> 00:18:59,800 Speaker 1: wants to use TSMC's chips. There are other fabricators out there. 300 00:19:00,119 --> 00:19:02,720 Speaker 1: TSMC is not the only game in town. It's just 301 00:19:02,800 --> 00:19:07,080 Speaker 1: the biggest one. Meanwhile, there is a political battle surrounding 302 00:19:07,119 --> 00:19:11,800 Speaker 1: TSMC's planned fabrication facility that would be here in the 303 00:19:11,880 --> 00:19:16,320 Speaker 1: United States, in Arizona. So the Taiwan based company plans 304 00:19:16,400 --> 00:19:20,000 Speaker 1: this fabrication plant in Arizona, but just last month announced 305 00:19:20,040 --> 00:19:22,800 Speaker 1: that there was going to be a construction delay that 306 00:19:22,920 --> 00:19:27,000 Speaker 1: would last until twenty twenty five. The reason being, according 307 00:19:27,040 --> 00:19:29,639 Speaker 1: to the company, is that there are a lack of 308 00:19:29,800 --> 00:19:32,840 Speaker 1: skilled workers here in the US who would be needed 309 00:19:32,960 --> 00:19:36,760 Speaker 1: to prepare and open the facility, not to work once 310 00:19:36,800 --> 00:19:39,879 Speaker 1: it is open, but to actually get everything in place 311 00:19:39,960 --> 00:19:44,400 Speaker 1: as well. They're just the talent isn't here, and instead 312 00:19:44,480 --> 00:19:48,200 Speaker 1: TSMC wants to bring around five hundred employees from Taiwan 313 00:19:48,800 --> 00:19:52,000 Speaker 1: to the United States to do that work instead. That 314 00:19:52,080 --> 00:19:55,159 Speaker 1: has led to US politicians weighing in and they have 315 00:19:55,359 --> 00:19:59,359 Speaker 1: argued that these jobs should go to US workers. Now 316 00:20:00,359 --> 00:20:03,240 Speaker 1: there's a lot going on here, and it gets very 317 00:20:03,280 --> 00:20:07,119 Speaker 1: messy and it gets very political. But from a high level, 318 00:20:07,520 --> 00:20:12,639 Speaker 1: the US decades ago seeded as in got rid of 319 00:20:12,800 --> 00:20:18,199 Speaker 1: pretty much all major chip fabrication because it's expensive. It 320 00:20:18,280 --> 00:20:21,679 Speaker 1: is very expensive to build chip fabrication plants. You have 321 00:20:21,720 --> 00:20:24,919 Speaker 1: to update them constantly because as we were just talking about, 322 00:20:25,000 --> 00:20:30,320 Speaker 1: you're always evolving the technology to make more powerful chips, 323 00:20:30,880 --> 00:20:33,879 Speaker 1: which means you got to retool everything. Sometimes you have 324 00:20:33,920 --> 00:20:38,240 Speaker 1: to build totally new facilities and that's a huge investment 325 00:20:38,280 --> 00:20:40,439 Speaker 1: and a lot of US companies got out of that 326 00:20:40,520 --> 00:20:44,760 Speaker 1: game ages ago, and instead that work went to places 327 00:20:44,840 --> 00:20:51,800 Speaker 1: like Taiwan and TSMC in particular. So because America got 328 00:20:51,920 --> 00:20:56,239 Speaker 1: rid of didn't totally get rid of chip fabrication, but 329 00:20:56,400 --> 00:20:59,200 Speaker 1: largely pushed that out to other places in the world, 330 00:21:00,280 --> 00:21:03,159 Speaker 1: you might say that TSMC could at least have a 331 00:21:03,200 --> 00:21:06,679 Speaker 1: partly legit point to make that the US lacks the 332 00:21:06,720 --> 00:21:12,040 Speaker 1: experts needed to open an advanced fabrication facility simply because 333 00:21:12,520 --> 00:21:15,320 Speaker 1: the US hasn't really been focused on that part of 334 00:21:15,440 --> 00:21:19,159 Speaker 1: chip manufacturing for a while. Yes, in the US you 335 00:21:19,200 --> 00:21:22,760 Speaker 1: have a lot of people developing the next chips, designing 336 00:21:22,800 --> 00:21:27,159 Speaker 1: the next generation of chips, but the actual fabrication is 337 00:21:27,200 --> 00:21:31,240 Speaker 1: taking place elsewhere. So while the expertise is definitely in design, 338 00:21:32,200 --> 00:21:36,320 Speaker 1: the argument is it's not in creating the fabrication facilities, 339 00:21:36,480 --> 00:21:40,080 Speaker 1: and that's where the problem is. On the flip side, 340 00:21:41,040 --> 00:21:44,520 Speaker 1: there's a concern that the reason TSMC really wants to 341 00:21:44,520 --> 00:21:47,879 Speaker 1: bring in Taiwanese workers to the US is not just 342 00:21:47,960 --> 00:21:51,280 Speaker 1: because they have expertise in the area, but also because 343 00:21:51,280 --> 00:21:55,040 Speaker 1: they're less likely to resist a push to work really 344 00:21:55,119 --> 00:21:59,879 Speaker 1: long hours, including working weekends and stuff, whereas US workers 345 00:22:00,200 --> 00:22:04,240 Speaker 1: have this pesky habit of arguing that they need to 346 00:22:04,280 --> 00:22:07,840 Speaker 1: be you know, fairly compensated and have work life balance. 347 00:22:08,320 --> 00:22:11,239 Speaker 1: So in other words, there's a concern that TSMC is 348 00:22:11,280 --> 00:22:14,560 Speaker 1: really looking to exploit a workforce in an effort to 349 00:22:14,640 --> 00:22:18,520 Speaker 1: keep costs down and to speed up building out the facilities. 350 00:22:19,400 --> 00:22:22,760 Speaker 1: As for what happens once the facility opens, TSMC says 351 00:22:22,800 --> 00:22:26,359 Speaker 1: it's committed to providing around twelve thousand jobs and that 352 00:22:26,520 --> 00:22:29,879 Speaker 1: US employees will fill those roles, so that like the 353 00:22:29,920 --> 00:22:34,119 Speaker 1: actual jobs of working at the facility will go to 354 00:22:34,280 --> 00:22:38,280 Speaker 1: US citizens, not you know, it won't be Taiwanese workers 355 00:22:38,280 --> 00:22:42,119 Speaker 1: brought over to do that particular work. So yeah, like 356 00:22:42,160 --> 00:22:45,240 Speaker 1: I said, it is political. There is a technical side 357 00:22:45,240 --> 00:22:49,480 Speaker 1: to it too, but it's messy. And this is why 358 00:22:49,560 --> 00:22:53,080 Speaker 1: you can't just leave politics out of discussions of technology, 359 00:22:53,440 --> 00:22:57,199 Speaker 1: because politics affects us and it affects the tech sector 360 00:22:57,640 --> 00:23:01,080 Speaker 1: a lot. In fact, you know, we can often see 361 00:23:01,119 --> 00:23:04,120 Speaker 1: it most acutely in the tech sector. Not that it's 362 00:23:04,160 --> 00:23:07,359 Speaker 1: not impacting other sectors as well, it's just when it 363 00:23:07,440 --> 00:23:12,040 Speaker 1: hits tech, people take notice because it's high profile stuff. 364 00:23:12,200 --> 00:23:15,080 Speaker 1: So we can't really avoid it here. I don't know 365 00:23:15,800 --> 00:23:18,080 Speaker 1: what the actual story is back here. I mean, it 366 00:23:18,119 --> 00:23:21,720 Speaker 1: may very well be that TSMC could not find the 367 00:23:21,800 --> 00:23:25,359 Speaker 1: talent they needed in order to prepare the fabrication facility 368 00:23:25,359 --> 00:23:29,040 Speaker 1: properly here in the US. Maybe that's true, or maybe 369 00:23:29,040 --> 00:23:32,800 Speaker 1: it's not. I just don't know, but I do know 370 00:23:32,880 --> 00:23:36,000 Speaker 1: that it is an ongoing issue right now. So we'll 371 00:23:36,040 --> 00:23:40,320 Speaker 1: have to see how that plays out in the short term. Okay, 372 00:23:40,880 --> 00:23:44,080 Speaker 1: I got a few more stories to cover, but before 373 00:23:44,119 --> 00:23:47,080 Speaker 1: I can get to that, let's take another quick break. 374 00:23:56,560 --> 00:23:57,240 Speaker 2: We're back. 375 00:23:57,960 --> 00:24:00,960 Speaker 1: Ours Technica has an article about how scientists at the 376 00:24:01,040 --> 00:24:04,920 Speaker 1: Lawrence Livermore National Lab in California have, for the second 377 00:24:05,000 --> 00:24:10,080 Speaker 1: time now, produced a fusion reaction that generated more energy 378 00:24:10,680 --> 00:24:14,920 Speaker 1: than was needed to initiate the reaction. We're gonna put 379 00:24:14,920 --> 00:24:17,320 Speaker 1: an asterisk on that because we're gonna come back to 380 00:24:17,359 --> 00:24:20,240 Speaker 1: it now. I have talked about fusion a lot on 381 00:24:20,280 --> 00:24:23,479 Speaker 1: the show, but just as a quick reminder, fusion occurs 382 00:24:23,480 --> 00:24:28,000 Speaker 1: when you take two lightweight atoms, like hydrogen atoms, for example, 383 00:24:29,280 --> 00:24:32,000 Speaker 1: and then you blast those atoms with enough pressure and 384 00:24:32,400 --> 00:24:36,040 Speaker 1: or energy like heat, in order to. 385 00:24:35,960 --> 00:24:38,159 Speaker 2: Fuse them into a new. 386 00:24:38,040 --> 00:24:41,919 Speaker 1: Atom, helium. In this case, this is what happens in 387 00:24:42,000 --> 00:24:46,280 Speaker 1: the sun. You know, hydrogen is forged into helium at 388 00:24:46,320 --> 00:24:50,920 Speaker 1: a temperature of millions of degrees as how does or 389 00:24:50,960 --> 00:24:54,480 Speaker 1: why does the sunshine would tell us anyway, in this process, 390 00:24:55,040 --> 00:24:57,480 Speaker 1: you also end up with a release of energy, right, 391 00:24:57,480 --> 00:25:00,320 Speaker 1: That's why the sun actually does shine. It's releasing it energy, 392 00:25:00,359 --> 00:25:03,679 Speaker 1: it's not just doing this process. So if you end 393 00:25:03,760 --> 00:25:07,880 Speaker 1: up with more energy than you used to start the reaction, 394 00:25:08,000 --> 00:25:11,359 Speaker 1: you've got a potentially viable alternative to other kinds of 395 00:25:11,359 --> 00:25:14,200 Speaker 1: power facilities, you know, like coal power plants, or even 396 00:25:14,280 --> 00:25:17,800 Speaker 1: things like solar and wind farms, or traditional nuclear power 397 00:25:17,800 --> 00:25:22,159 Speaker 1: plants which rely on nuclear fission. That's the process of 398 00:25:22,240 --> 00:25:26,840 Speaker 1: splitting heavy atoms into lighter atoms, and that also releases 399 00:25:26,840 --> 00:25:28,720 Speaker 1: a huge amount of energy in the process, but that 400 00:25:29,720 --> 00:25:31,960 Speaker 1: also creates things like nuclear waste, which you have to 401 00:25:31,960 --> 00:25:34,639 Speaker 1: figure out how do you process that or deal with 402 00:25:34,680 --> 00:25:37,239 Speaker 1: it or store it. It comes with a lot of 403 00:25:37,760 --> 00:25:44,679 Speaker 1: again political issues that make that technology difficult to pursue. 404 00:25:45,000 --> 00:25:47,320 Speaker 1: Here in Georgia, we actually had a nuclear power plant 405 00:25:47,320 --> 00:25:50,880 Speaker 1: come online for the first time in decades, and it 406 00:25:50,920 --> 00:25:54,240 Speaker 1: was supposed to have been built like fifteen years ago, 407 00:25:54,280 --> 00:25:57,600 Speaker 1: I think at this point somewhere around there. But there 408 00:25:57,600 --> 00:26:01,840 Speaker 1: were so many different delays and then the the cost 409 00:26:02,040 --> 00:26:06,280 Speaker 1: of building out the facility exploded as a result of that. 410 00:26:06,440 --> 00:26:09,840 Speaker 1: So even though the technology is proven, there are a 411 00:26:09,840 --> 00:26:14,359 Speaker 1: lot of drawbacks to nuclear fission. So nuclear fusion could 412 00:26:15,200 --> 00:26:18,720 Speaker 1: be a way to have an alternative that doesn't have 413 00:26:18,800 --> 00:26:26,960 Speaker 1: the same issues that nuclear fission has, but it's hard 414 00:26:27,000 --> 00:26:29,720 Speaker 1: to do so. Researchers have managed to create a few 415 00:26:29,720 --> 00:26:33,720 Speaker 1: fusion reactions over the years, including some uncontrolled ones in 416 00:26:33,760 --> 00:26:37,840 Speaker 1: the testing of the fusion bomb, but typically when you 417 00:26:37,880 --> 00:26:40,320 Speaker 1: were trying to make a fusion reactor for the purposes 418 00:26:40,320 --> 00:26:43,560 Speaker 1: of power generation, the result was that you were getting 419 00:26:43,720 --> 00:26:46,879 Speaker 1: an energy output that was less than the amount of 420 00:26:47,000 --> 00:26:49,760 Speaker 1: energy you were using to initiate the reaction, meaning you're 421 00:26:49,800 --> 00:26:53,639 Speaker 1: operating at a net loss. Right, you're using more energy 422 00:26:53,680 --> 00:26:56,720 Speaker 1: to start a reaction, then you're getting out of the reaction. 423 00:26:57,160 --> 00:27:00,400 Speaker 1: That is not a viable way to generate power lose 424 00:27:00,520 --> 00:27:05,440 Speaker 1: energy in the process. This most recent experiment generated three 425 00:27:05,560 --> 00:27:09,800 Speaker 1: point one five megadewels of energy, and the laser that 426 00:27:09,920 --> 00:27:14,960 Speaker 1: was lasers that were used to initiate this reaction. We're 427 00:27:15,000 --> 00:27:19,760 Speaker 1: blasting a two point oh five megadules amount of energy 428 00:27:19,880 --> 00:27:22,879 Speaker 1: at the fuel, So two point oh five energy is 429 00:27:22,920 --> 00:27:26,880 Speaker 1: hitting the fuel, the reaction generates three point one five 430 00:27:26,880 --> 00:27:30,119 Speaker 1: megadels of energy. That means you're getting about one and 431 00:27:30,119 --> 00:27:33,880 Speaker 1: a half times more energy out than you're putting in 432 00:27:34,480 --> 00:27:37,199 Speaker 1: with the laser. Now, let's go back to that asterisk 433 00:27:37,280 --> 00:27:41,920 Speaker 1: I said about, you know, creating more energy, or not creating, 434 00:27:41,960 --> 00:27:45,800 Speaker 1: but releasing more energy than you're pouring into it. Now, 435 00:27:45,800 --> 00:27:49,680 Speaker 1: the lasers did emit two point oh five megadels of energy. However, 436 00:27:50,119 --> 00:27:53,440 Speaker 1: the draw from the power grid to create those lasers 437 00:27:54,160 --> 00:27:58,480 Speaker 1: was much, much, much larger. So when you step back 438 00:27:58,520 --> 00:28:00,200 Speaker 1: and you say, all right, well how much energy did 439 00:28:00,200 --> 00:28:02,679 Speaker 1: it take for me to make a laser that could 440 00:28:02,720 --> 00:28:06,440 Speaker 1: emit two point zero five megadels of energy, that's where 441 00:28:06,440 --> 00:28:08,159 Speaker 1: you start to see that you're having to use a 442 00:28:08,160 --> 00:28:11,440 Speaker 1: lot more power to get that three point one five 443 00:28:11,480 --> 00:28:14,520 Speaker 1: mega jewels out of the reaction. So that means ultimately 444 00:28:14,520 --> 00:28:16,360 Speaker 1: it's a net loss when you look at it from 445 00:28:16,400 --> 00:28:21,600 Speaker 1: a big picture standpoint. According to Ours Technica, scientists think 446 00:28:21,640 --> 00:28:24,399 Speaker 1: we're going to have to hit energy generation that's around 447 00:28:24,400 --> 00:28:28,360 Speaker 1: thirty to one hundred times more than what the lasers 448 00:28:28,359 --> 00:28:30,720 Speaker 1: are blasting out in order for fusion to be a 449 00:28:30,840 --> 00:28:31,879 Speaker 1: viable power source. 450 00:28:32,400 --> 00:28:34,400 Speaker 2: That is a huge leap from. 451 00:28:34,200 --> 00:28:35,919 Speaker 1: One and a half time, so, which is what we 452 00:28:35,960 --> 00:28:38,120 Speaker 1: saw in this most recent experiment. You know, we have 453 00:28:38,120 --> 00:28:40,680 Speaker 1: to get up to thirty to one hundred times in 454 00:28:40,760 --> 00:28:44,600 Speaker 1: order to reach efficiencies where we're able to get more 455 00:28:44,720 --> 00:28:47,160 Speaker 1: energy out than we put in. Plus we have to 456 00:28:47,200 --> 00:28:50,760 Speaker 1: make it something that can be repeatable rapidly. Right now, 457 00:28:50,920 --> 00:28:54,720 Speaker 1: you're talking about months between experiments at this laboratory. A 458 00:28:54,840 --> 00:28:56,760 Speaker 1: power plant is going to need to do this many 459 00:28:56,800 --> 00:29:01,400 Speaker 1: times a second in order to continue lead generate energy 460 00:29:01,920 --> 00:29:04,720 Speaker 1: for the purposes or release energy. I keep saying generate. 461 00:29:04,760 --> 00:29:07,200 Speaker 1: You know, energy can be neither created nor destroyed. It's 462 00:29:07,240 --> 00:29:10,800 Speaker 1: really just released or converted from one form to another. 463 00:29:11,280 --> 00:29:11,720 Speaker 2: Anyway. 464 00:29:11,880 --> 00:29:15,160 Speaker 1: You would have to make that sustainable in order to 465 00:29:15,200 --> 00:29:18,440 Speaker 1: do things like create electricity for people. Otherwise you would 466 00:29:18,440 --> 00:29:21,760 Speaker 1: just have these spikes and they wouldn't be useful for anything. Yeah, 467 00:29:21,960 --> 00:29:23,560 Speaker 1: you could say like, wow, we generated a ton of 468 00:29:23,640 --> 00:29:26,800 Speaker 1: energy there. We released a huge amount of energy, But 469 00:29:26,920 --> 00:29:31,120 Speaker 1: unless you can make that something that can consistently provide electricity, 470 00:29:31,480 --> 00:29:34,960 Speaker 1: it's not really that useful. However, if we are able 471 00:29:35,000 --> 00:29:38,200 Speaker 1: to crack that code, we would have an incredible future 472 00:29:38,240 --> 00:29:40,920 Speaker 1: ahead of us. So let's hope for it, and let's 473 00:29:40,960 --> 00:29:43,920 Speaker 1: also hope that it's not a perpetual twenty to thirty 474 00:29:44,000 --> 00:29:47,560 Speaker 1: years situation. You know, that's where scientists say we're twenty 475 00:29:47,560 --> 00:29:50,760 Speaker 1: to thirty years out from a technology maturing, but then 476 00:29:50,880 --> 00:29:53,280 Speaker 1: we never get there, like ten years later, we're still 477 00:29:53,320 --> 00:29:56,000 Speaker 1: twenty to thirty years out. Let's hope it's not one 478 00:29:56,040 --> 00:29:58,760 Speaker 1: of those cases. A while back, I talked about how 479 00:29:58,800 --> 00:30:02,320 Speaker 1: Boeing has had to delay testing its Starliner crew vehicle 480 00:30:03,040 --> 00:30:06,840 Speaker 1: with actual, you know, human astronauts after discovering some issues 481 00:30:07,040 --> 00:30:10,520 Speaker 1: with the capsule's design. So, the purpose of the star 482 00:30:10,680 --> 00:30:14,480 Speaker 1: Liner capsule, it's a spacecraft but looks a lot like, 483 00:30:14,600 --> 00:30:17,560 Speaker 1: you know, like the old Apollo capsules. The purpose of 484 00:30:17,560 --> 00:30:19,880 Speaker 1: the star Liner is to serve as a vehicle that 485 00:30:19,920 --> 00:30:23,600 Speaker 1: will take astronauts to and from stuff like the International 486 00:30:23,600 --> 00:30:27,200 Speaker 1: Space Station. Then you would have like the Orion capsule, 487 00:30:27,240 --> 00:30:30,520 Speaker 1: which is larger. This is the one NASA plans to 488 00:30:30,720 --> 00:30:34,800 Speaker 1: use for future Moon missions. However, last month, Boeing had 489 00:30:34,840 --> 00:30:38,000 Speaker 1: to scrap test plans for the star Liner after review 490 00:30:38,040 --> 00:30:41,200 Speaker 1: showed that the capsule's quote unquote soft links in its 491 00:30:41,200 --> 00:30:45,320 Speaker 1: parachute system failed to measure up to NASA's safety requirements. 492 00:30:45,360 --> 00:30:47,720 Speaker 1: So they had to go back to the design board 493 00:30:47,760 --> 00:30:52,280 Speaker 1: and fix that and to replace those soft links, which 494 00:30:52,560 --> 00:30:56,960 Speaker 1: they now have said they've done. Plus, Boeing had used 495 00:30:57,000 --> 00:31:00,680 Speaker 1: some tape in the star Liner's wirings this ms that 496 00:31:00,840 --> 00:31:03,640 Speaker 1: have been flagged as a potential fire hazard that under 497 00:31:03,680 --> 00:31:09,480 Speaker 1: certain conditions they can become flammable. So Boeing has subsequently, 498 00:31:10,120 --> 00:31:13,240 Speaker 1: you know, started to remove all that tape and replace 499 00:31:13,280 --> 00:31:16,040 Speaker 1: it with other stuff. There are a few areas where 500 00:31:16,080 --> 00:31:19,080 Speaker 1: Boeing says that's not feasible to be able to actually 501 00:31:19,080 --> 00:31:22,080 Speaker 1: remove the tape because doing so would damage the vehicle 502 00:31:22,080 --> 00:31:26,640 Speaker 1: in the process. So instead they're coating the tape with 503 00:31:26,800 --> 00:31:31,120 Speaker 1: material that will be fire resistant so that you know, 504 00:31:31,160 --> 00:31:34,880 Speaker 1: it still won't end up causing a potential disastrous fire 505 00:31:35,120 --> 00:31:39,000 Speaker 1: inside the capsule. Now, all of this means that we're 506 00:31:39,040 --> 00:31:41,440 Speaker 1: looking at twenty twenty four at the earliest for a 507 00:31:41,480 --> 00:31:43,560 Speaker 1: test of the Starliner with a crew aboard. That is 508 00:31:44,160 --> 00:31:47,680 Speaker 1: disappointing news for Boeing as well as for NASA, but 509 00:31:47,760 --> 00:31:51,440 Speaker 1: NASA can continue to depend heavily on SpaceX's Dragon two 510 00:31:51,600 --> 00:31:56,240 Speaker 1: vehicle in the meantime. In more optimistic space related news, 511 00:31:56,600 --> 00:31:59,880 Speaker 1: NASA and the Department of Defense performed a recovery test 512 00:32:00,160 --> 00:32:04,720 Speaker 1: for the Orion crew module. So the Orion, like the 513 00:32:04,920 --> 00:32:09,520 Speaker 1: Apollo spacecraft of decades ago, is meant to splash down 514 00:32:10,040 --> 00:32:13,720 Speaker 1: in the ocean, specifically the Pacific Ocean upon returning to Earth. 515 00:32:14,360 --> 00:32:17,760 Speaker 1: Once it splashes down, a retrieval team will then maneuver 516 00:32:18,240 --> 00:32:22,800 Speaker 1: to a few thousand yards of the spacecraft and then 517 00:32:22,920 --> 00:32:27,080 Speaker 1: send retrieval teams to help the four astronauts exit the 518 00:32:27,160 --> 00:32:30,760 Speaker 1: vehicle safely. So this particular test is part of the 519 00:32:30,920 --> 00:32:34,280 Speaker 1: Artemis two mission. Artomis two will send astronauts around the 520 00:32:34,280 --> 00:32:35,880 Speaker 1: back side of the Moon for the first time in 521 00:32:35,960 --> 00:32:39,640 Speaker 1: many many decades. Artemis three is the one where astronauts 522 00:32:39,680 --> 00:32:42,200 Speaker 1: will actually set boots on the Moon for the first 523 00:32:42,200 --> 00:32:45,240 Speaker 1: time in ages. So the goal is to retrieve the 524 00:32:45,280 --> 00:32:48,360 Speaker 1: crew safely in less than two hours after the capsule 525 00:32:48,360 --> 00:32:51,960 Speaker 1: has splashed down in the ocean. The process involves Navy 526 00:32:52,000 --> 00:32:54,520 Speaker 1: divers who first go and check the capsule to make 527 00:32:54,560 --> 00:32:59,920 Speaker 1: sure that it's safe to deploy the raft that's around 528 00:33:00,240 --> 00:33:04,360 Speaker 1: the capsule and for the crew to emerge from the capsule. 529 00:33:04,920 --> 00:33:08,600 Speaker 1: The raft is called the front porch, and it serves 530 00:33:08,600 --> 00:33:12,040 Speaker 1: as a platform for the crew to step out on 531 00:33:12,040 --> 00:33:15,400 Speaker 1: once they leave the capsule, and from that point a 532 00:33:15,800 --> 00:33:20,280 Speaker 1: different retrieval crew will actually fly out to the splash 533 00:33:20,320 --> 00:33:23,760 Speaker 1: site and airlift the Orion crew and then transport them 534 00:33:23,800 --> 00:33:26,720 Speaker 1: back to a recovery ship. Once the crew is safely 535 00:33:26,760 --> 00:33:31,360 Speaker 1: aboard the recovery ship, then engineering teams will connect the 536 00:33:31,360 --> 00:33:33,920 Speaker 1: Orion capsule to the ship so it can be towed 537 00:33:34,000 --> 00:33:37,280 Speaker 1: back to land. So the test was successful, which is 538 00:33:37,320 --> 00:33:40,560 Speaker 1: a good step toward Artemis two. I got a couple 539 00:33:40,560 --> 00:33:44,320 Speaker 1: of article recommendations for you before I sign off. One 540 00:33:44,560 --> 00:33:48,560 Speaker 1: is in the Verge. The article is titled why thread 541 00:33:48,800 --> 00:33:52,960 Speaker 1: is Matter's biggest problem right now, and Jennifer Patson two 542 00:33:53,000 --> 00:33:56,280 Speaker 1: A wrote the piece. This deals with the technologies that 543 00:33:56,360 --> 00:34:00,840 Speaker 1: serve as the foundation for home automation tech and explains 544 00:34:00,840 --> 00:34:03,280 Speaker 1: how some high level decisions are making things perhaps a 545 00:34:03,280 --> 00:34:07,120 Speaker 1: little more complicated instead of simplifying them, which is what 546 00:34:07,400 --> 00:34:09,960 Speaker 1: Matter was really supposed to do. I'll have to do 547 00:34:10,000 --> 00:34:12,960 Speaker 1: a full episode about this in the future, but meanwhile, 548 00:34:13,040 --> 00:34:15,560 Speaker 1: this is a great start if you're wondering why is 549 00:34:15,600 --> 00:34:19,160 Speaker 1: home automation so darn complicated? Why are there so many 550 00:34:19,560 --> 00:34:24,719 Speaker 1: competing different systems using proprietary approaches where you can't have 551 00:34:24,840 --> 00:34:28,919 Speaker 1: interoperability between everything. This is a good way to kind 552 00:34:28,920 --> 00:34:33,840 Speaker 1: of get a ground understanding for that. The second article 553 00:34:33,880 --> 00:34:37,560 Speaker 1: I want to recommend is by Gregory Barber of Wired, 554 00:34:38,040 --> 00:34:41,680 Speaker 1: and it's titled The Cloud is a Prison? Can the 555 00:34:41,800 --> 00:34:45,520 Speaker 1: Local First Software Movement set us Free? So this piece 556 00:34:45,560 --> 00:34:49,000 Speaker 1: talks about how developers and consumers and corporations are grappling 557 00:34:49,080 --> 00:34:53,239 Speaker 1: with issues related to cloud platforms and a movement that 558 00:34:53,280 --> 00:34:57,400 Speaker 1: could potentially bring about an alternative to cloud computing. And 559 00:34:57,440 --> 00:35:00,640 Speaker 1: spoiler alert, it relies on technology that it has actually 560 00:35:00,719 --> 00:35:04,360 Speaker 1: been around for quite some time. But really interesting because 561 00:35:04,440 --> 00:35:09,000 Speaker 1: you know, seeing this this sort of seesaw movement between 562 00:35:09,280 --> 00:35:15,680 Speaker 1: centralized computing, decentralized computing, local computing versus cloud computing. It's 563 00:35:15,760 --> 00:35:18,160 Speaker 1: it's kind of you can start to see patterns in 564 00:35:18,200 --> 00:35:21,480 Speaker 1: the way people are using computers and what they do 565 00:35:21,560 --> 00:35:25,840 Speaker 1: when they encounter challenges in one model versus another. So 566 00:35:26,200 --> 00:35:28,520 Speaker 1: I highly recommend both those articles. As always, I have 567 00:35:28,600 --> 00:35:31,880 Speaker 1: no connection to either of those publications or the authors 568 00:35:31,920 --> 00:35:32,840 Speaker 1: behind those pieces. 569 00:35:33,200 --> 00:35:34,160 Speaker 2: I do not know them. 570 00:35:34,400 --> 00:35:36,640 Speaker 1: I just thought they were interesting and that if you 571 00:35:36,719 --> 00:35:38,960 Speaker 1: are into tech and you really want to learn more, 572 00:35:39,000 --> 00:35:42,799 Speaker 1: those are two good articles to read. Okay, that's it. 573 00:35:42,840 --> 00:35:45,720 Speaker 1: This was a long episode for a news episode. Probably 574 00:35:45,719 --> 00:35:49,160 Speaker 1: means Thursdays will be short. Here's hoping. I hope all 575 00:35:49,160 --> 00:35:51,560 Speaker 1: of you are well and I'll talk to you again 576 00:35:52,360 --> 00:36:01,640 Speaker 1: really soon. Tech Stuff is an eye iHeartRadio production. For 577 00:36:01,719 --> 00:36:06,600 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 578 00:36:06,680 --> 00:36:08,720 Speaker 1: or wherever you listen to your favorite shows.