1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,400 --> 00:00:21,639 Speaker 1: and I love all things tech and it is time 5 00:00:21,840 --> 00:00:27,000 Speaker 1: for the tech news for Tuesday, April six, twenty one. 6 00:00:27,640 --> 00:00:31,720 Speaker 1: Let's get at it. In the ongoing story about Amazon 7 00:00:31,920 --> 00:00:36,120 Speaker 1: and the company's efforts to fight against unionization, the US 8 00:00:36,280 --> 00:00:41,560 Speaker 1: National Labor Relations Board ruled against Amazon in a recent claim, 9 00:00:41,600 --> 00:00:45,319 Speaker 1: stating that the company illegally fired two employees who had 10 00:00:45,360 --> 00:00:48,600 Speaker 1: advocated for workers rights and to call the company to 11 00:00:48,640 --> 00:00:54,160 Speaker 1: account for its environmental impact. Marin Costa and Emily Cunningham 12 00:00:54,400 --> 00:00:58,640 Speaker 1: had both criticized their employer publicly before the company fired 13 00:00:58,880 --> 00:01:03,720 Speaker 1: both women, a move that the Labor Board identified as retaliatory. 14 00:01:03,800 --> 00:01:06,959 Speaker 1: Amazon representative J. C. Anderson said that the reason the 15 00:01:07,000 --> 00:01:10,920 Speaker 1: company handed the employees their walking papers wasn't to punish 16 00:01:10,920 --> 00:01:15,120 Speaker 1: them for criticizing Amazon, but rather for a quote, repeatedly 17 00:01:15,319 --> 00:01:20,120 Speaker 1: violating internal policies end quote. However, since at least one 18 00:01:20,120 --> 00:01:24,120 Speaker 1: of those policies seems to be about speaking publicly about 19 00:01:24,240 --> 00:01:27,720 Speaker 1: the business, according to The New York Times, this seems 20 00:01:27,760 --> 00:01:31,319 Speaker 1: like circular logic to me almost like a catch twenty two. 21 00:01:31,920 --> 00:01:35,200 Speaker 1: Sure you can criticize Amazon, it seems to say, but 22 00:01:35,280 --> 00:01:38,679 Speaker 1: if you speak publicly about the business, you've violated an 23 00:01:38,720 --> 00:01:42,440 Speaker 1: internal policy. Criticizing is not against the rules. It's then 24 00:01:42,760 --> 00:01:46,360 Speaker 1: that speaking about the business that is. But that internal 25 00:01:46,400 --> 00:01:50,120 Speaker 1: policy sounds like it is partly designed to protect the 26 00:01:50,160 --> 00:01:56,640 Speaker 1: company from employees criticizing it publicly, at least in any detail. Anyway, 27 00:01:56,800 --> 00:01:59,680 Speaker 1: I admit as always that I am biased when it 28 00:01:59,720 --> 00:02:01,840 Speaker 1: comes to this sort of stuff, as I am pro 29 00:02:02,080 --> 00:02:06,160 Speaker 1: union and in favor of protections that benefit the employee 30 00:02:06,240 --> 00:02:09,880 Speaker 1: over those that cater to the corporation. A couple of 31 00:02:09,880 --> 00:02:13,320 Speaker 1: weeks ago we saw reports about LG getting out of 32 00:02:13,360 --> 00:02:16,800 Speaker 1: the phone business. Now it's official. The company is shutting 33 00:02:16,800 --> 00:02:20,680 Speaker 1: down its smartphone division worldwide. It will continue to sell 34 00:02:20,680 --> 00:02:23,800 Speaker 1: off the inventory that it does have, so all those 35 00:02:23,840 --> 00:02:26,080 Speaker 1: phones made for this year will have to go. And 36 00:02:26,120 --> 00:02:28,320 Speaker 1: it sounds like we are never going to see production 37 00:02:28,400 --> 00:02:31,959 Speaker 1: models of phones like the rollable that was the c 38 00:02:32,160 --> 00:02:35,320 Speaker 1: e s reveal of a smartphone that had a flexible 39 00:02:35,360 --> 00:02:39,080 Speaker 1: oh lad screen, and the rollable could change size. It 40 00:02:39,120 --> 00:02:41,880 Speaker 1: could extend the width of the phone and go from 41 00:02:42,160 --> 00:02:47,360 Speaker 1: a sort of smartphone form factor to fablet sized or 42 00:02:47,480 --> 00:02:50,960 Speaker 1: tablet if you prefer, I definitely prefer The screen could 43 00:02:51,000 --> 00:02:54,960 Speaker 1: actually expand by unrolling from underneath the display you're looking at, 44 00:02:55,320 --> 00:02:59,280 Speaker 1: as these different sides would extend. It was a nifty design, 45 00:02:59,560 --> 00:03:03,079 Speaker 1: but I'm not sure that it was terribly practical. Moving 46 00:03:03,120 --> 00:03:06,760 Speaker 1: parts always represent a potential failure point, and for phones 47 00:03:07,280 --> 00:03:10,680 Speaker 1: which occasionally suffer bumps and falls, I think it would 48 00:03:10,680 --> 00:03:13,320 Speaker 1: have been a bit risky. Plus, if you wanted to 49 00:03:13,440 --> 00:03:17,240 Speaker 1: use the shape changing feature, you presumably wouldn't be able 50 00:03:17,320 --> 00:03:19,960 Speaker 1: to put your phone in a case because the case 51 00:03:19,960 --> 00:03:22,360 Speaker 1: wouldn't be able to change shape. But it was still 52 00:03:22,400 --> 00:03:25,839 Speaker 1: a very neat innovation in the phone space. And now 53 00:03:25,880 --> 00:03:28,120 Speaker 1: we're not really going to see it, but if you 54 00:03:28,160 --> 00:03:30,520 Speaker 1: are in the market for an LG phone, you should 55 00:03:30,600 --> 00:03:32,960 Speaker 1: keep your eyes peeled because we could see some pretty 56 00:03:33,000 --> 00:03:36,480 Speaker 1: aggressive sales in the very near future as they try 57 00:03:36,480 --> 00:03:40,160 Speaker 1: to get rid of that inventory. The Mars Perseverance rover 58 00:03:40,280 --> 00:03:43,560 Speaker 1: has dropped off the Ingenuity copter on the surface of 59 00:03:43,560 --> 00:03:47,760 Speaker 1: the Red planet. You might remember that Ingenuity piggybacked or 60 00:03:47,880 --> 00:03:52,080 Speaker 1: I guess really piggy bellied on the Perseverance and it 61 00:03:52,160 --> 00:03:56,720 Speaker 1: represents a high risk, high reward experiment. Mars is a 62 00:03:56,720 --> 00:04:00,920 Speaker 1: tough neighborhood. The atmosphere is thin, which means the rotors 63 00:04:01,000 --> 00:04:03,400 Speaker 1: on the Ingenuity will have to rotate at a rate 64 00:04:03,520 --> 00:04:06,680 Speaker 1: much higher than would be necessary here on Earth in 65 00:04:06,800 --> 00:04:09,920 Speaker 1: order to generate sufficient lift to get the copter off 66 00:04:09,960 --> 00:04:13,040 Speaker 1: the surface. It will also need to continue to charge 67 00:04:13,040 --> 00:04:16,919 Speaker 1: its batteries. The Perseverance had been providing juice to Ingenuity 68 00:04:17,000 --> 00:04:21,280 Speaker 1: since the Perseverance first touchdown, but now Ingenuity is gonna 69 00:04:21,279 --> 00:04:23,480 Speaker 1: have to use its own solar panels to do the same. 70 00:04:23,960 --> 00:04:27,279 Speaker 1: That electricity isn't just going to power the rotors motor, 71 00:04:28,160 --> 00:04:31,560 Speaker 1: which is a fun thing to say, rotors motor. We 72 00:04:31,640 --> 00:04:35,000 Speaker 1: go ahead, try it anyway. The electricity will also go 73 00:04:35,160 --> 00:04:39,240 Speaker 1: to powering Ingenuitys onboard heater as the temperature on Mars 74 00:04:39,640 --> 00:04:43,440 Speaker 1: plummets at night, it dips down to negative nineties celsius 75 00:04:43,680 --> 00:04:47,760 Speaker 1: or negative one thirty fahrenheit, whichever is greater. I kid, 76 00:04:47,800 --> 00:04:51,880 Speaker 1: they're the same thing. The heater ensures that Ingenuity's mechanical 77 00:04:51,960 --> 00:04:54,960 Speaker 1: parts don't freeze tight, which, as I understand it would 78 00:04:54,960 --> 00:04:59,719 Speaker 1: be somewhat you know, impede flight soon, but no sooner 79 00:04:59,760 --> 00:05:03,719 Speaker 1: than in April eleven, the Ingenuity will attempt its first flight. 80 00:05:04,200 --> 00:05:07,440 Speaker 1: This is really a proof of concept experiment with no 81 00:05:07,520 --> 00:05:11,480 Speaker 1: guarantee of success, but if that first flight does succeed, 82 00:05:11,760 --> 00:05:14,120 Speaker 1: NASA hopes to do a few more, as many as 83 00:05:14,200 --> 00:05:17,919 Speaker 1: four more before ingenuityes batteries are drained to the point 84 00:05:18,000 --> 00:05:20,920 Speaker 1: that it will just need to settle down on Mars 85 00:05:21,040 --> 00:05:23,960 Speaker 1: make a life for itself however it can, or rather 86 00:05:24,040 --> 00:05:27,200 Speaker 1: it'll sit on Mars and rest in a job well 87 00:05:27,240 --> 00:05:31,320 Speaker 1: done and collect dust for the rest of Eternity. Will 88 00:05:31,360 --> 00:05:34,440 Speaker 1: follow up on this once we hear about the first 89 00:05:34,480 --> 00:05:38,640 Speaker 1: attempts and how they panned out. Sticking with space, a 90 00:05:38,680 --> 00:05:41,760 Speaker 1: farmer in the United States and fact in the state 91 00:05:41,760 --> 00:05:44,880 Speaker 1: of Washington got a surprise last week when a piece 92 00:05:44,960 --> 00:05:49,039 Speaker 1: of a rocket ship landed on their farm. Last Thursday, 93 00:05:49,240 --> 00:05:53,039 Speaker 1: the second stage from a SpaceX Falcon nine entered into 94 00:05:53,080 --> 00:05:57,480 Speaker 1: an uncontrolled descent into the Earth's atmosphere after an otherwise 95 00:05:57,520 --> 00:06:02,360 Speaker 1: successful launch. The Falcon nine is a two stage launch vehicle, 96 00:06:02,720 --> 00:06:05,800 Speaker 1: and the first stage, the lower stage, is the one 97 00:06:05,880 --> 00:06:08,120 Speaker 1: that is supposed to return to Earth in a controlled 98 00:06:08,200 --> 00:06:11,920 Speaker 1: landing procedure, so that the stage can then be refurbished 99 00:06:11,960 --> 00:06:15,400 Speaker 1: and reused, which cuts way down on launch costs, But 100 00:06:15,560 --> 00:06:20,039 Speaker 1: the upper stage, the second stage, isn't so lucky. Typically 101 00:06:20,360 --> 00:06:23,120 Speaker 1: it has one of two fates. It's either pushed into 102 00:06:23,160 --> 00:06:25,680 Speaker 1: an orbit where it's going to stick around in orbit 103 00:06:25,760 --> 00:06:31,400 Speaker 1: for a while before eventually falling to Earth, or more frequently, 104 00:06:31,760 --> 00:06:34,400 Speaker 1: it is pushed so it de orbits in a controlled 105 00:06:34,520 --> 00:06:37,799 Speaker 1: manner and breaks up over the ocean. But this time 106 00:06:37,920 --> 00:06:41,279 Speaker 1: the second stage lacked enough fuel to do the ocean thing, 107 00:06:41,720 --> 00:06:44,760 Speaker 1: so instead it fell a bit short and broke up 108 00:06:44,760 --> 00:06:48,200 Speaker 1: over the Pacific Northwest. In the US. It was quite 109 00:06:48,240 --> 00:06:51,599 Speaker 1: the display, and people all in the northwestern part of 110 00:06:51,640 --> 00:06:56,280 Speaker 1: North America reported seeing multiple shooting stars. These were pieces 111 00:06:56,279 --> 00:06:59,120 Speaker 1: of the second stage glowing as they fell back to Earth, 112 00:06:59,560 --> 00:07:02,200 Speaker 1: and one such piece made it intact all the way 113 00:07:02,240 --> 00:07:05,839 Speaker 1: down to the ground on this farm in Washington. While 114 00:07:05,880 --> 00:07:08,680 Speaker 1: we knew about the re entry from last week, this 115 00:07:08,720 --> 00:07:12,400 Speaker 1: bit about the piece making it down to the surface 116 00:07:12,400 --> 00:07:15,560 Speaker 1: of Earth is new now. Apparently it's part of a 117 00:07:15,600 --> 00:07:19,400 Speaker 1: pressurized tank which is no longer pressurized. I guess I 118 00:07:19,400 --> 00:07:22,880 Speaker 1: should add it landed in Grant County, Washington, and that 119 00:07:23,080 --> 00:07:26,200 Speaker 1: is as specific as we can get about it. Because 120 00:07:26,200 --> 00:07:29,800 Speaker 1: the Sheriff's office wisely decided to leave out more details 121 00:07:30,040 --> 00:07:34,040 Speaker 1: in order to spare the farmer from locky lose. Google 122 00:07:34,080 --> 00:07:37,840 Speaker 1: has made a big change to its policy regarding Android apps, 123 00:07:37,960 --> 00:07:41,040 Speaker 1: but it might surprise some folks like like me to 124 00:07:41,160 --> 00:07:44,880 Speaker 1: learn about the implications of it. So there's a command 125 00:07:45,080 --> 00:07:49,520 Speaker 1: called query all packages, and this allows an app that's 126 00:07:49,560 --> 00:07:52,320 Speaker 1: installed on an Android device to get a list of 127 00:07:52,360 --> 00:07:55,280 Speaker 1: all the other apps that are installed on that device. 128 00:07:55,920 --> 00:07:58,520 Speaker 1: And that seems like that could be a little excessive 129 00:07:58,560 --> 00:08:00,840 Speaker 1: at least in some cases, right I mean, for example, 130 00:08:01,240 --> 00:08:04,080 Speaker 1: if you have an app that gives you a weather report, 131 00:08:04,480 --> 00:08:08,160 Speaker 1: should that weather report app also see which games you 132 00:08:08,200 --> 00:08:11,440 Speaker 1: have on your phone? Or if your apps related to 133 00:08:11,520 --> 00:08:14,760 Speaker 1: stuff like banking or your personal health or a real 134 00:08:14,920 --> 00:08:18,080 Speaker 1: estate app or whatever are also on that phone? Because 135 00:08:18,080 --> 00:08:22,480 Speaker 1: that that seems like that that's probably not necessary, right Well, Google, 136 00:08:22,520 --> 00:08:24,960 Speaker 1: after more than a decade, seems to have reached the 137 00:08:25,080 --> 00:08:29,600 Speaker 1: same conclusion. Now for devices running on Android eleven or later, 138 00:08:29,840 --> 00:08:32,400 Speaker 1: and all apps coming out from this point forward half 139 00:08:32,480 --> 00:08:36,400 Speaker 1: to target Android eleven or later. These apps can't just 140 00:08:36,480 --> 00:08:41,640 Speaker 1: have blanket access to the query all packages command. Developers 141 00:08:41,840 --> 00:08:44,679 Speaker 1: will have to defend why their app would need that 142 00:08:44,800 --> 00:08:47,640 Speaker 1: level of access, and in some cases you can make 143 00:08:47,679 --> 00:08:51,239 Speaker 1: an argument for it. For example, I have a password 144 00:08:51,360 --> 00:08:54,320 Speaker 1: vault app on my phone, and if I want that 145 00:08:54,360 --> 00:08:57,880 Speaker 1: app to work alongside other apps so that when I 146 00:08:57,880 --> 00:09:00,320 Speaker 1: open up some other app that requires a pass I 147 00:09:00,320 --> 00:09:04,360 Speaker 1: can use this little option to just fill it out automatically, 148 00:09:04,920 --> 00:09:07,800 Speaker 1: like say a banking app. Well, then the password vault 149 00:09:07,920 --> 00:09:09,640 Speaker 1: is going to need to know what other apps are 150 00:09:09,720 --> 00:09:13,280 Speaker 1: running on my device so that that interoperability will work. 151 00:09:13,840 --> 00:09:17,120 Speaker 1: But I probably don't need a food delivery service app 152 00:09:17,240 --> 00:09:20,880 Speaker 1: and a podcatching app to know about each other. That's 153 00:09:20,880 --> 00:09:24,600 Speaker 1: probably not necessary. So this change means that developers will 154 00:09:24,840 --> 00:09:28,200 Speaker 1: no longer know quite as much about the apps that 155 00:09:28,240 --> 00:09:31,800 Speaker 1: their users have on their phones, which isn't necessarily a 156 00:09:31,840 --> 00:09:35,400 Speaker 1: bad thing for the users. It cuts down opportunities for 157 00:09:35,440 --> 00:09:39,080 Speaker 1: developers to target people directly without their permission, or sell 158 00:09:39,120 --> 00:09:41,880 Speaker 1: information about them, and so forth. I think it's a 159 00:09:41,920 --> 00:09:45,520 Speaker 1: long overdue move on Google's part, and I am glad 160 00:09:45,559 --> 00:09:48,200 Speaker 1: that it's finally happening. I'm just surprised it took this 161 00:09:48,280 --> 00:09:52,080 Speaker 1: long to happen. Kara Swisher conducted an interview with Apple 162 00:09:52,200 --> 00:09:55,440 Speaker 1: CEO Tim Cook for The New York Times recently, and 163 00:09:55,640 --> 00:09:59,160 Speaker 1: in that interview, Cook talked a bit about how Apple 164 00:09:59,360 --> 00:10:02,400 Speaker 1: used the proces spect of augmented reality. He said that 165 00:10:02,800 --> 00:10:06,640 Speaker 1: a R could enhance conversations and give people the chance 166 00:10:06,679 --> 00:10:11,400 Speaker 1: to integrate other stuff into live conversation. For example, maybe 167 00:10:11,400 --> 00:10:13,840 Speaker 1: you're arguing with that friend of yours who just refuses 168 00:10:13,880 --> 00:10:16,559 Speaker 1: to admit that they are totally full of crap, and 169 00:10:16,640 --> 00:10:19,480 Speaker 1: you could pull up charts and graphs to back up 170 00:10:19,480 --> 00:10:22,520 Speaker 1: your point showing that they are in fact full of crap. 171 00:10:22,960 --> 00:10:24,839 Speaker 1: It sounds like a lot of fun to me that 172 00:10:25,000 --> 00:10:28,200 Speaker 1: conversations will be way better in the future. I also 173 00:10:28,240 --> 00:10:31,840 Speaker 1: don't anticipate a ARE applications being a threat to security 174 00:10:32,000 --> 00:10:36,880 Speaker 1: or privacy at all. I'm sure they're perfectly safe. Now. 175 00:10:36,920 --> 00:10:39,720 Speaker 1: I've been on the record as being pro a R, 176 00:10:40,000 --> 00:10:43,559 Speaker 1: but it's within specific contexts. I actually get a bit 177 00:10:43,640 --> 00:10:46,280 Speaker 1: squeamish when we talk about a R as a component 178 00:10:46,840 --> 00:10:50,480 Speaker 1: in person to person interactions. Sure, it could be helpful 179 00:10:50,520 --> 00:10:53,079 Speaker 1: if while chatting with a friend, I get a little 180 00:10:53,120 --> 00:10:57,520 Speaker 1: digital reminder in my view that says that this friend's 181 00:10:57,520 --> 00:11:01,319 Speaker 1: birthday is coming up, or maybe they've got a dietary sensitivity, 182 00:11:01,360 --> 00:11:04,959 Speaker 1: and since we're talking about restaurants, it's just reminding me 183 00:11:05,000 --> 00:11:06,960 Speaker 1: of that, so I don't end up making, you know, 184 00:11:07,240 --> 00:11:11,760 Speaker 1: suggestions that wouldn't apply or whatever. But you could see 185 00:11:11,760 --> 00:11:16,000 Speaker 1: where that stuff could actually really go wrong very quickly. Granted, 186 00:11:16,440 --> 00:11:19,800 Speaker 1: this is not the use case that Cook was making 187 00:11:19,880 --> 00:11:22,680 Speaker 1: to care Swisher, and you can make a very valid 188 00:11:22,760 --> 00:11:26,800 Speaker 1: argument that I am creating a sort of straw man argument. 189 00:11:27,160 --> 00:11:31,120 Speaker 1: I just see it as a potential slippery slope. And 190 00:11:31,280 --> 00:11:34,320 Speaker 1: let's face it, having a conversation with me is already 191 00:11:34,360 --> 00:11:38,120 Speaker 1: boring enough without me bringing charts and a bibliography into it. 192 00:11:38,880 --> 00:11:41,520 Speaker 1: I just think of all those online message boards that 193 00:11:41,559 --> 00:11:44,520 Speaker 1: are full of people demanding that other folks cite their 194 00:11:44,559 --> 00:11:47,760 Speaker 1: sources and stuff like that, and how that could become 195 00:11:48,280 --> 00:11:50,959 Speaker 1: a part of conversations moving forward, and that all of 196 00:11:51,000 --> 00:11:52,560 Speaker 1: that just makes me want to go back to bed. 197 00:11:53,720 --> 00:11:57,200 Speaker 1: A blind woman will receive one point one million dollars 198 00:11:57,240 --> 00:12:01,040 Speaker 1: in damages after having Uber driver's ditcher and her service 199 00:12:01,080 --> 00:12:05,480 Speaker 1: dog on fourteen different occasions, leaving her stranded and preventing 200 00:12:05,480 --> 00:12:09,520 Speaker 1: her from getting to important events and destinations, including just 201 00:12:09,600 --> 00:12:13,800 Speaker 1: getting to her office to do work. In sixteen, Uber 202 00:12:13,880 --> 00:12:16,719 Speaker 1: reached a two point six million dollar settlement for a 203 00:12:16,800 --> 00:12:20,600 Speaker 1: similar legal case, but some of the incidents that this 204 00:12:20,640 --> 00:12:26,439 Speaker 1: particular woman, Lisa Irving experienced happened after that settlement was reached, 205 00:12:26,640 --> 00:12:31,200 Speaker 1: which showed that Uber hadn't actually addressed the underlying problem. 206 00:12:31,240 --> 00:12:34,880 Speaker 1: It represents discrimination against Irving because of her blindness, which 207 00:12:35,000 --> 00:12:39,240 Speaker 1: is against the law. Irving will receive around three twenty 208 00:12:39,320 --> 00:12:42,319 Speaker 1: four thousand dollars. The rest of that one point one 209 00:12:42,360 --> 00:12:48,080 Speaker 1: million is going to cover legal fees. This illustrates why 210 00:12:48,200 --> 00:12:51,960 Speaker 1: the system in America is kind of screwy, because big 211 00:12:52,000 --> 00:12:55,200 Speaker 1: companies can afford to go through this sort of thing. 212 00:12:55,320 --> 00:12:58,920 Speaker 1: It takes a lot for a private citizen to take 213 00:12:58,920 --> 00:13:02,560 Speaker 1: on a big company a lawsuit. Uber has since created 214 00:13:02,640 --> 00:13:05,360 Speaker 1: a support form that customers can fill out should they 215 00:13:05,400 --> 00:13:08,800 Speaker 1: experience similar issues in the future. Doesn't prevent it from happening, 216 00:13:09,120 --> 00:13:14,360 Speaker 1: but it gives customers a chance to specifically address UH 217 00:13:15,080 --> 00:13:18,160 Speaker 1: an instance of the problem. And the very first question 218 00:13:18,200 --> 00:13:20,960 Speaker 1: asks if a writer was denied a ride because of 219 00:13:21,000 --> 00:13:25,480 Speaker 1: a service animal. So not truly a solution, but at 220 00:13:25,520 --> 00:13:30,560 Speaker 1: least a move to try and address the issue. And finally, 221 00:13:30,760 --> 00:13:35,520 Speaker 1: The Independent reports that researchers at Brown University have created 222 00:13:35,600 --> 00:13:39,920 Speaker 1: a brain computer interface with wireless connectivity. And what's more, 223 00:13:40,200 --> 00:13:44,520 Speaker 1: they say that this interface provides quote single neuron resolution 224 00:13:44,640 --> 00:13:49,719 Speaker 1: and full broadband fidelity end quote. Brain computer interfaces are 225 00:13:49,800 --> 00:13:53,440 Speaker 1: fascinating devices now. As the name indicates, this is a 226 00:13:53,440 --> 00:13:56,720 Speaker 1: technology that allows a person to interact with a computer 227 00:13:56,800 --> 00:14:01,640 Speaker 1: system through thought alone. It's like having telepathy that works 228 00:14:01,640 --> 00:14:07,880 Speaker 1: with computers. And typically this procedure includes intercranial surgery and 229 00:14:08,440 --> 00:14:13,080 Speaker 1: doctors have to implant electrodes into the brain of the recipient, 230 00:14:13,520 --> 00:14:16,920 Speaker 1: which is an incredibly invasive procedure obviously, and it has 231 00:14:16,960 --> 00:14:21,280 Speaker 1: its own set of risks, including potential infection, which is 232 00:14:21,280 --> 00:14:25,600 Speaker 1: a huge risk factor. So this process is required in 233 00:14:25,680 --> 00:14:29,000 Speaker 1: order to get very precise readings on brain activity, because 234 00:14:29,000 --> 00:14:32,040 Speaker 1: our skulls make it a bit tricky to detect brain 235 00:14:32,080 --> 00:14:35,880 Speaker 1: waves with accuracy unless we're actively in an f M 236 00:14:36,040 --> 00:14:40,240 Speaker 1: R I machine or something like that. In addition, we 237 00:14:40,360 --> 00:14:44,160 Speaker 1: usually see these technologies in the form of wired connections 238 00:14:44,200 --> 00:14:48,640 Speaker 1: between the interface that's attached to a patient and the 239 00:14:48,720 --> 00:14:53,560 Speaker 1: related computer system they communicate with, and that creates more limitations. 240 00:14:53,600 --> 00:14:56,760 Speaker 1: So this new approach removes the need for those physical wires. 241 00:14:56,760 --> 00:15:00,480 Speaker 1: There could be wireless communication between patient and computer, which 242 00:15:00,560 --> 00:15:04,200 Speaker 1: the researchers say opens up new possibilities and use cases. 243 00:15:04,680 --> 00:15:07,760 Speaker 1: Getting a system like this to work requires a lot 244 00:15:07,800 --> 00:15:11,560 Speaker 1: of adjustments because you're training a patient on how to 245 00:15:11,640 --> 00:15:16,400 Speaker 1: use the technology, but you're simultaneously training the technology to 246 00:15:16,600 --> 00:15:20,440 Speaker 1: learn how the patient thinks. The interface software has to 247 00:15:20,520 --> 00:15:24,120 Speaker 1: learn how to interpret brain waves and map those brain 248 00:15:24,200 --> 00:15:28,080 Speaker 1: waves to specific outcomes. It's a fascinating area of research 249 00:15:28,120 --> 00:15:31,200 Speaker 1: and development, and the researchers hope that by removing these 250 00:15:31,240 --> 00:15:34,000 Speaker 1: tethers they will be able to create more scenarios in 251 00:15:34,040 --> 00:15:37,960 Speaker 1: which they can have more examples and gather more data 252 00:15:38,040 --> 00:15:42,040 Speaker 1: and learn more about brain computer interactions and thus designed 253 00:15:42,080 --> 00:15:47,400 Speaker 1: better algorithms to create more seamless connectivity, ultimately giving paralyzed 254 00:15:47,400 --> 00:15:52,280 Speaker 1: patients more agency and communication capabilities. By the way, there 255 00:15:52,280 --> 00:15:55,080 Speaker 1: are also private companies and a lot of business folks 256 00:15:55,280 --> 00:15:59,360 Speaker 1: who are also researching this technology, though arguably for less 257 00:15:59,400 --> 00:16:03,800 Speaker 1: noble reasons. I personally remain convinced that Elon Musk is 258 00:16:03,840 --> 00:16:07,360 Speaker 1: mostly looking to find some sort of electronic means to 259 00:16:07,440 --> 00:16:12,160 Speaker 1: preserve his consciousness indefinitely. For example, it's just a feeling 260 00:16:12,200 --> 00:16:15,240 Speaker 1: I get and again, I have a bias, and it 261 00:16:15,320 --> 00:16:17,720 Speaker 1: really comes out in these New News episodes, doesn't it. 262 00:16:17,800 --> 00:16:22,040 Speaker 1: I make no apologies for it. It's who I am, 263 00:16:22,080 --> 00:16:25,160 Speaker 1: but draw your own conclusions. I'm not saying that I 264 00:16:25,200 --> 00:16:28,240 Speaker 1: am correct in this, It's just the feeling I get, 265 00:16:28,760 --> 00:16:32,880 Speaker 1: and I fully admit I could be wrong. But that 266 00:16:32,920 --> 00:16:37,240 Speaker 1: wraps up the news for April six one, and I 267 00:16:37,280 --> 00:16:39,840 Speaker 1: hope you are all doing well. If you have any 268 00:16:39,920 --> 00:16:42,760 Speaker 1: suggestions for things I should cover on future episodes of 269 00:16:42,800 --> 00:16:45,560 Speaker 1: tech Stuff, reach out to me on Twitter. The handle 270 00:16:45,600 --> 00:16:49,000 Speaker 1: we use is text stuff H s W and I'll 271 00:16:49,000 --> 00:16:57,840 Speaker 1: talk to you again really soon. Text Stuff is an 272 00:16:57,840 --> 00:17:01,520 Speaker 1: I Heart Radio production. For more podcasts from my Heart Radio, 273 00:17:01,880 --> 00:17:05,040 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 274 00:17:05,119 --> 00:17:11,200 Speaker 1: you listen to your favorite shows. H