1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:16,120 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, 3 00:00:16,120 --> 00:00:19,360 Speaker 1: Diamond Executive producer with iHeartRadio and how the tech are you. 4 00:00:19,920 --> 00:00:25,080 Speaker 1: It's time for the tech news for Tuesday, August twenty ninth, 5 00:00:25,600 --> 00:00:28,960 Speaker 1: twenty twenty three. We got a lot of heavy talk 6 00:00:29,040 --> 00:00:33,559 Speaker 1: about AI to get through today. First up, US Senator 7 00:00:33,680 --> 00:00:38,200 Speaker 1: Chuck Schumer will host an insight forum focused on AI 8 00:00:38,800 --> 00:00:42,839 Speaker 1: on September thirteenth. His office has confirmed there will be 9 00:00:43,120 --> 00:00:47,120 Speaker 1: several important folks in tech who are present at this event. 10 00:00:47,840 --> 00:00:51,880 Speaker 1: One of those is Elon Musk, who co founded open 11 00:00:51,920 --> 00:00:55,000 Speaker 1: Ai before he left that organization to go make his 12 00:00:55,040 --> 00:00:59,960 Speaker 1: own AI development team with Blackjack and well never mind. 13 00:01:00,600 --> 00:01:04,400 Speaker 1: Another is Mark Zuckerberg, which makes me wonder if Schumer 14 00:01:04,480 --> 00:01:07,960 Speaker 1: is prepared to keep Musk and Zuckerberg apart, since you 15 00:01:08,040 --> 00:01:11,000 Speaker 1: never know when they'll revert back to being high school 16 00:01:11,080 --> 00:01:15,480 Speaker 1: alpha male types and starts scrapping in the hallway between classes. 17 00:01:16,280 --> 00:01:18,480 Speaker 1: Eric Schmidt is also supposed to be there. He was 18 00:01:18,720 --> 00:01:23,319 Speaker 1: the former CEO of Google. Sundar Pichai, who is the 19 00:01:23,400 --> 00:01:26,840 Speaker 1: current CEO of Alphabet, which is Google's parent company. He's 20 00:01:26,880 --> 00:01:30,319 Speaker 1: going to be there, and of course Sam Altman, the 21 00:01:30,319 --> 00:01:33,840 Speaker 1: CEO of Open AI itself, will take part in this, 22 00:01:34,400 --> 00:01:37,319 Speaker 1: and the conversation is going to be a lot around 23 00:01:37,400 --> 00:01:41,320 Speaker 1: like the risks and benefits of AI and the development 24 00:01:41,480 --> 00:01:46,839 Speaker 1: of a US policy on regulating artificial intelligence. Now, these 25 00:01:46,840 --> 00:01:48,640 Speaker 1: folks are not going to be the only ones there, 26 00:01:48,720 --> 00:01:51,800 Speaker 1: which is a good thing, because if the only folks 27 00:01:51,960 --> 00:01:54,680 Speaker 1: at the table happen to be the ones who are 28 00:01:54,760 --> 00:01:58,880 Speaker 1: eager to avoid as much regulation as possible, you probably 29 00:01:58,920 --> 00:02:02,000 Speaker 1: wouldn't make a whole lot of progress. Schumer's office says 30 00:02:02,040 --> 00:02:05,760 Speaker 1: that there will be representatives from civil rights groups and 31 00:02:05,960 --> 00:02:09,000 Speaker 1: worker advocacy groups and creatives and that sort of thing 32 00:02:09,000 --> 00:02:12,920 Speaker 1: as well. The proceedings themselves will actually be done behind 33 00:02:13,000 --> 00:02:16,680 Speaker 1: closed doors, so there won't be any reporters allowed inside 34 00:02:16,760 --> 00:02:19,359 Speaker 1: while this is going on, but Schumer says his office 35 00:02:19,400 --> 00:02:22,840 Speaker 1: will release essentially a summary of what went on during 36 00:02:22,880 --> 00:02:27,160 Speaker 1: the discussions. I suspect the tech executives will do their 37 00:02:27,200 --> 00:02:32,960 Speaker 1: best to reduce any impact of proposed regulations because otherwise 38 00:02:33,000 --> 00:02:36,040 Speaker 1: that kind of affects their bottom line. Now, I do 39 00:02:36,160 --> 00:02:39,960 Speaker 1: think a serious discussion about artificial intelligence does need to 40 00:02:39,960 --> 00:02:42,880 Speaker 1: happen as soon as possible, and I don't just mean 41 00:02:43,280 --> 00:02:46,920 Speaker 1: generative AI. That gets a lot of headlines, but it 42 00:02:47,000 --> 00:02:51,120 Speaker 1: is not the one and only application of artificial intelligence 43 00:02:51,160 --> 00:02:54,959 Speaker 1: by far. For example, according to Jared Keller, who was 44 00:02:55,000 --> 00:03:00,000 Speaker 1: writing for Military dot Com, the US Army will potentially 45 00:03:00,000 --> 00:03:03,080 Speaker 1: they soon conduct tests in which they will mount the 46 00:03:03,240 --> 00:03:07,720 Speaker 1: Army's new sig sour x seven squad rifle to a 47 00:03:07,800 --> 00:03:12,280 Speaker 1: four legged robot provided by Ghost Robotics. The Army already 48 00:03:12,320 --> 00:03:16,000 Speaker 1: did a similar test with a robot from Ghost Robotics, 49 00:03:16,360 --> 00:03:19,160 Speaker 1: and they used an m for a one carbine in 50 00:03:19,200 --> 00:03:23,160 Speaker 1: those tests. Representatives for the Army have said these tests 51 00:03:23,200 --> 00:03:28,320 Speaker 1: are to explore human machine interaction in Army operations, but 52 00:03:28,400 --> 00:03:31,600 Speaker 1: they do not necessarily indicate that there is a plan 53 00:03:31,919 --> 00:03:35,600 Speaker 1: for these robots to be quote unquote deployed down range. 54 00:03:36,360 --> 00:03:39,760 Speaker 1: That is, the Army might test the stuff out, but 55 00:03:39,800 --> 00:03:43,320 Speaker 1: that doesn't necessarily mean that in the not two distant future, 56 00:03:43,480 --> 00:03:47,240 Speaker 1: four legged robots armed with machine guns will be blasting 57 00:03:47,240 --> 00:03:50,960 Speaker 1: their way through combat zones. So let's just test chill out. 58 00:03:51,640 --> 00:03:55,640 Speaker 1: Critics have repeatedly voiced concerns about arming semi autonomous and 59 00:03:55,680 --> 00:04:00,760 Speaker 1: remotely controlled devices, arguing that can lead to conflict escalation 60 00:04:01,560 --> 00:04:04,200 Speaker 1: and that the act of ending a human life should 61 00:04:04,240 --> 00:04:09,000 Speaker 1: be entirely up to another human, which, y'all, I understand 62 00:04:09,000 --> 00:04:11,400 Speaker 1: that on one hand, but it's really messed up to 63 00:04:11,440 --> 00:04:15,080 Speaker 1: think about it, right, Like, no, this should be left 64 00:04:15,080 --> 00:04:17,320 Speaker 1: to a person who will then be left to try 65 00:04:17,360 --> 00:04:19,839 Speaker 1: and deal with that trauma. But at the same time 66 00:04:19,960 --> 00:04:22,080 Speaker 1: you think, well, sure it should be up to a 67 00:04:22,120 --> 00:04:27,040 Speaker 1: person and not just automated, because that is really super 68 00:04:27,160 --> 00:04:30,640 Speaker 1: dark and grim. But then I am admittedly a hippie 69 00:04:30,680 --> 00:04:33,919 Speaker 1: dippy type who isn't so big on the concept of 70 00:04:34,040 --> 00:04:37,520 Speaker 1: ending human lives in the first place. Anyway. It's not 71 00:04:37,560 --> 00:04:41,520 Speaker 1: just human rights advocates who have voiced concerns. Several robotics companies, 72 00:04:41,520 --> 00:04:46,760 Speaker 1: including famously Boston Dynamics, have protested the move to weaponize 73 00:04:46,800 --> 00:04:50,039 Speaker 1: the technologies that they work on. Last year, those companies 74 00:04:50,040 --> 00:04:53,960 Speaker 1: released an open letter that in part reads, quote, we 75 00:04:54,080 --> 00:04:57,000 Speaker 1: believe that adding weapons to robots that are remotely or 76 00:04:57,040 --> 00:05:01,240 Speaker 1: autonomously operated, widely available to the public, and capable of 77 00:05:01,320 --> 00:05:05,200 Speaker 1: navigating to previously inaccessible locations where people live and work 78 00:05:05,760 --> 00:05:10,880 Speaker 1: raises new risks of harm and serious ethical issues. Weaponized 79 00:05:10,880 --> 00:05:14,640 Speaker 1: applications of these newly capable robots will also harm public 80 00:05:14,680 --> 00:05:18,240 Speaker 1: trust in the technology in ways that damage the tremendous 81 00:05:18,240 --> 00:05:22,880 Speaker 1: benefits they will bring to society end quote. Obviously, this 82 00:05:22,920 --> 00:05:25,800 Speaker 1: has not stopped the US Army and other nations are 83 00:05:25,839 --> 00:05:30,320 Speaker 1: similarly experimenting with weaponized robotic platforms. So I guess you 84 00:05:30,360 --> 00:05:34,760 Speaker 1: could say that unless everyone around the world agrees to 85 00:05:34,839 --> 00:05:37,960 Speaker 1: back off on doing this and then actually follows through 86 00:05:38,000 --> 00:05:41,160 Speaker 1: on that promise, the only other option you have is 87 00:05:41,200 --> 00:05:44,680 Speaker 1: to develop the stuff yourself, which again is pretty grim. 88 00:05:45,120 --> 00:05:49,400 Speaker 1: But wait, it gets even scarier. So not to be outdone, 89 00:05:49,880 --> 00:05:53,640 Speaker 1: the US Air Force is seeking a research budget to 90 00:05:53,680 --> 00:05:58,279 Speaker 1: build at least a thousand unmanned aircraft that can operate 91 00:05:58,360 --> 00:06:03,479 Speaker 1: autonomously and I'm talking being weaponized aircraft. These vehicles would 92 00:06:03,480 --> 00:06:07,160 Speaker 1: serve as wingmen to human pilots and would provide support 93 00:06:07,279 --> 00:06:10,680 Speaker 1: and cover during combat operations. They could also be sent 94 00:06:10,760 --> 00:06:14,400 Speaker 1: on suicide missions to achieve combat goals in scenarios where 95 00:06:14,400 --> 00:06:18,320 Speaker 1: the possibility of survival is approaching zero. As such, the 96 00:06:18,360 --> 00:06:22,120 Speaker 1: aircraft would need to be autonomous and armed. One candidate 97 00:06:22,200 --> 00:06:25,320 Speaker 1: for the vehicle that the Air Force could potentially use 98 00:06:25,360 --> 00:06:29,480 Speaker 1: in this program comes from a company called Kratos Defense. 99 00:06:29,680 --> 00:06:32,080 Speaker 1: It actually makes me wonder if the company chose that 100 00:06:32,200 --> 00:06:35,080 Speaker 1: name after the character from the God of War franchise. 101 00:06:35,960 --> 00:06:37,719 Speaker 1: That game came out in two thousand and five, and 102 00:06:37,760 --> 00:06:40,880 Speaker 1: the company that is now known as Kratos Defense actually 103 00:06:41,000 --> 00:06:44,479 Speaker 1: chose that name in two thousand and seven, two years 104 00:06:44,520 --> 00:06:48,920 Speaker 1: after the game came out. If that's the case, yikes. Anyway, 105 00:06:48,960 --> 00:06:53,320 Speaker 1: the platform itself is called Valkyrie, which is really another yikes. 106 00:06:54,080 --> 00:06:58,279 Speaker 1: Valkyrie were Odin's war maidens who would escort fallen warriors 107 00:06:58,279 --> 00:07:02,200 Speaker 1: to Valhalla. Force has been using the Valkyrie aircraft as 108 00:07:02,240 --> 00:07:05,760 Speaker 1: a support platform for connectivity purposes, essentially acting like a 109 00:07:05,800 --> 00:07:10,800 Speaker 1: network bridge between other aircraft and other autonomous vehicles that 110 00:07:10,880 --> 00:07:14,400 Speaker 1: are under Air Force control. But current plans involve using 111 00:07:14,440 --> 00:07:17,880 Speaker 1: a Valkyrie in a simulation to identify, chase, down, and 112 00:07:17,920 --> 00:07:21,720 Speaker 1: then take down a target over the Gulf of Mexico 113 00:07:22,480 --> 00:07:26,560 Speaker 1: in in a test of its capabilities, which is a 114 00:07:26,560 --> 00:07:30,240 Speaker 1: triple yikes really, as you can imagine. Critics have protested 115 00:07:30,240 --> 00:07:33,320 Speaker 1: this initiative as well, with the same sort of arguments 116 00:07:33,320 --> 00:07:36,040 Speaker 1: that they make for the four legged robots being armed 117 00:07:36,040 --> 00:07:38,600 Speaker 1: by the army, But the Air Force will be requesting 118 00:07:38,640 --> 00:07:41,520 Speaker 1: a nearly six billion dollar budget to pursue this plan 119 00:07:41,640 --> 00:07:46,040 Speaker 1: over the course of the next five years. Today is 120 00:07:46,080 --> 00:07:49,400 Speaker 1: the first day of Google's Cloud Next conference, an event 121 00:07:49,440 --> 00:07:52,360 Speaker 1: where AI will be one of many topics under discussion. 122 00:07:52,800 --> 00:07:57,160 Speaker 1: Google will launch an interesting tool at this conference called Synthid. 123 00:07:58,000 --> 00:08:02,200 Speaker 1: This tool applies a watermark to AI generated images. The 124 00:08:02,240 --> 00:08:05,280 Speaker 1: watermark is meant to be unnoticed by human eyes, so 125 00:08:05,320 --> 00:08:06,720 Speaker 1: when you look at the picture, you don't see that 126 00:08:06,760 --> 00:08:09,360 Speaker 1: there's a watermark there, but it's meant to be easily 127 00:08:09,400 --> 00:08:13,840 Speaker 1: detectable with any AI detection tool, which is pretty clever. 128 00:08:14,240 --> 00:08:16,960 Speaker 1: The watermark won't affect how we perceive the image, but 129 00:08:17,040 --> 00:08:19,720 Speaker 1: will also reveal itself to be the product of AI generation, 130 00:08:20,240 --> 00:08:22,880 Speaker 1: and Google says the watermark's design is such that you 131 00:08:22,960 --> 00:08:25,240 Speaker 1: could edit the image, You could crop it, you could 132 00:08:25,320 --> 00:08:28,440 Speaker 1: deform it, stretch it in various ways, and the watermarks 133 00:08:28,440 --> 00:08:32,040 Speaker 1: should be unaffected. Google engineers haven't gone into a lot 134 00:08:32,040 --> 00:08:34,120 Speaker 1: of detail about how this works because they don't want 135 00:08:34,120 --> 00:08:37,160 Speaker 1: to tip their hand too much lest folks immediately find 136 00:08:37,200 --> 00:08:39,439 Speaker 1: ways to gain the system and get around the tool. 137 00:08:39,760 --> 00:08:42,400 Speaker 1: But they have also said that this is really just 138 00:08:42,440 --> 00:08:45,400 Speaker 1: the beginning of Synthid. This tool is going to face 139 00:08:45,480 --> 00:08:48,920 Speaker 1: real world tests. People will find ways around it. Like, 140 00:08:49,000 --> 00:08:51,920 Speaker 1: that's just a fact, and the engineers at Google are 141 00:08:51,920 --> 00:08:55,040 Speaker 1: saying this, and that's going to prompt changes and improvements 142 00:08:55,040 --> 00:08:57,600 Speaker 1: on Google side, and that's just how things work. It's 143 00:08:57,679 --> 00:09:01,280 Speaker 1: essentially the exact same pattern we saw with captures. There 144 00:09:01,320 --> 00:09:03,760 Speaker 1: are lots of reasons that you would want to employ 145 00:09:03,840 --> 00:09:06,680 Speaker 1: a tool like this, ranging from everything from preventing the 146 00:09:06,720 --> 00:09:10,640 Speaker 1: spread of misinformation with deep fakes to avoiding the problem 147 00:09:10,640 --> 00:09:14,040 Speaker 1: of just mixing up images of actual real world things 148 00:09:14,559 --> 00:09:17,160 Speaker 1: with AI generated images of stuff that may or may 149 00:09:17,200 --> 00:09:20,280 Speaker 1: not exist. So there are a lot of different practical 150 00:09:20,320 --> 00:09:22,640 Speaker 1: applications for this technology. I'm sure we'll hear a lot 151 00:09:22,679 --> 00:09:26,600 Speaker 1: more about it as the Cloud Next conference continues. General 152 00:09:26,640 --> 00:09:29,679 Speaker 1: motors will be talking more about how it is using 153 00:09:29,679 --> 00:09:35,120 Speaker 1: conversational AI as the Cloud Next Conference continues, specifically with 154 00:09:35,240 --> 00:09:38,280 Speaker 1: the on Star service. So on Star is a connected 155 00:09:38,320 --> 00:09:41,360 Speaker 1: feature built into some vehicles that lets the driver get 156 00:09:41,400 --> 00:09:46,640 Speaker 1: support for all sorts of things ranging from the quasi 157 00:09:46,720 --> 00:09:51,000 Speaker 1: trivial to the very serious. GM is using conversational AI 158 00:09:51,120 --> 00:09:55,080 Speaker 1: to handle the more mundane, low urgency requests. You know, 159 00:09:55,120 --> 00:09:57,520 Speaker 1: like if you want to use on Star to help 160 00:09:57,600 --> 00:10:01,439 Speaker 1: guide you in navigating to your final destination in your 161 00:10:01,480 --> 00:10:03,760 Speaker 1: car that doesn't have to be a human being to 162 00:10:03,880 --> 00:10:08,400 Speaker 1: actually manage that. That could be an AI agent helping 163 00:10:08,440 --> 00:10:11,360 Speaker 1: you with that task. For stuff that's more important, like 164 00:10:11,400 --> 00:10:16,280 Speaker 1: reporting a crash or asking that you know someone like 165 00:10:16,720 --> 00:10:20,160 Speaker 1: an EMT be sent to your location, those calls get 166 00:10:20,200 --> 00:10:24,800 Speaker 1: routed to human operators, which is totally understandable, and by 167 00:10:24,840 --> 00:10:28,840 Speaker 1: offloading the low urgency stuff to AI, GM says it 168 00:10:28,880 --> 00:10:31,240 Speaker 1: has decreased the weight time to get in touch with 169 00:10:31,360 --> 00:10:33,880 Speaker 1: human operators, and obviously that's a good thing if you 170 00:10:34,000 --> 00:10:35,920 Speaker 1: really need to speak to someone in the event of 171 00:10:35,960 --> 00:10:39,280 Speaker 1: an emergency. According to GM, the response to the AI 172 00:10:39,320 --> 00:10:43,120 Speaker 1: assistance has been mostly positive among drivers, and this is 173 00:10:43,160 --> 00:10:45,520 Speaker 1: the kind of implementation I can really get behind, using 174 00:10:45,559 --> 00:10:49,040 Speaker 1: AI to offload less important tasks so that people with 175 00:10:49,120 --> 00:10:52,199 Speaker 1: specialized knowledge and training can handle the more important ones, 176 00:10:52,240 --> 00:10:57,120 Speaker 1: particularly ones that benefit from a human touch. Okay, we've 177 00:10:57,160 --> 00:10:59,040 Speaker 1: got a lot more news stories to go, but let's 178 00:10:59,080 --> 00:11:11,760 Speaker 1: take a quick break. Okay, we're back, and next up, 179 00:11:11,800 --> 00:11:15,120 Speaker 1: we've got another story with artificial intelligence, along with some 180 00:11:15,400 --> 00:11:19,439 Speaker 1: arguably dumb real behavior. At least in my opinion, it's 181 00:11:19,480 --> 00:11:23,760 Speaker 1: pretty dumb. So Elon Musk, owner of x formerly known 182 00:11:23,760 --> 00:11:27,800 Speaker 1: as Twitter, and the CEO of Tesla, live streamed a 183 00:11:27,840 --> 00:11:33,199 Speaker 1: demonstration of Tesla's upcoming full self driving Version twelve software 184 00:11:33,760 --> 00:11:36,359 Speaker 1: with him sitting in the driver's seat of a Tesla. 185 00:11:36,840 --> 00:11:38,920 Speaker 1: This version of full self driving has yet to be 186 00:11:39,000 --> 00:11:43,079 Speaker 1: released to Tesla owners, and during the demo, Musk broke 187 00:11:43,120 --> 00:11:46,520 Speaker 1: a California law which says you're not supposed to have 188 00:11:46,559 --> 00:11:49,360 Speaker 1: a phone in your hand while you're operating a vehicle. 189 00:11:50,280 --> 00:11:53,480 Speaker 1: Elon Musk definitely did do that. Follow up on that, 190 00:11:53,600 --> 00:11:57,199 Speaker 1: police are not going to pursue Musk for this because 191 00:11:57,200 --> 00:12:01,400 Speaker 1: no police officer directly witnessed it happening. That's a prerequisite 192 00:12:01,440 --> 00:12:04,320 Speaker 1: for charging someone if it's just a video or whatever. 193 00:12:04,679 --> 00:12:07,120 Speaker 1: Cops did not see it at the time, they will 194 00:12:07,120 --> 00:12:10,439 Speaker 1: not go after Musk. Plus, even if they did, the 195 00:12:10,559 --> 00:12:12,880 Speaker 1: penalty for your first offense can be as low as 196 00:12:12,880 --> 00:12:16,120 Speaker 1: a twenty dollars fine, so it's not like it wouldn't 197 00:12:16,160 --> 00:12:21,439 Speaker 1: mean anything. Also, Musk was technically violating Tesla's own policies 198 00:12:21,840 --> 00:12:24,640 Speaker 1: because the company says that full self driving is a 199 00:12:24,800 --> 00:12:27,240 Speaker 1: hands on feature and that drivers are supposed to keep 200 00:12:27,280 --> 00:12:29,920 Speaker 1: their hands on the steering wheel at all times, and 201 00:12:30,040 --> 00:12:33,960 Speaker 1: Elon Musk definitively did not do that, so he was 202 00:12:34,000 --> 00:12:37,040 Speaker 1: defying his own company's policies. But anyway, let's put all 203 00:12:37,080 --> 00:12:41,360 Speaker 1: that aside for now. During this demonstration, at one point, 204 00:12:41,840 --> 00:12:44,760 Speaker 1: Musk actually had to take control of the Tesla to 205 00:12:44,880 --> 00:12:47,760 Speaker 1: prevent it from running a red light. That's not a 206 00:12:47,760 --> 00:12:51,200 Speaker 1: great moment when you're demonstrating the supposed full self driving 207 00:12:51,240 --> 00:12:55,280 Speaker 1: capability of your vehicle. Musk also used Google to look 208 00:12:55,360 --> 00:12:58,760 Speaker 1: up Mark Zuckerberg's address and then showed it on camera, 209 00:12:59,000 --> 00:13:02,440 Speaker 1: but he said that doesn't to doxing anyone because you 210 00:13:02,480 --> 00:13:05,320 Speaker 1: could just google the way he did. To be fair 211 00:13:05,360 --> 00:13:08,440 Speaker 1: to Tesla, there were several segments of the drive in 212 00:13:08,480 --> 00:13:13,280 Speaker 1: which the vehicle navigated through construction zones and roundabouts and 213 00:13:13,400 --> 00:13:17,559 Speaker 1: didn't have any performance issues. Musk also pointed out that 214 00:13:17,600 --> 00:13:21,600 Speaker 1: Tesla's now rely solely on optical cameras rather than sensors 215 00:13:21,679 --> 00:13:25,200 Speaker 1: like LDAR. Another interesting note is that Musk did this 216 00:13:25,240 --> 00:13:28,720 Speaker 1: demonstration while his company is preparing to defend itself in 217 00:13:28,800 --> 00:13:32,680 Speaker 1: the first of a couple of oncoming court cases that 218 00:13:33,160 --> 00:13:36,040 Speaker 1: are arguing that the company's driver assist features led to 219 00:13:36,080 --> 00:13:39,679 Speaker 1: fatal accidents. So the first court case should begin in 220 00:13:39,720 --> 00:13:44,720 Speaker 1: California in mid September. And it's a civil lawsuit stems 221 00:13:44,720 --> 00:13:47,640 Speaker 1: from a twenty nineteen accident in which a Tesla owner 222 00:13:47,760 --> 00:13:51,360 Speaker 1: named Micah Lee died when his Tesla, which was in 223 00:13:51,480 --> 00:13:55,599 Speaker 1: autopilot mode, veered off a highway. It collided with a 224 00:13:55,640 --> 00:13:59,320 Speaker 1: tree and then burst into flames. Two passengers in Lee's 225 00:13:59,320 --> 00:14:03,000 Speaker 1: car suffered suitvarious injuries, but survive the crash. The second 226 00:14:03,040 --> 00:14:07,400 Speaker 1: court case is scheduled for October in Florida and centers 227 00:14:07,440 --> 00:14:10,360 Speaker 1: on a different crash that happened in twenty nineteen. That's 228 00:14:10,400 --> 00:14:13,400 Speaker 1: when Stephen Banner's Model three failed to detect a big 229 00:14:13,480 --> 00:14:16,480 Speaker 1: rig truck that was crossing the road ahead of him, 230 00:14:16,840 --> 00:14:21,480 Speaker 1: and his Tesla collided with the trailer, which killed Stephen Banner. 231 00:14:21,880 --> 00:14:25,120 Speaker 1: I can't imagine the lawyers at Tesla are super thrilled 232 00:14:25,160 --> 00:14:28,480 Speaker 1: about Elon Musk showing off full self driving on a 233 00:14:28,520 --> 00:14:32,120 Speaker 1: live stream, especially in a demonstration that required him to 234 00:14:32,200 --> 00:14:36,000 Speaker 1: take over to avoid running a red light. While simultaneously 235 00:14:36,040 --> 00:14:39,880 Speaker 1: preparing for these court cases, Microsoft will soon release version 236 00:14:39,920 --> 00:14:43,400 Speaker 1: one seventeen of the Edge web browser and will actually 237 00:14:43,400 --> 00:14:47,000 Speaker 1: be removing some features in the process. Microsoft said the 238 00:14:47,040 --> 00:14:50,280 Speaker 1: decision to remove the tools was to quote improve end 239 00:14:50,360 --> 00:14:54,320 Speaker 1: user experience and simplify the more tools menu end quote. 240 00:14:54,560 --> 00:14:57,680 Speaker 1: As reported by The Verge, truth be told, I hadn't 241 00:14:57,720 --> 00:14:59,760 Speaker 1: even heard of these features at all, so it's quite 242 00:14:59,760 --> 00:15:02,200 Speaker 1: poossible that very few people are making use of them. 243 00:15:02,680 --> 00:15:05,560 Speaker 1: Then again, according to at least some statistical analysis firms, 244 00:15:05,640 --> 00:15:08,600 Speaker 1: Microsoft Edge commands just five percent of the web browser 245 00:15:08,680 --> 00:15:11,520 Speaker 1: market in total, so you could argue very few people 246 00:15:11,520 --> 00:15:15,840 Speaker 1: are making use of Edge full stop anyway. The features 247 00:15:15,840 --> 00:15:22,400 Speaker 1: affected are picture dictionary citations, math Solver, kids Mode, and 248 00:15:22,680 --> 00:15:25,320 Speaker 1: grammar tools. If you are one of the few elite 249 00:15:25,640 --> 00:15:29,840 Speaker 1: that this actually will affect, you have my condolences. Microsoft 250 00:15:29,920 --> 00:15:33,520 Speaker 1: will push this update out of the beta phase. In 251 00:15:33,600 --> 00:15:36,560 Speaker 1: mid September, Carl Bode of tech Dirt wrote a piece 252 00:15:36,600 --> 00:15:39,600 Speaker 1: explaining how e byte companies, through a trade organization called 253 00:15:39,640 --> 00:15:42,920 Speaker 1: People for Bikes, has lobbied lawmakers in the United States 254 00:15:42,920 --> 00:15:45,360 Speaker 1: to make exceptions for e bikes in various right to 255 00:15:45,400 --> 00:15:48,680 Speaker 1: repair laws. Essentially, these companies are trying to make sure 256 00:15:48,720 --> 00:15:51,520 Speaker 1: that they can maintain control of the entire ecosystem for 257 00:15:51,560 --> 00:15:54,520 Speaker 1: their products, rather than open up so that customers can 258 00:15:54,600 --> 00:15:58,160 Speaker 1: either perform their own maintenance and repairs or to seek 259 00:15:58,240 --> 00:16:01,600 Speaker 1: those from an independent repair shop. The argument that the 260 00:16:01,600 --> 00:16:03,960 Speaker 1: group has been making is one we have heard before, 261 00:16:04,280 --> 00:16:07,320 Speaker 1: that this is really for the customer's safety. The group 262 00:16:07,440 --> 00:16:09,440 Speaker 1: argues that allowing people to do their own maintenance and 263 00:16:09,480 --> 00:16:12,560 Speaker 1: repair could lead to an increased risk of stuff like fires, 264 00:16:13,000 --> 00:16:15,840 Speaker 1: and while e bikes have been one of those electronic 265 00:16:15,920 --> 00:16:18,720 Speaker 1: products that have had problems with batteries catching on fire, 266 00:16:19,320 --> 00:16:22,400 Speaker 1: that has had more to do with poor manufacturing processes 267 00:16:22,920 --> 00:16:26,080 Speaker 1: than anything else. In fact, when pressed to cite figures 268 00:16:26,080 --> 00:16:28,400 Speaker 1: about how many fires were the result of an e 269 00:16:28,480 --> 00:16:31,120 Speaker 1: bike owner trying to do their own repairs, a rep 270 00:16:31,160 --> 00:16:34,440 Speaker 1: for the group said that the stories were quote unquote anecdotal, 271 00:16:34,960 --> 00:16:36,800 Speaker 1: which is another way of saying, I don't have any 272 00:16:36,840 --> 00:16:40,560 Speaker 1: evidence that this is actually a thing. And apparently these 273 00:16:40,600 --> 00:16:43,640 Speaker 1: lobbying efforts have been pretty effective, with e bikes getting 274 00:16:43,680 --> 00:16:46,880 Speaker 1: exceptions and several right to repair laws around the United States, 275 00:16:47,360 --> 00:16:50,560 Speaker 1: though not as tech Dirt reports in Minnesota. However, the 276 00:16:50,600 --> 00:16:53,920 Speaker 1: Minnesota law did make exceptions for game consoles, medical equipment, 277 00:16:54,000 --> 00:16:57,480 Speaker 1: and cars. Back in two thousand and eight, California voters 278 00:16:57,480 --> 00:17:00,880 Speaker 1: approved initial funding for a high speed rails within the state. 279 00:17:01,480 --> 00:17:03,560 Speaker 1: This is the same system that would later prompt Elon 280 00:17:03,680 --> 00:17:05,400 Speaker 1: Musk to say the whole thing was a huge waste 281 00:17:05,440 --> 00:17:07,600 Speaker 1: of money, and that a hyper loop system would be 282 00:17:07,640 --> 00:17:11,080 Speaker 1: faster and more effective. Of course, the hyper loop failed 283 00:17:11,080 --> 00:17:13,800 Speaker 1: to materialize, and despite several companies trying to make it 284 00:17:13,840 --> 00:17:16,679 Speaker 1: a thing, it has never manifested, at least not in 285 00:17:16,680 --> 00:17:20,840 Speaker 1: the way that Musk initially promoted it anyway. In the meantime, 286 00:17:20,960 --> 00:17:23,560 Speaker 1: over those years, the project for high speed rail has 287 00:17:23,640 --> 00:17:27,000 Speaker 1: moved forward, though very slowly, with the state engaged in 288 00:17:27,040 --> 00:17:30,840 Speaker 1: construction across hundreds of miles in California while still working 289 00:17:30,880 --> 00:17:34,840 Speaker 1: to receive environmental approval for some key stretches, and now 290 00:17:34,920 --> 00:17:38,080 Speaker 1: that project is officially putting out an RFQ, or Request 291 00:17:38,160 --> 00:17:41,000 Speaker 1: for Qualifications to look for vendors who would provide the 292 00:17:41,040 --> 00:17:43,520 Speaker 1: actual trains that will travel on those rails once they 293 00:17:43,520 --> 00:17:46,919 Speaker 1: are finished. Interested companies will need to respond to the 294 00:17:47,040 --> 00:17:51,280 Speaker 1: RFQ by November. The California High Speed Rail Authority will 295 00:17:51,280 --> 00:17:53,879 Speaker 1: consider the candidates and then narrow the search in early 296 00:17:53,960 --> 00:17:56,439 Speaker 1: twenty twenty four. To qualify, the companies will have to 297 00:17:56,440 --> 00:17:58,520 Speaker 1: be able to build trains that can operate at speeds 298 00:17:58,560 --> 00:18:01,360 Speaker 1: of two hundred and twenty miles per hour, with tests 299 00:18:01,400 --> 00:18:03,639 Speaker 1: as high as two hundred and forty two miles per 300 00:18:03,640 --> 00:18:07,920 Speaker 1: hour at least according to Los Angeles news outlet KTLA five. 301 00:18:08,520 --> 00:18:11,080 Speaker 1: The company selected will have to build all the trains 302 00:18:11,200 --> 00:18:13,639 Speaker 1: for the system and provide access to spare parts for 303 00:18:13,720 --> 00:18:17,199 Speaker 1: thirty years, which is important. Here in Atlanta, we have 304 00:18:17,280 --> 00:18:19,720 Speaker 1: a train system where the company that made the trains 305 00:18:20,480 --> 00:18:24,000 Speaker 1: doesn't exist anymore, so getting replacement parts requires a lot 306 00:18:24,040 --> 00:18:26,960 Speaker 1: more work. The authority's goal is to have the high 307 00:18:26,960 --> 00:18:29,760 Speaker 1: speed rail service in action by twenty thirty. Personally, I 308 00:18:29,800 --> 00:18:31,800 Speaker 1: have my doubts that it will be ready by then, 309 00:18:32,160 --> 00:18:34,960 Speaker 1: simply because these projects are so huge and complicated, and 310 00:18:35,040 --> 00:18:38,320 Speaker 1: made even more complex through local and state politics, which 311 00:18:38,680 --> 00:18:41,640 Speaker 1: change with every election. But here's hoping California is able 312 00:18:41,640 --> 00:18:44,760 Speaker 1: to see this project come to completion and perhaps serve 313 00:18:44,880 --> 00:18:47,800 Speaker 1: as a model that other states could follow. The lack 314 00:18:47,840 --> 00:18:50,600 Speaker 1: of high speed rail lines across the United States is 315 00:18:50,720 --> 00:18:54,920 Speaker 1: pretty embarrassing. Our final story is about how some hackers 316 00:18:55,000 --> 00:18:57,960 Speaker 1: say they have infiltrated a company called web Detective, which 317 00:18:58,000 --> 00:19:02,000 Speaker 1: makes spyware, and they have subsequent deleted all the device 318 00:19:02,200 --> 00:19:05,399 Speaker 1: information that the company had, which will make it impossible 319 00:19:05,400 --> 00:19:09,600 Speaker 1: for web Detective to collect additional data from those compromised devices. 320 00:19:09,720 --> 00:19:13,080 Speaker 1: According to Engadget. That means around seventy six thousand devices 321 00:19:13,560 --> 00:19:17,480 Speaker 1: will no longer be spying on their owners. Not all 322 00:19:17,520 --> 00:19:21,240 Speaker 1: heroes wear capes, some of them wear hoodies. All right, 323 00:19:21,680 --> 00:19:25,000 Speaker 1: that's it for the tech News for Tuesday, August twenty 324 00:19:25,080 --> 00:19:28,399 Speaker 1: ninth to twenty twenty three. I hope you are all well, 325 00:19:28,880 --> 00:19:38,560 Speaker 1: and I'll talk to you again really soon. Tech Stuff 326 00:19:38,640 --> 00:19:43,160 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 327 00:19:43,200 --> 00:19:46,720 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 328 00:19:46,760 --> 00:19:51,280 Speaker 1: your favorite shows.