1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,480 --> 00:00:16,079 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,120 --> 00:00:19,040 Speaker 1: an executive producer with iHeartRadio and how the tech are you? 4 00:00:19,360 --> 00:00:23,120 Speaker 1: It's time for the tech news for Tuesday, September twelfth, 5 00:00:23,160 --> 00:00:27,280 Speaker 1: twenty twenty three. So here in the United States, the 6 00:00:27,360 --> 00:00:30,960 Speaker 1: White House has proposed commitments on how to go about 7 00:00:31,160 --> 00:00:37,199 Speaker 1: developing and then deploying various artificial intelligence products. These are 8 00:00:37,520 --> 00:00:41,480 Speaker 1: voluntary commitments, so there's no law or anything here. It's 9 00:00:41,520 --> 00:00:44,800 Speaker 1: nothing holding anyone to these things. It's more like a 10 00:00:44,840 --> 00:00:48,200 Speaker 1: pledge that AI companies will promise to do their very 11 00:00:48,240 --> 00:00:52,200 Speaker 1: best not to release AI products that actively make life worse. 12 00:00:52,560 --> 00:00:55,520 Speaker 1: That kind of thing anyway. Several companies have already signed 13 00:00:55,520 --> 00:01:01,000 Speaker 1: this pledge, and some joined the group this week. The 14 00:01:01,040 --> 00:01:05,480 Speaker 1: new ones that joined include IBM and Adobe and Nvidia. 15 00:01:06,440 --> 00:01:10,080 Speaker 1: This follows other really big companies like Google and Microsoft, 16 00:01:10,080 --> 00:01:15,720 Speaker 1: along with open Ai, who signed to obey these commitments 17 00:01:16,319 --> 00:01:20,440 Speaker 1: back in July. Another one that signed this week, perhaps 18 00:01:20,440 --> 00:01:24,800 Speaker 1: a bit surprisingly, was a Palenteer. And what is it 19 00:01:24,840 --> 00:01:28,520 Speaker 1: with scary tech companies taking their names from JRR Tolkien works? 20 00:01:28,920 --> 00:01:31,760 Speaker 1: Because Pallenteer comes from Lord of the Rings, and that's 21 00:01:31,800 --> 00:01:35,840 Speaker 1: a software company that has associations with stuff like predictive policing, 22 00:01:36,560 --> 00:01:40,759 Speaker 1: think like Minority Report, but without the psychics. They also 23 00:01:40,800 --> 00:01:45,479 Speaker 1: do deep analytics to look at things like crime statistics 24 00:01:45,480 --> 00:01:48,720 Speaker 1: and potential terrorism and things like that. Then you've got 25 00:01:48,760 --> 00:01:52,720 Speaker 1: another Tolkien based defense company. This is not one that 26 00:01:52,840 --> 00:01:56,600 Speaker 1: signed the AI agreement. But there's Anderil, which is Palmer 27 00:01:56,720 --> 00:02:00,640 Speaker 1: Lucky's defense company. That one's working on military drum. Anyway, 28 00:02:01,080 --> 00:02:04,480 Speaker 1: considering Pallunteer's history, I'm a little bit skeptical regarding how 29 00:02:04,560 --> 00:02:09,160 Speaker 1: sincere their commitment to the responsible development of AI might be. 30 00:02:10,120 --> 00:02:13,960 Speaker 1: Maybe that's not fair, but again, I don't think predictive 31 00:02:14,040 --> 00:02:17,040 Speaker 1: policing is fair either. So we're all in the wrong here. 32 00:02:17,440 --> 00:02:21,359 Speaker 1: But let's get back to these commitments. Generally speaking, they 33 00:02:21,360 --> 00:02:26,440 Speaker 1: are some fairly broad promises to essentially not create malicious 34 00:02:26,600 --> 00:02:30,840 Speaker 1: or dangerous AI, at least not on purpose. It's largely 35 00:02:30,880 --> 00:02:33,280 Speaker 1: being looked at as sort of a placeholder while the 36 00:02:33,360 --> 00:02:36,400 Speaker 1: US Congress debates on any actual laws that would govern 37 00:02:36,480 --> 00:02:40,000 Speaker 1: the development and deployment of AI, because you know, the 38 00:02:40,040 --> 00:02:43,800 Speaker 1: president can't just make laws. That's the that's the job 39 00:02:43,800 --> 00:02:47,520 Speaker 1: of the legislative branch. The president will then you know, 40 00:02:47,680 --> 00:02:51,959 Speaker 1: sign a bill into law. But the president cannot make 41 00:02:52,040 --> 00:02:57,320 Speaker 1: laws on their own. So that's why this is, you know, 42 00:02:57,400 --> 00:03:02,520 Speaker 1: sort of these these wishy washy voluntary commitments. Still, I 43 00:03:02,520 --> 00:03:05,560 Speaker 1: guess it's better than nothing. Time will tell. I suppose 44 00:03:06,960 --> 00:03:10,120 Speaker 1: a big tech court case begins today in the United States. 45 00:03:10,520 --> 00:03:13,480 Speaker 1: So in this conna, we got the US Department of 46 00:03:13,760 --> 00:03:17,960 Speaker 1: Justice and their opponent from pot's unknown, well not unknown, 47 00:03:18,000 --> 00:03:21,920 Speaker 1: it was a garage in California. It's Google. So the 48 00:03:22,000 --> 00:03:25,639 Speaker 1: lawsuit in this case is a really big anti trust lawsuit. 49 00:03:25,680 --> 00:03:28,040 Speaker 1: So this has to do with monopolies, you know, not 50 00:03:28,120 --> 00:03:29,880 Speaker 1: the kind where if you rolled doubles you get to 51 00:03:29,919 --> 00:03:33,200 Speaker 1: go again. The last time the US government went after 52 00:03:33,200 --> 00:03:35,920 Speaker 1: a tech company like this was way back in nineteen 53 00:03:36,000 --> 00:03:42,160 Speaker 1: ninety eight. That's when initially US courts found Microsoft guilty 54 00:03:42,200 --> 00:03:46,400 Speaker 1: of violating the Sherman Anti Trust Act, and ultimately the 55 00:03:46,640 --> 00:03:49,200 Speaker 1: judgment was that the company would need to break up 56 00:03:49,280 --> 00:03:53,160 Speaker 1: into two different companies, one that focused on the operating 57 00:03:53,160 --> 00:03:56,600 Speaker 1: system and one that focused on software. But an appeals 58 00:03:56,600 --> 00:04:00,680 Speaker 1: court later took the teeth out of that. Microsoft was 59 00:04:00,720 --> 00:04:05,280 Speaker 1: allowed to remain a cohesive whole and continue to grow 60 00:04:05,360 --> 00:04:10,400 Speaker 1: and consolidate and such. Now Google is in the government's crosshairs. 61 00:04:10,880 --> 00:04:13,480 Speaker 1: The argument is that Google has used its dominant position 62 00:04:13,560 --> 00:04:17,200 Speaker 1: to maintain a stranglehold on the web search business. Now, 63 00:04:17,320 --> 00:04:22,279 Speaker 1: keep in mind, Google's real business is mostly about displaying ads. 64 00:04:22,920 --> 00:04:27,400 Speaker 1: It's just that web search results are the vehicle through 65 00:04:27,400 --> 00:04:31,600 Speaker 1: which Google serves ads, So the search part is kind 66 00:04:31,640 --> 00:04:36,119 Speaker 1: of like, it's sort of beside the point. The whole 67 00:04:36,160 --> 00:04:41,080 Speaker 1: business for Google is the ad business. But most analyzes 68 00:04:42,040 --> 00:04:46,920 Speaker 1: are suggesting that Google commands more than ninety percent of 69 00:04:46,960 --> 00:04:51,040 Speaker 1: the market share for web searches, So it definitely seems 70 00:04:51,080 --> 00:04:56,520 Speaker 1: like we're talking monopoly territory here. I mean, if more 71 00:04:56,560 --> 00:05:00,520 Speaker 1: than nine out of ten web searches are using your product, 72 00:05:01,480 --> 00:05:03,360 Speaker 1: I don't know how you can argue you don't hold 73 00:05:03,400 --> 00:05:07,320 Speaker 1: a monopoly. Google's lawyers argue that the company actually has 74 00:05:07,360 --> 00:05:10,200 Speaker 1: plenty of competition. I mean, you know, first of all, 75 00:05:10,240 --> 00:05:14,880 Speaker 1: you got bing. Everybody knows bing, right, But beyond that, 76 00:05:14,960 --> 00:05:19,440 Speaker 1: they argue that there are other services that serve a 77 00:05:19,480 --> 00:05:24,039 Speaker 1: similar purpose as web searches. It's just they're not obviously 78 00:05:24,120 --> 00:05:27,800 Speaker 1: web searches. So maybe it's a service like Yelp, where 79 00:05:27,839 --> 00:05:29,720 Speaker 1: you're using Yelp to figure out where you're going to 80 00:05:29,760 --> 00:05:33,040 Speaker 1: go eat out at night, or maybe it's a service 81 00:05:33,120 --> 00:05:36,560 Speaker 1: like Amazon where you're searching for a specific product to purchase. 82 00:05:37,400 --> 00:05:40,600 Speaker 1: And so Google's argument is like, that's our competition, right, 83 00:05:40,720 --> 00:05:43,480 Speaker 1: so therefore we're not a monopoly. There's all these other 84 00:05:43,560 --> 00:05:47,719 Speaker 1: services that are doing similar things. I don't think those 85 00:05:47,839 --> 00:05:52,200 Speaker 1: arguments are terribly convincing personally, but then I'm not a judge, 86 00:05:52,480 --> 00:05:56,039 Speaker 1: so maybe I would be persuaded if I were. Anyway, 87 00:05:56,120 --> 00:05:58,800 Speaker 1: this case is really just a determine if Google has 88 00:05:58,880 --> 00:06:03,640 Speaker 1: done anything that could be considered illegal, like if they 89 00:06:03,640 --> 00:06:07,160 Speaker 1: have broken the Sherman Antitrust Act or something like that. 90 00:06:08,279 --> 00:06:12,920 Speaker 1: If the trial determines that Google has done that, then 91 00:06:12,960 --> 00:06:15,560 Speaker 1: there will have to be another trial to figure out 92 00:06:15,600 --> 00:06:18,320 Speaker 1: what to do about Google. Also, I've said that name 93 00:06:18,360 --> 00:06:20,640 Speaker 1: so many times that my smart speaker is now listening 94 00:06:20,680 --> 00:06:22,400 Speaker 1: to me because I heard it chime a second ago. 95 00:06:22,800 --> 00:06:27,120 Speaker 1: So if you heard that, it's just a hazard of 96 00:06:27,160 --> 00:06:31,560 Speaker 1: the business. Now. Google is not the only big tech 97 00:06:31,560 --> 00:06:36,440 Speaker 1: company facing some legal scrutiny recently. Ashley Giovic, whose name 98 00:06:36,520 --> 00:06:39,400 Speaker 1: I know I'm butchering and I will continue to do so. 99 00:06:40,200 --> 00:06:43,440 Speaker 1: She was once an employee of Apple, and she has 100 00:06:43,480 --> 00:06:47,719 Speaker 1: brought several civil cases against Apple over the last couple 101 00:06:47,760 --> 00:06:51,640 Speaker 1: of years. So, in twenty twenty one, Apple fired her 102 00:06:52,000 --> 00:06:55,000 Speaker 1: and said that she had violated the company's intellectual property 103 00:06:55,000 --> 00:07:00,159 Speaker 1: policies and the company accused her of leaking proprietary information 104 00:07:00,240 --> 00:07:05,360 Speaker 1: to the public. However, before she was fired, she had 105 00:07:05,440 --> 00:07:08,760 Speaker 1: come forward as a whistleblower and she had complaints about 106 00:07:08,839 --> 00:07:12,320 Speaker 1: Apple's effects on the environment, saying that their environmental policy 107 00:07:13,560 --> 00:07:16,040 Speaker 1: was not reflective of what the company was actually doing. 108 00:07:17,160 --> 00:07:20,560 Speaker 1: She also argued that the company had severe problems of 109 00:07:21,520 --> 00:07:27,000 Speaker 1: harassment and discrimination within corporate culture. And so she says 110 00:07:27,480 --> 00:07:31,880 Speaker 1: that she was fired not because of supposedly leaking proprietary information, 111 00:07:32,520 --> 00:07:36,600 Speaker 1: but rather this was a retaliatory strike against her to 112 00:07:36,720 --> 00:07:40,680 Speaker 1: silence her after she came forward with these whistleblower complaints. Now, 113 00:07:40,720 --> 00:07:43,880 Speaker 1: retaliatory action is a pretty big no no for companies. 114 00:07:44,520 --> 00:07:47,200 Speaker 1: She then brought some civil cases against Apple. They have 115 00:07:47,240 --> 00:07:49,520 Speaker 1: slowly made their way through the system. She actually won 116 00:07:50,120 --> 00:07:52,960 Speaker 1: one of those cases and got a preliminary win in 117 00:07:53,000 --> 00:07:57,760 Speaker 1: another one. However, the state has been very slow, or 118 00:07:57,800 --> 00:07:59,640 Speaker 1: the courts, I should say, not just the state. The 119 00:07:59,680 --> 00:08:02,480 Speaker 1: court systems have been very slow, and there is a 120 00:08:02,480 --> 00:08:07,800 Speaker 1: statute of limitations on these civil cases, so that means 121 00:08:07,800 --> 00:08:10,200 Speaker 1: that her cases are actually going to expire, they will 122 00:08:10,240 --> 00:08:13,920 Speaker 1: no longer be viable. And it's not her fault because 123 00:08:14,080 --> 00:08:17,640 Speaker 1: she can't just make the investigations pick up the pace. 124 00:08:18,160 --> 00:08:22,400 Speaker 1: So she has now filed a civil lawsuit against Apple 125 00:08:22,640 --> 00:08:25,440 Speaker 1: to keep her efforts alive as she awaits the results 126 00:08:25,440 --> 00:08:29,520 Speaker 1: of various investigations looking into the claims of retaliation. The 127 00:08:29,640 --> 00:08:33,320 Speaker 1: lawsuit she has filed is a RICO lawsuit. RICO stands 128 00:08:33,320 --> 00:08:39,000 Speaker 1: for Racketeering, Influenced and Corrupt Organizations Act, So there are 129 00:08:39,000 --> 00:08:42,120 Speaker 1: federal RICO laws and there are state RICO laws here 130 00:08:42,160 --> 00:08:45,280 Speaker 1: in the United States. RICO laws add in some pretty 131 00:08:45,320 --> 00:08:49,960 Speaker 1: hefty penalties, but they only cover charges that are part 132 00:08:50,080 --> 00:08:55,320 Speaker 1: of the operations of a criminal organization. So it's for 133 00:08:55,440 --> 00:08:59,280 Speaker 1: very specific circumstances. You can't just use racketeering charges for anything. 134 00:08:59,320 --> 00:09:02,240 Speaker 1: It has to be this case within the realm of 135 00:09:02,280 --> 00:09:05,360 Speaker 1: operations of a criminal organization. And usually we see RICO 136 00:09:05,640 --> 00:09:10,040 Speaker 1: apply to stuff like gang's an organized crime. Famously, here 137 00:09:10,080 --> 00:09:14,000 Speaker 1: in Georgia we are seeing RICO charges applied to the 138 00:09:14,040 --> 00:09:17,160 Speaker 1: alleged conspiracy to overturn the results of the twenty twenty 139 00:09:17,200 --> 00:09:21,640 Speaker 1: election here in my home state. So her lawsuit against 140 00:09:21,640 --> 00:09:25,160 Speaker 1: Apple is essentially saying Apple is effectively operating like a 141 00:09:25,160 --> 00:09:29,040 Speaker 1: criminal organization. She posted a lengthy blog entry on this, 142 00:09:29,400 --> 00:09:32,480 Speaker 1: and I will quote the relevant RICO passage here. It's 143 00:09:32,480 --> 00:09:34,680 Speaker 1: a long quote, but it's a very long piece. So 144 00:09:34,720 --> 00:09:38,720 Speaker 1: here we go. Quote. If an employer terminates an employee 145 00:09:38,760 --> 00:09:42,280 Speaker 1: in a way that constitutes an indictable criminal act that 146 00:09:42,440 --> 00:09:46,160 Speaker 1: is on the enumerated list of predicate acts for RICO, 147 00:09:46,800 --> 00:09:50,920 Speaker 1: it can then establish a RICO case with additional related 148 00:09:51,240 --> 00:09:55,560 Speaker 1: predicate acts. Apple fired me in an egregious violation of 149 00:09:55,679 --> 00:10:00,160 Speaker 1: eighteen USC. Section fifteen twelve and fifteen thirteen, with me 150 00:10:00,160 --> 00:10:04,040 Speaker 1: me even complaining of witness intimidation. As they did the 151 00:10:04,160 --> 00:10:07,560 Speaker 1: deed with that predicate act, I was able to weave 152 00:10:07,640 --> 00:10:11,319 Speaker 1: in their additional violations of fifteen twelve and fifteen thirteen, 153 00:10:11,880 --> 00:10:16,280 Speaker 1: their wire fraud and mail fraud, my complaints about securities fraud, 154 00:10:16,640 --> 00:10:21,000 Speaker 1: and my complaints about state criminal bribery and extortion. This 155 00:10:21,120 --> 00:10:23,880 Speaker 1: also opened the door to include predicate acts of Tom 156 00:10:23,920 --> 00:10:29,320 Speaker 1: Moyer's criminal bribery charge, Nancy Heinen's securities fraud charges, and 157 00:10:29,440 --> 00:10:33,840 Speaker 1: gene Levoff's securities fraud convictions. I also found a surprising 158 00:10:33,840 --> 00:10:37,960 Speaker 1: doorway into integrating my toxic torte claims at Apple's many 159 00:10:38,080 --> 00:10:42,160 Speaker 1: environmental law violations, which I will write about in detail later. 160 00:10:42,520 --> 00:10:44,920 Speaker 1: End quote. And again, like I said, that's a very 161 00:10:44,920 --> 00:10:47,520 Speaker 1: long quote, but it's part of a much much longer 162 00:10:47,559 --> 00:10:51,040 Speaker 1: piece where she explains her process. No word yet on 163 00:10:51,120 --> 00:10:55,000 Speaker 1: how Apple will respond to this lawsuit, but yeah, this 164 00:10:55,080 --> 00:10:58,920 Speaker 1: is a This is a pretty hefty piece of lawsuit 165 00:10:58,960 --> 00:11:01,560 Speaker 1: to be brought up against Apple. Whether it goes anywhere, 166 00:11:01,600 --> 00:11:05,240 Speaker 1: who's to say, But we will see. Okay, I've got 167 00:11:05,280 --> 00:11:07,320 Speaker 1: a lot more news to cover before we get to that. 168 00:11:07,400 --> 00:11:19,760 Speaker 1: Let's take a quick break. This week, TikTok has rolled 169 00:11:19,760 --> 00:11:24,160 Speaker 1: out e commerce features on its platform in the United States, 170 00:11:24,240 --> 00:11:27,240 Speaker 1: so users in the US will now see a shop tab, 171 00:11:27,640 --> 00:11:29,560 Speaker 1: and it's so inclined they can use that to view 172 00:11:29,600 --> 00:11:32,640 Speaker 1: and purchase items, and it gives brands more incentives to 173 00:11:32,720 --> 00:11:36,040 Speaker 1: work with TikTok and to seek out influencers in the 174 00:11:36,120 --> 00:11:39,880 Speaker 1: space and to form strategic partnerships. And for folks who 175 00:11:39,880 --> 00:11:42,679 Speaker 1: are using TikTok, it means they can shop, shop, shop 176 00:11:42,720 --> 00:11:46,200 Speaker 1: till they tick TikTok or something. This is nothing new 177 00:11:46,280 --> 00:11:48,760 Speaker 1: in other parts of the world. TikTok has had similar 178 00:11:48,800 --> 00:11:53,840 Speaker 1: e commerce features live in places like the UK, Singapore, Malaysia, 179 00:11:54,040 --> 00:11:57,520 Speaker 1: the Philippines, Thailand, and Vietnam for a while now. So 180 00:11:57,920 --> 00:12:01,120 Speaker 1: it's interesting to me because business is clearly going on 181 00:12:01,400 --> 00:12:04,520 Speaker 1: for TikTok here in the US, even as we still 182 00:12:04,520 --> 00:12:10,080 Speaker 1: occasionally see various US government officials make efforts to shut 183 00:12:10,200 --> 00:12:14,679 Speaker 1: TikTok down or force it to be divested from its 184 00:12:14,720 --> 00:12:18,880 Speaker 1: parent company, byte Edance. So yeah, interesting stuff if you're 185 00:12:18,920 --> 00:12:23,600 Speaker 1: a TikTok user, now, y'all. I'm headed off to Vegas 186 00:12:23,720 --> 00:12:25,520 Speaker 1: in a little bit more than a week for a 187 00:12:25,559 --> 00:12:29,160 Speaker 1: work trip. Actually it's technically for two work trips, but 188 00:12:29,240 --> 00:12:30,880 Speaker 1: those work trips are back to back, so I just 189 00:12:30,920 --> 00:12:32,920 Speaker 1: made it one big one. So for one week, I'm 190 00:12:32,960 --> 00:12:36,240 Speaker 1: going to be recording in a hotel room in Las Vegas. 191 00:12:36,240 --> 00:12:39,640 Speaker 1: It's really living the lush life anyway. I'm going to 192 00:12:39,720 --> 00:12:42,720 Speaker 1: be staying at a place that is owned by MGM Resorts, 193 00:12:42,880 --> 00:12:45,200 Speaker 1: and that has me a little bit concerned because this 194 00:12:45,320 --> 00:12:49,600 Speaker 1: week the company experienced some sort of undefined quote unquote 195 00:12:49,720 --> 00:12:54,520 Speaker 1: cybersecurity issue. The issue has already been a pretty serious one. 196 00:12:54,559 --> 00:12:58,000 Speaker 1: The company had to replace its homepage with a message 197 00:12:58,040 --> 00:13:02,320 Speaker 1: explaining that its websites were now offline, and instead gave 198 00:13:02,360 --> 00:13:05,320 Speaker 1: people various phone numbers if they needed to talk with 199 00:13:05,520 --> 00:13:07,880 Speaker 1: someone at one of the MGM resorts to do things like, 200 00:13:07,920 --> 00:13:11,440 Speaker 1: you know, make a reservation or whatever. And while I'm 201 00:13:11,440 --> 00:13:14,960 Speaker 1: familiar with the Vegas properties under MGM, it turns out 202 00:13:15,080 --> 00:13:19,800 Speaker 1: this cybersecurity issue was organizationwide. It's not just the MGM 203 00:13:19,840 --> 00:13:23,000 Speaker 1: properties in Vegas. Also the ones in Atlantic City and 204 00:13:23,080 --> 00:13:27,040 Speaker 1: Detroit have had at least some effect with this too. 205 00:13:27,679 --> 00:13:30,520 Speaker 1: So according to a couple sites I was reading, some 206 00:13:30,640 --> 00:13:36,280 Speaker 1: digital keys stopped working. The resorts rewards program went offline. 207 00:13:36,360 --> 00:13:38,600 Speaker 1: That's a pretty big deal for lots of gamblers who 208 00:13:38,920 --> 00:13:42,200 Speaker 1: you know, they use those rewards programs to kind of 209 00:13:42,240 --> 00:13:45,920 Speaker 1: eke out every single potential benefit you can get from 210 00:13:46,120 --> 00:13:50,040 Speaker 1: gambling and shopping at a particular resort. And at least 211 00:13:50,040 --> 00:13:53,920 Speaker 1: some MGM operated casinos were having issues with their slot machines. 212 00:13:54,280 --> 00:13:57,120 Speaker 1: So these days, you know, most slot machines use digital 213 00:13:57,240 --> 00:13:59,680 Speaker 1: video rather than physical spools. There are some that still 214 00:13:59,760 --> 00:14:03,360 Speaker 1: use spools, but they're all commanded by a micro cheb. 215 00:14:03,800 --> 00:14:07,560 Speaker 1: But some pictures shared on X formerly known as Twitter, 216 00:14:07,920 --> 00:14:11,160 Speaker 1: showed slot machines that were offline. They had error messages 217 00:14:11,200 --> 00:14:15,280 Speaker 1: on their screens. Beyond that, apparently some ATMs were also offline, 218 00:14:15,320 --> 00:14:17,640 Speaker 1: and according to at least one X user, the cashier 219 00:14:17,679 --> 00:14:20,080 Speaker 1: window in an MGM resort had shut down as well, 220 00:14:21,040 --> 00:14:24,120 Speaker 1: which is a big ol' ouch. The rumor mill has 221 00:14:24,160 --> 00:14:27,120 Speaker 1: circulated that the issue might stem from a ransomware attack, 222 00:14:27,200 --> 00:14:31,800 Speaker 1: but no such confirmation has been made by MGM resorts itself. 223 00:14:32,320 --> 00:14:35,000 Speaker 1: Now let's pop on over to X slash Twitter for 224 00:14:35,040 --> 00:14:38,320 Speaker 1: a second. According to an analysis firm called news Whip, 225 00:14:38,800 --> 00:14:41,600 Speaker 1: it looks like X has been purposefully limiting the reach 226 00:14:41,680 --> 00:14:45,320 Speaker 1: of posts containing links to articles on the New York Times. 227 00:14:46,040 --> 00:14:49,640 Speaker 1: NewsWhip saw that starting in July, there was this precipitous 228 00:14:49,760 --> 00:14:52,520 Speaker 1: drop and how frequently New York Times articles were being 229 00:14:52,600 --> 00:14:57,680 Speaker 1: shared across X, and the engagement that New York Times 230 00:14:57,680 --> 00:15:02,240 Speaker 1: had on other social media platforms remained stable. So it 231 00:15:02,280 --> 00:15:05,240 Speaker 1: was just on X where this dip was happening. So 232 00:15:05,360 --> 00:15:08,800 Speaker 1: unless the population of X slash Twitter just started to 233 00:15:08,840 --> 00:15:12,360 Speaker 1: behave out of line with people everywhere else, it is 234 00:15:12,760 --> 00:15:15,920 Speaker 1: a somewhat understandable conclusion to assume that X has done 235 00:15:16,000 --> 00:15:18,760 Speaker 1: something behind the scenes to limit the spread of New 236 00:15:18,840 --> 00:15:23,720 Speaker 1: York Times material across the platform. If we assume that 237 00:15:24,080 --> 00:15:27,480 Speaker 1: X is intentionally limiting the reach of New York Times articles, 238 00:15:27,840 --> 00:15:31,640 Speaker 1: it does once again seem to go against Elon Musk's 239 00:15:31,720 --> 00:15:35,160 Speaker 1: claimed commitment to free speech. Now, I know I say 240 00:15:35,160 --> 00:15:37,120 Speaker 1: this a lot, but I'm going to say it again anyway. 241 00:15:37,640 --> 00:15:40,800 Speaker 1: To me, it seems that Musk's idea of free speech 242 00:15:41,200 --> 00:15:44,680 Speaker 1: is only applicable to speech that he agrees with, and 243 00:15:44,720 --> 00:15:47,640 Speaker 1: everything else is fair game. Anyway. As far as I 244 00:15:47,640 --> 00:15:50,200 Speaker 1: can tell, The Times has yet to receive an explanation 245 00:15:50,280 --> 00:15:52,880 Speaker 1: as to why there's been this big change in engagement 246 00:15:52,920 --> 00:15:56,240 Speaker 1: and traffic on X and that it quote would be 247 00:15:56,320 --> 00:16:00,360 Speaker 1: concerned by targeted pressure applied to any news organization for 248 00:16:00,480 --> 00:16:04,760 Speaker 1: unclear reasons end quote over at a different Musk company, 249 00:16:05,120 --> 00:16:07,960 Speaker 1: that being Tesla. It appears that there was a bit 250 00:16:08,000 --> 00:16:10,600 Speaker 1: of a mutiny going on in the last few years, 251 00:16:10,720 --> 00:16:13,360 Speaker 1: kind of low key mutiny in the grand scheme of things. 252 00:16:13,720 --> 00:16:18,640 Speaker 1: Auto Blog reported on this, citing biographer Walter Isaacson, who 253 00:16:19,080 --> 00:16:24,880 Speaker 1: is releasing a biography about Musk this week. Apparently, Franz 254 00:16:24,960 --> 00:16:29,360 Speaker 1: von Holtzhausen, who serves as design chief at Tesla told 255 00:16:29,440 --> 00:16:33,160 Speaker 1: Isaacson that some engineers within Tesla really hated the design 256 00:16:33,200 --> 00:16:36,240 Speaker 1: of the cyber truck. I'll remind you the cyber truck 257 00:16:36,280 --> 00:16:39,640 Speaker 1: was announced in twenty nineteen kind of to hilarious results, 258 00:16:40,080 --> 00:16:44,800 Speaker 1: and yet has not reached a single customer, although theoretically 259 00:16:44,840 --> 00:16:47,800 Speaker 1: they could be coming out any day now. We actually 260 00:16:47,840 --> 00:16:50,800 Speaker 1: don't really have that much real information about specs and 261 00:16:50,880 --> 00:16:54,560 Speaker 1: capability of this vehicle, which is odd considering how close 262 00:16:54,600 --> 00:16:57,520 Speaker 1: it supposedly is to being delivered. You know, last we 263 00:16:57,600 --> 00:16:59,800 Speaker 1: heard it was going to be shipped by the end 264 00:16:59,800 --> 00:17:02,440 Speaker 1: of the month. Anyway, back to the story, So apparently 265 00:17:02,480 --> 00:17:05,520 Speaker 1: these designers took issue with the proposed cyber truck back 266 00:17:05,520 --> 00:17:07,840 Speaker 1: in the design phase and decided that they would not 267 00:17:07,880 --> 00:17:11,159 Speaker 1: be a part of it. Instead, they secretly began designing 268 00:17:11,200 --> 00:17:15,040 Speaker 1: their own electric pickup truck. Now, whatever happened to those designs, 269 00:17:15,160 --> 00:17:17,600 Speaker 1: or even whatever happened to the people who made them, 270 00:17:17,720 --> 00:17:20,640 Speaker 1: I can't say. I can't say. The cyber truck continues 271 00:17:20,680 --> 00:17:23,760 Speaker 1: to be a divisive topic in general. Some folks kind 272 00:17:23,800 --> 00:17:27,159 Speaker 1: of dig the weird angular body, others absolutely hate it. 273 00:17:27,480 --> 00:17:30,840 Speaker 1: I'm kind of indifferent with it. I'm more interested in 274 00:17:30,880 --> 00:17:34,040 Speaker 1: a company's ability to hold itself to deadlines and schedules, 275 00:17:34,480 --> 00:17:36,560 Speaker 1: which I imagine is a lot easier when the person 276 00:17:36,600 --> 00:17:39,919 Speaker 1: in charge isn't as mercurial as Musk appears to be. 277 00:17:42,160 --> 00:17:44,760 Speaker 1: The state of California has passed a bill that effectively 278 00:17:45,480 --> 00:17:50,199 Speaker 1: will ban autonomous heavy duty vehicles that operate on public 279 00:17:50,280 --> 00:17:53,399 Speaker 1: roads if those vehicles don't have a human being ready 280 00:17:53,440 --> 00:17:56,920 Speaker 1: to take over in an instant. So essentially, this bill, 281 00:17:57,040 --> 00:18:00,920 Speaker 1: if passed into law, will ban autonomous trucking at least 282 00:18:01,320 --> 00:18:04,359 Speaker 1: any autonomous trucking that doesn't also have a human driver 283 00:18:04,680 --> 00:18:08,359 Speaker 1: in addition to the autonomous systems. It passed the state 284 00:18:08,520 --> 00:18:11,960 Speaker 1: Senate thirty six to two. Now it goes to California 285 00:18:12,000 --> 00:18:16,400 Speaker 1: Governor Gavin Newsom's desks. If he signs the bill, then 286 00:18:16,440 --> 00:18:19,760 Speaker 1: it will become law. However, Newsom has a history of 287 00:18:19,800 --> 00:18:23,399 Speaker 1: siding with tech companies. He's very tech company friendly, so 288 00:18:23,720 --> 00:18:27,359 Speaker 1: it's possible he'll choose to veto the measure. In fact, 289 00:18:27,400 --> 00:18:30,560 Speaker 1: all indications seem to suggest that's what he will do. 290 00:18:31,200 --> 00:18:33,640 Speaker 1: The bill is kind of at the center of an ongoing, 291 00:18:33,800 --> 00:18:38,440 Speaker 1: heated debate within California, So you've got people who represent 292 00:18:38,480 --> 00:18:41,400 Speaker 1: the interest of truckers, you know, like Teamster Unions who 293 00:18:41,600 --> 00:18:45,120 Speaker 1: argue that companies that are in the autonomous vehicle space, 294 00:18:45,119 --> 00:18:48,919 Speaker 1: who have been pushing for autonomous trucking are doing so 295 00:18:49,480 --> 00:18:54,840 Speaker 1: without having a proven safety record, and that it's largely 296 00:18:55,119 --> 00:18:59,359 Speaker 1: an attempt to appeal to shareholders, to appeal to investors 297 00:18:59,400 --> 00:19:03,000 Speaker 1: to say, hey, look, we're getting ready to jump into 298 00:19:03,480 --> 00:19:08,080 Speaker 1: full deployment now because it's been such a slow process 299 00:19:08,359 --> 00:19:12,000 Speaker 1: of having the technology evolve over time that this has 300 00:19:12,040 --> 00:19:15,320 Speaker 1: taken longer than these companies anticipated, and in order to 301 00:19:15,800 --> 00:19:18,240 Speaker 1: give their investors something to hold on to, they're saying, 302 00:19:18,240 --> 00:19:21,240 Speaker 1: all right, we're going to jump into operations. And so 303 00:19:21,359 --> 00:19:25,480 Speaker 1: that's why Teamster Unions say that they are behind the 304 00:19:25,520 --> 00:19:28,240 Speaker 1: bill that would ban this. They also, of course point 305 00:19:28,240 --> 00:19:31,639 Speaker 1: out that it could have a huge negative impact on 306 00:19:31,800 --> 00:19:34,879 Speaker 1: the trucking industry as far as the truckers themselves are concerned, 307 00:19:34,920 --> 00:19:40,040 Speaker 1: it could take away their livelihoods. So the bill calls 308 00:19:40,080 --> 00:19:44,399 Speaker 1: for more extensive studies on vehicle performance and safety, and 309 00:19:44,440 --> 00:19:46,760 Speaker 1: the California Department of Motor Vehicles would have to make 310 00:19:46,800 --> 00:19:49,240 Speaker 1: a determination as to how safe or unsafe the tech 311 00:19:49,359 --> 00:19:53,360 Speaker 1: was by January first, twenty twenty nine, or after five 312 00:19:53,440 --> 00:19:56,320 Speaker 1: years of testing, whichever comes first. The DMV is not 313 00:19:56,359 --> 00:19:58,600 Speaker 1: thrilled with that measure. They have argued that it would 314 00:19:58,760 --> 00:20:02,600 Speaker 1: have a chilling effect innovation. If Newsom does veto the 315 00:20:02,600 --> 00:20:05,840 Speaker 1: bill as expected, it would require a two thirds majority 316 00:20:05,880 --> 00:20:09,639 Speaker 1: in both the California Senate and the state Assembly to 317 00:20:09,720 --> 00:20:12,560 Speaker 1: override the veto and pass it into law. That hasn't 318 00:20:12,600 --> 00:20:15,919 Speaker 1: happened since nineteen seventy nine. But based on what I'm seeing, 319 00:20:16,480 --> 00:20:19,679 Speaker 1: the California Assembly voted in favor initially sixty nine to 320 00:20:19,720 --> 00:20:22,919 Speaker 1: four when they first passed this bill, and the Senate 321 00:20:22,960 --> 00:20:26,200 Speaker 1: approved it thirty six to two. So really the support 322 00:20:26,280 --> 00:20:29,320 Speaker 1: is there if they wanted to try and override a veto. 323 00:20:29,520 --> 00:20:32,280 Speaker 1: I just don't know if they would want to do that. Okay, 324 00:20:32,320 --> 00:20:36,320 Speaker 1: that's it for the news for Tuesday, September twelfth, twenty 325 00:20:36,440 --> 00:20:38,920 Speaker 1: twenty three. I hope you're all well, and I'll talk 326 00:20:38,920 --> 00:20:49,640 Speaker 1: to you again really soon. Tech Stuff is an iHeartRadio production. 327 00:20:49,960 --> 00:20:55,000 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 328 00:20:55,119 --> 00:21:00,080 Speaker 1: or wherever you listen to your favorite shows. He