1 00:00:04,440 --> 00:00:12,520 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,360 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,400 --> 00:00:20,080 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:20,120 --> 00:00:23,760 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:23,840 --> 00:00:28,360 Speaker 1: June twenty second, twenty twenty three. And let's start off 6 00:00:28,840 --> 00:00:34,480 Speaker 1: with a dumb, juvenile story involving billionaires fronting on each 7 00:00:34,479 --> 00:00:37,720 Speaker 1: other in an effort to determine which one is the alpha. 8 00:00:38,400 --> 00:00:41,760 Speaker 1: By the way, just saying that made me feel gross. 9 00:00:42,120 --> 00:00:46,239 Speaker 1: But let's get to the story. Elon Musk, always the 10 00:00:46,240 --> 00:00:50,400 Speaker 1: picture of rationality immaturity, tweeted out that he'd be up 11 00:00:50,440 --> 00:00:54,720 Speaker 1: for a cage fight with Meta founder and CEO Mark Zuckerberg, 12 00:00:55,120 --> 00:00:58,640 Speaker 1: and then Zuckerberg replied on Instagram because of course these 13 00:00:58,680 --> 00:01:04,400 Speaker 1: billionaires are going to to their own platforms send me location, 14 00:01:04,680 --> 00:01:10,399 Speaker 1: indicating that he too was interested in fisticuffs. Musk then 15 00:01:10,520 --> 00:01:19,200 Speaker 1: posted again on Twitter Vegas Octagon. So why is their beef? Well, 16 00:01:19,240 --> 00:01:22,800 Speaker 1: it's probably because Meta is in the process of launching 17 00:01:22,840 --> 00:01:27,840 Speaker 1: a Twitter alternative, currently rumored to be called Threads. The 18 00:01:27,959 --> 00:01:31,760 Speaker 1: Verge quoted Meta's chief product officer Chris Cox as saying, 19 00:01:31,880 --> 00:01:35,760 Speaker 1: quote We've been hearing from creators and public figures who 20 00:01:35,800 --> 00:01:39,759 Speaker 1: are interested in having a platform that is sanely run 21 00:01:40,319 --> 00:01:44,479 Speaker 1: that they believe that they can trust and rely upon 22 00:01:44,840 --> 00:01:49,400 Speaker 1: for distribution end quote. And in a podcast with Lex Friedman, 23 00:01:49,960 --> 00:01:54,800 Speaker 1: Zuckerberg threw some more shade in Elon's direction, saying, quote, 24 00:01:55,120 --> 00:01:58,000 Speaker 1: I've always thought that Twitter should have a billion people 25 00:01:58,120 --> 00:02:02,120 Speaker 1: using it end quote, and seeing the popularity of Meta's 26 00:02:02,160 --> 00:02:06,800 Speaker 1: Facebook platform there, I imagine whether this will actually escalate 27 00:02:06,880 --> 00:02:10,760 Speaker 1: into some real fight remains to be seen, though. I 28 00:02:10,800 --> 00:02:13,799 Speaker 1: think we're not likely to get that. I think it's 29 00:02:13,919 --> 00:02:16,799 Speaker 1: just going to be posturing. But if we do get 30 00:02:16,800 --> 00:02:19,720 Speaker 1: that fight, well, Zuckerberg has actually been training in martial 31 00:02:19,800 --> 00:02:22,440 Speaker 1: arts for a while and participating in tournaments and even 32 00:02:22,480 --> 00:02:27,400 Speaker 1: winning a few, and Elon is Elon. I'm not saying 33 00:02:27,440 --> 00:02:31,040 Speaker 1: it would be a good fight, but I'd be dishonest 34 00:02:31,080 --> 00:02:33,680 Speaker 1: if I said I found the concept of billionaires pitted 35 00:02:33,720 --> 00:02:37,400 Speaker 1: against each other for our entertainment to be you know, bad. 36 00:02:38,160 --> 00:02:41,160 Speaker 1: I'm all for it beat the rich, as it were. 37 00:02:42,600 --> 00:02:45,840 Speaker 1: I've got more to say about both Twitter and Meta today, 38 00:02:45,919 --> 00:02:50,840 Speaker 1: so let's stick with Twitter first. Australia's Online Safety Commissioner 39 00:02:51,040 --> 00:02:56,959 Speaker 1: Julie Inman Grant says that one third of all complaints 40 00:02:57,120 --> 00:03:01,560 Speaker 1: about online hate that her office receives relate to Twitter. Now, 41 00:03:02,440 --> 00:03:05,079 Speaker 1: keep in mind, Twitter is not as big a platform 42 00:03:05,440 --> 00:03:09,240 Speaker 1: as some others out there, like Facebook and TikTok. It's smaller, 43 00:03:09,320 --> 00:03:12,560 Speaker 1: so to be the source of one third of all 44 00:03:12,560 --> 00:03:17,200 Speaker 1: complaints of online harassment that is significant. And then Grant 45 00:03:17,240 --> 00:03:20,000 Speaker 1: goes on to say that not only is Twitter failing 46 00:03:20,040 --> 00:03:23,120 Speaker 1: to rein in hate speech and abuse, but the service's 47 00:03:23,200 --> 00:03:27,960 Speaker 1: decision to reverse several account bans means that there's now 48 00:03:28,000 --> 00:03:30,800 Speaker 1: a rise of hate groups returning to Twitter, and that 49 00:03:31,000 --> 00:03:35,200 Speaker 1: gives those groups more momentum, which is not great for society. 50 00:03:35,640 --> 00:03:38,560 Speaker 1: Twitter has had a hard time holding on to trust 51 00:03:38,600 --> 00:03:43,920 Speaker 1: and safety leadership. Two trust and safety leaders have left 52 00:03:44,080 --> 00:03:47,600 Speaker 1: the company in the last year, and at least from 53 00:03:47,640 --> 00:03:52,240 Speaker 1: an outsider's perspective, the company doesn't seem terribly concerned about 54 00:03:52,280 --> 00:03:57,720 Speaker 1: trust and safety. They pretty much wiped out those offices, 55 00:03:58,720 --> 00:04:02,520 Speaker 1: leaving leaders with like no one to lead. The company 56 00:04:02,640 --> 00:04:06,880 Speaker 1: has twenty eight days to respond to Inman Grant's request 57 00:04:07,040 --> 00:04:12,840 Speaker 1: here to answer for this problem of online harm on 58 00:04:12,920 --> 00:04:15,560 Speaker 1: their platform, or else they could face a fine of 59 00:04:15,640 --> 00:04:19,679 Speaker 1: up to seven hundred thousand dollars Australian. Now that's about 60 00:04:19,680 --> 00:04:23,000 Speaker 1: four hundred and seventy five thousand, three hundred bucks American. 61 00:04:23,480 --> 00:04:27,240 Speaker 1: And y'all, let's be serious. In the corporate world, that's 62 00:04:27,279 --> 00:04:29,520 Speaker 1: not much. I mean, that's a huge amount of money 63 00:04:29,560 --> 00:04:32,640 Speaker 1: to someone like you or me, probably definitely me, I'm 64 00:04:32,680 --> 00:04:35,719 Speaker 1: assuming you maybe not, in which case, hey, good job. 65 00:04:36,240 --> 00:04:39,600 Speaker 1: But it's not a whole lot for big companies like Twitter. 66 00:04:39,640 --> 00:04:41,960 Speaker 1: And moreover, as we're going to learn in just a moment, 67 00:04:42,920 --> 00:04:46,240 Speaker 1: Twitter hasn't exactly been current on its bills in general, 68 00:04:46,320 --> 00:04:49,040 Speaker 1: so the odds of collecting that money aren't certain even 69 00:04:49,120 --> 00:04:52,479 Speaker 1: if the regulators do decide to find Twitter. This is 70 00:04:52,520 --> 00:04:57,000 Speaker 1: part of the big problem with regulatory agencies around the 71 00:04:57,040 --> 00:04:59,960 Speaker 1: world is that they have a limit to their authority. 72 00:05:00,440 --> 00:05:04,440 Speaker 1: Which makes sense. You don't want them to have limitless authority. 73 00:05:04,440 --> 00:05:08,200 Speaker 1: That would be chaos. But the problem is that the 74 00:05:08,480 --> 00:05:12,680 Speaker 1: amount they are able to find companies or to hold 75 00:05:12,680 --> 00:05:17,920 Speaker 1: them accountable is tiny in comparison with those companies' capacity 76 00:05:18,320 --> 00:05:21,440 Speaker 1: to just pay it, to go away, or to ignore it. 77 00:05:22,279 --> 00:05:26,280 Speaker 1: But now let's talk about those unpaid bills. Mark Schrobinger, 78 00:05:26,440 --> 00:05:30,159 Speaker 1: who formerly was Twitter's senior director of compensation has filed 79 00:05:30,200 --> 00:05:34,760 Speaker 1: a class action lawsuit accusing the company of withholding payments 80 00:05:35,120 --> 00:05:38,880 Speaker 1: due to current and former employees. My guess is, when 81 00:05:38,960 --> 00:05:42,960 Speaker 1: the former director of compensation is saying something's wrong about 82 00:05:42,960 --> 00:05:47,120 Speaker 1: the company's compensation, we should probably listen. Anyway. According to 83 00:05:47,160 --> 00:05:50,279 Speaker 1: the lawsuit, Twitter had promised employees that for twenty twenty 84 00:05:50,279 --> 00:05:54,600 Speaker 1: two they would receive half of their target bonuses, which 85 00:05:54,640 --> 00:05:59,440 Speaker 1: is better than nothing. Except employees got nothing. They never 86 00:05:59,520 --> 00:06:02,280 Speaker 1: got the payout that they had been promised. So the 87 00:06:02,360 --> 00:06:05,800 Speaker 1: lawsuit argues that this constitutes a breach of contract and 88 00:06:05,839 --> 00:06:09,000 Speaker 1: Twitter should be held accountable to pay the promised amount, 89 00:06:09,200 --> 00:06:11,520 Speaker 1: plus probably damages on top of that. If I were 90 00:06:11,560 --> 00:06:14,440 Speaker 1: to guess, this is just the latest in an avalanche 91 00:06:14,440 --> 00:06:19,200 Speaker 1: of lawsuits leveled against the company. Literally hundreds of lawsuits 92 00:06:19,560 --> 00:06:24,320 Speaker 1: are forming against Twitter, and that includes lawsuits from the 93 00:06:24,320 --> 00:06:27,960 Speaker 1: former bosses of Twitter, the people who ran Twitter before 94 00:06:28,000 --> 00:06:30,760 Speaker 1: Elon Musk took over. They say the company owes more 95 00:06:30,800 --> 00:06:33,880 Speaker 1: than a million dollars in legal fees to them. Plus 96 00:06:33,920 --> 00:06:36,920 Speaker 1: there are various vendors and landlords who have accused Twitter 97 00:06:36,960 --> 00:06:39,440 Speaker 1: of not paying bills. You know, they recently got an 98 00:06:39,440 --> 00:06:44,280 Speaker 1: eviction notice out of Colorado due to not paying the 99 00:06:44,320 --> 00:06:47,080 Speaker 1: bill for the office space there. So I'm very curious 100 00:06:47,360 --> 00:06:50,480 Speaker 1: what Twitter's new CEO is going to do about all this, 101 00:06:51,000 --> 00:06:55,680 Speaker 1: if she does anything about it at all. Switching over 102 00:06:55,720 --> 00:06:59,360 Speaker 1: to Meta, the company's independent oversight board as a request, 103 00:06:59,680 --> 00:07:03,279 Speaker 1: the board wants Meta to evaluate how it approaches prevention 104 00:07:03,800 --> 00:07:07,279 Speaker 1: with regards to posts that call for political violence. This 105 00:07:07,400 --> 00:07:09,600 Speaker 1: is in the wake of an incident from last year. 106 00:07:10,640 --> 00:07:14,160 Speaker 1: There was a video showing a Brazilian military leader telling 107 00:07:14,200 --> 00:07:17,120 Speaker 1: people they should quote unquote hit the street in an 108 00:07:17,120 --> 00:07:21,520 Speaker 1: attempt to incite violence during the twenty twenty two Brazilian election. 109 00:07:22,160 --> 00:07:25,680 Speaker 1: Meda kept the post up, but then the board selected 110 00:07:25,720 --> 00:07:28,880 Speaker 1: that case as one for it to examine, and then 111 00:07:29,000 --> 00:07:31,760 Speaker 1: Meta took the post down. So now the board is 112 00:07:31,760 --> 00:07:34,640 Speaker 1: telling Meta that the company needs to reassure the public 113 00:07:34,920 --> 00:07:38,040 Speaker 1: that it has the proper policies and processes in place 114 00:07:38,600 --> 00:07:40,600 Speaker 1: to prevent that kind of stuff in the future, and 115 00:07:40,640 --> 00:07:44,960 Speaker 1: to actually follow through on those policies, particularly as we 116 00:07:45,000 --> 00:07:48,480 Speaker 1: head toward another election year here in the United States. Now, 117 00:07:48,520 --> 00:07:51,800 Speaker 1: I should also add that the oversight board, which can 118 00:07:51,840 --> 00:07:56,400 Speaker 1: review Meta's choices to remove or to allow posts, doesn't 119 00:07:56,480 --> 00:08:00,880 Speaker 1: actually have the authority to force change. Can only make 120 00:08:01,000 --> 00:08:05,280 Speaker 1: recommendations to Meta on courses of action, but the company 121 00:08:05,360 --> 00:08:08,680 Speaker 1: cannot choose to either heed those recommendations or to ignore 122 00:08:08,720 --> 00:08:12,440 Speaker 1: them because the board's suggestions are non binding. And you 123 00:08:12,520 --> 00:08:15,600 Speaker 1: might say, why even have an oversight board if it 124 00:08:15,640 --> 00:08:19,120 Speaker 1: doesn't have the authority to enforce change, And really the 125 00:08:19,160 --> 00:08:22,760 Speaker 1: answer is because it's a form of self regulation that 126 00:08:22,960 --> 00:08:27,000 Speaker 1: if Meta shows that it can self regulate, then that 127 00:08:27,160 --> 00:08:32,559 Speaker 1: could potentially prevent government regulators from waiting into the whole matter. 128 00:08:33,040 --> 00:08:35,560 Speaker 1: So you could view the oversight board as Meta's attempt 129 00:08:35,760 --> 00:08:39,880 Speaker 1: to fix problems before government agencies start getting into things 130 00:08:39,920 --> 00:08:44,720 Speaker 1: as well. Let's move on to Amazon. So the US 131 00:08:44,720 --> 00:08:49,280 Speaker 1: Federal Trade Commission, or FTC, has filed a lawsuit against Amazon, 132 00:08:49,360 --> 00:08:52,360 Speaker 1: alleging that the company uses various tactics to lure or 133 00:08:52,400 --> 00:08:55,439 Speaker 1: trick people into signing up for Amazon Prime, and then 134 00:08:55,480 --> 00:08:59,120 Speaker 1: the company does its darndest to prevent people from canceling 135 00:08:59,160 --> 00:09:03,520 Speaker 1: their service. The FTC says that Amazon used insidious means 136 00:09:03,520 --> 00:09:06,480 Speaker 1: to get folks to unknowingly sign up for Prime membership 137 00:09:06,520 --> 00:09:11,600 Speaker 1: without their consent. That's pretty darn underhanded. If true, this 138 00:09:11,640 --> 00:09:15,680 Speaker 1: would violate the restore Online Shopper's Confidence Act, which yeah, 139 00:09:15,720 --> 00:09:17,680 Speaker 1: I mean if a company is tricking people into paying 140 00:09:17,720 --> 00:09:20,040 Speaker 1: one hundred and thirty nine bucks a year and then 141 00:09:20,120 --> 00:09:22,240 Speaker 1: doing its best to prevent people from backing out of that, 142 00:09:22,880 --> 00:09:25,920 Speaker 1: then yes, consumers are going to lose confidence in online 143 00:09:25,960 --> 00:09:30,240 Speaker 1: shopping with that company. Amazon representative Heather Layman denied the 144 00:09:30,320 --> 00:09:33,280 Speaker 1: allegations and said the FTC is just playing wrong about 145 00:09:33,280 --> 00:09:36,480 Speaker 1: both the facts and the law. The FTC countered that 146 00:09:36,559 --> 00:09:39,959 Speaker 1: Amazon purposefully made choices to grab more subscribers and then 147 00:09:40,000 --> 00:09:43,800 Speaker 1: hold them prisoner because it would impact the company's bottom 148 00:09:43,840 --> 00:09:47,200 Speaker 1: line to allow them to back out easily. Also, Insider 149 00:09:47,280 --> 00:09:51,000 Speaker 1: reports that an internal document from Amazon referenced the cancelation 150 00:09:51,120 --> 00:09:55,560 Speaker 1: process as iliad. That's a reference to the epic poem 151 00:09:55,720 --> 00:09:59,120 Speaker 1: by Homer about a span of time that happens during 152 00:09:59,160 --> 00:10:02,760 Speaker 1: the Trojan War, and that doesn't exactly evoke images of 153 00:10:02,800 --> 00:10:06,600 Speaker 1: a smooth and painless process. Also, why the heck would 154 00:10:06,640 --> 00:10:10,240 Speaker 1: anyone use a name like that to describe a cancellation process, 155 00:10:10,280 --> 00:10:13,720 Speaker 1: even internally, it just sounds like it's asking for trouble. 156 00:10:13,840 --> 00:10:18,360 Speaker 1: You should name it something like beach day or coffee break, y'all. 157 00:10:18,520 --> 00:10:21,480 Speaker 1: This is kind of like the folks at FTX allegedly 158 00:10:21,559 --> 00:10:26,120 Speaker 1: naming an internal group chat wire fraud. It's funny until 159 00:10:26,120 --> 00:10:29,800 Speaker 1: the Feds come a knock in. Anyway, the FTC is 160 00:10:29,960 --> 00:10:33,600 Speaker 1: seeking civil penalties against Amazon, as well as a permanent 161 00:10:33,679 --> 00:10:36,360 Speaker 1: injunction that would forbid the company to use those kinds 162 00:10:36,400 --> 00:10:39,199 Speaker 1: of tactics in the future. And as I mentioned earlier, 163 00:10:39,240 --> 00:10:42,640 Speaker 1: Amazon denies the charges entirely, so we will see where 164 00:10:42,679 --> 00:10:46,840 Speaker 1: this goes. Meanwhile, the US Congress is also interested in 165 00:10:46,880 --> 00:10:51,600 Speaker 1: holding Amazon accountable, this time for conditions at Amazon's distribution centers. 166 00:10:52,080 --> 00:10:56,120 Speaker 1: The Senate Committee on Health, Education, Labor, and Pensions said 167 00:10:56,120 --> 00:11:00,439 Speaker 1: that Amazon warehouse workers filed more serious injury claims than 168 00:11:00,679 --> 00:11:06,680 Speaker 1: all other US warehouse workers combined. Yikes, that does sound 169 00:11:06,679 --> 00:11:10,080 Speaker 1: like there's a serious systemic problem that is contributing to 170 00:11:10,160 --> 00:11:12,839 Speaker 1: injuries and a negative impact on quality of life for 171 00:11:12,960 --> 00:11:17,160 Speaker 1: Amazon warehouse employees. Senator Bernie Sanders says the problem is 172 00:11:17,240 --> 00:11:21,360 Speaker 1: worse than it sounds. That Amazon funnels injured employees through 173 00:11:21,360 --> 00:11:25,120 Speaker 1: an on site medical clinic at warehouses, and the clinical 174 00:11:25,200 --> 00:11:27,520 Speaker 1: staff are encouraged to get workers back to work as 175 00:11:27,600 --> 00:11:31,800 Speaker 1: quickly as possible while under reporting serious injuries. If true, 176 00:11:31,840 --> 00:11:34,880 Speaker 1: this is approaching some of the truly awful work conditions 177 00:11:35,160 --> 00:11:38,080 Speaker 1: you might hear about in something like a Charles Dickens novel. 178 00:11:38,920 --> 00:11:41,640 Speaker 1: It explains why we've seen a couple of cases where 179 00:11:41,679 --> 00:11:45,400 Speaker 1: distribution center employees have voted to unionize. It also helps 180 00:11:45,440 --> 00:11:50,640 Speaker 1: explain Amazon's incredible turnover rate at these places. Sanders has 181 00:11:50,640 --> 00:11:54,680 Speaker 1: called on Amazon CEO Andy Jasse to appear and address 182 00:11:54,760 --> 00:11:58,080 Speaker 1: these allegations, and gave him a deadline of July fifth 183 00:11:58,120 --> 00:12:01,040 Speaker 1: to do it. He also indicated it was possible that 184 00:12:01,160 --> 00:12:05,200 Speaker 1: Congress would call upon Jassey or even Amazon's founder Jeff 185 00:12:05,240 --> 00:12:09,280 Speaker 1: Bezos to testify about warehouse safety and worker injuries in 186 00:12:09,320 --> 00:12:11,679 Speaker 1: front of all of Congress. So we'll have to see 187 00:12:11,920 --> 00:12:15,880 Speaker 1: if that happens. Okay, we're gonna take a quick break 188 00:12:15,880 --> 00:12:17,960 Speaker 1: to thank our sponsors, but we'll be back with more 189 00:12:18,000 --> 00:12:28,840 Speaker 1: news in just a moment. We're back, and we've got 190 00:12:28,840 --> 00:12:31,520 Speaker 1: just a couple more stories to cover. The protests over 191 00:12:31,640 --> 00:12:35,040 Speaker 1: on Reddit continue, though they've taken some odd turns. For 192 00:12:35,120 --> 00:12:37,960 Speaker 1: those just joining this story, Reddit made a change to 193 00:12:38,040 --> 00:12:41,600 Speaker 1: its API, its Application program interface, which is how third 194 00:12:41,640 --> 00:12:46,320 Speaker 1: party developers can create tools that then access Reddit. The 195 00:12:46,440 --> 00:12:48,240 Speaker 1: change is meant that developers will have to pay a 196 00:12:48,280 --> 00:12:52,480 Speaker 1: fee as their tools reference Reddit, and the more popular 197 00:12:52,480 --> 00:12:55,840 Speaker 1: and active apps could rack up considerable fees, perhaps in 198 00:12:55,840 --> 00:12:58,480 Speaker 1: the millions of dollars per year, and that in turn 199 00:12:58,880 --> 00:13:01,840 Speaker 1: has prompted several popular apps to close up shop and 200 00:13:01,880 --> 00:13:05,120 Speaker 1: go dark. More than eight thousand subredits on the site 201 00:13:05,160 --> 00:13:08,840 Speaker 1: protested reddits policy changes, as well as how Reddit CEO 202 00:13:09,000 --> 00:13:12,760 Speaker 1: Steve Huffman has handled the situation, primarily by dismissing it 203 00:13:12,800 --> 00:13:15,360 Speaker 1: and threatening moderators who have participated in the protest with 204 00:13:15,400 --> 00:13:19,760 Speaker 1: a ban. Reddit generates revenue through advertising, so the protesters 205 00:13:19,880 --> 00:13:22,760 Speaker 1: have tried to hit Reddit where it hurts the wallet. 206 00:13:23,440 --> 00:13:27,559 Speaker 1: Some subreddits switched to private mode and essentially eliminated all 207 00:13:27,640 --> 00:13:30,600 Speaker 1: traffic to the subreddit, which obviously cuts off ad dollars 208 00:13:30,600 --> 00:13:33,600 Speaker 1: that way, but others went a different route. They went 209 00:13:33,920 --> 00:13:35,640 Speaker 1: dark for a couple of days, but when they came back, 210 00:13:35,679 --> 00:13:39,560 Speaker 1: they switched a tag on the subreddit community to turn 211 00:13:39,600 --> 00:13:44,160 Speaker 1: it into an NSFW or not Safe for work subreddit, 212 00:13:44,640 --> 00:13:48,280 Speaker 1: and Reddit does allowed those kinds of communities on its platform. However, 213 00:13:48,840 --> 00:13:53,280 Speaker 1: it does not pair in SFW communities with advertising, and 214 00:13:53,320 --> 00:13:56,680 Speaker 1: that makes sense because your typical company is probably not 215 00:13:56,800 --> 00:14:02,240 Speaker 1: eager to have their products associated with pornography or OTHERFW material, 216 00:14:02,840 --> 00:14:06,240 Speaker 1: and redditors have been flooding these communities with you know 217 00:14:07,120 --> 00:14:11,120 Speaker 1: in SFW content which assures that no advertising is going 218 00:14:11,160 --> 00:14:13,719 Speaker 1: to happen in those communities while this is going on. 219 00:14:13,880 --> 00:14:18,120 Speaker 1: And again, this is a direct strike against Reddit's revenue source, 220 00:14:18,160 --> 00:14:22,360 Speaker 1: and as such, the platform has removed several moderators, although 221 00:14:22,360 --> 00:14:25,600 Speaker 1: it did later reinstall some of those in an effort 222 00:14:25,600 --> 00:14:28,760 Speaker 1: to fight back against this protest, and it's worked for 223 00:14:28,960 --> 00:14:33,400 Speaker 1: some subredits which have dropped the NSFW tag. There have 224 00:14:33,480 --> 00:14:36,520 Speaker 1: been cases where some of the subreddit moderators have said 225 00:14:36,800 --> 00:14:39,800 Speaker 1: they acted without the actual support of the community, and 226 00:14:39,920 --> 00:14:43,360 Speaker 1: I can understand a moderator changing it back then, because 227 00:14:43,640 --> 00:14:45,240 Speaker 1: really you need to get the buy in of the 228 00:14:45,240 --> 00:14:48,560 Speaker 1: community before you make a move like that. But other 229 00:14:48,600 --> 00:14:51,320 Speaker 1: communities are still going on strong with this protest, and 230 00:14:51,360 --> 00:14:54,320 Speaker 1: it's not like Reddit has an endless supply of replacement 231 00:14:54,400 --> 00:14:58,000 Speaker 1: mods to put in the spot where they have banned 232 00:14:58,040 --> 00:15:01,160 Speaker 1: other moderators. I'm not sure if Reddit is ultimately going 233 00:15:01,200 --> 00:15:03,760 Speaker 1: to win or lose this battle. I just know the 234 00:15:03,800 --> 00:15:09,360 Speaker 1: fight isn't over yet. Ford's CEO Jim Farley threw some 235 00:15:09,440 --> 00:15:13,200 Speaker 1: more shade toward Elon Musk, who must be feeling pretty 236 00:15:13,240 --> 00:15:15,680 Speaker 1: much in the dark by now. Anyway, this doesn't have 237 00:15:15,720 --> 00:15:18,160 Speaker 1: anything to do with Twitter. Instead, it focuses on one 238 00:15:18,240 --> 00:15:22,640 Speaker 1: of Musk's other companies, namely Tesla, and it's long delayed 239 00:15:22,800 --> 00:15:26,520 Speaker 1: cyber truck. Ford is preparing to launch its own, also 240 00:15:26,840 --> 00:15:31,160 Speaker 1: delayed electric truck, the F one P fifty Lightning, and 241 00:15:31,240 --> 00:15:33,240 Speaker 1: when he was asked if he felt that the cyber 242 00:15:33,280 --> 00:15:37,800 Speaker 1: truck is a significant competitive threat to the Lightning, Farley 243 00:15:37,840 --> 00:15:41,200 Speaker 1: was quick to dismiss such notions. He said, and I quote, 244 00:15:41,880 --> 00:15:45,240 Speaker 1: I make trucks for real people who do real work, 245 00:15:45,360 --> 00:15:49,800 Speaker 1: and that's a different kind of truck. Now I get 246 00:15:49,840 --> 00:15:53,120 Speaker 1: what he's saying, but it does strike me as wrong 247 00:15:53,440 --> 00:15:57,160 Speaker 1: to say that folks in Silicon Valley aren't real people. 248 00:15:57,520 --> 00:15:59,880 Speaker 1: I mean they are, I think. I mean, some of 249 00:15:59,880 --> 00:16:02,280 Speaker 1: the might be robots, and there's probably one or two 250 00:16:02,280 --> 00:16:04,720 Speaker 1: who are holograms, but I bet most of them are 251 00:16:04,800 --> 00:16:08,160 Speaker 1: real people. Anyway. Farley's point is that he feels the 252 00:16:08,160 --> 00:16:11,280 Speaker 1: cyber truck fails to meet the needs that most truck 253 00:16:11,360 --> 00:16:14,840 Speaker 1: owners have, like people who actually make practical use of 254 00:16:14,880 --> 00:16:17,560 Speaker 1: their trucks, and it's not like just a status symbol 255 00:16:17,640 --> 00:16:19,640 Speaker 1: or something. I think that's what he was getting at, 256 00:16:20,120 --> 00:16:25,680 Speaker 1: and that conversely, Ford's upcoming truck meets all the needs 257 00:16:25,720 --> 00:16:32,040 Speaker 1: of your typical truck driving you know, real person. However, 258 00:16:32,080 --> 00:16:34,800 Speaker 1: all that being said, the Lightning will use testless specs 259 00:16:34,840 --> 00:16:38,720 Speaker 1: for charging ports, and that could actually become the EV 260 00:16:38,960 --> 00:16:41,680 Speaker 1: charging standard, at least here in the United States, or 261 00:16:41,680 --> 00:16:45,480 Speaker 1: at least one of very few standards. Farley said that 262 00:16:45,560 --> 00:16:48,280 Speaker 1: this was just a decision that was good because it's 263 00:16:48,280 --> 00:16:51,560 Speaker 1: good for customers. And I agree. No one wants competing 264 00:16:51,680 --> 00:16:55,560 Speaker 1: charging standards because just imagine you're driving your electric vehicle 265 00:16:56,000 --> 00:16:58,560 Speaker 1: and you come up to a charging station because your 266 00:16:59,160 --> 00:17:01,760 Speaker 1: car's juice is getting a little low, and then you 267 00:17:01,800 --> 00:17:06,040 Speaker 1: see that the charging station isn't compatible with your electric vehicle. 268 00:17:06,119 --> 00:17:08,480 Speaker 1: That would be a nightmare, right. It's not like when 269 00:17:08,520 --> 00:17:10,679 Speaker 1: you drive up to a gas station and you're like, oh, 270 00:17:11,320 --> 00:17:15,360 Speaker 1: these pumps don't work with this car. You want something 271 00:17:15,400 --> 00:17:17,960 Speaker 1: that is as much of a universal standard as it 272 00:17:18,040 --> 00:17:20,960 Speaker 1: can be. So I think that it's a wise move. 273 00:17:21,960 --> 00:17:23,879 Speaker 1: As for the ultimate fate of the cyber truck, I 274 00:17:24,000 --> 00:17:27,200 Speaker 1: just don't know. I mean, it's been widely reported that 275 00:17:27,280 --> 00:17:30,520 Speaker 1: the delays have more to do with design issues that 276 00:17:30,600 --> 00:17:33,800 Speaker 1: have cropped up during development more than anything else, So 277 00:17:33,840 --> 00:17:35,840 Speaker 1: my advice would be hold off on buying the first 278 00:17:35,840 --> 00:17:37,960 Speaker 1: ones off the lot and see how things shake out 279 00:17:38,000 --> 00:17:41,840 Speaker 1: over time. Okay, that's it for the news this week. 280 00:17:41,920 --> 00:17:45,399 Speaker 1: I'm off next week on vacation, but I managed to 281 00:17:45,480 --> 00:17:49,959 Speaker 1: record three new episodes in advance, so Monday through Wednesday 282 00:17:50,040 --> 00:17:52,600 Speaker 1: or new episodes, you'll only be getting a rerun on Thursday, 283 00:17:53,200 --> 00:17:56,160 Speaker 1: plus the classic episode on Friday. But that's a weekly thing. 284 00:17:56,920 --> 00:18:01,000 Speaker 1: Fair warning. However, I wrote and recorded next week's episodes 285 00:18:01,359 --> 00:18:03,320 Speaker 1: while I was under the influence of an over the 286 00:18:03,359 --> 00:18:06,439 Speaker 1: counter headache medication, which normally wouldn't be an issue except 287 00:18:06,440 --> 00:18:10,159 Speaker 1: I accidentally took the PM version of that medication, so 288 00:18:10,200 --> 00:18:13,800 Speaker 1: I got real loopy. I'm gonna publish those episodes anyway, 289 00:18:13,840 --> 00:18:16,560 Speaker 1: because you only live once, by Golly, so be prepared 290 00:18:16,560 --> 00:18:19,560 Speaker 1: to learn about symbolic logic and the mighty boosh. I 291 00:18:19,600 --> 00:18:22,119 Speaker 1: hope you're all well, and I'll talk to you again 292 00:18:22,920 --> 00:18:31,720 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 293 00:18:31,760 --> 00:18:36,480 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 294 00:18:36,520 --> 00:18:42,000 Speaker 1: wherever you listen to your favorite shows.