1 00:00:04,360 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,079 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,160 --> 00:00:22,520 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:22,600 --> 00:00:28,560 Speaker 1: April eleventh, twenty three, and we've got a ton of 6 00:00:28,720 --> 00:00:33,280 Speaker 1: news to cover today, with a big chunk dedicated to Twitter. 7 00:00:33,440 --> 00:00:39,240 Speaker 1: It's been a while, as Stained once said, since I 8 00:00:39,360 --> 00:00:41,640 Speaker 1: dragged Twitter over the coals. But we've got a whole 9 00:00:41,680 --> 00:00:44,680 Speaker 1: bunch of Twitter news. But first, I wanted to talk 10 00:00:44,760 --> 00:00:49,560 Speaker 1: about an intelligence leak in the United States, which kind 11 00:00:49,560 --> 00:00:52,199 Speaker 1: of leads itself to all sorts of snarky jokes. But no, 12 00:00:52,360 --> 00:00:57,800 Speaker 1: this is about top secret documents that have been leaked. 13 00:00:57,960 --> 00:00:59,920 Speaker 1: And apparently this is a thing that's been going on 14 00:01:00,200 --> 00:01:04,240 Speaker 1: since at least January. And you may have heard that 15 00:01:04,440 --> 00:01:08,280 Speaker 1: images of top secret documents, some of them apparently edited, 16 00:01:08,760 --> 00:01:11,880 Speaker 1: have escaped out into the world and they're causing some 17 00:01:12,000 --> 00:01:16,920 Speaker 1: real problems among them. Ukraine reportedly had to change some 18 00:01:17,040 --> 00:01:21,160 Speaker 1: of its strategy with regard to Russia's ongoing war with Ukraine, 19 00:01:21,959 --> 00:01:25,880 Speaker 1: and Russia expressing how upset they are that the West 20 00:01:26,240 --> 00:01:30,959 Speaker 1: and NATO have been supporting Ukraine, which seems like a 21 00:01:31,080 --> 00:01:33,680 Speaker 1: really poorly kept secret in the first place. But whatever. 22 00:01:34,120 --> 00:01:36,560 Speaker 1: But the text side of this is really what I 23 00:01:36,600 --> 00:01:40,360 Speaker 1: want to focus on. Because these documents weren't submitted to 24 00:01:40,520 --> 00:01:44,679 Speaker 1: wiki leaks or anything like that. They appear to have 25 00:01:44,959 --> 00:01:50,000 Speaker 1: initially emerged on a Discord server. Now, y'all may know 26 00:01:50,240 --> 00:01:53,760 Speaker 1: that the original concept behind Discord was to act as 27 00:01:53,760 --> 00:01:58,920 Speaker 1: a means of communication, primarily voice chat for gamers, because 28 00:01:59,120 --> 00:02:03,280 Speaker 1: in game voice systems are frequently not very good. But 29 00:02:03,400 --> 00:02:08,720 Speaker 1: obviously Discord has grown beyond the gaming community, and apparently, 30 00:02:09,080 --> 00:02:11,840 Speaker 1: in a Discord server that appeared to at least partly 31 00:02:11,880 --> 00:02:18,120 Speaker 1: be dedicated to a YouTube figure from the Philippines, someone 32 00:02:18,160 --> 00:02:23,160 Speaker 1: started sharing top secret documents. Who that person is and 33 00:02:23,200 --> 00:02:25,760 Speaker 1: how they got access to the documents and why they 34 00:02:25,800 --> 00:02:28,919 Speaker 1: felt the need to share them is unknown, at least 35 00:02:28,960 --> 00:02:32,200 Speaker 1: to me, though there's speculation that it might have been 36 00:02:32,240 --> 00:02:36,799 Speaker 1: a low level official who had access to such documents 37 00:02:36,800 --> 00:02:39,240 Speaker 1: through their job and also was a member of this 38 00:02:39,320 --> 00:02:45,480 Speaker 1: particular server. Anyway, the leaks expanded beyond that one Discord server, 39 00:02:45,600 --> 00:02:49,880 Speaker 1: which has since been deleted. There were new leaks that 40 00:02:49,960 --> 00:02:53,560 Speaker 1: surfaced in a Discord server that was dedicated to Minecraft, 41 00:02:53,600 --> 00:02:57,160 Speaker 1: of all things, and then it spread to other platforms 42 00:02:57,200 --> 00:03:01,800 Speaker 1: like four Chan. Now, obviously this is a huge concern 43 00:03:01,840 --> 00:03:04,320 Speaker 1: to the US government. It could potentially be a threat 44 00:03:04,360 --> 00:03:07,560 Speaker 1: to national security. So the Department of Justice and the 45 00:03:07,560 --> 00:03:10,840 Speaker 1: Department of Defense are really interested to learn more about 46 00:03:10,840 --> 00:03:14,799 Speaker 1: the source of that leak. Now, clearly the fault does 47 00:03:14,840 --> 00:03:18,440 Speaker 1: not lie with the platforms themselves. Rather it lies with 48 00:03:18,480 --> 00:03:21,880 Speaker 1: whomever was responsible for posting images of the documents in 49 00:03:21,880 --> 00:03:24,800 Speaker 1: the first place. You can read more about the story 50 00:03:25,280 --> 00:03:28,400 Speaker 1: in the New York Times, and Eric Tohler has a 51 00:03:28,440 --> 00:03:33,400 Speaker 1: great article on Bellingcat and it's titled from Discord to 52 00:03:33,480 --> 00:03:37,600 Speaker 1: four Chan, The Improbable Journey of a US Intelligence leak. 53 00:03:37,720 --> 00:03:40,920 Speaker 1: So I highly recommend checking that out if you want 54 00:03:40,960 --> 00:03:43,800 Speaker 1: to learn more, and I'm sure we'll continue to cover 55 00:03:43,880 --> 00:03:48,080 Speaker 1: it as the story develops. Okay, now I've got a 56 00:03:48,080 --> 00:03:51,600 Speaker 1: whole suite of Twitter updates to deliver my guesses they'll 57 00:03:51,640 --> 00:03:55,520 Speaker 1: take up the bulk of this episode. And before I 58 00:03:55,560 --> 00:03:57,560 Speaker 1: really jump into it, I have to give a shout 59 00:03:57,600 --> 00:04:02,400 Speaker 1: out to Casey Newton's newsletter called Platformer, which grouped together 60 00:04:02,840 --> 00:04:06,720 Speaker 1: a ton of these resources. Now I've talked about Casey's 61 00:04:06,720 --> 00:04:10,120 Speaker 1: work on this show before, and he continues to do 62 00:04:10,160 --> 00:04:14,200 Speaker 1: an amazing job. Platformer is well worth the subscription. It 63 00:04:14,320 --> 00:04:17,880 Speaker 1: is a newsletter you subscribe to. It's a paid subscription. 64 00:04:18,120 --> 00:04:21,320 Speaker 1: And I do not personally know Casey. I have no 65 00:04:21,800 --> 00:04:25,360 Speaker 1: connection to him, and I also don't have any connection 66 00:04:25,440 --> 00:04:28,760 Speaker 1: to Platformer other than the fact that I'm a subscriber myself. 67 00:04:28,800 --> 00:04:31,159 Speaker 1: I'm just saying that if you're really into tech news 68 00:04:31,200 --> 00:04:37,080 Speaker 1: and like a deep dive into it, it's a good resource. Anyway, 69 00:04:37,360 --> 00:04:41,000 Speaker 1: Casey gives his opinion on Twitter, and he thinks that 70 00:04:41,080 --> 00:04:43,200 Speaker 1: we might be seeing sort of the middle of the 71 00:04:43,360 --> 00:04:46,120 Speaker 1: end for Twitter. You could argue the beginning of the 72 00:04:46,200 --> 00:04:48,719 Speaker 1: end was even before Elon Musk made his move to 73 00:04:48,800 --> 00:04:54,200 Speaker 1: acquire it, but Elon is certainly a big part of 74 00:04:54,240 --> 00:04:59,919 Speaker 1: why Casey feels that the platform could potentially be heading 75 00:05:00,200 --> 00:05:06,080 Speaker 1: to an ignominious end, mostly due to Musk's tendency to 76 00:05:06,279 --> 00:05:10,520 Speaker 1: make mercurial changes to the platform and then frequently reverse 77 00:05:10,600 --> 00:05:15,200 Speaker 1: those changes after facing harsh criticism, and then the platform 78 00:05:15,240 --> 00:05:18,599 Speaker 1: itself seems to be failing with various functions and features. 79 00:05:19,200 --> 00:05:21,960 Speaker 1: So let's cover some of the stories about Twitter and 80 00:05:22,000 --> 00:05:25,599 Speaker 1: how that forty four billion dollar investment is going first. 81 00:05:25,640 --> 00:05:28,159 Speaker 1: Up is a story that Casey actually didn't cover in 82 00:05:28,200 --> 00:05:31,279 Speaker 1: the most recent platformer, as I believe the news broke 83 00:05:31,400 --> 00:05:35,440 Speaker 1: after he had already sent that newsletter out to be published. 84 00:05:35,960 --> 00:05:40,599 Speaker 1: And that's the fact that Twitter Incorporated doesn't exist anymore. 85 00:05:41,000 --> 00:05:45,160 Speaker 1: So Twitter the service still exists, but the company doesn't 86 00:05:45,520 --> 00:05:49,080 Speaker 1: because some court documents revealed that on the corporate back end, 87 00:05:49,440 --> 00:05:55,160 Speaker 1: another one of Musk's company's x Corps has absorbed Twitter Incorporated. 88 00:05:55,760 --> 00:05:59,800 Speaker 1: This is according to Natish Pawa and Mark Joseph Stern, 89 00:06:00,440 --> 00:06:05,560 Speaker 1: who jointly published a piece in Slate yesterday afternoon. And 90 00:06:05,640 --> 00:06:10,520 Speaker 1: here's how the whole story unfolded. A woman named Laura Lumer, 91 00:06:10,600 --> 00:06:15,440 Speaker 1: who's known for her right wing politics and social presence, 92 00:06:16,000 --> 00:06:19,560 Speaker 1: sued Twitter for banning her account a few years ago. 93 00:06:20,120 --> 00:06:24,120 Speaker 1: She claims that this is an abuse of racketeering laws. 94 00:06:24,600 --> 00:06:28,000 Speaker 1: The Slate authors say that, in their opinion, her case 95 00:06:28,080 --> 00:06:32,719 Speaker 1: has no merit, and based upon just the very surface 96 00:06:32,760 --> 00:06:36,760 Speaker 1: level glance I got, I am inclined to agree with them. 97 00:06:37,080 --> 00:06:41,480 Speaker 1: This seems like a case that really has no teeth 98 00:06:41,560 --> 00:06:44,279 Speaker 1: to it. But one thing that the case does is 99 00:06:44,279 --> 00:06:49,040 Speaker 1: it requires Twitter to disclose corporate disclosure statements to the court, 100 00:06:49,680 --> 00:06:52,800 Speaker 1: and then by looking at court filings, you can actually 101 00:06:52,839 --> 00:06:55,400 Speaker 1: see those and one of those statements revealed the fact 102 00:06:55,440 --> 00:07:01,800 Speaker 1: that the Twitter company doesn't exist anymore. But anyway, even 103 00:07:01,800 --> 00:07:06,800 Speaker 1: though Twitter is now part of xcort, the lawsuit still 104 00:07:06,839 --> 00:07:11,880 Speaker 1: remains viable. It just transfers to xcort instead of Twitter Incorporated. 105 00:07:12,520 --> 00:07:15,119 Speaker 1: And you could argue that the corporate move really doesn't 106 00:07:15,200 --> 00:07:17,559 Speaker 1: change anything other than the name of the company that's 107 00:07:17,680 --> 00:07:21,640 Speaker 1: running Twitter the service, all the debt, all the lawsuits, 108 00:07:22,000 --> 00:07:24,920 Speaker 1: all of the troubles still exist. They just transfer to 109 00:07:25,000 --> 00:07:29,280 Speaker 1: this other entity. Apple Insider further goes on to suggest 110 00:07:29,320 --> 00:07:32,480 Speaker 1: that perhaps this move is another sign that Musk intends 111 00:07:32,480 --> 00:07:35,880 Speaker 1: for Twitter to become an everything app, that Twitter will 112 00:07:35,920 --> 00:07:39,880 Speaker 1: eventually become x and this app will support all sorts 113 00:07:39,880 --> 00:07:44,040 Speaker 1: of transactions beyond just posting about lunch and alienating advertisers, 114 00:07:44,680 --> 00:07:48,880 Speaker 1: be handling stuff like shopping and payment transfers and all 115 00:07:48,920 --> 00:07:52,480 Speaker 1: this other kind of stuff. So Twitter Incorporated is dead 116 00:07:52,720 --> 00:07:57,000 Speaker 1: all hail x Coort. I guess let's cover one of 117 00:07:57,000 --> 00:07:59,560 Speaker 1: the things that's going wrong at Twitter. Over the last 118 00:07:59,560 --> 00:08:03,480 Speaker 1: week or so, according to tech Crunch, several users have 119 00:08:03,560 --> 00:08:08,600 Speaker 1: reported seeing Twitter circle messages popping up in the for 120 00:08:09,240 --> 00:08:12,600 Speaker 1: U tab on Twitter, and you might be wondering what 121 00:08:12,640 --> 00:08:15,720 Speaker 1: all that means, because I sure was, which just shows 122 00:08:15,720 --> 00:08:17,480 Speaker 1: you how out of the loop I've been with Twitter 123 00:08:17,520 --> 00:08:21,520 Speaker 1: in general. So a Twitter circle is like a group 124 00:08:21,560 --> 00:08:25,480 Speaker 1: of close friends or trusted acquaintances. Twitter circles are meant 125 00:08:25,480 --> 00:08:29,560 Speaker 1: to allow you to post too smaller, more private groups 126 00:08:29,720 --> 00:08:33,240 Speaker 1: of people. The messages are not meant to go out 127 00:08:33,280 --> 00:08:38,640 Speaker 1: to Twitter's general population, and yet apparently some Twitter circle 128 00:08:38,720 --> 00:08:42,319 Speaker 1: messages have done just that. So if you were browsing 129 00:08:42,320 --> 00:08:45,880 Speaker 1: the for you tab in Twitter, you might actually see 130 00:08:45,880 --> 00:08:49,600 Speaker 1: some messages that you weren't ever meant to see. And 131 00:08:49,720 --> 00:08:53,000 Speaker 1: you could imagine how that could be a big problem. 132 00:08:53,040 --> 00:08:54,800 Speaker 1: I mean, if the same thing were to happen to 133 00:08:54,960 --> 00:08:59,760 Speaker 1: direct messages, all sorts of shenanigans could follow. Tech crunches. 134 00:08:59,800 --> 00:09:03,400 Speaker 1: Amanda Silberling points out that getting a response from Twitter 135 00:09:03,960 --> 00:09:08,200 Speaker 1: is pretty much impossible because Elon Musk famously set the 136 00:09:08,200 --> 00:09:12,240 Speaker 1: pr email address to auto reply to requests by sending 137 00:09:12,320 --> 00:09:16,800 Speaker 1: a poop emoji because that's professional and mature. Oh and 138 00:09:16,920 --> 00:09:20,120 Speaker 1: just assume for all the Twitter stories I'm talking about 139 00:09:20,120 --> 00:09:23,560 Speaker 1: today that at some point within that story the author 140 00:09:23,600 --> 00:09:26,000 Speaker 1: pointed out that it's impossible to get an official response 141 00:09:26,040 --> 00:09:29,160 Speaker 1: from Twitter that isn't a poop emoji, because trust me, 142 00:09:29,920 --> 00:09:34,080 Speaker 1: almost every story I read while writing this includes that, 143 00:09:35,240 --> 00:09:40,520 Speaker 1: let's call it nugget of information. Some former Twitter executives, 144 00:09:40,880 --> 00:09:46,000 Speaker 1: including the former CEO of the company, are now suing Twitter, 145 00:09:46,440 --> 00:09:51,080 Speaker 1: or rather excres now, I guess, because they have been 146 00:09:51,120 --> 00:09:55,200 Speaker 1: saddled with legal fees that I guess the company should 147 00:09:55,679 --> 00:09:59,880 Speaker 1: have covered. This includes legal fees for various government and 148 00:10:00,000 --> 00:10:03,640 Speaker 1: investigations into Twitter that we're leading up to Musk's purchase 149 00:10:03,640 --> 00:10:07,120 Speaker 1: of the company, which, as I'm sure you all remember, 150 00:10:07,880 --> 00:10:13,200 Speaker 1: was in itself a total chaotic mass filled with legal wrangling. Anyway, 151 00:10:13,240 --> 00:10:16,520 Speaker 1: it should come as no surprise that Twitter slash x 152 00:10:16,600 --> 00:10:20,520 Speaker 1: Corps has not paid these legal fees, because well, the 153 00:10:20,520 --> 00:10:24,880 Speaker 1: company pretty much stopped paying all of its bills. It 154 00:10:25,000 --> 00:10:27,760 Speaker 1: stopped paying for rent with the office space, it stopped 155 00:10:27,800 --> 00:10:32,840 Speaker 1: paying for contract work, and stopped paying for janitorial staff, etc. 156 00:10:34,520 --> 00:10:37,200 Speaker 1: And heck, we also know the company laid off or 157 00:10:37,280 --> 00:10:40,920 Speaker 1: drove away about seventy five percent of its workforce since 158 00:10:41,280 --> 00:10:45,160 Speaker 1: late last year, and according to this lawsuit, the company 159 00:10:45,240 --> 00:10:49,520 Speaker 1: is contractually bound to pay these legal fees for former 160 00:10:49,559 --> 00:10:54,120 Speaker 1: executives these are the executives that Elon Musk famously fired 161 00:10:54,360 --> 00:10:58,280 Speaker 1: the day he took possession of the company. So this 162 00:10:58,320 --> 00:11:01,679 Speaker 1: matter has now gone to court and we'll see how 163 00:11:01,720 --> 00:11:06,400 Speaker 1: long it sits there while Twitter continues to not pay 164 00:11:06,440 --> 00:11:09,680 Speaker 1: the bills. All right, So now let's talk about a 165 00:11:09,679 --> 00:11:13,040 Speaker 1: couple of instances where Elon Musk Institute to change and then, 166 00:11:13,200 --> 00:11:17,880 Speaker 1: upon receiving backlash, reversed or at least altered that change. 167 00:11:17,920 --> 00:11:22,800 Speaker 1: And first up is how Musk handled NPR. So, for 168 00:11:22,840 --> 00:11:26,280 Speaker 1: those who don't know, NPR, National Public Radio is a 169 00:11:26,400 --> 00:11:30,880 Speaker 1: media outlet here in the United States. So late last week, 170 00:11:31,200 --> 00:11:36,520 Speaker 1: Twitter appended the label state affiliated media to the NPR 171 00:11:36,600 --> 00:11:39,920 Speaker 1: Twitter account, which I think is a pretty blatant attack 172 00:11:40,080 --> 00:11:46,280 Speaker 1: on NPR's reputation. The phrase state affiliated media typically refers 173 00:11:46,320 --> 00:11:49,160 Speaker 1: to a media outlet that is under the direct control 174 00:11:49,520 --> 00:11:53,240 Speaker 1: of a government, often with the implication that the media 175 00:11:53,280 --> 00:11:57,120 Speaker 1: outlet is biased and the government in question is, to 176 00:11:57,240 --> 00:12:01,520 Speaker 1: at least some degree authoritarian. Think about China and its 177 00:12:01,640 --> 00:12:06,960 Speaker 1: state sponsored media outlets, NPR's Bobby Allen was not about 178 00:12:07,040 --> 00:12:10,760 Speaker 1: to let this go, and so Allen challenged Musk, pointing 179 00:12:10,800 --> 00:12:14,080 Speaker 1: out that the US government has no control over the 180 00:12:14,160 --> 00:12:19,000 Speaker 1: content or editorial voice of NPR, and Musk really had 181 00:12:19,120 --> 00:12:22,360 Speaker 1: nowhere to go on that he could not disagree with 182 00:12:22,400 --> 00:12:25,679 Speaker 1: the statement, so instead he just had Twitter changed the 183 00:12:25,760 --> 00:12:29,439 Speaker 1: label so that it then read that MPR is quote 184 00:12:29,520 --> 00:12:35,280 Speaker 1: unquote government funded, which, while less of an outright lie, 185 00:12:35,960 --> 00:12:40,240 Speaker 1: is still not entirely accurate. NPR receives only about one 186 00:12:40,360 --> 00:12:44,679 Speaker 1: percent of its funding directly from the government. The rest 187 00:12:44,720 --> 00:12:48,880 Speaker 1: comes from lots of other sources, including listeners like you. 188 00:12:49,400 --> 00:12:53,559 Speaker 1: As NPR often includes in their messages, receiving a single 189 00:12:53,720 --> 00:12:58,000 Speaker 1: percent of funding from the government doesn't sound like the 190 00:12:58,160 --> 00:13:02,760 Speaker 1: term government funded should really apply to NPR, and Alan 191 00:13:02,880 --> 00:13:07,959 Speaker 1: pointed out again that one of Musk's other companies, Tesla, 192 00:13:08,120 --> 00:13:13,800 Speaker 1: has received literally billions of dollars in government subsidies, and 193 00:13:13,920 --> 00:13:18,280 Speaker 1: yet the Tesla Twitter account does not include the government 194 00:13:18,320 --> 00:13:22,080 Speaker 1: funded label when it seems like conciderning the amount of 195 00:13:22,080 --> 00:13:25,599 Speaker 1: money that company has received in the form of subsidies 196 00:13:25,600 --> 00:13:30,120 Speaker 1: and such. It certainly should if we're being fair about labels. 197 00:13:30,120 --> 00:13:32,840 Speaker 1: So I think it's pretty clear that Musk's intent was 198 00:13:32,880 --> 00:13:37,079 Speaker 1: to try and discredit or dismiss INPR, because well, the 199 00:13:37,200 --> 00:13:40,200 Speaker 1: organization has this irritating habit of calling Musk out on 200 00:13:40,240 --> 00:13:44,520 Speaker 1: his own bs. Okay, I've got a lot more stories, 201 00:13:44,520 --> 00:13:48,720 Speaker 1: including more Twitter ones, but first let's take a quick break. 202 00:13:58,120 --> 00:14:01,560 Speaker 1: We're back, and we're back with a couple more Twitter 203 00:14:02,640 --> 00:14:08,920 Speaker 1: news items. Next we have the substack brew Haha brew haha, 204 00:14:09,640 --> 00:14:13,199 Speaker 1: ha ha ha. Shout out to any firestiying theater fans 205 00:14:13,200 --> 00:14:17,640 Speaker 1: out there anyway. So substack is a platform that lets 206 00:14:17,679 --> 00:14:23,720 Speaker 1: people create and monetize subscription newsletters. So platformer, the Casey 207 00:14:23,760 --> 00:14:26,040 Speaker 1: Newton newsletter that I mentioned at the beginning of all 208 00:14:26,040 --> 00:14:29,880 Speaker 1: this Twitter mess is actually built on top of substack. 209 00:14:30,040 --> 00:14:33,880 Speaker 1: Casey Newton builds platform or using substack. Well, one thing 210 00:14:33,880 --> 00:14:37,440 Speaker 1: that substack is working on is a product called notes, 211 00:14:37,920 --> 00:14:41,360 Speaker 1: which works a bit like Yeah, you guessed it. It 212 00:14:41,400 --> 00:14:44,320 Speaker 1: works a bit like Twitter does, and it at least 213 00:14:44,360 --> 00:14:49,280 Speaker 1: appears that Musk isn't very fond of such potential competition. 214 00:14:49,760 --> 00:14:52,240 Speaker 1: Though you could argue that some of the problems that 215 00:14:52,320 --> 00:14:56,240 Speaker 1: popped up, maybe those were coincidental. Maybe those problems were 216 00:14:56,280 --> 00:15:00,320 Speaker 1: more evidence that Twitter itself is breaking as a posted 217 00:15:00,320 --> 00:15:05,840 Speaker 1: to an outright decision to try and penalize substack. For 218 00:15:05,960 --> 00:15:10,640 Speaker 1: one thing, the ability to embed tweets into a post 219 00:15:10,680 --> 00:15:14,080 Speaker 1: on substack appeared to be broken. In fact, if you 220 00:15:14,120 --> 00:15:18,480 Speaker 1: were to try and embed a tweet in a substack 221 00:15:18,680 --> 00:15:22,320 Speaker 1: entry at that time, you would get the message quote 222 00:15:22,720 --> 00:15:27,440 Speaker 1: Twitter has unexpectedly restricted access to embedding tweets in sub 223 00:15:28,000 --> 00:15:32,840 Speaker 1: posts end quote. I'm not sure how unexpected it was 224 00:15:33,120 --> 00:15:37,920 Speaker 1: on Twitter's side, honestly, anyway. On top of that, it 225 00:15:38,000 --> 00:15:41,560 Speaker 1: seemed that Twitter was restricting visibility and the promotion of 226 00:15:41,640 --> 00:15:46,920 Speaker 1: tweets that contained links to posts on substack, and users 227 00:15:46,920 --> 00:15:49,640 Speaker 1: who tried to like or retweet a post that linked 228 00:15:49,680 --> 00:15:54,880 Speaker 1: to substack itself also received error messages. So it seemed 229 00:15:54,960 --> 00:15:57,640 Speaker 1: like Musk was trying to pull a similar tactic that 230 00:15:57,720 --> 00:16:02,120 Speaker 1: he used to head off people jumping ship from Twitter 231 00:16:02,200 --> 00:16:05,360 Speaker 1: to Mastodon a few months back. You might remember he 232 00:16:05,520 --> 00:16:11,560 Speaker 1: began banning links to Mastodon and some other platforms, saying 233 00:16:11,760 --> 00:16:15,440 Speaker 1: that saying all sorts of things to justify the decision, 234 00:16:15,560 --> 00:16:20,600 Speaker 1: none of which really held much water. Anyway. After numerous 235 00:16:20,640 --> 00:16:24,200 Speaker 1: folks raised a stink about Twitter's new direction with substack, 236 00:16:24,840 --> 00:16:27,200 Speaker 1: the service appeared to do a one eighties So links 237 00:16:27,240 --> 00:16:30,440 Speaker 1: are now working again, and it just seems like Musk 238 00:16:30,480 --> 00:16:34,240 Speaker 1: has backed down a little bit on his stance on 239 00:16:34,280 --> 00:16:38,640 Speaker 1: how to handle this but yeah, I mean, considering his history, 240 00:16:39,040 --> 00:16:44,280 Speaker 1: it's again not exactly shocking. Moving right along, Politico's Jessica 241 00:16:44,360 --> 00:16:48,160 Speaker 1: Piper reports that Twitter has failed to disclose political ads, 242 00:16:48,680 --> 00:16:52,520 Speaker 1: despite the service outlining a transparency policy saying it would 243 00:16:52,560 --> 00:16:56,040 Speaker 1: do that. So, Twitter, according to its own rules, is 244 00:16:56,080 --> 00:16:59,640 Speaker 1: supposed to label posts that are part of a paid 245 00:16:59,760 --> 00:17:03,120 Speaker 1: political campaign so that you can see as a user 246 00:17:03,680 --> 00:17:08,480 Speaker 1: that the post is a paid political advertisement. But it 247 00:17:08,480 --> 00:17:12,119 Speaker 1: turns out that several ads that were running on Twitter 248 00:17:12,880 --> 00:17:17,560 Speaker 1: in March, you know, just last month, were not labeled 249 00:17:17,840 --> 00:17:21,600 Speaker 1: that way, even though they were clearly actual paid for 250 00:17:22,080 --> 00:17:29,199 Speaker 1: campaign ads. And so I'm not sure exactly what's happening here. 251 00:17:29,240 --> 00:17:33,080 Speaker 1: It turns out that, you know, Politico asked Twitter to 252 00:17:33,160 --> 00:17:36,080 Speaker 1: comment on this, and Twitter sent over a spreadsheet that's 253 00:17:36,119 --> 00:17:39,800 Speaker 1: supposed to track these things, like, supposed to track all 254 00:17:39,840 --> 00:17:43,199 Speaker 1: the instances of paid political ads, and they failed to 255 00:17:43,240 --> 00:17:47,160 Speaker 1: include at least three instances of different campaigns running ads 256 00:17:47,160 --> 00:17:52,119 Speaker 1: on the platform. So to me, that suggests that Twitter's 257 00:17:52,160 --> 00:17:56,600 Speaker 1: own internal systems may be failing here. That you know, 258 00:17:56,600 --> 00:17:59,280 Speaker 1: maybe it's a case of the right hand being unaware 259 00:17:59,320 --> 00:18:03,320 Speaker 1: of what the left hand is doing, because obviously someone 260 00:18:03,440 --> 00:18:07,440 Speaker 1: had to sell that ad space, right, that had to 261 00:18:07,640 --> 00:18:12,919 Speaker 1: be an actual transaction that happened on Twitter's end, But 262 00:18:13,119 --> 00:18:15,680 Speaker 1: somehow that data didn't make it to the proper place 263 00:18:15,720 --> 00:18:19,640 Speaker 1: for disclosure, It didn't make it into this spreadsheet, and 264 00:18:19,680 --> 00:18:23,280 Speaker 1: therefore the tweets were never labeled. Now, to be clear, 265 00:18:24,359 --> 00:18:27,480 Speaker 1: there's not necessarily a law that's been broken here. This 266 00:18:27,560 --> 00:18:30,680 Speaker 1: is Twitter failing to live up to its own policies. 267 00:18:31,359 --> 00:18:33,840 Speaker 1: But it is important for citizens to be able to 268 00:18:33,840 --> 00:18:37,680 Speaker 1: detect and track things like political ads and political spending. 269 00:18:37,720 --> 00:18:40,879 Speaker 1: For one thing, knowing that something's an AD helps you 270 00:18:40,960 --> 00:18:47,280 Speaker 1: separate it from being, say, an unbiased news source, or 271 00:18:47,320 --> 00:18:51,200 Speaker 1: even a purported unbiased news source. If it says add 272 00:18:51,280 --> 00:18:53,560 Speaker 1: on there, you know, all right, well this is this 273 00:18:53,640 --> 00:18:56,399 Speaker 1: is a message that has a specific agenda, otherwise it 274 00:18:56,400 --> 00:19:01,439 Speaker 1: wouldn't be an ad. For another, it helped different groups 275 00:19:01,520 --> 00:19:05,680 Speaker 1: keep tabs on campaigns and campaign spending, as well as 276 00:19:05,720 --> 00:19:09,199 Speaker 1: the activities of not for profit organizations that may not 277 00:19:09,400 --> 00:19:14,119 Speaker 1: be subject to really strict campaign laws. Actually using the 278 00:19:14,200 --> 00:19:17,320 Speaker 1: term really strict as being far too generous in the 279 00:19:17,440 --> 00:19:20,680 Speaker 1: United States, but campaign laws really it all comes down 280 00:19:20,720 --> 00:19:23,560 Speaker 1: to tracking money and using that info to determine whom 281 00:19:23,560 --> 00:19:26,320 Speaker 1: you can trust. And it all gets really cynical from 282 00:19:26,359 --> 00:19:29,359 Speaker 1: that point forward. Honestly, I feel like this is an 283 00:19:29,400 --> 00:19:33,639 Speaker 1: indication that systems within Twitter are not working as intended, 284 00:19:34,200 --> 00:19:37,240 Speaker 1: rather than a potential indication that the company is actively 285 00:19:37,280 --> 00:19:40,560 Speaker 1: trying to hide that some political tweets are in fact 286 00:19:40,600 --> 00:19:44,400 Speaker 1: paid advertising. I don't think Twitter was trying to get 287 00:19:44,440 --> 00:19:47,359 Speaker 1: one over on users. I just feel like this was 288 00:19:47,400 --> 00:19:49,600 Speaker 1: a failure. So this feels more like a case where 289 00:19:49,640 --> 00:19:53,760 Speaker 1: it's incompetence rather than malevolence. That just is how it 290 00:19:53,800 --> 00:19:55,960 Speaker 1: feels to me. I admit, you know, I could be 291 00:19:55,960 --> 00:19:58,680 Speaker 1: totally wrong about that, but it doesn't I don't see 292 00:19:58,680 --> 00:20:02,000 Speaker 1: where the gain is for Twitter to not be transparent 293 00:20:02,040 --> 00:20:06,240 Speaker 1: about this stuff. So I think, to me, this feels 294 00:20:06,280 --> 00:20:09,560 Speaker 1: like an indication that things at Twitter are breaking, and 295 00:20:09,600 --> 00:20:13,520 Speaker 1: Twitter no longer has the engineering staff on hand to 296 00:20:14,400 --> 00:20:19,240 Speaker 1: prevent or repair that stuff in a timely fashion. Something 297 00:20:19,280 --> 00:20:21,880 Speaker 1: else that happened on Twitter last week with several Kremlin 298 00:20:22,000 --> 00:20:28,280 Speaker 1: related Twitter accounts, including Vladimir Putin's presidential account, became reinstated 299 00:20:28,320 --> 00:20:31,879 Speaker 1: on the service. So last year, last April, Twitter chose 300 00:20:31,960 --> 00:20:35,800 Speaker 1: to restrict the promotion and reach of Russian state media 301 00:20:35,840 --> 00:20:39,600 Speaker 1: accounts and Kremlin linked accounts in the wake of Russia's 302 00:20:39,640 --> 00:20:44,720 Speaker 1: invasion of Ukraine. Obviously, Russia was leaning heavily on social 303 00:20:44,800 --> 00:20:50,160 Speaker 1: platforms to spread propaganda and misinformation, but The Telegraph reports 304 00:20:50,200 --> 00:20:52,960 Speaker 1: that those restrictions seem to no longer be in place, 305 00:20:53,040 --> 00:20:56,000 Speaker 1: and that when searching for certain topics, the Kremlin linked 306 00:20:56,040 --> 00:20:59,600 Speaker 1: accounts were frequently in the results, sometimes at the tippy 307 00:20:59,680 --> 00:21:03,640 Speaker 1: top of search results. And The Telegraph also reported that 308 00:21:03,800 --> 00:21:07,520 Speaker 1: they created a brand new Twitter account, they didn't have 309 00:21:07,560 --> 00:21:11,320 Speaker 1: it following anyone in particular, and they noticed that when 310 00:21:11,359 --> 00:21:15,919 Speaker 1: they went to the four utab, the curated tab to 311 00:21:16,119 --> 00:21:20,280 Speaker 1: check and see what accounts were showing up there, some 312 00:21:20,359 --> 00:21:23,040 Speaker 1: of the Kremlin linked accounts were showing up in the 313 00:21:23,119 --> 00:21:25,800 Speaker 1: four U tab, even though this brand new account had 314 00:21:25,840 --> 00:21:30,120 Speaker 1: not followed any of those. So hey, apparently Musk thinks 315 00:21:30,280 --> 00:21:34,760 Speaker 1: NPR should not be trusted, but actual state backed accounts 316 00:21:34,800 --> 00:21:39,960 Speaker 1: from Russia and also from China are AOK. Agree. Twitter 317 00:21:40,000 --> 00:21:42,520 Speaker 1: sounds more and more like a dystopian nightmare to me. 318 00:21:43,240 --> 00:21:45,800 Speaker 1: By the way, if you want to suggest topics, you 319 00:21:45,840 --> 00:21:48,959 Speaker 1: can use Twitter and send a message to tech Stuff 320 00:21:49,080 --> 00:21:51,840 Speaker 1: HSW though, I think I need to come up with 321 00:21:51,880 --> 00:21:55,879 Speaker 1: an alternative for getting in touch with me and wrapping 322 00:21:56,080 --> 00:22:00,600 Speaker 1: up the Twitter section of our news. Finally, is a 323 00:22:00,640 --> 00:22:04,080 Speaker 1: report that Twitter appears to have acquiesced to the Government 324 00:22:04,119 --> 00:22:08,919 Speaker 1: of India's demands that objectionable messages, that is, messages that 325 00:22:09,000 --> 00:22:12,800 Speaker 1: the Government of India objects to, should be suppressed not 326 00:22:12,920 --> 00:22:17,680 Speaker 1: just in India, but around the entire world. And if 327 00:22:17,680 --> 00:22:21,520 Speaker 1: this is true, then it means that Twitter is actually 328 00:22:21,880 --> 00:22:30,080 Speaker 1: censoring itself globally at the direction of the Government of India. 329 00:22:31,040 --> 00:22:35,440 Speaker 1: You know, typically in the past, when Twitter would agree 330 00:22:35,480 --> 00:22:39,959 Speaker 1: to government demands in India, they would suppress a message, 331 00:22:39,960 --> 00:22:42,159 Speaker 1: but it would just be within India itself. If you 332 00:22:42,200 --> 00:22:45,680 Speaker 1: are outside of India, you could still see the tweets 333 00:22:45,720 --> 00:22:51,200 Speaker 1: in question. But apparently journalist and activist sarov Dos posted 334 00:22:51,240 --> 00:22:55,159 Speaker 1: some messages and when they went back to look over 335 00:22:55,160 --> 00:22:59,360 Speaker 1: their Twitter history, saw that the tweets had been tagged 336 00:22:59,359 --> 00:23:02,919 Speaker 1: with the phrase that the content was quote withheld in 337 00:23:03,000 --> 00:23:07,359 Speaker 1: worldwide in response to a legal demand end quote. So 338 00:23:07,520 --> 00:23:11,679 Speaker 1: DAWs included screenshots of these tweets, which were made in 339 00:23:11,800 --> 00:23:14,760 Speaker 1: twenty twenty two, and they said that they couldn't remember 340 00:23:14,840 --> 00:23:19,280 Speaker 1: the context of those messages or why they would be 341 00:23:19,320 --> 00:23:24,000 Speaker 1: suppressed worldwide. A website the Hindu dot Com attempted to 342 00:23:24,119 --> 00:23:28,600 Speaker 1: use a virtual Private network or VPN to see if 343 00:23:28,640 --> 00:23:32,919 Speaker 1: those suppressed tweets could be viewed in the United States, because, again, 344 00:23:33,359 --> 00:23:37,120 Speaker 1: typically these sort of suppressions only would happen within India itself. 345 00:23:37,600 --> 00:23:40,480 Speaker 1: But the Hindu discovered that even if they were looking 346 00:23:40,520 --> 00:23:45,720 Speaker 1: at Twitter from a United States server, they would not 347 00:23:45,800 --> 00:23:48,080 Speaker 1: be able to see those messages, which is a pretty 348 00:23:48,160 --> 00:23:51,840 Speaker 1: huge deal, particularly for a company led by someone who 349 00:23:51,840 --> 00:23:56,400 Speaker 1: proclaimed to be all about free speech. I'll just gets 350 00:23:56,400 --> 00:24:01,119 Speaker 1: worse and worse. But thankfully we are done with Twitter. 351 00:24:01,480 --> 00:24:04,480 Speaker 1: So when we come back from this break, we will 352 00:24:04,520 --> 00:24:17,560 Speaker 1: move on to something else. Okay, we're back. Yeah, we're 353 00:24:17,560 --> 00:24:22,520 Speaker 1: moving on to something else. And that something else is Tesla, 354 00:24:23,240 --> 00:24:25,560 Speaker 1: son of a All right, let's get through this and 355 00:24:25,600 --> 00:24:27,480 Speaker 1: then we'll get on to something else. All right. So 356 00:24:27,640 --> 00:24:33,240 Speaker 1: last week Reuters reported that from twenty nineteen through mid 357 00:24:33,440 --> 00:24:38,800 Speaker 1: twenty twenty two and possibly beyond, some folks in Tesla, 358 00:24:39,119 --> 00:24:47,359 Speaker 1: including management, would occasionally pull images captured by Tesla car cameras. Remember, 359 00:24:47,520 --> 00:24:51,280 Speaker 1: Tesla has used optical systems, in fact, to the point 360 00:24:51,280 --> 00:24:55,000 Speaker 1: where they've started to remove things like ultrasonic systems and 361 00:24:55,119 --> 00:24:59,440 Speaker 1: just rely on optical systems for the purposes of collision detection, navigation, 362 00:24:59,480 --> 00:25:02,560 Speaker 1: that kind of stuff. Well, these cameras can also work 363 00:25:02,560 --> 00:25:05,679 Speaker 1: as part of the security system. When you're charging your vehicle, 364 00:25:05,960 --> 00:25:08,879 Speaker 1: those cameras can be active, and it turns out that 365 00:25:09,000 --> 00:25:12,200 Speaker 1: Tesla employees were sometimes just peeking in on what these 366 00:25:12,240 --> 00:25:16,159 Speaker 1: camera systems were able to see, whether the car was 367 00:25:16,200 --> 00:25:19,280 Speaker 1: in operation or was in recharge mode, and then they 368 00:25:19,280 --> 00:25:24,280 Speaker 1: were sharing images and videos captured by those cameras. So 369 00:25:24,280 --> 00:25:28,520 Speaker 1: it's as if these Tesla employees had implanted cameras in 370 00:25:28,600 --> 00:25:33,760 Speaker 1: the homes of Tesla customers essentially, and this includes stuff 371 00:25:33,760 --> 00:25:37,679 Speaker 1: that clearly the owners of those Tesla vehicles would not 372 00:25:38,320 --> 00:25:42,439 Speaker 1: want to share with the outside world. Now, some of 373 00:25:42,440 --> 00:25:45,679 Speaker 1: the photos were fairly mundane, like it might be a 374 00:25:45,720 --> 00:25:49,920 Speaker 1: funny road sign. So someone for some reason in Tesla 375 00:25:50,200 --> 00:25:53,280 Speaker 1: started to look at the video feed created by this 376 00:25:53,359 --> 00:25:57,040 Speaker 1: particular person's car and see a funny road sign, and 377 00:25:57,080 --> 00:25:59,119 Speaker 1: they might clip that and share it with other people. 378 00:25:59,560 --> 00:26:02,040 Speaker 1: It's still rings the question why did they access the 379 00:26:02,160 --> 00:26:06,040 Speaker 1: video feed in the first place. And also there must 380 00:26:06,080 --> 00:26:09,640 Speaker 1: be policies at Tesla that say, you don't do this right, 381 00:26:09,680 --> 00:26:13,639 Speaker 1: you don't take images from someone's vehicle and just share 382 00:26:13,640 --> 00:26:17,920 Speaker 1: it within the company. Apparently a lot of the messages included, hey, 383 00:26:18,320 --> 00:26:20,760 Speaker 1: don't share this, or don't talk about this, or delete 384 00:26:20,760 --> 00:26:23,280 Speaker 1: after you see it, And it was just people who 385 00:26:23,280 --> 00:26:25,800 Speaker 1: just thought it was such a good picture or video 386 00:26:25,880 --> 00:26:28,320 Speaker 1: or funny idea that they had to share it. But 387 00:26:28,400 --> 00:26:30,760 Speaker 1: no one else should do that because you'll get in trouble. 388 00:26:31,240 --> 00:26:34,000 Speaker 1: So some of the stuff as being captured was not 389 00:26:34,240 --> 00:26:38,280 Speaker 1: just simple funny road signs or a pet behaving in 390 00:26:38,320 --> 00:26:42,400 Speaker 1: a goofy way. There were cases of people being say, 391 00:26:42,520 --> 00:26:45,719 Speaker 1: in the garage stark naked, but you know, I have 392 00:26:45,760 --> 00:26:48,359 Speaker 1: no clue why they were in their garage and they 393 00:26:48,359 --> 00:26:51,840 Speaker 1: were naked, but I don't have any business knowing that. 394 00:26:52,000 --> 00:26:54,840 Speaker 1: It's none of my business, and I'm pretty sure that 395 00:26:54,960 --> 00:26:57,360 Speaker 1: the owner in question wasn't thinking that they were at 396 00:26:57,440 --> 00:27:00,520 Speaker 1: risk it being in a candid camera like situation at 397 00:27:00,520 --> 00:27:04,399 Speaker 1: the time. Anyway, I think the whole sending photos and 398 00:27:04,480 --> 00:27:10,439 Speaker 1: videos around is an incredibly deep violation of trust, and 399 00:27:10,560 --> 00:27:15,240 Speaker 1: again it has to be violating some Tesla policies. So 400 00:27:15,320 --> 00:27:18,200 Speaker 1: now there is a class action lawsuit that's been brought 401 00:27:18,320 --> 00:27:22,920 Speaker 1: against Tesla about this very issue to hold the organization 402 00:27:23,040 --> 00:27:27,680 Speaker 1: accountable for this behavior and to force a change. Now, 403 00:27:27,680 --> 00:27:29,480 Speaker 1: I'm sure we're going to hear a lot more about 404 00:27:29,520 --> 00:27:32,360 Speaker 1: this as the case continues. It would shock me if 405 00:27:32,400 --> 00:27:35,320 Speaker 1: Tesla does not settle out of court. I am certain 406 00:27:35,359 --> 00:27:37,560 Speaker 1: that that's going to be the ultimate end of this, 407 00:27:38,000 --> 00:27:39,919 Speaker 1: because I don't think the company has a leg to 408 00:27:39,960 --> 00:27:42,320 Speaker 1: stand on. Even if they argue that this is the 409 00:27:42,400 --> 00:27:46,280 Speaker 1: behavior of a few outlying bad apples, it appears that 410 00:27:46,359 --> 00:27:51,000 Speaker 1: it was widespread enough to really be an issue within 411 00:27:51,040 --> 00:27:54,480 Speaker 1: the company, and ultimately the company does have to be 412 00:27:54,600 --> 00:27:59,760 Speaker 1: held accountable. And plus Tesla where this particular lawsuit is 413 00:27:59,800 --> 00:28:02,720 Speaker 1: being filed the state of California. Tesla is not a 414 00:28:02,880 --> 00:28:06,960 Speaker 1: big hero in the state of California right now among 415 00:28:07,200 --> 00:28:11,000 Speaker 1: the government, and also the state of California has some 416 00:28:11,240 --> 00:28:15,439 Speaker 1: really tough privacy laws, some of the toughest in the 417 00:28:15,520 --> 00:28:19,440 Speaker 1: United States. So I suspect that Tesla is going to 418 00:28:19,480 --> 00:28:23,080 Speaker 1: be trying to negotiate some sort of settlement and then 419 00:28:23,160 --> 00:28:28,200 Speaker 1: potentially create a policy where people who do this will 420 00:28:28,280 --> 00:28:34,080 Speaker 1: face some heavy consequences for behaving in this way in 421 00:28:34,080 --> 00:28:37,399 Speaker 1: the future, or one can hope. Anyway, Now, I do 422 00:28:37,480 --> 00:28:39,520 Speaker 1: have some other news to talk about that is not 423 00:28:39,760 --> 00:28:44,560 Speaker 1: Twitter or Tesla related. So, while this episode is longer 424 00:28:44,600 --> 00:28:47,040 Speaker 1: than I had hoped it would be, I'm gonna kind 425 00:28:47,040 --> 00:28:51,120 Speaker 1: of summarize some of the next few news items. So 426 00:28:51,200 --> 00:28:53,960 Speaker 1: first up, there's a growing movement in the US government 427 00:28:54,360 --> 00:28:57,160 Speaker 1: to look into ways to deal with the evolution and 428 00:28:57,240 --> 00:29:03,960 Speaker 1: proliferation of AI, largely by concerns about chat GPT in particular. Now, 429 00:29:04,120 --> 00:29:06,160 Speaker 1: I think I'm going to have to dedicate an entire 430 00:29:06,240 --> 00:29:10,240 Speaker 1: episode to this topic because I think it's actually incredibly 431 00:29:10,280 --> 00:29:14,760 Speaker 1: complicated and it requires a lot more thought and analysis 432 00:29:14,760 --> 00:29:18,600 Speaker 1: than I could ever deliver in a news episode, So 433 00:29:18,720 --> 00:29:21,120 Speaker 1: be on the lookout for that in the future. I'll 434 00:29:21,160 --> 00:29:23,840 Speaker 1: just say that while the recent emergence of tools like 435 00:29:23,920 --> 00:29:28,920 Speaker 1: chat GPT creates concerns, legitimate concerns, the matter of AI 436 00:29:29,240 --> 00:29:32,320 Speaker 1: is far more broad than a chat bought built on 437 00:29:32,320 --> 00:29:35,880 Speaker 1: a large language model, and so we really need to 438 00:29:35,960 --> 00:29:39,320 Speaker 1: have a full discussion. Also, I have very little hope 439 00:29:40,080 --> 00:29:43,920 Speaker 1: that the government will come up with anything really meaningful 440 00:29:44,040 --> 00:29:46,600 Speaker 1: or useful on this topic, not because I have a 441 00:29:46,600 --> 00:29:49,280 Speaker 1: total lack of faith in government. I do not. I 442 00:29:49,360 --> 00:29:53,160 Speaker 1: have faith in government. It's that a lot of the 443 00:29:53,200 --> 00:29:58,400 Speaker 1: people who are in government positions don't understand AI. Not 444 00:29:58,400 --> 00:30:02,560 Speaker 1: on a level that would make it useful. So my 445 00:30:02,640 --> 00:30:06,160 Speaker 1: guess is that any legislation or regulation that was created 446 00:30:06,800 --> 00:30:10,000 Speaker 1: would be painted with a very broad brush and not 447 00:30:10,160 --> 00:30:13,240 Speaker 1: be as effective as it needs to be because of 448 00:30:13,280 --> 00:30:17,120 Speaker 1: a lack of understanding among government officials. But again, I'll 449 00:30:17,120 --> 00:30:19,000 Speaker 1: have to do a full episode about this to really 450 00:30:19,040 --> 00:30:22,480 Speaker 1: dive into it. Also, I want to recommend an amazing 451 00:30:22,600 --> 00:30:26,840 Speaker 1: article written by BENJ. Edwards. Let's Bee and J Edwards, 452 00:30:27,120 --> 00:30:29,720 Speaker 1: and this is at ours Technica. So if you go 453 00:30:29,760 --> 00:30:33,360 Speaker 1: to Ours Technica you should look up why chat GPT 454 00:30:33,760 --> 00:30:37,600 Speaker 1: and bing chat are so good at making things up? 455 00:30:38,240 --> 00:30:41,760 Speaker 1: So this article is really great at explaining how chat 456 00:30:41,800 --> 00:30:45,200 Speaker 1: GPT works at a very high level and how that 457 00:30:45,280 --> 00:30:48,040 Speaker 1: basic mode of operation results in stuff that you can't 458 00:30:48,080 --> 00:30:51,560 Speaker 1: always trust. Now, I have frequently said that one of 459 00:30:51,600 --> 00:30:54,840 Speaker 1: the problems you have is that chatbots like chat GPT 460 00:30:55,520 --> 00:30:58,280 Speaker 1: don't necessarily know the difference between a good source of 461 00:30:58,320 --> 00:31:01,040 Speaker 1: information and a bad source of and that is part 462 00:31:01,080 --> 00:31:04,080 Speaker 1: of it. But Edwards points out that it actually goes 463 00:31:04,120 --> 00:31:07,280 Speaker 1: beyond this. It's a little more complicated than that, and 464 00:31:07,680 --> 00:31:12,160 Speaker 1: the explanation is fantastic. The argument is very well done, 465 00:31:12,680 --> 00:31:15,880 Speaker 1: so definitely check out that article why Chat, GPT and 466 00:31:15,960 --> 00:31:19,000 Speaker 1: being chat are so good at making things up. I'll 467 00:31:19,000 --> 00:31:22,080 Speaker 1: probably have to do a full episode about that as well, 468 00:31:22,320 --> 00:31:29,400 Speaker 1: and talk about the phenomena of chat bought quote unquote hallucinations. Now, 469 00:31:29,440 --> 00:31:34,120 Speaker 1: a quick note about Meta's verification process. So Meta rolled 470 00:31:34,120 --> 00:31:36,240 Speaker 1: out this a few weeks ago. Of course, Meta's the 471 00:31:36,320 --> 00:31:41,480 Speaker 1: parent company to Facebook and Instagram and WhatsApp, and this 472 00:31:41,720 --> 00:31:47,080 Speaker 1: new policy allows users to pay for verification, so they 473 00:31:47,320 --> 00:31:49,320 Speaker 1: have to not just pay a fee to do this, 474 00:31:49,360 --> 00:31:52,280 Speaker 1: they also have to submit information about themselves to prove 475 00:31:52,360 --> 00:31:55,320 Speaker 1: to Meta that they are who they claim to be, 476 00:31:55,880 --> 00:32:00,800 Speaker 1: and if their qualifications meet Meta's standards and the payment 477 00:32:00,840 --> 00:32:04,080 Speaker 1: goes through, they get a little check next to their 478 00:32:04,160 --> 00:32:07,960 Speaker 1: name showing that they are verified. And this ends up 479 00:32:08,080 --> 00:32:13,440 Speaker 1: being one a social status thing, but two verified accounts 480 00:32:13,480 --> 00:32:17,400 Speaker 1: get access to some other perks, including better customer service 481 00:32:17,920 --> 00:32:19,960 Speaker 1: and also some stuff that I think should just be 482 00:32:20,080 --> 00:32:26,160 Speaker 1: freaking standard for everyone on Meta's platforms. I mean, you know, 483 00:32:26,520 --> 00:32:31,240 Speaker 1: one of those things is like being protected against impersonators. 484 00:32:31,800 --> 00:32:33,840 Speaker 1: And if I'm told, hey, if you pay, us will 485 00:32:33,880 --> 00:32:36,680 Speaker 1: help make sure that folks pretending to be you are stopped. 486 00:32:37,600 --> 00:32:40,719 Speaker 1: That's that's a message that screams to me that I 487 00:32:40,760 --> 00:32:43,080 Speaker 1: don't want to be on that platform if users have 488 00:32:43,160 --> 00:32:46,760 Speaker 1: to pay to receive that kind of baseline protection or 489 00:32:46,800 --> 00:32:49,920 Speaker 1: what I think should be baseline protection, Like, to me, 490 00:32:50,000 --> 00:32:52,880 Speaker 1: that's that's something that should just go without saying, and 491 00:32:52,960 --> 00:32:55,320 Speaker 1: it shouldn't be oh, well, you know, if you pay 492 00:32:55,360 --> 00:32:58,080 Speaker 1: then we'll we'll take care of you. But you know me, 493 00:32:58,280 --> 00:33:02,520 Speaker 1: I'm a grouch anyway. Crunch reports that a big issue 494 00:33:02,560 --> 00:33:06,160 Speaker 1: among some of met as users is that as part 495 00:33:06,160 --> 00:33:10,240 Speaker 1: of the system, the company is requiring users to use 496 00:33:10,280 --> 00:33:14,640 Speaker 1: their real names as their user name in their profiles. 497 00:33:14,640 --> 00:33:17,840 Speaker 1: That the name of their profile has to reflect their 498 00:33:17,880 --> 00:33:20,959 Speaker 1: actual legal name, and that can be a real problem 499 00:33:21,000 --> 00:33:24,400 Speaker 1: for folks and say the sex work industry or the 500 00:33:24,440 --> 00:33:27,560 Speaker 1: trans community. These are people who may use names that 501 00:33:27,600 --> 00:33:32,880 Speaker 1: are not their actual legal name as their public persona. 502 00:33:33,040 --> 00:33:36,160 Speaker 1: And as some have said, this is like the company 503 00:33:36,280 --> 00:33:39,040 Speaker 1: is asking you to pay it fifteen dollars and in return, 504 00:33:39,160 --> 00:33:42,600 Speaker 1: it will dox you that you might remember that Google 505 00:33:42,640 --> 00:33:45,440 Speaker 1: went through a very similar thing a few years ago 506 00:33:45,920 --> 00:33:49,600 Speaker 1: when Google tried to align its former social platform of 507 00:33:49,680 --> 00:33:53,960 Speaker 1: Google Plus with YouTube and other Google products. As part 508 00:33:54,000 --> 00:33:57,640 Speaker 1: of that, Google was requiring people to use their legal name, 509 00:33:57,640 --> 00:34:00,360 Speaker 1: and the thought process behind it was stuff like, oh, 510 00:34:00,400 --> 00:34:03,480 Speaker 1: if people are using their legal names, especially on YouTube, 511 00:34:03,520 --> 00:34:05,920 Speaker 1: then that's going to cut back on abuse because people 512 00:34:06,200 --> 00:34:10,400 Speaker 1: will feel that they're being accountable right because they're no 513 00:34:10,440 --> 00:34:12,960 Speaker 1: longer hiding behind a user name that's their real name. 514 00:34:13,520 --> 00:34:16,720 Speaker 1: But Google also faced the same sort of objections. People 515 00:34:16,760 --> 00:34:19,880 Speaker 1: said having to use our legal names sometimes puts us 516 00:34:19,880 --> 00:34:23,080 Speaker 1: at danger depending upon who we are and what we do, 517 00:34:23,600 --> 00:34:28,080 Speaker 1: and ultimately Google backed off of that policy. But so 518 00:34:28,120 --> 00:34:31,360 Speaker 1: far we haven't seen meta budge. And while I understand 519 00:34:31,360 --> 00:34:33,520 Speaker 1: the need to verify with meta that you are who 520 00:34:33,560 --> 00:34:36,360 Speaker 1: you say you are, I mean, that's the whole purpose 521 00:34:36,400 --> 00:34:39,560 Speaker 1: of verification in the first place, I don't actually see 522 00:34:39,560 --> 00:34:43,520 Speaker 1: how that should pertain to the user name itself, particularly 523 00:34:43,560 --> 00:34:46,399 Speaker 1: if people know you better by like a stage name 524 00:34:46,520 --> 00:34:51,520 Speaker 1: or something. So yeah, I don't see where that part 525 00:34:51,640 --> 00:34:54,440 Speaker 1: needs to come in. The only thing I under I 526 00:34:54,520 --> 00:34:57,000 Speaker 1: see is that to get that verification check, you have 527 00:34:57,040 --> 00:34:59,960 Speaker 1: to verify with the company who you are, But then 528 00:35:00,040 --> 00:35:03,640 Speaker 1: that information should remain secure and not have to be 529 00:35:03,680 --> 00:35:08,480 Speaker 1: publicly disclosed. Oh and hey, a couple more quick things. 530 00:35:08,640 --> 00:35:11,719 Speaker 1: You know, those public charging USB stations you can find 531 00:35:11,760 --> 00:35:14,520 Speaker 1: in some places like an actual little USB port you 532 00:35:14,560 --> 00:35:16,800 Speaker 1: can plug your cable in and then charge your phone 533 00:35:17,520 --> 00:35:22,920 Speaker 1: at airports or certain hotels or sometimes cafes. Well, the 534 00:35:23,000 --> 00:35:26,840 Speaker 1: FBI says you shouldn't use those which honestly makes sense. 535 00:35:26,960 --> 00:35:31,120 Speaker 1: So apparently some hackers have compromised some of these stations 536 00:35:31,120 --> 00:35:34,279 Speaker 1: and they're using them to inject malware into stuff like 537 00:35:34,400 --> 00:35:37,680 Speaker 1: phones and tablets. So the malware might do your typical 538 00:35:37,760 --> 00:35:42,560 Speaker 1: identification theft stuff of logging passwords and user names. Maybe 539 00:35:42,560 --> 00:35:44,360 Speaker 1: it even does stuff where it can access things like 540 00:35:44,400 --> 00:35:47,920 Speaker 1: your microphone and your camera, and that is a pretty 541 00:35:48,200 --> 00:35:51,000 Speaker 1: darn steep trade off for getting a few more minutes 542 00:35:51,000 --> 00:35:56,239 Speaker 1: of battery life. So rather than use those ports, the 543 00:35:56,320 --> 00:35:59,880 Speaker 1: FBI recommends that you use an actual like wall outlet 544 00:36:00,400 --> 00:36:03,600 Speaker 1: and a plug that plugs into the outlet and then 545 00:36:03,760 --> 00:36:06,960 Speaker 1: you know the cable to your device. Don't just plug 546 00:36:06,960 --> 00:36:09,480 Speaker 1: a USB cord into a USB jack and then hook 547 00:36:09,560 --> 00:36:12,480 Speaker 1: up your phone, which is fun times. Anyway, this advice 548 00:36:12,600 --> 00:36:16,000 Speaker 1: is good advice. It's healthy to think about those public 549 00:36:16,080 --> 00:36:20,440 Speaker 1: charging stations being similar to a USB drive that's been 550 00:36:20,520 --> 00:36:22,759 Speaker 1: left on the floor. You wouldn't want to pick up 551 00:36:22,760 --> 00:36:25,480 Speaker 1: that USB drive and just plug it into whatever device 552 00:36:25,520 --> 00:36:30,000 Speaker 1: you happen to own because you're just you're playing roulette 553 00:36:30,120 --> 00:36:33,560 Speaker 1: Russian roulette that that USB device doesn't have malware that's 554 00:36:33,600 --> 00:36:37,120 Speaker 1: immediately going to install itself on your device. You've got 555 00:36:37,120 --> 00:36:40,120 Speaker 1: to use that same sort of thought process when you're 556 00:36:40,120 --> 00:36:45,840 Speaker 1: talking about USB charging stations. Finally, YouTube continues to add 557 00:36:45,880 --> 00:36:52,360 Speaker 1: support for podcasts. There's now a podcast tab on channels, 558 00:36:52,360 --> 00:36:53,920 Speaker 1: so when you go into channels you'll see things like 559 00:36:54,000 --> 00:36:57,200 Speaker 1: videos and playlists and stuff like that. Well, podcast is 560 00:36:57,239 --> 00:37:00,880 Speaker 1: a new tab that's going to be appearing on channel pages. 561 00:37:01,239 --> 00:37:04,920 Speaker 1: So this way creators can classify videos as podcast episodes 562 00:37:05,280 --> 00:37:08,600 Speaker 1: and use the podcast tabs, which can really help with discovery. 563 00:37:08,880 --> 00:37:11,719 Speaker 1: If someone is interested in looking up a specific episode 564 00:37:12,600 --> 00:37:15,080 Speaker 1: and they don't want to filter through all the different 565 00:37:15,160 --> 00:37:17,400 Speaker 1: videos on a channel, then it makes it a lot easier. 566 00:37:17,680 --> 00:37:20,120 Speaker 1: I think that's pretty darn cool. I know there are 567 00:37:20,200 --> 00:37:23,960 Speaker 1: quite a few podcasts out there that release part or 568 00:37:24,120 --> 00:37:28,480 Speaker 1: all of episodes as video on YouTube. I think some 569 00:37:28,560 --> 00:37:32,120 Speaker 1: of my colleagues are looking into doing something along those lines, 570 00:37:32,160 --> 00:37:35,359 Speaker 1: and I think that's super awesome. I don't think I'm 571 00:37:35,360 --> 00:37:38,440 Speaker 1: going to jump on that wagon myself, but only because 572 00:37:38,880 --> 00:37:41,480 Speaker 1: tech stuff is most of the time a solo show, 573 00:37:42,080 --> 00:37:44,799 Speaker 1: and I just can't imagine that any of you out 574 00:37:44,840 --> 00:37:47,200 Speaker 1: there would be interested in just seeing me talk to 575 00:37:47,239 --> 00:37:52,200 Speaker 1: a microphone by myself for forty minutes or whatever. I 576 00:37:52,280 --> 00:37:54,560 Speaker 1: just don't see how that would be at all appealing 577 00:37:54,640 --> 00:37:57,560 Speaker 1: to you. So I don't think I'm ever going to 578 00:37:57,600 --> 00:38:02,239 Speaker 1: be using this particular feature unless you know, I get 579 00:38:02,239 --> 00:38:04,520 Speaker 1: a co host or something. So for multi host shows, 580 00:38:04,560 --> 00:38:07,880 Speaker 1: I think this is great. By the way, I'm also 581 00:38:07,880 --> 00:38:11,680 Speaker 1: of the opinion that podcasters should try to get on 582 00:38:11,719 --> 00:38:14,719 Speaker 1: all platforms in order to reach their audience. Like, you 583 00:38:14,760 --> 00:38:18,080 Speaker 1: shouldn't be platforms specific. You should try and be on 584 00:38:18,120 --> 00:38:20,719 Speaker 1: all of them if you can, unless you're a superstar 585 00:38:20,719 --> 00:38:23,080 Speaker 1: who can put their podcasts behind a paywall and folks 586 00:38:23,080 --> 00:38:25,120 Speaker 1: will still flock to you. But you know, for the 587 00:38:25,120 --> 00:38:28,520 Speaker 1: rest of us, I think making your show as accessible 588 00:38:28,080 --> 00:38:31,359 Speaker 1: as possible is a great strategy. You're going to find 589 00:38:31,440 --> 00:38:34,520 Speaker 1: a lot more listeners that way, and you're not going 590 00:38:34,560 --> 00:38:37,040 Speaker 1: to prevent people from finding you, which is really good. 591 00:38:37,360 --> 00:38:40,800 Speaker 1: So yeah, I think this is a good thing overall. Again, 592 00:38:40,920 --> 00:38:44,120 Speaker 1: I don't have my show on YouTube. I don't have 593 00:38:44,160 --> 00:38:47,680 Speaker 1: any connection to YouTube. I just think that this is 594 00:38:47,760 --> 00:38:49,799 Speaker 1: this is a cool thing for people who do like 595 00:38:49,920 --> 00:38:54,879 Speaker 1: to consume podcasts through YouTube. I occasionally will look up 596 00:38:54,960 --> 00:38:58,960 Speaker 1: clips from podcasts I subscribe to on YouTube, like i'll 597 00:38:59,120 --> 00:39:03,200 Speaker 1: watch a clip of of a recording, but typically I 598 00:39:03,280 --> 00:39:07,960 Speaker 1: listened to podcasts just in audio form. I don't tend 599 00:39:08,680 --> 00:39:13,759 Speaker 1: to do that on YouTube exclusively. I usually rely on 600 00:39:13,800 --> 00:39:17,520 Speaker 1: my phone, but sometimes shows that I like will also 601 00:39:17,600 --> 00:39:20,440 Speaker 1: upload video clips, and I like to watch those just 602 00:39:20,480 --> 00:39:23,319 Speaker 1: to see the people that I listened to actually go 603 00:39:23,440 --> 00:39:27,719 Speaker 1: through the reactions they have. My bim BAM or my brother. 604 00:39:27,840 --> 00:39:30,080 Speaker 1: My brother and me does this with the little clips 605 00:39:30,120 --> 00:39:32,200 Speaker 1: from their shows, and I always enjoy that because they 606 00:39:32,320 --> 00:39:36,720 Speaker 1: tend to be pretty entertaining. But yeah, I think it's cool. 607 00:39:37,040 --> 00:39:40,640 Speaker 1: I love to see more support for podcasts across the board. 608 00:39:41,400 --> 00:39:46,120 Speaker 1: And that wraps up this news episode. Lots of Twitter 609 00:39:46,160 --> 00:39:48,200 Speaker 1: stuff in there, but like I said, it's been a 610 00:39:48,200 --> 00:39:50,800 Speaker 1: while since I had really covered it, and a ton 611 00:39:50,880 --> 00:39:53,040 Speaker 1: of information had come out over the last week, so 612 00:39:53,080 --> 00:39:56,440 Speaker 1: I kind of want to barrel through it. Hopefully on Thursday, 613 00:39:56,480 --> 00:39:59,120 Speaker 1: we won't have nearly as much to say about that, 614 00:39:59,160 --> 00:40:01,520 Speaker 1: and we can look at other news in the tech 615 00:40:01,600 --> 00:40:04,239 Speaker 1: space in the meantime. I hope you are all well 616 00:40:04,680 --> 00:40:14,240 Speaker 1: and I'll talk to you again really soon. Tech Stuff 617 00:40:14,360 --> 00:40:18,879 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 618 00:40:18,920 --> 00:40:22,440 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 619 00:40:22,480 --> 00:40:23,400 Speaker 1: your favorite shows.