1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,520 --> 00:00:15,880 Speaker 1: Welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,880 --> 00:00:19,639 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,680 --> 00:00:22,759 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:22,880 --> 00:00:28,520 Speaker 1: November twenty first, twenty twenty three. And first up, of course, 6 00:00:28,600 --> 00:00:32,720 Speaker 1: we have some updates to the open ai situation. Now. 7 00:00:32,720 --> 00:00:35,280 Speaker 1: If you want to hear what led to open AI's 8 00:00:35,320 --> 00:00:39,760 Speaker 1: board of directors to fire CEO Sam Altman last Friday, 9 00:00:40,280 --> 00:00:43,640 Speaker 1: check out yesterday's Tech Stuff episode. I did a full 10 00:00:43,680 --> 00:00:48,680 Speaker 1: episode kind of about what led to that event and 11 00:00:48,680 --> 00:00:51,360 Speaker 1: what is happening in the aftermath. But I do have 12 00:00:51,400 --> 00:00:54,160 Speaker 1: a couple of new bits to add to that. First up, 13 00:00:54,160 --> 00:00:57,720 Speaker 1: According to The Verge, Sam Altman is still seeking reinstatement 14 00:00:57,840 --> 00:01:02,760 Speaker 1: as open ai CEO. Yesterday I talked about how Altman 15 00:01:02,880 --> 00:01:05,920 Speaker 1: claimed he would never wear a guest badge to open 16 00:01:05,959 --> 00:01:09,640 Speaker 1: AI's headquarters again. He did it once in order to 17 00:01:09,680 --> 00:01:12,360 Speaker 1: return to negotiate for his potential return to the company. 18 00:01:12,400 --> 00:01:16,240 Speaker 1: Once the board of directors said whoop, see those negotiations 19 00:01:16,240 --> 00:01:19,840 Speaker 1: went nowhere. So then he reportedly joined Microsoft and he 20 00:01:19,880 --> 00:01:25,240 Speaker 1: will be named CEO of an advanced AI division within Microsoft, 21 00:01:25,319 --> 00:01:28,440 Speaker 1: or maybe a spinoff company, But apparently that's just one 22 00:01:28,680 --> 00:01:31,520 Speaker 1: likely scenario, and the ink has not yet been put 23 00:01:31,560 --> 00:01:35,000 Speaker 1: to paper on that deal, so it's not a certainty 24 00:01:35,120 --> 00:01:38,080 Speaker 1: that he will be part of Microsoft. There is another 25 00:01:38,120 --> 00:01:42,520 Speaker 1: possibility that should the board members who voted Altman out 26 00:01:42,920 --> 00:01:46,240 Speaker 1: step down, Alman would return to the company as CEO, 27 00:01:46,840 --> 00:01:49,440 Speaker 1: but that was the main sticking point in the negotiations 28 00:01:49,800 --> 00:01:54,360 Speaker 1: over the weekend. However, since then, one board member, Elia Sutzkiver, 29 00:01:54,920 --> 00:01:58,480 Speaker 1: has come forward to express his regret for taking part 30 00:01:58,920 --> 00:02:00,600 Speaker 1: in the whole thing and said it was a mistake 31 00:02:01,240 --> 00:02:04,440 Speaker 1: and that he now sides with Altman, which means there 32 00:02:04,440 --> 00:02:08,000 Speaker 1: are just two members left on the open AI board 33 00:02:08,040 --> 00:02:12,359 Speaker 1: of Directors who would have to change their opinion, and 34 00:02:12,440 --> 00:02:15,360 Speaker 1: there's tremendous pressure on them to do it and to 35 00:02:15,400 --> 00:02:18,639 Speaker 1: step down. Not only do you have the pr nightmare 36 00:02:18,919 --> 00:02:22,600 Speaker 1: of going through perhaps what has been the worst CEO 37 00:02:22,840 --> 00:02:27,960 Speaker 1: firing in recent history, but you've also got an angry 38 00:02:28,080 --> 00:02:31,400 Speaker 1: corporate partner in the form of Microsoft, and you've got 39 00:02:31,400 --> 00:02:34,639 Speaker 1: the threat of perhaps hundreds of employees leaving the company 40 00:02:35,000 --> 00:02:39,440 Speaker 1: in solidarity with the ousted CEO. So refusing to step 41 00:02:39,480 --> 00:02:43,160 Speaker 1: down could mean open ai would be set back years 42 00:02:43,400 --> 00:02:47,600 Speaker 1: because of this. But then the reported reason that this 43 00:02:47,720 --> 00:02:50,840 Speaker 1: board fired Altman in the first place was that they 44 00:02:50,840 --> 00:02:53,639 Speaker 1: were concerned he was pushing hard to develop and deploy 45 00:02:53,760 --> 00:02:58,240 Speaker 1: AI tools without really giving proper regard to safety, and 46 00:02:58,280 --> 00:03:01,040 Speaker 1: that's the whole reason open aa Eye exists in the 47 00:03:01,080 --> 00:03:03,959 Speaker 1: first place. So this could be a case where those 48 00:03:04,000 --> 00:03:06,400 Speaker 1: board members decide they have to go down with the 49 00:03:06,440 --> 00:03:10,320 Speaker 1: ship that what does it mean to have an organization 50 00:03:10,720 --> 00:03:15,720 Speaker 1: ostensibly dedicated to developing AI in an ethical and safe 51 00:03:15,760 --> 00:03:20,120 Speaker 1: way if you don't do that? Or perhaps the threat 52 00:03:20,120 --> 00:03:23,720 Speaker 1: to open ai has actually been overstated, Maybe it's not 53 00:03:23,880 --> 00:03:27,919 Speaker 1: really in such a precarious position. Maybe the organization will 54 00:03:27,919 --> 00:03:31,120 Speaker 1: be fine after passing through, you know, a period of 55 00:03:31,120 --> 00:03:37,040 Speaker 1: turmoil that's unavoidable. Meanwhile, Microsoft CEO Satya Nadella has said 56 00:03:37,080 --> 00:03:40,920 Speaker 1: in an interview on CNBC and also with Bloomberg that 57 00:03:41,040 --> 00:03:45,120 Speaker 1: its clear open ai quote has to change around the 58 00:03:45,120 --> 00:03:50,120 Speaker 1: governance end quote, perhaps hinting someone from Microsoft should also 59 00:03:50,160 --> 00:03:53,320 Speaker 1: sit on the board of directors for open Ai. After all, 60 00:03:53,840 --> 00:03:58,040 Speaker 1: Microsoft has committed to investing ten billion dollars into open 61 00:03:58,080 --> 00:04:01,040 Speaker 1: ai and already has sent a few billion of those 62 00:04:01,480 --> 00:04:04,960 Speaker 1: to the company. The chaos at open Ai has had 63 00:04:04,960 --> 00:04:08,560 Speaker 1: some odd rippling effects as well. For example, when word 64 00:04:08,600 --> 00:04:12,400 Speaker 1: got out that Microsoft was apparently hiring both Sam Altman 65 00:04:12,680 --> 00:04:17,000 Speaker 1: and former open Ai president Greg Brockman, the company sought 66 00:04:17,040 --> 00:04:21,279 Speaker 1: stock price increase. In fact, Microsoft's stock value rose to 67 00:04:21,320 --> 00:04:24,480 Speaker 1: three hundred and seventy seven dollars forty four cents per 68 00:04:24,680 --> 00:04:29,480 Speaker 1: share yesterday. That is the highest closing stock price in 69 00:04:29,640 --> 00:04:34,080 Speaker 1: Microsoft history. It's pretty wild to me that a couple 70 00:04:34,120 --> 00:04:37,680 Speaker 1: of executives coming over from another company could be enough 71 00:04:37,720 --> 00:04:41,440 Speaker 1: to boost stock prices like that. Also, in a similar move, 72 00:04:41,640 --> 00:04:44,279 Speaker 1: when word Firse got out that open ai had fired 73 00:04:44,360 --> 00:04:48,160 Speaker 1: Altman last Friday, Microsoft sought stock price dip by one 74 00:04:48,200 --> 00:04:52,080 Speaker 1: point seven percent. I think this really shows how people 75 00:04:52,120 --> 00:04:55,599 Speaker 1: feel AI is a critically important technology. I'm still not 76 00:04:55,720 --> 00:05:00,000 Speaker 1: convinced that very many of them actually understand artificial intelligence, 77 00:05:00,600 --> 00:05:03,279 Speaker 1: but then I'm not getting rich off the stock market, 78 00:05:03,360 --> 00:05:05,719 Speaker 1: so what do I know. Then you have the various 79 00:05:05,720 --> 00:05:07,719 Speaker 1: companies that are champion at the bit to get their 80 00:05:07,760 --> 00:05:11,840 Speaker 1: hands on the talent at open Ai. Hundreds of open 81 00:05:11,880 --> 00:05:15,200 Speaker 1: ai staff, some of the most knowledgeable people in the 82 00:05:15,240 --> 00:05:19,159 Speaker 1: discipline of artificial intelligence, have indicated a willingness to leave 83 00:05:19,200 --> 00:05:23,400 Speaker 1: the company if Altman doesn't return as CEO. And one 84 00:05:23,640 --> 00:05:27,240 Speaker 1: company that has expressed interest in hiring that talent that 85 00:05:27,520 --> 00:05:31,280 Speaker 1: may feel like a change of scenery is merited is Salesforce. 86 00:05:31,760 --> 00:05:36,000 Speaker 1: CEO Marc Benioff posted on x the platform formerly known 87 00:05:36,000 --> 00:05:38,520 Speaker 1: as Twitter and a company that we'll be talking a 88 00:05:38,560 --> 00:05:42,520 Speaker 1: lot about later in this episode, said that quote, Salesforce 89 00:05:42,560 --> 00:05:45,520 Speaker 1: will match any open Ai researcher who has tendered their 90 00:05:45,520 --> 00:05:50,520 Speaker 1: resignation full cash and equity, ote to immediately join our 91 00:05:50,600 --> 00:05:56,159 Speaker 1: salesforce Einstein Trusted Ai research team under Silvio Sevres. So 92 00:05:56,600 --> 00:05:59,599 Speaker 1: that's end quote, by the way. So that's another big 93 00:05:59,640 --> 00:06:02,400 Speaker 1: threat that's facing open Ai. There are other companies that 94 00:06:02,440 --> 00:06:06,599 Speaker 1: recognize the value of the knowledge held by their staff, right, 95 00:06:07,040 --> 00:06:09,239 Speaker 1: That's one of the big things that open ai has. 96 00:06:09,440 --> 00:06:13,360 Speaker 1: Before they came out with chat GPT, you could argue 97 00:06:13,600 --> 00:06:18,680 Speaker 1: that the most valued asset of open ai was its talent. 98 00:06:19,200 --> 00:06:21,560 Speaker 1: There were people who were taking jobs at open Ai. 99 00:06:21,640 --> 00:06:23,520 Speaker 1: They could have made more money somewhere else, but they 100 00:06:23,520 --> 00:06:26,039 Speaker 1: took it at open ai because it meant working with 101 00:06:26,080 --> 00:06:29,600 Speaker 1: the best people and on a very challenging goal. So 102 00:06:30,080 --> 00:06:31,960 Speaker 1: obviously there are a lot of companies that would really 103 00:06:32,040 --> 00:06:34,559 Speaker 1: love to be able to bring those folks on board 104 00:06:34,600 --> 00:06:37,760 Speaker 1: to their own teams, and they're already coming forward with 105 00:06:37,839 --> 00:06:40,880 Speaker 1: offers should those people decide to leave open AI. So again, 106 00:06:41,400 --> 00:06:44,360 Speaker 1: immense pressure on the board of directors over at Open AI. 107 00:06:45,000 --> 00:06:48,679 Speaker 1: According to an article by Rob Thubrunn of tech Spot 108 00:06:48,920 --> 00:06:51,520 Speaker 1: or Thubron, I'm not certain how to say your last name, Rob, 109 00:06:51,680 --> 00:06:55,320 Speaker 1: don't know you, but I like your work anyway. According 110 00:06:55,360 --> 00:06:57,960 Speaker 1: to this article, more than half of tech workers view 111 00:06:58,040 --> 00:07:02,520 Speaker 1: AI as being overrated and overhyped now. This in turn, 112 00:07:02,600 --> 00:07:06,880 Speaker 1: comes from a survey that a company called Retool conducted. 113 00:07:07,120 --> 00:07:11,520 Speaker 1: So Retool puts out this survey and asks various tech 114 00:07:11,560 --> 00:07:17,040 Speaker 1: workers to evaluate how AI is treated how it's viewed 115 00:07:17,480 --> 00:07:21,160 Speaker 1: in the tech sphere. Now, the simple size is not 116 00:07:21,320 --> 00:07:26,520 Speaker 1: particularly large. Retool surveyed around fifteen hundred tech professionals across 117 00:07:26,560 --> 00:07:32,360 Speaker 1: the industry. Everyone from programmers to designers to executives, like 118 00:07:32,680 --> 00:07:35,200 Speaker 1: pretty much every kind of role that's in tech was 119 00:07:35,240 --> 00:07:39,640 Speaker 1: represented in this survey. According to the survey, only twenty 120 00:07:39,680 --> 00:07:42,720 Speaker 1: three percent of the respondents felt as though AI is 121 00:07:42,880 --> 00:07:45,679 Speaker 1: more or less fairly assessed in the industry as a whole. 122 00:07:45,800 --> 00:07:50,240 Speaker 1: That the attitude and hype, or maybe hype is the 123 00:07:50,240 --> 00:07:53,080 Speaker 1: wrong word because that has a negative connotation, but the 124 00:07:53,160 --> 00:07:57,480 Speaker 1: treatment of AI is appropriate, that it's the right level. However, 125 00:07:58,640 --> 00:08:02,000 Speaker 1: fifty one point six percent of them feel that AI 126 00:08:02,040 --> 00:08:06,480 Speaker 1: is overrated and perhaps overhyped. This also means, by the way, 127 00:08:06,480 --> 00:08:08,920 Speaker 1: that about twenty five percent of the respondents actually feel 128 00:08:08,960 --> 00:08:12,840 Speaker 1: that AI is underrated. So keep that in mind, and 129 00:08:12,920 --> 00:08:15,320 Speaker 1: it seems like the type of job you have in 130 00:08:15,360 --> 00:08:20,000 Speaker 1: tech has a big influence on how you view artificial intelligence. 131 00:08:20,440 --> 00:08:24,000 Speaker 1: Executives seem more inclined to say AI was either rated 132 00:08:24,040 --> 00:08:29,520 Speaker 1: fairly or was actually underrated. And as Rob points out, well, yeah, 133 00:08:29,680 --> 00:08:32,720 Speaker 1: of course executives think that because they're looking at artificial 134 00:08:32,720 --> 00:08:37,280 Speaker 1: intelligence largely as a cost benefit analysis, and they're salivating 135 00:08:37,320 --> 00:08:41,079 Speaker 1: over the thought of replacing all those expensive human drones 136 00:08:41,440 --> 00:08:45,960 Speaker 1: with more economical AI powered algorithms, So of course they 137 00:08:46,040 --> 00:08:49,839 Speaker 1: think that it's as important, if not more important, than 138 00:08:49,880 --> 00:08:53,880 Speaker 1: the way the industry treats it. Meanwhile, the folks who 139 00:08:53,920 --> 00:08:57,160 Speaker 1: actually have to interact with AI on a daily basis 140 00:08:57,200 --> 00:08:59,360 Speaker 1: in order to do their work. Now, the people who 141 00:08:59,400 --> 00:09:02,680 Speaker 1: are using AI to do stuff like write code, for example, 142 00:09:03,280 --> 00:09:07,520 Speaker 1: they were far more likely to call AI overrated. Presumably 143 00:09:08,040 --> 00:09:10,040 Speaker 1: that's because so much of their time has to be 144 00:09:10,080 --> 00:09:13,760 Speaker 1: spent checking for errors and correcting them and that sort 145 00:09:13,800 --> 00:09:16,679 Speaker 1: of thing. So with the sample size of fifteen hundred people, 146 00:09:16,720 --> 00:09:20,120 Speaker 1: I'm not sure we can actually draw any sweeping conclusions 147 00:09:20,120 --> 00:09:23,360 Speaker 1: from this survey. And again, it's a survey that's talking 148 00:09:23,400 --> 00:09:27,200 Speaker 1: about people's opinions toward AI. It's not like it's a 149 00:09:27,240 --> 00:09:31,839 Speaker 1: metric that actually measures AI itself, but rather our perception 150 00:09:32,000 --> 00:09:36,160 Speaker 1: of artificial intelligence. I would say that the survey indicates 151 00:09:36,200 --> 00:09:38,320 Speaker 1: it's probably a good idea to pay attention to the 152 00:09:38,360 --> 00:09:42,440 Speaker 1: people who are working directly with the artificial intelligence powered tools, 153 00:09:43,040 --> 00:09:47,040 Speaker 1: because if your staff is indicating that the AI is 154 00:09:47,080 --> 00:09:51,240 Speaker 1: actually making their jobs more frustrating or inefficient, that is 155 00:09:51,280 --> 00:09:55,120 Speaker 1: worth paying attention to. Otherwise, the leaders might find that 156 00:09:55,160 --> 00:09:59,359 Speaker 1: they have committed to a solution that is actually a problem. 157 00:10:00,000 --> 00:10:02,040 Speaker 1: And I don't think we're careening toward a world of 158 00:10:02,120 --> 00:10:05,160 Speaker 1: automation and that we'll never be able to turn back 159 00:10:05,520 --> 00:10:08,040 Speaker 1: and once we turn that corner, it's all downhill from there. 160 00:10:08,080 --> 00:10:11,120 Speaker 1: I don't think that's the case. I do think that 161 00:10:11,200 --> 00:10:13,520 Speaker 1: a lot of companies need to be careful in how 162 00:10:13,559 --> 00:10:16,800 Speaker 1: they implement AI if they don't want to rush into 163 00:10:16,880 --> 00:10:20,439 Speaker 1: an expensive and complicated mess I think that there are 164 00:10:20,480 --> 00:10:23,960 Speaker 1: ways to incorporate AI that can have a direct benefit 165 00:10:24,040 --> 00:10:28,560 Speaker 1: to both the overall company and to the employees who 166 00:10:28,600 --> 00:10:30,960 Speaker 1: are making use of the AI. I think there are legitimate, 167 00:10:31,040 --> 00:10:34,160 Speaker 1: good uses for this technology, but I also think it's 168 00:10:34,240 --> 00:10:38,080 Speaker 1: very easy to implement it willy nilly, and to do 169 00:10:38,200 --> 00:10:40,760 Speaker 1: so in a way that ends up being counterproductive. And 170 00:10:40,800 --> 00:10:44,440 Speaker 1: that's what companies really need to be careful about. All Right, 171 00:10:45,200 --> 00:10:47,160 Speaker 1: we're going to take a quick break. When we come back, 172 00:10:47,480 --> 00:10:50,840 Speaker 1: I got a whole lot of news about everybody's favorite 173 00:10:50,840 --> 00:11:04,280 Speaker 1: social network platform, X, But first let's thank our sponsors. Okay, 174 00:11:04,280 --> 00:11:06,840 Speaker 1: we're back. And I mentioned before the break that we 175 00:11:06,880 --> 00:11:10,280 Speaker 1: would be talking about X today, So here goes. First up, 176 00:11:10,320 --> 00:11:13,360 Speaker 1: the company has filed a lawsuit against a media watchdog 177 00:11:13,400 --> 00:11:16,840 Speaker 1: group called Media Matters. The group had previously published a 178 00:11:16,920 --> 00:11:21,000 Speaker 1: report indicating that ads could appear next to objectionable material 179 00:11:21,520 --> 00:11:27,200 Speaker 1: such as extremist and racist messaging, including antisemitic messaging. That 180 00:11:27,320 --> 00:11:30,720 Speaker 1: obviously is something that does not go over well with 181 00:11:30,840 --> 00:11:35,360 Speaker 1: advertisers as they are trying to promote and protect their brands, 182 00:11:36,040 --> 00:11:40,640 Speaker 1: and last week, several major advertisers left X after Elon 183 00:11:40,760 --> 00:11:45,040 Speaker 1: Musk indicated support for a conspiracy theory promoted by anti 184 00:11:45,200 --> 00:11:50,199 Speaker 1: Semitic groups and white supremacists. So clearly it must have 185 00:11:50,280 --> 00:11:52,600 Speaker 1: been Media Matters that was the real problem, not the 186 00:11:52,600 --> 00:11:55,880 Speaker 1: fact that the owner of the company was elevating this messaging. 187 00:11:56,480 --> 00:12:01,199 Speaker 1: You know. Anyway, some big names in in companies had 188 00:12:01,280 --> 00:12:08,400 Speaker 1: halted all advertising on X. They include media companies like Paramount, Disney, NBC, Universal, 189 00:12:08,920 --> 00:12:13,840 Speaker 1: Warner Brothers, Discovery. They include telecommunications companies like Comcast and 190 00:12:13,960 --> 00:12:18,839 Speaker 1: tech companies like Apple, among many many other companies. And 191 00:12:18,920 --> 00:12:21,240 Speaker 1: the thought is that their departure is probably going to 192 00:12:21,280 --> 00:12:26,080 Speaker 1: prompt other companies to follow suit. Now, x's complaint against 193 00:12:26,120 --> 00:12:31,040 Speaker 1: Media Matters is that Media Matters supposedly manufactured the images 194 00:12:31,160 --> 00:12:34,520 Speaker 1: it showed in its report by creating a test account 195 00:12:34,520 --> 00:12:38,520 Speaker 1: on X and then just refreshing the view over and over, 196 00:12:38,640 --> 00:12:43,760 Speaker 1: like following some specific accounts that were posting problematic stuff 197 00:12:43,960 --> 00:12:47,160 Speaker 1: and then just hitting refresh until they started to get 198 00:12:47,240 --> 00:12:52,440 Speaker 1: ads to appear next to some of these entries. And 199 00:12:52,559 --> 00:12:55,120 Speaker 1: so what X is saying is that this is not 200 00:12:55,240 --> 00:12:58,680 Speaker 1: representative of the typical experience on X. Most people on 201 00:12:58,880 --> 00:13:02,320 Speaker 1: X would never see this thing at all. Media Matters 202 00:13:02,320 --> 00:13:05,280 Speaker 1: had to game the system in order to get these 203 00:13:05,320 --> 00:13:12,600 Speaker 1: sorts of results. That's what X is arguing personally. I mean, 204 00:13:12,679 --> 00:13:16,480 Speaker 1: that might be a legitimate argument that Media Matters perhaps 205 00:13:16,600 --> 00:13:20,640 Speaker 1: kind of orchestrated this so it would appear this way. However, 206 00:13:20,679 --> 00:13:23,360 Speaker 1: you still have the issue of the owner of the 207 00:13:23,400 --> 00:13:30,400 Speaker 1: company promoting these messages, which whether whether Media Matters orchestrated 208 00:13:30,480 --> 00:13:34,600 Speaker 1: anything is still an issue. Media Matters, by the way, 209 00:13:34,679 --> 00:13:38,439 Speaker 1: says that they disagree with X's accusations and that they 210 00:13:38,480 --> 00:13:40,840 Speaker 1: are willing to go to court and to prove that 211 00:13:41,280 --> 00:13:45,319 Speaker 1: their methodology was legitimate. So so far there's no one 212 00:13:45,360 --> 00:13:48,760 Speaker 1: backing down from this fight. Musk continues to behave in 213 00:13:48,760 --> 00:13:51,240 Speaker 1: a way that is really alarming to advertisers, and that 214 00:13:51,320 --> 00:13:53,080 Speaker 1: means that X is going to have a problem no 215 00:13:53,120 --> 00:13:56,400 Speaker 1: matter what any watch dog group says. You don't even 216 00:13:56,440 --> 00:14:00,520 Speaker 1: need watchdog groups to bring these matters to attend if 217 00:14:00,520 --> 00:14:02,600 Speaker 1: you've got the owner of the company doing these sorts 218 00:14:02,600 --> 00:14:05,640 Speaker 1: of things, because that is so newsworthy that you know 219 00:14:05,720 --> 00:14:08,320 Speaker 1: that's going to be brought to attention and be put 220 00:14:08,320 --> 00:14:12,000 Speaker 1: in the spotlight, whether whether there's a watchdog group dedicated 221 00:14:12,040 --> 00:14:14,880 Speaker 1: to it or not. On a semi related note, I 222 00:14:14,920 --> 00:14:17,920 Speaker 1: deactivated my own account on X. I had not been 223 00:14:18,040 --> 00:14:20,600 Speaker 1: checking it regularly for weeks, so I decided it was 224 00:14:20,680 --> 00:14:22,880 Speaker 1: just time to walk away. Part of it is that 225 00:14:22,920 --> 00:14:26,480 Speaker 1: I don't want to be on a platform where a 226 00:14:26,720 --> 00:14:31,520 Speaker 1: fundamentally important person on there is promoting things that can 227 00:14:31,560 --> 00:14:34,760 Speaker 1: cause direct harm to others. So my apologies to anyone 228 00:14:34,760 --> 00:14:36,600 Speaker 1: who's trying to use X to get in touch with me. 229 00:14:37,040 --> 00:14:40,280 Speaker 1: It no longer will work. But let's talk about X 230 00:14:40,320 --> 00:14:43,360 Speaker 1: some more, because of course we're not done yet. Tech 231 00:14:43,400 --> 00:14:47,320 Speaker 1: Crunch reports that the mass exodus of advertisers, which really 232 00:14:47,320 --> 00:14:50,280 Speaker 1: happened last week, is likely to have a pretty darn 233 00:14:50,400 --> 00:14:53,400 Speaker 1: hefty impact to x's bottom line. You got to remember, 234 00:14:54,520 --> 00:14:59,840 Speaker 1: ninety percent of X's revenue came from advertising, and when 235 00:15:00,080 --> 00:15:03,200 Speaker 1: a bunch of advertisers leave, it means the company is 236 00:15:03,200 --> 00:15:05,440 Speaker 1: going to have a lot of trouble bringing in revenue. 237 00:15:05,800 --> 00:15:10,320 Speaker 1: They had already seen some drastic drops in revenue in 238 00:15:10,360 --> 00:15:12,920 Speaker 1: the wake of Elon Musk purchasing the company last year, 239 00:15:13,200 --> 00:15:18,080 Speaker 1: so this was already an issue before last week. In fact, 240 00:15:18,080 --> 00:15:21,320 Speaker 1: analysts from Insider Intelligence estimated that X was going to 241 00:15:21,320 --> 00:15:24,600 Speaker 1: see a fifty four point four percent decline in ad 242 00:15:24,640 --> 00:15:29,240 Speaker 1: business year over year, but now that's likely to be 243 00:15:29,280 --> 00:15:33,480 Speaker 1: an even more dramatic decline. Linda Yakarino, the CEO of X, 244 00:15:33,480 --> 00:15:37,080 Speaker 1: has said that data will show X is actually dedicated 245 00:15:37,120 --> 00:15:42,320 Speaker 1: toward fighting misinformation and racism, including anti Semitism, and that 246 00:15:43,040 --> 00:15:46,440 Speaker 1: they will prove that in fact, the company has been 247 00:15:46,640 --> 00:15:51,040 Speaker 1: working hard on this, despite the fact that Elon Musk 248 00:15:51,120 --> 00:15:56,720 Speaker 1: famously dismissed almost the entirety of the content moderation departments 249 00:15:56,760 --> 00:16:00,920 Speaker 1: within Twitter when he purchased the company. Reportedly, several advertisers 250 00:16:00,960 --> 00:16:05,440 Speaker 1: have actually urged Yakarinos to step down as CEO, going 251 00:16:05,480 --> 00:16:08,320 Speaker 1: so far as to suggest that she is potentially causing 252 00:16:08,360 --> 00:16:12,840 Speaker 1: irreparable damage to her own reputation by serving as CEO 253 00:16:12,880 --> 00:16:15,680 Speaker 1: of X, Like they're essentially saying, you're better than this. 254 00:16:15,880 --> 00:16:18,160 Speaker 1: You need to get away from this company. It is 255 00:16:18,240 --> 00:16:21,440 Speaker 1: toxic and it is bringing you down with it. Yakarino 256 00:16:21,520 --> 00:16:24,480 Speaker 1: so far has rejected these calls. I think she's going 257 00:16:24,520 --> 00:16:27,600 Speaker 1: to continue to find it challenging to champion X's policies 258 00:16:27,840 --> 00:16:30,520 Speaker 1: while the owner continues to promote content that appears to 259 00:16:30,600 --> 00:16:35,000 Speaker 1: contradict her. So that's a mess anyway. That's the update 260 00:16:35,040 --> 00:16:39,120 Speaker 1: on x let's now switch over to YouTube for a moment. 261 00:16:39,840 --> 00:16:45,119 Speaker 1: Amir Siddicky of Android Authority posted an article titled YouTube 262 00:16:45,200 --> 00:16:49,960 Speaker 1: is reportedly slowing down videos for Firefox users. So essentially 263 00:16:50,560 --> 00:16:54,120 Speaker 1: the story says that folks who are using non Chrome browsers, 264 00:16:54,440 --> 00:16:58,760 Speaker 1: primarily Firefox and Microsoft Edge, have reported that when they 265 00:16:58,800 --> 00:17:02,800 Speaker 1: try to launch a video on YouTube, they're encountering a 266 00:17:02,840 --> 00:17:06,000 Speaker 1: delay in playback, and it's a delay that lasts several seconds, 267 00:17:06,000 --> 00:17:10,120 Speaker 1: like five seconds. But if they close out that browser 268 00:17:10,200 --> 00:17:13,160 Speaker 1: and open up Chrome and go to YouTube, they can 269 00:17:13,160 --> 00:17:15,479 Speaker 1: try and load that same video and there's no delay. 270 00:17:16,000 --> 00:17:19,960 Speaker 1: Now that is troubling. It could indicate that Google is 271 00:17:20,040 --> 00:17:24,919 Speaker 1: purposefully throttling performance on non Chrome browsers in an effort 272 00:17:24,960 --> 00:17:29,359 Speaker 1: to coax folks to use Chrome instead. Now, this is 273 00:17:29,720 --> 00:17:33,080 Speaker 1: preferential treatment that could be seen as anti competitive. In fact, 274 00:17:33,400 --> 00:17:36,080 Speaker 1: I think, especially in the current climate in the United States, 275 00:17:36,119 --> 00:17:39,800 Speaker 1: it would automatically be listed as being anti competitive behavior. 276 00:17:40,280 --> 00:17:42,480 Speaker 1: And considering that Google is currently in the hot seat 277 00:17:42,520 --> 00:17:46,719 Speaker 1: for that sort of thing, that's not good. It definitely 278 00:17:46,760 --> 00:17:48,879 Speaker 1: is not in line with the concept of net neutrality. 279 00:17:49,320 --> 00:17:55,480 Speaker 1: But YouTube has responded to this report and has claimed 280 00:17:55,480 --> 00:17:58,200 Speaker 1: that it really doesn't have anything to do with which 281 00:17:58,240 --> 00:18:03,440 Speaker 1: browser someone's using. That is immaterial according to Google. Instead, 282 00:18:03,640 --> 00:18:08,639 Speaker 1: what's happening, YouTube says is that it is putting in 283 00:18:08,720 --> 00:18:12,720 Speaker 1: delays for people who are using browsers that have ad 284 00:18:12,760 --> 00:18:15,560 Speaker 1: blocker extensions activated on them. So it's a way to 285 00:18:15,600 --> 00:18:19,520 Speaker 1: fight ad blocking. The statement says, quote users who have 286 00:18:19,680 --> 00:18:24,560 Speaker 1: ad blockers installed may experience suboptimal viewing, regardless of the 287 00:18:24,600 --> 00:18:28,560 Speaker 1: browser they are using. End quote. I don't doubt that 288 00:18:28,640 --> 00:18:31,440 Speaker 1: Google is using this approach to discourage ad blockers, as 289 00:18:31,440 --> 00:18:34,880 Speaker 1: the company has taken a pretty hardline stance against ad 290 00:18:34,920 --> 00:18:38,440 Speaker 1: blockers this year in particular. But I'm curious to see 291 00:18:38,480 --> 00:18:41,320 Speaker 1: if that's as far as this goes, because the stories 292 00:18:41,320 --> 00:18:44,800 Speaker 1: that were being shared seem to suggest that Chrome users 293 00:18:44,840 --> 00:18:48,680 Speaker 1: were getting a better experience. Now, maybe it was that 294 00:18:48,760 --> 00:18:52,520 Speaker 1: they had ad blockers installed on their non Chrome browsers, 295 00:18:53,040 --> 00:18:56,560 Speaker 1: but for whatever reason, they had not installed ad blockers 296 00:18:56,560 --> 00:19:01,200 Speaker 1: on Chrome. Maybe it's that Google has been rolling out 297 00:19:01,320 --> 00:19:07,399 Speaker 1: this feature to affect non Chrome browsers first, like they 298 00:19:07,440 --> 00:19:10,360 Speaker 1: are designing YouTube so that it can detect the presence 299 00:19:10,359 --> 00:19:14,200 Speaker 1: of ad blockers on each browser, and Chrome just hasn't 300 00:19:14,200 --> 00:19:17,679 Speaker 1: been rolled out yet And that eventually, no matter what 301 00:19:17,920 --> 00:19:21,440 Speaker 1: browser you're using, you're going to have this issue. That 302 00:19:22,080 --> 00:19:24,680 Speaker 1: might be true, but it still kind of smacks of 303 00:19:24,720 --> 00:19:28,080 Speaker 1: being anti competitive to me to because again, Chrome gets 304 00:19:28,119 --> 00:19:32,600 Speaker 1: preferential treatment. If your perception is that, oh well, as 305 00:19:32,600 --> 00:19:34,479 Speaker 1: long as I'm on Chrome it's better, I might as 306 00:19:34,480 --> 00:19:38,040 Speaker 1: well just use Chrome, that's an issue. If it's not 307 00:19:38,160 --> 00:19:42,879 Speaker 1: just legit a natural advance of Chrome. If the system 308 00:19:42,920 --> 00:19:45,760 Speaker 1: is being gamed on the back end, that's bad. So 309 00:19:46,160 --> 00:19:50,000 Speaker 1: we'll have to wait and see if more developments happen 310 00:19:50,080 --> 00:19:54,920 Speaker 1: in this story, because again, Google is currently in the 311 00:19:54,960 --> 00:19:58,040 Speaker 1: hot seat with the US government when it comes to 312 00:19:58,200 --> 00:20:05,199 Speaker 1: anti competitive behaviors, and there's a real threat that the 313 00:20:05,240 --> 00:20:09,639 Speaker 1: government will demand Google break apart into different pieces as 314 00:20:09,680 --> 00:20:12,240 Speaker 1: a result of this. Whether it actually gets to that point, 315 00:20:12,320 --> 00:20:15,399 Speaker 1: I don't know. I personally kind of doubt it, simply 316 00:20:15,440 --> 00:20:19,399 Speaker 1: because the last time we saw something this big happen, 317 00:20:20,480 --> 00:20:24,680 Speaker 1: the US government ultimately reversed its decision to break apart 318 00:20:24,760 --> 00:20:27,320 Speaker 1: a big tech company. In that case, it was Microsoft. 319 00:20:27,800 --> 00:20:31,639 Speaker 1: So not that it's impossible, but it hasn't happened in 320 00:20:31,720 --> 00:20:34,719 Speaker 1: a very long time, so I would be surprised if 321 00:20:34,760 --> 00:20:37,360 Speaker 1: it went that far. But we'll have to see. All right, 322 00:20:37,720 --> 00:20:40,000 Speaker 1: We're going to take another quick break. When we come back, 323 00:20:40,040 --> 00:20:54,200 Speaker 1: I've got a couple more stories I want to talk about. Okay, now, 324 00:20:54,320 --> 00:20:57,679 Speaker 1: let's have a quick story about Nothing. This is not 325 00:20:57,800 --> 00:21:02,160 Speaker 1: a Seinfeld reference. It's nothing like that. We're talking about 326 00:21:02,200 --> 00:21:06,880 Speaker 1: the Nothing Company, the makers of the Nothing Phone. That's 327 00:21:06,880 --> 00:21:10,080 Speaker 1: a smartphone. It runs on Android. It had a lot 328 00:21:10,119 --> 00:21:13,120 Speaker 1: of hype when it was first launching. There were all 329 00:21:13,160 --> 00:21:15,159 Speaker 1: these kind of hopes that the Nothing Phone was going 330 00:21:15,240 --> 00:21:18,880 Speaker 1: to be a really awesome flagship Android phone with really 331 00:21:18,920 --> 00:21:22,000 Speaker 1: cool features like these glyphs that light up on the 332 00:21:22,040 --> 00:21:24,600 Speaker 1: back of the phone indicating whether you got a text 333 00:21:24,640 --> 00:21:28,000 Speaker 1: message versus an incoming call or whatever. But so far, 334 00:21:28,320 --> 00:21:31,000 Speaker 1: the Nothing Phone has fallen. I think it's fair to 335 00:21:31,040 --> 00:21:35,280 Speaker 1: say it's fallen short of expectations since the original announcement 336 00:21:35,320 --> 00:21:38,560 Speaker 1: for the project. But this story specifically has to do 337 00:21:38,640 --> 00:21:43,919 Speaker 1: with an app that Nothing put its name on. Nothing 338 00:21:44,000 --> 00:21:49,000 Speaker 1: launched an app called Nothing Chats, and this app claimed 339 00:21:49,000 --> 00:21:52,480 Speaker 1: it could, among other things, let Android users get a 340 00:21:52,520 --> 00:21:55,800 Speaker 1: taste of the promised land. That is, if the promised 341 00:21:55,880 --> 00:21:58,280 Speaker 1: land is made up of blue chat bubbles in Apple's 342 00:21:58,280 --> 00:22:01,960 Speaker 1: I message. All right, so I've talked about this a 343 00:22:02,000 --> 00:22:06,919 Speaker 1: little bit recently, mostly because I find it all really exasperating. 344 00:22:07,480 --> 00:22:12,720 Speaker 1: But there's this perception among certain iPhone owners, particularly here 345 00:22:12,760 --> 00:22:16,480 Speaker 1: in the United States, that people who use other types 346 00:22:16,520 --> 00:22:20,520 Speaker 1: of phones, you know, non iPhones, are the dirty, unwashed 347 00:22:20,560 --> 00:22:24,400 Speaker 1: masses who lack sophistication and intelligence, and they should be 348 00:22:24,480 --> 00:22:30,679 Speaker 1: ostracized and ridiculed and shunned. And the dead giveaway if 349 00:22:31,320 --> 00:22:33,399 Speaker 1: it's not just looking at what the phone is when 350 00:22:33,400 --> 00:22:36,840 Speaker 1: they're holding it, is that when you are messaging with 351 00:22:37,040 --> 00:22:40,120 Speaker 1: them and you're using an iPhone, so you're an eye message, 352 00:22:40,920 --> 00:22:44,320 Speaker 1: their messages show up in little green bubbles rather than 353 00:22:44,359 --> 00:22:49,360 Speaker 1: little blue bubbles. So iPhone users all get blue bubbles, 354 00:22:49,359 --> 00:22:52,600 Speaker 1: but an Android user, they are completely restricted to green, 355 00:22:53,080 --> 00:22:57,480 Speaker 1: and they should be shunned. I guess I don't know. 356 00:22:57,520 --> 00:22:59,880 Speaker 1: I'm an Android user who doesn't really I don't care 357 00:23:00,040 --> 00:23:02,679 Speaker 1: about any of this stuff. I don't care if anyone 358 00:23:02,720 --> 00:23:05,320 Speaker 1: makes fun of me for the type of phone I use, 359 00:23:05,680 --> 00:23:08,719 Speaker 1: because if that's the kind of person they are, then 360 00:23:08,760 --> 00:23:12,280 Speaker 1: they're not worthy of me being worried about them. They're 361 00:23:12,280 --> 00:23:15,959 Speaker 1: beneath my concern, is what I'm saying. So yes, I 362 00:23:16,000 --> 00:23:20,080 Speaker 1: have my own issues of superiority. But I do understand 363 00:23:20,640 --> 00:23:23,040 Speaker 1: that other people care a lot about this stuff, and 364 00:23:23,080 --> 00:23:25,600 Speaker 1: they care about it so much that they will go 365 00:23:25,640 --> 00:23:28,359 Speaker 1: out of their way to make folks feel inferior about 366 00:23:28,440 --> 00:23:32,320 Speaker 1: not having an iPhone. And that is just lame. I mean, 367 00:23:32,720 --> 00:23:35,440 Speaker 1: you know, I get that, Like, especially like school kids 368 00:23:35,440 --> 00:23:39,520 Speaker 1: and stuff, they'll always find ways where they define a 369 00:23:39,560 --> 00:23:42,600 Speaker 1: group and they define outsiders to the group. Like that's 370 00:23:42,760 --> 00:23:47,560 Speaker 1: just kind of how it goes. But it's lame, y'all. Anyway. 371 00:23:48,280 --> 00:23:52,360 Speaker 1: Nothing claimed that the nothing chats app would let Android 372 00:23:52,440 --> 00:23:56,320 Speaker 1: users appear to be iPhone users in I Message conversations, 373 00:23:56,320 --> 00:24:00,280 Speaker 1: that their texts would appear in little blue bubbles. It 374 00:24:00,320 --> 00:24:04,000 Speaker 1: turned out that nothing Chats was actually a reskin of 375 00:24:04,040 --> 00:24:08,240 Speaker 1: a different app called the Sunbird app, and installing the 376 00:24:08,280 --> 00:24:12,720 Speaker 1: app would prompt Android users to submit their Apple username 377 00:24:12,840 --> 00:24:17,880 Speaker 1: and password, and then the app would reportedly log into 378 00:24:18,000 --> 00:24:21,280 Speaker 1: I Message on the behalf of the Android user and 379 00:24:21,359 --> 00:24:24,399 Speaker 1: kind of act like a middleman. And supposedly it was 380 00:24:24,440 --> 00:24:28,680 Speaker 1: going to include end to end encryption, except folks discovered 381 00:24:29,000 --> 00:24:33,520 Speaker 1: that Sunbird was not protecting this information at all. This 382 00:24:33,560 --> 00:24:38,000 Speaker 1: is a huge, huge security and privacy problem. So again, 383 00:24:38,040 --> 00:24:40,479 Speaker 1: the app claimed that it was incorporating into enn encryption. 384 00:24:41,040 --> 00:24:45,280 Speaker 1: It wasn't. Sunbird was logging and storing messages in plane 385 00:24:46,040 --> 00:24:50,480 Speaker 1: text in multiple places, not just in one location, which 386 00:24:50,680 --> 00:24:54,840 Speaker 1: makes it doubly vulnerable, right because that means hackers have 387 00:24:55,400 --> 00:24:58,600 Speaker 1: multiple targets they could look at in order to get 388 00:24:58,640 --> 00:25:03,600 Speaker 1: access to Lane text messaging. Researchers show it was possible 389 00:25:03,600 --> 00:25:06,159 Speaker 1: for a hacker to infiltrate a message server and to 390 00:25:06,200 --> 00:25:09,920 Speaker 1: read the messages sent via this app, which is actually 391 00:25:09,960 --> 00:25:13,119 Speaker 1: the opposite of what end to end encryption is supposed 392 00:25:13,160 --> 00:25:15,399 Speaker 1: to do. And the apps are no longer in the store. 393 00:25:15,680 --> 00:25:18,960 Speaker 1: They disappeared after just a day. But the whole thing 394 00:25:19,000 --> 00:25:23,480 Speaker 1: indicates that both Sunbird and Nothing have not taken proper 395 00:25:23,520 --> 00:25:26,679 Speaker 1: steps to actually be serious about security and privacy, not 396 00:25:26,760 --> 00:25:29,480 Speaker 1: even a little bit. Now. If you want to learn 397 00:25:29,480 --> 00:25:34,040 Speaker 1: more about this, I recommend Ron Amadeo's article in Ours Tetnica. 398 00:25:34,119 --> 00:25:37,640 Speaker 1: It is titled Nothing's I Message app was a security 399 00:25:37,640 --> 00:25:42,160 Speaker 1: catastrophe taken down in twenty four hours. Now. Our last 400 00:25:42,160 --> 00:25:46,680 Speaker 1: story is that the test launch of the SpaceX Starship 401 00:25:46,760 --> 00:25:51,760 Speaker 1: vehicle once again ended in an explosion. So back in April, 402 00:25:52,200 --> 00:25:55,919 Speaker 1: SpaceX first held a test launch of the next generation spacecraft, 403 00:25:55,920 --> 00:26:00,199 Speaker 1: which includes the Starship spacecraft and the super Heavy the 404 00:26:00,400 --> 00:26:04,240 Speaker 1: launch vehicle, and a few minutes after takeoff, like four 405 00:26:04,280 --> 00:26:08,480 Speaker 1: minutes after the first test mission took off, there was 406 00:26:08,520 --> 00:26:12,680 Speaker 1: a big explosion and then SpaceX chose to self destruct 407 00:26:13,200 --> 00:26:17,000 Speaker 1: the vehicle as a result of this, so four minutes 408 00:26:17,000 --> 00:26:22,920 Speaker 1: after lift off, everything went boom. This time this past weekend, 409 00:26:23,320 --> 00:26:26,280 Speaker 1: the vehicle made it further than it had in April, 410 00:26:26,320 --> 00:26:29,639 Speaker 1: but it's still encountered issues. So after the super Heavy 411 00:26:29,760 --> 00:26:32,600 Speaker 1: first stage booster separated from the rest of the vehicle, 412 00:26:33,240 --> 00:26:37,520 Speaker 1: the booster exploded over the Gulf of Mexico. Now, the 413 00:26:37,560 --> 00:26:43,080 Speaker 1: Starship stage continued in its trajectory initially anyway, but then 414 00:26:43,160 --> 00:26:48,640 Speaker 1: SpaceX mission control lost contact with the vehicle. The best 415 00:26:48,720 --> 00:26:53,960 Speaker 1: estimation was that the Starship stage initiated an automatic flight termination, 416 00:26:54,359 --> 00:26:57,000 Speaker 1: which is another way of saying it self destructed. Mission 417 00:26:57,000 --> 00:27:00,880 Speaker 1: control was unsure why the vehicle had actually initiated that. 418 00:27:01,359 --> 00:27:04,639 Speaker 1: The failure happened at around eight minutes into the test, 419 00:27:05,080 --> 00:27:07,840 Speaker 1: so four minutes longer than what happened back in April, 420 00:27:08,600 --> 00:27:11,399 Speaker 1: and it had also reached an altitude of ninety one miles, 421 00:27:11,440 --> 00:27:15,040 Speaker 1: which means by every definition, it reached space the US, 422 00:27:15,080 --> 00:27:17,399 Speaker 1: by the way, defines the edge of space as an 423 00:27:17,440 --> 00:27:21,920 Speaker 1: altitude of fifty miles, whereas the international standard is the 424 00:27:22,000 --> 00:27:24,800 Speaker 1: Karman line, which is at sixty two miles of altitude. 425 00:27:25,040 --> 00:27:28,560 Speaker 1: Either way, the starship got all the way up to 426 00:27:29,800 --> 00:27:32,720 Speaker 1: space because it was up ninety one miles in altitude. 427 00:27:33,200 --> 00:27:36,159 Speaker 1: The test is not a total failure. It is easy 428 00:27:36,200 --> 00:27:40,680 Speaker 1: to call things that when stuff blows up, but even 429 00:27:40,720 --> 00:27:43,000 Speaker 1: when things go wrong, that means there's an opportunity to 430 00:27:43,080 --> 00:27:45,879 Speaker 1: learn more and to learn what you need to do 431 00:27:45,920 --> 00:27:49,119 Speaker 1: differently and how to design things so that they're more reliable. 432 00:27:49,640 --> 00:27:50,960 Speaker 1: It does mean that there's going to have to be 433 00:27:51,040 --> 00:27:55,920 Speaker 1: more tests, obviously, to reach a point where the technology 434 00:27:55,920 --> 00:27:59,400 Speaker 1: can be demonstrated to be reliable and safe and consistently 435 00:27:59,720 --> 00:28:03,439 Speaker 1: working in good order. So that's an issue. It is 436 00:28:03,520 --> 00:28:07,640 Speaker 1: kind of a setback. It's not a failure, but I'm 437 00:28:07,680 --> 00:28:11,000 Speaker 1: sure it's something that SpaceX was not super pleased to 438 00:28:11,040 --> 00:28:13,879 Speaker 1: see because you know, they had already seen one of 439 00:28:13,920 --> 00:28:17,000 Speaker 1: these explode back in April. So we'll have to see 440 00:28:17,280 --> 00:28:20,560 Speaker 1: what happens moving forward. At In the meantime, you know, 441 00:28:20,600 --> 00:28:24,920 Speaker 1: you've got NASA. It's really kind of keeping an eye 442 00:28:25,040 --> 00:28:28,040 Speaker 1: on what's going on with SpaceX, while NASA is readying 443 00:28:28,119 --> 00:28:32,439 Speaker 1: up for the Artemis program to continue, which ultimately is 444 00:28:32,440 --> 00:28:35,240 Speaker 1: going to send astronauts back to the Moon. Clearly, for 445 00:28:35,320 --> 00:28:39,000 Speaker 1: that to happen, there need to be access to reliable 446 00:28:39,080 --> 00:28:42,880 Speaker 1: launch vehicles capable of sending a spacecraft as far out 447 00:28:42,920 --> 00:28:46,400 Speaker 1: as the Moon. So we'll see how things continue from here. 448 00:28:46,920 --> 00:28:49,840 Speaker 1: But that's it. That's the news for November twenty first, 449 00:28:49,880 --> 00:28:52,920 Speaker 1: twenty twenty three. Just a heads up for all of y'all. 450 00:28:53,040 --> 00:28:55,960 Speaker 1: This week in the United States, it's Thanksgiving, which means 451 00:28:56,280 --> 00:28:58,520 Speaker 1: we'll be off for the second half of the week. 452 00:28:58,600 --> 00:29:03,360 Speaker 1: I'll have some a rerun to play in place of 453 00:29:03,480 --> 00:29:05,720 Speaker 1: Thursday's episode, so I just want to give you a 454 00:29:05,720 --> 00:29:07,600 Speaker 1: heads up for that, but we should be back the 455 00:29:07,640 --> 00:29:09,880 Speaker 1: following week. Coming up, I'm going to be going on 456 00:29:09,880 --> 00:29:13,080 Speaker 1: a vacation in early December, so we'll probably have some 457 00:29:13,160 --> 00:29:16,680 Speaker 1: reruns then as well, And then of course we have 458 00:29:16,720 --> 00:29:19,040 Speaker 1: the Christmas holidays at the end that'll also have an 459 00:29:19,080 --> 00:29:21,920 Speaker 1: impact on the show, but then everything will be back 460 00:29:21,920 --> 00:29:25,920 Speaker 1: to what passes for normal on this wacky, zany show 461 00:29:26,000 --> 00:29:29,360 Speaker 1: called tech Stuff. I hope you are all well. I 462 00:29:29,360 --> 00:29:31,400 Speaker 1: hope those of you in the US who are about 463 00:29:31,400 --> 00:29:35,240 Speaker 1: to celebrate Thanksgiving, have a wonderful holiday, and I will 464 00:29:35,240 --> 00:29:44,560 Speaker 1: talk to you again really soon. Tech Stuff is an 465 00:29:44,560 --> 00:29:50,120 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 466 00:29:50,240 --> 00:29:57,040 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.