1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,480 --> 00:00:16,239 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,280 --> 00:00:19,360 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:19,400 --> 00:00:21,759 Speaker 1: tech are you. It's time for the tech news for 5 00:00:21,800 --> 00:00:26,319 Speaker 1: the week ending on Friday, June twenty first, twenty twenty four. 6 00:00:26,880 --> 00:00:29,720 Speaker 1: And we've had a little shuffle this week for the 7 00:00:29,760 --> 00:00:34,080 Speaker 1: title of world's most valuable company. And by that I 8 00:00:34,159 --> 00:00:37,680 Speaker 1: mean how a company is valued based upon the number 9 00:00:37,760 --> 00:00:41,559 Speaker 1: of shares that are available and the price per share. 10 00:00:42,000 --> 00:00:46,080 Speaker 1: All the contenders are tech companies and earlier this month, 11 00:00:46,200 --> 00:00:51,280 Speaker 1: Apple toppled Microsoft after Apple held its Worldwide Developer Conference 12 00:00:51,320 --> 00:00:55,040 Speaker 1: and got investors all excited, but that victory was short lived. 13 00:00:55,160 --> 00:01:00,640 Speaker 1: Microsoft regained the throne just two days later. This week, 14 00:01:00,760 --> 00:01:04,440 Speaker 1: a new challenger pushed past Apple to take a swing 15 00:01:04,480 --> 00:01:08,640 Speaker 1: at the king that was Nvidia. Like Apple, Nvidia achieved 16 00:01:08,640 --> 00:01:11,760 Speaker 1: the title of most valuable company in the world, only 17 00:01:11,840 --> 00:01:15,559 Speaker 1: to relinquish it back to Microsoft a couple of days later. 18 00:01:15,800 --> 00:01:18,560 Speaker 1: But on June eighteenth, Nvidia was on top of the 19 00:01:18,600 --> 00:01:22,399 Speaker 1: world with a market value of three point three four 20 00:01:23,319 --> 00:01:29,040 Speaker 1: trillion dollars. Geez Louise. Anyway, since then Nvidia stock price 21 00:01:29,120 --> 00:01:32,080 Speaker 1: dipped about three point four percent, and that was enough 22 00:01:32,080 --> 00:01:34,720 Speaker 1: to give Microsoft the top spot again, even though Microsoft 23 00:01:34,800 --> 00:01:38,120 Speaker 1: had also had a smaller dip in stock value in 24 00:01:38,200 --> 00:01:41,440 Speaker 1: that same time. Now, in case you're not familiar with Nvidia, 25 00:01:41,680 --> 00:01:46,520 Speaker 1: it's a microprocessor design company. Like they make the designs 26 00:01:46,640 --> 00:01:51,400 Speaker 1: for things like, you know, microprocessors, specifically graphics processing units 27 00:01:51,440 --> 00:01:56,200 Speaker 1: or GPUs, and then other companies fabricate those designs. Recently, 28 00:01:56,240 --> 00:01:59,440 Speaker 1: in Vidia's really big business has been in designing chips 29 00:01:59,440 --> 00:02:01,840 Speaker 1: that are used by companies for the purposes of running 30 00:02:01,880 --> 00:02:06,160 Speaker 1: AI implementations. On the tech and politics front, the US 31 00:02:06,240 --> 00:02:09,639 Speaker 1: White House has banned the sale of products and services 32 00:02:09,680 --> 00:02:14,959 Speaker 1: from the Russian cybersecurity company Kispersky Labs. So Kispersky Lab 33 00:02:15,000 --> 00:02:18,480 Speaker 1: it's a huge name in the cybersecurity space. They've played 34 00:02:18,560 --> 00:02:22,280 Speaker 1: a key part in identifying and mitigating various threats across 35 00:02:22,320 --> 00:02:25,919 Speaker 1: the Internet. Kisperski also makes lots of different products, including 36 00:02:26,080 --> 00:02:30,519 Speaker 1: antivirus software. So why would the US ban it? Well, 37 00:02:30,960 --> 00:02:33,720 Speaker 1: the main concern is that the White House believes the 38 00:02:33,800 --> 00:02:38,840 Speaker 1: Russian government could and would force Kaspersky Lab to harvest 39 00:02:38,880 --> 00:02:42,280 Speaker 1: and weaponize data from other countries, you know, primarily the 40 00:02:42,360 --> 00:02:45,480 Speaker 1: United States obviously, but there's also a fear that the 41 00:02:45,560 --> 00:02:48,959 Speaker 1: Russian government could force Kisperski Lab to actually serve as 42 00:02:48,960 --> 00:02:52,040 Speaker 1: a delivery system for malware rather than as a protection 43 00:02:52,120 --> 00:02:56,000 Speaker 1: against it. Imagine being told that your antivirus software needs 44 00:02:56,040 --> 00:02:59,400 Speaker 1: to be updated, and so you update it, but secretly 45 00:02:59,639 --> 00:03:04,600 Speaker 1: it's actually delivering malware to you. That's part of the concern. Now, 46 00:03:04,880 --> 00:03:09,440 Speaker 1: Kasperski Lab has protested this ban, which makes sense, and 47 00:03:09,600 --> 00:03:11,760 Speaker 1: have said that well, this is a case where the 48 00:03:11,840 --> 00:03:15,960 Speaker 1: US government is reacting to a perceived hypothetical threat as 49 00:03:15,960 --> 00:03:19,240 Speaker 1: opposed to know an actual threat, which is a sentiment 50 00:03:19,360 --> 00:03:22,360 Speaker 1: that we may have also heard regarding the upcoming US 51 00:03:22,440 --> 00:03:25,120 Speaker 1: ban on TikTok. I'll talk more about that in just 52 00:03:25,160 --> 00:03:28,480 Speaker 1: a moment. Kasperski Lab says it will pursue all legal 53 00:03:28,520 --> 00:03:32,720 Speaker 1: avenues open to the company to push back against this ban. 54 00:03:33,120 --> 00:03:35,400 Speaker 1: I don't know how I feel about this one, honestly, 55 00:03:35,520 --> 00:03:39,280 Speaker 1: because I mean, Kaspersky Lab undeniably has done some amazing work, 56 00:03:39,520 --> 00:03:43,160 Speaker 1: and without their contributions, I do believe the Internet would 57 00:03:43,160 --> 00:03:45,760 Speaker 1: be a much more dangerous place than it is already. 58 00:03:46,400 --> 00:03:48,640 Speaker 1: On the flip side, I don't think there's any way 59 00:03:48,640 --> 00:03:52,480 Speaker 1: you can deny that Russia has an authoritarian government in 60 00:03:52,600 --> 00:03:56,360 Speaker 1: place with various agencies that have shown in the past 61 00:03:56,400 --> 00:03:59,640 Speaker 1: to be more than willing to put pressure on whatever 62 00:04:00,200 --> 00:04:03,880 Speaker 1: levers they have to gain an advantage. But is that 63 00:04:04,200 --> 00:04:07,640 Speaker 1: potential enough to justify a ban. If you don't have 64 00:04:07,680 --> 00:04:10,560 Speaker 1: a specific instance to point out of saying yes, this 65 00:04:10,800 --> 00:04:14,400 Speaker 1: is happening, then are you justified in banning. I don't know. 66 00:04:14,480 --> 00:04:16,960 Speaker 1: It doesn't feel good to me, but at the same 67 00:04:17,000 --> 00:04:19,839 Speaker 1: time I can understand the concern. So you could also 68 00:04:20,000 --> 00:04:24,000 Speaker 1: argue that this is really just more political posturing, that 69 00:04:24,080 --> 00:04:28,960 Speaker 1: it's the US attempting to kind of shut down an 70 00:04:29,040 --> 00:04:34,080 Speaker 1: important business that's in Russia as a way to indirectly 71 00:04:34,160 --> 00:04:38,080 Speaker 1: pressure Russia itself. I'm not smart enough to understand all 72 00:04:38,080 --> 00:04:40,480 Speaker 1: the ins and outs, so I just look at it 73 00:04:40,480 --> 00:04:44,760 Speaker 1: and think, wow, that's a big mess. Also, speaking of TikTok, 74 00:04:44,920 --> 00:04:48,279 Speaker 1: Reuter's reports that TikTok and its parent company byte Dance, 75 00:04:48,560 --> 00:04:52,719 Speaker 1: the Chinese company, have requested essentially a dismissal of the 76 00:04:52,800 --> 00:04:57,440 Speaker 1: upcoming national ban on TikTok in the US that's scheduled 77 00:04:57,440 --> 00:05:01,679 Speaker 1: to happen on January nineteenth, twenty twenty. So the argument 78 00:05:01,760 --> 00:05:04,960 Speaker 1: that TikTok is making is that despite several attempts to 79 00:05:05,040 --> 00:05:09,040 Speaker 1: negotiate a solution to the various concerns the US government 80 00:05:09,080 --> 00:05:13,480 Speaker 1: has raised regarding TikTok and its relationship to its pairing company, 81 00:05:13,680 --> 00:05:16,760 Speaker 1: the US government has refused to really engage on a 82 00:05:16,800 --> 00:05:20,119 Speaker 1: serious level since twenty twenty two. So TikTok is saying, 83 00:05:20,480 --> 00:05:23,480 Speaker 1: We've come forward in good faith in an attempt to 84 00:05:23,520 --> 00:05:26,680 Speaker 1: work this out, and the government has just roadblocked us 85 00:05:26,880 --> 00:05:29,960 Speaker 1: and then gone to a ban which doesn't seem fair. 86 00:05:30,320 --> 00:05:33,480 Speaker 1: And then also TikTok's arguing that this is a violation 87 00:05:33,560 --> 00:05:36,440 Speaker 1: of First Amendment rights as well. TikTok reps have also 88 00:05:36,480 --> 00:05:40,160 Speaker 1: claimed that divestiture from the Chinese pairing company is quote 89 00:05:40,360 --> 00:05:45,000 Speaker 1: not possible technologically, commercially, or legally end the quote. I 90 00:05:45,000 --> 00:05:47,599 Speaker 1: don't know. I feel like I need to call bs 91 00:05:47,640 --> 00:05:50,320 Speaker 1: on at least the first two of those three, simply 92 00:05:50,400 --> 00:05:53,560 Speaker 1: because we've also read reports that TikTok has quietly been 93 00:05:53,600 --> 00:05:58,000 Speaker 1: working on an alternative recommendation algorithm, which sounds like it 94 00:05:58,040 --> 00:06:01,440 Speaker 1: could be a contingency plan if this band does go through. 95 00:06:01,600 --> 00:06:04,800 Speaker 1: You know, the band hinges on TikTok's relationship to Byte 96 00:06:04,880 --> 00:06:08,160 Speaker 1: Dance and by extension, China itself. The band has said 97 00:06:08,200 --> 00:06:12,080 Speaker 1: that if TikTok were to be divested from Byte Dance, 98 00:06:12,440 --> 00:06:16,200 Speaker 1: that this band would not necessarily happen. So the fact 99 00:06:16,240 --> 00:06:21,200 Speaker 1: that TikTok has apparently been working on this alternative recommendation algorithm, 100 00:06:21,200 --> 00:06:25,600 Speaker 1: one that's distinctly independent of the version that Byte Dance 101 00:06:25,680 --> 00:06:30,520 Speaker 1: uses for say du Yan, the Chinese version of TikTok 102 00:06:30,640 --> 00:06:34,320 Speaker 1: or the Chinese you know, like close cousin of TikTok, 103 00:06:34,680 --> 00:06:39,400 Speaker 1: That to me suggests that TikTok is quietly exploring those possibilities. 104 00:06:39,560 --> 00:06:43,000 Speaker 1: But obviously the company wants to avoid having to do 105 00:06:43,040 --> 00:06:45,440 Speaker 1: that at all, and it would be better for the 106 00:06:45,480 --> 00:06:48,440 Speaker 1: company if they didn't have to go through the whole 107 00:06:48,440 --> 00:06:54,200 Speaker 1: divestiture process. As for commercially, I mean, TikTok still operates 108 00:06:54,279 --> 00:06:58,719 Speaker 1: rather independently from the other products of Byte Dance, so 109 00:06:59,160 --> 00:07:03,719 Speaker 1: I would imagine that divesting it commercially actually is entirely possible. 110 00:07:03,800 --> 00:07:06,159 Speaker 1: I mean, no one by Dance wouldn't want to do 111 00:07:06,240 --> 00:07:09,600 Speaker 1: it because TikTok makes money, so why would you want to? 112 00:07:09,960 --> 00:07:13,960 Speaker 1: But I still think it's possible. I don't think it's impossible. 113 00:07:14,360 --> 00:07:17,360 Speaker 1: That's based on my own opinion, but yeah, the legal part, 114 00:07:17,720 --> 00:07:21,080 Speaker 1: that's harder for me to answer. That's what the courts 115 00:07:21,120 --> 00:07:23,480 Speaker 1: are for. I personally have doubts that the courts are 116 00:07:23,480 --> 00:07:25,440 Speaker 1: going to reach a decision as to whether the ban 117 00:07:25,600 --> 00:07:28,160 Speaker 1: is legal or not before the deadline actually gets here, 118 00:07:28,480 --> 00:07:32,640 Speaker 1: But we'll see. Reuter's also reports that Apple has recently 119 00:07:32,680 --> 00:07:36,840 Speaker 1: switched gears and has stopped work on a successor to 120 00:07:36,920 --> 00:07:40,960 Speaker 1: the Apple Vision Pro, which is Apple's mixed reality headset. 121 00:07:41,440 --> 00:07:45,000 Speaker 1: That headset has a hefty price tag, and it's also 122 00:07:45,160 --> 00:07:49,120 Speaker 1: had a fairly slow start, which I believe mostly comes 123 00:07:49,160 --> 00:07:52,720 Speaker 1: down to two really big reasons. The first is obviously 124 00:07:52,800 --> 00:07:55,560 Speaker 1: the base cost. It is huge. It's a whopping three 125 00:07:56,000 --> 00:07:59,480 Speaker 1: five hundred dollars at the beginning, before you've done any 126 00:07:59,840 --> 00:08:03,200 Speaker 1: like upgrades or accessories or anything like that. And that's 127 00:08:03,240 --> 00:08:05,960 Speaker 1: a lot more money than most people can PLoP down 128 00:08:06,000 --> 00:08:09,040 Speaker 1: for a new tech gadget. But the other big barrier 129 00:08:09,440 --> 00:08:12,320 Speaker 1: is that development for this platform is still in its 130 00:08:12,440 --> 00:08:16,520 Speaker 1: very early days, which means there's really not that much 131 00:08:16,560 --> 00:08:19,920 Speaker 1: you can do with the darn thing. Yet every review 132 00:08:20,040 --> 00:08:23,400 Speaker 1: I have read of the Vision Pro says that it 133 00:08:23,440 --> 00:08:26,920 Speaker 1: performs really well. It's really impressive. But that outside of 134 00:08:26,960 --> 00:08:29,680 Speaker 1: the basic things you're able to do, there's just not 135 00:08:29,880 --> 00:08:33,040 Speaker 1: much there yet. And for developers, I could see why 136 00:08:33,080 --> 00:08:35,559 Speaker 1: there'd be a reluctance to jump on board because if 137 00:08:35,559 --> 00:08:38,680 Speaker 1: the installed base of potential customers is a very small one, 138 00:08:38,960 --> 00:08:41,720 Speaker 1: you might not be able to capitalize on the invested 139 00:08:41,800 --> 00:08:45,280 Speaker 1: time and energy it took to make something compelling because 140 00:08:45,320 --> 00:08:49,640 Speaker 1: there's just not enough people to pay that investment off anyway. Now, 141 00:08:49,720 --> 00:08:52,840 Speaker 1: Apple is reportedly backing away from making a true Apple 142 00:08:53,000 --> 00:08:56,200 Speaker 1: Vision Pro two or whatever they would call it, but 143 00:08:56,400 --> 00:08:59,280 Speaker 1: the company is still working on a more affordable mixed 144 00:08:59,280 --> 00:09:02,679 Speaker 1: reality headsint at that would be available before the end 145 00:09:02,720 --> 00:09:06,679 Speaker 1: of next year. Presumably such an affordable model will have 146 00:09:06,760 --> 00:09:09,880 Speaker 1: fewer features than the Vision Pro. What I find interesting 147 00:09:10,280 --> 00:09:12,880 Speaker 1: is whether or not customers will flock to a cheaper 148 00:09:13,000 --> 00:09:16,880 Speaker 1: version of this platform, or if the lack of those 149 00:09:17,160 --> 00:09:20,480 Speaker 1: certain features is going to sap away any excitement about 150 00:09:20,520 --> 00:09:22,840 Speaker 1: the technology and people won't be willing to pay for 151 00:09:22,880 --> 00:09:26,520 Speaker 1: it because you know, it'll be a cheap imitation of 152 00:09:26,559 --> 00:09:28,680 Speaker 1: the thing that came out a couple of years earlier. 153 00:09:29,000 --> 00:09:30,920 Speaker 1: If I had to make a guess, I would say 154 00:09:30,960 --> 00:09:33,840 Speaker 1: more folks would be willing to buy a less expensive 155 00:09:33,840 --> 00:09:37,800 Speaker 1: headset made by Apple, because Apple's reality distortion field is 156 00:09:37,840 --> 00:09:40,920 Speaker 1: still a thing, and maybe we would hear folks later 157 00:09:41,000 --> 00:09:43,959 Speaker 1: complain that the headset isn't as impressive as they were 158 00:09:44,000 --> 00:09:46,120 Speaker 1: led to believe, but who knows. I guess we'll find 159 00:09:46,120 --> 00:09:49,240 Speaker 1: out before the end of next year. Okay, we've got 160 00:09:49,320 --> 00:09:51,800 Speaker 1: more tech news to cover before we get to that, though, 161 00:09:51,880 --> 00:10:04,679 Speaker 1: Let's take a quick break to thank our sponsors. Ilia Sutzkev, 162 00:10:04,920 --> 00:10:07,400 Speaker 1: one of the co founders of open ai, who later 163 00:10:07,520 --> 00:10:09,960 Speaker 1: was forced to resign from the board of directors and 164 00:10:10,000 --> 00:10:13,720 Speaker 1: then a couple months later resigned completely from the company itself, 165 00:10:14,040 --> 00:10:17,000 Speaker 1: is now starting his own AI company, which is called 166 00:10:17,240 --> 00:10:21,400 Speaker 1: Safe Superintelligence, and as the name indicates, the founding principles 167 00:10:21,440 --> 00:10:25,000 Speaker 1: are similar to what inspired the initial creation of open Ai, 168 00:10:25,320 --> 00:10:29,160 Speaker 1: before that company began to embrace profit to a greater degree. 169 00:10:29,400 --> 00:10:32,920 Speaker 1: The Safe Superintelligence website says the company will have offices 170 00:10:32,960 --> 00:10:37,040 Speaker 1: in Palo Alto, California, and Tel Aviv. Sutzkever posted on 171 00:10:37,360 --> 00:10:39,640 Speaker 1: x that the plan is for the company to focus 172 00:10:39,920 --> 00:10:43,320 Speaker 1: entirely on the safe development and deployment of AI, without 173 00:10:43,320 --> 00:10:46,760 Speaker 1: regard to stuff like market forces and trends and things 174 00:10:46,880 --> 00:10:49,840 Speaker 1: like that how he's going to achieve that remains to 175 00:10:49,880 --> 00:10:54,240 Speaker 1: be seen because AI development is a really expensive thing. 176 00:10:54,720 --> 00:10:57,400 Speaker 1: In fact, that's why open Ai launched a for profit 177 00:10:57,520 --> 00:11:00,440 Speaker 1: arm in the first place. But I do think it's 178 00:11:00,440 --> 00:11:03,080 Speaker 1: a lofty goal to aim for. I just don't know 179 00:11:03,120 --> 00:11:07,480 Speaker 1: how achievable it is. AI company Anthropic has unveiled the 180 00:11:07,600 --> 00:11:11,720 Speaker 1: Claude three point five Sonnet AI language model this week. 181 00:11:11,920 --> 00:11:15,080 Speaker 1: In fact, I've already started seeing ads online for this thing. 182 00:11:15,520 --> 00:11:17,920 Speaker 1: The model can do pretty much all the things we 183 00:11:18,000 --> 00:11:21,640 Speaker 1: associate with chatbots built on generative AI. You know. It 184 00:11:21,640 --> 00:11:24,480 Speaker 1: can respond to prompts, it can write code, it can 185 00:11:24,559 --> 00:11:28,719 Speaker 1: analyze text. According to Ours Technica's Binge Edwards, it can 186 00:11:28,760 --> 00:11:31,640 Speaker 1: be really difficult to compare different AI models to each 187 00:11:31,679 --> 00:11:34,720 Speaker 1: other for various reasons. But that being said, Claude three 188 00:11:34,720 --> 00:11:38,200 Speaker 1: point five Sonnet seems to perform pretty well impressively so. 189 00:11:38,559 --> 00:11:40,760 Speaker 1: In fact, now I haven't used it yet, but then 190 00:11:40,880 --> 00:11:43,760 Speaker 1: I don't use any AI models on a regular basis. 191 00:11:43,800 --> 00:11:46,320 Speaker 1: I've only dipped my toe in goofy ways to see 192 00:11:46,320 --> 00:11:48,640 Speaker 1: what kind of response I can get for silly queries. 193 00:11:48,920 --> 00:11:52,360 Speaker 1: But for those of y'all who are either researching AI 194 00:11:52,960 --> 00:11:55,760 Speaker 1: or occasionally making use of AI. It might be something 195 00:11:55,760 --> 00:11:57,560 Speaker 1: you want to check out just to see if it 196 00:11:57,720 --> 00:12:02,000 Speaker 1: gives you better responses than alternatives on the market. Back 197 00:12:02,000 --> 00:12:05,199 Speaker 1: to Reuters, which reports that Amazon is looking to monetize 198 00:12:05,240 --> 00:12:08,840 Speaker 1: its Alexa service soon. The plan, which is code named Banyan, 199 00:12:09,280 --> 00:12:12,679 Speaker 1: is to introduce a more advanced version of the Alexa assistant, 200 00:12:12,760 --> 00:12:17,440 Speaker 1: capable of holding generative AI conversations with users. Access to 201 00:12:17,559 --> 00:12:21,000 Speaker 1: this level of Alexa will actually require a monthly fee 202 00:12:21,000 --> 00:12:23,839 Speaker 1: somewhere in the neighborhood of five dollars a month. The 203 00:12:24,000 --> 00:12:27,200 Speaker 1: report says that Amazon is given developers until August to 204 00:12:27,240 --> 00:12:30,559 Speaker 1: get this version of Alexa ready, though that doesn't necessarily 205 00:12:30,600 --> 00:12:33,400 Speaker 1: tell us when such a service would actually launch. I 206 00:12:33,480 --> 00:12:36,040 Speaker 1: honestly don't know if this is going to work from 207 00:12:36,040 --> 00:12:39,679 Speaker 1: a commercial standpoint. I'm not sure most folks care if 208 00:12:39,720 --> 00:12:43,600 Speaker 1: their personal assistant is generating a conversation or just pulling 209 00:12:43,679 --> 00:12:47,199 Speaker 1: something directly off the web to answer various questions. That's 210 00:12:47,600 --> 00:12:50,160 Speaker 1: possibly due to the fact that as I get older, 211 00:12:50,200 --> 00:12:52,520 Speaker 1: I see fewer reasons for me to use tools like 212 00:12:52,559 --> 00:12:56,080 Speaker 1: this for anything beyond really basic stuff like, you know, 213 00:12:56,200 --> 00:12:58,760 Speaker 1: asking for an update on the weather or whatever, which 214 00:12:58,800 --> 00:13:01,480 Speaker 1: is like ninety nine times out of one hundred, if 215 00:13:01,480 --> 00:13:07,440 Speaker 1: I'm asking my smart assistant speaker for something, it's to 216 00:13:07,520 --> 00:13:10,240 Speaker 1: give me an update on whether. Other times it's to 217 00:13:10,480 --> 00:13:13,280 Speaker 1: play music so that my dog can listen to music 218 00:13:13,360 --> 00:13:16,080 Speaker 1: while I'm leaving the house. The rare occasions I do 219 00:13:16,160 --> 00:13:18,920 Speaker 1: that or to ask if certain fruits are safe for 220 00:13:18,960 --> 00:13:21,120 Speaker 1: dogs to eat. Those are like my three go tos. 221 00:13:21,559 --> 00:13:24,800 Speaker 1: But Amazon's trying to find a way to make Alexa moneymaker, 222 00:13:24,880 --> 00:13:27,960 Speaker 1: and maybe this will work out for them. I personally 223 00:13:28,000 --> 00:13:30,200 Speaker 1: have my doubts. I think a lot of people will say, 224 00:13:30,240 --> 00:13:32,120 Speaker 1: why am I going to pay for something that previously 225 00:13:32,160 --> 00:13:34,440 Speaker 1: I got to use for free, even if the new 226 00:13:34,559 --> 00:13:39,120 Speaker 1: version is at least in theory better. Meta has reorganized 227 00:13:39,160 --> 00:13:42,080 Speaker 1: its Reality Labs division. A quick reminder, this is the 228 00:13:42,280 --> 00:13:46,240 Speaker 1: part of Meta formerly Facebook that is tasked with bringing 229 00:13:46,280 --> 00:13:49,080 Speaker 1: the metaverse into reality, whether people want it or not. 230 00:13:49,440 --> 00:13:52,559 Speaker 1: It's the division that Meta has dedicated billions of dollars 231 00:13:52,559 --> 00:13:55,280 Speaker 1: to in that effort, which is something that has concerned 232 00:13:55,320 --> 00:13:58,719 Speaker 1: many investors in that company. But anyway, this reorg has 233 00:13:58,720 --> 00:14:02,800 Speaker 1: divided Reality Labs into two major organizations, one that will 234 00:14:02,800 --> 00:14:05,640 Speaker 1: focus on wearables and the other that will focus on 235 00:14:05,840 --> 00:14:09,160 Speaker 1: the metaverse. So wearables are things like the Quest VR 236 00:14:09,240 --> 00:14:13,400 Speaker 1: headsets and the ray Band smart glasses. The Verge reports 237 00:14:13,440 --> 00:14:15,920 Speaker 1: that the reorganization also meant the company laid off a 238 00:14:16,000 --> 00:14:19,160 Speaker 1: quote unquote small number of employees, although I haven't seen 239 00:14:19,200 --> 00:14:22,520 Speaker 1: any sources that give more detail as to how much 240 00:14:22,600 --> 00:14:26,400 Speaker 1: a small number actually is. Bloomberg reports that Elon Musk 241 00:14:26,440 --> 00:14:29,280 Speaker 1: is pushing hard to get an X payment service launched 242 00:14:29,560 --> 00:14:31,920 Speaker 1: before the end of this year. You might recall that 243 00:14:32,000 --> 00:14:34,480 Speaker 1: one thing Musk wanted to do with Twitter, apart from 244 00:14:34,520 --> 00:14:36,800 Speaker 1: opening it up to folks who had previously been banned 245 00:14:36,800 --> 00:14:40,320 Speaker 1: from the platform for violating various policies, was to convert 246 00:14:40,320 --> 00:14:43,320 Speaker 1: it into sort of an everything app, an app where 247 00:14:43,360 --> 00:14:46,520 Speaker 1: you could chat with friends, post to the general population, 248 00:14:46,920 --> 00:14:50,240 Speaker 1: shop for goods and services, and pay for stuff like 249 00:14:50,480 --> 00:14:54,000 Speaker 1: transfer money, all from one app. The payment service thing 250 00:14:54,160 --> 00:14:56,760 Speaker 1: is part of this overall puzzle, and one that Musk 251 00:14:56,800 --> 00:14:59,720 Speaker 1: has aggressively pushed for this year. To that end, X 252 00:14:59,800 --> 00:15:04,040 Speaker 1: has secured payment transmitter licenses with twenty eight states here 253 00:15:04,040 --> 00:15:06,520 Speaker 1: in the US, by the company's goals to secure such 254 00:15:06,560 --> 00:15:09,280 Speaker 1: licenses for every state by the end of the year. 255 00:15:09,640 --> 00:15:11,840 Speaker 1: It sounds to me like this is really a matter 256 00:15:11,960 --> 00:15:15,080 Speaker 1: of when not if, and at some point we will 257 00:15:15,080 --> 00:15:18,360 Speaker 1: see the X payment services roll out, possibly with many 258 00:15:18,360 --> 00:15:21,360 Speaker 1: other features to follow. Must's goal is to get users 259 00:15:21,360 --> 00:15:24,440 Speaker 1: to rely on X for financial transactions and to store 260 00:15:24,520 --> 00:15:28,120 Speaker 1: their money in high yield savings accounts, which might work. 261 00:15:28,160 --> 00:15:30,880 Speaker 1: It might actually entice people who have so far either 262 00:15:31,280 --> 00:15:34,520 Speaker 1: stayed off of the platform for various reasons or who 263 00:15:34,600 --> 00:15:37,640 Speaker 1: left due to similar reasons, and to have them come 264 00:15:37,680 --> 00:15:40,000 Speaker 1: back and jump in and order not to lose out 265 00:15:40,040 --> 00:15:42,600 Speaker 1: on an opportunity. However, I think it's safe to say 266 00:15:42,600 --> 00:15:45,080 Speaker 1: that skeptics like myself, you know, folks who have seen 267 00:15:45,160 --> 00:15:48,560 Speaker 1: Musk apply the ban hammer on X in a very 268 00:15:48,600 --> 00:15:52,760 Speaker 1: mercurial way, are less eager to put our money into 269 00:15:52,760 --> 00:15:56,760 Speaker 1: a platform where the owner has been shown to be unpredictable, temperamental, 270 00:15:56,840 --> 00:16:01,320 Speaker 1: perhaps unreliable is another good word. I just I think 271 00:16:01,320 --> 00:16:04,280 Speaker 1: it would be very risky to do personally. That's my 272 00:16:04,320 --> 00:16:07,720 Speaker 1: own opinion though, so you know, if you don't value 273 00:16:07,720 --> 00:16:09,960 Speaker 1: my opinion. One, I don't know why you're listening, but 274 00:16:10,000 --> 00:16:12,800 Speaker 1: thank you anyway, And two, yeah, do you don't make 275 00:16:12,840 --> 00:16:15,200 Speaker 1: your own decisions? It's totally cool if you happen to 276 00:16:15,240 --> 00:16:18,320 Speaker 1: live near Dallas, Texas or King of Prussia, Pennsylvania. I 277 00:16:18,400 --> 00:16:20,920 Speaker 1: have a homework assignment for you. Don't worry. It's not 278 00:16:20,960 --> 00:16:23,560 Speaker 1: coming up anytime soon. It'll be next year, but it's 279 00:16:23,600 --> 00:16:27,160 Speaker 1: to visit either of the two Netflix House venues that 280 00:16:27,200 --> 00:16:30,840 Speaker 1: are opening up in the upcoming months. They're slated to 281 00:16:30,960 --> 00:16:34,400 Speaker 1: open in twenty twenty five. Netflix first announced the intent 282 00:16:34,440 --> 00:16:37,400 Speaker 1: to debut some brick and mortar locations, and now we 283 00:16:37,560 --> 00:16:40,760 Speaker 1: know where those first two are going to be. Dallas, Texas, 284 00:16:41,000 --> 00:16:43,920 Speaker 1: King of Prussia, Pennsylvania. They'll be taking over the spaces 285 00:16:43,920 --> 00:16:47,240 Speaker 1: of former department stores and shopping centers, which I don't know. 286 00:16:47,280 --> 00:16:50,360 Speaker 1: Maybe that'll breathe some new life into malls. I haven't 287 00:16:50,400 --> 00:16:52,840 Speaker 1: been to a mall in years, but in fact, the 288 00:16:53,040 --> 00:16:54,920 Speaker 1: place where I used to go when I was a 289 00:16:55,000 --> 00:16:57,840 Speaker 1: kid is now essentially vacant. I think there's like only 290 00:16:57,880 --> 00:17:00,160 Speaker 1: a handful of stores that are still open and everything 291 00:17:00,200 --> 00:17:02,760 Speaker 1: else is closed down, which, as I understand, it is 292 00:17:02,800 --> 00:17:05,800 Speaker 1: not that unusual today. But then I don't ever leave 293 00:17:05,840 --> 00:17:08,119 Speaker 1: the house, So what do I know anyway? What's going 294 00:17:08,160 --> 00:17:11,800 Speaker 1: to go on in these Netflix houses beats me. Netflix 295 00:17:11,840 --> 00:17:13,840 Speaker 1: has not shared a whole lot of information. They have 296 00:17:13,960 --> 00:17:17,440 Speaker 1: said that there will be themed experiences as well as 297 00:17:17,440 --> 00:17:21,359 Speaker 1: some special food and drink offerings, and that presumably these 298 00:17:21,400 --> 00:17:25,640 Speaker 1: are going to draw inspiration from original Netflix films and series. 299 00:17:25,880 --> 00:17:27,960 Speaker 1: And when I think back to some of those originals 300 00:17:28,040 --> 00:17:30,680 Speaker 1: on Netflix, I'm not convinced I really want to experience 301 00:17:30,720 --> 00:17:35,080 Speaker 1: that stuff myself in real life, but I am definitely curious, 302 00:17:35,160 --> 00:17:36,960 Speaker 1: and heck, maybe I'll go to the King of Presha 303 00:17:37,000 --> 00:17:39,879 Speaker 1: one because my partner has family up in that area, 304 00:17:40,080 --> 00:17:41,920 Speaker 1: So maybe if we're happy to be there, I'll be like, hey, 305 00:17:41,960 --> 00:17:44,880 Speaker 1: let's go see what this is all about. The two 306 00:17:44,960 --> 00:17:47,960 Speaker 1: astronauts who recently journeyed to the International Space Station aboard 307 00:17:47,960 --> 00:17:51,480 Speaker 1: a Boeing Starliner spacecraft are staying a little bit longer 308 00:17:51,520 --> 00:17:54,520 Speaker 1: than they had planned. This is because the Starliner experienced 309 00:17:54,520 --> 00:17:57,600 Speaker 1: some technical issues that engineers really want to review further 310 00:17:57,720 --> 00:18:00,719 Speaker 1: before bringing the astronauts back home in the spacecraft on 311 00:18:00,800 --> 00:18:03,840 Speaker 1: June twenty six. So, during the trip to the ISS, 312 00:18:03,880 --> 00:18:06,760 Speaker 1: the star Liner had a few helium leaks, like five 313 00:18:06,800 --> 00:18:10,639 Speaker 1: of them. Now, helium itself is non toxic, it's non flammable, 314 00:18:10,800 --> 00:18:12,600 Speaker 1: so as long as the leaks weren't bad enough to 315 00:18:12,640 --> 00:18:17,439 Speaker 1: displace oxygen and cause asphyxiation, then the astronauts were deemed 316 00:18:17,480 --> 00:18:20,399 Speaker 1: to be safe. Then five of the twenty eight reaction 317 00:18:20,520 --> 00:18:23,520 Speaker 1: control system thrusters on the star Liner also failed as 318 00:18:23,560 --> 00:18:26,639 Speaker 1: the spacecraft was nearing the iss. It's not exactly the 319 00:18:26,680 --> 00:18:28,520 Speaker 1: sort of thing you want to chance when you're bringing 320 00:18:28,520 --> 00:18:31,119 Speaker 1: folks back home, as these thrusters play a part in 321 00:18:31,359 --> 00:18:35,520 Speaker 1: orienting the spacecraft properly upon re entry. NASA reps have 322 00:18:35,560 --> 00:18:38,439 Speaker 1: said the agency wants to review the test spacecraft's issues 323 00:18:38,480 --> 00:18:41,720 Speaker 1: before signing off on future missions, which makes perfect sense 324 00:18:41,720 --> 00:18:45,520 Speaker 1: to me. Okay, some quick reading recommendations for y'all. Mickey 325 00:18:45,560 --> 00:18:48,280 Speaker 1: Carroll has a piece for sky News titled self driving 326 00:18:48,320 --> 00:18:51,119 Speaker 1: cars found to be safer except at dawn, dusk, or 327 00:18:51,160 --> 00:18:54,480 Speaker 1: when turning according to a study. I recommend reading that 328 00:18:54,560 --> 00:18:56,359 Speaker 1: it's one of the biggest selling points for R and 329 00:18:56,440 --> 00:18:58,800 Speaker 1: D and self driving vehicles. This idea that you know, 330 00:18:58,920 --> 00:19:01,480 Speaker 1: ideally they would be fre safer on the roads than 331 00:19:01,600 --> 00:19:04,040 Speaker 1: vehicles that were driven by human beings. But as this 332 00:19:04,160 --> 00:19:07,399 Speaker 1: study points out, there are still scenarios in which AI 333 00:19:07,520 --> 00:19:10,919 Speaker 1: powered vehicles are more prone to accidents, and obviously that 334 00:19:11,000 --> 00:19:14,480 Speaker 1: shines a spotlight on areas where engineers need to improve safety. 335 00:19:14,840 --> 00:19:17,000 Speaker 1: Next up is a piece by Mike Masnik at tech 336 00:19:17,080 --> 00:19:20,040 Speaker 1: Dirt titled five hundred thousand books have been deleted from 337 00:19:20,040 --> 00:19:23,199 Speaker 1: the Internet Archives Lending Library. I talked a little bit 338 00:19:23,240 --> 00:19:25,240 Speaker 1: about this when I did an episode about the Internet 339 00:19:25,320 --> 00:19:28,280 Speaker 1: Archive recently, but this piece dives into how publishers have 340 00:19:28,359 --> 00:19:32,000 Speaker 1: leveraged their considerable power to deny the archive access to 341 00:19:32,119 --> 00:19:35,920 Speaker 1: library copies of various works. And last up is a 342 00:19:35,960 --> 00:19:40,119 Speaker 1: piece by Ashley Bellinger of Ours Technica titled lawsuit Meta 343 00:19:40,200 --> 00:19:43,840 Speaker 1: engineer told to resign after calling out sexist hiring practices. 344 00:19:44,119 --> 00:19:46,720 Speaker 1: This article details the claims made by a former Meta 345 00:19:46,720 --> 00:19:49,680 Speaker 1: employee who says the company retaliated against him after he 346 00:19:49,760 --> 00:19:53,280 Speaker 1: brought forward concerns and misogynistic practices in the company. All 347 00:19:53,280 --> 00:19:55,760 Speaker 1: of those are well worth your time to read. That's 348 00:19:55,800 --> 00:19:58,600 Speaker 1: it for this week. I hope you are all well 349 00:19:58,960 --> 00:20:01,400 Speaker 1: and I will talk to you again again really soon. 350 00:20:07,920 --> 00:20:12,560 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 351 00:20:12,920 --> 00:20:16,639 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 352 00:20:16,640 --> 00:20:17,719 Speaker 1: to your favorite shows.