1 00:00:04,480 --> 00:00:12,479 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,600 --> 00:00:16,079 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,079 --> 00:00:19,639 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,680 --> 00:00:21,959 Speaker 1: are you. It's time for the tech news for the 5 00:00:22,000 --> 00:00:26,279 Speaker 1: week ending on Friday May seventeenth, twenty twenty four. And 6 00:00:26,400 --> 00:00:30,240 Speaker 1: this was a pretty big week in tech news. There 7 00:00:30,240 --> 00:00:33,879 Speaker 1: were a couple of large companies that made major announcements, 8 00:00:34,120 --> 00:00:36,239 Speaker 1: and then there were tons of other things going on. 9 00:00:36,600 --> 00:00:39,080 Speaker 1: So we're going to start off with open ai, the 10 00:00:39,120 --> 00:00:44,680 Speaker 1: company that is behind the chat GPT chatbot, among many 11 00:00:44,720 --> 00:00:48,920 Speaker 1: other things. I mentioned last week that the rumor mill 12 00:00:49,440 --> 00:00:54,200 Speaker 1: said open ai was on the verge of announcing an 13 00:00:54,240 --> 00:00:57,640 Speaker 1: AI powered search tool, which would pretty much be akin 14 00:00:57,760 --> 00:01:02,040 Speaker 1: to putting Google dead in the center of open AI's crosshairs, 15 00:01:02,200 --> 00:01:05,320 Speaker 1: because Google was just about to have it's Google Io 16 00:01:05,440 --> 00:01:08,040 Speaker 1: Developers conference, which we'll talk about in a second. And 17 00:01:08,160 --> 00:01:11,679 Speaker 1: open ai did release an announcement for a new AI 18 00:01:11,880 --> 00:01:16,400 Speaker 1: model that's called GPT four O and I think that's 19 00:01:16,440 --> 00:01:20,480 Speaker 1: the letter oh, not a zero. Anyway, this model is 20 00:01:20,520 --> 00:01:24,400 Speaker 1: meant to allow for actual voice conversations between human and 21 00:01:24,600 --> 00:01:28,880 Speaker 1: AI eventually, and open Ai says it'll also work across 22 00:01:28,920 --> 00:01:32,039 Speaker 1: image and video related tasks. So this is what they 23 00:01:32,080 --> 00:01:35,520 Speaker 1: call a multi modal approach to AI. It's not just 24 00:01:35,840 --> 00:01:40,960 Speaker 1: text based, it's other stuff too, And the concept is 25 00:01:41,040 --> 00:01:43,440 Speaker 1: kind of like you could have an image of something 26 00:01:43,680 --> 00:01:46,520 Speaker 1: or a video of something. You could be looking at 27 00:01:46,560 --> 00:01:51,520 Speaker 1: things through say your camera app on your phone, and 28 00:01:51,640 --> 00:01:56,720 Speaker 1: then talk to the AI assistant and try to get 29 00:01:56,760 --> 00:01:59,520 Speaker 1: information about what it is you're seeing. So we've seen 30 00:01:59,720 --> 00:02:02,320 Speaker 1: stuff like this or related to this in the past, 31 00:02:02,440 --> 00:02:05,760 Speaker 1: like holding your phone up to look at a sign 32 00:02:05,800 --> 00:02:09,240 Speaker 1: that's written in another language and have real time translation 33 00:02:09,400 --> 00:02:12,519 Speaker 1: to translate that into whatever your native language is. We've 34 00:02:12,520 --> 00:02:15,960 Speaker 1: seen that in the past. This would be more advanced, 35 00:02:16,040 --> 00:02:18,320 Speaker 1: much more advanced than that, you know. The example I 36 00:02:18,320 --> 00:02:21,320 Speaker 1: would think of is I would imagine like, say you're 37 00:02:21,400 --> 00:02:24,519 Speaker 1: starting up your car and the engines making a weird noise, 38 00:02:24,960 --> 00:02:28,400 Speaker 1: and so you turn your car off, you pop the 39 00:02:28,760 --> 00:02:31,960 Speaker 1: hood of your engine, and you're turning the car on 40 00:02:32,040 --> 00:02:33,720 Speaker 1: and you're trying to take a look and see what 41 00:02:33,760 --> 00:02:36,720 Speaker 1: it is that's possibly going wrong. Well, if you're like me, 42 00:02:37,240 --> 00:02:41,520 Speaker 1: you may only have a passing familiarity with that particular engine. 43 00:02:41,639 --> 00:02:46,440 Speaker 1: So an app that could hear and see the same 44 00:02:46,480 --> 00:02:49,840 Speaker 1: things that you do, but have access to an entire 45 00:02:50,440 --> 00:02:53,520 Speaker 1: world of information would be really useful. It could tell you, hey, 46 00:02:53,639 --> 00:02:55,640 Speaker 1: this is something you don't need to worry about right now, 47 00:02:55,800 --> 00:02:57,799 Speaker 1: or maybe it would tell you you need to get 48 00:02:57,840 --> 00:03:00,960 Speaker 1: your vehicle checked right away, or maybe even give you 49 00:03:01,480 --> 00:03:04,960 Speaker 1: a quick fix, if such a thing as even possible now. 50 00:03:05,520 --> 00:03:07,800 Speaker 1: In some ways, you could argue open ai is doing 51 00:03:08,280 --> 00:03:11,799 Speaker 1: kind of similar stuff to what we're seeing with other 52 00:03:11,960 --> 00:03:16,200 Speaker 1: personal digital assistants like Google Assistant or Siri or Alexa, 53 00:03:16,320 --> 00:03:20,080 Speaker 1: but just turbocharged. Open Ai did a real time demonstration 54 00:03:20,639 --> 00:03:26,640 Speaker 1: of the model's capabilities, something that would loom over Google 55 00:03:26,680 --> 00:03:29,520 Speaker 1: a little bit for their io event, but we'll get 56 00:03:29,560 --> 00:03:32,400 Speaker 1: to them in a second. Anyway. Chat GPT four oh 57 00:03:32,520 --> 00:03:35,600 Speaker 1: is rolling out now, but not all features are available 58 00:03:35,680 --> 00:03:38,320 Speaker 1: rail of the gate, so for example, it is not 59 00:03:38,480 --> 00:03:40,320 Speaker 1: able to talk to you just yet, or at least 60 00:03:40,320 --> 00:03:43,120 Speaker 1: not to me. I actually check to see if I 61 00:03:43,240 --> 00:03:46,520 Speaker 1: have access to Chat GPT four oh. I have a 62 00:03:46,960 --> 00:03:50,680 Speaker 1: chat GPT account that I never use, and so I 63 00:03:50,680 --> 00:03:52,680 Speaker 1: think I logged onto it for the first time in 64 00:03:52,920 --> 00:03:55,600 Speaker 1: maybe a year and found out that sure enough, I've 65 00:03:55,640 --> 00:03:59,000 Speaker 1: got access to it. But it doesn't talk to me yet. 66 00:03:59,240 --> 00:04:01,480 Speaker 1: The rollout is also going to be happening in batches, 67 00:04:01,600 --> 00:04:03,840 Speaker 1: So if you don't have access to it yet, even 68 00:04:03,880 --> 00:04:06,800 Speaker 1: though you have a chat gpt account, it is coming. 69 00:04:07,080 --> 00:04:10,400 Speaker 1: You can actually sign in now by going to chatgpt 70 00:04:10,640 --> 00:04:15,280 Speaker 1: dot com. They created that URL, so that's where the 71 00:04:15,360 --> 00:04:18,159 Speaker 1: service lives now. I think that kind of is a 72 00:04:18,200 --> 00:04:22,640 Speaker 1: sly indicator that they are transitioning from something that is 73 00:04:22,760 --> 00:04:25,839 Speaker 1: largely talked about as kind of a test product into 74 00:04:25,880 --> 00:04:30,560 Speaker 1: something that's an actual product. Existing chat gpt account holders 75 00:04:30,560 --> 00:04:33,320 Speaker 1: will receive access to four O capabilities over the next 76 00:04:33,320 --> 00:04:35,560 Speaker 1: several days if they don't have it already. And I 77 00:04:35,680 --> 00:04:38,080 Speaker 1: want to echo a warning that I've seen a lot 78 00:04:38,120 --> 00:04:40,479 Speaker 1: of other people make, because I think it's wise to 79 00:04:40,520 --> 00:04:45,320 Speaker 1: remember be careful if in your interactions with chat gpt 80 00:04:45,560 --> 00:04:49,960 Speaker 1: it suggests links to stuff, because apparently at least some 81 00:04:50,720 --> 00:04:54,919 Speaker 1: malicious folks have found ways to get chat gpt to 82 00:04:55,880 --> 00:05:00,279 Speaker 1: suggest links that ultimately lead to malicious website it's and 83 00:05:00,320 --> 00:05:06,360 Speaker 1: malicious software. So don't trust everything Chad Gbt says, particularly 84 00:05:06,400 --> 00:05:09,040 Speaker 1: when it comes to links. I think it's a really 85 00:05:09,080 --> 00:05:12,360 Speaker 1: interesting interaction that you can have with this thing. I 86 00:05:12,400 --> 00:05:17,960 Speaker 1: don't fully like it. I have concerns, and we'll talk 87 00:05:18,000 --> 00:05:21,800 Speaker 1: more about that in just a second. But beyond the 88 00:05:22,200 --> 00:05:25,880 Speaker 1: announcement that open ai made, they also managed to close 89 00:05:25,960 --> 00:05:29,159 Speaker 1: a deal with Reddit, which will give open ai permission 90 00:05:29,200 --> 00:05:33,800 Speaker 1: to train AI models on Reddit and subreddit posts. That's 91 00:05:33,800 --> 00:05:37,000 Speaker 1: something that folks had suspected open ai was already doing, 92 00:05:37,400 --> 00:05:39,719 Speaker 1: just you know, without the part about having permission to 93 00:05:39,760 --> 00:05:43,200 Speaker 1: do so. Reddit will in return get some access to 94 00:05:43,279 --> 00:05:46,200 Speaker 1: AI powered tools that can then be incorporated into the 95 00:05:46,320 --> 00:05:49,560 Speaker 1: platform in various ways. So Reddit is saying, hey, don't worry, 96 00:05:49,600 --> 00:05:53,159 Speaker 1: you're gonna have enhanced ways to interact with Reddit. Though 97 00:05:53,839 --> 00:05:57,680 Speaker 1: what those will actually be haven't, you know, been fully disclosed, 98 00:05:57,800 --> 00:06:00,680 Speaker 1: if you recall. Open Ai similarly sign to deal with 99 00:06:00,720 --> 00:06:03,760 Speaker 1: stack overflow not too long ago, which caused a bit 100 00:06:03,760 --> 00:06:05,880 Speaker 1: of a tiff in that community because a lot of 101 00:06:05,880 --> 00:06:08,960 Speaker 1: folks said they preferred their work not be used for 102 00:06:09,040 --> 00:06:12,719 Speaker 1: training AI. But the terms of service for stack overflow 103 00:06:12,720 --> 00:06:16,360 Speaker 1: are such that essentially, once you post something to the platform, 104 00:06:16,480 --> 00:06:19,560 Speaker 1: it doesn't belong to you anymore. It's the platforms. So 105 00:06:20,200 --> 00:06:22,880 Speaker 1: while you might object to your work being used to 106 00:06:22,880 --> 00:06:25,800 Speaker 1: train AI, you don't have any authority to deny it 107 00:06:25,839 --> 00:06:30,520 Speaker 1: from happening. Some people tried to deface or delete old posts, 108 00:06:30,560 --> 00:06:33,560 Speaker 1: but that's exactly that's against the terms of service on 109 00:06:33,600 --> 00:06:37,560 Speaker 1: stack overflow. So many of those people subsequently found themselves banned, 110 00:06:37,720 --> 00:06:40,640 Speaker 1: at least temporarily from that site. It'll be interesting to 111 00:06:40,640 --> 00:06:43,120 Speaker 1: see if there will be a similar backlash in the 112 00:06:43,120 --> 00:06:47,839 Speaker 1: Reddit community. That community is already kind of predisposed to 113 00:06:48,080 --> 00:06:52,360 Speaker 1: not liking the corporate leadership of Reddit. There's already pretty 114 00:06:52,400 --> 00:06:56,120 Speaker 1: low opinion of the company's leaders, largely because of changes 115 00:06:56,160 --> 00:06:59,640 Speaker 1: that Reddit made to its API like a year ago, 116 00:06:59,800 --> 00:07:03,600 Speaker 1: and apparently that's not the case with investors. Investors on 117 00:07:03,640 --> 00:07:07,080 Speaker 1: the flip side are pretty darn happy. Reuters reported that 118 00:07:07,160 --> 00:07:11,120 Speaker 1: shares were trading fourteen percent higher today than they were 119 00:07:11,240 --> 00:07:15,120 Speaker 1: yesterday in the wake of this news. Not all the 120 00:07:15,160 --> 00:07:18,840 Speaker 1: news is good news for open ai. Between announcements and partnerships, 121 00:07:18,840 --> 00:07:22,480 Speaker 1: two prominent individuals in the organization resigned from the company 122 00:07:22,520 --> 00:07:26,000 Speaker 1: this week. That would be Ilia Sutzkiver, who is a 123 00:07:26,040 --> 00:07:29,640 Speaker 1: co founder of open ai, one of the original members 124 00:07:29,880 --> 00:07:33,440 Speaker 1: of the non profit version of open ai, and the 125 00:07:33,480 --> 00:07:36,680 Speaker 1: other one is Jan Leika, an executive who gave a 126 00:07:36,720 --> 00:07:41,400 Speaker 1: pretty straightforward announcement on x their announcement. Just read quote 127 00:07:41,840 --> 00:07:45,080 Speaker 1: I resigned end quote. Now the alarming thing is both 128 00:07:45,120 --> 00:07:48,720 Speaker 1: Laikah and Sutzkiver worked for a department within open ai 129 00:07:48,920 --> 00:07:53,280 Speaker 1: called super Alignment, and essentially they were assigned a task 130 00:07:53,720 --> 00:07:56,560 Speaker 1: that was the need to make sure AI evolves in 131 00:07:56,600 --> 00:07:59,440 Speaker 1: a way that is helpful but not harmful. This was 132 00:07:59,520 --> 00:08:04,239 Speaker 1: kind of the central premise, the reison deetre of open 133 00:08:04,280 --> 00:08:07,680 Speaker 1: ai when it was first founded as a nonprofit organization. 134 00:08:07,840 --> 00:08:10,400 Speaker 1: The founders wanted to make certain that AI would be 135 00:08:10,440 --> 00:08:15,240 Speaker 1: developed in a responsible and accountable and transparent way in 136 00:08:15,360 --> 00:08:18,880 Speaker 1: order to maximize its benefits to humanity and minimize or 137 00:08:18,880 --> 00:08:23,640 Speaker 1: eliminate any risks. But it turns out AI is really expensive. 138 00:08:24,000 --> 00:08:27,560 Speaker 1: Like it's expensive to develop, it's expensive to implement and 139 00:08:27,600 --> 00:08:31,600 Speaker 1: to run and to maintain, and without a near constant 140 00:08:31,680 --> 00:08:35,680 Speaker 1: cash injection, open ai just wouldn't be around for very long, 141 00:08:35,880 --> 00:08:38,960 Speaker 1: which is why the organization ultimately spun off a for 142 00:08:39,240 --> 00:08:42,280 Speaker 1: profit arm, which is the version of open ai most 143 00:08:42,320 --> 00:08:46,120 Speaker 1: of us talk about today, like, it's not the nonprofit version, 144 00:08:46,160 --> 00:08:50,520 Speaker 1: it's the very aggressive for profit version that makes the 145 00:08:50,559 --> 00:08:55,680 Speaker 1: news and that created conflict within the organization. It led 146 00:08:55,760 --> 00:08:57,680 Speaker 1: up to the brief period in which the board of 147 00:08:57,720 --> 00:09:02,400 Speaker 1: directors removed Sam Altman as CEO of Open Ai, before 148 00:09:02,400 --> 00:09:04,680 Speaker 1: being pressured to turn around and put him right back 149 00:09:04,679 --> 00:09:08,000 Speaker 1: in charge again, and then subsequently the board of directors 150 00:09:08,320 --> 00:09:12,440 Speaker 1: essentially dissolved and reformed with new people. Sutzkever was actually 151 00:09:12,440 --> 00:09:15,880 Speaker 1: part of the effort to remove Altman, although he subsequently 152 00:09:16,080 --> 00:09:18,679 Speaker 1: apologized for that, so I don't think it's really a 153 00:09:18,679 --> 00:09:21,920 Speaker 1: big surprise that he has now left the company. But 154 00:09:21,960 --> 00:09:24,680 Speaker 1: I think anyone who is concerned about ethical approaches to 155 00:09:24,760 --> 00:09:28,120 Speaker 1: AI should really be aware that a lot of red 156 00:09:28,200 --> 00:09:32,760 Speaker 1: flags are going up right now. Okay, with that fun note, 157 00:09:33,080 --> 00:09:45,520 Speaker 1: let's take a quick break to thank our sponsors. We're back, 158 00:09:45,600 --> 00:09:48,160 Speaker 1: and now let us shift to Google, which, as I 159 00:09:48,240 --> 00:09:52,480 Speaker 1: mentioned earlier, held its annual IO Developer conference this week, and, 160 00:09:52,679 --> 00:09:56,440 Speaker 1: to the surprise of literally no one, much of the 161 00:09:56,480 --> 00:10:00,199 Speaker 1: conference touched on artificial intelligence. Now there are do you 162 00:10:00,240 --> 00:10:04,640 Speaker 1: have great headlines out there like Google aio or even 163 00:10:04,920 --> 00:10:09,720 Speaker 1: Google aiaio, et cetera. So there's so many out there, 164 00:10:09,880 --> 00:10:11,600 Speaker 1: I'm kind of sad that none are left for me. 165 00:10:11,640 --> 00:10:15,200 Speaker 1: But enough about that. Google announced an update to its 166 00:10:15,280 --> 00:10:20,200 Speaker 1: AI model Gemini. The company revealed that both the basic version, 167 00:10:20,280 --> 00:10:23,680 Speaker 1: called Gemini one point five Flash and the pro version 168 00:10:23,760 --> 00:10:27,360 Speaker 1: are available for public preview now. The primary function of 169 00:10:27,400 --> 00:10:31,640 Speaker 1: these specific AI models is to help developers create stuff, 170 00:10:31,920 --> 00:10:34,640 Speaker 1: so if a developer is working on an app, they 171 00:10:34,640 --> 00:10:38,920 Speaker 1: can make use of these models to help boost those efforts, 172 00:10:39,160 --> 00:10:43,679 Speaker 1: whether it's encoding or whatever. Google also revealed an updated 173 00:10:43,760 --> 00:10:47,960 Speaker 1: version of its assistant called Project Astra. Google showed a 174 00:10:48,000 --> 00:10:51,800 Speaker 1: pre recorded demonstration of the assistant's ability, and a Googler 175 00:10:51,960 --> 00:10:54,800 Speaker 1: was taking her phone around an office and asking questions 176 00:10:55,120 --> 00:10:59,320 Speaker 1: and having the camera app activated on her phone so 177 00:10:59,360 --> 00:11:02,320 Speaker 1: she could point the camera at something and ask a 178 00:11:02,480 --> 00:11:05,920 Speaker 1: question about the thing that's on screen, and the an 179 00:11:05,960 --> 00:11:09,160 Speaker 1: Astra would answer. So one point, for example, she asks 180 00:11:09,200 --> 00:11:12,360 Speaker 1: the assistant to point out anything that makes sound, and 181 00:11:12,400 --> 00:11:15,880 Speaker 1: she scans a desk, and when she brings the camera 182 00:11:15,960 --> 00:11:18,800 Speaker 1: in view of a speaker, the assistant says, that's a speaker, 183 00:11:19,080 --> 00:11:22,960 Speaker 1: and then she actually uses a markup feature to circle 184 00:11:23,320 --> 00:11:25,880 Speaker 1: the tweeter on the speaker, and she asks a follow 185 00:11:25,960 --> 00:11:28,240 Speaker 1: up question saying what is this part of the speaker called, 186 00:11:28,520 --> 00:11:31,079 Speaker 1: and the assistant says, that's the tweeter. Then it explains 187 00:11:31,120 --> 00:11:33,520 Speaker 1: what the tweeter does, which is to playback you know, 188 00:11:33,720 --> 00:11:36,960 Speaker 1: higher pitched sounds, higher frequency sounds, so it was a 189 00:11:37,000 --> 00:11:40,280 Speaker 1: cool demonstration. She pointed out the window and asked, you know, 190 00:11:40,360 --> 00:11:43,960 Speaker 1: essentially like what's this neighborhood? And the assistant answered, and 191 00:11:44,520 --> 00:11:47,600 Speaker 1: she at one point even said that she had misplaced 192 00:11:47,640 --> 00:11:50,120 Speaker 1: her glasses and asked the assistant if it knew where 193 00:11:50,160 --> 00:11:53,200 Speaker 1: her glasses were, And because she had been walking through 194 00:11:53,200 --> 00:11:56,280 Speaker 1: the office with the camera on, the app was able 195 00:11:56,320 --> 00:12:00,280 Speaker 1: to reference that information and it was able to pick 196 00:12:00,280 --> 00:12:03,920 Speaker 1: out a moment where the camera saw some glasses on 197 00:12:04,160 --> 00:12:07,520 Speaker 1: a desk, and it directed her to those glasses, which, 198 00:12:07,760 --> 00:12:10,160 Speaker 1: from what I understand, that's what got the biggest reaction 199 00:12:10,360 --> 00:12:12,720 Speaker 1: from the crowd, which was kind of cool. I mean, 200 00:12:12,920 --> 00:12:15,880 Speaker 1: goodness knows, I misplaced things all the time, so it'd 201 00:12:15,880 --> 00:12:19,280 Speaker 1: be good to have, you know, an assistant that could 202 00:12:19,320 --> 00:12:21,400 Speaker 1: tell me, yeah, no, that's in your hand. You'd do 203 00:12:21,559 --> 00:12:26,280 Speaker 1: fas anyway. Later on, Googler Michael Chang somewhat cheekily shared 204 00:12:26,280 --> 00:12:29,439 Speaker 1: on X now known as you know X it used 205 00:12:29,480 --> 00:12:31,720 Speaker 1: to be known as Twitter that he had watched the 206 00:12:31,840 --> 00:12:36,720 Speaker 1: Open AI announcement on Monday with Astra, which is kind 207 00:12:36,760 --> 00:12:38,800 Speaker 1: of funny because you know, the whole thing was that 208 00:12:38,840 --> 00:12:41,640 Speaker 1: Open AI was doing this sort of to undermine Google 209 00:12:41,720 --> 00:12:44,360 Speaker 1: and that Michael Chang saying, yeah, I watched it along 210 00:12:44,360 --> 00:12:48,040 Speaker 1: with Project Astra, and Project Astra made a real time 211 00:12:48,520 --> 00:12:53,240 Speaker 1: transcript of that presentation, complete with attributions for each speaker, 212 00:12:53,360 --> 00:12:57,200 Speaker 1: so identifying who was speaking in the transcript as it 213 00:12:57,280 --> 00:13:00,280 Speaker 1: was going on. That's pretty impressive, and also giving some 214 00:13:00,320 --> 00:13:04,560 Speaker 1: commentary about the presentation as it went on. The demonstration 215 00:13:04,600 --> 00:13:08,000 Speaker 1: of the AI interpreting images and giving relevant responses was 216 00:13:08,120 --> 00:13:10,680 Speaker 1: really interesting, but again, it was pre recorded. It wasn't 217 00:13:10,720 --> 00:13:13,720 Speaker 1: like it was a live demo. Google also demonstrate an 218 00:13:13,800 --> 00:13:18,360 Speaker 1: updated image generation AI model called imagen three. On top 219 00:13:18,400 --> 00:13:22,280 Speaker 1: of images, Google showed off a music AI sandbox to 220 00:13:22,360 --> 00:13:25,360 Speaker 1: let folks use AI to put musicians and composers out 221 00:13:25,360 --> 00:13:28,480 Speaker 1: of a job. No, I'm sorry, that's not a messaging 222 00:13:28,520 --> 00:13:32,199 Speaker 1: they wanted. I mean to help artists elevate their creativity 223 00:13:32,400 --> 00:13:35,320 Speaker 1: through the use of AI. I looked at these I 224 00:13:35,400 --> 00:13:37,920 Speaker 1: just got to see demos. I don't have access to 225 00:13:37,960 --> 00:13:41,320 Speaker 1: the tools themselves because I'm not a notable musician. I'm 226 00:13:41,320 --> 00:13:43,840 Speaker 1: not really a musician at all, so I don't have 227 00:13:43,920 --> 00:13:48,520 Speaker 1: access to these things. But it's both fascinating and concerning. 228 00:13:48,840 --> 00:13:53,120 Speaker 1: Because I have a lot of respect for actual musical artists, 229 00:13:53,160 --> 00:13:57,319 Speaker 1: and I don't like the idea of people who are 230 00:13:57,320 --> 00:14:01,760 Speaker 1: really talented and creative potentially getting pushed aside for something 231 00:14:01,880 --> 00:14:07,120 Speaker 1: that is perhaps easier to access and maybe cheaper. That's 232 00:14:07,160 --> 00:14:10,600 Speaker 1: not great. I think people's talent and time is worth money, 233 00:14:10,880 --> 00:14:13,040 Speaker 1: like I think it's worth paying for and that you know, 234 00:14:13,080 --> 00:14:15,920 Speaker 1: you can't just expect something for free, and anything that 235 00:14:16,000 --> 00:14:19,920 Speaker 1: undercuts that really concerns me a lot. Google also announced 236 00:14:20,000 --> 00:14:23,680 Speaker 1: video FX, which you guessed it uses generative AI to 237 00:14:23,760 --> 00:14:28,120 Speaker 1: create video clips, which again raise lots of concerns, like 238 00:14:28,480 --> 00:14:31,120 Speaker 1: the more you're seeing of AI generated content, the more 239 00:14:31,120 --> 00:14:36,080 Speaker 1: you're worried about how reliable is the stuff you are seeing. 240 00:14:36,600 --> 00:14:40,320 Speaker 1: Whether it's you know, a listing for a real estate property, 241 00:14:40,680 --> 00:14:43,320 Speaker 1: like who's to say it wasn't a picture that was 242 00:14:43,360 --> 00:14:46,840 Speaker 1: then enhanced by AI to make the house look way 243 00:14:47,000 --> 00:14:50,760 Speaker 1: nicer than it really is, Or maybe it's something where 244 00:14:50,880 --> 00:14:53,080 Speaker 1: it's a it's a video clip that seems to put 245 00:14:53,160 --> 00:14:57,640 Speaker 1: a public figure in a bad light, and you start wondering, Okay, 246 00:14:57,640 --> 00:14:59,720 Speaker 1: well is this real or is this something that was 247 00:14:59,760 --> 00:15:02,000 Speaker 1: created by someone who just has an axe to grind 248 00:15:02,040 --> 00:15:04,640 Speaker 1: for whatever reason these are things that are getting more 249 00:15:04,680 --> 00:15:09,280 Speaker 1: and more relevant. Arguably they are incredibly relevant right now. Now. 250 00:15:09,320 --> 00:15:11,880 Speaker 1: The thing that is probably going to have the largest impact, 251 00:15:12,160 --> 00:15:15,520 Speaker 1: at least the earliest for most folks, is Google implementing 252 00:15:15,560 --> 00:15:19,000 Speaker 1: more AI into its search tool, which is of course 253 00:15:19,040 --> 00:15:22,720 Speaker 1: what Google is primarily known for, even though it's arguably 254 00:15:22,760 --> 00:15:26,040 Speaker 1: really an advertising company, it's known for search. The company 255 00:15:26,040 --> 00:15:30,000 Speaker 1: announced that AI overviews in search is going to be 256 00:15:30,480 --> 00:15:32,880 Speaker 1: rolling out nationwide in the United States this week, so 257 00:15:33,040 --> 00:15:35,400 Speaker 1: if you live in the United States, you probably have 258 00:15:35,560 --> 00:15:39,880 Speaker 1: this incorporated in your search right now. Google also said 259 00:15:39,920 --> 00:15:43,560 Speaker 1: that multi step reasoning capabilities are coming soon, which means 260 00:15:43,600 --> 00:15:46,800 Speaker 1: that you'll be able to ask more complex questions in 261 00:15:46,840 --> 00:15:50,360 Speaker 1: your search query and to get relevant answers. So, like, 262 00:15:50,960 --> 00:15:53,360 Speaker 1: here's an example I just made up, how do I 263 00:15:53,400 --> 00:15:56,280 Speaker 1: make a lasagna? And what cheese is a good substitute 264 00:15:56,320 --> 00:15:59,040 Speaker 1: if I don't have ricotta? Right, Because that's a two 265 00:15:59,080 --> 00:16:01,840 Speaker 1: part question. It's one asking what are the steps to 266 00:16:01,840 --> 00:16:04,440 Speaker 1: make a lasagna? And I know I already know I 267 00:16:04,520 --> 00:16:08,560 Speaker 1: don't have an ingredient that's in a lot of lasagna recipes, 268 00:16:08,680 --> 00:16:12,000 Speaker 1: So what is a good substitute? That's the sort of 269 00:16:12,040 --> 00:16:15,040 Speaker 1: thing that you can expect to be able to access 270 00:16:15,160 --> 00:16:17,840 Speaker 1: in the not too distant future. The various keynotes and 271 00:16:17,880 --> 00:16:22,640 Speaker 1: presentations stretched across multiple days, so there was so much 272 00:16:22,680 --> 00:16:25,640 Speaker 1: more brought up during the IO event. But I think 273 00:16:25,680 --> 00:16:27,960 Speaker 1: the real message for most of us, the people who 274 00:16:28,000 --> 00:16:30,720 Speaker 1: aren't like actively developers, that we should take home is 275 00:16:30,840 --> 00:16:32,680 Speaker 1: just that AI is going to be all up in 276 00:16:32,720 --> 00:16:38,960 Speaker 1: your business and soon. Google hasn't just been talking to developers, however, 277 00:16:39,200 --> 00:16:42,040 Speaker 1: the company has also been pushing out updates to its browser, 278 00:16:42,280 --> 00:16:45,800 Speaker 1: Google Chrome, and this is not part of its all AI, 279 00:16:46,080 --> 00:16:50,680 Speaker 1: all the time strategy. Instead, Google is addressing some zero 280 00:16:50,760 --> 00:16:53,600 Speaker 1: day vulnerabilities that have come to light over the last 281 00:16:53,640 --> 00:16:56,600 Speaker 1: couple of weeks, three of them in fact, recently. These 282 00:16:56,680 --> 00:17:00,400 Speaker 1: vulnerabilities are severe. Potentially hackers would be able to quote 283 00:17:00,440 --> 00:17:03,560 Speaker 1: perform an out of bounds memory right via a crafted 284 00:17:03,680 --> 00:17:07,399 Speaker 1: HTML page end quote that's from Google. That's not good. 285 00:17:07,680 --> 00:17:11,639 Speaker 1: It could mean that you could have a vulnerability activated 286 00:17:11,680 --> 00:17:14,879 Speaker 1: and then you visit a malicious website and then either 287 00:17:14,920 --> 00:17:18,560 Speaker 1: get malware downloaded to your computer or worse. So this 288 00:17:18,640 --> 00:17:21,920 Speaker 1: is a gentle reminder to all of y'all that it's 289 00:17:22,040 --> 00:17:25,040 Speaker 1: never a good idea to just push off installing updates 290 00:17:25,080 --> 00:17:28,760 Speaker 1: on stuff like web browsers and operating systems, because doing 291 00:17:28,800 --> 00:17:31,520 Speaker 1: that puts you at risk for malware or worse. I 292 00:17:31,560 --> 00:17:34,239 Speaker 1: know it's a hassle to have to update, especially if 293 00:17:34,320 --> 00:17:36,639 Speaker 1: you get to your machine and you've got like a 294 00:17:36,680 --> 00:17:38,480 Speaker 1: specific thing you need to do, and then the first 295 00:17:38,520 --> 00:17:41,080 Speaker 1: thing you see is a notification saying that you should 296 00:17:41,119 --> 00:17:43,320 Speaker 1: run an update. I know that's a hassle. It happens 297 00:17:43,320 --> 00:17:45,480 Speaker 1: to me too, and I hate it too, But it's 298 00:17:45,520 --> 00:17:47,520 Speaker 1: a good idea to go ahead and do the update 299 00:17:48,400 --> 00:17:50,920 Speaker 1: ninety nine times out of one hundred. One time out 300 00:17:50,920 --> 00:17:52,640 Speaker 1: of one hundred, it ends up being the wrong thing 301 00:17:52,680 --> 00:17:55,959 Speaker 1: to do, But when it's a ninety nine percent success, right, 302 00:17:56,000 --> 00:17:58,760 Speaker 1: you should probably do it. Also, hey, fun concept AI 303 00:17:58,840 --> 00:18:03,080 Speaker 1: can now help hackers discover and exploit vulnerabilities, So it's 304 00:18:03,240 --> 00:18:06,119 Speaker 1: just gonna get worse. Brave New World and all that 305 00:18:06,200 --> 00:18:10,040 Speaker 1: kind of stuff. Apple iOS beta testers say that iOS 306 00:18:10,040 --> 00:18:13,600 Speaker 1: build seventeen point five has a disturbing habit of bringing 307 00:18:13,640 --> 00:18:15,800 Speaker 1: the dead back to life. And so I'm not talking 308 00:18:15,840 --> 00:18:21,000 Speaker 1: about reanimated goules here. I'm talking about deleted material, like 309 00:18:21,080 --> 00:18:25,119 Speaker 1: deleted photos or deleted voicemails. According to a Reddit thread, 310 00:18:25,400 --> 00:18:28,960 Speaker 1: some users have reported iOS digging up old and sometimes 311 00:18:29,000 --> 00:18:32,240 Speaker 1: deleted photos and then showcasing them as if they are 312 00:18:32,359 --> 00:18:36,280 Speaker 1: recent images. So you pull up your app and say, huh, 313 00:18:36,520 --> 00:18:40,440 Speaker 1: according to this, this picture I took seven years ago 314 00:18:40,680 --> 00:18:44,560 Speaker 1: is a new one. Apparently in some cases this has 315 00:18:44,680 --> 00:18:48,440 Speaker 1: included images that people once took but long since deleted, 316 00:18:48,560 --> 00:18:55,000 Speaker 1: including nudes. Nude photos. That should be a massive concern 317 00:18:55,119 --> 00:18:59,080 Speaker 1: because Apple's stated practice is that once you delete a photo, 318 00:18:59,680 --> 00:19:02,840 Speaker 1: then you have thirty days in which you can recover 319 00:19:02,960 --> 00:19:06,520 Speaker 1: the deleted photo. But after thirty days, that photo is 320 00:19:06,560 --> 00:19:10,600 Speaker 1: supposed to be deleted for good. So if, as some 321 00:19:10,640 --> 00:19:14,959 Speaker 1: people have claimed, some of these photos have reappeared reportedly 322 00:19:15,160 --> 00:19:19,760 Speaker 1: years after they were first deleted, that raises serious privacy questions. 323 00:19:20,080 --> 00:19:23,160 Speaker 1: Right Like, if you took a nude photo seven years ago, 324 00:19:23,480 --> 00:19:26,520 Speaker 1: then like six and a half years ago, you deleted it. 325 00:19:26,800 --> 00:19:29,199 Speaker 1: You do not expect that seven year old photo to 326 00:19:29,240 --> 00:19:33,240 Speaker 1: come bouncing back today. That would be really shocking, and 327 00:19:33,280 --> 00:19:36,800 Speaker 1: it does again raise real questions about data retention and 328 00:19:36,840 --> 00:19:40,480 Speaker 1: privacy and security. At least one Reddit user claimed that 329 00:19:40,560 --> 00:19:43,520 Speaker 1: they had sold an old iPad to a friend of theirs, 330 00:19:43,680 --> 00:19:47,560 Speaker 1: but they had gone through Apple's process to wipe their 331 00:19:47,600 --> 00:19:51,000 Speaker 1: own personal data off of the iPad before selling it. So, 332 00:19:51,000 --> 00:19:53,880 Speaker 1: in other words, just making sure that you're reverting back 333 00:19:53,920 --> 00:19:57,560 Speaker 1: to like factory settings and deleting all personal information, but 334 00:19:57,720 --> 00:20:02,000 Speaker 1: that some of their old not safe for work photographs 335 00:20:02,040 --> 00:20:05,440 Speaker 1: have been popping back up on their friend's device. Now 336 00:20:05,800 --> 00:20:08,760 Speaker 1: surprise friend that you didn't expect to see your buddy 337 00:20:09,480 --> 00:20:14,240 Speaker 1: in their glory. This is not good news at all, 338 00:20:14,520 --> 00:20:16,720 Speaker 1: if in fact it is true. So I have to 339 00:20:16,800 --> 00:20:19,479 Speaker 1: keep on stressing if it is true, because I haven't 340 00:20:19,520 --> 00:20:24,199 Speaker 1: seen confirmation from anyone other than people commenting on Reddit. 341 00:20:24,400 --> 00:20:26,359 Speaker 1: I don't know if you know this, but sometimes people 342 00:20:26,359 --> 00:20:29,040 Speaker 1: on Reddit say things that are perhaps not one hundred 343 00:20:29,040 --> 00:20:33,080 Speaker 1: percent true. A lot of subreddits are almost exclusively made 344 00:20:33,160 --> 00:20:36,840 Speaker 1: up of fiction anyway. Wes Davis of The Verge extends 345 00:20:37,040 --> 00:20:39,880 Speaker 1: some grace to Apple and points out that this might 346 00:20:39,920 --> 00:20:43,639 Speaker 1: not be a case of Apple doing something insidious or 347 00:20:43,680 --> 00:20:48,720 Speaker 1: being really negligent. That deleted data is never really gone 348 00:20:49,000 --> 00:20:52,600 Speaker 1: until some other information gets overwritten on top of what 349 00:20:52,880 --> 00:20:55,960 Speaker 1: once was there. And that's true if you delete something 350 00:20:56,000 --> 00:20:59,080 Speaker 1: off your hard drive, but you don't overwrite information on 351 00:20:59,160 --> 00:21:03,320 Speaker 1: top of where that drive was, you know, storing stuff 352 00:21:03,600 --> 00:21:06,800 Speaker 1: it can be possible to recover the deleted information. So 353 00:21:06,960 --> 00:21:09,959 Speaker 1: maybe that's what's happened at Apple, just you know, on 354 00:21:10,000 --> 00:21:13,000 Speaker 1: a much much larger scale. Now, personally, I find that 355 00:21:13,040 --> 00:21:16,200 Speaker 1: a little hard to swallow because the trend has been 356 00:21:16,280 --> 00:21:19,560 Speaker 1: for more folks to use more Apple services over time. 357 00:21:20,040 --> 00:21:23,159 Speaker 1: So you would think if more people are using those services, 358 00:21:23,400 --> 00:21:27,600 Speaker 1: that stuff would be overwritten all the time, because otherwise 359 00:21:27,640 --> 00:21:31,160 Speaker 1: Apple would just have to keep on adding additional servers 360 00:21:31,160 --> 00:21:34,920 Speaker 1: to handle incoming images and voicemails and all the other stuff, 361 00:21:35,160 --> 00:21:37,760 Speaker 1: because you know the otherwise. The implication is that, well, 362 00:21:37,800 --> 00:21:41,960 Speaker 1: we haven't overwritten everything that you know has been freed 363 00:21:42,040 --> 00:21:44,919 Speaker 1: up since people have deleted things. I can't even be 364 00:21:45,000 --> 00:21:47,360 Speaker 1: sure that the claims that are being made are accurate. 365 00:21:47,480 --> 00:21:50,040 Speaker 1: So it's possible that some of these quote unquote deleted 366 00:21:50,080 --> 00:21:52,919 Speaker 1: files weren't really deleted at all. Maybe they were just 367 00:21:52,960 --> 00:21:56,760 Speaker 1: forgotten about, because goodness knows that happens too. But the 368 00:21:56,880 --> 00:22:00,600 Speaker 1: story about the wipe dipad in particular is trouble if 369 00:22:00,720 --> 00:22:04,800 Speaker 1: it is true, So just something to be aware of. Again, 370 00:22:04,840 --> 00:22:08,040 Speaker 1: that's for iOS seventeen point five, and it's in beta, 371 00:22:08,119 --> 00:22:11,440 Speaker 1: so if you're not beta testing, then this isn't likely 372 00:22:11,520 --> 00:22:15,280 Speaker 1: to affect you if you're an iOS user, but I'm 373 00:22:15,280 --> 00:22:17,800 Speaker 1: sure it's something that Apple is looking into to figure 374 00:22:17,800 --> 00:22:22,600 Speaker 1: out before iOS eighteen goes out. All right, we got 375 00:22:22,840 --> 00:22:25,480 Speaker 1: more stories to cover, but before we can get to 376 00:22:25,520 --> 00:22:37,719 Speaker 1: all of those, let's take another quick break. We're back, 377 00:22:37,960 --> 00:22:42,000 Speaker 1: and here's a story that hit like deja vus this 378 00:22:42,080 --> 00:22:44,880 Speaker 1: week at centers around the company game Stop, and it's 379 00:22:45,080 --> 00:22:48,760 Speaker 1: stock price. So you might remember that back in twenty 380 00:22:48,800 --> 00:22:51,480 Speaker 1: twenty one, a collection of folks decided to stick it 381 00:22:51,520 --> 00:22:55,080 Speaker 1: to some hedge funds by investing in game Stop because 382 00:22:55,119 --> 00:22:58,200 Speaker 1: word got out that some massive hedge funds were short 383 00:22:58,240 --> 00:23:02,040 Speaker 1: selling game Stop. By driving the price of the stock up, 384 00:23:02,119 --> 00:23:04,640 Speaker 1: the investors could do two things. One, they could see 385 00:23:04,640 --> 00:23:06,919 Speaker 1: a return on their own investment, right because if the 386 00:23:06,920 --> 00:23:11,040 Speaker 1: stock goes up, then you're making profit. And two, they 387 00:23:11,040 --> 00:23:15,200 Speaker 1: could cause serious financial hardship for these massive hedge funds 388 00:23:15,200 --> 00:23:18,840 Speaker 1: that typically hold a tremendous amount of power in the 389 00:23:18,880 --> 00:23:21,840 Speaker 1: stock market. Now, if you're not familiar with how short 390 00:23:21,880 --> 00:23:25,440 Speaker 1: selling goes, let me give you a very basic description, 391 00:23:25,600 --> 00:23:28,720 Speaker 1: and I'm glossing over a lot of details here. Let's 392 00:23:28,720 --> 00:23:32,800 Speaker 1: say that you believe a particular stock is going to 393 00:23:32,880 --> 00:23:36,280 Speaker 1: decline in value. So you decide, hey, I'm going to 394 00:23:36,320 --> 00:23:38,560 Speaker 1: make some money with stock going down. I'm going to 395 00:23:38,720 --> 00:23:41,520 Speaker 1: sell stock that I don't actually own, and i'm going 396 00:23:41,560 --> 00:23:44,000 Speaker 1: to sell it at what it is currently trading for. 397 00:23:44,240 --> 00:23:46,959 Speaker 1: So let's say that it's currently trading for ten dollars 398 00:23:46,960 --> 00:23:49,960 Speaker 1: per share. So you sell one hundred shares at ten 399 00:23:50,040 --> 00:23:53,000 Speaker 1: dollars per share, but you don't own those shares, So 400 00:23:53,000 --> 00:23:55,720 Speaker 1: you're literally selling stock you do not own, but you 401 00:23:55,760 --> 00:23:59,439 Speaker 1: have the promise of delivering that money, or rather the stock, 402 00:23:59,720 --> 00:24:02,760 Speaker 1: in a certain amount of time. Let's say it's one month. Now, 403 00:24:02,760 --> 00:24:05,239 Speaker 1: a month has gone by, and it's time for you 404 00:24:05,320 --> 00:24:08,879 Speaker 1: to deliver the stock that you've promised. At this point, 405 00:24:08,920 --> 00:24:11,480 Speaker 1: the stock has gone down to one dollar per share. 406 00:24:11,640 --> 00:24:14,159 Speaker 1: Now you previously sold it for ten dollars per share, 407 00:24:14,480 --> 00:24:18,280 Speaker 1: so you get to buy up the amount of stock 408 00:24:18,320 --> 00:24:21,200 Speaker 1: you agreed to handover in the transaction, and you get 409 00:24:21,200 --> 00:24:24,359 Speaker 1: to pocket the difference between the ten dollars that you 410 00:24:24,520 --> 00:24:27,800 Speaker 1: accepted for this sale and the one dollar that you 411 00:24:27,960 --> 00:24:33,080 Speaker 1: actually purchased everything for. So you make money by stock 412 00:24:33,119 --> 00:24:36,240 Speaker 1: going down. But if the stock price doesn't go down, 413 00:24:36,280 --> 00:24:38,960 Speaker 1: if it goes up, that's bad. News for you, because 414 00:24:39,320 --> 00:24:41,119 Speaker 1: now you're gonna have to pay out a pocket the 415 00:24:41,280 --> 00:24:44,520 Speaker 1: difference of that. So let's say that the stock actually 416 00:24:44,600 --> 00:24:47,080 Speaker 1: rose up to twenty dollars per share at the end 417 00:24:47,119 --> 00:24:49,200 Speaker 1: of the month, Well, then you're gonna have to pay 418 00:24:49,200 --> 00:24:53,280 Speaker 1: an extra ten dollars on every share of the stock 419 00:24:53,320 --> 00:24:56,399 Speaker 1: that you sold, So you're gonna lose that money instead 420 00:24:56,400 --> 00:25:01,560 Speaker 1: of gaining extra Well, these investors that ended up driving 421 00:25:01,640 --> 00:25:03,679 Speaker 1: up the price of game Stop were led by a 422 00:25:03,680 --> 00:25:07,280 Speaker 1: guy named Keith Gill, who's better known by his online handle, 423 00:25:07,280 --> 00:25:12,080 Speaker 1: which is Roaring Kitty, and they did a massive screw 424 00:25:12,200 --> 00:25:15,080 Speaker 1: you to major investment companies that are trying to short 425 00:25:15,119 --> 00:25:18,680 Speaker 1: sell game Stop. This is what we call a short squeeze. 426 00:25:19,040 --> 00:25:24,200 Speaker 1: This week, it happened again briefly because on Tuesday, game 427 00:25:24,240 --> 00:25:27,120 Speaker 1: Stop shares rose to a high of sixty four dollars 428 00:25:27,160 --> 00:25:30,400 Speaker 1: eighty three cents per share, and in fact, there were 429 00:25:30,480 --> 00:25:33,000 Speaker 1: companies that were putting on a short sale of game 430 00:25:33,080 --> 00:25:37,400 Speaker 1: Stop stock. However, in this case, the rally didn't last long. 431 00:25:37,480 --> 00:25:41,320 Speaker 1: It has subsequently dropped to around twenty seven dollars per share. 432 00:25:41,560 --> 00:25:44,880 Speaker 1: And this roller coaster followed some posts by the aforementioned 433 00:25:44,960 --> 00:25:47,199 Speaker 1: Roaring Kitty who seemed to be getting back into the 434 00:25:47,200 --> 00:25:50,520 Speaker 1: game after being quiet for a couple of years. But again, 435 00:25:50,680 --> 00:25:52,920 Speaker 1: unlike the first go round when this happened in twenty 436 00:25:53,000 --> 00:25:56,800 Speaker 1: twenty one, this particular boost appeared to just be a blip, 437 00:25:56,920 --> 00:25:59,680 Speaker 1: so we're not seeing a full repeat of the phenomenon. 438 00:26:00,119 --> 00:26:02,480 Speaker 1: So a lot of hedge funds have gotten wise to 439 00:26:02,560 --> 00:26:07,080 Speaker 1: this and have measures in place to limit their liability 440 00:26:07,160 --> 00:26:10,600 Speaker 1: so they're not caught out quite as badly as they 441 00:26:10,600 --> 00:26:15,639 Speaker 1: were the last time. Because boyotty, that was a doozy. Okay, 442 00:26:15,680 --> 00:26:19,000 Speaker 1: let's talk about Meta. They are ending the Workplace app 443 00:26:19,440 --> 00:26:22,440 Speaker 1: for external customers. At the very least, some elements of 444 00:26:22,480 --> 00:26:26,360 Speaker 1: Workplace may remain as internal Facebook systems. So Meta first 445 00:26:26,440 --> 00:26:31,280 Speaker 1: launched Workplace in twenty sixteen. It's a business collaboration tool. Essentially, 446 00:26:31,320 --> 00:26:35,560 Speaker 1: it lets companies make interconnecting page networks with their partners 447 00:26:35,720 --> 00:26:39,960 Speaker 1: and helps with collaborative projects that span different departments or 448 00:26:39,960 --> 00:26:43,159 Speaker 1: different companies entirely. They can have like dedicated message boards, 449 00:26:43,160 --> 00:26:45,560 Speaker 1: that kind of stuff. It's a business to business product. 450 00:26:46,119 --> 00:26:49,520 Speaker 1: Meta charged four dollars per user. That adds up pretty 451 00:26:49,560 --> 00:26:52,159 Speaker 1: quickly if you're talking about a really big company. But 452 00:26:52,280 --> 00:26:55,760 Speaker 1: now Meta is going to sunset the Workplace app and 453 00:26:55,840 --> 00:26:59,639 Speaker 1: shift assets to two other massive projects, which are AI 454 00:26:59,840 --> 00:27:03,520 Speaker 1: and of course the Metaverse. Because Meta is still determined 455 00:27:03,520 --> 00:27:05,840 Speaker 1: to make that happen. I think it's going to be 456 00:27:05,920 --> 00:27:09,200 Speaker 1: like Fetch. It's not gonna happen, but maybe it will. 457 00:27:09,480 --> 00:27:12,040 Speaker 1: I mean, maybe people will love it. I still am skeptical, 458 00:27:12,040 --> 00:27:14,360 Speaker 1: but I'm also not a genius and I'm not an entrepreneur. 459 00:27:14,400 --> 00:27:16,840 Speaker 1: I'm just a grouchy guy who talks into a microphone. 460 00:27:16,960 --> 00:27:19,920 Speaker 1: The company says it will work with current Workplace customers 461 00:27:19,960 --> 00:27:22,679 Speaker 1: to transition them over to work Vivo, which is a 462 00:27:22,720 --> 00:27:25,919 Speaker 1: product from Zoom and the one partner they work with 463 00:27:26,359 --> 00:27:30,040 Speaker 1: that makes something similar that they actually recommend. Okay, here's 464 00:27:30,080 --> 00:27:35,119 Speaker 1: an interesting story about suppressing tech. Seven hundred and fifty 465 00:27:35,160 --> 00:27:38,560 Speaker 1: miles to the east of Australia is New Caledonia. In 466 00:27:38,600 --> 00:27:43,280 Speaker 1: the mid nineteenth century, France claimed New Caledonia and has 467 00:27:43,480 --> 00:27:46,960 Speaker 1: remained the ruler of New Caledonia since it is a 468 00:27:47,000 --> 00:27:50,720 Speaker 1: French territory. But recently the government of New Caledonia enacted 469 00:27:50,800 --> 00:27:55,760 Speaker 1: changes to election laws that really sparked protests across the territory, 470 00:27:56,960 --> 00:28:01,639 Speaker 1: mostly in indigenous populations that have been fighting for independence. 471 00:28:01,720 --> 00:28:05,560 Speaker 1: For quite some time, these protests grew and became more chaotic. 472 00:28:06,040 --> 00:28:11,639 Speaker 1: Four deaths have resulted as the protests have grown more violent, 473 00:28:12,000 --> 00:28:14,680 Speaker 1: and since then the government has cracked down on the populace. 474 00:28:14,840 --> 00:28:19,679 Speaker 1: The government has banned public protests and demonstrations and also 475 00:28:19,880 --> 00:28:23,640 Speaker 1: blocked access to TikTok all in an effort to squash 476 00:28:24,000 --> 00:28:28,000 Speaker 1: protests online and off. The government says that people were 477 00:28:28,080 --> 00:28:32,720 Speaker 1: using TikTok in order to organize demonstrations, and since demonstrations 478 00:28:32,800 --> 00:28:37,440 Speaker 1: are now banned, so is TikTok, so people can't use 479 00:28:37,480 --> 00:28:40,200 Speaker 1: it to do that. That's really why I'm including the 480 00:28:40,240 --> 00:28:42,800 Speaker 1: story here, because it's a social and political event that 481 00:28:42,880 --> 00:28:47,040 Speaker 1: has spilled into suppressing access to online platforms. And as 482 00:28:47,080 --> 00:28:49,040 Speaker 1: to the heart of the matter, like what this is 483 00:28:49,080 --> 00:28:52,640 Speaker 1: all about again, it's largely about the question of independence 484 00:28:52,640 --> 00:28:55,920 Speaker 1: for New Caledonia and the recent election laws that were 485 00:28:56,000 --> 00:29:00,120 Speaker 1: changed now allow people who are more recent transplants to 486 00:29:00,160 --> 00:29:04,120 Speaker 1: New Caledonia to vote. Previously, in order to be eligible 487 00:29:04,160 --> 00:29:08,280 Speaker 1: to vote in matters of great import in New Caledonia, 488 00:29:08,400 --> 00:29:11,800 Speaker 1: you had to have a long and deep connection to 489 00:29:12,360 --> 00:29:15,280 Speaker 1: the territory. You couldn't just move there and then influence 490 00:29:15,320 --> 00:29:18,760 Speaker 1: the vote so a lot of people who support independence 491 00:29:18,800 --> 00:29:23,360 Speaker 1: say this is a blatant attempt to dilute their influence 492 00:29:23,600 --> 00:29:26,840 Speaker 1: in public matters, because the people who are moving to 493 00:29:26,920 --> 00:29:30,880 Speaker 1: New Caledonia are largely French citizens, who are of course 494 00:29:31,040 --> 00:29:36,080 Speaker 1: not voting in favor of independence. So that's what's really 495 00:29:36,080 --> 00:29:38,520 Speaker 1: the heart of the conflict. But yeah, like I said, 496 00:29:38,560 --> 00:29:41,720 Speaker 1: it's spilled out now with a ban on TikTok. And 497 00:29:42,480 --> 00:29:46,000 Speaker 1: while there is a twelve day state of emergency currently 498 00:29:46,000 --> 00:29:48,760 Speaker 1: in effect for New Caledonia, it is not yet clear 499 00:29:49,360 --> 00:29:52,680 Speaker 1: if the ban on TikTok will lift once that state 500 00:29:52,720 --> 00:29:56,960 Speaker 1: of emergency expires. A lawsuit targeting Tesla alleging that the 501 00:29:56,960 --> 00:30:00,320 Speaker 1: company made false claims regarding full self driving capability these 502 00:30:00,400 --> 00:30:03,880 Speaker 1: in its car may move forward. Tesla had attempted to 503 00:30:03,880 --> 00:30:06,760 Speaker 1: get the case thrown out. A California resident by the 504 00:30:06,840 --> 00:30:10,720 Speaker 1: name of Thomas Lesavio alleges the company engaged in fraud 505 00:30:10,960 --> 00:30:14,480 Speaker 1: by claiming that its vehicles would have full self driving 506 00:30:14,520 --> 00:30:18,880 Speaker 1: capability and that's why Lasavio purchased a Tesla vehicle, only 507 00:30:18,920 --> 00:30:23,400 Speaker 1: to subsequently find out that those claims were perhaps exaggerated 508 00:30:23,720 --> 00:30:26,680 Speaker 1: or false. The judge overseeing the matter has decided that 509 00:30:26,680 --> 00:30:29,800 Speaker 1: at least some of Lasavio's case has merit and can 510 00:30:29,880 --> 00:30:32,880 Speaker 1: go to trial, but some other parts have been dismissed 511 00:30:32,920 --> 00:30:36,120 Speaker 1: because of the way they were presented. Now, Lasavio originally 512 00:30:36,160 --> 00:30:38,680 Speaker 1: was going to be part of a class action lawsuit, 513 00:30:39,040 --> 00:30:42,880 Speaker 1: but the other plaintiffs of that lawsuit had to drop 514 00:30:42,920 --> 00:30:46,600 Speaker 1: out because they had signed an agreement which would require 515 00:30:46,640 --> 00:30:49,640 Speaker 1: them to go through arbitration with Tesla before they could 516 00:30:49,680 --> 00:30:53,440 Speaker 1: bring a case to court. Losavio did not sign such 517 00:30:53,440 --> 00:30:56,400 Speaker 1: an agreement, so was able to move forward with his 518 00:30:56,480 --> 00:30:58,560 Speaker 1: own lawsuit, and he has opened it up to be 519 00:30:58,600 --> 00:31:02,040 Speaker 1: a class action so other Tesla owners can join if 520 00:31:02,080 --> 00:31:04,840 Speaker 1: they so wish. Now, Lasauvio is going to have until 521 00:31:04,880 --> 00:31:07,600 Speaker 1: June fifth to amend his complaint in order to get 522 00:31:07,760 --> 00:31:11,720 Speaker 1: parts that were dismissed reincorporated into it. Tesla has until 523 00:31:11,800 --> 00:31:14,840 Speaker 1: June nineteenth to respond. So there's still no guarantee that 524 00:31:14,880 --> 00:31:16,760 Speaker 1: this is actually going to go to trial, but it 525 00:31:16,880 --> 00:31:20,160 Speaker 1: is a step closer. Reuters has a piece titled the 526 00:31:20,240 --> 00:31:25,640 Speaker 1: inside story of Elon Musk's mass firings of Tesla's supercharger staff. Obviously, 527 00:31:25,680 --> 00:31:27,440 Speaker 1: this relates to a story I talked about. A couple 528 00:31:27,480 --> 00:31:29,480 Speaker 1: of weeks ago in which the company laid off a 529 00:31:29,560 --> 00:31:33,320 Speaker 1: five hundred person department responsible for the development and deployment 530 00:31:33,400 --> 00:31:36,960 Speaker 1: of supercharger stations, which seemed more than a little weird 531 00:31:37,000 --> 00:31:40,400 Speaker 1: because Tesla has made great strides in establishing its supercharger 532 00:31:40,440 --> 00:31:43,920 Speaker 1: technology as a kind of de facto industry standard. You know, 533 00:31:44,000 --> 00:31:47,520 Speaker 1: other car manufacturers have bought into that, and they've made 534 00:31:47,520 --> 00:31:50,880 Speaker 1: their own vehicles compatible with it. The Reuter's piece tells 535 00:31:50,960 --> 00:31:54,200 Speaker 1: a pretty disturbing tale in my opinion, because the former 536 00:31:54,280 --> 00:31:58,000 Speaker 1: head of the supercharger department, a woman named Rebecca to Nucci, 537 00:31:58,080 --> 00:32:01,040 Speaker 1: had already gone through a round layoffs like she had 538 00:32:01,040 --> 00:32:03,960 Speaker 1: already held layoffs within her division at the direction of 539 00:32:04,000 --> 00:32:06,680 Speaker 1: Elon Musk, and Musk wanted to do more. But she 540 00:32:06,800 --> 00:32:10,080 Speaker 1: was meeting with Musk to present her vision that involved 541 00:32:10,080 --> 00:32:15,040 Speaker 1: the company expanding the supercharger network. Instead, Musk demanded more layoffs. 542 00:32:15,200 --> 00:32:17,920 Speaker 1: To Nucci reportedly said that if she were to lay 543 00:32:17,920 --> 00:32:20,760 Speaker 1: off anyone else in her department, that would be counterproductive, 544 00:32:20,760 --> 00:32:24,160 Speaker 1: it would hurt their business objectives. And that's when Musk 545 00:32:24,480 --> 00:32:27,400 Speaker 1: just went nuclear. He pulled the trigger and he fired 546 00:32:27,600 --> 00:32:31,360 Speaker 1: everyone or practically everyone in that department. And it's very 547 00:32:31,400 --> 00:32:34,320 Speaker 1: hard for me not to get the feeling that this 548 00:32:34,640 --> 00:32:37,320 Speaker 1: is a leader who, when he perceives a challenge to 549 00:32:37,440 --> 00:32:42,000 Speaker 1: his absolute authority, just goes with that nuclear option. And 550 00:32:42,040 --> 00:32:44,720 Speaker 1: now Tesla's having to deal with the consequences because people 551 00:32:44,760 --> 00:32:48,560 Speaker 1: from other departments, notably the Solar and battery divisions, have 552 00:32:48,680 --> 00:32:51,320 Speaker 1: been given the task to work with the various partners 553 00:32:51,360 --> 00:32:55,640 Speaker 1: and contractors who had been engaged in building out supercharging stations. 554 00:32:56,040 --> 00:32:58,240 Speaker 1: You know, these are folks who already had a lot 555 00:32:58,240 --> 00:33:00,400 Speaker 1: of work on their plate, and now they're to take 556 00:33:00,440 --> 00:33:03,440 Speaker 1: on the work of an entirely different department. It seems 557 00:33:03,480 --> 00:33:06,040 Speaker 1: like the general strategy right now for Tesla is to 558 00:33:06,120 --> 00:33:09,040 Speaker 1: stall for time, which isn't exactly welcome news if you 559 00:33:09,080 --> 00:33:11,480 Speaker 1: happen to be a contractor who's expecting to get paid. 560 00:33:11,960 --> 00:33:14,160 Speaker 1: Musk has said the plan is still to grow the 561 00:33:14,200 --> 00:33:17,160 Speaker 1: supercharger network, but that things are gonna have to happen 562 00:33:17,200 --> 00:33:20,040 Speaker 1: at a more methodical pace, which seems like that's a 563 00:33:20,080 --> 00:33:23,520 Speaker 1: given considering how many people were fired. There have also 564 00:33:23,520 --> 00:33:26,200 Speaker 1: been reports that Tesla has subsequently tried to hire back 565 00:33:26,240 --> 00:33:29,240 Speaker 1: at least some of the staff that were previously laid off. 566 00:33:29,360 --> 00:33:31,280 Speaker 1: I'm sure for some of those folks coming back is 567 00:33:31,320 --> 00:33:33,840 Speaker 1: a pretty hard decision. I mean, you do need to 568 00:33:33,880 --> 00:33:37,760 Speaker 1: make a living, so there is that element. But if 569 00:33:37,760 --> 00:33:42,600 Speaker 1: you've already been unceremoniously laid off for arguably irrational reasons, 570 00:33:42,800 --> 00:33:45,360 Speaker 1: it's not really great to think about coming back to 571 00:33:45,400 --> 00:33:47,920 Speaker 1: that company. It's also not a great development for electric 572 00:33:48,000 --> 00:33:50,640 Speaker 1: vehicles in general, because if Tesla fizzles out, then some 573 00:33:50,680 --> 00:33:53,800 Speaker 1: other entity or multiple entities will have to step forward 574 00:33:54,120 --> 00:33:57,440 Speaker 1: to provide the infrastructure necessary to support a migration from 575 00:33:57,480 --> 00:34:01,040 Speaker 1: internal combustion engine vehicles to evs. So that's one of 576 00:34:01,040 --> 00:34:05,200 Speaker 1: the big reasons why people say they're they're reluctant to 577 00:34:05,400 --> 00:34:10,000 Speaker 1: adopt evs is that they fear being stranded someplace because 578 00:34:10,000 --> 00:34:13,200 Speaker 1: they weren't able to find a charging station and recharge 579 00:34:13,200 --> 00:34:17,360 Speaker 1: at a reasonable limit and cost. So, yeah, this is 580 00:34:17,400 --> 00:34:20,319 Speaker 1: not great news for the EV industry as a whole. Now, 581 00:34:20,320 --> 00:34:22,920 Speaker 1: when he's not gutting his EV company, Elon Musk is 582 00:34:22,920 --> 00:34:26,680 Speaker 1: sparring with his old nemesis, the US Securities and Exchange Commission, 583 00:34:26,760 --> 00:34:29,880 Speaker 1: or SEC. He's had a few run ins with the SEC, 584 00:34:30,400 --> 00:34:33,799 Speaker 1: one stemming from his inability to not post stuff that 585 00:34:33,840 --> 00:34:36,880 Speaker 1: could be seen as market manipulation. When he intimated that 586 00:34:36,920 --> 00:34:40,480 Speaker 1: he was going to take Tesla private. Anyway, now the 587 00:34:40,560 --> 00:34:43,640 Speaker 1: SEC wants to chat with Musk about his purchase of Twitter, 588 00:34:43,760 --> 00:34:46,880 Speaker 1: of course, now known as X. The SEC did this 589 00:34:46,960 --> 00:34:50,480 Speaker 1: once before. Musk initially told him to pound sand and 590 00:34:50,520 --> 00:34:52,400 Speaker 1: that he wasn't going to listen to all their subpoenas 591 00:34:52,400 --> 00:34:54,600 Speaker 1: because they were just trying to bully him. After that, 592 00:34:54,760 --> 00:34:57,479 Speaker 1: some judges weighed in to say, no, no, no, that's 593 00:34:57,480 --> 00:35:00,200 Speaker 1: not how the law works, and Musk tried again get 594 00:35:00,200 --> 00:35:02,680 Speaker 1: the whole thing swept away because you know, it's such 595 00:35:02,719 --> 00:35:05,239 Speaker 1: a hassle to have to answer for your actions. It 596 00:35:05,280 --> 00:35:07,600 Speaker 1: turns out that's not a good enough reason for judges 597 00:35:07,640 --> 00:35:11,080 Speaker 1: to throw things out, even for billionaires. So it looks 598 00:35:11,120 --> 00:35:12,799 Speaker 1: like Musk is going to have to have a chat 599 00:35:12,840 --> 00:35:16,400 Speaker 1: with the SEC, which says that it came into possession 600 00:35:16,400 --> 00:35:19,279 Speaker 1: of new documents about the purchase of X and it 601 00:35:19,400 --> 00:35:22,760 Speaker 1: raises questions about Musk's involvement with Twitter in the months 602 00:35:22,840 --> 00:35:25,360 Speaker 1: leading up to his announced plans to acquire the company. 603 00:35:25,680 --> 00:35:29,279 Speaker 1: So we'll have to see where that goes. Considering the 604 00:35:29,520 --> 00:35:34,040 Speaker 1: decline of Twitter since Musk's announcement, this feels like it's 605 00:35:34,200 --> 00:35:38,040 Speaker 1: largely a lot of fuss over a rotting corpse. Honestly, 606 00:35:38,600 --> 00:35:40,880 Speaker 1: One story that I don't think I covered before is 607 00:35:40,920 --> 00:35:44,080 Speaker 1: that the first human recipient of a Neurlink implant, which 608 00:35:44,160 --> 00:35:48,080 Speaker 1: is the brain computer interface chip that's made by Neurlink, 609 00:35:48,120 --> 00:35:51,000 Speaker 1: another company owned by Elon Musk It has experienced some 610 00:35:51,160 --> 00:35:54,600 Speaker 1: issues as several of the very thin threads that connected 611 00:35:54,640 --> 00:35:57,920 Speaker 1: to the device have lost contact with the neurons that 612 00:35:57,960 --> 00:36:01,000 Speaker 1: they were previously attached to in this human subject, So 613 00:36:01,080 --> 00:36:03,680 Speaker 1: this affected the chip's ability to serve as a conduit 614 00:36:03,800 --> 00:36:07,279 Speaker 1: between the recipient's brain and the computer systems he was 615 00:36:07,800 --> 00:36:11,160 Speaker 1: interacting with. Engineers addressed the problem by making tweaks to 616 00:36:11,200 --> 00:36:14,400 Speaker 1: the algorithms that run along on the chip, and that 617 00:36:14,520 --> 00:36:18,360 Speaker 1: helped compensate for this issue. But now some unnamed sources 618 00:36:18,400 --> 00:36:21,440 Speaker 1: that are familiar with the testing say that this was 619 00:36:21,480 --> 00:36:25,799 Speaker 1: a known thing that the Neurlink had been observed to 620 00:36:25,880 --> 00:36:30,120 Speaker 1: have wires retracting in animal test subjects as well. Considering 621 00:36:30,160 --> 00:36:34,200 Speaker 1: the fact that this implant requires invasive and potentially dangerous surgery, 622 00:36:34,600 --> 00:36:37,760 Speaker 1: some are now questioning the ethical implications of moving forward 623 00:36:37,800 --> 00:36:41,520 Speaker 1: with a human test subject. While this retraction thing has 624 00:36:41,560 --> 00:36:44,480 Speaker 1: been a known issue, there's a concern that should more 625 00:36:44,520 --> 00:36:49,400 Speaker 1: wires retract, the neurolink will hit a wall with however 626 00:36:49,520 --> 00:36:52,040 Speaker 1: much adjustment you can do to the algorithms to deal 627 00:36:52,080 --> 00:36:55,319 Speaker 1: with it, and that in turn, that'll mean the chip 628 00:36:55,320 --> 00:37:00,600 Speaker 1: will gradually lose functionality and potentially become largely useless, which 629 00:37:00,680 --> 00:37:03,280 Speaker 1: could lead to an eividuality in which it becomes necessary 630 00:37:03,280 --> 00:37:06,640 Speaker 1: to remove the implant entirely. As for finding a way 631 00:37:06,640 --> 00:37:09,680 Speaker 1: to fix this problem in future procedures, that's pretty challenging. 632 00:37:09,719 --> 00:37:12,800 Speaker 1: You could create a system where you have more wires 633 00:37:12,880 --> 00:37:16,120 Speaker 1: or they're more firmly anchored, but that would mean they 634 00:37:16,120 --> 00:37:19,080 Speaker 1: would be harder to dislodge by definition, and if you 635 00:37:19,200 --> 00:37:21,640 Speaker 1: wanted to do something like an upgrade, or if you 636 00:37:21,680 --> 00:37:25,160 Speaker 1: needed to remove the implant, it could cause complications. So 637 00:37:25,280 --> 00:37:28,359 Speaker 1: this is not an easy fix one way or the other. 638 00:37:28,920 --> 00:37:32,960 Speaker 1: And in my mind, it's suggest to me that perhaps 639 00:37:33,000 --> 00:37:35,759 Speaker 1: this was a bit too aggressive for neuralink, but we 640 00:37:35,800 --> 00:37:38,200 Speaker 1: don't know that yet. Like it may turn out that 641 00:37:38,440 --> 00:37:41,919 Speaker 1: some wires retracting is just a normal course of action 642 00:37:42,040 --> 00:37:44,000 Speaker 1: and that the rest of it is fine. And there's 643 00:37:44,000 --> 00:37:49,240 Speaker 1: no denying that the technology does provide an incredible quality 644 00:37:49,239 --> 00:37:53,359 Speaker 1: of life boost to people who otherwise don't have use 645 00:37:53,400 --> 00:37:56,480 Speaker 1: of their limbs. That is something that is really important. 646 00:37:56,560 --> 00:38:00,759 Speaker 1: So if in fact we don't see further degradation of 647 00:38:00,880 --> 00:38:04,439 Speaker 1: the implant, I think that this issue, while it does 648 00:38:04,520 --> 00:38:07,319 Speaker 1: raise concerns and raises questions about why the FDA would 649 00:38:07,320 --> 00:38:10,120 Speaker 1: allow things to move forward, those are questions that I 650 00:38:10,160 --> 00:38:12,600 Speaker 1: think need to be answered. I think ultimately we could 651 00:38:12,640 --> 00:38:16,879 Speaker 1: say it's for the best if the device actually does 652 00:38:16,920 --> 00:38:19,160 Speaker 1: work and doesn't degrade to a point where it must 653 00:38:19,200 --> 00:38:22,640 Speaker 1: be removed. I mean, it's in arguably an improvement to 654 00:38:22,680 --> 00:38:25,000 Speaker 1: the life of the people who have it. Touching back 655 00:38:25,000 --> 00:38:27,360 Speaker 1: on AI for a moment, Sony Music Group has joined 656 00:38:27,360 --> 00:38:31,000 Speaker 1: countless artists, writers, and other creators and expressing concern over 657 00:38:31,040 --> 00:38:35,040 Speaker 1: AI using existing human created works as training material, and 658 00:38:35,120 --> 00:38:37,799 Speaker 1: Sony Music is saying, don't do that. We don't want 659 00:38:37,840 --> 00:38:40,200 Speaker 1: you to do that. So the company has reportedly sent 660 00:38:40,280 --> 00:38:43,919 Speaker 1: out several hundred letters to various AI startups and companies saying, hey, 661 00:38:44,000 --> 00:38:46,440 Speaker 1: don't you dare use our catalog of music without our 662 00:38:46,480 --> 00:38:49,759 Speaker 1: express permission, And at least for some of those recipients, 663 00:38:49,800 --> 00:38:51,600 Speaker 1: the company went on to say, you know, we know 664 00:38:51,719 --> 00:38:53,359 Speaker 1: that you're doing it now, so you've got to knock 665 00:38:53,400 --> 00:38:56,200 Speaker 1: it off Buster, except, of course, they worded it differently. 666 00:38:56,280 --> 00:39:00,560 Speaker 1: Sony specifically said, quote, due to the nature of your 667 00:39:00,640 --> 00:39:03,719 Speaker 1: operations and published information about your AI systems, we have 668 00:39:03,800 --> 00:39:06,799 Speaker 1: reason to believe that you and or your affiliates may 669 00:39:06,840 --> 00:39:12,000 Speaker 1: have already made unauthorized uses, including TDM, of SMG content, 670 00:39:12,280 --> 00:39:16,240 Speaker 1: in relation to the training, development, or commercialization of AI systems. 671 00:39:16,360 --> 00:39:19,719 Speaker 1: End quote. For the record, the initialism TDM stands for 672 00:39:19,920 --> 00:39:24,320 Speaker 1: Text and Data Mining. SMG is Sony Music Group. According 673 00:39:24,360 --> 00:39:28,040 Speaker 1: to jem Oswad of Variety, a recent EU law called 674 00:39:28,040 --> 00:39:31,440 Speaker 1: the AI Act prompted Sony to send these letters, as 675 00:39:31,480 --> 00:39:35,600 Speaker 1: the Act compels AI companies to reveal what their training 676 00:39:35,680 --> 00:39:38,160 Speaker 1: data was like where did they get their data to 677 00:39:38,200 --> 00:39:41,360 Speaker 1: train their AI models? So Sony, at least in part, 678 00:39:41,400 --> 00:39:45,960 Speaker 1: is saying, Hey, you used our information without our permission. 679 00:39:46,960 --> 00:39:50,720 Speaker 1: Don't do that now. I usually hold off on reading 680 00:39:50,760 --> 00:39:52,680 Speaker 1: recommendations to the very end of the episode, but I 681 00:39:52,719 --> 00:39:56,240 Speaker 1: want to mention Ashley Bellinger's incredible piece for Ours Technica 682 00:39:56,360 --> 00:40:00,440 Speaker 1: titled MIT Students stole twenty five million in second by 683 00:40:00,480 --> 00:40:05,760 Speaker 1: exploiting etch blockchain bug. DOJ says, now, as that title implies, 684 00:40:05,840 --> 00:40:07,440 Speaker 1: this is a heck of a story. It's about a 685 00:40:07,480 --> 00:40:10,720 Speaker 1: pair of brothers who learned how blockchains work at MIT. 686 00:40:11,360 --> 00:40:14,399 Speaker 1: They arguably learned a little too well how they work, 687 00:40:14,840 --> 00:40:17,560 Speaker 1: and they took this incredible education that they received at 688 00:40:17,680 --> 00:40:20,080 Speaker 1: MIT and they turned it into kind of an Ocean's 689 00:40:20,080 --> 00:40:23,920 Speaker 1: eleven style heist. Except this heist took less than fifteen seconds, 690 00:40:24,000 --> 00:40:27,279 Speaker 1: and it involved finding a way to access pending transactions 691 00:40:27,320 --> 00:40:30,160 Speaker 1: on the blockchain. So these were transactions that have not 692 00:40:30,280 --> 00:40:34,640 Speaker 1: yet actually been cemented in the Ethereum blockchain, and they 693 00:40:34,640 --> 00:40:38,360 Speaker 1: were able to divert cryptocurrency meant for these transactions to 694 00:40:38,400 --> 00:40:41,919 Speaker 1: go into their own accounts. So the way blockchains work 695 00:40:42,520 --> 00:40:46,200 Speaker 1: is that once a transaction gets added to a block 696 00:40:46,239 --> 00:40:49,280 Speaker 1: and the block is added onto the chain, it's secure 697 00:40:49,640 --> 00:40:52,640 Speaker 1: because going into the history of transactions and making a 698 00:40:52,719 --> 00:40:57,280 Speaker 1: change would cause a ripple effect for every block that follows, 699 00:40:57,800 --> 00:41:01,480 Speaker 1: and so any tampering immediately becomes clear and is reversed 700 00:41:01,840 --> 00:41:05,240 Speaker 1: because the entire network is able to see that someone 701 00:41:05,600 --> 00:41:08,640 Speaker 1: is messing around. The brothers, however, figured out how to 702 00:41:08,680 --> 00:41:12,040 Speaker 1: tap into transactions that had been initiated but not yet 703 00:41:12,080 --> 00:41:15,479 Speaker 1: assimilated into the blockchain, and they would have gotten away 704 00:41:15,480 --> 00:41:17,319 Speaker 1: with it too if it hadn't been for you know 705 00:41:17,560 --> 00:41:20,200 Speaker 1: the fact that their victims made a really big fuss 706 00:41:20,239 --> 00:41:23,640 Speaker 1: about this, and it prompted an investigation from the irs, 707 00:41:23,719 --> 00:41:28,200 Speaker 1: which then they said was incredibly easy to conduct because 708 00:41:28,200 --> 00:41:30,880 Speaker 1: the brothers were not particularly good at covering their tracks. 709 00:41:31,120 --> 00:41:33,520 Speaker 1: They tried to cover their tracks, but they didn't try 710 00:41:33,560 --> 00:41:36,320 Speaker 1: to cover their tracks in a smart way. Apparently the 711 00:41:36,360 --> 00:41:38,360 Speaker 1: brothers spent a lot of time searching for stuff that 712 00:41:38,480 --> 00:41:42,279 Speaker 1: was a little bit suss, like how to create shell companies, 713 00:41:42,520 --> 00:41:45,399 Speaker 1: how to launder money, and you know who's a good 714 00:41:45,400 --> 00:41:48,920 Speaker 1: crypto lawyer, which is all, I guess, arguably circumstantial, but 715 00:41:48,960 --> 00:41:51,280 Speaker 1: it's not a good look. As the kids used to say, 716 00:41:51,480 --> 00:41:53,600 Speaker 1: This is just a gentle reminder for all of y'all 717 00:41:53,600 --> 00:41:56,120 Speaker 1: out there. If you're going to commit a crime, and 718 00:41:56,160 --> 00:41:59,919 Speaker 1: I strongly urge you not to do so, maybe don't 719 00:42:00,239 --> 00:42:02,840 Speaker 1: search stuff while you're logged into a browser on a 720 00:42:02,840 --> 00:42:06,279 Speaker 1: personal device you know, or logged in at all to 721 00:42:06,400 --> 00:42:09,520 Speaker 1: a browser. You want as many layers of obfuscation between 722 00:42:09,520 --> 00:42:12,200 Speaker 1: what you're doing and you as you can manage, and 723 00:42:12,239 --> 00:42:14,799 Speaker 1: your browser history shouldn't be a treasure map that leads 724 00:42:14,800 --> 00:42:20,480 Speaker 1: authorities to your criminal behavior, because you just brought them there. Man. Now, 725 00:42:20,480 --> 00:42:24,000 Speaker 1: I just did some episodes about live service games this week, 726 00:42:24,040 --> 00:42:26,600 Speaker 1: and I mentioned the company rock Star in those shows. 727 00:42:26,760 --> 00:42:29,280 Speaker 1: Rockstar is the video game company behind the Grand Theft 728 00:42:29,320 --> 00:42:34,720 Speaker 1: Auto series. Jennifer Mos of Variety says that that Rockstar 729 00:42:34,840 --> 00:42:38,680 Speaker 1: had a two point nine billion dollar loss last quarter. 730 00:42:39,080 --> 00:42:42,200 Speaker 1: That is a heck of a princely sum to have 731 00:42:42,280 --> 00:42:45,680 Speaker 1: as a loss, and it's a here bit more money 732 00:42:45,719 --> 00:42:49,480 Speaker 1: than what was previously estimated because the original estimate shortfall 733 00:42:49,520 --> 00:42:51,840 Speaker 1: was going to be one hundred and seventy million dollars, So, 734 00:42:51,960 --> 00:42:56,040 Speaker 1: you know, just two point eight billion dollars off. Anyway, 735 00:42:56,400 --> 00:43:00,000 Speaker 1: about two billion of that was a quote unquote goodwill chart. 736 00:43:00,600 --> 00:43:02,920 Speaker 1: That seems like an oxymoron. It's kind of like the 737 00:43:03,040 --> 00:43:06,960 Speaker 1: convenience fees you see on services like Ticketmaster, and you're like, 738 00:43:07,000 --> 00:43:11,440 Speaker 1: who is this convenient for? But no goodwill charge just 739 00:43:11,480 --> 00:43:14,320 Speaker 1: refers to cases in which a company spends more money 740 00:43:14,680 --> 00:43:18,760 Speaker 1: acquiring some other entity than the value of that entity's 741 00:43:18,800 --> 00:43:22,160 Speaker 1: assets and whatnot. So it's like saying, this money represents 742 00:43:22,200 --> 00:43:24,640 Speaker 1: us paying more than what stuff was worth. But in 743 00:43:24,680 --> 00:43:27,479 Speaker 1: good news, gamers now know they can expect the long 744 00:43:27,520 --> 00:43:30,920 Speaker 1: awaited next game in the GTA franchise in the fall 745 00:43:31,040 --> 00:43:34,759 Speaker 1: of twenty twenty five. Okay, I got a couple of 746 00:43:34,800 --> 00:43:37,359 Speaker 1: recommended articles for y'all before I sign off. First up 747 00:43:37,400 --> 00:43:40,719 Speaker 1: is Jacob Stern's piece in The Atlantic. It's titled The 748 00:43:40,840 --> 00:43:44,000 Speaker 1: Dream of Streaming is Dead, and it goes into the 749 00:43:44,080 --> 00:43:47,680 Speaker 1: recent trend of bundling various streaming services together and how 750 00:43:47,719 --> 00:43:50,640 Speaker 1: this is essentially reverting to the older model of cable 751 00:43:50,680 --> 00:43:54,000 Speaker 1: television packages rather than the dream of freeing people from 752 00:43:54,040 --> 00:43:57,080 Speaker 1: that experience with the prospect of streaming. Next up is 753 00:43:57,120 --> 00:44:00,800 Speaker 1: Alex Heath's article in The Verge titled Goo and Open 754 00:44:00,840 --> 00:44:04,239 Speaker 1: AI are racing to rewire the Internet. It goes into 755 00:44:04,239 --> 00:44:07,880 Speaker 1: more detail about how the implementation of AI into search 756 00:44:08,000 --> 00:44:10,640 Speaker 1: can and will have a huge impact on the Web. 757 00:44:10,719 --> 00:44:13,120 Speaker 1: In particular, a lot of content sites out there are 758 00:44:13,120 --> 00:44:15,600 Speaker 1: wondering what the heck is going to happen because of 759 00:44:15,640 --> 00:44:18,960 Speaker 1: all that, So check those out. I hope you are 760 00:44:19,000 --> 00:44:22,680 Speaker 1: all well, and I'll talk to you again really soon. 761 00:44:28,920 --> 00:44:33,600 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 762 00:44:33,920 --> 00:44:37,640 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 763 00:44:37,640 --> 00:44:42,280 Speaker 1: to your favorite shows.