1 00:00:04,480 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,960 --> 00:00:19,279 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,360 --> 00:00:22,319 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,360 --> 00:00:27,319 Speaker 1: the week ending on Friday, October eighteenth, twenty twenty four, 6 00:00:27,840 --> 00:00:31,640 Speaker 1: and over at X the platform formerly known as Twitter, 7 00:00:31,880 --> 00:00:35,199 Speaker 1: I think I'll always call it that, maybe just offspite anyway, 8 00:00:35,360 --> 00:00:38,159 Speaker 1: there was another change in how the platform works, and 9 00:00:38,240 --> 00:00:42,320 Speaker 1: it appears to have prompted yet another exodus among a 10 00:00:42,360 --> 00:00:45,040 Speaker 1: subset of users. So this time it all has to 11 00:00:45,040 --> 00:00:50,199 Speaker 1: do with the block feature. Now, in ye olden days, 12 00:00:50,520 --> 00:00:53,360 Speaker 1: if you chose to block someone on Twitter, not only 13 00:00:53,400 --> 00:00:56,520 Speaker 1: would they no longer be able to comment on, or 14 00:00:56,720 --> 00:01:00,440 Speaker 1: quote or repost any of your tweets, I wouldn't be 15 00:01:00,440 --> 00:01:02,640 Speaker 1: able to see any of those tweets in the first place. 16 00:01:03,080 --> 00:01:05,160 Speaker 1: So to the blocked person, you would seem to have 17 00:01:05,240 --> 00:01:09,120 Speaker 1: disappeared off the platform. But earlier this week, for reasons 18 00:01:09,120 --> 00:01:12,440 Speaker 1: I don't fully understand, X revealed that it was going 19 00:01:12,480 --> 00:01:15,520 Speaker 1: to change the block feature. And the block feature will 20 00:01:15,560 --> 00:01:19,759 Speaker 1: still block someone from commenting on or retweeting your posts, 21 00:01:20,160 --> 00:01:24,520 Speaker 1: but they will be able to read everything that you 22 00:01:24,600 --> 00:01:29,039 Speaker 1: have tweeted. So now, if I were still on X 23 00:01:29,160 --> 00:01:32,640 Speaker 1: Slash Twitter and you were irritating me and I blocked you, 24 00:01:32,640 --> 00:01:34,800 Speaker 1: you would still be able to see everything I posted, 25 00:01:34,840 --> 00:01:38,320 Speaker 1: you just couldn't comment or retweet it. Now a lot 26 00:01:38,319 --> 00:01:42,039 Speaker 1: of people have balked at this since it has been revealed, 27 00:01:42,040 --> 00:01:45,080 Speaker 1: and they pointed out that this can create really dangerous 28 00:01:45,080 --> 00:01:48,520 Speaker 1: situations for some users. Let's say that someone's getting stalked 29 00:01:49,040 --> 00:01:53,120 Speaker 1: and blocking accounts was one way to limit their visibility 30 00:01:53,200 --> 00:01:56,480 Speaker 1: to the stalker. Well, now the stalker could continue to 31 00:01:56,520 --> 00:02:00,680 Speaker 1: read updates and potentially escalate matters, make things worse because 32 00:02:00,960 --> 00:02:04,000 Speaker 1: the other person's now not aware of what's going on. 33 00:02:04,120 --> 00:02:07,200 Speaker 1: They're just posting, but they don't realize that the person 34 00:02:07,320 --> 00:02:10,000 Speaker 1: that they thought was blocked from seeing them can actually 35 00:02:10,040 --> 00:02:14,040 Speaker 1: read everything. As Matt Binder of Mashable noted, it appears 36 00:02:14,080 --> 00:02:16,800 Speaker 1: that some folks on X have felt the need to 37 00:02:16,919 --> 00:02:21,399 Speaker 1: stretch their metaphorical legs and seek greener pastures or perhaps 38 00:02:21,840 --> 00:02:26,560 Speaker 1: bluer skies because X competitor blue Sky announced that in 39 00:02:26,600 --> 00:02:31,000 Speaker 1: the twenty four hours following X's announced changes to the 40 00:02:31,040 --> 00:02:35,360 Speaker 1: block feature, blue Sky saw half a million users join 41 00:02:35,480 --> 00:02:39,080 Speaker 1: the service now. Blue Sky is behind both X and 42 00:02:39,200 --> 00:02:42,560 Speaker 1: metas threads, platforms and users, and all of those platforms 43 00:02:42,600 --> 00:02:45,440 Speaker 1: are similar. They're kind of like that microblogging sort of thing. 44 00:02:45,840 --> 00:02:49,040 Speaker 1: There's also Mastodon out there. I wouldn't be surprised if 45 00:02:49,040 --> 00:02:52,720 Speaker 1: Mastadon also saw a surge of new folks signing on 46 00:02:53,200 --> 00:02:57,080 Speaker 1: these days. I'm only kinda sorta on Threads. Even that 47 00:02:57,240 --> 00:02:59,480 Speaker 1: is a bit much for me. There are issues with Threads. 48 00:02:59,480 --> 00:03:01,600 Speaker 1: I have that the same as my issues with like 49 00:03:01,680 --> 00:03:05,360 Speaker 1: Facebook and really Meta in general. So I don't feel 50 00:03:05,400 --> 00:03:07,400 Speaker 1: great about posting there, but I have done it a 51 00:03:07,400 --> 00:03:10,280 Speaker 1: couple of times. It does help scratch the itch that 52 00:03:10,360 --> 00:03:13,000 Speaker 1: Twitter used to satisfy for me back in the day. 53 00:03:13,320 --> 00:03:16,560 Speaker 1: But it's not great. Maybe I should switch just a 54 00:03:16,560 --> 00:03:18,960 Speaker 1: Blue skyer Maston, or just accept the fact that that 55 00:03:19,000 --> 00:03:21,400 Speaker 1: part of my life is over anyway. I don't understand 56 00:03:21,520 --> 00:03:24,520 Speaker 1: why X made this choice to change the block feature. 57 00:03:24,760 --> 00:03:27,480 Speaker 1: But I should also mention that Meta's Threads announced a 58 00:03:27,560 --> 00:03:32,240 Speaker 1: change to its service that's being rolled out gradually, which 59 00:03:32,320 --> 00:03:34,720 Speaker 1: is that users will be able to turn on a 60 00:03:34,760 --> 00:03:37,600 Speaker 1: feature called activity Status. Now I say turn on, it 61 00:03:37,640 --> 00:03:40,200 Speaker 1: may be that the status is turned on by default 62 00:03:40,200 --> 00:03:42,320 Speaker 1: and you have to go in to opt out of it. 63 00:03:42,680 --> 00:03:47,240 Speaker 1: But activity status tells you which users are currently online. 64 00:03:47,520 --> 00:03:49,960 Speaker 1: So if this is on, if it's active for you, 65 00:03:50,000 --> 00:03:51,760 Speaker 1: and it's on, it's not active for me yet I 66 00:03:51,880 --> 00:03:55,560 Speaker 1: checked before I recorded today. But if it's on, then 67 00:03:55,720 --> 00:03:57,440 Speaker 1: for other people that it's on, you'll see a little 68 00:03:57,480 --> 00:04:00,960 Speaker 1: green dot on their profile icon that indicates that they're 69 00:04:01,000 --> 00:04:03,960 Speaker 1: online at that moment. And if the green dot's not there, 70 00:04:04,080 --> 00:04:07,040 Speaker 1: it either means that they aren't online or they have 71 00:04:07,200 --> 00:04:11,240 Speaker 1: turned off the activity status feature. To me, it sounds 72 00:04:11,320 --> 00:04:14,119 Speaker 1: like this is a bad idea too, Like I don't 73 00:04:14,160 --> 00:04:17,000 Speaker 1: know anyone who is begging for this. Maybe it's just 74 00:04:17,040 --> 00:04:19,720 Speaker 1: that I use threads in a different way. I always 75 00:04:19,839 --> 00:04:22,599 Speaker 1: viewed threads just as I viewed Twitter as kind of 76 00:04:22,600 --> 00:04:26,840 Speaker 1: an asynchronous communications tool where you post, but you're not 77 00:04:26,880 --> 00:04:31,360 Speaker 1: expecting an immediate response, right, They'll respond when they get 78 00:04:31,360 --> 00:04:34,160 Speaker 1: a chance to respond, that's it, and then you respond 79 00:04:34,400 --> 00:04:37,520 Speaker 1: when you have a chance. It's not happening in real time. 80 00:04:37,920 --> 00:04:41,600 Speaker 1: But Meta appears to be kind of trying to move 81 00:04:41,680 --> 00:04:45,440 Speaker 1: threads into that space a little bit. And I don't know, 82 00:04:45,480 --> 00:04:47,039 Speaker 1: maybe that is something that a lot of people have 83 00:04:47,080 --> 00:04:50,200 Speaker 1: been asking for. But if you're like me, and you 84 00:04:50,240 --> 00:04:54,160 Speaker 1: aren't keen on everybody being aware of when you're on 85 00:04:54,360 --> 00:04:57,840 Speaker 1: the service, You'll probably want to turn the activity status 86 00:04:57,920 --> 00:05:01,159 Speaker 1: to off if you are in fact using once it 87 00:05:01,400 --> 00:05:04,160 Speaker 1: is rolled out. That is, so I'll be curious to 88 00:05:04,200 --> 00:05:06,280 Speaker 1: see how this rollout happens. Like I said, I don't 89 00:05:06,279 --> 00:05:08,400 Speaker 1: have it yet, so I don't know if this is 90 00:05:08,480 --> 00:05:10,720 Speaker 1: opt out or opt in. I would much prefer it 91 00:05:10,760 --> 00:05:13,440 Speaker 1: to be opt in and have it off by default, 92 00:05:13,720 --> 00:05:17,080 Speaker 1: but I suspect that will not be how it turns out. 93 00:05:17,120 --> 00:05:19,960 Speaker 1: We'll have to see. Sarah Perez of tech Crunch has 94 00:05:20,000 --> 00:05:23,680 Speaker 1: an article this week titled Elon Musk's X is changing 95 00:05:23,680 --> 00:05:27,040 Speaker 1: its privacy policy to allow third parties to train AI 96 00:05:27,279 --> 00:05:30,039 Speaker 1: on your posts. So maybe some of those folks headed 97 00:05:30,040 --> 00:05:32,440 Speaker 1: to Blue Sky are more concerned about their posts being 98 00:05:32,520 --> 00:05:35,960 Speaker 1: used to fuel our future robotic overlords and less concerned 99 00:05:36,000 --> 00:05:39,080 Speaker 1: about the block feature. I don't know. Perez notes that 100 00:05:39,360 --> 00:05:42,520 Speaker 1: X changed its privacy policy and it now includes the 101 00:05:42,600 --> 00:05:45,680 Speaker 1: option for third party collaborators to slurp up all that 102 00:05:45,800 --> 00:05:48,679 Speaker 1: tasted tasting data that you have generated over the years, 103 00:05:48,920 --> 00:05:52,040 Speaker 1: so that the next generation of trollbot or whatever can 104 00:05:52,200 --> 00:05:55,000 Speaker 1: lean on the collective wisdom of X, and I do 105 00:05:55,120 --> 00:05:59,400 Speaker 1: use all of those terms sarcastically. Users will apparently be 106 00:05:59,440 --> 00:06:01,719 Speaker 1: able to opt out of this feature. The new section 107 00:06:01,800 --> 00:06:05,400 Speaker 1: of the policy states, quote, depending on your settings or 108 00:06:05,440 --> 00:06:08,239 Speaker 1: if you decide to share your data, we may share 109 00:06:08,360 --> 00:06:11,400 Speaker 1: or disclose your information with third parties if you do 110 00:06:11,520 --> 00:06:15,040 Speaker 1: not opt out. In some instances, the recipients of the 111 00:06:15,040 --> 00:06:17,880 Speaker 1: information may use it for their own independent purposes in 112 00:06:17,880 --> 00:06:22,240 Speaker 1: addition to those stated in excess privacy policy, including for example, 113 00:06:22,440 --> 00:06:26,520 Speaker 1: to train their artificial intelligence models, whether generative or otherwise 114 00:06:26,880 --> 00:06:29,760 Speaker 1: end quote. Perez notes that as of the writing of 115 00:06:29,760 --> 00:06:33,400 Speaker 1: this article, there was no clear setting that would relate 116 00:06:33,440 --> 00:06:35,320 Speaker 1: to this policy, so if you went into your settings, 117 00:06:35,360 --> 00:06:38,360 Speaker 1: you wouldn't see something that was clearly marked as allowing 118 00:06:38,360 --> 00:06:41,360 Speaker 1: you to opt out of this third party collaborator stuff. 119 00:06:41,600 --> 00:06:45,400 Speaker 1: But the policy itself won't go into effect until November fifteenth, 120 00:06:45,400 --> 00:06:49,560 Speaker 1: so it is possible that that setting will arrive before 121 00:06:49,760 --> 00:06:53,880 Speaker 1: or when that happens. Getting back to Meta, the company 122 00:06:53,880 --> 00:06:56,600 Speaker 1: has apparently been making some staff cuts, and they sound 123 00:06:56,640 --> 00:06:59,320 Speaker 1: like they're not quite as sweeping as earlier rounds of 124 00:06:59,400 --> 00:07:02,320 Speaker 1: layoffs with the company, where like more than ten thousand 125 00:07:02,400 --> 00:07:04,880 Speaker 1: people were like go at a time. Alex Heath and 126 00:07:04,960 --> 00:07:07,919 Speaker 1: Jay Peters of the Verge report that the layoffs have 127 00:07:08,000 --> 00:07:12,840 Speaker 1: affected multiple divisions within Meta, including Instagram, Wattsapp, and the 128 00:07:12,880 --> 00:07:18,000 Speaker 1: company's all Things Metaverse department Reality Labs. As Maxwell Zeph 129 00:07:18,120 --> 00:07:21,000 Speaker 1: of tech Crunch has put it, the layoffs meant are 130 00:07:21,040 --> 00:07:25,200 Speaker 1: meant to quote reallocate resources within the company end quote. 131 00:07:25,360 --> 00:07:29,880 Speaker 1: So that's your standard reorganization slash restructuring language you hear 132 00:07:29,960 --> 00:07:34,880 Speaker 1: from corporate entities. Sometimes these moves reflect an organization realizing 133 00:07:34,920 --> 00:07:38,600 Speaker 1: that it has overstaffed certain departments and so operations have 134 00:07:38,680 --> 00:07:42,160 Speaker 1: become inefficient and wasteful, so the layoffs are an effort 135 00:07:42,160 --> 00:07:45,080 Speaker 1: to realign that. In other cases, it's more like company 136 00:07:45,160 --> 00:07:47,720 Speaker 1: leaders have decided they want to try and accomplish more 137 00:07:47,840 --> 00:07:50,600 Speaker 1: with less and say like, let's try and do the 138 00:07:50,600 --> 00:07:52,640 Speaker 1: same thing we're doing now, but with fewer people, so 139 00:07:52,760 --> 00:07:54,960 Speaker 1: are not spending as much money. It's hard to say 140 00:07:55,000 --> 00:07:58,880 Speaker 1: what this particular instance really qualifies. As Zeph at tech 141 00:07:58,920 --> 00:08:02,080 Speaker 1: Crunch also mentioned that the company declined to answer questions 142 00:08:02,080 --> 00:08:05,120 Speaker 1: regarding how many employees in total were let go, but 143 00:08:05,160 --> 00:08:08,000 Speaker 1: the layoffs definitely included some prominent folks who have taken 144 00:08:08,000 --> 00:08:10,560 Speaker 1: the social media to make it known that they are 145 00:08:10,600 --> 00:08:14,040 Speaker 1: currently in the job market. So I'm sure the cuts 146 00:08:14,240 --> 00:08:17,880 Speaker 1: are deeply felt within the departments where they happened. It's 147 00:08:17,920 --> 00:08:21,280 Speaker 1: just unclear as to how extensive those cuts actually have been. 148 00:08:21,760 --> 00:08:27,040 Speaker 1: Instagram has instituted some features to help protect users, particularly teens, 149 00:08:27,360 --> 00:08:32,599 Speaker 1: from sextortion attempts, as reported by Ayisha Malik of tech Crunch. Now, previously, 150 00:08:32,960 --> 00:08:35,760 Speaker 1: it was possible for someone to use screen capture tools 151 00:08:35,800 --> 00:08:39,080 Speaker 1: to copy images that were sent through direct messages. While 152 00:08:39,120 --> 00:08:42,640 Speaker 1: the sender would receive a notification that the image they 153 00:08:42,679 --> 00:08:45,760 Speaker 1: had sent had been saved, they couldn't really do anything 154 00:08:45,800 --> 00:08:48,840 Speaker 1: about it. And if the recipient of the message chose 155 00:08:48,880 --> 00:08:51,920 Speaker 1: to blackmail the sender, you know, threatening to share the 156 00:08:51,960 --> 00:08:54,720 Speaker 1: images that were sent to them unless the sender followed 157 00:08:55,000 --> 00:08:58,480 Speaker 1: you know, instructions, well that's where the sextortion stuff comes in, 158 00:08:58,559 --> 00:09:03,240 Speaker 1: which is pretty damn horrifying. Manipulating someone in order to 159 00:09:03,240 --> 00:09:06,160 Speaker 1: give them to send compromising images and then threatening them 160 00:09:06,520 --> 00:09:09,080 Speaker 1: by saying you're going to share those potentially with like 161 00:09:09,160 --> 00:09:12,160 Speaker 1: friends and family or whatever unless they do whatever it 162 00:09:12,200 --> 00:09:15,679 Speaker 1: is you tell them to do. It's disgusting anyway. Now 163 00:09:15,720 --> 00:09:19,120 Speaker 1: Instagram prevents screen captures of those kinds of images that 164 00:09:19,160 --> 00:09:22,760 Speaker 1: are sent as a view once or allow replay messages. 165 00:09:22,800 --> 00:09:25,480 Speaker 1: If it's sent through dms like that, you can no 166 00:09:25,600 --> 00:09:30,240 Speaker 1: longer take screen captures. Plus they'll only display on the 167 00:09:30,280 --> 00:09:33,720 Speaker 1: mobile version of the app. You cannot access these through 168 00:09:33,760 --> 00:09:36,800 Speaker 1: a desktop version of Instagram. They will not display the 169 00:09:36,840 --> 00:09:39,679 Speaker 1: images at all in an effort to prevent abuse. This 170 00:09:39,760 --> 00:09:44,400 Speaker 1: new process complements Instagram's recent rollout of teen accounts, which 171 00:09:44,440 --> 00:09:47,480 Speaker 1: includes a suite of features meant to give younger users 172 00:09:47,520 --> 00:09:50,880 Speaker 1: more protection while they're on the program. Meta has a 173 00:09:50,920 --> 00:09:53,920 Speaker 1: lot of ground to make up in this area because 174 00:09:54,200 --> 00:09:57,280 Speaker 1: the company has long been criticized for failing to ensure 175 00:09:57,320 --> 00:10:00,960 Speaker 1: the safety of younger users while simultaneously trying to court 176 00:10:01,040 --> 00:10:03,600 Speaker 1: them over to use the platform. That was a big 177 00:10:03,640 --> 00:10:06,559 Speaker 1: part of the whistleblower bruhaha from a couple of years 178 00:10:06,600 --> 00:10:10,280 Speaker 1: ago was that the revelation was one Meta at the 179 00:10:10,320 --> 00:10:14,000 Speaker 1: time it was Facebook. Facebook was well aware of the 180 00:10:14,000 --> 00:10:18,080 Speaker 1: potential harm it could cause, and two, while it did 181 00:10:18,200 --> 00:10:21,520 Speaker 1: very little to address that harm, it was actively trying 182 00:10:21,559 --> 00:10:27,199 Speaker 1: to get more young people to join the platform. So yeah, 183 00:10:27,240 --> 00:10:30,920 Speaker 1: this is an important step, but clearly it's just one 184 00:10:31,320 --> 00:10:34,880 Speaker 1: small step on a longer journey. Okay, we've got more 185 00:10:35,000 --> 00:10:36,960 Speaker 1: journey ahead of us too, but before we get to that, 186 00:10:37,080 --> 00:10:49,160 Speaker 1: let's take a quick break to thank our sponsors. We're back. 187 00:10:49,520 --> 00:10:53,280 Speaker 1: Meda's Oversight Board, which I'll remind you, is an organization 188 00:10:53,320 --> 00:10:57,760 Speaker 1: that's independent of Meta. It advises Meta on content moderation decisions. 189 00:10:58,000 --> 00:11:01,640 Speaker 1: It is now seeking public comments regard immigration related content 190 00:11:01,720 --> 00:11:04,800 Speaker 1: that potentially could be harmful to immigrants. The board has 191 00:11:04,840 --> 00:11:09,679 Speaker 1: expressed concern that Meta's current policy only shields vulnerable populations 192 00:11:09,960 --> 00:11:14,439 Speaker 1: like immigrants, migrants, and asylum seekers from the most harmful 193 00:11:14,520 --> 00:11:17,120 Speaker 1: forms of hate speech, but it leaves those people open 194 00:11:17,160 --> 00:11:20,720 Speaker 1: to perhaps less overt, but no less dangerous attacks. And 195 00:11:20,840 --> 00:11:25,439 Speaker 1: they have shown two examples of posts containing hate speech 196 00:11:25,760 --> 00:11:28,480 Speaker 1: or things that are bordering on hate speech that Meta 197 00:11:28,600 --> 00:11:32,000 Speaker 1: did not remove even after human review of those items 198 00:11:32,040 --> 00:11:34,600 Speaker 1: were brought to the company's attention. So One of them 199 00:11:34,800 --> 00:11:38,200 Speaker 1: happened in Poland and contained a derogatory term for black people. 200 00:11:38,440 --> 00:11:40,920 Speaker 1: The other one happened in Germany and featured a picture 201 00:11:40,920 --> 00:11:43,360 Speaker 1: of a white, blonde haired, blue eyed woman holding up 202 00:11:43,400 --> 00:11:46,280 Speaker 1: her hand, and there was a message saying outsiders should 203 00:11:46,280 --> 00:11:52,040 Speaker 1: stop coming into Germany. It got more explicit and hateful 204 00:11:52,160 --> 00:11:54,160 Speaker 1: from there, but I don't want to even repeat it 205 00:11:54,160 --> 00:11:58,319 Speaker 1: because it's gross. Anyway, Metta left both of those messages 206 00:11:58,440 --> 00:12:01,679 Speaker 1: up even after human review, and it seems pretty clear 207 00:12:01,720 --> 00:12:04,600 Speaker 1: to me that the messages included speech that was meant 208 00:12:04,800 --> 00:12:09,560 Speaker 1: to incite people and to be expressed at the harm 209 00:12:09,720 --> 00:12:13,040 Speaker 1: of vulnerable populations, and that therefore it probably should have 210 00:12:13,040 --> 00:12:16,600 Speaker 1: been taken down. And the board suggested that Meta reverse 211 00:12:16,640 --> 00:12:19,680 Speaker 1: its decision and take those messages down, but Meta declined. 212 00:12:19,760 --> 00:12:21,520 Speaker 1: And this is a good time to remind you that 213 00:12:21,760 --> 00:12:26,200 Speaker 1: this oversight board, while it can make content moderation guidelines 214 00:12:26,240 --> 00:12:29,360 Speaker 1: for Meta, Meta is in no way obligated to actually 215 00:12:29,360 --> 00:12:33,560 Speaker 1: follow those guidelines. It's non binding. But now the board 216 00:12:33,640 --> 00:12:37,160 Speaker 1: is looking for public comment about these issues, potentially in 217 00:12:37,240 --> 00:12:40,440 Speaker 1: order to pressure Meta to make these changes. Because it's 218 00:12:40,440 --> 00:12:43,640 Speaker 1: one thing for Meta to kind of ignore its oversight board. 219 00:12:43,840 --> 00:12:48,520 Speaker 1: It's another thing if there's a big public campaign pressuring 220 00:12:48,600 --> 00:12:52,080 Speaker 1: Meta to take more action. That's bad for optics, and 221 00:12:52,160 --> 00:12:55,439 Speaker 1: I think Meta is far more sensitive to that than 222 00:12:55,480 --> 00:12:58,400 Speaker 1: it is to the guidelines of its own oversight board. 223 00:12:58,679 --> 00:13:02,160 Speaker 1: But that's my own personal opinion and we're not done 224 00:13:02,240 --> 00:13:05,319 Speaker 1: yet with Meta. So Reuter's reports that Facebook and Blumhouse 225 00:13:05,360 --> 00:13:08,600 Speaker 1: Productions have created a project in which some filmmakers, including 226 00:13:08,679 --> 00:13:14,120 Speaker 1: Casey Affleck, the Spurlock Sisters, and Aniche Chiganti. Aniche's piece 227 00:13:14,200 --> 00:13:18,319 Speaker 1: is already up. I watched it, and Aniche showed how 228 00:13:18,760 --> 00:13:24,400 Speaker 1: he used the tool to change the background or elements 229 00:13:24,440 --> 00:13:27,439 Speaker 1: of videos he shot when he was a child, and 230 00:13:27,640 --> 00:13:30,400 Speaker 1: it was kind of interesting, Like there was one where 231 00:13:30,559 --> 00:13:33,800 Speaker 1: it shows someone walking down their street in California that 232 00:13:33,880 --> 00:13:36,120 Speaker 1: was supposed to be set in Manhattan, so he had 233 00:13:36,160 --> 00:13:39,280 Speaker 1: the AI tool changed the background to look like Manhattan. 234 00:13:39,440 --> 00:13:41,679 Speaker 1: It did not look like Manhattan. It did look like 235 00:13:41,720 --> 00:13:43,720 Speaker 1: a big city, but it looked a little weird. I mean, 236 00:13:43,760 --> 00:13:46,600 Speaker 1: it's like AI generative stuff. But his whole point was 237 00:13:46,640 --> 00:13:50,760 Speaker 1: that this was a way to augment the filmmaking experience, 238 00:13:50,880 --> 00:13:53,440 Speaker 1: and that he stresses in it, I still needed to 239 00:13:53,480 --> 00:13:56,240 Speaker 1: make the movie. I still needed to write everything, that 240 00:13:56,480 --> 00:13:59,240 Speaker 1: this wasn't a tool that replaced all that it was 241 00:13:59,280 --> 00:14:04,160 Speaker 1: a tool that augmented it. I remain somewhat unconvinced, not 242 00:14:04,480 --> 00:14:07,160 Speaker 1: that it could be a tool used to augment I 243 00:14:07,200 --> 00:14:09,880 Speaker 1: think it could be. I think generative AI could be 244 00:14:10,040 --> 00:14:15,040 Speaker 1: used in ways to augment work that is not necessarily 245 00:14:15,080 --> 00:14:18,640 Speaker 1: harmful to creatives. The problem I see is that a 246 00:14:18,640 --> 00:14:22,080 Speaker 1: lot of the companies, the production companies that are ultimately 247 00:14:22,280 --> 00:14:26,400 Speaker 1: in charge of paying for creatives, that they would just 248 00:14:26,560 --> 00:14:31,400 Speaker 1: go and use generative AI as a shortcut and skip 249 00:14:31,440 --> 00:14:35,200 Speaker 1: the whole artistic process because we've seen that. We have frankly, 250 00:14:35,240 --> 00:14:40,120 Speaker 1: we have seen companies fire creative departments and rely on 251 00:14:40,200 --> 00:14:44,520 Speaker 1: generative AI to varying degrees of failure. Really it's not 252 00:14:44,560 --> 00:14:48,120 Speaker 1: really success. They're pretty awful at this stage. But anyway, 253 00:14:48,600 --> 00:14:53,520 Speaker 1: that's what's going on. It's the movie gen tool or project. 254 00:14:53,920 --> 00:14:55,920 Speaker 1: It'll be curious. I'll be curious to see what Casey 255 00:14:55,960 --> 00:14:58,840 Speaker 1: Affleck and the Spurlock sisters create. I haven't seen their 256 00:14:58,920 --> 00:15:03,240 Speaker 1: output yet, seen a Niche Chaganti's, but it is interesting 257 00:15:03,520 --> 00:15:06,600 Speaker 1: and I'm sure it will propel the conversation forward. I 258 00:15:07,080 --> 00:15:10,480 Speaker 1: remain somewhat skeptical, largely because I mean, any project that 259 00:15:10,640 --> 00:15:15,840 Speaker 1: is heavily supported by Meta, there's obviously a narrative that's 260 00:15:15,880 --> 00:15:18,960 Speaker 1: trying to be promoted there. Caid Mets, Mike Isaac, and 261 00:15:19,000 --> 00:15:21,080 Speaker 1: Aaron Griffith have a piece in The New York Times 262 00:15:21,120 --> 00:15:24,400 Speaker 1: with the headline Microsoft and Open AI's close partnership shows 263 00:15:24,440 --> 00:15:27,000 Speaker 1: signs of fraying. It's well worth reading if you can 264 00:15:27,000 --> 00:15:29,120 Speaker 1: get hold of it. The article explains that there are 265 00:15:29,120 --> 00:15:32,240 Speaker 1: some interesting clauses in the agreement between the two companies 266 00:15:32,320 --> 00:15:36,640 Speaker 1: that suggests the relationship isn't as cozy as was previously thought. 267 00:15:36,920 --> 00:15:40,280 Speaker 1: Considering Microsoft has dedicated more than ten billion with a 268 00:15:40,360 --> 00:15:43,720 Speaker 1: B dollars of investment into open ai so far, I 269 00:15:43,720 --> 00:15:45,760 Speaker 1: mean that's a huge amount of money. So the article 270 00:15:45,920 --> 00:15:48,680 Speaker 1: details how open ai has grown kind of frustrated over 271 00:15:48,720 --> 00:15:52,280 Speaker 1: stuff like access to money and access to compute power, because, 272 00:15:52,280 --> 00:15:56,640 Speaker 1: as I've mentioned before, ai is incredibly expensive, both from 273 00:15:56,760 --> 00:16:00,560 Speaker 1: a purely financial standpoint and energy requirements in order to 274 00:16:00,720 --> 00:16:03,480 Speaker 1: power all that compute that you need. And you have 275 00:16:03,520 --> 00:16:05,840 Speaker 1: companies like open ai that are trying to scale up 276 00:16:05,920 --> 00:16:09,400 Speaker 1: and ramp up ever more ambitious project that's kind of 277 00:16:09,440 --> 00:16:14,320 Speaker 1: require even more computational power, and yeah, that's incredibly expensive. 278 00:16:14,560 --> 00:16:16,000 Speaker 1: One of the things that blew my mind in this 279 00:16:16,120 --> 00:16:19,920 Speaker 1: article is that there are estimates that by twenty twenty nine, 280 00:16:20,520 --> 00:16:24,160 Speaker 1: the annual computational bill for open ai is going to 281 00:16:24,160 --> 00:16:27,640 Speaker 1: be somewhere like thirty seven and a half billion dollars 282 00:16:27,960 --> 00:16:31,200 Speaker 1: per year. Like, think how much money you have to make. 283 00:16:31,480 --> 00:16:35,080 Speaker 1: If your expenses are thirty seven point five billion dollars, 284 00:16:35,320 --> 00:16:38,000 Speaker 1: open ai isn't making enough money to cover its expenses 285 00:16:38,040 --> 00:16:40,600 Speaker 1: now they're look they were looking at spending around more 286 00:16:40,640 --> 00:16:43,800 Speaker 1: than five billion dollars for compute power this year. So 287 00:16:44,440 --> 00:16:46,800 Speaker 1: no wonder. There are a lot of analysts out there 288 00:16:46,840 --> 00:16:49,160 Speaker 1: predicting that open ai was going to go bankrupt before 289 00:16:49,200 --> 00:16:51,480 Speaker 1: the end of the year, except that they then got 290 00:16:51,480 --> 00:16:55,600 Speaker 1: a big influx of cash from another investment round. Yeah, 291 00:16:55,640 --> 00:16:58,440 Speaker 1: it's pretty crazy. Also, open ai apparently has a clause 292 00:16:58,440 --> 00:17:03,080 Speaker 1: that says if Microsoft gets to artificial General Intelligence or AGI, 293 00:17:03,360 --> 00:17:07,520 Speaker 1: then it severs the partnership between the two. And meanwhile, 294 00:17:07,520 --> 00:17:11,560 Speaker 1: Microsoft is apparently worried that it's depending too heavily upon 295 00:17:11,680 --> 00:17:14,399 Speaker 1: open ai and so wants to diversify its approach to 296 00:17:14,480 --> 00:17:17,800 Speaker 1: artificial intelligence beyond open Ai. It's a really complicated thing, 297 00:17:17,840 --> 00:17:21,119 Speaker 1: so I recommend reading that article. It's very informative. Okay. 298 00:17:21,160 --> 00:17:24,800 Speaker 1: Couple of space stories. NASA's Artemis project to return to 299 00:17:24,840 --> 00:17:27,600 Speaker 1: the Moon continues to hit some snags. A lot of outlets, 300 00:17:27,640 --> 00:17:31,639 Speaker 1: including Ours Technica, have plenty of articles listing numerous reasons 301 00:17:31,640 --> 00:17:34,240 Speaker 1: why we're not likely to see the Artemis two mission 302 00:17:34,320 --> 00:17:37,600 Speaker 1: happen next year, as it was scheduled to happen, but 303 00:17:37,680 --> 00:17:40,960 Speaker 1: work continues to prepare for our return to the lunar surface, 304 00:17:41,160 --> 00:17:44,280 Speaker 1: and one such element is the development of new spacesuits. 305 00:17:44,440 --> 00:17:47,879 Speaker 1: And this week Axiom Space and Prada, as in the 306 00:17:48,080 --> 00:17:52,560 Speaker 1: luxury fashion company, unveiled a new spacesuit design. Now it's 307 00:17:52,600 --> 00:17:56,160 Speaker 1: not exactly chic, but then aesthetics aren't really as important 308 00:17:56,200 --> 00:17:58,800 Speaker 1: as you know, not dying, and as I've mentioned many 309 00:17:58,840 --> 00:18:02,439 Speaker 1: times on this and other space is trying to kill you. 310 00:18:03,000 --> 00:18:05,600 Speaker 1: So the suits have thermal protection built in that the 311 00:18:05,600 --> 00:18:08,840 Speaker 1: companies say will keep astronauts safe from the dangers of 312 00:18:08,880 --> 00:18:12,760 Speaker 1: extreme cold temperatures even at the South Lunar Pole in 313 00:18:12,920 --> 00:18:15,719 Speaker 1: shadowed regions, for up to two hours at a time. Now, 314 00:18:15,720 --> 00:18:17,479 Speaker 1: I'm not sure when this design is going to get 315 00:18:17,520 --> 00:18:20,960 Speaker 1: fitted to an actual astronaut for use in space, but 316 00:18:21,040 --> 00:18:24,840 Speaker 1: it's pretty cool to see the next evolution of spacesuits. Finally, 317 00:18:25,320 --> 00:18:28,560 Speaker 1: if you are aware of this news, I'm not surprised 318 00:18:28,560 --> 00:18:33,399 Speaker 1: it was spectacular. SpaceX accomplished an incredible achievement when a 319 00:18:33,440 --> 00:18:37,359 Speaker 1: super heavy Falcon booster returned to its launch site after 320 00:18:37,400 --> 00:18:40,600 Speaker 1: propelling a payload high into the atmosphere, and as it 321 00:18:40,640 --> 00:18:44,919 Speaker 1: did so, an enormous mechanical claw on the tower caught 322 00:18:45,200 --> 00:18:48,719 Speaker 1: the booster as it returned under precise control. As Elon 323 00:18:48,840 --> 00:18:52,200 Speaker 1: Musk wrote on x the tower caught the rocket and yeah, 324 00:18:52,240 --> 00:18:55,399 Speaker 1: seeing the video of this is spectacular. It's hard for 325 00:18:55,440 --> 00:18:58,800 Speaker 1: me to fathom how complicated this was from an engineering standpoint. 326 00:18:58,920 --> 00:19:02,240 Speaker 1: Having such precise control of a descent and the perfect 327 00:19:02,280 --> 00:19:06,280 Speaker 1: timing for the tower to grasp the booster with its claws, 328 00:19:06,400 --> 00:19:10,119 Speaker 1: which are called chopsticks, that is just amazing stuff, really 329 00:19:10,160 --> 00:19:13,679 Speaker 1: worth watching. And before I leave, one more reading recommendation 330 00:19:13,720 --> 00:19:17,320 Speaker 1: for all of y'all. Lilahmcclellan has a piece on Fortune 331 00:19:17,359 --> 00:19:21,280 Speaker 1: dot com that's titled twenty three and Me's entire board 332 00:19:21,520 --> 00:19:26,359 Speaker 1: resigned on the same day founder and Vodjitski still thinks 333 00:19:26,400 --> 00:19:29,000 Speaker 1: the startup is saveable. And it's a really thoughtful and 334 00:19:29,040 --> 00:19:32,480 Speaker 1: I think balanced analysis of the troubled company's challenges in 335 00:19:32,520 --> 00:19:35,439 Speaker 1: recent years and a complicated portrait of an assertive and 336 00:19:35,520 --> 00:19:38,760 Speaker 1: controversial founder. It's well worth a read. It covers a 337 00:19:38,800 --> 00:19:41,879 Speaker 1: lot of territory. That's it for this week. I hope 338 00:19:41,960 --> 00:19:44,320 Speaker 1: all of you out there are doing well, and I'll 339 00:19:44,320 --> 00:19:53,239 Speaker 1: talk to you again, really soon. Tech Stuff is an 340 00:19:53,320 --> 00:19:58,480 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio 341 00:19:58,560 --> 00:20:02,119 Speaker 1: app Apple Podcasts, wherever you listen to your favorite shows.