1 00:00:13,640 --> 00:00:16,920 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:17,040 --> 00:00:18,880 Speaker 2: I'm as Voloshian and I'm care Price. 3 00:00:19,160 --> 00:00:21,920 Speaker 1: Today we're going to get into the thing no One 4 00:00:22,000 --> 00:00:26,120 Speaker 1: asked for a social media app dedicated to AI video, 5 00:00:26,360 --> 00:00:30,960 Speaker 1: and then a study asking, somewhat surprisingly whether social media 6 00:00:31,080 --> 00:00:34,400 Speaker 1: is actually on the decline, Then some advice for parents 7 00:00:34,440 --> 00:00:37,040 Speaker 1: about how to raise your kids in the age of AI. 8 00:00:37,640 --> 00:00:44,800 Speaker 1: All of that on the Weekend Tech. It's Friday, October tenth. 9 00:00:47,200 --> 00:00:50,440 Speaker 2: Hello Cara, Hi Azie, how are you good. 10 00:00:50,520 --> 00:00:53,640 Speaker 1: I thought we'd start with Show and Tell today, my favorite. 11 00:00:53,920 --> 00:00:54,440 Speaker 1: So here we go. 12 00:00:54,640 --> 00:00:58,880 Speaker 2: Show me he's taking out YouTube guys. It's loading and 13 00:00:58,920 --> 00:00:59,880 Speaker 2: I'm seeing Andrew Ku. 14 00:01:00,480 --> 00:01:03,240 Speaker 3: I'm Andrew Cuomo and I could pertain to do a 15 00:01:03,240 --> 00:01:06,360 Speaker 3: lot of jobs. But I know what I know, and 16 00:01:06,440 --> 00:01:09,640 Speaker 3: I know what I don't know, and I do know 17 00:01:09,720 --> 00:01:12,600 Speaker 3: how to make government work. There are a lot of 18 00:01:12,680 --> 00:01:16,000 Speaker 3: jobs I can't do, but I'm ready to be your 19 00:01:16,080 --> 00:01:17,479 Speaker 3: mayor on day one. 20 00:01:18,200 --> 00:01:19,800 Speaker 1: There are a lot of things I'm not good at, 21 00:01:20,000 --> 00:01:24,080 Speaker 1: including being a mayor, being a mayor, and using AI 22 00:01:24,160 --> 00:01:28,039 Speaker 1: to make campaign ads unbelievable. Just for the benefit of 23 00:01:28,080 --> 00:01:31,120 Speaker 1: our listeners, the premise here is that Andrew Cuomo can't 24 00:01:31,120 --> 00:01:34,039 Speaker 1: do a lot of jobs. His AI version is looking 25 00:01:34,080 --> 00:01:37,560 Speaker 1: like a struggling subway conductor, stressed out as a trader 26 00:01:37,560 --> 00:01:40,080 Speaker 1: on the floor of the New York Stock Exchange, and 27 00:01:40,640 --> 00:01:45,319 Speaker 1: a window washer. However, there are two Empire State buildings 28 00:01:45,319 --> 00:01:46,720 Speaker 1: behind him, which that was quite missing. 29 00:01:47,960 --> 00:01:52,440 Speaker 2: I really love that video because one, it really makes 30 00:01:52,520 --> 00:01:53,400 Speaker 2: him look like a boomer. 31 00:01:53,480 --> 00:01:54,680 Speaker 1: It really really really does. 32 00:01:54,760 --> 00:01:57,360 Speaker 2: And obviously the person that he's going up against has 33 00:01:57,680 --> 00:02:00,800 Speaker 2: swept sillennials in drove Mamdani. 34 00:02:01,080 --> 00:02:03,920 Speaker 1: I know it's interesting. In the YouTube comments, somebody actually 35 00:02:04,000 --> 00:02:09,200 Speaker 1: commented boomers love their AI slop, which I thought was funny. 36 00:02:09,400 --> 00:02:11,240 Speaker 1: There are some other great YouTube comments I like, like 37 00:02:11,800 --> 00:02:15,680 Speaker 1: bro is getting funded by billionaires, pecant afford actions with 38 00:02:15,840 --> 00:02:21,079 Speaker 1: three skull emojic doll emojis. Mamdani also dinged Quomo obviously, 39 00:02:21,080 --> 00:02:22,520 Speaker 1: I mean when you when you present someone with an 40 00:02:22,520 --> 00:02:25,120 Speaker 1: open goal, they tend to strike the ball into it. 41 00:02:25,360 --> 00:02:28,600 Speaker 1: He made fun of Quomo Becausemamber earlier in the campaign 42 00:02:28,680 --> 00:02:31,520 Speaker 1: got called out for using AIS generate his housing plan. 43 00:02:31,720 --> 00:02:33,080 Speaker 1: Yes that was back in April. 44 00:02:33,200 --> 00:02:35,800 Speaker 2: Yes, he uses AI for his campaign videos and for 45 00:02:35,840 --> 00:02:36,880 Speaker 2: the housing exactly. 46 00:02:37,080 --> 00:02:39,400 Speaker 1: But Mamdani also said whether it was very clever in 47 00:02:39,440 --> 00:02:42,960 Speaker 1: a city of world class artists and production crews hunting 48 00:02:43,000 --> 00:02:45,920 Speaker 1: for their next gig. Andrew Cuomo made a TV ad 49 00:02:46,120 --> 00:02:48,520 Speaker 1: the same way he wrote his housing policy with AI. 50 00:02:48,919 --> 00:02:50,919 Speaker 2: That was a sweet dig. That was a sweet dig. 51 00:02:51,040 --> 00:02:53,600 Speaker 1: And you know you mentioned boomers, and with all due 52 00:02:53,639 --> 00:02:56,120 Speaker 1: respect to our you know boomers out there, it is 53 00:02:56,160 --> 00:02:59,720 Speaker 1: interesting how boomerish this use of AI is. 54 00:03:00,080 --> 00:03:02,760 Speaker 2: I'm sure that Cuomo jumped at the opportunity to put 55 00:03:02,760 --> 00:03:04,240 Speaker 2: AI in a campaign, of. 56 00:03:04,160 --> 00:03:06,360 Speaker 1: Course, as it feels fresh, and that's what I'm going 57 00:03:06,400 --> 00:03:08,160 Speaker 1: to do. We're gonna use AI. Do you remember the 58 00:03:08,200 --> 00:03:11,399 Speaker 1: original Mamdani campaign from the Democratic primary? Yeah, of course, 59 00:03:11,440 --> 00:03:14,720 Speaker 1: it was very visually iconic. Extremely Scott Galloway made this 60 00:03:14,760 --> 00:03:18,840 Speaker 1: point in his newsletter, which is that very surprisingly, despite 61 00:03:18,880 --> 00:03:21,120 Speaker 1: all the headlines about how AI is going to run 62 00:03:21,160 --> 00:03:24,959 Speaker 1: designers out of business, the ratio of designers to engineers 63 00:03:25,000 --> 00:03:29,399 Speaker 1: in Silicon Valley firms is actually increasing because the deluge 64 00:03:29,400 --> 00:03:31,720 Speaker 1: of AI slop is putting a new premium on really 65 00:03:31,760 --> 00:03:33,960 Speaker 1: really good human designers, right, and. 66 00:03:33,919 --> 00:03:36,640 Speaker 2: So design has to matter more like matter. 67 00:03:36,800 --> 00:03:40,000 Speaker 1: You know the old axiom new tools don't always equal 68 00:03:40,040 --> 00:03:41,080 Speaker 1: better results, right. 69 00:03:41,160 --> 00:03:44,160 Speaker 2: Well, you know, speaking of AI generated videos. I want 70 00:03:44,160 --> 00:03:47,720 Speaker 2: to talk a little bit about something that I can't 71 00:03:48,680 --> 00:03:54,880 Speaker 2: stop seeing online, and that is Sora to videos, Sora 72 00:03:55,000 --> 00:03:55,600 Speaker 2: Too videos. 73 00:03:55,680 --> 00:03:57,880 Speaker 1: I'm glad these are from open Ai, right, yes, you know, 74 00:03:57,960 --> 00:03:59,400 Speaker 1: I want to hear what you have telling me about Sora. 75 00:03:59,480 --> 00:04:00,840 Speaker 1: But before we get there, I do want to just 76 00:04:00,840 --> 00:04:02,680 Speaker 1: talk about what a big week it's been for open Ai. 77 00:04:03,000 --> 00:04:06,080 Speaker 1: There's also a multi hundred million dollar chip deal and 78 00:04:06,160 --> 00:04:09,240 Speaker 1: the release of something that people are calling the everything app. 79 00:04:09,320 --> 00:04:10,920 Speaker 1: So we're going to get to those other two big 80 00:04:11,240 --> 00:04:13,160 Speaker 1: open Ai headlines, but first time me about Sora. 81 00:04:13,640 --> 00:04:16,560 Speaker 2: So had you you've heard of Sora before I had. 82 00:04:16,760 --> 00:04:18,960 Speaker 2: It refers to a few different things. One of them 83 00:04:19,160 --> 00:04:21,960 Speaker 2: is what Sora one was, which was open AI's text 84 00:04:22,000 --> 00:04:26,000 Speaker 2: to video generator that was the original Sora. Last week, 85 00:04:26,320 --> 00:04:28,000 Speaker 2: Sra gets an update. I don't know why I want 86 00:04:28,000 --> 00:04:31,920 Speaker 2: to call her. Her name is Sora Too, and Sora 87 00:04:31,960 --> 00:04:34,880 Speaker 2: to is what powers this new app called Sora, which 88 00:04:34,920 --> 00:04:36,480 Speaker 2: is soa up. 89 00:04:36,880 --> 00:04:40,440 Speaker 1: Yeah, very good, very very good. I where as in 90 00:04:40,480 --> 00:04:42,679 Speaker 1: the app charts it was like top three, I actually 91 00:04:42,680 --> 00:04:44,280 Speaker 1: know the answer one, number. 92 00:04:44,080 --> 00:04:45,799 Speaker 2: One one in the total app. 93 00:04:45,920 --> 00:04:48,280 Speaker 1: Lap charts, even though it's in beta or beta. 94 00:04:48,080 --> 00:04:51,479 Speaker 2: As Beta beta orour Yeah. 95 00:04:52,200 --> 00:04:54,320 Speaker 1: I think one thing worth mentioning here is I think 96 00:04:54,400 --> 00:04:56,800 Speaker 1: this is the first time open ai has ever released 97 00:04:56,839 --> 00:04:59,600 Speaker 1: an app other than chatchipt. That's correct, Which if you're 98 00:04:59,640 --> 00:05:02,719 Speaker 1: a big company, that must mean you're treating us pretty 99 00:05:03,279 --> 00:05:07,239 Speaker 1: dumb seriously to release only the second ever consumer facing app. 100 00:05:07,440 --> 00:05:10,360 Speaker 2: Yeah, and you know chat rept. We now know has 101 00:05:10,440 --> 00:05:14,200 Speaker 2: Google kind of running scared in the search category, but 102 00:05:14,440 --> 00:05:18,080 Speaker 2: now open ai seems to be coming for social media too. 103 00:05:18,440 --> 00:05:21,560 Speaker 2: And what's really interesting is that Meta also launched an 104 00:05:21,600 --> 00:05:24,640 Speaker 2: AI video feed in their meta app just five days 105 00:05:24,680 --> 00:05:27,480 Speaker 2: before the Sora app dropped. The name of it is 106 00:05:29,279 --> 00:05:31,400 Speaker 2: it makes you think that Zuckerberg must name these things. 107 00:05:31,880 --> 00:05:35,000 Speaker 1: Vibes, oh Man, vibe coding, Meta. 108 00:05:34,760 --> 00:05:38,159 Speaker 2: Vibes, It's Pineapple Express. 109 00:05:38,960 --> 00:05:40,840 Speaker 1: I saw this. You couldn't miss the Sora and news 110 00:05:40,920 --> 00:05:43,120 Speaker 1: unless you're living under a rock. I didn't see a 111 00:05:43,240 --> 00:05:45,600 Speaker 1: peep or hear a peep about vibes. 112 00:05:45,880 --> 00:05:48,040 Speaker 2: I think why you didn't hear a peep about vibes 113 00:05:48,080 --> 00:05:52,839 Speaker 2: is because vibes sucks. You know, there's like a huge 114 00:05:52,839 --> 00:05:56,719 Speaker 2: difference between what you can do on vibes in terms 115 00:05:56,760 --> 00:05:58,479 Speaker 2: of Meta AI and what you can do on Sora 116 00:05:59,440 --> 00:06:00,919 Speaker 2: and what you can do on Sora is that you 117 00:06:00,920 --> 00:06:03,320 Speaker 2: can be a character in your own videos and you 118 00:06:03,360 --> 00:06:06,120 Speaker 2: can also put other care I saw Pikachu have a 119 00:06:06,200 --> 00:06:07,400 Speaker 2: DNA test on Maury. 120 00:06:08,160 --> 00:06:08,760 Speaker 1: You're serious? 121 00:06:08,960 --> 00:06:13,520 Speaker 2: It was like Pikachu, you are not. So for those 122 00:06:13,520 --> 00:06:16,040 Speaker 2: who don't know, here's how the app works. You can 123 00:06:16,120 --> 00:06:18,440 Speaker 2: upload a short video of yourself to the app, and 124 00:06:18,480 --> 00:06:22,320 Speaker 2: then the model can seamlessly insert your likeness into this 125 00:06:22,440 --> 00:06:24,520 Speaker 2: hyper realistic video that includes sound. 126 00:06:25,120 --> 00:06:27,479 Speaker 1: Do you remember when the text to image feature on 127 00:06:27,680 --> 00:06:31,240 Speaker 1: chatchypt was released and the whole Internet caught fire and 128 00:06:31,240 --> 00:06:34,000 Speaker 1: there was this moment where the Internet was completely flooded 129 00:06:34,000 --> 00:06:38,640 Speaker 1: by studio ghibli characters and you know pixel characters of 130 00:06:38,680 --> 00:06:41,160 Speaker 1: like you and your friends going about your life. When 131 00:06:41,200 --> 00:06:43,000 Speaker 1: was the last time you saw one of those in circulation? 132 00:06:43,120 --> 00:06:45,720 Speaker 2: Yeah, it's done dead. I like we have it from 133 00:06:45,800 --> 00:06:48,280 Speaker 2: jet whenever it was Yeah. So what do you think 134 00:06:48,640 --> 00:06:49,520 Speaker 2: in terms of Sora? 135 00:06:49,680 --> 00:06:49,920 Speaker 1: Yeah? 136 00:06:49,960 --> 00:06:51,000 Speaker 2: If like it will add. 137 00:06:50,960 --> 00:06:53,240 Speaker 1: It's obviously so thrilling to put you on your your 138 00:06:53,279 --> 00:06:56,880 Speaker 1: friends in Spider Man with Pikachu and Maury, Like, but 139 00:06:57,200 --> 00:06:58,279 Speaker 1: is it durable? 140 00:06:58,880 --> 00:07:01,960 Speaker 2: I think this will be more durable than the kind 141 00:07:01,960 --> 00:07:05,360 Speaker 2: of quick high jinks of studios Jubilee, Like, I think 142 00:07:05,400 --> 00:07:08,520 Speaker 2: that this is really fun to do over and over again. 143 00:07:09,000 --> 00:07:11,120 Speaker 2: Do I think Sora as an app is going to 144 00:07:11,200 --> 00:07:14,600 Speaker 2: Sora in the app store, you know, in perpetuity the 145 00:07:14,640 --> 00:07:19,320 Speaker 2: way that Chatgypt is just for sure. No, I don't 146 00:07:19,360 --> 00:07:21,200 Speaker 2: think it will. I definitely don't think it will. 147 00:07:21,240 --> 00:07:25,600 Speaker 1: So telling you mentioned more in Pikachu, which is undeniable, 148 00:07:25,840 --> 00:07:27,480 Speaker 1: will you confront him with just a flood of this 149 00:07:27,480 --> 00:07:28,400 Speaker 1: stuff I. 150 00:07:28,440 --> 00:07:30,320 Speaker 2: Was, and the way that you know it when you 151 00:07:30,360 --> 00:07:32,800 Speaker 2: watch a SRA video if you're on TikTok or Instagram, 152 00:07:33,120 --> 00:07:34,360 Speaker 2: there's a Sora watermark. 153 00:07:34,560 --> 00:07:36,280 Speaker 1: Wow. But those are kind of a flex right because 154 00:07:36,280 --> 00:07:38,240 Speaker 1: it's still in beta. 155 00:07:37,760 --> 00:07:39,760 Speaker 2: Right, And what it means is that you got invited 156 00:07:39,800 --> 00:07:41,840 Speaker 2: to Sora, which is very cool, which we you know, 157 00:07:42,200 --> 00:07:42,720 Speaker 2: one day. 158 00:07:42,640 --> 00:07:44,480 Speaker 1: Why did say in our production METI on Monday? Way, 159 00:07:44,480 --> 00:07:45,640 Speaker 1: why aren't we going to invite it to? 160 00:07:46,080 --> 00:07:48,360 Speaker 4: So I don't know. 161 00:07:49,480 --> 00:07:51,960 Speaker 2: The ones that I actually like the most were of 162 00:07:52,080 --> 00:07:58,960 Speaker 2: Sam Altman himself, who is very smart about centering himself 163 00:07:59,120 --> 00:08:00,520 Speaker 2: as the face of open Ai. 164 00:08:01,040 --> 00:08:03,200 Speaker 1: Now. I just wanted to pause on this because one 165 00:08:03,240 --> 00:08:05,720 Speaker 1: of the concerns has been can I take your face 166 00:08:06,000 --> 00:08:09,440 Speaker 1: and sorrow you? And the answer technically is no right. 167 00:08:09,520 --> 00:08:13,880 Speaker 1: So Sam, that's right, Sam's my dear friend. He decided 168 00:08:13,920 --> 00:08:17,080 Speaker 1: to basically make his face available to anyone who wanted 169 00:08:17,120 --> 00:08:18,880 Speaker 1: to use him as a character. That's right. 170 00:08:18,920 --> 00:08:26,080 Speaker 2: And there's two really disturbing videos. One is him shoplifting 171 00:08:26,160 --> 00:08:26,680 Speaker 2: at Target. 172 00:08:27,040 --> 00:08:29,600 Speaker 1: Oh my god, that he made himself was made of him. 173 00:08:29,760 --> 00:08:31,960 Speaker 2: I think someone made it, put it on X and 174 00:08:32,080 --> 00:08:35,280 Speaker 2: was bragging that their video went viral. The other one 175 00:08:35,520 --> 00:08:38,640 Speaker 2: is really disturbing. Again, another user made it of Sam 176 00:08:38,679 --> 00:08:42,840 Speaker 2: Altman being a sort of crust punk who had seemingly 177 00:08:42,840 --> 00:08:43,920 Speaker 2: a crystal math problem. 178 00:08:44,040 --> 00:08:47,360 Speaker 1: My god. So these are not always in the best taste. 179 00:08:48,160 --> 00:08:50,360 Speaker 2: Not always in the best taste. But I think again, 180 00:08:50,880 --> 00:08:57,040 Speaker 2: show that Sam Altman is the Zuckerberg or the bezos 181 00:08:57,320 --> 00:09:00,360 Speaker 2: that open AI needs more. He's more more. 182 00:09:00,480 --> 00:09:03,520 Speaker 1: Yeah, I mean he is the greatest hype beast the 183 00:09:03,520 --> 00:09:04,360 Speaker 1: world's ever seen. 184 00:09:04,600 --> 00:09:08,439 Speaker 2: Oh absolutely. I just want to go back and talk 185 00:09:08,520 --> 00:09:10,959 Speaker 2: a little bit about some of the darker questions that 186 00:09:11,080 --> 00:09:15,440 Speaker 2: Soa raises. To any normal person, you would imagine this 187 00:09:15,480 --> 00:09:18,320 Speaker 2: is like a huge content moderation issue, because now there's 188 00:09:18,440 --> 00:09:21,640 Speaker 2: just the Internet is flooded with fake videos that are 189 00:09:21,640 --> 00:09:24,600 Speaker 2: fake on purpose, and everyone knows they're fake. The company 190 00:09:24,640 --> 00:09:29,440 Speaker 2: does state that it quote bands impersonation scams and fraud, 191 00:09:30,040 --> 00:09:32,840 Speaker 2: and that there are quote extra guardrails to the app 192 00:09:32,920 --> 00:09:35,600 Speaker 2: when real people are featured in videos meant to block 193 00:09:35,679 --> 00:09:39,680 Speaker 2: nunity and graphic violence. Okay, so technically you can only 194 00:09:39,720 --> 00:09:42,800 Speaker 2: create videos of yourself, your friends, or people who choose 195 00:09:42,840 --> 00:09:46,920 Speaker 2: to let their likeness be public. But as one would expect, 196 00:09:46,960 --> 00:09:48,960 Speaker 2: there has already been at least one video of a 197 00:09:48,960 --> 00:09:52,360 Speaker 2: woman being covered in a mysterious substance. So that sort 198 00:09:52,360 --> 00:09:54,600 Speaker 2: of tells you how well this is going. 199 00:09:55,160 --> 00:09:58,240 Speaker 1: I'm curious. I mean, is there is there really a 200 00:09:58,720 --> 00:10:01,440 Speaker 1: belief or des I hear that Sora the app becomes 201 00:10:01,480 --> 00:10:05,280 Speaker 1: a destination to consume, to just gorge on ai slop. 202 00:10:05,320 --> 00:10:06,480 Speaker 1: Is that the vision is here? 203 00:10:07,280 --> 00:10:11,880 Speaker 2: I think a lot of people would say, yes, it's 204 00:10:11,920 --> 00:10:15,480 Speaker 2: like a it's a new social media destination for crap. 205 00:10:15,600 --> 00:10:17,680 Speaker 1: It's a place to hang out rather than just to create. 206 00:10:17,800 --> 00:10:20,440 Speaker 2: That's right, Okay, that's right. And there are two crucial 207 00:10:20,440 --> 00:10:25,240 Speaker 2: things that make this app different than say Instagram or TikTok. 208 00:10:25,720 --> 00:10:30,240 Speaker 2: The first is that the algorithm is not reigning supreme 209 00:10:30,440 --> 00:10:33,400 Speaker 2: in the Sora feed. It's in as you said, beta, 210 00:10:33,440 --> 00:10:36,880 Speaker 2: but apparently the app allows you to customize your feed 211 00:10:36,920 --> 00:10:39,640 Speaker 2: a little bit, so you can pick a mood, which 212 00:10:39,720 --> 00:10:41,800 Speaker 2: means that basically you tell the app what you want 213 00:10:41,840 --> 00:10:43,360 Speaker 2: to see more of and it feeds it to you. 214 00:10:44,080 --> 00:10:45,760 Speaker 2: The second thing is, we all know how much we 215 00:10:45,800 --> 00:10:50,840 Speaker 2: divulge to chat GPT. Sora quote may consider your chat 216 00:10:50,880 --> 00:10:56,280 Speaker 2: GPT history while creating video, which is a bit strange, Like, 217 00:10:56,720 --> 00:11:00,760 Speaker 2: to what extent is it going to impact? I guess 218 00:11:00,760 --> 00:11:01,600 Speaker 2: my Sora videos. 219 00:11:01,720 --> 00:11:03,560 Speaker 1: You said something to me yesterday when we were talking 220 00:11:03,559 --> 00:11:06,240 Speaker 1: about this that really stayed with me. You said, this 221 00:11:06,320 --> 00:11:08,600 Speaker 1: may be the day the Internet changed. 222 00:11:09,880 --> 00:11:12,760 Speaker 2: I did say that, and you know I talk to 223 00:11:12,800 --> 00:11:15,920 Speaker 2: our producer about it too, And I was thinking, was 224 00:11:15,920 --> 00:11:20,920 Speaker 2: I being a little bit Pollyanna ish? I just I 225 00:11:20,960 --> 00:11:24,920 Speaker 2: think we are already like treading such thin ice in 226 00:11:25,000 --> 00:11:28,160 Speaker 2: terms of like what is true that we see online. 227 00:11:29,360 --> 00:11:35,240 Speaker 2: There's now an app that is dedicated intentionally to propagating 228 00:11:35,520 --> 00:11:38,840 Speaker 2: fake video, and it's being run by probably the most 229 00:11:38,880 --> 00:11:43,720 Speaker 2: important company of this new generation. And so I do 230 00:11:43,800 --> 00:11:48,000 Speaker 2: think there is a pre Sora and postsra internet. Do 231 00:11:48,080 --> 00:11:51,240 Speaker 2: I think Sora is going to change the world. Probably not. 232 00:11:52,280 --> 00:11:56,800 Speaker 2: Do I think that having more realistic fake video in 233 00:11:56,880 --> 00:12:01,320 Speaker 2: a time when our president posts Sora like videos on 234 00:12:01,400 --> 00:12:07,559 Speaker 2: truth social is like incredibly damaging. Yes, I think you know. 235 00:12:07,920 --> 00:12:09,800 Speaker 1: Yeah, I mean, I'm I come back to the thought 236 00:12:09,840 --> 00:12:12,000 Speaker 1: I had at the beginning, is this like a flash 237 00:12:12,040 --> 00:12:12,520 Speaker 1: in the pan? 238 00:12:12,679 --> 00:12:12,800 Speaker 3: Like? 239 00:12:13,160 --> 00:12:15,720 Speaker 1: So, Scott Gallaway, who I mentioned up top, he's the 240 00:12:15,760 --> 00:12:19,320 Speaker 1: podcast host of Pivot and a marketing professor, had interesting 241 00:12:19,400 --> 00:12:21,960 Speaker 1: take here. He's very much the opinion that this is 242 00:12:22,000 --> 00:12:24,040 Speaker 1: a flash in the pan that surprise, based on his 243 00:12:24,160 --> 00:12:27,080 Speaker 1: thesis that most people don't want to be creators, they 244 00:12:27,120 --> 00:12:29,160 Speaker 1: want to be consumers. And here are the stats that 245 00:12:29,200 --> 00:12:32,760 Speaker 1: he shared to support that four percent of YouTube videos 246 00:12:32,800 --> 00:12:35,160 Speaker 1: account for ninety four percent of views on the platform, 247 00:12:35,880 --> 00:12:38,840 Speaker 1: five percent of TikTok videos generate eighty nine percent of 248 00:12:38,840 --> 00:12:42,360 Speaker 1: the views. On Instagram, three percent of videos and eighty 249 00:12:42,400 --> 00:12:46,000 Speaker 1: four percent of all views. Scott's hypothesis, or his take 250 00:12:46,480 --> 00:12:49,520 Speaker 1: was that actually Sora is really a B to B product. 251 00:12:49,520 --> 00:12:53,719 Speaker 1: This is designed to put Hollywood and advertising agencies effectively 252 00:12:53,760 --> 00:12:56,760 Speaker 1: to make many many people who work there redundant. Yeah, 253 00:12:56,840 --> 00:13:02,600 Speaker 1: but that's not very culturally favorable cell And so the 254 00:13:02,679 --> 00:13:05,120 Speaker 1: idea was to release as a consumer app, even though 255 00:13:05,120 --> 00:13:07,800 Speaker 1: that's not really what it's for. Interesting, it's good take, 256 00:13:07,920 --> 00:13:10,600 Speaker 1: but I also disagree. I disagree because of what you 257 00:13:10,679 --> 00:13:15,000 Speaker 1: said about Samilman catapulting himself into being the most iconic 258 00:13:15,320 --> 00:13:19,760 Speaker 1: text CEO of our generation. And I think that the 259 00:13:19,800 --> 00:13:23,080 Speaker 1: more that most people in America and the world experience 260 00:13:23,320 --> 00:13:28,920 Speaker 1: firsthand the absolute mind bending magic of generative AI, the 261 00:13:29,040 --> 00:13:30,560 Speaker 1: richer and more powerful Samilman gets. 262 00:13:30,679 --> 00:13:35,200 Speaker 2: Yeah. I think that's right. And I think that the 263 00:13:35,280 --> 00:13:38,680 Speaker 2: last thing I'll say about SORA is like, I don't 264 00:13:38,679 --> 00:13:41,080 Speaker 2: think it really matters. Is not that many people are 265 00:13:41,160 --> 00:13:43,720 Speaker 2: using it versus consuming it. Isn't the problem that people 266 00:13:43,760 --> 00:13:45,280 Speaker 2: are consuming it in such. 267 00:13:45,640 --> 00:13:51,600 Speaker 1: Hah ha, We're onto our next story. Okay. There was 268 00:13:51,920 --> 00:13:54,720 Speaker 1: a piece in the Financial Times with the headline have 269 00:13:54,760 --> 00:13:58,720 Speaker 1: we passed peak social Media? Have we per This article 270 00:13:58,800 --> 00:14:01,920 Speaker 1: on the data that underlie is it? Social media use 271 00:14:02,080 --> 00:14:06,440 Speaker 1: peaked three years ago in twenty twenty two. What So 272 00:14:06,520 --> 00:14:10,840 Speaker 1: the Ft partner with this digital audience insight company called 273 00:14:11,200 --> 00:14:14,400 Speaker 1: GWI to analyze the online habits of two hundred and 274 00:14:14,440 --> 00:14:18,199 Speaker 1: fifty thousand adults in fifty countries. What they found was 275 00:14:18,240 --> 00:14:20,400 Speaker 1: that by the end of twenty twenty four, people on 276 00:14:20,440 --> 00:14:23,360 Speaker 1: average spent two hours and twenty minutes per day on 277 00:14:23,400 --> 00:14:26,480 Speaker 1: social media, which is down almost ten percent from twenty 278 00:14:26,520 --> 00:14:28,240 Speaker 1: twenty two. This is a trend. 279 00:14:28,400 --> 00:14:29,040 Speaker 2: This is a trend. 280 00:14:29,160 --> 00:14:32,440 Speaker 1: You know, you know, per this data, which generation is 281 00:14:33,960 --> 00:14:35,560 Speaker 1: unsubscribing fastest now? 282 00:14:35,720 --> 00:14:37,560 Speaker 2: Gen Z fascinating. 283 00:14:38,160 --> 00:14:40,520 Speaker 1: So the columnist posits that the reason this may be 284 00:14:40,600 --> 00:14:44,200 Speaker 1: happening is in part because of the degradation of content 285 00:14:44,360 --> 00:14:49,480 Speaker 1: on social media platforms overall. This is also known as incertification, 286 00:14:49,640 --> 00:14:52,800 Speaker 1: the term coined by Cory doctor O. But the columnists 287 00:14:52,840 --> 00:14:55,880 Speaker 1: used this great analogy to describe this flood of new content. 288 00:14:56,600 --> 00:15:00,640 Speaker 1: He calls it quote ultra process content dopamine with at 289 00:15:00,760 --> 00:15:05,040 Speaker 1: best negligible informational value and at worst corrosively negative, And 290 00:15:05,080 --> 00:15:07,040 Speaker 1: he basically posits people are waking up to this. 291 00:15:07,200 --> 00:15:09,720 Speaker 2: You know, I in my youth because I am an 292 00:15:09,840 --> 00:15:13,040 Speaker 2: Instagram super user and I post a lot on Instagram Stories. 293 00:15:13,360 --> 00:15:15,360 Speaker 2: And what I have noticed is the way that the 294 00:15:15,360 --> 00:15:19,720 Speaker 2: Instagram algorithm works. I can't find things as easily as 295 00:15:19,720 --> 00:15:22,560 Speaker 2: I used to because everything is algorithmically driven. So I'm 296 00:15:22,600 --> 00:15:24,920 Speaker 2: seeing the same shit my friends are seeing. And what 297 00:15:24,960 --> 00:15:27,440 Speaker 2: it used to be is that like there was actually, 298 00:15:28,480 --> 00:15:30,080 Speaker 2: say two years ago. 299 00:15:29,760 --> 00:15:32,240 Speaker 1: Before your feed would surface true value. 300 00:15:32,360 --> 00:15:34,640 Speaker 2: Yeah, because my feed actually had things in it that 301 00:15:34,680 --> 00:15:36,640 Speaker 2: I wanted to see. Now my feed just has things 302 00:15:36,680 --> 00:15:37,960 Speaker 2: in it that my friends are watching. 303 00:15:38,080 --> 00:15:40,760 Speaker 1: Well, that's again part of what this article is about, 304 00:15:40,760 --> 00:15:43,320 Speaker 1: which is the thrill of social media was really about 305 00:15:43,320 --> 00:15:46,440 Speaker 1: connecting with your friends, right, But now these apps basically 306 00:15:46,480 --> 00:15:50,080 Speaker 1: promote just time spent with the never ending feed of 307 00:15:50,400 --> 00:15:54,040 Speaker 1: sensational content, and so it's become more and more passive 308 00:15:54,080 --> 00:15:57,360 Speaker 1: and therefore, listen, let's engaging. I mean, the snake is 309 00:15:57,400 --> 00:15:59,960 Speaker 1: eating its own tail, at least according to this article. 310 00:16:00,080 --> 00:16:03,360 Speaker 1: On the data, there is a twist here. Guess which 311 00:16:03,400 --> 00:16:05,960 Speaker 1: continent social media is not declining in? 312 00:16:06,320 --> 00:16:08,120 Speaker 2: Not declining North America. 313 00:16:08,240 --> 00:16:11,960 Speaker 1: Yeah, the trend line is down everywhere apart from here. 314 00:16:12,240 --> 00:16:14,480 Speaker 1: Daily usage is still rising year after year in this 315 00:16:14,520 --> 00:16:17,040 Speaker 1: part of the world, and last year North American social 316 00:16:17,080 --> 00:16:19,640 Speaker 1: media consumpation was fifteen percent higher than Europe. 317 00:16:19,720 --> 00:16:23,200 Speaker 2: It just scares me because we now have an entirely 318 00:16:23,280 --> 00:16:26,040 Speaker 2: new app that's number one on the App Store that 319 00:16:26,160 --> 00:16:28,000 Speaker 2: is literally I mean, you want to talk about in schitification. 320 00:16:28,320 --> 00:16:30,480 Speaker 2: You want to have an ais I mean a Pikachu 321 00:16:31,320 --> 00:16:35,280 Speaker 2: DJing at boiler Room, which is a real sour video 322 00:16:35,320 --> 00:16:35,840 Speaker 2: that came out. 323 00:16:35,880 --> 00:16:37,600 Speaker 1: Are you looking for Pikachu or you just I. 324 00:16:37,600 --> 00:16:42,320 Speaker 2: Mean, I'm always looking for Pikachu, isn't that the but no, 325 00:16:42,400 --> 00:16:45,160 Speaker 2: I just I think that the Galloway argument gets me 326 00:16:45,160 --> 00:16:47,840 Speaker 2: a little frustrated because it's like keeping candy in the 327 00:16:47,840 --> 00:16:50,720 Speaker 2: house all year as opposed to on Halloween, because it's like, 328 00:16:50,840 --> 00:16:52,880 Speaker 2: if we only have it on Halloween, at least there 329 00:16:52,920 --> 00:16:54,560 Speaker 2: is a day that's designated to eating it. If we 330 00:16:54,600 --> 00:16:56,520 Speaker 2: have it in the house all the time, kids are 331 00:16:56,520 --> 00:16:58,120 Speaker 2: just gonna be like, I want to eat candy every night. 332 00:16:58,440 --> 00:17:01,520 Speaker 2: And I think that with sore uh, it might not 333 00:17:02,880 --> 00:17:06,959 Speaker 2: impact the American social media landscape in the way that 334 00:17:07,080 --> 00:17:10,480 Speaker 2: you know, TikTok or Instagram did, but it does now 335 00:17:10,640 --> 00:17:14,639 Speaker 2: put more shit into the world that obviously, based on 336 00:17:14,680 --> 00:17:16,560 Speaker 2: what you're saying, Americans can't handle. 337 00:17:16,960 --> 00:17:18,840 Speaker 1: And I think but by the way, it's engaging. I mean, 338 00:17:18,920 --> 00:17:20,120 Speaker 1: I like seeing this stuff. 339 00:17:20,440 --> 00:17:23,720 Speaker 2: I enjoy watching a giraffe and a zebra walk through 340 00:17:23,720 --> 00:17:27,080 Speaker 2: suburban America absolutely. 341 00:17:26,080 --> 00:17:29,080 Speaker 1: As giraffe and zebra blended together. I promise more about 342 00:17:29,080 --> 00:17:32,200 Speaker 1: open Ai. We've talked about the first, you know, triumph 343 00:17:32,240 --> 00:17:35,080 Speaker 1: of the week from open AI's perspective, which was breaking 344 00:17:35,119 --> 00:17:38,480 Speaker 1: the Internet with a new social video generation app. But 345 00:17:38,520 --> 00:17:41,240 Speaker 1: there are also two other things. Announcing a deal with 346 00:17:41,320 --> 00:17:44,960 Speaker 1: the chip maker AMD, which added one hundred billion dollars 347 00:17:45,000 --> 00:17:47,840 Speaker 1: to the market cap of AMD almost immediately. I mean, 348 00:17:47,920 --> 00:17:49,960 Speaker 1: get this, just announcing it down a deal with the 349 00:17:49,960 --> 00:17:52,679 Speaker 1: company makes that company one hundred billion dollars more valuable. 350 00:17:52,680 --> 00:17:53,560 Speaker 1: I mean talk about power. 351 00:17:54,480 --> 00:17:56,400 Speaker 2: Can you talk a little more about AMD because I'm 352 00:17:56,440 --> 00:17:57,240 Speaker 2: not so familiar. 353 00:17:57,400 --> 00:18:00,280 Speaker 1: Yeah, so AMD is, you know, in video, is the 354 00:18:00,320 --> 00:18:01,560 Speaker 1: world's number one chip maker. 355 00:18:01,760 --> 00:18:05,240 Speaker 2: Yes, I know this islder and. 356 00:18:05,320 --> 00:18:07,880 Speaker 1: Open air also did a huge deal with Nvidia recently, 357 00:18:07,960 --> 00:18:11,040 Speaker 1: and also did a huge deal with Oracle for data centers. 358 00:18:11,240 --> 00:18:14,399 Speaker 1: But this was an agreement for open ai to start 359 00:18:14,480 --> 00:18:19,320 Speaker 1: building data centers using AMD chips. And this was a 360 00:18:19,440 --> 00:18:23,000 Speaker 1: huge thing for AMD because it basically was open Ai 361 00:18:23,520 --> 00:18:26,879 Speaker 1: laying down a marker saying no, no, we believe that 362 00:18:26,920 --> 00:18:29,560 Speaker 1: these guys can manufacture chips good enough to power the 363 00:18:29,640 --> 00:18:33,240 Speaker 1: data centers we need to do the next generation of AI. 364 00:18:34,600 --> 00:18:38,080 Speaker 1: There was a sweetener. Open ai had got ten percent 365 00:18:38,119 --> 00:18:41,240 Speaker 1: of AMD for free to do this deal. Well for 366 00:18:41,280 --> 00:18:43,640 Speaker 1: one center share options by shares at one center share 367 00:18:43,760 --> 00:18:46,320 Speaker 1: that is crazy. So they're basically, I mean they're out 368 00:18:46,320 --> 00:18:49,680 Speaker 1: there saying we know that working with us will increase 369 00:18:49,680 --> 00:18:50,639 Speaker 1: the value of your business. 370 00:18:50,760 --> 00:18:53,399 Speaker 2: So here you go, here's ten percent of it, right, 371 00:18:53,480 --> 00:18:54,400 Speaker 2: And that's what happened. 372 00:18:54,440 --> 00:18:57,639 Speaker 1: So the leverage of this company right now is absolutely extraordinary. 373 00:18:58,080 --> 00:19:02,000 Speaker 1: Bloomberg pointed out that the announced between AMD, Oracle and 374 00:19:02,119 --> 00:19:04,000 Speaker 1: Video that open Ai has annouced in the last few 375 00:19:04,000 --> 00:19:07,840 Speaker 1: weeks could top a trillion dollars. A trillion dollars. I 376 00:19:07,840 --> 00:19:10,120 Speaker 1: mean even even even the president rarely says that word. 377 00:19:10,200 --> 00:19:11,159 Speaker 2: He doesn't say trillion. 378 00:19:12,560 --> 00:19:16,440 Speaker 1: One stock analyst told Bloomberg that Sam Mortman quote has 379 00:19:16,480 --> 00:19:19,200 Speaker 1: the power to crash the global economy for a decade 380 00:19:20,119 --> 00:19:22,639 Speaker 1: or take us all to the promised land. 381 00:19:23,200 --> 00:19:24,880 Speaker 2: What is taking us to the promised land? 382 00:19:24,880 --> 00:19:27,000 Speaker 1: Look like? That's what I'm wondering that they didn't have 383 00:19:27,359 --> 00:19:27,920 Speaker 1: very good question. 384 00:19:27,920 --> 00:19:32,200 Speaker 2: They didn't go there not taking rich. Yeah that's rich 385 00:19:32,200 --> 00:19:35,840 Speaker 2: if they have starvation pays off right right right. I mean, 386 00:19:35,920 --> 00:19:37,960 Speaker 2: if there's anybody who is going to make the AI 387 00:19:38,000 --> 00:19:41,320 Speaker 2: revolution pay off, it's the founder of open Air. 388 00:19:42,200 --> 00:19:44,040 Speaker 1: It has to be. In the point article makes is 389 00:19:44,119 --> 00:19:46,200 Speaker 1: now all these companies are so much in each other's 390 00:19:46,240 --> 00:19:48,560 Speaker 1: business that Open Air with on the software side, AMD 391 00:19:48,680 --> 00:19:50,959 Speaker 1: and a Video on the hardware side, that this is 392 00:19:51,000 --> 00:19:52,960 Speaker 1: why some people are worried they may be a bubble 393 00:19:53,000 --> 00:19:55,320 Speaker 1: emerging because just by doing business with each other and 394 00:19:55,520 --> 00:19:58,600 Speaker 1: swishing the cash around, they can inflate each other stock prices, 395 00:19:58,800 --> 00:20:01,520 Speaker 1: borrow more money, do more deals. But of course that's 396 00:20:01,560 --> 00:20:03,399 Speaker 1: why if we don't get to the promised Land, we'll 397 00:20:03,440 --> 00:20:04,120 Speaker 1: go somewhere else. 398 00:20:04,280 --> 00:20:07,560 Speaker 2: There was also another one little piece of information about 399 00:20:07,600 --> 00:20:09,919 Speaker 2: open ai this week, really big open air. 400 00:20:10,040 --> 00:20:12,920 Speaker 1: It's a huge open Yeah. So we've got Sora, we've 401 00:20:12,960 --> 00:20:15,760 Speaker 1: got a md chips. But there's also develop a conference 402 00:20:15,800 --> 00:20:18,600 Speaker 1: on this idea of the everything app. So there was 403 00:20:18,640 --> 00:20:22,920 Speaker 1: all this concern about how generative AI was basically rendering 404 00:20:22,920 --> 00:20:27,280 Speaker 1: the Internet useless because all of the Internet's information was 405 00:20:27,600 --> 00:20:32,680 Speaker 1: served up through a chatbot rather than you know, in search. Now, interestingly, 406 00:20:33,160 --> 00:20:38,960 Speaker 1: companies like Zillow, Booking dot Com, Spotify have voluntarily struck 407 00:20:39,000 --> 00:20:43,439 Speaker 1: partnerships whereby you can use their products through chat GPT, 408 00:20:43,880 --> 00:20:46,320 Speaker 1: So you can say find me a house in X, 409 00:20:46,440 --> 00:20:49,040 Speaker 1: book book me a flight to Y, and without leaving 410 00:20:49,040 --> 00:20:51,120 Speaker 1: the chat GPT app, that app will interact with these 411 00:20:51,119 --> 00:20:54,399 Speaker 1: other companies. Now, of course you get the exciting bump 412 00:20:54,600 --> 00:20:56,400 Speaker 1: of we're working with open ai. 413 00:20:56,480 --> 00:21:00,040 Speaker 2: With the first exciting Etsy was very excited about it 414 00:21:00,240 --> 00:21:00,960 Speaker 2: might have been one of the first. 415 00:21:01,119 --> 00:21:03,440 Speaker 1: On the other hand, you want kind of alizes people 416 00:21:03,600 --> 00:21:05,919 Speaker 1: using your app or your website. I mean, it's interesting, 417 00:21:05,920 --> 00:21:06,720 Speaker 1: it's interesting trade off. 418 00:21:06,920 --> 00:21:08,800 Speaker 2: It is an interesting tradeoff, but I think it goes 419 00:21:08,840 --> 00:21:11,120 Speaker 2: to show. It's kind of like when people started selling 420 00:21:11,119 --> 00:21:13,960 Speaker 2: products on Amazon, like, are are you going to destroy 421 00:21:14,000 --> 00:21:15,880 Speaker 2: your storefront or are you're going to go with where 422 00:21:15,880 --> 00:21:18,480 Speaker 2: people are buying the product? Which is very interesting. So 423 00:21:18,560 --> 00:21:20,800 Speaker 2: I was thinking about this idea of the everything app, 424 00:21:20,960 --> 00:21:24,840 Speaker 2: and our producer Eliza reminded me that China has a 425 00:21:24,880 --> 00:21:27,320 Speaker 2: few everything apps, Ali pay and we Chat. Yeah. 426 00:21:27,359 --> 00:21:29,840 Speaker 1: I've actually I've been to China and so you open 427 00:21:29,880 --> 00:21:32,439 Speaker 1: we Chat and essentially in it you have the equivalent 428 00:21:32,520 --> 00:21:34,520 Speaker 1: of Uber. You have the equivalent of Amazon, like you 429 00:21:34,520 --> 00:21:36,959 Speaker 1: can do it banking, thank you to everything through one 430 00:21:37,000 --> 00:21:39,000 Speaker 1: app and it's been three days. The Promised Land is 431 00:21:39,040 --> 00:21:40,719 Speaker 1: what's obvious. Why it's the promised land, right, because if 432 00:21:40,760 --> 00:21:44,000 Speaker 1: you owned everything up, you're pretty powerful and exactly Elon 433 00:21:44,200 --> 00:21:47,040 Speaker 1: of course the reason he renamed Twitter X. One of 434 00:21:47,040 --> 00:21:50,160 Speaker 1: the reasons was he wanted X to become the everything app, which. 435 00:21:50,000 --> 00:21:54,639 Speaker 2: Does not look promise but chatchipt Yeah, I mean in 436 00:21:54,720 --> 00:21:59,359 Speaker 2: terms of a contender, not meta, not TikTok, chatchipt seems 437 00:21:59,400 --> 00:22:01,880 Speaker 2: to be the closest thing we have for a contender 438 00:22:01,920 --> 00:22:03,639 Speaker 2: in the everything app. And I'm cure. I mean, I 439 00:22:03,680 --> 00:22:05,640 Speaker 2: certainly think the phone is the everything app. 440 00:22:05,720 --> 00:22:07,399 Speaker 1: The phone is the everything app. And what will it 441 00:22:07,400 --> 00:22:10,720 Speaker 1: be displaced by building a new physical interface to use 442 00:22:10,720 --> 00:22:12,720 Speaker 1: their everything app? That I doubt more. 443 00:22:13,200 --> 00:22:15,600 Speaker 2: Yeah, I think that's probably unlikely. But I did have 444 00:22:15,640 --> 00:22:19,040 Speaker 2: this little thought that I can reflect on for the 445 00:22:19,040 --> 00:22:21,840 Speaker 2: next year, which is, like, to what extent will the 446 00:22:21,920 --> 00:22:25,320 Speaker 2: apps on my phone dwindle and dwindle and dwindle until 447 00:22:25,359 --> 00:22:26,480 Speaker 2: there's like three. 448 00:22:26,359 --> 00:22:30,359 Speaker 1: Left after the break? Mini brains might be the future 449 00:22:30,359 --> 00:22:34,800 Speaker 1: of computing. Robots are helping to create babies. Then I 450 00:22:34,840 --> 00:22:44,600 Speaker 1: look at raising kids in the age of AI, and 451 00:22:44,680 --> 00:22:47,400 Speaker 1: we're back, Cara. I have a story that I think 452 00:22:47,840 --> 00:22:50,280 Speaker 1: might blow your mind. What do you think of when 453 00:22:50,320 --> 00:22:53,600 Speaker 1: I say wet weear? I remember this is a family 454 00:22:53,600 --> 00:22:54,200 Speaker 1: friendly show. 455 00:22:55,280 --> 00:22:58,800 Speaker 2: Oh well, then I think of a wetsuit, but you surf. 456 00:23:00,760 --> 00:23:05,560 Speaker 1: So wetwear is not hardware and it's not software. It's 457 00:23:05,720 --> 00:23:07,080 Speaker 1: organic computing. 458 00:23:07,400 --> 00:23:10,000 Speaker 2: And I literally don't understand what you're talking about. 459 00:23:10,040 --> 00:23:13,560 Speaker 1: Okay, So I don't really it, to be honest, but 460 00:23:13,640 --> 00:23:17,840 Speaker 1: I did my best. Following a BBC headline scientists grow 461 00:23:18,200 --> 00:23:20,920 Speaker 1: mini human brains to power computers. 462 00:23:21,480 --> 00:23:23,920 Speaker 2: What what do you mean by mini brains? 463 00:23:24,160 --> 00:23:26,720 Speaker 1: Scientists in Switzerland have found a way to create stem 464 00:23:26,800 --> 00:23:30,320 Speaker 1: cells out of human skin cells, which are developed into 465 00:23:30,320 --> 00:23:33,760 Speaker 1: clusters of neurons that function as mini brains, also known 466 00:23:33,800 --> 00:23:34,639 Speaker 1: as organoids. 467 00:23:35,080 --> 00:23:37,199 Speaker 2: This sounds like science fiction. This is like Revenge of 468 00:23:37,240 --> 00:23:38,600 Speaker 2: the Organoids or something. 469 00:23:38,400 --> 00:23:41,280 Speaker 1: Well funny, should mention one of the scientists leading this 470 00:23:41,359 --> 00:23:44,439 Speaker 1: initiative is a huge science fiction fan. He said how 471 00:23:44,800 --> 00:23:47,399 Speaker 1: thrilling it is for him to make science fiction become 472 00:23:47,680 --> 00:23:51,800 Speaker 1: science fact. But this particular field of study is called biocomputing, 473 00:23:52,280 --> 00:23:55,400 Speaker 1: and the goal is to make these organoids powerful enough 474 00:23:55,440 --> 00:24:00,520 Speaker 1: to function as quote, living servers. Essentially, with the ganoids 475 00:24:00,600 --> 00:24:03,800 Speaker 1: attached to electrodes, the ideas that they will be mini 476 00:24:03,880 --> 00:24:07,360 Speaker 1: computers that can also learn, much like AI does. 477 00:24:07,960 --> 00:24:12,840 Speaker 2: So essentially they're creating organic AI. Is there any fear 478 00:24:12,880 --> 00:24:15,040 Speaker 2: of sentience like we have with jen Ai? 479 00:24:15,560 --> 00:24:18,399 Speaker 1: Perhaps even more so? So? This quote from the BBC 480 00:24:18,600 --> 00:24:24,120 Speaker 1: really haunted me. Sometimes scientists observe a flurry of activity 481 00:24:24,200 --> 00:24:27,960 Speaker 1: from the organoids before they die, similar to the increased 482 00:24:28,040 --> 00:24:30,879 Speaker 1: heart rate and brain activity, which has been observed in 483 00:24:30,920 --> 00:24:33,439 Speaker 1: some humans at the end of life. Now you could say, well, 484 00:24:33,440 --> 00:24:36,080 Speaker 1: this is like a computer overheating before it finally collapses. 485 00:24:36,480 --> 00:24:39,560 Speaker 1: Or you could say this is the symphonic moment where 486 00:24:39,880 --> 00:24:41,920 Speaker 1: you know, all the cells come together for one final 487 00:24:41,920 --> 00:24:44,760 Speaker 1: burst of energy before they pass into the next world. 488 00:24:44,840 --> 00:24:47,280 Speaker 1: But it's to me, it's fascinating, It's unbelievable. 489 00:24:48,000 --> 00:24:50,200 Speaker 2: I guess my question is like, what is the upside 490 00:24:50,200 --> 00:24:51,160 Speaker 2: of doing this? 491 00:24:51,760 --> 00:24:54,280 Speaker 1: Well, I mean, the big story about the AI revolution 492 00:24:54,440 --> 00:24:59,320 Speaker 1: is how much electrical power it requires to continue versus 493 00:24:59,600 --> 00:25:04,160 Speaker 1: us our brains, which require no electoral moath to function. 494 00:25:04,240 --> 00:25:06,239 Speaker 1: So the idea is basically that, you know, we can 495 00:25:06,320 --> 00:25:08,600 Speaker 1: do computing with much better energy efficiency. 496 00:25:08,920 --> 00:25:12,439 Speaker 2: You know, I've many many questions about this, you know, 497 00:25:12,480 --> 00:25:16,000 Speaker 2: mostly around the ethics of having living cells as computers. 498 00:25:16,000 --> 00:25:19,040 Speaker 2: But my first question is like, how the hell do 499 00:25:19,119 --> 00:25:19,840 Speaker 2: these things work. 500 00:25:20,520 --> 00:25:22,680 Speaker 1: Well, it's still early days for this type of work, 501 00:25:22,800 --> 00:25:25,439 Speaker 1: and it's honestly pretty hard to understand. But the Swiss 502 00:25:25,480 --> 00:25:28,120 Speaker 1: scientists are testing the organoids to see if they can 503 00:25:28,160 --> 00:25:31,760 Speaker 1: obey simple key commands. They're not really up to snuff yet, 504 00:25:31,800 --> 00:25:34,360 Speaker 1: and they only live for about four months, but interest 505 00:25:34,440 --> 00:25:37,920 Speaker 1: is spreading. Back In twenty twenty two, an Australian firm 506 00:25:37,960 --> 00:25:41,720 Speaker 1: called Cortical Labs was able to get quote artificial neurons 507 00:25:41,720 --> 00:25:44,719 Speaker 1: to play the early computer game Pong. But some of 508 00:25:44,720 --> 00:25:47,760 Speaker 1: the most promising stuff around this research is, in my view, 509 00:25:48,040 --> 00:25:51,240 Speaker 1: the scientists that JOHNS. Hopkins, who are studying how these 510 00:25:51,280 --> 00:25:55,959 Speaker 1: mini brains process information and how that in turn may 511 00:25:56,000 --> 00:25:58,640 Speaker 1: be applied to treating diseases like Alzheimer's. 512 00:25:58,840 --> 00:26:02,159 Speaker 2: I actually think my next story could be similarly unsettling 513 00:26:02,200 --> 00:26:04,520 Speaker 2: for some people, maybe even more people, because this is 514 00:26:04,640 --> 00:26:07,080 Speaker 2: more relevant to people's lives. It's from the Washington Post 515 00:26:07,119 --> 00:26:10,480 Speaker 2: and the headline reads, robots are learning to make human babies. 516 00:26:10,720 --> 00:26:13,560 Speaker 2: Twenty have already been BORNA where are you going with this? 517 00:26:14,280 --> 00:26:16,080 Speaker 2: It's actually, you know, it's not as sci fi as 518 00:26:16,119 --> 00:26:19,119 Speaker 2: it sounds, unlike your story. There are a few companies 519 00:26:19,119 --> 00:26:22,240 Speaker 2: that are running trials and using AI and robots to 520 00:26:22,359 --> 00:26:24,160 Speaker 2: help with the IVF process. 521 00:26:24,320 --> 00:26:28,600 Speaker 1: Okay, so this is basically automating the IVF process exactly. 522 00:26:28,920 --> 00:26:31,640 Speaker 2: One of the companies highlighted in this article is called 523 00:26:31,720 --> 00:26:36,040 Speaker 2: Conceivable Life Sciences. As a pun, yeah, I you're right, 524 00:26:36,080 --> 00:26:38,880 Speaker 2: it is fun that's what are we gonna call it conceivable, 525 00:26:39,040 --> 00:26:43,800 Speaker 2: So conceivable life sciences use AI to identify the healthiest 526 00:26:43,800 --> 00:26:46,560 Speaker 2: eggs and sperm, which is often a hard thing to do, 527 00:26:46,800 --> 00:26:50,320 Speaker 2: and use robots to extract the eggs and fertilize them. 528 00:26:50,720 --> 00:26:53,679 Speaker 2: They started testing their system on mice and are currently 529 00:26:53,720 --> 00:26:56,920 Speaker 2: conducting a one hundred person clinical trial in Mexico City. 530 00:26:57,359 --> 00:26:59,960 Speaker 2: There's also another company that I want to mention called Overture, 531 00:27:00,520 --> 00:27:03,400 Speaker 2: which ran a three person clinical trial in twenty twenty three, 532 00:27:03,800 --> 00:27:06,879 Speaker 2: and The Washington Post interviewed one of the participants, a 533 00:27:06,920 --> 00:27:09,520 Speaker 2: woman who is the mother of a completely healthy and 534 00:27:09,600 --> 00:27:11,120 Speaker 2: normal two and a half year old. 535 00:27:11,440 --> 00:27:14,520 Speaker 1: So, and we talk a lot about replacing doctors, but 536 00:27:14,600 --> 00:27:17,240 Speaker 1: it tends to be more in the realm of diagnosis. 537 00:27:17,280 --> 00:27:21,520 Speaker 1: But what this story is about is inserting robots into 538 00:27:22,080 --> 00:27:25,120 Speaker 1: the medical process. Presumably, if it worked, it would be 539 00:27:25,840 --> 00:27:27,800 Speaker 1: much cheaper than the human a lot cheaper. 540 00:27:27,840 --> 00:27:29,720 Speaker 2: That's a very good point because there are a few 541 00:27:29,760 --> 00:27:33,480 Speaker 2: advantages to robots and AI. Robots don't get tired, they 542 00:27:33,480 --> 00:27:36,639 Speaker 2: can see better than a human being can, meaning in theory, 543 00:27:36,960 --> 00:27:41,000 Speaker 2: they can select better sperm and insert more reliably into eggs. 544 00:27:41,440 --> 00:27:44,159 Speaker 2: So the hope is that the IVF process can get faster, 545 00:27:44,359 --> 00:27:46,879 Speaker 2: as you said, cheaper and better with the help of 546 00:27:46,920 --> 00:27:56,280 Speaker 2: this technology. 547 00:27:57,200 --> 00:27:59,560 Speaker 1: So normally this is where we do our chatamme segment 548 00:27:59,560 --> 00:28:01,720 Speaker 1: where we hit from our listeners about how they're really 549 00:28:01,840 --> 00:28:04,800 Speaker 1: using chatbots. But this week I had the chance to 550 00:28:04,840 --> 00:28:07,399 Speaker 1: talk to the hosts of a new podcast called Raising 551 00:28:07,480 --> 00:28:08,640 Speaker 1: Kids in the Age of AI. 552 00:28:09,359 --> 00:28:12,440 Speaker 2: Every single day I wake up and I go, oh, yeah, 553 00:28:12,520 --> 00:28:13,280 Speaker 2: I don't have kids. 554 00:28:13,320 --> 00:28:15,080 Speaker 1: Cool, you don't have to raise kids. 555 00:28:16,160 --> 00:28:17,439 Speaker 2: I don't have to raise kids in the age of 556 00:28:17,440 --> 00:28:19,880 Speaker 2: a phone or a IPEP maybe soon. 557 00:28:20,359 --> 00:28:23,439 Speaker 1: No surviving it yourself. So whether we have kids or not, 558 00:28:23,640 --> 00:28:26,439 Speaker 1: we are living through this sea change about the future 559 00:28:26,440 --> 00:28:29,120 Speaker 1: of how we work, the future, how we interact, and 560 00:28:29,119 --> 00:28:31,480 Speaker 1: how we move through the world in general. And this 561 00:28:31,520 --> 00:28:34,919 Speaker 1: podcast provides actional advice about how to help young people 562 00:28:35,520 --> 00:28:39,160 Speaker 1: use and harness AI. Kaleidoscope is actually one of the 563 00:28:39,200 --> 00:28:42,800 Speaker 1: producers on the show, so I'm biased by ConfL I 564 00:28:42,800 --> 00:28:44,600 Speaker 1: think I do think it's very interesting. 565 00:28:44,280 --> 00:28:45,400 Speaker 2: Who are the hosts of the show. 566 00:28:45,960 --> 00:28:49,720 Speaker 1: So. Alex Catran is the co founder and CEO of 567 00:28:49,760 --> 00:28:53,720 Speaker 1: the AI Education Project and the AI Education Project partner 568 00:28:53,720 --> 00:28:56,920 Speaker 1: with Google to create this podcast with us at Kaleidoscope. 569 00:28:57,320 --> 00:29:00,400 Speaker 1: Alex got his start in politics and policy and found 570 00:29:00,480 --> 00:29:04,720 Speaker 1: himself attending all these flashy AI conferences, and Ax became 571 00:29:05,000 --> 00:29:06,680 Speaker 1: consumed by this idea. 572 00:29:06,840 --> 00:29:08,560 Speaker 3: How do you make sure that every kid is having 573 00:29:08,600 --> 00:29:11,440 Speaker 3: the same conversations with the future that the futurists are having. 574 00:29:11,800 --> 00:29:12,840 Speaker 2: And who's the other host? 575 00:29:13,240 --> 00:29:16,840 Speaker 1: That would be doctor Eliza Pressman, who's a developmental psychologist 576 00:29:16,840 --> 00:29:20,320 Speaker 1: and a podcast host and also a parent who's constantly 577 00:29:20,360 --> 00:29:23,520 Speaker 1: talking to other parents about raising kids in the Age 578 00:29:23,520 --> 00:29:26,400 Speaker 1: of AI, and said she felt like she needed better answers. 579 00:29:26,680 --> 00:29:28,720 Speaker 4: I'm in a position where parents are asking me about 580 00:29:28,720 --> 00:29:31,720 Speaker 4: this constantly, but I don't, you know, sometimes you. 581 00:29:31,720 --> 00:29:32,800 Speaker 2: Don't know what you don't know. 582 00:29:33,160 --> 00:29:36,520 Speaker 4: So there have been things that I didn't even know 583 00:29:36,560 --> 00:29:39,000 Speaker 4: I needed to be either worried or excited about. 584 00:29:39,440 --> 00:29:42,479 Speaker 2: I think worried and excited is the balance that we 585 00:29:42,520 --> 00:29:45,320 Speaker 2: strike on this podcast, especially today's episode. 586 00:29:44,960 --> 00:29:52,080 Speaker 1: We'll wired and excited about no unknowns, unknown knowns mini brains. Yeah, 587 00:29:52,120 --> 00:29:54,520 Speaker 1: I mean, so that's the tone they strike on raising 588 00:29:54,600 --> 00:29:56,760 Speaker 1: kids in the Age of AI. Alex and Eliza have 589 00:29:56,840 --> 00:30:00,200 Speaker 1: the goal of reminding parents and teachers and caregi as 590 00:30:00,200 --> 00:30:03,520 Speaker 1: anyone who interacts with children, there's no time like the present. 591 00:30:03,800 --> 00:30:05,480 Speaker 3: If you think about this like learning journey, you really 592 00:30:06,120 --> 00:30:08,680 Speaker 3: will it will essentially never end. And the good news is, 593 00:30:09,120 --> 00:30:11,360 Speaker 3: you know, pretty much all of us are very early 594 00:30:11,440 --> 00:30:13,640 Speaker 3: on in that learning journey. There's almost like there's very 595 00:30:13,720 --> 00:30:17,440 Speaker 3: very few people who have any expertise in AI prior 596 00:30:17,480 --> 00:30:20,400 Speaker 3: to you know, November thirty, twenty twenty two. And so 597 00:30:20,440 --> 00:30:23,480 Speaker 3: really my message to all teachers, to all parents is like, 598 00:30:23,520 --> 00:30:25,040 Speaker 3: if you feel out of your element, if you feel 599 00:30:25,080 --> 00:30:26,720 Speaker 3: like this is that you're in over your head, It's 600 00:30:26,720 --> 00:30:27,640 Speaker 3: like you're in good company. 601 00:30:27,880 --> 00:30:29,720 Speaker 1: The episode of the show that dropped this week is 602 00:30:29,760 --> 00:30:32,240 Speaker 1: actually all about AI in the classroom, and that's something 603 00:30:32,680 --> 00:30:35,240 Speaker 1: you and I have been very interested in. Carara, Alex 604 00:30:35,240 --> 00:30:37,840 Speaker 1: and Alisa talked to a few teachers who are using 605 00:30:37,880 --> 00:30:41,719 Speaker 1: AI and facing its implications with the kids they teach, 606 00:30:42,160 --> 00:30:43,959 Speaker 1: and we're resting with the question of how do we 607 00:30:44,040 --> 00:30:47,040 Speaker 1: motivate students to learn when there's a tool that can 608 00:30:47,120 --> 00:30:48,160 Speaker 1: just give you the answer. 609 00:30:48,440 --> 00:30:50,480 Speaker 2: Yeah, So, how do you make sure students are learning 610 00:30:50,560 --> 00:30:53,960 Speaker 2: to use AI without giving into cognitive offloading, which we've 611 00:30:53,960 --> 00:30:54,840 Speaker 2: talked about quite a bit. 612 00:30:55,040 --> 00:30:58,520 Speaker 1: Well. Doctor Lisa mentioned one program in particular that gave 613 00:30:58,520 --> 00:31:02,400 Speaker 1: me some hope for how education might be powered by 614 00:31:02,960 --> 00:31:04,960 Speaker 1: rather than corroded by AI. 615 00:31:05,640 --> 00:31:07,800 Speaker 4: There is a course being taught at I believe Tulane 616 00:31:07,920 --> 00:31:10,840 Speaker 4: University which was leaning into the fact that the students, 617 00:31:10,880 --> 00:31:14,480 Speaker 4: of course are using AI. So let's show them how 618 00:31:14,520 --> 00:31:17,080 Speaker 4: they can use AI and then question it and question 619 00:31:17,160 --> 00:31:19,800 Speaker 4: themselves and question each other to show that they understand 620 00:31:19,840 --> 00:31:20,440 Speaker 4: the material. 621 00:31:21,200 --> 00:31:23,719 Speaker 2: So what I think she's saying is it's better to 622 00:31:23,800 --> 00:31:28,000 Speaker 2: not be punitive. It's better to actually engage in a 623 00:31:28,040 --> 00:31:30,360 Speaker 2: real conversation with students who are sort of at the 624 00:31:30,400 --> 00:31:32,000 Speaker 2: forefront of all this technology. 625 00:31:32,160 --> 00:31:36,920 Speaker 1: Absolutely, and I think you know, it's interesting. The education system, 626 00:31:37,520 --> 00:31:39,400 Speaker 1: certainly in Britain, I'm not sure about the US, was 627 00:31:39,600 --> 00:31:43,520 Speaker 1: built around oral exams, and so you know, using AIS 628 00:31:43,520 --> 00:31:47,280 Speaker 1: a discussion prompt and discussion partner rather than a work 629 00:31:47,360 --> 00:31:50,880 Speaker 1: replacer obviously, you know holds real promise. If you want 630 00:31:50,880 --> 00:31:53,880 Speaker 1: to hear more from Alex and Elisa and learn about 631 00:31:53,920 --> 00:31:56,680 Speaker 1: how AI is shaping the future of learning and education, 632 00:31:57,440 --> 00:31:59,800 Speaker 1: make sure to check out Raising Kids in the Age 633 00:31:59,800 --> 00:32:04,400 Speaker 1: of A, available on YouTube, Apple, Podcasts, Spotify, wherever you 634 00:32:04,440 --> 00:32:05,400 Speaker 1: listen or watch. 635 00:32:05,920 --> 00:32:08,280 Speaker 2: And even though you didn't get one today, we still 636 00:32:08,280 --> 00:32:10,120 Speaker 2: want to hear those Chat and me submissions. You can 637 00:32:10,160 --> 00:32:13,280 Speaker 2: tell us your crazy chat GBT stories by emailing tech 638 00:32:13,360 --> 00:32:24,520 Speaker 2: Stuff podcast at gmail dot com. That's it for this 639 00:32:24,560 --> 00:32:25,520 Speaker 2: week for tech Stuff. 640 00:32:25,560 --> 00:32:28,480 Speaker 1: I'm Cara Price and I'm as Volochin. This episode was 641 00:32:28,480 --> 00:32:31,960 Speaker 1: produced by Eliza Dennis Tyler Hill and Melissa Slaughter. It 642 00:32:32,040 --> 00:32:35,120 Speaker 1: was the executive produced by me Cara Price, Julian Nutter, 643 00:32:35,160 --> 00:32:39,520 Speaker 1: and Kate Osborne for Kaleidoscope and Katria Novel for iHeart Podcasts. 644 00:32:39,840 --> 00:32:43,200 Speaker 1: The engineer is Beheth Fraser and Jack Insley mix this episode. 645 00:32:43,600 --> 00:32:45,280 Speaker 1: Kyle Murdoch wrote at themself. 646 00:32:45,600 --> 00:32:48,120 Speaker 2: Join us next Wednesday for a conversation on the ins 647 00:32:48,120 --> 00:32:51,160 Speaker 2: and outs of AI video generation and what's Sam Altman 648 00:32:51,200 --> 00:32:54,000 Speaker 2: gains from plastering his face across his new platform. 649 00:32:54,200 --> 00:32:56,120 Speaker 1: Please do rate and review the show wherever you listen 650 00:32:56,120 --> 00:32:58,800 Speaker 1: to podcasts, and reach out to us at tech stuff 651 00:32:58,840 --> 00:33:01,600 Speaker 1: Podcasts at gmail dot com. We really do love to 652 00:33:01,600 --> 00:33:02,160 Speaker 1: hear from you.