1 00:00:00,520 --> 00:00:04,320 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:04,720 --> 00:00:07,160 Speaker 1: I'm mos Vloscian and today Cara Price and I will 3 00:00:07,160 --> 00:00:10,680 Speaker 1: bring you the headlines this week, including the fashion models 4 00:00:10,720 --> 00:00:14,320 Speaker 1: getting Ai twins. Then on Tech Support, we'll talk to 5 00:00:14,320 --> 00:00:17,720 Speaker 1: Garrett da vinc a reporter at the Washington Post about 6 00:00:17,760 --> 00:00:20,520 Speaker 1: all the news in AI from Open Ai, it's huge 7 00:00:20,520 --> 00:00:25,920 Speaker 1: new fundraise, to Xai's acquisition of x formerly Twitter, all 8 00:00:25,960 --> 00:00:27,320 Speaker 1: of that on the Weekend. 9 00:00:27,000 --> 00:00:37,400 Speaker 2: Tech It's Friday, April fourth, Cara Price. 10 00:00:37,440 --> 00:00:40,560 Speaker 3: As Volashan, you're looking like me today. 11 00:00:40,680 --> 00:00:43,400 Speaker 1: Yeah, we are matching. 12 00:00:43,720 --> 00:00:46,600 Speaker 3: We are very matchie matchie. I was at. 13 00:00:46,560 --> 00:00:49,600 Speaker 1: McLaren f one's headquarters as you know, earlier this week 14 00:00:49,720 --> 00:00:51,519 Speaker 1: and I sent you a picture that I was very 15 00:00:51,520 --> 00:00:55,760 Speaker 1: proud of me and Zach Brown, the CEO of McLaren Racing, 16 00:00:56,200 --> 00:00:59,280 Speaker 1: And your only comment was, oh, black trainers. 17 00:00:58,920 --> 00:01:02,280 Speaker 3: Always black six sneakers. You can sponsor us, but yes, 18 00:01:02,360 --> 00:01:04,680 Speaker 3: always black A six sneakers, even if he's wearing a tie. 19 00:01:05,800 --> 00:01:08,120 Speaker 3: But it's, you know, for a tech podcast, it's very 20 00:01:08,160 --> 00:01:09,880 Speaker 3: funny how much we talk about fashion. And it's not 21 00:01:09,920 --> 00:01:10,680 Speaker 3: because I'm a woman. 22 00:01:10,920 --> 00:01:13,120 Speaker 1: I mean, it is also ironic that we both have 23 00:01:13,240 --> 00:01:16,960 Speaker 1: our own uniforms. We're wearing blue blue shirts, dark blue pants, shirt, 24 00:01:17,000 --> 00:01:20,520 Speaker 1: blubby pants, and a cool hat. 25 00:01:20,560 --> 00:01:23,280 Speaker 3: Though I am wearing a cool hat, I'm always wearing well, 26 00:01:23,319 --> 00:01:24,760 Speaker 3: I wouldn't call it a cool hat, but if you 27 00:01:24,760 --> 00:01:27,600 Speaker 3: want to call it a cool hat, it's an appropriated 28 00:01:27,840 --> 00:01:31,080 Speaker 3: Yankees city by hat that is completely illegal and was 29 00:01:31,080 --> 00:01:33,280 Speaker 3: taken off the internet the minute it went up, So 30 00:01:33,480 --> 00:01:34,960 Speaker 3: I'm very proud to be wearing it now. 31 00:01:35,000 --> 00:01:36,959 Speaker 1: The reason we're talking about this is not just because 32 00:01:37,000 --> 00:01:40,520 Speaker 1: we're totally self indulgent, but because the topic of tech 33 00:01:40,640 --> 00:01:44,680 Speaker 1: folks and their uniforms is pretty fascinating. Earlier this year, 34 00:01:44,680 --> 00:01:48,160 Speaker 1: we talked about Mark Zuckerberg, who had this famous, weirdly 35 00:01:48,240 --> 00:01:51,880 Speaker 1: shaped T shirt with the Latin phrase out zuk out 36 00:01:52,000 --> 00:01:54,880 Speaker 1: nihil printed on it, which means all zook ll all 37 00:01:54,920 --> 00:01:58,400 Speaker 1: nothing and is a is a reference to Julius Caesar. 38 00:01:58,520 --> 00:01:59,680 Speaker 1: Of course I can't. 39 00:02:00,080 --> 00:02:04,960 Speaker 3: It's just insane to me. The level of It's not narcissism. 40 00:02:05,080 --> 00:02:09,520 Speaker 3: It's this sense of like an obsession with being cool that, 41 00:02:09,680 --> 00:02:14,359 Speaker 3: while chased, is the antithesis of anything cool and fashionable. 42 00:02:14,639 --> 00:02:17,679 Speaker 1: I thought that one of the genuinely coolest tech swipes 43 00:02:17,720 --> 00:02:20,760 Speaker 1: of the year was when Jay Graber, who's the CEO 44 00:02:20,919 --> 00:02:24,200 Speaker 1: of blue Sky, the kind of left leaning social media platform, 45 00:02:24,520 --> 00:02:28,320 Speaker 1: wore a T shirt at south By Southwest that said 46 00:02:28,680 --> 00:02:32,480 Speaker 1: in Latin, no more Caesars, and it was a real 47 00:02:32,639 --> 00:02:35,880 Speaker 1: sort of shots fired at Zuck, I think, and. 48 00:02:35,800 --> 00:02:38,080 Speaker 3: Also just again in Latin. I mean, these are the 49 00:02:38,120 --> 00:02:39,720 Speaker 3: nerdiest people in human history. 50 00:02:39,919 --> 00:02:42,280 Speaker 1: This was a huge best seller for Blue Sky. I 51 00:02:42,320 --> 00:02:43,359 Speaker 1: saw that almost immediately. 52 00:02:43,440 --> 00:02:46,720 Speaker 3: I bet it was, you know, it's tech Titans, as 53 00:02:46,760 --> 00:02:48,840 Speaker 3: I was saying, just seemed to want to have a 54 00:02:48,880 --> 00:02:51,160 Speaker 3: signature look. And I don't know if part of it 55 00:02:51,240 --> 00:02:58,240 Speaker 3: is pathology, psychological pathology, I mean, certainly minus psychological pathology. 56 00:02:58,280 --> 00:03:00,040 Speaker 3: I like to wear the same thing every day. I 57 00:03:00,040 --> 00:03:02,600 Speaker 3: don't have to think about it. Like our friend Elizabeth Holmes, 58 00:03:02,600 --> 00:03:04,720 Speaker 3: who also rocked the Black Turble. 59 00:03:04,560 --> 00:03:05,960 Speaker 1: She flex hard on the Steve Jobs. 60 00:03:06,200 --> 00:03:09,800 Speaker 3: She flexed and copped. She coughed something from Steve Jobs. 61 00:03:09,840 --> 00:03:11,360 Speaker 3: And I guess it's sort of worked for her because 62 00:03:11,360 --> 00:03:12,440 Speaker 3: we're still talking about it. 63 00:03:12,480 --> 00:03:14,600 Speaker 1: But I think the most sort of talked about tech 64 00:03:14,680 --> 00:03:17,160 Speaker 1: CEO of the moment is none other than Jensen Hwang 65 00:03:17,320 --> 00:03:18,840 Speaker 1: of Invidia. 66 00:03:18,520 --> 00:03:20,040 Speaker 3: The man who made a lot of people a lot 67 00:03:20,040 --> 00:03:21,839 Speaker 3: of money and now is making a lot of people 68 00:03:21,880 --> 00:03:22,800 Speaker 3: a little less money. 69 00:03:22,720 --> 00:03:25,600 Speaker 1: And still still a lot, but little less exactly. He 70 00:03:25,760 --> 00:03:29,079 Speaker 1: has a signature. Look, can you describe it. 71 00:03:29,080 --> 00:03:32,120 Speaker 3: It's sort of a Marlon Brando, sort of on the 72 00:03:32,160 --> 00:03:36,480 Speaker 3: waterfront black leather jacket that is I think Tom Ford. 73 00:03:36,680 --> 00:03:38,440 Speaker 1: I think it is Tom forty, which is like a 74 00:03:38,480 --> 00:03:39,320 Speaker 1: ten thousand dollars. 75 00:03:39,320 --> 00:03:41,440 Speaker 3: That's an extremely expensive jacket. 76 00:03:41,680 --> 00:03:43,960 Speaker 1: But there's a lot of Invidia stands. It turns out, 77 00:03:44,080 --> 00:03:46,880 Speaker 1: and if you build it, they will come to buy 78 00:03:46,920 --> 00:03:48,120 Speaker 1: a fake black leather jacket. 79 00:03:48,360 --> 00:03:49,640 Speaker 3: If you build it, they will come and buy it 80 00:03:49,640 --> 00:03:50,960 Speaker 3: a lot cheaper exactly. 81 00:03:51,080 --> 00:03:53,800 Speaker 1: So our friends at four or for Media reported that 82 00:03:53,840 --> 00:03:56,920 Speaker 1: there are all these websites popping up with knockoff Jensen 83 00:03:56,960 --> 00:04:01,240 Speaker 1: Hwang in video jackets for which have creative titles like 84 00:04:01,320 --> 00:04:05,960 Speaker 1: Jensen wang in Video Jacket totally optimized their CEO. 85 00:04:06,080 --> 00:04:07,960 Speaker 3: People, They're going to find my phone and be like 86 00:04:08,160 --> 00:04:09,960 Speaker 3: that was the last thing that she was looking at. 87 00:04:10,960 --> 00:04:12,920 Speaker 1: But I think I don't sure if you saw it. 88 00:04:12,920 --> 00:04:15,600 Speaker 1: But this week Jensen Hwang went to visit a company 89 00:04:15,720 --> 00:04:19,120 Speaker 1: called x one Robots, and they had one of their 90 00:04:19,200 --> 00:04:22,279 Speaker 1: robots come up to Jensen whanging humanoid robot and present 91 00:04:22,360 --> 00:04:25,120 Speaker 1: him with a new black leather jacket, not a tom 92 00:04:25,240 --> 00:04:29,000 Speaker 1: Ford one. This one was bedazzled with the Nvidia stock 93 00:04:29,080 --> 00:04:32,159 Speaker 1: ticker on the front left pocket and the logo in 94 00:04:32,240 --> 00:04:33,920 Speaker 1: kind of Swarovsky style crystals on. 95 00:04:33,839 --> 00:04:36,839 Speaker 3: The back to imagine if the humanoid robot is also sentient, 96 00:04:37,080 --> 00:04:39,120 Speaker 3: if the robot was like, I don't want to do this, 97 00:04:40,640 --> 00:04:43,040 Speaker 3: like the people programming mean, please give me something better 98 00:04:43,080 --> 00:04:43,360 Speaker 3: to do. 99 00:04:44,320 --> 00:04:46,599 Speaker 1: But there's a more serious story about fashion AI here, 100 00:04:46,680 --> 00:04:49,240 Speaker 1: and it comes from The Guardian. So according to their story, 101 00:04:49,320 --> 00:04:51,520 Speaker 1: the clothing retailer H and M announced that they're going 102 00:04:51,560 --> 00:04:55,200 Speaker 1: to create so called AI twins of thirty models, which 103 00:04:55,200 --> 00:04:57,400 Speaker 1: they're going to use for social media marketing posts. 104 00:04:57,440 --> 00:04:59,320 Speaker 3: The way, I wish I had this for a podcast. 105 00:04:59,480 --> 00:05:02,800 Speaker 3: As much as I enjoy doing it with you, I wish. 106 00:05:02,680 --> 00:05:06,760 Speaker 1: You could just have a I'd have a relic. Would 107 00:05:06,760 --> 00:05:08,560 Speaker 1: you rather use your replica or my replica? 108 00:05:09,080 --> 00:05:10,480 Speaker 3: Well, if we wanted me to be on time, I 109 00:05:10,480 --> 00:05:13,919 Speaker 3: would use your replica. But sorry, keep going, because I 110 00:05:13,920 --> 00:05:14,600 Speaker 3: do love the story. 111 00:05:14,720 --> 00:05:17,960 Speaker 1: No, So these models, the real models, have given their 112 00:05:17,960 --> 00:05:20,719 Speaker 1: permission to H and M to use their likeness with AI. 113 00:05:21,600 --> 00:05:23,359 Speaker 1: You know again, as you said, sounds good, you know, 114 00:05:23,400 --> 00:05:25,880 Speaker 1: why why sharp yourself you can send your digital twin. 115 00:05:26,160 --> 00:05:28,880 Speaker 3: Well that and also I think there's something that's very 116 00:05:28,920 --> 00:05:31,520 Speaker 3: interesting here that reminds me of the Atlantic sort of 117 00:05:31,520 --> 00:05:34,400 Speaker 3: selling its data, which is these are people that, like 118 00:05:34,480 --> 00:05:37,839 Speaker 3: in the film industry, are cannibalizing their own job. 119 00:05:39,560 --> 00:05:42,680 Speaker 1: Well, of course, just like with the Atlantic and Open Ai, 120 00:05:43,200 --> 00:05:46,320 Speaker 1: where open Ai actually compensates the Atlantic for using its archive, 121 00:05:46,880 --> 00:05:50,280 Speaker 1: these models are also being compensated for their image. They 122 00:05:50,279 --> 00:05:52,920 Speaker 1: own the rights of their twins, and their twins can 123 00:05:52,920 --> 00:05:55,880 Speaker 1: work for other brands, not just H and M, and 124 00:05:56,160 --> 00:05:59,440 Speaker 1: they'll get paid. One of the thirty models said, quote, 125 00:05:59,600 --> 00:06:01,320 Speaker 1: she's like me without the jet lag. 126 00:06:04,320 --> 00:06:06,400 Speaker 3: So the other thing is, and it's worth saying, the 127 00:06:06,440 --> 00:06:10,320 Speaker 3: images in the business of fashion right up look amazingly similar. 128 00:06:10,360 --> 00:06:11,479 Speaker 1: Like it's not distinguishable. 129 00:06:11,640 --> 00:06:13,520 Speaker 3: Yeah, it's not like who was that person that we 130 00:06:13,600 --> 00:06:16,880 Speaker 3: used to cover on Sleepwalkers. She was a model Lil Michayla. 131 00:06:17,000 --> 00:06:20,240 Speaker 3: Oh yeah, like littl Michayla looked like AI. I think 132 00:06:20,279 --> 00:06:22,760 Speaker 3: you know, after four years, we have now gotten to 133 00:06:22,800 --> 00:06:27,000 Speaker 3: a place where it's just an AI replica, which is incredible. 134 00:06:27,200 --> 00:06:30,480 Speaker 1: Yeah, and there's no sick finger. As you pointed out, 135 00:06:30,640 --> 00:06:32,640 Speaker 1: it does raise concerns about the future of modeling in 136 00:06:32,640 --> 00:06:36,479 Speaker 1: the fashion industry because although these models who are participation 137 00:06:36,520 --> 00:06:38,840 Speaker 1: in the partnership will get paid every time that digital 138 00:06:38,839 --> 00:06:41,279 Speaker 1: twin is used, you know, just like in the film industry, 139 00:06:41,320 --> 00:06:43,920 Speaker 1: also raises questions about what happens to all the people 140 00:06:43,960 --> 00:06:45,840 Speaker 1: who work on sets. I mean the hair siders, the 141 00:06:45,880 --> 00:06:47,400 Speaker 1: makeup artist, the lighting designers. 142 00:06:47,560 --> 00:06:49,840 Speaker 3: Yeah, you know, I was actually talking to our producer Eliza. 143 00:06:49,920 --> 00:06:52,240 Speaker 3: Shout out to Eliza, I like to keep her in 144 00:06:52,279 --> 00:06:55,360 Speaker 3: the mix. And she actually knows a hairstylist who works 145 00:06:55,360 --> 00:06:58,360 Speaker 3: with a company that's testing out replacing many of their 146 00:06:58,520 --> 00:07:02,640 Speaker 3: e commerce models with AI, and you know, work has 147 00:07:02,680 --> 00:07:04,080 Speaker 3: slowed way down for her. 148 00:07:04,480 --> 00:07:05,239 Speaker 1: So this is happening. 149 00:07:05,560 --> 00:07:07,560 Speaker 3: I mean, think about it. You don't need to be 150 00:07:07,640 --> 00:07:09,239 Speaker 3: doing a blood on a digital avatar. 151 00:07:09,720 --> 00:07:11,800 Speaker 1: Yeah, and I mean even for models themselves, if I 152 00:07:11,800 --> 00:07:13,640 Speaker 1: guess if you're you know, for these thirty who have 153 00:07:13,960 --> 00:07:16,360 Speaker 1: gotten first through the door. It's one thing, but this 154 00:07:16,440 --> 00:07:20,000 Speaker 1: may affect the kind of future job landscape for model 155 00:07:20,040 --> 00:07:21,680 Speaker 1: as well as people work around modeling. 156 00:07:22,520 --> 00:07:25,360 Speaker 3: But this is like a This is Zulander three where 157 00:07:25,400 --> 00:07:30,800 Speaker 3: Derek realizes we're in jeopardy. But in all seriousness, you know, 158 00:07:30,880 --> 00:07:34,120 Speaker 3: what what does it look like for models who are 159 00:07:34,120 --> 00:07:36,120 Speaker 3: no longer in demand? And it might not seem important 160 00:07:36,160 --> 00:07:39,720 Speaker 3: to like a lay person, but I do think it's 161 00:07:39,760 --> 00:07:42,440 Speaker 3: a harbinger of things to come, which is like, if 162 00:07:42,480 --> 00:07:45,800 Speaker 3: you can replace something that is so sort of ubiquitous 163 00:07:45,840 --> 00:07:49,160 Speaker 3: for you know, a century, what does that look like? 164 00:07:49,600 --> 00:07:51,840 Speaker 1: Yeah, I mean wile I find most interesting about these 165 00:07:51,880 --> 00:07:54,520 Speaker 1: like AI digital twins or AI actors or whatever, or 166 00:07:55,040 --> 00:07:58,360 Speaker 1: you know, chatbots of famous people from history is their 167 00:07:58,400 --> 00:08:01,440 Speaker 1: effect is kind of to lock in the very few 168 00:08:01,520 --> 00:08:04,480 Speaker 1: most famous people in the world as the only characters 169 00:08:04,520 --> 00:08:07,480 Speaker 1: who are worth interacting with. I mean, you know, if 170 00:08:07,480 --> 00:08:10,280 Speaker 1: you think about an action movie, why would you not 171 00:08:10,280 --> 00:08:12,120 Speaker 1: make it with Tom Cruise? Or talking about doing a 172 00:08:12,120 --> 00:08:14,240 Speaker 1: fashion shoot, why would you not make it with el McPherson. 173 00:08:14,280 --> 00:08:16,600 Speaker 1: So kind of, I think there's a longer term kind 174 00:08:16,600 --> 00:08:19,080 Speaker 1: of chilling effect on the pipeline and new talent in 175 00:08:19,160 --> 00:08:22,480 Speaker 1: creative industries, which you know, which will pretty interesting, you know, 176 00:08:22,560 --> 00:08:24,960 Speaker 1: and somewhat disturbing. To be fair to H and M, 177 00:08:24,960 --> 00:08:27,200 Speaker 1: they're being very upfront about their use of AI. They're 178 00:08:27,200 --> 00:08:29,600 Speaker 1: going to watermark the images of the AI twins in 179 00:08:29,640 --> 00:08:32,600 Speaker 1: their ads so people will know they're the AI versions, 180 00:08:33,120 --> 00:08:35,160 Speaker 1: And by doing this, the company will also be abiding 181 00:08:35,240 --> 00:08:37,959 Speaker 1: with the EU's AI Act, which is coming into effect 182 00:08:38,000 --> 00:08:41,000 Speaker 1: in twenty twenty six, and it will require all AI 183 00:08:41,160 --> 00:08:44,000 Speaker 1: images to be labeled as AI images, to. 184 00:08:44,000 --> 00:08:47,000 Speaker 3: Which I say, who's looking for that? Like, I you 185 00:08:47,040 --> 00:08:48,480 Speaker 3: know what I mean? Like the role of AI and 186 00:08:48,559 --> 00:08:50,800 Speaker 3: creative industries, which you know, is something I'm obsessed with 187 00:08:50,800 --> 00:08:53,040 Speaker 3: and sort of how to regulate it is something that's 188 00:08:53,080 --> 00:08:56,160 Speaker 3: obviously going to keep coming up, and I'm curious to 189 00:08:56,160 --> 00:08:58,560 Speaker 3: see where it goes next. I wonder to what extent 190 00:08:58,720 --> 00:09:02,680 Speaker 3: the lay person cares that, Like where they're buying a shirt, 191 00:09:03,120 --> 00:09:06,240 Speaker 3: is either modeling that shirt with a fake twin or 192 00:09:06,280 --> 00:09:07,440 Speaker 3: the real person. 193 00:09:07,320 --> 00:09:10,880 Speaker 1: Probably not, Probably not. The next headline comes from the 194 00:09:10,880 --> 00:09:14,439 Speaker 1: stuff of nightmares for anyone who's a frequent traveler, and 195 00:09:14,520 --> 00:09:16,520 Speaker 1: it has to do me, and it has to do 196 00:09:16,600 --> 00:09:18,720 Speaker 1: with an airport being shut down for twenty four hours. 197 00:09:18,720 --> 00:09:19,520 Speaker 3: I heard it was Heathrow. 198 00:09:19,640 --> 00:09:21,880 Speaker 1: It was London heath Throw Airport, where I flew out 199 00:09:21,880 --> 00:09:24,559 Speaker 1: of just this week. It shut down, leading to over 200 00:09:24,600 --> 00:09:28,559 Speaker 1: a thousand council flights after a fire caused a power outage, 201 00:09:28,880 --> 00:09:31,400 Speaker 1: and Bloomberg reported that the outage could be traced back 202 00:09:31,440 --> 00:09:35,559 Speaker 1: to a single point of failure, a burned transformer and 203 00:09:35,640 --> 00:09:39,800 Speaker 1: twenty five thousand liters of transformer cooling oil that was 204 00:09:39,840 --> 00:09:42,760 Speaker 1: a blaze for several hours. It's a fascinating story, and 205 00:09:42,800 --> 00:09:45,640 Speaker 1: Bloomberg had such a great headline, which was the device 206 00:09:45,920 --> 00:09:50,000 Speaker 1: throttling the world's electrified future. But Kara, do you know 207 00:09:50,040 --> 00:09:51,120 Speaker 1: what a transformer is? 208 00:09:52,120 --> 00:09:54,439 Speaker 3: I know what the movie Transformers is. I know it's 209 00:09:54,440 --> 00:09:55,880 Speaker 3: a car that turns into a man. 210 00:09:56,480 --> 00:10:00,000 Speaker 1: So most simply put, a transformer is something that changed 211 00:10:00,320 --> 00:10:04,000 Speaker 1: is voltage. So when you create electricity in a power plant, 212 00:10:04,320 --> 00:10:06,600 Speaker 1: you actually want to increase the voltage as much as 213 00:10:06,640 --> 00:10:09,040 Speaker 1: you can because that allows it to travel far and 214 00:10:09,200 --> 00:10:12,480 Speaker 1: fast with less loss of electricity along the way. So 215 00:10:12,920 --> 00:10:15,360 Speaker 1: use a transformer on the way out of the power plant. 216 00:10:15,640 --> 00:10:17,240 Speaker 1: But then when it gets to your home or the 217 00:10:17,280 --> 00:10:19,400 Speaker 1: local electric grid or whatever it may be, you actually 218 00:10:19,400 --> 00:10:21,880 Speaker 1: want to use a transformer to turn the voltage back 219 00:10:21,920 --> 00:10:24,200 Speaker 1: down because otherwise it blows up your stuff. 220 00:10:24,559 --> 00:10:27,720 Speaker 3: So ostensibly, it's sort of like when I'm staying at 221 00:10:28,600 --> 00:10:33,280 Speaker 3: not the greatest hotel and I turn a blow dryer 222 00:10:33,360 --> 00:10:36,079 Speaker 3: on and all of a sudden, the entire room short 223 00:10:36,120 --> 00:10:38,640 Speaker 3: circuits and all of the electricity goes off because the 224 00:10:38,720 --> 00:10:39,760 Speaker 3: voltage is too high. 225 00:10:39,960 --> 00:10:42,080 Speaker 1: Yeah, exactly mean that is a short circuit where there's 226 00:10:42,200 --> 00:10:45,360 Speaker 1: likely been a problem with the transformer being bypassed or 227 00:10:45,400 --> 00:10:46,319 Speaker 1: not functioning correctly. 228 00:10:46,480 --> 00:10:48,000 Speaker 3: Right. And I think that in a storm or a 229 00:10:48,080 --> 00:10:51,840 Speaker 3: natural disaster, they can sometimes explode and it's very loud. 230 00:10:51,920 --> 00:10:54,160 Speaker 1: Yeah, it sounds like fireworks or a bomb going off. 231 00:10:54,200 --> 00:10:56,880 Speaker 1: And I mean during hurricanes and other kind of natural disasters, 232 00:10:56,920 --> 00:10:59,160 Speaker 1: these things come under a lot of pressure when they 233 00:10:59,160 --> 00:11:01,800 Speaker 1: do go out. Follow And so what happened to Heathrow 234 00:11:01,840 --> 00:11:04,760 Speaker 1: Airport was this fire broke out in a substation which 235 00:11:04,760 --> 00:11:08,240 Speaker 1: houses transformers, and it took the firefighters seven hours to 236 00:11:08,240 --> 00:11:10,439 Speaker 1: get it under control. The airport, in order to come 237 00:11:10,440 --> 00:11:13,280 Speaker 1: back online, was able to accept power from other substations, 238 00:11:13,760 --> 00:11:15,199 Speaker 1: but even the time it took them to do that, 239 00:11:15,360 --> 00:11:18,800 Speaker 1: many many flights were canceled. It was chaos. 240 00:11:19,840 --> 00:11:22,080 Speaker 3: She made it, but it was not a pretty team. 241 00:11:22,160 --> 00:11:25,200 Speaker 1: Yeah him. Back in twenty thirteen, there was actually a 242 00:11:25,240 --> 00:11:29,400 Speaker 1: sniper attack on a substation in California which caused a 243 00:11:29,440 --> 00:11:32,800 Speaker 1: fire that burned seventeen transformers and almost knocked out all 244 00:11:32,800 --> 00:11:35,480 Speaker 1: of the power of Silicon Valley. Now This actually led 245 00:11:35,520 --> 00:11:38,760 Speaker 1: to a lot more security around substations and even stockpiling 246 00:11:38,800 --> 00:11:43,360 Speaker 1: of transformers, which of course sparked another problem, a shortage 247 00:11:43,360 --> 00:11:46,679 Speaker 1: of transformers, transformer shortage and obviously with the supply chain 248 00:11:46,720 --> 00:11:51,160 Speaker 1: issues recently, the lead time for delivering a new large 249 00:11:51,200 --> 00:11:55,240 Speaker 1: transformer is now about three to five years for a 250 00:11:55,280 --> 00:11:58,520 Speaker 1: single transformer, and bear in mind these can be huge. Nonetheless, 251 00:11:58,559 --> 00:12:00,800 Speaker 1: the scale of transformer that was Heathrow, it takes a 252 00:12:00,840 --> 00:12:03,679 Speaker 1: long hundre replace and they've got a much more expensive 253 00:12:03,760 --> 00:12:05,520 Speaker 1: So you know, now we're living in the error of 254 00:12:05,559 --> 00:12:09,120 Speaker 1: EVS and the AI boom powered by data centers and 255 00:12:09,280 --> 00:12:12,720 Speaker 1: transformers are also required to bring renewable power onto the grid, 256 00:12:13,080 --> 00:12:15,040 Speaker 1: and so you know, one of the interesting implications of 257 00:12:15,080 --> 00:12:18,480 Speaker 1: this story, as Imaza, who's on the show, recently told 258 00:12:18,520 --> 00:12:21,960 Speaker 1: me that actually the US's struggles to onboard new power 259 00:12:21,960 --> 00:12:24,880 Speaker 1: to the grid maybe the reason why the US ultimately 260 00:12:24,880 --> 00:12:26,280 Speaker 1: falls behind China in AI. 261 00:12:26,840 --> 00:12:29,840 Speaker 3: So it's ironic actually that the increase in transformer prices 262 00:12:29,880 --> 00:12:33,160 Speaker 3: could be further increased by President Trump's on again, off 263 00:12:33,200 --> 00:12:36,400 Speaker 3: again relationship with imposing tariffs in Canada and Mexico, which 264 00:12:36,440 --> 00:12:39,520 Speaker 3: is where we import a lot of our large transformers from. 265 00:12:39,520 --> 00:12:41,320 Speaker 1: And that's why I love this Bloomberg story because it's 266 00:12:41,320 --> 00:12:43,680 Speaker 1: fun think about all the sexy stuff like new chips 267 00:12:43,679 --> 00:12:46,640 Speaker 1: and data center construction, you know, but there's still this 268 00:12:46,679 --> 00:12:49,920 Speaker 1: one hundred year old technology that hasn't changed and that 269 00:12:50,040 --> 00:12:53,960 Speaker 1: has to be imported and which is absolutely critical for infrastructure, 270 00:12:54,040 --> 00:12:55,760 Speaker 1: digital and otherwise, you know. 271 00:12:55,720 --> 00:12:58,840 Speaker 3: Speaking of the AI boom and things that use a lot, 272 00:12:58,880 --> 00:13:01,000 Speaker 3: a lot, a lot of energy. This next story that 273 00:13:01,000 --> 00:13:02,920 Speaker 3: I want to tell you is for those people who've 274 00:13:02,920 --> 00:13:06,679 Speaker 3: been sitting here thinking about lms and not really understanding 275 00:13:06,679 --> 00:13:11,040 Speaker 3: how they work. I have news for you. Much like 276 00:13:11,080 --> 00:13:17,040 Speaker 3: Trump's decisions on tariffs, nobody knows how they work. And Anthropic, 277 00:13:17,120 --> 00:13:19,680 Speaker 3: which is the same company that makes the AI model Claud, 278 00:13:20,000 --> 00:13:22,840 Speaker 3: has been trying to figure out how the hell these things. 279 00:13:22,840 --> 00:13:25,040 Speaker 1: This is to solve the so called black books. 280 00:13:24,720 --> 00:13:27,040 Speaker 3: The black box problem. So Anthropic, which is the same 281 00:13:27,080 --> 00:13:29,679 Speaker 3: company that makes the A model Claud, has been trying 282 00:13:29,720 --> 00:13:32,880 Speaker 3: to figure out sort of what's under the hood, and 283 00:13:32,960 --> 00:13:36,679 Speaker 3: they recently released two reports on how llms do things 284 00:13:36,720 --> 00:13:42,800 Speaker 3: like complete sentences, solve math problems, and suppress hallucinations. And 285 00:13:42,840 --> 00:13:45,840 Speaker 3: they use a technique called circuit tracing which let them 286 00:13:45,920 --> 00:13:49,520 Speaker 3: track an LM's decision making process for ten different tasks 287 00:13:49,800 --> 00:13:52,680 Speaker 3: by working back from the solution to the query. 288 00:13:53,040 --> 00:13:56,280 Speaker 1: Huh okay, that makes sense. But I just think more broadly, 289 00:13:56,480 --> 00:13:59,360 Speaker 1: as more and more decisions are taken for us by 290 00:13:59,679 --> 00:14:03,640 Speaker 1: AI or outcomes kind of determined by AI, it's kind 291 00:14:03,640 --> 00:14:06,600 Speaker 1: of remarkable that this huge elephant is still in the 292 00:14:06,640 --> 00:14:09,560 Speaker 1: corner of the room, which is we can't understand how 293 00:14:09,559 --> 00:14:10,520 Speaker 1: they make their decisions. 294 00:14:10,640 --> 00:14:12,560 Speaker 3: But it also makes you think of like, I don't 295 00:14:12,600 --> 00:14:14,840 Speaker 3: really know. I mean, I know how a car works, 296 00:14:15,000 --> 00:14:19,640 Speaker 3: but like the I'm not an engineer, and yet increasingly, 297 00:14:19,880 --> 00:14:22,600 Speaker 3: I mean, you know, since whenever the nineteen twenties, people 298 00:14:22,640 --> 00:14:24,480 Speaker 3: have used cars more and more to get around, and 299 00:14:24,480 --> 00:14:26,760 Speaker 3: we're just kind of like, well, this thing's gonna work 300 00:14:26,800 --> 00:14:29,160 Speaker 3: until it blows up, you know. That's why I feel whatever. 301 00:14:29,400 --> 00:14:33,680 Speaker 3: But it's important to note that lllm's, like Claude, which 302 00:14:33,760 --> 00:14:37,040 Speaker 3: was the focus of this study, are trained, which I 303 00:14:37,080 --> 00:14:40,400 Speaker 3: thought this was really interesting. They're trained, not programmed on 304 00:14:40,440 --> 00:14:43,520 Speaker 3: a bunch of data. They create their own rules based 305 00:14:43,560 --> 00:14:46,600 Speaker 3: on the data they ingest. But up until now we 306 00:14:46,760 --> 00:14:49,560 Speaker 3: haven't been able to see into the models to know 307 00:14:49,600 --> 00:14:52,320 Speaker 3: what those rules actually are. Let alone how the models 308 00:14:52,400 --> 00:14:53,080 Speaker 3: generate them. 309 00:14:53,240 --> 00:14:55,800 Speaker 1: Yeah, and I think what the work the anthropic is 310 00:14:55,840 --> 00:14:59,880 Speaker 1: doing is all about basically understanding decision making. So then 311 00:15:00,120 --> 00:15:02,040 Speaker 1: not yet at the stage of being able to understand 312 00:15:02,040 --> 00:15:04,480 Speaker 1: how the models generate their own rules, But I think 313 00:15:04,520 --> 00:15:07,720 Speaker 1: this story is all about how it's starting to become 314 00:15:08,520 --> 00:15:12,480 Speaker 1: easier or possible to basically backfigure out how a model 315 00:15:12,480 --> 00:15:13,240 Speaker 1: has made a decision. 316 00:15:13,400 --> 00:15:18,520 Speaker 3: Yes, and it's not that simple. I think the researchers 317 00:15:18,560 --> 00:15:21,880 Speaker 3: in this particular case were inspired by brain scan techniques 318 00:15:22,240 --> 00:15:25,360 Speaker 3: that are used in neuroscience. They found that llms, which 319 00:15:25,400 --> 00:15:28,520 Speaker 3: again I'm so interested in how we anthropomorphize lms, But 320 00:15:28,840 --> 00:15:33,080 Speaker 3: they found that llms store different constellations of knowledge in 321 00:15:33,120 --> 00:15:35,400 Speaker 3: different parts of their model. So, for example, the concept 322 00:15:35,440 --> 00:15:39,800 Speaker 3: of smallness or the idea of a rabbit. Anthropic was 323 00:15:39,840 --> 00:15:43,560 Speaker 3: actually able to identify and then turn off certain parts 324 00:15:43,600 --> 00:15:47,120 Speaker 3: of the model. So like the idea rabbit and tune 325 00:15:47,120 --> 00:15:49,160 Speaker 3: it down so that it couldn't be part of a 326 00:15:49,240 --> 00:15:50,160 Speaker 3: queer result. 327 00:15:50,360 --> 00:15:52,760 Speaker 1: And for exac like what eats carrots? It would be like. 328 00:15:54,080 --> 00:15:58,080 Speaker 3: People right, a dog right exactly, And so the same 329 00:15:58,200 --> 00:16:01,480 Speaker 3: query would have a different answer if the rabbit part 330 00:16:01,520 --> 00:16:03,280 Speaker 3: of the model was dealt up or down, which you. 331 00:16:03,280 --> 00:16:06,720 Speaker 1: Say said by contrast, you might you might say, you know, 332 00:16:07,000 --> 00:16:09,000 Speaker 1: what's a mammal? It always asks rabbit if you dial 333 00:16:09,000 --> 00:16:10,240 Speaker 1: it up, and if you if you dial it down, 334 00:16:10,240 --> 00:16:11,120 Speaker 1: it would never own a rabbit. 335 00:16:11,280 --> 00:16:11,640 Speaker 4: Correct. 336 00:16:11,720 --> 00:16:13,680 Speaker 3: So I mean, and it's similar. I mean it's similar 337 00:16:13,680 --> 00:16:15,960 Speaker 3: in human beings, which is like I don't drink anymore, 338 00:16:15,960 --> 00:16:17,560 Speaker 3: so I go to a bar and I'm like water. 339 00:16:19,520 --> 00:16:20,680 Speaker 3: It's a similar kind of thing. 340 00:16:21,120 --> 00:16:23,080 Speaker 1: I mean that that actually clarifies for me, and I 341 00:16:23,240 --> 00:16:25,520 Speaker 1: kind of We've done a bunch of coverage of AI 342 00:16:25,600 --> 00:16:28,000 Speaker 1: and spoken to Jeffrey Hinton and others, but like that 343 00:16:28,120 --> 00:16:31,640 Speaker 1: idea of a neural network is to me really clarified 344 00:16:31,640 --> 00:16:32,280 Speaker 1: by this study. 345 00:16:32,400 --> 00:16:34,080 Speaker 3: Well, and how could it not be. It's something that 346 00:16:34,080 --> 00:16:36,000 Speaker 3: we've created, you know what I mean, it's going to 347 00:16:36,040 --> 00:16:40,400 Speaker 3: reflect a sort of human centric way of working. I 348 00:16:40,440 --> 00:16:44,480 Speaker 3: think this story actually came from the MIT Technology Review 349 00:16:44,480 --> 00:16:47,800 Speaker 3: with the headline anthropic can now track the bizarre inner 350 00:16:47,840 --> 00:16:51,280 Speaker 3: workings of a large language model, to which I said, Now, 351 00:16:52,480 --> 00:16:54,280 Speaker 3: what really blew my mind, though, is the way that 352 00:16:54,360 --> 00:16:57,280 Speaker 3: this thing solves math problems, because it's not the way 353 00:16:57,280 --> 00:16:58,200 Speaker 3: that humans do math. 354 00:16:58,320 --> 00:16:59,400 Speaker 1: Okay, tell me more. 355 00:16:59,360 --> 00:17:04,400 Speaker 3: So, research asked it to solve the equation thirty six 356 00:17:04,480 --> 00:17:07,640 Speaker 3: plus fifty nine, and while Claude came up with there, 357 00:17:07,760 --> 00:17:13,880 Speaker 3: what's the correct answer? Well, well, Claude took a very 358 00:17:13,920 --> 00:17:17,440 Speaker 3: circuitous route to get that answer. Anthropic found that Claude 359 00:17:17,520 --> 00:17:22,280 Speaker 3: used multiple computation paths in parallel to get its final answer, 360 00:17:22,359 --> 00:17:25,280 Speaker 3: unlike you, who just used your brain. So one path 361 00:17:25,359 --> 00:17:28,720 Speaker 3: added a bunch of numbers. I love this. This is 362 00:17:28,760 --> 00:17:31,600 Speaker 3: like nerd alert close to thirty six and fifty nine 363 00:17:31,720 --> 00:17:36,000 Speaker 3: to approximate the total like thirty five and sixty, while 364 00:17:36,160 --> 00:17:40,920 Speaker 3: another path focused on determining the last digit of the sum, 365 00:17:41,040 --> 00:17:43,879 Speaker 3: so it actually added the last two digits of thirty 366 00:17:43,880 --> 00:17:47,040 Speaker 3: six and fifty nine, six and nine to know that 367 00:17:47,119 --> 00:17:50,200 Speaker 3: the answer had to be a multiple of five. So 368 00:17:50,240 --> 00:17:52,480 Speaker 3: Claude used these two paths to come up with the 369 00:17:52,520 --> 00:17:54,800 Speaker 3: correct answer, which is as you said. 370 00:17:54,800 --> 00:17:58,320 Speaker 1: Ninety five. You know, I find particularly fascinating about this, 371 00:17:58,760 --> 00:18:01,320 Speaker 1: so I always struggled to be with math. My dad, 372 00:18:01,359 --> 00:18:04,679 Speaker 1: on the other hand, is a crazy math murder. He 373 00:18:04,800 --> 00:18:07,880 Speaker 1: was like the under thirteen chess champion of Britain, etc. 374 00:18:08,840 --> 00:18:10,560 Speaker 1: You know, he said to me about the most important 375 00:18:10,640 --> 00:18:14,200 Speaker 1: thing about how to do math well, is how approximate 376 00:18:14,240 --> 00:18:16,480 Speaker 1: the answer before you work it out. He basically told 377 00:18:16,560 --> 00:18:20,000 Speaker 1: me to do exactly what this model does. Basically break 378 00:18:20,000 --> 00:18:22,320 Speaker 1: it down to much simpler calculation, and then when you 379 00:18:22,359 --> 00:18:24,560 Speaker 1: do the actual work, you will know whether or not 380 00:18:24,600 --> 00:18:25,359 Speaker 1: you're in the range. 381 00:18:25,440 --> 00:18:25,640 Speaker 4: Right. 382 00:18:25,760 --> 00:18:28,919 Speaker 3: Well, Claude does math like your dad. When asked by 383 00:18:28,960 --> 00:18:31,639 Speaker 3: a user how it got the answer ninety five, it 384 00:18:31,680 --> 00:18:34,840 Speaker 3: claims to do it by the book, for example, simple addition, 385 00:18:35,000 --> 00:18:38,760 Speaker 3: carrying the one and one of the explanations anthropic positive. 386 00:18:38,440 --> 00:18:41,200 Speaker 1: Positive for the fact that, when asked how it got 387 00:18:41,200 --> 00:18:44,280 Speaker 1: to the answer, did it shared a response that was not, 388 00:18:44,320 --> 00:18:45,840 Speaker 1: in fact, how it got to the answer. 389 00:18:45,760 --> 00:18:49,560 Speaker 3: Right because Claude's written answer and I quote may reflect 390 00:18:49,560 --> 00:18:52,920 Speaker 3: the fact that the model learns to explain math by 391 00:18:53,000 --> 00:18:56,800 Speaker 3: simulating explanations written by people. So when asked to do 392 00:18:56,880 --> 00:19:00,120 Speaker 3: math without being taught how to do it, it may 393 00:19:00,119 --> 00:19:04,600 Speaker 3: develop its own internal strategies to do so. Remember it's 394 00:19:04,640 --> 00:19:05,720 Speaker 3: trade not programmed. 395 00:19:12,280 --> 00:19:14,359 Speaker 1: We're going to take a quick break, but stick around 396 00:19:14,520 --> 00:19:17,439 Speaker 1: and we'll be back with this week's tech support, all 397 00:19:17,480 --> 00:19:34,240 Speaker 1: about AI competition among tech giants. For our next segment, 398 00:19:34,240 --> 00:19:37,719 Speaker 1: we're going to be talking about all things AI. Surprise, Surprise, 399 00:19:37,760 --> 00:19:41,280 Speaker 1: on tech stuff Rise tell surprise, But in all seriousness, 400 00:19:41,400 --> 00:19:43,960 Speaker 1: there's just so much happening in AI all the time. 401 00:19:44,040 --> 00:19:45,760 Speaker 1: I find it even though it's our job, but it's 402 00:19:45,800 --> 00:19:46,439 Speaker 1: hard to keep up with. 403 00:19:46,880 --> 00:19:50,120 Speaker 3: We know. I literally work on a podcast about technology 404 00:19:50,160 --> 00:19:52,320 Speaker 3: and half the time other people are telling me what's 405 00:19:52,359 --> 00:19:54,240 Speaker 3: going on in the world of technology. Because I've come 406 00:19:54,640 --> 00:19:57,640 Speaker 3: I've become kind of like a sieve for technology news 407 00:19:57,680 --> 00:19:58,440 Speaker 3: from my friends. 408 00:19:58,720 --> 00:20:01,280 Speaker 1: Absolutely, So this week we're going to talk to somebody 409 00:20:01,359 --> 00:20:04,719 Speaker 1: who can help navigate a whirlwind of headlines. And you've 410 00:20:04,720 --> 00:20:08,760 Speaker 1: got the big open Ai fundraise, You've got elon integrating 411 00:20:09,119 --> 00:20:12,640 Speaker 1: x into Xai, and then you've got the struggles at 412 00:20:12,680 --> 00:20:13,360 Speaker 1: Apple with. 413 00:20:13,400 --> 00:20:16,560 Speaker 2: Their AI products or lack thereof. 414 00:20:17,000 --> 00:20:20,040 Speaker 1: Here to help us decode the current AI landscape is 415 00:20:20,119 --> 00:20:23,800 Speaker 1: Garrett Devinc who's a technology reporter for The Washington Post. Garrett, 416 00:20:23,840 --> 00:20:24,679 Speaker 1: Welcome to Tech Stuff. 417 00:20:24,800 --> 00:20:25,520 Speaker 4: Happy to be here. 418 00:20:25,680 --> 00:20:28,560 Speaker 1: Let's start with a big one. The volume of news 419 00:20:28,680 --> 00:20:32,480 Speaker 1: around AI today feels quite overwhelming. I mean, also, it 420 00:20:32,520 --> 00:20:37,800 Speaker 1: comes from so many different companies Open Ai, Google, Xai, Microsoft, Amazon, Apple, 421 00:20:38,119 --> 00:20:40,480 Speaker 1: The list goes on and on. How do you keep 422 00:20:40,600 --> 00:20:42,840 Speaker 1: up with all of this and also figure out how 423 00:20:42,840 --> 00:20:44,280 Speaker 1: to sort the signal from the noise. 424 00:20:44,560 --> 00:20:46,879 Speaker 5: Yeah, I mean, I think this is kind of a 425 00:20:46,920 --> 00:20:50,080 Speaker 5: key question. I'm sure a lot of people are asking themselves, 426 00:20:50,080 --> 00:20:52,520 Speaker 5: and I think first thing is, just take a deep breath, 427 00:20:52,600 --> 00:20:55,639 Speaker 5: calm down. AI is not coming for your job. AI 428 00:20:55,760 --> 00:20:58,560 Speaker 5: is not going to take over the world tomorrow, even 429 00:20:58,640 --> 00:21:02,040 Speaker 5: if really smart people or powerful people or rich companies 430 00:21:02,040 --> 00:21:05,919 Speaker 5: are saying that. Essentially, what we're seeing here is the 431 00:21:05,960 --> 00:21:09,320 Speaker 5: tech industry is this huge conglomeration of a bunch of 432 00:21:09,359 --> 00:21:12,440 Speaker 5: powerful people and billions and billions of dollars that are 433 00:21:12,480 --> 00:21:14,679 Speaker 5: always looking for the next thing, always looking for the 434 00:21:14,680 --> 00:21:17,640 Speaker 5: next way to make money. And they look back at 435 00:21:17,720 --> 00:21:22,440 Speaker 5: tech trends over the years, the Internet, cloud computing, moving 436 00:21:22,520 --> 00:21:25,399 Speaker 5: to mobile phones, and the tech industry has sort of 437 00:21:25,440 --> 00:21:30,360 Speaker 5: collectively decided that AI is the next one of those stages, right, 438 00:21:30,400 --> 00:21:32,760 Speaker 5: And so when the mobile phone came out, a bunch 439 00:21:32,760 --> 00:21:34,120 Speaker 5: of new people were able to make money. 440 00:21:34,200 --> 00:21:34,360 Speaker 1: Right. 441 00:21:34,400 --> 00:21:36,320 Speaker 4: We didn't have Uber, we didn't have. 442 00:21:36,200 --> 00:21:39,040 Speaker 5: Those kind of mobile first door Dash, those kinds of 443 00:21:39,080 --> 00:21:42,120 Speaker 5: companies before the mobile phone came out, and people made 444 00:21:42,280 --> 00:21:45,439 Speaker 5: huge amounts of money during that tech transition. And so 445 00:21:45,840 --> 00:21:49,320 Speaker 5: what's happening now is the tech industry believes and is 446 00:21:49,400 --> 00:21:52,439 Speaker 5: convincing themselves and trying to convince all of us that 447 00:21:52,560 --> 00:21:55,240 Speaker 5: AI is that next step, and so they're pouring money 448 00:21:55,240 --> 00:21:58,200 Speaker 5: into it, they're pouring marketing dollars into it, but at 449 00:21:58,200 --> 00:22:01,440 Speaker 5: the same time, there's still trying to build the plane 450 00:22:01,560 --> 00:22:04,480 Speaker 5: as it's taking off. And so that's why you might 451 00:22:04,520 --> 00:22:06,680 Speaker 5: see a lot of products that maybe. 452 00:22:06,440 --> 00:22:07,440 Speaker 4: Don't work very well. 453 00:22:07,560 --> 00:22:09,879 Speaker 5: You don't know really how they fit into your life, 454 00:22:10,200 --> 00:22:11,840 Speaker 5: and you're not sure whether you should be paying for 455 00:22:11,880 --> 00:22:13,600 Speaker 5: them yet. And so I think the first thing is 456 00:22:13,640 --> 00:22:15,639 Speaker 5: to say, you know, don't worry. You're not missing the 457 00:22:15,640 --> 00:22:18,040 Speaker 5: boat if you're not using a million AI apps right now. 458 00:22:18,119 --> 00:22:20,520 Speaker 5: And so, yes, this is happening. There's a lot of 459 00:22:20,720 --> 00:22:23,440 Speaker 5: hype and interest here, but it's not as if AI 460 00:22:23,640 --> 00:22:26,600 Speaker 5: is just sort of gonna change everything immediately. 461 00:22:26,880 --> 00:22:28,800 Speaker 1: And one of the things that Karen and I sometimes 462 00:22:28,800 --> 00:22:31,159 Speaker 1: talk about is like, is it speeding up or is 463 00:22:31,200 --> 00:22:33,880 Speaker 1: it slowing down? Because like November December, all the headlines 464 00:22:33,920 --> 00:22:36,880 Speaker 1: were slowing down CHACKGBT five is not coming, and now 465 00:22:36,880 --> 00:22:39,160 Speaker 1: it feels like we're in a big speeding up moment. Again, 466 00:22:39,760 --> 00:22:41,679 Speaker 1: is it even a relevant question? And where do you 467 00:22:41,680 --> 00:22:42,199 Speaker 1: fall on it? 468 00:22:42,359 --> 00:22:42,560 Speaker 4: Yeah? 469 00:22:42,560 --> 00:22:43,879 Speaker 5: I mean I think it's a great question, and I 470 00:22:43,960 --> 00:22:46,720 Speaker 5: think that analysis is correct, right. I mean, everyone is 471 00:22:46,760 --> 00:22:49,240 Speaker 5: trying to either build something up or tear it down. 472 00:22:49,280 --> 00:22:50,640 Speaker 4: That's how these things work. 473 00:22:50,720 --> 00:22:53,920 Speaker 5: And so the reason we had this huge boost in 474 00:22:54,000 --> 00:22:57,720 Speaker 5: interest was because chat gipt came out. It was definitely 475 00:22:57,840 --> 00:23:00,639 Speaker 5: better than what most people had you used before and 476 00:23:00,640 --> 00:23:02,560 Speaker 5: being able to experience directly, and they were able to 477 00:23:02,560 --> 00:23:05,240 Speaker 5: put it in a format that regular people could actually 478 00:23:05,359 --> 00:23:08,840 Speaker 5: understand and have a conversation with it. And so that's 479 00:23:08,840 --> 00:23:11,960 Speaker 5: sort of what fired the starting gun. And then they said, okay, 480 00:23:12,000 --> 00:23:14,320 Speaker 5: how do we make that better? And then the techniques 481 00:23:14,320 --> 00:23:17,520 Speaker 5: they were using, which was essentially to use way more data, 482 00:23:17,840 --> 00:23:20,560 Speaker 5: to just shove more data into these AI models and 483 00:23:20,720 --> 00:23:23,399 Speaker 5: hope that they get smarter. That had been working up 484 00:23:23,400 --> 00:23:25,840 Speaker 5: into a certain point, and then they kind of ran 485 00:23:25,920 --> 00:23:28,520 Speaker 5: out of data and that method slowed down, and so 486 00:23:28,640 --> 00:23:32,720 Speaker 5: they've now pivoted to using different techniques where they're actually 487 00:23:32,720 --> 00:23:35,640 Speaker 5: spending a lot more time training the model to. 488 00:23:35,600 --> 00:23:36,399 Speaker 4: Do different things. 489 00:23:36,400 --> 00:23:38,800 Speaker 5: They're sort of doing more coding so that it can 490 00:23:39,080 --> 00:23:42,120 Speaker 5: be a bit more efficient and strategic, and now they're 491 00:23:42,119 --> 00:23:44,959 Speaker 5: seeing a boost in capability. 492 00:23:44,280 --> 00:23:45,320 Speaker 4: From that technique. 493 00:23:45,320 --> 00:23:49,080 Speaker 5: And if people remember the Chinese AI model called deep Seek. 494 00:23:49,440 --> 00:23:51,440 Speaker 5: They really had a huge breakthrough where they were able 495 00:23:51,480 --> 00:23:54,000 Speaker 5: to use a little bit less data and less computing 496 00:23:54,080 --> 00:23:56,440 Speaker 5: power to come up with a model that was really 497 00:23:56,520 --> 00:23:59,439 Speaker 5: quite capable. And so now everyone is saying, oh, we're 498 00:23:59,480 --> 00:24:02,520 Speaker 5: speeding up again because we found new ways of increasing 499 00:24:02,560 --> 00:24:04,200 Speaker 5: the capabilities. 500 00:24:03,960 --> 00:24:07,680 Speaker 3: Which is I mean, ultimately a more long term effective 501 00:24:07,720 --> 00:24:10,040 Speaker 3: way to get these things to work than having to 502 00:24:10,119 --> 00:24:14,520 Speaker 3: rely on humongous data sets that might not be replenishable. 503 00:24:14,600 --> 00:24:16,080 Speaker 4: Yeah, yeah, absolutely. 504 00:24:16,080 --> 00:24:18,320 Speaker 5: And I mean the other aspect here is that AI 505 00:24:18,440 --> 00:24:21,800 Speaker 5: is very compute intensive, which is essentially just a way 506 00:24:21,840 --> 00:24:23,840 Speaker 5: of saying they need a lot of computers and a 507 00:24:23,840 --> 00:24:26,919 Speaker 5: lot of computer chips in order to do AI in 508 00:24:26,960 --> 00:24:29,919 Speaker 5: the first place. Takes a lot of energy, and so 509 00:24:30,000 --> 00:24:33,159 Speaker 5: there's a lot of potential environmental concerns. Even here in 510 00:24:33,200 --> 00:24:35,840 Speaker 5: the United States, coal power plants that were slated to 511 00:24:35,880 --> 00:24:38,879 Speaker 5: be shut down have actually been ramped back up in 512 00:24:39,080 --> 00:24:41,880 Speaker 5: order to serve all those AI data centers. So there 513 00:24:41,960 --> 00:24:44,919 Speaker 5: is a lot of interest and pressure in making AI 514 00:24:45,040 --> 00:24:48,240 Speaker 5: more efficient so that it's cheaper and more environmentally friendly. 515 00:24:48,960 --> 00:24:51,280 Speaker 3: One of the biggest names in the industry is open AI, 516 00:24:51,720 --> 00:24:55,919 Speaker 3: and for anyone who is living under a rock and 517 00:24:56,040 --> 00:24:58,679 Speaker 3: wasn't on social media this week. Why is everybody talking 518 00:24:58,720 --> 00:25:00,960 Speaker 3: so much about Studio Ghibli. 519 00:25:01,720 --> 00:25:06,400 Speaker 5: Yeah, so, open Ai they released a new image generator, right. 520 00:25:06,480 --> 00:25:08,400 Speaker 4: So one of the things that people have been able. 521 00:25:08,200 --> 00:25:10,240 Speaker 5: To use AI for over the last couple of years 522 00:25:10,320 --> 00:25:12,520 Speaker 5: is you make a short description and it spits out 523 00:25:12,560 --> 00:25:14,760 Speaker 5: an image, and you know, it's been getting better and 524 00:25:14,800 --> 00:25:18,440 Speaker 5: better over the months, and essentially they released a big 525 00:25:18,520 --> 00:25:23,440 Speaker 5: update to THEIRS and people realize that they could use photos, 526 00:25:23,560 --> 00:25:27,119 Speaker 5: upload photos and get the model to recreate that image 527 00:25:27,119 --> 00:25:31,119 Speaker 5: and sort of the same design styles from iconic animators 528 00:25:31,119 --> 00:25:33,840 Speaker 5: like you mentioned Studio Ghibli or even like the movie 529 00:25:33,920 --> 00:25:36,520 Speaker 5: Wallace and Grommet or the Muppets, and. 530 00:25:37,359 --> 00:25:40,200 Speaker 3: I like the Lego Family, the Lego Family or exactly. 531 00:25:41,080 --> 00:25:43,600 Speaker 5: It was this moment where the technology allowed people to 532 00:25:43,640 --> 00:25:45,560 Speaker 5: apply their creativity. 533 00:25:45,320 --> 00:25:47,639 Speaker 4: In a new way, and that's why it went viral. 534 00:25:48,359 --> 00:25:52,040 Speaker 5: And open Ai I think they did hope that this 535 00:25:52,080 --> 00:25:55,720 Speaker 5: would happen. They themselves were using some of these Studio 536 00:25:55,760 --> 00:26:00,639 Speaker 5: Ghibili examples early on. But also they can't really predict 537 00:26:00,720 --> 00:26:02,520 Speaker 5: or control how this is going to go, and some 538 00:26:02,600 --> 00:26:04,760 Speaker 5: of their releases have just kind of fallen flat. Everyone's 539 00:26:04,800 --> 00:26:06,600 Speaker 5: been like boring we don't care, but. 540 00:26:06,640 --> 00:26:07,720 Speaker 4: This one a lot of people. 541 00:26:07,840 --> 00:26:09,800 Speaker 5: They found it really fun, they found it really interesting, 542 00:26:09,800 --> 00:26:12,320 Speaker 5: and it definitely went viral, and it actually brought a 543 00:26:12,359 --> 00:26:14,000 Speaker 5: lot more people who. 544 00:26:13,920 --> 00:26:16,240 Speaker 4: Hadn't been using chat GPT before. 545 00:26:16,680 --> 00:26:19,000 Speaker 5: But now, of course this raises all sorts of questions 546 00:26:19,000 --> 00:26:22,800 Speaker 5: about art and copyright and big problems like that. 547 00:26:23,600 --> 00:26:29,640 Speaker 3: What's interesting, though, is while they're trying to grow very quickly, 548 00:26:30,080 --> 00:26:32,800 Speaker 3: we have someone like Sam Altman come out and say, 549 00:26:33,080 --> 00:26:35,320 Speaker 3: please slow down with this image generation. 550 00:26:35,640 --> 00:26:38,679 Speaker 1: He said that the GPUs are being melted by the demand, right. 551 00:26:38,840 --> 00:26:41,840 Speaker 5: Yeah, I mean it's possible some GPUs actually did melt 552 00:26:41,840 --> 00:26:43,360 Speaker 5: a little bit. I mean, you know, when you're using 553 00:26:43,400 --> 00:26:45,920 Speaker 5: your laptop, you've got two hundred Chrome tabs open and 554 00:26:45,960 --> 00:26:48,480 Speaker 5: you're trying to listen to YouTube, it gets hot, right, 555 00:26:48,520 --> 00:26:50,360 Speaker 5: and so that's exactly what happens. 556 00:26:50,560 --> 00:26:52,080 Speaker 3: My phone does get quite hot. 557 00:26:52,280 --> 00:26:56,280 Speaker 5: Yeah, you know that same thing happens with AI, right. 558 00:26:56,280 --> 00:26:57,920 Speaker 5: I mean, so this is still a very physical thing. 559 00:26:57,960 --> 00:26:59,879 Speaker 5: Every single time everyone in the world says, hey, make 560 00:27:00,119 --> 00:27:02,440 Speaker 5: an image of this. Hey write me a resume for this, 561 00:27:02,520 --> 00:27:06,080 Speaker 5: Hey answer my test questions for me. That needs to 562 00:27:06,119 --> 00:27:08,960 Speaker 5: go to a data center, It needs to be computed 563 00:27:09,160 --> 00:27:12,720 Speaker 5: on these computer chips on GPUs, which is the technical 564 00:27:12,800 --> 00:27:15,840 Speaker 5: term for the computer chips, and that heats them up 565 00:27:15,920 --> 00:27:18,040 Speaker 5: and so I don't know if they were actually melting, 566 00:27:18,040 --> 00:27:20,560 Speaker 5: But essentially what sam Altman was referring to there is 567 00:27:20,600 --> 00:27:22,840 Speaker 5: just that so many people wanted to use this that 568 00:27:22,920 --> 00:27:27,199 Speaker 5: it was becoming very expensive for open ai to run it. 569 00:27:27,200 --> 00:27:29,040 Speaker 5: And this kind of gets to a central problem for 570 00:27:29,080 --> 00:27:31,639 Speaker 5: them because the more people use it, the more it 571 00:27:31,760 --> 00:27:34,920 Speaker 5: costs them in computer chip costs, and so they want 572 00:27:34,960 --> 00:27:37,200 Speaker 5: people to use it, but at the same time they 573 00:27:37,240 --> 00:27:39,199 Speaker 5: need to figure out, you know, how can we convince 574 00:27:39,240 --> 00:27:42,280 Speaker 5: people force people to pay for these things so that 575 00:27:42,320 --> 00:27:44,320 Speaker 5: we can actually grow as a business. And this is 576 00:27:44,400 --> 00:27:47,960 Speaker 5: a huge question mark around open ai and other AI companies. 577 00:27:48,359 --> 00:27:51,480 Speaker 1: As fun as it is making a studio ghibli portrait yourself, 578 00:27:51,520 --> 00:27:54,280 Speaker 1: it's hard to measure it being a huge, huge business 579 00:27:54,320 --> 00:27:56,520 Speaker 1: driver of people buying tokens to do that. 580 00:27:56,520 --> 00:27:58,479 Speaker 5: That's sort of a question for open Ai, right, I mean, 581 00:27:58,520 --> 00:28:01,520 Speaker 5: they've been able to have these viral moments. Chat GPT 582 00:28:01,920 --> 00:28:04,800 Speaker 5: itself was a viral moment, and I do think people 583 00:28:05,000 --> 00:28:07,880 Speaker 5: are using these technologies. I mean, in some ways open 584 00:28:07,880 --> 00:28:10,800 Speaker 5: ai chatch GPT is one of the fastest growing, if 585 00:28:10,800 --> 00:28:16,040 Speaker 5: not the fastest growing consumer Internet products ever. But we're 586 00:28:16,080 --> 00:28:18,120 Speaker 5: not quite at that point yet where I think regular 587 00:28:18,200 --> 00:28:20,600 Speaker 5: people are saying, oh, like, this is so important to 588 00:28:20,680 --> 00:28:24,320 Speaker 5: my life, I need it so badly to write emails 589 00:28:24,560 --> 00:28:28,240 Speaker 5: or to have fun like generating these images that I'm 590 00:28:28,240 --> 00:28:31,159 Speaker 5: willing to pay two hundred dollars a month for it. 591 00:28:31,200 --> 00:28:33,640 Speaker 5: I mean, I do think that the company is still 592 00:28:33,640 --> 00:28:36,200 Speaker 5: in this world where they're trying to figure out how 593 00:28:36,200 --> 00:28:38,760 Speaker 5: do we convince people that they need this so badly 594 00:28:39,160 --> 00:28:41,400 Speaker 5: that they're willing to spend hundreds of dollars a year 595 00:28:41,440 --> 00:28:41,760 Speaker 5: on it. 596 00:28:42,320 --> 00:28:45,600 Speaker 1: So you may be somewhat skeptical, Garrett, but the market 597 00:28:45,640 --> 00:28:47,800 Speaker 1: is not, or at least soft Bank is not. You 598 00:28:47,840 --> 00:28:48,600 Speaker 1: talk a bit about that. 599 00:28:49,000 --> 00:28:53,640 Speaker 5: Yeah, I mean open Ai raised forty billion dollars and 600 00:28:54,000 --> 00:28:57,520 Speaker 5: at a three hundred billion dollar valuation, and I mean it's. 601 00:28:57,360 --> 00:29:00,640 Speaker 1: The largest ever private financing of a US company that. 602 00:29:01,200 --> 00:29:03,760 Speaker 3: A company that's not a private company a little while ago. 603 00:29:03,920 --> 00:29:07,440 Speaker 5: Yeah, And the only private company that is actually worth 604 00:29:07,520 --> 00:29:10,880 Speaker 5: more than three hundred billion dollars is SpaceX. So that's 605 00:29:10,960 --> 00:29:14,720 Speaker 5: Elon Musk's Space company. But they build big physical rockets 606 00:29:14,760 --> 00:29:17,800 Speaker 5: that cost hundreds of millions of dollars, So that company's 607 00:29:17,800 --> 00:29:20,240 Speaker 5: worth three hundred fifty billion. Opening Eye is now, according 608 00:29:20,240 --> 00:29:23,320 Speaker 5: to its investors, worth three hundred billion dollars, right, And 609 00:29:23,400 --> 00:29:26,200 Speaker 5: so I think this mostly goes to the to that 610 00:29:26,360 --> 00:29:29,720 Speaker 5: question about how expensive it is to do AI, right, 611 00:29:29,760 --> 00:29:33,480 Speaker 5: and so they need that money in order to buy 612 00:29:33,560 --> 00:29:37,240 Speaker 5: data centers, buy computer chips to keep helping people make 613 00:29:37,320 --> 00:29:39,600 Speaker 5: these these studio ghibli images and all the other things 614 00:29:39,640 --> 00:29:41,200 Speaker 5: that Opening Eye is working for. 615 00:29:41,240 --> 00:29:42,120 Speaker 4: And so so it's. 616 00:29:41,960 --> 00:29:44,920 Speaker 1: Just the uber model where you basically subsidized users and 617 00:29:45,000 --> 00:29:47,040 Speaker 1: then once you get them hooked, you start to charge them, 618 00:29:47,080 --> 00:29:48,000 Speaker 1: but at huge scale. 619 00:29:48,160 --> 00:29:48,360 Speaker 4: Yeah. 620 00:29:48,360 --> 00:29:50,680 Speaker 5: I mean that's a playbook that tech companies have used 621 00:29:50,680 --> 00:29:53,480 Speaker 5: for years now, right, and get people hooked on something 622 00:29:53,520 --> 00:29:56,720 Speaker 5: that is fun, cheap, easy, free work into their lives 623 00:29:56,720 --> 00:29:59,240 Speaker 5: and so that they feel like they needed every single day, 624 00:29:59,560 --> 00:30:02,040 Speaker 5: and then start to increase the costs. I mean Google 625 00:30:02,120 --> 00:30:04,840 Speaker 5: has done this. I'm paying for my Gmail storage. I 626 00:30:04,880 --> 00:30:05,800 Speaker 5: don't know if you guys are. 627 00:30:05,840 --> 00:30:11,080 Speaker 3: I mean, yes, it's a question of when does something 628 00:30:11,120 --> 00:30:14,200 Speaker 3: go from the gimmick phase to the business phase. I think, 629 00:30:14,200 --> 00:30:16,160 Speaker 3: at least in terms of chat GBT. 630 00:30:16,440 --> 00:30:18,800 Speaker 5: Yeah, And I think the other thing to point out 631 00:30:18,840 --> 00:30:21,960 Speaker 5: here is that open ai does have a business where 632 00:30:22,000 --> 00:30:25,320 Speaker 5: they sell access to their AI to other businesses, right, 633 00:30:25,360 --> 00:30:28,239 Speaker 5: and so there's the consumer question, which is exactly what 634 00:30:28,240 --> 00:30:30,960 Speaker 5: we're talking about. Then they are actually selling to businesses 635 00:30:30,960 --> 00:30:34,120 Speaker 5: who want to put AI technology into their own apps 636 00:30:34,200 --> 00:30:38,000 Speaker 5: and into their own technology, and that's also a big 637 00:30:38,040 --> 00:30:39,960 Speaker 5: part of what open ai is trying to do here. 638 00:30:40,440 --> 00:30:43,320 Speaker 5: But that's also, you know, a big question mark because 639 00:30:43,320 --> 00:30:47,440 Speaker 5: we have these open source AI models as well. Deep Seek, 640 00:30:47,520 --> 00:30:50,280 Speaker 5: the Chinese one we mentioned earlier is an example of that. 641 00:30:50,680 --> 00:30:53,360 Speaker 5: Facebook also provides these tools where they just essentially put 642 00:30:53,400 --> 00:30:56,080 Speaker 5: the AI out there for free for other businesses to 643 00:30:56,120 --> 00:30:58,560 Speaker 5: take and use in their own ways. And so open 644 00:30:58,600 --> 00:31:02,960 Speaker 5: ai is a very strong business. They have incredible technology, 645 00:31:02,960 --> 00:31:04,680 Speaker 5: they have some of the smartest people in the world 646 00:31:04,760 --> 00:31:07,840 Speaker 5: on AI, and they have huge funding and backing. 647 00:31:07,520 --> 00:31:08,360 Speaker 4: From their investors. 648 00:31:08,360 --> 00:31:11,280 Speaker 5: But at the same time, that doesn't guarantee that they're 649 00:31:11,320 --> 00:31:13,720 Speaker 5: going to continue to grow or even be around in 650 00:31:13,800 --> 00:31:18,840 Speaker 5: five years. 651 00:31:21,080 --> 00:31:23,280 Speaker 1: When we come back, we'll hear about how other big 652 00:31:23,280 --> 00:31:26,600 Speaker 1: tech companies like Amazon and Apple are fairing in the 653 00:31:26,680 --> 00:31:44,680 Speaker 1: scramble to develop AI products. Yeah, we want to ask 654 00:31:44,680 --> 00:31:46,520 Speaker 1: you a little bit about what Amazon and Apple are 655 00:31:46,520 --> 00:31:48,640 Speaker 1: doing in the realm of AI. But just before we 656 00:31:48,720 --> 00:31:51,720 Speaker 1: get there, there's another huge deal this week, also with 657 00:31:51,840 --> 00:31:53,840 Speaker 1: a potentially dubious price tag. 658 00:31:54,280 --> 00:31:57,920 Speaker 5: Yes, so I think you're referring to Elon Musk's merger 659 00:31:57,960 --> 00:31:59,960 Speaker 5: of two of his companies little confusing. 660 00:32:00,080 --> 00:32:02,000 Speaker 4: They're both kind of called XX. 661 00:32:02,360 --> 00:32:04,040 Speaker 1: I couldn't even get through the read at the beginning 662 00:32:04,080 --> 00:32:06,560 Speaker 1: of the shows. That's a tongue twist. Yeah, exactly. 663 00:32:06,640 --> 00:32:10,880 Speaker 4: I mean I think that sort of speaks to Elon Musk. 664 00:32:11,040 --> 00:32:13,720 Speaker 5: He has many companies at this point, and he's been 665 00:32:13,760 --> 00:32:16,600 Speaker 5: known to sort of move assets around a little bit. 666 00:32:16,680 --> 00:32:20,200 Speaker 5: And so when it comes to x which is formally Twitter, 667 00:32:20,280 --> 00:32:22,320 Speaker 5: so that's the social media platform that he bought a 668 00:32:22,320 --> 00:32:25,040 Speaker 5: couple of years ago for forty four billion dollars, he 669 00:32:25,360 --> 00:32:30,000 Speaker 5: has now sold that company to his AI company, which 670 00:32:30,040 --> 00:32:34,200 Speaker 5: is called Xai. And sort of formally, the social media 671 00:32:34,200 --> 00:32:37,240 Speaker 5: company is now owned by the AI company. And so 672 00:32:37,440 --> 00:32:39,600 Speaker 5: I say formally, because these companies had kind of been 673 00:32:39,600 --> 00:32:42,720 Speaker 5: working together in a lot of ways. User data from 674 00:32:42,840 --> 00:32:45,920 Speaker 5: the social media company was already being used to train 675 00:32:46,400 --> 00:32:50,120 Speaker 5: the AI on the AI company. The AI company's main product, 676 00:32:50,120 --> 00:32:53,160 Speaker 5: which is called Grock, which is a chat GPT competitor, 677 00:32:53,720 --> 00:32:57,520 Speaker 5: was available through the social media company, and so in 678 00:32:57,560 --> 00:33:00,520 Speaker 5: a lot of ways, these companies were already the same thing. 679 00:33:00,640 --> 00:33:02,920 Speaker 5: And what he did here is he said, look, the 680 00:33:03,000 --> 00:33:04,920 Speaker 5: AI company is able to raise a lot of money 681 00:33:04,920 --> 00:33:08,120 Speaker 5: because everyone loves AI, everyone wants to boost AI. So 682 00:33:08,200 --> 00:33:10,240 Speaker 5: I'm going to use that money that the AI company 683 00:33:10,280 --> 00:33:12,840 Speaker 5: is able to raise to bail out and kind of 684 00:33:13,120 --> 00:33:15,480 Speaker 5: give me more time when it comes to the social 685 00:33:15,520 --> 00:33:19,840 Speaker 5: media company, which is very influential and maybe Elon Musk's 686 00:33:19,880 --> 00:33:22,960 Speaker 5: most important company right now because of the political influence 687 00:33:23,000 --> 00:33:26,200 Speaker 5: it gives him. But from a business perspective, the social 688 00:33:26,200 --> 00:33:29,080 Speaker 5: media company has struggled and sort of been, you know, 689 00:33:29,240 --> 00:33:31,840 Speaker 5: going through the wilderness a little bit since Elon Musk 690 00:33:31,960 --> 00:33:34,360 Speaker 5: bought it because most of its users left, a whole 691 00:33:34,400 --> 00:33:36,720 Speaker 5: bunch of new users came in, advertisers left. 692 00:33:36,760 --> 00:33:37,920 Speaker 4: Maybe the advertisers are going. 693 00:33:37,880 --> 00:33:39,880 Speaker 5: To come back, and so it's a way for Elon 694 00:33:39,960 --> 00:33:42,840 Speaker 5: Musk to sort of use the AI hype to kind 695 00:33:42,840 --> 00:33:46,360 Speaker 5: of help shore up the finances of his social media company. 696 00:33:47,280 --> 00:33:50,840 Speaker 3: You know another major tech company that isn't mentioned as 697 00:33:50,920 --> 00:33:53,240 Speaker 3: much in the AI raise, and that is Amazon. They 698 00:33:53,280 --> 00:33:58,880 Speaker 3: recently unveiled Amazon Nova Act. They're AI agent. So what 699 00:33:58,920 --> 00:34:01,280 Speaker 3: does the lay person need to know about this. 700 00:34:01,600 --> 00:34:04,200 Speaker 4: So okay, a couple of things. I think AI agent 701 00:34:04,480 --> 00:34:05,240 Speaker 4: is a term. 702 00:34:05,080 --> 00:34:07,920 Speaker 5: That people are probably already hearing, and I guarantee you 703 00:34:07,960 --> 00:34:09,680 Speaker 5: they're going to be hearing more about it in the 704 00:34:09,719 --> 00:34:12,600 Speaker 5: coming months and years. An AI agent all that is 705 00:34:12,600 --> 00:34:15,080 Speaker 5: is just you know, you can say, okay, well, if 706 00:34:15,120 --> 00:34:17,120 Speaker 5: I can have a conversation with chat GPT, I can 707 00:34:17,160 --> 00:34:19,600 Speaker 5: ask it things. Can I ask it to then go 708 00:34:20,080 --> 00:34:22,360 Speaker 5: and read the internet for me? Can I ask it 709 00:34:22,440 --> 00:34:24,319 Speaker 5: to go do things on the internet for me? 710 00:34:24,440 --> 00:34:24,600 Speaker 1: Right? 711 00:34:24,680 --> 00:34:28,400 Speaker 5: If chat GPT is able to read an e commerce website, 712 00:34:28,719 --> 00:34:31,759 Speaker 5: can't I just tell chat GPT, hey, go buy me 713 00:34:31,920 --> 00:34:34,600 Speaker 5: the cheapest sofa for my new furniture that you can 714 00:34:34,680 --> 00:34:38,240 Speaker 5: find from my new apartment that's green and seats three people. 715 00:34:38,320 --> 00:34:38,480 Speaker 4: Right. 716 00:34:38,560 --> 00:34:41,800 Speaker 5: And so that's what an AI agent is, is essentially using 717 00:34:41,840 --> 00:34:45,360 Speaker 5: AI to go and help people to do things on 718 00:34:45,480 --> 00:34:48,080 Speaker 5: the internet for them. And so this is something a 719 00:34:48,120 --> 00:34:50,279 Speaker 5: lot of AI companies are talking about. Obviously, there's a 720 00:34:50,280 --> 00:34:52,719 Speaker 5: lot of problems. The technology is not quite there. The 721 00:34:52,800 --> 00:34:55,560 Speaker 5: last thing you want is to say to your AI agent, 722 00:34:55,640 --> 00:34:57,960 Speaker 5: go buy me a sofa for under one thousand dollars, 723 00:34:58,400 --> 00:35:01,040 Speaker 5: and then suddenly eight sofas that cost ten thousand dollars 724 00:35:01,040 --> 00:35:03,000 Speaker 5: each show up at your door a week later, right, 725 00:35:03,040 --> 00:35:04,960 Speaker 5: I mean we need to be really careous. Is that? 726 00:35:04,960 --> 00:35:06,439 Speaker 3: What is that the phase we're in right now? 727 00:35:06,760 --> 00:35:09,239 Speaker 5: I mean a colleague of mine ran an experiment where 728 00:35:09,239 --> 00:35:11,319 Speaker 5: he asked some of these AI agents to go and 729 00:35:11,400 --> 00:35:13,440 Speaker 5: find him the cheapest eggs. 730 00:35:13,440 --> 00:35:15,720 Speaker 4: This was sort of at the height of the egg panic, 731 00:35:15,760 --> 00:35:16,400 Speaker 4: and eggs. 732 00:35:16,200 --> 00:35:20,160 Speaker 5: Were selling for twenty dollars a dozen and the agent 733 00:35:20,239 --> 00:35:23,480 Speaker 5: actually went and bought I think it was like thirty 734 00:35:23,520 --> 00:35:26,880 Speaker 5: dollars eggs and had them delivered to his home before 735 00:35:26,960 --> 00:35:29,640 Speaker 5: he even had the chance to say, like, yes, make 736 00:35:29,680 --> 00:35:30,240 Speaker 5: that purchase. 737 00:35:30,280 --> 00:35:32,560 Speaker 4: And so it's definitely in the experimental phase. 738 00:35:32,880 --> 00:35:36,720 Speaker 5: But Amazon they sort of see that this is maybe 739 00:35:36,760 --> 00:35:38,400 Speaker 5: the next frontier of the technology. 740 00:35:38,520 --> 00:35:41,200 Speaker 1: Is that what the key interest is to basically do 741 00:35:41,280 --> 00:35:42,239 Speaker 1: your shopping for you? 742 00:35:42,560 --> 00:35:44,520 Speaker 4: I mean, I think that would make sense. 743 00:35:44,560 --> 00:35:47,719 Speaker 5: I mean, they are the shopping company, and we all 744 00:35:47,760 --> 00:35:49,480 Speaker 5: know Amazon Alexa. 745 00:35:49,239 --> 00:35:50,520 Speaker 4: Was an early version of this. 746 00:35:50,600 --> 00:35:53,160 Speaker 5: I mean you could ask Amazon Alexa to buy things 747 00:35:53,160 --> 00:35:55,680 Speaker 5: for you on Amazon. You could ask it obviously to 748 00:35:55,840 --> 00:35:58,359 Speaker 5: remind you about the weather, and people never really use 749 00:35:58,400 --> 00:36:00,799 Speaker 5: it for more than that. And so people have been 750 00:36:00,840 --> 00:36:03,920 Speaker 5: saying why was Amazon not ahead of this trend? Why 751 00:36:04,120 --> 00:36:07,440 Speaker 5: is Amazon Alexa not smarter? Why can chat GPT do 752 00:36:07,640 --> 00:36:10,959 Speaker 5: things that Amazon Alexa can't. So Amazon has been under 753 00:36:11,000 --> 00:36:13,799 Speaker 5: a huge amount of pressure from its investors, from its 754 00:36:13,800 --> 00:36:16,400 Speaker 5: own employees, from other people in the tech industry to 755 00:36:16,560 --> 00:36:18,960 Speaker 5: show that they are riding this AI wave just like 756 00:36:19,000 --> 00:36:21,880 Speaker 5: these other companies. And I think this Nova Act product, 757 00:36:21,880 --> 00:36:24,280 Speaker 5: which is very new, it's still in the experimental phase, 758 00:36:24,760 --> 00:36:26,640 Speaker 5: is a sign that they're trying to do that. 759 00:36:26,880 --> 00:36:29,680 Speaker 1: All of this, of course brings us to Siri. I mean, 760 00:36:29,719 --> 00:36:32,799 Speaker 1: the irony being that Amazon Alexa and Apple Siri were 761 00:36:32,840 --> 00:36:35,800 Speaker 1: kind of ahead of the curve in terms of voice 762 00:36:35,880 --> 00:36:41,200 Speaker 1: driven assistance, essentially agentic type features, and now Amazon feels 763 00:36:41,200 --> 00:36:43,920 Speaker 1: like it's behind the curve a little bit, and certainly 764 00:36:44,000 --> 00:36:46,919 Speaker 1: Apple does too when it comes to AI. What's going 765 00:36:46,960 --> 00:36:47,400 Speaker 1: on there? 766 00:36:47,680 --> 00:36:50,400 Speaker 5: Yeah, I mean, I think there's this dynamic that you're 767 00:36:50,840 --> 00:36:53,879 Speaker 5: putting your finger on where these companies are really really 768 00:36:53,880 --> 00:36:56,280 Speaker 5: invested in their own way of doing things and then 769 00:36:56,600 --> 00:36:58,920 Speaker 5: there's a new way that kind of comes out of 770 00:36:58,960 --> 00:37:01,880 Speaker 5: left field and to adjust right. This is the classic 771 00:37:02,360 --> 00:37:05,520 Speaker 5: innovator's dilemma. And so I would say for both companies. 772 00:37:05,560 --> 00:37:10,080 Speaker 5: Don't count them out. Amazon, Apple. They are both massive companies. 773 00:37:10,120 --> 00:37:13,319 Speaker 5: They are bigger than pretty much anything we've seen in 774 00:37:13,440 --> 00:37:18,360 Speaker 5: the history of business. They are in people's lives every minute, 775 00:37:18,400 --> 00:37:20,279 Speaker 5: every day, and so I think, first of all, we 776 00:37:20,320 --> 00:37:22,520 Speaker 5: should be careful not to just write them off and say, oh, 777 00:37:22,520 --> 00:37:25,919 Speaker 5: they're behind, therefore they will fail. But I do think 778 00:37:25,960 --> 00:37:28,919 Speaker 5: that this is really five alarm fire moment for them. 779 00:37:28,960 --> 00:37:31,319 Speaker 1: So you, I mean, they've been having internal sort of 780 00:37:31,360 --> 00:37:33,759 Speaker 1: ol hands meetings. What's the kind of internal drummer at 781 00:37:33,760 --> 00:37:34,279 Speaker 1: apput on this? 782 00:37:34,719 --> 00:37:38,280 Speaker 5: They are saying, Wow, we need to get on board 783 00:37:38,320 --> 00:37:40,439 Speaker 5: with this. And Google had the same thing when chat 784 00:37:40,480 --> 00:37:41,640 Speaker 5: GPT eventually came out. 785 00:37:41,680 --> 00:37:43,720 Speaker 4: Right, Google is the AI company. 786 00:37:43,760 --> 00:37:46,360 Speaker 5: A lot of this technology we're talking about was actually 787 00:37:46,360 --> 00:37:48,719 Speaker 5: developed by Google and then just shared with the world 788 00:37:48,760 --> 00:37:50,719 Speaker 5: because they didn't quite know what to do with it. 789 00:37:51,239 --> 00:37:54,400 Speaker 5: And so for Apple, people have said, okay, well Series 790 00:37:54,440 --> 00:37:56,560 Speaker 5: been around, are you going to make series smarter? This 791 00:37:56,600 --> 00:38:00,279 Speaker 5: seems like an obvious application. They stayed quiet for about 792 00:38:00,280 --> 00:38:03,040 Speaker 5: a year after chat GPT came out, and then they said. 793 00:38:03,040 --> 00:38:05,000 Speaker 4: Yes, we're doing it. We're going all in. 794 00:38:05,160 --> 00:38:08,360 Speaker 5: We're an AI company like everyone else. And now they've 795 00:38:08,360 --> 00:38:11,200 Speaker 5: delayed some of those releases, right, they've said, oh, actually, 796 00:38:11,320 --> 00:38:13,600 Speaker 5: we might need a bit more time to get it 797 00:38:13,640 --> 00:38:15,680 Speaker 5: to the level where we want it. And we can 798 00:38:15,719 --> 00:38:19,680 Speaker 5: see some of Apple's AI experiments have already failed pretty spectacularly. 799 00:38:19,719 --> 00:38:22,680 Speaker 5: They had a bot that summarized some of the messages 800 00:38:22,719 --> 00:38:24,879 Speaker 5: that were coming into your phone. Well I remember that, yeah, 801 00:38:25,239 --> 00:38:28,040 Speaker 5: last year Apple Intelligence. And what they would do is 802 00:38:28,080 --> 00:38:30,160 Speaker 5: they would see news alerts and they and said, oh, 803 00:38:30,239 --> 00:38:32,200 Speaker 5: let me be helpful. Let me summarize the three or 804 00:38:32,200 --> 00:38:35,520 Speaker 5: four news alerts you have. And the summaries were incorrect, right, 805 00:38:35,560 --> 00:38:38,759 Speaker 5: And so that is upsetting as journalists. It's upsetting as 806 00:38:38,760 --> 00:38:42,120 Speaker 5: a user because you want accurate information and Apple is 807 00:38:42,280 --> 00:38:43,240 Speaker 5: muddling the waters. 808 00:38:43,280 --> 00:38:44,879 Speaker 4: And so they had to pull that back. 809 00:38:44,920 --> 00:38:47,239 Speaker 5: And people are starting to seriously ask the question, is 810 00:38:47,280 --> 00:38:50,360 Speaker 5: this the moment where a company like Apple kind of 811 00:38:50,400 --> 00:38:53,840 Speaker 5: falls from grace and loses its steam, loses its power. 812 00:38:54,560 --> 00:38:57,200 Speaker 1: But here's the thing. Fifty percent of Apple's revenues come 813 00:38:57,239 --> 00:39:01,680 Speaker 1: from setting the iPhone. And I can't imagine why do 814 00:39:01,719 --> 00:39:03,120 Speaker 1: they need to be a monket eater in AI? Why 815 00:39:03,120 --> 00:39:04,840 Speaker 1: can't they license other people they eye products? 816 00:39:04,920 --> 00:39:07,239 Speaker 5: Yeah, I mean Apple doesn't really like doing that. They 817 00:39:07,560 --> 00:39:10,360 Speaker 5: like to do everything on their own. They now build 818 00:39:10,360 --> 00:39:13,160 Speaker 5: their own computer chips, they build their own software. 819 00:39:13,160 --> 00:39:15,560 Speaker 4: And the whole point of Apple was. 820 00:39:15,520 --> 00:39:18,040 Speaker 5: That it wasn't a PC, right, they had their own 821 00:39:18,160 --> 00:39:21,520 Speaker 5: operating system. And so that is a huge part of 822 00:39:21,640 --> 00:39:25,520 Speaker 5: Apple's value where they say, come into our walld garden. 823 00:39:25,600 --> 00:39:27,719 Speaker 5: You're going to pay a lot of money, but you're 824 00:39:27,800 --> 00:39:30,200 Speaker 5: going to be more secure, it's going to be cool, 825 00:39:30,400 --> 00:39:33,040 Speaker 5: it's going to be more intuitive. We're doing things our way, 826 00:39:33,120 --> 00:39:36,280 Speaker 5: and so that is their whole pitch to the consumer. 827 00:39:36,600 --> 00:39:38,879 Speaker 5: And if they suddenly have to start saying, come into 828 00:39:38,880 --> 00:39:42,200 Speaker 5: our world garden and use Google's AI, come into our 829 00:39:42,200 --> 00:39:43,520 Speaker 5: world garden and use opening. 830 00:39:43,760 --> 00:39:45,640 Speaker 3: Which I do now by the way, Yeah, I mean 831 00:39:45,680 --> 00:39:48,319 Speaker 3: I use my Apple products to use chatchibt. But it's 832 00:39:48,360 --> 00:39:50,960 Speaker 3: also like can you get everything from one guy? I 833 00:39:50,960 --> 00:39:51,319 Speaker 3: don't know? 834 00:39:51,600 --> 00:39:53,479 Speaker 5: Yeah, And I mean this is another big question because 835 00:39:53,480 --> 00:39:55,680 Speaker 5: that's been the struggle over the last ten years between 836 00:39:55,680 --> 00:39:58,040 Speaker 5: these giant tech companies to sort of box each other 837 00:39:58,080 --> 00:40:02,160 Speaker 5: out and try to corral people within their own ecosystems. 838 00:40:02,600 --> 00:40:05,880 Speaker 5: And maybe AI is the technology that kind of blows 839 00:40:05,920 --> 00:40:06,680 Speaker 5: that up in a way. 840 00:40:07,800 --> 00:40:09,120 Speaker 1: Gary, what a great place to end. 841 00:40:09,280 --> 00:40:11,160 Speaker 4: Thank you so much for thank you so much, of course, 842 00:40:11,160 --> 00:40:11,880 Speaker 4: it is my pleasure. 843 00:40:17,320 --> 00:40:20,400 Speaker 2: That's it for this week for tech Stuff, I'm mos Vloshin. 844 00:40:20,000 --> 00:40:23,360 Speaker 3: And I'm Kara Price. This episode was produced by Eliza Dennis, 845 00:40:23,440 --> 00:40:27,440 Speaker 3: Victoria Dominguez, and Adriana Tapia. It was executive produced by 846 00:40:27,480 --> 00:40:31,440 Speaker 3: me Oz Valashan and Kate Osbourne for Kaleidoscope and Katrina 847 00:40:31,480 --> 00:40:35,160 Speaker 3: Norvel for iHeart Podcasts. Jess Crinchich is the engineer and 848 00:40:35,280 --> 00:40:38,759 Speaker 3: Jack Insley mix the episode. Kyle Murdoch wrote our theme song. 849 00:40:39,239 --> 00:40:42,600 Speaker 1: Join us next Wednesday for tech Stuff The Story when 850 00:40:42,640 --> 00:40:46,360 Speaker 1: we'll share an in depth conversation with Reid Hoffmann, legendary 851 00:40:46,360 --> 00:40:50,040 Speaker 1: founder of LinkedIn, venture capitalist at gray Lock, and author 852 00:40:50,080 --> 00:40:53,200 Speaker 1: of the new book Super Agency. What could possibly go 853 00:40:53,320 --> 00:40:54,800 Speaker 1: right with our AI future? 854 00:40:55,120 --> 00:40:57,600 Speaker 3: Please rate, review, and reach out to us at tech 855 00:40:57,600 --> 00:41:00,719 Speaker 3: Stuff podcast at gmail dot com. We want to hear 856 00:41:00,719 --> 00:41:01,640 Speaker 3: from you.