1 00:00:12,280 --> 00:00:15,200 Speaker 1: Welcome to tech Stuff, a production of iHeart Podcasts and 2 00:00:15,280 --> 00:00:19,080 Speaker 1: Kaleidoscope iMOS vloscan and carries out again this week. So 3 00:00:19,120 --> 00:00:23,200 Speaker 1: today I'll bring you the headlines, including what Google Search 4 00:00:23,480 --> 00:00:27,360 Speaker 1: sounds like and why you're out of stock Cereal may 5 00:00:27,360 --> 00:00:30,440 Speaker 1: be the result of a cyber attack on today's tech 6 00:00:30,480 --> 00:00:36,320 Speaker 1: support segment. Semaphor's read Albergotti casts AI companies in the 7 00:00:36,360 --> 00:00:37,159 Speaker 1: Wizard of Oz. 8 00:00:37,840 --> 00:00:40,720 Speaker 2: The funniest part of this whole thing, honestly, was reaching 9 00:00:40,760 --> 00:00:43,640 Speaker 2: out to all of these companies and explaining to them 10 00:00:43,680 --> 00:00:47,400 Speaker 2: why they are the Munchkins or the wicked Witch of 11 00:00:47,440 --> 00:00:48,600 Speaker 2: the East or the West. 12 00:00:49,920 --> 00:00:53,400 Speaker 1: All of that in the Weekend Tech. It's Friday, June twentieth. 13 00:00:55,760 --> 00:01:01,160 Speaker 1: In summary, today's headlines are about Summary's perhaps the most 14 00:01:01,240 --> 00:01:04,840 Speaker 1: ubiquitous artifact of the age of AI, and today we 15 00:01:04,880 --> 00:01:08,040 Speaker 1: have two stories about AI summaries, one about a new 16 00:01:08,120 --> 00:01:11,680 Speaker 1: AI feature on Google Search and the other about Wikipedia's 17 00:01:11,760 --> 00:01:15,720 Speaker 1: turn away from AI. We'll start with Google. As you 18 00:01:15,800 --> 00:01:19,160 Speaker 1: might know, Google has this product called Notebook LM, which 19 00:01:19,200 --> 00:01:24,319 Speaker 1: allows users to feed their own research sources, PDFs, websites, spreadsheets, 20 00:01:24,360 --> 00:01:27,560 Speaker 1: whatever it may be, and Notebook LM will analyze the 21 00:01:27,600 --> 00:01:30,160 Speaker 1: data in all kinds of different ways. One of these 22 00:01:30,160 --> 00:01:33,840 Speaker 1: ways is a quote audio overview, where users can generate 23 00:01:33,959 --> 00:01:37,120 Speaker 1: this two hander deep dive podcast where a male voice 24 00:01:37,120 --> 00:01:40,080 Speaker 1: and a female voice talk each other through whatever the 25 00:01:40,080 --> 00:01:43,759 Speaker 1: topic at hand is. Now, Google is bringing this feature 26 00:01:44,000 --> 00:01:47,600 Speaker 1: to its search results. According to Ours Technica, the feature 27 00:01:47,640 --> 00:01:50,440 Speaker 1: hasn't been fully rolled out yet. Users have to opt 28 00:01:50,440 --> 00:01:53,320 Speaker 1: in by going to labs dot Google dot com slash 29 00:01:53,360 --> 00:01:57,040 Speaker 1: search and turning on audio overviews. There seems to be 30 00:01:57,120 --> 00:01:59,280 Speaker 1: a wait list, but one of our producers was able 31 00:01:59,320 --> 00:02:02,640 Speaker 1: to turn the future before news spread too far, and 32 00:02:02,880 --> 00:02:06,840 Speaker 1: here's what she found in the search results. Underneath the 33 00:02:06,880 --> 00:02:10,720 Speaker 1: section that says quote, people also ask, there's an oval 34 00:02:10,760 --> 00:02:15,840 Speaker 1: button that says generate audio overview. When you click it, 35 00:02:15,840 --> 00:02:19,079 Speaker 1: it starts working because even AI podcasts take a little 36 00:02:19,120 --> 00:02:22,040 Speaker 1: bit of time to cook. So while you wait, Google 37 00:02:22,080 --> 00:02:24,679 Speaker 1: tells you exactly what it's doing, which is quote, understanding 38 00:02:24,680 --> 00:02:29,840 Speaker 1: the query intent, researching websites, generating transcript, and then in 39 00:02:29,919 --> 00:02:33,880 Speaker 1: less than a minute, an audio player appears. So our 40 00:02:33,919 --> 00:02:37,840 Speaker 1: producer searched in Google, what is two plus two? The 41 00:02:37,880 --> 00:02:41,760 Speaker 1: resulting audio overview was two and a half minutes long, 42 00:02:42,200 --> 00:02:46,600 Speaker 1: which already tickled her spidery senses. But before we listen 43 00:02:46,639 --> 00:02:49,120 Speaker 1: to the audio, we checked out all the websites listed 44 00:02:49,200 --> 00:02:53,079 Speaker 1: underneath the clip as sources. They were a Reddit thread 45 00:02:53,280 --> 00:02:56,320 Speaker 1: asking why some people think two plus two equals five, 46 00:02:57,040 --> 00:03:00,639 Speaker 1: a link to study dot com, a Wikipedia page for 47 00:03:00,919 --> 00:03:03,400 Speaker 1: two plus two equals five, as well as a link 48 00:03:03,440 --> 00:03:06,760 Speaker 1: to a TikTok meme around the question of two plus two, 49 00:03:07,320 --> 00:03:11,160 Speaker 1: which I couldn't really understand, let and explain. And underneath 50 00:03:11,200 --> 00:03:17,000 Speaker 1: all these links is another message quote generative AI is experimental. 51 00:03:17,880 --> 00:03:20,280 Speaker 1: So here's the beginning of that two and a half 52 00:03:20,360 --> 00:03:23,639 Speaker 1: minute audio overview for what's two plus two? 53 00:03:24,120 --> 00:03:24,240 Speaker 2: So? 54 00:03:24,280 --> 00:03:26,760 Speaker 3: What's two plus two? Seems like a no brainer, right four? 55 00:03:27,000 --> 00:03:27,600 Speaker 3: End of story. 56 00:03:27,720 --> 00:03:29,400 Speaker 4: Well, that's what we learn in grade school at least. 57 00:03:29,639 --> 00:03:32,239 Speaker 4: But the idea that two plus two could equal something 58 00:03:32,240 --> 00:03:35,200 Speaker 4: other than four pops up in some pretty interesting places, 59 00:03:35,360 --> 00:03:36,280 Speaker 4: like where. 60 00:03:36,240 --> 00:03:38,440 Speaker 3: I'm picturing some kind of crazy math equation. 61 00:03:38,600 --> 00:03:41,480 Speaker 4: Now it's less about complex math and more about, shall 62 00:03:41,520 --> 00:03:43,240 Speaker 4: we say, bending reality a little. 63 00:03:43,640 --> 00:03:46,440 Speaker 1: The podcast has then launched into a discussion of instances 64 00:03:46,480 --> 00:03:49,000 Speaker 1: where some people might say two plus two equals five, 65 00:03:49,600 --> 00:03:53,120 Speaker 1: like in All Wells dystopian novel nineteen eighty four, in 66 00:03:53,120 --> 00:03:56,720 Speaker 1: which he ponders a totalitarian government that might insist two 67 00:03:56,760 --> 00:03:59,880 Speaker 1: plus two equals five. It also includes a joke about 68 00:03:59,920 --> 00:04:03,160 Speaker 1: it mathematician, an engineer, and a lawyer answering two plus 69 00:04:03,200 --> 00:04:06,880 Speaker 1: two questions two plus two will always before that. The 70 00:04:06,920 --> 00:04:11,320 Speaker 1: AI podcast hosts had this to say in conclusion, it's. 71 00:04:11,240 --> 00:04:12,960 Speaker 3: Kind of mind bending when you think about it. This 72 00:04:13,160 --> 00:04:16,160 Speaker 3: simple little equation can represent so much more than just 73 00:04:16,360 --> 00:04:17,080 Speaker 3: basic addition. 74 00:04:17,279 --> 00:04:20,000 Speaker 4: It really does. It's a reminder to question things, to 75 00:04:20,040 --> 00:04:23,480 Speaker 4: consider different viewpoints, and to be aware of how easily 76 00:04:23,720 --> 00:04:26,279 Speaker 4: fax can be twisted or manipulated. 77 00:04:26,640 --> 00:04:29,360 Speaker 1: My personal view is that the notebook LM product can 78 00:04:29,400 --> 00:04:31,400 Speaker 1: be pretty great if you want to bring a bunch 79 00:04:31,400 --> 00:04:33,560 Speaker 1: of your own narrowly bounded set of data or research 80 00:04:33,560 --> 00:04:35,560 Speaker 1: into one place and then here it played back to 81 00:04:35,600 --> 00:04:38,960 Speaker 1: you in podcast form. More power to you. The last 82 00:04:38,960 --> 00:04:41,840 Speaker 1: time I use it myself was over the holidays when 83 00:04:41,920 --> 00:04:44,560 Speaker 1: I was going to go and see the Nutcracker Ballet, 84 00:04:45,240 --> 00:04:47,719 Speaker 1: and I kind of wanted to know what to expect 85 00:04:47,760 --> 00:04:49,720 Speaker 1: so that I'd be able to enjoy it more. And 86 00:04:49,839 --> 00:04:53,080 Speaker 1: I've fed Notebook LM the Wikipedia page about the Nutcracker, 87 00:04:53,520 --> 00:04:57,240 Speaker 1: some reviews, some other information, and it summarized all of 88 00:04:57,279 --> 00:04:59,640 Speaker 1: that and turn it into a ten minute podcast which 89 00:04:59,640 --> 00:05:01,920 Speaker 1: I listen to on the way to the ballet, and 90 00:05:01,960 --> 00:05:05,080 Speaker 1: I honestly think that gave me a better enjoyment of 91 00:05:05,080 --> 00:05:07,680 Speaker 1: the ballet because I knew the context, but I wouldn't 92 00:05:07,680 --> 00:05:09,960 Speaker 1: have actually done the work myself of reading all the materials. 93 00:05:10,000 --> 00:05:13,160 Speaker 1: So that was kind of a cool use case. That said, 94 00:05:13,440 --> 00:05:15,719 Speaker 1: I don't think when I search for something like what's 95 00:05:15,760 --> 00:05:20,280 Speaker 1: two plus two, I need a audio conversational summary of 96 00:05:20,640 --> 00:05:23,560 Speaker 1: the randomness of the Internet. It just doesn't really seem 97 00:05:24,000 --> 00:05:26,960 Speaker 1: very relevant, and it seems more than anything like a 98 00:05:27,120 --> 00:05:30,560 Speaker 1: solution in search of a problem, which brings us to 99 00:05:30,600 --> 00:05:33,920 Speaker 1: our next story. There's another Internet titan that tried out 100 00:05:33,920 --> 00:05:39,040 Speaker 1: an AI feature recently. Wikipedia. On June second, the organization 101 00:05:39,120 --> 00:05:44,680 Speaker 1: that hosts and maintains Wikipedia, called the Wikimedia Foundation, announced 102 00:05:44,720 --> 00:05:48,120 Speaker 1: it would run an experiment with AI. The project, called 103 00:05:48,240 --> 00:05:52,240 Speaker 1: Simple Article Summaries, would provide AI writeups at the top 104 00:05:52,400 --> 00:05:55,400 Speaker 1: of Wikipedia entries. It was an attempt to make the 105 00:05:55,440 --> 00:05:59,279 Speaker 1: denser Wikipedia articles more accessible, and the experiment was set 106 00:05:59,279 --> 00:06:02,360 Speaker 1: to last two weeks, with summaries generated by an NLM 107 00:06:02,400 --> 00:06:08,320 Speaker 1: called Aya by Coheer. Then came the criticism. According to 108 00:06:08,480 --> 00:06:11,240 Speaker 1: a form media, the announcement was met with replies from 109 00:06:11,240 --> 00:06:14,520 Speaker 1: Wikipedia editors who warned that AI summaries were a very 110 00:06:14,560 --> 00:06:18,040 Speaker 1: bad idea. One editor pointed out that AI's ability to 111 00:06:18,080 --> 00:06:21,919 Speaker 1: get things wrong could damage the platform's reputation, which is 112 00:06:21,960 --> 00:06:24,520 Speaker 1: something that editors and the organization have worked very hard 113 00:06:24,560 --> 00:06:27,560 Speaker 1: on for a very long time. Another editor said quote, 114 00:06:28,000 --> 00:06:31,159 Speaker 1: just because Google has rolled out it's AI summaries doesn't 115 00:06:31,160 --> 00:06:33,760 Speaker 1: mean we need to one up them. Some even threatened 116 00:06:33,760 --> 00:06:37,160 Speaker 1: to quit if Wikipedia didn't roll back the summaries. And 117 00:06:37,600 --> 00:06:41,200 Speaker 1: here's the thing. Wikipedia relies on its editor's passion for 118 00:06:41,240 --> 00:06:46,360 Speaker 1: the website to keep this participatory encyclopedia running. So after 119 00:06:46,360 --> 00:06:48,920 Speaker 1: the editor upraw and just a day after the announcement 120 00:06:48,960 --> 00:06:52,880 Speaker 1: of the experiment, Wikimedia said they would pause it, but 121 00:06:52,960 --> 00:06:56,080 Speaker 1: that they were also still interested in AI summaries. It's 122 00:06:56,080 --> 00:06:58,640 Speaker 1: not clear when they might try them out again, but 123 00:06:58,680 --> 00:07:02,480 Speaker 1: it is clear how human editors feel about them. I'm 124 00:07:02,520 --> 00:07:05,839 Speaker 1: not somebody who's implacably opposed to AI summaries. In fact, 125 00:07:06,040 --> 00:07:08,600 Speaker 1: I use them myself. But what was really important to 126 00:07:08,600 --> 00:07:11,800 Speaker 1: remember is that AI summaries rely on the input of 127 00:07:11,920 --> 00:07:15,040 Speaker 1: human work and ingenuity, and with that in mind, I 128 00:07:15,080 --> 00:07:17,200 Speaker 1: read this article in the financial time that I wanted 129 00:07:17,200 --> 00:07:20,840 Speaker 1: to share. It had the headline AI alone cannot solve 130 00:07:20,880 --> 00:07:23,960 Speaker 1: the productivity problem, and it talks about something I had 131 00:07:23,960 --> 00:07:28,000 Speaker 1: no idea about, which is the average scientist today produces 132 00:07:28,280 --> 00:07:32,120 Speaker 1: fewer breakthrough ideas per dollar spent than their counterpart in 133 00:07:32,120 --> 00:07:35,440 Speaker 1: the sixties, and that despite all the headlines about progress, 134 00:07:35,760 --> 00:07:39,960 Speaker 1: progress is actually becoming more incremental. And here's the part 135 00:07:39,960 --> 00:07:43,320 Speaker 1: of the article that really got me thinking. Quote, had 136 00:07:43,320 --> 00:07:47,040 Speaker 1: the nineteenth century focused solely on better looms and plows, 137 00:07:47,400 --> 00:07:50,920 Speaker 1: we would enjoy cheap cloth and abundant grain, but there 138 00:07:50,960 --> 00:07:55,440 Speaker 1: would be no antibiotics, jet engines, or rockets. Economic miracles 139 00:07:55,480 --> 00:08:00,200 Speaker 1: stem from discovery, not repeating tasks at greater speed. The 140 00:08:00,280 --> 00:08:02,360 Speaker 1: article went on to make the point that large language 141 00:08:02,400 --> 00:08:09,280 Speaker 1: models gravitate towards statistical consensus. So think about it. If 142 00:08:09,320 --> 00:08:12,240 Speaker 1: there were NLMs back before the Wright Brothers, all the 143 00:08:12,320 --> 00:08:14,880 Speaker 1: data that the model was trained on would have suggested 144 00:08:14,880 --> 00:08:19,480 Speaker 1: that human flight was absolutely impossible. Wikipedia is not going 145 00:08:19,520 --> 00:08:23,160 Speaker 1: to solve the larger problem of fostering human ingenuity in 146 00:08:23,240 --> 00:08:26,960 Speaker 1: an age of statistical consensus, But at least for now, 147 00:08:27,160 --> 00:08:29,160 Speaker 1: you can rest assured that the articles you read on 148 00:08:29,200 --> 00:08:40,840 Speaker 1: it will be written and driven and created by humans. 149 00:08:41,880 --> 00:08:44,880 Speaker 1: I've got a couple more headlines for you, including why 150 00:08:44,920 --> 00:08:47,520 Speaker 1: you might be struggling to get your favorite box of cereal. 151 00:08:48,559 --> 00:08:51,760 Speaker 1: Tech Crunch reports that some grocery stores are reeling from 152 00:08:51,760 --> 00:08:56,760 Speaker 1: the effects of a cyber attack. United Natural Foods or UNFI, 153 00:08:57,400 --> 00:09:00,800 Speaker 1: is a large food distribution company that provide it's fresh 154 00:09:00,840 --> 00:09:04,760 Speaker 1: produce and products to over thirty thousand stores across the US. 155 00:09:05,200 --> 00:09:09,320 Speaker 1: Whole Foods uses the company as its primary distributor. Earlier 156 00:09:09,320 --> 00:09:13,040 Speaker 1: this month, UNFI experienced a cyber attack that caused them 157 00:09:13,080 --> 00:09:16,679 Speaker 1: to shut down their electronic ordering systems. The system is 158 00:09:16,720 --> 00:09:19,920 Speaker 1: now back online after more than ten days, but grocery 159 00:09:19,960 --> 00:09:22,840 Speaker 1: stores are still waiting on some of their stock. And 160 00:09:22,880 --> 00:09:25,800 Speaker 1: this isn't just an issue in the US. Cyber attacks 161 00:09:25,800 --> 00:09:30,360 Speaker 1: are also roiling UK supermarkets. The thing is, grocery chains 162 00:09:30,440 --> 00:09:33,200 Speaker 1: are a prime target for hackers because they have these large, 163 00:09:33,480 --> 00:09:37,920 Speaker 1: interconnected supply chains, which by definition have multiple failure points. 164 00:09:38,360 --> 00:09:41,880 Speaker 1: They also have treasure trows with customer data, and of course, 165 00:09:42,240 --> 00:09:44,360 Speaker 1: when shoppers can't get what they want at the store, 166 00:09:44,880 --> 00:09:49,559 Speaker 1: angst and headlines are sure to follow. Finally, for today's headlines, 167 00:09:50,160 --> 00:09:53,880 Speaker 1: ads are a huge business for social media platforms, and 168 00:09:54,000 --> 00:09:56,520 Speaker 1: TikTok is aiming to lower the cost of production for 169 00:09:56,559 --> 00:10:00,400 Speaker 1: its partners. According to The Verge, TikTok is adding new 170 00:10:00,480 --> 00:10:04,280 Speaker 1: capabilities to give advertisers the ability to upload images and 171 00:10:04,320 --> 00:10:06,959 Speaker 1: give text prompts that can then be used to generate 172 00:10:07,160 --> 00:10:11,959 Speaker 1: video advertisements. The videos can include virtual avatars holding or 173 00:10:12,040 --> 00:10:15,920 Speaker 1: modeling different products, just like an influencer mine for brands 174 00:10:15,920 --> 00:10:18,600 Speaker 1: that they partner with. TikTok has said that all this 175 00:10:18,760 --> 00:10:22,440 Speaker 1: content generated using AI will be labeled as such and 176 00:10:22,520 --> 00:10:26,040 Speaker 1: will go through quote multiple rounds of safety review. The 177 00:10:26,080 --> 00:10:29,920 Speaker 1: biggest story is that influencer marketing transformed the ad industry 178 00:10:29,960 --> 00:10:32,400 Speaker 1: over the last decade, and it will be interesting to 179 00:10:32,440 --> 00:10:36,640 Speaker 1: see what AI influencers can actually do because, unlike say, 180 00:10:36,679 --> 00:10:39,600 Speaker 1: fashion models, whose job it is to show what products 181 00:10:39,640 --> 00:10:43,160 Speaker 1: will look like when they're worn, influencers have thrived when 182 00:10:43,200 --> 00:10:45,079 Speaker 1: they make a case for a product based on their 183 00:10:45,120 --> 00:10:54,959 Speaker 1: personal experience. After the break, we bring you a production 184 00:10:55,000 --> 00:10:58,440 Speaker 1: of The Wizard of Oz starring your favorite tech giants. 185 00:10:58,880 --> 00:11:12,080 Speaker 1: Stay with us on this show. We've talked about tech 186 00:11:12,160 --> 00:11:17,040 Speaker 1: upstarts like open ai and its model, chatchipt Anthropic and 187 00:11:17,120 --> 00:11:20,920 Speaker 1: its model, Clawed at Length, and We've also talked about 188 00:11:20,920 --> 00:11:25,480 Speaker 1: the numerous tech giants racing to gain or defend market 189 00:11:25,480 --> 00:11:29,520 Speaker 1: share with AI products of their own, including Meta, Google, Apple, Amazon, 190 00:11:29,600 --> 00:11:33,439 Speaker 1: and Microsoft. It's a crowded field, but also an open 191 00:11:33,480 --> 00:11:36,640 Speaker 1: one in terms of knowing who will ultimately prevail, and 192 00:11:36,720 --> 00:11:40,719 Speaker 1: it can get quite dramatic their acquisitions, partnerships to get 193 00:11:40,760 --> 00:11:44,880 Speaker 1: contentious legal fights and hedge rolling at big tech companies 194 00:11:44,920 --> 00:11:47,559 Speaker 1: as they fight to stay in the game, and recently 195 00:11:47,679 --> 00:11:51,040 Speaker 1: there have been some big AI power moves. Meta has 196 00:11:51,080 --> 00:11:55,000 Speaker 1: just made its largest investments since acquiring WhatsApp, putting fourteen 197 00:11:55,040 --> 00:11:58,640 Speaker 1: billion dollars into a company called Scale Ai. So we 198 00:11:58,640 --> 00:12:01,200 Speaker 1: thought it'd be helpful to outline them many players the air, 199 00:12:01,240 --> 00:12:04,320 Speaker 1: current positions, and what might be on the horizon. And 200 00:12:04,400 --> 00:12:06,760 Speaker 1: here to walk us all through it is read Albagotti, 201 00:12:06,800 --> 00:12:09,960 Speaker 1: who's the tech editor Sema fool Read Welcome to tech Stuff. 202 00:12:09,800 --> 00:12:11,160 Speaker 2: Thanks for having me. Good to be here. 203 00:12:11,400 --> 00:12:15,199 Speaker 1: You came up with a very creative paradigm for helping 204 00:12:15,720 --> 00:12:19,080 Speaker 1: your readers make sense of all the different players in 205 00:12:19,200 --> 00:12:21,920 Speaker 1: the AI race. Tell us a little bit about it. 206 00:12:22,760 --> 00:12:24,960 Speaker 2: Well, it all started because, like you said, I mean, 207 00:12:25,000 --> 00:12:27,880 Speaker 2: there's this this huge race, and I was sort of 208 00:12:27,920 --> 00:12:31,760 Speaker 2: going back and forth with my colleague Rachel Jones, talking 209 00:12:31,760 --> 00:12:34,320 Speaker 2: about Okay, so like, what are the lanes that these 210 00:12:34,360 --> 00:12:36,640 Speaker 2: companies are starting to take and how do we sort 211 00:12:36,679 --> 00:12:39,160 Speaker 2: of define that? And I'm sort of writing this thing 212 00:12:39,200 --> 00:12:41,319 Speaker 2: and it's kind of it's a little dry, it's a 213 00:12:41,360 --> 00:12:44,520 Speaker 2: little heavy on the analysis, and Rachel's like, why don't 214 00:12:44,559 --> 00:12:47,280 Speaker 2: we just make them Wizard of Oz characters? And I 215 00:12:47,440 --> 00:12:53,120 Speaker 2: was like, that's a terrible idea. But then I thought 216 00:12:53,120 --> 00:12:54,720 Speaker 2: more about it, and I thought, that's actually kind of 217 00:12:54,720 --> 00:12:58,520 Speaker 2: a great idea. So you know, we decided to try 218 00:12:58,520 --> 00:13:01,960 Speaker 2: to fit you know, each of the biggest players that 219 00:13:02,000 --> 00:13:04,880 Speaker 2: we see in the field into these characters and the 220 00:13:04,880 --> 00:13:08,480 Speaker 2: Wizard of Oz and sort of explain why. And I think, look, 221 00:13:08,520 --> 00:13:11,000 Speaker 2: I mean the funniest part of this whole thing, honestly, 222 00:13:11,280 --> 00:13:13,839 Speaker 2: was was reaching out to all of these companies and 223 00:13:13,960 --> 00:13:17,000 Speaker 2: explaining to them why they are the Munchkins or the 224 00:13:17,040 --> 00:13:21,720 Speaker 2: Wicked Witch of the East or the West, and you know, 225 00:13:21,760 --> 00:13:24,679 Speaker 2: getting their responses and I thought, you know, actually, kind 226 00:13:25,480 --> 00:13:27,480 Speaker 2: I kind of like this because the best part was 227 00:13:27,520 --> 00:13:29,720 Speaker 2: when I got pushed back from these companies, right like, 228 00:13:29,800 --> 00:13:32,559 Speaker 2: we are we are not this, we are this character 229 00:13:32,840 --> 00:13:34,840 Speaker 2: just like, oh, actually I'm getting an insight now into 230 00:13:34,880 --> 00:13:37,480 Speaker 2: like how you see yourself as a company. And so 231 00:13:38,040 --> 00:13:39,439 Speaker 2: what I kind of hoped was, like, you know, we 232 00:13:39,480 --> 00:13:42,079 Speaker 2: would this was like people would write in and say 233 00:13:42,080 --> 00:13:45,160 Speaker 2: you're a complete idiot, like open Aye is not Dorothy, 234 00:13:45,240 --> 00:13:47,360 Speaker 2: They're you know, the tin man or something like that. 235 00:13:47,880 --> 00:13:50,640 Speaker 2: Just get people talking about what are the lanes here 236 00:13:50,679 --> 00:13:53,440 Speaker 2: and how and how should we think about these these players. 237 00:13:53,840 --> 00:13:56,200 Speaker 1: But that said, you did cast open Aiye as Dorothy. 238 00:13:56,600 --> 00:13:57,199 Speaker 1: Why was that? 239 00:13:57,800 --> 00:14:01,240 Speaker 2: Well, you know, Dorothy is is kind of the main character. Right. 240 00:14:01,320 --> 00:14:04,720 Speaker 2: They are the Kleenex of chat bots, so to speak. 241 00:14:04,920 --> 00:14:08,200 Speaker 2: They're the winner take all sort of consumer chat bot. 242 00:14:08,640 --> 00:14:10,880 Speaker 2: They get all the press, but you know, at the 243 00:14:10,920 --> 00:14:14,280 Speaker 2: same time, they require a lot of other you know, 244 00:14:14,360 --> 00:14:16,800 Speaker 2: players in this space to make it possible. Right, So 245 00:14:17,520 --> 00:14:21,520 Speaker 2: you know they they've borrowed technology from Google, right, you know, 246 00:14:21,560 --> 00:14:25,200 Speaker 2: they're using the data centers or at Microsoft for such 247 00:14:25,240 --> 00:14:27,800 Speaker 2: a huge part of that. So it's this supporting cast 248 00:14:27,880 --> 00:14:31,320 Speaker 2: that kind of really makes them who they are. I think, 249 00:14:31,840 --> 00:14:34,080 Speaker 2: so that was that was the justification there. Do you 250 00:14:34,120 --> 00:14:36,080 Speaker 2: agree or do you want to do you want to 251 00:14:36,080 --> 00:14:38,560 Speaker 2: fight me on that? 252 00:14:38,680 --> 00:14:42,520 Speaker 1: Well, I think I'll you've thought about this more than 253 00:14:42,520 --> 00:14:44,720 Speaker 1: I have. But I think I agree with you that 254 00:14:45,560 --> 00:14:47,760 Speaker 1: despite my name, I think that I agree with you 255 00:14:47,840 --> 00:14:50,200 Speaker 1: that that that opening I definitely has the main character 256 00:14:50,320 --> 00:14:53,360 Speaker 1: energy in this story. But I have to ask you 257 00:14:53,400 --> 00:14:56,760 Speaker 1: about her cost of supporting characters the tin man, the 258 00:14:56,840 --> 00:14:59,320 Speaker 1: cowardly line, and the scarecrow. How do you assign each 259 00:14:59,320 --> 00:15:00,720 Speaker 1: of these crucial pots? 260 00:15:01,880 --> 00:15:05,720 Speaker 2: Well, okay, so let's start with Let's start with with Google, 261 00:15:05,920 --> 00:15:09,720 Speaker 2: the cowardly lion. Google has, you know, all this power, right, 262 00:15:09,760 --> 00:15:14,040 Speaker 2: They've developed this technology for years. You know, the attention 263 00:15:14,240 --> 00:15:16,360 Speaker 2: is all you need paper, right like that. That's what 264 00:15:16,520 --> 00:15:20,480 Speaker 2: really laid the groundwork for CHATCHYPT. But they kind of 265 00:15:20,520 --> 00:15:22,440 Speaker 2: sat on it. And I think a lot of that 266 00:15:22,600 --> 00:15:25,120 Speaker 2: was out of fear. You know, these chatbots go off 267 00:15:25,120 --> 00:15:27,200 Speaker 2: the rails. They're crazy, Like, what are people going to 268 00:15:27,280 --> 00:15:30,240 Speaker 2: say if we just start releasing this stuff into the wild. 269 00:15:30,840 --> 00:15:34,680 Speaker 2: And so they didn't and they got beat right, chatchapt 270 00:15:34,880 --> 00:15:38,880 Speaker 2: came out. Google has had to completely reorient itself around 271 00:15:38,920 --> 00:15:39,840 Speaker 2: this race now. 272 00:15:39,960 --> 00:15:42,480 Speaker 1: So what you're saying is is Google could have had 273 00:15:42,520 --> 00:15:46,440 Speaker 1: market leadership here, and because of fears about cultural blowback 274 00:15:46,560 --> 00:15:49,160 Speaker 1: or erosion of that cool business, they basically sat on 275 00:15:49,200 --> 00:15:50,400 Speaker 1: the technology. 276 00:15:49,920 --> 00:15:53,200 Speaker 2: Yeah, that's very well said. They're a big public company. 277 00:15:53,280 --> 00:15:55,680 Speaker 2: They have to be sort of cautious, and you know, 278 00:15:55,720 --> 00:15:58,080 Speaker 2: I think that's somewhat in there, has become part of 279 00:15:58,120 --> 00:16:00,240 Speaker 2: their DNA. But they've had to get really bold, right 280 00:16:00,240 --> 00:16:04,080 Speaker 2: and they started to release new products. The notebook LM 281 00:16:04,200 --> 00:16:07,479 Speaker 2: is one of the big successes. They're throwing more spaghetti 282 00:16:07,520 --> 00:16:10,400 Speaker 2: against the wall. They're finding that they actually do have courage, right, 283 00:16:10,440 --> 00:16:13,240 Speaker 2: which you know, of course, if you've watched The Wizard 284 00:16:13,280 --> 00:16:15,600 Speaker 2: of Oz, you know in the end they all kind 285 00:16:15,640 --> 00:16:18,400 Speaker 2: of have the thing they thought they were missing. So 286 00:16:19,000 --> 00:16:23,920 Speaker 2: you know, tin Man is Microsoft this well oiled machine. 287 00:16:24,240 --> 00:16:26,880 Speaker 2: If you remember the tin Man was always putting oil 288 00:16:26,960 --> 00:16:31,680 Speaker 2: and oil can. They seem a bit rigid, right, but 289 00:16:31,720 --> 00:16:33,920 Speaker 2: in the end, like they kind of do have a heart. 290 00:16:33,960 --> 00:16:36,520 Speaker 2: I mean I think they I had an interview with 291 00:16:36,920 --> 00:16:40,400 Speaker 2: Satia Nadella recently and and Kevin Scott, the CTO, that 292 00:16:40,760 --> 00:16:43,120 Speaker 2: it's really clear they really do want to sort of 293 00:16:43,120 --> 00:16:46,000 Speaker 2: like enable these developers. I think they have sort of 294 00:16:46,040 --> 00:16:49,120 Speaker 2: a I think a less ego, a bit more of 295 00:16:49,160 --> 00:16:52,720 Speaker 2: a selfless kind of attitude around this stuff. They're building 296 00:16:52,720 --> 00:16:55,640 Speaker 2: the infrastructure for this industry and letting others kind of 297 00:16:55,920 --> 00:17:00,600 Speaker 2: use their creativity to benefit, so they really do have 298 00:17:00,640 --> 00:17:03,960 Speaker 2: a heart in the end. And then I think Amazon, 299 00:17:04,840 --> 00:17:09,520 Speaker 2: you know, this the scarecrow, right, no, no brain. When 300 00:17:09,560 --> 00:17:11,639 Speaker 2: I think about the beginning of this race, you know, 301 00:17:11,720 --> 00:17:15,000 Speaker 2: much like Google, people were really putting Amazon down and 302 00:17:15,080 --> 00:17:17,960 Speaker 2: they thought, oh my god, they've completely missed this, like 303 00:17:18,040 --> 00:17:22,439 Speaker 2: Amazon's got nothing here, and totally ignored the fact that 304 00:17:22,560 --> 00:17:26,639 Speaker 2: actually they had been thinking about this for years. And 305 00:17:26,720 --> 00:17:31,280 Speaker 2: even though you know, maybe they're sort of fighting for 306 00:17:31,400 --> 00:17:34,520 Speaker 2: market share in this new AI race, I think they're 307 00:17:34,680 --> 00:17:38,679 Speaker 2: pretty well positioned to profit from the growth and you know, 308 00:17:38,800 --> 00:17:42,560 Speaker 2: just the the market demand for tokens for compute. 309 00:17:42,640 --> 00:17:45,320 Speaker 1: Are they the market leaders in data centers Amazon today? 310 00:17:45,560 --> 00:17:48,240 Speaker 2: Oh yeah, I mean for sure in the in the cloud, 311 00:17:48,359 --> 00:17:51,119 Speaker 2: in the cloud business. And I think that the question 312 00:17:51,280 --> 00:17:54,800 Speaker 2: though is, you know, can they remain the market leader 313 00:17:54,880 --> 00:17:57,920 Speaker 2: in the AI you know era right, they've been the 314 00:17:57,960 --> 00:18:00,359 Speaker 2: market leader in cloud in terms of just you know, 315 00:18:00,480 --> 00:18:05,040 Speaker 2: normal compute AI data centers is it's a whole new ballgame. 316 00:18:05,320 --> 00:18:08,360 Speaker 2: AI runs on what we would call a GPU, right, 317 00:18:08,480 --> 00:18:13,040 Speaker 2: and so everyone is now you know, racing to add 318 00:18:13,080 --> 00:18:16,120 Speaker 2: these GPUs or build new data centers that are built 319 00:18:16,119 --> 00:18:19,320 Speaker 2: around these GPUs and it's a totally different business model 320 00:18:19,320 --> 00:18:23,399 Speaker 2: and concept, so it kind of restarts the race in 321 00:18:23,720 --> 00:18:24,240 Speaker 2: some ways. 322 00:18:24,800 --> 00:18:27,679 Speaker 1: Okay, so we got Open Air as Dorothy, Microsoft as 323 00:18:27,720 --> 00:18:31,760 Speaker 1: the tin Man, Google's the cowardly line, Amazon is the scarecrow. 324 00:18:32,119 --> 00:18:35,399 Speaker 1: There are also some wizards and witches who need to 325 00:18:35,440 --> 00:18:38,720 Speaker 1: be accounted for Glinda, the Wicked Witch of the West, 326 00:18:38,760 --> 00:18:41,280 Speaker 1: and of course my namesake, the Wizard of Oz. Who's 327 00:18:41,320 --> 00:18:43,400 Speaker 1: who in this cost. 328 00:18:43,440 --> 00:18:47,439 Speaker 2: So let's start with Perplexity being, you know, the wicked Witch. 329 00:18:48,000 --> 00:18:51,439 Speaker 2: As we found in Wicked. You know this, this witch 330 00:18:51,520 --> 00:18:54,040 Speaker 2: was kind of misunderstood, and I think that's sort of 331 00:18:54,080 --> 00:18:58,080 Speaker 2: how Perplexity became the absolute villain, especially in the media industry, 332 00:18:58,160 --> 00:19:03,120 Speaker 2: because they, you know, were were I guess, republishing almost 333 00:19:03,160 --> 00:19:07,720 Speaker 2: like articles without proper citation. I think they very quickly 334 00:19:08,320 --> 00:19:10,199 Speaker 2: changed I mean, it wasn't a policy. I think it 335 00:19:10,240 --> 00:19:12,159 Speaker 2: was a mistake and they very quickly changed them. And 336 00:19:12,160 --> 00:19:14,880 Speaker 2: they're actually have all these media deals where there are 337 00:19:14,920 --> 00:19:19,360 Speaker 2: revenue sharing agreements between Perplexity and the media companies. And 338 00:19:20,119 --> 00:19:24,080 Speaker 2: I think that Wicked image is starting to fade a 339 00:19:24,160 --> 00:19:27,480 Speaker 2: little bit from Perplexity. So I put them in the 340 00:19:27,560 --> 00:19:32,280 Speaker 2: role of the Wicked Witch, and Arvin strinovas the CEO Perplexity, 341 00:19:32,359 --> 00:19:36,760 Speaker 2: responded to me on Twitter with a I think AI 342 00:19:37,200 --> 00:19:39,919 Speaker 2: generated repurposing of the song from Wicked. 343 00:19:40,160 --> 00:19:43,840 Speaker 1: I've got the final close my eyes and leap. It's 344 00:19:43,880 --> 00:19:46,480 Speaker 1: time to try to find Google. I think I'll try 345 00:19:46,520 --> 00:19:48,960 Speaker 1: to find Google and you comple me down. 346 00:19:50,160 --> 00:19:55,480 Speaker 2: Right, So they're disrupting Google. Therefore they're they're really the 347 00:19:55,600 --> 00:19:57,840 Speaker 2: good guys. So you know that was I thought that 348 00:19:57,920 --> 00:20:01,080 Speaker 2: was fun. That was a fun response from Arvind. I think, 349 00:20:01,280 --> 00:20:03,640 Speaker 2: you know, we put Apple as the role of Glinda, 350 00:20:03,720 --> 00:20:06,440 Speaker 2: the good Witch, who you know sort of over over 351 00:20:06,560 --> 00:20:11,240 Speaker 2: promises and underdelivers. I think that's sort of the you know, 352 00:20:11,560 --> 00:20:13,520 Speaker 2: using a little bit of the old and the new 353 00:20:13,760 --> 00:20:18,960 Speaker 2: Wicked version of that that character. Obviously, you know, that's 354 00:20:19,000 --> 00:20:21,440 Speaker 2: based on the fact that they, you know, they had 355 00:20:21,440 --> 00:20:25,359 Speaker 2: this huge event where they promised the world around you know, AI, 356 00:20:25,480 --> 00:20:27,600 Speaker 2: they were going to do these amazing things with Siri 357 00:20:27,680 --> 00:20:29,800 Speaker 2: and your phone, and it turns out, oh, they hadn't 358 00:20:29,800 --> 00:20:32,000 Speaker 2: actually really built any of this, and it's not clear 359 00:20:32,040 --> 00:20:32,520 Speaker 2: that they can. 360 00:20:33,040 --> 00:20:35,199 Speaker 1: And what what's your reporting suggests? Why has it been 361 00:20:35,240 --> 00:20:35,919 Speaker 1: such a bust? 362 00:20:36,400 --> 00:20:39,560 Speaker 2: I think that was a train wreck that anybody who's 363 00:20:39,560 --> 00:20:41,560 Speaker 2: been following Apple has kind of seen for a long 364 00:20:41,600 --> 00:20:45,360 Speaker 2: time because they just have ignored AI and you don't 365 00:20:45,400 --> 00:20:48,359 Speaker 2: see the Apple people are not really like mixing with 366 00:20:48,400 --> 00:20:50,679 Speaker 2: the rest of Silicon Valley, and I think they just 367 00:20:50,720 --> 00:20:54,880 Speaker 2: didn't quite understand this whole AI thing. And and I've 368 00:20:54,920 --> 00:20:56,919 Speaker 2: also written that, you know, I this is a bit 369 00:20:57,080 --> 00:20:59,159 Speaker 2: this is kind of my view, and I don't know, 370 00:20:59,240 --> 00:21:01,320 Speaker 2: I haven't heard that any people agree with me on this, 371 00:21:01,440 --> 00:21:04,840 Speaker 2: but I think they also, you know, because they position 372 00:21:05,000 --> 00:21:09,800 Speaker 2: themselves as this privacy focused company, they that they also 373 00:21:09,840 --> 00:21:13,399 Speaker 2: shied away from AI there because it does require gathering 374 00:21:13,480 --> 00:21:17,560 Speaker 2: a lot of data right on people and personalizing things 375 00:21:17,560 --> 00:21:19,800 Speaker 2: in ways that make people maybe feel like, oh, I'm 376 00:21:19,840 --> 00:21:23,200 Speaker 2: being tracked or what have you. And it turns out 377 00:21:23,240 --> 00:21:27,000 Speaker 2: that that's you know, that's now everything in the tech industry, 378 00:21:27,080 --> 00:21:28,919 Speaker 2: and they're they're way behind. 379 00:21:29,560 --> 00:21:32,359 Speaker 1: And then of course the Wizard, just by having a 380 00:21:32,400 --> 00:21:35,760 Speaker 1: movie named after him, is actually no very important character. 381 00:21:36,600 --> 00:21:41,679 Speaker 2: Right exactly. So I cast x Ai in the in 382 00:21:41,680 --> 00:21:45,159 Speaker 2: the role of the Wizard. You know, I think x 383 00:21:45,200 --> 00:21:50,040 Speaker 2: Ai has this huge, you know, colossus data center, right, 384 00:21:50,119 --> 00:21:52,760 Speaker 2: and they've they've promised that, you know, it really kind 385 00:21:52,760 --> 00:21:54,760 Speaker 2: of made me think of them as that is the 386 00:21:54,840 --> 00:21:57,000 Speaker 2: fact that they're going to understand the secrets of the 387 00:21:57,080 --> 00:21:59,840 Speaker 2: universe with AI that theirs is the Is that literally? 388 00:22:01,160 --> 00:22:01,400 Speaker 4: Yeah? 389 00:22:01,400 --> 00:22:03,520 Speaker 2: I mean this is like literally, you know. I mean 390 00:22:03,560 --> 00:22:05,800 Speaker 2: I saw Elon Musk in a recent interview saying that 391 00:22:06,240 --> 00:22:10,600 Speaker 2: their model grock is based on basic physics and that's 392 00:22:10,640 --> 00:22:13,480 Speaker 2: where it gets its ground truth from. And so you know, 393 00:22:13,560 --> 00:22:16,840 Speaker 2: their effort is to actually ultimately figure out, like, you know, 394 00:22:16,880 --> 00:22:21,320 Speaker 2: a unifying theory of the universe. You know, I think 395 00:22:21,359 --> 00:22:23,640 Speaker 2: that's a lot of it. It remains to be seen 396 00:22:23,680 --> 00:22:25,000 Speaker 2: what's behind the curtain. There. 397 00:22:25,240 --> 00:22:27,800 Speaker 1: Two other companies we haven't spoken about yet are Meta 398 00:22:28,040 --> 00:22:29,720 Speaker 1: and Anthropy. Where do they fit in? 399 00:22:30,200 --> 00:22:34,840 Speaker 2: Yeah, well so Anthropic I cast as in the role 400 00:22:34,880 --> 00:22:38,160 Speaker 2: of the Munchkins. I felt a little bad doing that 401 00:22:38,240 --> 00:22:40,960 Speaker 2: because you know, I think there's the Munchkins have this 402 00:22:41,040 --> 00:22:44,359 Speaker 2: negative connotation. But actually I was reading about it and 403 00:22:44,720 --> 00:22:47,080 Speaker 2: researching this article, reading about the Wizard of Oz and 404 00:22:47,119 --> 00:22:50,640 Speaker 2: trying to understand the different characters on a more nuanced level. 405 00:22:50,680 --> 00:22:53,760 Speaker 2: And without the Munchkins, there is no Wizard of Oz. Right. 406 00:22:53,800 --> 00:22:57,480 Speaker 2: They they they're the people, They're the ones who populate 407 00:22:57,520 --> 00:23:02,200 Speaker 2: this universe, right, And I think I think what Anthropic 408 00:23:02,359 --> 00:23:03,800 Speaker 2: is doing, and I think a lot of people don't 409 00:23:03,840 --> 00:23:07,240 Speaker 2: really realize this, is they have become the de facto 410 00:23:07,960 --> 00:23:12,359 Speaker 2: coding model for AI, and that is really what's driving 411 00:23:12,400 --> 00:23:15,440 Speaker 2: all this, you know, the excitement and AI right now, 412 00:23:16,240 --> 00:23:19,200 Speaker 2: it's not so much chat GPT, even though I think 413 00:23:19,240 --> 00:23:23,320 Speaker 2: that's what the broad population sort of thinks of. But 414 00:23:23,400 --> 00:23:25,920 Speaker 2: in the tech industry, what I think really excites people 415 00:23:26,080 --> 00:23:29,639 Speaker 2: is that this stuff, even if it never advances beyond 416 00:23:29,720 --> 00:23:33,280 Speaker 2: just being able to generate code with AI, that's going 417 00:23:33,280 --> 00:23:37,000 Speaker 2: to completely change the world because you know, code is 418 00:23:37,040 --> 00:23:40,800 Speaker 2: already so powerful and can automate so many things. The 419 00:23:41,000 --> 00:23:43,919 Speaker 2: reason that the whole world isn't running on code is 420 00:23:43,960 --> 00:23:47,240 Speaker 2: that it's too expensive, and so Anthropic is kind of 421 00:23:47,240 --> 00:23:50,080 Speaker 2: like bringing that power to the people. 422 00:23:50,280 --> 00:23:52,920 Speaker 1: So when I read about vibe coding, though I read 423 00:23:52,960 --> 00:23:55,879 Speaker 1: about CUSS or AI and stuff, But you're saying that 424 00:23:56,560 --> 00:23:59,200 Speaker 1: Anthropic is a real big player in the world of 425 00:23:59,280 --> 00:23:59,840 Speaker 1: vibe coding. 426 00:24:00,280 --> 00:24:04,040 Speaker 2: Well, yeah, I mean Cursor is running on Anthropics code 427 00:24:04,119 --> 00:24:08,600 Speaker 2: generation model, and so is Replet and you know a 428 00:24:08,640 --> 00:24:09,280 Speaker 2: lot of others. 429 00:24:09,359 --> 00:24:12,199 Speaker 1: I mean, it's Anthropic, it's really underpending this vibe coding 430 00:24:12,320 --> 00:24:16,560 Speaker 1: revolution exactly what about Meta. It's a big week for them. 431 00:24:16,640 --> 00:24:18,600 Speaker 1: I think you cost them in the in the movie 432 00:24:18,720 --> 00:24:21,720 Speaker 1: before their then news announcement, But who are they and 433 00:24:21,760 --> 00:24:23,320 Speaker 1: the Wizard of Alls? And did they did the news 434 00:24:23,359 --> 00:24:25,080 Speaker 1: announcement change you're thinking at all? 435 00:24:26,680 --> 00:24:28,919 Speaker 2: Well, of course, of course I'm going to make an 436 00:24:28,960 --> 00:24:30,760 Speaker 2: argument that it didn't change my thinking. 437 00:24:30,800 --> 00:24:32,680 Speaker 1: But I would I would. 438 00:24:32,600 --> 00:24:35,199 Speaker 2: Encourage people to totally disagree with me and argue with 439 00:24:35,200 --> 00:24:37,760 Speaker 2: me on this. But no, I mean Meta I cast 440 00:24:37,760 --> 00:24:42,280 Speaker 2: as the yellow brick road. They're they're open sourcing these 441 00:24:42,320 --> 00:24:47,720 Speaker 2: AI models, and similar to to Anthropic where it's like that, 442 00:24:48,480 --> 00:24:54,160 Speaker 2: it's an amplified effect. Many companies are using Meta models, 443 00:24:54,680 --> 00:24:59,040 Speaker 2: these Lama family of models to run their own custom 444 00:24:59,160 --> 00:25:01,880 Speaker 2: software because you know, it's free and it could also 445 00:25:01,920 --> 00:25:04,880 Speaker 2: be fine tuned, it can be run locally, so it's 446 00:25:04,920 --> 00:25:08,160 Speaker 2: actually you know, having this big impact that a lot 447 00:25:08,160 --> 00:25:10,960 Speaker 2: of people don't really see. And then of course this 448 00:25:11,040 --> 00:25:12,639 Speaker 2: recent acquisition of Scale AI. 449 00:25:14,040 --> 00:25:16,440 Speaker 1: You know, we'll see whether or how does that You 450 00:25:16,560 --> 00:25:17,400 Speaker 1: don't talk about the. 451 00:25:17,440 --> 00:25:19,720 Speaker 2: Yeah, you're right, I called it an acquisition, and it's not. 452 00:25:19,840 --> 00:25:22,840 Speaker 2: I mean, it's a it's a forty nine percent of 453 00:25:22,880 --> 00:25:26,760 Speaker 2: the company takeover, but not an actual takeover and and 454 00:25:27,920 --> 00:25:32,160 Speaker 2: yes it's not an acquisition, but really kind of puts 455 00:25:32,200 --> 00:25:36,560 Speaker 2: them in. The CEO, the co founder and CEO of 456 00:25:36,600 --> 00:25:39,439 Speaker 2: Scale AI is now going to go work at Meta, 457 00:25:40,800 --> 00:25:44,000 Speaker 2: but Scale will still continue as an independent company. What 458 00:25:44,240 --> 00:25:46,680 Speaker 2: I think Meta is interested in them for is their 459 00:25:46,760 --> 00:25:49,679 Speaker 2: ability to generate a lot of data, you know, to 460 00:25:49,720 --> 00:25:54,640 Speaker 2: train these models. So you know, these AI models trained 461 00:25:54,720 --> 00:25:57,400 Speaker 2: on the entire Internet. I'm sure you've kind of heard 462 00:25:57,480 --> 00:26:00,399 Speaker 2: these people say that they sucked in all the data 463 00:26:00,440 --> 00:26:02,879 Speaker 2: and the entire Internet and train on it. Well that's 464 00:26:03,160 --> 00:26:06,800 Speaker 2: somewhat true. There are these huge databases that are free 465 00:26:06,880 --> 00:26:09,879 Speaker 2: that anybody can train on, and they've sort of tapped 466 00:26:09,880 --> 00:26:12,480 Speaker 2: that up, like using that data can has sort of 467 00:26:12,520 --> 00:26:14,320 Speaker 2: They've gotten as far as they can with that, and 468 00:26:14,359 --> 00:26:17,520 Speaker 2: now they have to generate more data and it's getting 469 00:26:17,600 --> 00:26:21,040 Speaker 2: very expensive. And so companies like Scale have these huge 470 00:26:21,080 --> 00:26:26,399 Speaker 2: networks of contractors who can generate a ton of data. 471 00:26:26,480 --> 00:26:30,400 Speaker 2: And it's actually like PhDs and like you know, high 472 00:26:30,480 --> 00:26:34,920 Speaker 2: level business executives who are doing this I think kind 473 00:26:34,960 --> 00:26:37,280 Speaker 2: of for fun. I mean they're getting paid, but they 474 00:26:37,320 --> 00:26:41,359 Speaker 2: sit there and they do like very complex stuff that 475 00:26:41,440 --> 00:26:44,280 Speaker 2: can then be used by these models to learn and 476 00:26:44,320 --> 00:26:48,360 Speaker 2: become much more you know, specialized in certain areas. And 477 00:26:48,960 --> 00:26:53,000 Speaker 2: part of the deal with Meta is that Meta has 478 00:26:53,040 --> 00:26:56,119 Speaker 2: sort of pre paid for a bunch of Scale services 479 00:26:56,119 --> 00:26:57,040 Speaker 2: to get this data. 480 00:26:57,560 --> 00:27:01,280 Speaker 1: So this is really giving itself a competitive edge. The 481 00:27:01,280 --> 00:27:03,280 Speaker 1: New York Times ran a story saying though that this 482 00:27:03,480 --> 00:27:06,359 Speaker 1: was like part of socker Bug's personal quest to build 483 00:27:06,359 --> 00:27:08,399 Speaker 1: a super intelligence lab. So I wasn't sure how to 484 00:27:08,400 --> 00:27:10,960 Speaker 1: balance that with Yeah, what I'd understood about Scale was 485 00:27:10,960 --> 00:27:12,800 Speaker 1: that it was a company that sort of specialized in 486 00:27:12,880 --> 00:27:17,000 Speaker 1: creating this very high value data that's not available organically. 487 00:27:17,880 --> 00:27:20,679 Speaker 2: This is sort of my my take on this. I mean, 488 00:27:20,720 --> 00:27:24,640 Speaker 2: it's easy to focus on, you know, the Scale CEO, 489 00:27:24,840 --> 00:27:27,000 Speaker 2: Alexander Wang is going to run it, and it's all 490 00:27:27,000 --> 00:27:30,560 Speaker 2: this talent. I mean, Wang is not an AI researcher. 491 00:27:31,160 --> 00:27:34,639 Speaker 2: He knows a lot about the AI industry and he's 492 00:27:34,920 --> 00:27:37,879 Speaker 2: one of the world's most experts on you know, data 493 00:27:38,160 --> 00:27:41,159 Speaker 2: being used in AI. He was roommates with Sam Altman 494 00:27:41,240 --> 00:27:44,760 Speaker 2: when basically the groundwork for chatch ept was being built, 495 00:27:44,800 --> 00:27:46,960 Speaker 2: and that sort of brought him in. I mean, he 496 00:27:47,000 --> 00:27:49,879 Speaker 2: didn't and I've listened to interviews with him where he said, 497 00:27:49,920 --> 00:27:53,560 Speaker 2: like he didn't really understand what Opening Eye was doing, 498 00:27:53,600 --> 00:27:58,320 Speaker 2: but it was clearly like this great opportunity and so yeah, 499 00:27:58,400 --> 00:28:02,520 Speaker 2: I mean that help them with recruiting, and it's it's great, 500 00:28:02,520 --> 00:28:04,639 Speaker 2: you know, it's great for marketing and it all, it 501 00:28:04,680 --> 00:28:07,159 Speaker 2: all helps. But I think the bread and butter of 502 00:28:07,200 --> 00:28:11,320 Speaker 2: that deal is really about Metas saying, well, Okay, we 503 00:28:11,400 --> 00:28:14,160 Speaker 2: might not be the most cutting edge AI company right now, 504 00:28:14,760 --> 00:28:16,679 Speaker 2: but we're putting a lot of money into having the 505 00:28:16,720 --> 00:28:17,359 Speaker 2: best data. 506 00:28:17,880 --> 00:28:20,639 Speaker 1: Read this the question that is normally a cache, but 507 00:28:20,680 --> 00:28:23,479 Speaker 1: in this case, I think it's appropriate. How did this 508 00:28:23,520 --> 00:28:24,040 Speaker 1: movie end? 509 00:28:26,280 --> 00:28:30,000 Speaker 2: Well, they all get to the Emerald City, which is Agi, 510 00:28:30,240 --> 00:28:33,560 Speaker 2: and it was all set in motion by this tornado 511 00:28:33,840 --> 00:28:38,360 Speaker 2: which is chatchbt. No, I don't think we know this 512 00:28:38,400 --> 00:28:41,080 Speaker 2: one hasn't This one hasn't ended yet. I think it's 513 00:28:41,160 --> 00:28:43,520 Speaker 2: I think there are many alternate endings, just like all 514 00:28:43,560 --> 00:28:47,600 Speaker 2: the Wicked and and and probably other fan fiction around 515 00:28:47,640 --> 00:28:48,400 Speaker 2: the Wizard of Oz. 516 00:28:48,800 --> 00:28:50,520 Speaker 1: But if you look looking at the landscape, maybe you 517 00:28:50,560 --> 00:28:53,520 Speaker 1: had to bet on you had to bet on one company. 518 00:28:53,560 --> 00:28:56,000 Speaker 1: I mean, who do you think is really positioning themselves 519 00:28:56,040 --> 00:28:59,160 Speaker 1: in the smallest way for this immediate future of AI. 520 00:29:00,360 --> 00:29:02,480 Speaker 2: Yeah, if you had to bet on one company that's 521 00:29:02,520 --> 00:29:05,800 Speaker 2: a that is a really that's a tough one. I mean, 522 00:29:05,840 --> 00:29:08,600 Speaker 2: because you know, I don't think this is a winner 523 00:29:08,640 --> 00:29:12,080 Speaker 2: take all race either. I think that if you look 524 00:29:12,120 --> 00:29:15,120 Speaker 2: at the compute layer, I mean, this is a growing 525 00:29:15,200 --> 00:29:18,640 Speaker 2: pie that will be divided up. I mean, Opening Eye 526 00:29:18,680 --> 00:29:22,040 Speaker 2: might be winner take all in the consumer chat bots space. 527 00:29:22,080 --> 00:29:25,000 Speaker 2: I could see them becoming sort of the Google of 528 00:29:25,000 --> 00:29:29,400 Speaker 2: of of that era. But you know, I look at 529 00:29:29,400 --> 00:29:32,840 Speaker 2: Google right now and I think they have there. It's 530 00:29:32,880 --> 00:29:34,800 Speaker 2: interesting because I think, on the one hand, they have 531 00:29:34,920 --> 00:29:37,040 Speaker 2: so much promise. I mean, they have the best models 532 00:29:37,080 --> 00:29:42,000 Speaker 2: right now, they're applying them in the most diverse ways, 533 00:29:42,240 --> 00:29:45,240 Speaker 2: right they have because one is waymo right. They have robotaxis, 534 00:29:45,920 --> 00:29:49,320 Speaker 2: which are probably ultimately just gonna run on Gemini at 535 00:29:49,360 --> 00:29:52,160 Speaker 2: some point in the future. They they're doing stuff in 536 00:29:52,600 --> 00:29:56,200 Speaker 2: other you know, humanoid robotics as well. But they've also 537 00:29:56,280 --> 00:29:59,120 Speaker 2: got these, you know, they've got their own consumer chat bots, 538 00:29:59,160 --> 00:30:02,640 Speaker 2: They've got you know, a lot of consumer distribution in 539 00:30:02,640 --> 00:30:07,520 Speaker 2: the software space. So I think they're very they're set 540 00:30:07,560 --> 00:30:10,760 Speaker 2: up for success. The interesting part though, is that their 541 00:30:10,840 --> 00:30:16,240 Speaker 2: core business of search advertising is also It's like, it's 542 00:30:16,280 --> 00:30:18,240 Speaker 2: not even just that it's under threat, but like the 543 00:30:18,320 --> 00:30:22,640 Speaker 2: whole the whole concept of of like the Internet economy 544 00:30:22,720 --> 00:30:26,160 Speaker 2: running off of advertising dollars is up in the air, 545 00:30:26,320 --> 00:30:28,720 Speaker 2: like we don't even know if that's the future. So 546 00:30:29,640 --> 00:30:34,000 Speaker 2: they're challenged and they're having to completely change, like reorient 547 00:30:34,080 --> 00:30:37,200 Speaker 2: themselves as a company. But at the same time, they 548 00:30:37,240 --> 00:30:39,720 Speaker 2: have such promise, so I don't know, I mean, it 549 00:30:39,760 --> 00:30:41,480 Speaker 2: would be sort of it's sort of like a gamble 550 00:30:41,480 --> 00:30:43,640 Speaker 2: to put money on anything. But I mean, I think 551 00:30:43,680 --> 00:30:48,040 Speaker 2: they're just very well positioned. I think them in open AI, 552 00:30:48,480 --> 00:30:50,600 Speaker 2: I mean open a eye, just what they've built is 553 00:30:50,640 --> 00:30:54,600 Speaker 2: so is so valuable, and not because of their models, 554 00:30:54,640 --> 00:30:58,440 Speaker 2: not because they have the best AI, but because they're 555 00:30:58,520 --> 00:31:02,840 Speaker 2: just they've they've become this consumer products now and that 556 00:31:02,960 --> 00:31:05,520 Speaker 2: is we've seen in history that like that is something 557 00:31:05,560 --> 00:31:08,960 Speaker 2: that you know just will be hugely valuable in the 558 00:31:09,000 --> 00:31:11,080 Speaker 2: long run. It's kind of like their game to lose there. 559 00:31:14,720 --> 00:31:16,760 Speaker 1: On that note, read, thank you so much for your time, 560 00:31:17,080 --> 00:31:17,840 Speaker 1: Thank you so much. 561 00:31:17,880 --> 00:31:18,360 Speaker 2: It was great. 562 00:31:33,040 --> 00:31:35,800 Speaker 1: That's it for this week for tech stuff. I'mos Voloshin. 563 00:31:36,320 --> 00:31:39,560 Speaker 1: This episode was produced by Eliza Dennis and Victoria Domingez. 564 00:31:40,040 --> 00:31:43,000 Speaker 1: It was executive produced by me Karen Price and Kate 565 00:31:43,040 --> 00:31:48,200 Speaker 1: Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. But 566 00:31:48,280 --> 00:31:51,560 Speaker 1: he Fraser is our engineer. Jack Insley makes this episode 567 00:31:51,880 --> 00:31:54,960 Speaker 1: and Kyle Murdoch wrote our theme song. Join us next 568 00:31:55,000 --> 00:31:57,920 Speaker 1: Wednesday for text Stuff the Story, when we'll share an 569 00:31:57,960 --> 00:32:01,760 Speaker 1: in depth conversation with Yasmin Green, the CEO of Jigsaw, 570 00:32:02,360 --> 00:32:06,840 Speaker 1: about the tech that could change town halls forever. Please rate, review, 571 00:32:06,960 --> 00:32:09,280 Speaker 1: and reach out to us at Tech Staff Podcast at 572 00:32:09,320 --> 00:32:11,320 Speaker 1: gmail dot com. We want to hear from you.