1 00:00:16,520 --> 00:00:18,960 Speaker 1: Welcome to Tech Stuff. I'm os Voloshen, and I'm excited 2 00:00:18,960 --> 00:00:22,720 Speaker 1: to welcome journalist isabel Busquett to talk about two of 3 00:00:22,760 --> 00:00:26,960 Speaker 1: her recent Wall Street Journal stories that caught my attention. First, 4 00:00:27,120 --> 00:00:30,240 Speaker 1: there's a battle happening on LinkedIn. It's a real sharks 5 00:00:30,320 --> 00:00:33,280 Speaker 1: versus jets, the moment between finance bros And tech bros. 6 00:00:33,760 --> 00:00:36,279 Speaker 1: And it's all part of this bigger picture of software 7 00:00:36,320 --> 00:00:41,000 Speaker 1: companies being existentially threatened by AI. Then, Isabelle tells us 8 00:00:41,040 --> 00:00:45,199 Speaker 1: about last week's GtC, the Massive Developer conference held by 9 00:00:45,240 --> 00:00:47,680 Speaker 1: the multi trillion dollar company in video. 10 00:00:48,200 --> 00:00:50,440 Speaker 2: Welcome, Isabelle, Thank you so much for having me. 11 00:00:50,680 --> 00:00:52,559 Speaker 1: Now this is I have to say I admire you 12 00:00:52,640 --> 00:00:57,560 Speaker 1: tremendously because you went literally straight from GtC on Tuesday 13 00:00:57,640 --> 00:01:00,520 Speaker 1: evening to the Red Eye to the podcast to do this. 14 00:01:01,000 --> 00:01:01,280 Speaker 3: Yeah. 15 00:01:01,400 --> 00:01:02,240 Speaker 2: Yeah, that's right. 16 00:01:02,440 --> 00:01:06,119 Speaker 3: I had a brief little breakfast at home first, but yeah, 17 00:01:06,600 --> 00:01:07,720 Speaker 3: fresh from San Jose. 18 00:01:08,000 --> 00:01:12,200 Speaker 1: So you're not a correct engineer, but you're a correct journalist. Yeah. 19 00:01:12,240 --> 00:01:14,039 Speaker 1: But I want to ask you what about GtC. But 20 00:01:14,480 --> 00:01:17,679 Speaker 1: the piece that caught my attention this week was about Bloomberg, 21 00:01:17,920 --> 00:01:19,480 Speaker 1: and I have to confess I used to work this, 22 00:01:19,640 --> 00:01:23,840 Speaker 1: so obviously I was particularly intrigued. Yeah, what happened? What 23 00:01:23,880 --> 00:01:25,920 Speaker 1: was this throwdown and how did it come about? 24 00:01:26,080 --> 00:01:30,520 Speaker 3: Okay, so I cover enterprise AI, and so for a 25 00:01:30,560 --> 00:01:34,240 Speaker 3: while we've been seeing crazy releases, these new models, these 26 00:01:34,240 --> 00:01:36,760 Speaker 3: new tools are coming out all the time. It's constant. 27 00:01:36,880 --> 00:01:39,360 Speaker 3: So the more these tools come out, the more capable 28 00:01:39,400 --> 00:01:43,480 Speaker 3: they are. It starts to raise questions about how fast 29 00:01:43,640 --> 00:01:50,080 Speaker 3: they can recreate versions of popular but very expensive software. 30 00:01:50,200 --> 00:01:53,920 Speaker 3: So Salesforce is a great example of this. There haven't 31 00:01:53,920 --> 00:01:56,400 Speaker 3: been a lot of sort of alternatives in the market, 32 00:01:56,720 --> 00:01:59,360 Speaker 3: and so now with a tool like claud code, some 33 00:01:59,400 --> 00:02:01,600 Speaker 3: people are saying, oh, we can just sort of code 34 00:02:01,640 --> 00:02:04,000 Speaker 3: our own version of Salesforce. So that's sort of the 35 00:02:04,040 --> 00:02:07,120 Speaker 3: context of this conversation. There's sort of this existential threat 36 00:02:07,160 --> 00:02:09,720 Speaker 3: to software and it's really been hitting some of the 37 00:02:09,720 --> 00:02:11,720 Speaker 3: software stocks odd. 38 00:02:11,560 --> 00:02:13,600 Speaker 1: Right, I mean Salesforce at lassion. 39 00:02:13,360 --> 00:02:16,880 Speaker 3: Yeah, the salesforces the workdays of the world, you know, 40 00:02:16,919 --> 00:02:18,359 Speaker 3: companies like Legal Zoom. 41 00:02:18,720 --> 00:02:20,840 Speaker 2: There's been like I think, like a trillion. 42 00:02:20,520 --> 00:02:23,680 Speaker 3: Dollars lost in the markets over this, And you can 43 00:02:23,760 --> 00:02:26,000 Speaker 3: argue about whether it's justified or not justified, but I 44 00:02:26,080 --> 00:02:28,480 Speaker 3: think the idea is like investors are spooked. 45 00:02:28,280 --> 00:02:31,320 Speaker 1: But let's anchor this in Bloomberg. I think most people 46 00:02:31,360 --> 00:02:34,160 Speaker 1: listening to this podcast may not know what the Bloomberg 47 00:02:34,240 --> 00:02:36,200 Speaker 1: terminal even is. So if you could start with explaining 48 00:02:36,200 --> 00:02:39,280 Speaker 1: the Bloomberg terminal and then how it fits into this story, 49 00:02:39,520 --> 00:02:41,320 Speaker 1: and then why it kind of went viral on LinkedIn. 50 00:02:41,520 --> 00:02:43,880 Speaker 3: Yeah, Yeah, the Bloomberg terminal is a little bit of 51 00:02:43,919 --> 00:02:47,720 Speaker 3: a unique beast. So it's it's it's essentially a computer system. 52 00:02:47,760 --> 00:02:50,239 Speaker 3: It's the hardware and software. It came out in nineteen 53 00:02:50,280 --> 00:02:53,560 Speaker 3: eighty two, and the interface looks remarkably similar to the 54 00:02:53,600 --> 00:02:56,919 Speaker 3: way it looked in nineteen eighty two. Yes, it has 55 00:02:56,960 --> 00:03:01,720 Speaker 3: that classic dark background, gold tags, very sort of classic feel. 56 00:03:02,360 --> 00:03:07,760 Speaker 3: And this machine is like the lifeblood of financial institutions. 57 00:03:07,840 --> 00:03:11,600 Speaker 3: So traders and research teams are on it for hours 58 00:03:11,639 --> 00:03:13,960 Speaker 3: and hours a day doing everything they need to do, 59 00:03:14,080 --> 00:03:18,760 Speaker 3: from tracking real time prices of commodities stocks and they 60 00:03:18,760 --> 00:03:22,600 Speaker 3: can research historical prices, They get news alerts, they can 61 00:03:22,680 --> 00:03:24,720 Speaker 3: do all the analysis about the way that you know 62 00:03:24,760 --> 00:03:26,799 Speaker 3: certain news is impacting certain prices. 63 00:03:26,880 --> 00:03:28,799 Speaker 2: They can execute trades on it, and. 64 00:03:29,480 --> 00:03:33,600 Speaker 3: One of the most beloved and critical features is called 65 00:03:33,680 --> 00:03:36,760 Speaker 3: instant Bloomberg, and it's a chat feature where you can 66 00:03:36,840 --> 00:03:39,040 Speaker 3: chat with anyone else who has a Bloomberg terminal. And 67 00:03:39,080 --> 00:03:41,880 Speaker 3: because the Bloomberg terminal is sort of so entrenched in 68 00:03:41,960 --> 00:03:45,600 Speaker 3: financial institutions, like everybody has it, so anyone who is 69 00:03:45,640 --> 00:03:49,440 Speaker 3: a colleague, a competitor, a customer is on there, and 70 00:03:49,680 --> 00:03:53,360 Speaker 3: it's sort of created this whole network of a community 71 00:03:53,360 --> 00:03:55,640 Speaker 3: that is just like on there all the time. That's 72 00:03:55,720 --> 00:03:56,920 Speaker 3: just their operating system. 73 00:03:57,040 --> 00:03:59,640 Speaker 1: It was sort of Select before select Bloomberg Messenger and 74 00:03:59,680 --> 00:04:01,240 Speaker 1: also outside your own organization. 75 00:04:01,720 --> 00:04:03,040 Speaker 2: Yeah yeah, yeah you met. 76 00:04:02,960 --> 00:04:06,480 Speaker 1: Their wives there, I mean. 77 00:04:05,240 --> 00:04:08,480 Speaker 3: True, Yeah, there is like a whole dating scene on 78 00:04:08,640 --> 00:04:13,119 Speaker 3: instance bloom Yeah. Yeah, people definitely mete their spouses on there, 79 00:04:13,200 --> 00:04:14,800 Speaker 3: and yeah, a lot going on. 80 00:04:15,160 --> 00:04:17,720 Speaker 1: So what what was a story you wrote? And because 81 00:04:17,720 --> 00:04:19,400 Speaker 1: I saw it on LinkedIn and I saw your story 82 00:04:19,400 --> 00:04:21,000 Speaker 1: and I was just delighted. But how did this kind 83 00:04:21,040 --> 00:04:21,559 Speaker 1: of come about? 84 00:04:21,720 --> 00:04:23,920 Speaker 3: Okay, so it sort of goes back to what we 85 00:04:23,920 --> 00:04:26,479 Speaker 3: were talking about a minute ago, which is that you know, 86 00:04:26,520 --> 00:04:29,359 Speaker 3: there's been this constant stream of releases of AI tools, 87 00:04:29,560 --> 00:04:32,440 Speaker 3: and every time there's a new AI tool released, there's 88 00:04:32,600 --> 00:04:36,240 Speaker 3: this really you know, broad conversation on social media being 89 00:04:36,320 --> 00:04:40,440 Speaker 3: like Salesforce is dead, workday is dead. This is the 90 00:04:40,520 --> 00:04:42,240 Speaker 3: end of this piece of SaaS software. This is the 91 00:04:42,320 --> 00:04:44,719 Speaker 3: end of this piece of SaaS software. So it was 92 00:04:44,720 --> 00:04:46,520 Speaker 3: on the heels of the launch of this tool called 93 00:04:46,560 --> 00:04:50,880 Speaker 3: Perplexity Computer, which is sort of an agent building system. 94 00:04:51,440 --> 00:04:55,080 Speaker 3: And Perplexity Computer comes out and some folks on you know, 95 00:04:55,200 --> 00:04:59,280 Speaker 3: social media, on acts on LinkedIn start posting that they 96 00:04:59,760 --> 00:05:03,440 Speaker 3: vib coded a version of a Bloomberg terminal in one shot, 97 00:05:03,600 --> 00:05:07,279 Speaker 3: so like one one natural language prompt, and they had 98 00:05:07,320 --> 00:05:08,360 Speaker 3: a Bloomberg terminal in. 99 00:05:08,279 --> 00:05:08,880 Speaker 2: Front of them. 100 00:05:09,600 --> 00:05:11,840 Speaker 1: By the way, Bloomberg terminals generate I think about ten 101 00:05:11,839 --> 00:05:13,440 Speaker 1: billion dollars in revenue a year today. 102 00:05:13,560 --> 00:05:15,800 Speaker 3: Right, It's a massively expensive tool. I mean it costs 103 00:05:15,800 --> 00:05:19,400 Speaker 3: thirty thousand a year per seat, per user. The Perplexity 104 00:05:19,440 --> 00:05:21,520 Speaker 3: tool costs like twenty four hundred dollars a year. So 105 00:05:21,560 --> 00:05:23,760 Speaker 3: the idea that you might be able to get what 106 00:05:23,800 --> 00:05:28,120 Speaker 3: you could get on Bloomberg via one shot prompt kind 107 00:05:28,160 --> 00:05:32,480 Speaker 3: of a wild idea. It doesn't really stack up theoretically, 108 00:05:32,600 --> 00:05:36,000 Speaker 3: but I think it touches on a lot of sort 109 00:05:36,000 --> 00:05:39,560 Speaker 3: of sensitive issues that are going on here. What struck 110 00:05:39,600 --> 00:05:42,279 Speaker 3: my interest about the conversation was just that, you know, 111 00:05:43,279 --> 00:05:49,680 Speaker 3: the visceral reaction from the finance community was so passionate 112 00:05:49,800 --> 00:05:54,320 Speaker 3: and energetic and angry, and people were so protective of 113 00:05:54,400 --> 00:05:59,240 Speaker 3: the Bloomberg terminal and just responded with such backlash, saying 114 00:05:59,279 --> 00:06:01,520 Speaker 3: like you will never replace this, Nothing you can ever 115 00:06:01,560 --> 00:06:05,360 Speaker 3: do will ever replace this, you know, questioning people's sanity, 116 00:06:05,480 --> 00:06:08,680 Speaker 3: saying the you know, users should be locked up, saying 117 00:06:08,720 --> 00:06:11,400 Speaker 3: that so there was a conspiracy theory that like some 118 00:06:11,440 --> 00:06:13,920 Speaker 3: of these people were paid by Perplexity to post it, 119 00:06:13,960 --> 00:06:17,599 Speaker 3: which Perplexity is said is not true. So the conversation 120 00:06:17,720 --> 00:06:19,240 Speaker 3: just kind of like spun out of control from there, 121 00:06:19,240 --> 00:06:21,080 Speaker 3: and I just thought it was so fascinating because like 122 00:06:21,400 --> 00:06:25,159 Speaker 3: nothing has sort of stirred that visceral reaction the way 123 00:06:25,279 --> 00:06:28,320 Speaker 3: that you know, this love of the Bloomberg terminal emerged. 124 00:06:28,600 --> 00:06:31,279 Speaker 1: Do you think the kind of Jets versus Sharks a 125 00:06:31,320 --> 00:06:34,960 Speaker 1: West Side Story comparison is apt? I mean, I guess 126 00:06:34,960 --> 00:06:39,400 Speaker 1: what I'm asking you really is finance bros were the 127 00:06:39,480 --> 00:06:45,400 Speaker 1: acme of professional brohood. Yeah, for twenty years. It's almost 128 00:06:45,440 --> 00:06:48,719 Speaker 1: like I'm wondering this is just a hypothesis. It feels 129 00:06:48,760 --> 00:06:53,200 Speaker 1: like finance people have essentially been replaced as in the 130 00:06:53,279 --> 00:06:58,400 Speaker 1: hierarchy of like where value creation and becoming incredibly rich happens. 131 00:06:58,560 --> 00:06:59,839 Speaker 1: That's been kind of happening for a while. But I 132 00:07:00,200 --> 00:07:02,640 Speaker 1: like the AI moment in the last couple of years 133 00:07:02,680 --> 00:07:04,880 Speaker 1: has really accelerated there. So do you think this is 134 00:07:04,920 --> 00:07:07,440 Speaker 1: like a howl in the wind of saying like no, no, 135 00:07:07,520 --> 00:07:10,680 Speaker 1: like no tech bros, like we still got us finance bros. 136 00:07:10,800 --> 00:07:12,360 Speaker 1: Is that fair or what? Do you think? 137 00:07:12,680 --> 00:07:15,720 Speaker 3: I think, Yeah, that's an interesting idea, Like I do think, 138 00:07:15,800 --> 00:07:20,880 Speaker 3: like one common criticism from the tech bro side was that, 139 00:07:21,120 --> 00:07:24,840 Speaker 3: you know, the finance people were just they just wanted 140 00:07:24,880 --> 00:07:28,360 Speaker 3: to stick their feet in the sand. They were ignorant of, 141 00:07:28,880 --> 00:07:32,440 Speaker 3: you know, how important and how critical AI advances were 142 00:07:32,480 --> 00:07:35,080 Speaker 3: going to be, that they were stuck in their old ways, 143 00:07:35,480 --> 00:07:37,800 Speaker 3: that it was naive of them to think that there's 144 00:07:37,880 --> 00:07:40,920 Speaker 3: any piece of software that's irreplaceable. But yeah, I mean, 145 00:07:40,960 --> 00:07:43,600 Speaker 3: I think, like what's so interesting about these two groups 146 00:07:43,760 --> 00:07:47,320 Speaker 3: is that the people who work in finance just have 147 00:07:47,400 --> 00:07:51,280 Speaker 3: this love of Bloomberg, Like it is like it is 148 00:07:51,280 --> 00:07:53,400 Speaker 3: like a phantom limb for that. One person I talked 149 00:07:53,400 --> 00:07:55,720 Speaker 3: to told me like using his terminal was like crack, 150 00:07:56,200 --> 00:07:59,960 Speaker 3: which I was like, okay. And on the other side 151 00:08:00,160 --> 00:08:03,080 Speaker 3: of that, like people worship AI. 152 00:08:03,640 --> 00:08:05,880 Speaker 2: So there's this like extreme. 153 00:08:05,440 --> 00:08:10,080 Speaker 3: Like almost religious like passion on both sides, and I think. 154 00:08:09,920 --> 00:08:11,880 Speaker 2: That's sort of the root of why we saw. 155 00:08:11,720 --> 00:08:14,239 Speaker 1: This clash that's so well put. You know, a friend 156 00:08:14,240 --> 00:08:17,320 Speaker 1: of mine worked in a scrapyard in France when he 157 00:08:17,360 --> 00:08:20,280 Speaker 1: was a teenager. Briefly we told me the other day 158 00:08:20,280 --> 00:08:24,680 Speaker 1: that there was an occasion when a guy who was 159 00:08:24,720 --> 00:08:27,480 Speaker 1: like a truck driver brought his truck to be scrapped 160 00:08:27,960 --> 00:08:29,520 Speaker 1: and he'd driven it for twenty years, and as it 161 00:08:29,560 --> 00:08:32,360 Speaker 1: was kind of entering a scrapyard, he just collapsed on 162 00:08:32,400 --> 00:08:34,600 Speaker 1: the floor and was screaming like a child in the 163 00:08:34,600 --> 00:08:37,760 Speaker 1: fetal position, saying look by to his truck. And we 164 00:08:37,760 --> 00:08:39,920 Speaker 1: were talking, my friend and I about this and how like, 165 00:08:40,760 --> 00:08:44,920 Speaker 1: as a person in the world, you ultimately spend like 166 00:08:44,960 --> 00:08:48,079 Speaker 1: the majority of your waking hours working, whether that's like 167 00:08:48,160 --> 00:08:51,280 Speaker 1: driving a truck or on the Bloomberg terminal, And so 168 00:08:51,480 --> 00:08:56,640 Speaker 1: that like extreme like nostalgia and personal identification with these tools, 169 00:08:56,679 --> 00:08:59,600 Speaker 1: whether it's a truck or a terminal, it's kind of interesting, 170 00:08:59,640 --> 00:09:03,440 Speaker 1: but is there something different about the terminal from say salesforce, Like, 171 00:09:03,480 --> 00:09:05,920 Speaker 1: what do you really think makes this like such a 172 00:09:06,040 --> 00:09:07,520 Speaker 1: rubicun for finance people. 173 00:09:07,640 --> 00:09:08,760 Speaker 2: I mean, it's a good question. 174 00:09:08,960 --> 00:09:11,360 Speaker 3: I think the fact that it's been around so long 175 00:09:11,520 --> 00:09:14,840 Speaker 3: and it's had such staying power throughout that period of 176 00:09:14,880 --> 00:09:18,480 Speaker 3: time is important. I think the fact that it is 177 00:09:18,559 --> 00:09:23,120 Speaker 3: so expensive also is important. You know, when you reach 178 00:09:23,200 --> 00:09:25,520 Speaker 3: the point in your career at which you get a 179 00:09:25,559 --> 00:09:28,360 Speaker 3: Bloomberg terminal, that's a big deal. That's sort of like 180 00:09:28,400 --> 00:09:30,560 Speaker 3: a rite of passage for you, and then you have it, 181 00:09:30,640 --> 00:09:33,280 Speaker 3: and it's sort of a status symbol. So I think 182 00:09:33,320 --> 00:09:36,320 Speaker 3: it's it's come to represent all these things. But I 183 00:09:36,400 --> 00:09:40,760 Speaker 3: also think the reality is that they do love the functionality, 184 00:09:41,000 --> 00:09:43,160 Speaker 3: like they just they've sort of come to depend on 185 00:09:43,160 --> 00:09:45,160 Speaker 3: that functionality and they couldn't really live without it. And 186 00:09:45,200 --> 00:09:49,160 Speaker 3: I think it's easy to be skeptical of, you know, 187 00:09:49,400 --> 00:09:52,880 Speaker 3: some sort of flashy new AI coming in and being like, oh, 188 00:09:52,920 --> 00:09:54,920 Speaker 3: we can do this, Like I think, you know, the 189 00:09:55,000 --> 00:09:58,080 Speaker 3: natural reaction is no, you can't. Bloomberg has been doing 190 00:09:58,120 --> 00:10:01,840 Speaker 3: this for decades. This AI tool was invented a couple 191 00:10:01,960 --> 00:10:04,920 Speaker 3: weeks ago, Like, what do you people know about this? 192 00:10:05,200 --> 00:10:07,880 Speaker 3: Don't try to come in and take away our Bloomberg 193 00:10:08,000 --> 00:10:11,800 Speaker 3: terminals and give us some shoddy, cheaper vibe coded replacement 194 00:10:12,000 --> 00:10:14,800 Speaker 3: because you're taking away like what we need to do 195 00:10:14,840 --> 00:10:15,719 Speaker 3: our jobs? 196 00:10:16,000 --> 00:10:19,720 Speaker 1: And did you experiment with the vibe coded replacement? I mean, 197 00:10:19,760 --> 00:10:23,040 Speaker 1: what was the take of people who actually used these 198 00:10:23,080 --> 00:10:25,600 Speaker 1: products in terms of how close did the perplexity vibe 199 00:10:25,600 --> 00:10:29,920 Speaker 1: coded version get to successfully replicating the Bloomberg functionality. 200 00:10:30,320 --> 00:10:32,640 Speaker 3: Yeah, so I think, to be honest, on both sides, 201 00:10:32,679 --> 00:10:37,320 Speaker 3: there's an acknowledgment that whatever you can vibe code in 202 00:10:37,559 --> 00:10:40,880 Speaker 3: one shot prompt is not going to be anywhere near 203 00:10:40,920 --> 00:10:44,240 Speaker 3: the Bloomberg terminal. Like I think even the tech people 204 00:10:44,960 --> 00:10:48,400 Speaker 3: realize how I acknowledge that, And I think they would say, like, oh, yeah, 205 00:10:48,400 --> 00:10:50,160 Speaker 3: but it's a great starting point and if you put 206 00:10:50,200 --> 00:10:51,720 Speaker 3: in a little more effort and maybe it's not all 207 00:10:51,800 --> 00:10:53,680 Speaker 3: vibe coding, maybe some of it's real coding, and maybe 208 00:10:53,679 --> 00:10:55,040 Speaker 3: you can't get all the way there, but maybe you 209 00:10:55,040 --> 00:10:58,600 Speaker 3: can get close. But these sort of like one shotted 210 00:10:59,240 --> 00:11:03,440 Speaker 3: vibe coded demonstrations, it's hard to tell because the people 211 00:11:03,440 --> 00:11:06,000 Speaker 3: that post them, you can't sort of go it. They 212 00:11:06,000 --> 00:11:08,520 Speaker 3: don't always share the prompt and they don't always share 213 00:11:08,559 --> 00:11:11,040 Speaker 3: the experience. They just share like a screen grap and 214 00:11:11,080 --> 00:11:13,840 Speaker 3: so a lot of what was being shared is these 215 00:11:13,880 --> 00:11:19,160 Speaker 3: tools for like tracking stocks and financial analysis, which people 216 00:11:19,160 --> 00:11:21,559 Speaker 3: were right to point out is just one part of Bloomberg. 217 00:11:21,920 --> 00:11:24,000 Speaker 3: Like a big question that came up is sort of 218 00:11:24,080 --> 00:11:27,200 Speaker 3: like the real time data inputs the Bloomberg gets. Like 219 00:11:27,200 --> 00:11:31,000 Speaker 3: Bloomberg has these proprietary data inputs, it gets them faster 220 00:11:31,120 --> 00:11:34,080 Speaker 3: than anybody else. It gets more data, one might say, 221 00:11:34,080 --> 00:11:36,959 Speaker 3: better data than anybody else. And yes, there is stock 222 00:11:37,040 --> 00:11:40,560 Speaker 3: data and commodities data available on the Internet, but it's 223 00:11:40,760 --> 00:11:44,320 Speaker 3: maybe not the same breath and depth that Bloomberg gets. 224 00:11:44,400 --> 00:11:48,080 Speaker 3: So I think, like no, to answer your question, like 225 00:11:48,200 --> 00:11:51,000 Speaker 3: the one shotted, vibeut it apps like don't come close 226 00:11:51,040 --> 00:11:53,679 Speaker 3: to what the Bloomberg terminal is. But again, like the 227 00:11:53,920 --> 00:11:56,280 Speaker 3: tech people would say, like bo we. 228 00:11:56,280 --> 00:11:56,960 Speaker 2: Can get there. 229 00:11:57,280 --> 00:12:00,360 Speaker 1: You know what I mean, you are another story couple 230 00:12:00,400 --> 00:12:04,000 Speaker 1: of weeks ago, meet the company's vibe coding that own CRM. 231 00:12:04,960 --> 00:12:08,040 Speaker 1: What's your like medium term perspective? You do you think 232 00:12:08,080 --> 00:12:11,520 Speaker 1: software companies are like literally gonna go away or do 233 00:12:11,600 --> 00:12:14,400 Speaker 1: you think a new crop of more targeted software companies 234 00:12:14,440 --> 00:12:17,600 Speaker 1: will emerge to have less legacy baggage and then probably 235 00:12:17,600 --> 00:12:20,160 Speaker 1: be consolidated into selfwaking commorts. Again, like, what do you 236 00:12:20,160 --> 00:12:21,120 Speaker 1: think is gonna happen here. 237 00:12:21,280 --> 00:12:22,920 Speaker 2: Yeah, No, it's a good question. 238 00:12:23,080 --> 00:12:26,880 Speaker 3: I mean, like there's been talk of the death of 239 00:12:26,960 --> 00:12:30,240 Speaker 3: software for a really really, really really long time, and 240 00:12:30,320 --> 00:12:34,240 Speaker 3: it has been resilient and it has stuck around. And 241 00:12:34,880 --> 00:12:37,640 Speaker 3: you know, three years ago when Chatchibt first came out, 242 00:12:37,720 --> 00:12:40,400 Speaker 3: everyone was like, oh, it's the end of software, like goodbye, 243 00:12:40,760 --> 00:12:43,440 Speaker 3: and that I think that hasn't really happened. Although, like 244 00:12:43,559 --> 00:12:47,280 Speaker 3: I think there is like if you're a legacy SaaS 245 00:12:47,280 --> 00:12:50,480 Speaker 3: company and you can pivot into the modern age and 246 00:12:50,760 --> 00:12:53,880 Speaker 3: you can invest really deeply in AI, invest really deeply 247 00:12:53,920 --> 00:12:56,600 Speaker 3: an agents, provide that to your customers and sort of 248 00:12:57,120 --> 00:13:01,160 Speaker 3: be a value proposition that's you know, not dissimilar from 249 00:13:01,200 --> 00:13:03,920 Speaker 3: what they could get from a more AI native startup, 250 00:13:04,440 --> 00:13:07,000 Speaker 3: then I think that those companies have a chance of succeeding. 251 00:13:07,480 --> 00:13:09,240 Speaker 3: If you're a legacy sasas provider and you kind of 252 00:13:09,280 --> 00:13:11,240 Speaker 3: stick your feet in the sand and you just count 253 00:13:11,240 --> 00:13:13,600 Speaker 3: on the fact that your customers are sort of locked 254 00:13:13,600 --> 00:13:15,760 Speaker 3: into you. They have years alone contracts, it's going to 255 00:13:15,760 --> 00:13:18,400 Speaker 3: be really hard for them to exit your platform. Then 256 00:13:18,600 --> 00:13:20,400 Speaker 3: maybe you're in a situation where you're getting in trouble 257 00:13:20,400 --> 00:13:24,080 Speaker 3: in a few years. But I do like, I think 258 00:13:24,120 --> 00:13:26,120 Speaker 3: most of the SaaS companies know they have to pivot, 259 00:13:26,400 --> 00:13:29,800 Speaker 3: Like AI is AI is the big thing. AI is here. 260 00:13:30,160 --> 00:13:33,559 Speaker 3: Everybody in the software world knows that, and so they're 261 00:13:33,720 --> 00:13:38,079 Speaker 3: all sort of jostling to position themselves as the provider 262 00:13:38,280 --> 00:13:42,280 Speaker 3: of AI and AI native companies are in a like 263 00:13:42,400 --> 00:13:44,280 Speaker 3: you could argue they're in a better position to do that, 264 00:13:44,520 --> 00:13:49,640 Speaker 3: but legacy companies have at yeah, big customer bases, more 265 00:13:49,679 --> 00:13:52,599 Speaker 3: of an understanding of like what you know, business requirements 266 00:13:52,600 --> 00:13:53,200 Speaker 3: and stuff like that. 267 00:13:53,520 --> 00:13:55,800 Speaker 2: So I don't know how it's all going to play out. 268 00:13:55,920 --> 00:13:58,760 Speaker 3: People like compare this sort of AI moment a lot 269 00:13:58,800 --> 00:14:01,000 Speaker 3: to like the dawn of the Internet and sort of 270 00:14:01,080 --> 00:14:04,360 Speaker 3: like the dot com must and you know, there will 271 00:14:04,400 --> 00:14:06,880 Speaker 3: be a lot of new companies that come up and 272 00:14:06,960 --> 00:14:09,440 Speaker 3: are great, and there will be other companies that sort 273 00:14:09,480 --> 00:14:12,640 Speaker 3: of go on the corner and die and other companies 274 00:14:12,640 --> 00:14:13,240 Speaker 3: that survive. 275 00:14:13,640 --> 00:14:16,280 Speaker 1: Yeah, we'll just have to say, Well, the company, which 276 00:14:16,320 --> 00:14:18,120 Speaker 1: is in many ways at the very center of all 277 00:14:18,160 --> 00:14:21,960 Speaker 1: this is in video when we come back, Isabelle tells 278 00:14:22,040 --> 00:14:24,960 Speaker 1: us all about her trip to Invidia's develop a conference 279 00:14:25,040 --> 00:14:28,520 Speaker 1: last week, what Jensen Huang has called the super Bowl 280 00:14:28,560 --> 00:14:56,960 Speaker 1: of AIM. So before the break. We were talking about 281 00:14:57,080 --> 00:15:00,320 Speaker 1: Bloomberg and software and whether AI will eat softwa where, 282 00:15:00,560 --> 00:15:02,520 Speaker 1: And I want to talk a bit more about that 283 00:15:02,560 --> 00:15:05,520 Speaker 1: in respect to what you learned this week at the 284 00:15:05,600 --> 00:15:09,440 Speaker 1: Nvidia conference. Before we get into those specific topics, what 285 00:15:09,600 --> 00:15:11,040 Speaker 1: was it like and describe the scene. 286 00:15:11,160 --> 00:15:17,280 Speaker 3: The energy at Nvidia GGC is very unique and it's 287 00:15:17,640 --> 00:15:21,160 Speaker 3: kind of electric. It takes over the entire city of 288 00:15:21,200 --> 00:15:23,360 Speaker 3: San Jose, thirty thousand people. 289 00:15:23,760 --> 00:15:25,200 Speaker 2: I interviewed the mayor of San. 290 00:15:25,120 --> 00:15:27,440 Speaker 3: Jose last year about this and he was like, the 291 00:15:27,520 --> 00:15:30,720 Speaker 3: coffee shops make their entire year's rent in the course 292 00:15:30,720 --> 00:15:34,120 Speaker 3: of a couple days. Of course, that means like if 293 00:15:34,160 --> 00:15:35,720 Speaker 3: you want to get a cup of coffee, you are 294 00:15:35,800 --> 00:15:38,160 Speaker 3: waiting in line for like an hour. You're waiting in 295 00:15:38,160 --> 00:15:40,400 Speaker 3: line for an hour for everything to like get into panels, 296 00:15:40,440 --> 00:15:42,560 Speaker 3: to get into buildings, to get coffee, to get a 297 00:15:42,560 --> 00:15:45,640 Speaker 3: granola bar. It's, you know, like you're out there in 298 00:15:45,680 --> 00:15:48,480 Speaker 3: the wilderness, like you have to fight to get like 299 00:15:48,520 --> 00:15:49,200 Speaker 3: a sandwich. 300 00:15:50,040 --> 00:15:53,440 Speaker 1: No, the Fortralling Company doesn't provide any free coffee and sandwich. 301 00:15:53,440 --> 00:15:54,040 Speaker 4: This is no. 302 00:15:54,120 --> 00:15:56,800 Speaker 3: I mean it's like it's wild though, I mean it 303 00:15:56,800 --> 00:15:59,240 Speaker 3: does like people say, it has this like rock concert 304 00:15:59,480 --> 00:16:04,480 Speaker 3: like sort of atmosphere. There's music and there's you know, 305 00:16:04,520 --> 00:16:08,040 Speaker 3: in the evening they like bring out shot luges. There's 306 00:16:08,840 --> 00:16:12,840 Speaker 3: robots everywhere of every different kind of roy you can imagine. 307 00:16:12,840 --> 00:16:15,560 Speaker 3: Because in Video is like really invested in this physical 308 00:16:15,600 --> 00:16:16,280 Speaker 3: AI thing. 309 00:16:16,520 --> 00:16:17,240 Speaker 2: They're also really. 310 00:16:17,160 --> 00:16:20,240 Speaker 3: Invested at autonomous driving, so they have like all these 311 00:16:20,280 --> 00:16:23,840 Speaker 3: like different cars. Like everything in the city is decked 312 00:16:23,880 --> 00:16:27,920 Speaker 3: out in in Vidia signature green and it's sort of 313 00:16:27,960 --> 00:16:31,720 Speaker 3: like Disneyland for like the AI industry, Like there's just 314 00:16:31,840 --> 00:16:34,320 Speaker 3: so much to do and see and play with and 315 00:16:34,800 --> 00:16:38,600 Speaker 3: it's just like a totally different world. Everyone is super 316 00:16:38,680 --> 00:16:41,760 Speaker 3: excited about the technology. Everyone is super bullish about it. 317 00:16:42,280 --> 00:16:44,440 Speaker 3: So yeah, it's it's sort of like this little like 318 00:16:44,560 --> 00:16:45,520 Speaker 3: hype of Vortex. 319 00:16:45,960 --> 00:16:47,600 Speaker 2: It's so easy to get sucked in. 320 00:16:47,800 --> 00:16:49,520 Speaker 1: Well, how do you avoid getting secked in? What the 321 00:16:49,600 --> 00:16:51,360 Speaker 1: kind of what were the more critical questions that you 322 00:16:51,440 --> 00:16:52,360 Speaker 1: had about what was going on? 323 00:16:52,440 --> 00:16:56,440 Speaker 3: I suppose yeah, yeah, I mean I think like if 324 00:16:56,440 --> 00:16:58,920 Speaker 3: you think about Nvidia as a chip company, like there's 325 00:16:58,960 --> 00:17:02,280 Speaker 3: sort of been this inflex point in the chip industry, 326 00:17:02,360 --> 00:17:05,600 Speaker 3: which is this shift from training to inference, Like for 327 00:17:05,680 --> 00:17:09,200 Speaker 3: the last few years, like the bulk of compute has 328 00:17:09,240 --> 00:17:12,880 Speaker 3: been used to train these really big, large language models, 329 00:17:13,359 --> 00:17:16,200 Speaker 3: and now we're getting to the point where a lot 330 00:17:16,240 --> 00:17:20,560 Speaker 3: of compute is being used to run these large language models. 331 00:17:21,080 --> 00:17:24,199 Speaker 3: And so in Vidia's chips historically have been really, really 332 00:17:24,240 --> 00:17:25,040 Speaker 3: really great. 333 00:17:24,840 --> 00:17:25,480 Speaker 2: At the training. 334 00:17:25,560 --> 00:17:29,120 Speaker 3: Like that is what propelled them to their current valuation. 335 00:17:29,560 --> 00:17:31,959 Speaker 3: And I think like there were always questions about what 336 00:17:31,960 --> 00:17:34,800 Speaker 3: does it mean for Nvidia in the era of AI 337 00:17:34,840 --> 00:17:41,240 Speaker 3: and friends, And in Vidia is an incredibly innovative company 338 00:17:41,320 --> 00:17:44,280 Speaker 3: and it comes from the top. Jensen is just an 339 00:17:44,359 --> 00:17:49,480 Speaker 3: incredibly curious, innovative person. They have a very flat structure 340 00:17:49,560 --> 00:17:51,960 Speaker 3: as a company, not a lot of middle management. 341 00:17:52,320 --> 00:17:54,719 Speaker 2: Jensen and sort of like all the top leaders have 342 00:17:54,800 --> 00:17:55,920 Speaker 2: a lot of direct reports. 343 00:17:55,960 --> 00:17:59,480 Speaker 3: There's a lot of autonomy, but there's just this constant 344 00:17:59,480 --> 00:18:01,680 Speaker 3: push to just like innovate as fast as you can 345 00:18:01,720 --> 00:18:03,199 Speaker 3: and always be on the next thing, and always be 346 00:18:03,200 --> 00:18:04,640 Speaker 3: on the next thing, and always be on the next thing. 347 00:18:05,200 --> 00:18:10,960 Speaker 3: So they acquired an inference chip startup called Grock recently, 348 00:18:11,119 --> 00:18:13,040 Speaker 3: so a lot of the conference was. 349 00:18:13,200 --> 00:18:15,200 Speaker 1: There's nothing to do with Elan's Grock, This is Grock 350 00:18:15,200 --> 00:18:16,400 Speaker 1: with Q or other case. 351 00:18:16,560 --> 00:18:20,280 Speaker 3: Yeah, it's incredibly confusing and a weird name to have 352 00:18:20,400 --> 00:18:24,119 Speaker 3: twice and what is stream But yeah, so there was 353 00:18:24,160 --> 00:18:26,119 Speaker 3: a lot of talk about how they're integrating sort of 354 00:18:26,160 --> 00:18:29,840 Speaker 3: like Grock's offerings into in video offerings and you know, 355 00:18:29,960 --> 00:18:33,240 Speaker 3: just this this big push to you know, stay on 356 00:18:33,280 --> 00:18:36,439 Speaker 3: top of things on the compute side, and then on 357 00:18:36,480 --> 00:18:38,920 Speaker 3: the flip side of that, there's also you know, part 358 00:18:38,920 --> 00:18:41,320 Speaker 3: of this conference is for Jensen to get out there 359 00:18:41,359 --> 00:18:46,280 Speaker 3: and you know, prove that demands for AI workloads is 360 00:18:47,359 --> 00:18:49,560 Speaker 3: going to be greater than ever before. And that's what 361 00:18:49,600 --> 00:18:50,920 Speaker 3: he says every year, and that's what he has to 362 00:18:50,920 --> 00:18:52,600 Speaker 3: go and prove out every year because that's what's sort 363 00:18:52,600 --> 00:18:55,280 Speaker 3: of spurring their demand. And so there was a lot 364 00:18:55,280 --> 00:18:58,440 Speaker 3: of talk about the software side and what companies are 365 00:18:58,720 --> 00:19:00,680 Speaker 3: doing with AI and what they're going to be doing 366 00:19:00,720 --> 00:19:03,840 Speaker 3: with AI, and there was a lot of talk about 367 00:19:03,880 --> 00:19:04,920 Speaker 3: something called open Claw. 368 00:19:06,280 --> 00:19:07,959 Speaker 1: I want to ask you about them, and particularly how 369 00:19:08,000 --> 00:19:11,000 Speaker 1: it relates to the Bloomberg and the software story we're 370 00:19:11,000 --> 00:19:13,320 Speaker 1: talking about earlier. Just before we get there, I want 371 00:19:13,359 --> 00:19:16,840 Speaker 1: to read a quote from your newsletter. Today's maturity curve 372 00:19:16,880 --> 00:19:19,560 Speaker 1: in AI is much higher, and the explosive growth and 373 00:19:19,600 --> 00:19:23,240 Speaker 1: improvement of AI coding tools has arguably given a gentic tech. 374 00:19:23,320 --> 00:19:27,520 Speaker 1: It's killer app Still, the technology continues to speed far 375 00:19:27,600 --> 00:19:30,840 Speaker 1: ahead of business adoption. Talk about that. Why is business 376 00:19:30,880 --> 00:19:33,280 Speaker 1: adoption slower than technology development? 377 00:19:33,400 --> 00:19:35,320 Speaker 2: I mean there's a lot of reasons for that. One 378 00:19:35,560 --> 00:19:39,160 Speaker 2: is just because technology development is so fast. Silicon Valley 379 00:19:39,240 --> 00:19:42,679 Speaker 2: is moving so fast right now with the models are 380 00:19:42,680 --> 00:19:45,400 Speaker 2: coming out all the time, they're always better. And then 381 00:19:45,440 --> 00:19:48,879 Speaker 2: there's you know, last year, agents were just starting to 382 00:19:48,880 --> 00:19:51,280 Speaker 2: become a thing. So I remember being a GtC. 383 00:19:51,119 --> 00:19:54,879 Speaker 3: Last year and everyone was talking about agents, but also 384 00:19:55,680 --> 00:19:58,520 Speaker 3: nobody was really using them. There was just this understanding that, like, 385 00:19:58,560 --> 00:20:00,199 Speaker 3: agents are going to be the next thing. And it 386 00:20:00,240 --> 00:20:02,840 Speaker 3: was this transition from we have chatbots and we can 387 00:20:02,840 --> 00:20:04,560 Speaker 3: ask them questions and we can get answers and this 388 00:20:04,640 --> 00:20:09,159 Speaker 3: is incredible to Okay, agents are actually going to do 389 00:20:09,359 --> 00:20:11,800 Speaker 3: things for us. And I think last year was just 390 00:20:11,800 --> 00:20:14,800 Speaker 3: sort of like a buzzword. This year is actually starting 391 00:20:14,800 --> 00:20:19,600 Speaker 3: to happen in businesses. But the tech is sort of 392 00:20:19,680 --> 00:20:23,280 Speaker 3: chugging along even beyond that too. It's not just agents 393 00:20:23,280 --> 00:20:25,560 Speaker 3: that will do a task for you, it's now these 394 00:20:25,600 --> 00:20:30,800 Speaker 3: sort of complex groups of agents that are delegating tasks 395 00:20:30,880 --> 00:20:35,320 Speaker 3: to each other and can access your documents and files 396 00:20:35,400 --> 00:20:39,240 Speaker 3: and business data and run for a really long time 397 00:20:39,400 --> 00:20:43,239 Speaker 3: and sort of do these complex workflows and then you know, 398 00:20:43,320 --> 00:20:46,560 Speaker 3: report back to you. So it's it's even beyond just 399 00:20:46,600 --> 00:20:49,919 Speaker 3: doing tasks to like you know, doing almost like entire jobs. 400 00:20:50,000 --> 00:20:53,000 Speaker 3: But I think that's where businesses are still sort of 401 00:20:53,160 --> 00:20:57,560 Speaker 3: like the initial agentic deployment point of just like figuring out, 402 00:20:57,600 --> 00:21:00,040 Speaker 3: like where do I even like trust this type of 403 00:21:00,040 --> 00:21:02,159 Speaker 3: offer to like do a task for me let alone, 404 00:21:02,320 --> 00:21:04,800 Speaker 3: like you know, give it access to like my entire 405 00:21:04,840 --> 00:21:05,680 Speaker 3: operating system. 406 00:21:06,080 --> 00:21:09,320 Speaker 1: And where does open claw fit into all of this? 407 00:21:09,520 --> 00:21:11,960 Speaker 1: Animo Claw which was announced at GtC. 408 00:21:12,320 --> 00:21:15,360 Speaker 3: Open Claw is another thing that just sort of it's 409 00:21:15,400 --> 00:21:17,760 Speaker 3: so hard to keep up in this industry. 410 00:21:18,440 --> 00:21:20,040 Speaker 2: Someone was saying the other day. 411 00:21:19,840 --> 00:21:23,199 Speaker 3: Like you have to be fully unemployed to keep up 412 00:21:23,200 --> 00:21:25,879 Speaker 3: with anything that's going on in AI, and like I 413 00:21:25,920 --> 00:21:26,840 Speaker 3: can really respect that. 414 00:21:27,400 --> 00:21:28,960 Speaker 1: I can relate to that too. I mean I do 415 00:21:29,000 --> 00:21:31,160 Speaker 1: this podcast twice a week, but it's just the deluge 416 00:21:31,160 --> 00:21:32,200 Speaker 1: of information. 417 00:21:32,800 --> 00:21:36,800 Speaker 3: Wild It's like one day and there's this thing called 418 00:21:37,160 --> 00:21:41,000 Speaker 3: open claw, and it's really confusing because open claw is 419 00:21:41,040 --> 00:21:43,000 Speaker 3: about creating these things called. 420 00:21:42,920 --> 00:21:45,800 Speaker 1: Clause clause agents. Yeah. 421 00:21:45,960 --> 00:21:51,720 Speaker 3: Yeah, Claws are basically these long running autonomous agents. And 422 00:21:52,600 --> 00:21:55,760 Speaker 3: open claw it's it's an open source standard for building 423 00:21:55,800 --> 00:21:59,800 Speaker 3: these sort of like complex orchestrations of all these different 424 00:21:59,800 --> 00:22:02,440 Speaker 3: age So you can set up a claw to do 425 00:22:02,520 --> 00:22:05,040 Speaker 3: something and that claw will like tap a bunch of 426 00:22:05,119 --> 00:22:08,800 Speaker 3: series of subagents delegate tasks. You can give it access 427 00:22:08,840 --> 00:22:11,359 Speaker 3: to your files. It can do things for you, and 428 00:22:11,440 --> 00:22:13,959 Speaker 3: it can do it like continuously over a long period 429 00:22:13,960 --> 00:22:17,640 Speaker 3: of time. Like you could do things like summarize all 430 00:22:17,680 --> 00:22:19,639 Speaker 3: the news of the week and shoot me an email 431 00:22:19,680 --> 00:22:22,199 Speaker 3: every Monday at noon with what's going on, and it 432 00:22:22,280 --> 00:22:24,720 Speaker 3: will just sort of keep running and do that. Or 433 00:22:25,040 --> 00:22:26,720 Speaker 3: you know, you could set it to do a task 434 00:22:27,240 --> 00:22:29,719 Speaker 3: overnight and then come back and it's done in the morning. 435 00:22:29,800 --> 00:22:32,840 Speaker 3: So I think like the biggest sort of shift with 436 00:22:33,040 --> 00:22:37,199 Speaker 3: open claw is this move from I'm talking to an 437 00:22:37,200 --> 00:22:39,080 Speaker 3: agent and I'm prompting an agent and it's giving me 438 00:22:39,160 --> 00:22:40,879 Speaker 3: a response, and then I'm giving it a response, and 439 00:22:40,920 --> 00:22:44,680 Speaker 3: it's sort of this like real time interaction collaboration versus 440 00:22:45,440 --> 00:22:49,400 Speaker 3: with clause, you're really just sort of like sending it 441 00:22:49,480 --> 00:22:53,960 Speaker 3: off okay for a while, and then it's coming back 442 00:22:54,000 --> 00:22:54,600 Speaker 3: to you later. 443 00:22:54,640 --> 00:22:56,680 Speaker 2: It's like it's asynchronously working with. 444 00:22:56,680 --> 00:22:59,920 Speaker 1: A single natural language prompt or a more complicated promo. 445 00:23:00,080 --> 00:23:02,240 Speaker 1: What's the use case of open clure that you've seen 446 00:23:02,320 --> 00:23:03,320 Speaker 1: that you find intriguing. 447 00:23:03,640 --> 00:23:05,600 Speaker 3: Yeah, yeah, no, I mean that's a great question, And 448 00:23:05,640 --> 00:23:07,680 Speaker 3: that was exactly my question when I was at the 449 00:23:07,720 --> 00:23:11,639 Speaker 3: conference and in video had this experience called build a Claw, 450 00:23:11,960 --> 00:23:15,320 Speaker 3: which was this tent where everybody got to go inside 451 00:23:15,320 --> 00:23:17,800 Speaker 3: and build a claw, and I think like it was 452 00:23:17,920 --> 00:23:21,840 Speaker 3: really interesting because I went to go build my own claw, 453 00:23:21,880 --> 00:23:24,040 Speaker 3: and I just said to the video exact I was 454 00:23:24,080 --> 00:23:29,400 Speaker 3: talking to, I can't think of something that would need 455 00:23:29,440 --> 00:23:32,359 Speaker 3: to be a claw that I couldn't just do with 456 00:23:32,520 --> 00:23:38,159 Speaker 3: an agent or a chat bot. And it was just 457 00:23:38,200 --> 00:23:42,120 Speaker 3: it's this hard conceptual shift that needs to happen. It's 458 00:23:42,160 --> 00:23:44,760 Speaker 3: like it's sort of like the same way when chattyb 459 00:23:44,880 --> 00:23:47,119 Speaker 3: Tea came out, we would all talk to it the 460 00:23:47,160 --> 00:23:47,639 Speaker 3: same way. 461 00:23:47,520 --> 00:23:48,440 Speaker 2: We talked to Google. 462 00:23:48,720 --> 00:23:50,560 Speaker 3: We would just put in a few keywords because that's 463 00:23:50,600 --> 00:23:53,479 Speaker 3: the way we were used to interacting with those systems. So 464 00:23:53,640 --> 00:23:56,119 Speaker 3: now it's like when we get the opportunity to use 465 00:23:56,119 --> 00:23:58,520 Speaker 3: a claw, we're kind of interacting with it the same 466 00:23:58,520 --> 00:24:01,000 Speaker 3: way we would do with chatchybet. Like I was with 467 00:24:01,000 --> 00:24:02,880 Speaker 3: my editor and he was like, oh, help me plan 468 00:24:02,960 --> 00:24:06,560 Speaker 3: a road trip, and it could do that, but it's 469 00:24:06,560 --> 00:24:10,200 Speaker 3: like it's capabilities are so far beyond that that again 470 00:24:10,240 --> 00:24:12,920 Speaker 3: it's hard to conceptualize, like how to actually do this. 471 00:24:13,040 --> 00:24:14,960 Speaker 1: Do the invideoor exec give you any advice on what 472 00:24:15,000 --> 00:24:16,680 Speaker 1: type of clore to build or were the other people 473 00:24:16,680 --> 00:24:20,520 Speaker 1: that build? Yeah, I mean I that you came. 474 00:24:20,920 --> 00:24:25,680 Speaker 3: I remember talking to this one marketing leader who said 475 00:24:25,720 --> 00:24:28,199 Speaker 3: she had like, you know, a dozen different ideas for 476 00:24:28,359 --> 00:24:32,119 Speaker 3: clause and a lot of it is, you know, scanning 477 00:24:32,920 --> 00:24:36,080 Speaker 3: information on the internet. So her idea is, you know, 478 00:24:36,600 --> 00:24:39,560 Speaker 3: go on social media and read all these news sites 479 00:24:39,560 --> 00:24:41,840 Speaker 3: and read all these newsletters and report back to me 480 00:24:41,880 --> 00:24:42,520 Speaker 3: every morning. 481 00:24:42,920 --> 00:24:43,840 Speaker 2: What are sort of like. 482 00:24:43,800 --> 00:24:47,240 Speaker 3: The key themes of you know, what we should be 483 00:24:47,520 --> 00:24:50,840 Speaker 3: putting our marketing and thought leadership content around, and then 484 00:24:50,920 --> 00:24:53,280 Speaker 3: like help us like make sure that whatever we plan 485 00:24:53,400 --> 00:24:54,879 Speaker 3: to put out that day is like sort of like 486 00:24:54,960 --> 00:24:58,080 Speaker 3: drafted around that. Another example was an Nvidia exec was 487 00:24:58,080 --> 00:25:00,520 Speaker 3: telling me that they had this college student come through 488 00:25:00,600 --> 00:25:04,000 Speaker 3: and she was job searching, and so she was like 489 00:25:04,640 --> 00:25:07,240 Speaker 3: kind of claw helped me, you know, identify jobs to 490 00:25:07,280 --> 00:25:11,159 Speaker 3: apply for. And he was like, yes, that's like a 491 00:25:11,160 --> 00:25:14,320 Speaker 3: great chop out question. But like take it a step further, 492 00:25:14,880 --> 00:25:17,439 Speaker 3: like once it finds the role, then have it like 493 00:25:17,960 --> 00:25:20,920 Speaker 3: identify the right skills and then like optimize and tweak 494 00:25:20,960 --> 00:25:23,159 Speaker 3: your resume. And she was like, okay, great, and then 495 00:25:23,200 --> 00:25:25,159 Speaker 3: in videos, I can then take it a step further 496 00:25:25,280 --> 00:25:27,600 Speaker 3: and then give it access to your LinkedIn and then 497 00:25:27,640 --> 00:25:29,639 Speaker 3: give it access to your data and like have it 498 00:25:29,680 --> 00:25:32,680 Speaker 3: build out these coverlead It's like, have it do these applications. 499 00:25:32,680 --> 00:25:34,639 Speaker 3: And so it's like you just have to have that 500 00:25:34,840 --> 00:25:39,159 Speaker 3: mindset shift of like always take it a step further. 501 00:25:39,359 --> 00:25:42,040 Speaker 3: And I think that's not totally clear to all of 502 00:25:42,119 --> 00:25:45,159 Speaker 3: us yet unless you're, like, you know, an AI. 503 00:25:45,040 --> 00:25:46,359 Speaker 2: Super super super user. 504 00:25:46,680 --> 00:25:48,640 Speaker 3: But even since the AI super users that were at 505 00:25:48,680 --> 00:25:51,200 Speaker 3: the video conference, I think we're just like having trouble 506 00:25:51,280 --> 00:25:54,880 Speaker 3: conceptualizing like the real capabilities of what we can. 507 00:25:54,800 --> 00:25:57,840 Speaker 1: Do with this. Jensen said, this is the new chat moment. 508 00:25:58,040 --> 00:26:01,400 Speaker 1: He said that we're having the che cheapy team moment 509 00:26:01,480 --> 00:26:03,800 Speaker 1: for robotics, so having one check cheap team. 510 00:26:05,400 --> 00:26:08,679 Speaker 3: He was like he was really really all in on this, 511 00:26:08,760 --> 00:26:11,440 Speaker 3: I mean all on open claw. 512 00:26:11,520 --> 00:26:11,840 Speaker 4: Yeah. 513 00:26:12,000 --> 00:26:14,199 Speaker 3: But okay, so here's the thing about open Claw we 514 00:26:14,200 --> 00:26:16,560 Speaker 3: haven't talked about, which is that like there's a lot 515 00:26:16,600 --> 00:26:21,439 Speaker 3: of security challenges, and I did not find anyone at 516 00:26:21,880 --> 00:26:24,679 Speaker 3: the video conference who was comfortable using open claw for 517 00:26:24,760 --> 00:26:29,080 Speaker 3: any context in their actual like work slash business. So 518 00:26:29,359 --> 00:26:31,960 Speaker 3: when people want to use open claw, a lot of 519 00:26:31,960 --> 00:26:34,479 Speaker 3: times they'll go out and get a separate computer and 520 00:26:34,600 --> 00:26:36,440 Speaker 3: just run it on that computer, like. 521 00:26:36,400 --> 00:26:37,720 Speaker 1: A clean computers. 522 00:26:37,760 --> 00:26:42,760 Speaker 3: So it's like it's literally just like a separate computer. Yeah, 523 00:26:42,800 --> 00:26:46,480 Speaker 3: and just for open claw, because if you have open 524 00:26:46,520 --> 00:26:48,439 Speaker 3: claw on your computer where you have all your business 525 00:26:48,480 --> 00:26:51,840 Speaker 3: data and you're giving it access to things, it can 526 00:26:51,920 --> 00:26:53,800 Speaker 3: really I mean there's been a lot of stories of 527 00:26:53,840 --> 00:26:56,840 Speaker 3: open claw deleted all your emails or if you know, 528 00:26:56,880 --> 00:26:58,520 Speaker 3: if you have a Claw that's on the internet and 529 00:26:58,520 --> 00:27:01,919 Speaker 3: interacting with people, you know, it's not that difficult to 530 00:27:01,960 --> 00:27:03,959 Speaker 3: like prompt injected and then open clock can like give 531 00:27:03,960 --> 00:27:07,119 Speaker 3: away or credit card details. So there's so many data 532 00:27:07,119 --> 00:27:09,880 Speaker 3: concerns about open claw, which is why it is not 533 00:27:10,200 --> 00:27:11,000 Speaker 3: enterprise ready. 534 00:27:11,240 --> 00:27:13,640 Speaker 1: And the video are trying to basically with Nemo Claw, 535 00:27:13,640 --> 00:27:17,800 Speaker 1: which is their product, create a version of layer on 536 00:27:17,840 --> 00:27:20,040 Speaker 1: top of the open source open flaw to make it 537 00:27:20,119 --> 00:27:20,760 Speaker 1: business safe. 538 00:27:21,040 --> 00:27:23,000 Speaker 2: Yes, yeah, yeah, that's what they're trying to do. 539 00:27:23,400 --> 00:27:25,720 Speaker 3: I don't know if they're fully there yet I think 540 00:27:25,760 --> 00:27:28,680 Speaker 3: there's this is this tack is so new, there's still 541 00:27:28,800 --> 00:27:32,840 Speaker 3: so many concerns because a lot could go wrong, clearly, 542 00:27:33,400 --> 00:27:36,000 Speaker 3: but I think they're excited about what could go right 543 00:27:36,040 --> 00:27:37,280 Speaker 3: and they just sort of want to put it in 544 00:27:37,280 --> 00:27:37,720 Speaker 3: that direction. 545 00:27:38,080 --> 00:27:40,120 Speaker 1: The other thing which came up, I think a fabit 546 00:27:40,160 --> 00:27:41,639 Speaker 1: was space, right AI and space? 547 00:27:42,040 --> 00:27:42,920 Speaker 4: Yeah? 548 00:27:43,240 --> 00:27:45,080 Speaker 1: Was that? What else kind of surprised you and caught 549 00:27:45,119 --> 00:27:46,600 Speaker 1: your eye while you were there on the ground. 550 00:27:46,840 --> 00:27:49,119 Speaker 3: Look, the amount of announcements and video puts out at 551 00:27:49,160 --> 00:27:52,320 Speaker 3: GtC is crazy. I mean there was there were more 552 00:27:52,320 --> 00:27:53,119 Speaker 3: than a dozen. 553 00:27:52,840 --> 00:27:56,160 Speaker 2: Press releases, Like it's just wild. 554 00:27:56,400 --> 00:27:59,560 Speaker 3: But yeah, I mean computers going into space, that's one 555 00:27:59,600 --> 00:28:02,399 Speaker 3: thing you know in videos playing a lot in the 556 00:28:02,520 --> 00:28:05,200 Speaker 3: quantum landscape, Like, that's another thing we should be paying 557 00:28:05,240 --> 00:28:09,480 Speaker 3: attention to. Autonomous driving is a really big focus, and 558 00:28:09,560 --> 00:28:12,480 Speaker 3: I think that caught my eye because you know, in 559 00:28:12,560 --> 00:28:17,160 Speaker 3: video they announced a partnership with Uber to sort of 560 00:28:17,280 --> 00:28:20,080 Speaker 3: invest in this robotaxi fleet and it's coming to a 561 00:28:20,280 --> 00:28:22,400 Speaker 3: bunch of cities around the world in the next couple 562 00:28:22,400 --> 00:28:25,520 Speaker 3: of years. And you know, you can imagine a future 563 00:28:25,600 --> 00:28:29,600 Speaker 3: where like you drive to work and then while you're 564 00:28:29,760 --> 00:28:33,119 Speaker 3: in your office, you send your car out to be 565 00:28:33,840 --> 00:28:36,520 Speaker 3: an autonomous uber for the day, and then it drives 566 00:28:36,520 --> 00:28:37,520 Speaker 3: back and picks you up. 567 00:28:37,640 --> 00:28:39,920 Speaker 2: So that was another interesting thing. 568 00:28:40,480 --> 00:28:42,560 Speaker 1: Just coming back to the to the beginning of our conversation. 569 00:28:42,640 --> 00:28:46,040 Speaker 1: I mean, the kind of open law world you're describing 570 00:28:47,040 --> 00:28:50,160 Speaker 1: does sound like a world whereas seems like there will 571 00:28:50,160 --> 00:28:53,400 Speaker 1: be serious competitive threats to people paying thirty thousand dollars 572 00:28:53,440 --> 00:28:54,240 Speaker 1: a year for Bloomberg. 573 00:28:54,400 --> 00:28:57,240 Speaker 3: Yeah, yeah, I mean you could definitely make that argument. 574 00:28:57,480 --> 00:28:59,520 Speaker 3: You know, you could also make the other argument that, 575 00:28:59,640 --> 00:29:03,200 Speaker 3: you know, there's a reason Bloomberg costs thirty thousand dollars 576 00:29:03,240 --> 00:29:07,440 Speaker 3: and it's incredibly secure and they have spent many, many 577 00:29:07,520 --> 00:29:11,800 Speaker 3: years investing in you know, the algorithms that model the 578 00:29:11,880 --> 00:29:16,840 Speaker 3: pricing for this, and that I think, you know, you 579 00:29:16,840 --> 00:29:20,440 Speaker 3: could build an alternative with open claw and people will 580 00:29:20,840 --> 00:29:24,080 Speaker 3: And you know, maybe if you're just sort of like 581 00:29:24,120 --> 00:29:28,160 Speaker 3: a less serious retail investor, like that will be great 582 00:29:28,200 --> 00:29:30,160 Speaker 3: for you because you're someone who's never going to invest 583 00:29:30,160 --> 00:29:33,920 Speaker 3: in a Bloomberg terminal. But if you're like a finance company, 584 00:29:34,720 --> 00:29:38,040 Speaker 3: it's hard to imagine you would walk away from like 585 00:29:38,320 --> 00:29:42,880 Speaker 3: the security and consistency of something like a Bloomberg terminal 586 00:29:42,920 --> 00:29:44,240 Speaker 3: At least in the near future. 587 00:29:44,480 --> 00:29:46,240 Speaker 1: I think brand metters, and I think about on the 588 00:29:46,280 --> 00:29:49,280 Speaker 1: legal side, for example, like sure, if you're a finance company, 589 00:29:49,280 --> 00:29:52,080 Speaker 1: you could use some you know, legal AI software like 590 00:29:52,920 --> 00:29:55,320 Speaker 1: or you could pay Lathan Watkins one thousand dollars an 591 00:29:55,360 --> 00:30:01,080 Speaker 1: hour perer because you know you've got Yeah. 592 00:30:01,160 --> 00:30:03,120 Speaker 3: Yeah, And I mean one thing we haven't touched on 593 00:30:03,160 --> 00:30:05,520 Speaker 3: either is the fact that, like you know, Bloomberg is 594 00:30:05,560 --> 00:30:08,560 Speaker 3: integrating AI. They have AI features of the terminal, and 595 00:30:08,640 --> 00:30:10,560 Speaker 3: so I think like that's their push to sort of say, 596 00:30:10,600 --> 00:30:12,680 Speaker 3: like this is not a point when you need to 597 00:30:12,680 --> 00:30:15,160 Speaker 3: go out and find an alternative, like we are everything 598 00:30:15,200 --> 00:30:19,080 Speaker 3: we've always been to you and more and we're worth 599 00:30:19,120 --> 00:30:22,880 Speaker 3: that price tag. So yeah, I mean again, there's arguments 600 00:30:22,920 --> 00:30:24,680 Speaker 3: on both sides, which is again why it's started up 601 00:30:24,720 --> 00:30:26,680 Speaker 3: playing so much angry debate online. 602 00:30:27,320 --> 00:30:30,240 Speaker 1: Isabel, thank you, thanks for coming straight from the Red Eye. 603 00:30:30,840 --> 00:30:32,200 Speaker 1: A good luck with the rest of your day. I 604 00:30:32,200 --> 00:30:33,800 Speaker 1: gather you're going into the office now to write some 605 00:30:33,840 --> 00:30:34,400 Speaker 1: more stories. 606 00:30:34,640 --> 00:30:36,840 Speaker 3: Yeah, yeah, I'm about to. Could I go write some 607 00:30:36,880 --> 00:30:38,360 Speaker 3: more Open Claws stories? 608 00:30:38,440 --> 00:31:02,040 Speaker 4: So thanks for having me, h for text stuff. 609 00:31:02,120 --> 00:31:04,920 Speaker 1: I'm as Volosian. This episode was produced by Eliza Dennis 610 00:31:04,920 --> 00:31:08,200 Speaker 1: and Melissa Slaughter. It was executive produced by me Julian 611 00:31:08,280 --> 00:31:11,320 Speaker 1: Nutter and Kate Osborne for Kaleidoscope and Katrina Norvell for 612 00:31:11,360 --> 00:31:15,840 Speaker 1: iHeart Podcasts. The engineer is Charles de Montebello for CDM Studios. 613 00:31:16,240 --> 00:31:19,120 Speaker 1: Jack Insley mixed this episode and Kyle murdoch Rodart theme 614 00:31:19,200 --> 00:31:22,600 Speaker 1: song and please do rate and review this show wherever 615 00:31:22,640 --> 00:31:23,680 Speaker 1: you listen to your podcast