1 00:00:14,840 --> 00:00:17,560 Speaker 1: Welcome to tech Stuff. This is the story and today 2 00:00:17,720 --> 00:00:20,639 Speaker 1: I'm here with Cara Price. Hi. So we were talking 3 00:00:20,680 --> 00:00:24,520 Speaker 1: about these episodes of the story, and we realized that 4 00:00:24,600 --> 00:00:26,119 Speaker 1: while the other of us we often have an in 5 00:00:26,200 --> 00:00:29,440 Speaker 1: depth discussion about the guest or the episode, but that 6 00:00:29,440 --> 00:00:31,720 Speaker 1: we weren't capturing that discussion for the podcast and that 7 00:00:31,720 --> 00:00:34,800 Speaker 1: felt like a misopportunity. So today we're trying out something new, 8 00:00:35,000 --> 00:00:36,760 Speaker 1: which is to have a bit of a chat between 9 00:00:36,760 --> 00:00:38,920 Speaker 1: the two of us to intro the guests of the 10 00:00:38,920 --> 00:00:42,800 Speaker 1: week and in particular why we think their story matters 11 00:00:42,800 --> 00:00:45,960 Speaker 1: to us and to our listeners. So to start of 12 00:00:46,000 --> 00:00:49,280 Speaker 1: this week, I wanted to share a thought from our guest, 13 00:00:49,479 --> 00:00:53,120 Speaker 1: David Webster about the importance of always considering the user 14 00:00:53,440 --> 00:00:54,840 Speaker 1: when developing new technology. 15 00:00:55,280 --> 00:00:57,600 Speaker 2: And it's often a blind spot for the technologists. I mean, 16 00:00:57,600 --> 00:01:03,520 Speaker 2: the technologists are not necessarily cold or dispersonal or or cynical. 17 00:01:03,760 --> 00:01:06,960 Speaker 2: The probably parents of this new technology, but they might 18 00:01:06,959 --> 00:01:09,920 Speaker 2: not realize that their babies their babies ugly. 19 00:01:11,120 --> 00:01:14,120 Speaker 1: So, Cara, that was David Webster speaking. He's head of 20 00:01:14,560 --> 00:01:18,680 Speaker 1: ux or User Experience, that's something called Google Labs, and 21 00:01:18,760 --> 00:01:20,800 Speaker 1: in my opinion, he has one of the most interesting 22 00:01:20,880 --> 00:01:24,520 Speaker 1: jobs at Google, because Labs is the division that's responsible 23 00:01:24,560 --> 00:01:27,840 Speaker 1: for making consumer facing products that sit on top of 24 00:01:27,880 --> 00:01:30,240 Speaker 1: Google's AI models. They've done stuff in the world of 25 00:01:30,319 --> 00:01:35,600 Speaker 1: video generation, but probably their most famous product is Notebook LM. 26 00:01:35,720 --> 00:01:37,440 Speaker 3: And that's the product that we thought was going to 27 00:01:37,480 --> 00:01:39,800 Speaker 3: put us out of business one, the one where you 28 00:01:39,880 --> 00:01:43,520 Speaker 3: can feed any websites, document articles you like, and get 29 00:01:43,600 --> 00:01:46,200 Speaker 3: back a two way podcast. 30 00:01:45,840 --> 00:01:49,640 Speaker 1: Two comforting voices, giggling over whatever the topic at hand 31 00:01:49,680 --> 00:01:52,840 Speaker 1: may be. Much like that exactly. This notebook and AM 32 00:01:52,920 --> 00:01:55,920 Speaker 1: product kind of grew people's minds and it had fans 33 00:01:56,120 --> 00:02:02,880 Speaker 1: from Mark Cuban to AI pioneer Andre Pathy his Mark Cuban, has. 34 00:02:02,840 --> 00:02:07,560 Speaker 2: Anybody played with Notebook l LAMB the podcast feature Oh 35 00:02:07,600 --> 00:02:08,480 Speaker 2: my God. 36 00:02:08,639 --> 00:02:12,079 Speaker 3: So you could take any PDF file, any text. 37 00:02:12,880 --> 00:02:15,000 Speaker 2: It could be like your onboarding manual. 38 00:02:15,120 --> 00:02:18,680 Speaker 3: They could be training manual, they could be distribution manuals, 39 00:02:18,680 --> 00:02:20,840 Speaker 3: whatever it is, and you feed. 40 00:02:20,560 --> 00:02:22,880 Speaker 2: It in there and it turns it into a podcast. 41 00:02:23,639 --> 00:02:27,000 Speaker 1: Cuban is very fixtational manuals for whatever reason he is. 42 00:02:28,080 --> 00:02:31,440 Speaker 3: But what does Webster's role have to do with any 43 00:02:31,480 --> 00:02:31,720 Speaker 3: of this. 44 00:02:32,040 --> 00:02:36,000 Speaker 1: Well, David Webster, as I mentioned, runs Google Labs. Previously, 45 00:02:36,280 --> 00:02:39,680 Speaker 1: he worked for this design firm called Ideo, which is 46 00:02:39,760 --> 00:02:43,359 Speaker 1: famous for the concept of human centered design, in other words, 47 00:02:43,440 --> 00:02:46,960 Speaker 1: doing deep user research to understand what real people want 48 00:02:46,960 --> 00:02:50,320 Speaker 1: from our product and then integrating that feedback into the product. 49 00:02:50,720 --> 00:02:53,320 Speaker 1: Idea is actually the firm who designed the first mouse 50 00:02:53,360 --> 00:02:56,960 Speaker 1: for Apple in the way back when, and Webster has 51 00:02:57,000 --> 00:03:00,519 Speaker 1: now been tasked with making delightful, engaging a eye products 52 00:03:00,560 --> 00:03:01,560 Speaker 1: for Google. Yeah. 53 00:03:01,560 --> 00:03:03,919 Speaker 3: You know, it's an interesting moment for this conversation because 54 00:03:04,400 --> 00:03:08,800 Speaker 3: Google is getting absolutely beaten up, relatively speaking, by the 55 00:03:08,840 --> 00:03:12,399 Speaker 3: stock market because of questions about whether the ad supported 56 00:03:12,440 --> 00:03:15,840 Speaker 3: search engine, which is fundamental to its business, will survive 57 00:03:15,880 --> 00:03:17,440 Speaker 3: the age of AI exactly. 58 00:03:17,560 --> 00:03:20,440 Speaker 1: And David is on the front lines of building the 59 00:03:20,520 --> 00:03:23,920 Speaker 1: next wave of consumer products for Google. We spoke about 60 00:03:23,919 --> 00:03:27,680 Speaker 1: the philosophy behind how he does that and about his 61 00:03:27,800 --> 00:03:31,399 Speaker 1: views on the future of wearable AI. Here's the conversation. 62 00:03:32,240 --> 00:03:33,600 Speaker 1: David Webster, thanks for joining us. 63 00:03:33,560 --> 00:03:34,880 Speaker 4: Today, Thanks for having me. 64 00:03:35,000 --> 00:03:37,600 Speaker 1: You know, when people think about Google, they obviously think 65 00:03:37,600 --> 00:03:40,440 Speaker 1: about the search engine. In fact, you know, to Google 66 00:03:40,560 --> 00:03:44,720 Speaker 1: became a verb much like to hoover, where we come from, 67 00:03:45,280 --> 00:03:49,800 Speaker 1: became a verb for vacuum cleaning in Britain. You work 68 00:03:49,840 --> 00:03:53,760 Speaker 1: at Google Labs as the head of UX, Well, what 69 00:03:53,840 --> 00:03:55,760 Speaker 1: does labs do and how does it interact with a 70 00:03:55,840 --> 00:03:57,200 Speaker 1: larger Google organization. 71 00:03:57,640 --> 00:04:02,160 Speaker 2: Well, Labs is a group that we started in twenty 72 00:04:02,240 --> 00:04:06,120 Speaker 2: twenty two. It is actually our reboot of a group 73 00:04:06,160 --> 00:04:08,520 Speaker 2: that was around her in the early days of Google, 74 00:04:09,600 --> 00:04:15,520 Speaker 2: which was a place where we could experiment with emergent 75 00:04:15,560 --> 00:04:19,360 Speaker 2: capabilities and new technologies coming from the research side of 76 00:04:19,400 --> 00:04:25,719 Speaker 2: the organization and figure out how to quickly productize those 77 00:04:26,640 --> 00:04:30,000 Speaker 2: in new ways and service of Google's missions. So that's 78 00:04:30,000 --> 00:04:32,120 Speaker 2: why we brought labs back into existence. 79 00:04:32,600 --> 00:04:36,000 Speaker 1: Google's mission being is that still to organize the world's information. 80 00:04:35,760 --> 00:04:41,040 Speaker 2: Organize the world's information and make it universally accessible and useful. 81 00:04:42,400 --> 00:04:45,240 Speaker 2: I joined to stand up the UX component of labs, 82 00:04:45,760 --> 00:04:49,760 Speaker 2: and the exciting thing for me was having had a 83 00:04:49,800 --> 00:04:54,599 Speaker 2: career in design and UX UX standing of course for 84 00:04:54,720 --> 00:05:00,200 Speaker 2: user experiences. I was very aware that we were in 85 00:05:00,240 --> 00:05:04,280 Speaker 2: a moment where there was a host actually of unprecedentedly 86 00:05:04,279 --> 00:05:10,480 Speaker 2: powerful technologies their time had arrived, and was very conscious 87 00:05:10,560 --> 00:05:17,000 Speaker 2: of the significance of user experience and inflecting those technologies 88 00:05:17,080 --> 00:05:19,560 Speaker 2: so that the prosu weigh the cons. There were a 89 00:05:19,640 --> 00:05:23,880 Speaker 2: bunch of things emerging around twenty twenty two that were contenders. 90 00:05:23,960 --> 00:05:26,880 Speaker 2: You know, the blockchain was still a thing. Metaverses were 91 00:05:26,920 --> 00:05:32,520 Speaker 2: still a thing, but obviously llms their moment was arriving, 92 00:05:32,560 --> 00:05:36,919 Speaker 2: and very quickly we ascertained that they were becoming the 93 00:05:37,080 --> 00:05:42,240 Speaker 2: thing and started putting all our efforts into figuring out 94 00:05:42,240 --> 00:05:45,520 Speaker 2: how to take the potential of this, you know, significant 95 00:05:45,520 --> 00:05:50,080 Speaker 2: platform shift and format it so that it was accessible 96 00:05:50,480 --> 00:05:51,760 Speaker 2: and useful to people. 97 00:05:51,960 --> 00:05:54,120 Speaker 1: You just want to sort of go back in time 98 00:05:55,040 --> 00:05:57,640 Speaker 1: thirty odd years to kind of give some framing to 99 00:05:57,680 --> 00:06:01,360 Speaker 1: this conversation into your background. I'm wonders you can tell 100 00:06:01,440 --> 00:06:05,200 Speaker 1: us who Kenji Eko one, if I'm saying that correctly, 101 00:06:05,240 --> 00:06:08,880 Speaker 1: who he is, and what we can't understand about your 102 00:06:08,960 --> 00:06:10,400 Speaker 1: work without understanding him. 103 00:06:10,920 --> 00:06:17,760 Speaker 2: Kenji Ekuan. Kenji Ekuan was absolutely legendary and iconic a 104 00:06:17,880 --> 00:06:24,680 Speaker 2: Japanese designer. He was the founder of a company called 105 00:06:24,720 --> 00:06:28,560 Speaker 2: GK Design, which is a large Japanese group of design 106 00:06:28,600 --> 00:06:31,880 Speaker 2: and creative companies, and in. 107 00:06:31,800 --> 00:06:33,600 Speaker 4: The early days of my career. 108 00:06:36,480 --> 00:06:42,000 Speaker 2: Had the privilege of getting to meet him quite serendipetitively. 109 00:06:42,720 --> 00:06:46,400 Speaker 2: I studied mechanical engineering originally, then studied industrial design at 110 00:06:46,400 --> 00:06:49,120 Speaker 2: the Royal College of Art. Wanted to be both a 111 00:06:49,160 --> 00:06:52,640 Speaker 2: deeply technical person and a highly creative person. You had 112 00:06:52,640 --> 00:06:54,920 Speaker 2: to kind of pick either or in the UK at 113 00:06:54,920 --> 00:06:59,440 Speaker 2: the time, so I was attracted to and moved to 114 00:06:59,480 --> 00:07:01,840 Speaker 2: a place where where you didn't have to pick either or, 115 00:07:02,160 --> 00:07:05,680 Speaker 2: and one of those places was Japan. And so I 116 00:07:05,760 --> 00:07:08,320 Speaker 2: rolled up in Japan, but I didn't have a job 117 00:07:08,360 --> 00:07:11,800 Speaker 2: lined up, and via a friend of mine who was 118 00:07:11,840 --> 00:07:15,160 Speaker 2: already in Japan, I got an introduction to Kenji Ekuan, 119 00:07:15,360 --> 00:07:19,040 Speaker 2: who was this revered chairman of the most iconic design 120 00:07:19,080 --> 00:07:22,000 Speaker 2: firm in Japan. If you have ever used soy sauce 121 00:07:22,160 --> 00:07:25,880 Speaker 2: of one of those Kickoman soy sauce bottles, beautifully kind 122 00:07:25,920 --> 00:07:29,040 Speaker 2: of organically shaped one with the red lead, that's one 123 00:07:29,040 --> 00:07:30,560 Speaker 2: of Ekuan's designs. 124 00:07:30,840 --> 00:07:35,400 Speaker 1: One of the most ubiquitous and iconic consumers there. 125 00:07:35,240 --> 00:07:40,600 Speaker 2: Is, absolutely and I found myself able to go and 126 00:07:40,640 --> 00:07:43,800 Speaker 2: meet the great man. I went into his office. I 127 00:07:45,120 --> 00:07:48,360 Speaker 2: was looking around me and it was this thing in 128 00:07:48,400 --> 00:07:52,080 Speaker 2: the corner that I couldn't help noticing, which was a 129 00:07:52,120 --> 00:07:59,640 Speaker 2: life size fiberglass human body kind of suspended in space 130 00:07:59,680 --> 00:08:03,080 Speaker 2: horror onto. He was two axles going through the shoulders 131 00:08:03,720 --> 00:08:08,080 Speaker 2: and the ankles, and big kind of chrome wagon wheels 132 00:08:08,080 --> 00:08:11,960 Speaker 2: on the axles. After we'd done with this sort of 133 00:08:11,960 --> 00:08:13,640 Speaker 2: introductory niceties. 134 00:08:13,200 --> 00:08:17,840 Speaker 4: I said, I couldn't help noticing that thing. What is that? 135 00:08:18,680 --> 00:08:20,200 Speaker 2: He just looked me in the eye and said, this 136 00:08:20,240 --> 00:08:23,680 Speaker 2: is my soul chariot and pressed the switch and the 137 00:08:23,720 --> 00:08:27,600 Speaker 2: acxules started rotating, and I was awestruck. I thought, this 138 00:08:27,640 --> 00:08:32,080 Speaker 2: guy's amazing. This is not your typical Christy senior executive. 139 00:08:32,160 --> 00:08:35,160 Speaker 2: He's talking about his soul chariot. And then we got 140 00:08:35,280 --> 00:08:40,080 Speaker 2: talking about what my aspirations were, and he looked at 141 00:08:40,120 --> 00:08:43,880 Speaker 2: some of my portfolio work from college and he said, 142 00:08:44,200 --> 00:08:48,720 Speaker 2: I want you to design motorcycles. And through that experience, 143 00:08:48,760 --> 00:08:51,240 Speaker 2: as I worked in his design firm design motorcycles, I 144 00:08:52,080 --> 00:08:54,760 Speaker 2: read a lot about him and learned his philosophy and 145 00:08:54,800 --> 00:08:57,760 Speaker 2: obviously had some exposure to him. And he has this 146 00:08:58,120 --> 00:09:00,679 Speaker 2: way of talking about the central endeavor of design, and 147 00:09:00,800 --> 00:09:03,320 Speaker 2: he calls it human machine soul energy. 148 00:09:03,600 --> 00:09:06,559 Speaker 1: That is the game, Human machine soul energy. 149 00:09:06,640 --> 00:09:08,679 Speaker 2: That is a game, and that has stuped with me 150 00:09:08,800 --> 00:09:12,520 Speaker 2: ever since. So my career has been about the pursuit 151 00:09:12,559 --> 00:09:15,079 Speaker 2: of human machine soul energy still is. 152 00:09:15,880 --> 00:09:19,120 Speaker 1: That's a remarkable way of putting it, and especially at 153 00:09:19,160 --> 00:09:22,600 Speaker 1: a time when I mean there's all these questions about 154 00:09:23,360 --> 00:09:26,800 Speaker 1: Emerson consciousness. There's people who are falling in love with 155 00:09:26,880 --> 00:09:31,040 Speaker 1: their ms, treating large language models as spiritual guides. I mean, 156 00:09:31,120 --> 00:09:35,239 Speaker 1: this quality of the soul when it comes to technology 157 00:09:35,320 --> 00:09:38,360 Speaker 1: is something something which for me is a kind of 158 00:09:38,440 --> 00:09:41,760 Speaker 1: fascinating part at the reach of my understanding of what's 159 00:09:41,800 --> 00:09:44,360 Speaker 1: going on silicon value right now. But I haven't had 160 00:09:44,400 --> 00:09:47,240 Speaker 1: a chance to ask somebody really what they mean by 161 00:09:47,280 --> 00:09:48,559 Speaker 1: it and what they think about it. So I have 162 00:09:48,640 --> 00:09:51,120 Speaker 1: to ask you right now. I mean, how how does 163 00:09:51,200 --> 00:09:54,760 Speaker 1: that phrase become newly relevant given everything we're living through 164 00:09:54,880 --> 00:09:55,160 Speaker 1: right now? 165 00:09:55,600 --> 00:09:59,079 Speaker 2: Well, it's newly relevant every time I visit, but it's 166 00:10:00,480 --> 00:10:05,559 Speaker 2: particularly relevant right now because all of a sudden, really quickly, 167 00:10:06,720 --> 00:10:15,000 Speaker 2: there's a bunch of experiences and capabilities that we're securely 168 00:10:15,080 --> 00:10:18,880 Speaker 2: on the human side of the human machine equation, and 169 00:10:19,760 --> 00:10:24,160 Speaker 2: now those are able to happen on the technology side. 170 00:10:24,760 --> 00:10:24,920 Speaker 3: You know. 171 00:10:25,840 --> 00:10:32,560 Speaker 2: Famously, there's a lot of anthrop of myomorphization of what. 172 00:10:32,679 --> 00:10:33,760 Speaker 4: The lllms can do. 173 00:10:33,920 --> 00:10:37,360 Speaker 2: People use all this language that was previously used for 174 00:10:37,480 --> 00:10:40,440 Speaker 2: humans to describe how ILM's work. You know, they say 175 00:10:40,480 --> 00:10:45,000 Speaker 2: it's thinking, and you know, the seashore between human and 176 00:10:45,200 --> 00:10:51,160 Speaker 2: machine has changed rather radically, rather suddenly, and there's a 177 00:10:51,240 --> 00:10:55,080 Speaker 2: lot of figuring out that has to happen to ensure 178 00:10:55,120 --> 00:10:59,040 Speaker 2: that that change nets out positively for humans. 179 00:11:00,160 --> 00:11:01,920 Speaker 1: Of what you're working on, and I want to ask 180 00:11:02,000 --> 00:11:04,760 Speaker 1: you not just what the framing of the problem is, 181 00:11:04,840 --> 00:11:07,400 Speaker 1: but what some of the initial answers are before we 182 00:11:07,520 --> 00:11:10,880 Speaker 1: get there. Though, You finish up in Japan and in 183 00:11:11,000 --> 00:11:14,319 Speaker 1: nineteen ninety seven you Silicon Valley, I believe, which is 184 00:11:14,400 --> 00:11:17,959 Speaker 1: the same year that Steve Jobs returns to Apple, and 185 00:11:18,040 --> 00:11:20,920 Speaker 1: you take a job at a firm called IDEO, which, 186 00:11:21,440 --> 00:11:23,439 Speaker 1: for those who don't know, I think, is kind of 187 00:11:23,520 --> 00:11:28,240 Speaker 1: a household name in this design thinking world. It was 188 00:11:28,280 --> 00:11:30,599 Speaker 1: also the company that worked with Apple I think in 189 00:11:30,640 --> 00:11:35,160 Speaker 1: the eighties to design Apple's first computer mouse. From nineteen 190 00:11:35,200 --> 00:11:39,080 Speaker 1: ninety seven to today, working at IDEO until twenty twenty two, 191 00:11:40,000 --> 00:11:44,640 Speaker 1: you witness first hand revolution after revolution in terms of 192 00:11:44,720 --> 00:11:48,680 Speaker 1: the way humans interact with technology. Where do you place 193 00:11:49,000 --> 00:11:54,640 Speaker 1: twenty twenty two and the consumerization of generative AI in 194 00:11:54,720 --> 00:11:55,240 Speaker 1: that journey. 195 00:11:56,160 --> 00:11:59,400 Speaker 2: It's interesting because on the one hand, there's one way 196 00:11:59,440 --> 00:12:05,520 Speaker 2: of character, which is it's just another cycle, right, which is, Okay, 197 00:12:05,600 --> 00:12:11,080 Speaker 2: there's a technology unlock. There's a bunch of investment that happens, 198 00:12:11,120 --> 00:12:14,319 Speaker 2: and a bunch of hype that happens, and kind of 199 00:12:14,440 --> 00:12:20,920 Speaker 2: Darwinistic explosion of invention of new value that happens. All 200 00:12:20,960 --> 00:12:24,800 Speaker 2: of that is true, And so at some level it 201 00:12:24,920 --> 00:12:27,679 Speaker 2: feels like Web two point or or feels like the 202 00:12:27,760 --> 00:12:32,640 Speaker 2: dotcom moment. But to me it feels very different in 203 00:12:32,800 --> 00:12:38,679 Speaker 2: terms of how pervasive this technology is already and definitely 204 00:12:38,800 --> 00:12:43,640 Speaker 2: will be. But I think the fundamental character of the 205 00:12:44,240 --> 00:12:50,559 Speaker 2: technology makes its potential even more significant. And then for me, 206 00:12:51,600 --> 00:12:54,360 Speaker 2: the attraction to IDEO was a continuation of that through line. 207 00:12:54,360 --> 00:12:57,319 Speaker 2: You know, engineer who went to art school went to 208 00:12:57,480 --> 00:13:01,120 Speaker 2: Japan to be both technologists and create to and then 209 00:13:01,240 --> 00:13:09,000 Speaker 2: Ideal famously resolved any dichotomy there. And so IDEO's cultures 210 00:13:09,640 --> 00:13:14,320 Speaker 2: point of view philosophically is that it's an absolutely user 211 00:13:14,440 --> 00:13:20,120 Speaker 2: centered and just some very motivated to bring that you know, 212 00:13:20,280 --> 00:13:24,720 Speaker 2: solving for the users as a kind of fundamental approach 213 00:13:26,240 --> 00:13:29,520 Speaker 2: to this extremely significant technology moment. 214 00:13:29,960 --> 00:13:32,000 Speaker 1: Yeah, and going back to the mission of Google Labs 215 00:13:32,000 --> 00:13:34,959 Speaker 1: that I read at the beginning, Google Labs is where you, 216 00:13:35,600 --> 00:13:38,240 Speaker 1: I guess you means the user in this case can 217 00:13:38,320 --> 00:13:43,079 Speaker 1: discover and try Google's latest AI experimental products and help 218 00:13:43,160 --> 00:13:46,160 Speaker 1: shape the future of AI technology. A very interesting in 219 00:13:46,240 --> 00:13:49,160 Speaker 1: second half of that sentence, especially because it's someone at 220 00:13:49,160 --> 00:13:52,000 Speaker 1: odds with how people think about the development of AI. 221 00:13:52,200 --> 00:13:52,320 Speaker 4: Right. 222 00:13:52,400 --> 00:13:55,880 Speaker 1: The general model is, here's a bunch of computer scientists 223 00:13:55,960 --> 00:13:59,920 Speaker 1: who are not particularly interested in humans or even humanity 224 00:14:00,200 --> 00:14:04,000 Speaker 1: more broadly, and are willing to write it off in 225 00:14:04,120 --> 00:14:08,360 Speaker 1: favor of a you know, silicon superintelligence. But here you 226 00:14:08,520 --> 00:14:11,880 Speaker 1: are promising us as a role in the development of 227 00:14:11,880 --> 00:14:12,559 Speaker 1: the technology. 228 00:14:13,480 --> 00:14:14,000 Speaker 4: How do you do that? 229 00:14:14,760 --> 00:14:18,559 Speaker 2: Yeah, that is absolutely the case, certainly something that I 230 00:14:18,640 --> 00:14:22,480 Speaker 2: bring from Ideal in certainly in my experience of the 231 00:14:22,600 --> 00:14:26,000 Speaker 2: engineering culture at Google, I don't think it is in 232 00:14:26,120 --> 00:14:30,000 Speaker 2: denial about the user. But what you do is you 233 00:14:30,680 --> 00:14:34,080 Speaker 2: you know, in the approach that Ideal famously used across 234 00:14:34,120 --> 00:14:40,080 Speaker 2: all of these waves of technology, it's to characterize successful 235 00:14:40,120 --> 00:14:42,800 Speaker 2: and innovation as being in the sort of sweet spot 236 00:14:42,920 --> 00:14:49,680 Speaker 2: between feeability, desirability, and viability. Meaning there's a three circle 237 00:14:49,760 --> 00:14:55,800 Speaker 2: diagram which is technology, human and business, and a successful 238 00:14:55,800 --> 00:14:58,000 Speaker 2: innovation has to be like the middle of that three 239 00:14:58,160 --> 00:15:02,880 Speaker 2: circle diagram. It has to matter to humans and be 240 00:15:03,000 --> 00:15:05,720 Speaker 2: desirable to them for some reason, and it doesn't get 241 00:15:05,720 --> 00:15:07,880 Speaker 2: to exist unless there's a business model that supports it, 242 00:15:08,760 --> 00:15:15,040 Speaker 2: and often the conditions around a new technology moment, a 243 00:15:15,120 --> 00:15:20,120 Speaker 2: new platform moment can lead to getting to that middle 244 00:15:20,120 --> 00:15:24,640 Speaker 2: of the ven diagram territory from the technology circle. So 245 00:15:24,800 --> 00:15:29,440 Speaker 2: a technology push that can often feel like an answer 246 00:15:29,560 --> 00:15:30,360 Speaker 2: looking for a question. 247 00:15:30,680 --> 00:15:31,480 Speaker 4: It's like, we've. 248 00:15:31,360 --> 00:15:36,160 Speaker 2: Invented this new capability, let's figure out how to make 249 00:15:36,240 --> 00:15:36,960 Speaker 2: people want it. 250 00:15:37,280 --> 00:15:39,760 Speaker 1: With all due respect to Mark Zuckerberg, this is the 251 00:15:40,320 --> 00:15:42,400 Speaker 1: if people are lonely, might not give them fifteen AI 252 00:15:42,600 --> 00:15:45,080 Speaker 1: friends school of thought. I don't want reach out town 253 00:15:45,400 --> 00:15:46,480 Speaker 1: so we can right. 254 00:15:48,560 --> 00:15:50,880 Speaker 2: And it's often a blind spot for the technologists. I mean, 255 00:15:50,880 --> 00:15:56,800 Speaker 2: the technologists are not necessarily cold or dispersonal or cynical, 256 00:15:56,840 --> 00:16:00,240 Speaker 2: the proud parents of this new technology, but they might 257 00:16:00,280 --> 00:16:03,760 Speaker 2: not realize that their babies, their babies ugly, you know. 258 00:16:04,160 --> 00:16:07,520 Speaker 4: And what Ideal. 259 00:16:07,720 --> 00:16:13,520 Speaker 2: Strongly believes, what I believe and what we practice in 260 00:16:13,680 --> 00:16:18,160 Speaker 2: labs is that you have to, at the very least 261 00:16:19,280 --> 00:16:22,600 Speaker 2: also fold in the human at the earliest stage. You 262 00:16:22,720 --> 00:16:27,480 Speaker 2: have to and preferably drive it from from the user. 263 00:16:27,640 --> 00:16:29,600 Speaker 2: You have to go out there and get curious about 264 00:16:29,600 --> 00:16:33,000 Speaker 2: what matters to people on their terms and use that 265 00:16:33,320 --> 00:16:35,400 Speaker 2: use what you learn there as the jumping off point. 266 00:16:35,600 --> 00:16:39,440 Speaker 2: For generating new ideas about value that you can deliver 267 00:16:39,560 --> 00:16:43,360 Speaker 2: within new technology. When you go about it that way, 268 00:16:44,160 --> 00:16:50,920 Speaker 2: you're far more lightly to create experiences that users love. 269 00:16:51,560 --> 00:16:53,880 Speaker 2: You're more likely to create the kind of value that 270 00:16:54,080 --> 00:16:58,160 Speaker 2: you can defend over time as a company because you're 271 00:16:58,200 --> 00:17:02,360 Speaker 2: in a you're in an emotional relationship with your users, 272 00:17:03,200 --> 00:17:06,480 Speaker 2: and it generally works out a lot better. The thing 273 00:17:06,520 --> 00:17:11,120 Speaker 2: that it's often in tension with is speed, because when 274 00:17:11,160 --> 00:17:14,399 Speaker 2: a new technology appears, time is of the essence, and 275 00:17:14,480 --> 00:17:17,400 Speaker 2: that's very real, and so you have to achieve both. 276 00:17:17,480 --> 00:17:25,080 Speaker 1: Had when we come back the double edged sword of 277 00:17:25,160 --> 00:17:29,520 Speaker 1: technology and what happens when you innovate too quickly stay 278 00:17:29,560 --> 00:17:44,840 Speaker 1: with us. So I mean, let's anchor this in the 279 00:17:44,960 --> 00:17:49,880 Speaker 1: tangible world of products that you have incubated and launched 280 00:17:50,240 --> 00:17:50,960 Speaker 1: within labs. 281 00:17:51,040 --> 00:17:51,159 Speaker 4: Right. 282 00:17:51,280 --> 00:17:53,960 Speaker 1: So, Google's develop a conference was in May of this year. 283 00:17:54,040 --> 00:17:58,320 Speaker 1: I think there were four labs launches that were announced, 284 00:17:59,160 --> 00:18:00,879 Speaker 1: maybe a five. I'm not show but correctly if I'm wrong. 285 00:18:00,920 --> 00:18:04,760 Speaker 1: But take one of them and explain how it reflects 286 00:18:04,840 --> 00:18:06,880 Speaker 1: the process you've just described. 287 00:18:07,359 --> 00:18:10,359 Speaker 2: Yeah, so let's take one from one of our recently 288 00:18:10,440 --> 00:18:15,639 Speaker 2: shared ones. Flow so flows the surface that lets people 289 00:18:16,119 --> 00:18:20,400 Speaker 2: interact with the capabilities of the generative video model VO three. 290 00:18:21,080 --> 00:18:25,080 Speaker 2: VO is the name of the latest video generation model 291 00:18:25,960 --> 00:18:30,600 Speaker 2: Google has, So you know, you can convey your intention 292 00:18:30,720 --> 00:18:35,840 Speaker 2: to this model and it will generate incredible, undiscernible from 293 00:18:35,880 --> 00:18:39,359 Speaker 2: reality videos. So VO is the model, Flow is the product. 294 00:18:39,440 --> 00:18:44,240 Speaker 2: Flow is the means of conveying that intention to the model. 295 00:18:44,280 --> 00:18:48,600 Speaker 2: As a filmmaker, so you communicate what you want the 296 00:18:48,680 --> 00:18:54,720 Speaker 2: scene to be, you communicate camera instructions, you communicate edits 297 00:18:54,800 --> 00:18:58,159 Speaker 2: to it, and Flow as a lab's product. Just as 298 00:18:58,160 --> 00:19:01,520 Speaker 2: an example of how that comes about. You know, from 299 00:19:01,600 --> 00:19:05,639 Speaker 2: the beginning of Flow, we worked very closely collaboratively with 300 00:19:06,240 --> 00:19:10,399 Speaker 2: movie makers. One of our fundamental principles and labs is 301 00:19:10,440 --> 00:19:13,600 Speaker 2: co create. Find out the people that you're interested in 302 00:19:14,520 --> 00:19:18,040 Speaker 2: making something for and set things up so that you 303 00:19:18,119 --> 00:19:21,679 Speaker 2: can actually work with them, and you're in a very 304 00:19:21,800 --> 00:19:25,040 Speaker 2: very close feedback loop about how well you're doing in 305 00:19:25,160 --> 00:19:28,320 Speaker 2: terms of delivering the kind of experience the matter to them. 306 00:19:28,359 --> 00:19:31,680 Speaker 2: So we worked with a bunch of filmmakers and developing Flow, 307 00:19:32,400 --> 00:19:36,080 Speaker 2: and it's really interesting you mentioned Notebook LM earlier, there's 308 00:19:36,080 --> 00:19:39,000 Speaker 2: another kind of co creation there. We worked with Stephen Johnson, 309 00:19:39,560 --> 00:19:47,520 Speaker 2: who's a best selling writer who had been obsessed for years, 310 00:19:47,680 --> 00:19:50,600 Speaker 2: for like a decade with this idea of tools for 311 00:19:50,800 --> 00:19:57,159 Speaker 2: thought and ways of collecting and formatting all of the 312 00:19:57,480 --> 00:20:01,520 Speaker 2: stuff that he you he's when writing a novel, both 313 00:20:01,560 --> 00:20:06,159 Speaker 2: his own previous writing has notes, his references, and figuring 314 00:20:06,160 --> 00:20:07,600 Speaker 2: out how to make it most useful. 315 00:20:07,680 --> 00:20:09,560 Speaker 4: So he had He was like the. 316 00:20:10,760 --> 00:20:16,760 Speaker 2: Their original super user of notebook LM before notebook LM existed. 317 00:20:16,960 --> 00:20:19,800 Speaker 2: So we hired him and brought them onto the team 318 00:20:20,080 --> 00:20:22,959 Speaker 2: and co created the product with him and a lot 319 00:20:23,040 --> 00:20:26,920 Speaker 2: of the core DNA of notebook LM is what it 320 00:20:27,119 --> 00:20:28,160 Speaker 2: is because of Stephen. 321 00:20:29,040 --> 00:20:31,080 Speaker 1: What's cool about notebook and LM is rather than just 322 00:20:31,160 --> 00:20:35,920 Speaker 1: saying to it, you know Gemini, Let's say I'm interested 323 00:20:35,960 --> 00:20:40,720 Speaker 1: in the history of the Nutcracker, and you know ballet, 324 00:20:41,480 --> 00:20:44,480 Speaker 1: you can say notebook LM. Here's you know, a great 325 00:20:44,920 --> 00:20:48,600 Speaker 1: New Yorker piece about the Nutcracker, a review of a 326 00:20:48,680 --> 00:20:53,000 Speaker 1: contemporary production at the Brooklyn Clagory of Music. And here 327 00:20:53,520 --> 00:20:57,040 Speaker 1: is you know, a long document about the history of dance. 328 00:20:57,680 --> 00:20:59,280 Speaker 1: I don't want to look at anything else, just these 329 00:20:59,320 --> 00:21:01,720 Speaker 1: three documents and then provide you with insights and so 330 00:21:02,160 --> 00:21:04,080 Speaker 1: in a sense, it puts you in the driver's seat 331 00:21:04,200 --> 00:21:06,760 Speaker 1: in terms of it's more like having a research assistant 332 00:21:06,880 --> 00:21:09,479 Speaker 1: than just sending out a query and seeing what happens. 333 00:21:09,560 --> 00:21:09,720 Speaker 4: Right. 334 00:21:10,280 --> 00:21:13,720 Speaker 1: But then the kind of additional layer was and you 335 00:21:13,800 --> 00:21:15,800 Speaker 1: can hear it as a two way podcast, and that 336 00:21:16,320 --> 00:21:18,680 Speaker 1: final thing seemed to be what blue people's minds most 337 00:21:18,720 --> 00:21:18,959 Speaker 1: of all. 338 00:21:19,359 --> 00:21:19,879 Speaker 4: Absolutely. 339 00:21:19,960 --> 00:21:23,159 Speaker 2: I mean, it wasn't entirely surprised that the blue people's 340 00:21:23,200 --> 00:21:26,000 Speaker 2: minds because we had been having our minds going by 341 00:21:27,000 --> 00:21:31,359 Speaker 2: internally before we launched it, and we knew that this 342 00:21:31,640 --> 00:21:34,720 Speaker 2: is gold. You know, this is amazing as in terms 343 00:21:34,760 --> 00:21:39,679 Speaker 2: of like delivering on the make the world's information accessible 344 00:21:39,800 --> 00:21:46,000 Speaker 2: and useful. This directly served that in a super fresh way. 345 00:21:46,560 --> 00:21:50,119 Speaker 2: That was super delightful. You know, a big part of 346 00:21:51,119 --> 00:21:54,880 Speaker 2: user experience designs. Don't over index on yourself as the user, 347 00:21:55,200 --> 00:21:59,959 Speaker 2: but as a user of notebook LM, the magic has 348 00:22:00,080 --> 00:22:03,520 Speaker 2: and more and not. One of the early comments and 349 00:22:03,640 --> 00:22:08,160 Speaker 2: our discard feed from actual users about it, well, somebody said, 350 00:22:08,400 --> 00:22:13,320 Speaker 2: it makes boring stuff interesting and I use it all 351 00:22:13,400 --> 00:22:14,760 Speaker 2: day every day to do that. 352 00:22:15,200 --> 00:22:18,119 Speaker 1: I love it to play Devil's advocate for a moment. 353 00:22:18,800 --> 00:22:23,240 Speaker 1: Some people say, well, you know, in the age of 354 00:22:23,400 --> 00:22:26,680 Speaker 1: l l m's and jernity of AI. Anything that sits 355 00:22:26,720 --> 00:22:29,520 Speaker 1: on top of a model is just a rapper. Uh, 356 00:22:30,240 --> 00:22:33,080 Speaker 1: and so labs in the wrapper business. 357 00:22:34,280 --> 00:22:37,120 Speaker 2: Yeah, I mean, like who isn't in the in the wrap? 358 00:22:39,119 --> 00:22:41,879 Speaker 2: It's like, I guess I don't really agree with the frame. 359 00:22:42,480 --> 00:22:43,800 Speaker 1: No, I'm not. 360 00:22:44,240 --> 00:22:44,480 Speaker 3: I'm not. 361 00:22:46,160 --> 00:22:51,800 Speaker 4: Is every car brand a rapper and internal combustion? Yeah? 362 00:22:51,920 --> 00:22:54,679 Speaker 1: But to be a bit more substantive about this, right, 363 00:22:56,160 --> 00:22:58,920 Speaker 1: why do I need flow? Why can't I just tell 364 00:22:59,480 --> 00:23:02,040 Speaker 1: the three really in natural language what I want to achieve? 365 00:23:02,160 --> 00:23:04,159 Speaker 1: Like why is a filmmaker? Do I need like a 366 00:23:04,200 --> 00:23:07,080 Speaker 1: product layer beyond natural language processing between me and a 367 00:23:07,160 --> 00:23:08,200 Speaker 1: video generation model? 368 00:23:09,280 --> 00:23:12,600 Speaker 2: Yeah, I mean I think you're talking to UX person. 369 00:23:12,800 --> 00:23:15,200 Speaker 2: I think it's all about that layer. It's all about 370 00:23:15,200 --> 00:23:18,720 Speaker 2: the application layer. It's all about the qualities of the 371 00:23:18,880 --> 00:23:24,640 Speaker 2: interaction experience with the underlying technology. I think we've been 372 00:23:24,760 --> 00:23:30,879 Speaker 2: in the sort of a command line era for AI 373 00:23:31,119 --> 00:23:34,639 Speaker 2: thus far, where it's all about kind of the user 374 00:23:35,760 --> 00:23:41,840 Speaker 2: learning this arcane language of prompts and contorting what they 375 00:23:41,880 --> 00:23:44,920 Speaker 2: want to make happen into that language to feed it 376 00:23:45,000 --> 00:23:47,760 Speaker 2: to the model. That doesn't bring me joy as a user. 377 00:23:48,320 --> 00:23:50,760 Speaker 2: I think it doesn't have to be that way. You know, 378 00:23:50,920 --> 00:23:56,000 Speaker 2: we can figure out delightful interactions for the user to 379 00:23:56,240 --> 00:23:58,840 Speaker 2: convery their intent to the model and for them to 380 00:23:58,920 --> 00:24:01,639 Speaker 2: work with the model. And I think it's all about that. 381 00:24:01,760 --> 00:24:03,960 Speaker 2: If you look at at how value is built and 382 00:24:04,040 --> 00:24:07,800 Speaker 2: defended by the world's most successful companies, certainly in the 383 00:24:07,880 --> 00:24:13,439 Speaker 2: consumer sector, it's about UX ya your UX. What are 384 00:24:13,480 --> 00:24:17,320 Speaker 2: the qualities of that experience and that's what builds a 385 00:24:17,440 --> 00:24:19,919 Speaker 2: character in the brand of a company. 386 00:24:20,240 --> 00:24:23,720 Speaker 1: You mentioned business models yourself, right, Like where does a 387 00:24:23,960 --> 00:24:26,760 Speaker 1: flow or notebook m LM? Like how much is it 388 00:24:26,880 --> 00:24:30,040 Speaker 1: your job to monetize these new products that you're creating. 389 00:24:31,240 --> 00:24:35,840 Speaker 4: Certainly not my job to monetize them. You know. 390 00:24:35,880 --> 00:24:40,240 Speaker 2: The way tech companies in general are configured is you've 391 00:24:40,240 --> 00:24:43,040 Speaker 2: got engineering as a discipline. You've got uxit's a discipline. 392 00:24:43,040 --> 00:24:45,000 Speaker 2: You've got product management is a discipline, and the product 393 00:24:45,040 --> 00:24:49,480 Speaker 2: managers or the or the CEOs essentially of the endeavor 394 00:24:50,119 --> 00:24:54,520 Speaker 2: typically and it's their job to take an innovation on 395 00:24:54,600 --> 00:25:01,800 Speaker 2: the trajectory from glimpse of value to users to successful product, 396 00:25:02,160 --> 00:25:10,680 Speaker 2: including monetization strategy, usually through phases. And so when you're 397 00:25:10,720 --> 00:25:17,040 Speaker 2: doing early stage, it's less about are you hitting a 398 00:25:17,320 --> 00:25:20,240 Speaker 2: weekly active user target and it's more about are you 399 00:25:20,440 --> 00:25:24,240 Speaker 2: seeing fire in the eyes of the users that you're 400 00:25:24,280 --> 00:25:28,200 Speaker 2: putting in front of them, And certainly for me, my 401 00:25:28,400 --> 00:25:30,560 Speaker 2: currency is more of the latter than the former. 402 00:25:31,359 --> 00:25:33,480 Speaker 4: You know, I'm all about helping. 403 00:25:33,280 --> 00:25:37,120 Speaker 2: To create experiences that create that kind of response from 404 00:25:37,840 --> 00:25:40,320 Speaker 2: that induce that kind of response in users, and then 405 00:25:40,960 --> 00:25:44,920 Speaker 2: you know, pms then have something to play with to 406 00:25:45,080 --> 00:25:51,000 Speaker 2: take on the journey towards like successful products, including monetization. 407 00:25:51,880 --> 00:25:55,200 Speaker 1: So that moment for you when the Internet and Mark 408 00:25:55,240 --> 00:25:59,000 Speaker 1: Cuban and Capathy and regular people on Reddit are going 409 00:25:59,160 --> 00:26:02,200 Speaker 1: crazy for note notebook a M, that's a moment that 410 00:26:02,400 --> 00:26:04,560 Speaker 1: that is the magic in the bottle that you spend 411 00:26:04,600 --> 00:26:06,160 Speaker 1: every day in such of essentially. 412 00:26:06,840 --> 00:26:10,160 Speaker 2: Yeah, I mean certainly in terms of outcome, the other 413 00:26:10,280 --> 00:26:14,000 Speaker 2: magic in the bottle, like the thing that really motivates 414 00:26:14,080 --> 00:26:17,480 Speaker 2: me at least as much of that is that there's 415 00:26:17,560 --> 00:26:20,000 Speaker 2: something about just the sort of thrill of the chase, 416 00:26:20,520 --> 00:26:25,280 Speaker 2: this idea of kind of before launch, when you've got 417 00:26:25,480 --> 00:26:31,480 Speaker 2: like a team, there's this sort of mutual creative euphoric 418 00:26:31,640 --> 00:26:35,399 Speaker 2: state you get into when you're you're pushing hard because 419 00:26:35,440 --> 00:26:38,320 Speaker 2: you know there's something that you can bring into existence 420 00:26:38,960 --> 00:26:41,640 Speaker 2: and if you set things up right with the team. 421 00:26:42,560 --> 00:26:46,639 Speaker 2: They're all kind of egging each other on. They're causing 422 00:26:47,400 --> 00:26:52,480 Speaker 2: each other to be at their creative best. And then 423 00:26:53,119 --> 00:26:58,080 Speaker 2: when it results in an outcome that is obviously striking 424 00:26:58,119 --> 00:27:02,320 Speaker 2: a card with users and hopefully making a difference in 425 00:27:02,359 --> 00:27:04,919 Speaker 2: the world, then that's that's greedy. 426 00:27:05,480 --> 00:27:05,680 Speaker 4: You know. 427 00:27:07,080 --> 00:27:09,399 Speaker 1: One of the products that you guys are announced that 428 00:27:09,640 --> 00:27:12,400 Speaker 1: the Developer Conference I'd love to spend just a couple 429 00:27:12,440 --> 00:27:16,639 Speaker 1: of minutes on is Project Mariner. I mean hosting a 430 00:27:16,760 --> 00:27:20,200 Speaker 1: podcast called tech Stuff. We're constantly hearing about agents and 431 00:27:20,359 --> 00:27:25,800 Speaker 1: urgentic AI. So even I don't fully know what that means. 432 00:27:26,040 --> 00:27:28,800 Speaker 1: I think it means ais that go out in the 433 00:27:28,880 --> 00:27:31,680 Speaker 1: world and are able to take action on your behalf 434 00:27:32,320 --> 00:27:36,640 Speaker 1: in different fields. So I might say, like, not just hey, 435 00:27:36,760 --> 00:27:39,280 Speaker 1: Gemini would be a great itinerary for ten days in 436 00:27:39,520 --> 00:27:41,959 Speaker 1: Italy if I want to avoid the crowds in August, 437 00:27:42,040 --> 00:27:45,720 Speaker 1: but hey, Agent, I want to go to Italy for 438 00:27:45,760 --> 00:27:49,640 Speaker 1: ten days in August. Here's my budget. Please come back 439 00:27:49,720 --> 00:27:52,960 Speaker 1: with a fully you know, executed trip, because is that 440 00:27:53,040 --> 00:27:55,840 Speaker 1: a fair summary? And then what does Project Mariner do? 441 00:27:56,920 --> 00:28:02,080 Speaker 2: I think that's that's a you're framing as a first 442 00:28:02,320 --> 00:28:04,840 Speaker 2: summary of you know, the idea of what an agent 443 00:28:04,960 --> 00:28:10,840 Speaker 2: is product Mariners an early experiment and achieving that. So 444 00:28:11,840 --> 00:28:16,000 Speaker 2: let's create what you just described, which is an agent 445 00:28:16,119 --> 00:28:19,680 Speaker 2: that can understand your intention and go and take action 446 00:28:19,840 --> 00:28:24,840 Speaker 2: in service of that and primarily via browser based activities. 447 00:28:25,440 --> 00:28:27,760 Speaker 1: And what does that mean, like if I if I 448 00:28:27,920 --> 00:28:31,560 Speaker 1: use marin it, like, what will my experience be? And 449 00:28:31,680 --> 00:28:34,640 Speaker 1: your user research? How long do you think you'll take 450 00:28:35,200 --> 00:28:38,760 Speaker 1: the average user to be comfortable with aim making purchase 451 00:28:38,800 --> 00:28:40,680 Speaker 1: decisions on spending their money. 452 00:28:40,840 --> 00:28:45,200 Speaker 2: Essentially, Well, one of the things that it doesn't do 453 00:28:45,360 --> 00:28:48,080 Speaker 2: so far is make the purchases for you, right, it's 454 00:28:48,160 --> 00:28:51,720 Speaker 2: not at that point yet, but it will go through 455 00:28:51,800 --> 00:28:57,280 Speaker 2: a reasonable set of research and choice characterization and recommendation 456 00:28:57,800 --> 00:29:02,120 Speaker 2: steps that would Amuel what you would do the purchase 457 00:29:02,120 --> 00:29:08,880 Speaker 2: decision was still in ad theser. So so that's exactly 458 00:29:08,880 --> 00:29:11,240 Speaker 2: the kind of thing that it does. But it's not 459 00:29:11,320 --> 00:29:15,000 Speaker 2: just for purchasing. It can be for research tasks. You know, 460 00:29:15,240 --> 00:29:18,120 Speaker 2: go figure out all there is and all about this 461 00:29:18,280 --> 00:29:21,760 Speaker 2: thing that I'm interested in, do some reasoning associated with that, 462 00:29:22,720 --> 00:29:26,880 Speaker 2: and come back with a set of conclusions or recommendations 463 00:29:27,600 --> 00:29:30,720 Speaker 2: that helped me. And so it's those sort of things. 464 00:29:30,760 --> 00:29:34,040 Speaker 2: You can point it out at tasks and it can 465 00:29:34,120 --> 00:29:38,479 Speaker 2: go and you know, pursue those tasks on your behalf. 466 00:29:39,000 --> 00:29:41,160 Speaker 1: David had to ask you because you know, you had 467 00:29:41,200 --> 00:29:45,080 Speaker 1: this amazing experience of going to Japan as a young man, 468 00:29:46,840 --> 00:29:50,000 Speaker 1: you know, meeting one of your idols and then going 469 00:29:50,040 --> 00:29:54,040 Speaker 1: out and being assigned the task of you know, design 470 00:29:54,120 --> 00:29:57,640 Speaker 1: for motorcycles. Right, part of what you had to do 471 00:29:57,840 --> 00:30:02,400 Speaker 1: in that process, I assume was study other motorcycles, meet 472 00:30:02,480 --> 00:30:07,240 Speaker 1: motorcycle riders, essentially develop the playbook that you refined at 473 00:30:07,800 --> 00:30:11,600 Speaker 1: IDEO and then are executing on now at Google Labs. 474 00:30:12,680 --> 00:30:16,320 Speaker 1: Do you worry about a future in which the products 475 00:30:16,440 --> 00:30:21,000 Speaker 1: like Mariner, the agentic products get so good that there 476 00:30:21,120 --> 00:30:25,560 Speaker 1: is no incentive for humans to go into the world 477 00:30:26,440 --> 00:30:30,560 Speaker 1: and develop their own skills and vision by grinding through 478 00:30:31,320 --> 00:30:34,520 Speaker 1: tasks that could be automated. But that it was to 479 00:30:34,600 --> 00:30:38,600 Speaker 1: our benefit that they were not automated, because it allowed 480 00:30:38,680 --> 00:30:41,920 Speaker 1: us to become the people and thinkers and doers that 481 00:30:42,000 --> 00:30:44,720 Speaker 1: we are today. I mean, how do you how do 482 00:30:44,760 --> 00:30:45,400 Speaker 1: you grapple with that? 483 00:30:46,400 --> 00:30:46,760 Speaker 4: Yeah, I. 484 00:30:48,720 --> 00:30:53,120 Speaker 2: Acknowledge it as a reasonable question. I personally, I don't 485 00:30:53,120 --> 00:30:55,520 Speaker 2: worry too much about it because It's funny, I you 486 00:30:55,600 --> 00:31:02,880 Speaker 2: know career working in design, it's often to those who 487 00:31:03,000 --> 00:31:07,000 Speaker 2: aren't in it, it can seem like the answer to 488 00:31:07,160 --> 00:31:12,560 Speaker 2: good design is remove friction, right, and hey, if you 489 00:31:12,680 --> 00:31:16,120 Speaker 2: just if you just remove friction, then it's a great 490 00:31:16,160 --> 00:31:19,160 Speaker 2: design solution. I think there's a little bit more to 491 00:31:19,320 --> 00:31:21,400 Speaker 2: it than that, you know. To answer your question about 492 00:31:21,440 --> 00:31:26,440 Speaker 2: my career and all of the struggle and toil the 493 00:31:27,120 --> 00:31:31,200 Speaker 2: resulted then hard lessons learned? Do I think those same 494 00:31:31,320 --> 00:31:35,920 Speaker 2: lessons would have been learned with some help with that toil, 495 00:31:36,280 --> 00:31:39,760 Speaker 2: I think they probably would. I think there's probably a 496 00:31:39,840 --> 00:31:42,520 Speaker 2: sweet spot. Humans are all different, but I think most 497 00:31:42,640 --> 00:31:46,360 Speaker 2: humans are at their best when there's a little bit 498 00:31:46,480 --> 00:31:49,680 Speaker 2: of striving going on, when there's a little bit of 499 00:31:50,840 --> 00:31:54,760 Speaker 2: delightful struggle going on, that tends to bring out the 500 00:31:54,840 --> 00:31:55,320 Speaker 2: best in us. 501 00:31:56,480 --> 00:31:57,280 Speaker 4: And I think. 502 00:31:58,600 --> 00:32:03,000 Speaker 2: Maybe it's not a about eliminating the possibility of that. 503 00:32:03,320 --> 00:32:08,240 Speaker 2: Maybe it's about being more able to be more targeted 504 00:32:08,560 --> 00:32:10,600 Speaker 2: with that in service of one's growth. 505 00:32:11,160 --> 00:32:14,880 Speaker 1: On the subject of designing user experiences, do you have 506 00:32:15,640 --> 00:32:17,720 Speaker 1: a dream product in the back of your mind. 507 00:32:18,360 --> 00:32:20,680 Speaker 2: Rather than a specific dream product. There's a dream category 508 00:32:21,800 --> 00:32:26,600 Speaker 2: that I'm excited about personally and is this category of 509 00:32:26,720 --> 00:32:29,560 Speaker 2: physical AI. I you know, what are the set of 510 00:32:30,800 --> 00:32:39,080 Speaker 2: opportunities for beautiful new user experiences beyond looking at pixels 511 00:32:39,120 --> 00:32:40,320 Speaker 2: and leggling our fingers. 512 00:32:40,920 --> 00:32:42,360 Speaker 4: And I very much look forward to that. 513 00:32:42,480 --> 00:32:45,280 Speaker 2: If I look at the experiences I enjoy, if I 514 00:32:45,360 --> 00:32:47,800 Speaker 2: look at the weekend I just had, I would say 515 00:32:47,800 --> 00:32:51,240 Speaker 2: that in my top five experiences, we're all physical in nature, 516 00:32:51,280 --> 00:32:54,360 Speaker 2: that we're all about me up and about moving my 517 00:32:54,440 --> 00:33:00,800 Speaker 2: body in the world. I am really excited about new 518 00:33:02,240 --> 00:33:07,360 Speaker 2: presentation of this technology that enhances my physical experience, and 519 00:33:07,480 --> 00:33:09,040 Speaker 2: I think that is coming. 520 00:33:09,480 --> 00:33:12,680 Speaker 1: You gave a Ted talk a few years ago in Glasgow, 521 00:33:12,760 --> 00:33:16,880 Speaker 1: I think, and one of your warnings was about how tech, 522 00:33:17,640 --> 00:33:20,440 Speaker 1: in its eagerness to scale, could sometimes sprint in the 523 00:33:20,480 --> 00:33:24,680 Speaker 1: wrong direction without kind of pausing to assess. You gave 524 00:33:24,760 --> 00:33:26,800 Speaker 1: that Ted talk before you worked at one of the 525 00:33:26,840 --> 00:33:30,840 Speaker 1: world's largest and most powerful tech companies. But I'm curious, 526 00:33:30,920 --> 00:33:33,480 Speaker 1: you know, how do you apply that thinking and that 527 00:33:33,680 --> 00:33:36,920 Speaker 1: lesson that you developed on the outside now that you're 528 00:33:36,960 --> 00:33:37,520 Speaker 1: on the inside. 529 00:33:38,040 --> 00:33:39,680 Speaker 2: Yeah, I mean, I still very much believe in that. 530 00:33:40,160 --> 00:33:46,920 Speaker 2: I interestingly was, you know, not inside one of the 531 00:33:47,160 --> 00:33:50,640 Speaker 2: largest and most powerful tech companies, but I was consulting 532 00:33:50,760 --> 00:33:53,560 Speaker 2: with many of them that the time. So I did 533 00:33:53,640 --> 00:33:56,640 Speaker 2: have a front, real seat into that dynamic. You know, 534 00:33:57,640 --> 00:34:00,760 Speaker 2: so when a platform shifts the cars in a very 535 00:34:00,880 --> 00:34:05,320 Speaker 2: real way, that history shows time is of the essence. 536 00:34:06,640 --> 00:34:13,960 Speaker 2: It matters to ship fast, and that can have unintended 537 00:34:14,040 --> 00:34:19,640 Speaker 2: consequences if if it's at the expense of curiosity users 538 00:34:19,760 --> 00:34:24,680 Speaker 2: curiosity about consequences. I mean, you know, I guess the 539 00:34:24,760 --> 00:34:26,600 Speaker 2: fact that I am in this company doing what I 540 00:34:26,760 --> 00:34:33,279 Speaker 2: do is is evidence of user curiosity, right, I mean 541 00:34:33,320 --> 00:34:35,480 Speaker 2: my answer to your question is what do you do 542 00:34:35,600 --> 00:34:37,600 Speaker 2: about it? Do you worry about it? Well, I don't 543 00:34:37,640 --> 00:34:41,040 Speaker 2: worry about it. If you're folding in like strong professional 544 00:34:41,160 --> 00:34:44,040 Speaker 2: UX practice from the get go, you know, as you 545 00:34:44,600 --> 00:34:49,759 Speaker 2: as you frame your pursuits and you build that into 546 00:34:50,160 --> 00:34:53,280 Speaker 2: the process of bringing the new thing into the world. 547 00:34:54,440 --> 00:34:56,759 Speaker 4: That's that's actually. 548 00:34:56,560 --> 00:34:59,360 Speaker 2: Kind of the job of UX in the mix is 549 00:34:59,440 --> 00:35:02,279 Speaker 2: to find out what matters to people and ensure that 550 00:35:02,400 --> 00:35:05,800 Speaker 2: that is factored into the path forward for bringing the 551 00:35:05,840 --> 00:35:06,960 Speaker 2: product into existence. 552 00:35:07,719 --> 00:35:09,360 Speaker 1: Just to close, I'd love you to reflect on that 553 00:35:09,560 --> 00:35:13,920 Speaker 1: word curiosity. What's your personal philosophy on curiosity? 554 00:35:14,239 --> 00:35:19,040 Speaker 2: My personal philosophy on curiosity, Well, in the context of AI. 555 00:35:19,360 --> 00:35:23,640 Speaker 2: You know, AI has continued to provide all of these 556 00:35:23,760 --> 00:35:34,120 Speaker 2: new affordancies for productivity and efficiency. So I think it 557 00:35:34,120 --> 00:35:39,520 Speaker 2: would be a terrible shame if all we did was 558 00:35:39,680 --> 00:35:45,400 Speaker 2: to turn efficiency and productivity up to living with this technology, 559 00:35:45,880 --> 00:35:49,120 Speaker 2: that would be that would be doing the technology a disservice. 560 00:35:49,800 --> 00:35:55,799 Speaker 2: I think what we need to do is to use 561 00:35:55,880 --> 00:35:58,520 Speaker 2: the fact that we can get more done more efficiently 562 00:36:00,280 --> 00:36:05,000 Speaker 2: to afford the ability to broaden our gaze and look 563 00:36:05,040 --> 00:36:06,040 Speaker 2: for new opportunities. 564 00:36:06,920 --> 00:36:08,960 Speaker 1: Is that what you mean by help technology be good? 565 00:36:09,000 --> 00:36:10,680 Speaker 1: Which is a phrase from your tech talk. 566 00:36:11,320 --> 00:36:12,440 Speaker 4: It definitely. 567 00:36:13,840 --> 00:36:19,560 Speaker 2: Serves that aspiration. Every technology obviously is a double edged sword. 568 00:36:20,680 --> 00:36:22,640 Speaker 2: You know, you invent the ship, you invent the shipwreck 569 00:36:22,880 --> 00:36:26,600 Speaker 2: kind of thing. And I believe that design in particular 570 00:36:27,080 --> 00:36:29,960 Speaker 2: plays a role in the mix that allows it to 571 00:36:31,080 --> 00:36:36,400 Speaker 2: inflect how that technology plays out in constructive ways. And 572 00:36:36,560 --> 00:36:41,800 Speaker 2: so I think that is certainly sort of a central 573 00:36:41,920 --> 00:36:44,880 Speaker 2: endeavor of the designer. I think it's if I go 574 00:36:45,000 --> 00:36:48,200 Speaker 2: back to the very beginning and talk about human machine 575 00:36:48,200 --> 00:36:52,080 Speaker 2: and soul energy, I think it's about finding virtuous ways 576 00:36:52,680 --> 00:36:55,200 Speaker 2: to express that relationship. 577 00:36:57,640 --> 00:36:59,160 Speaker 1: Their webs. To thank you so UCAL during US and 578 00:36:59,160 --> 00:37:00,080 Speaker 1: tech stuff today. 579 00:37:00,080 --> 00:37:01,640 Speaker 4: That was fun. Nice to speak to you. 580 00:37:17,760 --> 00:37:21,320 Speaker 1: The tech Stuff. I'm os Vloshin. This episode was produced 581 00:37:21,360 --> 00:37:25,040 Speaker 1: by Eliza Dennis and Adriana Toapia. It was executive produced 582 00:37:25,040 --> 00:37:28,600 Speaker 1: by Me, Karen Price, and Kate Osborne for Kaleidoscope and 583 00:37:28,800 --> 00:37:33,120 Speaker 1: Katrina nor velve Heart Podcasts. Jack Insley mixed this episode 584 00:37:33,280 --> 00:37:36,520 Speaker 1: and Kyle Murdoch rod oar theme song. Join us on 585 00:37:36,560 --> 00:37:39,160 Speaker 1: Friday for the weekend tech Karen and I will run 586 00:37:39,239 --> 00:37:41,960 Speaker 1: through all the headlines you may have missed, and please 587 00:37:42,040 --> 00:37:44,279 Speaker 1: do rate and review the show and reach out to 588 00:37:44,400 --> 00:37:47,080 Speaker 1: us with your feedback at tech Stuff podcast at gmail 589 00:37:47,120 --> 00:37:47,520 Speaker 1: dot com