1 00:00:01,480 --> 00:00:03,760 Speaker 1: Welcome to tech Stuff. This is the story. 2 00:00:04,519 --> 00:00:06,880 Speaker 2: Each week on Wednesdays, we bring you an in depth 3 00:00:06,920 --> 00:00:09,240 Speaker 2: interview with someone who has a front row seat to 4 00:00:09,240 --> 00:00:12,960 Speaker 2: the most fascinating things happening in tech today. We're joined 5 00:00:12,960 --> 00:00:22,119 Speaker 2: by Astroteller. Astroteller is the captain of Moonshots for X, 6 00:00:22,400 --> 00:00:28,320 Speaker 2: an innovation lab within Alphabet, Google's parent company. To understand Astroteller, 7 00:00:28,760 --> 00:00:32,760 Speaker 2: you have to understand moonshots. Years ago, a moonshot might 8 00:00:32,800 --> 00:00:35,640 Speaker 2: have been slang for a long shot, but in the 9 00:00:35,640 --> 00:00:39,600 Speaker 2: technological age, it's more akin to a lofty goal or 10 00:00:39,880 --> 00:00:42,960 Speaker 2: a giant leap. So to be the captain of Moonshots, 11 00:00:43,280 --> 00:00:44,479 Speaker 2: you sort of have to be. 12 00:00:44,440 --> 00:00:47,240 Speaker 1: The ringleader for pioneers. Now. 13 00:00:47,440 --> 00:00:49,920 Speaker 2: If you've never heard of X, you've probably heard of 14 00:00:49,920 --> 00:00:52,199 Speaker 2: the products that came out of what's known as the 15 00:00:52,240 --> 00:00:56,400 Speaker 2: Moonshot Factory. Google Glass a leap forward for smart glasses 16 00:00:56,440 --> 00:01:01,160 Speaker 2: and computer vision, Weai mow actual self driving tech, and 17 00:01:01,400 --> 00:01:06,080 Speaker 2: Google Brain, a groundbreaking AI research team. All of them 18 00:01:06,120 --> 00:01:08,840 Speaker 2: got their start at X. But X is more than 19 00:01:08,840 --> 00:01:13,080 Speaker 2: an incubator, it's a playground. And thanks to the Moonshop podcast, 20 00:01:13,280 --> 00:01:15,959 Speaker 2: we finally get a bird's eye view of the factory floor. 21 00:01:17,240 --> 00:01:20,640 Speaker 2: Host Astroteller leads us through what feels like an oral 22 00:01:20,800 --> 00:01:24,440 Speaker 2: history about how these innovations came to be from the 23 00:01:24,480 --> 00:01:27,240 Speaker 2: perspective of the people who built them, the problems that 24 00:01:27,280 --> 00:01:31,200 Speaker 2: sparked the idea, the setbacks, the successes, and how each 25 00:01:31,240 --> 00:01:35,319 Speaker 2: project evolved. I had the opportunity to interview Astroteller about 26 00:01:35,360 --> 00:01:39,160 Speaker 2: the Moonshop podcast at south By Southwest. We met at 27 00:01:39,160 --> 00:01:42,319 Speaker 2: the Google office in Austin, Texas, and the first thing 28 00:01:42,360 --> 00:01:45,600 Speaker 2: I wanted to know was what motivated Alphabet to produce 29 00:01:45,640 --> 00:01:46,319 Speaker 2: this podcast. 30 00:01:47,000 --> 00:01:49,800 Speaker 3: Our original excuse for making it was that we're turning 31 00:01:49,840 --> 00:01:53,080 Speaker 3: fifteen years old and just seemed like a nice time 32 00:01:53,240 --> 00:01:57,600 Speaker 3: to stop and look backwards and think about sort of 33 00:01:57,760 --> 00:02:02,280 Speaker 3: where have we come? Just for reminiscence, but also as 34 00:02:02,320 --> 00:02:05,360 Speaker 3: a way of educating ourselves and maybe sharing with other people, 35 00:02:05,800 --> 00:02:09,640 Speaker 3: Like what is a moonshot factory? Does the world only 36 00:02:09,680 --> 00:02:12,240 Speaker 3: need one? Does it need zero? Does it actually need 37 00:02:12,280 --> 00:02:14,600 Speaker 3: a hundred of them? I hope it's the answer, is 38 00:02:14,600 --> 00:02:19,760 Speaker 3: the last, And what can we learn from all of 39 00:02:19,760 --> 00:02:24,040 Speaker 3: these people who've gone through this process. Let me give 40 00:02:24,080 --> 00:02:27,080 Speaker 3: you a really concrete example, because if I stand on 41 00:02:27,160 --> 00:02:29,800 Speaker 3: stage and I say one of the mantras at X 42 00:02:30,080 --> 00:02:33,880 Speaker 3: is get into the real world as fast as possible, 43 00:02:33,919 --> 00:02:37,760 Speaker 3: get in contact with it, and get this sort of painful, complex, 44 00:02:37,960 --> 00:02:42,160 Speaker 3: dirty learning lessons from getting into the world and then 45 00:02:42,360 --> 00:02:45,360 Speaker 3: realizing in all kinds of ways you're wrong. I can 46 00:02:45,400 --> 00:02:47,919 Speaker 3: talk about why that's a good idea. It is sincerely 47 00:02:47,960 --> 00:02:49,720 Speaker 3: one of the mantras and sort of the ways that 48 00:02:49,760 --> 00:02:55,120 Speaker 3: we operated X that's very abstract, it's very philosophical. On 49 00:02:55,200 --> 00:02:58,280 Speaker 3: the first podcast, there's a nice moment where the Wing 50 00:02:58,400 --> 00:03:01,480 Speaker 3: team is talking about the fact that when they got 51 00:03:01,520 --> 00:03:04,280 Speaker 3: out in the world first really doing deliveries, which was 52 00:03:04,360 --> 00:03:07,240 Speaker 3: first in Australia, they were worried that people be annoyed 53 00:03:07,320 --> 00:03:09,799 Speaker 3: by the sound of their drones, and so they had 54 00:03:09,800 --> 00:03:12,760 Speaker 3: worked really hard for the propellers that are operating when 55 00:03:12,760 --> 00:03:15,000 Speaker 3: it's in hover modes, so it's just hovering above a 56 00:03:15,080 --> 00:03:17,640 Speaker 3: house and it's lowering a package on a string for 57 00:03:17,680 --> 00:03:20,840 Speaker 3: a delivery. They were worried that that sound is the 58 00:03:20,880 --> 00:03:22,720 Speaker 3: sound that would bother people. So they've done all this 59 00:03:22,800 --> 00:03:25,799 Speaker 3: work ahead of time to make that as quiet as possible, 60 00:03:25,960 --> 00:03:27,760 Speaker 3: and when I got out there, they found out no 61 00:03:27,800 --> 00:03:28,800 Speaker 3: one cared about. 62 00:03:28,520 --> 00:03:30,880 Speaker 1: That sound because they were excited to be receiving the packages. 63 00:03:30,960 --> 00:03:35,240 Speaker 3: Right exactly, and it was the forward flight this zsh 64 00:03:35,440 --> 00:03:38,560 Speaker 3: right as something was going over houses, you know, sixty 65 00:03:38,640 --> 00:03:40,680 Speaker 3: seventy miles an hour, two hundred feet in the air, 66 00:03:40,760 --> 00:03:43,920 Speaker 3: which we had never considered would be the problem. That 67 00:03:44,080 --> 00:03:47,440 Speaker 3: was actually the sound that people were bothered by. And 68 00:03:47,480 --> 00:03:49,240 Speaker 3: then we went and we did the hard work of 69 00:03:49,280 --> 00:03:51,360 Speaker 3: getting that to be much quieter. And now that doesn't 70 00:03:51,360 --> 00:03:54,680 Speaker 3: bother people either. But hearing the chief engineer for Wing 71 00:03:54,840 --> 00:03:58,280 Speaker 3: unpack the discovery of that and then how they worked 72 00:03:58,320 --> 00:04:03,240 Speaker 3: on it makes this this very abstract idea let's say, 73 00:04:03,240 --> 00:04:05,480 Speaker 3: get in contact with the real world all of a sudden, 74 00:04:05,640 --> 00:04:07,120 Speaker 3: really concrete for people. 75 00:04:07,480 --> 00:04:09,800 Speaker 2: So they say that, you know, a great podcast always 76 00:04:09,840 --> 00:04:13,440 Speaker 2: drops you into a scene, and your podcast drops you 77 00:04:13,560 --> 00:04:16,120 Speaker 2: into quite a fascinating scene as a listener with you 78 00:04:16,360 --> 00:04:20,039 Speaker 2: and Sebastian Thrun, who originally found at Google x and 79 00:04:20,160 --> 00:04:22,440 Speaker 2: hired you in the back of a way, mo I 80 00:04:22,440 --> 00:04:24,520 Speaker 2: mean a project that you know, you guys had worked 81 00:04:24,520 --> 00:04:26,840 Speaker 2: on that really came to life. But I'm wondering if 82 00:04:26,839 --> 00:04:30,520 Speaker 2: you can drop our listeners into a scene, describe what 83 00:04:30,560 --> 00:04:33,320 Speaker 2: it's like to be in the Moonshot factory. I'm thinking 84 00:04:33,320 --> 00:04:35,120 Speaker 2: about roller skates. I'm thinking about the line that you 85 00:04:35,200 --> 00:04:36,080 Speaker 2: cross every morning. 86 00:04:36,520 --> 00:04:40,440 Speaker 3: Sure. So the building itself originally was the first air 87 00:04:40,480 --> 00:04:44,799 Speaker 3: conditioned mall in California. So it's a relatively old mall, 88 00:04:45,200 --> 00:04:50,320 Speaker 3: very high ceilings. We've left it quite raw, polished cement floors, 89 00:04:50,680 --> 00:04:54,520 Speaker 3: the concrete beams on the sides, like this one that's 90 00:04:54,600 --> 00:04:56,680 Speaker 3: right here where we're recording. This has a lot of 91 00:04:56,680 --> 00:04:59,240 Speaker 3: graffiti on it. Not this concrete one because people have 92 00:04:59,279 --> 00:05:03,800 Speaker 3: cleaned this one up, but we've left the original construction 93 00:05:04,040 --> 00:05:07,200 Speaker 3: markings you know, gas line that way, don't drill or 94 00:05:07,200 --> 00:05:11,719 Speaker 3: whatever on the concrete. It's not a decoration for us. 95 00:05:11,800 --> 00:05:14,000 Speaker 3: But it was like a why would we spend time 96 00:05:14,040 --> 00:05:16,800 Speaker 3: cleaning it up? Is that really the money we want 97 00:05:16,839 --> 00:05:19,800 Speaker 3: to spend? B We want the place, and I think 98 00:05:19,839 --> 00:05:23,360 Speaker 3: it successfully feels like a work in progress. And if 99 00:05:23,440 --> 00:05:27,200 Speaker 3: we're as humans as professionals and our projects are constantly 100 00:05:27,240 --> 00:05:30,320 Speaker 3: a work in progress, how can we send a lot 101 00:05:30,360 --> 00:05:35,000 Speaker 3: of unconscious signals to encourage that work in progress mentality? 102 00:05:35,520 --> 00:05:37,920 Speaker 3: And so the walls are a lot of them made 103 00:05:37,920 --> 00:05:41,720 Speaker 3: of plywood. Could we have used something other than plywood? Yes, 104 00:05:42,160 --> 00:05:45,680 Speaker 3: but plywood works fine, and it sends that same signal 105 00:05:46,000 --> 00:05:50,240 Speaker 3: when you come into the main lobby. We have a 106 00:05:50,279 --> 00:05:54,080 Speaker 3: lot of things hanging there on the walls or large robots. 107 00:05:54,120 --> 00:05:56,040 Speaker 3: I guess they're sitting on the floor in some cases, 108 00:05:56,520 --> 00:06:00,839 Speaker 3: but pretty universally, they're not the finished product that we made. 109 00:06:01,360 --> 00:06:03,960 Speaker 3: There's something that we made along the way, and that's 110 00:06:04,000 --> 00:06:08,360 Speaker 3: another opportunity for us to signal to each other and 111 00:06:08,400 --> 00:06:11,160 Speaker 3: to everyone who visits us. We're more proud of the 112 00:06:11,240 --> 00:06:16,720 Speaker 3: process than we are of the outcome, including is you're referencing. 113 00:06:16,839 --> 00:06:20,080 Speaker 3: There's this huge line across the floor and it says, 114 00:06:20,160 --> 00:06:23,280 Speaker 3: in all caps, right by the line, you may never 115 00:06:23,400 --> 00:06:28,560 Speaker 3: cross this line. Exclamation point ever exclamation point, And it's 116 00:06:28,600 --> 00:06:31,000 Speaker 3: a stupid rule. You can't get into the building unless 117 00:06:31,000 --> 00:06:34,600 Speaker 3: you cross the line, and it's a way to help 118 00:06:34,720 --> 00:06:39,320 Speaker 3: people practice breaking stupid rules. We don't want people to 119 00:06:39,440 --> 00:06:45,599 Speaker 3: like cause fraud or embezzle money or hurt somebody, but 120 00:06:45,839 --> 00:06:49,400 Speaker 3: so many rules are actually in our heads. There are 121 00:06:49,400 --> 00:06:53,360 Speaker 3: assumptions about how the world works that we haven't realized 122 00:06:53,600 --> 00:06:56,599 Speaker 3: or just assumptions, and they can and need to be questioned. 123 00:06:56,600 --> 00:06:58,880 Speaker 3: If you're going to do something unusual. 124 00:06:59,040 --> 00:07:01,280 Speaker 1: Can you describe what it feels like to work at 125 00:07:01,279 --> 00:07:02,320 Speaker 1: the Moonshot Factory? 126 00:07:03,080 --> 00:07:10,200 Speaker 3: There is a happy, mild mannickness to being around us 127 00:07:10,320 --> 00:07:15,600 Speaker 3: every day. The Moonshot Factory is a very matrix place where, 128 00:07:16,240 --> 00:07:18,960 Speaker 3: in some great ways and maybe in some not ideal ways, 129 00:07:18,960 --> 00:07:21,280 Speaker 3: we're sort of all in each other's business all the time. 130 00:07:21,720 --> 00:07:24,200 Speaker 3: There's a lot of hey, can I help with that. 131 00:07:24,280 --> 00:07:26,320 Speaker 3: In fact, just as someone was dropping me off in 132 00:07:26,360 --> 00:07:29,320 Speaker 3: this room, we were ending a conversation about ways we 133 00:07:29,360 --> 00:07:32,560 Speaker 3: could be even better at not worrying about whose job 134 00:07:32,680 --> 00:07:34,680 Speaker 3: was what, but we could just like jump in and 135 00:07:34,760 --> 00:07:37,160 Speaker 3: help not only things that are still at X, but 136 00:07:37,200 --> 00:07:39,840 Speaker 3: we were actually talking about something that has left X, 137 00:07:40,880 --> 00:07:43,120 Speaker 3: and people at X were saying, oh, do we still 138 00:07:43,160 --> 00:07:44,840 Speaker 3: need to be helping them? He was like, who cares? 139 00:07:45,080 --> 00:07:47,360 Speaker 3: Like they're part of us? They were a graduate of ours, 140 00:07:47,520 --> 00:07:49,880 Speaker 3: just like, jump in and help. That's a lot of 141 00:07:49,880 --> 00:07:52,880 Speaker 3: what it feels like in the conversations in the hallways 142 00:07:52,920 --> 00:07:58,440 Speaker 3: every day. There's an ethos of helpfulness, of excitement, a 143 00:07:58,520 --> 00:08:05,320 Speaker 3: sense of purpose. The philosophy of experimentation plays out all 144 00:08:05,360 --> 00:08:08,960 Speaker 3: the time. Every day. There's very much a why don't 145 00:08:08,960 --> 00:08:11,960 Speaker 3: we just try it, like instead of talking about it 146 00:08:12,040 --> 00:08:14,040 Speaker 3: when we don't really know what the right answer is. 147 00:08:14,280 --> 00:08:17,960 Speaker 3: And this might be about some hr issue or about 148 00:08:18,040 --> 00:08:21,760 Speaker 3: some public relations issue. It might be a very technical issue. 149 00:08:21,880 --> 00:08:24,240 Speaker 3: It's actually the same thing. It's do we really know 150 00:08:24,280 --> 00:08:27,440 Speaker 3: what the right answer is. Let's make a hypothesis and 151 00:08:27,480 --> 00:08:31,440 Speaker 3: then find the fastest, simplest, cheapest way to test that hypothesis. 152 00:08:31,640 --> 00:08:34,080 Speaker 3: That's kind of how it feels in all the conversations 153 00:08:34,120 --> 00:08:34,600 Speaker 3: all the time. 154 00:08:34,760 --> 00:08:37,400 Speaker 2: So I think the process is fascinating, but the output 155 00:08:37,440 --> 00:08:39,920 Speaker 2: is also fascinating, right, Like I've also been in a 156 00:08:39,960 --> 00:08:42,680 Speaker 2: Way Mow, and you know you have that experience like 157 00:08:42,720 --> 00:08:46,079 Speaker 2: this is totally uncanny, and then it becomes normal very quickly. 158 00:08:46,600 --> 00:08:51,160 Speaker 2: Google Brain obviously also spun out of X and also 159 00:08:51,280 --> 00:08:55,120 Speaker 2: gave birth, in partnership with deep Mind, to last year's 160 00:08:55,320 --> 00:08:59,200 Speaker 2: Nobel Prize in chemistry. What marked out, let's say, Way 161 00:08:59,320 --> 00:09:04,600 Speaker 2: Mow and Brain as huge successes versus the projects that 162 00:09:04,840 --> 00:09:05,600 Speaker 2: didn't graduate. 163 00:09:05,840 --> 00:09:08,400 Speaker 1: What characterizes a success in a failure? 164 00:09:08,920 --> 00:09:11,960 Speaker 3: At least the way it feels at X is we 165 00:09:12,040 --> 00:09:15,480 Speaker 3: start on a whole bunch of unlikely journeys. But I 166 00:09:15,480 --> 00:09:17,880 Speaker 3: don't know that we're much better than random at predicting 167 00:09:17,920 --> 00:09:19,959 Speaker 3: ahead of time, what's going to be a great idea? 168 00:09:20,040 --> 00:09:22,960 Speaker 3: If we were great at predicting what's going to be 169 00:09:22,960 --> 00:09:25,200 Speaker 3: a great idea. We would only work on the great ideas. 170 00:09:25,880 --> 00:09:28,960 Speaker 3: I don't think anybody gets that privilege. There are people 171 00:09:29,160 --> 00:09:32,200 Speaker 3: who pick one thing. They announced that they have the 172 00:09:32,280 --> 00:09:34,880 Speaker 3: right idea, and then they work really hard on it 173 00:09:34,760 --> 00:09:36,800 Speaker 3: and it turns out they did have a great idea. 174 00:09:37,720 --> 00:09:41,040 Speaker 3: That's just survivor bias thought. Those are just the ones 175 00:09:41,040 --> 00:09:43,280 Speaker 3: you hear. There are a thousand people who said they 176 00:09:43,320 --> 00:09:45,559 Speaker 3: had a great idea and worked really hard on it, 177 00:09:45,760 --> 00:09:48,560 Speaker 3: and just they didn't have a great idea it turns out, 178 00:09:48,760 --> 00:09:51,440 Speaker 3: and so they went away. The one person who made 179 00:09:51,440 --> 00:09:54,959 Speaker 3: it is not necessarily smarter. Mostly I believe we believe 180 00:09:55,000 --> 00:09:58,280 Speaker 3: at X they just got lucky. So we start about 181 00:09:58,280 --> 00:10:01,760 Speaker 3: a thousand things per decade, and then we're always looking 182 00:10:01,800 --> 00:10:05,080 Speaker 3: for evidence. Is this really a once in a generation 183 00:10:05,200 --> 00:10:07,760 Speaker 3: opportunity for the world to make the world better and 184 00:10:07,800 --> 00:10:11,480 Speaker 3: to create an enduring business. And when the evidence starts 185 00:10:11,520 --> 00:10:14,600 Speaker 3: to pile up, no, or at least we can't get 186 00:10:14,640 --> 00:10:17,920 Speaker 3: good evidence that the answer is yes, we throw it away. 187 00:10:18,400 --> 00:10:22,880 Speaker 3: So brain and way Mow survived those pressure tests, that's 188 00:10:22,920 --> 00:10:26,640 Speaker 3: really what happened. It wasn't this was the good stuff 189 00:10:27,000 --> 00:10:30,200 Speaker 3: ahead of time. It's more like the evidence piled up 190 00:10:30,240 --> 00:10:32,960 Speaker 3: that it turns out these were particularly good ideas, And 191 00:10:33,040 --> 00:10:35,640 Speaker 3: you know, maybe there were things that we've tried that 192 00:10:35,720 --> 00:10:38,920 Speaker 3: were really good ideas, but we just didn't find the 193 00:10:39,080 --> 00:10:43,200 Speaker 3: right wedge in on the problem. So maybe somebody else will. 194 00:10:43,520 --> 00:10:46,000 Speaker 3: And in a number of cases we've actually when we 195 00:10:46,040 --> 00:10:49,520 Speaker 3: wound something down, we've published to the rest of the 196 00:10:49,559 --> 00:10:52,120 Speaker 3: world everything that we learned so that they could build 197 00:10:52,160 --> 00:10:52,800 Speaker 3: on top of that. 198 00:11:00,360 --> 00:11:01,200 Speaker 1: When we come back. 199 00:11:01,400 --> 00:11:04,640 Speaker 2: How an innovation playground birthed in the era of endless 200 00:11:04,640 --> 00:11:17,480 Speaker 2: optimism maintains its momentum. Stay with us, Welcome back to 201 00:11:17,520 --> 00:11:20,920 Speaker 2: tech Stuff. I'm talking to Astroteller about the new podcast 202 00:11:21,080 --> 00:11:25,520 Speaker 2: out of Alphabet's ex called the Moonshot Podcast. There's an 203 00:11:25,559 --> 00:11:28,719 Speaker 2: interesting moment in the podcast where Sebastian Thrunn, the co 204 00:11:28,760 --> 00:11:32,480 Speaker 2: founder of x, describes a conversation he had with former 205 00:11:32,520 --> 00:11:35,720 Speaker 2: Google CEO Larry Page back in two thousand and five. 206 00:11:36,840 --> 00:11:39,400 Speaker 2: Larry asked Thron to come to Google and make a 207 00:11:39,440 --> 00:11:43,280 Speaker 2: self driving car essentially out of nowhere, and Thron, though 208 00:11:43,320 --> 00:11:46,360 Speaker 2: skeptical of the outcome, took a leap of faith and 209 00:11:46,400 --> 00:11:52,000 Speaker 2: started experimenting, creating what would ultimately become the Moonshop Factory. 210 00:11:52,080 --> 00:11:53,280 Speaker 1: This was during the era. 211 00:11:53,160 --> 00:11:56,800 Speaker 2: Of endless optimism in Silicon Valley, but fifteen years later 212 00:11:57,240 --> 00:11:59,400 Speaker 2: there seems to have been something of a vibe shift 213 00:12:00,160 --> 00:12:03,480 Speaker 2: cuts at old big tech companies, including cuts that have 214 00:12:03,520 --> 00:12:07,720 Speaker 2: affected Eggs, And I wanted to know if Astarteta thought 215 00:12:07,760 --> 00:12:09,560 Speaker 2: this vibe shift affects X. 216 00:12:09,559 --> 00:12:14,240 Speaker 3: Today, we're constantly learning and trying to improve, but I 217 00:12:14,320 --> 00:12:17,679 Speaker 3: hope that that vibe shift hasn't per se changed us 218 00:12:18,840 --> 00:12:21,520 Speaker 3: at all. Let me describe that in another way. From 219 00:12:21,600 --> 00:12:24,920 Speaker 3: the very beginning, we've been trying to find ways to 220 00:12:25,040 --> 00:12:30,560 Speaker 3: make a moonshot factory. That is, keep our audacity really high, 221 00:12:30,840 --> 00:12:34,960 Speaker 3: but find ways to systematize the process. So we've been 222 00:12:34,960 --> 00:12:37,600 Speaker 3: committed to that for fifteen years. And as soon as 223 00:12:37,640 --> 00:12:40,000 Speaker 3: you commit to the factory part of it, not just 224 00:12:40,040 --> 00:12:43,080 Speaker 3: the moonshot part of it, you are pre committing to 225 00:12:43,200 --> 00:12:47,680 Speaker 3: a constant attempt to up the rigor without killing off 226 00:12:47,720 --> 00:12:51,840 Speaker 3: the magic. And so we didn't need a vibe shift 227 00:12:51,840 --> 00:12:56,040 Speaker 3: to get interested in efficiency. We've always been interested in 228 00:12:56,160 --> 00:12:59,320 Speaker 3: how to keep ratcheting up that efficiency. That's not a 229 00:12:59,360 --> 00:13:02,319 Speaker 3: new thing for us. And so that vibe shift of 230 00:13:02,400 --> 00:13:06,280 Speaker 3: anything has aligned the world better with what we were 231 00:13:06,320 --> 00:13:09,200 Speaker 3: already trying to do, which is, we want to be 232 00:13:09,360 --> 00:13:12,560 Speaker 3: creating a great return on investment where the things that 233 00:13:12,559 --> 00:13:16,560 Speaker 3: we produce are worth more than enough to justify the 234 00:13:16,559 --> 00:13:18,480 Speaker 3: money we've spent and the time that it took to 235 00:13:18,559 --> 00:13:22,839 Speaker 3: make ten or fifteen year period. Now it happens that 236 00:13:22,960 --> 00:13:25,960 Speaker 3: Alphabet is very long term in its thinking, which it 237 00:13:26,000 --> 00:13:29,200 Speaker 3: does to its great long term benefits. And you know, 238 00:13:29,440 --> 00:13:32,480 Speaker 3: it has become a very large business by thinking long 239 00:13:32,559 --> 00:13:36,640 Speaker 3: term and by having the bravery to place these much 240 00:13:36,640 --> 00:13:41,720 Speaker 3: longer term bets, and so X has received the support 241 00:13:41,760 --> 00:13:46,560 Speaker 3: from Alphabet, which I'm very grateful for. And in that context, 242 00:13:46,720 --> 00:13:49,360 Speaker 3: we're continuing to do our job. How can we take 243 00:13:49,640 --> 00:13:53,880 Speaker 3: these audacious attempts to find something really great for the 244 00:13:53,920 --> 00:13:58,840 Speaker 3: world that is also you know, significant shareholder value production, 245 00:13:59,320 --> 00:14:03,520 Speaker 3: and to do that as efficiently as possible. Understanding we're 246 00:14:03,559 --> 00:14:06,120 Speaker 3: still going to be wrong ninety nine percent of the time. 247 00:14:06,800 --> 00:14:08,720 Speaker 3: The question is not how do we get it so 248 00:14:08,760 --> 00:14:11,840 Speaker 3: we're only wrong ninety percent of the time. The question 249 00:14:11,880 --> 00:14:15,000 Speaker 3: at X is how do we discover the ninety nine 250 00:14:15,040 --> 00:14:18,800 Speaker 3: percent where we're wrong as fast and as cheaply as possible. 251 00:14:19,320 --> 00:14:22,200 Speaker 3: The better we get at that part of the riggor 252 00:14:23,000 --> 00:14:26,280 Speaker 3: that one percent that comes through will come through more 253 00:14:26,320 --> 00:14:28,400 Speaker 3: and more efficiently because we've spent less and less time 254 00:14:28,440 --> 00:14:30,240 Speaker 3: and money on the stuff where it turned out not 255 00:14:30,320 --> 00:14:31,200 Speaker 3: to be a great idea. 256 00:14:31,320 --> 00:14:33,000 Speaker 2: And one of the things you said was that as 257 00:14:33,040 --> 00:14:35,520 Speaker 2: a manager, you don't want to be telling people I 258 00:14:35,560 --> 00:14:37,520 Speaker 2: want to kill that idea. You want them to come 259 00:14:37,520 --> 00:14:39,400 Speaker 2: to you proactivity and kill their own ideas. 260 00:14:39,560 --> 00:14:41,520 Speaker 3: That's right. So if you worked at X and you're 261 00:14:41,600 --> 00:14:47,480 Speaker 3: building the Teleporter project or whatever, if I believe that 262 00:14:47,640 --> 00:14:52,080 Speaker 3: you can't practice intellectual honesty and work in the betterment 263 00:14:52,720 --> 00:14:57,400 Speaker 3: for X and alphabet of X's overall portfolio, including a 264 00:14:58,160 --> 00:15:02,520 Speaker 3: dispassionate view of your own work, at least periodically, you 265 00:15:02,560 --> 00:15:06,400 Speaker 3: and I are fundamentally in antagonism with each other. You're 266 00:15:06,440 --> 00:15:09,000 Speaker 3: not only not on my team, you're actually working against 267 00:15:09,120 --> 00:15:12,960 Speaker 3: my team. If you're being overly partisan to what you do, 268 00:15:13,840 --> 00:15:16,280 Speaker 3: then why do I even. 269 00:15:16,080 --> 00:15:16,640 Speaker 1: Have you here? 270 00:15:16,680 --> 00:15:20,200 Speaker 3: That's horrible. If we were an actual incubator, I would 271 00:15:20,200 --> 00:15:23,760 Speaker 3: get it, because then we're just this system and you're 272 00:15:23,760 --> 00:15:27,080 Speaker 3: trying to leach energy off of us to launch your thing. 273 00:15:27,320 --> 00:15:30,480 Speaker 3: You're supposed to be partisan to your thing, but that's 274 00:15:30,520 --> 00:15:32,880 Speaker 3: not what we're doing at the Moonshot Factory. All of 275 00:15:32,960 --> 00:15:35,520 Speaker 3: us here, including you if you've joined the Moonshot Factory, 276 00:15:35,840 --> 00:15:38,920 Speaker 3: are working to systematize innovation. And while I hope that 277 00:15:38,960 --> 00:15:41,760 Speaker 3: your teleporter works out, I don't hope that nearly as 278 00:15:41,840 --> 00:15:44,360 Speaker 3: much is that the factory works out. And if you 279 00:15:44,440 --> 00:15:46,920 Speaker 3: aren't on that team, you're not going to be happy 280 00:15:46,920 --> 00:15:47,320 Speaker 3: at X. 281 00:15:48,240 --> 00:15:54,120 Speaker 2: So you, obviously, through an iterative process, taken ideas to fruition, 282 00:15:54,280 --> 00:15:58,720 Speaker 2: whether it's Weimo, Brain Wing or others. What does the 283 00:15:58,760 --> 00:16:02,360 Speaker 2: iterative process of the Moonshelt Fight Tree itself been. I 284 00:16:02,440 --> 00:16:06,600 Speaker 2: think he specifically you spoke about measuring or creating a 285 00:16:06,640 --> 00:16:09,160 Speaker 2: balance between efficiency and magic making. 286 00:16:10,320 --> 00:16:12,600 Speaker 3: I mean yes, and I can give you examples of that. 287 00:16:12,640 --> 00:16:15,880 Speaker 3: But let me tell you about something that has evolved 288 00:16:16,040 --> 00:16:19,880 Speaker 3: for us over time. We've been realizing more and more 289 00:16:20,320 --> 00:16:23,080 Speaker 3: that for at least many of the things that we make, 290 00:16:23,800 --> 00:16:28,640 Speaker 3: landing them outside Alphabet is actually better for Alphabet and 291 00:16:28,880 --> 00:16:32,880 Speaker 3: for the project the proto company as it becomes a company. 292 00:16:33,400 --> 00:16:37,400 Speaker 3: So Alphabet can still have a large minority interest in 293 00:16:37,440 --> 00:16:41,200 Speaker 3: this business. But if it's outside of Alphabet and Alphabet 294 00:16:41,200 --> 00:16:46,760 Speaker 3: doesn't control it, then get it can participate in market 295 00:16:46,800 --> 00:16:50,480 Speaker 3: capital and get strategic partners in a way that's different 296 00:16:50,560 --> 00:16:53,840 Speaker 3: than if it's inside alphabet. It can go faster in 297 00:16:53,880 --> 00:16:58,200 Speaker 3: some ways for being decoupled from Alphabet, which is also 298 00:16:58,240 --> 00:17:02,240 Speaker 3: a complex, very large business. So we're finding ways to 299 00:17:02,280 --> 00:17:05,280 Speaker 3: sort of systematize the landing of things more and more 300 00:17:05,320 --> 00:17:08,959 Speaker 3: outside of Alphabet. And that's something we've learned through the 301 00:17:09,000 --> 00:17:12,600 Speaker 3: process of making these other bets. And it sometimes happens 302 00:17:12,800 --> 00:17:18,720 Speaker 3: that being in a culturally, operationally and legally separate entity 303 00:17:18,920 --> 00:17:23,800 Speaker 3: within Alphabet, weimo wing intrinsic these verily these kinds of 304 00:17:23,800 --> 00:17:26,840 Speaker 3: things that came from X. It works. It's sometimes good 305 00:17:26,880 --> 00:17:29,639 Speaker 3: for them, but it's not for all of them. And 306 00:17:29,680 --> 00:17:32,879 Speaker 3: so that's an example where we've been trying to learn 307 00:17:32,920 --> 00:17:36,200 Speaker 3: ourselves how we can do our job better, systematize our 308 00:17:36,280 --> 00:17:40,800 Speaker 3: process and the conveyor belt for these ideas to optimize 309 00:17:40,960 --> 00:17:43,000 Speaker 3: their chances of being really great for the world. 310 00:17:44,160 --> 00:17:45,840 Speaker 2: I don't want to drag you into politics, but this 311 00:17:45,920 --> 00:17:49,600 Speaker 2: word efficiency has obviously become very very loaded recently, and 312 00:17:49,640 --> 00:17:52,760 Speaker 2: there's a kind of wider debate within the country about 313 00:17:53,160 --> 00:17:56,679 Speaker 2: the value of supporting a government supporting long term science 314 00:17:56,680 --> 00:18:01,360 Speaker 2: and research initiatives versus cutting waste and cutting efficiency. If 315 00:18:01,359 --> 00:18:03,920 Speaker 2: you have one sort of piece of advice to another organization, 316 00:18:04,080 --> 00:18:06,879 Speaker 2: be it government or another company, What would you say 317 00:18:06,920 --> 00:18:10,400 Speaker 2: about what you've learned in terms of balancing those two imperatives. 318 00:18:11,320 --> 00:18:18,399 Speaker 3: Well, first of all, getting efficient is generally a good goal, 319 00:18:19,200 --> 00:18:22,239 Speaker 3: but you have to know what game you're playing. In 320 00:18:22,280 --> 00:18:25,280 Speaker 3: our case, because we believe that we're in the moonshot business, 321 00:18:25,520 --> 00:18:29,120 Speaker 3: it's super important, as I've been describing, that the efficiency 322 00:18:29,160 --> 00:18:31,880 Speaker 3: stays balanced with what we're trying to do, and there 323 00:18:31,920 --> 00:18:34,560 Speaker 3: has to be a lot of exploration and a lot 324 00:18:34,560 --> 00:18:38,920 Speaker 3: of being wrong. So if you propose the teleporter project, 325 00:18:39,280 --> 00:18:42,560 Speaker 3: and I start with here's the thirty reasons, that's stupid. 326 00:18:43,080 --> 00:18:47,280 Speaker 3: One you will never bring up a creative idea ever again. Two, 327 00:18:47,880 --> 00:18:51,960 Speaker 3: it's easy to say that thirty thousand reasons why some 328 00:18:52,240 --> 00:18:56,480 Speaker 3: unusual idea isn't going to work. But then we aren't 329 00:18:56,480 --> 00:18:58,600 Speaker 3: going to go on any adventures, which, by the way, 330 00:18:58,880 --> 00:19:02,879 Speaker 3: I don't believe in Taro cards metaphysically, but there is 331 00:19:02,920 --> 00:19:07,000 Speaker 3: a tarot card poster of the fool. It's the only 332 00:19:07,040 --> 00:19:10,879 Speaker 3: thing on the door of my huddle. Because the activity 333 00:19:11,280 --> 00:19:14,080 Speaker 3: of setting out on a new journey is the activity 334 00:19:14,160 --> 00:19:17,880 Speaker 3: of creation, and that's the job that we're all in 335 00:19:18,080 --> 00:19:21,480 Speaker 3: at X, and so if you ran a widget factory, 336 00:19:21,640 --> 00:19:24,440 Speaker 3: you might have a very different set of goals with efficiency, 337 00:19:24,480 --> 00:19:26,760 Speaker 3: and you might do six sigma and that might be 338 00:19:26,840 --> 00:19:30,159 Speaker 3: reasonable for you. Six sigma is the wrong way to 339 00:19:30,160 --> 00:19:34,919 Speaker 3: think about efficiency at a moonshot factory. And you know, 340 00:19:34,960 --> 00:19:36,800 Speaker 3: I'll leave it to people who are smarter than me 341 00:19:36,880 --> 00:19:40,840 Speaker 3: to figure out how the government should focus on efficiency productively. 342 00:19:41,080 --> 00:19:52,639 Speaker 2: Fair enough, coming up, what Astro tele learned from his 343 00:19:52,720 --> 00:20:08,439 Speaker 2: grandfather about innovation, Stay with us, welcome back to tech stuff. 344 00:20:08,920 --> 00:20:11,919 Speaker 2: Before I had the opportunity to interview Astro Teller at 345 00:20:11,960 --> 00:20:15,000 Speaker 2: the Google office in Austin, Texas, I saw him on 346 00:20:15,040 --> 00:20:18,760 Speaker 2: a live panel moderated by Nicholas Thompson, CEO of the Atlantic. 347 00:20:19,560 --> 00:20:20,760 Speaker 1: At one point, Nick. 348 00:20:20,600 --> 00:20:22,800 Speaker 2: Started to say that government had created a lot of 349 00:20:22,800 --> 00:20:26,359 Speaker 2: innovation throughout the twentieth century. At one point, Nicholas Thompson 350 00:20:26,400 --> 00:20:29,600 Speaker 2: started to ask a question around how government had driven 351 00:20:29,600 --> 00:20:32,520 Speaker 2: a lot of innovation in the twentieth century, and Astro 352 00:20:32,600 --> 00:20:36,840 Speaker 2: quickly interjected, saying, actually government funded research, which in turn 353 00:20:36,920 --> 00:20:37,640 Speaker 2: drove innovation. 354 00:20:38,119 --> 00:20:39,400 Speaker 1: It was a fascinating conversation. 355 00:20:39,800 --> 00:20:41,480 Speaker 2: You'll be able to hear the whole thing because it's 356 00:20:41,480 --> 00:20:44,880 Speaker 2: going to be episode ten of the Moonshot podcast, and 357 00:20:45,080 --> 00:20:48,520 Speaker 2: Astro Teller has first hand knowledge of these relationships because 358 00:20:48,560 --> 00:20:52,200 Speaker 2: his grandfather, Edward Teller, was one of the key members 359 00:20:52,240 --> 00:20:55,080 Speaker 2: of the Manhattan Project, the R and D project that 360 00:20:55,160 --> 00:20:59,040 Speaker 2: developed nuclear weapons during World War Two. So I asked Teller, 361 00:20:59,480 --> 00:21:02,199 Speaker 2: how does he think about the balance between academia and 362 00:21:02,280 --> 00:21:06,480 Speaker 2: deep research, government and private labs like Google Eggs when 363 00:21:06,480 --> 00:21:07,800 Speaker 2: it comes to building a future. 364 00:21:08,400 --> 00:21:10,719 Speaker 3: I think it's a place for all of them. You know, 365 00:21:10,880 --> 00:21:13,879 Speaker 3: there will probably from time to time always be some 366 00:21:14,160 --> 00:21:19,560 Speaker 3: issues which are national security level issues, and it's rational 367 00:21:19,600 --> 00:21:22,639 Speaker 3: for any government, including the United States government, to spend 368 00:21:22,720 --> 00:21:25,680 Speaker 3: money to solve those things in a somewhat cost and 369 00:21:25,760 --> 00:21:28,440 Speaker 3: sensitive way because it's a national security issue. The Manhattan 370 00:21:28,440 --> 00:21:29,719 Speaker 3: Project was an example of that. 371 00:21:29,800 --> 00:21:31,720 Speaker 1: You have toio granfather' not working on that, by. 372 00:21:31,560 --> 00:21:36,479 Speaker 3: The way, frequently. Yeah, there will always be a place 373 00:21:36,600 --> 00:21:42,360 Speaker 3: in all countries for basic science because the raw material 374 00:21:42,480 --> 00:21:46,679 Speaker 3: of training the next generation of people in all the 375 00:21:46,720 --> 00:21:52,480 Speaker 3: stem fields creates this sort of rich soil from which 376 00:21:52,720 --> 00:21:58,520 Speaker 3: really great new things can spring. And then obviously private organizations, 377 00:21:58,920 --> 00:22:02,800 Speaker 3: either ones in history like the Bell Labs or Xerox Park, 378 00:22:02,920 --> 00:22:06,160 Speaker 3: or maybe more modern things like x the Moonshot factory, 379 00:22:06,560 --> 00:22:09,879 Speaker 3: I think also have a place because something has to 380 00:22:10,000 --> 00:22:15,880 Speaker 3: bridge between the Silicon Valley venture world of we can 381 00:22:15,920 --> 00:22:17,800 Speaker 3: see where we're going. It's kind of a ways off, 382 00:22:17,800 --> 00:22:19,200 Speaker 3: but we can see it, and we just want to 383 00:22:19,280 --> 00:22:23,520 Speaker 3: rush there as fast as possible. Academia, Oh my god, 384 00:22:23,600 --> 00:22:25,879 Speaker 3: we found a frictionless surface, but we have no idea 385 00:22:25,920 --> 00:22:29,159 Speaker 3: what this is good for. There's a big gap between 386 00:22:29,200 --> 00:22:33,359 Speaker 3: those two things, and I think that moonshot factories, not 387 00:22:33,600 --> 00:22:35,879 Speaker 3: just the one that we're making, but I think others 388 00:22:35,920 --> 00:22:39,560 Speaker 3: out in the world could over time fill that gap 389 00:22:39,600 --> 00:22:40,399 Speaker 3: really effectively. 390 00:22:40,960 --> 00:22:42,679 Speaker 2: I wonder if you you could talk briefly about your 391 00:22:42,920 --> 00:22:45,280 Speaker 2: grandfather and was that one conversation or one piece of 392 00:22:45,320 --> 00:22:47,680 Speaker 2: advice he gave you from the point of view of 393 00:22:47,680 --> 00:22:49,560 Speaker 2: the Manhattan Project that put you on the path through 394 00:22:49,600 --> 00:22:50,000 Speaker 2: on today. 395 00:22:51,400 --> 00:22:53,080 Speaker 3: I mean, I'd give a few things. First of all, 396 00:22:53,119 --> 00:22:57,680 Speaker 3: he liked to quote Neil's Boorr, who said that an 397 00:22:57,720 --> 00:23:00,680 Speaker 3: expert is someone who's made the majority of the mistakes 398 00:23:00,680 --> 00:23:04,400 Speaker 3: in their field. And I think that that is one 399 00:23:04,480 --> 00:23:06,600 Speaker 3: of the things that helped me lock in on the 400 00:23:06,720 --> 00:23:11,800 Speaker 3: understanding of failure is an inherent part of becoming an expert, 401 00:23:12,040 --> 00:23:14,719 Speaker 3: of becoming really good at something or of learning like 402 00:23:14,760 --> 00:23:20,560 Speaker 3: what anything should be to. My grandfather was a great orator, 403 00:23:20,880 --> 00:23:22,960 Speaker 3: and I had a mild speech impediment when I was 404 00:23:23,000 --> 00:23:26,080 Speaker 3: a kid, and I learned a lot from him by 405 00:23:26,160 --> 00:23:29,800 Speaker 3: watching him speak in private settings and interviews like this 406 00:23:30,320 --> 00:23:33,720 Speaker 3: on stages. I learned a lot from him about how 407 00:23:33,760 --> 00:23:38,080 Speaker 3: he connected with individuals and within an audience. I really 408 00:23:38,080 --> 00:23:40,560 Speaker 3: looked up to that particular skill of his and I 409 00:23:40,640 --> 00:23:45,439 Speaker 3: learned a lot from it. Also the Manhattan Project. He 410 00:23:45,640 --> 00:23:49,840 Speaker 3: had no real interest in bombs. That's not actually what 411 00:23:49,960 --> 00:23:53,160 Speaker 3: got him at all excited. He had a great interest 412 00:23:53,320 --> 00:23:58,040 Speaker 3: in physics and technology, but his single largest interest was 413 00:23:58,080 --> 00:24:02,800 Speaker 3: in being around phenomenal, the interesting and creative other people, 414 00:24:03,640 --> 00:24:07,280 Speaker 3: and the idea that you could get a group of 415 00:24:07,320 --> 00:24:12,800 Speaker 3: people together and create a subculture somewhat protected from the 416 00:24:12,840 --> 00:24:15,920 Speaker 3: rest of the world, so that you could foment some 417 00:24:16,040 --> 00:24:21,400 Speaker 3: really new ideas together in a very creative kind of crucible. 418 00:24:22,800 --> 00:24:25,719 Speaker 3: That is the thing I most took away from his 419 00:24:25,840 --> 00:24:29,080 Speaker 3: experience with the Manhattan Project, And obviously the Moonshot Factory 420 00:24:29,359 --> 00:24:33,320 Speaker 3: is pointed in very different directions, but I think I 421 00:24:33,440 --> 00:24:34,359 Speaker 3: was inspired by that. 422 00:24:34,840 --> 00:24:35,240 Speaker 1: I love that. 423 00:24:35,280 --> 00:24:37,240 Speaker 2: I'm sure as somebody who knows the real story. You 424 00:24:37,280 --> 00:24:40,119 Speaker 2: had a bunch of issues with Oppenheimer the movie, but 425 00:24:40,200 --> 00:24:43,440 Speaker 2: that sense that a group of people physically co located 426 00:24:43,440 --> 00:24:47,000 Speaker 2: in a space working on a mission was very fascinating. 427 00:24:47,200 --> 00:24:49,560 Speaker 2: I think it's part of what comes through with the 428 00:24:49,600 --> 00:24:50,440 Speaker 2: podcast as well. 429 00:24:50,520 --> 00:24:50,639 Speaker 3: Right. 430 00:24:50,680 --> 00:24:53,760 Speaker 1: I mean, you have collected all of these people. 431 00:24:53,480 --> 00:24:55,400 Speaker 2: Who've been on the journey with you for fifteen years 432 00:24:55,400 --> 00:24:58,560 Speaker 2: and kind of sharing stories with one another and with 433 00:24:58,600 --> 00:25:00,920 Speaker 2: the public about how that drove innovation. 434 00:25:01,160 --> 00:25:02,359 Speaker 1: At south By Southwest. 435 00:25:02,560 --> 00:25:06,680 Speaker 2: One of the biggest stories was the wooly Mouse, Colossal 436 00:25:06,760 --> 00:25:10,600 Speaker 2: Biosciences and the gene editing project to revive the woolly mammoth, 437 00:25:10,600 --> 00:25:13,159 Speaker 2: which has created the woody mouse along the way. A 438 00:25:13,200 --> 00:25:16,680 Speaker 2: lot of these like science fiction type stories are becoming 439 00:25:17,000 --> 00:25:21,520 Speaker 2: science factor right. AI, machine learning, gene editing, quantum has 440 00:25:21,520 --> 00:25:23,600 Speaker 2: been in the news in the last couple of weeks 441 00:25:24,200 --> 00:25:28,040 Speaker 2: as these platform technologies emerge around you, like, how do 442 00:25:28,080 --> 00:25:31,840 Speaker 2: you think about the role of X in terms of 443 00:25:32,520 --> 00:25:34,920 Speaker 2: figuring out the past interacting with them. 444 00:25:35,560 --> 00:25:37,520 Speaker 3: One of the things that's really important in the Moonshot 445 00:25:37,560 --> 00:25:41,720 Speaker 3: Factor is we're playing such a long game that there's 446 00:25:41,760 --> 00:25:44,399 Speaker 3: a temptation for the whole world, and it does seep 447 00:25:44,440 --> 00:25:47,719 Speaker 3: into X to what I think of as kind of 448 00:25:47,800 --> 00:25:49,760 Speaker 3: swarm with everybody. 449 00:25:49,280 --> 00:25:50,880 Speaker 1: Else skate towards the perk core. 450 00:25:51,520 --> 00:25:55,239 Speaker 3: Well, yeah, I mean, but I'm interested in like the 451 00:25:55,280 --> 00:25:58,639 Speaker 3: pucks that other people aren't watching by the time everyone 452 00:25:58,880 --> 00:26:02,240 Speaker 3: is like over fixated on a puck and everyone is 453 00:26:02,359 --> 00:26:04,840 Speaker 3: rushing towards that puck as they are right now with 454 00:26:05,400 --> 00:26:08,399 Speaker 3: llms and generative AI, I mean, a lot of value 455 00:26:08,440 --> 00:26:10,480 Speaker 3: will be created over time, a lot of goodness for 456 00:26:10,520 --> 00:26:13,800 Speaker 3: the world through these foundation models and what we can 457 00:26:13,840 --> 00:26:16,520 Speaker 3: do with them. The world does not need us rushing 458 00:26:16,560 --> 00:26:19,679 Speaker 3: at that. Everyone else is rushing at that. And you know, 459 00:26:19,720 --> 00:26:23,080 Speaker 3: we were one of the groups that set off that 460 00:26:23,920 --> 00:26:27,440 Speaker 3: sort of ripple effect because of Google Brain. Because of 461 00:26:27,480 --> 00:26:30,239 Speaker 3: Google Brain, our job now should be what can we 462 00:26:30,320 --> 00:26:34,159 Speaker 3: do today that thirteen or fifteen years from now is 463 00:26:34,280 --> 00:26:39,639 Speaker 3: as important then as the effects of Google Brain are today. 464 00:26:40,359 --> 00:26:42,760 Speaker 3: That's our real job is to be working on things 465 00:26:42,800 --> 00:26:44,439 Speaker 3: that when you looked at them, if you came and 466 00:26:44,440 --> 00:26:47,520 Speaker 3: looked at our earliest stuff, you should say, I don't know, 467 00:26:47,600 --> 00:26:51,000 Speaker 3: there's probably nothing, and you'd be mostly right, but not 468 00:26:51,359 --> 00:26:53,879 Speaker 3: entirely right. And that's our job is for one and 469 00:26:53,960 --> 00:26:56,000 Speaker 3: a hundred of those things. To turn out to be 470 00:26:56,280 --> 00:26:59,320 Speaker 3: Google Brain level important and we don't know ahead of 471 00:26:59,320 --> 00:27:00,359 Speaker 3: time which one is it'll be. 472 00:27:00,680 --> 00:27:01,480 Speaker 1: And that takes like. 473 00:27:01,560 --> 00:27:08,080 Speaker 3: Constant bravery and creativity and open mindedness paired with humility. 474 00:27:08,359 --> 00:27:10,920 Speaker 3: We're wrong most of the time. How can we get 475 00:27:10,960 --> 00:27:13,600 Speaker 3: the evidence that verifies this isn't one of those things? 476 00:27:13,640 --> 00:27:14,879 Speaker 3: So we can stop doing it? 477 00:27:15,440 --> 00:27:17,800 Speaker 2: And I mean, I'm sure that you have to be 478 00:27:17,880 --> 00:27:20,040 Speaker 2: very sensitive in terms of what you share publicly, but 479 00:27:20,080 --> 00:27:23,560 Speaker 2: are there any early signals you're getting from interaction with 480 00:27:23,640 --> 00:27:26,040 Speaker 2: the real world about things that you're working on today 481 00:27:26,080 --> 00:27:29,199 Speaker 2: that maybe you know in season ten of the of 482 00:27:29,240 --> 00:27:32,560 Speaker 2: the Google x podcast, the Moonshot Factory podcast will be 483 00:27:32,800 --> 00:27:34,040 Speaker 2: featured ten years from now. 484 00:27:34,960 --> 00:27:38,520 Speaker 3: I'm increasingly confident by watching the experiments that we've done 485 00:27:38,640 --> 00:27:40,720 Speaker 3: and this has been This isn't a single project. 486 00:27:40,720 --> 00:27:41,800 Speaker 1: This is a range of them. 487 00:27:41,960 --> 00:27:45,520 Speaker 3: That biology is moving and a decent clip from being 488 00:27:45,560 --> 00:27:47,720 Speaker 3: a science to being an engineering discipline. 489 00:27:47,960 --> 00:27:48,600 Speaker 1: What does that mean? 490 00:27:48,800 --> 00:27:52,399 Speaker 3: What that means is it is already true today that 491 00:27:52,480 --> 00:27:56,560 Speaker 3: you can go into ecoli or yeas to bacteria cell 492 00:27:56,920 --> 00:28:00,960 Speaker 3: and reprogram it. You can change its d you can 493 00:28:01,240 --> 00:28:05,520 Speaker 3: change its environment and ask it as best you can 494 00:28:05,760 --> 00:28:08,480 Speaker 3: to do something other than what it would normally do. 495 00:28:08,720 --> 00:28:11,160 Speaker 3: Let's say to produce a lot of a thing it's 496 00:28:11,200 --> 00:28:13,800 Speaker 3: not used to producing, but that would be useful for people. 497 00:28:14,080 --> 00:28:16,800 Speaker 3: That might be an enzyme that goes into laundry detergent 498 00:28:16,840 --> 00:28:19,160 Speaker 3: for breaking down things when you put it in the wash. 499 00:28:19,600 --> 00:28:22,520 Speaker 3: That could be a human milk sugar that you want 500 00:28:22,560 --> 00:28:24,560 Speaker 3: to produce so it can go into baby formula. There's 501 00:28:25,440 --> 00:28:30,360 Speaker 3: universe of things you might ask these self replicating carbon 502 00:28:30,440 --> 00:28:34,680 Speaker 3: negative machines that biology has invented for us to make 503 00:28:35,960 --> 00:28:38,200 Speaker 3: the problem is you don't know what it will do 504 00:28:38,280 --> 00:28:41,000 Speaker 3: when you reprogram it. There is no simulator where you 505 00:28:41,040 --> 00:28:44,440 Speaker 3: can test it out. So it was incredibly like trial 506 00:28:44,520 --> 00:28:46,960 Speaker 3: and error. If you're a strain engineer, you just have 507 00:28:47,000 --> 00:28:51,680 Speaker 3: to make some change in the code of this little 508 00:28:51,800 --> 00:28:55,920 Speaker 3: tiny factory to sell and then stick it into a 509 00:28:55,960 --> 00:28:58,360 Speaker 3: Petri dish and watch it and like, well what does 510 00:28:58,400 --> 00:29:03,320 Speaker 3: it do? That caused very slow inovation in this space. 511 00:29:03,720 --> 00:29:06,960 Speaker 3: But if you could try this in a computer, you 512 00:29:07,000 --> 00:29:10,600 Speaker 3: would be going thousands, tens of thousands of times faster 513 00:29:10,720 --> 00:29:14,720 Speaker 3: and discovering stuff. So we're seeing more and more evidence 514 00:29:14,760 --> 00:29:16,520 Speaker 3: that that's going to be a thing. I think that's 515 00:29:16,520 --> 00:29:18,520 Speaker 3: going to turn out to be really important for humanity. 516 00:29:18,920 --> 00:29:25,080 Speaker 3: For healthcare, sure, like making of drugs. Almost anything that 517 00:29:25,280 --> 00:29:30,080 Speaker 3: you wouldn't call manufacturing could be the domain of biology 518 00:29:30,120 --> 00:29:33,440 Speaker 3: to make. So the clothing that we're wearing, like, there's 519 00:29:33,440 --> 00:29:37,640 Speaker 3: no reason biology couldn't be producing this stuff, turning plastics 520 00:29:37,960 --> 00:29:41,400 Speaker 3: back into the raw materials, or making those raw materials 521 00:29:41,440 --> 00:29:43,280 Speaker 3: in the first place so that we don't have to 522 00:29:43,680 --> 00:29:47,200 Speaker 3: burn fossil fuels in order to make plastics. There's no 523 00:29:47,280 --> 00:29:51,160 Speaker 3: reason biology couldn't do that. Yes, medicine for people, there's 524 00:29:52,480 --> 00:29:56,560 Speaker 3: trillions of dollars a year that humans produce in various 525 00:29:56,680 --> 00:30:00,720 Speaker 3: kinds of what look like factories or refineries that we 526 00:30:00,840 --> 00:30:04,280 Speaker 3: do in a very industrial way today because we know 527 00:30:04,360 --> 00:30:08,320 Speaker 3: how to mechanically make things. We have some facility to 528 00:30:08,480 --> 00:30:11,160 Speaker 3: chemically make things, but we haven't figured out how to 529 00:30:11,200 --> 00:30:14,760 Speaker 3: program biology to make those things. So I see that 530 00:30:14,840 --> 00:30:17,160 Speaker 3: as a big shift in during the twenty first century. 531 00:30:17,760 --> 00:30:21,760 Speaker 2: You're talking at south By Southwest about turning trash back 532 00:30:21,800 --> 00:30:26,120 Speaker 2: into treasure. And there was a big Dickens fan growing up, 533 00:30:26,160 --> 00:30:28,520 Speaker 2: and Dickens is always writing about kind of hunting through 534 00:30:28,600 --> 00:30:32,120 Speaker 2: trash heaps to find these miraculous pieces of treasure and stuff. 535 00:30:32,640 --> 00:30:34,520 Speaker 2: It used a phrase that I've never heard before, but 536 00:30:34,560 --> 00:30:38,360 Speaker 2: that I'd love you to expound on, which is moonshot compost. 537 00:30:39,040 --> 00:30:42,360 Speaker 3: Yes, so I mean let me unpack both a little 538 00:30:42,360 --> 00:30:45,000 Speaker 3: bit so that the difference is clear. When I say 539 00:30:45,040 --> 00:30:48,320 Speaker 3: turning trash into treasure. What I was talking about was 540 00:30:49,440 --> 00:30:52,440 Speaker 3: humanity spends sort of, depending on how you count, five 541 00:30:52,520 --> 00:30:55,360 Speaker 3: or six trillion dollars a year making stuff, and then 542 00:30:55,400 --> 00:30:59,360 Speaker 3: the leftovers, which again reasonable people could disagree, but it 543 00:30:59,440 --> 00:31:02,720 Speaker 3: is arguably worth at least a few trillion dollars a 544 00:31:02,960 --> 00:31:06,600 Speaker 3: year goes into landfill of various kinds. This is plastics, 545 00:31:06,840 --> 00:31:10,840 Speaker 3: this is e waste, like leftover computers. This is things 546 00:31:10,920 --> 00:31:13,880 Speaker 3: like we break down a building like the one we're in, 547 00:31:14,440 --> 00:31:16,920 Speaker 3: and all of the rubble, all of the metal, it 548 00:31:17,000 --> 00:31:20,720 Speaker 3: all just goes to landfill. We reuse this stuff terribly. 549 00:31:20,800 --> 00:31:23,400 Speaker 3: Think how hard we worked to get this metal out 550 00:31:23,400 --> 00:31:25,600 Speaker 3: of the ground in the first place. It's already refined, 551 00:31:25,880 --> 00:31:28,239 Speaker 3: but we don't quite know how to reuse itself. Just 552 00:31:28,320 --> 00:31:32,640 Speaker 3: all goes to landfill. If we could make it profitable 553 00:31:32,840 --> 00:31:37,000 Speaker 3: to take that several trillion dollars a year of stuff 554 00:31:37,360 --> 00:31:39,160 Speaker 3: that right now is going to landfill and turn it 555 00:31:39,280 --> 00:31:42,200 Speaker 3: back into the raw material for humanity. One there's a 556 00:31:42,280 --> 00:31:45,120 Speaker 3: ridiculous amount of money to be made doing that, but two, 557 00:31:45,320 --> 00:31:49,200 Speaker 3: it would actually cause human existence on earth to be 558 00:31:49,280 --> 00:31:52,320 Speaker 3: much more circular. We could stop trying to be so 559 00:31:52,480 --> 00:31:55,880 Speaker 3: extractive from the world and be more circular. So that's 560 00:31:55,920 --> 00:31:58,880 Speaker 3: what I meant by turning trash into treasure, And there's 561 00:31:59,080 --> 00:32:02,880 Speaker 3: lots to be said about that separately. Metaphorically, we think 562 00:32:02,880 --> 00:32:05,880 Speaker 3: about the exact same thing at X. So if you 563 00:32:05,920 --> 00:32:09,200 Speaker 3: work on the Teleporter project, you're just passionate about it, 564 00:32:09,280 --> 00:32:11,640 Speaker 3: you decide that it's not as good as you thought, 565 00:32:11,840 --> 00:32:16,240 Speaker 3: we're going to end the Teleporter project. Congratulations, Good for you, 566 00:32:16,560 --> 00:32:19,800 Speaker 3: high five that you're doing the right thing. Here's a 567 00:32:19,840 --> 00:32:22,680 Speaker 3: bonus for you and for your whole team. Now you 568 00:32:22,720 --> 00:32:24,560 Speaker 3: have a few months where you can sort of explore 569 00:32:24,600 --> 00:32:27,400 Speaker 3: the factory and find out what your next thing at 570 00:32:27,440 --> 00:32:30,160 Speaker 3: the factory is. We don't have to throw away the people. 571 00:32:30,360 --> 00:32:32,440 Speaker 3: We don't have to throw away the code that you wrote. 572 00:32:32,520 --> 00:32:34,920 Speaker 3: We don't have to throw away the patents that you filed. 573 00:32:35,240 --> 00:32:37,120 Speaker 3: We don't have to throw away the partnerships that you 574 00:32:37,200 --> 00:32:41,320 Speaker 3: built or the hardware. There's so much that might get 575 00:32:41,360 --> 00:32:44,280 Speaker 3: reused in some way from what you did, even if 576 00:32:44,320 --> 00:32:49,200 Speaker 3: it's in surprising, very alternate uses, and so the process 577 00:32:49,360 --> 00:32:52,680 Speaker 3: of reminding ourselves over and over again. Just because we 578 00:32:52,720 --> 00:32:55,320 Speaker 3: stopped the project, it doesn't mean there isn't a lot 579 00:32:55,360 --> 00:32:58,240 Speaker 3: of value here. That's what we mean when we say 580 00:32:58,280 --> 00:33:02,040 Speaker 3: Moonshot compost. It is the reusing and the sort of 581 00:33:02,120 --> 00:33:06,520 Speaker 3: second and third lives of all of this knowledge creation 582 00:33:07,080 --> 00:33:10,560 Speaker 3: that because it stays in the factory, it's frictionless for 583 00:33:10,640 --> 00:33:12,960 Speaker 3: us to reuse it. And that also makes it easier 584 00:33:13,000 --> 00:33:15,760 Speaker 3: when you stop a project to know that it isn't 585 00:33:15,800 --> 00:33:18,320 Speaker 3: just like zeroed out, that it's back in the dirt 586 00:33:18,360 --> 00:33:26,360 Speaker 3: and it's going to come back in some interesting new forum. 587 00:33:26,480 --> 00:33:30,640 Speaker 2: That was Astrotella Alphabet's Captain of Moonshots. Check out the 588 00:33:30,640 --> 00:33:34,479 Speaker 2: Moonshot podcast wherever you get your podcasts for tech stuff, 589 00:33:34,680 --> 00:33:38,360 Speaker 2: I'm oz Voloshin. This episode was produced by Eliza Dennis 590 00:33:38,400 --> 00:33:41,920 Speaker 2: and Victoria Domingez. It was executive produced by me Carrot 591 00:33:41,960 --> 00:33:46,120 Speaker 2: Price and Kate Osborne for Kaleidoscope and Katrina Novelle via 592 00:33:46,160 --> 00:33:50,680 Speaker 2: our Podcasts. Nomad Sound recorded this interview. Jack Insley mikesed 593 00:33:50,680 --> 00:33:54,120 Speaker 2: the episode and Kyle Murdoch wrote our theme song. Join 594 00:33:54,200 --> 00:33:57,360 Speaker 2: us this Friday for tech Stuff's We Can Tech. We'll 595 00:33:57,400 --> 00:33:59,480 Speaker 2: run through the headlines and hear from four or four 596 00:33:59,560 --> 00:34:03,480 Speaker 2: Media's Joseph Cox about a tool that allows one ICE 597 00:34:03,680 --> 00:34:07,520 Speaker 2: surveillance contractor to scrape over two hundred sites, apps, and 598 00:34:07,600 --> 00:34:12,279 Speaker 2: services for data on targeted individuals. Please rate, review, and 599 00:34:12,360 --> 00:34:14,880 Speaker 2: reach out to us at tech Stuff podcast at gmail 600 00:34:14,920 --> 00:34:15,560 Speaker 2: dot com. 601 00:34:15,600 --> 00:34:16,560 Speaker 1: We really want to hear from you.