1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,320 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,400 --> 00:00:17,119 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:17,160 --> 00:00:20,640 Speaker 1: how the tech are you? Listener Charlie Kniehouse asked if 5 00:00:20,680 --> 00:00:24,040 Speaker 1: I might do an episode about Google X, which these 6 00:00:24,120 --> 00:00:27,520 Speaker 1: days is just called X. So that is what we 7 00:00:27,560 --> 00:00:33,120 Speaker 1: are going to do, and X focuses on moon shots. Well, 8 00:00:33,360 --> 00:00:35,760 Speaker 1: was a moon shot some of you might be asking? 9 00:00:36,400 --> 00:00:40,839 Speaker 1: That would be a project that's really ambitious. There's no 10 00:00:40,880 --> 00:00:44,880 Speaker 1: assurance that the project is going to be successful, and 11 00:00:44,960 --> 00:00:48,000 Speaker 1: even if it is successful, there's no assurance that it 12 00:00:48,000 --> 00:00:52,599 Speaker 1: would be marketable or profitable, at least in the short term. 13 00:00:52,640 --> 00:00:55,880 Speaker 1: And the term itself moonshot is a play off of 14 00:00:56,160 --> 00:00:59,520 Speaker 1: long shot. You know, a bet has really tough odds, 15 00:00:59,600 --> 00:01:02,680 Speaker 1: but if it works, it will have an enormous payout. 16 00:01:03,360 --> 00:01:07,280 Speaker 1: And it references also the Apollo program in the Space race, 17 00:01:07,560 --> 00:01:10,120 Speaker 1: you know, the the one that sent human beings to 18 00:01:10,120 --> 00:01:13,039 Speaker 1: the Moon. The space missions are a great example of 19 00:01:13,400 --> 00:01:17,360 Speaker 1: ambitious projects that had no guarantee of success and that 20 00:01:17,480 --> 00:01:22,920 Speaker 1: subsequently provided tremendous benefits beyond just the program itself. Sure, 21 00:01:23,000 --> 00:01:26,480 Speaker 1: the Space race likely would not have been nearly as 22 00:01:26,520 --> 00:01:29,440 Speaker 1: productive had it not also been for the Cold War 23 00:01:29,920 --> 00:01:33,319 Speaker 1: between the United States and the Soviet Union. But whatever 24 00:01:33,400 --> 00:01:36,600 Speaker 1: the driving factors of the space race were, the truth 25 00:01:36,840 --> 00:01:40,240 Speaker 1: is that the work that was done by scientists, engineers, 26 00:01:40,640 --> 00:01:44,040 Speaker 1: mathematicians and others, it did more than just see people 27 00:01:44,120 --> 00:01:45,880 Speaker 1: land on the moon. And I can't believe I just 28 00:01:45,920 --> 00:01:48,760 Speaker 1: said just see people in on the moon, because that 29 00:01:48,800 --> 00:01:52,680 Speaker 1: alone is truly monumental, But no, their work would make 30 00:01:52,720 --> 00:01:56,040 Speaker 1: possible new technologies that were useful in lots of other ways. 31 00:01:56,200 --> 00:01:58,520 Speaker 1: So there were a lot of other benefits they came 32 00:01:58,520 --> 00:02:01,600 Speaker 1: out of the space race. Like the primary goal obviously 33 00:02:01,720 --> 00:02:04,680 Speaker 1: was let's get there before the other guys do. But 34 00:02:04,960 --> 00:02:08,760 Speaker 1: in order to do that, there were so many innovations 35 00:02:08,880 --> 00:02:12,440 Speaker 1: made that would filter their way into other aspects of 36 00:02:12,440 --> 00:02:17,400 Speaker 1: our lives that we saw an incredible benefit from that work. 37 00:02:17,520 --> 00:02:19,919 Speaker 1: And that's kind of the idea here, that's the hope 38 00:02:19,919 --> 00:02:25,160 Speaker 1: with a moonshot project. You identify a really tough goal 39 00:02:25,440 --> 00:02:27,600 Speaker 1: that you want to achieve, so it has to be 40 00:02:27,680 --> 00:02:31,119 Speaker 1: really challenging. It maybe something that is years out from 41 00:02:31,280 --> 00:02:34,680 Speaker 1: even being possible, and it may eventually turn out that 42 00:02:34,680 --> 00:02:37,880 Speaker 1: that goal is just not achievable, at least not with 43 00:02:37,960 --> 00:02:41,800 Speaker 1: whatever current resources are available. But the hope is, even 44 00:02:41,840 --> 00:02:45,840 Speaker 1: if you can't realize the end goal, you'll discover useful 45 00:02:45,880 --> 00:02:49,240 Speaker 1: stuff along your journey, and it may turn out that 46 00:02:49,280 --> 00:02:51,720 Speaker 1: you didn't get to the destination you had planned on, 47 00:02:52,160 --> 00:02:55,120 Speaker 1: but you did create something truly remarkable. All the same, 48 00:02:56,080 --> 00:03:00,760 Speaker 1: X serves that purpose for Google, but to be more accurate, 49 00:03:00,840 --> 00:03:04,400 Speaker 1: it really serves that purpose for Alphabet. That's the parent 50 00:03:04,560 --> 00:03:08,480 Speaker 1: company that oversees Google, as well as several other spin 51 00:03:08,520 --> 00:03:12,560 Speaker 1: off companies like Waimo, which actually got its start as 52 00:03:12,560 --> 00:03:16,160 Speaker 1: a Google X project. Now, to be clear, Google Slash 53 00:03:16,200 --> 00:03:19,680 Speaker 1: Alphabet is not the only company to have a department 54 00:03:19,960 --> 00:03:24,440 Speaker 1: like this. Over at Lockheed Martin, you've got skunk Works. 55 00:03:24,480 --> 00:03:28,280 Speaker 1: That division serves a similar purpose. It is produced notable 56 00:03:28,440 --> 00:03:31,840 Speaker 1: aircraft like the U Tube spy plane and the F 57 00:03:31,960 --> 00:03:36,200 Speaker 1: twenty two Raptor fighter aircraft, among many others. A. T 58 00:03:36,320 --> 00:03:39,600 Speaker 1: and T has Bell Labs, which is notable for being 59 00:03:39,600 --> 00:03:43,600 Speaker 1: the think tank that produced transformative technologies like the solid 60 00:03:43,640 --> 00:03:48,800 Speaker 1: state transistor and the laser Xerox had the Palo Alto 61 00:03:49,000 --> 00:03:53,360 Speaker 1: Research Center or Park p A r C is now 62 00:03:53,400 --> 00:03:56,400 Speaker 1: a standalone company, so it's no longer a Xerox division, 63 00:03:57,000 --> 00:03:59,240 Speaker 1: and Park was notable for producing a lot of the 64 00:03:59,240 --> 00:04:02,600 Speaker 1: innovations Steve Jobs would later lift to use in the 65 00:04:02,680 --> 00:04:06,240 Speaker 1: Macintosh computer, you know, things like the graphic user interface 66 00:04:06,320 --> 00:04:09,440 Speaker 1: and the computer mouse. Jobs was given a tour of Park, 67 00:04:09,920 --> 00:04:14,240 Speaker 1: saw that Xerox engineers had developed this incredible technology, but 68 00:04:14,360 --> 00:04:18,919 Speaker 1: that Xerox never really invested in making that a consumer product. 69 00:04:19,000 --> 00:04:21,039 Speaker 1: So it was this technology that was just kind of 70 00:04:22,000 --> 00:04:25,159 Speaker 1: wasting away, and so Steve Jobs made sure it didn't 71 00:04:25,200 --> 00:04:29,200 Speaker 1: go to waste. However, let's focus on x and its 72 00:04:29,360 --> 00:04:32,800 Speaker 1: story now. These things can be a little bit complicated 73 00:04:32,800 --> 00:04:36,760 Speaker 1: to unravel, largely because Google has always been a company 74 00:04:36,960 --> 00:04:41,400 Speaker 1: of engineers, and their ways can be complex and mysterious 75 00:04:42,200 --> 00:04:47,080 Speaker 1: or sometimes haphazard and difficult to untangle. Engineers are great. 76 00:04:47,560 --> 00:04:52,120 Speaker 1: I love engineers. I loved talking with them. However, sometimes 77 00:04:52,120 --> 00:04:54,240 Speaker 1: it takes an engineer to suss out what the heck 78 00:04:54,279 --> 00:04:57,159 Speaker 1: another engineer has been doing this whole time. I actually 79 00:04:57,160 --> 00:04:59,240 Speaker 1: feel like a lot of Google products fall into that 80 00:04:59,320 --> 00:05:02,520 Speaker 1: category where if you are an engineer, you start to 81 00:05:02,600 --> 00:05:05,920 Speaker 1: grock what is going on pretty quickly, But if you 82 00:05:06,000 --> 00:05:09,640 Speaker 1: are a mere mortal like myself, you might encounter a 83 00:05:09,680 --> 00:05:12,400 Speaker 1: Google product and you're spending the first you know, hour 84 00:05:12,560 --> 00:05:15,960 Speaker 1: or so just trying to figure out how you're supposed 85 00:05:16,000 --> 00:05:18,760 Speaker 1: to use it. And once you do, and you realize 86 00:05:18,800 --> 00:05:21,800 Speaker 1: why the decisions were made to make it that way, 87 00:05:21,839 --> 00:05:26,160 Speaker 1: it clicks. But it's not as intuitive as say, Apple's 88 00:05:26,279 --> 00:05:30,400 Speaker 1: approach to things like user interfaces. Anyway, one way to 89 00:05:30,440 --> 00:05:33,279 Speaker 1: tell the story of Google X is to talk about 90 00:05:33,400 --> 00:05:37,560 Speaker 1: Sebastian Throne. Uh. Sebastian Throne was born in what at 91 00:05:37,600 --> 00:05:40,440 Speaker 1: that point was West Germany. Now you would just call 92 00:05:40,480 --> 00:05:43,400 Speaker 1: it Germany, and this was back in nineteen sixty seven. 93 00:05:43,680 --> 00:05:49,160 Speaker 1: He studied computer science, among many other subjects, and by 94 00:05:49,440 --> 00:05:52,240 Speaker 1: he had become part of the computer science department at 95 00:05:52,279 --> 00:05:56,520 Speaker 1: Carnegie Mellon University. In two thousand one, he took a 96 00:05:56,600 --> 00:06:00,240 Speaker 1: year off from CMU and he spent a year at 97 00:06:00,279 --> 00:06:02,960 Speaker 1: Stanford and that must have been quite a year, because 98 00:06:03,000 --> 00:06:06,920 Speaker 1: in two thousand three he would leave CMU officially and 99 00:06:07,040 --> 00:06:11,839 Speaker 1: join the teaching staff at Stanford University. He also became 100 00:06:11,880 --> 00:06:15,320 Speaker 1: the head of the artificial intelligence laboratory there, and he 101 00:06:15,360 --> 00:06:20,880 Speaker 1: would participate in a really important competition, actually several of them, 102 00:06:21,320 --> 00:06:24,560 Speaker 1: but the one I'm specifically talking about is the DARPA 103 00:06:24,800 --> 00:06:30,440 Speaker 1: Grand Challenge. So DARPA is the United States Defense Advanced 104 00:06:30,480 --> 00:06:35,960 Speaker 1: Research Projects Agency. This is part of the Department of Defense, 105 00:06:36,240 --> 00:06:42,560 Speaker 1: and it is instrumental and developing the next generation of 106 00:06:42,600 --> 00:06:46,719 Speaker 1: technologies that the Department of Defense and by extension, the U. 107 00:06:46,839 --> 00:06:50,119 Speaker 1: S Military will depend upon. So this is all about 108 00:06:50,160 --> 00:06:55,800 Speaker 1: creating technologies that ultimately contribute to national defense. Now, DARPA 109 00:06:56,160 --> 00:06:58,960 Speaker 1: doesn't really do research on its own. It's not like 110 00:06:59,000 --> 00:07:03,520 Speaker 1: it's some secret underground government lab filled with robots and 111 00:07:03,600 --> 00:07:07,040 Speaker 1: panels with flashing lights. I wish it were now. It's 112 00:07:07,080 --> 00:07:11,640 Speaker 1: more filled with like administrators and budget sheets. DARPA really 113 00:07:11,720 --> 00:07:16,960 Speaker 1: awards grants to organizations like universities or research facilities, sometimes 114 00:07:16,960 --> 00:07:22,440 Speaker 1: private companies for specific projects. And occasionally DARTBA holds competitions 115 00:07:22,480 --> 00:07:26,160 Speaker 1: in which the agency identifies a goal or a task 116 00:07:26,200 --> 00:07:29,680 Speaker 1: of some sort and the various participants try to create 117 00:07:29,720 --> 00:07:32,840 Speaker 1: a solution that meets the criteria and beats out all 118 00:07:32,880 --> 00:07:36,720 Speaker 1: the other competitors and usually will win some sort of prize, like, 119 00:07:36,840 --> 00:07:38,720 Speaker 1: you know, a couple of billion dollars in some cases. 120 00:07:39,200 --> 00:07:42,640 Speaker 1: Now I should note that Throne had previously participated in 121 00:07:42,720 --> 00:07:46,840 Speaker 1: other DARPA challenges. These were not Grand challenges, but these 122 00:07:46,840 --> 00:07:50,240 Speaker 1: were various robotics challenges, and so he was working on 123 00:07:50,280 --> 00:07:52,880 Speaker 1: things with robotics and artificial intelligence, so he was no 124 00:07:52,960 --> 00:07:56,400 Speaker 1: stranger to competing, and he had been involved in those 125 00:07:56,440 --> 00:07:58,640 Speaker 1: kind of competitions even before he was part of the 126 00:07:58,720 --> 00:08:01,760 Speaker 1: car Carnegie melon University. I keep wanting to say Carnegie 127 00:08:01,760 --> 00:08:05,600 Speaker 1: because that's how Andrew Carnegie said it, but I've been reprimanded. 128 00:08:05,640 --> 00:08:09,920 Speaker 1: It's Carnegie Mellon University. And he had been participating on 129 00:08:10,000 --> 00:08:13,040 Speaker 1: teams with Carnegie Mellon. He did some a few times 130 00:08:13,040 --> 00:08:16,120 Speaker 1: before he even joined Carnegie Mellon, and then he did 131 00:08:16,160 --> 00:08:18,840 Speaker 1: it again with Stanford. So in two thousand four he 132 00:08:18,960 --> 00:08:22,960 Speaker 1: participated in the first of several Grand challenges that would 133 00:08:23,000 --> 00:08:27,280 Speaker 1: kick off the quest to create autonomous vehicles. The two 134 00:08:27,280 --> 00:08:32,320 Speaker 1: thousand four DARPA Grand Challenge invited racing teams from various 135 00:08:32,440 --> 00:08:37,240 Speaker 1: universities and research facilities to build an autonomous vehicle that 136 00:08:37,280 --> 00:08:40,160 Speaker 1: would be capable of traveling a hundred fifty miles or 137 00:08:40,200 --> 00:08:45,880 Speaker 1: two dty kilometers along a predetermined desert route. And it 138 00:08:45,960 --> 00:08:49,559 Speaker 1: turned out this was a moonshot sort of project because 139 00:08:49,760 --> 00:08:53,479 Speaker 1: ultimately not a single one of the vehicles that participated 140 00:08:54,120 --> 00:08:57,640 Speaker 1: was able to complete the course. In fact, the vehicle 141 00:08:57,679 --> 00:09:01,040 Speaker 1: that traveled the furthest only made its seven point three 142 00:09:01,120 --> 00:09:05,320 Speaker 1: two miles down the road or eleven point seven eight kilometers. 143 00:09:06,000 --> 00:09:08,800 Speaker 1: That one was actually from Thrones Old stomping grounds of 144 00:09:08,880 --> 00:09:14,360 Speaker 1: Carnegie Mellon, undaunted. DARPA then held another Grand Challenge the 145 00:09:14,400 --> 00:09:17,920 Speaker 1: following year. They changed up the course, they changed up 146 00:09:17,960 --> 00:09:21,440 Speaker 1: some of the requirements, and that year would be very different. 147 00:09:21,480 --> 00:09:24,360 Speaker 1: For one thing, there was now an actual community that 148 00:09:24,440 --> 00:09:29,040 Speaker 1: had grown up around this quest to build autonomous vehicle technology. 149 00:09:29,160 --> 00:09:32,960 Speaker 1: There were people who were developing new approaches and new 150 00:09:33,040 --> 00:09:35,559 Speaker 1: you know, new technologies to make this happen, and that 151 00:09:35,600 --> 00:09:39,000 Speaker 1: meant there were new opportunities to collaborate and share knowledge. 152 00:09:39,320 --> 00:09:42,000 Speaker 1: And while this was a competition, there were teams that 153 00:09:42,040 --> 00:09:45,079 Speaker 1: were eager to develop or learn best practices and then 154 00:09:45,120 --> 00:09:48,679 Speaker 1: share them with others. It became a very community experience 155 00:09:48,720 --> 00:09:52,320 Speaker 1: and and the the engineers learned through their failures, and 156 00:09:52,360 --> 00:09:56,520 Speaker 1: in two thousand five, Stanford's team would take first place 157 00:09:56,800 --> 00:10:01,400 Speaker 1: in the competition. Interestingly, Carnegie Melon had two teams and 158 00:10:01,520 --> 00:10:04,840 Speaker 1: both of those teams took second and third place. So 159 00:10:05,720 --> 00:10:09,720 Speaker 1: you know, Thrun certainly had his influence on Carnegie Mellon. 160 00:10:09,800 --> 00:10:13,040 Speaker 1: I mean he had been there previously, and so presumably 161 00:10:13,080 --> 00:10:14,560 Speaker 1: some of the people who are working on the team 162 00:10:14,600 --> 00:10:19,280 Speaker 1: were people who either had studied directly under Throne or 163 00:10:19,679 --> 00:10:22,560 Speaker 1: at least new people who had, So yeah, he was 164 00:10:23,120 --> 00:10:26,400 Speaker 1: very much at the center of this work. Now, I've 165 00:10:26,440 --> 00:10:29,160 Speaker 1: done episodes about the Grand Challenges before, so we're not 166 00:10:29,200 --> 00:10:31,400 Speaker 1: going to go over all that there were more after 167 00:10:32,240 --> 00:10:35,600 Speaker 1: the two thousand five one anyway, and that's a that's 168 00:10:35,600 --> 00:10:39,760 Speaker 1: a story that's worthy of its own series of podcasts, 169 00:10:39,800 --> 00:10:43,080 Speaker 1: So we'll stick with Thrun's work, and it definitely caught 170 00:10:43,080 --> 00:10:46,600 Speaker 1: the attention of Google. In two thousand seventh, Round would 171 00:10:46,679 --> 00:10:50,839 Speaker 1: join Google as a Google Fellow, that is, a level 172 00:10:51,000 --> 00:10:55,720 Speaker 1: ten Googler or level ten Google engineer, which sounds kind 173 00:10:55,720 --> 00:10:57,960 Speaker 1: of like I'm talking about a cult at that point, right. 174 00:10:58,360 --> 00:11:03,440 Speaker 1: So in Google there was, and I presume still is, 175 00:11:04,160 --> 00:11:06,960 Speaker 1: sort of a hierarchy for engineers. So if you're like 176 00:11:07,120 --> 00:11:09,560 Speaker 1: level one, you're kind of at the entry level of 177 00:11:09,600 --> 00:11:12,320 Speaker 1: engineers at Google, so you might do something like work 178 00:11:12,360 --> 00:11:16,320 Speaker 1: in I T Support. By the time you hit around 179 00:11:16,400 --> 00:11:20,000 Speaker 1: level five, you're talking about folks who frequently have a 180 00:11:20,040 --> 00:11:23,280 Speaker 1: doctorate in their respective fields, they're they're an expert at 181 00:11:23,280 --> 00:11:26,720 Speaker 1: whatever they do. But beyond level five, you've got the 182 00:11:26,760 --> 00:11:30,439 Speaker 1: engineers at Google that the company views as being pivotal 183 00:11:30,600 --> 00:11:33,040 Speaker 1: for a project to succeed. So these are people who 184 00:11:33,120 --> 00:11:37,480 Speaker 1: not only are experts, but they are drivers for success. 185 00:11:37,520 --> 00:11:40,640 Speaker 1: These are people with vision and capability, and they make 186 00:11:40,679 --> 00:11:43,400 Speaker 1: the impossible possible. By the time you're hitting up to 187 00:11:43,480 --> 00:11:47,080 Speaker 1: level nine, it means you are a distinguished engineer and 188 00:11:47,200 --> 00:11:50,800 Speaker 1: you are highly respected generally speaking, at least for your 189 00:11:51,240 --> 00:11:55,080 Speaker 1: expertise and ability. And then to become a level ten 190 00:11:55,240 --> 00:11:57,839 Speaker 1: is to be a Google Fellow, which usually means you're 191 00:11:57,880 --> 00:12:01,440 Speaker 1: also the leading expert in whatever your particular field is. 192 00:12:01,840 --> 00:12:03,800 Speaker 1: So it's not just that you're an expert, you are 193 00:12:03,840 --> 00:12:07,960 Speaker 1: the expert that other experts refer to when they need help. Uh. 194 00:12:08,000 --> 00:12:10,160 Speaker 1: And there have also been a couple of Google Senior 195 00:12:10,240 --> 00:12:14,040 Speaker 1: Fellows level elevens, but you know enough about that. So 196 00:12:14,160 --> 00:12:16,960 Speaker 1: Thrown as a Google Fellow worked on some interesting challenges. 197 00:12:17,600 --> 00:12:20,160 Speaker 1: He helped build out tech that would make turn by 198 00:12:20,200 --> 00:12:23,560 Speaker 1: turn directions possible in Google Maps. And you'all, I don't 199 00:12:23,600 --> 00:12:26,560 Speaker 1: know how many of you remember this, but way back 200 00:12:26,600 --> 00:12:30,839 Speaker 1: in the early days of web based map solutions. You 201 00:12:30,920 --> 00:12:34,160 Speaker 1: had to plot out your your route and then print 202 00:12:34,200 --> 00:12:36,440 Speaker 1: it out right. I'm talking back in the old map 203 00:12:36,559 --> 00:12:40,160 Speaker 1: quest days, and this was definitely a huge jump from 204 00:12:40,200 --> 00:12:44,200 Speaker 1: just using a regular old map or atlas right where 205 00:12:44,200 --> 00:12:46,199 Speaker 1: you just had to turn the page, find the right 206 00:12:46,240 --> 00:12:49,360 Speaker 1: map and trace your route. It was better than that, 207 00:12:49,960 --> 00:12:52,720 Speaker 1: but it did also mean that you typically needed to 208 00:12:52,760 --> 00:12:56,400 Speaker 1: have an extra person, like an actual navigator, looking after 209 00:12:56,440 --> 00:12:59,080 Speaker 1: the instruction so that you didn't miss a turn because 210 00:12:59,120 --> 00:13:02,360 Speaker 1: you know it was printed, you couldn't you couldn't know 211 00:13:02,480 --> 00:13:04,480 Speaker 1: how close you were to the next turn. You just 212 00:13:04,520 --> 00:13:07,280 Speaker 1: had to really pay attention. And it was only later 213 00:13:07,320 --> 00:13:10,000 Speaker 1: that we would get the turn by turn capabilities coupled 214 00:13:10,040 --> 00:13:14,559 Speaker 1: with accurate GPS receivers that would transform navigation forever. I'm 215 00:13:14,559 --> 00:13:17,320 Speaker 1: sure several of you out there have used maps to 216 00:13:17,320 --> 00:13:20,679 Speaker 1: get around, either because you've been driving for a while, 217 00:13:21,280 --> 00:13:25,920 Speaker 1: or maybe you drive in places where there's little to 218 00:13:26,000 --> 00:13:28,440 Speaker 1: no reception, so you have to have those physical maps. 219 00:13:28,760 --> 00:13:30,679 Speaker 1: But for a lot of people that has a lost art, 220 00:13:30,800 --> 00:13:33,480 Speaker 1: it is something that they are not used to and 221 00:13:33,600 --> 00:13:36,679 Speaker 1: it would be very challenging for them to do it today. 222 00:13:37,080 --> 00:13:40,719 Speaker 1: But this term by turned approach, in part, it's one 223 00:13:40,760 --> 00:13:43,320 Speaker 1: reason why people don't get lost as frequently, and in 224 00:13:43,320 --> 00:13:47,760 Speaker 1: another part, it's kind of made us less capable of 225 00:13:47,840 --> 00:13:50,199 Speaker 1: reading maps, so you could look at it as both 226 00:13:50,200 --> 00:13:53,000 Speaker 1: a positive and a negative. Thrown had also worked on 227 00:13:53,080 --> 00:13:56,320 Speaker 1: street View, which is the Google project that saw vehicles 228 00:13:56,320 --> 00:13:59,679 Speaker 1: with cameras mounted on their rooftops drive slowly through various 229 00:13:59,720 --> 00:14:03,600 Speaker 1: neighbors hoods and map the photographs of actual locations to 230 00:14:03,640 --> 00:14:07,040 Speaker 1: their coordinates on a map. Also one that got a 231 00:14:07,040 --> 00:14:10,079 Speaker 1: lot of controversy as it went into various neighborhoods around 232 00:14:10,080 --> 00:14:13,120 Speaker 1: the United States and later on the world. And perhaps 233 00:14:13,160 --> 00:14:17,360 Speaker 1: most importantly, Thrown brought his experience in developing autonomous vehicle 234 00:14:17,400 --> 00:14:21,600 Speaker 1: technology to a project that at the time was called Chauffeur. 235 00:14:22,400 --> 00:14:27,440 Speaker 1: This was Google's own driverless vehicle effort. We'll talk more 236 00:14:27,560 --> 00:14:31,920 Speaker 1: about how that became kind of the the seed for 237 00:14:32,040 --> 00:14:43,080 Speaker 1: Google X, but first let's take a quick break. Okay, 238 00:14:43,120 --> 00:14:46,640 Speaker 1: before we went to break, Sebastian Thrun had started work 239 00:14:46,760 --> 00:14:51,360 Speaker 1: on Chauffeur, Google's driverless car initiative way back in the days. 240 00:14:51,400 --> 00:14:55,520 Speaker 1: It's around like two thousand and nine ish, and Google's 241 00:14:55,560 --> 00:14:58,120 Speaker 1: co founders Larry Page and Sarah gay Brenn were really 242 00:14:58,160 --> 00:15:01,400 Speaker 1: impressed by Thron's work, and they approached him and gave 243 00:15:01,480 --> 00:15:04,760 Speaker 1: him the chance to spearhead an entire department dedicated to 244 00:15:04,840 --> 00:15:09,560 Speaker 1: pursue challenging opportunities. Uh. They also gave him a new title, 245 00:15:09,680 --> 00:15:14,960 Speaker 1: the Director of Other, which I don't know. That kind 246 00:15:14,960 --> 00:15:16,560 Speaker 1: of sounds like it came out of like a Neil 247 00:15:16,680 --> 00:15:19,200 Speaker 1: game In book or something. To me, it's got that 248 00:15:19,320 --> 00:15:22,280 Speaker 1: kind of ring to it. But yeah, this is still 249 00:15:22,520 --> 00:15:25,080 Speaker 1: early on in the phase where he was Director of Other, 250 00:15:25,920 --> 00:15:31,120 Speaker 1: as this concept of Google X began to take shape. Now, 251 00:15:31,920 --> 00:15:34,840 Speaker 1: at that stage, Google's whole R and D approach was 252 00:15:35,360 --> 00:15:39,360 Speaker 1: super duper secret, Like the outside world had no knowledge 253 00:15:39,920 --> 00:15:45,400 Speaker 1: of this developing department within Google. Later we would have 254 00:15:45,440 --> 00:15:47,640 Speaker 1: a better idea of what kinds of projects were going 255 00:15:47,680 --> 00:15:50,280 Speaker 1: on inside that department. But for a while, it was 256 00:15:50,400 --> 00:15:54,400 Speaker 1: really hush hush, so much so that I actually remember 257 00:15:54,520 --> 00:15:58,480 Speaker 1: visiting someone at Google around this time and then getting 258 00:15:58,520 --> 00:16:01,920 Speaker 1: quite cross with them once word broke out about Google's 259 00:16:01,960 --> 00:16:06,640 Speaker 1: self driving vehicles got into the public. I mean, if 260 00:16:06,680 --> 00:16:10,080 Speaker 1: you can't trust a nice guy like me who just 261 00:16:10,120 --> 00:16:13,280 Speaker 1: happens to be the host of a widely distributed tech podcast, 262 00:16:13,520 --> 00:16:17,800 Speaker 1: whom can you trust? But I should be fair. Even 263 00:16:17,920 --> 00:16:22,320 Speaker 1: within Google itself, the X department, as it would become known, 264 00:16:22,960 --> 00:16:27,720 Speaker 1: was mysterious. To get entry to their buildings, you needed 265 00:16:27,720 --> 00:16:30,480 Speaker 1: an authorized key card, and not very many people had 266 00:16:30,520 --> 00:16:32,960 Speaker 1: one of those, Like they had normal key cards where 267 00:16:32,960 --> 00:16:35,720 Speaker 1: they could get through most of Google campus. But once 268 00:16:35,760 --> 00:16:38,120 Speaker 1: you got up to this, yeah, you had to have 269 00:16:38,440 --> 00:16:42,920 Speaker 1: special authorization essentially. And while the standard googler at the 270 00:16:42,960 --> 00:16:45,720 Speaker 1: time was given up to of their work week to 271 00:16:45,760 --> 00:16:51,320 Speaker 1: tackle personal projects which could one day become a Google product, 272 00:16:51,440 --> 00:16:53,920 Speaker 1: a lot of Google products actually began as one of 273 00:16:53,960 --> 00:16:58,320 Speaker 1: those time projects. The X team was actually less structured 274 00:16:58,440 --> 00:17:04,000 Speaker 1: than that. So with in the Google X division, creativity 275 00:17:04,359 --> 00:17:11,240 Speaker 1: reigned supreme. The department welcomed unlikely, outlandish ideas many in fact, 276 00:17:11,600 --> 00:17:15,440 Speaker 1: probably most of them wouldn't pan out, but it sounds 277 00:17:15,520 --> 00:17:17,679 Speaker 1: like the early motto was something along the lines of 278 00:17:17,800 --> 00:17:21,320 Speaker 1: you never know if you don't try. So. Thrown was 279 00:17:21,359 --> 00:17:24,919 Speaker 1: also fiercely protective of his team. He wanted them to 280 00:17:24,920 --> 00:17:28,119 Speaker 1: have all the freedom to experiment and test ideas and 281 00:17:28,160 --> 00:17:31,600 Speaker 1: to collaborate across teams. So maybe one team that's working 282 00:17:31,640 --> 00:17:35,280 Speaker 1: on one really big problem ends up collaborating with a 283 00:17:35,280 --> 00:17:37,600 Speaker 1: team that's working on a totally different problem, and that 284 00:17:37,680 --> 00:17:42,760 Speaker 1: cross pollination ends up fueling new creativity. He didn't want 285 00:17:42,880 --> 00:17:45,159 Speaker 1: his team members to ever have to worry about attending 286 00:17:45,240 --> 00:17:48,720 Speaker 1: meetings where they have to give status updates and budget 287 00:17:48,760 --> 00:17:51,000 Speaker 1: reports and that kind of thing. He wanted them to 288 00:17:51,040 --> 00:17:54,560 Speaker 1: focus completely on their projects and not worry about having 289 00:17:54,560 --> 00:17:58,440 Speaker 1: to justify those brought projects to to top brass. So 290 00:17:58,640 --> 00:18:03,240 Speaker 1: Google effectively funded the X division, but the individual projects 291 00:18:03,240 --> 00:18:06,119 Speaker 1: inside didn't have to break down costs or anything like that, 292 00:18:06,200 --> 00:18:09,439 Speaker 1: at least not to a really granular level, certainly not 293 00:18:09,520 --> 00:18:12,000 Speaker 1: to a level that you would expect for most businesses. 294 00:18:12,520 --> 00:18:17,120 Speaker 1: Thron led the division from twenty ten until two thousand twelve. 295 00:18:17,359 --> 00:18:21,240 Speaker 1: He actually got really focused on the driverless car project, 296 00:18:21,920 --> 00:18:26,360 Speaker 1: ultimately kind of stepping back from overseeing the overall UH 297 00:18:26,359 --> 00:18:30,560 Speaker 1: projects in Google X, and then ultimately left Google altogether 298 00:18:30,800 --> 00:18:34,720 Speaker 1: in order to found a new education company called Udacity. 299 00:18:34,840 --> 00:18:39,479 Speaker 1: So taking his place was Eric quote unquote astro Teller. 300 00:18:39,560 --> 00:18:43,000 Speaker 1: Astro's nickname apparently that's what he goes by all the time. 301 00:18:43,560 --> 00:18:45,960 Speaker 1: And boy, howney, does this guy have a pedigree. So 302 00:18:46,520 --> 00:18:51,359 Speaker 1: astro Teller's grandfather is the guy who spearheaded the development 303 00:18:51,440 --> 00:18:56,119 Speaker 1: of the hydrogen bomb a k A Edward Teller. One day, 304 00:18:56,160 --> 00:18:59,320 Speaker 1: I'm gonna have to do a full episode about Edward Teller, 305 00:18:59,400 --> 00:19:05,000 Speaker 1: because that was one complicated dude. Uh, A brilliant person, 306 00:19:05,480 --> 00:19:09,800 Speaker 1: a vilified person for reasons that would become clear if 307 00:19:09,800 --> 00:19:12,440 Speaker 1: I do a full episode. So we'll put that aside. 308 00:19:12,600 --> 00:19:15,640 Speaker 1: In the future. Maybe I'll do an Edward Teller episode anyway, 309 00:19:15,680 --> 00:19:20,399 Speaker 1: Astro's other grandfather had also had won Nobel prizes, So 310 00:19:20,400 --> 00:19:22,280 Speaker 1: so yeah, Astro came from a family that had a 311 00:19:22,320 --> 00:19:25,840 Speaker 1: reputation for big thinking, and he would be very modest 312 00:19:25,920 --> 00:19:29,679 Speaker 1: when describing his own role within his family. And Astro 313 00:19:29,800 --> 00:19:32,840 Speaker 1: Teller had kind of a moonshot task of his own, 314 00:19:32,880 --> 00:19:37,360 Speaker 1: which was to take this creative, whirlwind, chaotic department and 315 00:19:37,400 --> 00:19:40,040 Speaker 1: give it a little more structure. You know, there was 316 00:19:40,080 --> 00:19:43,320 Speaker 1: still a clear need for lots of freedom, but without 317 00:19:43,400 --> 00:19:46,240 Speaker 1: any structure, there was little chance of being able to 318 00:19:46,240 --> 00:19:50,000 Speaker 1: actually harness whatever came out of the research, Right, Like, 319 00:19:50,080 --> 00:19:52,800 Speaker 1: you want to make sure that if you do come 320 00:19:52,880 --> 00:19:55,879 Speaker 1: up with things that are useful, that you have a 321 00:19:55,920 --> 00:19:59,720 Speaker 1: process in place to make that come to fruition. Otherwise, 322 00:20:00,560 --> 00:20:02,800 Speaker 1: if you're just creating cool stuff but nobody ever gets 323 00:20:02,840 --> 00:20:04,679 Speaker 1: to see it, it would be kind of like, you know, 324 00:20:05,560 --> 00:20:08,240 Speaker 1: Willie Wonka's chocolate factory before he ever gave out the 325 00:20:08,240 --> 00:20:11,080 Speaker 1: golden tickets. Yeah, it's an amazing place, but who gets 326 00:20:11,119 --> 00:20:13,879 Speaker 1: to see it. So to that end, tell her brought 327 00:20:13,960 --> 00:20:18,040 Speaker 1: in a woman named Obie Felton who had previously worked 328 00:20:18,040 --> 00:20:21,040 Speaker 1: in Google's marketing department. So she was not from an 329 00:20:21,040 --> 00:20:25,359 Speaker 1: engineering background, she was from a product marketing background, and 330 00:20:25,720 --> 00:20:28,240 Speaker 1: she was specifically brought in because she could bring that 331 00:20:28,320 --> 00:20:31,920 Speaker 1: kind of perspective to the department, and she helped design 332 00:20:32,080 --> 00:20:35,479 Speaker 1: Google X so that had a little more framework to it. 333 00:20:35,840 --> 00:20:39,320 Speaker 1: Her title became head of getting moon Shots ready for 334 00:20:39,520 --> 00:20:42,560 Speaker 1: contact with the real world, and she would later go 335 00:20:42,600 --> 00:20:45,239 Speaker 1: on to leave Google and found her own start up 336 00:20:45,280 --> 00:20:49,000 Speaker 1: with a focus on mental health. So yeah, she and 337 00:20:49,000 --> 00:20:51,639 Speaker 1: and many others who have worked at Google X have 338 00:20:51,800 --> 00:20:53,919 Speaker 1: gone on to become sort of a who's who have 339 00:20:54,000 --> 00:20:58,600 Speaker 1: startup founders and industry leaders. It's it's really fascinating. In fact, 340 00:20:58,640 --> 00:21:04,240 Speaker 1: I even got to interview a a former director over 341 00:21:04,280 --> 00:21:07,520 Speaker 1: at at Google X, and it's just fascinating to talk 342 00:21:07,520 --> 00:21:11,119 Speaker 1: with people who have been in those positions. Now. According 343 00:21:11,160 --> 00:21:14,880 Speaker 1: to an article titled The Truth about Google X by 344 00:21:14,960 --> 00:21:19,080 Speaker 1: John Gartner, and Fast Company. For a project to be 345 00:21:19,320 --> 00:21:23,280 Speaker 1: considered within Google X. For for this to be something 346 00:21:23,320 --> 00:21:28,400 Speaker 1: that the department actually pursues, UH, it has to meet 347 00:21:28,480 --> 00:21:33,080 Speaker 1: three criteria. One is that the projects should address a 348 00:21:33,119 --> 00:21:37,480 Speaker 1: problem that affects millions or potentially billions of people. Two 349 00:21:37,680 --> 00:21:40,440 Speaker 1: is that some part of the solution should at least 350 00:21:40,440 --> 00:21:44,760 Speaker 1: resemble science fiction. It should be you know, incorporating innovation 351 00:21:44,880 --> 00:21:48,439 Speaker 1: in some way. Third is that these solutions need to 352 00:21:48,480 --> 00:21:52,000 Speaker 1: rely on technologies that are either currently available or kind 353 00:21:52,000 --> 00:21:56,879 Speaker 1: of on the cusp of availability. Alternatively, Wired says that 354 00:21:56,960 --> 00:22:00,159 Speaker 1: third criterion is really that the solution should produce is 355 00:22:00,200 --> 00:22:03,879 Speaker 1: a radically positive outcome, and that the outcome should be 356 00:22:04,200 --> 00:22:08,320 Speaker 1: ten times better than whatever exists today. So, whatever the 357 00:22:08,320 --> 00:22:12,919 Speaker 1: the challenges, whatever the goal is, the realization of that 358 00:22:13,000 --> 00:22:16,440 Speaker 1: goal should be that where we are there is ten 359 00:22:16,480 --> 00:22:19,640 Speaker 1: times better than where we were before. Now this appears 360 00:22:19,680 --> 00:22:23,320 Speaker 1: to be when the department also began to formalize its 361 00:22:23,320 --> 00:22:27,040 Speaker 1: approach a little bit, which was trying to make a 362 00:22:27,160 --> 00:22:31,240 Speaker 1: project fail right away. And and I really do mean that. 363 00:22:31,320 --> 00:22:33,560 Speaker 1: I mean the team would get together and they would 364 00:22:33,600 --> 00:22:36,639 Speaker 1: start to identify, you know, some sort of approach to 365 00:22:37,359 --> 00:22:40,840 Speaker 1: solving a problem, and they would first try to tackle 366 00:22:40,840 --> 00:22:44,200 Speaker 1: the hardest part of whatever that problem was right away, 367 00:22:44,600 --> 00:22:46,720 Speaker 1: you know, identify what is going to be the most 368 00:22:46,840 --> 00:22:52,000 Speaker 1: difficult thing to get beyond with that problem, and try 369 00:22:52,080 --> 00:22:55,680 Speaker 1: to solve for that first. And the wisdom was that 370 00:22:55,720 --> 00:22:58,480 Speaker 1: the team discovered that the hardest part of the challenge 371 00:22:58,600 --> 00:23:01,479 Speaker 1: was totally beyond reach, it would be best to just 372 00:23:01,560 --> 00:23:04,640 Speaker 1: admit failure at least for now, and to move on. 373 00:23:05,119 --> 00:23:09,560 Speaker 1: So Google X would pursue some goals but abandon lots 374 00:23:09,560 --> 00:23:12,240 Speaker 1: of others. Like they might say, you know what, this 375 00:23:12,440 --> 00:23:14,840 Speaker 1: just isn't going to be a thing yet, let's put 376 00:23:14,880 --> 00:23:18,760 Speaker 1: it aside now. Reasons for abandoning a project were varied 377 00:23:18,920 --> 00:23:21,719 Speaker 1: and are varied to this day. So it might just 378 00:23:21,800 --> 00:23:25,159 Speaker 1: be that the costs of the project are estimated to 379 00:23:25,280 --> 00:23:29,280 Speaker 1: be beyond whatever the benefits are. Now. By costs, I 380 00:23:29,320 --> 00:23:32,560 Speaker 1: don't necessarily just mean money, although it could be that 381 00:23:33,080 --> 00:23:37,360 Speaker 1: those costs could be in stuff like effort. So if 382 00:23:37,359 --> 00:23:40,800 Speaker 1: a project aims to create a new way to do something, 383 00:23:41,359 --> 00:23:44,240 Speaker 1: but it turns out that this new way actually requires 384 00:23:44,240 --> 00:23:48,840 Speaker 1: more work to complete than existing ways, well that's a 385 00:23:48,880 --> 00:23:51,600 Speaker 1: good reason to pull the eject lever on that project. 386 00:23:51,920 --> 00:23:54,400 Speaker 1: You know, I'm often left thinking about people and I'm 387 00:23:54,400 --> 00:23:56,879 Speaker 1: guilty of this too, people who would spend more time 388 00:23:56,880 --> 00:23:59,760 Speaker 1: and effort getting out of doing something than it would 389 00:23:59,800 --> 00:24:02,760 Speaker 1: say for them to actually do the thing. That's the 390 00:24:02,840 --> 00:24:06,280 Speaker 1: kind of stuff that the department wants to avoid. So 391 00:24:06,440 --> 00:24:10,199 Speaker 1: an example of such a project was an attempt to 392 00:24:10,320 --> 00:24:13,840 Speaker 1: build a safe jet pack. If you've been asking where's 393 00:24:13,880 --> 00:24:19,359 Speaker 1: my jet pack? Well, Google's answer would be we tried. So. 394 00:24:19,400 --> 00:24:22,480 Speaker 1: I had a tech Crunch Disrupt conference back in fourteen. 395 00:24:22,560 --> 00:24:25,879 Speaker 1: Astro Teller was a speaker there and revealed that the 396 00:24:25,920 --> 00:24:29,480 Speaker 1: Google X division had been at work on a jet pack, 397 00:24:29,600 --> 00:24:32,439 Speaker 1: because that was one of those projects that Astro himself 398 00:24:32,480 --> 00:24:34,240 Speaker 1: was saying, I just think it would be cool to 399 00:24:34,320 --> 00:24:39,160 Speaker 1: have one. But ultimately the team concluded that it would 400 00:24:39,200 --> 00:24:42,560 Speaker 1: be such an inefficient device from a power perspective, that 401 00:24:43,080 --> 00:24:47,119 Speaker 1: it would be so fuel inefficient that it just made 402 00:24:47,160 --> 00:24:49,720 Speaker 1: no sense to pursue. You would be able to get 403 00:24:49,760 --> 00:24:54,199 Speaker 1: around with other ways using way less fuel than if 404 00:24:54,200 --> 00:24:56,399 Speaker 1: you were to use the jet pack. It would just 405 00:24:56,440 --> 00:25:01,000 Speaker 1: be this noisy, inefficient means of transportation, and that's before 406 00:25:01,000 --> 00:25:03,920 Speaker 1: you even get to the safety concerns. So they said, yeah, 407 00:25:03,960 --> 00:25:06,640 Speaker 1: this is a non starter because we can already get 408 00:25:06,680 --> 00:25:11,240 Speaker 1: around using way less fuel, using you know, conventional means, 409 00:25:11,400 --> 00:25:15,280 Speaker 1: so there's no point in pursuing this. Another abandoned project 410 00:25:15,680 --> 00:25:18,879 Speaker 1: was a hoverboard, another thing astro Teller really wanted, and 411 00:25:18,920 --> 00:25:20,800 Speaker 1: this would be kind of like what you would see 412 00:25:20,920 --> 00:25:24,000 Speaker 1: in the film Back to the Future Too, and in 413 00:25:24,040 --> 00:25:26,640 Speaker 1: a small bit and Back to the Future three. So 414 00:25:26,920 --> 00:25:29,840 Speaker 1: way back when Back to the Future Too was in production, 415 00:25:30,119 --> 00:25:34,520 Speaker 1: the production team teased the general public. They said that 416 00:25:34,560 --> 00:25:38,880 Speaker 1: the hoverboard technology actually existed. They used it in the movie, 417 00:25:39,280 --> 00:25:42,719 Speaker 1: but the toy companies figured out that it was too 418 00:25:42,840 --> 00:25:45,280 Speaker 1: dangerous for the public, so it would never go on sale. 419 00:25:45,720 --> 00:25:49,399 Speaker 1: So it's this this kind of joke that these hoverboards 420 00:25:49,400 --> 00:25:51,399 Speaker 1: really existed, but it was all a prank. Of course, 421 00:25:51,440 --> 00:25:54,639 Speaker 1: they didn't really exist. However, a team at Google tried 422 00:25:54,680 --> 00:25:57,919 Speaker 1: to make it work. They used stuff like graphite and 423 00:25:58,040 --> 00:26:02,119 Speaker 1: magnets to make a very small version of this work, 424 00:26:02,280 --> 00:26:03,960 Speaker 1: and they could get it to work on a very 425 00:26:03,960 --> 00:26:07,520 Speaker 1: small scale with models, but then they found it very 426 00:26:07,560 --> 00:26:11,920 Speaker 1: difficult to scale that up to a hoverboard that would 427 00:26:11,920 --> 00:26:15,119 Speaker 1: actually be human size and capable of holding a human up, 428 00:26:16,000 --> 00:26:19,639 Speaker 1: and the problems might not have been insurmountable. Maybe with 429 00:26:19,800 --> 00:26:23,000 Speaker 1: enough time and resources they could have gotten around it, 430 00:26:23,520 --> 00:26:25,840 Speaker 1: But they figured that even if they did suss out 431 00:26:25,960 --> 00:26:30,320 Speaker 1: all the problems and solve them, the benefits of the 432 00:26:30,320 --> 00:26:34,880 Speaker 1: technology are so niche that it's not worth all that 433 00:26:35,000 --> 00:26:38,520 Speaker 1: time and effort and other resources that would be necessary 434 00:26:38,560 --> 00:26:43,160 Speaker 1: to make it become a reality. So the hoverboard went 435 00:26:43,200 --> 00:26:47,439 Speaker 1: into the project graveyard at Google X. Now, in other cases, 436 00:26:47,480 --> 00:26:49,920 Speaker 1: it might be that the actual challenge is just so 437 00:26:49,960 --> 00:26:54,679 Speaker 1: ambitious as to be deemed impossible. One such idea, at 438 00:26:54,720 --> 00:26:57,400 Speaker 1: least according to an article in Forbes by Eric Mack 439 00:26:57,920 --> 00:27:02,200 Speaker 1: and repeated in the Fast Company epiece, was a supposed 440 00:27:02,359 --> 00:27:07,120 Speaker 1: teleportation method, as in yeah, you get zapped from one 441 00:27:07,160 --> 00:27:13,399 Speaker 1: physical location to appear in another. However, the Google team 442 00:27:13,440 --> 00:27:18,679 Speaker 1: concluded that this particular concept violates a few basic laws 443 00:27:18,680 --> 00:27:22,000 Speaker 1: of physics as we understand them, which even Scotty on 444 00:27:22,080 --> 00:27:24,760 Speaker 1: Star Trek would say is a real deal breaker. You 445 00:27:24,880 --> 00:27:29,240 Speaker 1: cannot break the laws of physics. Now, there are likely 446 00:27:29,880 --> 00:27:34,000 Speaker 1: dozens or hundreds or maybe thousands of projects that X 447 00:27:34,040 --> 00:27:36,800 Speaker 1: has tackled and abandoned since it was founded back in 448 00:27:36,840 --> 00:27:40,200 Speaker 1: two that we will never really hear about. It is 449 00:27:40,320 --> 00:27:42,679 Speaker 1: rare to get a glimpse at what was going on 450 00:27:42,840 --> 00:27:47,040 Speaker 1: within this first division within Google and then subsidiary company 451 00:27:47,160 --> 00:27:50,760 Speaker 1: of Alphabet. However, we can talk about some of the 452 00:27:50,800 --> 00:27:54,439 Speaker 1: work that happened within the department that did see at 453 00:27:54,520 --> 00:27:59,600 Speaker 1: least some level of mainstream visibility. And one thing that 454 00:27:59,640 --> 00:28:02,000 Speaker 1: we can talk about right off the bat is the 455 00:28:02,119 --> 00:28:06,280 Speaker 1: driverless car technology that kind of started the whole thing. Now, 456 00:28:06,320 --> 00:28:10,680 Speaker 1: Eventually people began to spot vehicles that were operating around 457 00:28:10,720 --> 00:28:14,479 Speaker 1: the Google campus in California, and some of these vehicles 458 00:28:14,480 --> 00:28:17,280 Speaker 1: had these weird frames attached to the rooftops that had 459 00:28:17,400 --> 00:28:21,120 Speaker 1: all this electronic equipment on it, and some folks even 460 00:28:21,160 --> 00:28:23,959 Speaker 1: noticed that the people in the cars weren't necessarily touching 461 00:28:24,000 --> 00:28:27,720 Speaker 1: the steering wheel while the car was in motion. Eventually, 462 00:28:28,119 --> 00:28:32,240 Speaker 1: that project would evolve into the spinoff company Weymo. So 463 00:28:32,320 --> 00:28:35,800 Speaker 1: Google did at one point come forward and say, yes, 464 00:28:35,800 --> 00:28:38,600 Speaker 1: we're working on driverless cars. That was the point where 465 00:28:38,640 --> 00:28:42,560 Speaker 1: I got really snitty at the Googler I know who 466 00:28:42,600 --> 00:28:46,560 Speaker 1: didn't tell me about it because they weren't supposed to, 467 00:28:46,680 --> 00:28:48,520 Speaker 1: so they were doing the right thing. I was just 468 00:28:48,600 --> 00:28:52,920 Speaker 1: being kind of cranky about it. And once we began 469 00:28:52,960 --> 00:28:55,480 Speaker 1: to learn about that, it became known as Project Weymo, 470 00:28:55,640 --> 00:28:59,760 Speaker 1: and then eventually was spun off as a startup called Weymo, 471 00:29:00,000 --> 00:29:04,120 Speaker 1: which is against still a subsidiary of Alphabet. So Alphabet 472 00:29:04,120 --> 00:29:08,280 Speaker 1: being the parent company Google, exists beneath Alphabet, so does Weimo, 473 00:29:08,680 --> 00:29:11,120 Speaker 1: you know, so does YouTube that kind of thing. So 474 00:29:11,200 --> 00:29:15,120 Speaker 1: Weymo continues to work on bringing driverless car technology to market, 475 00:29:15,360 --> 00:29:19,480 Speaker 1: and it operates some limited implementations. Uh, there is a 476 00:29:19,560 --> 00:29:24,560 Speaker 1: limited self driving taxi service in the Phoenix, Arizona area 477 00:29:24,720 --> 00:29:27,960 Speaker 1: that you know. It's it's very tightly controlled, right, Like, 478 00:29:28,040 --> 00:29:34,360 Speaker 1: you can't go beyond a heavily defined region within which 479 00:29:34,440 --> 00:29:37,800 Speaker 1: this service can operate. So if you needed a ride 480 00:29:38,040 --> 00:29:41,200 Speaker 1: that was outside that region, you can't do it in 481 00:29:41,280 --> 00:29:46,080 Speaker 1: Weymo because it needs that controlled environment in order to 482 00:29:46,240 --> 00:29:50,880 Speaker 1: operate safely. But it's still is a platform upon which 483 00:29:50,920 --> 00:29:55,720 Speaker 1: the company is learning, uh new things that are informing 484 00:29:56,000 --> 00:29:59,320 Speaker 1: the future of autonomous driving. Much of the company's work 485 00:29:59,360 --> 00:30:04,080 Speaker 1: also is focusing on automating the trucking industry, which would 486 00:30:04,120 --> 00:30:07,560 Speaker 1: be huge. So this is one of the projects that 487 00:30:07,680 --> 00:30:11,960 Speaker 1: grew out of Google X that's actually stuck around. Now 488 00:30:12,520 --> 00:30:15,040 Speaker 1: when we come back, I'm going to talk about other projects, 489 00:30:15,080 --> 00:30:17,560 Speaker 1: some of which also stuck around, and some of which 490 00:30:18,600 --> 00:30:22,760 Speaker 1: did not. But they did get going in Google X, 491 00:30:22,880 --> 00:30:25,600 Speaker 1: and they had a good amount of steam behind them 492 00:30:25,600 --> 00:30:28,720 Speaker 1: at one point, but some of them just didn't pan out. 493 00:30:29,480 --> 00:30:31,920 Speaker 1: We're going to talk about more of those when we 494 00:30:31,960 --> 00:30:42,720 Speaker 1: come back after this break. So before the break, I 495 00:30:43,400 --> 00:30:46,600 Speaker 1: hinted that there are some projects that started in Google 496 00:30:46,800 --> 00:30:51,240 Speaker 1: X that got really far but ultimately didn't pan out. 497 00:30:51,440 --> 00:30:55,600 Speaker 1: One of those had to do with high altitude balloons. 498 00:30:56,360 --> 00:31:00,120 Speaker 1: So the problem that was identified was that there are 499 00:31:00,000 --> 00:31:04,400 Speaker 1: areas in the world that have limited to no connectivity 500 00:31:04,440 --> 00:31:07,560 Speaker 1: to the Internet, and the Internet is such an important 501 00:31:07,680 --> 00:31:11,640 Speaker 1: tool that this was seen as a massive detriment to 502 00:31:11,640 --> 00:31:16,320 Speaker 1: those communities and that there should be solutions to provide 503 00:31:16,800 --> 00:31:20,320 Speaker 1: the potential at least to connect to the Internet. And 504 00:31:20,600 --> 00:31:24,960 Speaker 1: the concept here was to use high altitude balloons flying 505 00:31:25,160 --> 00:31:29,800 Speaker 1: well above weather systems, and that these balloons would carry 506 00:31:29,920 --> 00:31:34,280 Speaker 1: essentially Internet transmitters, kind of similar to what satellite Internet does, 507 00:31:34,320 --> 00:31:37,800 Speaker 1: except it's not all the way out into orbit. It's 508 00:31:37,840 --> 00:31:41,240 Speaker 1: being carried by these balloons, and you could use these 509 00:31:41,360 --> 00:31:45,280 Speaker 1: networks of balloons to provide connectivity to remote and underserved 510 00:31:45,520 --> 00:31:48,880 Speaker 1: locations like parts of Africa. This would become known as 511 00:31:48,920 --> 00:31:51,920 Speaker 1: Project loon I actually wrote an article about this for 512 00:31:51,960 --> 00:31:53,880 Speaker 1: How Stuff Works way back in the day, and this 513 00:31:53,920 --> 00:31:58,080 Speaker 1: project started around two thousand eleven. Engineers worked on it 514 00:31:58,200 --> 00:32:01,840 Speaker 1: in secret for about two years, and then Google announced 515 00:32:01,840 --> 00:32:04,000 Speaker 1: the program in two thousand and thirteen. That's when it 516 00:32:04,080 --> 00:32:07,480 Speaker 1: got the name Project Loon, or at least publicly it 517 00:32:07,560 --> 00:32:10,400 Speaker 1: was known as Project Loon. They might have been calling 518 00:32:10,400 --> 00:32:12,760 Speaker 1: it that within the department for a while, I don't know, 519 00:32:13,520 --> 00:32:16,800 Speaker 1: But like Weymo, Google would eventually create a spin off 520 00:32:16,840 --> 00:32:21,080 Speaker 1: company called Loon to bring this technology to market. That 521 00:32:21,160 --> 00:32:23,680 Speaker 1: happened in two thousand eighteen, when Loon would become its 522 00:32:23,720 --> 00:32:27,800 Speaker 1: own company as a subsidiary under Alphabet. The new company 523 00:32:27,920 --> 00:32:31,760 Speaker 1: landed a deal to provide Internet connectivity over parts of Kenya, 524 00:32:31,880 --> 00:32:34,120 Speaker 1: and while it seemed like Loon was on its way 525 00:32:34,120 --> 00:32:38,160 Speaker 1: to success, in reality it became clear that the operational 526 00:32:38,280 --> 00:32:41,880 Speaker 1: costs and the risks were too great to make Loon 527 00:32:42,120 --> 00:32:46,600 Speaker 1: a viable business, so Google popped the Loon project in 528 00:32:46,720 --> 00:32:50,360 Speaker 1: twenty one. But like I said earlier in this episode, 529 00:32:50,880 --> 00:32:54,360 Speaker 1: sometimes the pursuit of a goal can generate benefits even 530 00:32:54,400 --> 00:32:57,960 Speaker 1: if you never achieve the final goal itself. So with 531 00:32:58,080 --> 00:33:01,760 Speaker 1: the case of Loon, sign antists were able to leverage 532 00:33:01,880 --> 00:33:05,440 Speaker 1: high resolution data that was gathered by the project's high 533 00:33:05,440 --> 00:33:10,520 Speaker 1: altitude balloons and study stuff like climate modeling or how 534 00:33:10,600 --> 00:33:16,360 Speaker 1: gravity waves moved through the stratosphere. So again, without this project, 535 00:33:16,440 --> 00:33:19,920 Speaker 1: that data wouldn't have been available to the scientific community. 536 00:33:19,960 --> 00:33:24,160 Speaker 1: So there were benefits that emerged from this project, even 537 00:33:24,160 --> 00:33:27,360 Speaker 1: though the project itself you might describe as a failure 538 00:33:27,400 --> 00:33:29,880 Speaker 1: because it was unable to achieve the goal that was, 539 00:33:30,520 --> 00:33:33,800 Speaker 1: you know, established from the get go. Now, another famous 540 00:33:33,920 --> 00:33:38,160 Speaker 1: project to emerge from Google X was Google Glass. That's 541 00:33:38,160 --> 00:33:43,360 Speaker 1: the augmented reality headset Google unveiled way back in now. 542 00:33:43,400 --> 00:33:46,560 Speaker 1: For those of y'all not familiar with Google Glass, the 543 00:33:46,560 --> 00:33:51,480 Speaker 1: original version looked kind of like frames for eyeglasses. No 544 00:33:51,680 --> 00:33:55,920 Speaker 1: lenses in the original um and they had stems that 545 00:33:55,960 --> 00:33:58,320 Speaker 1: would go over the ears, so you know, you had 546 00:33:58,320 --> 00:34:00,960 Speaker 1: a little nose rest the would sit on the bridge 547 00:34:01,000 --> 00:34:03,280 Speaker 1: of your nose and the stems would go over your ear, 548 00:34:03,840 --> 00:34:07,000 Speaker 1: over the backs of your ears, and instead of lenses, 549 00:34:07,080 --> 00:34:09,920 Speaker 1: you had this clear prism that was located a little 550 00:34:09,960 --> 00:34:12,840 Speaker 1: bit above and to the right of your right eye, 551 00:34:13,239 --> 00:34:16,160 Speaker 1: and a tiny projector from the stem on that side 552 00:34:16,160 --> 00:34:20,040 Speaker 1: could beam in images into that prism. So when you 553 00:34:20,120 --> 00:34:23,520 Speaker 1: looked up and to the right a little bit through 554 00:34:23,560 --> 00:34:26,120 Speaker 1: this prism, you could see the images that were projected 555 00:34:26,160 --> 00:34:30,560 Speaker 1: on there. They kind of overlaid whatever was behind it 556 00:34:31,000 --> 00:34:33,439 Speaker 1: in your in your field of view, so the prism 557 00:34:33,520 --> 00:34:36,640 Speaker 1: wouldn't block your forward view. So in theory it would 558 00:34:36,680 --> 00:34:38,640 Speaker 1: be safe to where even if you were, you know, 559 00:34:39,040 --> 00:34:41,279 Speaker 1: driving a car or something, because the idea was that 560 00:34:41,320 --> 00:34:44,920 Speaker 1: it wasn't supposed to interfere with your vision. And you 561 00:34:44,960 --> 00:34:49,440 Speaker 1: would pair the Google Glass with an Android smartphone, and 562 00:34:49,480 --> 00:34:52,560 Speaker 1: the smartphone would provide the connectivity for the glass itself. 563 00:34:52,560 --> 00:34:55,560 Speaker 1: It acted kind of like a modem for the Google Glass. 564 00:34:55,960 --> 00:34:59,359 Speaker 1: The headset included small bone conduction speaker in it so 565 00:34:59,400 --> 00:35:01,960 Speaker 1: that you could hear your audio through. The headset had 566 00:35:02,000 --> 00:35:05,960 Speaker 1: a microphone so we could pick up voice commands. Uh. 567 00:35:06,000 --> 00:35:08,800 Speaker 1: It also could pick up gesture commands both from physical 568 00:35:08,840 --> 00:35:11,600 Speaker 1: touch as well as moving your head in certain ways. 569 00:35:11,680 --> 00:35:13,920 Speaker 1: You could you could use that to have it do 570 00:35:14,160 --> 00:35:18,319 Speaker 1: specific actions. And it had a camera incorporated and as well, 571 00:35:18,360 --> 00:35:21,120 Speaker 1: so you can actually use your Google Glass to take photos. 572 00:35:21,560 --> 00:35:24,680 Speaker 1: And I actually had a Google Glass headset once upon 573 00:35:24,719 --> 00:35:28,080 Speaker 1: a time, and I wore it to Dragon Con that year. 574 00:35:28,760 --> 00:35:31,719 Speaker 1: That's a big science fiction, fantasy, horror, and comic book 575 00:35:31,719 --> 00:35:35,680 Speaker 1: convention that takes place here in Atlanta, and it's known 576 00:35:35,719 --> 00:35:39,200 Speaker 1: for being incredibly popular with Cause players. In fact, sometimes 577 00:35:39,239 --> 00:35:41,560 Speaker 1: it feels like Cause players out number of people in 578 00:35:41,680 --> 00:35:44,960 Speaker 1: mundane clothing by a factor of two to one or 579 00:35:45,000 --> 00:35:50,480 Speaker 1: maybe more. So. I wore this brand new Google Glass headset, 580 00:35:51,400 --> 00:35:53,319 Speaker 1: which lots of folks had not even heard of at 581 00:35:53,320 --> 00:35:56,360 Speaker 1: that point, and I went to Dragon Con. I used 582 00:35:56,400 --> 00:35:59,560 Speaker 1: the headset to take photos of people after first asking 583 00:36:00,080 --> 00:36:03,320 Speaker 1: or permission to do so. Always important anyone going to 584 00:36:03,400 --> 00:36:06,240 Speaker 1: a convention. You see some cosplay you love, ask permission 585 00:36:06,280 --> 00:36:09,120 Speaker 1: before you start taking photos. It's just polite. So that 586 00:36:09,200 --> 00:36:11,160 Speaker 1: was a lot of fun for me. And partly the 587 00:36:11,200 --> 00:36:13,279 Speaker 1: reason it was so much fun for me is a 588 00:36:13,280 --> 00:36:15,080 Speaker 1: lot of the pictures I got, at least the initial 589 00:36:15,120 --> 00:36:18,319 Speaker 1: pictures I got, were people looking at me very confused 590 00:36:18,719 --> 00:36:23,720 Speaker 1: because I chose to use the voice command to take 591 00:36:23,760 --> 00:36:26,960 Speaker 1: a photo using Google Glass. So I would use the 592 00:36:27,000 --> 00:36:30,080 Speaker 1: activation phrase, which I'm not gonna say simply because it's 593 00:36:30,120 --> 00:36:34,520 Speaker 1: the same phrase Google uses today for its smart speakers, 594 00:36:34,560 --> 00:36:37,319 Speaker 1: and I have one behind me, and if I say it, 595 00:36:37,640 --> 00:36:39,960 Speaker 1: I'll wake it up. But then you would say take 596 00:36:40,000 --> 00:36:42,560 Speaker 1: a picture. So I would say take a picture, and 597 00:36:42,640 --> 00:36:44,799 Speaker 1: the person I was looking at would say huh or 598 00:36:44,840 --> 00:36:47,520 Speaker 1: you know what or something like that, and that would 599 00:36:47,520 --> 00:36:49,320 Speaker 1: be the photo I would end up with as someone 600 00:36:49,360 --> 00:36:51,480 Speaker 1: looking confused and saying, why are you talking to me 601 00:36:51,560 --> 00:36:53,920 Speaker 1: like this? And then I would explain, oh, I just 602 00:36:54,000 --> 00:36:58,880 Speaker 1: took a photo using this headset, and and they went bonkers. 603 00:36:59,040 --> 00:37:01,800 Speaker 1: It was like it was like I was the future, 604 00:37:01,960 --> 00:37:04,799 Speaker 1: I was science fiction at this science fiction convention. I 605 00:37:04,800 --> 00:37:07,279 Speaker 1: would show them the picture that would come up on 606 00:37:07,360 --> 00:37:10,799 Speaker 1: my phone that was taken by the headset and they 607 00:37:11,040 --> 00:37:14,840 Speaker 1: loved it. Well. That was a fun, positive experience for 608 00:37:14,880 --> 00:37:18,439 Speaker 1: me with Google Glass. However, Google Glass in general got 609 00:37:18,480 --> 00:37:22,759 Speaker 1: some awful reviews. For one thing, it was really expensive 610 00:37:23,280 --> 00:37:25,560 Speaker 1: and this was a pilot program, like these were not 611 00:37:26,520 --> 00:37:30,640 Speaker 1: rolled out for general consumption. That also meant they were 612 00:37:30,680 --> 00:37:33,960 Speaker 1: available in very limited numbers, so it was also kind 613 00:37:34,000 --> 00:37:36,920 Speaker 1: of elitist. Although I will point out there was at 614 00:37:37,000 --> 00:37:39,120 Speaker 1: least one other person at Dragon Con that same year 615 00:37:39,160 --> 00:37:41,520 Speaker 1: wearing Google Glass because we took a picture of each 616 00:37:41,520 --> 00:37:44,080 Speaker 1: other wearing them. Uh. There was also a concern for 617 00:37:44,120 --> 00:37:47,360 Speaker 1: privacy because you're wearing a camera on your face, so 618 00:37:47,440 --> 00:37:50,440 Speaker 1: there was this fear that someone could be taking photos 619 00:37:50,600 --> 00:37:54,759 Speaker 1: or videos surreptitiously, and a midley that is creepy. Now 620 00:37:54,960 --> 00:37:58,719 Speaker 1: you could you could counter that by saying everyone is 621 00:37:58,840 --> 00:38:01,400 Speaker 1: carrying a camera with them all the time, it's in 622 00:38:01,440 --> 00:38:05,240 Speaker 1: their pocket, it's a smartphone, that this is already happening. 623 00:38:05,280 --> 00:38:10,200 Speaker 1: People are taking photos and videos surreptitiously all the time. However, 624 00:38:10,840 --> 00:38:14,040 Speaker 1: there's just there's like a psychological difference when it's something 625 00:38:14,080 --> 00:38:16,239 Speaker 1: that you're wearing on your face as opposed to something 626 00:38:16,280 --> 00:38:19,520 Speaker 1: you're holding in your hand. And um, you know, I 627 00:38:19,560 --> 00:38:22,720 Speaker 1: gotta admit, like the first time I wore my Google 628 00:38:22,719 --> 00:38:25,640 Speaker 1: glass without thinking and walked right into a public restroom, 629 00:38:25,880 --> 00:38:29,560 Speaker 1: I realized, woe, this is a problem. This is not 630 00:38:29,760 --> 00:38:33,400 Speaker 1: something that there's got to be protocol here because this 631 00:38:33,480 --> 00:38:37,240 Speaker 1: is inappropriate. I did a I did a Grandpa Simpson 632 00:38:37,320 --> 00:38:40,000 Speaker 1: immediate one a turn and walked right back out of 633 00:38:40,040 --> 00:38:43,520 Speaker 1: the restroom, and then handed my Google glass too to 634 00:38:43,640 --> 00:38:46,400 Speaker 1: my wife, who looked after them while I went right 635 00:38:46,440 --> 00:38:50,799 Speaker 1: back in. Anyway, Google glass ultimately failed to emerge as 636 00:38:50,840 --> 00:38:54,879 Speaker 1: a consumer product. Google made the determination that it just 637 00:38:55,040 --> 00:38:57,520 Speaker 1: wasn't going to work. It was going to be too expensive, 638 00:38:57,640 --> 00:39:00,280 Speaker 1: it was going to be too niche. They're all ready 639 00:39:00,360 --> 00:39:05,600 Speaker 1: was this big backlash against it. So instead they focused 640 00:39:05,840 --> 00:39:10,560 Speaker 1: Google Glass for the enterprise market, enterprise customers and businesses. 641 00:39:11,040 --> 00:39:14,440 Speaker 1: So businesses buy them and give them to their employees, 642 00:39:14,480 --> 00:39:17,000 Speaker 1: who then use Google Glass to do stuff like keep 643 00:39:17,000 --> 00:39:20,560 Speaker 1: track of checklists or instructions and that kind of thing 644 00:39:20,560 --> 00:39:23,880 Speaker 1: while the employees are working on a job, which is nifty. 645 00:39:24,200 --> 00:39:26,799 Speaker 1: It's just not quite as amazing as what Google was 646 00:39:26,880 --> 00:39:30,480 Speaker 1: kind of hoping for, or seeming to hope for, back 647 00:39:30,520 --> 00:39:34,720 Speaker 1: when they unveiled the technology nearly a decade ago. Glass 648 00:39:35,040 --> 00:39:39,040 Speaker 1: does remain active. It is still under X, so that 649 00:39:39,360 --> 00:39:42,839 Speaker 1: is still an ongoing project and it did not uh 650 00:39:43,000 --> 00:39:46,200 Speaker 1: spin off. It did not quote unquote graduate. That's what 651 00:39:46,280 --> 00:39:50,080 Speaker 1: Google says whenever one of their their projects spins off 652 00:39:50,120 --> 00:39:53,520 Speaker 1: into a subsidiary company. Now, another initiative that began under 653 00:39:53,560 --> 00:39:56,839 Speaker 1: Google X was a service in which flying vehicles would 654 00:39:56,880 --> 00:40:00,480 Speaker 1: deliver small packages. So we're talking about drone to livery 655 00:40:00,880 --> 00:40:04,239 Speaker 1: meaning a service that uses drones to deliver stuff, you know, 656 00:40:04,719 --> 00:40:08,120 Speaker 1: not a service that delivers drones, though I guess a 657 00:40:08,200 --> 00:40:11,520 Speaker 1: larger drone could potentially deliver a smaller n you know 658 00:40:11,520 --> 00:40:14,680 Speaker 1: what I'm getting lost in the weeds. Here engineers began 659 00:40:14,800 --> 00:40:17,319 Speaker 1: serious work on this project in two thousand twelve. Two 660 00:40:17,400 --> 00:40:20,520 Speaker 1: years later, Google acknowledged what it had been working on, 661 00:40:20,560 --> 00:40:23,400 Speaker 1: because keep in mind early on this is in the 662 00:40:23,400 --> 00:40:27,400 Speaker 1: top secret phase, and Google began to test this technology 663 00:40:27,440 --> 00:40:30,920 Speaker 1: in Australia. Google would spin this project off as a 664 00:40:30,960 --> 00:40:36,120 Speaker 1: startup called Wing, so, like Waymo and formerly Loon, Wing 665 00:40:36,160 --> 00:40:40,440 Speaker 1: has become another company under alphabet. Wing delivers packages kind 666 00:40:40,440 --> 00:40:43,120 Speaker 1: of the same way that NASA used a skycrane to 667 00:40:43,160 --> 00:40:46,880 Speaker 1: lower the recent Mars rovers on the Red Planet. Namely 668 00:40:47,280 --> 00:40:50,040 Speaker 1: the delivery system the drone in this case hovers above 669 00:40:50,040 --> 00:40:53,800 Speaker 1: the ground. It lowers the package or payload using cables 670 00:40:53,800 --> 00:40:56,880 Speaker 1: and a winch, and then once it detects that the 671 00:40:56,960 --> 00:41:00,319 Speaker 1: payload has touched down, it detaches the cable from the 672 00:41:00,360 --> 00:41:04,200 Speaker 1: package and then the drone can fly back home, presumably 673 00:41:04,280 --> 00:41:08,920 Speaker 1: to get a treat, probably a microchip. In two thousand thirteen, 674 00:41:08,960 --> 00:41:12,400 Speaker 1: Google acquired a company called mccannie m a k A 675 00:41:12,719 --> 00:41:17,360 Speaker 1: n I. That company was working on a wind energy solution, 676 00:41:17,400 --> 00:41:21,560 Speaker 1: a really interesting idea, and it used kites with turbines 677 00:41:21,800 --> 00:41:25,680 Speaker 1: in the kites rather than a traditional wind turbine. The 678 00:41:25,719 --> 00:41:29,880 Speaker 1: big benefit here being that the kites. One they're deployable 679 00:41:29,880 --> 00:41:32,680 Speaker 1: at different places. They are still tethered to ground units 680 00:41:32,680 --> 00:41:35,120 Speaker 1: so that you can actually harness the electricity that's being 681 00:41:35,160 --> 00:41:39,280 Speaker 1: generated and store it in batteries or transmitted to wherever 682 00:41:39,280 --> 00:41:42,640 Speaker 1: you need to use it. Uh. But they used way 683 00:41:42,760 --> 00:41:49,080 Speaker 1: less material than your traditional wind farm would. So Google 684 00:41:49,120 --> 00:41:52,720 Speaker 1: acquires them in two thousand thirteen. The Google X division 685 00:41:52,760 --> 00:41:56,360 Speaker 1: got to work on the technology, and by two thousand nineteen, 686 00:41:56,400 --> 00:42:00,239 Speaker 1: Alphabet was ready to graduate mccannie as its own company 687 00:42:00,400 --> 00:42:05,200 Speaker 1: again as a subsidiary to Alphabet. But mcconnie fell into 688 00:42:05,239 --> 00:42:08,640 Speaker 1: a similar fate like Loon did. Engineers were unable to 689 00:42:08,640 --> 00:42:10,799 Speaker 1: find a way to make mcconnie work so that it 690 00:42:10,840 --> 00:42:15,040 Speaker 1: was both practical and reliable, and essentially the company said 691 00:42:15,080 --> 00:42:17,919 Speaker 1: that the road to commercialization was too risky and thus 692 00:42:18,000 --> 00:42:21,360 Speaker 1: didn't meet the threshold for support. So Alphabet ultimately shut 693 00:42:21,400 --> 00:42:25,600 Speaker 1: down mccannie in twenty just a year after it had graduated. 694 00:42:26,200 --> 00:42:29,520 Speaker 1: There are other projects that emerged from Google X or 695 00:42:30,320 --> 00:42:33,160 Speaker 1: just X because you know, like I said, Google X 696 00:42:33,200 --> 00:42:37,080 Speaker 1: itself would get spun off into its own company, so 697 00:42:37,200 --> 00:42:41,480 Speaker 1: also a subsidiary to Alphabet. So there's another project that's 698 00:42:41,480 --> 00:42:44,440 Speaker 1: called Malta. This is a company that is working to 699 00:42:44,520 --> 00:42:48,480 Speaker 1: store energy in the form of heat using tanks of 700 00:42:48,520 --> 00:42:54,560 Speaker 1: molten salt. So imagine that you've generated electrical energy in 701 00:42:54,600 --> 00:42:57,920 Speaker 1: some way. You know you've created this electrical energy. You 702 00:42:57,960 --> 00:43:01,080 Speaker 1: need to store that electrical energy till you actually need 703 00:43:01,120 --> 00:43:04,040 Speaker 1: to use it, right, because electricity is something where you 704 00:43:04,080 --> 00:43:06,600 Speaker 1: either use it or you store it, or you lose it. 705 00:43:07,239 --> 00:43:12,080 Speaker 1: So in this case, Malta converts the electricity into thermal energy, 706 00:43:12,120 --> 00:43:16,440 Speaker 1: which is stored in these vats of molten salts. Then 707 00:43:16,560 --> 00:43:19,640 Speaker 1: when you need electricity, you essentially reverse this process. You 708 00:43:19,680 --> 00:43:23,919 Speaker 1: convert the thermal energy into electricity. So this is kind 709 00:43:23,960 --> 00:43:27,840 Speaker 1: of like batteries, right, it's energy storage. Now, obviously batteries 710 00:43:27,880 --> 00:43:31,399 Speaker 1: are electrochemical and and this is good old heat we're 711 00:43:31,400 --> 00:43:36,000 Speaker 1: talking about with molten salts, but still similar in concept 712 00:43:36,000 --> 00:43:37,879 Speaker 1: and the idea like this is a way of of 713 00:43:37,960 --> 00:43:44,239 Speaker 1: storing the electricity you've generated. Another graduated project is Intrinsic. 714 00:43:44,520 --> 00:43:48,799 Speaker 1: This company aims to revolutionize industrial robots and make them 715 00:43:48,840 --> 00:43:52,280 Speaker 1: capable of handling different tasks so that as a company's 716 00:43:52,320 --> 00:43:56,080 Speaker 1: business changes, the robots can continue to be useful. Because 717 00:43:56,440 --> 00:43:59,400 Speaker 1: typically your your industrial robot is something that's designed to 718 00:43:59,400 --> 00:44:05,200 Speaker 1: do a very specific series of actions and that's it. Like, 719 00:44:05,400 --> 00:44:08,160 Speaker 1: it's great at doing those, but you can't tell it 720 00:44:08,200 --> 00:44:11,680 Speaker 1: to do anything else. Typically, So Intrinsics part in this 721 00:44:11,760 --> 00:44:14,360 Speaker 1: is to develop kind of the software that would allow 722 00:44:14,440 --> 00:44:17,480 Speaker 1: robots to be able to learn and adapt to changing 723 00:44:17,520 --> 00:44:21,960 Speaker 1: situations rather than be tied down to one specific process. Now, 724 00:44:22,000 --> 00:44:24,920 Speaker 1: obviously this also requires the robot to be capable of 725 00:44:24,960 --> 00:44:29,000 Speaker 1: doing whatever the extra activities are, but it is meant 726 00:44:29,040 --> 00:44:34,280 Speaker 1: to increase a robot's usefulness. And this sounds deceptively simple, 727 00:44:34,320 --> 00:44:37,720 Speaker 1: but it turns out it is super complicated. It touches 728 00:44:37,760 --> 00:44:42,160 Speaker 1: on tons of advanced computer science problems like perception, which 729 00:44:42,200 --> 00:44:47,239 Speaker 1: is still something that's really tricky, motion planning simulation, and 730 00:44:47,320 --> 00:44:50,799 Speaker 1: tons more. It also involves working on force control so 731 00:44:50,840 --> 00:44:53,560 Speaker 1: that robots use the appropriate amount of force for whatever 732 00:44:53,600 --> 00:44:56,600 Speaker 1: the task at hand happens to be, or task it 733 00:44:56,719 --> 00:45:00,640 Speaker 1: clamp as it were. Intrinsics work is atributing to a 734 00:45:00,680 --> 00:45:02,960 Speaker 1: growing wealth of knowledge and expertise when it comes to 735 00:45:03,040 --> 00:45:07,480 Speaker 1: robotics and AI. While Intrinsic is focused on industrial robotics, 736 00:45:07,800 --> 00:45:10,160 Speaker 1: those same advances are going to play an important role 737 00:45:10,320 --> 00:45:14,800 Speaker 1: for robotics in general, which I anticipate will include robots 738 00:45:14,840 --> 00:45:19,359 Speaker 1: that share spaces with human beings. Now, there are other 739 00:45:19,400 --> 00:45:22,560 Speaker 1: ones that we could talk about. Their contact lenses that 740 00:45:22,640 --> 00:45:27,040 Speaker 1: can detect glucose levels in tiers, which could be really 741 00:45:27,120 --> 00:45:31,279 Speaker 1: useful for people who are dealing with diabetes. You know, 742 00:45:31,560 --> 00:45:35,600 Speaker 1: these kind of concepts that are really really mind blowing. 743 00:45:36,000 --> 00:45:39,839 Speaker 1: But I think the truly remarkable thing about X is that, 744 00:45:40,040 --> 00:45:44,440 Speaker 1: unlike your traditional company, X isn't focused on producing the 745 00:45:44,440 --> 00:45:47,680 Speaker 1: biggest return on investment within the shortest amount of time. 746 00:45:48,360 --> 00:45:50,400 Speaker 1: This is not the type of business that is looking 747 00:45:50,440 --> 00:45:53,520 Speaker 1: forward to the next quarter. It's the type that looks 748 00:45:53,560 --> 00:45:56,800 Speaker 1: forward to the next ten years. And well, it's true 749 00:45:56,840 --> 00:46:00,319 Speaker 1: that more ideas get shot down that move forward. It's 750 00:46:00,360 --> 00:46:02,359 Speaker 1: also true that some of the ones that do move 751 00:46:02,440 --> 00:46:04,920 Speaker 1: forward ultimately have to be put on the shelf, like 752 00:46:05,120 --> 00:46:08,560 Speaker 1: Loon and mccannie. What the engineers learn along the way 753 00:46:08,600 --> 00:46:12,040 Speaker 1: can often find its way into other products. The benefits 754 00:46:12,040 --> 00:46:14,680 Speaker 1: of the research manifest in ways that the team couldn't 755 00:46:14,840 --> 00:46:18,080 Speaker 1: possibly have predicted when they first started. And while we 756 00:46:18,160 --> 00:46:21,000 Speaker 1: might not ever get that jet back or that hoverboard, 757 00:46:21,400 --> 00:46:25,200 Speaker 1: we might discover that some elements that were uncovered during 758 00:46:25,239 --> 00:46:29,840 Speaker 1: those projects get incorporated into stuff we used today. Now, 759 00:46:30,560 --> 00:46:32,320 Speaker 1: I want to be clear, I still have some major 760 00:46:32,440 --> 00:46:35,800 Speaker 1: issues with Alphabet the company. Like I often say Google, 761 00:46:35,880 --> 00:46:38,800 Speaker 1: but really at this point you just mean Alphabet because 762 00:46:39,320 --> 00:46:42,040 Speaker 1: the company as a as a whole is a data 763 00:46:42,120 --> 00:46:44,920 Speaker 1: black hole. I mean, it sucks up information at a 764 00:46:44,960 --> 00:46:47,120 Speaker 1: scale that is impossible for me to get my mind 765 00:46:47,120 --> 00:46:51,800 Speaker 1: wrapped around. Google has benefited from our personal information, I 766 00:46:51,840 --> 00:46:55,800 Speaker 1: mean because of us and our data, Google has become 767 00:46:55,920 --> 00:46:59,960 Speaker 1: ridiculously profitable. And we shouldn't get Alphabet a free path 768 00:47:00,080 --> 00:47:03,319 Speaker 1: us just because one of its subsidiaries is working on 769 00:47:03,360 --> 00:47:06,800 Speaker 1: some truly difficult problems and coming up with novel solutions 770 00:47:06,840 --> 00:47:10,600 Speaker 1: that could potentially be of enormous benefit. That is good, 771 00:47:11,480 --> 00:47:16,560 Speaker 1: all props to the x UH company and Alphabet for 772 00:47:16,640 --> 00:47:21,040 Speaker 1: even doing this, But we can't just you know, paying 773 00:47:21,040 --> 00:47:24,239 Speaker 1: everything with a with a happy brush. We have to 774 00:47:24,320 --> 00:47:27,319 Speaker 1: keep everything else in mind too. Still, it was fun 775 00:47:27,360 --> 00:47:30,439 Speaker 1: to look into this, and a lot of these could 776 00:47:30,480 --> 00:47:32,640 Speaker 1: be their own episodes, right. There could be a way 777 00:47:32,640 --> 00:47:35,600 Speaker 1: Mo episode, There could be a Loon episode. In fact, 778 00:47:35,640 --> 00:47:37,959 Speaker 1: I think I have done a Loon episode, So maybe 779 00:47:38,000 --> 00:47:41,680 Speaker 1: I'll do some more episodes about specific parts of X 780 00:47:41,840 --> 00:47:45,120 Speaker 1: in the future. In the meantime, if you have suggestions 781 00:47:45,120 --> 00:47:47,759 Speaker 1: for topics I should tackle on tech stuff, there's some 782 00:47:48,160 --> 00:47:50,960 Speaker 1: alliteration for you. You can reach out to me in 783 00:47:50,960 --> 00:47:53,000 Speaker 1: a couple of ways. One way is on Twitter. We 784 00:47:53,080 --> 00:47:56,440 Speaker 1: have the handle text stuff h s W there that's 785 00:47:56,560 --> 00:47:59,840 Speaker 1: how we got this request for example. Another way is 786 00:47:59,840 --> 00:48:02,719 Speaker 1: to who download the I Heart Radio app, navigate over 787 00:48:02,760 --> 00:48:05,520 Speaker 1: to the text Stuff part of the app. There's a 788 00:48:05,560 --> 00:48:08,000 Speaker 1: little microphone icon you can click on that leave a 789 00:48:08,080 --> 00:48:10,839 Speaker 1: voice message up to thirty seconds in length and let 790 00:48:10,840 --> 00:48:13,319 Speaker 1: me know what you would like me to cover that way. 791 00:48:13,960 --> 00:48:16,680 Speaker 1: Either way, I hope you're doing well and I'll talk 792 00:48:16,719 --> 00:48:25,640 Speaker 1: to you again really soon. Y text Stuff is an 793 00:48:25,640 --> 00:48:29,320 Speaker 1: I heart Radio production. For more podcasts from I Heart Radio, 794 00:48:29,680 --> 00:48:32,840 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 795 00:48:32,920 --> 00:48:34,440 Speaker 1: you listen to your favorite shows.