1 00:00:04,360 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,880 --> 00:00:19,800 Speaker 1: I'm an executive producer with iHeartRadio and how the Tech Area. 4 00:00:20,160 --> 00:00:23,320 Speaker 1: It's time for another classic episode of tech Stuff. This 5 00:00:23,360 --> 00:00:28,480 Speaker 1: one is titled the Great Google Car Crash of twenty sixteen. 6 00:00:29,080 --> 00:00:32,360 Speaker 1: Scott Benjamin joined the show to talk about this. This 7 00:00:32,520 --> 00:00:35,040 Speaker 1: was one of those news stories that was a pretty 8 00:00:35,040 --> 00:00:38,800 Speaker 1: big deal in twenty sixteen. The episode originally published on 9 00:00:38,880 --> 00:00:45,199 Speaker 1: April twenty seventh, twenty sixteen. And you know autonomous cars, 10 00:00:46,200 --> 00:00:50,040 Speaker 1: well really that we don't have a truly autonomous car yet, 11 00:00:50,920 --> 00:00:55,800 Speaker 1: but it was kind of a new concept, like new 12 00:00:55,800 --> 00:00:58,320 Speaker 1: in the sense that we hadn't really seen too many 13 00:00:58,400 --> 00:01:02,080 Speaker 1: of them out on actual street back in twenty sixteen. 14 00:01:02,280 --> 00:01:06,360 Speaker 1: So Google getting into a little car accident was big news. Now. Obviously, 15 00:01:06,440 --> 00:01:09,360 Speaker 1: in the years since, we have had a lot more 16 00:01:09,440 --> 00:01:15,839 Speaker 1: accidents involving autonomous vehicles or semi autonomous vehicles, so things 17 00:01:15,840 --> 00:01:22,560 Speaker 1: have definitely continued to be complicated and sometimes tragic, but 18 00:01:22,720 --> 00:01:28,560 Speaker 1: this one was more of an interesting, lighthearted look at 19 00:01:28,840 --> 00:01:32,480 Speaker 1: why an autonomous car might get into an accident. So 20 00:01:32,560 --> 00:01:36,720 Speaker 1: sit back and enjoy this episode from twenty sixteen. We're 21 00:01:36,720 --> 00:01:40,760 Speaker 1: talking today about a peculiar event, something that happened in 22 00:01:40,800 --> 00:01:43,880 Speaker 1: February twenty sixteen, and it's just taken me this long 23 00:01:43,920 --> 00:01:46,560 Speaker 1: to finally get around and addressing it. You might have 24 00:01:46,640 --> 00:01:50,240 Speaker 1: heard in previous episodes of Tech Stuff about how I 25 00:01:50,280 --> 00:01:54,600 Speaker 1: would champion the fact that Google with their self driving 26 00:01:54,640 --> 00:02:01,120 Speaker 1: cars had had enormous success, flawless. You might argue, success 27 00:02:01,120 --> 00:02:03,960 Speaker 1: something like one point four to five million miles driven 28 00:02:04,480 --> 00:02:07,720 Speaker 1: without a single accident caused by the autonomous system. That 29 00:02:07,960 --> 00:02:11,560 Speaker 1: there had been about fourteen or so accidents, but all 30 00:02:11,600 --> 00:02:15,400 Speaker 1: of those were either the fault of a person manually 31 00:02:15,480 --> 00:02:20,160 Speaker 1: driving the car in manual mode or another driver colliding 32 00:02:20,240 --> 00:02:23,840 Speaker 1: with the autonomous car, but never the fallow the autonomous 33 00:02:23,840 --> 00:02:29,639 Speaker 1: car itself. It was a perfect system until February twenty sixteen. Yeah, 34 00:02:29,639 --> 00:02:31,720 Speaker 1: and that is the day of what I'll caohol, And 35 00:02:31,720 --> 00:02:34,320 Speaker 1: I really don't mean to over dramaticize this at all, 36 00:02:34,639 --> 00:02:37,360 Speaker 1: So maybe I'm titling this episode. I think maybe sure 37 00:02:37,880 --> 00:02:41,079 Speaker 1: we should call this the Saint Valentine's Day Google self 38 00:02:41,200 --> 00:02:44,639 Speaker 1: driving Car Massacre. Oh, that's an excellent title, and I'm 39 00:02:44,680 --> 00:02:47,280 Speaker 1: all for it. Definitely not overly dramatic in any way. Well, 40 00:02:47,280 --> 00:02:49,639 Speaker 1: I mean it's it's also funny because we're recording this 41 00:02:50,440 --> 00:02:53,840 Speaker 1: the week after I've gotten back from south By Southwest 42 00:02:53,919 --> 00:02:57,360 Speaker 1: and this was a topic that was discussed heavily at 43 00:02:57,360 --> 00:03:03,240 Speaker 1: south By Southwest because till this incident, it was a 44 00:03:03,480 --> 00:03:07,360 Speaker 1: very easy sell to say autonomous cars are the way 45 00:03:07,400 --> 00:03:10,600 Speaker 1: to go. And then this little accident happened, and and 46 00:03:10,680 --> 00:03:13,480 Speaker 1: it it wasn't terrible. We'll get into the details of 47 00:03:13,520 --> 00:03:16,880 Speaker 1: the accident, but this little accident happened and suddenly it 48 00:03:17,040 --> 00:03:22,160 Speaker 1: sounded like Google's autonomous car had caused an enormous pile 49 00:03:22,280 --> 00:03:25,120 Speaker 1: up on the highway. Everyone was much more cautious. So 50 00:03:25,200 --> 00:03:27,440 Speaker 1: you're maybe not buying my alternate title. Then, is that 51 00:03:27,480 --> 00:03:30,560 Speaker 1: what you're saying? What's your alternate title? No, no, that 52 00:03:30,639 --> 00:03:32,880 Speaker 1: I'm totally no. I think it is a maskacer, but 53 00:03:32,919 --> 00:03:35,040 Speaker 1: I think it's a massacer in the sense of the 54 00:03:35,080 --> 00:03:40,440 Speaker 1: public perception of autonomous cars. I see, Okay, yeah, I'm 55 00:03:40,480 --> 00:03:42,880 Speaker 1: thinking of from a pr st, I could I could 56 00:03:42,920 --> 00:03:46,880 Speaker 1: take this opportunity to gloat and say, ah, they're not 57 00:03:46,920 --> 00:03:50,160 Speaker 1: as infallible as you thought they're They're not perfect, but 58 00:03:50,160 --> 00:03:52,440 Speaker 1: but I'm going to take a different stance here in 59 00:03:52,560 --> 00:03:54,800 Speaker 1: this in this podcast, And and I think that as 60 00:03:54,880 --> 00:03:57,480 Speaker 1: we as we talk through this, we're going to realize 61 00:03:57,520 --> 00:04:00,640 Speaker 1: that they've been held to a much higher standard than 62 00:04:00,680 --> 00:04:04,240 Speaker 1: they probably need to be right. And I know that's 63 00:04:04,280 --> 00:04:06,200 Speaker 1: that's tough to to take, you know, when you when 64 00:04:06,200 --> 00:04:08,200 Speaker 1: you just hear it that way, But listen to our 65 00:04:08,280 --> 00:04:11,400 Speaker 1: argument back and forth about this, and and understand that 66 00:04:12,040 --> 00:04:14,960 Speaker 1: they're being held to perfection when they probably shouldn't be. 67 00:04:15,320 --> 00:04:18,400 Speaker 1: When humans, I mean, we're not perfect, of course, right, 68 00:04:18,560 --> 00:04:24,960 Speaker 1: there's a vast m a chasm between what the standards 69 00:04:25,000 --> 00:04:27,440 Speaker 1: that they're held to versus the standard that human test 70 00:04:27,480 --> 00:04:29,480 Speaker 1: drivers are held too. Sure, yeah, if you look at 71 00:04:29,560 --> 00:04:32,440 Speaker 1: if you look at the standard driving test that you 72 00:04:32,520 --> 00:04:35,799 Speaker 1: have to pass before you get a license, I would 73 00:04:35,880 --> 00:04:40,159 Speaker 1: argue most autonomous cars could likely pass such a test 74 00:04:41,120 --> 00:04:45,560 Speaker 1: as close to flawlessly as you can get. But you 75 00:04:45,600 --> 00:04:48,279 Speaker 1: don't have to be flawless when you take a driver's test. 76 00:04:48,360 --> 00:04:52,760 Speaker 1: There's room for you to not completely do something perfectly, 77 00:04:52,760 --> 00:04:55,840 Speaker 1: Like if your parallel parking isn't exactly right, you're going 78 00:04:55,880 --> 00:04:58,600 Speaker 1: to get points deducted from your total, but you may 79 00:04:58,600 --> 00:05:01,880 Speaker 1: still be you know, high enough, score high enough so 80 00:05:01,880 --> 00:05:04,040 Speaker 1: that you could pass the full driver's test. Exactly. You 81 00:05:04,120 --> 00:05:06,080 Speaker 1: knock over cone. It's not really a big deal. But 82 00:05:06,200 --> 00:05:09,320 Speaker 1: an autonomous car knocks over a cone, everybody points at 83 00:05:09,360 --> 00:05:11,040 Speaker 1: it and says, oh, we'll get that thing. It's it's 84 00:05:11,080 --> 00:05:14,880 Speaker 1: pilot junk. Yeah, exactly. It's interesting that you point that 85 00:05:14,920 --> 00:05:18,160 Speaker 1: out too, because that ties into a different discussion. I 86 00:05:18,200 --> 00:05:21,360 Speaker 1: saw it south by Southwest that wasn't specifically about autonomous cars. 87 00:05:21,360 --> 00:05:23,360 Speaker 1: It was about robots. So this is a little bit 88 00:05:23,400 --> 00:05:26,279 Speaker 1: of a tangent, but it goes to illustrate the point 89 00:05:26,320 --> 00:05:30,159 Speaker 1: you just made. I love tangents, So this h pammel 90 00:05:30,160 --> 00:05:33,360 Speaker 1: about robots. There was a woman, Leila Takayama, who used 91 00:05:33,360 --> 00:05:36,080 Speaker 1: to work for Google X, but not in the autonomous 92 00:05:36,080 --> 00:05:41,480 Speaker 1: car division. She talked about how she ran an experiment. 93 00:05:42,440 --> 00:05:45,479 Speaker 1: She got a guy from Pixar to do a series 94 00:05:45,520 --> 00:05:50,360 Speaker 1: of very simple animations to show people the interactions between 95 00:05:50,400 --> 00:05:53,760 Speaker 1: a robot and a person and then to judge which 96 00:05:54,000 --> 00:05:59,360 Speaker 1: robot is considered intelligent versus not intelligent. And the whole 97 00:05:59,400 --> 00:06:02,359 Speaker 1: point of this was to show the differences between succeeding 98 00:06:02,360 --> 00:06:07,200 Speaker 1: and failing, but giving no indication that the robot understands 99 00:06:07,200 --> 00:06:11,279 Speaker 1: it succeeded or failed, or building in expressions for the 100 00:06:11,360 --> 00:06:16,279 Speaker 1: robot to follow a success or failure to indicate it 101 00:06:16,480 --> 00:06:21,039 Speaker 1: quote unquote understands what happened. And it was fascinating because 102 00:06:21,320 --> 00:06:25,760 Speaker 1: they showed a very simple experiment with a robot trying 103 00:06:25,760 --> 00:06:27,760 Speaker 1: to open the door. And again, this is an animation, 104 00:06:28,640 --> 00:06:32,440 Speaker 1: so there was different scenarios. There's one where the robot 105 00:06:32,800 --> 00:06:35,680 Speaker 1: opens the door and the door opens and then the 106 00:06:35,760 --> 00:06:38,080 Speaker 1: robot just sits there and it's done because it's done 107 00:06:38,080 --> 00:06:40,560 Speaker 1: what it was supposed to do. There's one where the 108 00:06:40,640 --> 00:06:45,480 Speaker 1: robot opens the door and then kind of perks up, like, oh, 109 00:06:45,040 --> 00:06:47,960 Speaker 1: I did what I wanted to do. There was one 110 00:06:47,960 --> 00:06:50,600 Speaker 1: where the robot fails to open the door and then 111 00:06:50,720 --> 00:06:53,480 Speaker 1: does nothing, and then there was one where the robot 112 00:06:53,560 --> 00:06:55,760 Speaker 1: fails to open the door and then slumps down a 113 00:06:55,760 --> 00:06:57,760 Speaker 1: little bit as if to say, oh, I'm disappointed I 114 00:06:57,800 --> 00:07:01,279 Speaker 1: didn't succeed. They then asked able to judge which robots 115 00:07:01,320 --> 00:07:03,839 Speaker 1: they thought were the most intelligent, and everyone said the 116 00:07:03,960 --> 00:07:08,240 Speaker 1: robot that failed but showed disappointment was more intelligent than 117 00:07:08,279 --> 00:07:11,280 Speaker 1: the robot that succeeded but didn't show any expression at all. 118 00:07:11,520 --> 00:07:14,120 Speaker 1: No kidding, And when you think about that again. It's 119 00:07:14,160 --> 00:07:18,360 Speaker 1: holding robots to a standard that doesn't necessarily apply to 120 00:07:18,400 --> 00:07:22,120 Speaker 1: them because of the human element this human robot interaction. 121 00:07:22,920 --> 00:07:27,920 Speaker 1: We're holding autonomous cars to a similar standard that perhaps 122 00:07:28,160 --> 00:07:31,360 Speaker 1: is not fair. We're holding robots to a standard that's 123 00:07:31,400 --> 00:07:34,080 Speaker 1: not fair. But that means that people who are designing 124 00:07:34,080 --> 00:07:37,200 Speaker 1: autonomous cars and people who are designing robots have to 125 00:07:37,240 --> 00:07:41,280 Speaker 1: take that into consideration because that's the way humans are. Well, 126 00:07:41,320 --> 00:07:45,800 Speaker 1: this is interesting because you're you're mentioning specifically imitating human behavior. Yes, 127 00:07:45,840 --> 00:07:48,840 Speaker 1: and this comes up in an article that I read 128 00:07:48,920 --> 00:07:51,800 Speaker 1: in let's see, it was in the Verge. Yes, and 129 00:07:51,960 --> 00:07:53,640 Speaker 1: the Verge article you might have read the same thing. 130 00:07:53,680 --> 00:07:56,400 Speaker 1: An excellent article. Yeah, it really is. And a person 131 00:07:56,440 --> 00:07:59,160 Speaker 1: by the name of Jennifer Harun. She's the head of 132 00:07:59,200 --> 00:08:02,160 Speaker 1: business operations for Google self driving project. And by the way, 133 00:08:02,360 --> 00:08:04,000 Speaker 1: let's come back to the details of the accident and 134 00:08:04,040 --> 00:08:06,360 Speaker 1: just a moment. Sure, we'll describe what happened, but she 135 00:08:06,400 --> 00:08:08,680 Speaker 1: says that, well, you know what, maybe I need to 136 00:08:08,720 --> 00:08:10,480 Speaker 1: back this up just a second here. How about this, 137 00:08:10,920 --> 00:08:12,880 Speaker 1: Let's describe the accident and then we'll talk about what 138 00:08:12,920 --> 00:08:16,400 Speaker 1: she said, because it plays perfectly into it. So in 139 00:08:16,560 --> 00:08:19,160 Speaker 1: helping understand what happened. So here's here's how you can 140 00:08:19,200 --> 00:08:23,600 Speaker 1: imagine it. All right, You've got an intersection in Mountain View, California, 141 00:08:23,600 --> 00:08:26,880 Speaker 1: which is where Google's headquarters is located, and you've got 142 00:08:26,920 --> 00:08:29,600 Speaker 1: the Google self driving car. Correct me if I get 143 00:08:29,600 --> 00:08:31,440 Speaker 1: any of this wrong, Scott, I'm going from memory. That's 144 00:08:31,440 --> 00:08:33,760 Speaker 1: all right, I'm doing mostly the same. It's so the 145 00:08:33,760 --> 00:08:35,719 Speaker 1: Google self driving cars in the right lane and at 146 00:08:35,720 --> 00:08:39,280 Speaker 1: once it's planning on making a right turn at this intersection. Yes, Now, 147 00:08:39,520 --> 00:08:42,280 Speaker 1: at the corner of the intersection, there were some sandbags 148 00:08:42,360 --> 00:08:45,400 Speaker 1: that were a partial obstruction of the lane. Yeah, I 149 00:08:45,400 --> 00:08:48,920 Speaker 1: think you're blocking a sewer entry a great maybe or something. Right, 150 00:08:49,000 --> 00:08:53,640 Speaker 1: So the Google car detected that there was an obstruction, 151 00:08:54,080 --> 00:08:56,480 Speaker 1: and so it had to plan an alternate way to 152 00:08:56,559 --> 00:08:58,840 Speaker 1: make its right turn. It still wanted to follow the 153 00:08:58,920 --> 00:09:02,800 Speaker 1: route that it had planned, so the change would have 154 00:09:02,840 --> 00:09:05,559 Speaker 1: been for it to kind of edge into the next lane, 155 00:09:05,600 --> 00:09:07,680 Speaker 1: over the next lane to the left before making a 156 00:09:07,760 --> 00:09:13,800 Speaker 1: right turn. Behind the Google car, approaching at a blistering 157 00:09:13,840 --> 00:09:17,319 Speaker 1: speed of fifteen miles per hour was a bus, and 158 00:09:17,400 --> 00:09:20,800 Speaker 1: so the Google car recognized there was a bus coming. 159 00:09:21,640 --> 00:09:23,319 Speaker 1: He was moving at a very slow speed at two 160 00:09:23,360 --> 00:09:27,840 Speaker 1: miles per hour. The Google Car said, well, based upon 161 00:09:27,920 --> 00:09:30,760 Speaker 1: my programming, what I should expect happen is that the 162 00:09:30,800 --> 00:09:34,120 Speaker 1: bus will slow down, allow me to move through. I'll 163 00:09:34,120 --> 00:09:37,720 Speaker 1: clear the intersection, the bus will continue. What actually happened 164 00:09:37,840 --> 00:09:40,600 Speaker 1: was the Google Car made the move into the lane, 165 00:09:40,960 --> 00:09:44,000 Speaker 1: the bus continued, and there was a low speed collision 166 00:09:44,200 --> 00:09:46,560 Speaker 1: and there were no injuries. No one was hurt. There was, 167 00:09:46,679 --> 00:09:49,360 Speaker 1: in fact a driver behind the wheel of the Google Car. 168 00:09:49,400 --> 00:09:52,760 Speaker 1: It's just the driver wasn't in control. The autonomous system 169 00:09:52,800 --> 00:09:55,800 Speaker 1: was in control. And some people might say, well, you know, 170 00:09:55,840 --> 00:09:58,400 Speaker 1: the bus driver just didn't let the car in. But 171 00:09:58,520 --> 00:10:01,760 Speaker 1: Google actually said this is important. Google came out and said, 172 00:10:02,480 --> 00:10:06,720 Speaker 1: we accept responsibility for this. This is something that it's 173 00:10:06,800 --> 00:10:09,520 Speaker 1: it's valuable that this information has come to light because 174 00:10:09,520 --> 00:10:12,480 Speaker 1: it means that we need to revisit this particular part 175 00:10:12,800 --> 00:10:16,920 Speaker 1: of the autonomous car programming. Now, I thought that was 176 00:10:17,000 --> 00:10:19,520 Speaker 1: really interesting. First of all, I have never heard of 177 00:10:19,520 --> 00:10:23,880 Speaker 1: a company accepting responsibility for something so fast in my life. Well, 178 00:10:24,320 --> 00:10:25,920 Speaker 1: they did, and they didn't. I mean, there's a couple 179 00:10:25,960 --> 00:10:28,200 Speaker 1: of versions of this. Now you got the details of 180 00:10:28,240 --> 00:10:30,280 Speaker 1: the accident correct, Although I did hear and this this 181 00:10:30,360 --> 00:10:32,960 Speaker 1: is a bit confusing. I didn't hear that the lanes 182 00:10:32,960 --> 00:10:35,800 Speaker 1: in this particular part of town are extremely wide, and 183 00:10:35,840 --> 00:10:39,120 Speaker 1: so what happened was the car kind of edged itself 184 00:10:39,120 --> 00:10:43,800 Speaker 1: over toward the curb. So it was I guess, mimicking 185 00:10:43,920 --> 00:10:45,720 Speaker 1: human behavior again. And I'll get to that in just 186 00:10:45,760 --> 00:10:48,400 Speaker 1: a minute. But um, you know, once this accident happened 187 00:10:48,520 --> 00:10:50,920 Speaker 1: and they said, you know, we do need to investigate this, 188 00:10:51,280 --> 00:10:53,480 Speaker 1: they did that to the tune of about thirty five 189 00:10:53,679 --> 00:10:57,240 Speaker 1: hundred new tests that they've now implemented since this accident 190 00:10:57,440 --> 00:10:59,320 Speaker 1: that said, we're going to watch for this. You know, 191 00:10:59,520 --> 00:11:03,040 Speaker 1: we need to understand a little more deeply that some 192 00:11:03,080 --> 00:11:06,000 Speaker 1: of these larger vehicles may have a different, more difficult 193 00:11:06,000 --> 00:11:09,360 Speaker 1: times stopping in traffic than a smaller vehicle will. And 194 00:11:09,360 --> 00:11:11,600 Speaker 1: and that's the reason why some of these bigger vehicles 195 00:11:11,640 --> 00:11:14,080 Speaker 1: like to continue on their path and think, well, maybe 196 00:11:14,080 --> 00:11:17,040 Speaker 1: someone behind me will let them in. Google did say 197 00:11:17,400 --> 00:11:20,840 Speaker 1: we were relying on an element of human kindness to 198 00:11:20,920 --> 00:11:23,120 Speaker 1: let us into that lane, and that's normally what happens. 199 00:11:23,160 --> 00:11:24,880 Speaker 1: It really does. Usually there's a back and forth or 200 00:11:24,920 --> 00:11:27,079 Speaker 1: you know, maybe there's always going to be that off 201 00:11:27,120 --> 00:11:29,280 Speaker 1: handed time where you know, someone does cut through and 202 00:11:29,600 --> 00:11:31,360 Speaker 1: they're just like they're like, no, I need to get 203 00:11:31,360 --> 00:11:33,560 Speaker 1: through that intersection in this light cycle, and no one 204 00:11:33,679 --> 00:11:35,160 Speaker 1: is going to stop me. Yeah, I mean I'm gonna 205 00:11:35,160 --> 00:11:37,200 Speaker 1: be ten feet ahead of you when all this happens, right, 206 00:11:37,320 --> 00:11:39,480 Speaker 1: That's where I you know, that's that's my goal. But 207 00:11:39,679 --> 00:11:42,800 Speaker 1: usually what happens is it's an alternating pattern and they 208 00:11:42,800 --> 00:11:45,120 Speaker 1: expected that to happen, and it didn't happen in this case. 209 00:11:45,120 --> 00:11:48,319 Speaker 1: And this is where it plays right into what Jennifer 210 00:11:48,360 --> 00:11:50,640 Speaker 1: had mentioned. Now, Jennifer Harun, who is the head of 211 00:11:50,679 --> 00:11:55,000 Speaker 1: business operations for Google self driving project, explained at and 212 00:11:55,000 --> 00:11:57,000 Speaker 1: I think she was at the south By Southwest conference 213 00:11:57,080 --> 00:12:00,240 Speaker 1: as well, and she said that the Lexus, it's a 214 00:12:00,320 --> 00:12:02,560 Speaker 1: Lexus vehicle that was outfitted with this gear so that 215 00:12:02,640 --> 00:12:05,760 Speaker 1: it struck the bus in part because it was imitating 216 00:12:05,840 --> 00:12:08,800 Speaker 1: human behavior. And that's I found that interesting that she 217 00:12:09,040 --> 00:12:14,000 Speaker 1: kind of is deferring the fault here and to say, well, 218 00:12:14,040 --> 00:12:16,040 Speaker 1: we were just imitating what we see on the streets 219 00:12:16,040 --> 00:12:18,280 Speaker 1: and that's that's kind of what happened. So well, it's 220 00:12:18,280 --> 00:12:21,000 Speaker 1: a it's a double deferral in a way, right, because 221 00:12:21,040 --> 00:12:22,960 Speaker 1: first they say we were counting on an element of 222 00:12:23,000 --> 00:12:25,959 Speaker 1: human kindness, which is already kind of a deferral in itself. Sure, yeah, 223 00:12:26,040 --> 00:12:29,280 Speaker 1: you're you're essentially saying, well, we were we were thinking 224 00:12:29,320 --> 00:12:31,360 Speaker 1: that the bus driver would be a decent human being 225 00:12:31,360 --> 00:12:35,320 Speaker 1: and not a Lexus hating scumbag. That's I'm paraphrasing what 226 00:12:35,360 --> 00:12:38,040 Speaker 1: they said. Obviously I'm taking a little liberty. And then 227 00:12:38,040 --> 00:12:41,199 Speaker 1: in this one, you're saying, she's also saying, well, we 228 00:12:41,320 --> 00:12:44,160 Speaker 1: designed the car to behave the way we see actual 229 00:12:44,240 --> 00:12:47,679 Speaker 1: cars behaving on the road. So in both cases, you're 230 00:12:47,720 --> 00:12:50,560 Speaker 1: almost it's a little bit of backing away from taking 231 00:12:50,600 --> 00:12:53,960 Speaker 1: full responsibility exactly. And this imitating human behavior that she's 232 00:12:54,000 --> 00:12:57,079 Speaker 1: talking about was that they had recently taught the vehicles 233 00:12:57,080 --> 00:12:59,600 Speaker 1: to hug the right hand side of that lane when 234 00:12:59,600 --> 00:13:02,280 Speaker 1: they're making that right hand turn, and that that's when 235 00:13:02,320 --> 00:13:06,000 Speaker 1: it's encountered the sandbags that were unexpected. And so this 236 00:13:06,080 --> 00:13:07,800 Speaker 1: is what I find interesting is if it's a wide 237 00:13:07,880 --> 00:13:10,320 Speaker 1: lane and she's saying that it was hugging the right 238 00:13:10,360 --> 00:13:12,839 Speaker 1: hand side of that lane trying to make the turn 239 00:13:12,880 --> 00:13:15,640 Speaker 1: as most humans do, if it was in the center 240 00:13:15,679 --> 00:13:18,120 Speaker 1: of the lane, she's saying, if it had just behaved 241 00:13:18,160 --> 00:13:19,839 Speaker 1: as they normally would do it. You know that it 242 00:13:19,840 --> 00:13:21,679 Speaker 1: would be in the exact dead center of that lane. 243 00:13:22,000 --> 00:13:24,280 Speaker 1: The bus wouldn't have had the gap, I guess, and 244 00:13:24,320 --> 00:13:26,319 Speaker 1: try to try to make it through that gap, so 245 00:13:26,640 --> 00:13:28,880 Speaker 1: it would have just been behind the car. Never would 246 00:13:28,880 --> 00:13:31,680 Speaker 1: have happened. So she's saying, in effect, because we're trying 247 00:13:31,679 --> 00:13:33,800 Speaker 1: to make it mimic human behavior and we were hugging 248 00:13:33,840 --> 00:13:37,040 Speaker 1: that right side, that's why this accident happened. Maybe we 249 00:13:37,040 --> 00:13:39,240 Speaker 1: shouldn't have done that. But then again they come back 250 00:13:39,240 --> 00:13:43,120 Speaker 1: and say that that's absolutely necessary for them to mimic 251 00:13:43,200 --> 00:13:45,720 Speaker 1: human behavior, because if they don't that causes trouble as well. 252 00:13:45,760 --> 00:13:48,880 Speaker 1: There's brother issues that are right. If a vehicle, let's 253 00:13:48,880 --> 00:13:52,839 Speaker 1: say that it's an autonomous car heading toward an intersection, 254 00:13:53,720 --> 00:13:58,880 Speaker 1: the light goes from green to amber, and there's technically 255 00:13:59,040 --> 00:14:03,680 Speaker 1: enough space for the car to break safely and come 256 00:14:03,679 --> 00:14:07,280 Speaker 1: to a complete stop as the light turns red. Knowing 257 00:14:07,320 --> 00:14:11,760 Speaker 1: that most humans would just gun it, or at least 258 00:14:11,800 --> 00:14:13,640 Speaker 1: just continue at the same speed to go through the 259 00:14:13,679 --> 00:14:17,960 Speaker 1: intersection while it's still amber, you might want to think 260 00:14:18,000 --> 00:14:20,160 Speaker 1: about that when you're designing your autonomous car so that 261 00:14:20,200 --> 00:14:23,000 Speaker 1: you don't cause a pile up behind you, like you don't. 262 00:14:24,000 --> 00:14:27,440 Speaker 1: If the person directly behind the autonomous car expects the 263 00:14:27,440 --> 00:14:29,800 Speaker 1: car in front of them to continue through the intersection, 264 00:14:30,160 --> 00:14:32,800 Speaker 1: you could potentially get rear ended. Yeah, that happens a 265 00:14:32,880 --> 00:14:34,720 Speaker 1: thousand times around. I mean more than that. But it 266 00:14:34,720 --> 00:14:38,360 Speaker 1: happens all over the world, really, especially in Atlanta, where 267 00:14:38,360 --> 00:14:40,760 Speaker 1: the rule is if the light turns red, three cars 268 00:14:40,760 --> 00:14:43,720 Speaker 1: get to go through. It is so true, isn't it. Yeah, 269 00:14:43,760 --> 00:14:46,640 Speaker 1: it seems like once one goes through, two more following. Yeah, 270 00:14:46,640 --> 00:14:49,800 Speaker 1: it's crazy. Yeah, I've seen it happen in multiple places 271 00:14:49,800 --> 00:14:52,920 Speaker 1: around the city. There's some neighborhoods where it's worse. I'm 272 00:14:52,920 --> 00:14:55,920 Speaker 1: not going to name any names, buckhead, but I'm just saying, yeah, 273 00:14:56,080 --> 00:14:59,440 Speaker 1: never accelerate immediately on a green light anywhere in this area. 274 00:14:59,680 --> 00:15:02,240 Speaker 1: You can expect there's going to be that that odd 275 00:15:02,240 --> 00:15:04,040 Speaker 1: ball car that comes through every right, and there's gonna 276 00:15:04,040 --> 00:15:06,200 Speaker 1: be like a car that's been waiting to turn left 277 00:15:06,240 --> 00:15:08,040 Speaker 1: the whole time, and they say, I'm not waiting another 278 00:15:08,080 --> 00:15:10,320 Speaker 1: light cycle. I'm just going now, like even if they're 279 00:15:10,320 --> 00:15:12,600 Speaker 1: behind the stop line. Yeah, I'm sure, I'm sure it 280 00:15:12,600 --> 00:15:16,760 Speaker 1: happens everywhere's it's particularly bad in these congested areas. Yes, 281 00:15:17,120 --> 00:15:20,920 Speaker 1: So to your point, it is important to take those 282 00:15:20,960 --> 00:15:23,680 Speaker 1: things into consideration when designing the autonomous car. You don't 283 00:15:23,720 --> 00:15:26,480 Speaker 1: want an autonomous car to drive like an inconsiderate jerk 284 00:15:26,520 --> 00:15:28,720 Speaker 1: of a driver. But at the same time, you can't 285 00:15:28,720 --> 00:15:32,840 Speaker 1: have it be so clinically precise that it is standing 286 00:15:32,920 --> 00:15:35,560 Speaker 1: out from all the other drivers. The only way that 287 00:15:35,600 --> 00:15:37,640 Speaker 1: works is if you get to a point where you're 288 00:15:37,680 --> 00:15:40,560 Speaker 1: at a saturation point with autonomous cars on the road, 289 00:15:41,000 --> 00:15:43,760 Speaker 1: where then you can affect the behavior on a mass 290 00:15:43,800 --> 00:15:47,560 Speaker 1: scale across a fleet of cars and not have that 291 00:15:47,760 --> 00:15:53,040 Speaker 1: issue of human drivers having awful interactions with robotic drivers exactly. 292 00:15:53,080 --> 00:15:56,200 Speaker 1: Here's here's the way they stated these these spokesman stated. 293 00:15:56,400 --> 00:15:58,600 Speaker 1: They say, it's vital for us to develop advanced skills 294 00:15:58,640 --> 00:16:01,080 Speaker 1: that respect not just the letter of the traffic code, 295 00:16:01,440 --> 00:16:03,960 Speaker 1: but the spirit of the road. I think that's a 296 00:16:03,960 --> 00:16:05,400 Speaker 1: good way to put it. The spirit of the road. 297 00:16:05,400 --> 00:16:07,920 Speaker 1: I understand that. I completely get that when I read 298 00:16:08,000 --> 00:16:11,000 Speaker 1: it is that, yeah, there's little rules here and there 299 00:16:11,080 --> 00:16:14,280 Speaker 1: that we'd bend, but everybody bends them, and you expect 300 00:16:14,360 --> 00:16:16,600 Speaker 1: you you understand how other drivers are going to behave 301 00:16:16,600 --> 00:16:19,640 Speaker 1: in the same situation, and you expect that to happen, right, 302 00:16:19,840 --> 00:16:21,920 Speaker 1: and you behave in that way and it all works. 303 00:16:22,280 --> 00:16:25,040 Speaker 1: But when something comes in, a spoiler comes in and 304 00:16:25,200 --> 00:16:27,320 Speaker 1: it follows exactly to the letter of the log the 305 00:16:27,360 --> 00:16:30,520 Speaker 1: way that's supposed to happen, that person maybe you know 306 00:16:30,600 --> 00:16:34,240 Speaker 1: the standout right. Another great example of that in Atlanta 307 00:16:34,360 --> 00:16:37,240 Speaker 1: would be we have a couple of different highways that 308 00:16:37,400 --> 00:16:39,560 Speaker 1: run through the city and one that runs around the 309 00:16:39,640 --> 00:16:42,920 Speaker 1: city two eighty five, and two eighty five is often 310 00:16:43,000 --> 00:16:45,200 Speaker 1: thought of as the type of highway that if you 311 00:16:45,280 --> 00:16:47,480 Speaker 1: get on it, you have to speed. You cannot go 312 00:16:47,640 --> 00:16:49,480 Speaker 1: the speed limit on two eighty five. It's just too 313 00:16:49,560 --> 00:16:52,160 Speaker 1: dangerous because everyone else is going above the speed limit. 314 00:16:52,280 --> 00:16:56,040 Speaker 1: Massive truck traffic, Yeah, yeah, and there are enormous, enormous 315 00:16:56,080 --> 00:16:59,400 Speaker 1: semis rushing down there, and you don't want to get 316 00:16:59,600 --> 00:17:02,440 Speaker 1: you to be poking along when they come up behind you. 317 00:17:02,840 --> 00:17:06,119 Speaker 1: So again, an autonomous car would need to have that 318 00:17:06,280 --> 00:17:09,399 Speaker 1: information and take that into account, unless you got to 319 00:17:09,480 --> 00:17:11,720 Speaker 1: a point where you had so many autonomous vehicles on 320 00:17:11,760 --> 00:17:14,960 Speaker 1: the road that it was no longer a concern. Yeah, 321 00:17:14,960 --> 00:17:17,160 Speaker 1: And this is where we discussed this yesterday, because we're 322 00:17:17,200 --> 00:17:18,840 Speaker 1: talking off air about this just a little bit to 323 00:17:19,280 --> 00:17:22,520 Speaker 1: prep for today, and the idea would be that's kind 324 00:17:22,560 --> 00:17:25,720 Speaker 1: of like schooling, almost like fish schooling, and that the 325 00:17:25,800 --> 00:17:28,080 Speaker 1: cars know where the other one is at all times 326 00:17:28,119 --> 00:17:30,280 Speaker 1: and they can communicate between them. The problem is when 327 00:17:30,320 --> 00:17:32,920 Speaker 1: you throw in the human driver element into that mix, 328 00:17:33,520 --> 00:17:36,239 Speaker 1: or you know, if you have just one autonomous car 329 00:17:36,400 --> 00:17:39,040 Speaker 1: among all humans, that's the other problem. It's the other issue, 330 00:17:39,240 --> 00:17:41,719 Speaker 1: and right now, that's the battle that they're fighting right right. 331 00:17:41,800 --> 00:17:43,480 Speaker 1: So once we get to a point where there's that 332 00:17:43,880 --> 00:17:47,840 Speaker 1: tipping point one way or the other, then things will 333 00:17:47,880 --> 00:17:51,520 Speaker 1: be very different. But there's going to be some growing paints. 334 00:17:51,560 --> 00:17:53,840 Speaker 1: And this also leads into something that I talked about 335 00:17:54,160 --> 00:17:58,040 Speaker 1: earlier in twenty sixteen when I went to CEES Toyota 336 00:17:58,119 --> 00:18:03,880 Speaker 1: had their big AI discussion. You know, they're investing millions 337 00:18:03,920 --> 00:18:07,439 Speaker 1: of dollars in AI research for autonomous cars and beyond, 338 00:18:08,640 --> 00:18:10,600 Speaker 1: and one of the things they've talked about was how 339 00:18:11,480 --> 00:18:15,440 Speaker 1: autonomous cars in general are really really good at handling 340 00:18:15,520 --> 00:18:19,359 Speaker 1: all the mundane stuff that you would typically encounter on 341 00:18:19,480 --> 00:18:22,080 Speaker 1: a normal day driving from point A to point B. 342 00:18:22,920 --> 00:18:25,720 Speaker 1: What they are not good at is dealing with stuff 343 00:18:25,880 --> 00:18:29,440 Speaker 1: that's outside of that norm and the sandbags that we 344 00:18:29,560 --> 00:18:32,399 Speaker 1: talked about earlier. Would be a great example of that. 345 00:18:32,960 --> 00:18:36,760 Speaker 1: It's some form of obstruction that's partially blocking off part 346 00:18:36,760 --> 00:18:42,840 Speaker 1: of the road and that ends up causing a different scenario, 347 00:18:43,400 --> 00:18:48,200 Speaker 1: and sometimes the card behaves in a way that works 348 00:18:48,240 --> 00:18:51,240 Speaker 1: out for everybody. In this case, it didn't. And it's 349 00:18:52,240 --> 00:18:54,720 Speaker 1: not that the car couldn't handle the situation. It's just 350 00:18:54,840 --> 00:18:57,440 Speaker 1: that the method that the car used turned out to 351 00:18:57,640 --> 00:19:01,760 Speaker 1: not be reliable. Yeah, this was an extremely slow speed crash, 352 00:19:01,800 --> 00:19:04,640 Speaker 1: as we've said. Yes, the bus was traveling fifteen miles 353 00:19:04,680 --> 00:19:06,239 Speaker 1: per hour in the other lane trying to get through 354 00:19:06,280 --> 00:19:08,879 Speaker 1: that gap. But the Google car was traveling I think 355 00:19:08,920 --> 00:19:11,920 Speaker 1: they said to two miles per hour. Yeah, very very slow, 356 00:19:12,359 --> 00:19:15,520 Speaker 1: very slow speed. So the thing is like with the 357 00:19:15,640 --> 00:19:18,840 Speaker 1: with the compensation for this, you know, the thirty five 358 00:19:18,960 --> 00:19:22,280 Speaker 1: hundred tests that they're now going to run additional tests, uh, 359 00:19:22,359 --> 00:19:25,800 Speaker 1: to determine or to find a way around that situation. 360 00:19:25,920 --> 00:19:28,200 Speaker 1: So it's never going to happen again. We're gonna do 361 00:19:28,240 --> 00:19:31,199 Speaker 1: everything we can, but to to think about it that way, 362 00:19:31,280 --> 00:19:33,760 Speaker 1: to say, the thirty five hundred tests that are going 363 00:19:33,840 --> 00:19:37,080 Speaker 1: to allow this vehicle to to think about that exact 364 00:19:37,160 --> 00:19:39,639 Speaker 1: situation and never let it happen again where where it 365 00:19:39,680 --> 00:19:42,040 Speaker 1: just kind of noses out into into a lane that 366 00:19:42,119 --> 00:19:45,240 Speaker 1: appears open. Yeah, that's remarkable. I mean it just lets 367 00:19:45,280 --> 00:19:48,119 Speaker 1: you know that there there are tens of thousands of 368 00:19:48,359 --> 00:19:52,480 Speaker 1: of um programs or are thoughts, I don't know how 369 00:19:52,560 --> 00:19:54,399 Speaker 1: to say it, right, that are they're going through this 370 00:19:54,520 --> 00:19:58,520 Speaker 1: thing at all times, you know, um as calculations and 371 00:19:58,600 --> 00:20:01,480 Speaker 1: parameters and just you know, if this then that you know, 372 00:20:01,560 --> 00:20:04,400 Speaker 1: those scenarios are being run all the time. It's just incredible, 373 00:20:04,480 --> 00:20:06,560 Speaker 1: mind boggling, it really is. And I was looking into 374 00:20:06,720 --> 00:20:09,480 Speaker 1: you said one one point four or five million miles 375 00:20:09,560 --> 00:20:12,840 Speaker 1: have been driven, uh flawlessly, right, I mean they hadn't 376 00:20:12,880 --> 00:20:15,440 Speaker 1: have any problems, you know, at at fault I guess 377 00:20:15,760 --> 00:20:19,359 Speaker 1: the autonomous vehicle. Do you know how much they test 378 00:20:19,920 --> 00:20:23,760 Speaker 1: on a daily basis and on actually on a daily basis? No? No, okay, 379 00:20:23,760 --> 00:20:25,240 Speaker 1: well let's see, I've got I got a note here. 380 00:20:25,240 --> 00:20:27,879 Speaker 1: I should have looked for that reading that. Okay, here 381 00:20:27,920 --> 00:20:29,760 Speaker 1: we go. All right, so actually this is a per 382 00:20:29,840 --> 00:20:32,080 Speaker 1: week and then a per day thing. All right, They 383 00:20:32,200 --> 00:20:35,480 Speaker 1: drive ten thousand miles per week and that's like you know, 384 00:20:35,680 --> 00:20:38,200 Speaker 1: somebody in a vehicle on the road ten thousand miles 385 00:20:38,280 --> 00:20:41,640 Speaker 1: per week, per day though this number is incredible. Per day, 386 00:20:42,359 --> 00:20:46,879 Speaker 1: they are driving three million miles of computer simulation miles. 387 00:20:47,240 --> 00:20:50,160 Speaker 1: Oh so three million the equivalent of three million miles. 388 00:20:50,240 --> 00:20:52,280 Speaker 1: That's because they can quickly just go through that and 389 00:20:52,359 --> 00:20:56,359 Speaker 1: have multiple systems running these things, you know, So the 390 00:20:56,680 --> 00:20:58,280 Speaker 1: amount of testing that they do in a year is 391 00:20:58,320 --> 00:21:00,879 Speaker 1: just unbelievable. I don't have yearly stats or anything, but 392 00:21:00,920 --> 00:21:04,320 Speaker 1: you can extrapolate those numbers to that well. And the 393 00:21:04,440 --> 00:21:07,680 Speaker 1: other point in the Toyota press conference that was interesting 394 00:21:07,760 --> 00:21:09,520 Speaker 1: to me, and this goes back to what you were 395 00:21:09,520 --> 00:21:11,760 Speaker 1: saying at the beginning of the show about holding autonomous 396 00:21:11,840 --> 00:21:14,720 Speaker 1: cars at a different standard than we hold human drivers. 397 00:21:16,080 --> 00:21:20,520 Speaker 1: They talked about how you offer a lot of the 398 00:21:20,600 --> 00:21:26,440 Speaker 1: autonomous car industry talks about the one hundred million mile benchmark, 399 00:21:26,560 --> 00:21:29,400 Speaker 1: saying that you want you want one hundred million miles 400 00:21:29,480 --> 00:21:33,960 Speaker 1: traveled of proven safety. And they said, you know that's 401 00:21:34,000 --> 00:21:36,399 Speaker 1: not enough. You need to go much bigger than that 402 00:21:36,440 --> 00:21:40,000 Speaker 1: one hundred billion miles. And I thought, wow, that is 403 00:21:40,359 --> 00:21:43,280 Speaker 1: I mean, I get it for you want that many 404 00:21:43,440 --> 00:21:47,399 Speaker 1: miles so that you can encounter as many possible different 405 00:21:47,480 --> 00:21:52,440 Speaker 1: situations as you might encounter on the road, because obviously, 406 00:21:52,720 --> 00:21:55,560 Speaker 1: if you if you plan a system and it's great 407 00:21:55,640 --> 00:21:59,119 Speaker 1: for handling ninety nine percent of the situations, that's fine 408 00:21:59,280 --> 00:22:01,600 Speaker 1: until you run into that one percent. And when you 409 00:22:01,720 --> 00:22:04,680 Speaker 1: do figure out how many cars are on the road 410 00:22:04,760 --> 00:22:08,520 Speaker 1: in the United States alone, traveling on any given day, 411 00:22:08,960 --> 00:22:13,120 Speaker 1: you realize the odds are Eventually, I mean, statistics show, 412 00:22:13,200 --> 00:22:17,639 Speaker 1: statistics prove like odds are sooner rather than later, one 413 00:22:17,640 --> 00:22:21,120 Speaker 1: of those autonomous cars will encounter a situation that would 414 00:22:21,160 --> 00:22:25,120 Speaker 1: have been impossible to anticipate in the programming phase. Yeah, 415 00:22:25,240 --> 00:22:27,440 Speaker 1: so well, I get it. On one hand. On the 416 00:22:27,520 --> 00:22:29,800 Speaker 1: other hand, I get frustrated because I really want to 417 00:22:29,840 --> 00:22:32,879 Speaker 1: see this feature get here as soon as possible. But 418 00:22:33,000 --> 00:22:40,480 Speaker 1: I totally understand the need for that level of precision 419 00:22:40,600 --> 00:22:44,679 Speaker 1: that's demanded so that you can be sure that nothing 420 00:22:45,480 --> 00:22:50,080 Speaker 1: catastrophic happens when a car encounters something that the programmers 421 00:22:50,160 --> 00:22:54,000 Speaker 1: just did not anticipate. Now, Chris Rmson again, he was 422 00:22:54,119 --> 00:22:56,720 Speaker 1: the what was his title, shoot, I think as director 423 00:22:56,800 --> 00:23:00,400 Speaker 1: of the self driving car project. I couldn't remember director 424 00:23:00,440 --> 00:23:02,480 Speaker 1: or not, but he he did say, and this is 425 00:23:02,680 --> 00:23:04,679 Speaker 1: I don't you can find this troubling, I guess if 426 00:23:04,680 --> 00:23:07,520 Speaker 1: you want to. But I understand what he's saying. He 427 00:23:07,640 --> 00:23:09,720 Speaker 1: says that, you know, of course the February fourteenth was 428 00:23:09,800 --> 00:23:12,359 Speaker 1: a tough day for his team, obviously, but he says, 429 00:23:12,480 --> 00:23:15,240 Speaker 1: and I thought this was interesting. He said, We've got 430 00:23:15,280 --> 00:23:17,199 Speaker 1: to be prepared for more days just like that if 431 00:23:17,200 --> 00:23:19,479 Speaker 1: we're going to ever succeed in creating this, this project, 432 00:23:19,520 --> 00:23:21,800 Speaker 1: you know, making this work. And we're actually gonna have 433 00:23:21,880 --> 00:23:24,960 Speaker 1: worse days than that. And when I when I hear that, 434 00:23:25,040 --> 00:23:26,359 Speaker 1: you know we're gonna have worse days than that. Of 435 00:23:26,440 --> 00:23:27,840 Speaker 1: course you think, you know the worst. Do you think 436 00:23:27,880 --> 00:23:30,440 Speaker 1: that it's going to be involved in accident that is 437 00:23:30,680 --> 00:23:34,080 Speaker 1: fatal or you know, harms somebody, anybody in any way, 438 00:23:34,160 --> 00:23:35,920 Speaker 1: And of course that would be an awful day. That'd 439 00:23:35,920 --> 00:23:38,080 Speaker 1: be a worse day than what we've seen. But you 440 00:23:38,240 --> 00:23:41,000 Speaker 1: kind of have to expect something like that is going 441 00:23:41,080 --> 00:23:43,600 Speaker 1: to happen if you're traveling, if you're traveling three hundred 442 00:23:43,680 --> 00:23:45,920 Speaker 1: billion miles like you said, or you know whatever, the 443 00:23:46,119 --> 00:23:48,399 Speaker 1: the enormous number of miles on the road that they 444 00:23:48,480 --> 00:23:51,520 Speaker 1: want to travel is. Um I would guess that, you know, 445 00:23:51,560 --> 00:23:53,480 Speaker 1: when you're talking about three or three hundred million or 446 00:23:53,480 --> 00:23:57,200 Speaker 1: billion or whatever it was, those might be computer simulated miles, 447 00:23:57,240 --> 00:23:59,320 Speaker 1: because you know, the three million, three million a day 448 00:23:59,400 --> 00:24:01,840 Speaker 1: is an hormous number and that adds up quickly. But 449 00:24:02,160 --> 00:24:04,840 Speaker 1: you know, ten thousand miles per week of actual you know, 450 00:24:05,080 --> 00:24:08,000 Speaker 1: physical drawn the road testing, that's that's pretty impressive still, 451 00:24:08,040 --> 00:24:10,040 Speaker 1: But how long would that take to get up to 452 00:24:11,000 --> 00:24:13,080 Speaker 1: three hundred million? And you know, I think somebody who 453 00:24:13,160 --> 00:24:16,080 Speaker 1: laid it out pretty clearly here is the US Transportation Secretary. 454 00:24:16,119 --> 00:24:18,840 Speaker 1: His name is Anthony Fox. And you know he's the 455 00:24:18,880 --> 00:24:21,920 Speaker 1: one who said where I initially read, I guess, don't 456 00:24:21,960 --> 00:24:24,600 Speaker 1: don't compare these two perfection You can't do that. And 457 00:24:24,920 --> 00:24:26,480 Speaker 1: one of the quotes here in an article that I 458 00:24:26,560 --> 00:24:29,920 Speaker 1: read from the BBC says that he says, it's not 459 00:24:30,119 --> 00:24:32,000 Speaker 1: as surprise that at some point to be a crash 460 00:24:32,040 --> 00:24:33,919 Speaker 1: when they've got this brand new technology in the road. 461 00:24:33,960 --> 00:24:36,080 Speaker 1: But what I what I would challenge anyone to do, 462 00:24:36,240 --> 00:24:37,960 Speaker 1: is to look at the number of crashes that occurred 463 00:24:38,000 --> 00:24:40,160 Speaker 1: on the same day that with the result of human behavior. 464 00:24:40,480 --> 00:24:42,200 Speaker 1: And that gets right back to what you were saying, 465 00:24:42,240 --> 00:24:45,080 Speaker 1: and that you know, there's so many miles driven every 466 00:24:45,200 --> 00:24:48,000 Speaker 1: day just here in the US, around the world, all 467 00:24:48,080 --> 00:24:51,160 Speaker 1: over the place, that you just you know that bad 468 00:24:51,200 --> 00:24:53,760 Speaker 1: stuff is happening all the time, every minute, literally, all right, 469 00:24:53,880 --> 00:24:56,760 Speaker 1: but this is a great opportunity for me to transition 470 00:24:56,920 --> 00:24:59,840 Speaker 1: from the Google story, which is you know it's again. 471 00:25:00,359 --> 00:25:04,880 Speaker 1: It has huge implications for the autonomous car industry. Even 472 00:25:04,960 --> 00:25:08,680 Speaker 1: though it was in the grand scheme of things, a 473 00:25:08,840 --> 00:25:14,240 Speaker 1: minor accident, it was something that once you realize, oh, 474 00:25:15,000 --> 00:25:18,440 Speaker 1: they're not perfect, then it starts raising some questions. These 475 00:25:18,760 --> 00:25:21,280 Speaker 1: talks were a bit more subdued after that point. Yeah, 476 00:25:21,359 --> 00:25:23,760 Speaker 1: and at south By Southwest like that was definitely happening, 477 00:25:23,800 --> 00:25:26,000 Speaker 1: although I went to a couple of different panels about 478 00:25:26,040 --> 00:25:28,080 Speaker 1: autonomous cars where they didn't even bring it up. They 479 00:25:28,119 --> 00:25:31,440 Speaker 1: were gung ho. I mean, the general feeling at south 480 00:25:31,480 --> 00:25:35,520 Speaker 1: By Southwest is that autonomous cars are a definitive future 481 00:25:35,640 --> 00:25:40,080 Speaker 1: that are coming, and that that most likely there will 482 00:25:40,119 --> 00:25:43,920 Speaker 1: be some form of shared services model for autonomous cars. 483 00:25:44,320 --> 00:25:51,160 Speaker 1: I think most people agreed that personal ownership is going 484 00:25:51,280 --> 00:25:56,639 Speaker 1: to slowly phase out, largely because younger generations don't necessarily 485 00:25:56,760 --> 00:26:00,200 Speaker 1: see the necessity of owning a car. And there some 486 00:26:00,280 --> 00:26:04,000 Speaker 1: interesting statistics too. I saw a panel it's called robot 487 00:26:04,080 --> 00:26:10,080 Speaker 1: cars and sharing road rage or smooth sailing, and this 488 00:26:10,280 --> 00:26:13,240 Speaker 1: was said. Three panelists on it, a moderator and two panelists. 489 00:26:13,920 --> 00:26:17,200 Speaker 1: One was the moderator was Frederick Sue of a company 490 00:26:17,240 --> 00:26:22,320 Speaker 1: called NATO. NATO creates an app and a camera setup 491 00:26:22,480 --> 00:26:25,360 Speaker 1: where you can essentially upgrade your car into a smart car, 492 00:26:25,880 --> 00:26:28,480 Speaker 1: not an autonomous car, but a smart car where it's 493 00:26:29,040 --> 00:26:33,680 Speaker 1: able to use information from the camera and run it 494 00:26:33,760 --> 00:26:35,960 Speaker 1: through some algorithms that are on the back end of 495 00:26:36,040 --> 00:26:39,399 Speaker 1: the data system that then transmits to your app to 496 00:26:39,560 --> 00:26:43,040 Speaker 1: let you know things like how well, how good of 497 00:26:43,080 --> 00:26:45,320 Speaker 1: a driver is the driver, that kind of stuff. So 498 00:26:45,440 --> 00:26:48,959 Speaker 1: it's also a use for like fleet management. You can 499 00:26:49,160 --> 00:26:51,399 Speaker 1: use it to figure out if the driver you've just 500 00:26:51,600 --> 00:26:55,840 Speaker 1: hired to be one of your employees, if that was 501 00:26:55,880 --> 00:26:57,879 Speaker 1: a good choice or not, or maybe you need to 502 00:26:57,920 --> 00:27:00,880 Speaker 1: rethink that that kind of stuff based on driving. Yeah, 503 00:27:01,280 --> 00:27:04,040 Speaker 1: and uh, and it pulls information from a lot of 504 00:27:04,080 --> 00:27:07,879 Speaker 1: different sources, but the camera is the primary one. He 505 00:27:08,040 --> 00:27:11,440 Speaker 1: was the moderator. And then you had Shad Laws from 506 00:27:11,680 --> 00:27:15,560 Speaker 1: Renault and who was funny because he talked about Renault 507 00:27:15,640 --> 00:27:18,080 Speaker 1: is a brand that is famous around the world but 508 00:27:18,280 --> 00:27:20,199 Speaker 1: not here in the US. But you might know our 509 00:27:20,240 --> 00:27:24,760 Speaker 1: partner Nissan. And then uh, yeah, we knew Renault back 510 00:27:24,800 --> 00:27:28,320 Speaker 1: in what the mid eighties, I think, yeah, that's about it. Yeah, 511 00:27:28,680 --> 00:27:31,600 Speaker 1: And then there was a Mark platchin from BMW who 512 00:27:31,680 --> 00:27:33,680 Speaker 1: was actually a substitute. Originally it was supposed to be 513 00:27:33,760 --> 00:27:38,000 Speaker 1: Maryanne Wu of ge Ventures. We'll be back with more 514 00:27:38,000 --> 00:27:40,960 Speaker 1: of the great Google Car Crash of twenty sixteen after 515 00:27:41,080 --> 00:27:53,040 Speaker 1: we take this quick break. So let me throw you 516 00:27:53,160 --> 00:27:55,440 Speaker 1: some some statistics at you, or some of the facts 517 00:27:55,480 --> 00:27:58,080 Speaker 1: at you that Sue brought up. So one of the 518 00:27:58,160 --> 00:28:02,920 Speaker 1: things he said was that the hypoical American car spends 519 00:28:03,080 --> 00:28:07,880 Speaker 1: ninety six percent of its life part of its life part. 520 00:28:08,040 --> 00:28:10,960 Speaker 1: That's an enormous chunk of time. Yeah, so only four 521 00:28:11,040 --> 00:28:14,879 Speaker 1: percent of your typical American car, knowing that there are 522 00:28:14,960 --> 00:28:18,600 Speaker 1: cases outside on either end, four percent of it is 523 00:28:18,600 --> 00:28:23,040 Speaker 1: actually used driving around. So with that when you hit 524 00:28:23,119 --> 00:28:26,080 Speaker 1: someone with that, assuming that that is in fact correct, 525 00:28:26,680 --> 00:28:29,760 Speaker 1: I don't know where his source was for ninety six percent, 526 00:28:30,160 --> 00:28:32,680 Speaker 1: but assuming that is in fact correct, you can start 527 00:28:32,720 --> 00:28:36,440 Speaker 1: to see an argument for a fleet of autonomous vehicles 528 00:28:36,600 --> 00:28:40,080 Speaker 1: that can drive around on demand and pick someone up 529 00:28:40,080 --> 00:28:42,680 Speaker 1: and drop them off, because that means you could free 530 00:28:42,800 --> 00:28:45,360 Speaker 1: up the space that would be taken by a part 531 00:28:45,480 --> 00:28:50,800 Speaker 1: car and use it for something else, because I mean, 532 00:28:50,880 --> 00:28:54,640 Speaker 1: a lot of our spaces are reserver for parking. In fact, 533 00:28:54,920 --> 00:28:58,000 Speaker 1: there are regulations for office buildings about how much square 534 00:28:58,040 --> 00:29:02,040 Speaker 1: footage you have to set aside for parking in certain cities. 535 00:29:02,080 --> 00:29:04,800 Speaker 1: It depends on the city. But imagine that you have 536 00:29:04,920 --> 00:29:08,600 Speaker 1: a world where people are relying on autonomous cars to 537 00:29:08,720 --> 00:29:10,320 Speaker 1: pick them up and drop them off. You don't need 538 00:29:10,400 --> 00:29:13,040 Speaker 1: that space for parking anymore. You can actually dedicate that 539 00:29:13,160 --> 00:29:17,080 Speaker 1: to something else and make bukuza money. I think it's 540 00:29:17,120 --> 00:29:18,920 Speaker 1: the way they put it. But anyway, plant a tree, 541 00:29:19,400 --> 00:29:20,959 Speaker 1: plant a tree. You could also do that. I think, 542 00:29:21,000 --> 00:29:24,760 Speaker 1: come on, you tree hugger. No, I also think that 543 00:29:24,840 --> 00:29:27,560 Speaker 1: would be awesome. So one of the things that I 544 00:29:27,680 --> 00:29:35,360 Speaker 1: thought was shocking it. I think the effect on me 545 00:29:35,680 --> 00:29:39,520 Speaker 1: was not what the speaker was planning. Shad laws of 546 00:29:39,600 --> 00:29:45,800 Speaker 1: Renault was talking about the safety factor of autonomous cars, 547 00:29:45,840 --> 00:29:51,479 Speaker 1: and his argument was that safety autonomous cars. First of all, 548 00:29:51,520 --> 00:29:54,480 Speaker 1: we can't determine that they're more safe than human driven 549 00:29:54,560 --> 00:29:57,320 Speaker 1: cars yet because we don't have enough information. We don't 550 00:29:57,320 --> 00:29:59,880 Speaker 1: have enough autonomous cars on the road, we haven't had 551 00:29:59,880 --> 00:30:02,440 Speaker 1: a enough scenarios to really tell. But then he also 552 00:30:02,480 --> 00:30:05,160 Speaker 1: said safety is really not as big a deal as 553 00:30:05,240 --> 00:30:09,120 Speaker 1: you might think, because the safety benchmark is to try 554 00:30:09,200 --> 00:30:12,400 Speaker 1: and have fewer than one fatality per one hundred million 555 00:30:12,480 --> 00:30:17,040 Speaker 1: kilometers driven. Now, in the United States it is one 556 00:30:17,160 --> 00:30:20,640 Speaker 1: point zero eight fatalities for one hundred million miles. But 557 00:30:20,760 --> 00:30:23,520 Speaker 1: a mile is longer than a kilometer, right, one mile 558 00:30:23,640 --> 00:30:26,800 Speaker 1: is one point six kilometers, so it's still below that 559 00:30:27,120 --> 00:30:30,000 Speaker 1: one fatality per one hundred million kilometers. And then he 560 00:30:30,040 --> 00:30:32,680 Speaker 1: said for most countries that's the case. There are a 561 00:30:32,720 --> 00:30:34,840 Speaker 1: few that are above it, but not many. So is 562 00:30:34,840 --> 00:30:37,720 Speaker 1: this an unrealistic standard to be held to? Well, I 563 00:30:37,760 --> 00:30:40,920 Speaker 1: think what he was trying to say is that human 564 00:30:41,040 --> 00:30:45,840 Speaker 1: drivers are pretty safe already, and therefore you can't sell 565 00:30:45,960 --> 00:30:48,360 Speaker 1: autonomous cars on the promise of safety because we're so 566 00:30:48,480 --> 00:30:52,400 Speaker 1: safe already. I would counter that argument by saying, more 567 00:30:52,480 --> 00:30:55,360 Speaker 1: than thirty thousand people died last year as a result 568 00:30:55,400 --> 00:30:59,520 Speaker 1: of car accidents, as thirty thousand fewer people around today 569 00:31:00,280 --> 00:31:03,840 Speaker 1: because of a car accident, and more something around the 570 00:31:04,000 --> 00:31:06,040 Speaker 1: order of ninety percent of car accidents are at the 571 00:31:06,080 --> 00:31:08,840 Speaker 1: fault of the human of a human driver, at least 572 00:31:08,880 --> 00:31:12,880 Speaker 1: one human driver, And so my counter to that argument 573 00:31:12,960 --> 00:31:17,240 Speaker 1: is that it may be statistically speaking, a safe thing, 574 00:31:17,680 --> 00:31:20,000 Speaker 1: but when you get down to actual numbers, with real 575 00:31:20,160 --> 00:31:23,760 Speaker 1: human being lives attached to it, I would argue that 576 00:31:23,880 --> 00:31:29,000 Speaker 1: the autonomous vehicles so far have proven to be a 577 00:31:29,200 --> 00:31:32,280 Speaker 1: really good move in the right direction to reduce that 578 00:31:32,440 --> 00:31:35,440 Speaker 1: number dramatically. This is dangerous territory you're waiting into here, 579 00:31:35,520 --> 00:31:38,720 Speaker 1: because on our show on Car Stuff, we sometimes talk about, 580 00:31:39,200 --> 00:31:45,040 Speaker 1: you know, the incredible rise in fatalities on Georgia highways 581 00:31:45,680 --> 00:31:47,720 Speaker 1: last year, because there was a huge increase, like twenty 582 00:31:47,760 --> 00:31:50,440 Speaker 1: five percent increase or something, you know, year over year. Wow, 583 00:31:50,480 --> 00:31:52,680 Speaker 1: And it was really big, and it was the first 584 00:31:52,720 --> 00:31:54,440 Speaker 1: time in a long, long time, a long stretch of 585 00:31:54,480 --> 00:31:57,000 Speaker 1: time where you know, it had actually been on the rise. 586 00:31:57,040 --> 00:31:59,120 Speaker 1: It it was going down up until that point, and 587 00:31:59,200 --> 00:32:02,040 Speaker 1: then suddenly this big spike and trying to figure out why, 588 00:32:02,080 --> 00:32:04,000 Speaker 1: and we're talking about distraction and all that stuff, you know, 589 00:32:04,120 --> 00:32:06,520 Speaker 1: smartphones and things behind the wheel and trying to just 590 00:32:06,960 --> 00:32:10,120 Speaker 1: you know, guess why it's happening that way. And of 591 00:32:10,200 --> 00:32:12,600 Speaker 1: course somebody writes in and says, well, thirty eight thousand 592 00:32:12,640 --> 00:32:15,320 Speaker 1: people is not that many people. And you say, well, 593 00:32:15,400 --> 00:32:18,760 Speaker 1: that's a lot of people die on us highways. And 594 00:32:18,920 --> 00:32:20,880 Speaker 1: they say, now, how back in the nineteen fifties the 595 00:32:20,960 --> 00:32:24,080 Speaker 1: number was like forty four thousand and that, you know, 596 00:32:24,360 --> 00:32:26,000 Speaker 1: and that was with less drivers on the road. And 597 00:32:26,040 --> 00:32:28,760 Speaker 1: they give you all these stats about population and number 598 00:32:28,800 --> 00:32:30,840 Speaker 1: of miles driven and all that night, and I get 599 00:32:30,920 --> 00:32:32,800 Speaker 1: I gotta be honest, I get kind of confused with 600 00:32:33,000 --> 00:32:35,840 Speaker 1: with that, you know, with that sure angle, like trying 601 00:32:35,880 --> 00:32:38,480 Speaker 1: to compare apples to apples, you know, back then, you know, 602 00:32:38,840 --> 00:32:41,440 Speaker 1: sixty years ago to today. That's that's kind of tough 603 00:32:41,480 --> 00:32:44,440 Speaker 1: to do well, especially you know, there's so many other 604 00:32:44,560 --> 00:32:46,680 Speaker 1: factors there, right, Yeah, you might have fewer drivers on 605 00:32:46,720 --> 00:32:50,080 Speaker 1: the road, but your safety regulations weren't anything like they 606 00:32:50,160 --> 00:32:53,160 Speaker 1: are today years ago exactly. That was one thing. And 607 00:32:53,280 --> 00:32:56,240 Speaker 1: we always argue that point two, there were no crumple zones, 608 00:32:56,400 --> 00:32:58,280 Speaker 1: there were no air bags, there's none of that stuff 609 00:32:58,320 --> 00:33:00,320 Speaker 1: going on, so maybe that it counts for it. But 610 00:33:00,400 --> 00:33:03,280 Speaker 1: then they counter with another argument. So I'm just saying that, 611 00:33:03,560 --> 00:33:06,320 Speaker 1: you know, I feel that somebody out there is going 612 00:33:06,360 --> 00:33:08,520 Speaker 1: to have some kind of issue with you know, mentioning 613 00:33:08,600 --> 00:33:10,840 Speaker 1: that thirty you know, thirty eight thousands, a huge number, 614 00:33:10,880 --> 00:33:13,480 Speaker 1: that was what it was last year in the US alone. 615 00:33:14,200 --> 00:33:16,000 Speaker 1: That's a huge number, no matter how you look at it. 616 00:33:16,080 --> 00:33:18,960 Speaker 1: I mean, even if, even if they're more people driving, 617 00:33:19,040 --> 00:33:21,720 Speaker 1: I think my response to anyone who would argue like 618 00:33:22,320 --> 00:33:25,080 Speaker 1: that that this is less than in the past, I 619 00:33:25,160 --> 00:33:28,320 Speaker 1: would say, that's good, but it could be lower and 620 00:33:28,880 --> 00:33:31,880 Speaker 1: lower number of people who die as a result of 621 00:33:31,960 --> 00:33:34,320 Speaker 1: car accidents. I think it's hard to argue that that's 622 00:33:34,440 --> 00:33:37,320 Speaker 1: a bad thing. No, you certainly want that number to 623 00:33:37,400 --> 00:33:39,320 Speaker 1: be as low, as close to zero as you could 624 00:33:39,360 --> 00:33:42,280 Speaker 1: possibly make it. Of course, automakers strive for that. They're trying, 625 00:33:42,360 --> 00:33:45,080 Speaker 1: they're they're trying everything they can to make essentially a 626 00:33:45,200 --> 00:33:47,400 Speaker 1: deathproof car. I mean, you can't, you know, you can't 627 00:33:47,400 --> 00:33:51,200 Speaker 1: account for every situation, right, every single situation, but they're 628 00:33:51,240 --> 00:33:54,440 Speaker 1: doing their best to make what is essentially a deathproof car. 629 00:33:54,680 --> 00:33:58,000 Speaker 1: And there's several that, you know, several marks that they've got. 630 00:33:58,080 --> 00:34:01,360 Speaker 1: They've gone years without a Yeah, I've got you know, 631 00:34:01,520 --> 00:34:03,720 Speaker 1: the stats somewhere back on my desk, but there's a 632 00:34:03,800 --> 00:34:05,840 Speaker 1: few that have gone I'm going to guess here, just 633 00:34:06,080 --> 00:34:07,720 Speaker 1: based off my memory, it was like five or six 634 00:34:07,840 --> 00:34:11,719 Speaker 1: years without a fatality, right, caused by a fault in 635 00:34:11,760 --> 00:34:14,600 Speaker 1: a system in their vehicle. Right. Well, and that also 636 00:34:14,719 --> 00:34:16,759 Speaker 1: leads me to a different panel that I saw. We'll 637 00:34:16,800 --> 00:34:18,600 Speaker 1: come We'll come back to the road rage one because 638 00:34:18,600 --> 00:34:21,640 Speaker 1: we got to get to the BMW's Yeah. But the 639 00:34:22,440 --> 00:34:24,480 Speaker 1: other panel I saw that was related to this was 640 00:34:24,560 --> 00:34:27,440 Speaker 1: called looking Forward to Rush Hour the Future of Transit, 641 00:34:27,760 --> 00:34:29,759 Speaker 1: looking Forward to Rush Out. Yeah. This was from a 642 00:34:29,840 --> 00:34:33,720 Speaker 1: couple of industrial designers without to it design talking about 643 00:34:33,760 --> 00:34:37,480 Speaker 1: the future of transportation, and it wasn't just autonomous cars 644 00:34:37,600 --> 00:34:40,200 Speaker 1: or even just the future of cars. That was one 645 00:34:40,440 --> 00:34:44,040 Speaker 1: half of the panel, and that was done by a guy. 646 00:34:44,640 --> 00:34:47,759 Speaker 1: The guy who led that part was Dan Dorley, but 647 00:34:47,840 --> 00:34:51,040 Speaker 1: there was also Chip Walters who did the other half, 648 00:34:51,080 --> 00:34:54,279 Speaker 1: which was more about the hyper loop, also fascinating, but 649 00:34:54,400 --> 00:34:56,200 Speaker 1: that we're not talking about the hyper loop today. So 650 00:34:57,000 --> 00:34:59,080 Speaker 1: switching back over to Dorley. One of the things Dorley 651 00:34:59,120 --> 00:35:02,680 Speaker 1: said that I thought was really interesting was that once 652 00:35:02,719 --> 00:35:04,759 Speaker 1: you get to a level where you have a lot 653 00:35:04,800 --> 00:35:08,360 Speaker 1: of autonomous cars on the road, like let's say the 654 00:35:08,440 --> 00:35:12,719 Speaker 1: majority of cars on the road or autonomous, and you 655 00:35:12,880 --> 00:35:17,200 Speaker 1: have proof I mean, obviously this only works if everything's 656 00:35:17,239 --> 00:35:20,279 Speaker 1: working properly, but you have proof that because of the 657 00:35:20,400 --> 00:35:22,600 Speaker 1: number of autonomous vehicles on the road, the number of 658 00:35:22,719 --> 00:35:28,759 Speaker 1: crashes decreases dramatically, the number of deaths decreased dramatically. Then 659 00:35:28,880 --> 00:35:31,440 Speaker 1: you can start to play around with other stuff, because 660 00:35:32,200 --> 00:35:35,360 Speaker 1: if the autonomous cars are a proven technology that's safe, 661 00:35:36,040 --> 00:35:40,120 Speaker 1: you can let up on some of the major safety 662 00:35:40,280 --> 00:35:42,279 Speaker 1: considerations you've had to put into place over the last 663 00:35:42,280 --> 00:35:44,800 Speaker 1: few years in order to minimize that number that we 664 00:35:44,920 --> 00:35:48,799 Speaker 1: talked about, that thirty thousand or higher number. You could 665 00:35:48,800 --> 00:35:52,640 Speaker 1: remove crumple zones. You can make cars smaller and lighter, 666 00:35:53,120 --> 00:35:56,680 Speaker 1: which is especially important if your cars also are electric, 667 00:35:57,200 --> 00:35:59,440 Speaker 1: because the battery will have less weight to have to 668 00:35:59,520 --> 00:36:03,400 Speaker 1: move around. That will extend the driving range of your 669 00:36:03,520 --> 00:36:06,800 Speaker 1: vehicle because you've made your vehicle lighter. Not that not 670 00:36:06,920 --> 00:36:08,960 Speaker 1: that the battery's gotten any better, but it doesn't have 671 00:36:09,040 --> 00:36:11,279 Speaker 1: to push as much weight around. Sure, And again this 672 00:36:11,640 --> 00:36:14,480 Speaker 1: only works though, if every vehicle out there isn't the same. Yeah, 673 00:36:14,600 --> 00:36:17,800 Speaker 1: you have to have you have to have enough autonomous vehicles, 674 00:36:18,360 --> 00:36:22,319 Speaker 1: at least the majority, if not of them out there, 675 00:36:22,400 --> 00:36:26,759 Speaker 1: so that you can be confident that by eliminating those 676 00:36:27,200 --> 00:36:31,040 Speaker 1: safety features that are important right now, it's not going 677 00:36:31,080 --> 00:36:35,080 Speaker 1: to make any difference. And I think that I think 678 00:36:35,120 --> 00:36:37,440 Speaker 1: we're pretty far away from that. But I thought it 679 00:36:37,520 --> 00:36:41,040 Speaker 1: was an interesting point. He also talked about more car 680 00:36:41,120 --> 00:36:45,279 Speaker 1: manufacturers creating a sort of a universal chassis where lots 681 00:36:45,320 --> 00:36:47,600 Speaker 1: of different bodies of vehicles could fit on top of 682 00:36:47,680 --> 00:36:51,960 Speaker 1: the same basic chassis, leading to a future where ultimately 683 00:36:52,520 --> 00:36:54,600 Speaker 1: you can and you can do this now. Actually, if 684 00:36:54,640 --> 00:36:57,800 Speaker 1: you've got enough money, you can go to certain specialty 685 00:36:57,880 --> 00:37:02,080 Speaker 1: companies and three D print a car design. You could 686 00:37:02,120 --> 00:37:04,759 Speaker 1: design a car if you wanted to, and three D 687 00:37:04,880 --> 00:37:07,920 Speaker 1: print a car body that fits on top of a 688 00:37:08,200 --> 00:37:12,640 Speaker 1: particular chassis and motor drivetrain configuration, and so you can 689 00:37:12,680 --> 00:37:15,440 Speaker 1: have your own Like people would say, well, what kind 690 00:37:15,440 --> 00:37:17,160 Speaker 1: of cars as well, that's my car. I call it 691 00:37:17,239 --> 00:37:22,520 Speaker 1: a Strickland. Yeah, it's it's a Strickland. It doesn't drive anywhere. 692 00:37:24,280 --> 00:37:27,800 Speaker 1: Um yeah, that's that's a joke about me not driving. 693 00:37:28,320 --> 00:37:30,440 Speaker 1: But yeah. I thought it was interesting that he was 694 00:37:30,560 --> 00:37:35,480 Speaker 1: looking into implications of autonomous cars well beyond safety, well 695 00:37:35,560 --> 00:37:40,080 Speaker 1: beyond the shared model. He was looking at au Thomas 696 00:37:40,160 --> 00:37:42,120 Speaker 1: cars like, well, what does that do to the design 697 00:37:42,239 --> 00:37:44,440 Speaker 1: of the car itself. That's interesting that you know, you 698 00:37:44,520 --> 00:37:46,920 Speaker 1: could eliminate the things that we find that we have 699 00:37:47,120 --> 00:37:49,440 Speaker 1: to have now. Yeah, and that's that's an interesting way 700 00:37:49,440 --> 00:37:51,239 Speaker 1: to think about it, like if if, if it's just 701 00:37:51,400 --> 00:37:54,919 Speaker 1: not necessary, what could you really pair that design down 702 00:37:55,000 --> 00:37:57,080 Speaker 1: to what what what smart things could you do with 703 00:37:57,200 --> 00:38:00,359 Speaker 1: that ye to make it work better as an electric form, 704 00:38:00,440 --> 00:38:02,919 Speaker 1: as an autonomous platform, Right, you know, it all makes 705 00:38:03,160 --> 00:38:05,640 Speaker 1: it makes good sense. But again you're you're counting on 706 00:38:06,280 --> 00:38:09,600 Speaker 1: you know, participation in this. Yeah, you need to, you 707 00:38:09,640 --> 00:38:12,440 Speaker 1: would need to have enough buy in so that there 708 00:38:12,600 --> 00:38:15,759 Speaker 1: isn't a risk of having something like we saw with 709 00:38:16,120 --> 00:38:18,440 Speaker 1: the previous Google tests. You know, we talked about there 710 00:38:18,480 --> 00:38:23,120 Speaker 1: were more than a dozen accidents involving Google self driving cars, 711 00:38:23,480 --> 00:38:29,480 Speaker 1: previously only only the previous ones until before February twenty sixteen. 712 00:38:30,040 --> 00:38:33,279 Speaker 1: They were all at the fault of a human driver, 713 00:38:33,520 --> 00:38:36,759 Speaker 1: either the person manually controlling the self driving car or 714 00:38:36,800 --> 00:38:40,000 Speaker 1: another driver. So the same thing is true. If you're 715 00:38:40,040 --> 00:38:42,560 Speaker 1: in an autonomous vehicle and there are human drivers on 716 00:38:42,600 --> 00:38:44,680 Speaker 1: the road, then there's a chance that one of them 717 00:38:44,960 --> 00:38:47,840 Speaker 1: could make a terrible like there could just be an accident, 718 00:38:47,920 --> 00:38:51,120 Speaker 1: it could be a failure, it could be a distracted driver, 719 00:38:51,320 --> 00:38:55,600 Speaker 1: drunk driver, it could be anything. And until you eliminate 720 00:38:55,640 --> 00:38:58,960 Speaker 1: those possibilities, it is pretty dangerous to just say, let's 721 00:38:58,960 --> 00:39:01,239 Speaker 1: eliminate crumple zone. It's not very dangerous. You could do 722 00:39:01,360 --> 00:39:04,600 Speaker 1: other stuff like imagine, you know, you have no need 723 00:39:04,680 --> 00:39:08,320 Speaker 1: for controls, so you free up all that space in 724 00:39:08,440 --> 00:39:12,000 Speaker 1: the front that would be normally be dedicated to steering 725 00:39:12,000 --> 00:39:14,000 Speaker 1: wheel and pedals and that kind of stuff. You could 726 00:39:14,000 --> 00:39:18,040 Speaker 1: have a workstation or an entertainment station. Because you're not driving, 727 00:39:18,320 --> 00:39:20,800 Speaker 1: you don't even necessarily have to face forward. Yeah, you 728 00:39:20,840 --> 00:39:23,000 Speaker 1: can face backward. I had a discussion about this on 729 00:39:23,120 --> 00:39:27,560 Speaker 1: Forward Thinking and Lauren immediately said, yeah, could never do that. 730 00:39:27,680 --> 00:39:29,839 Speaker 1: I'd be yakking all over the inside of that car. Yeah. 731 00:39:30,040 --> 00:39:31,560 Speaker 1: I think a lot of people have that trouble on 732 00:39:31,640 --> 00:39:34,960 Speaker 1: a train already or a bus, you know, in certain situations. Right, 733 00:39:35,120 --> 00:39:38,680 Speaker 1: but imagine if you could sit sideways. Yeah, the design 734 00:39:38,719 --> 00:39:40,800 Speaker 1: of the vehicle could just be so radically different that 735 00:39:41,560 --> 00:39:43,720 Speaker 1: none of that really matter. You could you could probably 736 00:39:43,760 --> 00:39:45,879 Speaker 1: design you know, those honeycomb systems where you could sleep 737 00:39:45,920 --> 00:39:47,279 Speaker 1: in the car if you want it. Yeah. It's kind 738 00:39:47,280 --> 00:39:52,440 Speaker 1: of funny because it actually opens up an enormous opportunity 739 00:39:52,520 --> 00:39:56,719 Speaker 1: for designers. Yeah, right, unprecedented opportunity because you would be 740 00:39:56,800 --> 00:39:59,879 Speaker 1: completely transforming the interior of a car. All the things 741 00:40:00,000 --> 00:40:03,920 Speaker 1: we associate as being well, not all, but a lot 742 00:40:03,960 --> 00:40:06,880 Speaker 1: of the things we associate as being the definition of 743 00:40:06,960 --> 00:40:08,719 Speaker 1: what an inside of a car would look like go 744 00:40:08,840 --> 00:40:11,400 Speaker 1: out the window, I mean figuratively speaking. And so you 745 00:40:11,440 --> 00:40:15,920 Speaker 1: could then have all kinds of different configurations and designs 746 00:40:15,960 --> 00:40:19,160 Speaker 1: almost like almost more like home design really, or room design, 747 00:40:19,800 --> 00:40:22,880 Speaker 1: some sort of interior design for vehicles. Yeah, it's a strange, 748 00:40:23,160 --> 00:40:25,280 Speaker 1: strange thought. Hey, by the way, I want to clarify 749 00:40:25,360 --> 00:40:27,560 Speaker 1: one thing, really, sure, just something's been bugging me for 750 00:40:27,600 --> 00:40:29,800 Speaker 1: the last ten minutes. All right, I do know that 751 00:40:29,840 --> 00:40:32,160 Speaker 1: there were rent cars on US roads prior to the 752 00:40:32,280 --> 00:40:34,520 Speaker 1: nineteen eighties. I was just mentioning their brief comeback, you know, 753 00:40:34,600 --> 00:40:38,000 Speaker 1: with the Alliance lineup and uh and and the kind 754 00:40:38,040 --> 00:40:40,080 Speaker 1: of the I guess I'm going to mention the failure 755 00:40:40,560 --> 00:40:43,040 Speaker 1: that that was. It was, it was not not all 756 00:40:43,080 --> 00:40:45,480 Speaker 1: that well received. Yeah, but I think most of my 757 00:40:45,640 --> 00:40:49,200 Speaker 1: listeners are most of them, let me let me clarify, 758 00:40:49,400 --> 00:40:53,759 Speaker 1: most of my listeners in the US are probably unfamiliar 759 00:40:53,880 --> 00:40:57,399 Speaker 1: with the brand Renault. Yeah, probably because I'm guessing many 760 00:40:57,440 --> 00:40:59,560 Speaker 1: of them were born in the eighties. Yeah, it's a 761 00:40:59,760 --> 00:41:03,759 Speaker 1: it is a seldom seen vehicle on the roads here 762 00:41:03,800 --> 00:41:05,480 Speaker 1: in the United States, but in other parts of the 763 00:41:05,560 --> 00:41:07,400 Speaker 1: world it is a very popular way. Yeah, I mean 764 00:41:07,440 --> 00:41:09,480 Speaker 1: a lot like pougeaux or something like that. You know, 765 00:41:09,560 --> 00:41:13,040 Speaker 1: there's reasons, but not the right show. It's so funny 766 00:41:13,440 --> 00:41:17,120 Speaker 1: when you start throwing around car manufacturer names and car 767 00:41:17,200 --> 00:41:21,480 Speaker 1: brand names, and then you come to that weird realization 768 00:41:21,560 --> 00:41:23,320 Speaker 1: that in other parts of the world they are totally 769 00:41:23,440 --> 00:41:26,319 Speaker 1: different ones that and some some of which that are 770 00:41:26,440 --> 00:41:29,720 Speaker 1: prevalent in the United States, are largely unknown in certain 771 00:41:29,760 --> 00:41:32,440 Speaker 1: parts of the world. And it just reminds you like, oh, yeah, 772 00:41:32,480 --> 00:41:34,840 Speaker 1: that's right. The whole world, isn't the US. Yeah, well 773 00:41:34,880 --> 00:41:37,400 Speaker 1: it's it's it is strange. And once you travel outside 774 00:41:37,440 --> 00:41:39,200 Speaker 1: and you see that you see the same vehicle but 775 00:41:39,239 --> 00:41:41,960 Speaker 1: it's named something different or something like that, it's just unusual. 776 00:41:42,040 --> 00:41:45,520 Speaker 1: It's just a it's it is eye opening. Really. Now, 777 00:41:45,880 --> 00:41:50,239 Speaker 1: I've been teasing this for the whole episode, but let's 778 00:41:50,280 --> 00:41:54,640 Speaker 1: get back to BMW and Mark Platchin. Is this intended 779 00:41:54,680 --> 00:41:56,520 Speaker 1: to hurt me? No, it's time tend to hurt you. 780 00:41:56,680 --> 00:41:59,240 Speaker 1: I just want to see what your reaction is Scott. 781 00:42:00,400 --> 00:42:03,759 Speaker 1: Scott and I started talking about this off microphone yesterday, 782 00:42:04,239 --> 00:42:06,759 Speaker 1: and as I was talking a little voice in my 783 00:42:06,880 --> 00:42:10,960 Speaker 1: head said, shut up, Jonathan, save it for the show. 784 00:42:11,560 --> 00:42:13,880 Speaker 1: So that's what we're gonna do. It's not really you 785 00:42:14,280 --> 00:42:17,279 Speaker 1: might just shrug and say, oh all right. But in 786 00:42:17,480 --> 00:42:21,240 Speaker 1: order to set this up first, what is bmw slogan? 787 00:42:21,600 --> 00:42:24,239 Speaker 1: Now they're known as it's It's a driver's car, right, yeah, 788 00:42:24,480 --> 00:42:27,560 Speaker 1: it's it's um the ultimate driving machine. Yea, I mean 789 00:42:27,640 --> 00:42:30,320 Speaker 1: the ultimate driving machine, the ultimate driving machine. So you 790 00:42:30,360 --> 00:42:32,360 Speaker 1: would think that, you know, of course they're going to 791 00:42:32,480 --> 00:42:35,479 Speaker 1: dabble in autonomous systems like you know, maybe adaptive cruise 792 00:42:35,520 --> 00:42:38,480 Speaker 1: control something like that. But I just I've had a 793 00:42:38,520 --> 00:42:42,960 Speaker 1: hard time all along seeing BMW going fully autonomous because 794 00:42:43,000 --> 00:42:44,839 Speaker 1: of the way they market their company right now, it's 795 00:42:44,920 --> 00:42:47,480 Speaker 1: it is the ultimate driving machine. It's a driver's vehicle. 796 00:42:47,719 --> 00:42:50,719 Speaker 1: If you want something that's fun to drive, that's an experience, 797 00:42:51,320 --> 00:42:53,120 Speaker 1: you get a BMW. You get it. You know something 798 00:42:53,200 --> 00:42:56,280 Speaker 1: that's it's it's top of the line, it's expensive, it's plush, 799 00:42:56,760 --> 00:42:59,399 Speaker 1: it's it's it's a well handling car, it's powerful, it's 800 00:42:59,480 --> 00:43:03,000 Speaker 1: it's everything you want, and again, ultimate driving machines. So 801 00:43:03,480 --> 00:43:06,440 Speaker 1: why are they messing around with autonomous vehicles? That's that's 802 00:43:06,480 --> 00:43:13,320 Speaker 1: my thoughts. Plashin works specifically with the autonomous vehicle section 803 00:43:13,719 --> 00:43:17,840 Speaker 1: in BMW, and his response to the first part would be, 804 00:43:18,360 --> 00:43:20,719 Speaker 1: I imagine I'm putting some words into his mouth, so 805 00:43:21,400 --> 00:43:23,080 Speaker 1: take this with a grain of salt. But I imagine 806 00:43:23,080 --> 00:43:27,520 Speaker 1: he would say, it's where the future of vehicles definitely 807 00:43:27,840 --> 00:43:31,320 Speaker 1: happens to be completely understand that and you cannot ignore it. 808 00:43:31,520 --> 00:43:35,359 Speaker 1: If you do, you'll be left behind. Yes, he said 809 00:43:35,440 --> 00:43:41,759 Speaker 1: that the company was at a real they were in 810 00:43:41,800 --> 00:43:44,440 Speaker 1: a quandary. He actually said that. I think it was 811 00:43:44,560 --> 00:43:47,719 Speaker 1: last year he was brought in to talk, or maybe 812 00:43:47,719 --> 00:43:49,359 Speaker 1: it was a few years ago. He was taught brought 813 00:43:49,400 --> 00:43:53,080 Speaker 1: in to talk about the concepts they needed to talk 814 00:43:53,120 --> 00:43:56,640 Speaker 1: about in an upcoming conversation. They were going to hit 815 00:43:56,719 --> 00:44:00,280 Speaker 1: like some corporate milestone, and they wanted to talk about 816 00:44:00,480 --> 00:44:03,960 Speaker 1: what are the next one hundred years of BMW going 817 00:44:04,000 --> 00:44:07,160 Speaker 1: to look like? Now, anyone who's listened to forward thinking, 818 00:44:07,360 --> 00:44:10,319 Speaker 1: you know, predicting the future is hard. Predicting five years 819 00:44:10,360 --> 00:44:13,160 Speaker 1: out is hard. Predicting one hundred years out is impossible. 820 00:44:13,160 --> 00:44:17,040 Speaker 1: I can't. Yeah, the only thing I can predict tomorrow 821 00:44:17,160 --> 00:44:18,520 Speaker 1: is that if I don't wear some block, I will 822 00:44:18,560 --> 00:44:20,920 Speaker 1: be sunburnt. That's it, because I know I'm gonna be 823 00:44:20,920 --> 00:44:24,799 Speaker 1: outside a lot. But he said it was his job 824 00:44:25,120 --> 00:44:28,640 Speaker 1: to try and help coordinate this vision of BMW for 825 00:44:28,680 --> 00:44:32,719 Speaker 1: the next one hundred years. And taking into account the 826 00:44:32,800 --> 00:44:36,359 Speaker 1: fact that autonomous cars are I mean everyone it's suth 827 00:44:36,400 --> 00:44:38,200 Speaker 1: By Southwest was talking about them as if it's a 828 00:44:38,239 --> 00:44:42,080 Speaker 1: foregone conclusion. That's the future, that's where we're going. So 829 00:44:42,520 --> 00:44:46,000 Speaker 1: taking that as part of it, they actually had serious 830 00:44:46,200 --> 00:44:51,320 Speaker 1: internal discussions what does this mean with our slogan the 831 00:44:51,600 --> 00:44:56,200 Speaker 1: ultimate driving machine? What do we do? Do we rebrand? 832 00:44:56,800 --> 00:45:00,839 Speaker 1: How do we rebrand? This is something we pry ourselves upon. 833 00:45:00,960 --> 00:45:04,440 Speaker 1: It is a corporate identity. It's it's kind of the 834 00:45:04,560 --> 00:45:08,279 Speaker 1: central mantra of the company. It's it's the DNA of 835 00:45:08,320 --> 00:45:12,600 Speaker 1: the company. They started playing with alternatives to the slogan, 836 00:45:12,719 --> 00:45:15,719 Speaker 1: like maybe we change it to something else, and they 837 00:45:15,800 --> 00:45:19,200 Speaker 1: tried a few different things out and all internally, and 838 00:45:19,440 --> 00:45:21,839 Speaker 1: no one liked them. No one liked them, And then 839 00:45:21,960 --> 00:45:27,640 Speaker 1: finally someone said, well, technically it's a driving machine. It's 840 00:45:27,640 --> 00:45:31,960 Speaker 1: a machine that's driving. It is the driving machine. We 841 00:45:32,080 --> 00:45:36,160 Speaker 1: can make the ultimate driving machine. So it's still the 842 00:45:36,239 --> 00:45:41,160 Speaker 1: same slogan. The context is redefined. Oh, boy, yeah, boy, 843 00:45:41,239 --> 00:45:44,080 Speaker 1: I don't know, that's why. I yeah, I don't know 844 00:45:44,200 --> 00:45:48,640 Speaker 1: about this all right. Well, so it's almost like you're 845 00:45:48,680 --> 00:45:51,840 Speaker 1: putting the emphasis on the on the other part, I 846 00:45:51,880 --> 00:45:53,080 Speaker 1: don't know, how how do you even look at that? 847 00:45:53,120 --> 00:45:56,080 Speaker 1: I guess it's how you would say, I'd say the emphasis. 848 00:45:56,760 --> 00:45:59,719 Speaker 1: The emphasis previously was on driving because you think of 849 00:45:59,800 --> 00:46:02,640 Speaker 1: driving as a verb that people indulge. So now it's 850 00:46:02,640 --> 00:46:06,160 Speaker 1: the ultimate driving machine. I see. Okay, well boy, that's 851 00:46:06,200 --> 00:46:09,120 Speaker 1: so subtle. So if you make and this was also 852 00:46:09,160 --> 00:46:12,320 Speaker 1: an interesting discussion because people ask the questions and what 853 00:46:12,480 --> 00:46:16,040 Speaker 1: happens to brand identity in a future of autonomous vehicles 854 00:46:16,160 --> 00:46:19,719 Speaker 1: that are likely not going to be owned by individuals 855 00:46:19,840 --> 00:46:22,880 Speaker 1: but will be in some form of shared economy. And 856 00:46:24,000 --> 00:46:26,479 Speaker 1: they had a really good response for this. They said, well, 857 00:46:27,960 --> 00:46:30,960 Speaker 1: you could argue that all autonomous cars would essentially be 858 00:46:31,000 --> 00:46:34,840 Speaker 1: alike that one, you know, robo uber car would be 859 00:46:34,960 --> 00:46:38,080 Speaker 1: the same as the next robot uber car, except eventually 860 00:46:38,120 --> 00:46:39,759 Speaker 1: someone would come along and say, you know what, We're 861 00:46:39,800 --> 00:46:41,960 Speaker 1: going to make a different robot uber car that has 862 00:46:42,560 --> 00:46:46,960 Speaker 1: X features in it, which appeals to why demographic, Yeah, 863 00:46:46,960 --> 00:46:49,360 Speaker 1: somebody will pay a premium for that feature, right, because 864 00:46:49,360 --> 00:46:53,560 Speaker 1: if you're like, hey, we noticed that that young people 865 00:46:53,680 --> 00:46:55,680 Speaker 1: between the ages of such and such and such and such, 866 00:46:55,760 --> 00:46:58,080 Speaker 1: they really care about these things and they don't care 867 00:46:58,080 --> 00:47:01,040 Speaker 1: about these other things. Let's make some cars that go 868 00:47:01,360 --> 00:47:04,400 Speaker 1: straight straight to what they care about, and we'll be 869 00:47:04,480 --> 00:47:07,040 Speaker 1: able to dominate that market. And then you get competition 870 00:47:07,160 --> 00:47:09,520 Speaker 1: there because other companies will follow, which means you still 871 00:47:09,680 --> 00:47:13,040 Speaker 1: end up getting that differentiation, you still get the brand identity. 872 00:47:13,120 --> 00:47:16,800 Speaker 1: The question is how do they define themselves so that 873 00:47:16,880 --> 00:47:20,399 Speaker 1: the experience of being in, say a BMW autonomous car 874 00:47:21,120 --> 00:47:24,239 Speaker 1: is different from being an Alexis autonomous car. Well, now 875 00:47:24,280 --> 00:47:26,360 Speaker 1: we know all they do is put the emphasis on machine. 876 00:47:27,239 --> 00:47:29,440 Speaker 1: That's it, right, But I thought, I guess it beats 877 00:47:29,520 --> 00:47:36,000 Speaker 1: like BMW we give up or BMW it was fun 878 00:47:36,080 --> 00:47:40,600 Speaker 1: well at last, Yeah, yeah, you can come up with 879 00:47:40,680 --> 00:47:42,520 Speaker 1: a bunch of funny slogans. I'm sure for it, but 880 00:47:42,760 --> 00:47:46,000 Speaker 1: but honestly like to stick with what they have. Really, 881 00:47:46,040 --> 00:47:47,920 Speaker 1: I think maybe if if that's what they're gonna do, 882 00:47:47,960 --> 00:47:50,920 Speaker 1: and they're gonna push it that way, that may be 883 00:47:51,000 --> 00:47:52,759 Speaker 1: exactly what they do. They may you may hear that 884 00:47:52,880 --> 00:47:56,600 Speaker 1: emphasized machine over over driving, which we are now, So 885 00:47:57,000 --> 00:47:58,840 Speaker 1: that's gonna be really weird, isn't it. I think so, 886 00:47:59,040 --> 00:48:00,719 Speaker 1: I mean, I'll so, I think it's going to be 887 00:48:00,800 --> 00:48:03,560 Speaker 1: weird to be in a world where, assuming that the 888 00:48:03,680 --> 00:48:07,840 Speaker 1: shared car approach is what what wins out, it'll be 889 00:48:08,000 --> 00:48:10,560 Speaker 1: weird to live in that world for lots of different reasons, 890 00:48:10,560 --> 00:48:12,439 Speaker 1: because a lot of us are very used to having 891 00:48:12,480 --> 00:48:15,520 Speaker 1: our own personal vehicle for multiple reasons, not just for 892 00:48:15,600 --> 00:48:19,080 Speaker 1: the convenience sake, but convenience outside of just I have 893 00:48:19,200 --> 00:48:21,879 Speaker 1: a car whenever I need to go someplace, assuming that's 894 00:48:21,920 --> 00:48:26,320 Speaker 1: not broken down. Scott and I will conclude our discussion 895 00:48:26,360 --> 00:48:29,120 Speaker 1: about the Great Google car crash of twenty sixteen, but 896 00:48:29,239 --> 00:48:42,520 Speaker 1: first we need to take another break. What about all 897 00:48:42,560 --> 00:48:45,360 Speaker 1: the stuff that's in a car. Like a lot of 898 00:48:45,400 --> 00:48:48,840 Speaker 1: people have stuff that they keep in their car, and 899 00:48:48,920 --> 00:48:51,040 Speaker 1: it might word in equipment or you know things like 900 00:48:51,160 --> 00:48:53,960 Speaker 1: that diapers, Yeah, stuff like the new parents might have 901 00:48:54,239 --> 00:48:55,960 Speaker 1: a box of diapers in the car so that when 902 00:48:56,000 --> 00:48:58,520 Speaker 1: they travel places they have their supply right there. Sure, 903 00:48:58,600 --> 00:48:59,920 Speaker 1: if they need to run out to the car, they 904 00:49:00,120 --> 00:49:04,240 Speaker 1: can or but in the future, if you have shared vehicles, 905 00:49:04,280 --> 00:49:06,359 Speaker 1: obviously you can't just keep stuff in a car. You'd 906 00:49:06,400 --> 00:49:08,600 Speaker 1: have to carry everything you need with you all the time, 907 00:49:09,800 --> 00:49:12,320 Speaker 1: and you would either have to pare down the stuff 908 00:49:12,440 --> 00:49:15,120 Speaker 1: so that you're saying, well, I might not be prepared 909 00:49:15,160 --> 00:49:17,759 Speaker 1: for certain situations. But yeah, but it's a trade off. 910 00:49:17,840 --> 00:49:19,440 Speaker 1: Isn't it really nice to just kind of leave an 911 00:49:19,520 --> 00:49:21,279 Speaker 1: umbrella in your car and have it when you need 912 00:49:21,320 --> 00:49:23,479 Speaker 1: it and you don't have to remember it every single 913 00:49:23,560 --> 00:49:24,720 Speaker 1: time you go out the door. And it was funny 914 00:49:24,719 --> 00:49:28,120 Speaker 1: because Platschin actually said, well, maybe we'll have services where 915 00:49:28,880 --> 00:49:33,280 Speaker 1: you could actually store your stuff and what would account 916 00:49:33,400 --> 00:49:35,520 Speaker 1: what would end up being like a mobile storage unit 917 00:49:35,560 --> 00:49:37,239 Speaker 1: and you can just call upon it to come to 918 00:49:37,320 --> 00:49:39,640 Speaker 1: you whenever you needed it. And I thought that puts 919 00:49:39,760 --> 00:49:42,719 Speaker 1: more cars on the road. That's a terrible idea. Yeah, 920 00:49:42,800 --> 00:49:44,680 Speaker 1: I'd just tell you right now, that's a bad idea. 921 00:49:44,840 --> 00:49:47,319 Speaker 1: Joe and Lauren both agree with you, and I do too. 922 00:49:47,719 --> 00:49:51,080 Speaker 1: I also thought like, well, that doesn't sound like that's ideal. 923 00:49:51,440 --> 00:49:54,040 Speaker 1: So yeah, there's obviously some huge trade offs that would happen. 924 00:49:54,080 --> 00:49:56,640 Speaker 1: We'll imagine a block of lockers, you know, that's would 925 00:49:56,640 --> 00:49:58,640 Speaker 1: it be a block of lockers driving down the road 926 00:49:58,800 --> 00:50:01,160 Speaker 1: with your stuff and everybody else's stuff in it? Yeah? 927 00:50:01,200 --> 00:50:03,000 Speaker 1: And what if some went across town. Need I know 928 00:50:03,040 --> 00:50:05,680 Speaker 1: that they would probably keep it in a central area. 929 00:50:05,840 --> 00:50:08,680 Speaker 1: Central area. Yeah, but people don't all like Like, let's 930 00:50:08,680 --> 00:50:10,480 Speaker 1: say that my next door neighbor and I both use 931 00:50:10,560 --> 00:50:13,000 Speaker 1: the same unit because we both live next door to 932 00:50:13,080 --> 00:50:15,880 Speaker 1: each other. We don't necessarily work anywhere close to each other, 933 00:50:16,440 --> 00:50:17,960 Speaker 1: so he might work on the other side of town. 934 00:50:18,080 --> 00:50:20,239 Speaker 1: He needs his umbrella, I need my umbrella. That car, 935 00:50:20,400 --> 00:50:24,239 Speaker 1: I mean, easily you could see problems with that model. 936 00:50:24,960 --> 00:50:26,680 Speaker 1: There was a similar model that I also let me 937 00:50:26,680 --> 00:50:29,320 Speaker 1: see once you think about this one by Scott so 938 00:50:29,920 --> 00:50:33,480 Speaker 1: talking about shared cars. Now, in the examples I've been 939 00:50:33,480 --> 00:50:36,600 Speaker 1: giving so far, it's essentially a fleet of service vehicles, 940 00:50:36,760 --> 00:50:38,680 Speaker 1: something along the lines of an uber or a lift, 941 00:50:39,000 --> 00:50:42,880 Speaker 1: only with no human drivers. Right. Yes, one of the 942 00:50:42,960 --> 00:50:47,839 Speaker 1: alternatives I heard, Shad Law actually mentioned this possibility, which 943 00:50:47,880 --> 00:50:54,280 Speaker 1: I think is almost as bad as the traveling locker idea. 944 00:50:55,200 --> 00:50:59,680 Speaker 1: What if instead of it being a fleet car, it's 945 00:51:00,040 --> 00:51:03,880 Speaker 1: a communal car among multiple households, and you own like 946 00:51:04,400 --> 00:51:08,239 Speaker 1: a sixth of that car. Can you imagine that working out? 947 00:51:09,320 --> 00:51:12,120 Speaker 1: How would you guarantee that the car would be available 948 00:51:12,239 --> 00:51:15,799 Speaker 1: for all the households? Well, I guess the man Okay, 949 00:51:15,880 --> 00:51:18,120 Speaker 1: this isn't as bad an idea as I mean, I 950 00:51:18,239 --> 00:51:20,800 Speaker 1: understand that it's it's not great. Yeah, and there's a 951 00:51:20,840 --> 00:51:23,120 Speaker 1: lot of flaws to this one as well, But um, 952 00:51:23,320 --> 00:51:25,560 Speaker 1: isn't this kind of the idea behind you know, the 953 00:51:25,680 --> 00:51:27,400 Speaker 1: the companies that allow you to have kind of a 954 00:51:27,800 --> 00:51:30,719 Speaker 1: lease on three different types of vehicles at one time 955 00:51:31,080 --> 00:51:33,359 Speaker 1: and you can use the one that you need when 956 00:51:33,400 --> 00:51:36,720 Speaker 1: you need it. So you lease, um, you know, a sedan, 957 00:51:37,080 --> 00:51:39,000 Speaker 1: you lease a compact car that's very good, you know 958 00:51:39,000 --> 00:51:41,959 Speaker 1: with mileage, and you lease, you know, a pickup truck 959 00:51:42,320 --> 00:51:44,200 Speaker 1: and when you need to pickup truck on the weekend, 960 00:51:44,400 --> 00:51:45,840 Speaker 1: you can rent that. You can you can have that 961 00:51:46,000 --> 00:51:47,920 Speaker 1: brought to you, or you can go get it. Um 962 00:51:48,160 --> 00:51:49,480 Speaker 1: use it for that amount of time. But what if 963 00:51:49,719 --> 00:51:52,440 Speaker 1: somebody else is using that that that sedan when you 964 00:51:52,520 --> 00:51:54,319 Speaker 1: need it, you know, they need it for the week 965 00:51:54,360 --> 00:51:55,719 Speaker 1: and you also need it for the week. I mean, 966 00:51:55,760 --> 00:51:58,839 Speaker 1: how does that all work? I don't Again, same set 967 00:51:58,880 --> 00:52:02,239 Speaker 1: of problems. I think, yeah, maybe a smaller scale than 968 00:52:02,239 --> 00:52:04,320 Speaker 1: the one that I'm talking I mean maybe if there's 969 00:52:05,360 --> 00:52:08,600 Speaker 1: maybe the only way I can see it working is 970 00:52:08,680 --> 00:52:10,760 Speaker 1: that you again go back to the fleet of cars, 971 00:52:11,200 --> 00:52:14,160 Speaker 1: so you've got a fleet of autonomous cars. You've got 972 00:52:14,200 --> 00:52:18,200 Speaker 1: a group of people who have essentially collectively invested so 973 00:52:18,400 --> 00:52:21,640 Speaker 1: that they own the quote unquote own one of those cars. 974 00:52:22,200 --> 00:52:24,400 Speaker 1: They don't actually own a car, they just own a 975 00:52:24,640 --> 00:52:27,359 Speaker 1: share time share. Yeah, it's like a timeshare for those 976 00:52:27,440 --> 00:52:30,640 Speaker 1: vehicles that are on demand. So if I call for 977 00:52:30,760 --> 00:52:32,759 Speaker 1: a car and my neighbor calls for a car, and 978 00:52:32,880 --> 00:52:36,360 Speaker 1: they're both and we're both on this plan, two different 979 00:52:36,400 --> 00:52:40,480 Speaker 1: cars come because of the way we've agreed with this fleet, 980 00:52:40,719 --> 00:52:44,839 Speaker 1: and it's the purchase price of the vehicle that ends 981 00:52:44,880 --> 00:52:48,000 Speaker 1: up covering the cost of the individual trip as opposed 982 00:52:48,040 --> 00:52:51,320 Speaker 1: to doing a trip, you know, a fee per trip, 983 00:52:51,640 --> 00:52:54,600 Speaker 1: like like a typical rental car. Now. Yeah, so essentially 984 00:52:54,600 --> 00:52:57,879 Speaker 1: it would be like, all right, well, collectively, we all 985 00:52:58,080 --> 00:53:01,680 Speaker 1: got together and we put in thirty five thousand dollars 986 00:53:01,960 --> 00:53:05,160 Speaker 1: to quote unquote buy a car. What that really does 987 00:53:05,280 --> 00:53:09,600 Speaker 1: is give us unlimited travel using the service within its 988 00:53:09,840 --> 00:53:14,080 Speaker 1: range of service, you know, assuming that it isn't you know, 989 00:53:14,320 --> 00:53:17,640 Speaker 1: state or a countrywide or whatever. Yeah, And I could 990 00:53:17,680 --> 00:53:20,920 Speaker 1: see it working that way maybe, but I can't see 991 00:53:20,960 --> 00:53:24,520 Speaker 1: it working in such a way where you actually physically 992 00:53:24,719 --> 00:53:28,279 Speaker 1: have one car to share between the multiple house that 993 00:53:28,280 --> 00:53:30,880 Speaker 1: would never work. Yeah, just wouldn't work out. There's got 994 00:53:30,960 --> 00:53:32,239 Speaker 1: to be a way around it, like you said, it's 995 00:53:32,239 --> 00:53:34,040 Speaker 1: got to be there's gotta be a pool of vehicles 996 00:53:34,080 --> 00:53:36,480 Speaker 1: to draw from. Yeah, it just wouldn't work. Yeah, but 997 00:53:36,640 --> 00:53:38,880 Speaker 1: this wasn't I wish you could have gone because I 998 00:53:38,960 --> 00:53:42,000 Speaker 1: wish you could have seen panels like days and some 999 00:53:42,080 --> 00:53:43,719 Speaker 1: of the other ones too, Like I was only able 1000 00:53:43,760 --> 00:53:47,440 Speaker 1: to go to three panels total, but there were so many, 1001 00:53:47,640 --> 00:53:50,759 Speaker 1: there were all about autonomous vehicles. It sounds fascinating. I 1002 00:53:50,920 --> 00:53:53,040 Speaker 1: really didn't know until some of the you know, the 1003 00:53:53,120 --> 00:53:55,880 Speaker 1: reports that I've been reading just for this podcast, that 1004 00:53:56,239 --> 00:53:58,840 Speaker 1: this show is so focused on that type of technology. 1005 00:53:58,880 --> 00:54:01,320 Speaker 1: I tend to think of more of cs to be 1006 00:54:01,520 --> 00:54:04,600 Speaker 1: like that, Yeah, rather than this south By Southwest. Yeah, 1007 00:54:04,600 --> 00:54:08,120 Speaker 1: it's it's It's interesting because south By Southwest Interactive for 1008 00:54:08,320 --> 00:54:12,279 Speaker 1: many years was focused on mobile apps, like that was 1009 00:54:12,360 --> 00:54:16,560 Speaker 1: the big thing, mobile apps and some gaming. But usually 1010 00:54:16,600 --> 00:54:20,360 Speaker 1: you're talking about the next Twitter. You know. Merecat and 1011 00:54:20,440 --> 00:54:24,239 Speaker 1: Periscope both came out in twenty fifteen. Meercat went under 1012 00:54:24,280 --> 00:54:27,600 Speaker 1: in twenty sixteen. Periscope is still around because it's owned 1013 00:54:27,600 --> 00:54:32,640 Speaker 1: by Twitter. Anyway, that's the kind of stuff you would expect, 1014 00:54:32,760 --> 00:54:36,680 Speaker 1: But they had different tracks of programming under Interactive and 1015 00:54:36,880 --> 00:54:39,920 Speaker 1: one of the tracks was titled Intelligent Future, and that's 1016 00:54:39,960 --> 00:54:43,239 Speaker 1: where all the robotics and autonomous vehicles and AI all 1017 00:54:43,280 --> 00:54:46,960 Speaker 1: those discussions fell under that. And so a lot of 1018 00:54:47,000 --> 00:54:49,640 Speaker 1: it had to do with the future of cars, and 1019 00:54:49,760 --> 00:54:53,960 Speaker 1: again not just autonomous cars, but the idea of what 1020 00:54:54,320 --> 00:54:57,560 Speaker 1: what is it going to look like from multiple standpoints. 1021 00:54:58,200 --> 00:55:00,400 Speaker 1: I think autonomous played a huge role because everyone just 1022 00:55:00,440 --> 00:55:02,759 Speaker 1: assumes that's going to be part of the future, no 1023 00:55:02,880 --> 00:55:05,160 Speaker 1: matter how it turns out. Yeah. You know, one thing 1024 00:55:05,200 --> 00:55:07,000 Speaker 1: we should probably point out here is that we always 1025 00:55:07,040 --> 00:55:10,279 Speaker 1: talk about how it's happening. It's it's incrementally happening. What's 1026 00:55:10,280 --> 00:55:12,360 Speaker 1: going on, and we're getting little bits and pieces of 1027 00:55:12,440 --> 00:55:14,160 Speaker 1: it now and we see it, you know, in our 1028 00:55:14,200 --> 00:55:17,160 Speaker 1: everyday cars, but not the whole package yet. And the 1029 00:55:17,239 --> 00:55:19,560 Speaker 1: whole package they always it seems like it's always ten 1030 00:55:19,640 --> 00:55:21,439 Speaker 1: years out, is what they say. But yeah, I'm seeing 1031 00:55:21,560 --> 00:55:25,080 Speaker 1: estimates now that range anywhere from three years to thirty years. Yes, 1032 00:55:25,280 --> 00:55:27,880 Speaker 1: and those are all to be honest, those are all realistic. 1033 00:55:28,200 --> 00:55:29,920 Speaker 1: I mean, it could take thirty years, it could be 1034 00:55:30,480 --> 00:55:32,600 Speaker 1: faster than that. It could be we could I have 1035 00:55:32,760 --> 00:55:35,799 Speaker 1: this by twenty twenty. You never know. Yeah, I think 1036 00:55:35,920 --> 00:55:38,440 Speaker 1: I think three years is probably what we would we 1037 00:55:38,440 --> 00:55:41,200 Speaker 1: would start to see actual vehicles make their way onto 1038 00:55:41,239 --> 00:55:43,560 Speaker 1: the roads. Thirty years is where you get to the 1039 00:55:43,560 --> 00:55:45,880 Speaker 1: point where you're at saturation. Yeah, and you know, I 1040 00:55:46,040 --> 00:55:48,520 Speaker 1: know on this podcast, even especially in car stuff, but 1041 00:55:48,600 --> 00:55:51,920 Speaker 1: on this one, we've we've mentioned before they're already out 1042 00:55:51,920 --> 00:55:53,799 Speaker 1: there there cars that can drive you home from work 1043 00:55:53,880 --> 00:55:57,040 Speaker 1: without you touching the wheel or doing anything. You but 1044 00:55:57,160 --> 00:55:59,759 Speaker 1: they simply can't say it's an autonomous vate. You have 1045 00:55:59,800 --> 00:56:01,680 Speaker 1: to be sitting down the wheel and allow it to 1046 00:56:02,040 --> 00:56:03,640 Speaker 1: You can allow it to do it, but you have 1047 00:56:03,719 --> 00:56:05,239 Speaker 1: to be there, and you have to be ready to 1048 00:56:05,280 --> 00:56:07,680 Speaker 1: take control at anymore. And often often you'll get a 1049 00:56:07,719 --> 00:56:11,120 Speaker 1: little little beep asking you to make sure you make 1050 00:56:11,160 --> 00:56:13,359 Speaker 1: contact with the wheel to prove that you're still paying 1051 00:56:13,400 --> 00:56:15,920 Speaker 1: attention and everything exactly. You're not taking a nap on 1052 00:56:15,960 --> 00:56:17,520 Speaker 1: the way home. You don't want to, you know, those 1053 00:56:17,560 --> 00:56:20,760 Speaker 1: companies don't want to be liable for a terrible accident. 1054 00:56:20,920 --> 00:56:23,120 Speaker 1: So the three to thirty years we're talking about is 1055 00:56:23,320 --> 00:56:26,360 Speaker 1: where the companies are actually confident enough to say this 1056 00:56:26,560 --> 00:56:30,120 Speaker 1: is an autonomous self driving car. Let it do it. Yeah, 1057 00:56:30,840 --> 00:56:34,320 Speaker 1: and I'm so glad you're able to join me on 1058 00:56:34,400 --> 00:56:36,600 Speaker 1: this this episode and talk about this kind of stuff. 1059 00:56:37,400 --> 00:56:39,320 Speaker 1: I know that I come across as I love to 1060 00:56:39,440 --> 00:56:41,400 Speaker 1: needle you with these because you're the car guy and 1061 00:56:41,880 --> 00:56:44,120 Speaker 1: it's fun. It's all good fun. I should also mention 1062 00:56:44,520 --> 00:56:50,080 Speaker 1: that pretty much everyone's agreed that personal car ownership is 1063 00:56:50,120 --> 00:56:52,720 Speaker 1: not ever gonna go away entirely in the United States, 1064 00:56:52,840 --> 00:56:55,640 Speaker 1: that no one seemed to believe that that was the case. 1065 00:56:57,160 --> 00:57:00,640 Speaker 1: People said that it may be that fewer people own 1066 00:57:00,920 --> 00:57:04,160 Speaker 1: their own vehicles, but you'll you'll still be allowed to 1067 00:57:04,239 --> 00:57:06,719 Speaker 1: own and operate your own vehicle. Yeah, I could see 1068 00:57:06,719 --> 00:57:09,959 Speaker 1: that happening, especially for things like rural areas. It doesn't 1069 00:57:10,000 --> 00:57:13,640 Speaker 1: make sense to have an autonomous car service out serving 1070 00:57:14,239 --> 00:57:17,000 Speaker 1: way all the way. Rural areas, like you know, cattle 1071 00:57:17,080 --> 00:57:18,960 Speaker 1: ranchers aren't going to have any need for that. Now, 1072 00:57:19,080 --> 00:57:22,600 Speaker 1: this is a congested city situation. Yes, this is for urban, 1073 00:57:22,800 --> 00:57:26,720 Speaker 1: dense urban environments, and it's not ideal for other situations. 1074 00:57:27,160 --> 00:57:31,000 Speaker 1: So but one guy did say that he could envisioned 1075 00:57:31,000 --> 00:57:34,520 Speaker 1: a future in which car ownership of like an actual 1076 00:57:35,040 --> 00:57:40,160 Speaker 1: car owner, will be about as rare as horse owners 1077 00:57:40,240 --> 00:57:43,280 Speaker 1: are today. Really. Yeah, so there are plenty of people 1078 00:57:43,320 --> 00:57:48,480 Speaker 1: who still own horses, just not the general population. Yeah, 1079 00:57:48,560 --> 00:57:51,000 Speaker 1: there's a there's a lot of wide open space out there, 1080 00:57:51,120 --> 00:57:52,880 Speaker 1: and I think that, you know, that's where they'll be 1081 00:57:53,480 --> 00:57:57,280 Speaker 1: still used. Of course, yeah, maybe in cities. You know, 1082 00:57:57,360 --> 00:57:58,960 Speaker 1: I hate to say it, but there may be a 1083 00:57:59,040 --> 00:58:00,720 Speaker 1: point where, you know, you can't drive into the city 1084 00:58:00,760 --> 00:58:03,120 Speaker 1: in your own personal vehicle. You maybe have a giant 1085 00:58:03,160 --> 00:58:04,920 Speaker 1: parking lot in the outskirts of the city and you 1086 00:58:05,000 --> 00:58:07,160 Speaker 1: get out from there and then you take your city 1087 00:58:07,880 --> 00:58:10,520 Speaker 1: city approved transportation once you're inside. And I wouldn't that 1088 00:58:10,600 --> 00:58:13,920 Speaker 1: be something. I mean, it'd be a dramatic change in 1089 00:58:14,000 --> 00:58:15,280 Speaker 1: the way that we do things now. I mean, it 1090 00:58:15,440 --> 00:58:19,160 Speaker 1: would significantly changed the well the entire city scape. Really 1091 00:58:19,480 --> 00:58:23,040 Speaker 1: everything would be different. So it's fascinating topic. And again, 1092 00:58:23,120 --> 00:58:24,960 Speaker 1: thanks for inviting me in today to do this. I 1093 00:58:25,040 --> 00:58:26,720 Speaker 1: always have fun talking with you, and I know you 1094 00:58:26,840 --> 00:58:29,480 Speaker 1: like to rib me a little bit about car ownership 1095 00:58:29,560 --> 00:58:31,880 Speaker 1: and you know the way it's going, and uh, you know, 1096 00:58:31,960 --> 00:58:33,440 Speaker 1: I agree on a lot of this stuff. I mean, 1097 00:58:33,480 --> 00:58:35,760 Speaker 1: I think we can have a decent conversation back and 1098 00:58:35,840 --> 00:58:39,520 Speaker 1: forth about I understand that you know, things are moving 1099 00:58:39,560 --> 00:58:42,480 Speaker 1: towards autonomous vehicles, but I also and I'm glad that 1100 00:58:42,520 --> 00:58:44,320 Speaker 1: you said it too, that you know it's never gonna 1101 00:58:44,360 --> 00:58:47,520 Speaker 1: go completely away. Well, and to be fair, we're so 1102 00:58:48,440 --> 00:58:52,680 Speaker 1: so so in the baby stage of this, right, we're 1103 00:58:52,720 --> 00:58:56,960 Speaker 1: in the earliest stages of this autonomous era. Sure that 1104 00:58:57,840 --> 00:59:02,360 Speaker 1: making any definitive statement as autonomous cars will completely replace 1105 00:59:02,480 --> 00:59:06,200 Speaker 1: manual cars, or that car ownership will completely become a 1106 00:59:06,280 --> 00:59:10,000 Speaker 1: thing of the past, or even that manual cars will 1107 00:59:10,160 --> 00:59:13,600 Speaker 1: no longer be allowed within city limits, any of those. 1108 00:59:13,800 --> 00:59:16,240 Speaker 1: It's so premature to make any kind of statement like that. 1109 00:59:17,880 --> 00:59:21,720 Speaker 1: And honestly, it may turn out that we just see 1110 00:59:21,800 --> 00:59:25,000 Speaker 1: that the ideal mix is somewhere in the middle, with 1111 00:59:25,720 --> 00:59:28,720 Speaker 1: a mixture of autonomous cars and manually driven cars. We 1112 00:59:28,840 --> 00:59:33,600 Speaker 1: don't know. You know, the mathematical models suggests that if 1113 00:59:33,600 --> 00:59:35,880 Speaker 1: you went all autonomous, you avoid a lot of problems, 1114 00:59:35,960 --> 00:59:38,200 Speaker 1: but that's not necessarily the way it will actually shake 1115 00:59:38,280 --> 00:59:40,080 Speaker 1: out in real life. Well, I'm with you. I try 1116 00:59:40,120 --> 00:59:42,000 Speaker 1: to avoid the predictions because it just ends up making 1117 00:59:42,040 --> 00:59:44,920 Speaker 1: you look like a fool later right when it happens eventually, 1118 00:59:45,400 --> 00:59:48,120 Speaker 1: But that's pretty much status quo for me. Well, you 1119 00:59:48,240 --> 00:59:51,320 Speaker 1: kind of have to though. You know, anyways, I really 1120 00:59:51,440 --> 00:59:53,160 Speaker 1: I don't like to do that, but that I like 1121 00:59:53,240 --> 00:59:55,920 Speaker 1: to just kind of sit back and and kind of 1122 00:59:56,000 --> 00:59:57,960 Speaker 1: just take it all in because there's so many changes 1123 00:59:58,000 --> 01:00:01,080 Speaker 1: happening right now. It's actually pretty exciting. Yeah. Yeah, I mean, 1124 01:00:01,640 --> 01:00:05,120 Speaker 1: and you know, you got to remember the autonomous cars, 1125 01:00:05,160 --> 01:00:07,920 Speaker 1: if they'd become a thing, like a real serious thing, 1126 01:00:08,320 --> 01:00:11,760 Speaker 1: like most people believe, there are implications well beyond the 1127 01:00:11,840 --> 01:00:15,400 Speaker 1: auto industry that things that could really be effective, like 1128 01:00:15,840 --> 01:00:19,200 Speaker 1: like airline industry. You know, if you're able to jump 1129 01:00:19,320 --> 01:00:22,479 Speaker 1: into an autonomously driven car and you can do work 1130 01:00:22,960 --> 01:00:25,120 Speaker 1: or you can go to sleep and you are not 1131 01:00:25,880 --> 01:00:28,400 Speaker 1: you don't need to get to whatever your destination is 1132 01:00:28,480 --> 01:00:31,400 Speaker 1: within a couple of hours, that could really impact a 1133 01:00:31,480 --> 01:00:34,520 Speaker 1: lot of airline travel. So well, yeah, I mean, okay, 1134 01:00:34,640 --> 01:00:36,680 Speaker 1: I know we got to wrap up here, but you're 1135 01:00:36,720 --> 01:00:38,480 Speaker 1: you're making me think of you know, the the uh 1136 01:00:38,960 --> 01:00:41,600 Speaker 1: the kind of pros cons you you wagh if you're 1137 01:00:41,640 --> 01:00:43,600 Speaker 1: making a short trip on a plane, you know, if 1138 01:00:43,600 --> 01:00:46,000 Speaker 1: you're flying from here to or Lando, sure, yeah, which 1139 01:00:46,080 --> 01:00:48,520 Speaker 1: is like like from here, it's like an hour little 1140 01:00:48,520 --> 01:00:50,400 Speaker 1: a little less than an hour and a half flight. Yeah, 1141 01:00:50,440 --> 01:00:52,360 Speaker 1: but then then you have to take an account. You 1142 01:00:52,360 --> 01:00:53,760 Speaker 1: gotta get up early, you got to pack the car, 1143 01:00:53,840 --> 01:00:55,440 Speaker 1: you gotta get to the air important park and all 1144 01:00:55,480 --> 01:00:58,360 Speaker 1: that stuff. It ends up taking more than half the day. 1145 01:00:59,160 --> 01:01:00,720 Speaker 1: But you could just drive there too, And if you 1146 01:01:00,800 --> 01:01:03,560 Speaker 1: can do that in a way that doesn't it's not 1147 01:01:03,680 --> 01:01:06,560 Speaker 1: taxing on you, right, it's it's it's actually comfortable and 1148 01:01:06,640 --> 01:01:09,200 Speaker 1: you're in your own your own car. It's a lot 1149 01:01:09,280 --> 01:01:12,880 Speaker 1: more comfortable than being crammed into an airplane. Sure, why 1150 01:01:12,920 --> 01:01:14,920 Speaker 1: would you not do that? You have the opportunity to 1151 01:01:15,000 --> 01:01:17,880 Speaker 1: stop at a specific place to have food rather than 1152 01:01:17,960 --> 01:01:20,720 Speaker 1: just buying whatever little snack box happens will be on 1153 01:01:20,760 --> 01:01:23,040 Speaker 1: the plane. Yeah, so you're right. It does change even 1154 01:01:23,360 --> 01:01:25,760 Speaker 1: you know that short distance sure travel. Yeah, Now for 1155 01:01:25,880 --> 01:01:30,320 Speaker 1: long distances, obviously, I think air travel unless you're determined 1156 01:01:30,360 --> 01:01:34,080 Speaker 1: to do the great autonomous American road trip, I think 1157 01:01:34,400 --> 01:01:38,120 Speaker 1: you're The airlines will still be very much a strong 1158 01:01:38,240 --> 01:01:40,760 Speaker 1: player in that, but it will affect their bottom line, 1159 01:01:40,960 --> 01:01:44,760 Speaker 1: and that will affect how they route planes, how they 1160 01:01:44,840 --> 01:01:50,120 Speaker 1: design planes, how they how they price tickets. So there's 1161 01:01:50,160 --> 01:01:54,520 Speaker 1: some big, potentially disruptive things that could happen ripple out 1162 01:01:54,720 --> 01:01:57,680 Speaker 1: from the automotive industry out into many other ones, so 1163 01:01:57,960 --> 01:02:01,320 Speaker 1: it's pretty interesting stuff. I hope you enjoyed that classic 1164 01:02:01,400 --> 01:02:03,640 Speaker 1: episode of tech Stuff. Like I said at the beginning, 1165 01:02:03,680 --> 01:02:06,480 Speaker 1: there have been a lot more accidents involving cars that 1166 01:02:06,560 --> 01:02:11,600 Speaker 1: are in autonomous or semi autonomous modes. Obviously the most 1167 01:02:11,680 --> 01:02:15,600 Speaker 1: publicized ones are Tesla, but it's not like Tesla has, 1168 01:02:16,520 --> 01:02:21,640 Speaker 1: you know, the exclusive rights to accidents. It's just that 1169 01:02:22,160 --> 01:02:26,200 Speaker 1: they tend to make headlines across the world when it happens. 1170 01:02:26,400 --> 01:02:30,400 Speaker 1: And yeah, and of course Tesla has the ongoing issue 1171 01:02:30,560 --> 01:02:34,480 Speaker 1: between saying calling its products things that make it sound 1172 01:02:34,560 --> 01:02:38,280 Speaker 1: like they're autonomous vehicle products, while also saying this is 1173 01:02:38,360 --> 01:02:41,400 Speaker 1: not an autonomous vehicle product, So we get into that 1174 01:02:41,520 --> 01:02:44,720 Speaker 1: complication as well. If you have suggestions for topics I 1175 01:02:44,720 --> 01:02:46,960 Speaker 1: should cover on future episodes of tech Stuff, please reach 1176 01:02:47,000 --> 01:02:49,200 Speaker 1: out to me and let me know. You can download 1177 01:02:49,240 --> 01:02:52,200 Speaker 1: the iHeartRadio app and you can navigate over to tech 1178 01:02:52,320 --> 01:02:55,040 Speaker 1: Stuff by putting that into the little search engine. You 1179 01:02:55,120 --> 01:02:57,040 Speaker 1: will see that when you go to the tech Stuff page, 1180 01:02:57,080 --> 01:02:59,320 Speaker 1: there's a microphone icon. If you click on that, you 1181 01:02:59,360 --> 01:03:01,880 Speaker 1: can actually leave a voice message for me and let 1182 01:03:01,960 --> 01:03:03,200 Speaker 1: me know what you would like me to cover in 1183 01:03:03,200 --> 01:03:05,640 Speaker 1: the future, or if you prefer, you can head on 1184 01:03:05,720 --> 01:03:08,960 Speaker 1: over to Twitter and you can tweet me. The shows 1185 01:03:09,000 --> 01:03:12,720 Speaker 1: handle is tech Stuff HSW and let me know what 1186 01:03:12,880 --> 01:03:14,520 Speaker 1: you would like me to cover in the future and 1187 01:03:14,640 --> 01:03:24,200 Speaker 1: I'll talk to you again really soon. Tech Stuff is 1188 01:03:24,240 --> 01:03:28,760 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 1189 01:03:28,840 --> 01:03:32,440 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 1190 01:03:32,480 --> 01:03:33,200 Speaker 1: favorite shows.