1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:14,880 Speaker 1: Be there and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,800 --> 00:00:21,439 Speaker 1: And how the tech are you? It's time for a 5 00:00:21,520 --> 00:00:25,400 Speaker 1: plastic episode. This one has the title high Tech at 6 00:00:25,520 --> 00:00:30,080 Speaker 1: High Speed and originally published on February two thousand and sixteen. 7 00:00:30,960 --> 00:00:34,600 Speaker 1: And Scott, who at the time we're hosting a podcast 8 00:00:34,680 --> 00:00:38,720 Speaker 1: called car Stuff, joined the show too talk on the 9 00:00:38,760 --> 00:00:43,640 Speaker 1: subject of tech in vehicle form and the kind of 10 00:00:43,640 --> 00:00:47,040 Speaker 1: tech you're you're finding in vehicles. Keep in mind this 11 00:00:47,080 --> 00:00:50,440 Speaker 1: is from things have changed, like this is before we 12 00:00:50,479 --> 00:00:53,960 Speaker 1: started talking about stuff like paying a subscription feed in 13 00:00:54,040 --> 00:00:57,120 Speaker 1: order to access your heated car seats. I'm sure that 14 00:00:57,480 --> 00:00:59,520 Speaker 1: they would have a lot to say about that as well. 15 00:00:59,560 --> 00:01:03,080 Speaker 1: But let's sit back and enjoy this classic episode. So 16 00:01:03,200 --> 00:01:06,000 Speaker 1: today we're gonna talk about a little more in depth 17 00:01:06,000 --> 00:01:09,319 Speaker 1: about some of the technology that was on display at 18 00:01:09,360 --> 00:01:13,440 Speaker 1: cs SEN, specifically in regards two cars, which is why 19 00:01:13,480 --> 00:01:16,479 Speaker 1: I got Scott onto the show here to kind of 20 00:01:16,760 --> 00:01:20,319 Speaker 1: gauge his reactions and maybe even here your insight on 21 00:01:20,360 --> 00:01:22,560 Speaker 1: some of this stuff. Well, you know I've read about it. 22 00:01:22,600 --> 00:01:24,240 Speaker 1: Now you were at the show. Of course, you saw 23 00:01:24,280 --> 00:01:26,760 Speaker 1: this stuff firsthand. A lot of this is just me 24 00:01:26,840 --> 00:01:29,200 Speaker 1: looking at reviews of you know, what everybody else has 25 00:01:29,240 --> 00:01:32,200 Speaker 1: written about this stuff. So I'm getting, you know, a 26 00:01:32,200 --> 00:01:33,920 Speaker 1: different view of the whole thing. I don't get the 27 00:01:33,920 --> 00:01:36,280 Speaker 1: hands on experience, but um, I'll do my best to 28 00:01:36,319 --> 00:01:38,840 Speaker 1: try to keep up with you. But really the reaction 29 00:01:38,920 --> 00:01:41,560 Speaker 1: maybe more of what you're going for here, because we've 30 00:01:41,600 --> 00:01:44,240 Speaker 1: got opinions both ways about this. I mean, I know 31 00:01:44,319 --> 00:01:46,640 Speaker 1: that your listeners have heard us uh, I guess butt 32 00:01:46,680 --> 00:01:49,080 Speaker 1: heads and a couple of contutes in the past, or 33 00:01:49,120 --> 00:01:52,040 Speaker 1: at least they heard that one very special episode where 34 00:01:52,040 --> 00:01:55,840 Speaker 1: I I pulled the rug out from under you. I 35 00:01:55,920 --> 00:01:57,920 Speaker 1: still have nightmares. That was the one where I think, 36 00:01:57,960 --> 00:01:59,920 Speaker 1: I think the question was Scott, I'm gonna give you 37 00:02:00,000 --> 00:02:02,920 Speaker 1: a scenario that in like ten years it's illegal to 38 00:02:03,000 --> 00:02:05,320 Speaker 1: drive a car, And then then I watched your you 39 00:02:05,760 --> 00:02:10,200 Speaker 1: just kind of wilt from that point on. Yeah. So 40 00:02:11,320 --> 00:02:12,920 Speaker 1: one of the things I want to just mention right 41 00:02:12,960 --> 00:02:15,880 Speaker 1: off the top before going into any specifics, is that 42 00:02:17,160 --> 00:02:20,240 Speaker 1: the car industry is always played a part in CS 43 00:02:20,280 --> 00:02:22,120 Speaker 1: as long as I have been going. This was my 44 00:02:22,200 --> 00:02:25,840 Speaker 1: eight year going, and typically the North Hall of the 45 00:02:25,880 --> 00:02:29,800 Speaker 1: convention center is where you'll find the car text stuff. 46 00:02:30,000 --> 00:02:32,760 Speaker 1: Occasionally there will be some things out in the the 47 00:02:32,800 --> 00:02:35,360 Speaker 1: actual parking lots that are around the convention center to 48 00:02:35,360 --> 00:02:39,560 Speaker 1: show practical demonstrations of stuff like driver assist technologies that 49 00:02:39,600 --> 00:02:44,359 Speaker 1: sort of thing. But typically most of the the exhibits 50 00:02:44,400 --> 00:02:47,280 Speaker 1: are inside, so their static displays, they're not you know, 51 00:02:47,360 --> 00:02:48,840 Speaker 1: you're not gonna get in the car and drive around 52 00:02:48,919 --> 00:02:53,359 Speaker 1: or anything. But I've noticed that the trend has been 53 00:02:53,400 --> 00:02:55,720 Speaker 1: growing year over year that that we're seeing more and 54 00:02:55,800 --> 00:02:58,440 Speaker 1: more car technologies make their way into the show floor, 55 00:02:58,480 --> 00:03:00,760 Speaker 1: which is kind of interesting. I mean, I get that 56 00:03:00,760 --> 00:03:05,160 Speaker 1: their consumer technology, but often when I hear consumer technology, 57 00:03:05,240 --> 00:03:09,440 Speaker 1: I'm thinking of smaller things, things smaller than a vehicle, 58 00:03:10,120 --> 00:03:13,320 Speaker 1: right like maybe as big as like a refrigerator, But 59 00:03:13,480 --> 00:03:17,680 Speaker 1: mostly I'm thinking of things in the television stereo system world, 60 00:03:17,720 --> 00:03:20,720 Speaker 1: not things that you park in a garage. Well, you know, 61 00:03:20,919 --> 00:03:23,680 Speaker 1: sometimes they get around this by just bringing in maybe 62 00:03:23,720 --> 00:03:27,400 Speaker 1: the cockpit, the driver cockpit and the seat that is, 63 00:03:27,400 --> 00:03:30,320 Speaker 1: you know, integral to whatever they're trying to show or display, 64 00:03:30,639 --> 00:03:32,639 Speaker 1: and and that's enough. But sometimes they want to bring 65 00:03:32,639 --> 00:03:34,680 Speaker 1: the whole vehicle because it's a whole package. Now at 66 00:03:34,680 --> 00:03:36,720 Speaker 1: this point, everything is connected to everything else, and they 67 00:03:36,720 --> 00:03:38,360 Speaker 1: want to show you how it all works together and 68 00:03:38,600 --> 00:03:41,200 Speaker 1: how it feels when you're actually in there. There true 69 00:03:41,200 --> 00:03:42,840 Speaker 1: to life product that they're going to be coming out 70 00:03:42,840 --> 00:03:44,600 Speaker 1: with soon. And in fact, I noticed that there were 71 00:03:44,600 --> 00:03:48,160 Speaker 1: a couple of a couple of early reveals that happened 72 00:03:48,160 --> 00:03:50,920 Speaker 1: at this show that I just wanted to mention. They brought, well, 73 00:03:51,000 --> 00:03:55,160 Speaker 1: Chevrolet brought to brand new vehicles for actually there aren't 74 00:03:55,160 --> 00:03:58,240 Speaker 1: gonna be out until the end of and this is 75 00:03:58,280 --> 00:04:02,040 Speaker 1: what's really weird to me. Brought out the Chevy Bolt, 76 00:04:02,600 --> 00:04:05,280 Speaker 1: which is the hatchback or the smaller version of the 77 00:04:05,360 --> 00:04:10,240 Speaker 1: Chetty Volts I guess, the updated version, And uh, it's 78 00:04:10,280 --> 00:04:12,920 Speaker 1: kind of it wasn't really a reveal, it sort of was. 79 00:04:13,280 --> 00:04:15,640 Speaker 1: They wanted to show off the whole technology, the whole vehicle. 80 00:04:15,720 --> 00:04:18,840 Speaker 1: But later they took it to the Detroit Auto Auto 81 00:04:18,839 --> 00:04:21,279 Speaker 1: Show the I guess, the North American International Auto Show 82 00:04:21,279 --> 00:04:24,960 Speaker 1: in Detroit, and then that's where they officially revealed that vehicle, 83 00:04:25,000 --> 00:04:27,320 Speaker 1: even though everybody at c YES had already seen it 84 00:04:27,320 --> 00:04:29,279 Speaker 1: had been had been shown off like a week and 85 00:04:29,320 --> 00:04:32,960 Speaker 1: a half earlier. Yeah, and similar with the Chevy Cruise hatchback, 86 00:04:33,320 --> 00:04:34,880 Speaker 1: and I think that they wanted to show some of 87 00:04:34,880 --> 00:04:37,880 Speaker 1: their safety technologies and some of their their advanced technologies 88 00:04:37,880 --> 00:04:39,920 Speaker 1: that they were going to be bringing out in about 89 00:04:39,920 --> 00:04:41,520 Speaker 1: what about a year now at this point of year 90 00:04:41,600 --> 00:04:44,280 Speaker 1: and a half something like that. Uh, so interesting how 91 00:04:44,320 --> 00:04:46,719 Speaker 1: they would really reveal them to the public there and 92 00:04:46,760 --> 00:04:48,800 Speaker 1: then give them the official reveal you know where they 93 00:04:48,800 --> 00:04:51,320 Speaker 1: pull off, you know, they do the dramatic Yankee off 94 00:04:51,320 --> 00:04:52,720 Speaker 1: of the cover or you know, I have a drive 95 00:04:52,720 --> 00:04:56,039 Speaker 1: out on stage at the Detroit show. Yeah, it's interesting 96 00:04:56,080 --> 00:04:59,279 Speaker 1: to see ce S play home to that sort of 97 00:04:59,400 --> 00:05:03,320 Speaker 1: announcement or non announcement almost where you do get this 98 00:05:03,400 --> 00:05:07,280 Speaker 1: early look at stuff that normally would be reserved for 99 00:05:07,320 --> 00:05:10,760 Speaker 1: an auto show or or even just a press event 100 00:05:10,800 --> 00:05:14,360 Speaker 1: for that specific company. Um, it's kind of showing how 101 00:05:14,440 --> 00:05:19,040 Speaker 1: c e S and cars are meshing together. In fact, 102 00:05:19,080 --> 00:05:21,000 Speaker 1: that's going to be a theme throughout this episode, is 103 00:05:21,040 --> 00:05:25,280 Speaker 1: about how we're seeing sort of high text stuff outside 104 00:05:25,279 --> 00:05:28,320 Speaker 1: of the direct control of an automobile make its way 105 00:05:28,360 --> 00:05:30,719 Speaker 1: into the vehicle, and some things you wouldn't expect to 106 00:05:30,720 --> 00:05:33,160 Speaker 1: see in an automobile that do make its way into 107 00:05:33,160 --> 00:05:36,760 Speaker 1: that automobile. Generally it's apps and things like that should 108 00:05:36,800 --> 00:05:39,200 Speaker 1: get it, but there's some hardware changes and some other 109 00:05:39,240 --> 00:05:42,279 Speaker 1: surprising things along the way to right. And the first 110 00:05:42,320 --> 00:05:44,279 Speaker 1: thing I want to start off with Scott is talking 111 00:05:44,320 --> 00:05:47,360 Speaker 1: about Toyota. I want to get that out of the way. So, 112 00:05:48,040 --> 00:05:50,880 Speaker 1: first of all, full disclosure, I was there at c 113 00:05:51,360 --> 00:05:55,159 Speaker 1: S partially uh in my role as host of Forward Thinking, 114 00:05:55,160 --> 00:05:58,520 Speaker 1: which is sponsored by Toyota. This show is not sponsored 115 00:05:58,520 --> 00:06:00,080 Speaker 1: by Toyota, but I just wanted to get that all 116 00:06:00,120 --> 00:06:04,640 Speaker 1: the way. Uh. So they had a presentation that they 117 00:06:04,640 --> 00:06:09,040 Speaker 1: were calling moving Safely into Future Mobility, and it was 118 00:06:09,080 --> 00:06:13,919 Speaker 1: all really about artificial intelligence. Now, in two thousand and fifteen, 119 00:06:14,600 --> 00:06:16,920 Speaker 1: it was announced that Toyota was going to fund two, 120 00:06:17,200 --> 00:06:20,400 Speaker 1: well at least to start with two different research facilities, 121 00:06:20,400 --> 00:06:23,520 Speaker 1: one near Stanford, one near m i T dedicated to 122 00:06:23,600 --> 00:06:29,840 Speaker 1: researching artificial intelligence, specifically in regards to automotive industry, but 123 00:06:30,120 --> 00:06:33,159 Speaker 1: beyond that as well. And then it was announced that 124 00:06:33,200 --> 00:06:35,880 Speaker 1: they were going to put in a billion dollars over 125 00:06:35,880 --> 00:06:39,839 Speaker 1: the next five years into these two research facilities. And 126 00:06:39,880 --> 00:06:43,359 Speaker 1: they're they're actually working on two different things. So the 127 00:06:43,400 --> 00:06:47,400 Speaker 1: Stanford group is working on something where you are trying 128 00:06:47,440 --> 00:06:51,279 Speaker 1: to eliminate uncertainty in AI. And by that what they 129 00:06:51,360 --> 00:06:55,920 Speaker 1: mean is, Scott, You're you're driving your car. You're driving 130 00:06:55,960 --> 00:07:00,479 Speaker 1: down at highway speeds on a highway and there's you 131 00:07:00,600 --> 00:07:03,080 Speaker 1: notice that the car in front of you, uh that 132 00:07:03,520 --> 00:07:05,359 Speaker 1: it's got some barrels in the bed of a truck. 133 00:07:05,400 --> 00:07:07,800 Speaker 1: And one of the barrels happens to to fly out, 134 00:07:07,839 --> 00:07:11,840 Speaker 1: and you have a time to react, so you managed 135 00:07:11,880 --> 00:07:14,360 Speaker 1: to swerve all the way. You as a human being, 136 00:07:14,600 --> 00:07:18,320 Speaker 1: you're able to recognize this extrapolate from past experience kind 137 00:07:18,320 --> 00:07:20,840 Speaker 1: of no, because you know the car and know the road. 138 00:07:21,440 --> 00:07:23,680 Speaker 1: You know sort of what you can do, what's within 139 00:07:23,760 --> 00:07:26,840 Speaker 1: your abilities to do in reaction to something like that. 140 00:07:28,240 --> 00:07:31,160 Speaker 1: The thing that the Stanford folks are working on is 141 00:07:31,280 --> 00:07:35,520 Speaker 1: how do you program that kind of ability into artificial intelligence? 142 00:07:35,520 --> 00:07:39,640 Speaker 1: Because machines are really good and reacting in a specific 143 00:07:39,680 --> 00:07:44,040 Speaker 1: way to specific scenarios. So in other words, you could 144 00:07:44,040 --> 00:07:47,360 Speaker 1: program in, all right, if a bicyclist swerves into your lane, 145 00:07:47,360 --> 00:07:50,080 Speaker 1: then you slow down like you have that. If this, 146 00:07:50,360 --> 00:07:54,080 Speaker 1: then that, But you can't anticipate everything, so this is 147 00:07:54,080 --> 00:07:58,440 Speaker 1: like a split second reaction. Uh. Everything that goes into 148 00:07:58,520 --> 00:08:01,040 Speaker 1: a human split second reaction and is what they have 149 00:08:01,160 --> 00:08:03,320 Speaker 1: to some kind of find some kind of way to 150 00:08:03,360 --> 00:08:07,280 Speaker 1: develop for their own AI system to recognize and react 151 00:08:07,280 --> 00:08:09,560 Speaker 1: the proper way so that it what Well, here's the 152 00:08:09,640 --> 00:08:12,600 Speaker 1: here's the question though. Does it react to preserve the 153 00:08:12,640 --> 00:08:14,440 Speaker 1: lives of the people that are in the car. Does 154 00:08:14,440 --> 00:08:17,720 Speaker 1: it react to preserve the lives of people that maybe 155 00:08:17,800 --> 00:08:22,800 Speaker 1: are walking on the sidewalk? This is the classic trolley problem. Okay, well, okay, yeah, 156 00:08:22,840 --> 00:08:25,640 Speaker 1: that's right. So who does who does the system protect? 157 00:08:25,680 --> 00:08:27,760 Speaker 1: Does it protect the driver of the vehicle or does 158 00:08:27,760 --> 00:08:30,360 Speaker 1: it protect the people around surrounding that? And that's and 159 00:08:30,400 --> 00:08:32,480 Speaker 1: that's one of the questions that has to be asked, right, 160 00:08:32,559 --> 00:08:35,560 Speaker 1: like how how you know knowing that there's going to 161 00:08:35,600 --> 00:08:40,040 Speaker 1: be ultimately at some point a scenario where a car 162 00:08:40,160 --> 00:08:42,840 Speaker 1: will have to quote unquote decide to do something that 163 00:08:42,960 --> 00:08:45,360 Speaker 1: is going to put someone in harm's way, how do 164 00:08:45,400 --> 00:08:48,920 Speaker 1: you determine what how the car will will make that decision? 165 00:08:49,480 --> 00:08:52,200 Speaker 1: And that's that's part of it is just this idea 166 00:08:52,280 --> 00:08:56,800 Speaker 1: of you can program AI to handle the mundane situations 167 00:08:56,840 --> 00:09:00,560 Speaker 1: of driving. Really well, we we have pretty got that 168 00:09:00,760 --> 00:09:04,839 Speaker 1: down right. The manufacturers are essentially doing that now, they 169 00:09:04,880 --> 00:09:07,760 Speaker 1: just can't legally say that you're allowed to operate that 170 00:09:07,880 --> 00:09:11,040 Speaker 1: vehicle without a driver in the driver's seat, right, So 171 00:09:11,640 --> 00:09:15,480 Speaker 1: under normal operating conditions, so a normal day like in 172 00:09:15,520 --> 00:09:18,520 Speaker 1: a in a temperate area, So we're not talking about 173 00:09:18,920 --> 00:09:23,079 Speaker 1: whether conditions like ice or snow. We're not talking about 174 00:09:23,240 --> 00:09:27,360 Speaker 1: you know, traffic system where maybe the traffic lights are 175 00:09:27,400 --> 00:09:30,800 Speaker 1: on the blank. That could obviously cause issues with a 176 00:09:30,800 --> 00:09:33,439 Speaker 1: car that is designed to wait until a green light 177 00:09:33,480 --> 00:09:36,480 Speaker 1: comes up, that kind of stuff, But under normal operating 178 00:09:36,480 --> 00:09:40,600 Speaker 1: conditions they're very safe. The point that Toyota was making 179 00:09:40,760 --> 00:09:42,760 Speaker 1: is that that's not the way the world works, and 180 00:09:42,840 --> 00:09:44,840 Speaker 1: that you could argue that right now, we have a 181 00:09:44,880 --> 00:09:49,720 Speaker 1: million mile safety record with autonomous cars, just because that's 182 00:09:49,760 --> 00:09:53,720 Speaker 1: the number of miles autonomous cars have traveled collectively. But 183 00:09:53,800 --> 00:09:57,560 Speaker 1: they argue that we really need a trillion mile safety 184 00:09:58,400 --> 00:10:02,760 Speaker 1: record to prove that autonomous cars are safe enough for 185 00:10:02,880 --> 00:10:06,640 Speaker 1: us to hand over control two machines and not be 186 00:10:06,960 --> 00:10:09,200 Speaker 1: at least partially in control. Well, that's part of the 187 00:10:09,200 --> 00:10:12,360 Speaker 1: reason we always see this this decade out date where 188 00:10:12,400 --> 00:10:17,199 Speaker 1: they say we're gonna have this ready by or in 189 00:10:17,240 --> 00:10:21,280 Speaker 1: some cases we don't even plan to have anything fully autonomous, 190 00:10:21,280 --> 00:10:24,000 Speaker 1: and until fifteen years from now, but we're gonna do 191 00:10:24,080 --> 00:10:26,520 Speaker 1: these these incremental steps along the way so that in 192 00:10:26,679 --> 00:10:29,560 Speaker 1: ten years we're gonna have one that's pretty good. It's 193 00:10:29,600 --> 00:10:33,160 Speaker 1: nearly there, and so that's exactly what Toyota is saying 194 00:10:33,200 --> 00:10:37,640 Speaker 1: that they want to have focus on driver assist systems 195 00:10:37,800 --> 00:10:40,480 Speaker 1: where the AI can come in and help out if 196 00:10:40,520 --> 00:10:45,240 Speaker 1: something extraordinary happens or if something happens outside of your awareness, uh, 197 00:10:45,280 --> 00:10:48,760 Speaker 1: and help you avoid an accident, but it doesn't completely 198 00:10:48,760 --> 00:10:51,280 Speaker 1: take over for the driver. Now check me if I'm 199 00:10:51,280 --> 00:10:53,800 Speaker 1: wrong here. But I saw a just a little video 200 00:10:53,840 --> 00:10:55,640 Speaker 1: clip of a display that was happening at one of 201 00:10:55,640 --> 00:10:57,240 Speaker 1: the Toyota booths. And I don't know if this is 202 00:10:57,440 --> 00:11:00,360 Speaker 1: tied into with this exactly, but they had must look 203 00:11:00,440 --> 00:11:01,959 Speaker 1: what looked like a fish tank. It was a big 204 00:11:02,600 --> 00:11:05,640 Speaker 1: area with the table and yeah, they had a lot 205 00:11:05,679 --> 00:11:07,559 Speaker 1: of miniature, you know, scale model cars. I don't know 206 00:11:07,559 --> 00:11:09,880 Speaker 1: if they were previous cars or what they were, but uh, 207 00:11:09,920 --> 00:11:12,280 Speaker 1: they were rolling around and I guess when they initially 208 00:11:12,400 --> 00:11:15,080 Speaker 1: kind of released all these cars, maybe ten or twelve 209 00:11:15,080 --> 00:11:18,440 Speaker 1: cars into this uh this this boxes glass box. Uh, 210 00:11:18,480 --> 00:11:20,160 Speaker 1: they were all bumping into each other and it was 211 00:11:20,200 --> 00:11:23,320 Speaker 1: just kind of chaos and the cars were then learning 212 00:11:23,800 --> 00:11:27,040 Speaker 1: how to drive around each other without colliding with it 213 00:11:27,040 --> 00:11:29,480 Speaker 1: with objects that were in the display and the and 214 00:11:29,679 --> 00:11:32,760 Speaker 1: other vehicles, and by the end of the demonstration, or 215 00:11:32,800 --> 00:11:34,760 Speaker 1: at least when they were showing this video, clear when 216 00:11:34,760 --> 00:11:37,720 Speaker 1: they were recording this clip, the cars were not colliding 217 00:11:37,720 --> 00:11:39,920 Speaker 1: at all. They weren't touching. They were you know, sensing 218 00:11:39,920 --> 00:11:42,320 Speaker 1: each other and moving to make something, you know, making 219 00:11:42,320 --> 00:11:45,400 Speaker 1: some kind of avoidance maneuver in a in a smart 220 00:11:45,440 --> 00:11:47,800 Speaker 1: way so they didn't bump into another vehicle or another 221 00:11:47,840 --> 00:11:49,920 Speaker 1: object or the wall or whatever. It was just it 222 00:11:49,960 --> 00:11:52,040 Speaker 1: was a clever way to show that. Yeah, it was 223 00:11:52,160 --> 00:11:55,000 Speaker 1: neat because again they you know, Toyota had a couple 224 00:11:55,000 --> 00:11:57,400 Speaker 1: of vehicles on the show floor, but this was a 225 00:11:57,440 --> 00:12:01,880 Speaker 1: way of showing off this technology, uh, in a live 226 00:12:02,240 --> 00:12:06,400 Speaker 1: setting that still didn't require you to actually go outside 227 00:12:06,440 --> 00:12:09,040 Speaker 1: and see a bunch of Prius is kind of pirouet 228 00:12:09,120 --> 00:12:11,560 Speaker 1: around each other. Yeah, I know exactly what you're talking about, 229 00:12:11,559 --> 00:12:13,760 Speaker 1: because I got a good look at this. So, yeah, 230 00:12:13,760 --> 00:12:16,160 Speaker 1: it's sort of a grid and each car had its 231 00:12:16,200 --> 00:12:18,719 Speaker 1: own route that it had to go along, and at 232 00:12:18,760 --> 00:12:21,560 Speaker 1: first it was trying to go along this route, not 233 00:12:22,120 --> 00:12:25,800 Speaker 1: really caring what the other cars in the in the 234 00:12:25,840 --> 00:12:27,720 Speaker 1: area we're trying to do, and that's where they were 235 00:12:27,760 --> 00:12:29,880 Speaker 1: having these little collisions and stuff and figuring things out 236 00:12:31,280 --> 00:12:35,360 Speaker 1: that nice and when the learning algorithm kicked in, then 237 00:12:35,400 --> 00:12:38,680 Speaker 1: they were able to start, you know, was pausing to 238 00:12:38,800 --> 00:12:41,040 Speaker 1: let another car go in front of them before our 239 00:12:41,120 --> 00:12:43,760 Speaker 1: cross in front them before they went and uh, it 240 00:12:43,960 --> 00:12:47,320 Speaker 1: became this very intricate, almost like a dance that you 241 00:12:47,360 --> 00:12:50,520 Speaker 1: could watch, and they had a video streaming in the 242 00:12:50,559 --> 00:12:53,560 Speaker 1: background explaining the whole process that this was a machine 243 00:12:53,640 --> 00:12:56,760 Speaker 1: learning algorithm that would be applied on a much grander 244 00:12:56,800 --> 00:13:00,840 Speaker 1: scale to technology that would find its in cars down 245 00:13:00,840 --> 00:13:04,760 Speaker 1: the road. It's almost like a like a fish school mentality, 246 00:13:04,840 --> 00:13:06,640 Speaker 1: you know, like they all move one way together and 247 00:13:06,679 --> 00:13:09,480 Speaker 1: it's all fluid and there's not really much of fish 248 00:13:09,559 --> 00:13:11,839 Speaker 1: colliding all the time. You know, they all know where 249 00:13:11,840 --> 00:13:14,440 Speaker 1: the other one is going. It's like maintaining that distance. Yeah, 250 00:13:14,440 --> 00:13:16,280 Speaker 1: it's it seems it seems like that. It looks like 251 00:13:16,320 --> 00:13:18,240 Speaker 1: that when you see it happening. Yeah, it's actually it's 252 00:13:18,240 --> 00:13:20,920 Speaker 1: actually pretty cool to watch. Like I got to watch 253 00:13:20,920 --> 00:13:24,240 Speaker 1: it for a really long time as we were setting 254 00:13:24,280 --> 00:13:26,960 Speaker 1: up a shot and so but yeah, it was neat 255 00:13:27,040 --> 00:13:29,800 Speaker 1: seeing how these were working. And in fact, a lot 256 00:13:29,800 --> 00:13:31,680 Speaker 1: of the people from Toyota said, yeah, some of our 257 00:13:31,920 --> 00:13:33,960 Speaker 1: competitors were coming in to make sure that it wasn't 258 00:13:34,000 --> 00:13:37,000 Speaker 1: like remote controlled vehicles that was all actually autonomous. And 259 00:13:37,240 --> 00:13:39,720 Speaker 1: in fact it was autonomous. It's just like a little 260 00:13:40,000 --> 00:13:42,960 Speaker 1: little bitty model cars. Yeah. You know, one thing as 261 00:13:43,040 --> 00:13:45,760 Speaker 1: this that you sent me an article at least from 262 00:13:45,760 --> 00:13:47,959 Speaker 1: the New York Times. Yeah, that that mentions this, you know, 263 00:13:48,000 --> 00:13:51,840 Speaker 1: Toyota investment of one billion dollars in artificial intelligence here 264 00:13:51,840 --> 00:13:53,600 Speaker 1: in the United States, and it said, and I thought 265 00:13:53,640 --> 00:13:57,640 Speaker 1: this was pretty interesting that artificial intelligence technologies were disappointing 266 00:13:57,720 --> 00:14:00,720 Speaker 1: for decades. And I've never really thought on the disappointing. 267 00:14:00,760 --> 00:14:04,280 Speaker 1: I mean, I see them incrementally growing, you know, getting better. Yeah, 268 00:14:04,520 --> 00:14:07,520 Speaker 1: but I never really saw them as being disappointing. I 269 00:14:07,520 --> 00:14:10,960 Speaker 1: think part of it is that there is the the 270 00:14:11,200 --> 00:14:16,400 Speaker 1: public expectation of what artificial intelligence is versus the actual 271 00:14:16,440 --> 00:14:18,839 Speaker 1: definition of artificial intelligence. So if you if you look 272 00:14:18,880 --> 00:14:23,080 Speaker 1: at artificial intelligence as being a very broad spectrum that 273 00:14:23,360 --> 00:14:28,160 Speaker 1: encompasses a ton of stuff and on on the the 274 00:14:28,280 --> 00:14:33,760 Speaker 1: simplest level, it's on the the idea of sensing a 275 00:14:33,880 --> 00:14:36,680 Speaker 1: change in the environment and being able to react to it. 276 00:14:37,080 --> 00:14:39,720 Speaker 1: That's the type of artificial intelligence. And of course the 277 00:14:39,800 --> 00:14:43,760 Speaker 1: far end of the spectrum is human like or superhuman intelligence, 278 00:14:43,800 --> 00:14:49,160 Speaker 1: which obviously we have not really achieved. Um. But you 279 00:14:49,200 --> 00:14:51,520 Speaker 1: know a lot of people think that that superhuman or 280 00:14:51,600 --> 00:14:54,360 Speaker 1: human like intelligence is the definition of AI. They don't 281 00:14:54,400 --> 00:14:59,520 Speaker 1: realize necessarily that artificial intelligence means lots of different stuff. 282 00:14:59,560 --> 00:15:01,160 Speaker 1: Well just give you an example. I mean that this 283 00:15:01,280 --> 00:15:04,880 Speaker 1: article in particular mentioned Siri as a as a tremendous 284 00:15:04,920 --> 00:15:09,440 Speaker 1: advancement in artificial intelligence. Now, I know that you've probably 285 00:15:09,440 --> 00:15:12,600 Speaker 1: talked about this at length on the show, but I 286 00:15:12,600 --> 00:15:16,520 Speaker 1: can kind of understand it. That's like, finally the general populace, 287 00:15:16,640 --> 00:15:18,680 Speaker 1: I guess, is getting a chance to kind of kind 288 00:15:18,680 --> 00:15:20,440 Speaker 1: of toy with this a little bit and see what 289 00:15:20,480 --> 00:15:21,920 Speaker 1: it can do, what it's capable of. And I know 290 00:15:22,000 --> 00:15:25,760 Speaker 1: it's a it's I don't know, I have trouble calling 291 00:15:25,760 --> 00:15:29,480 Speaker 1: it true artificial intelligence. Yeah. No, it's it's it's voice recognition, 292 00:15:29,520 --> 00:15:33,480 Speaker 1: which is already impressive, right, And then it's it's it's 293 00:15:33,520 --> 00:15:38,720 Speaker 1: pairing that with UH an ability to execute uncertain apps 294 00:15:38,840 --> 00:15:40,800 Speaker 1: or do searches. But then you think about it, if 295 00:15:40,800 --> 00:15:43,360 Speaker 1: you're gonna call Sirie artificial intelligence, you might as well 296 00:15:43,400 --> 00:15:48,640 Speaker 1: call a Google Search artificial intelligence, because Siri is just 297 00:15:48,880 --> 00:15:52,080 Speaker 1: really a new way to interface with a system that's 298 00:15:52,120 --> 00:15:54,800 Speaker 1: been in place for several years exactly now. Okay, so, 299 00:15:55,200 --> 00:15:59,119 Speaker 1: but we're seeing more and more stuff like Siri implemented 300 00:15:59,160 --> 00:16:02,680 Speaker 1: into automob be recognition. We'll be talking a lot about that, 301 00:16:02,800 --> 00:16:05,600 Speaker 1: and to be able to control certain things in the car, 302 00:16:05,640 --> 00:16:07,640 Speaker 1: which are what you're getting far more advanced than they 303 00:16:07,640 --> 00:16:11,720 Speaker 1: were even last year. Um. Yeah, it's crazy. Yeah, the interaction, 304 00:16:11,800 --> 00:16:14,360 Speaker 1: the level of interaction. I mean, I gotta say, I've 305 00:16:14,360 --> 00:16:17,680 Speaker 1: never really been all that impressed with any of these 306 00:16:17,720 --> 00:16:19,760 Speaker 1: apps that you can have on your phone that maybe 307 00:16:19,760 --> 00:16:22,560 Speaker 1: turn on the lights in your living room or you know, 308 00:16:22,560 --> 00:16:25,280 Speaker 1: you can adjust your thermostad like you stand to get it. 309 00:16:25,520 --> 00:16:29,360 Speaker 1: You say that, but until you convince your loved ones 310 00:16:29,400 --> 00:16:31,840 Speaker 1: that your house is haunted because you are using the 311 00:16:31,880 --> 00:16:34,000 Speaker 1: app while you're at work to turn the lights on 312 00:16:34,080 --> 00:16:37,360 Speaker 1: and off, you really haven't lived. I see, I see. 313 00:16:37,360 --> 00:16:39,880 Speaker 1: So it's it's good for practical jokers too. Yeah. No, 314 00:16:40,000 --> 00:16:42,440 Speaker 1: I've I've been guilty of that. I've turned the lights 315 00:16:42,480 --> 00:16:45,480 Speaker 1: off on my wife while I was at work. That's 316 00:16:45,520 --> 00:16:48,880 Speaker 1: pretty funny trick, you know. Honestly, like what we're going 317 00:16:48,960 --> 00:16:52,160 Speaker 1: to cover here today, it goes far beyond that. I mean, 318 00:16:52,480 --> 00:16:54,520 Speaker 1: because again, like I said, I'm not I haven't really 319 00:16:54,520 --> 00:16:56,880 Speaker 1: been all that you know, wowed by this stuff up 320 00:16:56,920 --> 00:16:59,520 Speaker 1: until this point. And now now it seems like you 321 00:16:59,560 --> 00:17:02,000 Speaker 1: can re lee talked to this thing and make it 322 00:17:02,040 --> 00:17:04,720 Speaker 1: do what you really want, I mean, and there's far 323 00:17:04,800 --> 00:17:07,400 Speaker 1: reaching implications with that. Yeah, I think, uh, I think 324 00:17:07,400 --> 00:17:09,919 Speaker 1: it's really cool that, I mean. That was one of 325 00:17:09,960 --> 00:17:12,840 Speaker 1: the really big stories this year l c ES was 326 00:17:12,880 --> 00:17:18,120 Speaker 1: that this sort of connected card technology is really starting 327 00:17:18,160 --> 00:17:21,680 Speaker 1: to make sense, right, it's it's gone beyond. It made 328 00:17:21,680 --> 00:17:23,399 Speaker 1: me think of the way smart TVs were when they 329 00:17:23,400 --> 00:17:25,440 Speaker 1: first came out. Smart TVs when they first came out, 330 00:17:26,440 --> 00:17:29,160 Speaker 1: you get a widget on your screen and it would 331 00:17:29,160 --> 00:17:31,560 Speaker 1: tell you, you know, what the temperature was outside and 332 00:17:31,560 --> 00:17:33,600 Speaker 1: how likely is it going to rain? And it wasn't 333 00:17:33,680 --> 00:17:36,440 Speaker 1: very attractive, it wasn't very useful, and often it was 334 00:17:36,480 --> 00:17:40,919 Speaker 1: difficult to navigate around these systems because it was more like, hey, 335 00:17:41,080 --> 00:17:45,040 Speaker 1: we can put internet content on a television. Let's go 336 00:17:45,080 --> 00:17:47,440 Speaker 1: ahead and do that before we figure out what's the 337 00:17:47,480 --> 00:17:51,359 Speaker 1: best way to incorporate it so that the experience is 338 00:17:51,400 --> 00:17:53,320 Speaker 1: a positive one and I think cars have had the 339 00:17:53,359 --> 00:17:55,280 Speaker 1: same thing they are getting there. Now I'll tell you 340 00:17:55,359 --> 00:17:58,240 Speaker 1: because this this whole Forward Alexa, I think the years 341 00:17:58,320 --> 00:18:00,119 Speaker 1: you've mentioned as this is one of the top so 342 00:18:00,160 --> 00:18:01,440 Speaker 1: you wanted to talk about you mind if we go 343 00:18:01,480 --> 00:18:03,320 Speaker 1: to that right now to it all right? So Ford 344 00:18:03,680 --> 00:18:08,320 Speaker 1: is kind of working with Echo and and Alexa and uh, 345 00:18:08,320 --> 00:18:11,640 Speaker 1: and they're incorporating this into their already existing SINC program 346 00:18:11,640 --> 00:18:13,359 Speaker 1: and I think they're at the third generation of the 347 00:18:13,359 --> 00:18:15,760 Speaker 1: SINC program for their cars. How do you know SINC 348 00:18:15,840 --> 00:18:18,440 Speaker 1: program is the right way to say that platform platform, 349 00:18:18,480 --> 00:18:21,400 Speaker 1: it's a better way. So you can go to forward 350 00:18:21,480 --> 00:18:23,040 Speaker 1: dot com and check out what SINK will do. But 351 00:18:23,080 --> 00:18:25,800 Speaker 1: I'll tell you that this next level where they've taken 352 00:18:25,800 --> 00:18:27,960 Speaker 1: out of the way. They've integrated everything that SINK will 353 00:18:27,960 --> 00:18:31,320 Speaker 1: do with Echo and with Alexa. So that's like the 354 00:18:31,680 --> 00:18:33,719 Speaker 1: is it a personal assistant or what do they call 355 00:18:34,040 --> 00:18:37,959 Speaker 1: they call them now they're calling a voice concierge voice concierge. 356 00:18:38,040 --> 00:18:41,280 Speaker 1: So I was reading an article about this from Gelapnic 357 00:18:41,760 --> 00:18:44,480 Speaker 1: and it's a car site and one of the things 358 00:18:44,480 --> 00:18:47,760 Speaker 1: and this impressed me. It's not just turn the lights 359 00:18:47,800 --> 00:18:50,200 Speaker 1: on and or you know, set the set the temperature, 360 00:18:50,200 --> 00:18:52,520 Speaker 1: it says, you know, you can you can from your 361 00:18:52,520 --> 00:18:55,280 Speaker 1: car on the way home, you can say, Alexa, turn 362 00:18:55,320 --> 00:18:57,480 Speaker 1: the kitchen lights on, open the Gradu door, heat the 363 00:18:57,520 --> 00:18:59,960 Speaker 1: oven to four degrees, and turn on a baseball game. 364 00:19:00,040 --> 00:19:02,120 Speaker 1: So when you get home, all this stuff is ready 365 00:19:02,160 --> 00:19:04,879 Speaker 1: to go. It's all set to go. And and Mercedes 366 00:19:04,920 --> 00:19:08,440 Speaker 1: Benz is working with with Nest to do something very similar. 367 00:19:08,960 --> 00:19:11,840 Speaker 1: So other auto manufacturers are working with different companies to 368 00:19:11,920 --> 00:19:14,520 Speaker 1: do similar things. One of them is working with Microsoft 369 00:19:14,600 --> 00:19:18,280 Speaker 1: and has Cortana, which is Microsoft version of a voice concierge, 370 00:19:18,400 --> 00:19:20,440 Speaker 1: incorporating that. We'll get to that in a little bit too. 371 00:19:20,560 --> 00:19:23,280 Speaker 1: And and you can you can set start times for 372 00:19:23,320 --> 00:19:25,280 Speaker 1: your car. So here's the usual routine. You know, you've 373 00:19:25,280 --> 00:19:27,760 Speaker 1: got the automatic start that you know, the manufacturer builds 374 00:19:27,760 --> 00:19:30,040 Speaker 1: into the system. That's that's one thing, right, But you 375 00:19:30,320 --> 00:19:33,199 Speaker 1: still and I know this sounds kind of silly, but 376 00:19:33,520 --> 00:19:35,480 Speaker 1: you still have to you don't go near the window, 377 00:19:35,480 --> 00:19:37,280 Speaker 1: I guess, and look at your car and and push 378 00:19:37,320 --> 00:19:39,560 Speaker 1: the button and knowing you know the previous night, you 379 00:19:39,600 --> 00:19:41,680 Speaker 1: turn the heat on high and the defroster on and 380 00:19:41,720 --> 00:19:43,639 Speaker 1: all that stuff and it sets or you know, some 381 00:19:43,720 --> 00:19:46,720 Speaker 1: manufacturers probably has settings that automatically do that, but you 382 00:19:47,119 --> 00:19:49,080 Speaker 1: have some type of physical interaction you have to do 383 00:19:49,119 --> 00:19:50,440 Speaker 1: with your car in the morning. You have to remember, 384 00:19:50,440 --> 00:19:52,159 Speaker 1: I'm going to turn that on ten minutes before I 385 00:19:52,160 --> 00:19:55,800 Speaker 1: go out. Clear, clear off the frost on the windshield. 386 00:19:56,119 --> 00:19:58,840 Speaker 1: Make sure the car is a nice temperature before I 387 00:19:58,840 --> 00:20:02,160 Speaker 1: get in exactly so about comfort. Right. So this system, 388 00:20:02,320 --> 00:20:06,320 Speaker 1: this is this new forward Alexa ecosystem, will allow you 389 00:20:06,359 --> 00:20:09,040 Speaker 1: to set a time like on on Monday morning, I 390 00:20:09,119 --> 00:20:11,280 Speaker 1: want the car to start at five am. I want 391 00:20:11,320 --> 00:20:14,680 Speaker 1: the interior temperature to be seventy five degrees um. I want, 392 00:20:14,960 --> 00:20:16,680 Speaker 1: you know, whatever you can. You could adjust that, you 393 00:20:16,680 --> 00:20:19,240 Speaker 1: can turn the radio station to whatever you want. But 394 00:20:19,280 --> 00:20:21,720 Speaker 1: you can program all that in days ahead of time. 395 00:20:21,760 --> 00:20:24,440 Speaker 1: It's like setting a schedule for your car. My question 396 00:20:24,560 --> 00:20:27,080 Speaker 1: is with that, though, what if things change and you 397 00:20:27,119 --> 00:20:28,840 Speaker 1: forget to do that, then your car is running out 398 00:20:28,840 --> 00:20:31,240 Speaker 1: in the driveway for you know, an hour and a 399 00:20:31,280 --> 00:20:33,520 Speaker 1: half or whatever before you remember that it's actually running. 400 00:20:33,520 --> 00:20:36,520 Speaker 1: I would imagine that, assuming you're having this worked with 401 00:20:36,600 --> 00:20:38,840 Speaker 1: the echo, which is essentially a speaker that has a 402 00:20:38,880 --> 00:20:42,480 Speaker 1: microphone embedded in it so you can speak into it, 403 00:20:42,480 --> 00:20:45,240 Speaker 1: it can talk back to you. Uh. I would imagine 404 00:20:45,280 --> 00:20:48,240 Speaker 1: how fail safe would be to have the speaker chime 405 00:20:48,560 --> 00:20:52,040 Speaker 1: or otherwise alert you that your car has been running 406 00:20:52,040 --> 00:20:55,040 Speaker 1: but hasn't been doing anything after a certain amount of 407 00:20:55,040 --> 00:20:57,720 Speaker 1: time has passed, maybe five minutes. That's that's that would 408 00:20:57,720 --> 00:21:00,760 Speaker 1: be my solution to that problem. But it's it is 409 00:21:00,800 --> 00:21:03,080 Speaker 1: something that obviously you have to think about when you're 410 00:21:03,080 --> 00:21:08,120 Speaker 1: trying to implement these kind of approaches into the automobile industry. 411 00:21:08,160 --> 00:21:10,600 Speaker 1: I mean, this is stuff that we've seen in smart 412 00:21:10,680 --> 00:21:13,680 Speaker 1: homes to some degree or another, and to see it's 413 00:21:13,720 --> 00:21:18,280 Speaker 1: kind of creep into cars. You know, hopefully people are 414 00:21:18,320 --> 00:21:22,200 Speaker 1: taking lessons they've learned from the other implementations and trying 415 00:21:22,200 --> 00:21:25,680 Speaker 1: to figure out the best practices when applying it to cars. Yeah. Sure, 416 00:21:25,680 --> 00:21:28,440 Speaker 1: and you know, you can schedule service reminders and things 417 00:21:28,480 --> 00:21:30,520 Speaker 1: like that. No, that's not anything new because cars can 418 00:21:30,520 --> 00:21:32,080 Speaker 1: do that kind of on their own. You can program 419 00:21:32,119 --> 00:21:34,240 Speaker 1: that in for just a date. But they can also 420 00:21:34,359 --> 00:21:37,760 Speaker 1: detect if there's you know, um um something missing out 421 00:21:37,760 --> 00:21:39,880 Speaker 1: of your refrigerator. So let's say that you're at the 422 00:21:39,920 --> 00:21:42,199 Speaker 1: grocery store and you're in your connected car, and you've 423 00:21:42,200 --> 00:21:45,040 Speaker 1: got your connected home and your device that you know 424 00:21:45,080 --> 00:21:47,560 Speaker 1: interfaces between the two of them. You're in the grocery 425 00:21:47,560 --> 00:21:50,000 Speaker 1: store parking lot, though, and you don't remember if you 426 00:21:50,000 --> 00:21:53,880 Speaker 1: have milk, and you can you can now check while 427 00:21:53,880 --> 00:21:55,800 Speaker 1: you're in the car at the grocery store and check 428 00:21:55,840 --> 00:21:57,440 Speaker 1: to see if you have milk in the fridge at 429 00:21:57,440 --> 00:22:00,880 Speaker 1: home without having to have somebody there to say, yeah, 430 00:22:00,920 --> 00:22:02,640 Speaker 1: pick up a galan of milk on the way home. Right, 431 00:22:02,720 --> 00:22:05,840 Speaker 1: So that this does require, obviously that you have various 432 00:22:05,840 --> 00:22:09,720 Speaker 1: smart appliances to to network together. Right. Yeah, If it's 433 00:22:09,760 --> 00:22:13,840 Speaker 1: just a regular old fridge or or a kegenator as 434 00:22:13,960 --> 00:22:16,399 Speaker 1: as we might have back at home, it's not going 435 00:22:16,480 --> 00:22:20,159 Speaker 1: to be able to chime up. But yeah, it's this interconnectivity. 436 00:22:20,359 --> 00:22:22,879 Speaker 1: And in fact, that's one of the big assets that 437 00:22:22,920 --> 00:22:25,879 Speaker 1: Amazon has is the Echo can work with lots of 438 00:22:26,000 --> 00:22:30,560 Speaker 1: other different home automation products, not even not just once 439 00:22:30,600 --> 00:22:33,119 Speaker 1: from one company, which is a huge deal. Scott, I 440 00:22:33,119 --> 00:22:34,960 Speaker 1: can't tell you how big the deal this is to 441 00:22:35,000 --> 00:22:38,480 Speaker 1: have a connected home. Like in the early days, everyone 442 00:22:38,560 --> 00:22:40,960 Speaker 1: was trying to make sure their approach was going to 443 00:22:41,040 --> 00:22:44,640 Speaker 1: become the standard approach, which meant that there was no standard, 444 00:22:44,640 --> 00:22:47,560 Speaker 1: you had a bunch of proprietary approaches that wouldn't talk 445 00:22:47,600 --> 00:22:52,080 Speaker 1: to each other. So products like the Echo and some 446 00:22:52,200 --> 00:22:55,080 Speaker 1: apps that I've seen are finally getting to the point 447 00:22:55,200 --> 00:22:58,920 Speaker 1: where you can interface with different products from different companies, 448 00:22:59,160 --> 00:23:01,240 Speaker 1: so you don't have to i in on just one 449 00:23:01,320 --> 00:23:05,720 Speaker 1: system if you happen to like one manufacturer's refrigerator. Let's 450 00:23:05,720 --> 00:23:09,120 Speaker 1: say you like Whirlpool and you really want the newest 451 00:23:09,160 --> 00:23:12,320 Speaker 1: smart Whirlpool fridge, which, by the way, I've seen and 452 00:23:12,400 --> 00:23:15,639 Speaker 1: it's pretty sweet. If you wanted one of those, but 453 00:23:15,720 --> 00:23:18,760 Speaker 1: you wanted to have your other devices from a different manufacturer, 454 00:23:19,119 --> 00:23:21,560 Speaker 1: it would be difficult to have them all talking to 455 00:23:21,600 --> 00:23:24,639 Speaker 1: each other. But something like the Echo can act like 456 00:23:24,680 --> 00:23:27,280 Speaker 1: a hub, and then your car becomes an extension of 457 00:23:27,320 --> 00:23:29,520 Speaker 1: that hub, and that's where you can do things like 458 00:23:29,880 --> 00:23:32,720 Speaker 1: check to see if like if when was the last 459 00:23:32,760 --> 00:23:34,680 Speaker 1: time I bought milk? So, oh, it's two weeks ago, Like, oh, 460 00:23:34,840 --> 00:23:37,320 Speaker 1: I gotta buy milk, you know. And that's the whole 461 00:23:37,320 --> 00:23:39,919 Speaker 1: reason we're talking about it today, because they're adding the 462 00:23:39,960 --> 00:23:42,160 Speaker 1: automobile to this whole mix that you know, you can 463 00:23:42,320 --> 00:23:45,440 Speaker 1: it's now completely mobile. You can really you can really 464 00:23:45,520 --> 00:23:48,840 Speaker 1: truly control your home from your car, and you can 465 00:23:48,880 --> 00:23:51,320 Speaker 1: do a lot of things in your car from home 466 00:23:51,359 --> 00:23:53,240 Speaker 1: as well. I mean the things that you just wouldn't 467 00:23:53,240 --> 00:23:55,440 Speaker 1: think that you would normally need. And it's it's it's 468 00:23:55,440 --> 00:23:57,880 Speaker 1: all luxury. I love the idea of it to like 469 00:23:58,440 --> 00:24:01,399 Speaker 1: some of it. It's definitely luck tree, but some of 470 00:24:01,440 --> 00:24:06,520 Speaker 1: it is incredibly helpful for people who are both absent 471 00:24:06,560 --> 00:24:09,280 Speaker 1: minded and a little O c D. So. I don't 472 00:24:09,280 --> 00:24:11,960 Speaker 1: know Scott if you've ever experienced this, but my wife 473 00:24:11,960 --> 00:24:15,400 Speaker 1: and I when we go out anywhere, there will come 474 00:24:15,400 --> 00:24:18,159 Speaker 1: in time, usually when we're about a mile and a 475 00:24:18,200 --> 00:24:22,080 Speaker 1: half away from home, when the question was the front 476 00:24:22,080 --> 00:24:24,879 Speaker 1: door locked, or did the garage door come all the 477 00:24:24,920 --> 00:24:28,520 Speaker 1: way down? Or was the oven on something some variation 478 00:24:28,560 --> 00:24:31,280 Speaker 1: of that will be asked, and then ultimately what has 479 00:24:31,320 --> 00:24:34,359 Speaker 1: to happen is we're a turn trip home before we 480 00:24:34,400 --> 00:24:37,480 Speaker 1: ever get to our destination. And ALEXA. One of the 481 00:24:37,480 --> 00:24:39,600 Speaker 1: things that can do is it can tell you and 482 00:24:40,359 --> 00:24:44,880 Speaker 1: presumably at least in future or implementations, it can even 483 00:24:44,920 --> 00:24:48,440 Speaker 1: correct things if they are incorrect, Like if you say, hey, 484 00:24:48,520 --> 00:24:51,120 Speaker 1: is the front door locked, No, it's not, it's unlocked. 485 00:24:51,320 --> 00:24:53,280 Speaker 1: Can you lock the front door for me? Yes? Click 486 00:24:53,400 --> 00:24:55,160 Speaker 1: and then you don't have to worry about it anymore. 487 00:24:55,320 --> 00:24:56,840 Speaker 1: Very smart. Now, I want to show you something, and 488 00:24:56,880 --> 00:24:59,280 Speaker 1: this is only your benefit, not the listeners benefits. To 489 00:24:59,359 --> 00:25:01,679 Speaker 1: see this page where I have the word yes with 490 00:25:01,720 --> 00:25:05,240 Speaker 1: two exclamation points, I have underlined the line that says, asking, 491 00:25:05,240 --> 00:25:08,840 Speaker 1: for example, whether the gardge door is closed. Every time 492 00:25:08,880 --> 00:25:10,960 Speaker 1: I leave my house, I look at the gradge door, 493 00:25:11,160 --> 00:25:14,919 Speaker 1: make sure it's closed, and I say, I whispered to myself, 494 00:25:15,320 --> 00:25:17,639 Speaker 1: the gardge door is closed. Yeah. And it's like a 495 00:25:17,640 --> 00:25:19,680 Speaker 1: little routine. It's a pattern I have. And if I 496 00:25:19,680 --> 00:25:22,280 Speaker 1: if I ever leave without doing that, I have to 497 00:25:22,320 --> 00:25:24,760 Speaker 1: return and make sure that garage door is closed because 498 00:25:24,760 --> 00:25:26,639 Speaker 1: I got a lot of valuable stuff in there, my tools, 499 00:25:26,920 --> 00:25:28,879 Speaker 1: I got my project car in there. Who knows what 500 00:25:28,920 --> 00:25:31,520 Speaker 1: else is in there, but bicycles things like that. See, 501 00:25:31,520 --> 00:25:33,600 Speaker 1: for us, not only do we have stuff that's valuable 502 00:25:33,600 --> 00:25:35,680 Speaker 1: in our garage, but that's a direct route into our 503 00:25:35,720 --> 00:25:39,360 Speaker 1: house and so and so it's where all our other 504 00:25:39,480 --> 00:25:41,800 Speaker 1: stuff happens. To me, Yeah and so, so you mean 505 00:25:41,880 --> 00:25:43,320 Speaker 1: when you walked out the door, you didn't say the 506 00:25:43,359 --> 00:25:46,320 Speaker 1: gradge door is locked. We actually say yeah. We say 507 00:25:46,440 --> 00:25:48,879 Speaker 1: we say Dora is down, that is That is the 508 00:25:49,520 --> 00:25:51,840 Speaker 1: That is the key phrase. And I am not joking. 509 00:25:51,960 --> 00:25:55,840 Speaker 1: If we pull out of our home area and neither 510 00:25:55,880 --> 00:25:59,040 Speaker 1: of us have said door is down, that won't come 511 00:25:59,040 --> 00:26:01,880 Speaker 1: a point where my wife say, did the garage door 512 00:26:01,920 --> 00:26:04,359 Speaker 1: go down? And because I don't want to lie to 513 00:26:04,480 --> 00:26:06,520 Speaker 1: my wife, like I want to get to where I'm going, 514 00:26:07,080 --> 00:26:08,760 Speaker 1: but I never want to get to where I'm going 515 00:26:08,760 --> 00:26:10,720 Speaker 1: more than I want to you know, like that that's 516 00:26:10,760 --> 00:26:13,639 Speaker 1: never so great that I will lie. I'll say, I 517 00:26:13,680 --> 00:26:15,840 Speaker 1: don't know. I am turned around and go look. I 518 00:26:15,840 --> 00:26:19,320 Speaker 1: am sometimes ridiculed for my my whispering of the gradge 519 00:26:19,320 --> 00:26:22,760 Speaker 1: door is down. It's when you say closed. When when 520 00:26:22,760 --> 00:26:26,200 Speaker 1: you say it and you've you've got that, like all right, 521 00:26:26,359 --> 00:26:28,960 Speaker 1: that chapter is closed. Now a new chapter can begin. 522 00:26:29,040 --> 00:26:31,119 Speaker 1: This is the journey chapter. Isn't it funny? I know 523 00:26:31,240 --> 00:26:35,359 Speaker 1: people have this type of thing. It's very low level 524 00:26:35,400 --> 00:26:37,239 Speaker 1: of c D. I I understand that, I get it, 525 00:26:37,240 --> 00:26:39,480 Speaker 1: but I know people have this for all kinds of 526 00:26:39,480 --> 00:26:42,960 Speaker 1: items in their house and their automobile. This really does 527 00:26:43,400 --> 00:26:46,000 Speaker 1: smooth out some of that. How helpful would that be 528 00:26:46,040 --> 00:26:48,680 Speaker 1: if you're ten miles away and you can check to 529 00:26:48,720 --> 00:26:52,960 Speaker 1: make sure that stupid gradge door is down otherwise, right, 530 00:26:53,400 --> 00:26:56,600 Speaker 1: or you didn't leave the dishwasher on or something when 531 00:26:56,600 --> 00:26:58,840 Speaker 1: you're gone because your fear of you know, flooding the 532 00:26:58,920 --> 00:27:02,440 Speaker 1: kitchen store. And then and the best thing, Scott, absolute 533 00:27:02,520 --> 00:27:07,439 Speaker 1: best thing, Alexa will never mock you for that is 534 00:27:07,480 --> 00:27:10,640 Speaker 1: probably the best part of the whole thing. Scott and 535 00:27:10,800 --> 00:27:14,120 Speaker 1: yours truly will talk more about high tech at high 536 00:27:14,160 --> 00:27:26,240 Speaker 1: speed after we come back from this quick break. So 537 00:27:26,440 --> 00:27:29,680 Speaker 1: moving on to some of the other manufacturers that we saw, 538 00:27:29,960 --> 00:27:34,040 Speaker 1: we stopped by Volkswagen and uh so, of course Volkswagen 539 00:27:34,119 --> 00:27:37,879 Speaker 1: obviously still in the wake of the pr disaster that 540 00:27:38,080 --> 00:27:42,399 Speaker 1: was the clean Diesel scandal, so they were they trotting 541 00:27:42,400 --> 00:27:46,119 Speaker 1: out their latest diesel technology. They oddly enough left that 542 00:27:46,320 --> 00:27:49,560 Speaker 1: off the show floor, an interesting move. Yeah. They instead 543 00:27:49,600 --> 00:27:53,440 Speaker 1: decided to show off an electric vehicle concept design called 544 00:27:53,480 --> 00:27:59,560 Speaker 1: the Buddy. Yeah I did that. I did that on 545 00:27:59,600 --> 00:28:01,639 Speaker 1: a previou A c S episode and got a lot 546 00:28:01,720 --> 00:28:04,400 Speaker 1: of people saying, I know who Polly Shore is, thank 547 00:28:04,440 --> 00:28:07,240 Speaker 1: you for the Polly Shore reference. Um, but yeah, the 548 00:28:07,560 --> 00:28:11,159 Speaker 1: Buddy electric vehicle concept, which it's kind of funny. It 549 00:28:11,200 --> 00:28:15,320 Speaker 1: relates in a way to another car that we'll talk about, 550 00:28:15,359 --> 00:28:17,480 Speaker 1: the F Zero one. And I know that you and 551 00:28:17,600 --> 00:28:21,600 Speaker 1: Ben covered Faraday Futures f F zero one in an episode, 552 00:28:21,680 --> 00:28:25,440 Speaker 1: right we did about it, so, uh, in a similar 553 00:28:25,480 --> 00:28:29,159 Speaker 1: way to what that concept was. Volkswagen's idea was to 554 00:28:29,200 --> 00:28:33,040 Speaker 1: create a sort of modular approach to designing electric vehicles, 555 00:28:33,160 --> 00:28:36,440 Speaker 1: and so this was really just a representation of that idea. 556 00:28:36,880 --> 00:28:38,800 Speaker 1: That being said, it kind of had this sort of 557 00:28:39,480 --> 00:28:45,840 Speaker 1: quirky retro futuristic design to it, very angular, very uh 558 00:28:46,280 --> 00:28:48,920 Speaker 1: odd looking. But I really dug it. Do you think it? 559 00:28:49,080 --> 00:28:50,560 Speaker 1: I mean, it was it trying to play on the 560 00:28:50,560 --> 00:28:53,640 Speaker 1: old VDO microbus thing. I think so a little bit, 561 00:28:53,680 --> 00:28:57,280 Speaker 1: but definitely in a way that was not like it 562 00:28:57,360 --> 00:29:02,080 Speaker 1: was obviously not a a uh you know, a recreation 563 00:29:02,280 --> 00:29:06,240 Speaker 1: of the microbus just with new lighting and stuff. It 564 00:29:06,360 --> 00:29:10,080 Speaker 1: was the design itself had changed pretty dramatically, Okay, And 565 00:29:10,120 --> 00:29:12,080 Speaker 1: you know I heard one of the one of the 566 00:29:12,120 --> 00:29:14,360 Speaker 1: biggest parts of this whole thing that made it interesting 567 00:29:14,480 --> 00:29:17,440 Speaker 1: was that, uh, it's not going to support any kind 568 00:29:17,480 --> 00:29:19,920 Speaker 1: of It was designed without without the intent to support 569 00:29:19,920 --> 00:29:23,400 Speaker 1: any kind of internal combustion engine. So it's purely electric 570 00:29:23,880 --> 00:29:26,680 Speaker 1: and they're going full on with this and saying this 571 00:29:26,720 --> 00:29:28,560 Speaker 1: is the way we're gonna we're gonna build this vehicle, 572 00:29:28,680 --> 00:29:32,200 Speaker 1: this platform vehicle, uh, and it's gonna be customizable. Of course, 573 00:29:32,200 --> 00:29:33,880 Speaker 1: we're gonna be able to kind of change things around. 574 00:29:33,880 --> 00:29:37,360 Speaker 1: But they've got an idea that right now it's gonna use. 575 00:29:37,960 --> 00:29:40,200 Speaker 1: It can use up to three different types of batteries. Right, 576 00:29:40,400 --> 00:29:43,880 Speaker 1: So there's three different types. There's the cylindrical, pouch and 577 00:29:44,000 --> 00:29:47,320 Speaker 1: prismatic cells. And I think right now they're only using 578 00:29:47,360 --> 00:29:50,240 Speaker 1: two of those. They're using prismatic and pouch cells at 579 00:29:50,240 --> 00:29:54,760 Speaker 1: this point. So they haven't um decided yet on the cylindrical, 580 00:29:54,800 --> 00:29:57,960 Speaker 1: but they say it gives them the flexibility to adapt 581 00:29:57,960 --> 00:30:01,040 Speaker 1: to whatever becomes the forerunner in that yield. Right. So 582 00:30:01,360 --> 00:30:07,240 Speaker 1: with this approach, they can design any number of vehicle structures, 583 00:30:07,320 --> 00:30:10,640 Speaker 1: like whether it's a car or a van or whatever, 584 00:30:10,760 --> 00:30:14,560 Speaker 1: they can go with that because of this module or approach. Uh. 585 00:30:14,640 --> 00:30:17,360 Speaker 1: They just change out the size of the battery, the 586 00:30:17,400 --> 00:30:20,320 Speaker 1: capacity of the battery essentially, and the size of the 587 00:30:20,400 --> 00:30:24,800 Speaker 1: chassis obviously, but it gives them a common starting ground 588 00:30:25,080 --> 00:30:28,920 Speaker 1: for lots of different vehicle types. And the the battery 589 00:30:28,960 --> 00:30:32,520 Speaker 1: pack itself is flat and fits under the car, so 590 00:30:32,560 --> 00:30:36,800 Speaker 1: you get this nice flat interior which made it incredibly 591 00:30:36,880 --> 00:30:39,600 Speaker 1: roomy on the inside, not that we were allowed to 592 00:30:39,600 --> 00:30:41,800 Speaker 1: get in. No one was allowed to get very close 593 00:30:41,840 --> 00:30:44,560 Speaker 1: to this thing, at least at the time we were there. 594 00:30:44,680 --> 00:30:46,880 Speaker 1: It's surprising, isn't how much how much room that free 595 00:30:47,040 --> 00:30:48,960 Speaker 1: up when it's a flat floor, Like, Yeah, it is 596 00:30:48,960 --> 00:30:51,480 Speaker 1: phenomenal where it's just like you just look at it 597 00:30:51,480 --> 00:30:54,480 Speaker 1: and you think, wow, I I would have so much 598 00:30:54,520 --> 00:30:56,920 Speaker 1: room to stretch out, and you know, you can even 599 00:30:56,960 --> 00:31:01,280 Speaker 1: have luggage in the cabinet law with passengers and it's 600 00:31:01,400 --> 00:31:03,360 Speaker 1: wouldn't even get in the way. You get in and 601 00:31:03,400 --> 00:31:06,160 Speaker 1: it feels like an empty bust or something. Yeah, it's big, yeah, 602 00:31:06,200 --> 00:31:10,080 Speaker 1: and it has The Buddy concept car had two motors, 603 00:31:10,480 --> 00:31:12,440 Speaker 1: two electric motors, one for the front wheels, one for 604 00:31:12,520 --> 00:31:16,720 Speaker 1: the back. Um supposedly had a hundred and one kilowatt 605 00:31:16,800 --> 00:31:20,920 Speaker 1: hour capacity battery that would provide a driving range of 606 00:31:20,920 --> 00:31:23,880 Speaker 1: three hundred seventy three miles, although when asked about it, 607 00:31:24,200 --> 00:31:28,840 Speaker 1: Volkswagen essentially said, actually, that's what we're working toward, that's 608 00:31:28,840 --> 00:31:31,600 Speaker 1: our goal. We're not there yet, so they packed it 609 00:31:31,640 --> 00:31:34,480 Speaker 1: down a bit. Yeah. Most people say around three hundred 610 00:31:34,560 --> 00:31:37,440 Speaker 1: miles max. Nasty. I saw a different number that said 611 00:31:37,440 --> 00:31:40,480 Speaker 1: when they ran it through testing based on the U 612 00:31:40,600 --> 00:31:42,360 Speaker 1: S E P A cycle. So you know the way 613 00:31:42,400 --> 00:31:46,600 Speaker 1: that they normally determined for these electric vehicles, the comparison miles, 614 00:31:46,680 --> 00:31:49,160 Speaker 1: I guess the equivalent. They said the equivalent was going 615 00:31:49,200 --> 00:31:52,920 Speaker 1: to be two D thirty three miles. But that's you 616 00:31:52,920 --> 00:31:55,080 Speaker 1: know what I should even stay per gallon, that's the range. 617 00:31:55,120 --> 00:31:57,440 Speaker 1: That's how long these batteries were less two or thirty three. 618 00:31:57,560 --> 00:32:00,800 Speaker 1: That's a significant drop from what they came out. Yeah, 619 00:32:00,840 --> 00:32:03,440 Speaker 1: more than Yeah, so when they say three seventy three 620 00:32:03,480 --> 00:32:06,280 Speaker 1: and it goes to significant. But what is interesting, and 621 00:32:06,280 --> 00:32:08,520 Speaker 1: they didn't back down on this, is that they can 622 00:32:08,640 --> 00:32:12,200 Speaker 1: charge of the battery capacity in about fifteen minutes. And 623 00:32:12,200 --> 00:32:14,480 Speaker 1: that's pretty good. Yeah. If that if that holds, if 624 00:32:14,480 --> 00:32:17,480 Speaker 1: that sticks, and that's true, that's really good. That's significant 625 00:32:17,480 --> 00:32:20,400 Speaker 1: step forward. It's good. I don't know that's good enough. 626 00:32:20,480 --> 00:32:24,080 Speaker 1: Like like Tesla's idea about having not just the super 627 00:32:24,080 --> 00:32:27,239 Speaker 1: fast charging stations, but also the option to switch out 628 00:32:27,280 --> 00:32:30,520 Speaker 1: a battery if you are on the go. That makes 629 00:32:30,560 --> 00:32:32,160 Speaker 1: a lot of sense for people who are used to 630 00:32:32,280 --> 00:32:34,240 Speaker 1: being able to pull into a gas station and refill 631 00:32:34,240 --> 00:32:36,160 Speaker 1: in a couple of minutes. It does, but look, I'm 632 00:32:36,200 --> 00:32:39,000 Speaker 1: on gas stations. There are yeah, sure, there are four 633 00:32:39,040 --> 00:32:42,680 Speaker 1: in every corner. Yeah, that's true. It's I don't mean, 634 00:32:42,960 --> 00:32:44,840 Speaker 1: I don't mean to shoot you down on that idea anything. 635 00:32:44,880 --> 00:32:47,680 Speaker 1: It's quick changing things. They can be far they can 636 00:32:47,720 --> 00:32:51,280 Speaker 1: be a lot fewer and far between. I suppose for 637 00:32:51,280 --> 00:32:55,000 Speaker 1: for the for the battery swaps, the the supercharger stations. However, 638 00:32:55,080 --> 00:32:58,800 Speaker 1: I think those have to be fairly common because a 639 00:32:58,880 --> 00:33:01,720 Speaker 1: Tesla right now s the charge times. If you have 640 00:33:01,960 --> 00:33:04,040 Speaker 1: a supercharger station, or I think that's what they call it, 641 00:33:04,040 --> 00:33:06,840 Speaker 1: a supercharger station, it takes you about one hour to 642 00:33:06,920 --> 00:33:09,120 Speaker 1: fully charge the battery, and that's the three mile range. 643 00:33:09,440 --> 00:33:11,440 Speaker 1: But if you don't have a supercharger station, you're just 644 00:33:11,480 --> 00:33:15,040 Speaker 1: relying on plane old electricity from the wall plug in. Uh, 645 00:33:15,200 --> 00:33:17,200 Speaker 1: you get something like twenty two miles of range for 646 00:33:17,280 --> 00:33:21,160 Speaker 1: every hour of charging. So that's a significant drop in well, 647 00:33:21,200 --> 00:33:22,960 Speaker 1: it's a it's an increase in the time that would 648 00:33:23,280 --> 00:33:27,000 Speaker 1: get somewhere. So I mean for v W to say 649 00:33:27,320 --> 00:33:30,240 Speaker 1: of battery capacity in fifteen minutes, so what's of two 650 00:33:30,320 --> 00:33:33,600 Speaker 1: thirty three miles, that's pretty good. Yeah, no, it's it's 651 00:33:33,760 --> 00:33:37,080 Speaker 1: it's definitely significant. Uh. And you know. They had some 652 00:33:37,120 --> 00:33:41,120 Speaker 1: other cool features on this, including a big display screens 653 00:33:41,160 --> 00:33:44,239 Speaker 1: that would replace your typical dashboard, so it was like 654 00:33:44,400 --> 00:33:49,160 Speaker 1: a bunch of almost like tablets style computers. Um plus 655 00:33:49,200 --> 00:33:52,800 Speaker 1: just your controls, just your controls were huge with cars 656 00:33:52,840 --> 00:33:54,719 Speaker 1: this year. That's that's where you don't even have to 657 00:33:54,720 --> 00:33:56,800 Speaker 1: touch the screen. You can move your hands at a 658 00:33:56,840 --> 00:34:00,560 Speaker 1: certain height and camera will detect the pre stuff your 659 00:34:00,560 --> 00:34:02,720 Speaker 1: hand and then interpret that as a command, so you 660 00:34:02,720 --> 00:34:07,600 Speaker 1: can swipe through options or select something by typically acting 661 00:34:07,640 --> 00:34:09,880 Speaker 1: like you're pushing a button in mid air. That tends 662 00:34:09,920 --> 00:34:12,920 Speaker 1: to be the like you're playing charades or something. Yeah, 663 00:34:12,960 --> 00:34:15,520 Speaker 1: you know, I saw that this is an infrared system, 664 00:34:15,600 --> 00:34:17,640 Speaker 1: So it's kind of a cost cutting measure on their 665 00:34:17,640 --> 00:34:19,520 Speaker 1: part because there are other ways to do it as well, 666 00:34:20,280 --> 00:34:22,480 Speaker 1: but this one they claim they're Gonnase infra red just 667 00:34:22,560 --> 00:34:25,600 Speaker 1: to keep cost down. Again, this is not really meant 668 00:34:25,640 --> 00:34:28,319 Speaker 1: to be a production model. It's strictly a show card. 669 00:34:28,360 --> 00:34:31,360 Speaker 1: It's just really to show off the technologies that that 670 00:34:31,480 --> 00:34:35,320 Speaker 1: Volkswagen has been working with, many of which may find 671 00:34:35,360 --> 00:34:39,399 Speaker 1: their way into production vehicles in the future, but not 672 00:34:39,480 --> 00:34:43,360 Speaker 1: necessarily the Buddy itself. You're not gonna find the buddy 673 00:34:44,239 --> 00:34:46,400 Speaker 1: on a on a show floor for you to purchase. 674 00:34:46,640 --> 00:34:48,640 Speaker 1: It's just a rolling test bed at this point. In fact, 675 00:34:48,719 --> 00:34:52,600 Speaker 1: I should say that I suspect at least a few 676 00:34:52,719 --> 00:34:55,839 Speaker 1: of the vehicles on the showroom floor at CS are 677 00:34:56,000 --> 00:34:58,759 Speaker 1: no more than a shell with nothing that actually makes 678 00:34:58,800 --> 00:35:01,520 Speaker 1: the thing go. I think, faaradays, like that, isn't it. 679 00:35:01,840 --> 00:35:04,520 Speaker 1: I remember trying to remember back to our program, and 680 00:35:04,560 --> 00:35:06,480 Speaker 1: I think that it's it's not really on a power 681 00:35:06,560 --> 00:35:09,680 Speaker 1: platform at this point. It's all theoretical. Yeah. I do 682 00:35:09,800 --> 00:35:11,719 Speaker 1: not know the answer to that to be able to 683 00:35:11,760 --> 00:35:14,400 Speaker 1: say definitively one way or the other, but it would 684 00:35:14,400 --> 00:35:17,399 Speaker 1: not surprise me. I'll tell you that's not anything brand new. 685 00:35:17,440 --> 00:35:20,360 Speaker 1: Really show cars, even at the big auto shows in 686 00:35:20,400 --> 00:35:22,400 Speaker 1: the past, there have been many times when they have 687 00:35:22,440 --> 00:35:24,560 Speaker 1: to be pushed out on the stage or or they 688 00:35:24,560 --> 00:35:28,680 Speaker 1: have some just really low low power module underneath. It's 689 00:35:28,680 --> 00:35:31,120 Speaker 1: not like what they claim is under the hood. Um, 690 00:35:31,160 --> 00:35:32,759 Speaker 1: you know, it's just something just to get out on 691 00:35:32,800 --> 00:35:34,719 Speaker 1: the stage for the show and that's about it. Yeah, 692 00:35:34,880 --> 00:35:36,640 Speaker 1: never never look under the hood. I know that one 693 00:35:36,640 --> 00:35:39,040 Speaker 1: of the Toyota fuel cell vehicles was like that, and 694 00:35:39,160 --> 00:35:42,520 Speaker 1: that the chassis they were showing off was a concept, 695 00:35:43,200 --> 00:35:45,200 Speaker 1: but the car underneath it was not actually a fuel 696 00:35:45,239 --> 00:35:47,880 Speaker 1: cell vehicle, whereas the fuel cell vehicle they had on 697 00:35:48,000 --> 00:35:53,400 Speaker 1: display was using a modified Toyota previously existing Toyota model 698 00:35:53,800 --> 00:35:56,479 Speaker 1: as the chassis. So it's just one of those things 699 00:35:56,480 --> 00:35:58,680 Speaker 1: where it's it's like we have the idea, we have 700 00:35:58,719 --> 00:36:01,440 Speaker 1: the technology, we haven't gotten into production yet, so we 701 00:36:01,480 --> 00:36:04,640 Speaker 1: don't have a model that actually works with the technology 702 00:36:04,680 --> 00:36:07,839 Speaker 1: we're talking about fully fully formed. And that's not every case. 703 00:36:07,880 --> 00:36:11,040 Speaker 1: I mean, they've been running driving concept cars or what 704 00:36:11,160 --> 00:36:13,680 Speaker 1: they called dream cars for decades, of course, you know, 705 00:36:13,719 --> 00:36:16,359 Speaker 1: since the beginning, but there were always those cases where 706 00:36:16,400 --> 00:36:19,600 Speaker 1: they bring out some just crazy design that doesn't really 707 00:36:19,600 --> 00:36:21,719 Speaker 1: have anything under the hood or anything. And and it 708 00:36:21,840 --> 00:36:23,640 Speaker 1: was kind of a big deal again back in the 709 00:36:23,719 --> 00:36:28,160 Speaker 1: late eighties and early nineties when companies started making running 710 00:36:28,280 --> 00:36:33,160 Speaker 1: driving production rather prototype vehicles, um, all these concept cars, 711 00:36:33,160 --> 00:36:35,320 Speaker 1: and it was kind of like this thing really drives, 712 00:36:35,320 --> 00:36:37,480 Speaker 1: Like I mean, it's the I think I'm I'm not 713 00:36:37,520 --> 00:36:39,640 Speaker 1: gonna say the first or anything, but I remember like 714 00:36:39,680 --> 00:36:42,600 Speaker 1: hearing about the Viper Dodge Viper, and they're like listening, 715 00:36:42,600 --> 00:36:44,640 Speaker 1: actually drives, we can drive it on the Indian helps 716 00:36:44,640 --> 00:36:48,080 Speaker 1: five under track, like, no, way, that thing actually has 717 00:36:48,120 --> 00:36:50,520 Speaker 1: a V ten and you know they claim it did, 718 00:36:51,000 --> 00:36:53,960 Speaker 1: but it really did. And the Plymouth Prowler, and you know, 719 00:36:54,040 --> 00:36:56,440 Speaker 1: vehicles like that along the way that you thought and 720 00:36:56,840 --> 00:36:59,160 Speaker 1: all the cars that I thought were awesome when I 721 00:36:59,200 --> 00:37:01,680 Speaker 1: was younger. Yes, well that's that's when it's kind of 722 00:37:01,719 --> 00:37:04,719 Speaker 1: the resurgence of like, well, this thing actually does what 723 00:37:04,760 --> 00:37:08,320 Speaker 1: we say it does, is that it's not just a shell. Yeah. 724 00:37:08,320 --> 00:37:11,959 Speaker 1: There were some other cool technologies on display to stick 725 00:37:12,000 --> 00:37:15,840 Speaker 1: with a connected car and just your controls. BMW showed 726 00:37:15,880 --> 00:37:19,239 Speaker 1: off their eye Vision Future Interaction Platform, which again is 727 00:37:19,239 --> 00:37:22,920 Speaker 1: another network car. All those commands that you would issue 728 00:37:22,960 --> 00:37:26,200 Speaker 1: to the car can be done by uh, touchscreen or 729 00:37:26,320 --> 00:37:31,000 Speaker 1: by voice control or again by gestures um. And the 730 00:37:31,040 --> 00:37:33,000 Speaker 1: steering wheel actually, did you see the steering wheel on 731 00:37:33,000 --> 00:37:36,160 Speaker 1: this thing with the lights? Okay, so there are three 732 00:37:36,200 --> 00:37:40,239 Speaker 1: different modes of driving this vehicle. There's pure drive, where 733 00:37:40,360 --> 00:37:43,000 Speaker 1: the vehicle is under the control of a human driver, 734 00:37:43,680 --> 00:37:47,799 Speaker 1: at least presumably a human driver. A driver could be 735 00:37:47,840 --> 00:37:50,200 Speaker 1: a dog. Well you would hope it would not be. 736 00:37:50,680 --> 00:37:52,879 Speaker 1: I suppose it could be at this point. Then there 737 00:37:52,960 --> 00:37:55,759 Speaker 1: was assist mode, which was kind of where you have 738 00:37:55,920 --> 00:37:59,440 Speaker 1: that autonomous ability to kick into gear should something go 739 00:37:59,520 --> 00:38:03,120 Speaker 1: wrong to protect the driver and the car. And then 740 00:38:03,160 --> 00:38:07,480 Speaker 1: you had pure autonomous mode or auto mode, and the 741 00:38:07,520 --> 00:38:10,360 Speaker 1: steering wheel would light up to indicate which mode you 742 00:38:10,360 --> 00:38:12,319 Speaker 1: were in. So if you were in automode, it would 743 00:38:12,360 --> 00:38:16,800 Speaker 1: be blue steering wheel like pretty much at the um, 744 00:38:16,880 --> 00:38:21,040 Speaker 1: you know, the three and nine positions up would be 745 00:38:21,120 --> 00:38:23,239 Speaker 1: blue like up to the top. And then if you 746 00:38:23,239 --> 00:38:26,960 Speaker 1: wanted to switch over to pure drive, the lights would 747 00:38:26,960 --> 00:38:29,960 Speaker 1: start to fade down from the top to the sides 748 00:38:30,680 --> 00:38:33,520 Speaker 1: so that it would be an alert saying, hey, you 749 00:38:33,520 --> 00:38:36,000 Speaker 1: should probably put your hands on the wheel now because 750 00:38:36,360 --> 00:38:39,239 Speaker 1: here where the blue light is so so. And then 751 00:38:39,680 --> 00:38:42,320 Speaker 1: before it would switch to pure drive mode, the lights 752 00:38:42,360 --> 00:38:47,040 Speaker 1: turn red to indicate, hey, seriously, take the wheel. I'm 753 00:38:47,040 --> 00:38:50,279 Speaker 1: not in control anymore. All right, let me tell you something. Yeah, 754 00:38:50,320 --> 00:38:53,239 Speaker 1: I got a major problem with this car. This is 755 00:38:53,320 --> 00:38:55,279 Speaker 1: kind of way you invited me here. It was one 756 00:38:55,320 --> 00:38:58,279 Speaker 1: of the minutes, the moment right here, because this could 757 00:38:58,320 --> 00:39:00,560 Speaker 1: be a moment. Let me tell you, all right, this 758 00:39:00,640 --> 00:39:03,600 Speaker 1: is the so, this is the BMW I eight they 759 00:39:03,600 --> 00:39:05,839 Speaker 1: have modified. There's no doors in this car. There's no 760 00:39:06,040 --> 00:39:07,960 Speaker 1: hood on this car, I mean roof on this car. 761 00:39:08,000 --> 00:39:11,279 Speaker 1: But um, but inside is where all this technologies happen. 762 00:39:11,320 --> 00:39:12,920 Speaker 1: And you get it, that's what the showcases. They want 763 00:39:12,920 --> 00:39:14,799 Speaker 1: you to have a clear, unobstructive view of that. So 764 00:39:14,840 --> 00:39:18,280 Speaker 1: I get it. What I almost cannot believe I'm hearing 765 00:39:18,719 --> 00:39:23,560 Speaker 1: from BMW, which by the way, used to be called, uh, 766 00:39:23,719 --> 00:39:26,880 Speaker 1: the ultimate driving machine, right, okay, the ultimate driving machine. 767 00:39:26,880 --> 00:39:30,800 Speaker 1: Now here's what. Here's what this says. BMW's latest technology 768 00:39:30,880 --> 00:39:33,839 Speaker 1: reduces the amount of driver control to a minimum in 769 00:39:33,960 --> 00:39:38,160 Speaker 1: order to simplify driving, and driving can be stressful for some. 770 00:39:38,600 --> 00:39:41,160 Speaker 1: And that's when it goes into the three modes. Come on. 771 00:39:41,719 --> 00:39:43,359 Speaker 1: This is part of the stuff that they that they're 772 00:39:44,320 --> 00:39:49,640 Speaker 1: you know, this is their message. I get it. There's okay, 773 00:39:49,840 --> 00:39:54,120 Speaker 1: and we're all going there. Some of us are are screaming. 774 00:39:54,120 --> 00:39:58,640 Speaker 1: And but this company in particular, and I understand. I 775 00:39:58,719 --> 00:40:00,560 Speaker 1: understand that. You know that they have to keep up, 776 00:40:00,560 --> 00:40:03,160 Speaker 1: that they have to remain relevance. I get That's why 777 00:40:03,160 --> 00:40:05,080 Speaker 1: they're doing it. So I'm not I'm not that old 778 00:40:05,120 --> 00:40:07,279 Speaker 1: that I don't understand that. I really do. They have 779 00:40:07,320 --> 00:40:09,880 Speaker 1: to remain relevant, they have to they have to sell product, right, 780 00:40:10,239 --> 00:40:13,040 Speaker 1: So that's what they're doing. I get it. But for 781 00:40:13,080 --> 00:40:15,800 Speaker 1: a company that that claims to be the ultimate drive machine, 782 00:40:15,840 --> 00:40:19,640 Speaker 1: and they are known for building cars that drivers really 783 00:40:19,680 --> 00:40:22,920 Speaker 1: really enjoy. They pay a premium to drive a BMW 784 00:40:23,080 --> 00:40:26,160 Speaker 1: because of the experience. It's a driver's car, and there's 785 00:40:26,200 --> 00:40:29,560 Speaker 1: certain vehicles. You could expect that out of Porsche as well. Uh, 786 00:40:29,600 --> 00:40:31,640 Speaker 1: you know, you expect a certain experience when you get 787 00:40:31,640 --> 00:40:34,640 Speaker 1: behind the wheel, and this is removing all that. You're 788 00:40:34,680 --> 00:40:36,879 Speaker 1: just saying, I want the I want the brand name. 789 00:40:36,960 --> 00:40:39,160 Speaker 1: I wanted to be this I eight or whatever platform 790 00:40:39,239 --> 00:40:41,560 Speaker 1: they put it in, but I don't really want to 791 00:40:41,600 --> 00:40:43,560 Speaker 1: do anything. I don't want to experience that anymore. I'd 792 00:40:43,640 --> 00:40:45,399 Speaker 1: rather just sit back and read a book maybe while 793 00:40:45,400 --> 00:40:49,200 Speaker 1: I go to work. Why not take the train it start? 794 00:40:49,239 --> 00:40:53,400 Speaker 1: It sounds to me like a glorious future. But let 795 00:40:53,520 --> 00:40:56,480 Speaker 1: see the problem is your listeners have come to expect 796 00:40:56,480 --> 00:40:59,040 Speaker 1: this kind of rant from me, So get you on 797 00:40:59,080 --> 00:41:03,360 Speaker 1: the show. But honest, really, I mean, really think about it. 798 00:41:03,400 --> 00:41:05,279 Speaker 1: I mean, this is supposed to be a driver's cards. 799 00:41:05,400 --> 00:41:07,839 Speaker 1: It's it's a it's an odd move from this manufacturer. 800 00:41:07,880 --> 00:41:10,080 Speaker 1: That's all on SAE well, and to be fair, I 801 00:41:10,080 --> 00:41:12,680 Speaker 1: mean they do still have the pure drive mode where 802 00:41:12,960 --> 00:41:16,800 Speaker 1: I think that's what they're relying on that to to 803 00:41:16,880 --> 00:41:22,400 Speaker 1: continue there there kind of their slogan about the driving 804 00:41:22,440 --> 00:41:26,920 Speaker 1: experience while still also investing in these technologies that if 805 00:41:26,920 --> 00:41:30,400 Speaker 1: they don't really look into that, they will literally be 806 00:41:30,520 --> 00:41:33,200 Speaker 1: left behind. And I understand that not every vehicle is 807 00:41:33,239 --> 00:41:35,719 Speaker 1: going to receive this type of treatment. And they've got 808 00:41:35,760 --> 00:41:37,560 Speaker 1: the rest of the product line, and if you look 809 00:41:37,600 --> 00:41:40,279 Speaker 1: at BMW's page, you'll see that they have a lot 810 00:41:40,280 --> 00:41:42,759 Speaker 1: of different products that are in fact a lot of 811 00:41:42,760 --> 00:41:45,520 Speaker 1: fun to drive. Alright, we got a little bit more 812 00:41:45,560 --> 00:41:47,879 Speaker 1: to say on this topic before we get to that, though, 813 00:41:47,960 --> 00:41:59,640 Speaker 1: Let's take another quick break. Did you happen to see 814 00:42:00,120 --> 00:42:06,200 Speaker 1: the demonstration of their self parking technology for for the alright, So, 815 00:42:07,480 --> 00:42:11,480 Speaker 1: so BMW has shown off self parking a couple of times, 816 00:42:11,520 --> 00:42:14,120 Speaker 1: like it's not like it's brand new, except that this 817 00:42:14,200 --> 00:42:17,160 Speaker 1: year they showed off that they had a new way 818 00:42:17,280 --> 00:42:20,040 Speaker 1: to interact with self parking because usually you would get 819 00:42:20,040 --> 00:42:21,440 Speaker 1: out of your car, you might have a button on 820 00:42:21,480 --> 00:42:24,640 Speaker 1: your key fob that says park and you would press 821 00:42:24,680 --> 00:42:26,600 Speaker 1: it in the car would pull itself into a parking slot. 822 00:42:26,719 --> 00:42:29,279 Speaker 1: Very impressive, you know, you know, yeah, and you can 823 00:42:29,360 --> 00:42:33,200 Speaker 1: even maybe with something like Alexa have voice control where 824 00:42:33,239 --> 00:42:35,960 Speaker 1: you could say park the car and then jump out 825 00:42:36,000 --> 00:42:37,919 Speaker 1: and it would probably a car that would be cool too. 826 00:42:38,280 --> 00:42:43,080 Speaker 1: BMW decided to go the air controller route. But just 827 00:42:43,120 --> 00:42:46,320 Speaker 1: your controls. You can actually just your to your car 828 00:42:47,160 --> 00:42:49,960 Speaker 1: to take a to take a spot, and to to 829 00:42:50,120 --> 00:42:53,600 Speaker 1: move the car while you're outside of it to go 830 00:42:53,640 --> 00:42:56,040 Speaker 1: and park. Really yeah, they actually showed off. They had 831 00:42:56,120 --> 00:42:59,360 Speaker 1: journalists sit in the passenger side, no one in the 832 00:42:59,400 --> 00:43:05,080 Speaker 1: driver's seat, have a BMW representative standing there gesturing to 833 00:43:05,200 --> 00:43:07,560 Speaker 1: the car and to the parking space, and the car 834 00:43:07,600 --> 00:43:09,759 Speaker 1: would maneuver over into the parking I got to ask 835 00:43:09,800 --> 00:43:11,560 Speaker 1: you this, do you have to have those flashlights with 836 00:43:11,560 --> 00:43:13,640 Speaker 1: the cones on the end to do this? You have 837 00:43:13,840 --> 00:43:16,920 Speaker 1: to have them, but I think it presents a certain 838 00:43:17,000 --> 00:43:20,880 Speaker 1: element of style. I absolutely would plus the giant ear 839 00:43:20,920 --> 00:43:24,560 Speaker 1: mud right exactly. You know, the BMW dengines can get loud, 840 00:43:24,960 --> 00:43:27,640 Speaker 1: so you want to be able to protect your ears. 841 00:43:28,040 --> 00:43:31,120 Speaker 1: But yeah, so that's that's that's a shock to me. 842 00:43:31,160 --> 00:43:34,000 Speaker 1: I I'll have to seek a video of this because 843 00:43:34,000 --> 00:43:36,400 Speaker 1: I've got to see somebody directing their car as if 844 00:43:36,400 --> 00:43:38,239 Speaker 1: it's an airplane into a potet. I think it was. 845 00:43:38,520 --> 00:43:40,480 Speaker 1: I want to say it was the Verge where I 846 00:43:40,520 --> 00:43:45,000 Speaker 1: saw the video demonstration of this particular technology. I should 847 00:43:45,040 --> 00:43:46,680 Speaker 1: go on to say I didn't get a chance to 848 00:43:46,719 --> 00:43:49,239 Speaker 1: see this personally. This was something that I saw after 849 00:43:49,280 --> 00:43:51,080 Speaker 1: I got back. This is one of the most frustrating 850 00:43:51,080 --> 00:43:54,320 Speaker 1: things about attending CES is you get to see a 851 00:43:54,320 --> 00:43:56,680 Speaker 1: lot of really cool stuff. But when you come back, 852 00:43:56,719 --> 00:43:58,960 Speaker 1: you see all this really cool stuff that you didn't 853 00:43:59,120 --> 00:44:01,520 Speaker 1: have a chance to experience yourself. There's just no way. 854 00:44:01,560 --> 00:44:03,600 Speaker 1: There's no no, there is absolutely no way. And like 855 00:44:03,640 --> 00:44:06,880 Speaker 1: even if you spent every single second running from thing 856 00:44:06,960 --> 00:44:11,320 Speaker 1: to thing, you would still miss stuff. Um. So next 857 00:44:11,520 --> 00:44:15,480 Speaker 1: we have Kia, which normally I think I wouldn't have 858 00:44:15,560 --> 00:44:18,640 Speaker 1: even wouldn't have even registered with me. But again Kia 859 00:44:18,840 --> 00:44:24,919 Speaker 1: again moving on this this same sort of autonomous car bandwagon, 860 00:44:25,120 --> 00:44:27,960 Speaker 1: I guess you could call it. UM introduced its drive 861 00:44:28,040 --> 00:44:33,319 Speaker 1: Wise platform. Yeah, on the Soul e V platform. Right, So, uh, 862 00:44:33,360 --> 00:44:36,040 Speaker 1: you know, it's in a product that they currently build, 863 00:44:36,680 --> 00:44:38,319 Speaker 1: which makes sense. It's not like they're bringing out some 864 00:44:38,360 --> 00:44:40,319 Speaker 1: concept vehagle that's a you know, pie in the sky 865 00:44:40,320 --> 00:44:42,279 Speaker 1: idea that maybe will never happen. They're saying, now we're 866 00:44:42,320 --> 00:44:45,480 Speaker 1: gonna put it into our actual product right here's what here, 867 00:44:45,520 --> 00:44:48,160 Speaker 1: here's what we have available right now that it works in. Yeah, 868 00:44:48,239 --> 00:44:51,680 Speaker 1: and so right now, it's kind of a a advanced 869 00:44:51,719 --> 00:44:56,160 Speaker 1: driver assistance system or ADS. It's one of those where 870 00:44:56,360 --> 00:44:58,600 Speaker 1: again similar to what I was talking about with Toyota, 871 00:44:58,800 --> 00:45:00,640 Speaker 1: with with pretty much every do you like, it's this 872 00:45:00,719 --> 00:45:05,080 Speaker 1: idea of a computer assisting a driver, normally just to 873 00:45:05,239 --> 00:45:09,120 Speaker 1: intervene when there's something that the driver has not noticed 874 00:45:09,400 --> 00:45:11,960 Speaker 1: or has it doesn't have the time to react to 875 00:45:12,360 --> 00:45:15,640 Speaker 1: him or herself. So this is a partially autonomous system, 876 00:45:15,760 --> 00:45:18,920 Speaker 1: right because they've got to release dates. Yes, this was 877 00:45:19,000 --> 00:45:21,320 Speaker 1: me the earlier one, the earlier one which is scheduled 878 00:45:21,360 --> 00:45:26,160 Speaker 1: to debut in and another full ten years beyond that. 879 00:45:26,239 --> 00:45:28,759 Speaker 1: So when is when they're gonna come out with the 880 00:45:28,840 --> 00:45:33,040 Speaker 1: fully autonomous version. That's what they're projecting right now. I think, honestly, Scott, 881 00:45:33,080 --> 00:45:35,719 Speaker 1: if I have to be like, if you nailed me down, 882 00:45:35,719 --> 00:45:37,640 Speaker 1: I'd have to say I think a lot of these 883 00:45:37,640 --> 00:45:41,920 Speaker 1: car companies are purposefully being conservative with those guesses. I 884 00:45:41,960 --> 00:45:46,000 Speaker 1: totally agree. I think I think certain companies that are 885 00:45:46,000 --> 00:45:51,600 Speaker 1: not car companies, like Google, are really pushing the envelope 886 00:45:51,640 --> 00:45:54,960 Speaker 1: as far as the autonomous car technology goes to the 887 00:45:55,000 --> 00:45:59,920 Speaker 1: point where there will be enough demand from the consumer, 888 00:46:00,000 --> 00:46:03,080 Speaker 1: are for some of those dates to get moved up. 889 00:46:03,520 --> 00:46:05,200 Speaker 1: And I just think a lot of the car companies 890 00:46:05,239 --> 00:46:07,319 Speaker 1: are trying to do two things. One, they want to 891 00:46:07,360 --> 00:46:09,520 Speaker 1: make sure that they give themselves enough time to really 892 00:46:09,520 --> 00:46:12,239 Speaker 1: develop the technology so that the vehicles they put out 893 00:46:12,239 --> 00:46:17,319 Speaker 1: are safe. Nobody wants the autonomous car version of the 894 00:46:17,400 --> 00:46:20,640 Speaker 1: clean diesel scandal, because that's a version where people will 895 00:46:20,680 --> 00:46:24,480 Speaker 1: lose their lives. So obviously there's that, And I think 896 00:46:24,520 --> 00:46:28,040 Speaker 1: the other one is just that there's this fear, a 897 00:46:28,120 --> 00:46:31,479 Speaker 1: well founded one. I would argue that the autonomous car 898 00:46:31,560 --> 00:46:34,800 Speaker 1: future is also going to be the future where fewer 899 00:46:34,880 --> 00:46:38,400 Speaker 1: people own personal vehicles. Yeah, no argument for me on 900 00:46:38,440 --> 00:46:40,319 Speaker 1: those points. I think that's really gonna happen. I think 901 00:46:40,320 --> 00:46:42,000 Speaker 1: a lot of people are going to jump on this 902 00:46:42,040 --> 00:46:44,600 Speaker 1: and want that. And I think that manufacturers are kind 903 00:46:44,600 --> 00:46:47,839 Speaker 1: of hedging their beds a little bit insane, and we're 904 00:46:47,840 --> 00:46:50,040 Speaker 1: gonna be fully autonomous. But I wouldn't be surprised if 905 00:46:50,080 --> 00:46:52,400 Speaker 1: that date moves up by five years. I mean a 906 00:46:52,400 --> 00:46:54,879 Speaker 1: long long way. It's kind of that under promise over 907 00:46:54,920 --> 00:46:56,920 Speaker 1: delivery thing, right, I mean, you don't want to say 908 00:46:57,040 --> 00:46:59,799 Speaker 1: in we're gonna have a fully autonomous line and it's 909 00:46:59,800 --> 00:47:02,480 Speaker 1: going to be fantastic, and the rolls around you look 910 00:47:02,480 --> 00:47:04,880 Speaker 1: back at that new story and then you, you know, 911 00:47:04,960 --> 00:47:07,439 Speaker 1: kind of press Kia and say, well where the fully 912 00:47:07,400 --> 00:47:09,840 Speaker 1: autonomos vehicles and they have to kind of put their 913 00:47:09,880 --> 00:47:14,040 Speaker 1: you know, head down and say, well, kick quite where? Yeah, 914 00:47:14,120 --> 00:47:18,400 Speaker 1: you know, maybe in five more years. I think I 915 00:47:18,440 --> 00:47:20,319 Speaker 1: think part of it is also the idea that if 916 00:47:20,360 --> 00:47:22,400 Speaker 1: you'd go with the driver assist route rather than the 917 00:47:22,400 --> 00:47:27,960 Speaker 1: fully autonomous route, you can keep that culture of personal 918 00:47:28,000 --> 00:47:32,600 Speaker 1: car ownership going longer, right because people will be interested 919 00:47:32,680 --> 00:47:35,600 Speaker 1: in driving with these kind of vehicles. It's when you 920 00:47:35,640 --> 00:47:38,560 Speaker 1: go the fully autonomous route where one vehicle price gets 921 00:47:38,600 --> 00:47:41,400 Speaker 1: so high that the average person starts to have trouble 922 00:47:41,440 --> 00:47:45,960 Speaker 1: affording one and two. There's less need to have one 923 00:47:46,040 --> 00:47:50,040 Speaker 1: in densely populated areas. Obviously, in rural areas that's a 924 00:47:50,080 --> 00:47:53,480 Speaker 1: totally different story. But but in cities where you could 925 00:47:53,600 --> 00:47:58,960 Speaker 1: presume presumably get a service going with electric or not 926 00:47:59,000 --> 00:48:04,080 Speaker 1: even electric, but fully autonomous vehicles, um, then there's very 927 00:48:04,120 --> 00:48:06,640 Speaker 1: little reason to own your own desk. See I totally 928 00:48:06,719 --> 00:48:09,120 Speaker 1: understand that there's a time and place for this tepeon thing, 929 00:48:09,160 --> 00:48:12,160 Speaker 1: and really it does work in some situations. Other situations 930 00:48:12,520 --> 00:48:14,600 Speaker 1: there's just no way that it would work. There's certain 931 00:48:14,640 --> 00:48:17,640 Speaker 1: people that this type of this type of idea, this 932 00:48:17,719 --> 00:48:20,920 Speaker 1: car sharing idea, this uh um, even even e V 933 00:48:21,120 --> 00:48:23,839 Speaker 1: cars at this point the way they are right now 934 00:48:24,239 --> 00:48:27,400 Speaker 1: wouldn't work for a lot of people. But for a 935 00:48:27,440 --> 00:48:29,120 Speaker 1: lot of people they do. And it's a great thing. 936 00:48:29,200 --> 00:48:32,440 Speaker 1: And I get the move to autonomous and people's excitement 937 00:48:32,440 --> 00:48:34,879 Speaker 1: around it, but um, I don't think it's ever gonna 938 00:48:34,880 --> 00:48:39,360 Speaker 1: go either way. Yeah, I I used to be gung 939 00:48:39,440 --> 00:48:42,560 Speaker 1: ho autonomous, but I've eased off on that a little 940 00:48:42,560 --> 00:48:45,520 Speaker 1: bit just because I recognize there are I'm thinking of 941 00:48:45,520 --> 00:48:48,200 Speaker 1: it too narrowly, right, I'm thinking of it from my perspective, 942 00:48:48,280 --> 00:48:52,840 Speaker 1: my own personal experience of using vehicles. For me, it 943 00:48:52,880 --> 00:48:55,960 Speaker 1: makes perfect sense pretty much all the time. But there 944 00:48:55,960 --> 00:49:00,080 Speaker 1: are plenty of outliers that you would say, all right, well, know, 945 00:49:00,640 --> 00:49:04,080 Speaker 1: personal ownership is never going to completely go away, not 946 00:49:04,160 --> 00:49:06,440 Speaker 1: as long as we still have people living outside of 947 00:49:06,560 --> 00:49:10,480 Speaker 1: urban areas, and presumably that's going to continue. Um. But 948 00:49:10,719 --> 00:49:12,680 Speaker 1: to get back to Kia, I want to ask you this, 949 00:49:13,000 --> 00:49:19,240 Speaker 1: did you happen to look at their blind control system. 950 00:49:19,280 --> 00:49:24,719 Speaker 1: It's a fingerprint scanner that identifies the driver, and then 951 00:49:25,000 --> 00:49:28,520 Speaker 1: as you drive the vehicle, the vehicle begins to learn 952 00:49:28,760 --> 00:49:32,680 Speaker 1: your style, and it learns what temperature you like to 953 00:49:32,760 --> 00:49:35,320 Speaker 1: keep the car, and what stations you tend to listen 954 00:49:35,360 --> 00:49:38,800 Speaker 1: to and UH and other ways that you actually handle 955 00:49:38,840 --> 00:49:41,279 Speaker 1: the vehicle while you're driving it, and it builds a 956 00:49:41,320 --> 00:49:46,160 Speaker 1: profile that's personalized to you. And then if someone else 957 00:49:46,239 --> 00:49:49,000 Speaker 1: drives the car and scans their fingerprint, they get a profile. 958 00:49:49,200 --> 00:49:51,160 Speaker 1: And then every time you drive the car afterward, when 959 00:49:51,160 --> 00:49:54,640 Speaker 1: you scan, it knows it's you, automatically adjusts all those 960 00:49:54,640 --> 00:49:57,880 Speaker 1: settings so that you get the driving experience you want 961 00:49:58,000 --> 00:50:00,920 Speaker 1: and the other driver gets the driving experience it's they want. Now, 962 00:50:00,960 --> 00:50:03,640 Speaker 1: there have been very analog versions of this along the way. 963 00:50:03,680 --> 00:50:06,600 Speaker 1: You can say this is this is driver one, driver 964 00:50:06,680 --> 00:50:09,000 Speaker 1: to driver three and hit the button and the seat 965 00:50:09,080 --> 00:50:12,280 Speaker 1: changes position, the steering wheel chages position, the temperature, controls, 966 00:50:12,280 --> 00:50:14,879 Speaker 1: the radio stations, all that I get it. But this 967 00:50:14,960 --> 00:50:19,240 Speaker 1: is doing it with with fingerprint technology. Um, that's pretty interesting, 968 00:50:19,280 --> 00:50:20,640 Speaker 1: and you know, it's kind of reminds me of the 969 00:50:20,680 --> 00:50:23,279 Speaker 1: way that the houses, as smart houses can adapt to 970 00:50:23,320 --> 00:50:24,920 Speaker 1: the way, and you know, they know that at a 971 00:50:24,960 --> 00:50:26,440 Speaker 1: certain time you come up from work and you want 972 00:50:26,440 --> 00:50:29,200 Speaker 1: the temperature to certain setting. You want this light on, 973 00:50:29,280 --> 00:50:31,640 Speaker 1: but not that one. And then later in the evening 974 00:50:31,960 --> 00:50:33,799 Speaker 1: turned that one off, turned that one on, and right 975 00:50:34,080 --> 00:50:36,480 Speaker 1: it starts to learn your patterns and and things like that. 976 00:50:36,800 --> 00:50:38,960 Speaker 1: I can see these two things working together. It's like, 977 00:50:39,440 --> 00:50:40,919 Speaker 1: not pretty soon, you're just gonna be able to walk. 978 00:50:40,960 --> 00:50:42,680 Speaker 1: You're gonna be able to walk through life without every 979 00:50:42,760 --> 00:50:45,560 Speaker 1: having to push another button or flip another switch. Yeah, 980 00:50:45,640 --> 00:50:48,520 Speaker 1: I've always I've always mentioned this as uh, you know, 981 00:50:48,719 --> 00:50:50,840 Speaker 1: it will come down to a conversation between you and 982 00:50:50,880 --> 00:50:54,959 Speaker 1: another person about what's your reality like? Because here, here's 983 00:50:55,000 --> 00:50:58,120 Speaker 1: what my reality is like, because everything is customizing itself 984 00:50:58,160 --> 00:51:03,040 Speaker 1: to my convenience, my comfort, my preferences. So what's yours like? 985 00:51:03,120 --> 00:51:05,000 Speaker 1: Because I bet you are a little different from me, 986 00:51:05,480 --> 00:51:09,040 Speaker 1: and I'm curious what how you experience life compared to 987 00:51:09,080 --> 00:51:11,680 Speaker 1: how I experienced life. That's gonna be a real conversation 988 00:51:11,760 --> 00:51:13,480 Speaker 1: in a few years. And it's strange, isn't it that 989 00:51:13,520 --> 00:51:15,600 Speaker 1: you won't have that you won't have the same environment 990 00:51:15,680 --> 00:51:17,839 Speaker 1: that you you share really because everything will be so 991 00:51:18,600 --> 00:51:21,839 Speaker 1: as you said tailored, I guess to custom fit to you. 992 00:51:22,160 --> 00:51:24,239 Speaker 1: It's gonna be so different. It's really it's really weird 993 00:51:24,280 --> 00:51:26,760 Speaker 1: to think, like you you won't have to again flip 994 00:51:26,760 --> 00:51:29,320 Speaker 1: the sweet. You won't if you do anything manually practically anymore. 995 00:51:29,520 --> 00:51:32,280 Speaker 1: I do think smart thermostats are gonna have to start 996 00:51:32,320 --> 00:51:37,680 Speaker 1: seeking therapy for families like mine, where the preference is 997 00:51:38,000 --> 00:51:40,799 Speaker 1: a matter of a couple of degrees and it's it's 998 00:51:40,840 --> 00:51:45,239 Speaker 1: hotly contested territory. Yeah, I can understand that our is 999 00:51:46,480 --> 00:51:49,040 Speaker 1: the switch on the side is almost worn out by 1000 00:51:49,040 --> 00:51:51,640 Speaker 1: the end of the season. Yeah, yeah, that's kind of 1001 00:51:51,680 --> 00:51:54,920 Speaker 1: the same for us. When we come back, Scott and 1002 00:51:55,200 --> 00:51:58,600 Speaker 1: I will end up concluding our discussion about high tech 1003 00:51:58,680 --> 00:52:12,239 Speaker 1: at high speed after these messages. So we kind of 1004 00:52:12,280 --> 00:52:14,680 Speaker 1: mentioned Fair Day Future already, and you guys, like I said, 1005 00:52:14,719 --> 00:52:16,879 Speaker 1: did a full episode on it. Yeah, I mean, I've 1006 00:52:16,920 --> 00:52:18,480 Speaker 1: got I've got some I've got a pile of notes 1007 00:52:18,520 --> 00:52:20,759 Speaker 1: here on this whole thing. So you've you've seen this 1008 00:52:20,840 --> 00:52:23,480 Speaker 1: in person. I haven't. I've only read about it, and 1009 00:52:23,719 --> 00:52:25,759 Speaker 1: you know, kind of get the the idea that some 1010 00:52:25,800 --> 00:52:28,719 Speaker 1: people thought it was really fantastic, and other reviewers went 1011 00:52:28,719 --> 00:52:31,120 Speaker 1: in and said, this is absolutely ridiculous. I don't know 1012 00:52:31,160 --> 00:52:33,360 Speaker 1: what they're doing bringing this car to the show because 1013 00:52:33,880 --> 00:52:35,959 Speaker 1: this has nothing to do with what the company is really. 1014 00:52:36,040 --> 00:52:38,239 Speaker 1: What the what the real messages here? Yeah? I I 1015 00:52:38,400 --> 00:52:40,160 Speaker 1: well it didn't in a way, but it but it 1016 00:52:40,200 --> 00:52:42,640 Speaker 1: wasn't a good representation of what they're gonna do, right, 1017 00:52:42,680 --> 00:52:44,640 Speaker 1: So the representation is similar to what we were talking 1018 00:52:44,640 --> 00:52:48,360 Speaker 1: about with Volkswagen. This modular approach to building, designing and 1019 00:52:48,400 --> 00:52:51,759 Speaker 1: building vehicles, where you have a basic design that you 1020 00:52:51,800 --> 00:52:54,759 Speaker 1: can then modify for whatever purpose you need, whether it's 1021 00:52:54,800 --> 00:52:59,240 Speaker 1: a subcompact car, a race car, and suv, a truck, 1022 00:52:59,280 --> 00:53:03,000 Speaker 1: whatever it might be. You can adjust the foundation and 1023 00:53:03,000 --> 00:53:07,359 Speaker 1: then you have this modular battery system that allows you 1024 00:53:07,400 --> 00:53:10,640 Speaker 1: to add or remove batteries depending upon what vehicle it 1025 00:53:10,719 --> 00:53:13,680 Speaker 1: is that you're planning on building. So subcntact compact cars 1026 00:53:13,680 --> 00:53:16,440 Speaker 1: don't need as much battery power, essay, a race car. 1027 00:53:17,160 --> 00:53:19,279 Speaker 1: I think they called it strings of batteries, didn't they. 1028 00:53:19,480 --> 00:53:21,680 Speaker 1: You can you can adjust the size of the string 1029 00:53:21,719 --> 00:53:24,560 Speaker 1: of batteries, right, Yeah, And they had a cool video 1030 00:53:24,640 --> 00:53:26,799 Speaker 1: that sort of showed the concept from very high level. 1031 00:53:27,600 --> 00:53:29,920 Speaker 1: But I like the idea a lot because it allows 1032 00:53:29,960 --> 00:53:32,879 Speaker 1: for rapid development and prototyping. I mean that the car 1033 00:53:33,000 --> 00:53:37,480 Speaker 1: that they had on display did not exist eighteen months ago. 1034 00:53:37,719 --> 00:53:39,560 Speaker 1: It's a killer car, by the way. I love it. 1035 00:53:39,560 --> 00:53:42,400 Speaker 1: It looks like it's going fast while's sitting still. That's awesome. 1036 00:53:42,440 --> 00:53:45,319 Speaker 1: I really do like it. Good design. I I as 1037 00:53:45,360 --> 00:53:47,560 Speaker 1: soon as I'm not a Scott you know, I'm not 1038 00:53:47,600 --> 00:53:50,319 Speaker 1: a car guy. That's not me. I wanted to get 1039 00:53:50,320 --> 00:53:51,920 Speaker 1: behind the wheel of this car. When I saw it, 1040 00:53:52,040 --> 00:53:53,520 Speaker 1: it was just like and you would be the only 1041 00:53:53,560 --> 00:53:55,759 Speaker 1: one able to do that because it's a single seat car. 1042 00:53:55,960 --> 00:53:59,480 Speaker 1: It's so you know, to to drum up excitement, of course, 1043 00:54:00,000 --> 00:54:02,680 Speaker 1: brought out the race car version, Yeah, it was. It 1044 00:54:02,719 --> 00:54:06,880 Speaker 1: was essentially a way of getting attention right. First of all, 1045 00:54:07,160 --> 00:54:09,919 Speaker 1: the brand new company. This company has not been around 1046 00:54:09,960 --> 00:54:12,880 Speaker 1: for very long, mostly mysterious. A lot of people had 1047 00:54:12,920 --> 00:54:15,040 Speaker 1: heard the name but didn't know a whole lot about it. 1048 00:54:15,040 --> 00:54:19,000 Speaker 1: It was getting most of its funding through China, so 1049 00:54:19,120 --> 00:54:22,160 Speaker 1: it's headquartered in Los Angeles, and it's going to have 1050 00:54:22,320 --> 00:54:26,440 Speaker 1: a manufacturing facility in Las Vegas. Uh. These were like 1051 00:54:26,480 --> 00:54:28,880 Speaker 1: the basic things people knew, but no one had seen 1052 00:54:28,960 --> 00:54:31,800 Speaker 1: anything or really knew what what was going on behind 1053 00:54:31,800 --> 00:54:34,680 Speaker 1: the scenes. Found out it was this approach to making 1054 00:54:34,680 --> 00:54:39,320 Speaker 1: electric vehicles. That's their specialty, uh, taking this modular design 1055 00:54:39,400 --> 00:54:42,319 Speaker 1: so they could build any type of electrical vehicle. They 1056 00:54:42,360 --> 00:54:47,600 Speaker 1: have different power train, our drive train uh layouts. They 1057 00:54:47,640 --> 00:54:50,080 Speaker 1: can do so again depending up how what your car needs. 1058 00:54:50,120 --> 00:54:51,600 Speaker 1: If it doesn't need a whole lot of power, then 1059 00:54:51,680 --> 00:54:53,560 Speaker 1: it's got a very simple drive train. If it's gonna 1060 00:54:53,560 --> 00:54:56,160 Speaker 1: need a lot of torque for something where you're gonna 1061 00:54:56,160 --> 00:54:58,719 Speaker 1: be hauling things or you're gonna be driving like an 1062 00:54:58,800 --> 00:55:02,359 Speaker 1: suv and areas. Obviously they would have a different drive 1063 00:55:02,360 --> 00:55:05,600 Speaker 1: train than from a subcombat car. Four motors instead of 1064 00:55:05,640 --> 00:55:08,520 Speaker 1: two or something like. Yeah, yeah, they had. They showed 1065 00:55:08,600 --> 00:55:10,640 Speaker 1: all the different variations where like there was one where 1066 00:55:10,640 --> 00:55:12,520 Speaker 1: it's like one motor attached to one wheel, and I 1067 00:55:12,560 --> 00:55:16,440 Speaker 1: was like, huh okay, they would be a little commuter car. 1068 00:55:17,000 --> 00:55:21,000 Speaker 1: So they also had the ability. They said, well, our designs, 1069 00:55:21,040 --> 00:55:22,600 Speaker 1: one of the things we wanted to work in from 1070 00:55:22,600 --> 00:55:24,920 Speaker 1: the beginning was that if we wanted to make it 1071 00:55:25,160 --> 00:55:28,640 Speaker 1: an autonomous vehicle, we could add that on to the 1072 00:55:28,640 --> 00:55:32,400 Speaker 1: frame as well. So in other words, it's not automatically 1073 00:55:32,440 --> 00:55:34,279 Speaker 1: going to be an autonomous vehicle, but they have the 1074 00:55:34,360 --> 00:55:38,359 Speaker 1: option to make a any particular model, they go into 1075 00:55:38,360 --> 00:55:41,840 Speaker 1: production with an autonomous version of that car. Pretty simple 1076 00:55:41,840 --> 00:55:44,880 Speaker 1: thing in an all electric vehicle. Really yeah, so really 1077 00:55:44,880 --> 00:55:48,440 Speaker 1: cool idea. Uh and the race car obviously was just 1078 00:55:48,520 --> 00:55:51,480 Speaker 1: a way of saying, come and take a look at 1079 00:55:51,520 --> 00:55:53,880 Speaker 1: what we have to show you. And it worked for me. 1080 00:55:54,960 --> 00:55:58,000 Speaker 1: But the problem was when they started interviewing the people 1081 00:55:58,040 --> 00:56:00,520 Speaker 1: that you know, brought this technology out on stage, the 1082 00:56:00,560 --> 00:56:03,680 Speaker 1: ones that we're conducting, you know, the press conference, when 1083 00:56:03,680 --> 00:56:06,040 Speaker 1: they really started drilling down and asking a lot of questions, 1084 00:56:06,040 --> 00:56:07,640 Speaker 1: here's here's the thing they got. They got a lot 1085 00:56:07,680 --> 00:56:09,920 Speaker 1: of uh, well, we don't really know the answer to 1086 00:56:09,920 --> 00:56:11,879 Speaker 1: that yet, but just trust us. It's gonna work out. 1087 00:56:11,960 --> 00:56:14,160 Speaker 1: Everything's gonna be fine. We we know that we can 1088 00:56:14,280 --> 00:56:16,000 Speaker 1: achieve this as just right now, we don't have the 1089 00:56:16,080 --> 00:56:18,680 Speaker 1: solution that kind of answer. It's a very much an 1090 00:56:18,680 --> 00:56:22,160 Speaker 1: Internet startup approach, and it's an Internet startup approach to 1091 00:56:22,400 --> 00:56:27,640 Speaker 1: an industry that has uh, very deep roots and a 1092 00:56:27,800 --> 00:56:31,000 Speaker 1: very very high barrier to intrigue. In fact, I would 1093 00:56:31,000 --> 00:56:35,680 Speaker 1: say Faraday Future would have had zero chance of success 1094 00:56:35,880 --> 00:56:39,320 Speaker 1: were it not for a little company called Tesla. Sure. Yeah, 1095 00:56:39,440 --> 00:56:41,760 Speaker 1: but you can look at other examples that the people 1096 00:56:41,840 --> 00:56:44,799 Speaker 1: immediately go to. And we've heard the smart listeners on 1097 00:56:44,840 --> 00:56:48,279 Speaker 1: car stuff as well. Um, you know, this looks al 1098 00:56:48,280 --> 00:56:51,680 Speaker 1: awful lot like something like an Elio or even if 1099 00:56:51,680 --> 00:56:56,479 Speaker 1: you want to go back farther, UH Tucker or Dale Dale. 1100 00:56:56,760 --> 00:56:59,760 Speaker 1: Dale's a different scenario because that was just complete scam 1101 00:56:59,760 --> 00:57:01,839 Speaker 1: all around right there. There's no way around that one. 1102 00:57:01,920 --> 00:57:05,160 Speaker 1: But but the others Tucker and UH and Elio, and 1103 00:57:05,200 --> 00:57:08,160 Speaker 1: there's been others throughout time as well that have done this. 1104 00:57:08,200 --> 00:57:10,480 Speaker 1: But they bring a real product out and they show 1105 00:57:10,480 --> 00:57:12,239 Speaker 1: you the real product, and they say, all we need 1106 00:57:12,320 --> 00:57:15,480 Speaker 1: is X million number of million dollars to get the 1107 00:57:15,560 --> 00:57:19,480 Speaker 1: first you know, production run, first match of production out there. 1108 00:57:19,520 --> 00:57:21,400 Speaker 1: And then from that point, you know, we can we 1109 00:57:21,440 --> 00:57:23,920 Speaker 1: can you know, take them out and say self, we 1110 00:57:23,960 --> 00:57:25,760 Speaker 1: can sell you a dealership and you can sell these 1111 00:57:25,760 --> 00:57:28,320 Speaker 1: products and then we can create more cars. And that's 1112 00:57:28,360 --> 00:57:30,600 Speaker 1: the way the whole thing works. But they've got to 1113 00:57:30,600 --> 00:57:33,720 Speaker 1: get over that initial hump and this one. Now they've 1114 00:57:33,760 --> 00:57:37,320 Speaker 1: got a huge backing, right, They've got a lot of money, yes, 1115 00:57:37,560 --> 00:57:41,000 Speaker 1: available to them. That doesn't necessarily guarantee success. Anyone who 1116 00:57:41,040 --> 00:57:45,520 Speaker 1: has backed a kickstarter for any kind of technology, UH 1117 00:57:45,680 --> 00:57:48,440 Speaker 1: has probably had the experience at least once of backing 1118 00:57:48,520 --> 00:57:52,040 Speaker 1: something that never ever came to fruition, and whether it 1119 00:57:52,280 --> 00:57:55,520 Speaker 1: was due to someone trying to pull a scam or 1120 00:57:55,680 --> 00:58:00,560 Speaker 1: someone uh underestimating the challenges they would face taking an 1121 00:58:00,640 --> 00:58:05,360 Speaker 1: idea from the concept stage to production or whatever it 1122 00:58:05,400 --> 00:58:08,040 Speaker 1: may be. You always encounter things that you did not 1123 00:58:08,160 --> 00:58:12,120 Speaker 1: anticipate when you first set out. Uh. A lot of 1124 00:58:12,120 --> 00:58:16,040 Speaker 1: people have experienced this disappointment and something that they back 1125 00:58:16,280 --> 00:58:19,320 Speaker 1: not actually coming out, and that's a possibility. Uh. The 1126 00:58:19,400 --> 00:58:21,880 Speaker 1: nice thing is it's someone else's money in the case 1127 00:58:21,880 --> 00:58:24,200 Speaker 1: of Faraday Future, so I'm okay with this, that's true. 1128 00:58:24,240 --> 00:58:26,320 Speaker 1: I mean, like in the case of Elio Motors, that's 1129 00:58:26,480 --> 00:58:28,280 Speaker 1: that's the most recent one. You know that we've we've 1130 00:58:28,320 --> 00:58:30,840 Speaker 1: been dealing with, is that you know, people are sending 1131 00:58:30,920 --> 00:58:33,320 Speaker 1: in you know, hundred bucks, two hundred bucks, five hundred 1132 00:58:33,320 --> 00:58:35,360 Speaker 1: dollars or whatever. It's kind of a down payment deposit 1133 00:58:35,480 --> 00:58:37,320 Speaker 1: with the idea that that's starting the company and that 1134 00:58:37,440 --> 00:58:40,040 Speaker 1: you know, when they do start producing them, they're gonna 1135 00:58:40,040 --> 00:58:41,680 Speaker 1: get the first pick. You know, they're gonna get the 1136 00:58:41,680 --> 00:58:44,520 Speaker 1: first production run. Uh. You know, they get little little 1137 00:58:44,560 --> 00:58:46,600 Speaker 1: things along the way, you know, Smill incentives I guess 1138 00:58:46,640 --> 00:58:49,320 Speaker 1: for doing that, whether it's bumper stickers or T shirt 1139 00:58:49,440 --> 00:58:54,440 Speaker 1: or something like that. Right, Um, but that company, the 1140 00:58:54,520 --> 00:58:56,360 Speaker 1: hurdle for that company is something like two hundred and 1141 00:58:56,360 --> 00:58:58,080 Speaker 1: fifty or three hundred million dollars and that's what it 1142 00:58:58,080 --> 00:58:59,960 Speaker 1: would take to get that company to start producing car 1143 00:59:00,080 --> 00:59:03,360 Speaker 1: is right now. Probably some other things too at this point, 1144 00:59:03,400 --> 00:59:06,000 Speaker 1: because it's some time has passed. But but this one, 1145 00:59:06,840 --> 00:59:09,000 Speaker 1: this is all being funded by one individual and it's 1146 00:59:09,080 --> 00:59:11,960 Speaker 1: it's the I laugh when I see this, But it's 1147 00:59:11,960 --> 00:59:14,960 Speaker 1: the Chinese version of Steve Jobs. That's what the that's 1148 00:59:14,960 --> 00:59:17,960 Speaker 1: how he's described now he has he's the founder of 1149 00:59:18,720 --> 00:59:21,840 Speaker 1: uh I guess the Chinese version of Netflix over there, right, 1150 00:59:22,200 --> 00:59:23,840 Speaker 1: So he's got a lot of money, this guy, he's 1151 00:59:23,840 --> 00:59:26,360 Speaker 1: a billionaire. But he's not going to throw all of 1152 00:59:26,400 --> 00:59:28,840 Speaker 1: his money behind this whole thing. Of course, he's gonna 1153 00:59:28,880 --> 00:59:31,280 Speaker 1: put some money behind it, but he's got deep pockets. 1154 00:59:31,840 --> 00:59:33,840 Speaker 1: Is this gonna make it? Because they're talking about putting 1155 00:59:33,880 --> 00:59:37,880 Speaker 1: a factory in it's again a billion dollar facility north 1156 00:59:37,880 --> 00:59:40,360 Speaker 1: of Las Vegas. They're gonna break around on that. I 1157 00:59:40,400 --> 00:59:41,880 Speaker 1: believe it was at the end of this month, if 1158 00:59:41,880 --> 00:59:45,680 Speaker 1: not next month. Um, so it's it's happening soon. Do 1159 00:59:45,720 --> 00:59:47,560 Speaker 1: you really see this thing going anywhere? Do you think 1160 00:59:47,560 --> 00:59:49,400 Speaker 1: they're gonna do it? Are they gonna come out, you know, 1161 00:59:49,600 --> 00:59:51,200 Speaker 1: the week before and say like, well there's been a 1162 00:59:51,200 --> 00:59:53,240 Speaker 1: delay and we're gonna do it, um at the end 1163 00:59:53,280 --> 00:59:57,680 Speaker 1: of this year. Uh, without having seen any of the 1164 00:59:57,760 --> 01:00:01,320 Speaker 1: actual inner workings of the But it's really hard for 1165 01:00:01,360 --> 01:00:05,440 Speaker 1: me to say. I can say I want them to 1166 01:00:05,440 --> 01:00:09,480 Speaker 1: to succeed so badly simply because I think it could 1167 01:00:09,480 --> 01:00:14,000 Speaker 1: be a truly disruptive approach to the automotive industry. And 1168 01:00:14,000 --> 01:00:18,360 Speaker 1: by disruptive, I don't mean like a dangerous thing or 1169 01:00:18,680 --> 01:00:22,360 Speaker 1: a destructive thing, but rather something that shakes stuff up 1170 01:00:22,520 --> 01:00:28,520 Speaker 1: enough that we start seeing crazy innovation across all manufacturers. 1171 01:00:28,600 --> 01:00:30,840 Speaker 1: That's what I want to see, and I want to 1172 01:00:30,880 --> 01:00:33,000 Speaker 1: see I want to see all boats rise. Like I 1173 01:00:33,000 --> 01:00:36,800 Speaker 1: don't have a grudge against any particular company. Um. I mean, 1174 01:00:38,520 --> 01:00:41,920 Speaker 1: I definitely wagged my finger at Volkswagen for the clean 1175 01:00:41,960 --> 01:00:45,240 Speaker 1: diesel thing, but but it's you know, that's more of 1176 01:00:45,320 --> 01:00:50,240 Speaker 1: a a emotional reaction to that scandal. But I want 1177 01:00:50,240 --> 01:00:54,280 Speaker 1: to see everybody benefit, um. And I would hate to 1178 01:00:54,320 --> 01:00:57,480 Speaker 1: see this be an example of an Internet startup that 1179 01:00:57,680 --> 01:01:01,160 Speaker 1: got a lot of hype and then collapsed in on itself. Now, 1180 01:01:01,200 --> 01:01:03,480 Speaker 1: Steve Ben and I had a similar thought on our 1181 01:01:03,520 --> 01:01:05,520 Speaker 1: show and we talked about it on Car Stuff. We said, 1182 01:01:06,000 --> 01:01:07,920 Speaker 1: you know, we really want we we truly do want 1183 01:01:07,960 --> 01:01:10,640 Speaker 1: them to succeed. This would be fantastic for development, as 1184 01:01:10,640 --> 01:01:12,200 Speaker 1: you said, you know it would. It would push things 1185 01:01:12,200 --> 01:01:17,160 Speaker 1: along considerably. But with all the examples, the history that 1186 01:01:17,200 --> 01:01:19,400 Speaker 1: we've had that we've we've seen, We've talked about on 1187 01:01:19,400 --> 01:01:22,080 Speaker 1: our show so many times, it's like, you can you 1188 01:01:22,080 --> 01:01:26,600 Speaker 1: can remain hopeful but be skeptical to awesome. Yes, there's good, 1189 01:01:27,080 --> 01:01:29,680 Speaker 1: good reason to employ critical thinking. Sure, yeah, yeah, I 1190 01:01:29,720 --> 01:01:31,800 Speaker 1: mean this thing may or may not end up being 1191 01:01:32,000 --> 01:01:35,800 Speaker 1: kind of vaporware, but I hope not. I'm really I'm 1192 01:01:35,800 --> 01:01:37,720 Speaker 1: really hoping for the best in this. I want to 1193 01:01:37,840 --> 01:01:42,440 Speaker 1: ride and I mean the future. If you're listening, I 1194 01:01:42,520 --> 01:01:45,400 Speaker 1: want a ride. Well, that's that's something we didn't mention. 1195 01:01:45,520 --> 01:01:48,280 Speaker 1: That's that's their race car version. It's like the top 1196 01:01:48,360 --> 01:01:51,160 Speaker 1: ends thing that they can produce. But but the reality 1197 01:01:51,280 --> 01:01:54,320 Speaker 1: is their road testing a vehicle right now that no 1198 01:01:54,360 --> 01:01:57,280 Speaker 1: one's seen. They They've got a secret card that's out 1199 01:01:57,320 --> 01:01:59,560 Speaker 1: there testing somewhere. So if you see a car go 1200 01:01:59,680 --> 01:02:02,520 Speaker 1: by that you don't recognize and there's no marking on it, 1201 01:02:02,720 --> 01:02:04,760 Speaker 1: that might be them. It could be them. So they're 1202 01:02:04,800 --> 01:02:08,200 Speaker 1: out there somewhere testing this platform, and you probably somewhere 1203 01:02:08,200 --> 01:02:11,840 Speaker 1: in California or Nevada. Yeah, it seems like those states 1204 01:02:11,840 --> 01:02:14,880 Speaker 1: are you know, a little more loose with their you know, 1205 01:02:15,480 --> 01:02:19,640 Speaker 1: the allowances. I guess yeah, yeah, um yeah. I I 1206 01:02:19,960 --> 01:02:21,880 Speaker 1: would be interested to see what they're gonna come up 1207 01:02:21,920 --> 01:02:24,200 Speaker 1: with for the production vehicle and will it kind of 1208 01:02:24,240 --> 01:02:27,480 Speaker 1: be like a what's that sad trombone moment? You know 1209 01:02:27,600 --> 01:02:31,200 Speaker 1: when you see it, uh, versus the race car that 1210 01:02:31,240 --> 01:02:33,400 Speaker 1: they showed at the c S show. I mean, it's 1211 01:02:33,480 --> 01:02:35,360 Speaker 1: it's it's like, you know, it's good that they did 1212 01:02:35,400 --> 01:02:38,920 Speaker 1: that to get the excitement. They promised you this, but 1213 01:02:38,960 --> 01:02:41,439 Speaker 1: they gave you this. That kind of thing. Well kind 1214 01:02:41,480 --> 01:02:44,120 Speaker 1: of yeah, I mean you're gonna you know, you're gonna 1215 01:02:44,120 --> 01:02:46,480 Speaker 1: be just a little bit disappointed, Like, oh man, I 1216 01:02:46,560 --> 01:02:50,480 Speaker 1: was really hoping for that single seater, the zero one concept, 1217 01:02:50,600 --> 01:02:53,040 Speaker 1: that car that probably would have cost around three hundred 1218 01:02:53,120 --> 01:02:56,200 Speaker 1: thousand dollars. It's like a batmobile or something. It's so 1219 01:02:56,840 --> 01:02:59,439 Speaker 1: it's pretty wicked. It is the people on the floor 1220 01:02:59,480 --> 01:03:01,640 Speaker 1: they were say they preferred to think of it as 1221 01:03:01,680 --> 01:03:03,840 Speaker 1: the car out of Minority Report. Oh yeah, okay, I 1222 01:03:03,880 --> 01:03:06,240 Speaker 1: can see that a little bit. Yeah, all right, Um, 1223 01:03:06,280 --> 01:03:09,320 Speaker 1: you know there's another one that you wanted to mention, 1224 01:03:09,840 --> 01:03:13,080 Speaker 1: the last one that Sherry on top of this Sunday. Yeah, 1225 01:03:13,160 --> 01:03:16,040 Speaker 1: and this one leaves me with my head chacken, I 1226 01:03:15,600 --> 01:03:19,640 Speaker 1: I understand what this this uh, this builder does. This 1227 01:03:19,680 --> 01:03:22,400 Speaker 1: is WRIN Speed, Yes, Rin Speed, and the vehicle we're 1228 01:03:22,400 --> 01:03:25,920 Speaker 1: talking about the concept at any rate is the ets Yeah. 1229 01:03:26,200 --> 01:03:29,240 Speaker 1: And also we should mention the Atas is actually a 1230 01:03:29,240 --> 01:03:32,280 Speaker 1: BMW I eight based concept car, of course. Yeah, so 1231 01:03:32,840 --> 01:03:35,640 Speaker 1: the latest greatest from BMW. But what they've done with it, 1232 01:03:36,560 --> 01:03:38,640 Speaker 1: I don't know. I don't know if I understand everything 1233 01:03:38,640 --> 01:03:43,200 Speaker 1: that they're trying to achieve with this car. But I mean, 1234 01:03:43,280 --> 01:03:45,120 Speaker 1: Rin Speed has done a lot of cool concepts in 1235 01:03:45,160 --> 01:03:47,400 Speaker 1: the past, and you know they've got sure, someone has 1236 01:03:47,400 --> 01:03:50,480 Speaker 1: some questionable features and things. But when I look at 1237 01:03:50,520 --> 01:03:53,040 Speaker 1: this one, the ETAs, Yeah, it's got a lot of 1238 01:03:53,120 --> 01:03:57,280 Speaker 1: questionable features. So yeah, let's talk about some of them. 1239 01:03:57,320 --> 01:04:00,960 Speaker 1: For one thing, the interface that they have is is 1240 01:04:01,080 --> 01:04:04,000 Speaker 1: encased in two twenty one and a half inch curved 1241 01:04:04,240 --> 01:04:10,240 Speaker 1: four K displays four K resolution, Scott. I mean, granted, 1242 01:04:10,360 --> 01:04:13,880 Speaker 1: you are pretty close to those displays when you're a driver, right, 1243 01:04:13,920 --> 01:04:15,880 Speaker 1: I mean, it's not that far away from you. And 1244 01:04:15,920 --> 01:04:20,400 Speaker 1: in fact, four K resolution is insane to me because 1245 01:04:21,160 --> 01:04:23,920 Speaker 1: those displays would need to be enormous and you'd have 1246 01:04:23,960 --> 01:04:28,960 Speaker 1: to be even closer to appreciate the sharpness in that resolution. 1247 01:04:29,360 --> 01:04:32,520 Speaker 1: I can't believe four K is necessary when ten AD 1248 01:04:32,640 --> 01:04:36,960 Speaker 1: would have sufficed. However, that's nitpicking. So let's say that 1249 01:04:36,960 --> 01:04:41,680 Speaker 1: you're driving this this crazy concept, this this variation on 1250 01:04:41,720 --> 01:04:46,720 Speaker 1: the BMW I eight. Uh, You're driving around and you're 1251 01:04:46,880 --> 01:04:48,960 Speaker 1: you've got this nice steering wheel, it's got its own 1252 01:04:49,000 --> 01:04:51,000 Speaker 1: little panel right in the middle of it, and you 1253 01:04:51,080 --> 01:04:54,720 Speaker 1: think I'm gonna hand over control to my car. You 1254 01:04:54,800 --> 01:04:58,520 Speaker 1: press a little button, the steering wheel folds in on itself, Scott, 1255 01:04:59,200 --> 01:05:01,920 Speaker 1: and then it ch tracks into the dashboard so it 1256 01:05:02,000 --> 01:05:05,080 Speaker 1: goes away. So that okay, that's That's one of the 1257 01:05:05,120 --> 01:05:07,520 Speaker 1: things that left me shake in my head. Like most 1258 01:05:07,560 --> 01:05:11,040 Speaker 1: of these systems, even most of them, recognize the idea 1259 01:05:11,120 --> 01:05:14,040 Speaker 1: that there's going to be a point when human interaction 1260 01:05:14,160 --> 01:05:16,480 Speaker 1: is probably gonna be necessary. And we're gonna alert you 1261 01:05:16,560 --> 01:05:18,600 Speaker 1: with a chime or lights, as you said on the 1262 01:05:19,560 --> 01:05:23,320 Speaker 1: one steering most of the again the BMW Right, Okay, 1263 01:05:23,720 --> 01:05:26,840 Speaker 1: most manufacturers still get that that you know there's going 1264 01:05:26,880 --> 01:05:28,280 Speaker 1: to be a point where you're gonna want to take 1265 01:05:28,280 --> 01:05:30,720 Speaker 1: control or the car will say, I don't know if 1266 01:05:30,720 --> 01:05:33,840 Speaker 1: I can handle this, grab the wheel, push shaws as 1267 01:05:33,880 --> 01:05:36,480 Speaker 1: written speed. Yeah, so you know that's I mean, I 1268 01:05:36,520 --> 01:05:39,760 Speaker 1: watched it happen in the video video clip and it's 1269 01:05:39,760 --> 01:05:41,840 Speaker 1: not a fast motion for it to come back out. 1270 01:05:42,440 --> 01:05:48,880 Speaker 1: It's it's cool, it is, it's cool. It's interesting to watch. 1271 01:05:49,320 --> 01:05:52,000 Speaker 1: But the idea also of a folding steering wheel, that's 1272 01:05:52,000 --> 01:05:54,080 Speaker 1: another thing that makes me cringe just a little bit. 1273 01:05:54,120 --> 01:05:56,400 Speaker 1: You know, that's that's not a great design. Really. Yeah, 1274 01:05:56,440 --> 01:05:59,600 Speaker 1: the two if you think of a wheel, think of it, uh, 1275 01:06:00,200 --> 01:06:03,480 Speaker 1: divide the wheel in half, and the top half and 1276 01:06:03,560 --> 01:06:07,640 Speaker 1: bottom half fold down along the column, the steering column, 1277 01:06:07,800 --> 01:06:10,360 Speaker 1: and the whole thing where tracks back into the dashboard, 1278 01:06:10,440 --> 01:06:13,200 Speaker 1: so that you are freed up to read a book. 1279 01:06:13,440 --> 01:06:16,400 Speaker 1: Essentially like it looks like the like the passenger side 1280 01:06:16,440 --> 01:06:18,360 Speaker 1: of the vehicle at that point, really it really does. 1281 01:06:18,440 --> 01:06:21,920 Speaker 1: And the two displays extend forward so that you can 1282 01:06:22,040 --> 01:06:25,320 Speaker 1: get more immersed in whatever, you know, entertainment like playing 1283 01:06:25,320 --> 01:06:27,000 Speaker 1: on your dashboard, you can get a lot closer to that. 1284 01:06:27,040 --> 01:06:30,480 Speaker 1: Four KOs appreciated a lot more when it's when when 1285 01:06:30,480 --> 01:06:32,800 Speaker 1: it's when it approaches you by five inches, when it's 1286 01:06:32,880 --> 01:06:36,400 Speaker 1: eight inches away from your eye. So uh and the 1287 01:06:36,720 --> 01:06:38,880 Speaker 1: other thing, now, this just kind of made me laugh 1288 01:06:38,880 --> 01:06:40,840 Speaker 1: in the video. I don't know why this was was 1289 01:06:40,920 --> 01:06:43,200 Speaker 1: featured in here, but there's a couple of things more 1290 01:06:43,240 --> 01:06:46,760 Speaker 1: but um, when when they were showing the autonomous capability, Yeah, 1291 01:06:46,960 --> 01:06:49,680 Speaker 1: they had the car at a track in Spain, a 1292 01:06:49,800 --> 01:06:53,840 Speaker 1: racetrack in Spain, and they clearly had a time, you know, 1293 01:06:53,920 --> 01:06:57,520 Speaker 1: a lap time, a lap time from a human driver 1294 01:06:57,960 --> 01:06:59,480 Speaker 1: on the screen. So it's it's got you know, an 1295 01:06:59,520 --> 01:07:02,640 Speaker 1: interface allows you to record lap times and it's really cool, 1296 01:07:02,680 --> 01:07:05,000 Speaker 1: a map of the track and everything best you know, 1297 01:07:05,080 --> 01:07:06,800 Speaker 1: kind of a best time with a little award, you know, 1298 01:07:07,280 --> 01:07:10,080 Speaker 1: a silver copper with It's almost equivalent to like a 1299 01:07:10,200 --> 01:07:12,320 Speaker 1: video game, except this is a real car that you're 1300 01:07:12,360 --> 01:07:14,480 Speaker 1: driving around a real track. Yeah. Sure, and we've seen 1301 01:07:14,520 --> 01:07:16,320 Speaker 1: that stuff, you know in other cars. We get that, 1302 01:07:16,400 --> 01:07:18,120 Speaker 1: you know, they've got stuff like that. You know, they 1303 01:07:18,120 --> 01:07:21,760 Speaker 1: can measure times and speeds and all that. But this 1304 01:07:21,800 --> 01:07:24,800 Speaker 1: one was showing like, Okay, I'm gonna I'm gonna let 1305 01:07:24,800 --> 01:07:28,120 Speaker 1: the autonomous UH feature take over and drive around this 1306 01:07:28,200 --> 01:07:31,640 Speaker 1: racetrack in Spain. And then that's when the driver decided 1307 01:07:31,640 --> 01:07:34,600 Speaker 1: to grab a book from the from what it looks 1308 01:07:34,600 --> 01:07:37,360 Speaker 1: like a bookshelf on the passenger side. That thing was 1309 01:07:37,360 --> 01:07:40,080 Speaker 1: always loaded with books. So while they're driving around this 1310 01:07:40,160 --> 01:07:42,520 Speaker 1: racetrack in Spain, the driver's reading a book. And I 1311 01:07:42,520 --> 01:07:44,479 Speaker 1: know that's to prove a point that he's not touched 1312 01:07:44,480 --> 01:07:47,560 Speaker 1: on the wheel, that there's no interaction there. But the car, 1313 01:07:47,640 --> 01:07:51,080 Speaker 1: of course, best of the human driver's time was much faster, 1314 01:07:51,520 --> 01:07:54,160 Speaker 1: or a little bit faster anyways. But I just thought 1315 01:07:54,160 --> 01:07:56,600 Speaker 1: that was so odd to see him doing that, you know, 1316 01:07:56,640 --> 01:07:58,480 Speaker 1: like to be on a racetrack of all places I 1317 01:07:58,480 --> 01:08:00,400 Speaker 1: can get if you're on a boring can mute or 1318 01:08:00,440 --> 01:08:05,320 Speaker 1: so clearly he was trying to psych out the robot 1319 01:08:05,440 --> 01:08:08,760 Speaker 1: car to show how how little interest he had in 1320 01:08:08,800 --> 01:08:11,200 Speaker 1: the robot car's performance. I thought, you're gonna say other 1321 01:08:11,320 --> 01:08:14,120 Speaker 1: drivers competing against like I'm just gonna let it take over, 1322 01:08:14,520 --> 01:08:17,000 Speaker 1: Like I'm just gonna I'm gonna read my cough cut here. 1323 01:08:17,360 --> 01:08:19,360 Speaker 1: Did you get my point though? I mean, of all 1324 01:08:19,360 --> 01:08:21,000 Speaker 1: places you're gonna want to see, I mean, you're just 1325 01:08:21,000 --> 01:08:22,800 Speaker 1: gonna want to pay attention and watch what's going on 1326 01:08:22,840 --> 01:08:25,240 Speaker 1: because it's exciting, it's thrilling. It's not like you're just going, 1327 01:08:25,360 --> 01:08:30,960 Speaker 1: you know, between downtown Atlanta and home. Sure, let's be fair, 1328 01:08:31,080 --> 01:08:36,280 Speaker 1: could sometimes be exciting. Uh. It also uses Microsoft's Cortana 1329 01:08:36,800 --> 01:08:39,960 Speaker 1: concierge service, so this is similar to Ford and Alexa. 1330 01:08:40,600 --> 01:08:43,360 Speaker 1: Uh Cortana, I would argue, Well, I mean I I've 1331 01:08:43,439 --> 01:08:45,920 Speaker 1: used Cortana a little bit because I've got Windows ten 1332 01:08:46,080 --> 01:08:48,840 Speaker 1: on my computer at home and Cortana is incorporated into 1333 01:08:48,840 --> 01:08:51,120 Speaker 1: Windows ten. Sound fancy. I've used it a little bit, 1334 01:08:52,080 --> 01:08:54,360 Speaker 1: but I'm still very much a keyboard and mouse kind 1335 01:08:54,360 --> 01:08:56,360 Speaker 1: of guy, so I don't tend to talk to my 1336 01:08:56,400 --> 01:08:59,800 Speaker 1: computer unless something has gone wrong and the words I'm 1337 01:08:59,800 --> 01:09:02,160 Speaker 1: saying are not words that the concierge is going to 1338 01:09:02,240 --> 01:09:04,559 Speaker 1: help me with. I talked to my laptop here at 1339 01:09:04,560 --> 01:09:07,000 Speaker 1: work all the time. And uh wait, I'm glad it 1340 01:09:07,040 --> 01:09:09,720 Speaker 1: can't record. That's that's a good good thing. Wait I can't, 1341 01:09:09,760 --> 01:09:14,280 Speaker 1: can it? I don't think so, at least unless some 1342 01:09:14,400 --> 01:09:17,040 Speaker 1: malware has been put on there. But the real reason 1343 01:09:17,120 --> 01:09:20,640 Speaker 1: we had to add this, besides all the ridiculous autonomous 1344 01:09:20,760 --> 01:09:25,720 Speaker 1: car features and and the fact that uh Cortana is incorporated, 1345 01:09:26,040 --> 01:09:30,439 Speaker 1: is that this particular car also has a landing pad 1346 01:09:30,720 --> 01:09:34,120 Speaker 1: on the back of it for a d j I drone. Yeah, 1347 01:09:34,240 --> 01:09:37,240 Speaker 1: this was probably the biggest head scratcher for me. What 1348 01:09:37,360 --> 01:09:40,200 Speaker 1: is the point of that drone? Well, the drone can 1349 01:09:40,240 --> 01:09:43,080 Speaker 1: fly along while you're driving and take video of you 1350 01:09:43,240 --> 01:09:47,639 Speaker 1: driving your really kick ass BMW I a concept car. Well, 1351 01:09:47,680 --> 01:09:49,559 Speaker 1: I agree, that is a cool thing to be able 1352 01:09:49,600 --> 01:09:52,160 Speaker 1: to do. I get it. You don't necessarily need to 1353 01:09:52,200 --> 01:09:55,439 Speaker 1: have a landing pad and a matching drone. I guess 1354 01:09:55,479 --> 01:09:57,400 Speaker 1: you could just get one at toys or Us or something, 1355 01:09:57,760 --> 01:10:00,760 Speaker 1: or at the local Harby Harby store. But they did 1356 01:10:00,760 --> 01:10:02,840 Speaker 1: show it to something that was kind of cool. Now, 1357 01:10:03,280 --> 01:10:06,080 Speaker 1: in the video, it's kind of a silly setup, right, 1358 01:10:06,120 --> 01:10:10,759 Speaker 1: I mean, it's a male female females flying the drone 1359 01:10:10,800 --> 01:10:12,439 Speaker 1: in some kind of looks like they're in a warehouse 1360 01:10:12,520 --> 01:10:15,599 Speaker 1: or something. Yeah, it's a big circular room where there's 1361 01:10:15,640 --> 01:10:20,280 Speaker 1: a there's a staircase that's on the internal curved wall 1362 01:10:20,800 --> 01:10:24,920 Speaker 1: that goes up. So some very weird looking space where 1363 01:10:24,920 --> 01:10:26,479 Speaker 1: the car is parked in the center of it, but 1364 01:10:26,520 --> 01:10:28,599 Speaker 1: it's open to the to the air, you know, there's 1365 01:10:28,600 --> 01:10:31,360 Speaker 1: an open top. So the guy climbs back into the car, 1366 01:10:32,000 --> 01:10:35,280 Speaker 1: and the car alerts him, Hey, it's Valentine's Day. Wouldn't 1367 01:10:35,280 --> 01:10:37,120 Speaker 1: you like to get some roses for your wife? And 1368 01:10:37,120 --> 01:10:40,479 Speaker 1: then he just says the word roses and that sends 1369 01:10:40,479 --> 01:10:43,200 Speaker 1: a signal to the drone that the woman is currently 1370 01:10:43,200 --> 01:10:46,759 Speaker 1: controlling with a with a remote control, and the drone 1371 01:10:46,840 --> 01:10:49,839 Speaker 1: ends up going into auto mode and flies off, sending 1372 01:10:49,880 --> 01:10:53,280 Speaker 1: the woman into a fit. A fit. Yes, she is 1373 01:10:53,360 --> 01:10:56,400 Speaker 1: quite upset that she has lost control of this, and 1374 01:10:56,479 --> 01:11:00,880 Speaker 1: she she confronts her husband, who takes off his glasses, 1375 01:11:00,960 --> 01:11:05,080 Speaker 1: his sunglasses, puts one uh one stem of the sunglasses 1376 01:11:05,120 --> 01:11:08,200 Speaker 1: in his mouth and shrugs is to say, hey, what 1377 01:11:08,240 --> 01:11:10,960 Speaker 1: are you gonna do? Things happen, and then they go 1378 01:11:11,200 --> 01:11:14,519 Speaker 1: down the stairs to get into their autonomous sports car 1379 01:11:14,560 --> 01:11:16,960 Speaker 1: of the future. When the drone returns with a bouquet 1380 01:11:17,000 --> 01:11:19,760 Speaker 1: of roses that it has picked up from a delivery service. 1381 01:11:19,800 --> 01:11:21,600 Speaker 1: It was like a like a messenger drone. I right 1382 01:11:21,680 --> 01:11:26,040 Speaker 1: it it went out and it's it's sought the nearest floors, 1383 01:11:26,120 --> 01:11:27,920 Speaker 1: which was, I believe on the screen it said four 1384 01:11:27,960 --> 01:11:30,879 Speaker 1: point one miles away, something like a long long distance, 1385 01:11:30,880 --> 01:11:33,120 Speaker 1: so it went out got them or no, it was 1386 01:11:33,400 --> 01:11:35,479 Speaker 1: four minutes away, four and a half minutes and a 1387 01:11:35,479 --> 01:11:38,479 Speaker 1: half minutes right. So he attaches a bouquet of roses 1388 01:11:38,800 --> 01:11:41,719 Speaker 1: and it brings him right back to her and everything 1389 01:11:41,800 --> 01:11:44,000 Speaker 1: is just every smoothed over. At that point she gives 1390 01:11:44,080 --> 01:11:46,719 Speaker 1: him a hug her her one foot comes off the ground, 1391 01:11:46,800 --> 01:11:50,479 Speaker 1: you know, and classic hug posed. Truly, I had it. 1392 01:11:50,520 --> 01:11:54,160 Speaker 1: I had a tear at that point. I felt something. Um, yeah, 1393 01:11:54,200 --> 01:11:57,160 Speaker 1: it was. It was a it was a funny video 1394 01:11:57,200 --> 01:12:00,680 Speaker 1: to watch. I actually really enjoyed it. Um. I mean, 1395 01:12:01,680 --> 01:12:05,599 Speaker 1: I think it's ridiculous is probably too strong a word. 1396 01:12:05,800 --> 01:12:10,360 Speaker 1: It's certainly whimsical. Let's say whimsical. That's nice. It's it's 1397 01:12:10,520 --> 01:12:13,040 Speaker 1: it's worth watching. It's it's interesting to see you as 1398 01:12:13,479 --> 01:12:16,200 Speaker 1: and also as proof of concept of certain technologies. It's 1399 01:12:16,240 --> 01:12:19,080 Speaker 1: really cool. I mean, I I agree with you. I 1400 01:12:19,120 --> 01:12:22,160 Speaker 1: don't think that the autonomous approach as as awesome as 1401 01:12:22,200 --> 01:12:25,959 Speaker 1: this automated thing where the steering calum retracts into the dashboard, 1402 01:12:26,000 --> 01:12:28,360 Speaker 1: as awesome, as visual as that is, I don't think 1403 01:12:28,360 --> 01:12:31,040 Speaker 1: it's terribly practical. For one thing, it's a couple of 1404 01:12:31,080 --> 01:12:33,840 Speaker 1: different points of failure in a system that doesn't need 1405 01:12:33,920 --> 01:12:37,160 Speaker 1: any points of failure. Right. If anything goes wrong and 1406 01:12:37,200 --> 01:12:41,320 Speaker 1: then you switch to manual and something some connection has 1407 01:12:41,320 --> 01:12:45,040 Speaker 1: been severed as a result of this process, that's bad. 1408 01:12:45,920 --> 01:12:49,080 Speaker 1: But I did think that the whole thing was very 1409 01:12:49,120 --> 01:12:52,839 Speaker 1: fun and I like the approach, Like the forward thinking 1410 01:12:54,160 --> 01:12:59,080 Speaker 1: nature of the video was was really cool and forward thinking. 1411 01:12:59,120 --> 01:13:01,400 Speaker 1: It's like that tattoo can see in your neck, you know, 1412 01:13:02,520 --> 01:13:05,599 Speaker 1: that's how they track me. Um So, at any rate, 1413 01:13:05,680 --> 01:13:07,840 Speaker 1: this was this was a lot of fun to talk 1414 01:13:07,880 --> 01:13:10,800 Speaker 1: to you about. The technology that was on displayed c Yes, 1415 01:13:11,320 --> 01:13:15,040 Speaker 1: obviously very different from the underlying technology that actually makes 1416 01:13:15,040 --> 01:13:19,240 Speaker 1: cars work. You know, we're not talking necessarily about motors 1417 01:13:19,360 --> 01:13:22,160 Speaker 1: or you know, transmissions or that kind of stuff. This 1418 01:13:22,200 --> 01:13:24,120 Speaker 1: is all add on stuff. It's like, how can I 1419 01:13:24,160 --> 01:13:28,400 Speaker 1: make an app, uh, make that system better? And and again, 1420 01:13:28,840 --> 01:13:32,719 Speaker 1: like smart TVs, I think we're finally starting to see 1421 01:13:33,479 --> 01:13:37,400 Speaker 1: that idea mature to a point where it is something 1422 01:13:37,439 --> 01:13:41,120 Speaker 1: that could be marketable as opposed to well, we have 1423 01:13:41,200 --> 01:13:43,880 Speaker 1: the ability to throw this in there, let's just do it. 1424 01:13:44,680 --> 01:13:48,479 Speaker 1: I hope you enjoyed that classic episode from It was 1425 01:13:48,560 --> 01:13:52,120 Speaker 1: great having Scott on the show. Scott has moved on 1426 01:13:52,160 --> 01:13:56,519 Speaker 1: to other endeavors and actually a completely lost touch with him. 1427 01:13:56,560 --> 01:13:59,800 Speaker 1: Hope he's doing okay, Love you buddy, but yeah. If 1428 01:13:59,840 --> 01:14:02,080 Speaker 1: you have suggestions for topics that I should cover in 1429 01:14:02,160 --> 01:14:04,760 Speaker 1: future episodes of tech Stuff, please reach out and let 1430 01:14:04,800 --> 01:14:08,040 Speaker 1: me know. You can do so by downloading the iHeart 1431 01:14:08,120 --> 01:14:11,200 Speaker 1: Radio app, navigate over to tech Stuff by using Little 1432 01:14:11,200 --> 01:14:14,639 Speaker 1: search Engine. You can then leave me a voice message. 1433 01:14:14,680 --> 01:14:16,840 Speaker 1: There's a little microphone icon if you click on that, 1434 01:14:17,280 --> 01:14:19,719 Speaker 1: you can leave a message up to thirty seconds in length, 1435 01:14:20,479 --> 01:14:23,240 Speaker 1: or if you prefer, you can navigate on over to 1436 01:14:23,280 --> 01:14:26,000 Speaker 1: Twitter and send me a message there. The handle for 1437 01:14:26,040 --> 01:14:29,559 Speaker 1: the show is tech Stuff HM s W and I'll 1438 01:14:29,600 --> 01:14:38,600 Speaker 1: talk to you again really soon y. Tech Stuff is 1439 01:14:38,640 --> 01:14:41,760 Speaker 1: an I Heart Radio production. For more podcasts from my 1440 01:14:41,880 --> 01:14:45,480 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 1441 01:14:45,600 --> 01:14:47,599 Speaker 1: or wherever you listen to your favorite shows.