1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,800 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:14,880 --> 00:00:17,480 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,960 --> 00:00:21,119 Speaker 1: and how the tech are you today? I'm introducing a 5 00:00:21,239 --> 00:00:25,040 Speaker 1: new kind of tech stuff episode. Now. Listeners who have 6 00:00:25,120 --> 00:00:29,640 Speaker 1: been with me for a long long time might remember 7 00:00:29,880 --> 00:00:34,840 Speaker 1: the old uh listener mail episodes where I would say 8 00:00:34,960 --> 00:00:39,160 Speaker 1: listener mail in an extremely obnoxious way and we would 9 00:00:39,200 --> 00:00:43,560 Speaker 1: have a terrible sound effect play afterward, and we had 10 00:00:44,080 --> 00:00:48,440 Speaker 1: equal amounts of mail about listener mail, with about half 11 00:00:48,520 --> 00:00:52,440 Speaker 1: saying I hate it, I hate how you do it. 12 00:00:52,720 --> 00:00:55,080 Speaker 1: Stop doing it. I still want the listener mail episodes, 13 00:00:55,120 --> 00:00:57,680 Speaker 1: but stop introducing it the way you introduce it, and 14 00:00:57,720 --> 00:01:01,560 Speaker 1: the other half said, I think it's hilarious. Keep doing it. 15 00:01:02,040 --> 00:01:05,200 Speaker 1: And eventually I just stopped because you know, I was 16 00:01:05,280 --> 00:01:08,720 Speaker 1: alienating like half the audience. But now we have a 17 00:01:08,720 --> 00:01:11,319 Speaker 1: new type of tech stuff episode, which means we have 18 00:01:11,360 --> 00:01:15,919 Speaker 1: a new opportunity to have a way to introduce these 19 00:01:15,959 --> 00:01:19,479 Speaker 1: types of episodes, and I'm calling this type of episode 20 00:01:20,440 --> 00:01:35,319 Speaker 1: a tech Stuff talk back. Now, for those of you 21 00:01:35,440 --> 00:01:38,680 Speaker 1: who have not heard, if you use the I Heart 22 00:01:38,760 --> 00:01:41,560 Speaker 1: Radio app, which is a free app. You can download 23 00:01:41,560 --> 00:01:45,839 Speaker 1: tier Phone and use that to access the tech Stuff podcast. 24 00:01:46,160 --> 00:01:48,920 Speaker 1: You'll see there's a little microphone icon. It's both on 25 00:01:48,960 --> 00:01:51,800 Speaker 1: the main tex Stuff page and it's also on every 26 00:01:51,800 --> 00:01:55,120 Speaker 1: single episode that's in the app. If you click on 27 00:01:55,120 --> 00:01:58,480 Speaker 1: that icon, then you can leave a thirty second voice 28 00:01:58,520 --> 00:02:01,640 Speaker 1: message to us, and Tori and I are the only 29 00:02:01,720 --> 00:02:04,960 Speaker 1: two people who can listen to those messages. So you 30 00:02:04,960 --> 00:02:07,520 Speaker 1: can leave a message to the show in general, or 31 00:02:07,680 --> 00:02:10,480 Speaker 1: you can go into a specific episode and leave an 32 00:02:10,520 --> 00:02:14,320 Speaker 1: episode specific message. On the back end, we can see 33 00:02:15,120 --> 00:02:18,079 Speaker 1: what episode you you clicked on and all that, so 34 00:02:18,240 --> 00:02:21,200 Speaker 1: it makes it really easy to figure out who is 35 00:02:21,240 --> 00:02:24,919 Speaker 1: talking about what. So listener Greg sent me a couple 36 00:02:25,080 --> 00:02:28,200 Speaker 1: of messages, including one that was just really friendly saying 37 00:02:28,280 --> 00:02:30,359 Speaker 1: thank you for the show. Thank you, Greg, that was 38 00:02:30,400 --> 00:02:33,119 Speaker 1: really thoughtful of you. And he used the talk back 39 00:02:33,160 --> 00:02:35,359 Speaker 1: feature on the I Heart Radio app. So first off, 40 00:02:35,520 --> 00:02:37,880 Speaker 1: shout out to you Greg for being the first person 41 00:02:38,240 --> 00:02:40,919 Speaker 1: to use that feature to contact me. I am not 42 00:02:41,320 --> 00:02:46,680 Speaker 1: counting the test message I sent to myself. Also, while 43 00:02:46,720 --> 00:02:49,440 Speaker 1: technically using talk back gives me the right to use 44 00:02:49,600 --> 00:02:52,960 Speaker 1: the audio that is recorded, So, in other words, if 45 00:02:53,000 --> 00:02:56,080 Speaker 1: you leave a message technically because of the agreements, the 46 00:02:56,160 --> 00:02:59,760 Speaker 1: user agreements. I can use that audio, However, uh, that's 47 00:02:59,800 --> 00:03:02,239 Speaker 1: not my style. I am not sure if Greg is 48 00:03:02,280 --> 00:03:04,640 Speaker 1: cool would be using his audio. So I'm just gonna 49 00:03:04,760 --> 00:03:07,880 Speaker 1: summarize what he asked. And now, if folks in the 50 00:03:07,880 --> 00:03:10,720 Speaker 1: future use the talkback feature and you want me to 51 00:03:10,800 --> 00:03:13,480 Speaker 1: include that audio in an episode like you would like 52 00:03:13,560 --> 00:03:16,960 Speaker 1: it if I had that play and then answer your question, 53 00:03:17,360 --> 00:03:19,560 Speaker 1: just let me know and you can say it's cool 54 00:03:19,560 --> 00:03:22,720 Speaker 1: to use this in your episode or something along those lines. Now, 55 00:03:22,760 --> 00:03:25,480 Speaker 1: Greg's talk back came in before I set those expectations, 56 00:03:25,520 --> 00:03:28,119 Speaker 1: so I'm just going to assume by default he would 57 00:03:28,200 --> 00:03:31,240 Speaker 1: prefer I not include it. And that is because I 58 00:03:31,400 --> 00:03:34,960 Speaker 1: much prefer opt in systems where you choose to be 59 00:03:35,080 --> 00:03:38,320 Speaker 1: part of it, rather than opt out systems where you 60 00:03:38,360 --> 00:03:42,360 Speaker 1: expressly have to say, hey, don't use this um So 61 00:03:42,920 --> 00:03:46,480 Speaker 1: I would rather err on the side of opting in. Anyway. 62 00:03:46,800 --> 00:03:49,640 Speaker 1: Greg has been listening to tech stuff since twenty six 63 00:03:50,400 --> 00:03:53,200 Speaker 1: and he had a couple of requests. For one, he 64 00:03:53,240 --> 00:03:56,200 Speaker 1: wanted to know if I had any direct personal experience 65 00:03:56,600 --> 00:04:00,760 Speaker 1: with writing around in Tesla vehicles, particularly with the full 66 00:04:00,800 --> 00:04:04,120 Speaker 1: self driving system turned on. And I can answer that 67 00:04:04,160 --> 00:04:07,520 Speaker 1: one right away. I do not have that experience. I 68 00:04:07,680 --> 00:04:12,200 Speaker 1: have a neighbor, my next door neighbor, who has a Tesla. 69 00:04:12,400 --> 00:04:15,600 Speaker 1: I mean, he lives literally right next to me. But 70 00:04:15,840 --> 00:04:17,960 Speaker 1: we are not in the hey can I hop in 71 00:04:18,000 --> 00:04:21,719 Speaker 1: your car phase of our relationship. We're more in the hey, 72 00:04:21,839 --> 00:04:25,920 Speaker 1: I accidentally got your mail again phase in our relationship. 73 00:04:26,200 --> 00:04:28,839 Speaker 1: So I have not asked him to give me a 74 00:04:28,920 --> 00:04:32,040 Speaker 1: ride in his Tesla. I think that that would probably 75 00:04:32,040 --> 00:04:35,120 Speaker 1: come across as very odd to him. But Greg also 76 00:04:35,160 --> 00:04:38,360 Speaker 1: wanted to know if full self driving is what Tesla 77 00:04:38,680 --> 00:04:43,000 Speaker 1: says it is. Now that question is actually more complicated 78 00:04:43,080 --> 00:04:45,720 Speaker 1: than what it seems on the surface. So on the 79 00:04:45,720 --> 00:04:49,520 Speaker 1: face of it, you've got the term full self driving. Now, 80 00:04:50,160 --> 00:04:54,080 Speaker 1: to me, that implies that the vehicle does all of 81 00:04:54,120 --> 00:04:57,520 Speaker 1: the driving by itself. That's to me what full self 82 00:04:57,600 --> 00:05:01,960 Speaker 1: driving seems to imply. So if you told me that 83 00:05:02,080 --> 00:05:06,160 Speaker 1: this is a full self driving car, I would assume 84 00:05:06,480 --> 00:05:09,039 Speaker 1: that you were talking about at least a level three 85 00:05:09,160 --> 00:05:13,600 Speaker 1: autonomous vehicle. Now, as a reminder, there are six levels 86 00:05:13,680 --> 00:05:17,320 Speaker 1: to autonomous driving, as agreed upon by the Society of 87 00:05:17,400 --> 00:05:24,320 Speaker 1: Automotive Engineers, Levels zero through to cover driver assist features, 88 00:05:24,400 --> 00:05:26,880 Speaker 1: and they all require that a human behind the wheel 89 00:05:27,200 --> 00:05:31,520 Speaker 1: continues to monitor the driving environment at all times. That 90 00:05:31,680 --> 00:05:35,359 Speaker 1: is where Tesla's full self driving mode actually falls. It 91 00:05:35,440 --> 00:05:38,880 Speaker 1: falls in level two under some situations and level one 92 00:05:39,160 --> 00:05:43,560 Speaker 1: under other situations. A level three autonomous vehicle would do 93 00:05:43,760 --> 00:05:48,200 Speaker 1: all the monitoring of its environment by itself. Human override 94 00:05:48,200 --> 00:05:51,159 Speaker 1: would still be possible in a level three autonomous vehicle, 95 00:05:51,320 --> 00:05:54,080 Speaker 1: and in fact it might be required for edge cases 96 00:05:54,120 --> 00:05:58,880 Speaker 1: that pop up during driving or certain conditions. So this 97 00:05:58,960 --> 00:06:01,919 Speaker 1: means that a level three at ponomous vehicle would still 98 00:06:02,080 --> 00:06:04,599 Speaker 1: need controls like you would still need to have a 99 00:06:04,640 --> 00:06:07,599 Speaker 1: steering wheel and an accelerator and a break that a 100 00:06:07,680 --> 00:06:10,960 Speaker 1: human could operate. If you did not have those controls, 101 00:06:11,480 --> 00:06:13,880 Speaker 1: that car would need to be level four or higher 102 00:06:14,279 --> 00:06:17,680 Speaker 1: like level four level five autonomous. So in addition, a 103 00:06:17,800 --> 00:06:21,640 Speaker 1: level three autonomous vehicle would have conditional autonomy, so that 104 00:06:21,680 --> 00:06:26,000 Speaker 1: means it only operates an autonomous mode if all those 105 00:06:26,000 --> 00:06:30,760 Speaker 1: conditions are met. So that might include things like weather conditions. 106 00:06:30,880 --> 00:06:35,080 Speaker 1: If whether it is particularly nasty like storming or something 107 00:06:35,080 --> 00:06:37,040 Speaker 1: where you have heavy heavy rains, or maybe a very 108 00:06:37,040 --> 00:06:40,000 Speaker 1: dense fog, that might be a condition where the car 109 00:06:40,040 --> 00:06:43,919 Speaker 1: will not operate an autonomous mode. Uh that might also 110 00:06:44,040 --> 00:06:46,560 Speaker 1: have a geo fencing feature, meaning that you have to 111 00:06:46,600 --> 00:06:49,960 Speaker 1: operate within a certain radius, and if you get outside 112 00:06:49,960 --> 00:06:53,080 Speaker 1: of that radius of operation, the car will not work 113 00:06:53,160 --> 00:06:55,640 Speaker 1: in autonomous mode and you have to drive it manually. 114 00:06:56,040 --> 00:07:00,480 Speaker 1: So level three autonomy is conditional autonomy. Tesla full self 115 00:07:00,560 --> 00:07:04,720 Speaker 1: driving mode does not get to level three yet. I 116 00:07:04,760 --> 00:07:08,240 Speaker 1: think Tesla refers to its mode as full self driving 117 00:07:08,279 --> 00:07:12,160 Speaker 1: because it handles two separate driving functions at the same 118 00:07:12,200 --> 00:07:15,840 Speaker 1: time under certain conditions. That is, it handles both the 119 00:07:15,880 --> 00:07:20,280 Speaker 1: acceleration and breaking of the vehicle, as well as the 120 00:07:20,360 --> 00:07:24,680 Speaker 1: steering of the vehicle. A level one autonomous vehicle only 121 00:07:24,720 --> 00:07:29,040 Speaker 1: handles one driving operation at a time. It can handle acceleration, 122 00:07:29,120 --> 00:07:32,760 Speaker 1: slash breaking or steering, but not both. So a car 123 00:07:32,880 --> 00:07:36,560 Speaker 1: that has lane correction, for example, will guide a vehicle 124 00:07:36,760 --> 00:07:39,080 Speaker 1: that's drifting out of a lane of traffic to go 125 00:07:39,160 --> 00:07:43,800 Speaker 1: back into the lane of traffic. That's one job it's steering. 126 00:07:44,360 --> 00:07:47,800 Speaker 1: Or a car that has adaptive cruise control will change 127 00:07:47,840 --> 00:07:50,560 Speaker 1: the speed of the vehicle's travel based upon the speed 128 00:07:50,640 --> 00:07:53,320 Speaker 1: of traffic around it. So if the car in front 129 00:07:53,360 --> 00:07:57,080 Speaker 1: of you slows down, your car's adaptive cruise control will 130 00:07:57,120 --> 00:08:01,360 Speaker 1: slow your car down to automatically. Full self driving takes 131 00:08:01,400 --> 00:08:05,040 Speaker 1: on both of those functions, at least in highway operation, 132 00:08:05,600 --> 00:08:09,640 Speaker 1: and I'm guessing that's why Tesla called it full self driving, 133 00:08:09,680 --> 00:08:14,560 Speaker 1: because it controls both the acceleration and deceleration as well 134 00:08:14,600 --> 00:08:17,160 Speaker 1: as the steering. You as a driver don't have to 135 00:08:17,160 --> 00:08:20,880 Speaker 1: do anything when in full self driving mode other than 136 00:08:21,600 --> 00:08:25,160 Speaker 1: maintain supervision of the vehicle and be ready to intervene 137 00:08:25,320 --> 00:08:30,040 Speaker 1: at any given moment, so this is still a very 138 00:08:30,120 --> 00:08:33,880 Speaker 1: highly conditional mode. Full self driving mode is meant for 139 00:08:34,080 --> 00:08:37,240 Speaker 1: highway use. You can activate it when you're taking an 140 00:08:37,280 --> 00:08:40,160 Speaker 1: on ramp to a highway and the full self driving 141 00:08:40,160 --> 00:08:43,160 Speaker 1: system uh this this part of full self drivings in 142 00:08:43,240 --> 00:08:46,080 Speaker 1: beta testing, by the way, But the full self driving 143 00:08:46,120 --> 00:08:50,360 Speaker 1: system can then navigate your car through highway traffic, including 144 00:08:50,440 --> 00:08:53,440 Speaker 1: lane changes to go to the appropriate exit, and it 145 00:08:53,440 --> 00:08:57,120 Speaker 1: can even navigate through interchanges, so you can go from 146 00:08:57,120 --> 00:09:01,160 Speaker 1: one highway to a different highway and continue on to 147 00:09:01,280 --> 00:09:04,920 Speaker 1: your you know, exit, the the appropriate exit for wherever 148 00:09:04,960 --> 00:09:08,440 Speaker 1: you're going. At that point. You would then resume manual 149 00:09:08,559 --> 00:09:13,440 Speaker 1: steering of the vehicle. There are also some non highway 150 00:09:13,520 --> 00:09:17,000 Speaker 1: modes that you can use in full self driving there's 151 00:09:17,000 --> 00:09:20,720 Speaker 1: an auto park feature, which means you can, you know, 152 00:09:20,840 --> 00:09:23,600 Speaker 1: have your car park itself. It will guide the Tesla 153 00:09:23,640 --> 00:09:26,319 Speaker 1: to parallel park itself or to park in a perpendicular 154 00:09:26,360 --> 00:09:29,960 Speaker 1: space automatically. And there's kind of the reverse of that. 155 00:09:30,080 --> 00:09:34,079 Speaker 1: There's the Summon feature, which, as you might imagine, summons 156 00:09:34,200 --> 00:09:37,200 Speaker 1: your Tesla so that your car moves to where you are, 157 00:09:37,320 --> 00:09:40,600 Speaker 1: kind of like an automated valet. And there's a mode 158 00:09:40,600 --> 00:09:43,640 Speaker 1: that will automatically identify traffic lights and stop signs, which 159 00:09:43,679 --> 00:09:46,720 Speaker 1: will guide your car to stop appropriately on city streets. 160 00:09:46,800 --> 00:09:51,400 Speaker 1: So that would allow for acceleration deceleration on city streets, 161 00:09:51,400 --> 00:09:54,840 Speaker 1: but not steering. So as of right now, auto steer 162 00:09:55,000 --> 00:09:57,760 Speaker 1: on city streets is not an option. It is listed 163 00:09:57,880 --> 00:10:01,000 Speaker 1: as coming up like an up humming option, but it's 164 00:10:01,040 --> 00:10:04,200 Speaker 1: not there yet. So on city streets, full self driving 165 00:10:04,240 --> 00:10:07,280 Speaker 1: really reverts down to a level one autonomous mode of 166 00:10:07,320 --> 00:10:12,760 Speaker 1: operation because it handles acceleration and deceleration but not the steering, 167 00:10:13,200 --> 00:10:16,920 Speaker 1: and in all modes, Tesla drivers are expected to supervise 168 00:10:16,960 --> 00:10:20,960 Speaker 1: the operation of the vehicle at all times. All right, 169 00:10:21,480 --> 00:10:24,040 Speaker 1: when we come back after this quick break, I'll wrap 170 00:10:24,160 --> 00:10:35,240 Speaker 1: up with Greg's question here. So before the break, we 171 00:10:35,240 --> 00:10:39,800 Speaker 1: were talking about how Tesla drivers have to maintain supervision 172 00:10:39,840 --> 00:10:42,360 Speaker 1: of their vehicle even in full self driving mode, which 173 00:10:42,679 --> 00:10:46,680 Speaker 1: keeps full self driving at level to autonomous operation at 174 00:10:46,760 --> 00:10:50,560 Speaker 1: least in highway mode. On city street mode, you're still 175 00:10:50,600 --> 00:10:53,760 Speaker 1: at level one, and Tesla for it, for the record, 176 00:10:53,960 --> 00:10:56,480 Speaker 1: they are up from about this. So my beef with 177 00:10:56,559 --> 00:11:01,080 Speaker 1: Tesla is that the names that it gives of these features, 178 00:11:01,120 --> 00:11:04,360 Speaker 1: like autopilot, which is the more basic set of driver 179 00:11:04,480 --> 00:11:08,280 Speaker 1: assist features, and full self driving, those names, at least 180 00:11:08,320 --> 00:11:13,040 Speaker 1: to me on the surface, create a bit of misdirection. Uh, 181 00:11:13,120 --> 00:11:15,280 Speaker 1: if you read up on it, If you actually go 182 00:11:15,360 --> 00:11:18,800 Speaker 1: to Tesla site and you read up on autopilot and 183 00:11:18,840 --> 00:11:21,679 Speaker 1: full self driving, you would quickly understand the limitations of 184 00:11:21,720 --> 00:11:25,760 Speaker 1: those systems and the expectations Tesla has for its drivers, 185 00:11:25,800 --> 00:11:28,680 Speaker 1: and you can probably suss out why Tesla chose those 186 00:11:28,760 --> 00:11:31,800 Speaker 1: names in the first place. But I say, on the 187 00:11:31,840 --> 00:11:36,200 Speaker 1: surface level, those terms are misleading at best. I think 188 00:11:36,200 --> 00:11:41,240 Speaker 1: they said particular expectations that aren't realistic, and I think 189 00:11:41,240 --> 00:11:45,720 Speaker 1: those unrealistic expectations have factored into some truly tragic accidents. 190 00:11:46,440 --> 00:11:50,520 Speaker 1: So is Tesla responsible for those high profile accidents that 191 00:11:50,559 --> 00:11:54,439 Speaker 1: we have seen, some of which lead to fatalities. That 192 00:11:54,520 --> 00:11:57,439 Speaker 1: to me is a very difficult question to answer. So, 193 00:11:57,480 --> 00:12:00,520 Speaker 1: on the one hand, the company communicates to drivers the 194 00:12:00,559 --> 00:12:04,960 Speaker 1: requirement of actively supervising the vehicle, not relying on it 195 00:12:05,000 --> 00:12:07,160 Speaker 1: as a self driving solution. They even go so far 196 00:12:07,200 --> 00:12:10,800 Speaker 1: as to outright say it is not an autonomous vehicle. 197 00:12:11,880 --> 00:12:15,520 Speaker 1: On the other hand, the marketing of these features to me, 198 00:12:16,160 --> 00:12:20,520 Speaker 1: suggests that the vehicle is capable of handling everything by itself. 199 00:12:21,120 --> 00:12:24,839 Speaker 1: Full self driving seems to indicate that the car will 200 00:12:24,880 --> 00:12:29,720 Speaker 1: do everything for you. I don't think that Tesla should 201 00:12:29,720 --> 00:12:31,600 Speaker 1: be totally off the hook here. I do think that 202 00:12:31,640 --> 00:12:37,120 Speaker 1: Tesla drivers bear most of the responsibility for accidents. I 203 00:12:37,160 --> 00:12:39,880 Speaker 1: do not want to give the implication that I think 204 00:12:39,920 --> 00:12:44,040 Speaker 1: Tesla is fully responsible for any accident of a Tesla 205 00:12:44,160 --> 00:12:47,160 Speaker 1: vehicle that was operating in full self driving mode. I 206 00:12:47,200 --> 00:12:50,280 Speaker 1: don't think that's true. I think that the drivers bear 207 00:12:50,520 --> 00:12:55,280 Speaker 1: most of that responsibility, largely because Tesla does communicate what 208 00:12:55,360 --> 00:12:58,559 Speaker 1: the limitations of those modes are two drivers before they 209 00:12:58,600 --> 00:13:01,040 Speaker 1: can activate them. Whether of the drivers read it or 210 00:13:01,040 --> 00:13:03,480 Speaker 1: not is another question. It can be like one of 211 00:13:03,480 --> 00:13:07,280 Speaker 1: those end user license agreements that everybody with a few 212 00:13:07,320 --> 00:13:10,280 Speaker 1: exceptions just skips over so that they can click the 213 00:13:10,360 --> 00:13:12,200 Speaker 1: eye accept at the end of it and move on 214 00:13:12,280 --> 00:13:15,280 Speaker 1: with their lives. Um only a few people ever bother 215 00:13:15,360 --> 00:13:18,920 Speaker 1: reading those and I worry that Tesla drivers can sometimes 216 00:13:18,960 --> 00:13:22,240 Speaker 1: be the same way with the warnings and limitations of 217 00:13:22,400 --> 00:13:26,080 Speaker 1: things like autopilot and full self driving. So it's hard 218 00:13:26,120 --> 00:13:29,959 Speaker 1: for me to remove all accountability from the company itself. 219 00:13:30,000 --> 00:13:34,480 Speaker 1: I do think Tesla bears at least some responsibility, perhaps 220 00:13:34,520 --> 00:13:39,000 Speaker 1: for poor communication of what these these modes do, simply 221 00:13:39,040 --> 00:13:44,640 Speaker 1: because the names are so evocative of things that that 222 00:13:44,760 --> 00:13:49,559 Speaker 1: the modes just can't can't do. Anyway, as to whether 223 00:13:49,840 --> 00:13:52,440 Speaker 1: full self driving does what Tesla says it can do, 224 00:13:53,679 --> 00:13:57,040 Speaker 1: I think yeah it does. If you actually read what 225 00:13:57,240 --> 00:14:01,120 Speaker 1: Tesla says the mode does. Uh, you have to take 226 00:14:01,160 --> 00:14:03,120 Speaker 1: that time to do it. And if you don't do 227 00:14:03,200 --> 00:14:06,080 Speaker 1: it and you just rely on the feeling you get 228 00:14:06,120 --> 00:14:08,280 Speaker 1: from the name itself, you are going to walk away 229 00:14:08,280 --> 00:14:11,680 Speaker 1: with the wrong impression. But at the heart of the matter, 230 00:14:11,880 --> 00:14:15,360 Speaker 1: Tesla does pretty much. You know, let's spell out what 231 00:14:15,520 --> 00:14:18,640 Speaker 1: the mode can and cannot do. So would I be 232 00:14:18,760 --> 00:14:23,720 Speaker 1: comfortable writing with someone who activated Tesla's full self driving mode. 233 00:14:23,760 --> 00:14:26,240 Speaker 1: Let's say I get in my next door neighbors Tesla. 234 00:14:26,800 --> 00:14:28,920 Speaker 1: He takes me out on the highway and turns on 235 00:14:28,960 --> 00:14:32,920 Speaker 1: full self driving. Well, if I had previously ridden with 236 00:14:32,960 --> 00:14:34,560 Speaker 1: the guy and I knew that he was a really 237 00:14:34,640 --> 00:14:36,920 Speaker 1: good driver, and that he was paying attention and that 238 00:14:37,360 --> 00:14:42,280 Speaker 1: he's a responsible driver, then I probably feel comfortable because 239 00:14:42,280 --> 00:14:46,239 Speaker 1: I know that he would be continuously monitoring the situation 240 00:14:46,280 --> 00:14:51,920 Speaker 1: and ready to step in should anything unexpected happen. However, 241 00:14:51,960 --> 00:14:54,400 Speaker 1: if the driver was someone that I felt was, you know, 242 00:14:55,120 --> 00:14:59,760 Speaker 1: vaguely irresponsible and was clearly leaving everything up to the 243 00:14:59,800 --> 00:15:02,400 Speaker 1: ca are, I absolutely would not feel comfortable at all, 244 00:15:02,520 --> 00:15:04,560 Speaker 1: and I would be questioning every decision I had made 245 00:15:04,560 --> 00:15:07,200 Speaker 1: in my life that led me to sit in that vehicle. 246 00:15:07,560 --> 00:15:10,640 Speaker 1: I would be terrified. So I think the last several 247 00:15:10,720 --> 00:15:15,280 Speaker 1: years have shown us that Tesla's particular approach to self 248 00:15:15,360 --> 00:15:18,520 Speaker 1: driving technology has a really long way to go. It 249 00:15:18,640 --> 00:15:22,160 Speaker 1: is tragic that in the process of learning about those 250 00:15:22,200 --> 00:15:26,240 Speaker 1: gaps and capability, people have lost their lives and accidents. 251 00:15:26,280 --> 00:15:30,640 Speaker 1: That is truly terrible. Obviously, it would be a lot 252 00:15:30,720 --> 00:15:34,960 Speaker 1: better if any company would discover limitations like that through 253 00:15:35,000 --> 00:15:39,240 Speaker 1: controlled tests that didn't endanger human lives or the lives 254 00:15:39,280 --> 00:15:41,160 Speaker 1: of you know, other people who are on the road 255 00:15:41,400 --> 00:15:45,240 Speaker 1: and aren't even involved in the vehicle itself. You could 256 00:15:45,880 --> 00:15:48,800 Speaker 1: make the argument that the way Tesla has rolled out 257 00:15:48,800 --> 00:15:56,120 Speaker 1: the features has inspired unrealistic expectations and has created situations 258 00:15:56,120 --> 00:16:01,720 Speaker 1: where people have uh abused the technology. Now that's how 259 00:16:01,800 --> 00:16:05,120 Speaker 1: I feel, but I also acknowledge that other folks might 260 00:16:05,240 --> 00:16:09,360 Speaker 1: strongly disagree with me and say that the drivers should 261 00:16:09,400 --> 00:16:14,560 Speaker 1: shoulder full responsibility for their actions. I don't quite feel 262 00:16:14,600 --> 00:16:17,960 Speaker 1: that way, because I feel like if someone is selling 263 00:16:18,000 --> 00:16:20,360 Speaker 1: you something and they call it one thing, and it's 264 00:16:20,440 --> 00:16:22,760 Speaker 1: only by reading the fine print that you realize it's 265 00:16:22,800 --> 00:16:27,200 Speaker 1: not it's not exactly what they're calling it, some of 266 00:16:27,240 --> 00:16:30,800 Speaker 1: that responsibility has to follow fall on the vendor for 267 00:16:30,840 --> 00:16:34,200 Speaker 1: a kind of misrepresenting the product. That's just how I 268 00:16:34,240 --> 00:16:38,120 Speaker 1: feel though. That's my own opinion. But thanks to Greg 269 00:16:38,160 --> 00:16:40,840 Speaker 1: again for his request. He also sent a follow up 270 00:16:40,840 --> 00:16:43,560 Speaker 1: request asking me to do a full episode treatment for 271 00:16:43,600 --> 00:16:47,080 Speaker 1: on Kio. I talked about how the company that created 272 00:16:47,080 --> 00:16:50,200 Speaker 1: the on Chio brand has gone out of business. The 273 00:16:50,240 --> 00:16:53,920 Speaker 1: brand itself lives on so A few people on Twitter 274 00:16:53,960 --> 00:16:57,200 Speaker 1: have similarly asked me to do a full episode on Kio. 275 00:16:57,280 --> 00:16:59,520 Speaker 1: So I'm going to be tackling that very soon. Keep 276 00:16:59,560 --> 00:17:01,800 Speaker 1: an ear out for it, and if you would like 277 00:17:01,880 --> 00:17:03,960 Speaker 1: to leave me a message, that would become a future 278 00:17:04,000 --> 00:17:06,800 Speaker 1: tech Stuff talk back episode. Like I said, just go 279 00:17:07,040 --> 00:17:10,960 Speaker 1: download the I Heart Radio app on your smartphone, navigate 280 00:17:11,000 --> 00:17:13,520 Speaker 1: to tech Stuff. You'll see that there's that little microphone 281 00:17:13,720 --> 00:17:16,920 Speaker 1: icon both on the main page and within each episode. 282 00:17:17,160 --> 00:17:20,200 Speaker 1: You click on that, you record your thirty second message. Remember, 283 00:17:20,280 --> 00:17:23,960 Speaker 1: let me know if you want that to audio to 284 00:17:24,000 --> 00:17:28,640 Speaker 1: actually play within an episode, and then I will take 285 00:17:28,680 --> 00:17:31,600 Speaker 1: care of it from there. It's a great little feature 286 00:17:31,640 --> 00:17:34,600 Speaker 1: and I'm really enjoying how it's working out so far, 287 00:17:34,800 --> 00:17:36,719 Speaker 1: and I'm looking forward to hearing from more of you 288 00:17:36,800 --> 00:17:40,199 Speaker 1: in the future. And otherwise, if you just want to 289 00:17:40,200 --> 00:17:42,359 Speaker 1: get in touch with me on Twitter, you can still 290 00:17:42,440 --> 00:17:45,479 Speaker 1: totally do that to The handle for the show is 291 00:17:45,680 --> 00:17:48,719 Speaker 1: tech Stuff H. S W and I'll talk to you 292 00:17:48,760 --> 00:17:58,560 Speaker 1: again really soon. Tex Stuff is an I Heart Radio production. 293 00:17:58,800 --> 00:18:01,600 Speaker 1: For more podcasts for my Heart Radio, visit the I 294 00:18:01,720 --> 00:18:04,960 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 295 00:18:05,000 --> 00:18:10,440 Speaker 1: your favorite shows. H