1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,680 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,760 --> 00:00:16,840 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,920 --> 00:00:19,080 Speaker 1: how Stuff Works, and I heart radio and I love 5 00:00:19,160 --> 00:00:22,680 Speaker 1: all things tech and I mentioned a few weeks ago 6 00:00:22,960 --> 00:00:27,080 Speaker 1: that I was planning a suite of episodes about driverless cars, 7 00:00:27,120 --> 00:00:30,960 Speaker 1: including the history of developing them, the challenges we face 8 00:00:31,320 --> 00:00:35,440 Speaker 1: in implementing them, the potential benefits that autonomous vehicles could have, 9 00:00:35,920 --> 00:00:38,680 Speaker 1: and how long it might be before we see truly 10 00:00:38,840 --> 00:00:43,280 Speaker 1: driverless cars deployed on a wide scale basis, also whether 11 00:00:43,360 --> 00:00:46,640 Speaker 1: or not they'll hit the road before they're ready. Now, today, 12 00:00:46,880 --> 00:00:49,440 Speaker 1: we're going to start that journey by talking about a 13 00:00:49,520 --> 00:00:54,320 Speaker 1: time when the idea of driverless cars was mostly science fiction. Also, 14 00:00:54,520 --> 00:00:57,240 Speaker 1: I think it's important to understand that the development of 15 00:00:57,280 --> 00:01:01,040 Speaker 1: self driving cars is not a linear your story with 16 00:01:01,080 --> 00:01:04,679 Speaker 1: a solid narrative. It's not a leads to be leads 17 00:01:04,720 --> 00:01:08,360 Speaker 1: to see a lot of people have worked on developing 18 00:01:08,400 --> 00:01:13,520 Speaker 1: the technology independently, sometimes from very different perspectives and philosophies, 19 00:01:13,720 --> 00:01:17,680 Speaker 1: but all heading towards a common goal. These projects can 20 00:01:17,720 --> 00:01:21,160 Speaker 1: sometimes overlap each other in time without having any direct 21 00:01:21,240 --> 00:01:24,840 Speaker 1: connection between them, so you could have independent studies happening 22 00:01:24,840 --> 00:01:27,840 Speaker 1: throughout the world. So I'm just gonna cover some of 23 00:01:27,840 --> 00:01:30,560 Speaker 1: the big, really important ones, but just no, there were 24 00:01:30,560 --> 00:01:33,160 Speaker 1: a lot of people working on this at various times 25 00:01:33,160 --> 00:01:37,120 Speaker 1: throughout our history. I'm gonna stick as close to a 26 00:01:37,200 --> 00:01:42,600 Speaker 1: chronological timeline as I can, but it will involve jumping 27 00:01:42,600 --> 00:01:46,559 Speaker 1: around a little bit in time, just because otherwise I'd 28 00:01:46,600 --> 00:01:50,160 Speaker 1: be saying, meanwhile, blah blah blah was happening, and then 29 00:01:50,480 --> 00:01:52,320 Speaker 1: so and so was doing this, and it was just 30 00:01:52,520 --> 00:01:55,160 Speaker 1: turned into a really hackneyed kind of like radio mystery. 31 00:01:55,240 --> 00:01:57,720 Speaker 1: I don't want that to happen. So picking a starting 32 00:01:57,760 --> 00:02:02,360 Speaker 1: point is actually pretty darned tricky too, because the technology 33 00:02:02,360 --> 00:02:08,160 Speaker 1: of autonomous cars spans lots of different fields. You could 34 00:02:08,160 --> 00:02:11,560 Speaker 1: go into all sorts of technologies and talk about the 35 00:02:11,600 --> 00:02:16,000 Speaker 1: development and how the evolution of those technologies eventually found 36 00:02:16,000 --> 00:02:18,839 Speaker 1: their way into autonomous cars, and if I dug too 37 00:02:18,880 --> 00:02:22,760 Speaker 1: deep in that rabbit hole, this series would last maybe 38 00:02:22,800 --> 00:02:26,160 Speaker 1: thirty episodes. We'd be talking about the development of the automobile, 39 00:02:26,320 --> 00:02:30,200 Speaker 1: we'd be talking about development of cameras. I kind of 40 00:02:30,240 --> 00:02:33,160 Speaker 1: like how Wired did it? In a piece that was 41 00:02:33,200 --> 00:02:36,640 Speaker 1: published in twenty sixteen titled quote a brief history of 42 00:02:36,680 --> 00:02:40,920 Speaker 1: autonomous vehicle technology end quote. And as that name suggests, 43 00:02:41,400 --> 00:02:44,680 Speaker 1: that article went beyond self driving cars, and the first 44 00:02:44,760 --> 00:02:48,440 Speaker 1: item on that list actually dates all the way back 45 00:02:48,760 --> 00:02:53,960 Speaker 1: to fifteen hundred or thereabouts, and a famous turtle named Leonardo. 46 00:02:55,200 --> 00:02:59,320 Speaker 1: I'm sorry, tari Is tar is telling me that's the throng. Leonardo. 47 00:02:59,560 --> 00:03:02,720 Speaker 1: Leonardo was not actually a turtle, but apparently some sort 48 00:03:02,720 --> 00:03:09,120 Speaker 1: of renowned renaissance artist and proto scientists. Okay, Leonardo da Vinci. Okay, 49 00:03:09,400 --> 00:03:13,360 Speaker 1: all right, I was was way off base there, Thanks Harry. Anyway, 50 00:03:13,760 --> 00:03:18,360 Speaker 1: the story of his invention comes to us courtesy of 51 00:03:18,400 --> 00:03:23,520 Speaker 1: the Codex Atlanticus, Folio eight twelve r, in which da 52 00:03:23,600 --> 00:03:27,480 Speaker 1: Vinci describes the creation of a cart that would use 53 00:03:27,680 --> 00:03:31,680 Speaker 1: springs and gears. The springs would store energy, and the 54 00:03:31,720 --> 00:03:35,080 Speaker 1: gears and the cogs would drive the wheels and steering mechanisms, 55 00:03:35,480 --> 00:03:39,280 Speaker 1: so that this cart, once wound up and quote unquote programmed, 56 00:03:39,640 --> 00:03:43,680 Speaker 1: could travel and steer on its own power. So you 57 00:03:43,680 --> 00:03:48,160 Speaker 1: would change these settings around, and that would allow the 58 00:03:48,400 --> 00:03:52,920 Speaker 1: cart to not just travel across a distance, but also 59 00:03:53,200 --> 00:03:56,320 Speaker 1: turn left or right, depending upon the settings that you 60 00:03:56,360 --> 00:03:59,640 Speaker 1: had selected. I imagine this was largely done through various 61 00:04:00,080 --> 00:04:03,080 Speaker 1: gears that once it hit a certain point of its revolution, 62 00:04:03,120 --> 00:04:06,760 Speaker 1: it would engage the steering mechanism, But honestly, I don't 63 00:04:06,880 --> 00:04:10,440 Speaker 1: know the full details. Uh If da Vinci ever built 64 00:04:10,480 --> 00:04:12,600 Speaker 1: one of these for real z s. The working model 65 00:04:12,680 --> 00:04:16,880 Speaker 1: did not survive the passage of time, but modern enthusiasts 66 00:04:16,920 --> 00:04:22,200 Speaker 1: have attempted to recreate this self driving cart, which we 67 00:04:22,320 --> 00:04:25,720 Speaker 1: now believe was intended for use in theatrical productions. It 68 00:04:25,800 --> 00:04:29,840 Speaker 1: wasn't meant to do practical work on like manual labor 69 00:04:29,920 --> 00:04:31,960 Speaker 1: or anything like that. It was meant to be part 70 00:04:32,080 --> 00:04:36,400 Speaker 1: of a show that you would have this cart go across, 71 00:04:36,440 --> 00:04:39,640 Speaker 1: perhaps as an effect, maybe it's carrying some sort of 72 00:04:39,640 --> 00:04:42,400 Speaker 1: piece of scenery or an actor or something. But Da 73 00:04:42,480 --> 00:04:46,400 Speaker 1: Vinci's drawings and designs, we're not step by step instructions, 74 00:04:46,760 --> 00:04:50,680 Speaker 1: and so any recreation of this invention has to be 75 00:04:50,720 --> 00:04:54,240 Speaker 1: done with a healthy dose of interpretation on the part 76 00:04:54,360 --> 00:04:56,760 Speaker 1: of the builders. They have to start kind of filling 77 00:04:56,800 --> 00:05:00,200 Speaker 1: in the gaps and making, you know, best guesses. But 78 00:05:00,360 --> 00:05:03,359 Speaker 1: some people, such as the staff at Wired, point to 79 00:05:03,480 --> 00:05:06,560 Speaker 1: this as an early example of a vehicle that can 80 00:05:06,640 --> 00:05:10,480 Speaker 1: drive itself. Though to be fair, da Vinci's design required 81 00:05:10,600 --> 00:05:13,760 Speaker 1: some humans to pre set the device so that it 82 00:05:13,800 --> 00:05:15,719 Speaker 1: would do what it was supposed to do. So you 83 00:05:15,720 --> 00:05:20,279 Speaker 1: couldn't just, you know, tell the cart I need you 84 00:05:20,320 --> 00:05:23,880 Speaker 1: to cross upstage left. You would actually have to quote 85 00:05:23,960 --> 00:05:28,320 Speaker 1: unquote program it. Now, no discussion about the early days 86 00:05:28,600 --> 00:05:32,800 Speaker 1: of driver less cars would be complete without acknowledging the 87 00:05:32,839 --> 00:05:39,279 Speaker 1: accomplishments of someone known as Francis P. Houdina. He is 88 00:05:39,320 --> 00:05:42,560 Speaker 1: credited in various articles as being an electrical engineer for 89 00:05:42,600 --> 00:05:46,200 Speaker 1: the Army, and once he left the service, he founded 90 00:05:46,240 --> 00:05:50,440 Speaker 1: a company called the Houdina Radio Control Company. I'll have 91 00:05:50,520 --> 00:05:53,680 Speaker 1: more to say about Francis in a second, but let's 92 00:05:53,680 --> 00:05:55,760 Speaker 1: get to the heart of the matter as far as 93 00:05:55,800 --> 00:06:01,480 Speaker 1: autonomous cars go. Back in ninet, the Houdina Radio Control 94 00:06:01,520 --> 00:06:05,400 Speaker 1: Company engaged in a major publicity stunt. It happened on 95 00:06:05,480 --> 00:06:11,360 Speaker 1: July when a car dubbed the Lendrick in Wonder, though 96 00:06:11,480 --> 00:06:17,000 Speaker 1: sometimes it is credited as the American Wonder, roamed the 97 00:06:17,040 --> 00:06:20,800 Speaker 1: streets of New York City without a driver behind the wheel. 98 00:06:21,200 --> 00:06:24,039 Speaker 1: In fact, there was no one in the car whatsoever. 99 00:06:24,080 --> 00:06:27,080 Speaker 1: There was someone standing on the sideboard of the car, 100 00:06:27,279 --> 00:06:30,600 Speaker 1: the running board so they were perched on the outside 101 00:06:30,839 --> 00:06:32,920 Speaker 1: where they could reach in and grab the wheel, but 102 00:06:33,839 --> 00:06:38,599 Speaker 1: no one was actually sitting in the driver's seat Ellie, 103 00:06:38,680 --> 00:06:41,159 Speaker 1: and no one was apparently in control, at least not 104 00:06:41,200 --> 00:06:43,080 Speaker 1: directly behind the wheel. But as the name of the 105 00:06:43,120 --> 00:06:46,360 Speaker 1: company would indicate the line, Rick in Wonder was a 106 00:06:46,480 --> 00:06:50,480 Speaker 1: radio controlled car like a toy r C Cards, just 107 00:06:50,760 --> 00:06:54,400 Speaker 1: instead of being a toy, it was a full sized automobile. 108 00:06:55,480 --> 00:06:59,719 Speaker 1: Circuits connecting to motors would control the movements of the gearshift, 109 00:06:59,800 --> 00:07:03,640 Speaker 1: the accelerator, the brakes, the steering wheel. Actual control of 110 00:07:03,680 --> 00:07:07,440 Speaker 1: the vehicle came from an apparatus inside a following car, 111 00:07:08,000 --> 00:07:11,880 Speaker 1: So you'd have a car operated by a regular human 112 00:07:11,960 --> 00:07:15,040 Speaker 1: driver sitting in the driver's seat. In the passenger seat 113 00:07:15,080 --> 00:07:17,880 Speaker 1: would be the operator for the radio controlled car that 114 00:07:17,880 --> 00:07:21,120 Speaker 1: would travel in front of them. So the car was 115 00:07:21,200 --> 00:07:26,120 Speaker 1: not truly driverless because the driver did exist. It's just 116 00:07:26,240 --> 00:07:30,080 Speaker 1: the driver was in another vehicle. While researching this episode, 117 00:07:30,200 --> 00:07:33,040 Speaker 1: I found a New York Times article that described the event, 118 00:07:33,320 --> 00:07:37,480 Speaker 1: and it doesn't sound like it was a complete success. 119 00:07:37,520 --> 00:07:41,440 Speaker 1: From that article, these are quotes. A loose housing around 120 00:07:41,480 --> 00:07:43,920 Speaker 1: the shaft to the steering wheel in the radio car 121 00:07:44,280 --> 00:07:48,320 Speaker 1: caused the uncertain course as the procession got underway, as 122 00:07:48,400 --> 00:07:51,880 Speaker 1: John Alexander of the Houdina Company, riding in the second car, 123 00:07:52,160 --> 00:07:56,880 Speaker 1: applied the radio waves the directing apparatus attached to the shaft, 124 00:07:56,920 --> 00:08:00,320 Speaker 1: and the other automobile failed to grasp it prop really. 125 00:08:00,720 --> 00:08:04,200 Speaker 1: As a result, the radio car careened from left to 126 00:08:04,400 --> 00:08:09,160 Speaker 1: right down Broadway, around Columbus Circle and south on Fifth Avenue, 127 00:08:09,360 --> 00:08:12,920 Speaker 1: almost running down two trucks and a milk wagon, which 128 00:08:12,960 --> 00:08:17,480 Speaker 1: took to the curbs for safety. At Street, Houdina lunged 129 00:08:17,640 --> 00:08:20,400 Speaker 1: for the steering wheel, but could not prevent the car 130 00:08:20,520 --> 00:08:24,640 Speaker 1: from crashing into the fender of an automobile filled with cameramen. 131 00:08:25,120 --> 00:08:28,200 Speaker 1: It was at Fort Street that a crash into a 132 00:08:28,320 --> 00:08:32,560 Speaker 1: fire engine was barely averted. The police advised Judina to 133 00:08:32,600 --> 00:08:35,600 Speaker 1: postpone his experiments, but after the car had been driven 134 00:08:35,720 --> 00:08:39,360 Speaker 1: up Broadway, it was once more operated by radio along 135 00:08:39,559 --> 00:08:43,840 Speaker 1: Central Park drives. And uh, here's that bit about Francis 136 00:08:43,880 --> 00:08:47,960 Speaker 1: Judina I was mentioning earlier. You may have noticed that 137 00:08:48,080 --> 00:08:52,199 Speaker 1: the name bears some passing resemblance to that of Harry Houdini, 138 00:08:52,600 --> 00:08:55,400 Speaker 1: who was alive at that same time. He was the 139 00:08:55,440 --> 00:08:59,800 Speaker 1: famous escape artist and magician Now, according to the story, 140 00:09:00,400 --> 00:09:03,680 Speaker 1: and there are at least some court documents to back 141 00:09:03,760 --> 00:09:07,960 Speaker 1: this up, the post office was sometimes in the habit 142 00:09:08,000 --> 00:09:10,559 Speaker 1: of delivering some of the mail meant for the Houdina 143 00:09:10,679 --> 00:09:17,080 Speaker 1: Company to Houdini the magician, including bills, and Houdini was 144 00:09:17,120 --> 00:09:20,600 Speaker 1: not crazy about getting the bills from some other company 145 00:09:21,160 --> 00:09:23,800 Speaker 1: and he felt that it was encroaching upon his name. 146 00:09:23,880 --> 00:09:27,240 Speaker 1: So he marched on over to the Houdina Radio Company 147 00:09:27,360 --> 00:09:31,480 Speaker 1: headquarters and a scuffle broke out after he started raising 148 00:09:31,520 --> 00:09:34,280 Speaker 1: a fuss. The head of the office, a guy named 149 00:09:34,360 --> 00:09:38,760 Speaker 1: George Young, filed charges against Houdini for disorderly conduct, but 150 00:09:38,960 --> 00:09:41,320 Speaker 1: on the day that Houdini was to appear in court, 151 00:09:42,200 --> 00:09:45,360 Speaker 1: Young failed to show up, and so all charges were dismissed. 152 00:09:45,960 --> 00:09:49,640 Speaker 1: Now there's a guy named Dean Carnegie who posted a 153 00:09:49,679 --> 00:09:52,680 Speaker 1: few years ago that he had been contacted by the 154 00:09:52,840 --> 00:09:56,880 Speaker 1: son of the person who called himself Francis Houdina, and 155 00:09:56,880 --> 00:10:00,640 Speaker 1: that Judina was a pseudonym. And further, he says that 156 00:10:00,720 --> 00:10:05,600 Speaker 1: the son of Houdina revealed that this scuffle was all 157 00:10:05,640 --> 00:10:08,480 Speaker 1: just a publicity stunt, that Houdini had thought it up 158 00:10:09,120 --> 00:10:11,920 Speaker 1: and they had all worked on it together. I do 159 00:10:12,120 --> 00:10:15,160 Speaker 1: find it odd to have gone through all this trouble 160 00:10:15,480 --> 00:10:20,680 Speaker 1: to take on a pseudonym, establish a company, create a 161 00:10:20,720 --> 00:10:24,200 Speaker 1: demonstration of a radio controlled car, and all in an 162 00:10:24,280 --> 00:10:28,079 Speaker 1: effort to set up a big PR stunt with a magician. 163 00:10:28,360 --> 00:10:31,360 Speaker 1: It seems to me like that's an awful lot of 164 00:10:31,360 --> 00:10:33,600 Speaker 1: trouble to go through before you get to a point 165 00:10:33,600 --> 00:10:36,560 Speaker 1: where you can hold this PR stunt. But I guess 166 00:10:36,640 --> 00:10:39,479 Speaker 1: if someone was going to do it, it would be Houdini. 167 00:10:40,520 --> 00:10:43,800 Speaker 1: I am highly skeptical that it was in fact a 168 00:10:43,840 --> 00:10:48,280 Speaker 1: publicity stunt, only because, as I say, it's an awful 169 00:10:48,320 --> 00:10:50,280 Speaker 1: lot of trouble to go through in order to do it, 170 00:10:50,320 --> 00:10:54,440 Speaker 1: and it wouldn't necessarily bring good publicity, I would imagine. 171 00:10:55,200 --> 00:10:57,079 Speaker 1: I just thought I would share the story because when 172 00:10:57,080 --> 00:11:00,440 Speaker 1: I was doing the research, it kept popping up. Now, 173 00:11:01,120 --> 00:11:04,560 Speaker 1: this Houdina radio controlled vehicle was not the only one 174 00:11:04,559 --> 00:11:07,560 Speaker 1: of its kind. There were some other people who also 175 00:11:07,679 --> 00:11:12,079 Speaker 1: built full size radio controlled apparatus for cars. There were 176 00:11:12,080 --> 00:11:15,520 Speaker 1: a few in the twenties and thirties, and in general 177 00:11:15,559 --> 00:11:19,240 Speaker 1: people would start calling these phantom cars because they appeared 178 00:11:19,240 --> 00:11:21,880 Speaker 1: to be driven by an invisible phantom, but all of 179 00:11:21,880 --> 00:11:25,600 Speaker 1: them were actually controlled by a remote driver. Those controls 180 00:11:26,120 --> 00:11:29,480 Speaker 1: might not resemble those of a typical automobile, but they 181 00:11:29,520 --> 00:11:34,360 Speaker 1: would end up controlling motors that would affect the automobiles 182 00:11:35,520 --> 00:11:40,320 Speaker 1: regular operations, and according to various accounts, some of these 183 00:11:40,320 --> 00:11:43,760 Speaker 1: would be controlled by follow cars, sometimes by someone on 184 00:11:43,800 --> 00:11:47,880 Speaker 1: the street, sometimes by someone following in a low flying aircraft. 185 00:11:47,920 --> 00:11:51,920 Speaker 1: It all just depends upon the account. In nineteen thirty nine, 186 00:11:52,280 --> 00:11:55,760 Speaker 1: at the New York World's Fair, General Motors presented an 187 00:11:55,800 --> 00:11:59,839 Speaker 1: exhibit called Future Rama And it was not a and 188 00:12:00,000 --> 00:12:03,280 Speaker 1: aimated series by Matt Graining. It was something else. It 189 00:12:03,360 --> 00:12:07,280 Speaker 1: was a vision of the far off future of nineteen 190 00:12:07,400 --> 00:12:12,840 Speaker 1: sixty and the General Motors exhibit was a ride. Essentially, 191 00:12:13,040 --> 00:12:17,280 Speaker 1: visitors would get on these cars, these chair cars, and 192 00:12:17,640 --> 00:12:21,520 Speaker 1: be pulled through this exhibit, where there was an enormous 193 00:12:21,679 --> 00:12:26,760 Speaker 1: scale model. One part of this vision, which encompassed lots 194 00:12:26,760 --> 00:12:30,360 Speaker 1: of different thoughts about the future, was the Motorway of 195 00:12:30,440 --> 00:12:33,959 Speaker 1: the future, and the vision included a sort of driver 196 00:12:34,160 --> 00:12:39,440 Speaker 1: assist system in cars. They described an electro magnetic breaking 197 00:12:39,679 --> 00:12:42,320 Speaker 1: system that would engage if one car were to get 198 00:12:42,320 --> 00:12:44,680 Speaker 1: too close to the car ahead of it, and cars 199 00:12:44,679 --> 00:12:48,600 Speaker 1: would be traveling down lanes that would have raised walls 200 00:12:48,600 --> 00:12:52,640 Speaker 1: on either side, sort of like guiding slots. So it 201 00:12:52,720 --> 00:12:55,440 Speaker 1: sounded like GM was pretty sure cars would still be 202 00:12:55,559 --> 00:12:58,360 Speaker 1: under the control of human beings even in the far 203 00:12:58,480 --> 00:13:01,040 Speaker 1: off future world of nineteen six team, but there would 204 00:13:01,080 --> 00:13:04,679 Speaker 1: be some automated elements that would find their way into 205 00:13:04,760 --> 00:13:07,560 Speaker 1: vehicle operation. And these days we would say that that 206 00:13:07,720 --> 00:13:09,960 Speaker 1: vision of the future was pretty much on the money, 207 00:13:10,040 --> 00:13:13,240 Speaker 1: except for the part about it being nine, because it 208 00:13:13,320 --> 00:13:17,199 Speaker 1: took a little longer than that. Norman bell Getz, who 209 00:13:17,400 --> 00:13:20,920 Speaker 1: designed this exhibit for General Motors, described a couple of 210 00:13:20,920 --> 00:13:25,720 Speaker 1: possible methods for controlling traffic in the future using radio waves, 211 00:13:25,760 --> 00:13:28,400 Speaker 1: and in one he envisioned a system that would include 212 00:13:28,440 --> 00:13:32,160 Speaker 1: numerous broadcast towers along the side of the road, and 213 00:13:32,240 --> 00:13:36,120 Speaker 1: so cars would maintain contact with these broadcast towers. But 214 00:13:36,280 --> 00:13:39,960 Speaker 1: another version, he suggested the possibility of an electrical conductor 215 00:13:40,120 --> 00:13:43,640 Speaker 1: embedded in the road itself for direct control, and that 216 00:13:43,679 --> 00:13:47,040 Speaker 1: would be a method that a lot of different people 217 00:13:47,080 --> 00:13:50,520 Speaker 1: would look into over the next few years. Today we 218 00:13:50,559 --> 00:13:53,520 Speaker 1: would say that such an enormous system is probably unlikely 219 00:13:54,080 --> 00:13:58,080 Speaker 1: because of the huge investment it would require an infrastructure 220 00:13:58,120 --> 00:14:00,200 Speaker 1: that would come along with it. But back in the 221 00:14:00,280 --> 00:14:03,720 Speaker 1: nineteen thirties, the US highway system was still developing, it 222 00:14:03,760 --> 00:14:07,080 Speaker 1: was still being built across the country, so it's probably 223 00:14:07,120 --> 00:14:09,199 Speaker 1: seen as more of a possibility. After all, we were 224 00:14:09,240 --> 00:14:12,320 Speaker 1: already connecting distant parts of the country to each other, 225 00:14:12,920 --> 00:14:15,559 Speaker 1: so couldn't we just go to the extra effort to 226 00:14:15,760 --> 00:14:19,280 Speaker 1: wire all of that in some way. I have more 227 00:14:19,320 --> 00:14:22,480 Speaker 1: to say about this version of autonomous cars and how 228 00:14:22,520 --> 00:14:25,840 Speaker 1: that evolved into what we think of as an autonomous 229 00:14:25,840 --> 00:14:27,960 Speaker 1: car today, but first let's take a quick break to 230 00:14:28,040 --> 00:14:39,480 Speaker 1: thank our sponsor. In nineteen thirty six, a magazine called 231 00:14:39,640 --> 00:14:43,200 Speaker 1: Modern Mechanics published an article about a different method for 232 00:14:43,280 --> 00:14:47,320 Speaker 1: autonomous control of a vehicle. This version would include building 233 00:14:47,320 --> 00:14:51,000 Speaker 1: cars that had special photo cells on them to detect 234 00:14:51,080 --> 00:14:54,920 Speaker 1: specific frequencies of light. The car it self would project 235 00:14:55,400 --> 00:14:58,080 Speaker 1: this light. It would have a projector of some sort 236 00:14:58,120 --> 00:15:00,600 Speaker 1: on the front of the car, and then in the 237 00:15:00,680 --> 00:15:04,640 Speaker 1: road itself would be steel mirrors that could reflect the 238 00:15:04,720 --> 00:15:07,720 Speaker 1: light back at the car, and the photocells would pick 239 00:15:07,800 --> 00:15:10,800 Speaker 1: up that light. The article pointed out that cars were 240 00:15:10,880 --> 00:15:14,400 Speaker 1: already approaching the limits of human reflexes. This idea that 241 00:15:15,040 --> 00:15:19,000 Speaker 1: we were starting to drive faster than we could react. Um, 242 00:15:19,040 --> 00:15:23,760 Speaker 1: I guess, uh you know, good old uh Jack of 243 00:15:23,800 --> 00:15:26,280 Speaker 1: the Pork Shop Express would say he never, he never 244 00:15:26,360 --> 00:15:29,960 Speaker 1: drives faster than he can see. But we know that 245 00:15:30,040 --> 00:15:33,080 Speaker 1: human reflexes have their limits. So this article mainly focused 246 00:15:33,080 --> 00:15:36,400 Speaker 1: on the link between the photo cells and the steering 247 00:15:36,440 --> 00:15:40,200 Speaker 1: mechanism for the car. That leaves a lot of questions unanswered, 248 00:15:40,240 --> 00:15:43,720 Speaker 1: such as how the car would accelerate or break, would 249 00:15:43,720 --> 00:15:46,960 Speaker 1: there still be a human responsible for those operations, and 250 00:15:47,000 --> 00:15:49,400 Speaker 1: the only thing that would be taken off the plate 251 00:15:49,520 --> 00:15:52,560 Speaker 1: of the human driver is steering the car. One also 252 00:15:52,560 --> 00:15:55,640 Speaker 1: wonders exactly how any one car would deal with the 253 00:15:55,680 --> 00:16:01,280 Speaker 1: presence of other cars on the road that are similarly outfitted. Right, 254 00:16:01,400 --> 00:16:03,400 Speaker 1: So if you have a whole bunch of cars that 255 00:16:03,480 --> 00:16:07,680 Speaker 1: use this technology and it's dependent upon light being reflected 256 00:16:07,720 --> 00:16:10,920 Speaker 1: back at them, what happens when you start getting the 257 00:16:10,960 --> 00:16:13,600 Speaker 1: signals from one car picked up by another, or what 258 00:16:13,680 --> 00:16:16,240 Speaker 1: happens if you're driving on a really sunny day or 259 00:16:16,240 --> 00:16:19,880 Speaker 1: in really bad weather like fog or rain. But the 260 00:16:19,880 --> 00:16:23,240 Speaker 1: point is people were already thinking about alternatives to radio 261 00:16:23,320 --> 00:16:27,560 Speaker 1: controls even in the thirties. In nineteen fifty three, Arthur 262 00:16:27,840 --> 00:16:31,840 Speaker 1: mac Barrett created what he called a driverless vehicle he 263 00:16:31,960 --> 00:16:35,360 Speaker 1: mounted a wire in the ceiling of a warehouse, and 264 00:16:35,400 --> 00:16:38,240 Speaker 1: he had a specially outfitted vehicle. In this case, it 265 00:16:38,320 --> 00:16:42,760 Speaker 1: was a towing tractor that could follow that wire, and 266 00:16:42,800 --> 00:16:45,720 Speaker 1: in later versions he would bury the wire within the 267 00:16:45,760 --> 00:16:48,880 Speaker 1: floor of a facility. And he called this system the 268 00:16:49,000 --> 00:16:53,440 Speaker 1: guide O Matic. Now this was not meant for cars necessarily, 269 00:16:54,040 --> 00:16:56,520 Speaker 1: but it was based on similar principles to some of 270 00:16:56,560 --> 00:16:59,440 Speaker 1: the more sci fi ideas that were proposed in the 271 00:16:59,520 --> 00:17:02,920 Speaker 1: nineteen thirties. So we started seeing it actually being put 272 00:17:02,920 --> 00:17:04,880 Speaker 1: to use in the fifties, and in fact, this kind 273 00:17:04,920 --> 00:17:08,320 Speaker 1: of system is still used to this day, but obviously 274 00:17:08,400 --> 00:17:10,960 Speaker 1: with a much more sophisticated equipment than what was available 275 00:17:10,960 --> 00:17:16,120 Speaker 1: in the fifties. In Disney Studios produced a segment titled 276 00:17:16,320 --> 00:17:20,960 Speaker 1: Magic Highway USA as part of The Wonderful World of Disney, 277 00:17:21,080 --> 00:17:24,439 Speaker 1: and the piece included some humorous gags about what the 278 00:17:24,480 --> 00:17:28,359 Speaker 1: future of driving could be, just based off jokes really, 279 00:17:28,400 --> 00:17:31,080 Speaker 1: but after that was a more measured look, a more 280 00:17:31,720 --> 00:17:35,080 Speaker 1: thoughtful look at the future of driving. Still had some 281 00:17:35,119 --> 00:17:38,080 Speaker 1: pretty far out ideas in it. Now it did include 282 00:17:38,119 --> 00:17:40,879 Speaker 1: some practical stuff that has in fact come to pass. 283 00:17:40,960 --> 00:17:44,360 Speaker 1: For example, the idea that signage is going to need 284 00:17:44,400 --> 00:17:47,640 Speaker 1: to be larger and simpler so that motorists traveling at 285 00:17:47,640 --> 00:17:50,960 Speaker 1: great speed can read and understand the road signs in general. 286 00:17:51,320 --> 00:17:54,600 Speaker 1: That did come to pass. It also predicted a road 287 00:17:54,640 --> 00:17:57,600 Speaker 1: system that would be able to retain heat to keep 288 00:17:57,600 --> 00:18:01,720 Speaker 1: it dry even in snowstorms. That has not happened. It 289 00:18:01,760 --> 00:18:04,399 Speaker 1: does sound a lot like some of the smart road 290 00:18:04,600 --> 00:18:08,600 Speaker 1: systems that were being peddled no no pun intended around 291 00:18:08,640 --> 00:18:11,240 Speaker 1: a few years ago. You may remember those the smart 292 00:18:11,320 --> 00:18:14,080 Speaker 1: highways that we're supposed to be made up of photo cells, 293 00:18:14,520 --> 00:18:18,840 Speaker 1: solar cells essentially, and they could soak up light, generate electricity, 294 00:18:19,080 --> 00:18:22,600 Speaker 1: and even be warmed so that snow and ice woulden 295 00:18:22,720 --> 00:18:25,480 Speaker 1: form on them. Those have not really panned out so well, 296 00:18:25,520 --> 00:18:31,040 Speaker 1: at least not in any widespread application. However, some of 297 00:18:31,040 --> 00:18:36,280 Speaker 1: the other bits in that magic Highway section included predictions 298 00:18:36,320 --> 00:18:40,080 Speaker 1: similar to GPS and rear mounted cameras on cars, so 299 00:18:40,400 --> 00:18:43,640 Speaker 1: we do have those, although the one that was proposed 300 00:18:43,640 --> 00:18:46,720 Speaker 1: in the piece was more of a full time rear 301 00:18:46,800 --> 00:18:49,719 Speaker 1: view camera, so instead of having a rear view mirror, 302 00:18:50,000 --> 00:18:53,200 Speaker 1: you would have a rear view screen that you would 303 00:18:53,200 --> 00:18:57,200 Speaker 1: consistently looked to for information about what's going on. Behind 304 00:18:57,240 --> 00:18:59,880 Speaker 1: the car. Some of the other predictions did not pan 305 00:19:00,000 --> 00:19:03,320 Speaker 1: out in any way, shape or form, such as tunneling 306 00:19:03,359 --> 00:19:08,320 Speaker 1: by atomic energy. Yeah, the actual special suggested that we 307 00:19:08,440 --> 00:19:13,200 Speaker 1: use an enormous like atomic ray cannon essentially that would 308 00:19:13,280 --> 00:19:16,560 Speaker 1: melt a hole through the side of a mountain when 309 00:19:16,600 --> 00:19:19,000 Speaker 1: we needed to build a tunnel for a highway. That 310 00:19:19,480 --> 00:19:23,040 Speaker 1: clearly has not happened. But the special then goes on 311 00:19:23,080 --> 00:19:25,440 Speaker 1: to suggest that in the future will get in our 312 00:19:25,520 --> 00:19:28,600 Speaker 1: family vehicles and with a push of a few buttons, 313 00:19:28,960 --> 00:19:32,160 Speaker 1: which in the specials depicted as physical slighter controls kind 314 00:19:32,160 --> 00:19:35,120 Speaker 1: of like you would see on a on a stereo 315 00:19:35,359 --> 00:19:39,200 Speaker 1: or a soundboard, we would select our destination and an 316 00:19:39,200 --> 00:19:43,359 Speaker 1: electronic system incorporating the vehicle and the road itself, so 317 00:19:43,480 --> 00:19:46,800 Speaker 1: it would be a system that has both internal components 318 00:19:46,800 --> 00:19:49,240 Speaker 1: in the car and external components in the environment would 319 00:19:49,280 --> 00:19:52,480 Speaker 1: take care of everything else. So again this vision hinges 320 00:19:52,600 --> 00:19:55,760 Speaker 1: on that sort of smart highway concept, the idea that 321 00:19:56,160 --> 00:19:59,280 Speaker 1: a lot of this work is being done by the 322 00:19:59,359 --> 00:20:03,520 Speaker 1: infrastr ructure, not just the vehicle. Disney was just one 323 00:20:03,600 --> 00:20:07,320 Speaker 1: company to promote this kind of idea. America's Electric Light 324 00:20:07,359 --> 00:20:11,720 Speaker 1: and Power companies ran an advertisement in the Saturday Evening 325 00:20:11,760 --> 00:20:14,800 Speaker 1: post in the nineteen fifties with an illustration that showed 326 00:20:15,160 --> 00:20:19,919 Speaker 1: the stereotypical nineteen fifties American family depicted as it was 327 00:20:20,080 --> 00:20:22,160 Speaker 1: at that time in the medium, which is to say 328 00:20:22,720 --> 00:20:26,280 Speaker 1: it was a white upper class or upper middle class 329 00:20:26,320 --> 00:20:29,200 Speaker 1: at least family. There was a father, mother's son, and daughter, 330 00:20:29,280 --> 00:20:32,680 Speaker 1: so that kind of stereotypical family. All four of those 331 00:20:32,720 --> 00:20:35,280 Speaker 1: family members are inside a car that has sort of 332 00:20:35,320 --> 00:20:37,720 Speaker 1: like that big glass bubble kind of approach, sort of 333 00:20:37,720 --> 00:20:40,240 Speaker 1: what you would see in something like the Jetsons, and 334 00:20:40,280 --> 00:20:43,760 Speaker 1: they're all facing inward toward each other. A couple of 335 00:20:43,840 --> 00:20:47,440 Speaker 1: them are playing dominoes, they're having conversation. No one's having 336 00:20:47,440 --> 00:20:50,720 Speaker 1: to drive right, the car itself is doing it. And 337 00:20:50,760 --> 00:20:53,320 Speaker 1: the ad talks about how the electric age will lead 338 00:20:53,320 --> 00:20:56,760 Speaker 1: to automation and efficiency in all sorts of areas, including 339 00:20:56,800 --> 00:21:00,679 Speaker 1: stuff like flat TV screens and vehicles control by quote 340 00:21:00,760 --> 00:21:05,800 Speaker 1: electronic devices embedded in the road end quote. Now keep 341 00:21:05,800 --> 00:21:08,000 Speaker 1: in mind again this is in the post World War 342 00:21:08,040 --> 00:21:11,760 Speaker 1: two era. This is an era in which America's industry 343 00:21:11,840 --> 00:21:16,280 Speaker 1: was a key component of national identity. It was part 344 00:21:16,359 --> 00:21:18,840 Speaker 1: of what people thought of when they were asked the 345 00:21:18,920 --> 00:21:22,160 Speaker 1: question what is it to be American industry and innovation 346 00:21:22,240 --> 00:21:28,240 Speaker 1: were very much, really important components of that identity. And 347 00:21:28,280 --> 00:21:33,720 Speaker 1: these weren't just concepts. These weren't just artists and advertisers saying, 348 00:21:34,040 --> 00:21:36,920 Speaker 1: let's come up with some sort of wild idea. There 349 00:21:36,920 --> 00:21:40,479 Speaker 1: were engineers who were actively building cars and test roads 350 00:21:40,760 --> 00:21:44,840 Speaker 1: to work out the actual details. Joseph Bidwell and Lawrence 351 00:21:44,840 --> 00:21:48,560 Speaker 1: Hofstad who were researchers with General motors outfitted in nineteen 352 00:21:49,240 --> 00:21:51,960 Speaker 1: Chevrolet with pickup coils to work with a road that 353 00:21:52,040 --> 00:21:55,240 Speaker 1: had embedded electrical wire in it. The coils were connected 354 00:21:55,240 --> 00:21:57,560 Speaker 1: to motors that could adjust the cars steering so that 355 00:21:57,680 --> 00:22:00,159 Speaker 1: can continue to follow the wire below, very much much 356 00:22:00,240 --> 00:22:03,600 Speaker 1: like the guide Oh Manic that talked about earlier. Meanwhile, 357 00:22:04,040 --> 00:22:07,359 Speaker 1: over at our c A another smarty pants was working 358 00:22:07,400 --> 00:22:11,720 Speaker 1: on this challenge. This would be Vladimir Zwarakin, which some 359 00:22:11,880 --> 00:22:14,639 Speaker 1: of you people may know as one of the pioneers 360 00:22:14,640 --> 00:22:18,439 Speaker 1: who played a really big part in the development of television. 361 00:22:18,720 --> 00:22:21,560 Speaker 1: In fact, depending upon whom you ask, it was Warrikin, 362 00:22:21,720 --> 00:22:26,199 Speaker 1: not farms Worth who was pioneer of TV. But honestly, 363 00:22:26,240 --> 00:22:28,520 Speaker 1: it's a very complicated story, and I've talked about before 364 00:22:28,520 --> 00:22:30,280 Speaker 1: on tech stuff so I'm not gonna go into it here, 365 00:22:30,320 --> 00:22:35,520 Speaker 1: but back to driverless cars. His concept included embedding circuits 366 00:22:35,520 --> 00:22:39,400 Speaker 1: in the roads that would be able to sense vehicles magnetically, 367 00:22:40,040 --> 00:22:43,359 Speaker 1: and his vision had the circuits identifying the speed and 368 00:22:43,520 --> 00:22:47,679 Speaker 1: position of vehicles, which would provide information to a centralized 369 00:22:47,680 --> 00:22:51,000 Speaker 1: system that could then send out instructions to specific cars 370 00:22:51,359 --> 00:22:54,080 Speaker 1: in order to manage traffic. And his idea turned out 371 00:22:54,119 --> 00:22:58,439 Speaker 1: to be impractical for widespread deployment for autonomous cars. However, 372 00:22:59,200 --> 00:23:02,520 Speaker 1: it did becomes sort of the foundation for car sensing 373 00:23:02,600 --> 00:23:06,320 Speaker 1: loops that are under many intersections. They're used to help 374 00:23:06,320 --> 00:23:09,960 Speaker 1: control traffic lights. Those loops that can detect if there's 375 00:23:09,960 --> 00:23:13,280 Speaker 1: a vehicle on top of it through the electromagnetic effect 376 00:23:13,720 --> 00:23:16,239 Speaker 1: and thus send a signal to the traffic lights that 377 00:23:16,560 --> 00:23:20,160 Speaker 1: they should switch over soon so that they change the 378 00:23:20,200 --> 00:23:23,960 Speaker 1: direction of traffic. That's pretty cool. A key component in 379 00:23:24,040 --> 00:23:27,080 Speaker 1: many of these concepts was that the system for control 380 00:23:27,560 --> 00:23:31,520 Speaker 1: lay outside of the vehicle itself. It required some sort 381 00:23:31,560 --> 00:23:35,680 Speaker 1: of larger centralized system to handle things, and the cars 382 00:23:35,760 --> 00:23:38,800 Speaker 1: would respond to commands from that system. A car might 383 00:23:38,880 --> 00:23:41,520 Speaker 1: have some components abort it to help with this, but 384 00:23:41,640 --> 00:23:44,359 Speaker 1: for the most part, the important elements were external to 385 00:23:44,400 --> 00:23:47,560 Speaker 1: the vehicle. So why was that? Why were we thinking 386 00:23:48,200 --> 00:23:53,040 Speaker 1: outside the car? Well, keep in mind that before we 387 00:23:53,160 --> 00:23:57,160 Speaker 1: did not have transistors, so electronics were very large and bulky, 388 00:23:57,320 --> 00:24:01,520 Speaker 1: and even the seven transistor was not a practical component 389 00:24:01,560 --> 00:24:04,080 Speaker 1: that you would incorporate into a finished product. So it 390 00:24:04,119 --> 00:24:07,680 Speaker 1: would be a few years before transistors would really play 391 00:24:07,720 --> 00:24:11,479 Speaker 1: an important role in consumer technology, and mentorization was just 392 00:24:11,560 --> 00:24:14,800 Speaker 1: getting started in the fifties and sixties, so computers were 393 00:24:15,000 --> 00:24:18,320 Speaker 1: enormous machines that would take up at least a desk, 394 00:24:18,800 --> 00:24:23,200 Speaker 1: but sometimes an entire room. So driving, because it's such 395 00:24:23,200 --> 00:24:27,320 Speaker 1: a complicated task, meant it wasn't really practical to create 396 00:24:27,359 --> 00:24:30,840 Speaker 1: a fully autonomous car. The computer you would need to 397 00:24:31,000 --> 00:24:34,480 Speaker 1: calculate all the different decisions that would be made in 398 00:24:34,600 --> 00:24:37,199 Speaker 1: order to drive a vehicle would be bigger than the 399 00:24:37,280 --> 00:24:39,879 Speaker 1: car was. It made more sense to look outside the 400 00:24:39,960 --> 00:24:42,359 Speaker 1: vehicle for the components that would be needed for a 401 00:24:42,440 --> 00:24:46,600 Speaker 1: driverless automobile and send commands to a car that would 402 00:24:46,600 --> 00:24:49,480 Speaker 1: be more like a dumb terminal would be for a supercomputer. 403 00:24:50,119 --> 00:24:54,359 Speaker 1: Experts recognized the potential for autonomous systems. In particular, many 404 00:24:54,400 --> 00:24:58,160 Speaker 1: engineers believed a good system would save lives and prevent injuries. 405 00:24:58,400 --> 00:25:01,200 Speaker 1: As we became accustomed to travel link at higher speeds, 406 00:25:01,560 --> 00:25:03,960 Speaker 1: there was a legit fear that people were driving too 407 00:25:04,080 --> 00:25:06,960 Speaker 1: quickly to be able to react safely in the event 408 00:25:07,040 --> 00:25:11,400 Speaker 1: of an emergency. In nineteen sixty, Norbert Wiener, a mathematician 409 00:25:11,400 --> 00:25:13,200 Speaker 1: at m I T. He's also known as the father 410 00:25:13,359 --> 00:25:17,480 Speaker 1: of cybernetics, said, quote, by the time we are able 411 00:25:17,520 --> 00:25:20,280 Speaker 1: to react to our senses and stop the car which 412 00:25:20,280 --> 00:25:23,320 Speaker 1: we are driving, it may already have run head on 413 00:25:23,400 --> 00:25:26,720 Speaker 1: into a wall end quote. He was advocating for some 414 00:25:26,760 --> 00:25:29,440 Speaker 1: sort of feedback system that could react in a fraction 415 00:25:29,440 --> 00:25:31,760 Speaker 1: of the time humans can, And he had a point. 416 00:25:32,160 --> 00:25:35,760 Speaker 1: Reaction times can average between a hundred fifty milliseconds to 417 00:25:35,840 --> 00:25:40,040 Speaker 1: three hundred milliseconds, depending upon this stimuli, and that gets 418 00:25:40,520 --> 00:25:43,040 Speaker 1: pretty darn fast, and a hundred fifty milliseconds it's not 419 00:25:43,080 --> 00:25:45,520 Speaker 1: a lot of times, so that's a pretty fast reaction time. However, 420 00:25:46,200 --> 00:25:48,120 Speaker 1: let's say that you're driving in a car that's going 421 00:25:48,200 --> 00:25:52,000 Speaker 1: sixty miles per hour or around kilometers per hour. That 422 00:25:52,080 --> 00:25:55,040 Speaker 1: means you're traveling at eight ft per second. Even if 423 00:25:55,080 --> 00:25:59,240 Speaker 1: your reflexes are on the fast side. That hundred fifty milliseconds. 424 00:26:00,119 --> 00:26:03,560 Speaker 1: It means you travel thirteen ft before you'd even start 425 00:26:03,600 --> 00:26:07,880 Speaker 1: to do anything, you would see something happened to you, 426 00:26:08,080 --> 00:26:10,600 Speaker 1: and by the time you were able to start touching 427 00:26:10,640 --> 00:26:15,280 Speaker 1: the break, you've all already traveled thirteen feet. Also, if 428 00:26:15,320 --> 00:26:18,600 Speaker 1: you're traveling sixty you're in a vehicle. The vehicle has 429 00:26:18,600 --> 00:26:21,240 Speaker 1: a pretty hefty mass. You've got a lot of inertia 430 00:26:21,359 --> 00:26:24,000 Speaker 1: to deal with. Two, You're not gonna stop on a dime. 431 00:26:24,160 --> 00:26:27,320 Speaker 1: It's gonna take you time and therefore distance to stop. 432 00:26:27,880 --> 00:26:31,480 Speaker 1: So the thought was if we could build out vehicles 433 00:26:31,520 --> 00:26:35,600 Speaker 1: that could react for us much more quickly than we 434 00:26:35,760 --> 00:26:40,320 Speaker 1: ourselves can react, and that these vehicles could monitor conditions 435 00:26:40,359 --> 00:26:42,920 Speaker 1: that surround the car at all times, not just what's 436 00:26:42,920 --> 00:26:45,520 Speaker 1: happening at whatever you happen to be focusing on at 437 00:26:45,560 --> 00:26:49,760 Speaker 1: that moment. Wouldn't that be great? And we will revisit 438 00:26:49,800 --> 00:26:52,160 Speaker 1: that idea and a couple of episodes. When we start 439 00:26:52,200 --> 00:26:56,120 Speaker 1: talking about the arguments for autonomous cars, we'll also talk 440 00:26:56,160 --> 00:26:59,440 Speaker 1: about the arguments against them. I've got a lot more 441 00:26:59,480 --> 00:27:02,560 Speaker 1: to say about these early concepts and autonomous cars, but 442 00:27:02,640 --> 00:27:05,560 Speaker 1: first let's take another quick break to thank our sponsor. 443 00:27:13,160 --> 00:27:15,400 Speaker 1: You know it would be great if our cars could 444 00:27:15,400 --> 00:27:18,639 Speaker 1: watch after us. But the researchers, engineers, and mechanics of 445 00:27:18,720 --> 00:27:23,159 Speaker 1: the sixties, we're running into huge design challenges and progress 446 00:27:23,280 --> 00:27:26,720 Speaker 1: was pretty slow. Money for autonomous systems was running low 447 00:27:27,000 --> 00:27:30,639 Speaker 1: as well, as the automotive industry began to dedicate funds 448 00:27:30,680 --> 00:27:34,240 Speaker 1: toward developing technology that would help mitigate human error. So, 449 00:27:34,280 --> 00:27:37,639 Speaker 1: in other words, the tech to take human error out 450 00:27:38,040 --> 00:27:41,800 Speaker 1: of the equation, that is, to make driverless cars was 451 00:27:42,000 --> 00:27:46,520 Speaker 1: really complicated and beyond our ability to to realize at 452 00:27:46,560 --> 00:27:51,639 Speaker 1: that time. So instead companies shifted to, well, human error 453 00:27:51,720 --> 00:27:53,600 Speaker 1: is going to happen. We can't take it out of 454 00:27:53,640 --> 00:27:56,840 Speaker 1: the equation, so let's figure out how to have human 455 00:27:56,920 --> 00:28:00,560 Speaker 1: error make the least negative impact possible. So for that 456 00:28:00,640 --> 00:28:04,200 Speaker 1: reason we saw money instead being dedicated to the development 457 00:28:04,240 --> 00:28:08,320 Speaker 1: of other technologies, stuff like seat belts, airbags, anti lock 458 00:28:08,400 --> 00:28:11,480 Speaker 1: brake systems, and it would stay that way throughout the 459 00:28:11,520 --> 00:28:14,920 Speaker 1: nineteen sixties and nineteen seventies. It really wasn't until the 460 00:28:15,000 --> 00:28:18,879 Speaker 1: nineteen eighties that we started seeing serious work and experimentation 461 00:28:18,960 --> 00:28:24,120 Speaker 1: in driverless systems going again. For land based vehicles. Keep 462 00:28:24,119 --> 00:28:28,760 Speaker 1: in mind, we had had things like you know, automated 463 00:28:28,840 --> 00:28:34,320 Speaker 1: pilots for a long time, that was a relatively simple 464 00:28:35,000 --> 00:28:39,200 Speaker 1: problem to solve compared to cars. Cars continued to be 465 00:28:39,280 --> 00:28:42,880 Speaker 1: a difficult problem. Now. One of the engineers who did 466 00:28:43,480 --> 00:28:46,320 Speaker 1: very important work in the nineteen eighties was a guy 467 00:28:46,400 --> 00:28:50,760 Speaker 1: named Ernst Dickman's from Germany. He ran a lab at 468 00:28:51,440 --> 00:28:56,200 Speaker 1: bundes Verre University in Munich, Germany, and he started out 469 00:28:56,440 --> 00:29:00,880 Speaker 1: as an aerospace engineer, so super smart guy, but he 470 00:29:00,920 --> 00:29:03,920 Speaker 1: had ambitions to work on creating a way for vehicles 471 00:29:04,000 --> 00:29:06,600 Speaker 1: to be able to see their surroundings and then react 472 00:29:06,640 --> 00:29:09,440 Speaker 1: to them. His work would provide the foundation for tons 473 00:29:09,600 --> 00:29:13,959 Speaker 1: of innovation in dynamic computer vision. So in the nineteen eighties, 474 00:29:14,160 --> 00:29:16,440 Speaker 1: he and his research team took a van that was 475 00:29:16,520 --> 00:29:20,320 Speaker 1: manufactured by Mercedes Benz and began to customize it for 476 00:29:20,440 --> 00:29:24,080 Speaker 1: driverless operation. Now, according to Dickmans, the university sort of 477 00:29:24,080 --> 00:29:26,960 Speaker 1: just let him do this because he had a reputation 478 00:29:26,960 --> 00:29:29,560 Speaker 1: of being brilliant, so they said, well, he's a smart guy, 479 00:29:29,640 --> 00:29:32,320 Speaker 1: let him do what he does. So his team refitted 480 00:29:32,320 --> 00:29:35,240 Speaker 1: this van with various systems that would be able to 481 00:29:35,280 --> 00:29:39,880 Speaker 1: control steering, acceleration, breaking. They also outfitted the vehicle with 482 00:29:39,920 --> 00:29:43,719 Speaker 1: a computer system to process information and then sensors and 483 00:29:43,760 --> 00:29:48,000 Speaker 1: cameras to gather information. So you have the sensors and 484 00:29:48,040 --> 00:29:50,760 Speaker 1: cameras that bring in data, send it to a computer. 485 00:29:50,880 --> 00:29:54,240 Speaker 1: The computer processes of the data and then sends commands 486 00:29:54,600 --> 00:29:58,240 Speaker 1: to the various control systems to change the behavior of 487 00:29:58,280 --> 00:30:03,440 Speaker 1: the vehicle. That's your basic concept behind the modern autonomous car. 488 00:30:04,000 --> 00:30:07,240 Speaker 1: So they incorporate technology that could detect the steering angle, 489 00:30:07,800 --> 00:30:12,880 Speaker 1: break pressure, temperature, acceleration in both latitudinal and longitudinal directions 490 00:30:12,920 --> 00:30:15,520 Speaker 1: and more. Uh. The camera was actually a pair of 491 00:30:15,560 --> 00:30:19,320 Speaker 1: cameras mounted on swivels that could move along two axes 492 00:30:19,800 --> 00:30:22,240 Speaker 1: in order to focus on specific points within the field 493 00:30:22,240 --> 00:30:25,960 Speaker 1: of view. And they called the experiment v A M 494 00:30:26,040 --> 00:30:30,000 Speaker 1: O R s V mores with a big V, big M, 495 00:30:30,040 --> 00:30:34,960 Speaker 1: big R. So alternating caps and lower case. Sticking a 496 00:30:35,040 --> 00:30:38,840 Speaker 1: camera on a car is one thing, right, anyone can 497 00:30:38,880 --> 00:30:42,760 Speaker 1: really do that. Teaching a computer to interpret images from 498 00:30:42,800 --> 00:30:46,240 Speaker 1: that camera is another thing. Entirely and in the nineteen 499 00:30:46,280 --> 00:30:49,760 Speaker 1: eighties and normally would take a computer several minutes to 500 00:30:49,880 --> 00:30:54,120 Speaker 1: analyze a single image in any meaningful way, and even 501 00:30:54,160 --> 00:30:57,840 Speaker 1: that was fairly limited compared to what we can do today. 502 00:30:57,920 --> 00:31:00,640 Speaker 1: So to be useful in the driving scenario needed something 503 00:31:00,760 --> 00:31:03,640 Speaker 1: dramatically better than that. A computer would have to analyze 504 00:31:03,680 --> 00:31:07,160 Speaker 1: many images per second, like ten images per second, not 505 00:31:07,360 --> 00:31:10,760 Speaker 1: one image every ten minutes. So how do you fix 506 00:31:10,880 --> 00:31:15,560 Speaker 1: that problem? That's an enormous challenge. Well, Dickmon's solution was 507 00:31:15,640 --> 00:31:19,040 Speaker 1: to limit what the car was actually looking at, and 508 00:31:19,120 --> 00:31:22,000 Speaker 1: so he took human eyesight as kind of a source 509 00:31:22,040 --> 00:31:26,160 Speaker 1: of inspiration. You see, we're only really able to focus 510 00:31:26,160 --> 00:31:30,200 Speaker 1: on a relatively small part of our vision. Everything else 511 00:31:30,240 --> 00:31:32,760 Speaker 1: that's in our field of view is there, but it's 512 00:31:32,760 --> 00:31:36,720 Speaker 1: not really in focus. So we concentrate on whatever we 513 00:31:36,800 --> 00:31:39,880 Speaker 1: have deemed to be important at that moment. It might 514 00:31:39,920 --> 00:31:42,680 Speaker 1: be traffic ahead of us, or an incoming soccer ball 515 00:31:42,760 --> 00:31:45,000 Speaker 1: kicked at our heads, or whatever it may be. So 516 00:31:45,080 --> 00:31:49,240 Speaker 1: Dickmon's thought, hey, if I limit what the computer system 517 00:31:49,320 --> 00:31:52,800 Speaker 1: is focused on and I let it ignore everything else, 518 00:31:53,520 --> 00:31:55,880 Speaker 1: then I limit the amount of data that needs to 519 00:31:55,880 --> 00:31:59,680 Speaker 1: be processed and everything speeds up as a result. So 520 00:31:59,760 --> 00:32:03,480 Speaker 1: he focused on finding shortcuts, such as programming the computer 521 00:32:03,560 --> 00:32:07,480 Speaker 1: to only really look at stuff like road markings and 522 00:32:07,520 --> 00:32:11,200 Speaker 1: to ignore other things. His work was dedicated to creating 523 00:32:11,280 --> 00:32:14,840 Speaker 1: an early driverless system that could function on an empty 524 00:32:14,880 --> 00:32:17,680 Speaker 1: stretch of road in the early days, so it wasn't 525 00:32:17,720 --> 00:32:21,320 Speaker 1: really important to worry about other things that your average 526 00:32:21,440 --> 00:32:25,000 Speaker 1: driver would have to worry about, like other vehicles or obstacles. 527 00:32:25,360 --> 00:32:28,240 Speaker 1: He was just concentrating on how can I make a 528 00:32:28,400 --> 00:32:32,920 Speaker 1: system that will reliably follow a road without having a 529 00:32:33,000 --> 00:32:36,320 Speaker 1: driver behind the wheel. He was building the foundational blocks 530 00:32:36,440 --> 00:32:41,120 Speaker 1: at that time. Dackmans also sped up the computation process 531 00:32:41,160 --> 00:32:44,080 Speaker 1: by limiting the need for the computer to save images, 532 00:32:44,520 --> 00:32:47,480 Speaker 1: so it was really just analyzing and responding to each 533 00:32:47,520 --> 00:32:50,920 Speaker 1: image and then conveniently kind of forgetting about them. His 534 00:32:51,000 --> 00:32:54,479 Speaker 1: techniques paid off with some early demonstrations, but they relied 535 00:32:54,520 --> 00:32:59,160 Speaker 1: heavily on predictable and reliable components like those road markings. 536 00:32:59,160 --> 00:33:02,200 Speaker 1: But if the road road markings were obscured or if 537 00:33:02,200 --> 00:33:04,880 Speaker 1: they were absent, then the car would start to drift 538 00:33:04,960 --> 00:33:08,440 Speaker 1: out of its lane. It didn't know quote unquote where 539 00:33:08,440 --> 00:33:11,240 Speaker 1: it was supposed to be, and it might just continue 540 00:33:11,240 --> 00:33:15,560 Speaker 1: to wander on whatever steering direction it was in before 541 00:33:15,720 --> 00:33:18,520 Speaker 1: it lost track of the road markings. And we're talking 542 00:33:18,560 --> 00:33:22,600 Speaker 1: about a five ton van that his team had been testing, 543 00:33:22,720 --> 00:33:26,800 Speaker 1: so that's potentially a real danger now. To be fair, 544 00:33:27,200 --> 00:33:30,520 Speaker 1: they were testing it on unoccupied stretches of road, so 545 00:33:31,160 --> 00:33:34,840 Speaker 1: at least the potential for a catastrophe was severely limited. 546 00:33:35,400 --> 00:33:39,400 Speaker 1: They worked on like the unopened stretches of the Autobahn, 547 00:33:39,520 --> 00:33:44,520 Speaker 1: for example, so it was more safe than what I'm 548 00:33:44,520 --> 00:33:48,360 Speaker 1: making it sound like, especially since this was new road, 549 00:33:48,960 --> 00:33:53,440 Speaker 1: so that the times when the road markings were not 550 00:33:53,800 --> 00:33:58,480 Speaker 1: detectable were rare. Dickman's work would become a part of 551 00:33:58,520 --> 00:34:02,280 Speaker 1: a huge project much bigger than just autonomous cars, called 552 00:34:02,320 --> 00:34:06,480 Speaker 1: the Eureka Framework, and Eureka is still around today, but 553 00:34:06,640 --> 00:34:09,480 Speaker 1: this was sort of a a European Union kind of 554 00:34:09,520 --> 00:34:13,840 Speaker 1: thing before there was a European Union, so it Eureka 555 00:34:13,920 --> 00:34:17,719 Speaker 1: is a pan European research and development funding organization. It's 556 00:34:17,760 --> 00:34:22,359 Speaker 1: meant to make sure that European nations remain competitive with 557 00:34:22,520 --> 00:34:26,320 Speaker 1: other countries, namely countries like Japan and the United States. 558 00:34:27,000 --> 00:34:30,120 Speaker 1: For several years, Dickman's and his team were working on 559 00:34:30,200 --> 00:34:35,720 Speaker 1: refining this driverless car technology, and that culminated in demonstrations 560 00:34:35,760 --> 00:34:41,720 Speaker 1: that happened in nine and n one happened in Paris, France. 561 00:34:42,280 --> 00:34:45,239 Speaker 1: Dickman's had a huge challenge ahead of him. Daimler, with 562 00:34:45,280 --> 00:34:48,640 Speaker 1: whom he was working wanted his team to equip passenger 563 00:34:48,719 --> 00:34:54,520 Speaker 1: cars Daimler passenger cars with this driverless technology, and they 564 00:34:54,560 --> 00:34:58,720 Speaker 1: intended for the team to have one of these cars 565 00:34:58,840 --> 00:35:02,680 Speaker 1: navigate a three lane highway in public traffic, being able 566 00:35:02,719 --> 00:35:07,520 Speaker 1: to make automated changes of lane and everything while carrying real, 567 00:35:07,680 --> 00:35:10,680 Speaker 1: live human passengers, and his team would only have a 568 00:35:10,719 --> 00:35:15,280 Speaker 1: couple of years to accomplish this goal before this demo. 569 00:35:15,960 --> 00:35:19,760 Speaker 1: After a brief consideration, Dick Wan's agreed to this challenge 570 00:35:19,800 --> 00:35:23,640 Speaker 1: and he got to work and in October, his team 571 00:35:23,680 --> 00:35:26,760 Speaker 1: picked up several important people at the Charles de Gaulle 572 00:35:26,840 --> 00:35:30,680 Speaker 1: Airport and took them to a highway and then flipped 573 00:35:30,800 --> 00:35:34,760 Speaker 1: the automobiles, not literally, but they flipped them to autonomous mode. 574 00:35:35,480 --> 00:35:38,239 Speaker 1: And both of the cars that were used in this 575 00:35:38,280 --> 00:35:41,799 Speaker 1: demonstration still had human drivers sitting in the driver's seat. 576 00:35:41,920 --> 00:35:44,480 Speaker 1: They still had their hands on the wheel, but they 577 00:35:44,520 --> 00:35:47,160 Speaker 1: weren't putting any pressure on the wheel. They weren't turning 578 00:35:47,160 --> 00:35:49,200 Speaker 1: the wheel. They just had their hands there in case 579 00:35:49,360 --> 00:35:54,000 Speaker 1: something should happen, so they would occasionally take their hands 580 00:35:54,000 --> 00:35:56,239 Speaker 1: away from the wheel to show that the cars were 581 00:35:56,239 --> 00:35:59,279 Speaker 1: in fact driving themselves and that they were just there 582 00:35:59,360 --> 00:36:04,080 Speaker 1: for safety sake. His team took an altered vehicle on 583 00:36:04,120 --> 00:36:07,799 Speaker 1: an autonomous trip from Bavaria to Denmark that's more than 584 00:36:07,840 --> 00:36:11,760 Speaker 1: one thousand miles or seventeen hundred kilometers. The car reached 585 00:36:11,840 --> 00:36:14,160 Speaker 1: speeds of up to a hundred nine miles per hour 586 00:36:14,520 --> 00:36:17,800 Speaker 1: or a hundred seventy five kilometers per hour, so pretty 587 00:36:17,840 --> 00:36:21,960 Speaker 1: impressive for an autonomous car. Now, despite these remarkable achievements, 588 00:36:22,280 --> 00:36:25,480 Speaker 1: the technology was still too primitive for widespread use. It 589 00:36:25,520 --> 00:36:29,200 Speaker 1: depended heavily on predictable factors. Anything outside of that was 590 00:36:29,239 --> 00:36:33,040 Speaker 1: more of a challenge, particularly obstacle detection. They didn't build 591 00:36:33,040 --> 00:36:35,560 Speaker 1: it in to be consumer friendly, so you had these 592 00:36:35,640 --> 00:36:39,799 Speaker 1: large computer systems that were inside the vehicles themselves. So 593 00:36:39,840 --> 00:36:43,120 Speaker 1: it was an exciting advancement in autonomous car technology, but 594 00:36:43,160 --> 00:36:46,279 Speaker 1: it wasn't far enough along for consumer or practical use, 595 00:36:46,800 --> 00:36:48,319 Speaker 1: and so the world would have to wait a bit 596 00:36:48,360 --> 00:36:51,319 Speaker 1: longer for tech to evolve to give autonomous cars another go. 597 00:36:52,080 --> 00:36:56,120 Speaker 1: And this is related to a concept called AI winter, 598 00:36:56,640 --> 00:36:59,560 Speaker 1: which is tied into the idea of hype cycles. And 599 00:36:59,640 --> 00:37:02,920 Speaker 1: AI winter is named that way because it's considered to 600 00:37:02,960 --> 00:37:06,719 Speaker 1: be the funding equivalent of a nuclear winter, and it 601 00:37:06,800 --> 00:37:11,000 Speaker 1: describes the time when there's a growing reluctance to fund 602 00:37:11,040 --> 00:37:15,200 Speaker 1: AI projects. Generally speaking, this is how the pattern tends 603 00:37:15,239 --> 00:37:18,880 Speaker 1: to play out. You get some super smart people making 604 00:37:18,920 --> 00:37:23,040 Speaker 1: some cool advances in artificial intelligence, and those advances may 605 00:37:23,080 --> 00:37:27,240 Speaker 1: one day have practical application in numerous technologies, but early 606 00:37:27,320 --> 00:37:31,440 Speaker 1: on we're talking about truly experimental work that's exciting but 607 00:37:31,600 --> 00:37:35,920 Speaker 1: not necessarily practical at the moment. However, word of that 608 00:37:36,040 --> 00:37:40,280 Speaker 1: work gets around. Maybe the company sponsoring the research releases 609 00:37:40,280 --> 00:37:43,440 Speaker 1: a big press release that implies breakthroughs are closer than 610 00:37:43,480 --> 00:37:46,520 Speaker 1: they really are. Maybe the media picks up the story 611 00:37:46,560 --> 00:37:50,240 Speaker 1: and they run with it. Enthusiasm among the general populace grows, 612 00:37:50,640 --> 00:37:54,320 Speaker 1: and funding gets easier to secure. But as time passes 613 00:37:54,680 --> 00:37:57,520 Speaker 1: and it becomes clear that in reality, these sort of 614 00:37:57,560 --> 00:37:59,680 Speaker 1: things take a lot of time, and they take a 615 00:37:59,680 --> 00:38:01,280 Speaker 1: lot of work, and they take a lot of money 616 00:38:01,560 --> 00:38:05,800 Speaker 1: to make progress in fields like artificial intelligence, then people 617 00:38:05,880 --> 00:38:09,080 Speaker 1: get less enchanted. Typically, starting with the media, you get 618 00:38:09,120 --> 00:38:12,080 Speaker 1: these stories that are the equivalent of where's my flying car? 619 00:38:12,600 --> 00:38:15,000 Speaker 1: And the narrative changes from think about how the cool, 620 00:38:15,120 --> 00:38:18,080 Speaker 1: how cool the future is going to be too? Why 621 00:38:18,160 --> 00:38:22,080 Speaker 1: isn't the future here already? So enthusiasm for the field drops, 622 00:38:22,239 --> 00:38:25,760 Speaker 1: and then funding drops and that in turn sets back 623 00:38:25,840 --> 00:38:29,640 Speaker 1: the field even further, which delays any other big breakthroughs 624 00:38:29,640 --> 00:38:33,560 Speaker 1: in the process. Eventually, this part of the cycle comes 625 00:38:33,600 --> 00:38:36,359 Speaker 1: to an end if you're lucky, and then enthusiasm can 626 00:38:36,440 --> 00:38:41,520 Speaker 1: begin to build again. AI has experienced several of these cycles, 627 00:38:41,600 --> 00:38:43,800 Speaker 1: and we've also seen the same thing in other fields 628 00:38:43,800 --> 00:38:47,759 Speaker 1: as well. Virtual reality is a field that leaps to mine. 629 00:38:48,640 --> 00:38:52,080 Speaker 1: Dickman's work was really exciting, and it had even survived 630 00:38:52,120 --> 00:38:55,680 Speaker 1: one AI winter in the late nineteen eighties uh and 631 00:38:55,800 --> 00:38:58,719 Speaker 1: got all the way through the mid nineties, But at 632 00:38:58,760 --> 00:39:02,120 Speaker 1: that time the funding was really becoming scarce and his 633 00:39:02,360 --> 00:39:04,399 Speaker 1: work had really gone about as far as it could 634 00:39:04,480 --> 00:39:08,120 Speaker 1: go based upon the sophistication of technology at the time, 635 00:39:08,719 --> 00:39:10,920 Speaker 1: and so it kind of came to an end, and 636 00:39:11,000 --> 00:39:14,160 Speaker 1: his pioneer work was largely forgotten for many years. In 637 00:39:14,200 --> 00:39:17,000 Speaker 1: our next episode, we'll look at the resurgence of interest 638 00:39:17,040 --> 00:39:20,400 Speaker 1: in autonomous cars and how the US Department of Defense 639 00:39:20,440 --> 00:39:23,440 Speaker 1: got involved. But for now we're going to conclude this. 640 00:39:23,920 --> 00:39:26,600 Speaker 1: If you guys have any suggestions for future episodes of 641 00:39:26,640 --> 00:39:29,480 Speaker 1: Tech Stuff, or you've got any stories you want to 642 00:39:29,520 --> 00:39:32,719 Speaker 1: share about autonomous cars, or maybe there's some guests I 643 00:39:32,719 --> 00:39:35,160 Speaker 1: should have on the show. Anything like that, let me know. 644 00:39:35,280 --> 00:39:38,279 Speaker 1: Send me an email. The addresses tech stuff at how 645 00:39:38,360 --> 00:39:41,120 Speaker 1: stuff works dot com. You can go to tech stuff 646 00:39:41,200 --> 00:39:43,840 Speaker 1: podcast dot com. That's our website with all the information 647 00:39:43,880 --> 00:39:45,359 Speaker 1: about the show. In other ways to get in touch 648 00:39:45,400 --> 00:39:47,759 Speaker 1: with me, don't forget to go to our store over 649 00:39:47,840 --> 00:39:51,720 Speaker 1: at t public dot com slash tech stuff. Every purchase 650 00:39:51,760 --> 00:39:53,640 Speaker 1: you make goes to help the show, and we greatly 651 00:39:53,680 --> 00:39:57,320 Speaker 1: appreciate it. And I'll talk to you again really soon 652 00:40:02,800 --> 00:40:05,239 Speaker 1: for more on this and thousands of other topics. Is 653 00:40:05,280 --> 00:40:11,880 Speaker 1: it how stuff works dot com, wh