1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,760 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,800 --> 00:00:17,200 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,239 --> 00:00:19,640 Speaker 1: How Stuff Works, and I heart radio and I love 5 00:00:19,720 --> 00:00:23,920 Speaker 1: all things tech and his time for another classic episode 6 00:00:24,040 --> 00:00:29,400 Speaker 1: of tech Stuff. This episode originally published on May two, 7 00:00:29,600 --> 00:00:33,600 Speaker 1: thousand ten, and I wanted to publish it today. On 8 00:00:34,080 --> 00:00:38,240 Speaker 1: March eight two is when the republication date is because 9 00:00:38,240 --> 00:00:42,680 Speaker 1: it's International Women's Day, and I thought what more appropriate 10 00:00:42,720 --> 00:00:47,120 Speaker 1: way to celebrate International Women's Day than by celebrating one 11 00:00:47,159 --> 00:00:52,720 Speaker 1: of the most remarkable women in tech history, Ada Lovelace. 12 00:00:53,200 --> 00:00:56,800 Speaker 1: So this episode has the title was Ada Lovelace the 13 00:00:56,920 --> 00:01:02,440 Speaker 1: first computer Programmer, which is particular regularly fascinating because she 14 00:01:02,680 --> 00:01:07,600 Speaker 1: lived at a time before computers. Chris Pallette and I 15 00:01:07,640 --> 00:01:11,560 Speaker 1: talk about Ada and you can hear how much we 16 00:01:11,600 --> 00:01:14,520 Speaker 1: fell in love with her as we did our research. 17 00:01:14,800 --> 00:01:18,240 Speaker 1: I hope you enjoy this classic episode. We got two 18 00:01:18,319 --> 00:01:20,679 Speaker 1: requests in the space of a week, which is not 19 00:01:20,720 --> 00:01:24,840 Speaker 1: a big surprise considering the the subject of this um 20 00:01:25,040 --> 00:01:28,680 Speaker 1: this podcast, it's a listener mail rock blow Yes. So 21 00:01:29,160 --> 00:01:33,600 Speaker 1: this comes from Bridget and Adam. So I'll read bridgets first, 22 00:01:34,319 --> 00:01:37,720 Speaker 1: Bridget is from Australia, but I'm not going to try 23 00:01:37,720 --> 00:01:39,959 Speaker 1: and do an Australian accent because whenever I do, I 24 00:01:40,000 --> 00:01:43,039 Speaker 1: sound like a New Zealander who suffered massive head trauma. 25 00:01:43,200 --> 00:01:47,039 Speaker 1: So here is bridgets email. Good day, Chris and Jonathan. 26 00:01:47,120 --> 00:01:50,600 Speaker 1: I've been spending some time lately looking to inspirational people 27 00:01:50,600 --> 00:01:52,840 Speaker 1: in hope of finding a suitable name for my soon 28 00:01:52,920 --> 00:01:56,320 Speaker 1: to be born child. Such searching brought me to Aida Lovelace, 29 00:01:56,440 --> 00:01:59,560 Speaker 1: otherwise known as the Mother of Coding. I've done a 30 00:01:59,560 --> 00:02:02,320 Speaker 1: little search into Ada and found that there's some discussion 31 00:02:02,320 --> 00:02:05,400 Speaker 1: as to whether she deserves this moniker. Was Ada Lovelace 32 00:02:05,440 --> 00:02:08,440 Speaker 1: the first computer programmer and therefore a worthy namesake for 33 00:02:08,480 --> 00:02:11,440 Speaker 1: my future daughter? Let me know what you think. Cheers, 34 00:02:11,680 --> 00:02:15,800 Speaker 1: Bridget and Adams. Was I recently learned a little about 35 00:02:15,800 --> 00:02:18,160 Speaker 1: Ada Lovelace, the first woman to write an algorithm that 36 00:02:18,200 --> 00:02:20,239 Speaker 1: would be read by a computer, and thought it would 37 00:02:20,280 --> 00:02:22,440 Speaker 1: make a great podcast. I love the show. Keep up 38 00:02:22,480 --> 00:02:24,640 Speaker 1: the amazing work. Can you also do a show on 39 00:02:24,680 --> 00:02:30,519 Speaker 1: the LHC. Please cheers? Insert beer clink sound here. Alright, 40 00:02:30,560 --> 00:02:33,920 Speaker 1: Bridget and Adam, this is our podcast about Ada, not 41 00:02:34,040 --> 00:02:38,320 Speaker 1: about the lhc um jan, we can't do this podcast. 42 00:02:38,560 --> 00:02:40,280 Speaker 1: What do you mean we can't do this podcast? It's 43 00:02:40,320 --> 00:02:44,520 Speaker 1: already been done. I mean Stuff you Missed in History Class, 44 00:02:44,560 --> 00:02:47,160 Speaker 1: it's already done a whole podcast. There's a podcast called 45 00:02:47,160 --> 00:02:49,480 Speaker 1: stuff you Missed in History Class? There is It's wonderful, 46 00:02:50,080 --> 00:02:52,239 Speaker 1: is that the one with Katy and Sarah? It is indeed, 47 00:02:52,240 --> 00:02:54,840 Speaker 1: and they talked about Ada Lovelace already they did, and 48 00:02:54,880 --> 00:02:56,360 Speaker 1: they did really Well, you know what we should do, 49 00:02:56,680 --> 00:02:59,400 Speaker 1: what's that? We should just have their podcast play and 50 00:02:59,440 --> 00:03:02,440 Speaker 1: we'll sign off. All right, we'll just insert their podcast 51 00:03:02,480 --> 00:03:05,079 Speaker 1: here and then no, we can't. We can't do that. 52 00:03:05,160 --> 00:03:07,400 Speaker 1: We can't do that. Besides, they specifically mentioned us in 53 00:03:07,440 --> 00:03:09,600 Speaker 1: that podcast. Well, maybe what we should do then is 54 00:03:09,639 --> 00:03:14,160 Speaker 1: talk specifically about her computer programming expertise and how she 55 00:03:14,240 --> 00:03:17,880 Speaker 1: managed to do that considering she lived in the eighteen hundreds. Yeah, 56 00:03:17,919 --> 00:03:20,840 Speaker 1: you would think she lived before computers. How could she 57 00:03:20,880 --> 00:03:24,440 Speaker 1: have written a computer programmer? Program? Rather not she wrote 58 00:03:24,440 --> 00:03:28,360 Speaker 1: a programmer. That's been a long week. Well, we're gonna 59 00:03:28,360 --> 00:03:31,160 Speaker 1: do this on Friday. Actual clearly not, but we're going 60 00:03:31,200 --> 00:03:33,839 Speaker 1: to tell you how she wrote a computer program. First 61 00:03:33,840 --> 00:03:35,440 Speaker 1: of all, Bridget I'm going to get this all the way. 62 00:03:35,480 --> 00:03:38,680 Speaker 1: First of all, congratulations on your child, Um, and also 63 00:03:38,800 --> 00:03:42,200 Speaker 1: Aida is more than worthy, I would say. In fact, 64 00:03:42,360 --> 00:03:44,160 Speaker 1: I kind of fell in love with this lady the 65 00:03:44,160 --> 00:03:46,600 Speaker 1: more I read about her. Actually her first name was 66 00:03:46,600 --> 00:03:50,040 Speaker 1: an Aida. No, it was Augusta Augusta Ada Byron. Yeah, 67 00:03:50,080 --> 00:03:53,839 Speaker 1: Augusta Ada Byron, daughter of Lord Byron the poet. Yes. 68 00:03:54,080 --> 00:03:57,560 Speaker 1: She was born December eighteen fifteen, the daughter of Lord 69 00:03:57,560 --> 00:04:02,080 Speaker 1: George Gordon Byron and Annabella Millbank Byron. Um. Actually, your 70 00:04:02,080 --> 00:04:04,640 Speaker 1: parents married on the second of January and eighteen fifteen, 71 00:04:04,640 --> 00:04:08,800 Speaker 1: but were separated by January six, eighteen sixteen. YEA, so 72 00:04:08,840 --> 00:04:11,360 Speaker 1: the marriage lasted a full year and a week and 73 00:04:11,360 --> 00:04:14,320 Speaker 1: a half, just long enough to uh to have the 74 00:04:14,360 --> 00:04:17,680 Speaker 1: first computer programmer born to them, right, Um, so yes, 75 00:04:17,760 --> 00:04:21,120 Speaker 1: they're they're marriage was not a happy one. Her parents 76 00:04:21,520 --> 00:04:25,080 Speaker 1: and uh. In fact, um young Ada was never to meet. 77 00:04:25,120 --> 00:04:29,120 Speaker 1: Her father was separated. Um, she lived with her mom, 78 00:04:29,160 --> 00:04:32,000 Speaker 1: and her mom had decided that Aida did not really 79 00:04:32,040 --> 00:04:34,359 Speaker 1: need to have the distractions of poetry. She thought that 80 00:04:34,440 --> 00:04:41,400 Speaker 1: Byron's rather unpredictable personality let's call it that showy um 81 00:04:41,680 --> 00:04:45,919 Speaker 1: was due to his romanticism and his obsession with poetry. Yeah, 82 00:04:46,040 --> 00:04:49,599 Speaker 1: let's just and and Annabella. The mother felt that the 83 00:04:50,400 --> 00:04:53,560 Speaker 1: that such qualities were not really admirable. She didn't want 84 00:04:53,600 --> 00:04:56,839 Speaker 1: her daughter to have the same sort of personality and 85 00:04:56,960 --> 00:05:01,760 Speaker 1: uh and and wanting lifestyle as the father did had 86 00:05:02,000 --> 00:05:05,000 Speaker 1: so um, so she thought, well, what's the least poetic 87 00:05:05,000 --> 00:05:07,000 Speaker 1: thing I can push my daughter into. I happen to 88 00:05:07,040 --> 00:05:12,320 Speaker 1: be an amateur mathematician. Let's push her into mathematics. Yeah. Actually, uh. 89 00:05:12,440 --> 00:05:14,960 Speaker 1: As a matter of fact, I found out that Lord 90 00:05:15,000 --> 00:05:21,239 Speaker 1: Byron had referred to his very brief married married wife. Um, 91 00:05:22,000 --> 00:05:25,360 Speaker 1: he called Annabella the Princess of parallelograms. Yes, that was 92 00:05:25,480 --> 00:05:31,560 Speaker 1: a lot poetic. It was not meant to be a compliment, nonetheless, 93 00:05:31,760 --> 00:05:35,279 Speaker 1: but it does illustrate that she had a mathematical bent herself. 94 00:05:35,400 --> 00:05:39,640 Speaker 1: And what I find interesting is that not only were 95 00:05:39,760 --> 00:05:45,440 Speaker 1: Lord Byron's poetical genes evident later in Ada's life, but 96 00:05:46,000 --> 00:05:48,279 Speaker 1: it's actually she ended up being sort of a blend 97 00:05:48,320 --> 00:05:51,680 Speaker 1: of both of her parents, as is appropriate for this case. Yeah. Yeah, 98 00:05:51,760 --> 00:05:59,080 Speaker 1: and Ada herself received a wonderful, wonderful title given to 99 00:05:59,120 --> 00:06:01,840 Speaker 1: her by by Charles Babbage, who will will discuss at 100 00:06:01,920 --> 00:06:05,400 Speaker 1: length in a little bit. Uh. The enchantress of numbers, 101 00:06:05,920 --> 00:06:10,680 Speaker 1: which I think is an amazing, amazing phrase and very 102 00:06:10,760 --> 00:06:15,960 Speaker 1: fitting as well. So Aida grows up UM with some 103 00:06:16,040 --> 00:06:19,839 Speaker 1: of the best tutors in that you can imagine. During 104 00:06:19,839 --> 00:06:24,920 Speaker 1: this time she studies mathematics and has absolutely fascinated with them, 105 00:06:25,600 --> 00:06:29,800 Speaker 1: the subject of mathematics, and is incredibly adept, an amazing student. 106 00:06:29,800 --> 00:06:32,640 Speaker 1: In fact, the more we research data, the more I 107 00:06:32,720 --> 00:06:38,280 Speaker 1: realized anyway, that she was phenomenally more intelligent than I am. 108 00:06:40,040 --> 00:06:43,840 Speaker 1: I mean, I can't even really compare UM. She was 109 00:06:43,920 --> 00:06:49,359 Speaker 1: able to to understand algorithms that that completely baffle me, 110 00:06:49,839 --> 00:06:53,039 Speaker 1: and I was able to to really study them in 111 00:06:53,040 --> 00:06:56,320 Speaker 1: a way that she found fascinating. I find them perplexing 112 00:06:56,800 --> 00:07:01,520 Speaker 1: and maddening. She found it as having its own kind 113 00:07:01,560 --> 00:07:04,960 Speaker 1: of poetry. UM. And in a way you think about it, well, 114 00:07:05,000 --> 00:07:07,120 Speaker 1: this kind of makes sense, you know, we we when 115 00:07:07,120 --> 00:07:09,479 Speaker 1: you really look at algorithms, we're talking about things like 116 00:07:09,600 --> 00:07:13,840 Speaker 1: number theory and how the universe sort of works, like 117 00:07:13,920 --> 00:07:17,560 Speaker 1: how things kind of fit together. And we expressed this 118 00:07:18,000 --> 00:07:21,440 Speaker 1: more often than not through mathematic equations and algorithms and 119 00:07:21,480 --> 00:07:23,960 Speaker 1: things of that nature. And she was able to see 120 00:07:24,000 --> 00:07:27,280 Speaker 1: that kind of stuff. I'm under I can understand the 121 00:07:27,360 --> 00:07:30,520 Speaker 1: underlying concept, but when you get beyond that, it just 122 00:07:30,880 --> 00:07:33,400 Speaker 1: I feel like I'm a fish out of water. Yeah, 123 00:07:33,600 --> 00:07:39,320 Speaker 1: I understand. Well, let's see, Um, somebody who else who 124 00:07:39,400 --> 00:07:43,120 Speaker 1: was fascinated with her would be William King. Yes, he 125 00:07:43,160 --> 00:07:45,680 Speaker 1: was so fascinated. Whether he married her? Well, the first 126 00:07:45,680 --> 00:07:48,320 Speaker 1: William King who was who was her tutor? I found 127 00:07:48,320 --> 00:07:50,440 Speaker 1: that I found this interesting. I mean the two different 128 00:07:50,480 --> 00:07:52,720 Speaker 1: William Kings. Well, no, actually I did mean I did 129 00:07:52,840 --> 00:07:54,840 Speaker 1: mean her future husband. But it was really funny because 130 00:07:55,240 --> 00:07:56,640 Speaker 1: it confused me for a second when I was doing 131 00:07:56,640 --> 00:07:59,280 Speaker 1: the research. I said, William King was her tutor, and 132 00:07:59,320 --> 00:08:01,720 Speaker 1: then it turns out there was a William King, not 133 00:08:01,800 --> 00:08:04,480 Speaker 1: the one she married, as her tutor, who was apparently 134 00:08:05,400 --> 00:08:07,840 Speaker 1: immediately feeling out of his depth as he talked to her. 135 00:08:07,880 --> 00:08:10,640 Speaker 1: He realized that she had a much more innate grasp 136 00:08:10,880 --> 00:08:14,200 Speaker 1: of mathematics than he did, so he he actually bowed 137 00:08:14,200 --> 00:08:16,520 Speaker 1: out very quickly. He was one of the many many 138 00:08:17,320 --> 00:08:20,760 Speaker 1: But yeah, less than a year later, apparently he uh 139 00:08:20,240 --> 00:08:25,680 Speaker 1: ad married the other William King, who um was the 140 00:08:25,720 --> 00:08:29,320 Speaker 1: eighth Baron King and was an earl, made an earl 141 00:08:29,360 --> 00:08:39,360 Speaker 1: in ninety eight, so that's when she became eighteen thirty eight, dude, Sorry, So, yes, 142 00:08:39,440 --> 00:08:41,640 Speaker 1: he was made an earl in eighteen thirty eight, and 143 00:08:41,720 --> 00:08:44,480 Speaker 1: that's when she became the Countess of Lovelace. Yes, so 144 00:08:44,520 --> 00:08:48,280 Speaker 1: he usually just referred to her as Ada Lovelace. Uh Now. 145 00:08:49,559 --> 00:08:55,360 Speaker 1: Aida continued her her almost obsession with mathematics throughout her life, 146 00:08:55,400 --> 00:09:01,800 Speaker 1: which unfortunately was tragically short Ada us away from after 147 00:09:01,960 --> 00:09:05,319 Speaker 1: contracting cancer. Um. I think she was thirty seven. It 148 00:09:05,320 --> 00:09:10,640 Speaker 1: was eighteen fifty two, and she died November eighteen eighteen. Right. 149 00:09:11,320 --> 00:09:16,160 Speaker 1: Uh so, but during that her life she ended up 150 00:09:16,280 --> 00:09:20,520 Speaker 1: encountering lots of remarkable people, including you know, like the 151 00:09:20,520 --> 00:09:24,080 Speaker 1: author of Charles Dickens, who became a close friend. One 152 00:09:24,080 --> 00:09:27,920 Speaker 1: of her other friends was Charles Babbage. Yeah, she met 153 00:09:28,160 --> 00:09:32,240 Speaker 1: she met Babbage, who was the Leucassian professor of mathematics 154 00:09:32,240 --> 00:09:37,520 Speaker 1: at Cambridge. She met him when she was just seventeen. Um, 155 00:09:37,520 --> 00:09:40,720 Speaker 1: which is a pretty interesting eighteen thirty three was it 156 00:09:40,800 --> 00:09:43,360 Speaker 1: went around when that happened. Chris and I have more 157 00:09:43,400 --> 00:09:45,960 Speaker 1: to say about Ada Lovelace in just a moment, But 158 00:09:46,000 --> 00:09:55,880 Speaker 1: first let's take a quick break. Should we get into 159 00:09:55,920 --> 00:09:59,800 Speaker 1: the the time when she was talking at a party 160 00:10:00,280 --> 00:10:05,280 Speaker 1: Babbage about this new machine Um, he had come up 161 00:10:05,360 --> 00:10:10,960 Speaker 1: with this thing, the analytical Engine. Yes, all right, well 162 00:10:11,040 --> 00:10:13,240 Speaker 1: let's let's backtrack just a touch before we get into 163 00:10:13,240 --> 00:10:15,679 Speaker 1: the analytical Engine. That was not the first machine that 164 00:10:15,720 --> 00:10:17,920 Speaker 1: Babbage had proposed. No, no, not at all. As a 165 00:10:17,920 --> 00:10:21,600 Speaker 1: matter of fact, we brought it up before from the past, right, 166 00:10:21,760 --> 00:10:24,960 Speaker 1: that was a fun podcast, Um, But yes, this was 167 00:10:25,320 --> 00:10:27,040 Speaker 1: a more recent one. And as we talked about on 168 00:10:27,040 --> 00:10:31,280 Speaker 1: that podcast, Um, Babbage was having difficulty getting funding for 169 00:10:31,440 --> 00:10:34,920 Speaker 1: these amazing machines because people just didn't get it. Babbage 170 00:10:34,960 --> 00:10:38,720 Speaker 1: was able to get subfunding for his first uh machine, 171 00:10:38,760 --> 00:10:41,840 Speaker 1: which is called the Difference Engine. Yes, it was different 172 00:10:42,280 --> 00:10:44,439 Speaker 1: from the other one, right, it was a little more 173 00:10:44,480 --> 00:10:47,800 Speaker 1: simplistic than his idea for the analytical engine. Right now. 174 00:10:47,840 --> 00:10:50,000 Speaker 1: The Difference Engine, he managed to get some money to 175 00:10:50,120 --> 00:10:52,760 Speaker 1: fund it, but his the process of building it was 176 00:10:52,800 --> 00:10:57,679 Speaker 1: a very long, laborious process. They had to actually machine 177 00:10:57,760 --> 00:11:00,679 Speaker 1: these parts by hand and and try and it all together. 178 00:11:01,160 --> 00:11:03,520 Speaker 1: And uh, he kind of ran out of money before 179 00:11:03,520 --> 00:11:06,000 Speaker 1: he ran out of machine. Machine was not done yet. 180 00:11:06,400 --> 00:11:08,640 Speaker 1: And um it was in the process of this whole 181 00:11:08,760 --> 00:11:11,640 Speaker 1: construction phase that he got the idea for the analytical machine, 182 00:11:11,640 --> 00:11:15,200 Speaker 1: which was even more ambitious than the difference engine. Right, 183 00:11:15,480 --> 00:11:18,520 Speaker 1: So the analytical engine was going to be uh more 184 00:11:18,559 --> 00:11:22,880 Speaker 1: complex and be able to do more than the difference engine, 185 00:11:22,880 --> 00:11:27,319 Speaker 1: which you could kind of say was essentially a giant calculator, right. 186 00:11:27,480 --> 00:11:31,199 Speaker 1: The analytical engine was more like a very primitive computer. 187 00:11:32,400 --> 00:11:34,400 Speaker 1: And as a matter of fact that at that time, 188 00:11:34,640 --> 00:11:36,280 Speaker 1: that whole time thing, the fact that it was taking 189 00:11:36,320 --> 00:11:38,199 Speaker 1: a long time to build, did not help him when 190 00:11:38,240 --> 00:11:40,560 Speaker 1: he was seeking funding for the analytical engine. Right. There 191 00:11:40,559 --> 00:11:42,920 Speaker 1: were two things that two things that budd that kind 192 00:11:42,920 --> 00:11:44,760 Speaker 1: of plagued him when he was trying to get some money. 193 00:11:44,840 --> 00:11:47,480 Speaker 1: One was that he had not finished the difference engine, 194 00:11:47,559 --> 00:11:49,160 Speaker 1: and that was kind of what he was being paid 195 00:11:49,160 --> 00:11:51,959 Speaker 1: for in the first place. So his funders were saying, 196 00:11:52,480 --> 00:11:54,960 Speaker 1: until you build this other machine you promised us several 197 00:11:55,040 --> 00:11:57,600 Speaker 1: years ago, we ain't given you no more money. Yeah, 198 00:11:57,920 --> 00:11:59,520 Speaker 1: And then the other part of they probably did say 199 00:11:59,600 --> 00:12:01,600 Speaker 1: like that out They probably said with an English accents, 200 00:12:01,640 --> 00:12:04,760 Speaker 1: that's probably till you go and fish daddy, we ain't 201 00:12:04,800 --> 00:12:09,600 Speaker 1: given you no more money. They're apparently all glad, apparently 202 00:12:09,800 --> 00:12:14,520 Speaker 1: apparently all played by Dick van Dyke. So anyway, at 203 00:12:14,520 --> 00:12:17,440 Speaker 1: any rate, so Sabbage is in a tight spot. But 204 00:12:17,600 --> 00:12:19,880 Speaker 1: he comes up with this idea of the analytical engine, 205 00:12:19,880 --> 00:12:22,760 Speaker 1: and of course he's very passionate about it, so he's 206 00:12:22,920 --> 00:12:25,680 Speaker 1: blabbering on and on about it at parties. Yes, then 207 00:12:25,679 --> 00:12:30,120 Speaker 1: you have young Ada Lovelace, who overhears such talk thinks 208 00:12:30,840 --> 00:12:35,120 Speaker 1: this sounds absolutely fascinating. And not only does she think 209 00:12:35,120 --> 00:12:39,400 Speaker 1: it's interesting, she immediately sees the potential to use such 210 00:12:39,440 --> 00:12:45,840 Speaker 1: a device far beyond even Babbage is uh concepts. Sabbage 211 00:12:45,880 --> 00:12:49,079 Speaker 1: is thinking, well, this would allow you to create an 212 00:12:49,120 --> 00:12:53,040 Speaker 1: engine that would be able to generate the numbers that 213 00:12:53,080 --> 00:12:57,920 Speaker 1: you would find in a logarithmic table. Yes, because until 214 00:12:57,960 --> 00:13:00,280 Speaker 1: that point you pretty much had to be able to 215 00:13:00,320 --> 00:13:03,160 Speaker 1: come up with those figures by doing the calculations all yourself. 216 00:13:03,200 --> 00:13:06,760 Speaker 1: And these calculations were pretty complex, and it was easy 217 00:13:06,800 --> 00:13:09,520 Speaker 1: to make a mistake along the way, which would of 218 00:13:09,520 --> 00:13:13,360 Speaker 1: course affect all of your figures from that point on. Uh. 219 00:13:13,400 --> 00:13:16,160 Speaker 1: And he he just he was sitting down one day 220 00:13:16,200 --> 00:13:18,079 Speaker 1: and he was thinking, what if I could What if 221 00:13:18,080 --> 00:13:21,480 Speaker 1: there are a machine, some steam powered machine that could 222 00:13:22,040 --> 00:13:25,280 Speaker 1: generate these numbers so I wouldn't have to and then 223 00:13:25,320 --> 00:13:28,640 Speaker 1: I could I could generate them as far out as 224 00:13:28,840 --> 00:13:31,440 Speaker 1: I wanted to. Uh. And I wouldn't have to worry 225 00:13:31,440 --> 00:13:34,440 Speaker 1: about error because the machine would just be following the 226 00:13:34,520 --> 00:13:38,880 Speaker 1: same algorithm over and over and over again. Yes, well, 227 00:13:39,080 --> 00:13:41,680 Speaker 1: Ada thought of that, and she even went further. She 228 00:13:41,760 --> 00:13:46,559 Speaker 1: said that you could potentially use mathematics to represent other 229 00:13:46,640 --> 00:13:53,720 Speaker 1: things like text or images or even music. She would 230 00:13:53,800 --> 00:13:59,720 Speaker 1: had foreseen computers. This remarkable woman was able to look 231 00:13:59,760 --> 00:14:02,319 Speaker 1: at this machine that really was meant to be able 232 00:14:02,320 --> 00:14:06,319 Speaker 1: to run algorithms so that you could generate more mathematical 233 00:14:06,360 --> 00:14:09,640 Speaker 1: figures mainly in the in the pursuit of mathematics itself 234 00:14:09,679 --> 00:14:12,760 Speaker 1: and things like number theory, um, and she was able 235 00:14:12,800 --> 00:14:19,360 Speaker 1: to see even grander uses, which to me is it's 236 00:14:19,400 --> 00:14:22,120 Speaker 1: it's it's one of those discoveries that I just think 237 00:14:22,680 --> 00:14:26,720 Speaker 1: before that time, no one had ever really even considered this, 238 00:14:26,840 --> 00:14:29,640 Speaker 1: and then she just comes up with it just by 239 00:14:29,840 --> 00:14:33,520 Speaker 1: looking at this thing and seeing its potential. Yes, it's 240 00:14:33,560 --> 00:14:37,600 Speaker 1: that's where I'm like, Okay, this woman was way above 241 00:14:37,640 --> 00:14:39,920 Speaker 1: and beyond smarter than I am. All right, stop geeking 242 00:14:39,960 --> 00:14:41,560 Speaker 1: out for a second. Okay, I'm sorry. I will I 243 00:14:41,560 --> 00:14:43,440 Speaker 1: will give you a quote from her. As a matter 244 00:14:43,440 --> 00:14:46,000 Speaker 1: of fact, she h she compared to to Jacquard's looms 245 00:14:46,040 --> 00:14:49,320 Speaker 1: if you will remember we've mentioned that machine a couple 246 00:14:49,320 --> 00:14:52,000 Speaker 1: of times on the podcast. I believe, Um, this was 247 00:14:52,080 --> 00:14:57,880 Speaker 1: a loom that was invented by uh Monsieur Jacquard and 248 00:14:58,880 --> 00:15:01,640 Speaker 1: basically made a lot of unhappy because it used punch 249 00:15:01,720 --> 00:15:05,040 Speaker 1: cards to automate parts of the weaving process. You could 250 00:15:05,080 --> 00:15:06,840 Speaker 1: put in a pattern, a card for a pattern, and 251 00:15:06,960 --> 00:15:10,600 Speaker 1: the loom would be able to weave that pattern into 252 00:15:10,640 --> 00:15:14,680 Speaker 1: the fabric. While she said, um, we may say most 253 00:15:14,800 --> 00:15:18,600 Speaker 1: aptly that the analytical engine weaves algebraical patterns just as 254 00:15:18,600 --> 00:15:23,280 Speaker 1: the Jacquard loom weaves flowers and leaves. So see, there 255 00:15:23,280 --> 00:15:25,120 Speaker 1: you go, there's that whole poetry thing she's you know, 256 00:15:25,320 --> 00:15:28,640 Speaker 1: that's just in there. Yeah. Well, and and like I said, 257 00:15:28,800 --> 00:15:32,800 Speaker 1: even if you even if you ignore the language, and 258 00:15:32,960 --> 00:15:36,080 Speaker 1: Ada was very gifted with with words, just as she 259 00:15:36,160 --> 00:15:39,280 Speaker 1: was with mathematics. Um, the fact that she could see 260 00:15:39,880 --> 00:15:44,400 Speaker 1: the poetry in in math is again very phenomenal to me. Well, 261 00:15:44,400 --> 00:15:47,120 Speaker 1: I meant that she was making connections between something that 262 00:15:47,160 --> 00:15:50,720 Speaker 1: was completely, well not completely, but and wide in a 263 00:15:50,720 --> 00:15:53,240 Speaker 1: wide way. It was it was not very related directly 264 00:15:53,320 --> 00:15:58,200 Speaker 1: to this analytical engine. Uh. Also you might notice, um, 265 00:15:58,280 --> 00:16:01,200 Speaker 1: she sort of foresaw the use of punch cards, uh, 266 00:16:01,440 --> 00:16:04,080 Speaker 1: to be used for programs. So she's already thinking in 267 00:16:04,080 --> 00:16:07,840 Speaker 1: a programmatic sense. Yeah. Actually, Babbage himself talked a little 268 00:16:07,880 --> 00:16:13,680 Speaker 1: bit about punch cards when he wrote about his analytical engine. Um. Yeah, 269 00:16:13,760 --> 00:16:16,920 Speaker 1: and in his sense he was talking about the the 270 00:16:17,040 --> 00:16:19,520 Speaker 1: use for punch cards for two purposes. And we've talked 271 00:16:19,520 --> 00:16:23,480 Speaker 1: about this. Babbage also we we shouldn't we shouldn't know. 272 00:16:23,560 --> 00:16:26,720 Speaker 1: He's leave him out of this amazing innovation as well. 273 00:16:27,440 --> 00:16:32,280 Speaker 1: Babbage was also amazing in his ability to foresee the 274 00:16:32,320 --> 00:16:35,840 Speaker 1: future as far as computers are concerned. Now, granted, his 275 00:16:35,960 --> 00:16:39,120 Speaker 1: devices were all mechanical as opposed to electrical, Yes, but 276 00:16:39,320 --> 00:16:45,360 Speaker 1: they the principles of electronic computing are based very firmly 277 00:16:45,440 --> 00:16:50,760 Speaker 1: on Babbage's discoveries. Um. Babbage foresaw the use of punch 278 00:16:50,840 --> 00:16:55,040 Speaker 1: cards using a few different kinds of punch cards. One 279 00:16:55,040 --> 00:16:58,080 Speaker 1: would be a set of instructions, and the other would 280 00:16:58,080 --> 00:17:02,720 Speaker 1: be would represent the constants variables of whatever formula you're 281 00:17:02,720 --> 00:17:05,040 Speaker 1: plugging in. Right, So one is the program and the 282 00:17:05,080 --> 00:17:07,080 Speaker 1: other is the information that you plug into the program 283 00:17:07,119 --> 00:17:09,880 Speaker 1: to get a result. Exactly the same sort of thing 284 00:17:09,880 --> 00:17:13,120 Speaker 1: that we see in microprocessors today. What Babbage was doing. 285 00:17:13,280 --> 00:17:18,720 Speaker 1: Was was the the precursor to the micro processor. It's 286 00:17:18,760 --> 00:17:21,960 Speaker 1: just his was a macro processor because it was enormous 287 00:17:21,960 --> 00:17:24,200 Speaker 1: and weighed tons and tons. If he had ever managed 288 00:17:24,240 --> 00:17:27,400 Speaker 1: to actually build it at the size of that silicon wafer, yeah, 289 00:17:27,440 --> 00:17:30,480 Speaker 1: he never. He never did build the analytical engine. He 290 00:17:30,560 --> 00:17:32,760 Speaker 1: did he realized during his lifetime that it was not 291 00:17:32,840 --> 00:17:34,560 Speaker 1: going to happen, and I'm sure it was a massive 292 00:17:34,600 --> 00:17:37,920 Speaker 1: disappointment to him. But they have been made since. Yes, 293 00:17:37,960 --> 00:17:41,399 Speaker 1: there was one created almost like an art project in 294 00:17:41,440 --> 00:17:44,200 Speaker 1: the early nineties, and um, fun I think it's in 295 00:17:44,240 --> 00:17:46,200 Speaker 1: a museum now, right, Yeah, Actually, I think I think 296 00:17:46,200 --> 00:17:49,080 Speaker 1: there may be two. To be honest, I think it's 297 00:17:49,080 --> 00:17:50,960 Speaker 1: one of the things that I ran across the mention 298 00:17:51,040 --> 00:17:53,600 Speaker 1: of as I was looking specifically for information about it 299 00:17:53,720 --> 00:17:55,360 Speaker 1: to love life, so I didn't follow it, but yeah, 300 00:17:55,359 --> 00:17:57,000 Speaker 1: I think I think I saw that there were two 301 00:17:57,320 --> 00:18:00,520 Speaker 1: in existence now that had been created just because you 302 00:18:00,600 --> 00:18:04,399 Speaker 1: can and and Babbage actually wrote that the analytical engine 303 00:18:04,400 --> 00:18:08,760 Speaker 1: would eventually contain an apparatus for printing on paper or 304 00:18:08,800 --> 00:18:13,080 Speaker 1: if required, up to two copies printed out on paper 305 00:18:14,040 --> 00:18:17,080 Speaker 1: ahead of the iPad. I'm kidding their software for that. 306 00:18:18,280 --> 00:18:21,639 Speaker 1: It would have a means for producing a stereotype mold 307 00:18:21,720 --> 00:18:25,240 Speaker 1: of the tables or results it computes, and it would 308 00:18:25,240 --> 00:18:28,679 Speaker 1: have a mechanism for punching on blank pasteboard cards or 309 00:18:28,800 --> 00:18:32,960 Speaker 1: metal plates the numerical results of any of its computations. So, 310 00:18:33,080 --> 00:18:35,719 Speaker 1: in other words, you would read it by looking at 311 00:18:35,720 --> 00:18:37,919 Speaker 1: a punch card. You would find the results of whatever 312 00:18:37,960 --> 00:18:41,480 Speaker 1: it was that you were trying to uh to calculate, 313 00:18:41,760 --> 00:18:47,560 Speaker 1: and the his his method of designating a punch card 314 00:18:47,560 --> 00:18:52,159 Speaker 1: was actually pretty simple. But each punch card had um 315 00:18:52,400 --> 00:18:57,119 Speaker 1: had several columns of holes or columns where you could 316 00:18:57,160 --> 00:19:02,320 Speaker 1: punch a hole and ten rows. And if you punched 317 00:19:02,400 --> 00:19:05,439 Speaker 1: the top hole, that would be a one. If you 318 00:19:05,480 --> 00:19:08,040 Speaker 1: punched the top two that would be a two, If 319 00:19:08,040 --> 00:19:09,680 Speaker 1: you punch the top three that would be a three. 320 00:19:09,680 --> 00:19:12,960 Speaker 1: So this isn't binary, you see what I'm saying. Yes, 321 00:19:13,400 --> 00:19:15,600 Speaker 1: So it was a very simple way. You would look 322 00:19:15,600 --> 00:19:17,679 Speaker 1: at the punch card and you would say, all right, 323 00:19:17,680 --> 00:19:20,120 Speaker 1: well the first number is a three because the first 324 00:19:20,119 --> 00:19:21,960 Speaker 1: three holes are punched. That kind of thing that made 325 00:19:21,960 --> 00:19:26,000 Speaker 1: it pretty easy to read. Yes, But again Babbage was 326 00:19:26,080 --> 00:19:28,240 Speaker 1: just thinking in terms of numbers. Love Lace was the 327 00:19:28,240 --> 00:19:31,840 Speaker 1: one who was thinking in terms of graphics, music, that 328 00:19:31,920 --> 00:19:34,880 Speaker 1: kind of thing. We have a bit more to say 329 00:19:34,920 --> 00:19:38,280 Speaker 1: about the enchantress of numbers, but before we get into that, 330 00:19:38,359 --> 00:19:50,760 Speaker 1: let's take another quick break. Lovelace comes up with a 331 00:19:50,840 --> 00:19:54,800 Speaker 1: kind of a test. She she writes out of program 332 00:19:54,880 --> 00:20:01,080 Speaker 1: essentially based on Babbage's uh design for the analytical engine. Now, 333 00:20:01,119 --> 00:20:04,040 Speaker 1: this engine again did not physically exist at this point, 334 00:20:04,320 --> 00:20:07,440 Speaker 1: in fact, that it never existed during his life, and 335 00:20:07,440 --> 00:20:13,159 Speaker 1: and and Lovelace predeceased Babbage. So Lovelace looks at this 336 00:20:13,240 --> 00:20:17,000 Speaker 1: design and she says, you know what, let's just take 337 00:20:17,640 --> 00:20:22,879 Speaker 1: one uh one mathematical algorithm, and I will design a 338 00:20:22,960 --> 00:20:27,760 Speaker 1: program for this engine that would fulfill this algorithm. So 339 00:20:27,800 --> 00:20:32,880 Speaker 1: she decides to create a program that would generate Bernoulli numbers. 340 00:20:33,560 --> 00:20:35,640 Speaker 1: I would like to explain to you what a Bernoulli 341 00:20:35,760 --> 00:20:39,840 Speaker 1: number is, Honestly, I would like to, but I'm an 342 00:20:39,880 --> 00:20:44,439 Speaker 1: English major, and no, seriously, I looked at Bernoulli numbers. 343 00:20:44,440 --> 00:20:48,080 Speaker 1: I looked up five or six different explanations, and really 344 00:20:48,119 --> 00:20:50,960 Speaker 1: it's just a it's a it's a level of mathematics 345 00:20:51,080 --> 00:20:53,520 Speaker 1: with which I am not comfortable. So I cannot even 346 00:20:53,600 --> 00:20:59,119 Speaker 1: explain um. They're generated through the through a simple algorithm, 347 00:20:59,160 --> 00:21:03,200 Speaker 1: relatively simple algorithm, and um, Lovelace was able to create 348 00:21:03,359 --> 00:21:06,840 Speaker 1: a program that would have generated Bernoulli numbers through the 349 00:21:06,880 --> 00:21:11,200 Speaker 1: analytic engine had it ever been built. So I would 350 00:21:11,200 --> 00:21:13,000 Speaker 1: say that, yes, you can call her the first computer 351 00:21:13,040 --> 00:21:17,159 Speaker 1: programmer definitely. So um, yeah, it's I admit it's been 352 00:21:17,200 --> 00:21:19,520 Speaker 1: a long time since, uh, since I took calculus to 353 00:21:19,680 --> 00:21:23,159 Speaker 1: more than twenty years now. But yeah, the Bernoulli numbers 354 00:21:23,160 --> 00:21:28,040 Speaker 1: were named for Jaco Bernoulli, who published actually the work 355 00:21:28,080 --> 00:21:31,120 Speaker 1: was published after his death. It was published in seventeen thirteen, 356 00:21:31,600 --> 00:21:34,320 Speaker 1: um and based on the and that was in the 357 00:21:34,800 --> 00:21:37,439 Speaker 1: art I hope I probably am not pronouncing this right 358 00:21:37,640 --> 00:21:42,640 Speaker 1: ours knjak Condi or yeah, anyway, anyway, it was published 359 00:21:42,640 --> 00:21:45,520 Speaker 1: by Mr Bernoulli, who was one of several in his 360 00:21:45,600 --> 00:21:49,320 Speaker 1: family to work with math um. But the Bernoulli numbers 361 00:21:49,320 --> 00:21:51,879 Speaker 1: are very very important because they can be used in 362 00:21:51,920 --> 00:21:55,639 Speaker 1: a lot of different ways related to number theory and 363 00:21:55,680 --> 00:21:59,040 Speaker 1: triggonometric functions as well. But yes, number theory, I mean 364 00:21:59,640 --> 00:22:02,480 Speaker 1: we're talking about a lot of pure mathematics here. Yeah, 365 00:22:02,600 --> 00:22:05,160 Speaker 1: it's it's basically has to deal with the consecutive integers 366 00:22:05,240 --> 00:22:08,760 Speaker 1: and and the way the sums of powers are calculated. Yeah, 367 00:22:08,840 --> 00:22:14,000 Speaker 1: I read that, and um sure ye. Also I should 368 00:22:14,000 --> 00:22:16,879 Speaker 1: also point out before anyone writes in uh, he was 369 00:22:16,920 --> 00:22:18,919 Speaker 1: not the first, He was not the only person to 370 00:22:19,040 --> 00:22:21,760 Speaker 1: discover this principle. Well, I mean this is a time 371 00:22:21,800 --> 00:22:24,960 Speaker 1: of people who were discovering things at the same time, 372 00:22:25,040 --> 00:22:28,560 Speaker 1: right at the same time. There was two different forms. 373 00:22:28,920 --> 00:22:32,320 Speaker 1: I want to say it was a Japanese scholar who 374 00:22:32,359 --> 00:22:36,480 Speaker 1: discovered it, and also his work was published after he 375 00:22:36,560 --> 00:22:40,399 Speaker 1: passed away, and it was published in seventeen twelve, one 376 00:22:40,480 --> 00:22:44,480 Speaker 1: year before. But they probably discovered it around the same time. Yeah, 377 00:22:44,520 --> 00:22:47,440 Speaker 1: because this was actually almost ten years I think after 378 00:22:47,680 --> 00:22:51,480 Speaker 1: Bernoulli's death, so it would have been back concurrently, simultaneous, 379 00:22:52,080 --> 00:22:56,040 Speaker 1: concurrently sign I was sorry that would have been rely redundant. Well, 380 00:22:56,400 --> 00:22:59,000 Speaker 1: it's hard to say he was first, but they were 381 00:22:59,040 --> 00:23:01,560 Speaker 1: around the same time. I'm just as you know, in 382 00:23:01,600 --> 00:23:05,920 Speaker 1: the immediately preceding years we have the calculus being conceived of. 383 00:23:06,000 --> 00:23:08,160 Speaker 1: It's fast. It must have been a really heavy time 384 00:23:08,200 --> 00:23:11,760 Speaker 1: for mathematicians. Oh sure, and uh so yeah, I mean 385 00:23:12,119 --> 00:23:15,320 Speaker 1: the fact that that Lovelace was able to you know, 386 00:23:15,400 --> 00:23:18,879 Speaker 1: she she knew of course about this um the algorithm 387 00:23:18,880 --> 00:23:22,840 Speaker 1: to generate Bernoulli numbers and was able to program a 388 00:23:23,680 --> 00:23:25,399 Speaker 1: You know this this is all more or less a 389 00:23:25,400 --> 00:23:29,240 Speaker 1: thought experiment, because again, nothing existed with which upon which 390 00:23:29,240 --> 00:23:31,359 Speaker 1: she could run this program. But she was able to 391 00:23:31,400 --> 00:23:34,520 Speaker 1: create a program that would have generated Bernoulli numbers based 392 00:23:34,560 --> 00:23:37,480 Speaker 1: upon the way that the analytical engine would have worked. 393 00:23:38,640 --> 00:23:43,919 Speaker 1: So the fact that one she understood this, which all 394 00:23:44,400 --> 00:23:46,800 Speaker 1: by itself is pretty amazing to me, because I mean 395 00:23:47,040 --> 00:23:50,560 Speaker 1: in the sense that I find it completely incomprehensible. Too. 396 00:23:50,720 --> 00:23:53,440 Speaker 1: She was able to write a program for something that 397 00:23:53,560 --> 00:23:57,239 Speaker 1: only existed in theory, I mean, and and she had 398 00:23:57,280 --> 00:23:59,159 Speaker 1: a lot of influence with Babbage. The two of them 399 00:23:59,240 --> 00:24:03,880 Speaker 1: together really kind of shaped the analytical engine, and they 400 00:24:03,880 --> 00:24:07,600 Speaker 1: would find errors into each other's work. So it wasn't 401 00:24:07,640 --> 00:24:10,440 Speaker 1: like like Babbage would make mistakes because he's human and 402 00:24:10,560 --> 00:24:13,119 Speaker 1: Lovelace would find them. And sometimes Lovelace would make mistakes 403 00:24:13,119 --> 00:24:15,080 Speaker 1: and Babbage would find them. They had a very long 404 00:24:15,119 --> 00:24:20,640 Speaker 1: history of correspondence um and also a web comic. Yes, yeah, 405 00:24:20,680 --> 00:24:24,000 Speaker 1: we have to mention that the Lovelace and Babbage web comic. 406 00:24:24,520 --> 00:24:28,160 Speaker 1: Oh this, guys, do a search on on the web 407 00:24:28,240 --> 00:24:31,520 Speaker 1: for the Lovelace and Babbage web comic because it is phenomenal. 408 00:24:32,359 --> 00:24:35,600 Speaker 1: I think it's a it's a very playful, tongue in 409 00:24:35,680 --> 00:24:38,600 Speaker 1: cheek tribute to these two individuals. But I think it's 410 00:24:38,640 --> 00:24:41,600 Speaker 1: also you can tell it's it's made out of love. 411 00:24:42,280 --> 00:24:44,440 Speaker 1: I mean that kind of effort to go into two 412 00:24:44,480 --> 00:24:49,720 Speaker 1: and and it's great art. It's great writing. Um. It 413 00:24:49,920 --> 00:24:52,879 Speaker 1: kind of picks up on the presumption of Lovelace and 414 00:24:52,920 --> 00:24:56,919 Speaker 1: Babbage becoming kind of like a crime fighters using computational 415 00:24:57,160 --> 00:25:00,560 Speaker 1: in the analytical engine to to deffe eat crime and 416 00:25:00,560 --> 00:25:04,199 Speaker 1: solve mysteries and what those sort of sound like, the 417 00:25:04,680 --> 00:25:06,639 Speaker 1: like it should be a Hanna Barberi show or something 418 00:25:06,880 --> 00:25:09,920 Speaker 1: kind of, but the arts better. So yeah, no, it's 419 00:25:09,960 --> 00:25:13,840 Speaker 1: really great stuff. I definitely recommended. And you know, you 420 00:25:13,920 --> 00:25:18,239 Speaker 1: know why we got these these emails so close together, right, 421 00:25:18,560 --> 00:25:22,920 Speaker 1: why is that? It's because of it a Lovelace Day. Ah. Yeah. Now, 422 00:25:22,960 --> 00:25:25,520 Speaker 1: the very first Aida Lovelace Day was held on the 423 00:25:25,600 --> 00:25:28,719 Speaker 1: March two thousand nine, and they had another one this 424 00:25:28,800 --> 00:25:34,120 Speaker 1: year again March. And you can find information about Ada 425 00:25:34,160 --> 00:25:37,119 Speaker 1: Lovelace Day on Facebook, on on the web in general. 426 00:25:37,560 --> 00:25:41,080 Speaker 1: There's a Twitter feed for called finding Ada and Ada 427 00:25:41,240 --> 00:25:44,800 Speaker 1: is a d A so it's all one word finding Ada. Um, 428 00:25:44,840 --> 00:25:46,920 Speaker 1: and they try and get people to sign a pledge 429 00:25:47,000 --> 00:25:50,399 Speaker 1: to blog about Ada Lovelace and kind of increased public 430 00:25:50,440 --> 00:25:53,120 Speaker 1: awareness of who this woman was and what she accomplished 431 00:25:53,600 --> 00:25:59,320 Speaker 1: and how really amazing you know, she was. And um 432 00:25:59,440 --> 00:26:04,480 Speaker 1: there if you look at contemporary records of Lovelace, uh, 433 00:26:04,520 --> 00:26:06,879 Speaker 1: it's it's a little for me, it's a little discomforting 434 00:26:07,000 --> 00:26:11,280 Speaker 1: because it's it's almost dismissive. It's like she's amazing despite 435 00:26:11,320 --> 00:26:13,840 Speaker 1: the fact that she's a woman, I mean, which is 436 00:26:13,960 --> 00:26:18,760 Speaker 1: of course indicative of the the general philosophy of the era, right, 437 00:26:19,280 --> 00:26:22,760 Speaker 1: but I mean it's you know, I ignore that because 438 00:26:23,119 --> 00:26:27,400 Speaker 1: this woman was just phenomenal period, absolutely brilliant. Yeah yeah, 439 00:26:27,440 --> 00:26:29,840 Speaker 1: And I should point out to that that's not the 440 00:26:29,840 --> 00:26:33,280 Speaker 1: only time she's been honored, um, you know, and recognized 441 00:26:33,320 --> 00:26:35,200 Speaker 1: for her work. Uh the As a matter of fact, 442 00:26:35,359 --> 00:26:38,479 Speaker 1: oddly enough, the United States Department of Defense honored her 443 00:26:38,600 --> 00:26:44,879 Speaker 1: with her own program own programming language in nine. So 444 00:26:44,960 --> 00:26:47,160 Speaker 1: she's I think she's fascinating enough that she just sort 445 00:26:47,200 --> 00:26:49,080 Speaker 1: of keeps popping up in history from time to time. 446 00:26:49,119 --> 00:26:51,239 Speaker 1: People get fascinated and want to learn more about her 447 00:26:51,280 --> 00:26:55,520 Speaker 1: and every reason she's absolutely brilliant. Lay anyone, anyone who 448 00:26:55,560 --> 00:26:59,159 Speaker 1: has a computer science background has heard of her just 449 00:26:59,240 --> 00:27:03,320 Speaker 1: from their their studies. But yeah, I can't help but 450 00:27:03,520 --> 00:27:07,560 Speaker 1: feel that had she not had cancer, had she been 451 00:27:07,600 --> 00:27:11,240 Speaker 1: able to to live on and continue her work, um, 452 00:27:11,280 --> 00:27:13,960 Speaker 1: that possibly the era of computers would have come a 453 00:27:13,960 --> 00:27:17,600 Speaker 1: little faster. Now. It's the main thing that would have 454 00:27:17,600 --> 00:27:20,720 Speaker 1: had to have happened was that the combination of Lovelace 455 00:27:20,760 --> 00:27:24,320 Speaker 1: and Babbage's work would together would have to convince people 456 00:27:24,440 --> 00:27:29,359 Speaker 1: to invest in completing the analytical engine. Um because of 457 00:27:29,400 --> 00:27:32,399 Speaker 1: course they didn't have the resources at their disposal to 458 00:27:32,400 --> 00:27:35,520 Speaker 1: create an electrical computer that would still it would still 459 00:27:35,520 --> 00:27:39,080 Speaker 1: have been a mechanical instrument, and who knows how sophisticated 460 00:27:39,240 --> 00:27:41,800 Speaker 1: it ultimately would have been. It may be that her 461 00:27:42,080 --> 00:27:46,520 Speaker 1: vision of of mathematics representing music and graphics and that 462 00:27:46,560 --> 00:27:49,440 Speaker 1: sort of thing would take longer and possibly a totally 463 00:27:49,440 --> 00:27:53,000 Speaker 1: different form factor than the analytical engine. But we might 464 00:27:53,040 --> 00:27:55,640 Speaker 1: compute completely differently than we do now. Yeah, who knows. 465 00:27:55,680 --> 00:27:57,440 Speaker 1: It could have been a very steampunk kind of a 466 00:27:58,119 --> 00:28:03,360 Speaker 1: kind of future. Right. Well, I hope we did justice 467 00:28:03,600 --> 00:28:07,159 Speaker 1: to uh Ada Lovelace again, And if you want to 468 00:28:07,160 --> 00:28:09,600 Speaker 1: know more about her as a person, definitely check out 469 00:28:10,280 --> 00:28:12,439 Speaker 1: the stuff you missed in history class. Yes, they do, 470 00:28:12,680 --> 00:28:14,920 Speaker 1: they do. It's an excellent job. Yeah, we I listened 471 00:28:14,960 --> 00:28:18,119 Speaker 1: to it before we did this podcast, and uh and 472 00:28:18,119 --> 00:28:20,040 Speaker 1: and Katie and Sarah really do a great job at 473 00:28:20,280 --> 00:28:22,439 Speaker 1: giving an idea of what her life was like, and 474 00:28:22,720 --> 00:28:26,280 Speaker 1: especially her relationship with her mother, which was a very 475 00:28:26,320 --> 00:28:31,640 Speaker 1: complex relationship. Um and sometimes combative, but it's a it's 476 00:28:31,680 --> 00:28:35,680 Speaker 1: an interesting story, kind of tragic ultimately, but definitely helps 477 00:28:35,760 --> 00:28:40,480 Speaker 1: shape the way the history of computers. And that wraps 478 00:28:40,560 --> 00:28:43,240 Speaker 1: up this classic episode of tech Stuff. I hope you 479 00:28:43,360 --> 00:28:46,520 Speaker 1: enjoyed it. If you really like the story of Ada Lovelace, 480 00:28:46,680 --> 00:28:49,680 Speaker 1: you gotta check out our store over at t public 481 00:28:49,840 --> 00:28:53,920 Speaker 1: dot com slash tech stuff. We have the code like 482 00:28:53,960 --> 00:28:57,040 Speaker 1: a Girl T shirt that has Ada Lovelace on The 483 00:28:57,080 --> 00:29:00,360 Speaker 1: shirt has an illustration of Ada Lovelace. It's one of 484 00:29:00,400 --> 00:29:03,320 Speaker 1: my favorite shirts. Um. I own a couple of them. 485 00:29:03,360 --> 00:29:06,600 Speaker 1: In fact, I bought them. In fact, I loved it 486 00:29:06,640 --> 00:29:08,760 Speaker 1: so much I would have purchased it. Didn't tell anyone, 487 00:29:08,800 --> 00:29:10,440 Speaker 1: and then everyone said, hey, we could have sent you 488 00:29:10,960 --> 00:29:13,520 Speaker 1: a version of that if you wanted, but no, I 489 00:29:13,520 --> 00:29:18,120 Speaker 1: I really felt like that was something important. And uh, 490 00:29:18,280 --> 00:29:20,760 Speaker 1: I love that particular shirt. You should go check that out. 491 00:29:20,800 --> 00:29:22,640 Speaker 1: Even if you don't buy one, go at least take 492 00:29:22,640 --> 00:29:24,120 Speaker 1: a look at it, because I'm really proud of that 493 00:29:24,160 --> 00:29:28,120 Speaker 1: particular product. If you guys have any suggestions for future 494 00:29:28,200 --> 00:29:31,440 Speaker 1: episodes of tech Stuff, send me an email. The address 495 00:29:31,520 --> 00:29:34,640 Speaker 1: is tech Stuff at how stuff works dot com. Don't 496 00:29:34,680 --> 00:29:37,200 Speaker 1: forget to visit our website that's over at tech stuff 497 00:29:37,280 --> 00:29:39,960 Speaker 1: podcast dot com. You can find us on social media 498 00:29:40,040 --> 00:29:42,000 Speaker 1: over there as well, and I'll talk to you again 499 00:29:42,400 --> 00:29:50,400 Speaker 1: and really soon for more on this and thousands of 500 00:29:50,440 --> 00:30:00,120 Speaker 1: other topics, because at how stuff works dot com you