1 00:00:00,240 --> 00:00:28,760 Speaker 1: Ridiculous History is a production of I Heart Radio. Welcome 2 00:00:28,800 --> 00:00:32,000 Speaker 1: back to the show Ridiculous Historians. Thank you, as always 3 00:00:32,080 --> 00:00:35,120 Speaker 1: so much for tuning in. Shout out to our super producer, 4 00:00:35,159 --> 00:00:38,000 Speaker 1: Max Williams. Add a very special shout out to our 5 00:00:38,000 --> 00:00:44,240 Speaker 1: guest super producer returning again, the one, the only Mr Lowell, Brilliante. 6 00:00:44,880 --> 00:00:48,280 Speaker 1: I've been your your nal. And this is part two 7 00:00:48,440 --> 00:00:51,720 Speaker 1: of a two part series, isn't it. It is indeed 8 00:00:52,159 --> 00:00:58,200 Speaker 1: part two of Ada Lovelace and her journey into mathematics 9 00:00:58,240 --> 00:01:03,760 Speaker 1: and computation and you know, standard bearing, trail blazing, all 10 00:01:03,800 --> 00:01:06,520 Speaker 1: that good stuff. Where where do we leave off, ben, 11 00:01:06,560 --> 00:01:09,600 Speaker 1: if I'm not mistaken, She had just kind of returned 12 00:01:09,720 --> 00:01:13,600 Speaker 1: to the fray of of of computer science after a 13 00:01:13,680 --> 00:01:18,399 Speaker 1: brief stint in the domestic realm. Yes, after just a 14 00:01:18,440 --> 00:01:22,280 Speaker 1: few months after the birth of her third child back 15 00:01:22,280 --> 00:01:26,720 Speaker 1: in eighteen thirty nine, Aida decided to dive back into 16 00:01:27,160 --> 00:01:31,240 Speaker 1: the world of numerousy and the world of mathematics. And 17 00:01:32,640 --> 00:01:35,800 Speaker 1: it's interesting because we mentioned in part one. Will pause, 18 00:01:35,800 --> 00:01:38,000 Speaker 1: everybody listen to part one. Do do do Do Do 19 00:01:38,120 --> 00:01:41,200 Speaker 1: Do Do Do do do and you're back. So in 20 00:01:41,319 --> 00:01:45,280 Speaker 1: part one we're still here in part one. Uh, she 21 00:01:45,400 --> 00:01:48,400 Speaker 1: met the man who would become her mentor, Charles Babbage, 22 00:01:48,720 --> 00:01:54,400 Speaker 1: and just before she got married in eighteen thirty five 23 00:01:54,520 --> 00:01:56,520 Speaker 1: she got married, So a year before then, in eighteen 24 00:01:56,560 --> 00:02:01,560 Speaker 1: thirty four, Babbage had finally and to mess around with 25 00:02:01,720 --> 00:02:06,560 Speaker 1: something he called not the difference machine, but something he 26 00:02:06,640 --> 00:02:09,959 Speaker 1: called an analytical engine. This is the old trope you 27 00:02:10,040 --> 00:02:14,880 Speaker 1: might hear of punch cards being used in the first computers, right, 28 00:02:15,760 --> 00:02:20,040 Speaker 1: that's right. UM. It was the earliest form of kind 29 00:02:20,040 --> 00:02:24,000 Speaker 1: of large mainframe type computer systems that continue to use 30 00:02:24,040 --> 00:02:27,520 Speaker 1: punch card systems well into the eighties. I believe, like 31 00:02:27,560 --> 00:02:31,720 Speaker 1: you know, big massive UM room filling systems that use 32 00:02:31,840 --> 00:02:36,280 Speaker 1: these punch cards to uh instruct the computer on what 33 00:02:36,360 --> 00:02:40,400 Speaker 1: to do. UM. I'm a little hazy on exactly how 34 00:02:40,480 --> 00:02:42,520 Speaker 1: that works. It's all kind of magic to me and 35 00:02:42,560 --> 00:02:47,200 Speaker 1: in some ways, but essentially the punch cards are the instructions. 36 00:02:47,600 --> 00:02:50,880 Speaker 1: In this situation, it was very rudimentary UM tasks and 37 00:02:50,960 --> 00:02:55,600 Speaker 1: computations that could multiply and divide numbers UM and other 38 00:02:56,040 --> 00:03:01,200 Speaker 1: more simplistic data related tasks using these punch cards as 39 00:03:01,320 --> 00:03:05,200 Speaker 1: the programming instructions. UM really quickly. But do you remember 40 00:03:05,280 --> 00:03:08,400 Speaker 1: back in probably the nineties, maybe the early to mid nineties, 41 00:03:08,400 --> 00:03:10,960 Speaker 1: there used to be a chain of computer stores called 42 00:03:10,960 --> 00:03:14,799 Speaker 1: babbage Is, was it? You know, always thinking about that 43 00:03:14,880 --> 00:03:17,920 Speaker 1: was turn remember if they were computer stores or music stores? 44 00:03:18,040 --> 00:03:20,480 Speaker 1: And I'm pretty sure you're right there, Uh they were 45 00:03:20,520 --> 00:03:27,119 Speaker 1: computer stores, if I'm not mistaken. Babbage Is ultimately became 46 00:03:27,280 --> 00:03:31,960 Speaker 1: game Stop. Babbage Is was the initial um, you know concept, 47 00:03:32,080 --> 00:03:36,400 Speaker 1: and then it gradually uh mutated into the game Stop 48 00:03:36,760 --> 00:03:39,920 Speaker 1: and movie Stopped and all of the various stops. Um. 49 00:03:40,040 --> 00:03:42,200 Speaker 1: But but I might be mistaken, but I think that's right. 50 00:03:42,720 --> 00:03:44,440 Speaker 1: It was a very popular chance of stores in the 51 00:03:44,480 --> 00:03:47,320 Speaker 1: eighties and nineties, but named after Charles Babbage. I just 52 00:03:47,320 --> 00:03:48,960 Speaker 1: always thought it was a funny word when I was 53 00:03:49,000 --> 00:03:53,360 Speaker 1: a kid. Didn't founding partner of that or one of 54 00:03:53,400 --> 00:03:57,240 Speaker 1: the early investors was Ross Pero, which is a deep 55 00:03:57,280 --> 00:03:59,240 Speaker 1: cut for a lot of people. He was one of 56 00:03:59,280 --> 00:04:03,160 Speaker 1: the he was one of the politicians in recent US 57 00:04:03,320 --> 00:04:07,520 Speaker 1: history who actually got close to being a member of 58 00:04:07,720 --> 00:04:12,840 Speaker 1: or candidate for viable third political party spoiler it didn't happen. No, 59 00:04:13,040 --> 00:04:15,480 Speaker 1: And he was also one of the you know, more 60 00:04:15,520 --> 00:04:19,839 Speaker 1: fledgling candidates that um was quite fun in his um 61 00:04:20,000 --> 00:04:23,440 Speaker 1: portrayal on Saturday Night Live. I think it was Dana 62 00:04:23,560 --> 00:04:25,800 Speaker 1: Carvey that did Ross Pero and he did the Whole 63 00:04:25,800 --> 00:04:30,680 Speaker 1: Life Canna fanish Cana, fantis cana, fanish cana, fanish chap. Anyway, 64 00:04:30,720 --> 00:04:33,720 Speaker 1: it was a thing, um, but it's true. So Babbage 65 00:04:33,720 --> 00:04:36,960 Speaker 1: at this point has started to develop this system called 66 00:04:36,960 --> 00:04:40,320 Speaker 1: the Analytical engine. Um. Initially it was just a concept. 67 00:04:40,360 --> 00:04:43,960 Speaker 1: It was these sketches of this massive machine that literally 68 00:04:44,080 --> 00:04:50,280 Speaker 1: used you know, mechanical pieces, mechanical components like costs very 69 00:04:50,360 --> 00:04:53,760 Speaker 1: much so uh. And again they filled up rooms or 70 00:04:54,040 --> 00:04:56,840 Speaker 1: would ultimately go into Philip rooms. What it would become 71 00:04:56,880 --> 00:05:01,840 Speaker 1: of this lovelace was essentially the acting as a consultant 72 00:05:02,200 --> 00:05:06,679 Speaker 1: on this project. She seemed to understand the idea behind 73 00:05:06,720 --> 00:05:09,560 Speaker 1: it or the potential behind it, maybe even more so 74 00:05:09,680 --> 00:05:13,160 Speaker 1: than Babbage himself. Yeah, and this is a theme that 75 00:05:13,240 --> 00:05:16,680 Speaker 1: we saw earlier in part one of this series. She 76 00:05:16,800 --> 00:05:20,840 Speaker 1: saw the potential for uh, for this to be a 77 00:05:20,920 --> 00:05:24,799 Speaker 1: representation of what she called poetical science, which goes back 78 00:05:24,839 --> 00:05:29,200 Speaker 1: to the earlier conflict with her mother about you know, 79 00:05:29,480 --> 00:05:32,520 Speaker 1: math versus poetry. Please don't let my daughter turn into 80 00:05:32,560 --> 00:05:37,200 Speaker 1: Lord Byron, because he's you know, kind of out there 81 00:05:37,520 --> 00:05:43,359 Speaker 1: so Babbage goes to Turin to promote his work, to 82 00:05:43,440 --> 00:05:49,040 Speaker 1: promote his ideas, and while he's promoting this, he realizes 83 00:05:49,120 --> 00:05:52,240 Speaker 1: he needs a lot of financial support to make this 84 00:05:52,279 --> 00:05:55,919 Speaker 1: trip happen. And Lovelace had already served as the primary 85 00:05:55,920 --> 00:06:00,719 Speaker 1: interpreter of these drawing board ideas essentially, and to be clear, 86 00:06:00,880 --> 00:06:04,800 Speaker 1: it's the building of the machine that requires a lot 87 00:06:04,839 --> 00:06:07,440 Speaker 1: of money, so this is this is kind of a 88 00:06:07,600 --> 00:06:11,120 Speaker 1: pitch trip for him as well. That's where he meets 89 00:06:11,160 --> 00:06:16,720 Speaker 1: a mathematician with the wonderful name Luigi Fererico Menabrea, and 90 00:06:17,320 --> 00:06:21,400 Speaker 1: this guy says, all right, your ideas check out, Charlie, 91 00:06:21,440 --> 00:06:24,080 Speaker 1: I am gonna. I'm going to write a paper about 92 00:06:24,080 --> 00:06:28,600 Speaker 1: this machine. And the paper gets published in October two 93 00:06:28,720 --> 00:06:33,400 Speaker 1: in a Swiss journal. Lovelace translates it from French, and 94 00:06:33,400 --> 00:06:37,080 Speaker 1: then while she's translating, she also adds her own notes. 95 00:06:37,520 --> 00:06:41,120 Speaker 1: And the paper itself that this guy Luigi writes is 96 00:06:41,160 --> 00:06:46,240 Speaker 1: about eight thousand words long, not bad, but the translation 97 00:06:46,400 --> 00:06:50,080 Speaker 1: and the commentary that Lovelace creates comes in at about 98 00:06:50,120 --> 00:06:55,919 Speaker 1: twenty thousand words, and Charles Pappage is saying, the notes 99 00:06:55,960 --> 00:06:58,839 Speaker 1: of the Countess of Lovelace extend to about three times 100 00:06:58,880 --> 00:07:03,320 Speaker 1: the length of the original memoir. And uh, he's again 101 00:07:03,640 --> 00:07:06,800 Speaker 1: very very impressed with the quality of her thought and 102 00:07:06,880 --> 00:07:11,280 Speaker 1: her scholarship. Yeah, it's true, and he certainly sees that 103 00:07:11,480 --> 00:07:15,200 Speaker 1: in her. And despite you know, it being a very 104 00:07:15,280 --> 00:07:19,200 Speaker 1: chauvinistic and patriarchal world, you know, the society in general, 105 00:07:19,240 --> 00:07:22,320 Speaker 1: but also just the idea that men were the only 106 00:07:22,400 --> 00:07:25,440 Speaker 1: people that had ideas worth paying attention to. UM. And 107 00:07:25,560 --> 00:07:29,400 Speaker 1: you needed to get this Luigi Federico Montabrea to co 108 00:07:29,600 --> 00:07:32,080 Speaker 1: sign on it. It wasn't enough for Lovelace to have 109 00:07:32,080 --> 00:07:33,760 Speaker 1: done it, even though she obviously could have done it 110 00:07:33,760 --> 00:07:39,120 Speaker 1: herself and clearly massively improved upon Um Montabree's ideas, UM 111 00:07:39,240 --> 00:07:43,920 Speaker 1: by a significant margin. UM. But her notes were published, 112 00:07:44,040 --> 00:07:45,800 Speaker 1: you know, our translation and the notes were published in 113 00:07:45,800 --> 00:07:49,840 Speaker 1: eighteen forty three, UM. And according to Lovelace, you know, 114 00:07:49,920 --> 00:07:54,800 Speaker 1: scholars and historians, this was her most important contribution to 115 00:07:55,040 --> 00:07:59,640 Speaker 1: the burgeoning field of computer science, which really wasn't even 116 00:07:59,800 --> 00:08:03,560 Speaker 1: they weren't even calling it that. Actually are are good 117 00:08:03,560 --> 00:08:06,960 Speaker 1: friend Alex Williams, who composed our theme. His father, UM 118 00:08:07,000 --> 00:08:10,960 Speaker 1: and mother are both computer scientists, and I briefly was 119 00:08:11,000 --> 00:08:15,400 Speaker 1: talking to his dad about Lovelace UM and he just 120 00:08:15,480 --> 00:08:18,200 Speaker 1: mentioned how you know, they were still using a punch 121 00:08:18,280 --> 00:08:20,960 Speaker 1: card systems when he was coming up in you know, 122 00:08:21,000 --> 00:08:24,160 Speaker 1: the computer science world, and how early on there was 123 00:08:24,200 --> 00:08:27,120 Speaker 1: actually a programming language that was named after Lovelace. It 124 00:08:27,160 --> 00:08:30,480 Speaker 1: was called Ada that was used by military UM computer 125 00:08:30,520 --> 00:08:34,160 Speaker 1: scientists for military systems. But that you know, like similar 126 00:08:34,200 --> 00:08:36,880 Speaker 1: with the whole Hidden Figures film, This like group of 127 00:08:36,880 --> 00:08:40,040 Speaker 1: women that were very responsible for kind of cracking a 128 00:08:40,080 --> 00:08:43,920 Speaker 1: lot of the science behind UM space travel, but we're 129 00:08:44,000 --> 00:08:46,439 Speaker 1: largely relegated to kind of the shadows since the name. 130 00:08:46,480 --> 00:08:48,199 Speaker 1: And this is very much the case for Lovely. She 131 00:08:48,200 --> 00:08:51,920 Speaker 1: definitely didn't get her do until much much much later UM, 132 00:08:51,960 --> 00:08:54,920 Speaker 1: but was clearly one of the most important figures in 133 00:08:55,040 --> 00:08:58,480 Speaker 1: the development of this technology, more so even maybe than 134 00:08:58,520 --> 00:09:01,559 Speaker 1: the guy who kind of envisioned it. Yeah, and quick 135 00:09:01,600 --> 00:09:04,480 Speaker 1: note this will be if for any fellow ridiculous historians 136 00:09:04,520 --> 00:09:09,319 Speaker 1: who are also fans of etymology. The original down computer 137 00:09:09,760 --> 00:09:13,520 Speaker 1: described a person, not a machine from the sixteen forties, 138 00:09:13,600 --> 00:09:17,120 Speaker 1: one who calculates or a reckon er, which I always thought, 139 00:09:17,120 --> 00:09:20,360 Speaker 1: I always thought was so fascinating. You're right there, she's 140 00:09:20,400 --> 00:09:23,840 Speaker 1: working against tremendous misogyny, she said earlier. In fact, her 141 00:09:23,880 --> 00:09:28,160 Speaker 1: translation in her notes when they're published in eighteen forty three, 142 00:09:29,040 --> 00:09:33,800 Speaker 1: the authorship is attributed to just a series of initials A, A, L. 143 00:09:34,520 --> 00:09:38,440 Speaker 1: And and this is this is kind of a this 144 00:09:38,600 --> 00:09:42,040 Speaker 1: kind of a kick in the pants because it is 145 00:09:42,160 --> 00:09:47,360 Speaker 1: still considered one of her greatest contributions to to the field. 146 00:09:47,880 --> 00:09:52,040 Speaker 1: And she walks through not just the possibilities of how 147 00:09:52,040 --> 00:09:55,319 Speaker 1: it could work, but she also walks through the context 148 00:09:55,600 --> 00:10:00,200 Speaker 1: leading to its creation. And she specifically shouts out the 149 00:10:00,440 --> 00:10:04,120 Speaker 1: Jacquard loom, which was a silk weaving machine and it 150 00:10:04,200 --> 00:10:09,800 Speaker 1: could create images using a chain of punched cards and lovelace. 151 00:10:10,080 --> 00:10:13,679 Speaker 1: How poetically scientific is this? She says, Look, folks, this 152 00:10:13,960 --> 00:10:17,679 Speaker 1: engine that Babbage has made is doing the same thing, 153 00:10:17,760 --> 00:10:23,040 Speaker 1: but it's weaving mathematical problems. It's weaving patterns. They're just 154 00:10:23,280 --> 00:10:28,120 Speaker 1: nu miracle. She also wrote how it could how it 155 00:10:28,160 --> 00:10:30,760 Speaker 1: could work in a specific case test, like, here's an 156 00:10:30,800 --> 00:10:34,440 Speaker 1: example of how you would use these punch cards to 157 00:10:34,800 --> 00:10:39,960 Speaker 1: create a long sequence. And this outline that she that 158 00:10:40,120 --> 00:10:43,520 Speaker 1: she thought up, that she created this is considered to 159 00:10:43,640 --> 00:10:48,960 Speaker 1: be the very first computer program, and and she has 160 00:10:49,000 --> 00:10:52,040 Speaker 1: this beautiful line where she says, the science of operations, 161 00:10:52,280 --> 00:10:55,360 Speaker 1: as derived from mathematics more especially, is a science of 162 00:10:55,360 --> 00:11:01,160 Speaker 1: itself and has its own abstract truth and value. Chef's kiss. Oh, indeed, 163 00:11:01,400 --> 00:11:03,719 Speaker 1: just back to the Jacquard loom really quickly that was 164 00:11:03,960 --> 00:11:09,400 Speaker 1: developed by Joseph Mulley Jacquard Um. And to me, this 165 00:11:09,480 --> 00:11:12,360 Speaker 1: is a really great way to kind of understand somewhat 166 00:11:12,400 --> 00:11:14,880 Speaker 1: how punch cards work. Uh, you know, I think it 167 00:11:14,960 --> 00:11:18,719 Speaker 1: was clearly a next level use of this technology when 168 00:11:18,760 --> 00:11:21,360 Speaker 1: it started to be used for more difficult you know, 169 00:11:21,800 --> 00:11:25,080 Speaker 1: algorithms and mathematical equations solving and things like that. But 170 00:11:25,280 --> 00:11:28,400 Speaker 1: it's sort of almost like a player piano or a 171 00:11:28,480 --> 00:11:31,320 Speaker 1: music box, where like, if you have a music box, 172 00:11:31,400 --> 00:11:35,319 Speaker 1: you create the programming or the instructions for what individual 173 00:11:35,400 --> 00:11:38,240 Speaker 1: musical times will be plucked to create a melody by 174 00:11:38,280 --> 00:11:41,640 Speaker 1: punching out sequences of notes like on a scale on 175 00:11:41,720 --> 00:11:44,120 Speaker 1: a staff that is on a roll of paper, in 176 00:11:44,160 --> 00:11:46,040 Speaker 1: the same way that a player piano. You feed these 177 00:11:46,160 --> 00:11:49,440 Speaker 1: roles of punched out paper into it, and that blocks 178 00:11:49,760 --> 00:11:52,560 Speaker 1: the keys from being played when they shouldn't be played 179 00:11:52,679 --> 00:11:54,640 Speaker 1: and allows it to play the keys when they are 180 00:11:54,640 --> 00:11:56,400 Speaker 1: supposed to be played. So to me, that's sort of 181 00:11:56,440 --> 00:11:59,520 Speaker 1: like a rudimentary understanding of the basis of how punch 182 00:11:59,559 --> 00:12:02,319 Speaker 1: cards work. But any computer science folks out there, please 183 00:12:02,320 --> 00:12:05,280 Speaker 1: correct me if that's absolutely off base. But that's kind 184 00:12:05,280 --> 00:12:07,880 Speaker 1: of how I understand it. But she wanted to take 185 00:12:07,920 --> 00:12:11,160 Speaker 1: it even further. Uh. She wanted to weave what would 186 00:12:11,160 --> 00:12:15,000 Speaker 1: are called Bernoulli numbers, the sequences of Bernoulli numbers. Um. 187 00:12:15,080 --> 00:12:17,920 Speaker 1: And that's exactly what she did. Uh. And that's what 188 00:12:17,960 --> 00:12:20,120 Speaker 1: that quote is describing, and that would have been considered 189 00:12:20,240 --> 00:12:24,480 Speaker 1: the first computer program. You're absolutely right, ben Um and 190 00:12:24,880 --> 00:12:28,680 Speaker 1: s Singer. Uh in his biography of Love Lace wrote, 191 00:12:28,840 --> 00:12:31,880 Speaker 1: Ada here is seeking to do nothing less than invent 192 00:12:31,960 --> 00:12:34,679 Speaker 1: the science of computing and separate it from the science 193 00:12:34,720 --> 00:12:38,240 Speaker 1: of mathematics. What she calls the science of operations is 194 00:12:38,280 --> 00:12:42,120 Speaker 1: indeed in effect computing at a time where they didn't 195 00:12:42,120 --> 00:12:44,480 Speaker 1: have a name for it. Uh, didn't even understand the 196 00:12:44,559 --> 00:12:47,720 Speaker 1: full ramifications of what this would ultimately be, you know, 197 00:12:47,800 --> 00:12:50,800 Speaker 1: the Pandora's box for good or bad that this would 198 00:12:50,800 --> 00:12:53,680 Speaker 1: open up. She was very aware of that very high 199 00:12:53,800 --> 00:12:57,080 Speaker 1: level stuff that wouldn't even come into play for many, 200 00:12:57,120 --> 00:13:01,080 Speaker 1: many years, much more so I think than Babbage himself. Yeah, 201 00:13:01,200 --> 00:13:06,640 Speaker 1: a language of the world, a language that describes reality, 202 00:13:06,840 --> 00:13:09,839 Speaker 1: she she said, so to put a fine point on it. 203 00:13:10,360 --> 00:13:14,320 Speaker 1: At one point, she writes, this science constitutes the language 204 00:13:14,320 --> 00:13:18,280 Speaker 1: through which alone we can adequately express the great facts 205 00:13:18,400 --> 00:13:22,559 Speaker 1: of the natural world. So this is figuring out how 206 00:13:22,679 --> 00:13:29,720 Speaker 1: to how to translate reality, and that's that's an amazing thing. 207 00:13:29,920 --> 00:13:34,440 Speaker 1: At that level, we are getting to what Arthur C. 208 00:13:34,440 --> 00:13:37,720 Speaker 1: Clark famously refers to in his quote about the difference 209 00:13:37,720 --> 00:13:42,120 Speaker 1: between science and magic. Right, any science sufficiently advanced may 210 00:13:42,160 --> 00:13:45,920 Speaker 1: as well be magic. And this is really what she's doing. 211 00:13:51,040 --> 00:13:55,000 Speaker 1: If you look at something called the Ada Initiative, which 212 00:13:55,040 --> 00:14:00,640 Speaker 1: is a nonprofit organization that arranges conferencing and training programs 213 00:14:00,640 --> 00:14:05,600 Speaker 1: to elevate women working in STEM, you'll see the executive director, 214 00:14:05,760 --> 00:14:13,960 Speaker 1: Valerie Aurora, says that Babbage possessed this technical ingenuity, this inventiveness, 215 00:14:14,000 --> 00:14:17,439 Speaker 1: but in his partnership with Lovelace, she is the one 216 00:14:17,480 --> 00:14:22,240 Speaker 1: who propelled his invention into the world of computing, because 217 00:14:22,320 --> 00:14:26,320 Speaker 1: she was the one to see the true potential. There's 218 00:14:26,320 --> 00:14:29,960 Speaker 1: an excellent article that talks about this Ada Lovelace First 219 00:14:30,000 --> 00:14:34,120 Speaker 1: Tech Visionary by Betsy Murray's which is in The New 220 00:14:34,200 --> 00:14:40,680 Speaker 1: York Or from and unlike sad to say, many other 221 00:14:41,040 --> 00:14:47,920 Speaker 1: male inventors or scientists, Babbage had absolutely no problem hyping 222 00:14:48,000 --> 00:14:52,440 Speaker 1: and supporting Lovelace and her her work and her ability 223 00:14:52,960 --> 00:14:57,000 Speaker 1: as a computer scientists. He called her Lady Ferry, He 224 00:14:57,080 --> 00:15:01,280 Speaker 1: called her the Enchantress of numbers, and Aida leaned into this. 225 00:15:01,360 --> 00:15:05,120 Speaker 1: She once called herself the high Priestess of Babbage's engine, 226 00:15:05,520 --> 00:15:07,440 Speaker 1: which is cool. It sounds like it could be really 227 00:15:07,440 --> 00:15:10,160 Speaker 1: fun to be a fly on the wall in their conversations. 228 00:15:10,600 --> 00:15:12,720 Speaker 1: I agree, and I think it's neat. You know, we're 229 00:15:12,760 --> 00:15:15,440 Speaker 1: gonna see this relationship sort of changes a little bit. 230 00:15:15,440 --> 00:15:17,720 Speaker 1: And it does appear Babbage and may have been difficult 231 00:15:17,760 --> 00:15:22,160 Speaker 1: in some other respects, but he was pretty progressive given 232 00:15:22,160 --> 00:15:23,920 Speaker 1: the time and all of the you know, other stuff 233 00:15:23,920 --> 00:15:27,240 Speaker 1: that we've been talking about, the kind of patriarchal societal 234 00:15:27,560 --> 00:15:30,080 Speaker 1: model and um, all of that stuff and the misogyny 235 00:15:30,120 --> 00:15:33,200 Speaker 1: that existed. But um, you know, it's not like he 236 00:15:33,360 --> 00:15:38,040 Speaker 1: pressed for her. He definitely associated her with these uh 237 00:15:38,200 --> 00:15:41,120 Speaker 1: improvements in the paper, and he wasn't trying to like 238 00:15:41,160 --> 00:15:45,000 Speaker 1: brush her aside. But I don't know, Uh, we'll just see, 239 00:15:45,040 --> 00:15:48,200 Speaker 1: let's see how things progress. Um, So Lovelace does become 240 00:15:48,240 --> 00:15:53,360 Speaker 1: fascinated with the potential for the analytical engine and these algorithms, 241 00:15:53,400 --> 00:15:55,040 Speaker 1: which I don't even know if they were calling them 242 00:15:55,080 --> 00:15:57,800 Speaker 1: that at the time, but it essentially is what we 243 00:15:57,840 --> 00:16:00,560 Speaker 1: would call algorithms today, these pieces of code that are 244 00:16:00,600 --> 00:16:05,640 Speaker 1: designed to do like a particular task. Um. Unfortunately, she 245 00:16:06,760 --> 00:16:11,720 Speaker 1: didn't continue working with Babbage or wasn't more involved. She 246 00:16:11,840 --> 00:16:14,720 Speaker 1: was sort of relegated to the sidelines a little bit. Um. 247 00:16:14,760 --> 00:16:16,520 Speaker 1: I think that's maybe where I'm what I'm getting at 248 00:16:16,600 --> 00:16:19,520 Speaker 1: is that he definitely listened to her. He hyped her up, 249 00:16:19,560 --> 00:16:22,160 Speaker 1: and he you know, gave her a megaphone and a 250 00:16:22,200 --> 00:16:24,320 Speaker 1: platform to kind of try out some of the ideas, 251 00:16:24,360 --> 00:16:28,840 Speaker 1: but it ultimately was his thing. Right. Yeah, this is 252 00:16:28,880 --> 00:16:33,720 Speaker 1: where we see an unfortunate complication in their collaboration. So 253 00:16:33,840 --> 00:16:39,000 Speaker 1: in August of eighteen forty three, Lovelace writes a letter 254 00:16:39,120 --> 00:16:42,240 Speaker 1: to Babbage and she says, hey, why don't you let 255 00:16:42,280 --> 00:16:47,040 Speaker 1: me help you by putting me in charge of anything 256 00:16:47,640 --> 00:16:53,960 Speaker 1: related to the analytical project that requires us influencing important people, 257 00:16:54,080 --> 00:16:56,840 Speaker 1: kind of like, let me go talk to these folks, 258 00:16:56,920 --> 00:17:00,920 Speaker 1: let me go help get some approval, some funding, get 259 00:17:01,000 --> 00:17:04,879 Speaker 1: some other assistance. And this was a pretty long letter. 260 00:17:05,080 --> 00:17:10,120 Speaker 1: Historians still don't know why, but Babbage, in very terse 261 00:17:10,280 --> 00:17:15,600 Speaker 1: terms rejected her offer. You'll see that the best guests 262 00:17:15,720 --> 00:17:18,760 Speaker 1: people tend to have is that he approved of her 263 00:17:18,840 --> 00:17:24,880 Speaker 1: work in publicizing his engine, but with this particular project 264 00:17:25,280 --> 00:17:28,679 Speaker 1: he felt uncomfortable for some reason about letting her be 265 00:17:28,760 --> 00:17:31,800 Speaker 1: involved in the project itself. Does that mean maybe he 266 00:17:31,880 --> 00:17:37,159 Speaker 1: thought she might outshine him. Did his misogyny come into play? 267 00:17:37,400 --> 00:17:41,560 Speaker 1: I don't know. But ever since that point, due impart 268 00:17:41,640 --> 00:17:44,919 Speaker 1: to this conflict they had, Oh they stayed friends, by 269 00:17:44,960 --> 00:17:48,480 Speaker 1: the way. Part of yeah, conflict they had, you'll find 270 00:17:48,520 --> 00:17:52,120 Speaker 1: some folks say that she was not a programmer at all, 271 00:17:52,160 --> 00:17:56,359 Speaker 1: but I tend to disagree. Well, yeah, it's interesting. I 272 00:17:56,359 --> 00:17:59,720 Speaker 1: think maybe if she had been more involved in the 273 00:17:59,760 --> 00:18:03,280 Speaker 1: project itself instead of just kind of relegated to interpreting 274 00:18:03,400 --> 00:18:07,479 Speaker 1: and adding some more clarity around that original paper, then 275 00:18:07,520 --> 00:18:11,959 Speaker 1: I think she would have become more influential in actual programming. 276 00:18:12,160 --> 00:18:16,000 Speaker 1: She certainly had the chops and the imagination, which is 277 00:18:16,160 --> 00:18:19,000 Speaker 1: really what that poetical science concept is all about, is 278 00:18:19,040 --> 00:18:22,560 Speaker 1: kind of combining those two things. Um, it was Babbage 279 00:18:22,640 --> 00:18:25,280 Speaker 1: himself that created the first what essentially could be called 280 00:18:25,280 --> 00:18:31,520 Speaker 1: the first operating system or operating instructions uh for the system. Um, 281 00:18:31,680 --> 00:18:34,199 Speaker 1: she may well have done a better job. You know, 282 00:18:34,280 --> 00:18:37,040 Speaker 1: it seems like it. But I understand what you're saying, 283 00:18:37,080 --> 00:18:39,119 Speaker 1: been and I kind of agree that maybe he was 284 00:18:39,160 --> 00:18:42,000 Speaker 1: worried that, um, she might have stolen his thunder a 285 00:18:42,040 --> 00:18:44,960 Speaker 1: little bit. Um. The thing is not to beat the 286 00:18:45,160 --> 00:18:47,639 Speaker 1: dead horse here. She is the one who sort of 287 00:18:47,760 --> 00:18:51,800 Speaker 1: understood the potential for this stuff more than Babbage. Where 288 00:18:51,840 --> 00:18:54,159 Speaker 1: she understood that it could do things other than just 289 00:18:54,560 --> 00:18:59,320 Speaker 1: simple calculations. She understood the more modern uh, you know 290 00:18:59,400 --> 00:19:02,760 Speaker 1: concept of computing um, in that it can be multi purpose, 291 00:19:02,800 --> 00:19:05,399 Speaker 1: it can create these you know, have like systems running 292 00:19:05,400 --> 00:19:07,639 Speaker 1: in tandem that can do multiple things. The idea of 293 00:19:07,920 --> 00:19:10,200 Speaker 1: you know, apps or like individual piece of software that 294 00:19:10,200 --> 00:19:12,480 Speaker 1: can run at the same time. Certainly more of a 295 00:19:12,560 --> 00:19:16,480 Speaker 1: rudimentary version of that concept, but definitely approaching that in 296 00:19:16,520 --> 00:19:21,360 Speaker 1: a really important way. UM. Babbage believed that these machines 297 00:19:21,359 --> 00:19:26,159 Speaker 1: would be exclusively used for computing numbers. Uh. But she 298 00:19:26,240 --> 00:19:28,760 Speaker 1: had more of a vision. She believed there could be 299 00:19:28,880 --> 00:19:32,680 Speaker 1: music involved or like, you know, she she envisioned word processing. 300 00:19:33,000 --> 00:19:35,520 Speaker 1: She envisioned the idea of them using being being used 301 00:19:35,520 --> 00:19:38,560 Speaker 1: to make sound, whether that's along the lines of the 302 00:19:38,640 --> 00:19:41,119 Speaker 1: music box kind of version that I was describing before, 303 00:19:41,600 --> 00:19:44,760 Speaker 1: which is very similar to what MIDI ultimately is. MIDDI 304 00:19:44,920 --> 00:19:47,920 Speaker 1: is a language, a computer language that that basically is 305 00:19:47,960 --> 00:19:51,200 Speaker 1: a set of instructions the different electronic devices can understand 306 00:19:51,480 --> 00:19:53,840 Speaker 1: to tell it what to do to play different notes. 307 00:19:53,880 --> 00:19:58,480 Speaker 1: She almost foresaw something like that, um processing photographs and 308 00:19:58,520 --> 00:20:00,920 Speaker 1: things like that, Like all of these a modern concepts, 309 00:20:00,920 --> 00:20:04,639 Speaker 1: she already kind of had her head around them. Yeah, 310 00:20:04,760 --> 00:20:11,200 Speaker 1: it's enormously prescient. This level of accuracy and a prediction 311 00:20:11,800 --> 00:20:15,359 Speaker 1: is somewhat rare, and it should be acknowledged as such. 312 00:20:16,600 --> 00:20:20,840 Speaker 1: I also, I'm not sure the right way to say this. 313 00:20:22,720 --> 00:20:26,280 Speaker 1: I vibe with these kind of predictions, and and it's 314 00:20:26,320 --> 00:20:29,520 Speaker 1: interesting to me that there are still people who are 315 00:20:30,280 --> 00:20:32,320 Speaker 1: so Folks who say that this person's role in history 316 00:20:32,440 --> 00:20:38,640 Speaker 1: is overblown are met with responses from other historians who 317 00:20:38,640 --> 00:20:42,240 Speaker 1: have studied Aida Lovelace their entire lives, and those folks say, 318 00:20:42,400 --> 00:20:45,920 Speaker 1: what's happening now is there's still people in the modern 319 00:20:46,040 --> 00:20:50,760 Speaker 1: day who are trying to discredit her achievements. And they say, look, 320 00:20:50,920 --> 00:20:54,920 Speaker 1: this doesn't just happen to prominent historical figures like Aida Lovelace. 321 00:20:55,160 --> 00:20:58,399 Speaker 1: This happens to women working in tech today, many of 322 00:20:58,440 --> 00:21:01,239 Speaker 1: whom are in the audio now and thank you, by 323 00:21:01,240 --> 00:21:05,040 Speaker 1: the way for tuning in. As always ridiculous historians. Uh, 324 00:21:05,240 --> 00:21:08,680 Speaker 1: people are starting to guess they're they're saying, like, well, 325 00:21:08,680 --> 00:21:12,040 Speaker 1: what what happened after this? Because as we found, you know, 326 00:21:12,280 --> 00:21:16,080 Speaker 1: history textbooks sometimes teach us that people exist for one 327 00:21:16,200 --> 00:21:19,120 Speaker 1: great moment, that it's that one great speech they did, 328 00:21:19,160 --> 00:21:24,760 Speaker 1: that one eureka uh second that defines their life. But 329 00:21:24,920 --> 00:21:27,639 Speaker 1: that's not true. People are people, just like everyone else. 330 00:21:27,680 --> 00:21:30,800 Speaker 1: They have their ups and downs. And some folks say 331 00:21:30,920 --> 00:21:36,359 Speaker 1: that because Aida hit this wall in the world of tech, 332 00:21:36,920 --> 00:21:40,879 Speaker 1: she may have it may have led her into the 333 00:21:41,280 --> 00:21:44,320 Speaker 1: gambling halls. And we do know that in the eighteen 334 00:21:44,359 --> 00:21:49,040 Speaker 1: forties she did pick up a gambling habit, and some 335 00:21:49,119 --> 00:21:53,880 Speaker 1: people speculate that this gambling habit forced her to secretly 336 00:21:53,960 --> 00:21:58,520 Speaker 1: pawn the family's precious stones. I'm gonna have family jewels. 337 00:21:58,560 --> 00:22:01,240 Speaker 1: But that feels like it's more euphanism. I think it is. 338 00:22:01,280 --> 00:22:03,120 Speaker 1: But but it's an uphanism for a reason. I think 339 00:22:03,119 --> 00:22:06,200 Speaker 1: it comes from situations like that is the prize possession 340 00:22:06,240 --> 00:22:08,440 Speaker 1: of like you know, your legacy kind of and yeah, 341 00:22:08,480 --> 00:22:11,240 Speaker 1: when you're when you're that low that you have to 342 00:22:11,240 --> 00:22:14,880 Speaker 1: resort to that, Um, it's definitely a sad, a sad 343 00:22:14,920 --> 00:22:18,560 Speaker 1: place to be. Uh. And she, you know, reportedly lost 344 00:22:18,880 --> 00:22:24,320 Speaker 1: thousands of pounds betting on horses UM at the EPSOM Derby. UM. 345 00:22:24,800 --> 00:22:28,280 Speaker 1: She had relationships, like often people do that get deep 346 00:22:28,320 --> 00:22:31,520 Speaker 1: into the gambling world with con men and loan sharky 347 00:22:31,600 --> 00:22:36,000 Speaker 1: types you know of the time, who were essentially using her, 348 00:22:36,480 --> 00:22:41,320 Speaker 1: uh for her smarts, for some sort of mathematical way 349 00:22:41,359 --> 00:22:45,120 Speaker 1: of predicting horse races. Uh. And I think we all 350 00:22:45,440 --> 00:22:48,879 Speaker 1: know that that's not a thing. Maybe there's you know, 351 00:22:48,920 --> 00:22:51,760 Speaker 1: ways of calculating odds and stuff like that, and like 352 00:22:51,840 --> 00:22:54,320 Speaker 1: you know, algorithms that can kind of help guide you 353 00:22:54,359 --> 00:22:56,719 Speaker 1: in the right direction. But if someone had it's like 354 00:22:56,760 --> 00:22:59,560 Speaker 1: the I always think of the sports Almanac from back 355 00:22:59,600 --> 00:23:01,760 Speaker 1: to the few tre two. You know, somebody had that 356 00:23:01,880 --> 00:23:06,080 Speaker 1: ability to predict the outcome of sports games, there would 357 00:23:06,080 --> 00:23:08,959 Speaker 1: be no rhymer reason to to betting it, wouldn't there. 358 00:23:09,040 --> 00:23:10,960 Speaker 1: There'll be no need for it anymore unless that secret 359 00:23:11,000 --> 00:23:14,080 Speaker 1: was you know, just kept just by Biff from the past, 360 00:23:14,160 --> 00:23:16,040 Speaker 1: you know, who is now a Donald Trump asked figure 361 00:23:16,040 --> 00:23:17,960 Speaker 1: living in his high rise keeping the sports on the 362 00:23:18,000 --> 00:23:21,040 Speaker 1: neck in a in a under lock and key, And 363 00:23:21,200 --> 00:23:25,800 Speaker 1: unfortunately that's not what happened with a. We know their 364 00:23:25,800 --> 00:23:30,160 Speaker 1: friendship continued. We know that she met other notable figures 365 00:23:30,560 --> 00:23:34,560 Speaker 1: like Charles Dickens. Yes, that Charles Dickens. He was friends 366 00:23:34,560 --> 00:23:38,000 Speaker 1: with Babbage. He met Lovelace through her. There's before we 367 00:23:38,000 --> 00:23:41,400 Speaker 1: move on from gambling, there's this really cool conspiratorial passing 368 00:23:41,440 --> 00:23:44,960 Speaker 1: the book story between her and Babbage, but no one's 369 00:23:45,000 --> 00:23:47,159 Speaker 1: actually found the book. It'd be great to see what 370 00:23:47,200 --> 00:23:50,560 Speaker 1: that program was, though, wouldn't if they actually were able 371 00:23:50,600 --> 00:23:53,560 Speaker 1: to do it. Uh, the book being the book being 372 00:23:53,800 --> 00:23:57,640 Speaker 1: some secret code or some secret algorithm that could predict 373 00:23:57,920 --> 00:24:03,600 Speaker 1: these outcomes. Yeah. So the idea is that, and you 374 00:24:03,640 --> 00:24:06,160 Speaker 1: can you can read some of this in a book 375 00:24:06,200 --> 00:24:09,560 Speaker 1: called Lady Byron and Her Daughters. Uh. The idea is 376 00:24:09,640 --> 00:24:13,119 Speaker 1: that about once a week, during the height of her gambling, 377 00:24:13,320 --> 00:24:17,679 Speaker 1: Lovelace and Babbage would exchange a book that was believed 378 00:24:17,720 --> 00:24:22,320 Speaker 1: to contain a program that attempted to predict those results. 379 00:24:22,359 --> 00:24:25,720 Speaker 1: But at the same time she's struggling with this gambling, 380 00:24:26,000 --> 00:24:28,359 Speaker 1: she's also struggling with her health, which had been a 381 00:24:28,400 --> 00:24:32,480 Speaker 1: problem at different points in her adulthood. Uh, it seemed 382 00:24:32,520 --> 00:24:34,879 Speaker 1: to take a turn for the worst. In the eighteen 383 00:24:34,920 --> 00:24:41,040 Speaker 1: forties she suffered from uterine cancer, and in August of 384 00:24:41,080 --> 00:24:44,240 Speaker 1: eighteen fifty two, Charles Dickens, who by this point is 385 00:24:44,280 --> 00:24:48,320 Speaker 1: like super famous, he visits. He visits her while she 386 00:24:48,440 --> 00:24:52,680 Speaker 1: is ailing, and then she asked him to read some 387 00:24:52,760 --> 00:24:55,840 Speaker 1: of his work to her, and so he reads a 388 00:24:56,080 --> 00:25:00,720 Speaker 1: scene from a novel called Dombi and Son. Dombi and 389 00:25:00,760 --> 00:25:04,480 Speaker 1: Son was published in eighteen and the scene he reads 390 00:25:05,400 --> 00:25:08,800 Speaker 1: is the part where spoiler alert, I guess is the 391 00:25:08,840 --> 00:25:13,320 Speaker 1: part where the six year old character Paul Dombie dies. 392 00:25:14,040 --> 00:25:18,359 Speaker 1: It's kind of somber stuff. Three months after that, Ada 393 00:25:18,440 --> 00:25:29,120 Speaker 1: Lovelace herself will will pass away on November two, rewind 394 00:25:29,160 --> 00:25:32,679 Speaker 1: ever so slightly well quite quite a ways to her birth. 395 00:25:32,760 --> 00:25:35,639 Speaker 1: In the situation with her father, Lord Byron, Um, I 396 00:25:35,680 --> 00:25:37,560 Speaker 1: think we made this pretty clear, but just want to 397 00:25:37,640 --> 00:25:40,439 Speaker 1: emphasize that they never had a relationship at all. He 398 00:25:40,520 --> 00:25:43,240 Speaker 1: was kind of this like you know, specter in her life, 399 00:25:43,240 --> 00:25:46,040 Speaker 1: who loomed large in his own right. Um. She was 400 00:25:46,119 --> 00:25:49,960 Speaker 1: fascinated with him from afar uh and with his poetry 401 00:25:50,040 --> 00:25:53,960 Speaker 1: but for all intents and purposes, her mother's mission to 402 00:25:54,040 --> 00:25:57,560 Speaker 1: keep you know, their daughter from becoming like um, you know, 403 00:25:57,640 --> 00:26:02,040 Speaker 1: her lecherous uh father worked. She very much led a 404 00:26:02,080 --> 00:26:04,760 Speaker 1: life much more in the image of her her mom. 405 00:26:05,080 --> 00:26:07,320 Speaker 1: But it's interesting that she still had that fascination and 406 00:26:07,400 --> 00:26:09,680 Speaker 1: kind of kept up with him. But it's like you said, 407 00:26:09,680 --> 00:26:14,600 Speaker 1: been on his deathbed, Lord Byron expressed some um regrets 408 00:26:14,720 --> 00:26:17,120 Speaker 1: you know, over not having had a relationship with his daughter, 409 00:26:17,160 --> 00:26:21,000 Speaker 1: who he seemed to also be aware of her reputation 410 00:26:21,119 --> 00:26:24,320 Speaker 1: and legacy and the seemed to respect her. But after 411 00:26:25,040 --> 00:26:28,840 Speaker 1: her death, she requested to be buried in the Byron 412 00:26:29,000 --> 00:26:32,320 Speaker 1: family plot or vault rather, which was inside the church 413 00:26:32,440 --> 00:26:36,040 Speaker 1: of St. Mary Magdalen in a very small English town 414 00:26:36,280 --> 00:26:41,240 Speaker 1: called Hucknall, and her coffin was placed right beside her father's, 415 00:26:41,720 --> 00:26:45,920 Speaker 1: Lord Byron's, and she too passed away at a very 416 00:26:45,960 --> 00:26:49,359 Speaker 1: early age. She was only thirty six. Um, I believe 417 00:26:49,359 --> 00:26:51,480 Speaker 1: it was the same exact age that Lord Byron died. 418 00:26:52,640 --> 00:26:56,639 Speaker 1: Same exact age. Yeah, she was, uh, she was close 419 00:26:56,680 --> 00:27:00,840 Speaker 1: to turning thirty seven, but they did. They were the 420 00:27:00,840 --> 00:27:04,840 Speaker 1: same age when they passed away, and her mother built 421 00:27:04,840 --> 00:27:08,520 Speaker 1: a memorial for her that included a sonnet that Ada 422 00:27:08,600 --> 00:27:12,600 Speaker 1: Lovelace had written herself, and that I believe was a 423 00:27:12,640 --> 00:27:15,520 Speaker 1: beautiful thing for her to do, given that she had, 424 00:27:16,800 --> 00:27:19,280 Speaker 1: you know, spent so much of her daughter's childhood trying 425 00:27:19,320 --> 00:27:23,840 Speaker 1: to convince the kid not to write poetry. Uh so 426 00:27:24,040 --> 00:27:28,240 Speaker 1: she passes away. Why how did the world deal with 427 00:27:28,280 --> 00:27:31,640 Speaker 1: it when this occurred? Well, it may not surprise many 428 00:27:31,680 --> 00:27:36,640 Speaker 1: of us in the audience today to note that no 429 00:27:36,680 --> 00:27:41,120 Speaker 1: one really acknowledged her contributions for almost a hundred years. 430 00:27:41,640 --> 00:27:45,320 Speaker 1: They like, nobody really paid attention to her awesome insightful 431 00:27:45,440 --> 00:27:49,080 Speaker 1: notes on the analytical Engine when they were first published. 432 00:27:49,400 --> 00:27:53,720 Speaker 1: It wasn't ntil author named B. V. Bowden publishes a 433 00:27:53,720 --> 00:27:57,920 Speaker 1: book called Faster Than Thought, a symposium on digital computing machines, 434 00:27:58,440 --> 00:28:05,320 Speaker 1: and that's when people start paying attention to Love leases again, tremendously, 435 00:28:05,960 --> 00:28:11,640 Speaker 1: tremendously insightful work. Picture her like a mathematical Nostradams. If 436 00:28:11,640 --> 00:28:16,199 Speaker 1: no Stradomas ever actually made any successful predictions, which he didn't, 437 00:28:17,520 --> 00:28:21,360 Speaker 1: not so much, but but Lovelace absolutely did. And even 438 00:28:21,400 --> 00:28:23,280 Speaker 1: in the fifties, you know, like like I said, when 439 00:28:23,280 --> 00:28:28,200 Speaker 1: I was talking to Alex and Max's dad, Um the 440 00:28:28,200 --> 00:28:31,200 Speaker 1: machines that existed still were very much along the lines 441 00:28:31,200 --> 00:28:34,720 Speaker 1: of what Babbage and Lovelace had kind of worked on together, 442 00:28:34,840 --> 00:28:41,760 Speaker 1: and still not even close to approaching what potential that 443 00:28:41,800 --> 00:28:44,560 Speaker 1: she saw. That wouldn't really happen until you started getting 444 00:28:44,560 --> 00:28:47,720 Speaker 1: into the personal computer era, when like individuals can start 445 00:28:47,800 --> 00:28:50,000 Speaker 1: using them for more creative endeavors, you know, at least 446 00:28:50,080 --> 00:28:51,960 Speaker 1: that part of her predictions, you know, the idea of 447 00:28:52,160 --> 00:28:56,160 Speaker 1: using them for music, and using them for multitasking and things, 448 00:28:56,200 --> 00:28:58,080 Speaker 1: and to be more of like a staple of like 449 00:28:58,400 --> 00:29:01,200 Speaker 1: everyday life instead of relegate it into these like massive 450 00:29:01,560 --> 00:29:04,800 Speaker 1: server rooms, you know, and being so prohibitably expensive that 451 00:29:04,920 --> 00:29:09,200 Speaker 1: only UM organizations like the U. S. Department of Defense 452 00:29:09,480 --> 00:29:12,320 Speaker 1: could afford to house them, which is exactly what they 453 00:29:12,320 --> 00:29:16,080 Speaker 1: did in the nineteen seventies. I spent billions of dollars 454 00:29:16,520 --> 00:29:20,680 Speaker 1: UM embedding these computer systems with code uh that they 455 00:29:21,280 --> 00:29:24,560 Speaker 1: a coding language that they dubbed Ada, and it changed 456 00:29:24,600 --> 00:29:30,400 Speaker 1: the trajectory of computer science UM in a very meaningful way. Yeah. 457 00:29:30,520 --> 00:29:33,440 Speaker 1: It was actually one of the one of the most 458 00:29:33,520 --> 00:29:37,080 Speaker 1: expensive coding projects ever. I think it was the most 459 00:29:37,120 --> 00:29:41,040 Speaker 1: expensive coding project up to that time. In the nineteen seventies. 460 00:29:41,680 --> 00:29:43,880 Speaker 1: The folks at the Department of Defense in the US 461 00:29:43,960 --> 00:29:47,600 Speaker 1: look around and they say, we are spending billions in 462 00:29:47,720 --> 00:29:50,320 Speaker 1: nineteen seventies dollars, so even more money today. We are 463 00:29:50,360 --> 00:29:56,760 Speaker 1: spending billions on making computing systems. And these computing systems 464 00:29:56,800 --> 00:30:00,000 Speaker 1: can't talk with each other. They each have their own language. 465 00:30:00,200 --> 00:30:04,520 Speaker 1: We're building a weird tower of babble. So instead they say, 466 00:30:04,600 --> 00:30:08,520 Speaker 1: let's let's consolidate all our military computing. Let's save a 467 00:30:08,520 --> 00:30:12,280 Speaker 1: little scratch, let's make a language all of these machines 468 00:30:12,560 --> 00:30:16,920 Speaker 1: can speak together. And when they arrive at this idea, 469 00:30:17,520 --> 00:30:20,240 Speaker 1: which they do follow through with, that's when that language 470 00:30:20,240 --> 00:30:23,000 Speaker 1: comes in. There's a U. S. Navy commander named Jack 471 00:30:23,040 --> 00:30:26,840 Speaker 1: Cooper who's when they're pitching names for it, he says, well, 472 00:30:26,840 --> 00:30:29,840 Speaker 1: how about we name it Ada in honor of Aida 473 00:30:29,920 --> 00:30:34,760 Speaker 1: Lovelace in the ninety nine. Everybody looked around and said, yeah, 474 00:30:34,800 --> 00:30:38,880 Speaker 1: that's an awesome idea, So good for them. Also, Aida 475 00:30:39,120 --> 00:30:44,160 Speaker 1: is still used across the world today for air traffic control, 476 00:30:44,400 --> 00:30:49,840 Speaker 1: railroad transit, rockets, even some satellites, military weapons. We don't 477 00:30:49,880 --> 00:30:54,120 Speaker 1: know how she would feel about, you know, having a 478 00:30:54,240 --> 00:30:59,880 Speaker 1: language named after her used in warfare. But it's still around, 479 00:31:00,360 --> 00:31:03,320 Speaker 1: and it's it's definitely less popular now. It's fallen out 480 00:31:03,360 --> 00:31:06,640 Speaker 1: of favor because technology is always so progressing so quickly. 481 00:31:07,000 --> 00:31:12,480 Speaker 1: But we also like nuclear plants and military operations sometimes 482 00:31:12,520 --> 00:31:15,840 Speaker 1: do tend to use outdated uh coading or or computers 483 00:31:15,880 --> 00:31:19,280 Speaker 1: because they need to be proprietary, so they're harder to 484 00:31:19,360 --> 00:31:22,840 Speaker 1: hack or they're harder to like break into correct correct 485 00:31:22,920 --> 00:31:27,000 Speaker 1: for security. And then also another point would be for predictability. 486 00:31:27,360 --> 00:31:29,960 Speaker 1: If there are bugs in the program, they've been identified, 487 00:31:30,040 --> 00:31:33,840 Speaker 1: you know, in the intervening decades. So this is a 488 00:31:33,880 --> 00:31:38,120 Speaker 1: testament to Aida lovelas his life and work, and it's 489 00:31:38,120 --> 00:31:42,280 Speaker 1: a testament that not a lot of people can say 490 00:31:42,320 --> 00:31:44,840 Speaker 1: that they have achieved, you know what I mean. Everybody 491 00:31:44,920 --> 00:31:47,400 Speaker 1: gets a statue, right if you're a leader of old 492 00:31:47,440 --> 00:31:51,040 Speaker 1: or your prominent figure. But to get a new kind 493 00:31:51,080 --> 00:31:55,800 Speaker 1: of language, a computer language named after you, that's pretty impressive. 494 00:31:55,840 --> 00:31:57,480 Speaker 1: That's up there with having your face on the money. 495 00:31:57,560 --> 00:32:00,000 Speaker 1: I would say even more important, I think so too 496 00:32:00,480 --> 00:32:04,000 Speaker 1: quick footnote, there is an Ada Lovelace Day. In two 497 00:32:04,040 --> 00:32:09,000 Speaker 1: thousand nine, a British social media grew named Sue Chairman 498 00:32:09,240 --> 00:32:12,880 Speaker 1: Anderson in order to encourage young women to go into 499 00:32:12,960 --> 00:32:16,920 Speaker 1: stem fields and all of that. They decided to create 500 00:32:17,320 --> 00:32:21,440 Speaker 1: an Ada Lovelace Day. UM that was on October eleven 501 00:32:21,680 --> 00:32:26,280 Speaker 1: of each year. UM. And it's weirdly not Ada Lovelace's birthday, 502 00:32:26,480 --> 00:32:29,280 Speaker 1: nor is it the day that she died. Um, it's 503 00:32:29,360 --> 00:32:33,400 Speaker 1: just the second Tuesday in October because it was convenient. Um. 504 00:32:33,480 --> 00:32:34,920 Speaker 1: And one thing I just wanted to say, because I 505 00:32:34,920 --> 00:32:37,160 Speaker 1: thought it was really funny. I think I mentioned the 506 00:32:37,200 --> 00:32:41,680 Speaker 1: first episode UM a YouTube talk I listened to or 507 00:32:41,880 --> 00:32:45,160 Speaker 1: we we listened to about a book that was written 508 00:32:45,200 --> 00:32:49,800 Speaker 1: to commemorate the two anniversary of her uh birth and 509 00:32:50,080 --> 00:32:52,800 Speaker 1: UM it was had to do with Oxford Press. And 510 00:32:52,840 --> 00:32:55,280 Speaker 1: I'm sorry I'm spacing on the name of it right now, 511 00:32:55,320 --> 00:32:59,680 Speaker 1: but in the presentation, one of the authors mentioned that 512 00:33:00,040 --> 00:33:01,800 Speaker 1: when they had access to all these letters she wrote, 513 00:33:01,960 --> 00:33:04,600 Speaker 1: she wasn't very good at keeping track of dates like 514 00:33:04,680 --> 00:33:08,800 Speaker 1: she UM oftentimes would would say that a letter was 515 00:33:08,880 --> 00:33:11,400 Speaker 1: be dated for like Sunday, November the second, and the 516 00:33:11,440 --> 00:33:13,640 Speaker 1: next one would be dated for Monday, November the second. 517 00:33:13,840 --> 00:33:16,440 Speaker 1: The punchline of the of the bit that the author 518 00:33:16,480 --> 00:33:18,640 Speaker 1: said was and it turns out that that date was 519 00:33:18,680 --> 00:33:21,960 Speaker 1: actually a Tuesday. So UM, it's I think fitting that 520 00:33:22,520 --> 00:33:25,680 Speaker 1: this date is a little bit arbitrary. I love a 521 00:33:25,760 --> 00:33:29,480 Speaker 1: Loveli day, right right right there we go, that's that's 522 00:33:29,680 --> 00:33:32,320 Speaker 1: poetic as well. One thing we do want to end 523 00:33:32,320 --> 00:33:36,920 Speaker 1: done that we teased earlier in part one is that 524 00:33:37,080 --> 00:33:41,120 Speaker 1: when we say Loveliest was prescient and how to how 525 00:33:41,200 --> 00:33:44,520 Speaker 1: to sharp mind for future trends, we need to talk 526 00:33:44,560 --> 00:33:48,440 Speaker 1: a little bit about the concept of machine consciousness sometimes 527 00:33:48,440 --> 00:33:52,240 Speaker 1: called artificial intelligence or AI. Where do you think she 528 00:33:52,400 --> 00:34:00,000 Speaker 1: fell on on the side of this argument ridiculous historians? Well, uh, 529 00:34:00,360 --> 00:34:04,200 Speaker 1: she was. She was against it. She said, the analytical 530 00:34:04,240 --> 00:34:08,319 Speaker 1: engine has no pretensions whatsoever to originate anything. It can 531 00:34:08,320 --> 00:34:10,880 Speaker 1: do whatever we know how to order it to perform. 532 00:34:11,040 --> 00:34:14,200 Speaker 1: It can follow analysis, but has no power of anticipating 533 00:34:14,600 --> 00:34:19,200 Speaker 1: any analytical relations or truths. And so you have to wonder. 534 00:34:20,960 --> 00:34:24,480 Speaker 1: You have to wonder how long that prediction and soft 535 00:34:24,560 --> 00:34:27,600 Speaker 1: interpreted as a prediction, how long it will hold. Because 536 00:34:27,840 --> 00:34:30,120 Speaker 1: I don't know. The whole time we're talking about this, 537 00:34:31,239 --> 00:34:35,440 Speaker 1: I I kept wondering what she would think now, what 538 00:34:35,560 --> 00:34:38,880 Speaker 1: she would think of the latest innovations. Everybody has a 539 00:34:38,920 --> 00:34:42,960 Speaker 1: surveillance device or you know, the majority of people in 540 00:34:42,960 --> 00:34:46,239 Speaker 1: the developed world have a surveillance device. Who just call smartphone, 541 00:34:46,360 --> 00:34:49,120 Speaker 1: so it'll seem a little more comfortable. Uh. There are 542 00:34:49,400 --> 00:34:55,240 Speaker 1: entire multibillion dollar industries based on predicting people's behavior right 543 00:34:55,280 --> 00:34:59,000 Speaker 1: based on their past activities, their location, the folks they 544 00:34:59,160 --> 00:35:03,799 Speaker 1: roll with. So it's quite possible. I don't know, what 545 00:35:03,840 --> 00:35:05,640 Speaker 1: do you what do you think she would think about 546 00:35:05,719 --> 00:35:10,120 Speaker 1: the modern world of computing. I think she'd with that 547 00:35:10,520 --> 00:35:13,560 Speaker 1: big old brain of hers, probably be able to predict 548 00:35:13,560 --> 00:35:17,480 Speaker 1: where we'd be heading, you know another you know, fifty 549 00:35:17,680 --> 00:35:20,440 Speaker 1: hundred years from now. Um. It just seemed like she 550 00:35:20,719 --> 00:35:25,279 Speaker 1: really was possessed with a certain prescient ability to kind 551 00:35:25,280 --> 00:35:28,720 Speaker 1: of just see through, um, the veil, kind of into 552 00:35:28,800 --> 00:35:30,640 Speaker 1: into what was next. So I think it would be 553 00:35:30,719 --> 00:35:32,960 Speaker 1: very interesting to have a conversation with her, like Bill 554 00:35:32,960 --> 00:35:35,279 Speaker 1: and Ted style. UM. I don't want to point out 555 00:35:35,280 --> 00:35:37,520 Speaker 1: one thing too. I want to make sure I didn't 556 00:35:37,520 --> 00:35:40,279 Speaker 1: come off sounding as though the difference engine or the 557 00:35:40,280 --> 00:35:44,440 Speaker 1: analytical engine were successfully built and existed. In the four 558 00:35:44,480 --> 00:35:47,879 Speaker 1: It was all conceptual. It was all these papers and 559 00:35:47,960 --> 00:35:51,680 Speaker 1: these analyzes of the concept of these machines. But he 560 00:35:51,760 --> 00:35:55,120 Speaker 1: never actually got babbaged. Also partly because he was a difficult, 561 00:35:55,160 --> 00:35:59,200 Speaker 1: dude never got the project off the ground UM fully UM. 562 00:35:59,280 --> 00:36:04,319 Speaker 1: But into thousand two, the London Science Museum UM, with 563 00:36:04,880 --> 00:36:09,800 Speaker 1: the under the supervision of Dorn Suede or Swatty UM, 564 00:36:09,840 --> 00:36:15,600 Speaker 1: they actually built a fully working, full size difference engine UM. 565 00:36:15,760 --> 00:36:20,239 Speaker 1: And it took seventeen years UM and it is absolutely 566 00:36:20,360 --> 00:36:24,520 Speaker 1: massive and it represents the completion of this whole legacy 567 00:36:24,520 --> 00:36:27,880 Speaker 1: that we're talking about. Obviously, the concepts went on to 568 00:36:28,040 --> 00:36:32,160 Speaker 1: feed computing moving forward, but the original you know, invention 569 00:36:32,600 --> 00:36:37,800 Speaker 1: never actually was was created until two thousand two. And folks, 570 00:36:37,880 --> 00:36:40,080 Speaker 1: I think you can I think you can agree with me. 571 00:36:40,200 --> 00:36:43,120 Speaker 1: Noll well, I think you guys will also agree that 572 00:36:43,360 --> 00:36:46,160 Speaker 1: maybe the difference engine would have been built in Charles 573 00:36:46,160 --> 00:36:50,120 Speaker 1: Babbage's lifetime if he had just let someone else talk 574 00:36:50,239 --> 00:36:52,719 Speaker 1: to the people who could have paid for it. So 575 00:36:52,800 --> 00:36:54,839 Speaker 1: I think that was a misstep on his You can't 576 00:36:54,840 --> 00:36:58,000 Speaker 1: put that on Ada man. That's all you saw, you, Charlie. 577 00:36:58,560 --> 00:37:05,120 Speaker 1: Uh yes, and uh you know this story is equally inspiring, 578 00:37:05,320 --> 00:37:09,960 Speaker 1: it's equally heartbreaking because we've all heard the old adage 579 00:37:09,960 --> 00:37:14,000 Speaker 1: you never get the flowers while you can still smell them. Uh. 580 00:37:14,040 --> 00:37:22,040 Speaker 1: And we we be humanity, oh Ada Lovelace tremendous debt. 581 00:37:22,680 --> 00:37:24,960 Speaker 1: As we said in the beginning of this series, if 582 00:37:25,000 --> 00:37:28,120 Speaker 1: you have a smartphone, if you have a computer, heck, 583 00:37:28,160 --> 00:37:31,439 Speaker 1: if you're listening to this podcast, then there's a very 584 00:37:31,520 --> 00:37:35,600 Speaker 1: strong argument that you personally oh a little bit of 585 00:37:35,640 --> 00:37:41,880 Speaker 1: gratitude towards the one and only Ada Lovelace. Indeed, Um, 586 00:37:41,920 --> 00:37:44,479 Speaker 1: and I certainly learned a lot. I knew a little 587 00:37:44,520 --> 00:37:47,319 Speaker 1: bit about Aida and her legacy, and a little bit 588 00:37:47,320 --> 00:37:50,759 Speaker 1: about Babbage, but not nearly as much as we were 589 00:37:50,760 --> 00:37:53,920 Speaker 1: able to dig into on this two part episode. I 590 00:37:53,920 --> 00:37:56,480 Speaker 1: hope you all have learned something too, and we are 591 00:37:56,560 --> 00:37:58,960 Speaker 1: going to call it a day. This is one for 592 00:37:59,000 --> 00:38:03,440 Speaker 1: the history books. As always, thanks to super producer Max Williams. 593 00:38:03,440 --> 00:38:08,400 Speaker 1: Special thanks to our returning guest producer Lowell Brilliante Noel. 594 00:38:08,480 --> 00:38:11,040 Speaker 1: Can you believe he's stuck with us for a second one? 595 00:38:11,560 --> 00:38:13,839 Speaker 1: I can't, Um, but I am for ever his debt 596 00:38:13,880 --> 00:38:17,600 Speaker 1: for his patients and uh and and producing acumen. And 597 00:38:17,640 --> 00:38:21,120 Speaker 1: also don't forget to check out Lowell's amazing podcast Prodigy, 598 00:38:21,160 --> 00:38:24,320 Speaker 1: available now on the IHRT Radio app, Apple Podcasts, wherever 599 00:38:24,360 --> 00:38:26,920 Speaker 1: you get podcasts, wherever you get. That's true, That's true. 600 00:38:27,600 --> 00:38:31,439 Speaker 1: And thanks of course also to Alex Williams who uh 601 00:38:32,040 --> 00:38:35,040 Speaker 1: rumor has it maybe returning to the show sooner than 602 00:38:35,160 --> 00:38:38,479 Speaker 1: later act surprised Big thanks to Christopher hassegda Is Big 603 00:38:38,480 --> 00:38:43,760 Speaker 1: Big thanks to Eve's Jeff Coat and of course Noel 604 00:38:44,520 --> 00:38:47,440 Speaker 1: Big thanks to you, and I think we can both 605 00:38:47,480 --> 00:38:51,439 Speaker 1: say big thanks to Gabe Lousier. Keep wanting to say 606 00:38:51,480 --> 00:38:53,279 Speaker 1: one don't cut this part out. I keep wanting to 607 00:38:53,320 --> 00:38:56,000 Speaker 1: say the one and only. It just feels so good 608 00:38:56,040 --> 00:38:58,799 Speaker 1: to say that when you talk true, it feels good 609 00:38:58,840 --> 00:39:03,120 Speaker 1: and it's true you can. These are all like irreplaceable originals, 610 00:39:03,239 --> 00:39:05,560 Speaker 1: um all the folks who work with so thanks to 611 00:39:05,640 --> 00:39:08,040 Speaker 1: all of our one and only and to you, my friend, 612 00:39:09,280 --> 00:39:15,880 Speaker 1: back at you. We'll see you next time, folks. For 613 00:39:15,960 --> 00:39:18,120 Speaker 1: more podcasts from my Heart Radio, visit the I Heart 614 00:39:18,160 --> 00:39:21,040 Speaker 1: Radio app, Apple podcast, or wherever you listen to your 615 00:39:21,040 --> 00:39:21,800 Speaker 1: favorite shows.