1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:14,080 Speaker 1: How Stuff Works. Hey there, everybody, and welcome to tech Stuff. 3 00:00:14,120 --> 00:00:16,759 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,760 --> 00:00:18,800 Speaker 1: How Stuff Works and iHeart radio and I love all 5 00:00:18,840 --> 00:00:22,560 Speaker 1: things tech, and it is time for another classic episode 6 00:00:22,680 --> 00:00:25,079 Speaker 1: of tech Stuff. The episode you are about to hear 7 00:00:25,400 --> 00:00:30,080 Speaker 1: originally published way back on October two thousand and twelve. 8 00:00:30,360 --> 00:00:33,960 Speaker 1: It is called how d n A Computers Work and 9 00:00:34,080 --> 00:00:37,839 Speaker 1: Chris and I dive into the weird, wild world of 10 00:00:37,920 --> 00:00:42,479 Speaker 1: computers based on DNA. Hope you guys enjoy. We were 11 00:00:42,479 --> 00:00:45,400 Speaker 1: going to share some twisted logic with you today. Yes, 12 00:00:45,560 --> 00:00:54,480 Speaker 1: we wanted to talk about dioxy ribonucleic acid computers, DONA, 13 00:00:54,560 --> 00:00:59,840 Speaker 1: DONA no nonda, do NAA d n A computers And 14 00:01:00,160 --> 00:01:04,119 Speaker 1: what is a DNA computer? What would it be? Because 15 00:01:04,200 --> 00:01:07,560 Speaker 1: we're really in the very early stages of using DNA 16 00:01:07,880 --> 00:01:12,560 Speaker 1: for the reasons of uh purposes of a computer. But 17 00:01:12,720 --> 00:01:15,520 Speaker 1: what would a DNA computer be? Why would we even 18 00:01:15,680 --> 00:01:20,440 Speaker 1: use DNA? And what the heck is this DNA stuff? Anyway? Well, 19 00:01:20,480 --> 00:01:22,039 Speaker 1: you know, I've got a USB port in the back 20 00:01:22,080 --> 00:01:26,840 Speaker 1: of my head, so yeah, he also woke up one 21 00:01:26,920 --> 00:01:28,920 Speaker 1: day and he was in a giant battery and he 22 00:01:28,959 --> 00:01:32,520 Speaker 1: had to get out. Turns out Chris is the one. 23 00:01:32,840 --> 00:01:36,520 Speaker 1: And I'm definitely not we got this. You know, We've 24 00:01:36,520 --> 00:01:39,040 Speaker 1: got agents Smith showing up every other day at the 25 00:01:39,160 --> 00:01:42,240 Speaker 1: office and we're like, he's not here, today's teleworking and 26 00:01:42,520 --> 00:01:46,039 Speaker 1: us is irritating. But anyways, a glitch in the matrix DNA. 27 00:01:46,760 --> 00:01:51,680 Speaker 1: So DNA is is is important stuff. I mean, this 28 00:01:51,840 --> 00:01:57,640 Speaker 1: is a molecule that contains information that you know, collectively, 29 00:01:57,760 --> 00:02:02,840 Speaker 1: this information makes makes organisms what they are, yes and 30 00:02:03,400 --> 00:02:08,760 Speaker 1: uh and so biologically DNA is used to store information 31 00:02:09,520 --> 00:02:13,280 Speaker 1: and that is really the key there, you know, saying, 32 00:02:13,600 --> 00:02:17,079 Speaker 1: wait a minute, if DNA stores information for organisms, could 33 00:02:17,160 --> 00:02:21,320 Speaker 1: we use DNA to store information for other purposes. But 34 00:02:21,800 --> 00:02:25,760 Speaker 1: to to really explain this, DNA, it's this, it's it's 35 00:02:25,800 --> 00:02:31,440 Speaker 1: that double helix molecule you're pricing, uh, you know, illustrations 36 00:02:31,520 --> 00:02:33,000 Speaker 1: of it. You may have built a model of it. 37 00:02:33,560 --> 00:02:36,079 Speaker 1: If you are in school, you may be studying this 38 00:02:36,240 --> 00:02:39,119 Speaker 1: so much that the terms I'm going to use you're thinking, Wow, 39 00:02:39,320 --> 00:02:42,679 Speaker 1: he's really glossing over this. But it's because this is 40 00:02:42,800 --> 00:02:44,960 Speaker 1: tech stuff, not stuff to blow your mind. So we're 41 00:02:45,000 --> 00:02:47,440 Speaker 1: not going to go too deep into the cellular biology 42 00:02:47,560 --> 00:02:50,280 Speaker 1: aspect of DNA. Yes, And if you're looking for your 43 00:02:50,320 --> 00:02:52,720 Speaker 1: mind being blown, I'm sorry you've come to the wrong place. 44 00:02:52,840 --> 00:02:56,000 Speaker 1: Right now. DNA has a has a lot of instructions 45 00:02:56,040 --> 00:02:58,560 Speaker 1: in it. Yes, As it turns out, it's a very 46 00:02:58,639 --> 00:03:03,320 Speaker 1: tiny molecule with with a very large capacity for for 47 00:03:03,440 --> 00:03:06,440 Speaker 1: carrying information. Yeah, if you were to actually stretch out 48 00:03:06,600 --> 00:03:10,280 Speaker 1: a DNA molecule and lay it lengthwise, it would end 49 00:03:10,360 --> 00:03:12,920 Speaker 1: up taking much more space than it typically does because 50 00:03:12,960 --> 00:03:18,600 Speaker 1: it has this twisted three dimensional uh uh structure. Hence 51 00:03:18,720 --> 00:03:22,040 Speaker 1: my earlier dumb joke. Right, So this twisted structure actually 52 00:03:22,040 --> 00:03:29,160 Speaker 1: allows this this very dense uh storage medium to exist 53 00:03:29,320 --> 00:03:33,639 Speaker 1: in a relatively small volume of space. Yeah, because you've 54 00:03:33,639 --> 00:03:36,280 Speaker 1: twisted it. And you know, it's the whole thing about 55 00:03:36,920 --> 00:03:40,160 Speaker 1: uh conserving surface area and all that great stuff that 56 00:03:40,240 --> 00:03:42,320 Speaker 1: all my biologist friends go on and on and on 57 00:03:42,440 --> 00:03:46,560 Speaker 1: about and then I end up wandering away. Um. But 58 00:03:47,360 --> 00:03:54,800 Speaker 1: DNA has, among many other attributes, there are pairs of 59 00:03:55,080 --> 00:04:00,160 Speaker 1: bases that that pair up in DNA, and this is 60 00:04:00,280 --> 00:04:03,400 Speaker 1: you know, the the structure of those the sequence of 61 00:04:03,560 --> 00:04:09,360 Speaker 1: those determines what information is stored in that strand of DNA. Okay, 62 00:04:09,400 --> 00:04:14,920 Speaker 1: so those four bases you have ad anine, citazine, guanine, 63 00:04:15,000 --> 00:04:18,480 Speaker 1: and thym ing and usually we just call those A, C, 64 00:04:19,240 --> 00:04:23,920 Speaker 1: G and T. And the way that those are sequenced, 65 00:04:24,040 --> 00:04:26,920 Speaker 1: like I said, within a strand of DNA, determines the 66 00:04:27,080 --> 00:04:31,760 Speaker 1: type of information that that DNA holds. Uh and uh, 67 00:04:32,920 --> 00:04:37,640 Speaker 1: it's it's it's that that forms the basis of the 68 00:04:37,760 --> 00:04:43,040 Speaker 1: idea of using a DNA computer because in our of course, 69 00:04:43,080 --> 00:04:47,960 Speaker 1: in our our classic computer model, we've got computers thinking 70 00:04:48,440 --> 00:04:55,040 Speaker 1: quote unquote thinking in binary right, zeros and ones and 71 00:04:55,279 --> 00:05:00,800 Speaker 1: so uh. With using DNA. Uh, the approach now is 72 00:05:00,920 --> 00:05:06,080 Speaker 1: to associate certain of those bases with zeros and the 73 00:05:06,200 --> 00:05:09,720 Speaker 1: others with ones, and the idea being that way you 74 00:05:09,839 --> 00:05:14,080 Speaker 1: could sequence a DNA down the length of a strand 75 00:05:14,120 --> 00:05:17,640 Speaker 1: of DNA with these zeros and ones. You encode a 76 00:05:17,720 --> 00:05:20,520 Speaker 1: strand of DNA that way, and then you would decode it. 77 00:05:20,600 --> 00:05:24,880 Speaker 1: You would read back those those base pairings and that 78 00:05:24,880 --> 00:05:28,039 Speaker 1: would determine whether each pair was a zero or a one, 79 00:05:28,720 --> 00:05:32,680 Speaker 1: and then you would decode that into binary language, and 80 00:05:33,200 --> 00:05:36,279 Speaker 1: thus you would get back to whatever information you originally 81 00:05:36,360 --> 00:05:42,400 Speaker 1: stored onto the DNA. UM. This is it makes it 82 00:05:42,480 --> 00:05:45,520 Speaker 1: sound pretty simple. But this is high tech science stuff 83 00:05:45,760 --> 00:05:48,520 Speaker 1: right now. Now, granted it's high tech science stuff that 84 00:05:48,760 --> 00:05:54,000 Speaker 1: we have made huge advances in over the last two decades. Really, 85 00:05:55,040 --> 00:06:00,720 Speaker 1: So things that were seen as practically a possible two 86 00:06:00,760 --> 00:06:04,520 Speaker 1: decades ago are things that we do almost not quite routinely, 87 00:06:04,880 --> 00:06:08,560 Speaker 1: but with a greater ease than we could have expected. Yeah, 88 00:06:08,600 --> 00:06:13,680 Speaker 1: but over the course of of the last few decades. Um, 89 00:06:13,760 --> 00:06:16,160 Speaker 1: it's the kind of thing that when people see the 90 00:06:16,240 --> 00:06:19,640 Speaker 1: double helix, it's familiar. Um, you know, it's it's it's 91 00:06:20,400 --> 00:06:24,000 Speaker 1: it's high tech science, but it's in our public consciousness too, 92 00:06:24,279 --> 00:06:27,840 Speaker 1: it's in our DNA. There you go the fact that 93 00:06:27,920 --> 00:06:31,800 Speaker 1: that that's a a uh slang term, you know for something. 94 00:06:31,880 --> 00:06:33,960 Speaker 1: When you say it's, it's basically you're saying it's deeply 95 00:06:34,160 --> 00:06:37,720 Speaker 1: ingrained in your personality or whatever you're saying that about. Um, 96 00:06:38,200 --> 00:06:40,800 Speaker 1: you know, it's it's certainly something that that we're all 97 00:06:40,839 --> 00:06:43,600 Speaker 1: familiar with now, but only a few decades ago, you know, 98 00:06:43,760 --> 00:06:48,479 Speaker 1: it was completely foreign to us. Yeah. So yeah, let's 99 00:06:48,880 --> 00:06:51,280 Speaker 1: we'll do a quick, quick rundown of the history of 100 00:06:51,480 --> 00:06:55,800 Speaker 1: our knowledge about DNA, because clearly DNA has existed for 101 00:06:56,880 --> 00:06:59,400 Speaker 1: millions of years, but we've only really been aware of 102 00:06:59,520 --> 00:07:04,039 Speaker 1: it sense about well, we knew something about it back 103 00:07:04,080 --> 00:07:10,760 Speaker 1: in eighteen yes when Freedrich Meischer, who was thank you 104 00:07:11,000 --> 00:07:15,960 Speaker 1: was he was a biologist from Switzerland and he was 105 00:07:16,120 --> 00:07:21,000 Speaker 1: looking at something pretty darn gross. He was looking at 106 00:07:21,040 --> 00:07:25,480 Speaker 1: bandages that had puss on them, and he isolated DNA 107 00:07:25,760 --> 00:07:28,800 Speaker 1: from the pus on the bandages, and he thought that 108 00:07:29,040 --> 00:07:35,880 Speaker 1: perhaps the this stuff, these nucleic acids, which is DNA 109 00:07:36,200 --> 00:07:40,520 Speaker 1: is a nucleic acid. He thought that perhaps this stuff 110 00:07:40,680 --> 00:07:46,880 Speaker 1: might contain information in it that would determine why stuff 111 00:07:47,160 --> 00:07:49,880 Speaker 1: is the way it is so, genetic information. He thought 112 00:07:49,960 --> 00:07:53,640 Speaker 1: that that probably did contain that information, but there was 113 00:07:53,680 --> 00:07:56,600 Speaker 1: no way for him to be able to confirm it. 114 00:07:57,120 --> 00:08:01,000 Speaker 1: He could not point to anything and say see, I'm right. 115 00:08:01,680 --> 00:08:05,320 Speaker 1: So that had to wait for future scientists to uh, 116 00:08:05,560 --> 00:08:08,560 Speaker 1: to really dive into it. Not not the past that 117 00:08:08,640 --> 00:08:11,560 Speaker 1: bi gross, but to really dive into the information and 118 00:08:11,680 --> 00:08:14,640 Speaker 1: study it and and figure out more details. So in 119 00:08:14,760 --> 00:08:22,360 Speaker 1: ninete some scientists at Rockefeller University, including Oswald Avery, showed 120 00:08:22,440 --> 00:08:27,280 Speaker 1: that DNA taken from a bacterium could make a non 121 00:08:27,600 --> 00:08:34,440 Speaker 1: infectious type of bacteria become infectious bacteria. So The thought 122 00:08:34,600 --> 00:08:38,559 Speaker 1: was that there must be some information from this nucleic 123 00:08:38,640 --> 00:08:43,000 Speaker 1: acid taken from one type of bacteria that could transfer 124 00:08:43,280 --> 00:08:47,000 Speaker 1: properties to a different bacteria that otherwise would not have 125 00:08:47,320 --> 00:08:52,280 Speaker 1: that infectious property. But what does it r Yes, that's 126 00:08:52,360 --> 00:08:55,920 Speaker 1: kind of what everyone was saying. Well, there's some sort 127 00:08:56,040 --> 00:09:00,120 Speaker 1: of information holding material here. We don't really in ud 128 00:09:00,240 --> 00:09:03,880 Speaker 1: stand the mechanism by which it stores information, nor how 129 00:09:04,040 --> 00:09:08,120 Speaker 1: does it impart that information uh or or replicated. We 130 00:09:08,200 --> 00:09:11,520 Speaker 1: didn't know that at the time. Uh. And then in 131 00:09:11,760 --> 00:09:16,679 Speaker 1: nineteen fifty two, Alfred Hershey and Martha Chase showed that 132 00:09:17,280 --> 00:09:22,800 Speaker 1: to make new viruses bacteria fage virus injected DNA into 133 00:09:23,000 --> 00:09:26,599 Speaker 1: the host cell, which was important because previously it was 134 00:09:26,640 --> 00:09:30,600 Speaker 1: thought that perhaps it was through protein exchange, but instead 135 00:09:30,600 --> 00:09:35,480 Speaker 1: of protein exchange, it was DNA exchange. So that showed, yes, 136 00:09:35,640 --> 00:09:38,360 Speaker 1: there's something in this. This d N A is what 137 00:09:38,720 --> 00:09:43,920 Speaker 1: is important. And then came along Watson and Crick. Yes, 138 00:09:44,480 --> 00:09:47,880 Speaker 1: James D. Watson and Francis Crick. Yeah. They it was 139 00:09:48,240 --> 00:09:53,280 Speaker 1: clear that, uh, that people were already onto something. Hershey 140 00:09:53,320 --> 00:09:55,760 Speaker 1: and Chase had something there, and it was only a 141 00:09:55,840 --> 00:09:59,679 Speaker 1: year later when Watson and Crick, uh you know, made 142 00:09:59,679 --> 00:10:04,640 Speaker 1: their announcement they had discovered the structure of DNA, right, 143 00:10:04,720 --> 00:10:08,280 Speaker 1: and so this is when we started to really learn 144 00:10:08,640 --> 00:10:13,199 Speaker 1: what how DNA you know, forms, and what shape it 145 00:10:13,360 --> 00:10:19,160 Speaker 1: takes and why that's important. And um So once all 146 00:10:19,240 --> 00:10:22,000 Speaker 1: of that was taken, once we learned all that, we 147 00:10:22,160 --> 00:10:24,600 Speaker 1: began to see that these base pairings I was talking about, 148 00:10:25,000 --> 00:10:27,760 Speaker 1: we learned that they pair in very specific ways. You know, 149 00:10:27,920 --> 00:10:30,920 Speaker 1: I mentioned there are the four different bases. There's A, 150 00:10:31,559 --> 00:10:35,760 Speaker 1: the A, C, G T. Well, half of those A 151 00:10:36,160 --> 00:10:41,520 Speaker 1: and G are called purines. Uh, C and T are 152 00:10:42,200 --> 00:10:47,839 Speaker 1: uh perimidines. I'm glad you took that part. Yeah, me too, 153 00:10:48,320 --> 00:10:50,280 Speaker 1: Uh you know, way back when I was actually really 154 00:10:50,320 --> 00:10:52,839 Speaker 1: good at biology. But man, that was a few decades ago. 155 00:10:53,000 --> 00:10:58,959 Speaker 1: So anyway, uh puriings and peri perim pyrimidines. Look, I 156 00:10:59,000 --> 00:11:02,200 Speaker 1: can't even do it now, periods of paramidines. Still, glad 157 00:11:02,240 --> 00:11:06,800 Speaker 1: you took that bond together. Right, So, uh, you don't 158 00:11:06,880 --> 00:11:09,439 Speaker 1: get two puringes bonding together, and you don't get two 159 00:11:09,480 --> 00:11:14,640 Speaker 1: pyramidines bonding together. And to be even more specific, A 160 00:11:14,880 --> 00:11:18,640 Speaker 1: and T will bond together and C and G will 161 00:11:18,720 --> 00:11:22,240 Speaker 1: bond together. All right, so that that means that you know, 162 00:11:22,360 --> 00:11:25,199 Speaker 1: you can't. You're not going to get a strand of 163 00:11:25,280 --> 00:11:29,600 Speaker 1: DNA where A and C or A and G are 164 00:11:29,920 --> 00:11:33,599 Speaker 1: paired together. It does not happen, right they Structurally that 165 00:11:33,720 --> 00:11:40,720 Speaker 1: doesn't happen. So uh. That also dictates the rationale behind 166 00:11:41,000 --> 00:11:45,280 Speaker 1: using uh these pairings as zeros and ones, because you 167 00:11:45,600 --> 00:11:48,679 Speaker 1: can either have uh. You can either have the A 168 00:11:48,840 --> 00:11:52,079 Speaker 1: T pairing or the C G pairing, right, so that 169 00:11:52,320 --> 00:11:55,200 Speaker 1: that lets you say, okay, well that's binary. It's either 170 00:11:55,240 --> 00:11:59,000 Speaker 1: you you just designate that one means one, pairing means zero, 171 00:11:59,160 --> 00:12:02,720 Speaker 1: the other pairing means one. Um. If it weren't that case, 172 00:12:03,040 --> 00:12:08,760 Speaker 1: if we could have multiple pairing multiple uh uh, like 173 00:12:09,040 --> 00:12:11,360 Speaker 1: like if A could pair with G instead of just 174 00:12:11,480 --> 00:12:14,080 Speaker 1: A and T, then you would say, all right, well, 175 00:12:14,120 --> 00:12:17,520 Speaker 1: now we've got a system that goes beyond binary, which, 176 00:12:17,800 --> 00:12:22,880 Speaker 1: in theory, if you completely change the way computers work, 177 00:12:23,840 --> 00:12:29,840 Speaker 1: would mean that you could dramatically increase parallel processing because 178 00:12:29,920 --> 00:12:32,559 Speaker 1: you could designate things. It would almost be like the 179 00:12:32,720 --> 00:12:37,640 Speaker 1: cubits of a quantum computer, where you know the basic 180 00:12:37,679 --> 00:12:41,160 Speaker 1: explanation as a cubit represents both a zero and a 181 00:12:41,240 --> 00:12:45,160 Speaker 1: one and all values in between in superposition of one another, 182 00:12:46,160 --> 00:12:50,240 Speaker 1: and that if you have enough cubits, you can perform 183 00:12:50,400 --> 00:12:56,200 Speaker 1: a massive parallel processing problem all at the same time 184 00:12:56,320 --> 00:13:00,800 Speaker 1: because those that that one group of cubits is behaving 185 00:13:00,840 --> 00:13:04,640 Speaker 1: as if it's uh, you know, a huge number of 186 00:13:05,720 --> 00:13:09,480 Speaker 1: traditional bits. I think it's important to remember too that 187 00:13:09,559 --> 00:13:14,160 Speaker 1: no matter how many bases DNA has, they all belong 188 00:13:14,240 --> 00:13:16,480 Speaker 1: to us. Oh I knew it. I knew it. I 189 00:13:16,640 --> 00:13:18,760 Speaker 1: was like, oh, I's gonna do an all your base 190 00:13:18,840 --> 00:13:21,120 Speaker 1: I belonged to us if someone set us up the bomb, 191 00:13:22,040 --> 00:13:24,880 Speaker 1: so well, it could be Actually, if you if you 192 00:13:25,480 --> 00:13:28,959 Speaker 1: were trying to if those pairs become corrupted, they will 193 00:13:29,000 --> 00:13:33,000 Speaker 1: not work and uh and a cell can die. Actually, 194 00:13:33,000 --> 00:13:35,080 Speaker 1: we're getting a lot of this information to from our 195 00:13:35,160 --> 00:13:37,640 Speaker 1: our excellent article on how stuff Works dot com about 196 00:13:37,720 --> 00:13:41,199 Speaker 1: how DNA works. It gets into a whole lot more detailed, right, Yeah, 197 00:13:41,200 --> 00:13:43,200 Speaker 1: if you want to learn more about and and it's 198 00:13:43,320 --> 00:13:45,600 Speaker 1: very accessible. It's a very accessible article. So if you're 199 00:13:45,640 --> 00:13:47,760 Speaker 1: curious about you know, you've always heard about d N A, 200 00:13:47,880 --> 00:13:50,160 Speaker 1: and you've heard about DNA testing, and you know about 201 00:13:50,280 --> 00:13:55,160 Speaker 1: chromosomes and genes, but you're not really you know, beyond that, 202 00:13:55,360 --> 00:13:57,720 Speaker 1: you're kind of confused. I highly recommend you read how 203 00:13:57,840 --> 00:14:00,640 Speaker 1: DNA works at how stuff works dot com. We also 204 00:14:00,679 --> 00:14:03,199 Speaker 1: have an article on how DNA computers work, which is 205 00:14:03,200 --> 00:14:07,760 Speaker 1: pretty interesting because it's talking about an earlier era of 206 00:14:07,920 --> 00:14:12,080 Speaker 1: DNA computers, but recent developments have really brought it brought 207 00:14:12,120 --> 00:14:17,559 Speaker 1: to lights some interesting, uh, new technologies and new use 208 00:14:17,679 --> 00:14:20,640 Speaker 1: cases for d n A and we'll get into those 209 00:14:20,640 --> 00:14:22,720 Speaker 1: in a second. Yeah, It's it's funny that you say that, 210 00:14:22,840 --> 00:14:25,680 Speaker 1: because I'm sure that people this is futuristic enough where 211 00:14:25,720 --> 00:14:28,640 Speaker 1: people are saying, what are you talking about new developments? 212 00:14:28,680 --> 00:14:30,880 Speaker 1: We haven't heard of a d n A computer before? 213 00:14:31,000 --> 00:14:33,520 Speaker 1: But yeah, that's that's not really surprising. This is the 214 00:14:33,600 --> 00:14:36,440 Speaker 1: kind of thing like like quantum computing, where they've been 215 00:14:36,480 --> 00:14:38,400 Speaker 1: working on it for some time, but it's not at 216 00:14:38,440 --> 00:14:41,240 Speaker 1: a point where they can really, you know, put something 217 00:14:41,320 --> 00:14:43,120 Speaker 1: on a shelf and go look at this. Yeah, yeah, 218 00:14:43,760 --> 00:14:46,080 Speaker 1: where people really take notice of it. In general, this 219 00:14:46,240 --> 00:14:49,800 Speaker 1: is all stuff that's taking place in universities and research facilities, 220 00:14:49,960 --> 00:14:52,800 Speaker 1: and it's you know, most of these machines that are 221 00:14:52,840 --> 00:14:56,280 Speaker 1: being made now or or these implementations of using DNA 222 00:14:56,440 --> 00:15:01,440 Speaker 1: for information digital information are are really in the prototype stage. 223 00:15:01,720 --> 00:15:05,480 Speaker 1: But we're getting the technology that allows us to create 224 00:15:05,600 --> 00:15:09,680 Speaker 1: these machines is becoming more and more sophisticated and less expensive, 225 00:15:10,080 --> 00:15:14,320 Speaker 1: which of course is key huge any news. And Gordon 226 00:15:14,400 --> 00:15:18,040 Speaker 1: Moore explained that back and when he did his his 227 00:15:18,160 --> 00:15:21,560 Speaker 1: paper about cramming more components onto an integrated circuit. His 228 00:15:21,720 --> 00:15:24,160 Speaker 1: point was not just that technology was advancing to a 229 00:15:24,160 --> 00:15:27,240 Speaker 1: point where we could shrink stuff down and fit twice 230 00:15:27,280 --> 00:15:30,640 Speaker 1: as many components onto a square inch of silicon as 231 00:15:30,640 --> 00:15:32,720 Speaker 1: we could a year ago. It was also that the 232 00:15:32,760 --> 00:15:36,240 Speaker 1: manufacturing process was becoming efficient enough and cheap enough where 233 00:15:36,320 --> 00:15:40,920 Speaker 1: that made sense. So same sort of thing here. Well, 234 00:15:41,880 --> 00:15:45,280 Speaker 1: all right, so we've we've determined that DNA contains information. 235 00:15:45,560 --> 00:15:48,400 Speaker 1: It because of its very structure, it can contain a 236 00:15:48,560 --> 00:15:51,720 Speaker 1: lot of information in a small volume. Uh. And then 237 00:15:51,840 --> 00:15:55,480 Speaker 1: it wasn't until about nine and I remember it was 238 00:15:55,560 --> 00:15:57,680 Speaker 1: the it was the fifties, the early fifties when we 239 00:15:57,720 --> 00:16:00,440 Speaker 1: started to really understand what DNA was and how how 240 00:16:00,760 --> 00:16:03,880 Speaker 1: it formed and how and its structured and everything like that. 241 00:16:04,600 --> 00:16:09,440 Speaker 1: Into a man named Leonard Edelman came up with this idea. 242 00:16:09,520 --> 00:16:12,880 Speaker 1: He sort of uh introduced the idea of using DNA 243 00:16:13,760 --> 00:16:18,840 Speaker 1: to solve math problems. And it was essentially this idea 244 00:16:18,920 --> 00:16:24,120 Speaker 1: of coding DNA as if it were a strip of 245 00:16:24,320 --> 00:16:30,080 Speaker 1: binary code. And so he took this idea and he 246 00:16:30,320 --> 00:16:33,160 Speaker 1: sort of ran with it. He began to formulate an 247 00:16:33,200 --> 00:16:35,760 Speaker 1: idea about how to how to create an experiment that 248 00:16:35,880 --> 00:16:39,520 Speaker 1: could show that this would work. And it's funny because 249 00:16:40,200 --> 00:16:42,680 Speaker 1: it's talking about a DNA computer, but if you read 250 00:16:42,720 --> 00:16:45,600 Speaker 1: about the experiment, it sounds more like someone in a 251 00:16:45,720 --> 00:16:51,920 Speaker 1: chemistry lab mixing various chemical compositions together and then coming 252 00:16:52,000 --> 00:16:54,920 Speaker 1: up with a solution at the end of it. And 253 00:16:55,400 --> 00:16:58,560 Speaker 1: that's it turns out that this is a computational solution, 254 00:16:58,760 --> 00:17:02,960 Speaker 1: not just a chemical solution. I see what you did there, 255 00:17:03,080 --> 00:17:05,119 Speaker 1: Ye little word play there. Yeah, it's a little a 256 00:17:05,119 --> 00:17:10,879 Speaker 1: little incredible. So he yeah, he um, he dissolved my objections. 257 00:17:11,840 --> 00:17:13,879 Speaker 1: Chris and I have more to say about DNA in 258 00:17:13,960 --> 00:17:16,680 Speaker 1: just a moment, but first let's take a quick break 259 00:17:16,960 --> 00:17:27,240 Speaker 1: to thank our sponsor. Let me read I'll read the 260 00:17:27,400 --> 00:17:30,560 Speaker 1: steps from our article on DNA computers, because I want 261 00:17:30,640 --> 00:17:33,800 Speaker 1: to explain how this early, early, early implementation of a 262 00:17:33,880 --> 00:17:37,320 Speaker 1: DNA computer, how it how it played out, and it's 263 00:17:37,440 --> 00:17:42,160 Speaker 1: kind of amazing. All right. Here are the steps. Number 264 00:17:42,200 --> 00:17:45,840 Speaker 1: one strands of DNA represent the seven cities now when 265 00:17:45,920 --> 00:17:47,680 Speaker 1: it says seven cities in here, what he was doing 266 00:17:47,760 --> 00:17:49,880 Speaker 1: was he was trying to solve something called the traveling 267 00:17:50,000 --> 00:17:54,760 Speaker 1: salesman problem, also the directed Hamilton's path problem. The idea 268 00:17:54,840 --> 00:17:57,760 Speaker 1: being that you're supposed to find the shortest route between 269 00:17:58,520 --> 00:18:01,360 Speaker 1: a group of cities and and it could be any 270 00:18:01,440 --> 00:18:03,760 Speaker 1: number really of cities, but you have to only go 271 00:18:04,000 --> 00:18:08,160 Speaker 1: through each city one time, um, and it becomes more complex. 272 00:18:08,440 --> 00:18:10,600 Speaker 1: This is this is why this is such a fascinating 273 00:18:10,640 --> 00:18:13,440 Speaker 1: problem uh as Jonathan pointed out to me right before, 274 00:18:13,600 --> 00:18:16,480 Speaker 1: he reminded me that this is something that quantum computing 275 00:18:16,680 --> 00:18:20,080 Speaker 1: is fascinated with because this is such a I don't 276 00:18:20,119 --> 00:18:22,680 Speaker 1: know what you call it, thorny, a thorny problem. So 277 00:18:22,800 --> 00:18:25,200 Speaker 1: it was that problem that they were were that he 278 00:18:25,280 --> 00:18:28,639 Speaker 1: wanted to work on, and he chose, I believe seven cities, 279 00:18:28,760 --> 00:18:31,000 Speaker 1: he said, that is his benchmark he wanted to do. 280 00:18:31,160 --> 00:18:33,159 Speaker 1: And see, this is this is an interesting problem for 281 00:18:33,600 --> 00:18:37,399 Speaker 1: uh in computers because think about it, you've got seven cities, 282 00:18:37,480 --> 00:18:39,200 Speaker 1: you can only travel through each city once. You have 283 00:18:39,320 --> 00:18:42,480 Speaker 1: to find the most efficient pathway to go. Well, the 284 00:18:42,520 --> 00:18:45,639 Speaker 1: way a computer would do this, generally speaking, is to 285 00:18:45,840 --> 00:18:52,359 Speaker 1: start going through every single possible um permutation of that 286 00:18:52,560 --> 00:18:56,280 Speaker 1: trip going from city to city and determining which of 287 00:18:56,320 --> 00:18:57,879 Speaker 1: those is the most efficient by the end of it, 288 00:18:57,960 --> 00:19:02,000 Speaker 1: by comparing them all, which can take ages and as 289 00:19:02,200 --> 00:19:04,560 Speaker 1: as of course, as you add more cities, as you 290 00:19:04,600 --> 00:19:09,440 Speaker 1: add complexity to the problem, it creates an exponentially more 291 00:19:09,600 --> 00:19:12,080 Speaker 1: difficult problem for the computer to solve. You know, I 292 00:19:12,119 --> 00:19:15,800 Speaker 1: don't think it's that unlike trying to crack a password. Yeah, 293 00:19:16,160 --> 00:19:18,560 Speaker 1: in the in the you know, other references we've made 294 00:19:18,560 --> 00:19:21,520 Speaker 1: to these again, parallel processing. That's another reason why quantum 295 00:19:21,520 --> 00:19:25,520 Speaker 1: computers are very scary for anyone who's in cryptography who 296 00:19:25,600 --> 00:19:28,360 Speaker 1: wants to create good encryption, because they're talking about using 297 00:19:28,359 --> 00:19:31,680 Speaker 1: parallel processing to attack, you know, do a brute force 298 00:19:31,720 --> 00:19:36,760 Speaker 1: attack on a password. You can really reduce the amount 299 00:19:36,760 --> 00:19:39,440 Speaker 1: of time it would take you to crack a password, 300 00:19:39,480 --> 00:19:42,200 Speaker 1: like a password that would probably take you thousands of 301 00:19:42,359 --> 00:19:46,200 Speaker 1: years in classic computer time might only take an hour 302 00:19:46,680 --> 00:19:49,719 Speaker 1: in using a quantum computer because it's using that parallel approach. 303 00:19:50,000 --> 00:19:53,119 Speaker 1: So just remember, quantum computing is the cure for the 304 00:19:53,200 --> 00:19:57,960 Speaker 1: common code. Man, what is it with you today? I 305 00:19:58,040 --> 00:20:01,720 Speaker 1: don't know. Chris is in a move anyway? All right, 306 00:20:02,240 --> 00:20:05,720 Speaker 1: Getting back to getting back to this thing, this set 307 00:20:05,760 --> 00:20:09,280 Speaker 1: of steps. All right, So Edelman creates strands of DNA 308 00:20:09,400 --> 00:20:15,159 Speaker 1: that represent the seven cities. Uh and so it's these A, T, 309 00:20:15,440 --> 00:20:21,160 Speaker 1: and CG pairings and then um, these various sequences represent 310 00:20:21,280 --> 00:20:25,000 Speaker 1: each city and possible flight path. He then took the 311 00:20:25,160 --> 00:20:29,280 Speaker 1: molecules that these strands of DNA and mixed them in 312 00:20:29,359 --> 00:20:32,040 Speaker 1: a test tube, and some of the strands of DNA 313 00:20:32,200 --> 00:20:34,840 Speaker 1: stuck together in a chain. Of those strands represented a 314 00:20:34,960 --> 00:20:39,840 Speaker 1: potential answer to that question, which of these you know, 315 00:20:39,920 --> 00:20:43,320 Speaker 1: which route is the most efficient. Within a few seconds, 316 00:20:43,480 --> 00:20:47,000 Speaker 1: all of the possible combinations of DNA strands were created 317 00:20:47,080 --> 00:20:49,840 Speaker 1: in the test tube, and then Adelman eliminated the wrong 318 00:20:49,920 --> 00:20:53,240 Speaker 1: molecules through chemical reactions, which left behind only the flight 319 00:20:53,320 --> 00:20:58,240 Speaker 1: paths that connect all seven cities. So here he was 320 00:20:58,440 --> 00:21:03,760 Speaker 1: doing chemistry and looking at molecules by uh and it 321 00:21:03,920 --> 00:21:07,640 Speaker 1: was and it was biological chemistry because he was using 322 00:21:07,800 --> 00:21:13,159 Speaker 1: organic DNA um and and trying to come up with 323 00:21:13,200 --> 00:21:14,920 Speaker 1: the answer that way, which is pretty interesting to me. 324 00:21:15,000 --> 00:21:18,200 Speaker 1: I mean, it looks that sounds so different from the 325 00:21:18,240 --> 00:21:21,520 Speaker 1: way we think of computing today, where you're using microprocessors 326 00:21:21,600 --> 00:21:24,320 Speaker 1: and you know, the user interface looking at screen this 327 00:21:24,400 --> 00:21:28,639 Speaker 1: guy is using test tubes and molecules um and he 328 00:21:28,840 --> 00:21:31,920 Speaker 1: was actually thinking at the time that this would be 329 00:21:32,119 --> 00:21:34,680 Speaker 1: DNA computing is going to be the future because it 330 00:21:34,800 --> 00:21:37,399 Speaker 1: packs so much information in such a small form factor 331 00:21:37,720 --> 00:21:41,120 Speaker 1: and it's plentiful. Yes, because there's a lot of life 332 00:21:41,160 --> 00:21:47,320 Speaker 1: out there, and organic life relies on DNA heavily. There's 333 00:21:47,359 --> 00:21:49,760 Speaker 1: some that rely on RNA, but we're not going to 334 00:21:49,800 --> 00:21:54,760 Speaker 1: go into that. But anyway, a great amount of organic 335 00:21:54,840 --> 00:21:56,720 Speaker 1: life out there has lots and lots of DNA, so 336 00:21:56,880 --> 00:22:00,800 Speaker 1: the we've got plenty of materials to work from. Uh. 337 00:22:01,240 --> 00:22:04,800 Speaker 1: What's interesting is that since that time where his first 338 00:22:04,880 --> 00:22:09,200 Speaker 1: experiments were showing the viability of a DNA computer, our 339 00:22:09,280 --> 00:22:14,800 Speaker 1: ability to sequence synthetic DNA has improved to the point 340 00:22:14,880 --> 00:22:18,600 Speaker 1: where organic DNA is not really what we care about anymore. 341 00:22:19,520 --> 00:22:22,800 Speaker 1: We can synthesize DNA in the lab and just make 342 00:22:22,840 --> 00:22:26,040 Speaker 1: it ourselves so we don't have to um harvest it. 343 00:22:27,600 --> 00:22:30,159 Speaker 1: As Chris was saying in the pre show, you know, 344 00:22:30,600 --> 00:22:33,760 Speaker 1: it would be a totally different world if you realize 345 00:22:33,760 --> 00:22:35,639 Speaker 1: that your computer was running out a memory, so you 346 00:22:35,800 --> 00:22:38,400 Speaker 1: checked another hamster into your machine so that you could 347 00:22:38,640 --> 00:22:40,879 Speaker 1: finish whatever it was you were doing. That was a 348 00:22:40,920 --> 00:22:44,000 Speaker 1: particularly gory idea. Well we didn't know, but yeah, I 349 00:22:44,119 --> 00:22:46,760 Speaker 1: left out the part about the grinding noises, you know, 350 00:22:47,400 --> 00:22:52,200 Speaker 1: and for flying out the back you yeah, and I 351 00:22:52,280 --> 00:22:57,960 Speaker 1: thought that was my contribution. Um, yeah, they University of Rochester, 352 00:22:58,080 --> 00:23:02,600 Speaker 1: there were some researchers at found ways to use DNA 353 00:23:02,800 --> 00:23:08,200 Speaker 1: to create logic gates. Again in the it looks like 354 00:23:08,840 --> 00:23:13,240 Speaker 1: um so uh and that's we've touched on on several occasions, 355 00:23:13,280 --> 00:23:18,680 Speaker 1: but that those logic gates are basically key to classic computing. Yeah, 356 00:23:18,760 --> 00:23:22,080 Speaker 1: this is what uh, this is. This is what allows 357 00:23:22,119 --> 00:23:25,200 Speaker 1: the computer to dictate how information moves through it so 358 00:23:25,320 --> 00:23:28,480 Speaker 1: that it has any meaning. You know. The logic gates 359 00:23:28,960 --> 00:23:33,360 Speaker 1: essentially dictate whether the zero or one that goes into 360 00:23:33,400 --> 00:23:35,359 Speaker 1: the gate comes out at zero or one on the 361 00:23:35,400 --> 00:23:39,400 Speaker 1: other side or something. Usually it's a pair. If it's 362 00:23:39,400 --> 00:23:41,600 Speaker 1: a zero and a one on the other side of 363 00:23:41,600 --> 00:23:42,960 Speaker 1: the gate, is that going to be a one or zero? 364 00:23:43,040 --> 00:23:45,600 Speaker 1: And it all depends on the type of gate it is. Um. 365 00:23:45,840 --> 00:23:48,320 Speaker 1: And of course you you can link a bunch of 366 00:23:48,359 --> 00:23:51,800 Speaker 1: gates together to create all sorts of different outcomes depending 367 00:23:51,880 --> 00:23:54,880 Speaker 1: upon what the input is. This is all very important 368 00:23:55,080 --> 00:23:58,280 Speaker 1: from classical computing. So getting to that step of being 369 00:23:58,320 --> 00:24:00,200 Speaker 1: able to build logic gates out of d in a 370 00:24:00,600 --> 00:24:02,960 Speaker 1: it was pivotal if you want to be able to 371 00:24:03,040 --> 00:24:07,600 Speaker 1: eventually build a true DNA computer. And again this is 372 00:24:08,359 --> 00:24:12,280 Speaker 1: you know, you compare the components of a DNA computer 373 00:24:12,440 --> 00:24:17,200 Speaker 1: to those of a an inorganic computer. UM. And we have, 374 00:24:17,720 --> 00:24:21,879 Speaker 1: as a Jonathan pointed out and Gordon Moore's uh famous 375 00:24:21,920 --> 00:24:27,240 Speaker 1: prediction that the transistors would double in number per square 376 00:24:27,280 --> 00:24:31,720 Speaker 1: inch of silicon back in the original prediction, UM, you 377 00:24:31,800 --> 00:24:34,760 Speaker 1: know every you know over a certain period of time, 378 00:24:34,800 --> 00:24:37,200 Speaker 1: which again has changed, you know, year, year and a half, 379 00:24:37,280 --> 00:24:40,760 Speaker 1: two years. The thing is, um, we're talking about a 380 00:24:41,240 --> 00:24:44,359 Speaker 1: flat piece of silicon. And we've also talked about how 381 00:24:44,440 --> 00:24:48,840 Speaker 1: hard drives, the classical hard drive, UM, you know has 382 00:24:49,160 --> 00:24:51,159 Speaker 1: so much information on it. It's in a it's in 383 00:24:51,200 --> 00:24:56,200 Speaker 1: a flat plane. We've talked about electronic memory and how 384 00:24:56,760 --> 00:24:59,119 Speaker 1: you know this information is is getting stored. But we 385 00:24:59,480 --> 00:25:03,480 Speaker 1: basically been talking two dimensional and and a long time 386 00:25:03,520 --> 00:25:06,600 Speaker 1: ago we talked about processors and how at some point, 387 00:25:06,840 --> 00:25:10,600 Speaker 1: due to the limitations of physics like it's at some 388 00:25:10,720 --> 00:25:13,800 Speaker 1: point electrons will begin to tunnel through layers of the 389 00:25:13,880 --> 00:25:18,000 Speaker 1: material used to create transistors, basically making them ineffective. So 390 00:25:18,080 --> 00:25:23,560 Speaker 1: at some point, theoretically the traditional transistor chip is going 391 00:25:23,640 --> 00:25:26,359 Speaker 1: to be so full that you cannot fill it anymore 392 00:25:26,440 --> 00:25:29,840 Speaker 1: without having serious electrical problems. So they were talking about 393 00:25:29,880 --> 00:25:34,040 Speaker 1: going into three D processors. Well, d N a kind 394 00:25:34,080 --> 00:25:37,320 Speaker 1: of goes around that problem or is a natural if 395 00:25:37,359 --> 00:25:39,840 Speaker 1: you will solution. Hey, for once, that wasn't a pun 396 00:25:40,080 --> 00:25:46,240 Speaker 1: intended um, because DNA is volumetric. It isn't it can 397 00:25:46,480 --> 00:25:50,040 Speaker 1: fit because of its its natural characteristics. It doesn't have 398 00:25:50,200 --> 00:25:53,960 Speaker 1: to be in a two dimensional flat shape. You don't 399 00:25:54,000 --> 00:25:56,520 Speaker 1: have to stretch out the helix and stick it on 400 00:25:56,600 --> 00:26:01,560 Speaker 1: a piece of silicon or whatever to make it work. Um, 401 00:26:01,840 --> 00:26:05,400 Speaker 1: and that gives uh, that gives computing so much more 402 00:26:05,600 --> 00:26:10,880 Speaker 1: advantage to move to a DNA based existence, right. Yeah. 403 00:26:10,920 --> 00:26:14,639 Speaker 1: The the challenge is building The challenge is building the 404 00:26:14,680 --> 00:26:19,960 Speaker 1: equipment that allows you to sequence and decode that information 405 00:26:20,160 --> 00:26:23,720 Speaker 1: because you know that's where where that's where the bottleneck is. 406 00:26:23,840 --> 00:26:27,200 Speaker 1: Right now, is that the it's not simple, Yeah, we 407 00:26:27,280 --> 00:26:29,680 Speaker 1: have to get there. Yeah, But once we get to 408 00:26:29,760 --> 00:26:33,800 Speaker 1: a point where we're able to construct the DNA and 409 00:26:34,480 --> 00:26:36,080 Speaker 1: lay it out in such a way where we're able 410 00:26:36,080 --> 00:26:39,080 Speaker 1: to pack in all that information, and then we have 411 00:26:40,040 --> 00:26:43,560 Speaker 1: the companion devices that can decode that and make it 412 00:26:43,680 --> 00:26:47,440 Speaker 1: meaningful to a computer. Again, then you're talking about some 413 00:26:48,359 --> 00:26:53,800 Speaker 1: huge leaps in storage capacity. One gram of d n 414 00:26:53,880 --> 00:26:57,560 Speaker 1: A can store up to four hundred and fifty five 415 00:26:58,000 --> 00:27:03,720 Speaker 1: billion gigabytes of data, which is about a hundred billion 416 00:27:03,880 --> 00:27:07,600 Speaker 1: DVDs worth of information. Yea, yea. As a matter of fact, 417 00:27:07,840 --> 00:27:10,720 Speaker 1: this is the article that sort of uh turned me 418 00:27:10,800 --> 00:27:13,960 Speaker 1: onto this idea was something that my friends Kim and 419 00:27:14,040 --> 00:27:16,360 Speaker 1: Tim pointed out to me in the in the Guardian, 420 00:27:17,119 --> 00:27:20,199 Speaker 1: which really wasn't that long ago August two thousand twelve. 421 00:27:20,760 --> 00:27:25,240 Speaker 1: They started talking about how books had been encoded in 422 00:27:25,400 --> 00:27:30,399 Speaker 1: dna UM and that that got me to thinking and 423 00:27:30,480 --> 00:27:33,520 Speaker 1: to suggesting this to Jonathan as a potential topic because 424 00:27:33,560 --> 00:27:37,200 Speaker 1: it's it's fascinating that d n A, something so small, 425 00:27:37,560 --> 00:27:40,280 Speaker 1: can hold that much information. Yeah, and it's funny because 426 00:27:40,359 --> 00:27:44,399 Speaker 1: the story goes it talks about how Professor George Church 427 00:27:45,280 --> 00:27:50,959 Speaker 1: lead this project and he belongs to he well, he teaches, 428 00:27:51,240 --> 00:27:55,840 Speaker 1: he teaches at HAVD, but not just Harvard, it's Harvard 429 00:27:55,960 --> 00:27:59,439 Speaker 1: Medical School. This is this is one of those weird things, 430 00:28:00,119 --> 00:28:06,800 Speaker 1: uh that this this overlaps science, computer science, and uh medicine. Yeah, 431 00:28:06,840 --> 00:28:09,920 Speaker 1: so you've got I'm sorry, physical science and medical science. 432 00:28:10,000 --> 00:28:12,960 Speaker 1: Let's say that right. No, No, that's that's fine. That's 433 00:28:12,960 --> 00:28:18,640 Speaker 1: a computer science and and medical science. It's it's multidisciplinary obviously, 434 00:28:18,800 --> 00:28:24,119 Speaker 1: just like nanobiology or nanotechnology is a multidisciplinary approach. So 435 00:28:24,440 --> 00:28:29,239 Speaker 1: is this DNA computer or DNA storage idea. We've got 436 00:28:29,280 --> 00:28:31,840 Speaker 1: a little bit more about DNA ahead of us, and 437 00:28:31,960 --> 00:28:34,280 Speaker 1: before we get to that, let's take another quick break. 438 00:28:42,360 --> 00:28:47,440 Speaker 1: So what what Professor Church did was they decided to 439 00:28:47,760 --> 00:28:52,040 Speaker 1: take a book that was about five point to seven 440 00:28:52,280 --> 00:28:58,440 Speaker 1: megabits of digital space once you converted into digital information, 441 00:28:59,240 --> 00:29:04,479 Speaker 1: and to uh encode that as d N A and UM. 442 00:29:05,360 --> 00:29:11,080 Speaker 1: They didn't do it just once. They decided to duplicate 443 00:29:11,160 --> 00:29:16,560 Speaker 1: it a few times, seven seventy billion times, seventy billion 444 00:29:16,800 --> 00:29:21,280 Speaker 1: copies of this book, which, according to an article in 445 00:29:21,400 --> 00:29:24,560 Speaker 1: Extreme Tech, prompted them to joke that it made it 446 00:29:24,640 --> 00:29:27,720 Speaker 1: the best selling book of all time, yes, and that 447 00:29:27,960 --> 00:29:31,720 Speaker 1: it was. The seventy billion copies totaled about forty four 448 00:29:32,040 --> 00:29:37,680 Speaker 1: peta bytes of data. UM. So that is slightly larger 449 00:29:37,760 --> 00:29:40,040 Speaker 1: than the N A S I have attached at my 450 00:29:40,160 --> 00:29:43,200 Speaker 1: network at home. Yeah, yeah, forty four pea bites. That's 451 00:29:43,520 --> 00:29:46,920 Speaker 1: an incredible amount of information. It's also quite a bit 452 00:29:47,080 --> 00:29:50,880 Speaker 1: smaller than my n A s Yeah. So so when 453 00:29:50,920 --> 00:29:55,920 Speaker 1: you think about it, the the promise of d n 454 00:29:56,000 --> 00:30:00,360 Speaker 1: A is that with a relatively small amount of DNA 455 00:30:01,320 --> 00:30:04,920 Speaker 1: you could store the sum total of all human knowledge 456 00:30:05,360 --> 00:30:10,480 Speaker 1: in a very tiny compartment, relatively speaking, a tiny compartment. 457 00:30:11,160 --> 00:30:14,680 Speaker 1: And uh, if you're able to use that same sort 458 00:30:14,840 --> 00:30:21,720 Speaker 1: of uh of capacity in a processing way as opposed 459 00:30:21,720 --> 00:30:24,320 Speaker 1: to just storage storage is great. I mean, that's fantastic. 460 00:30:24,480 --> 00:30:29,840 Speaker 1: The the the this project was really showing how using 461 00:30:29,920 --> 00:30:33,800 Speaker 1: DNA is great for archival purposes if you want to 462 00:30:33,920 --> 00:30:37,920 Speaker 1: store information for longevity's sake. And another point about that 463 00:30:38,160 --> 00:30:42,480 Speaker 1: is that this yeah, is that here's here's an issue 464 00:30:42,560 --> 00:30:46,880 Speaker 1: that we have with storing information. The way we access 465 00:30:46,920 --> 00:30:51,520 Speaker 1: information changes over time, and some of the there there 466 00:30:51,520 --> 00:30:55,240 Speaker 1: are multiple problems here. Sometimes the way we store information, uh, 467 00:30:55,480 --> 00:30:59,560 Speaker 1: we store it on a medium that can decompose, which 468 00:30:59,640 --> 00:31:04,480 Speaker 1: means it as time passes, the likelihood that that data 469 00:31:04,600 --> 00:31:09,000 Speaker 1: is intact decreases. So let's say like a book. Okay, 470 00:31:09,440 --> 00:31:13,880 Speaker 1: books are susceptible to lots of different environmental factors that 471 00:31:14,000 --> 00:31:18,400 Speaker 1: can make them impossible to read. So as time goes by, 472 00:31:19,000 --> 00:31:24,120 Speaker 1: a book's ability to preserve the information decreases, particularly depending 473 00:31:24,200 --> 00:31:26,880 Speaker 1: upon its environment. Yeah. And and one of the things 474 00:31:26,960 --> 00:31:30,640 Speaker 1: that's funny to me about this is, and I'll keep 475 00:31:30,680 --> 00:31:32,960 Speaker 1: this short, but it's it's funny to me that in 476 00:31:33,040 --> 00:31:38,720 Speaker 1: a way, Uh, the increase in technology, um has only 477 00:31:39,440 --> 00:31:41,880 Speaker 1: increased the rate of data rot as some people call it. 478 00:31:41,960 --> 00:31:45,080 Speaker 1: Because you think about something like the Rosetta stone and 479 00:31:45,200 --> 00:31:48,760 Speaker 1: how long ago that was chiseled, but it's still there 480 00:31:48,880 --> 00:31:51,560 Speaker 1: because hey, you know it's stone. If now, if you 481 00:31:51,680 --> 00:31:54,400 Speaker 1: left it out in the elements, eventually the the writing 482 00:31:54,480 --> 00:31:57,280 Speaker 1: on it will wear away due to the effects of erosion. 483 00:31:57,400 --> 00:32:01,360 Speaker 1: But um, that's longer lived than say, paper, which could 484 00:32:01,400 --> 00:32:05,480 Speaker 1: be eaten by weevils or could be affected by mold 485 00:32:05,560 --> 00:32:09,280 Speaker 1: or mildew or or even water or fire. Um. You 486 00:32:09,360 --> 00:32:12,080 Speaker 1: know there there are many things acid in the paper. Um. 487 00:32:12,320 --> 00:32:15,120 Speaker 1: But but that would be longer lived than say, um, 488 00:32:15,600 --> 00:32:18,960 Speaker 1: a magnetic storage medium, which might may only live a 489 00:32:19,040 --> 00:32:23,280 Speaker 1: few decades. Yeah, because you've got with magnetic storage, eventually 490 00:32:23,720 --> 00:32:27,600 Speaker 1: that magnetic properties starts to kind of and I have 491 00:32:27,760 --> 00:32:30,840 Speaker 1: Dad gets corrupted. Yeah, and I've had CDs and DVDs 492 00:32:30,960 --> 00:32:34,680 Speaker 1: that I've burned and a few years ago that are 493 00:32:34,720 --> 00:32:38,880 Speaker 1: starting to show signs of deterioration. And I'm thinking all 494 00:32:38,960 --> 00:32:41,160 Speaker 1: this futuristic stuff, it's kind of funny. The stuff that's 495 00:32:41,240 --> 00:32:43,560 Speaker 1: chiseled in stone is still there well. And on top 496 00:32:43,640 --> 00:32:46,680 Speaker 1: of all that, besides the fact that you've got these media, 497 00:32:46,720 --> 00:32:50,920 Speaker 1: these media that will that can degrade over time. Um, 498 00:32:51,800 --> 00:32:54,719 Speaker 1: magnetic definitely is more susceptible that I would say than 499 00:32:54,760 --> 00:32:58,560 Speaker 1: optical storage. But but both can can degree and both 500 00:32:58,600 --> 00:33:01,240 Speaker 1: are susceptible to damage. I mean, just about everything is. 501 00:33:01,680 --> 00:33:07,120 Speaker 1: But but the other problem is that we move away 502 00:33:07,280 --> 00:33:11,160 Speaker 1: from those older forms of media and eventually we get 503 00:33:11,200 --> 00:33:14,040 Speaker 1: to a point where nothing we have can read what 504 00:33:14,320 --> 00:33:18,400 Speaker 1: we used to use, or if you do have something 505 00:33:18,480 --> 00:33:21,080 Speaker 1: that can read it, it's a legacy system. So like 506 00:33:21,200 --> 00:33:24,920 Speaker 1: the keeping old computers around simply to read those documents, right, 507 00:33:25,000 --> 00:33:26,960 Speaker 1: like like anything that's on an old five and a 508 00:33:27,080 --> 00:33:31,280 Speaker 1: quarter inch diskette from the early days of the personal computer, 509 00:33:31,960 --> 00:33:34,960 Speaker 1: you know, and I still have something, I would wager 510 00:33:35,040 --> 00:33:38,800 Speaker 1: that most people do not have easy access to such 511 00:33:38,840 --> 00:33:42,240 Speaker 1: a disk drive. Um, you know, especially if you're just 512 00:33:42,360 --> 00:33:44,120 Speaker 1: kind of an average user and you've gone out and 513 00:33:44,160 --> 00:33:45,960 Speaker 1: you're like, oh, I want a new laptop. You go again. 514 00:33:46,160 --> 00:33:47,920 Speaker 1: If you buy a new laptop today, you might not 515 00:33:48,000 --> 00:33:51,120 Speaker 1: even have an optical drive, which means that there you 516 00:33:51,160 --> 00:33:53,960 Speaker 1: could come across records of information that you have no 517 00:33:54,080 --> 00:33:56,160 Speaker 1: way of accessing because you do not have the tech 518 00:33:56,440 --> 00:33:59,840 Speaker 1: capable of accessing it. Well, d n A is a 519 00:34:00,080 --> 00:34:05,160 Speaker 1: basic building block of organic life, and so the idea 520 00:34:05,280 --> 00:34:09,200 Speaker 1: is that because it's something so basic, we will always 521 00:34:09,280 --> 00:34:12,640 Speaker 1: have the ability and assuming that you know, we don't 522 00:34:12,719 --> 00:34:16,200 Speaker 1: have some sort of post apocalyptic event, while an apocalyptic 523 00:34:16,280 --> 00:34:20,440 Speaker 1: event that then leads to post apocalyptic events, um, then 524 00:34:20,560 --> 00:34:22,759 Speaker 1: we should be able to have equipment that can read 525 00:34:22,840 --> 00:34:25,320 Speaker 1: this same information. Hey, do you have the instructions on 526 00:34:25,400 --> 00:34:27,000 Speaker 1: how to read DNA? Yeah? I saved it on that 527 00:34:27,120 --> 00:34:32,439 Speaker 1: magnetic wa now here in Atlanta. Were used to post 528 00:34:32,480 --> 00:34:35,560 Speaker 1: apocalyptic events because we've got zombies. Yes, you may have 529 00:34:35,640 --> 00:34:38,120 Speaker 1: seen if you've watched the documentary The Walking Dead as 530 00:34:38,160 --> 00:34:42,200 Speaker 1: seen on TV. So, um, yeah, the the idea was 531 00:34:42,280 --> 00:34:46,719 Speaker 1: that this will d n A does not degrade over time. Well, 532 00:34:47,160 --> 00:34:49,560 Speaker 1: it takes a much longer time than something like a 533 00:34:49,600 --> 00:34:53,680 Speaker 1: paper book. Right, So since you're not worried about degrading. 534 00:34:53,800 --> 00:34:55,719 Speaker 1: I mean when I say it doesn't degrade over time, 535 00:34:56,080 --> 00:34:59,200 Speaker 1: we're talking generations here, the undreds of thousands of years 536 00:34:59,280 --> 00:35:04,160 Speaker 1: so people. So yes, I wouldn't know eventually it will degrade, 537 00:35:04,560 --> 00:35:08,879 Speaker 1: but for the foreseeable future it won't. Uh. It takes 538 00:35:08,960 --> 00:35:10,800 Speaker 1: up far less space. We don't have to worry so 539 00:35:10,960 --> 00:35:14,080 Speaker 1: much about not being able to access the information anymore 540 00:35:14,200 --> 00:35:17,920 Speaker 1: because against the basic building block, we will presumably be 541 00:35:18,400 --> 00:35:21,879 Speaker 1: still be interested in DNA in the future. Uh. In fact, 542 00:35:22,239 --> 00:35:25,640 Speaker 1: it become increasingly interested as we learn more about how 543 00:35:25,760 --> 00:35:29,520 Speaker 1: to uh to tweet DNA to do things like fight 544 00:35:29,600 --> 00:35:35,480 Speaker 1: off illnesses and other scientific applications of that knowledge. So 545 00:35:36,960 --> 00:35:38,839 Speaker 1: that was kind of the whole point was that it's 546 00:35:38,880 --> 00:35:41,279 Speaker 1: great for archival and that reason it's gonna be. It's 547 00:35:41,360 --> 00:35:45,840 Speaker 1: it's it's a it's a more permanent solution in multiple ways. 548 00:35:46,560 --> 00:35:49,480 Speaker 1: And uh, that's really where the focus is on the 549 00:35:49,560 --> 00:35:52,600 Speaker 1: recent articles that we've been reading, although there's still obviously 550 00:35:52,719 --> 00:35:55,160 Speaker 1: quite a bit of development on the research and about 551 00:35:55,280 --> 00:36:00,360 Speaker 1: building a true DNA computer that would have of an 552 00:36:00,440 --> 00:36:03,839 Speaker 1: incredibly small form factor. I mean, you're talking about, uh, 553 00:36:04,600 --> 00:36:07,400 Speaker 1: DNA being the size of a couple of atoms, and 554 00:36:08,440 --> 00:36:13,839 Speaker 1: this is some small stuff. I mean, we could theoretically 555 00:36:14,000 --> 00:36:19,840 Speaker 1: have a DNA computer capable of performing huge calculations and 556 00:36:19,960 --> 00:36:22,600 Speaker 1: storing an enormous amount of data in a tiny, tiny 557 00:36:22,680 --> 00:36:26,000 Speaker 1: form factor. It would be amazing if we could look 558 00:36:26,040 --> 00:36:29,200 Speaker 1: into the future, maybe I don't know, twenty fifty years 559 00:36:29,280 --> 00:36:32,000 Speaker 1: something like that, where perhaps we have reached the point 560 00:36:32,040 --> 00:36:38,000 Speaker 1: where this technology is viable and reproducible and economic, where 561 00:36:38,239 --> 00:36:41,840 Speaker 1: we could see it in applications that actually the average 562 00:36:41,880 --> 00:36:45,120 Speaker 1: consumer could access. It wouldn't just be the realm of 563 00:36:45,200 --> 00:36:48,160 Speaker 1: the scientific community or the research community. It would also 564 00:36:48,280 --> 00:36:51,080 Speaker 1: be within our grasp. Because then can you imagine you 565 00:36:51,120 --> 00:36:55,680 Speaker 1: can have a smartphone that could literally contain all the 566 00:36:55,880 --> 00:37:01,040 Speaker 1: data that we have ever generated ever since the dawn 567 00:37:01,280 --> 00:37:04,880 Speaker 1: of man on your phone. I was waiting for you 568 00:37:04,960 --> 00:37:07,880 Speaker 1: to go all the data. No, that was it, just 569 00:37:08,080 --> 00:37:10,359 Speaker 1: all of all the data, um well, all the data 570 00:37:10,520 --> 00:37:15,360 Speaker 1: we have access to. Um there there. It's astounding to 571 00:37:15,480 --> 00:37:20,440 Speaker 1: think of something uh so common that has been with 572 00:37:20,640 --> 00:37:25,480 Speaker 1: us for so long being an answer and fairly easy 573 00:37:26,120 --> 00:37:28,120 Speaker 1: answer to a lot of these problems. I mean, like 574 00:37:28,200 --> 00:37:30,680 Speaker 1: I said, it's not easy to get there, but the 575 00:37:30,760 --> 00:37:33,640 Speaker 1: idea is like really just DNA. As it turns out. 576 00:37:33,680 --> 00:37:36,879 Speaker 1: You know, they've they've been using synthetic DNA to run 577 00:37:36,960 --> 00:37:40,919 Speaker 1: these experiments, and there are some drawbacks, one of which 578 00:37:41,560 --> 00:37:44,279 Speaker 1: is it can't be rewritten. That is true. So once 579 00:37:44,360 --> 00:37:47,279 Speaker 1: you write that data, it's that's another reason why people 580 00:37:47,280 --> 00:37:50,319 Speaker 1: are talking about for archival purposes. Once you write the data, 581 00:37:50,520 --> 00:37:54,680 Speaker 1: that's it. Now. Granted, you're talking about a construct that's 582 00:37:54,719 --> 00:37:58,239 Speaker 1: so small that you could keep doing that indefinitely and 583 00:37:58,360 --> 00:38:00,560 Speaker 1: not have to worry about taking up too much space. 584 00:38:01,600 --> 00:38:04,120 Speaker 1: But that's just the way they're thinking of it right now. Right. 585 00:38:04,280 --> 00:38:07,080 Speaker 1: But but you know you can't. You can't always think 586 00:38:07,160 --> 00:38:11,080 Speaker 1: that way because someday that will catch up to you. 587 00:38:11,640 --> 00:38:14,920 Speaker 1: Apparently that might be when we're actually saying, hey, hey, 588 00:38:15,000 --> 00:38:16,640 Speaker 1: we finally got a plan on how to get off 589 00:38:16,680 --> 00:38:20,160 Speaker 1: this rock because the Sun's gonna swallow us up in 590 00:38:20,239 --> 00:38:24,799 Speaker 1: another million years. That that would never happen. By the way, 591 00:38:24,920 --> 00:38:27,319 Speaker 1: don't don't write into me and explain to me why 592 00:38:27,440 --> 00:38:29,799 Speaker 1: that would be ridiculous. I understand. I was just using 593 00:38:29,880 --> 00:38:32,799 Speaker 1: that as a an example. Well, and and the other 594 00:38:32,920 --> 00:38:36,719 Speaker 1: thing is, um, you know, and yes, I realized that 595 00:38:36,840 --> 00:38:41,359 Speaker 1: this is you know that you could destroy DNA, but um, 596 00:38:41,840 --> 00:38:47,000 Speaker 1: thinking about that the sensitive information can't be erased. Then 597 00:38:47,640 --> 00:38:50,759 Speaker 1: you would need to keep up with your Let's say 598 00:38:50,760 --> 00:38:53,240 Speaker 1: you had a DNA drive like you have a flash 599 00:38:53,360 --> 00:38:56,400 Speaker 1: drive to carry back and forth with you, uh, and 600 00:38:56,520 --> 00:38:59,800 Speaker 1: it gets lost and it had I don't know importance 601 00:38:59,840 --> 00:39:04,800 Speaker 1: and sitive documents related to national security or um you know, 602 00:39:04,920 --> 00:39:10,600 Speaker 1: the secret um uh copy of your unpublished book, and 603 00:39:10,680 --> 00:39:13,280 Speaker 1: somebody else runs across it and makes billions of dollars 604 00:39:13,320 --> 00:39:16,080 Speaker 1: off of it because they found it. You can't you 605 00:39:16,160 --> 00:39:19,200 Speaker 1: can't remotely wipe that information. I don't know how you 606 00:39:19,239 --> 00:39:24,200 Speaker 1: would do that without destroying without physically destroying the material. 607 00:39:24,560 --> 00:39:28,439 Speaker 1: So it's that's sort of a a minor drawback, really, 608 00:39:28,480 --> 00:39:31,200 Speaker 1: but it's something it's it's something very different from the 609 00:39:31,320 --> 00:39:34,120 Speaker 1: media that we typically talk about. So clearly in that case, 610 00:39:34,160 --> 00:39:36,000 Speaker 1: you would be talking about, all right, well, now we've 611 00:39:36,040 --> 00:39:39,240 Speaker 1: got this incredible archival ability. Now we have to figure 612 00:39:39,280 --> 00:39:43,319 Speaker 1: out a way of securing it. Oh well, don't see that. Well, 613 00:39:43,560 --> 00:39:46,279 Speaker 1: and this brings me to my brilliant science fiction idea 614 00:39:47,200 --> 00:39:49,279 Speaker 1: which I I said in the pre show. I said, 615 00:39:49,320 --> 00:39:52,839 Speaker 1: if if someone steals this, I will find you. See. 616 00:39:52,880 --> 00:39:54,880 Speaker 1: That was my That was my like shout out to 617 00:39:55,000 --> 00:39:59,319 Speaker 1: your no, no, I'm sharing it because if someone out 618 00:39:59,320 --> 00:40:02,319 Speaker 1: there makes the us, I want to cut. So here's 619 00:40:02,320 --> 00:40:05,560 Speaker 1: the sci fi idea. Guys. You have a character who 620 00:40:06,200 --> 00:40:09,080 Speaker 1: is just an ordinary guy or girl, you know, someone 621 00:40:09,280 --> 00:40:12,200 Speaker 1: who is going through life and they've got the same 622 00:40:12,320 --> 00:40:15,719 Speaker 1: sort of challenges and problems and joys and despairs as 623 00:40:15,760 --> 00:40:19,040 Speaker 1: all the rest of us. But then suddenly they noticed 624 00:40:19,120 --> 00:40:22,160 Speaker 1: that they're being watched and people are closing in on them, 625 00:40:22,200 --> 00:40:24,439 Speaker 1: and they don't know why because they're just a normal person, 626 00:40:24,560 --> 00:40:26,800 Speaker 1: and so they're trying to get away, and it turns 627 00:40:26,880 --> 00:40:31,240 Speaker 1: out they find out that they themselves are a synthetic 628 00:40:31,680 --> 00:40:34,520 Speaker 1: life form. They were built in a lab from the 629 00:40:34,640 --> 00:40:38,880 Speaker 1: ground up, and in fact, their DNA contains this incredibly 630 00:40:39,000 --> 00:40:43,960 Speaker 1: important information encoded into this person's very being is a 631 00:40:44,120 --> 00:40:48,000 Speaker 1: secret message of such import that various forces are closing 632 00:40:48,040 --> 00:40:50,799 Speaker 1: in on them, determined to get hold of this person, 633 00:40:51,320 --> 00:40:53,279 Speaker 1: lop off a finger and figure out what the heck 634 00:40:53,400 --> 00:40:55,759 Speaker 1: is going on, And so the character has to go 635 00:40:55,880 --> 00:40:59,239 Speaker 1: through this incredible series of adventures in order to figure out. 636 00:40:59,480 --> 00:41:01,880 Speaker 1: It's kind of a journey of self discovery as well 637 00:41:01,920 --> 00:41:04,719 Speaker 1: as protection, and there's a whole like hero arc and 638 00:41:05,320 --> 00:41:07,680 Speaker 1: the credits are great and Bruce Willis Stars and I 639 00:41:07,800 --> 00:41:13,799 Speaker 1: want to cut I've got data under my skin, are 640 00:41:14,000 --> 00:41:17,040 Speaker 1: in it and through it. So, guys, yeah that was 641 00:41:17,760 --> 00:41:19,480 Speaker 1: I'm sure someone's gonna write in and say, yeah, that 642 00:41:19,600 --> 00:41:21,880 Speaker 1: was a great story when so and so wrote it 643 00:41:23,440 --> 00:41:26,560 Speaker 1: years ago. I want to read it. Yeah, yeah, I 644 00:41:26,600 --> 00:41:29,239 Speaker 1: I have no illusions that someone has not already come 645 00:41:29,320 --> 00:41:31,560 Speaker 1: up with that idea. But if they haven't, and then 646 00:41:31,640 --> 00:41:33,319 Speaker 1: you guys think that's a great idea and you want 647 00:41:33,320 --> 00:41:35,120 Speaker 1: to go out and make it. Remember, I want to 648 00:41:35,200 --> 00:41:38,680 Speaker 1: credit and some money or at least a sandwich. Come on, 649 00:41:39,840 --> 00:41:43,439 Speaker 1: writer's gotta eat all right, assassinating stuff though, it's it's 650 00:41:43,560 --> 00:41:45,279 Speaker 1: the kind of thing that I would never have thought 651 00:41:45,360 --> 00:41:48,319 Speaker 1: to do, so, I mean, I'm blown away by that. Yeah, 652 00:41:48,440 --> 00:41:51,160 Speaker 1: it's a it's a pretty fascinating subject. And like we said, 653 00:41:51,239 --> 00:41:53,680 Speaker 1: there's that we have some great articles on how stuff. 654 00:41:53,719 --> 00:41:55,839 Speaker 1: We actually can go and check those out and read 655 00:41:55,920 --> 00:41:58,840 Speaker 1: up on DNA and DNA computers and you know, like 656 00:41:58,920 --> 00:42:01,480 Speaker 1: I said, they are the articles on the Guardian as 657 00:42:01,560 --> 00:42:05,400 Speaker 1: well as other places that are talking about this storage 658 00:42:05,800 --> 00:42:09,680 Speaker 1: medium and it blows my mind. Hey, guys, hope you 659 00:42:09,880 --> 00:42:13,239 Speaker 1: enjoyed that classic episode of tech Stuff Always a joy 660 00:42:13,400 --> 00:42:16,640 Speaker 1: to revisit the old episodes we did in the past. 661 00:42:16,880 --> 00:42:20,560 Speaker 1: It's also interesting that I decided to throw this one 662 00:42:20,680 --> 00:42:23,480 Speaker 1: in there, because obviously a lot has happened since two 663 00:42:23,520 --> 00:42:25,680 Speaker 1: thousand twelve, so I may have to do a full 664 00:42:25,920 --> 00:42:29,520 Speaker 1: update episode about d n A computers in the near future. 665 00:42:29,760 --> 00:42:32,000 Speaker 1: If you're interested in hearing such a thing, let me know. 666 00:42:32,320 --> 00:42:34,520 Speaker 1: The email address for the show is tech Stuff at 667 00:42:35,000 --> 00:42:38,040 Speaker 1: how stuff works dot com, or jump on over to 668 00:42:38,320 --> 00:42:42,200 Speaker 1: the website that's text stuff podcast dot com. That's where 669 00:42:42,200 --> 00:42:44,920 Speaker 1: you're gonna find an archive to all of our previous episodes, 670 00:42:45,360 --> 00:42:48,719 Speaker 1: as well as our presence on social media, and you 671 00:42:48,800 --> 00:42:51,680 Speaker 1: also find a link to our online store, where every 672 00:42:51,719 --> 00:42:53,440 Speaker 1: purchase you make goes to help the show and we 673 00:42:53,640 --> 00:42:57,560 Speaker 1: greatly appreciate it, and I'll talk to you again really soon. 674 00:43:01,960 --> 00:43:04,120 Speaker 1: Text Stuff is a production of I Heart Radio's How 675 00:43:04,200 --> 00:43:07,560 Speaker 1: Stuff Works. For more podcasts from I heart Radio, visit 676 00:43:07,640 --> 00:43:10,640 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 677 00:43:10,760 --> 00:43:12,120 Speaker 1: listen to your favorite shows.