1 00:00:05,720 --> 00:00:07,640 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. My name 2 00:00:07,720 --> 00:00:10,320 Speaker 1: is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:10,360 --> 00:00:12,280 Speaker 1: What does that mean, Robert, What means it's time to 4 00:00:12,320 --> 00:00:14,680 Speaker 1: open the vault. It looks like it's already open, so 5 00:00:14,760 --> 00:00:16,960 Speaker 1: let's get in there. There's a lot of P in here. 6 00:00:17,520 --> 00:00:20,599 Speaker 1: There's also a lot of MP in here. Perhaps you 7 00:00:20,640 --> 00:00:25,439 Speaker 1: can explain this to everybody. Smooth transition, Robert, very smooth. 8 00:00:25,560 --> 00:00:28,520 Speaker 1: I like it. Uh. This is the episode on the 9 00:00:28,600 --> 00:00:33,720 Speaker 1: P versus MP problem, a classic fascinating vexing problem in 10 00:00:34,280 --> 00:00:37,440 Speaker 1: logic and math and computer science and uh. And in 11 00:00:37,479 --> 00:00:40,040 Speaker 1: this episode we tackle the question of what it actually 12 00:00:40,080 --> 00:00:44,080 Speaker 1: means to solve a problem. This originally aired on April twelve, 13 00:00:45,600 --> 00:00:48,040 Speaker 1: and we're bringing it back for you now, so we 14 00:00:48,080 --> 00:00:53,440 Speaker 1: hope you enjoy our exploration of the P versus MP problem. 15 00:00:55,480 --> 00:00:58,280 Speaker 1: Welcome to Stuff to Blow your Mind from how Stuff 16 00:00:58,280 --> 00:01:07,920 Speaker 1: Works dot com. Hey, welcome to Sceptable in your Mind. 17 00:01:07,959 --> 00:01:10,320 Speaker 1: My name is Robert Lamb and I'm Joe McCormick. And 18 00:01:10,400 --> 00:01:12,959 Speaker 1: today we're gonna be taking a look at the issue 19 00:01:13,200 --> 00:01:17,920 Speaker 1: in computer science. Uh. Funny enough, I'd say that's not 20 00:01:18,040 --> 00:01:20,520 Speaker 1: one of the science, as we dip into very frequently 21 00:01:20,560 --> 00:01:23,119 Speaker 1: on this podcast. Yes, and I mean, really, we should 22 00:01:23,160 --> 00:01:25,280 Speaker 1: probably just remind everyone to stick with us, trust us 23 00:01:25,280 --> 00:01:27,440 Speaker 1: on this one. Uh, don't be scared off by the 24 00:01:27,440 --> 00:01:29,840 Speaker 1: computer science thing. Don't be scared off by the P 25 00:01:30,440 --> 00:01:34,360 Speaker 1: versus MP thing. It's it's it's all gonna make a 26 00:01:34,440 --> 00:01:37,920 Speaker 1: type of sense at the end, hopefully. But I'm wondering 27 00:01:37,920 --> 00:01:40,440 Speaker 1: if maybe we should dip into computer science more often 28 00:01:40,520 --> 00:01:42,800 Speaker 1: because or at least wherever we can find a way 29 00:01:42,840 --> 00:01:46,240 Speaker 1: to make the contents of it reasonably concrete, because, let's 30 00:01:46,240 --> 00:01:49,240 Speaker 1: be honest, as we've discovered in researching this episode, it's 31 00:01:49,760 --> 00:01:52,560 Speaker 1: very abstract, very difficult, and a lot of times hard 32 00:01:52,600 --> 00:01:55,320 Speaker 1: to come up with ways of explaining that makes sense 33 00:01:55,400 --> 00:01:59,040 Speaker 1: just just talking about it without visual aids or watching 34 00:01:59,080 --> 00:02:02,360 Speaker 1: programs Exit Cute as an example. Yeah, it's definitely one 35 00:02:02,400 --> 00:02:05,160 Speaker 1: of those topics that it's a swimming pool of a 36 00:02:05,240 --> 00:02:09,639 Speaker 1: topic in which there is no gradual deepening from the 37 00:02:09,720 --> 00:02:12,800 Speaker 1: kitty area to the deep end. It's just shallow, and 38 00:02:12,919 --> 00:02:16,120 Speaker 1: at times it feels too shallow and then you're immediately 39 00:02:16,480 --> 00:02:18,639 Speaker 1: out of your depth. Yeah, but if you're thinking of 40 00:02:18,760 --> 00:02:21,600 Speaker 1: computer science, really does that fit with the show hold 41 00:02:21,639 --> 00:02:23,919 Speaker 1: on for a second, because I think it does. Um. 42 00:02:24,160 --> 00:02:27,880 Speaker 1: Computer science to me is a fascinating subject. Uh, and 43 00:02:28,000 --> 00:02:31,840 Speaker 1: it's not just limited to how computers work. So my 44 00:02:31,880 --> 00:02:35,120 Speaker 1: advice is, when you think about the idea of computer science, 45 00:02:35,639 --> 00:02:38,880 Speaker 1: forget the computer sitting in front of you. That's not 46 00:02:38,960 --> 00:02:42,440 Speaker 1: all it is. Computer science is really something more akin 47 00:02:42,560 --> 00:02:47,680 Speaker 1: to the philosophy of logic, understanding the the underlying sorcery 48 00:02:47,800 --> 00:02:51,440 Speaker 1: of how logic and math work in the universe we inhabit, 49 00:02:51,680 --> 00:02:55,520 Speaker 1: and especially the science of how problems are solved. And 50 00:02:55,520 --> 00:02:57,840 Speaker 1: certainly when you start bringing math into the equation here 51 00:02:57,960 --> 00:03:00,200 Speaker 1: as well, I mean you're talking we're talking about the 52 00:03:00,800 --> 00:03:04,160 Speaker 1: essentially the very fabric of the universe. We're talking about 53 00:03:03,960 --> 00:03:06,160 Speaker 1: either the fabric of the universe, either the way the 54 00:03:06,240 --> 00:03:09,160 Speaker 1: universe works, or this perfect creation that humans have come 55 00:03:09,240 --> 00:03:12,800 Speaker 1: up with that so accurately describes how the universe works, 56 00:03:12,800 --> 00:03:15,720 Speaker 1: and that is pretty mind blowing territory. Well, either way 57 00:03:15,720 --> 00:03:18,720 Speaker 1: you look at it, there is something mystical about about 58 00:03:18,760 --> 00:03:21,920 Speaker 1: the math that we walk on every day that you know, 59 00:03:21,960 --> 00:03:24,960 Speaker 1: that makes up the fabric and the logic math, the 60 00:03:25,000 --> 00:03:26,840 Speaker 1: math beneath our feet. You know, if you go with 61 00:03:26,919 --> 00:03:30,080 Speaker 1: the max Tech mark idea of the mathematical universe. Some 62 00:03:30,080 --> 00:03:32,600 Speaker 1: people don't like this idea because, like I don't understand 63 00:03:32,600 --> 00:03:35,920 Speaker 1: what that means, but at least it's a very intriguing idea. 64 00:03:36,040 --> 00:03:40,200 Speaker 1: I think his ideas that the underlying basis of all 65 00:03:40,360 --> 00:03:44,040 Speaker 1: reality not just as described by math, but is math, 66 00:03:44,160 --> 00:03:48,400 Speaker 1: and the universe is a mathematical object. But we're gonna 67 00:03:48,440 --> 00:03:51,000 Speaker 1: get back to this idea of problem solving because today 68 00:03:51,040 --> 00:03:54,320 Speaker 1: we want to focus on algorithms and on the inherent 69 00:03:54,400 --> 00:03:57,960 Speaker 1: logic of problem solving in our universe, with some attention 70 00:03:58,000 --> 00:04:02,640 Speaker 1: to a special example of one really interesting outstanding problem 71 00:04:02,720 --> 00:04:06,440 Speaker 1: in computer science, and that's the P versus n P issue. 72 00:04:07,160 --> 00:04:09,040 Speaker 1: If you've never heard of this before, don't worry. We'll 73 00:04:09,080 --> 00:04:12,000 Speaker 1: explain what the terms mean in in a simplified manner. 74 00:04:12,560 --> 00:04:14,480 Speaker 1: And uh I at this point also do want to 75 00:04:14,480 --> 00:04:16,880 Speaker 1: give a shout out to our listener Jim in New Jersey, 76 00:04:16,960 --> 00:04:19,480 Speaker 1: who has been encouraging me over email to tackle this 77 00:04:19,560 --> 00:04:22,200 Speaker 1: issue for a while despite all the challenges, and has 78 00:04:22,240 --> 00:04:25,440 Speaker 1: also sent some really helpful, uh really helpful guides and 79 00:04:25,520 --> 00:04:28,320 Speaker 1: explainers on some stuff. He he learned about this when 80 00:04:28,320 --> 00:04:31,400 Speaker 1: he was in graduate school. Yeah, indeed, and UH and 81 00:04:31,400 --> 00:04:33,320 Speaker 1: I think this is great too, because this episode is 82 00:04:33,320 --> 00:04:36,240 Speaker 1: coming on the heels of first of all, the Wicked 83 00:04:36,279 --> 00:04:40,120 Speaker 1: Problems episode that came out a few weeks ago, as 84 00:04:40,160 --> 00:04:44,719 Speaker 1: well as the more recent Cargo Cults episode which in 85 00:04:44,760 --> 00:04:48,000 Speaker 1: which we discuss some outside context problems a little bit 86 00:04:48,000 --> 00:04:50,320 Speaker 1: as well. So it's it's perfectly fitting that we would 87 00:04:50,560 --> 00:04:54,320 Speaker 1: discuss another problem. Well, what does it mean inherently to 88 00:04:54,600 --> 00:04:57,240 Speaker 1: solve a problem? If you get into the theory of 89 00:04:57,320 --> 00:05:00,960 Speaker 1: problem solving? What what what does this process look like? Well, 90 00:05:01,080 --> 00:05:02,919 Speaker 1: when it comes down just to the basics open and 91 00:05:02,960 --> 00:05:05,800 Speaker 1: this also kind of gets into the whole Wicked Problems 92 00:05:05,839 --> 00:05:08,800 Speaker 1: area of like, what's what's missing when you don't have, um, 93 00:05:08,920 --> 00:05:11,480 Speaker 1: you know, everything you need to solve a problem for 94 00:05:11,480 --> 00:05:13,240 Speaker 1: for a real problem, you have to be able to, 95 00:05:13,240 --> 00:05:16,160 Speaker 1: of course, do to find what the problem is. A 96 00:05:16,160 --> 00:05:18,279 Speaker 1: lot of a lot of attempts fail right there. Yeah, 97 00:05:18,360 --> 00:05:20,320 Speaker 1: you've got to You've got to be able to say 98 00:05:20,440 --> 00:05:22,680 Speaker 1: this is the thing you know and then you and 99 00:05:22,720 --> 00:05:24,800 Speaker 1: then you have to be able to measure your success 100 00:05:25,279 --> 00:05:28,359 Speaker 1: and check the solution. So you essentially have to be 101 00:05:28,400 --> 00:05:31,719 Speaker 1: able to say, hey, this is wrong because of X. 102 00:05:31,760 --> 00:05:33,560 Speaker 1: And then if you then figure out what x is 103 00:05:33,600 --> 00:05:38,080 Speaker 1: and see if the equation balances out. Um, it sounds 104 00:05:38,080 --> 00:05:40,040 Speaker 1: pretty simple. But like I said, as we discussed in 105 00:05:40,080 --> 00:05:43,640 Speaker 1: Wicked problems, that's uh, it can be very difficult to do, 106 00:05:43,760 --> 00:05:48,400 Speaker 1: especially in you know, the very complex social situations when 107 00:05:48,400 --> 00:05:50,960 Speaker 1: you're dealing with with certainly some of the larger problems 108 00:05:50,960 --> 00:05:53,360 Speaker 1: that we're going to talk about here, or even if 109 00:05:53,360 --> 00:05:56,120 Speaker 1: you want to go into the simplest level, well, I mean, 110 00:05:56,200 --> 00:05:58,600 Speaker 1: depending on what you would call simple. In fact, what 111 00:05:58,640 --> 00:06:01,200 Speaker 1: we're going to be getting into today is directly referred 112 00:06:01,200 --> 00:06:04,960 Speaker 1: to as complexity theory. Uh. So maybe it's not so simple, 113 00:06:05,000 --> 00:06:07,839 Speaker 1: but at least simple in terms of not involving uh 114 00:06:08,200 --> 00:06:10,960 Speaker 1: phenomena in the real world, but just math, just math 115 00:06:11,080 --> 00:06:15,560 Speaker 1: and logic and and true versus untrue and UH and algorithms. 116 00:06:15,600 --> 00:06:17,880 Speaker 1: So I think it's time to pull back the curtain 117 00:06:18,320 --> 00:06:21,440 Speaker 1: a little bit and reveal some of the deep weirdness 118 00:06:21,480 --> 00:06:25,200 Speaker 1: of the nature of algorithms and problem solving in our universe. 119 00:06:25,520 --> 00:06:28,640 Speaker 1: So let's look at this P versus MP problem. This 120 00:06:28,760 --> 00:06:31,360 Speaker 1: is something that comes from two of the great minds 121 00:06:31,400 --> 00:06:35,239 Speaker 1: of the twentieth century, Kurt Girdle and John von Neuman, 122 00:06:35,760 --> 00:06:39,560 Speaker 1: and in nineteen fifty six, Kurt Girdle, who's a mathematician 123 00:06:39,560 --> 00:06:43,120 Speaker 1: and logician, wrote a letter to John von Neuman which 124 00:06:43,440 --> 00:06:46,320 Speaker 1: kicked off this quest to solve one of the biggest 125 00:06:46,440 --> 00:06:50,440 Speaker 1: questions in computer science, the P versus n P issue. Now, 126 00:06:50,480 --> 00:06:53,120 Speaker 1: who were these guys, but both were titans of the 127 00:06:53,160 --> 00:06:57,680 Speaker 1: twentieth century in terms of math, logic, and computers. Well. 128 00:06:57,720 --> 00:07:02,000 Speaker 1: Godal is probably most famous for his first incompleteness theorem, 129 00:07:02,160 --> 00:07:06,760 Speaker 1: and this states that any adequate um axiomatizeable theory that means, 130 00:07:06,839 --> 00:07:10,040 Speaker 1: a theory that's based on self evident but unprovable proofs, 131 00:07:10,360 --> 00:07:15,840 Speaker 1: is incomplete or inconsistent. Yeah. Girdle's whole incompleteness theorem set. Yeah, 132 00:07:15,840 --> 00:07:19,040 Speaker 1: A couple of his incompleteness theorems essentially amount to the 133 00:07:19,080 --> 00:07:23,240 Speaker 1: idea that any mathematical system that makes sense will have 134 00:07:23,440 --> 00:07:27,280 Speaker 1: some statements that are true yet impossible to prove. It's 135 00:07:27,280 --> 00:07:29,680 Speaker 1: sort of the idea that you can't ever know everything 136 00:07:29,760 --> 00:07:33,360 Speaker 1: about a self consistent system. Yeah, and the the The 137 00:07:33,400 --> 00:07:37,240 Speaker 1: implication here, according to theoretical physicist and mathematician Freeman Dyson, 138 00:07:37,440 --> 00:07:40,400 Speaker 1: who has is also quite a giant in the field, 139 00:07:41,120 --> 00:07:44,160 Speaker 1: is that mathematics is inexhaustible, that no matter how many 140 00:07:44,160 --> 00:07:49,200 Speaker 1: problems we solve, will inevitably encounter more unsolvable problems within 141 00:07:49,480 --> 00:07:53,760 Speaker 1: the existing rules. I take comfort in that measure of futility. Yeah, 142 00:07:53,800 --> 00:07:56,640 Speaker 1: but there's also John von Neuman, the recipient of the letter. 143 00:07:56,760 --> 00:08:00,400 Speaker 1: And von Neuman. I don't know what you've heard about him, 144 00:08:00,440 --> 00:08:03,000 Speaker 1: but I'd say he's often considered one of the most 145 00:08:03,080 --> 00:08:06,520 Speaker 1: intelligent people who ever lived that we know about at least, 146 00:08:07,560 --> 00:08:10,440 Speaker 1: and so maybe we call him a mathematician and a physicist, 147 00:08:10,440 --> 00:08:12,800 Speaker 1: but he made contributions to numerous fields. He was a 148 00:08:13,240 --> 00:08:16,600 Speaker 1: modern Da Vinci kind of you know, a polymath, and 149 00:08:16,600 --> 00:08:19,680 Speaker 1: and that includes computer science, for example, the von Neuman 150 00:08:19,840 --> 00:08:22,760 Speaker 1: architecture in the history of computer design, which is basically 151 00:08:22,920 --> 00:08:27,000 Speaker 1: it's a way of controlling the interaction between processing operations 152 00:08:27,040 --> 00:08:29,880 Speaker 1: the CPU and the memory of a computer. And this 153 00:08:30,040 --> 00:08:33,240 Speaker 1: letter in nineteen fifty six from Girdle to von Neumann 154 00:08:33,800 --> 00:08:36,920 Speaker 1: started this process of looking into the question of whether 155 00:08:37,280 --> 00:08:40,880 Speaker 1: P does or does not equal in P. Now, like 156 00:08:40,920 --> 00:08:43,720 Speaker 1: I said, we're about to explain what all the terms 157 00:08:43,760 --> 00:08:45,920 Speaker 1: here mean. But I do want to note at the 158 00:08:45,960 --> 00:08:48,920 Speaker 1: outset of this explanation that you know, on this show 159 00:08:48,960 --> 00:08:51,200 Speaker 1: we always try to do our best to present our 160 00:08:51,240 --> 00:08:55,120 Speaker 1: subjects accurately, but then at the same time be understandable 161 00:08:55,160 --> 00:08:57,640 Speaker 1: to the average person. And this, this P versus in 162 00:08:57,760 --> 00:09:01,040 Speaker 1: P issue in complexity theory is probably the most difficult 163 00:09:01,040 --> 00:09:03,959 Speaker 1: and abstract subject I've ever tried to cover on a podcast. 164 00:09:04,800 --> 00:09:06,720 Speaker 1: So we'll have to do our best to explain the 165 00:09:06,720 --> 00:09:10,480 Speaker 1: issue and its implications without losing you in asphyxiating clouds 166 00:09:10,480 --> 00:09:14,240 Speaker 1: of abstraction. Yeah, I mean, basically, the House of Works 167 00:09:14,320 --> 00:09:18,760 Speaker 1: UH mission overall is to demystify your science and UH 168 00:09:19,320 --> 00:09:21,400 Speaker 1: topics like this can be a bit difficult because you 169 00:09:21,440 --> 00:09:24,720 Speaker 1: don't want to through the explanation just mystify it even 170 00:09:24,760 --> 00:09:27,199 Speaker 1: more for the average listener exactly right. So this is 171 00:09:27,280 --> 00:09:30,800 Speaker 1: necessarily going to involve a lot of simplified versions of principles. 172 00:09:30,800 --> 00:09:33,680 Speaker 1: We won't be able to go down uh and explore 173 00:09:33,720 --> 00:09:37,240 Speaker 1: all of the complex details behind these principles, but we 174 00:09:37,320 --> 00:09:40,680 Speaker 1: hope that you computer scientists and mathematicians out there will 175 00:09:40,679 --> 00:09:43,160 Speaker 1: not be too scandalized or think we're doing violence to 176 00:09:43,200 --> 00:09:47,760 Speaker 1: your subject. Uh. Anyway, here we go. So we we 177 00:09:47,840 --> 00:09:50,040 Speaker 1: got to start with the concept of algorithms. What what 178 00:09:50,200 --> 00:09:53,640 Speaker 1: is an algorithm? Well, I'd say an algorithm is a 179 00:09:53,720 --> 00:09:58,360 Speaker 1: self contained list of instructions to solve a problem. You've 180 00:09:58,360 --> 00:10:00,600 Speaker 1: got a goal, and then you make us step by 181 00:10:00,640 --> 00:10:03,800 Speaker 1: step list of things to do that gets you to 182 00:10:03,840 --> 00:10:07,600 Speaker 1: the goal. A common example within a computer program would 183 00:10:07,679 --> 00:10:11,160 Speaker 1: be a subroutine designed to sort a list of things. 184 00:10:11,400 --> 00:10:15,000 Speaker 1: That's an algorithm. Yeah, and you know, algorithms are something 185 00:10:15,000 --> 00:10:18,640 Speaker 1: we encounter on a just on a daily basis, especially online. 186 00:10:18,640 --> 00:10:21,199 Speaker 1: I mean Facebook, Google. Both of these depend on ever 187 00:10:21,320 --> 00:10:24,480 Speaker 1: changing algorithms to decide what you see and don't see 188 00:10:25,200 --> 00:10:27,439 Speaker 1: on your feeds and on your search results. Yeah, and 189 00:10:27,480 --> 00:10:29,959 Speaker 1: I think that's a great example of how complex algorithms 190 00:10:30,000 --> 00:10:33,200 Speaker 1: can get. You've got the simple sorting algorithm on one hand, 191 00:10:33,240 --> 00:10:36,240 Speaker 1: and then you've got the stuff that decides whether you 192 00:10:36,360 --> 00:10:39,680 Speaker 1: only see political articles you agree with or whether you 193 00:10:39,760 --> 00:10:42,400 Speaker 1: sometimes see stuff that's going to make you mad. So, 194 00:10:42,400 --> 00:10:46,400 Speaker 1: when you're designing algorithms in in a computer science arena, 195 00:10:46,480 --> 00:10:49,040 Speaker 1: or really to solve any problem, but we're mostly gonna 196 00:10:49,080 --> 00:10:53,520 Speaker 1: be talking about computer programs, you compare how much time 197 00:10:53,640 --> 00:10:57,199 Speaker 1: it takes to solve a problem with an algorithm, given 198 00:10:57,240 --> 00:10:59,960 Speaker 1: the scope of a problem. So this is usually expressed 199 00:11:00,040 --> 00:11:03,720 Speaker 1: in terms of inputs versus time. So I want to 200 00:11:03,720 --> 00:11:07,040 Speaker 1: give a quick example with sorting. Like I said, you say, 201 00:11:07,040 --> 00:11:10,160 Speaker 1: you're given a spreadsheet that includes a list of all 202 00:11:10,200 --> 00:11:13,960 Speaker 1: the James Bond movies that exist currently in a random order, 203 00:11:14,840 --> 00:11:17,280 Speaker 1: and you've got to write a computer program that sorts 204 00:11:17,320 --> 00:11:20,880 Speaker 1: all of those lists of James Bond movie titles into 205 00:11:20,920 --> 00:11:24,120 Speaker 1: a list in the order they came out. How would 206 00:11:24,160 --> 00:11:26,240 Speaker 1: you do that? Now, there are a lot of ways 207 00:11:26,280 --> 00:11:29,160 Speaker 1: you actually could approach the problem, and that they don't 208 00:11:29,200 --> 00:11:31,560 Speaker 1: all take the same amount of time. Some are much 209 00:11:31,559 --> 00:11:35,320 Speaker 1: more efficient than others. Here's one example. You could create 210 00:11:35,360 --> 00:11:39,280 Speaker 1: an algorithm that goes like this. Step one, rearrange the 211 00:11:39,480 --> 00:11:43,360 Speaker 1: entire list at random. Step two, check each movie in 212 00:11:43,360 --> 00:11:45,360 Speaker 1: the list to see if it came out before the 213 00:11:45,400 --> 00:11:48,120 Speaker 1: next movie in the list. If the answer is yes, 214 00:11:48,160 --> 00:11:50,120 Speaker 1: all the way down the line, then the list is 215 00:11:50,160 --> 00:11:54,000 Speaker 1: sorted correctly and you're done. If not, start over and 216 00:11:54,200 --> 00:11:57,400 Speaker 1: rearrange it entirely randomly. Now, given enough time and a 217 00:11:57,480 --> 00:12:01,040 Speaker 1: small enough data set, this algorithm will if eventually finished 218 00:12:01,080 --> 00:12:04,800 Speaker 1: by blind luck. It is just brute force burning through 219 00:12:04,840 --> 00:12:09,320 Speaker 1: computer resources wastefully in order to eventually solve the problem 220 00:12:09,400 --> 00:12:13,120 Speaker 1: by blind block. But there are also much more efficient 221 00:12:13,120 --> 00:12:15,640 Speaker 1: ways you could go about it. For example, you could 222 00:12:15,679 --> 00:12:18,480 Speaker 1: go down the list comparing each movie to the next, 223 00:12:18,559 --> 00:12:20,839 Speaker 1: and if the second movie came out before the first, 224 00:12:20,920 --> 00:12:22,960 Speaker 1: you switch their order on the list and then go 225 00:12:23,040 --> 00:12:25,959 Speaker 1: on like that. Uh, and then you do that until 226 00:12:26,000 --> 00:12:29,600 Speaker 1: the list is sorted. But some problems are inherently a 227 00:12:29,600 --> 00:12:34,480 Speaker 1: lot harder than others, and there aren't any algorithmic shortcuts 228 00:12:34,520 --> 00:12:36,680 Speaker 1: like that that we know about. We don't know of 229 00:12:36,760 --> 00:12:40,200 Speaker 1: any easy way to solve them. The only thing we 230 00:12:40,280 --> 00:12:43,400 Speaker 1: know how to do is do that stupid brute force 231 00:12:43,480 --> 00:12:47,280 Speaker 1: method where you just wastefully burned through computer resources until 232 00:12:47,400 --> 00:12:52,280 Speaker 1: it's solved by time and force. And in fact, I'd 233 00:12:52,320 --> 00:12:54,360 Speaker 1: like to make a comparison here in the you know, 234 00:12:54,440 --> 00:12:58,360 Speaker 1: the efficient algorithm versus brute force methods, to what you 235 00:12:58,440 --> 00:13:01,440 Speaker 1: might see in animals in the wild using intelligence to 236 00:13:01,480 --> 00:13:04,680 Speaker 1: solve a problem. So like, if you're hunting another animal, 237 00:13:04,920 --> 00:13:08,720 Speaker 1: you could use a brute force method of just running 238 00:13:08,760 --> 00:13:11,959 Speaker 1: after the animal until it is tired or until your 239 00:13:12,040 --> 00:13:14,760 Speaker 1: muscles have allowed you to catch it, and then killing 240 00:13:14,800 --> 00:13:16,959 Speaker 1: it with the strength of your muscles. That's sort of 241 00:13:17,000 --> 00:13:20,040 Speaker 1: the brute force method. Or you could set a trap, 242 00:13:20,200 --> 00:13:23,000 Speaker 1: or you could build a weapon that these are shortcuts 243 00:13:23,040 --> 00:13:25,440 Speaker 1: that make the process of hunting a lot more efficient. 244 00:13:25,760 --> 00:13:28,600 Speaker 1: Now here's where we get to our main terms in 245 00:13:28,640 --> 00:13:31,599 Speaker 1: this discussion, P and n P. P is going to 246 00:13:31,720 --> 00:13:37,079 Speaker 1: stand for polynomial time and n P stands for nondeterministic 247 00:13:37,160 --> 00:13:39,920 Speaker 1: polynomial time. You don't really need to remember that for 248 00:13:40,040 --> 00:13:42,120 Speaker 1: the purpose of this discussion, because we're gonna make it 249 00:13:42,160 --> 00:13:43,800 Speaker 1: a lot simpler. Yeah, I mean, this is one of 250 00:13:43,840 --> 00:13:46,360 Speaker 1: the problems with the topic, is that like just the 251 00:13:46,400 --> 00:13:49,040 Speaker 1: word just this the the the basic idea here of 252 00:13:49,080 --> 00:13:53,040 Speaker 1: P and MP there, it's so dry and unrelatable. But 253 00:13:53,440 --> 00:13:56,200 Speaker 1: allow us to explain. Yeah, okay, So the real difference 254 00:13:56,240 --> 00:13:59,040 Speaker 1: has to do with um processes of solving problems on 255 00:13:59,120 --> 00:14:02,800 Speaker 1: a determinus dick Turing machine, which is equivalent to the 256 00:14:02,880 --> 00:14:05,120 Speaker 1: kind of computer you'd be using right now, you know, 257 00:14:05,440 --> 00:14:11,000 Speaker 1: any device you have, versus a hypothetical nondeterministic machine, which 258 00:14:11,000 --> 00:14:13,760 Speaker 1: in theory you could say works by magically guessing the 259 00:14:13,840 --> 00:14:16,679 Speaker 1: answers to questions and then just checking to see if 260 00:14:16,679 --> 00:14:19,320 Speaker 1: the magic guess is correct. But, like we said, we 261 00:14:19,320 --> 00:14:21,280 Speaker 1: don't want to get too bogged down in all those details. 262 00:14:21,280 --> 00:14:25,880 Speaker 1: So here's the simplified version. P is a set of 263 00:14:25,920 --> 00:14:30,080 Speaker 1: all problems that can be solved by an algorithm quickly 264 00:14:30,200 --> 00:14:33,560 Speaker 1: or easily or efficiently. This is the easy. The list 265 00:14:33,600 --> 00:14:37,120 Speaker 1: of easy problems in computer science. N P, on the 266 00:14:37,120 --> 00:14:40,720 Speaker 1: other hand, stands for answers that can be easily checked 267 00:14:40,720 --> 00:14:43,960 Speaker 1: by a computer once you have them, but they can't 268 00:14:43,960 --> 00:14:48,000 Speaker 1: necessarily be solved easily. Yeah. So it's difficult to place 269 00:14:48,080 --> 00:14:50,760 Speaker 1: this in a non mathematical context. But one way I 270 00:14:50,800 --> 00:14:53,720 Speaker 1: like to think about this is in terms of written reviews, 271 00:14:53,880 --> 00:14:59,160 Speaker 1: for for albums, for for for musical albums. Um, because 272 00:14:59,400 --> 00:15:03,080 Speaker 1: a good a good review, a good music review is 273 00:15:03,080 --> 00:15:05,880 Speaker 1: is difficult to write, uh, and hard to find and 274 00:15:06,000 --> 00:15:08,400 Speaker 1: hard to find. Yeah, Like in my own experience, you 275 00:15:08,440 --> 00:15:11,240 Speaker 1: often find I mean, I I write many reviews of 276 00:15:11,280 --> 00:15:14,120 Speaker 1: stuff of stuff from time to time, and that alone 277 00:15:14,160 --> 00:15:16,640 Speaker 1: is challenging enough for me. But but yeah, when you 278 00:15:16,760 --> 00:15:19,760 Speaker 1: even when you're looking out of the major publications, uh, yeah, 279 00:15:19,760 --> 00:15:22,680 Speaker 1: it's hard to find one that feels just right. It's Uh. 280 00:15:23,000 --> 00:15:26,400 Speaker 1: The average reader, though, can can swiftly judge to what 281 00:15:26,520 --> 00:15:29,560 Speaker 1: extent they agree with the author, obviously, to what extent 282 00:15:29,640 --> 00:15:32,080 Speaker 1: the author is just blowing smoke. We've all read those 283 00:15:32,160 --> 00:15:35,280 Speaker 1: music reviews where you get the sense that the the 284 00:15:35,360 --> 00:15:38,160 Speaker 1: author is really using the music as an excuse to 285 00:15:38,280 --> 00:15:41,720 Speaker 1: sort of write his or her own poetry. Uh, they're 286 00:15:41,760 --> 00:15:45,320 Speaker 1: on the page as opposed actually describing what the music 287 00:15:45,440 --> 00:15:48,760 Speaker 1: is like. But but the reader knows. So the reader 288 00:15:48,800 --> 00:15:52,160 Speaker 1: can can can look at the material and either believe 289 00:15:52,200 --> 00:15:54,360 Speaker 1: them or you don't. Either you buy into their opinion 290 00:15:54,480 --> 00:15:57,080 Speaker 1: or you don't. Yeah, so you could. You don't have 291 00:15:57,160 --> 00:16:01,920 Speaker 1: the algorithm internally to efficiently right this piece of writing yourself, 292 00:16:01,920 --> 00:16:04,520 Speaker 1: but you know it when you see it exactly. Yeah, 293 00:16:04,560 --> 00:16:07,120 Speaker 1: it's kind of like pornography in that sens right. You 294 00:16:07,240 --> 00:16:10,480 Speaker 1: might not be able to define clearly the difference between 295 00:16:10,760 --> 00:16:13,520 Speaker 1: art and pornography, but you know it when you see it. 296 00:16:14,240 --> 00:16:16,920 Speaker 1: Once you have that answer certificate there, you can check 297 00:16:17,440 --> 00:16:21,440 Speaker 1: and by golly, it checks out. And I like that 298 00:16:21,480 --> 00:16:23,520 Speaker 1: because it also gives a whole new meaning to the 299 00:16:23,520 --> 00:16:27,720 Speaker 1: P and to the MP and H. Now, so there 300 00:16:27,760 --> 00:16:30,280 Speaker 1: are a couple of other terms that matter there. There's 301 00:16:30,440 --> 00:16:34,440 Speaker 1: MP hard, and this means that are problems that are 302 00:16:34,480 --> 00:16:38,400 Speaker 1: as hard as any other MP problem essentially. And then 303 00:16:38,400 --> 00:16:41,600 Speaker 1: there's also MP complete, which is a big issue in 304 00:16:41,640 --> 00:16:43,880 Speaker 1: this arena, and this is problems that are m P 305 00:16:44,160 --> 00:16:47,720 Speaker 1: and n P hard so you you can check the 306 00:16:47,760 --> 00:16:50,640 Speaker 1: answer once you have it in in a reasonable amount 307 00:16:50,640 --> 00:16:53,880 Speaker 1: of time, and their MP hard. Now, the interesting thing 308 00:16:53,920 --> 00:16:56,920 Speaker 1: about MP complete problems is that it has been proved 309 00:16:57,000 --> 00:16:59,840 Speaker 1: in the literature that if you have an algorithm that 310 00:17:00,000 --> 00:17:04,560 Speaker 1: and efficiently solve one MP complete problem, it can be 311 00:17:04,640 --> 00:17:09,000 Speaker 1: transposed to solve all of them. These problems reduced to 312 00:17:09,160 --> 00:17:12,040 Speaker 1: each other. So if you if you can solve one 313 00:17:12,200 --> 00:17:15,320 Speaker 1: MP complete problem in a reasonable amount of time, you 314 00:17:15,440 --> 00:17:19,560 Speaker 1: have found the master key. And this in the universe 315 00:17:19,640 --> 00:17:23,440 Speaker 1: kind of shrinks in response to this. So a classic 316 00:17:23,480 --> 00:17:27,320 Speaker 1: example of an MP problem is the prime factorization problem 317 00:17:27,359 --> 00:17:29,800 Speaker 1: that we use in encryption on the Internet. Again, we 318 00:17:29,800 --> 00:17:31,520 Speaker 1: don't want you to get lost too much here, so 319 00:17:32,119 --> 00:17:35,920 Speaker 1: here's the simple version with smaller numbers than usual. Let's 320 00:17:35,920 --> 00:17:38,760 Speaker 1: say I just throw out a random number, and let's 321 00:17:38,760 --> 00:17:41,520 Speaker 1: say it's a number of I don't know what's a 322 00:17:41,560 --> 00:17:44,320 Speaker 1: good one skulls in a pile. So let's say I 323 00:17:44,400 --> 00:17:47,600 Speaker 1: give you a number. Let's say there are seven hundred 324 00:17:47,600 --> 00:17:51,359 Speaker 1: and twenty one thousand, four hundred twenty one skulls in 325 00:17:51,400 --> 00:17:55,160 Speaker 1: a pile. It is now I tell you this number 326 00:17:55,240 --> 00:17:59,240 Speaker 1: is the product of two prime numbers of skulls in 327 00:17:59,240 --> 00:18:03,560 Speaker 1: a pile, but I don't tell you what they are. Now, 328 00:18:03,600 --> 00:18:06,760 Speaker 1: how could you figure out what those two prime numbers 329 00:18:06,760 --> 00:18:09,879 Speaker 1: of skulls in a pile are? For a computer with 330 00:18:10,080 --> 00:18:12,800 Speaker 1: numbers this small, this wouldn't be all that big a deal. 331 00:18:12,880 --> 00:18:15,920 Speaker 1: But we're we're gonna have to extrapolate too much bigger numbers. 332 00:18:16,640 --> 00:18:20,160 Speaker 1: But for you, this would be really annoying to figure out, right, 333 00:18:21,119 --> 00:18:24,040 Speaker 1: because there's no simple efficient way to do it. You'd 334 00:18:24,040 --> 00:18:27,320 Speaker 1: pretty much have to get out a huge list of 335 00:18:27,440 --> 00:18:31,960 Speaker 1: all the prime numbers between zero and seven thousand, four 336 00:18:32,040 --> 00:18:36,720 Speaker 1: hundred one and start trying multiplying them together to see 337 00:18:36,760 --> 00:18:39,040 Speaker 1: if they give you the right answer. Yeah, and in 338 00:18:39,080 --> 00:18:42,520 Speaker 1: this situation, I imagine it's like the opening scenes of Terminator, 339 00:18:43,040 --> 00:18:46,439 Speaker 1: and I'm probably already pretty distracted by the Pyramids of 340 00:18:46,480 --> 00:18:50,400 Speaker 1: Bone and the Hunter Killer, the Hunter Killers exactly, I'm 341 00:18:50,400 --> 00:18:52,480 Speaker 1: not gonna have time for all this prime number nuns. Now, 342 00:18:52,480 --> 00:18:54,840 Speaker 1: since we've solved p equals in p at this point, 343 00:18:54,840 --> 00:18:57,440 Speaker 1: they don't have rubber skin anymore. They figured out how 344 00:18:57,440 --> 00:18:59,480 Speaker 1: to how to get through the problem to make the 345 00:18:59,520 --> 00:19:02,359 Speaker 1: bio or suits. So yeah, you're you're in a real 346 00:19:02,440 --> 00:19:07,040 Speaker 1: rush here. But anyway, back to the problem. Yeah, there's 347 00:19:07,080 --> 00:19:11,399 Speaker 1: just no fast, simple, easy way to solve this. And 348 00:19:11,440 --> 00:19:13,840 Speaker 1: if you use numbers large enough, this type of problem 349 00:19:13,960 --> 00:19:18,919 Speaker 1: is excruciatingly slow, even for computers to conquer by brute force. 350 00:19:19,440 --> 00:19:22,760 Speaker 1: But let's say I told you the two prime numbers 351 00:19:22,760 --> 00:19:25,119 Speaker 1: are seven hundred and fifty seven and nine hundred and 352 00:19:25,200 --> 00:19:29,320 Speaker 1: fifty three piles skulls in a pile. It would be 353 00:19:29,359 --> 00:19:32,080 Speaker 1: trivially easy for you to check and see if that's 354 00:19:32,119 --> 00:19:34,320 Speaker 1: the correct solution. You just have to multiply them and 355 00:19:34,320 --> 00:19:37,119 Speaker 1: see if you get the right answer, and that takes 356 00:19:37,119 --> 00:19:38,880 Speaker 1: almost no time at all. And in fact, I'd really 357 00:19:38,960 --> 00:19:40,880 Speaker 1: only need to tell you one of the numbers because 358 00:19:40,920 --> 00:19:43,359 Speaker 1: you already know what they're supposed to multiply too, so 359 00:19:43,400 --> 00:19:46,199 Speaker 1: you could just divide that by the one number. So 360 00:19:46,280 --> 00:19:48,480 Speaker 1: here's an example. This is a problem that if you're 361 00:19:48,480 --> 00:19:51,200 Speaker 1: going to try to solve it starting with no information, 362 00:19:51,280 --> 00:19:53,680 Speaker 1: it's just going to take you ages. It's going to 363 00:19:53,760 --> 00:19:58,880 Speaker 1: be impossible. But if you already know a selected answer 364 00:19:58,960 --> 00:20:01,199 Speaker 1: to test, you can I can see if that answer 365 00:20:01,280 --> 00:20:04,480 Speaker 1: is right. Yeah, I mean this brings to mind. Uh, 366 00:20:04,520 --> 00:20:06,440 Speaker 1: it's a bit like trying to crack a four number 367 00:20:06,480 --> 00:20:09,400 Speaker 1: code on a simple combination lock, right, And I'm talking 368 00:20:09,440 --> 00:20:12,040 Speaker 1: about a human doing this, not a computer. So it 369 00:20:12,040 --> 00:20:14,800 Speaker 1: would take me as a human quite a while to 370 00:20:14,840 --> 00:20:18,880 Speaker 1: test out all ten thousand possible solutions, but no time 371 00:20:18,920 --> 00:20:21,560 Speaker 1: at all to check a solution that someone else had 372 00:20:21,560 --> 00:20:23,880 Speaker 1: provided me. So you know, I'd be there all day 373 00:20:23,920 --> 00:20:27,160 Speaker 1: just putting in each one of those ten thousand solutions. 374 00:20:27,200 --> 00:20:30,160 Speaker 1: But but I can easily put in you know, uh, 375 00:20:30,560 --> 00:20:33,000 Speaker 1: thirty three sixty six and see if that is correct. 376 00:20:33,200 --> 00:20:36,880 Speaker 1: You found a really interesting request for help on this subject, 377 00:20:36,880 --> 00:20:41,320 Speaker 1: and you, oh, yeah, yeah, there was because I wanted 378 00:20:41,359 --> 00:20:43,119 Speaker 1: to check to make sure that my math was right 379 00:20:43,160 --> 00:20:45,240 Speaker 1: on how many possible combinations. I was pretty sure it 380 00:20:45,280 --> 00:20:47,520 Speaker 1: was the ten by ten by ten by ten thing. 381 00:20:48,280 --> 00:20:50,560 Speaker 1: But I I did one of those searchers to just 382 00:20:50,600 --> 00:20:54,800 Speaker 1: see people asking math questions online, and I found one where, um, 383 00:20:54,840 --> 00:20:58,440 Speaker 1: this user apparently had a combination lock or it maybe 384 00:20:58,480 --> 00:21:02,199 Speaker 1: it was like a contract actor's box, you know, like 385 00:21:02,200 --> 00:21:04,400 Speaker 1: at a house we have a real estate agent. Yeah, 386 00:21:04,480 --> 00:21:06,399 Speaker 1: and she says it's like a thirty dollar box. So 387 00:21:06,440 --> 00:21:08,480 Speaker 1: she didn't want to just have it, you know, cut 388 00:21:08,520 --> 00:21:11,280 Speaker 1: into and ruin it in order to get the keys out. 389 00:21:12,040 --> 00:21:14,159 Speaker 1: She wanted to, but she didn't know what the combo was, 390 00:21:14,560 --> 00:21:18,200 Speaker 1: so she wanted to just enter all the possible combinations 391 00:21:18,240 --> 00:21:20,400 Speaker 1: to get it. And here's the here's the caveat though 392 00:21:20,640 --> 00:21:25,520 Speaker 1: she knows that no number was used, uh more than once, 393 00:21:26,320 --> 00:21:29,440 Speaker 1: and that cut it down significantly, just just more than 394 00:21:29,480 --> 00:21:33,120 Speaker 1: five thousand. Yeah, just was like five thousand forty or something. Yeah. 395 00:21:33,240 --> 00:21:36,160 Speaker 1: And and the person who supply the answer on this forum, 396 00:21:36,200 --> 00:21:38,719 Speaker 1: they included a list of all of them for her convenience. 397 00:21:39,320 --> 00:21:41,720 Speaker 1: So um, I and I don't know how that came out. 398 00:21:41,760 --> 00:21:45,200 Speaker 1: I wonder if she then took that list and just painstakingly, uh, 399 00:21:45,560 --> 00:21:47,719 Speaker 1: spent the time to try each one out, or if 400 00:21:47,720 --> 00:21:52,000 Speaker 1: she decided, you know, actually that inputting all those numbers 401 00:21:52,080 --> 00:21:54,280 Speaker 1: is not worth the thirty dollars that I would save 402 00:21:54,359 --> 00:21:57,080 Speaker 1: by keeping the box intact. That sounds like a fun saturday, 403 00:21:57,160 --> 00:21:59,520 Speaker 1: you know, on the porch in front of the box 404 00:21:59,560 --> 00:22:10,600 Speaker 1: with the case a beer. Hopefully it's nice weather. But anyway, 405 00:22:10,840 --> 00:22:14,680 Speaker 1: so yeah, this is problems that can potentially be brutally 406 00:22:14,720 --> 00:22:17,320 Speaker 1: hard to solve, but they're easy to check once you 407 00:22:17,359 --> 00:22:20,840 Speaker 1: have a certificate of the answer. Another classic and often 408 00:22:20,880 --> 00:22:25,800 Speaker 1: cited example of an MP complete problem is the traveling 409 00:22:25,880 --> 00:22:29,240 Speaker 1: salesman problem. Now I think something is exciting because now 410 00:22:29,240 --> 00:22:32,000 Speaker 1: we have something with more of an anthropomorphic name. It 411 00:22:32,040 --> 00:22:34,920 Speaker 1: seems to imply a scenario. It's like the Infinity Hotel. Well, 412 00:22:34,960 --> 00:22:36,359 Speaker 1: I want to change it up and I want to 413 00:22:36,400 --> 00:22:39,800 Speaker 1: call I want to call this the nationwide infection problem. 414 00:22:40,280 --> 00:22:42,600 Speaker 1: So imagine that you are the vanguard of an alien 415 00:22:42,640 --> 00:22:45,679 Speaker 1: species that has come to Earth and you want to 416 00:22:46,240 --> 00:22:48,960 Speaker 1: land in a country, you say the United States, and 417 00:22:49,200 --> 00:22:53,800 Speaker 1: infect at least one person in every township and municipality 418 00:22:54,160 --> 00:22:58,280 Speaker 1: in this country with one of your larvae. But you've 419 00:22:58,320 --> 00:23:00,600 Speaker 1: got limited time to do it. A k. You know, 420 00:23:00,680 --> 00:23:04,040 Speaker 1: no Dilly dalian around. So what you're looking for is 421 00:23:04,200 --> 00:23:06,600 Speaker 1: a route that you can plan out on your alien 422 00:23:06,680 --> 00:23:09,960 Speaker 1: equivalent of Google Maps directions or or you know, your 423 00:23:09,960 --> 00:23:12,760 Speaker 1: Apple Directions device that will tell you how to go 424 00:23:12,800 --> 00:23:16,280 Speaker 1: to every single city and township in the country, only 425 00:23:16,359 --> 00:23:21,960 Speaker 1: one time each in the shortest route possible. Okay, that's 426 00:23:22,040 --> 00:23:24,960 Speaker 1: not easy. If you just had four or five cities, 427 00:23:25,000 --> 00:23:27,120 Speaker 1: this wouldn't be such a big deal for a computer 428 00:23:27,200 --> 00:23:31,679 Speaker 1: to figure out. Once you start adding hundreds or thousands 429 00:23:31,760 --> 00:23:34,600 Speaker 1: of cities, how is it going to figure this out? 430 00:23:35,000 --> 00:23:37,439 Speaker 1: The only way we know of is back to brute force. 431 00:23:37,920 --> 00:23:40,520 Speaker 1: It could try one method, so well, you go to 432 00:23:40,680 --> 00:23:43,240 Speaker 1: city A, and then city B, and then city C 433 00:23:43,520 --> 00:23:45,320 Speaker 1: and then D and go all the way around the 434 00:23:45,359 --> 00:23:48,199 Speaker 1: country and see how long that takes. And then it 435 00:23:48,240 --> 00:23:51,679 Speaker 1: could try again with first city B and then A, 436 00:23:52,080 --> 00:23:54,760 Speaker 1: and then the same from there and then so you 437 00:23:54,800 --> 00:23:59,520 Speaker 1: end up getting these exponentially multiplying combinations there. It is 438 00:23:59,560 --> 00:24:02,760 Speaker 1: just going to take massive amounts of time and computing 439 00:24:02,800 --> 00:24:07,280 Speaker 1: power to figure out what is actually the shortest trip. Now, 440 00:24:07,440 --> 00:24:10,160 Speaker 1: you might already see an issue with including this problem 441 00:24:10,160 --> 00:24:12,760 Speaker 1: within in P and actually reade an interesting blog post 442 00:24:12,840 --> 00:24:17,679 Speaker 1: by somebody writing for for IBM about how, under certain 443 00:24:17,720 --> 00:24:21,120 Speaker 1: conditions this problem actually isn't in P depending on how 444 00:24:21,160 --> 00:24:23,959 Speaker 1: you define what you're checking for. Like, if you're just 445 00:24:24,040 --> 00:24:26,560 Speaker 1: looking to check that a given route is a correct 446 00:24:26,600 --> 00:24:30,720 Speaker 1: solution it visits every city only once, uh, then it 447 00:24:30,840 --> 00:24:33,040 Speaker 1: is easy to check. You can check that very quickly. 448 00:24:33,400 --> 00:24:37,880 Speaker 1: This comes down to accurately defining the problem right exactly. Now, 449 00:24:37,920 --> 00:24:40,280 Speaker 1: if you're looking to check that it visits every city 450 00:24:40,320 --> 00:24:43,600 Speaker 1: only once under a certain mileage, that's also easy to check. 451 00:24:43,640 --> 00:24:46,439 Speaker 1: You just see that it visited every city once and 452 00:24:46,440 --> 00:24:48,879 Speaker 1: see how long it took. But if you're looking to 453 00:24:48,960 --> 00:24:51,840 Speaker 1: verify whether a solution is in fact the shortest of 454 00:24:51,880 --> 00:24:55,840 Speaker 1: all possible routes, that's not easy to check because you'd 455 00:24:55,880 --> 00:24:58,560 Speaker 1: still have you'd essentially have to do the entire brute 456 00:24:58,560 --> 00:25:00,800 Speaker 1: force method that way and compare or it to every 457 00:25:00,840 --> 00:25:04,840 Speaker 1: other possibility. All possible routes have to be to have 458 00:25:04,920 --> 00:25:07,320 Speaker 1: to be considered. So if if that's what you're going for, 459 00:25:07,440 --> 00:25:09,800 Speaker 1: it's not hard to solve, easy to check, It's hard 460 00:25:09,840 --> 00:25:12,760 Speaker 1: to solve and hard to check. But here's the big 461 00:25:12,800 --> 00:25:15,359 Speaker 1: problem with the with the P versus n P issue. 462 00:25:15,720 --> 00:25:19,719 Speaker 1: We know that P problems are a subset of MP problems, 463 00:25:20,480 --> 00:25:24,280 Speaker 1: But what if the subset is actually the same as 464 00:25:24,359 --> 00:25:28,639 Speaker 1: the set, Meaning what if all NP problems are actually 465 00:25:28,920 --> 00:25:32,840 Speaker 1: P problems? Meaning what if all problems where we can 466 00:25:32,920 --> 00:25:37,399 Speaker 1: check the answer are actually problems where we can solve 467 00:25:37,440 --> 00:25:40,960 Speaker 1: them efficiently. We just haven't figured out how to solve 468 00:25:41,000 --> 00:25:43,960 Speaker 1: them efficiently yet, Okay, or we haven't developed the here 469 00:25:44,400 --> 00:25:47,600 Speaker 1: the machines they can do it. Yeah, yeah, So is 470 00:25:47,680 --> 00:25:51,439 Speaker 1: that possible? And that's actually probably the single biggest open 471 00:25:51,560 --> 00:25:55,680 Speaker 1: question in computer science today? Is P equal or not 472 00:25:55,960 --> 00:25:58,399 Speaker 1: equal to n P? Are they or are they not 473 00:25:58,600 --> 00:26:04,439 Speaker 1: equivalent sets. Now, the obvious answer is no, right, that 474 00:26:04,800 --> 00:26:07,399 Speaker 1: seems intuitive, That seems that that seems to be the 475 00:26:07,440 --> 00:26:10,640 Speaker 1: answer that that feels most in keeping with our our 476 00:26:10,800 --> 00:26:13,800 Speaker 1: understanding of like the limits of human ability, the limits 477 00:26:13,800 --> 00:26:16,399 Speaker 1: of human knowledge, and just sort of the fabric of 478 00:26:16,400 --> 00:26:19,560 Speaker 1: our universe. Yeah, and so most computer scientists and mathematicians, 479 00:26:19,600 --> 00:26:22,840 Speaker 1: I think agree that the more likely answer to this 480 00:26:22,920 --> 00:26:26,399 Speaker 1: unsolved question is that P does not equal MP. I 481 00:26:26,440 --> 00:26:28,960 Speaker 1: found one pole that was taken. It was more than 482 00:26:28,960 --> 00:26:31,280 Speaker 1: ten years ago. I don't know if things have changed 483 00:26:31,359 --> 00:26:34,320 Speaker 1: much since then. But in two thousand two, the University 484 00:26:34,320 --> 00:26:37,480 Speaker 1: of Maryland computer scientist William I. Gas Arc did a 485 00:26:37,480 --> 00:26:41,119 Speaker 1: poll of colleagues in complexity theory and UH. And he 486 00:26:41,160 --> 00:26:44,080 Speaker 1: found the results. And so he found out of this 487 00:26:44,160 --> 00:26:47,280 Speaker 1: poll of colleagues, sixty one of his colleagues thought that 488 00:26:47,359 --> 00:26:51,960 Speaker 1: P did not equal MP. Nine thought that P did 489 00:26:52,119 --> 00:26:54,760 Speaker 1: equal MP. And some of those that said that said 490 00:26:54,800 --> 00:26:58,359 Speaker 1: they they said that basically just to be contrarian or 491 00:26:58,359 --> 00:27:03,040 Speaker 1: to continue encourage people to research the possibility. UH. And 492 00:27:03,080 --> 00:27:06,880 Speaker 1: then several other colleagues either offered no opinion or offered 493 00:27:07,040 --> 00:27:11,119 Speaker 1: UH sort of complex answers that weren't yes. Or no. 494 00:27:12,280 --> 00:27:14,439 Speaker 1: But so you can obviously see that the opinion that 495 00:27:14,560 --> 00:27:17,600 Speaker 1: P does equal MP, or the prediction that that will 496 00:27:17,640 --> 00:27:22,480 Speaker 1: be what's eventually proved, is the minority. It's not what 497 00:27:22,520 --> 00:27:26,440 Speaker 1: we would tend to think is the more likely possibility. Okay, 498 00:27:26,600 --> 00:27:32,159 Speaker 1: it's the more outsider consideration here. So let's assume for 499 00:27:32,200 --> 00:27:34,360 Speaker 1: a second that that is the case that one day 500 00:27:34,560 --> 00:27:38,679 Speaker 1: some amazing mathematician or computer scientists somebody comes along and 501 00:27:38,720 --> 00:27:41,200 Speaker 1: they figure out a way to prove that P does 502 00:27:41,280 --> 00:27:44,520 Speaker 1: not equal in P. Proof Proofs like this happen in uh, 503 00:27:44,600 --> 00:27:46,520 Speaker 1: in math and computer science all the time. You might 504 00:27:46,520 --> 00:27:48,720 Speaker 1: be wondering, how could that be proved, But people figure 505 00:27:48,720 --> 00:27:53,440 Speaker 1: out ways to demonstrate logically that something is true like this. 506 00:27:53,520 --> 00:27:56,840 Speaker 1: So let's say it's demonstrated that P does not equal MP. 507 00:27:57,400 --> 00:28:01,000 Speaker 1: What are the implications. Well, mostly, I'd say not much changes. 508 00:28:01,880 --> 00:28:04,199 Speaker 1: This is sort of the obvious conclusion. It's the one 509 00:28:04,240 --> 00:28:07,320 Speaker 1: that wouldn't surprise us. In other words, all of our 510 00:28:07,359 --> 00:28:12,200 Speaker 1: brute force problems remain brute force problems. UH. But if 511 00:28:12,240 --> 00:28:14,119 Speaker 1: this is the case, it would still be useful to know. 512 00:28:14,440 --> 00:28:16,880 Speaker 1: It would be useful that people wouldn't start floating into 513 00:28:16,920 --> 00:28:19,359 Speaker 1: the sky. The great old ones wouldn't come back. It 514 00:28:19,400 --> 00:28:22,600 Speaker 1: would just business is usual for most people. Yeah, so 515 00:28:23,200 --> 00:28:25,160 Speaker 1: we we'd have a proof, so people could stop trying 516 00:28:25,200 --> 00:28:27,159 Speaker 1: to solve it, and we'd be able to use the 517 00:28:27,200 --> 00:28:29,479 Speaker 1: fact that P is not equal to MP as an 518 00:28:29,480 --> 00:28:34,040 Speaker 1: assumption for other work in mathematics and computer science. So 519 00:28:34,080 --> 00:28:37,480 Speaker 1: we just move on with our lives basically essentially. But 520 00:28:37,600 --> 00:28:40,880 Speaker 1: here's where things get interesting. What would the implications be 521 00:28:41,120 --> 00:28:46,000 Speaker 1: if P does equal m P. Well, right off the 522 00:28:46,000 --> 00:28:47,640 Speaker 1: top of my head, of of course, what comes to 523 00:28:47,680 --> 00:28:50,120 Speaker 1: mind is the the the use of encryption that we've 524 00:28:50,160 --> 00:28:53,400 Speaker 1: already talked about, Like that's really like our most everyday 525 00:28:53,800 --> 00:28:58,520 Speaker 1: interaction with with the idea of of of P and MP. Yeah, 526 00:28:58,560 --> 00:29:00,600 Speaker 1: I mean, why is it not easy for me to 527 00:29:00,760 --> 00:29:04,000 Speaker 1: get into those photos you have on your phone, whatever 528 00:29:04,040 --> 00:29:07,280 Speaker 1: they are. Well, it's because it's because we use these 529 00:29:07,360 --> 00:29:12,680 Speaker 1: encryption methods. And almost all current encryption methods would be 530 00:29:13,040 --> 00:29:16,800 Speaker 1: subject to UH. They just depend on the fact that 531 00:29:16,840 --> 00:29:20,440 Speaker 1: you don't have, you know, tons of supercomputers and time 532 00:29:21,000 --> 00:29:24,800 Speaker 1: to sit around trying to brute force crack into people's junk. 533 00:29:25,600 --> 00:29:28,520 Speaker 1: But if you were able to reduce those problems to 534 00:29:28,960 --> 00:29:32,480 Speaker 1: UH to essentially easily solvable problems, problems that could be 535 00:29:32,480 --> 00:29:37,680 Speaker 1: solved in regular polynomial time. The P class then suddenly, yeah, 536 00:29:37,760 --> 00:29:40,560 Speaker 1: by by encryption essentially. I mean, we can't know for 537 00:29:40,600 --> 00:29:43,400 Speaker 1: sure exactly how big a deal this, uh, this would 538 00:29:43,400 --> 00:29:46,160 Speaker 1: be in terms of applied sciences and technology, but the 539 00:29:46,280 --> 00:29:50,640 Speaker 1: likely implication, uh seems to be that any job currently 540 00:29:50,720 --> 00:29:54,640 Speaker 1: hindered by the limits of brute force computation would be revolutionized. 541 00:29:54,920 --> 00:29:58,200 Speaker 1: So yeah, there's the there's the prime factorization issue that 542 00:29:58,200 --> 00:30:01,800 Speaker 1: that feeds into encryption. And the informal way of summing 543 00:30:01,800 --> 00:30:03,840 Speaker 1: this up is that you know, if, if if our 544 00:30:03,880 --> 00:30:07,800 Speaker 1: present methods of encryption and data security are like a 545 00:30:07,880 --> 00:30:11,920 Speaker 1: like a plastic diary lock on our information, every hacker 546 00:30:11,960 --> 00:30:14,280 Speaker 1: on Earth might have access to a pair of fourteen 547 00:30:14,280 --> 00:30:17,200 Speaker 1: inch bolt cutters if P equals n P. On the 548 00:30:17,240 --> 00:30:19,240 Speaker 1: other hand, and this would be a positive, it could 549 00:30:19,280 --> 00:30:22,520 Speaker 1: also mean that research projects that rely on brute force 550 00:30:22,640 --> 00:30:26,960 Speaker 1: computation could also potentially see huge leaps forward. For example, 551 00:30:27,000 --> 00:30:29,840 Speaker 1: one one that I saw, I've seen mentioned, and I've 552 00:30:29,840 --> 00:30:33,120 Speaker 1: read about before is protein folding simulation. Have you ever 553 00:30:33,120 --> 00:30:36,320 Speaker 1: read about this? Uh, yeah, a little bit. Um. I 554 00:30:36,360 --> 00:30:40,120 Speaker 1: think I tend to a discussion on on the topic 555 00:30:40,200 --> 00:30:43,320 Speaker 1: a few years back. Yeah, so this research, it involves 556 00:30:43,480 --> 00:30:47,920 Speaker 1: going through permutations of of different ways of folding proteins 557 00:30:48,080 --> 00:30:51,960 Speaker 1: in a computer simulation. And this research could help cure 558 00:30:52,040 --> 00:30:54,880 Speaker 1: diseases and treat medical conditions if we learned the right 559 00:30:54,920 --> 00:30:58,040 Speaker 1: things about the behavior of how protein molecules fold up 560 00:30:58,040 --> 00:31:01,440 Speaker 1: on one another and behave in the body. But simulating 561 00:31:01,480 --> 00:31:05,440 Speaker 1: all of these folding permutations is a brute force computing project. 562 00:31:05,840 --> 00:31:08,840 Speaker 1: So if this actually reduced to a p problem, we 563 00:31:08,920 --> 00:31:12,160 Speaker 1: might be able to hasten research that saves lives, maybe 564 00:31:12,160 --> 00:31:15,720 Speaker 1: cure cancer. Who knows. Okay, so we're already we're looking 565 00:31:15,760 --> 00:31:17,960 Speaker 1: at a world where maybe we have to go back 566 00:31:17,960 --> 00:31:21,880 Speaker 1: to using just normal mail instead of email. But on 567 00:31:21,920 --> 00:31:25,320 Speaker 1: the other hand, maybe we cure cancer. Or if people 568 00:31:25,400 --> 00:31:28,480 Speaker 1: i mean digital security maybe could still be a thing. 569 00:31:28,520 --> 00:31:30,959 Speaker 1: If people just come up with another method, we just 570 00:31:31,040 --> 00:31:35,000 Speaker 1: our current methods would would possibly become obsolete. Yeah, and 571 00:31:35,040 --> 00:31:38,239 Speaker 1: then the new method will be sealed envelope under your 572 00:31:38,280 --> 00:31:41,360 Speaker 1: back or a chest buried in your backyard. Yeah, that's 573 00:31:41,360 --> 00:31:44,320 Speaker 1: where you have to keep all You'd have to physically 574 00:31:44,400 --> 00:31:47,240 Speaker 1: meet up with everybody to trade information with an agree 575 00:31:47,280 --> 00:31:49,960 Speaker 1: on a password in person, you know, that would be interesting, 576 00:31:50,520 --> 00:31:56,200 Speaker 1: like trying to imagine a a digital civilization that suddenly 577 00:31:56,240 --> 00:31:59,600 Speaker 1: has to become a non digital civilization, but wants to 578 00:31:59,680 --> 00:32:03,200 Speaker 1: keep everything uh operating more or less as it did 579 00:32:03,200 --> 00:32:05,520 Speaker 1: when it moves digital. You know, like they still want 580 00:32:05,560 --> 00:32:10,520 Speaker 1: to use tender uh, but they can no longer use 581 00:32:10,640 --> 00:32:13,040 Speaker 1: a true digital version of it. What does that even 582 00:32:13,080 --> 00:32:17,000 Speaker 1: consist of? Oh wow, yeah, that's fascinating. But of course 583 00:32:17,080 --> 00:32:21,320 Speaker 1: the changes wouldn't just be in applied sciences and technology. 584 00:32:21,360 --> 00:32:24,920 Speaker 1: One of the interesting things about this is multiple experts 585 00:32:24,920 --> 00:32:27,920 Speaker 1: have commented that if we live in a P equals 586 00:32:27,960 --> 00:32:32,840 Speaker 1: in P universe, we have been sorely mistaken about what 587 00:32:33,000 --> 00:32:36,840 Speaker 1: reality is like. If this is the universe we inhabit, 588 00:32:36,880 --> 00:32:39,800 Speaker 1: in fact, it is quite different than we thought. And 589 00:32:40,080 --> 00:32:43,680 Speaker 1: one one thing I want to quote is by Scott Aaronson, 590 00:32:43,760 --> 00:32:48,040 Speaker 1: who offers this as quote a physical philosophical argument against 591 00:32:48,160 --> 00:32:51,400 Speaker 1: P equals in P, and he says, quote, if P 592 00:32:51,520 --> 00:32:54,440 Speaker 1: equals in P, then the world would be a profoundly 593 00:32:54,600 --> 00:32:57,840 Speaker 1: different place than we usually assume it to be. There 594 00:32:57,880 --> 00:33:01,959 Speaker 1: would be no special value for creative leaps, no fundamental 595 00:33:02,040 --> 00:33:06,000 Speaker 1: gap between solving a problem and recognizing the solution once 596 00:33:06,040 --> 00:33:10,840 Speaker 1: it's found everyone who could appreciate a symphony would be Mozart, 597 00:33:11,360 --> 00:33:14,560 Speaker 1: everyone who could follow a step by step argument would 598 00:33:14,640 --> 00:33:18,960 Speaker 1: be Gauss. Everyone who could recognize a good investment strategy 599 00:33:19,320 --> 00:33:22,160 Speaker 1: would be Warren Buffett. It's possible to put the point 600 00:33:22,160 --> 00:33:25,360 Speaker 1: in Darwinian terms. If this is the sort of universe 601 00:33:25,400 --> 00:33:29,160 Speaker 1: we inhabited, why wouldn't we already have evolved to take 602 00:33:29,200 --> 00:33:34,640 Speaker 1: advantage of it. That's a really interesting point. But at 603 00:33:34,640 --> 00:33:36,920 Speaker 1: the same time, so he's framing it in terms of it. 604 00:33:37,040 --> 00:33:39,240 Speaker 1: This is one among a list of arguments he gives 605 00:33:39,280 --> 00:33:43,400 Speaker 1: that P probably does not equal to. This seems to 606 00:33:43,400 --> 00:33:47,360 Speaker 1: be very much an argument in favor of their, of 607 00:33:47,400 --> 00:33:50,600 Speaker 1: their being no equality are right. But you could also 608 00:33:50,680 --> 00:33:53,240 Speaker 1: look at that look at it as an interesting comment 609 00:33:53,360 --> 00:33:56,280 Speaker 1: on how different the world would be from how we 610 00:33:56,360 --> 00:33:59,400 Speaker 1: assume it is if this were in fact the case. 611 00:33:59,760 --> 00:34:02,240 Speaker 1: And it's interesting to note that we shouldn't assume that 612 00:34:02,320 --> 00:34:04,480 Speaker 1: just because it doesn't feel like we live in the 613 00:34:04,560 --> 00:34:07,240 Speaker 1: P equals in P world, that P equals MP is 614 00:34:07,320 --> 00:34:11,239 Speaker 1: necessarily false. Our our intuitions about what's possible in the 615 00:34:11,280 --> 00:34:13,880 Speaker 1: math and problem solving space have turned out to be 616 00:34:14,120 --> 00:34:17,960 Speaker 1: very wrong in the past, and sometimes long standing problems 617 00:34:17,960 --> 00:34:21,040 Speaker 1: in math and computer science are solved by uh, solved. 618 00:34:21,040 --> 00:34:25,080 Speaker 1: They're solved or proved in ways that just seem extremely peculiar, 619 00:34:25,280 --> 00:34:28,719 Speaker 1: Yet you can't deny the result. I can't help but 620 00:34:28,800 --> 00:34:32,000 Speaker 1: circle back around to the earlier opinion that we mentioned 621 00:34:32,120 --> 00:34:37,640 Speaker 1: was attributed to Freeman Dyson that mathematics is inexhaustible. If 622 00:34:37,719 --> 00:34:42,919 Speaker 1: P equals MP, then the universe. Is the universe really inexhaustible? Yeah? 623 00:34:43,120 --> 00:34:45,680 Speaker 1: And uh? And if P equals MP, does that mean 624 00:34:45,719 --> 00:34:49,520 Speaker 1: that there is in a sense a universal algorithm out there? Uh? 625 00:34:49,800 --> 00:34:52,239 Speaker 1: I mean there there is a theory of everything within 626 00:34:52,280 --> 00:34:55,920 Speaker 1: our mathematical universes, as mass Max tech Mark argues in 627 00:34:55,960 --> 00:34:58,839 Speaker 1: mathematical universe theory that we mentioned earlier. Because the tech 628 00:34:58,880 --> 00:35:01,120 Speaker 1: Mark even go so far is to predict that a 629 00:35:01,200 --> 00:35:04,959 Speaker 1: mathematical proof for a theory of everything could eventually fit 630 00:35:05,000 --> 00:35:07,359 Speaker 1: on a T shirt. Well, I would kind of like 631 00:35:07,480 --> 00:35:11,439 Speaker 1: to the universe. Yeah, I mean that's an interesting thought 632 00:35:11,520 --> 00:35:14,000 Speaker 1: on its own. And tech Marks theories I I, as 633 00:35:14,040 --> 00:35:16,319 Speaker 1: I think I said earlier in this episode, I find 634 00:35:16,360 --> 00:35:19,319 Speaker 1: them very interesting, even if I'm not qualified enough to 635 00:35:19,400 --> 00:35:22,839 Speaker 1: know whether they're really rigorous physics. I read that book, 636 00:35:23,160 --> 00:35:27,360 Speaker 1: are our mathematical universe, and I found it amazingly stimulating. 637 00:35:27,400 --> 00:35:30,759 Speaker 1: You know, he talks about different levels of of multiverse 638 00:35:30,840 --> 00:35:35,400 Speaker 1: realities and what they each imply. Uh, and he gives 639 00:35:35,480 --> 00:35:37,600 Speaker 1: a very, at least to the lay person, a very 640 00:35:37,640 --> 00:35:42,239 Speaker 1: reasonable sounding explanation of how these are natural conclusions from 641 00:35:42,239 --> 00:35:45,040 Speaker 1: what we know about physics. But I do want to 642 00:35:45,160 --> 00:35:46,879 Speaker 1: use what you said as a sort of jumping off 643 00:35:46,920 --> 00:35:50,560 Speaker 1: point to take a broader view about the algorithmic nature 644 00:35:50,760 --> 00:35:53,719 Speaker 1: of reality. But first we're going to take a quick 645 00:35:53,719 --> 00:36:00,840 Speaker 1: break to hear from the sponsor of this episode. Everybody 646 00:36:00,840 --> 00:36:02,719 Speaker 1: in this day and age, you know the importance of 647 00:36:02,760 --> 00:36:05,840 Speaker 1: having a professional looking website. That's how you represent yourself 648 00:36:05,880 --> 00:36:08,080 Speaker 1: in the world at large. Come on, you can't just 649 00:36:08,120 --> 00:36:10,439 Speaker 1: have a Geo cities page hanging out there with all 650 00:36:10,440 --> 00:36:13,920 Speaker 1: your dancing baby gifts. It's it's and come on, that's right, 651 00:36:13,920 --> 00:36:16,319 Speaker 1: that's not gonna fly. The problem is, of course, most 652 00:36:16,320 --> 00:36:18,279 Speaker 1: of us, you know, we don't have the coding expertise 653 00:36:18,360 --> 00:36:19,719 Speaker 1: to go out and make one of these things. We 654 00:36:19,760 --> 00:36:21,600 Speaker 1: don't have the money to throw at some sort of 655 00:36:21,680 --> 00:36:26,160 Speaker 1: big fancy, big wig website designer. So what do you do? 656 00:36:26,200 --> 00:36:28,080 Speaker 1: You sign up for squarespace, is what you do. Because 657 00:36:28,080 --> 00:36:31,920 Speaker 1: squarespace they have the easy to use tools, the interface 658 00:36:31,960 --> 00:36:34,520 Speaker 1: that you need, all everything at your disposal to knock 659 00:36:34,560 --> 00:36:37,480 Speaker 1: out that professional looking website. Yeah, it'll look great and 660 00:36:37,520 --> 00:36:39,920 Speaker 1: it won't be scary. They take away all of the 661 00:36:40,320 --> 00:36:43,319 Speaker 1: gears and the creepiness of designing a website that make 662 00:36:43,360 --> 00:36:46,399 Speaker 1: it super intuitive, super easy. You have what you need 663 00:36:46,480 --> 00:36:48,920 Speaker 1: and you can make it yourself in no time. And hey, 664 00:36:48,960 --> 00:36:51,239 Speaker 1: if you want to use our offer code, you can 665 00:36:51,320 --> 00:36:54,600 Speaker 1: go to squarespace dot com and enter in mind blown 666 00:36:54,640 --> 00:36:58,000 Speaker 1: to get off your first purchase and a free domain 667 00:36:58,040 --> 00:37:00,000 Speaker 1: when you sign up today. So go check out square 668 00:37:00,000 --> 00:37:04,520 Speaker 1: airspace dot com special code mind blown and get started 669 00:37:04,560 --> 00:37:12,759 Speaker 1: making that awesome website. So whatever the solution to P 670 00:37:12,880 --> 00:37:15,239 Speaker 1: equals and P turns out to be, I think one 671 00:37:15,280 --> 00:37:19,000 Speaker 1: thing that's very interesting about it is just the idea 672 00:37:19,160 --> 00:37:23,640 Speaker 1: that this problem in computer science runs under the skin 673 00:37:23,800 --> 00:37:27,320 Speaker 1: of everything that exists. You know. It's it's not something 674 00:37:27,320 --> 00:37:31,000 Speaker 1: that people just made up. This is talking about a 675 00:37:31,080 --> 00:37:33,880 Speaker 1: fact about the universe that would be a fact about 676 00:37:33,880 --> 00:37:37,839 Speaker 1: the universe whether we were here to discuss it or not. Yeah, 677 00:37:37,880 --> 00:37:40,920 Speaker 1: I mean there's a certain amount of it's difficult kind 678 00:37:40,920 --> 00:37:43,880 Speaker 1: of to get to get outside of the mirror language 679 00:37:44,320 --> 00:37:47,000 Speaker 1: of the situation when we're talking about problem solving, because 680 00:37:47,000 --> 00:37:49,520 Speaker 1: we can't help but think about a human mind trying 681 00:37:49,520 --> 00:37:53,120 Speaker 1: to solve a problem. But in a sense, problem solving 682 00:37:53,520 --> 00:37:56,120 Speaker 1: takes place not only the human level, it takes place 683 00:37:56,160 --> 00:37:59,400 Speaker 1: that at at at at the animal level, as you know, 684 00:37:59,440 --> 00:38:02,760 Speaker 1: as this particular entity is trying to navigate a world 685 00:38:02,800 --> 00:38:06,200 Speaker 1: of fixed and moving objects, generally to acquire some gold, 686 00:38:06,280 --> 00:38:09,480 Speaker 1: to acquire food or or a mate. Uh, you could 687 00:38:09,520 --> 00:38:12,440 Speaker 1: even you could probably even extrapolated as far to say 688 00:38:12,640 --> 00:38:15,960 Speaker 1: to say that that an object obeying gravity is kind 689 00:38:16,000 --> 00:38:21,120 Speaker 1: of engaging in a sort of non mental problem solving 690 00:38:22,040 --> 00:38:24,759 Speaker 1: in a way, in in a limited way. Um, I 691 00:38:24,800 --> 00:38:26,640 Speaker 1: guess what I'm just trying to drive home here is 692 00:38:26,680 --> 00:38:29,680 Speaker 1: that when we continue to talk about problem solving here, 693 00:38:30,080 --> 00:38:32,400 Speaker 1: it's like trying not to think about it as much 694 00:38:33,280 --> 00:38:36,759 Speaker 1: within in the human realm, because it is we're about 695 00:38:36,800 --> 00:38:39,480 Speaker 1: to see it gets well outside of it, remove the 696 00:38:39,560 --> 00:38:44,239 Speaker 1: consciousness from it, and retain only the teleology exactly that 697 00:38:44,400 --> 00:38:47,040 Speaker 1: there there are there are steps towards a purpose, but 698 00:38:47,160 --> 00:38:49,520 Speaker 1: you don't have to know what the purpose is, and 699 00:38:49,600 --> 00:38:53,000 Speaker 1: you don't even have to realize you're taking steps right now. 700 00:38:53,040 --> 00:38:56,200 Speaker 1: A great example of this is the slime mold. So 701 00:38:56,239 --> 00:38:59,480 Speaker 1: slime molds don't have brains. They consists of a single 702 00:38:59,560 --> 00:39:03,000 Speaker 1: cell containing millions of nuclei and they form a network 703 00:39:03,040 --> 00:39:07,480 Speaker 1: of protoplasmic tubes to creep toward a food source along 704 00:39:07,520 --> 00:39:11,359 Speaker 1: the shortest path. That's essential here. Uh. It sends out 705 00:39:11,400 --> 00:39:15,000 Speaker 1: limbs to find food and when it finds us food source, 706 00:39:15,160 --> 00:39:18,319 Speaker 1: it spreads over, it secretes the digestive enzymes and has 707 00:39:18,360 --> 00:39:21,839 Speaker 1: its meal. When it doesn't, yeah, it's pretty it's pretty great. 708 00:39:22,160 --> 00:39:25,840 Speaker 1: Essentially the blob right, and when it doesn't find food, 709 00:39:25,960 --> 00:39:28,799 Speaker 1: then the limb you know, dies in retreats back. So 710 00:39:28,880 --> 00:39:32,160 Speaker 1: in this it creates a network for transporting nutrients and 711 00:39:32,280 --> 00:39:37,200 Speaker 1: chemicals for inter cellular communication. And that the method allows 712 00:39:37,239 --> 00:39:39,840 Speaker 1: them to perform such feats and this is generally and 713 00:39:40,160 --> 00:39:43,680 Speaker 1: this in lab environments. Uh, such feats as solving a maze, 714 00:39:43,760 --> 00:39:45,600 Speaker 1: like a straight up maze that you would put a 715 00:39:45,600 --> 00:39:49,520 Speaker 1: mouse in, as well as when presented with a miniaturized 716 00:39:49,560 --> 00:39:53,160 Speaker 1: earth environment, Uh, they can they can recreate some of 717 00:39:53,200 --> 00:39:56,040 Speaker 1: the great trade routes um of the world, of some 718 00:39:56,120 --> 00:40:00,920 Speaker 1: of the great highway systems. They can model cancer growth. Um, 719 00:40:02,200 --> 00:40:04,360 Speaker 1: let me go to a little detail about the silk 720 00:40:04,480 --> 00:40:07,239 Speaker 1: road thing, and this is this gets into exactly what 721 00:40:07,280 --> 00:40:11,080 Speaker 1: we're talking earlier about an algorithm attempting to to plot 722 00:40:11,120 --> 00:40:13,680 Speaker 1: a course, right and having to hit all those stuffs 723 00:40:13,680 --> 00:40:17,600 Speaker 1: the salesman problem that we're discussing earlier. Okay, So back 724 00:40:17,640 --> 00:40:23,640 Speaker 1: into computer scientist Andre Adamantski from the University of the 725 00:40:23,680 --> 00:40:26,680 Speaker 1: West of England. He took a globe. Okay, and you 726 00:40:26,680 --> 00:40:28,319 Speaker 1: can do this at home probably, I guess, if you 727 00:40:28,320 --> 00:40:31,680 Speaker 1: have access to the materials. He took a globe and 728 00:40:31,680 --> 00:40:34,040 Speaker 1: he coated it with auger all right, this, of course 729 00:40:34,120 --> 00:40:36,040 Speaker 1: is the stuff in a petri dish that you know 730 00:40:36,120 --> 00:40:40,600 Speaker 1: bacteria grows up. Yeah. Uh and then he um. And 731 00:40:40,640 --> 00:40:44,120 Speaker 1: then what he did is he removed uh, the augur 732 00:40:44,239 --> 00:40:46,560 Speaker 1: from the areas over the ocean, so it's just covering 733 00:40:46,560 --> 00:40:49,000 Speaker 1: the continents and the land at this point. And then 734 00:40:49,040 --> 00:40:52,160 Speaker 1: he placed oat flakes at the locations of twenty four 735 00:40:52,239 --> 00:40:55,560 Speaker 1: different major cities on the globe, so that's a food source. Okay. 736 00:40:56,200 --> 00:40:58,680 Speaker 1: Then he introduced the slime mold. Alright. He did this 737 00:40:58,760 --> 00:41:02,080 Speaker 1: thirty different times. In each time the slime mold conquered 738 00:41:02,120 --> 00:41:06,320 Speaker 1: the world in a slightly different way, establishing trade routes 739 00:41:06,600 --> 00:41:10,640 Speaker 1: between the various oats cities. Yeah, and it's uh there 740 00:41:10,640 --> 00:41:14,040 Speaker 1: their pictures out there. This is pretty remarkable, uh because 741 00:41:14,040 --> 00:41:16,200 Speaker 1: it it maybe a little bit scary because you see 742 00:41:16,360 --> 00:41:18,840 Speaker 1: these ten drils just spreading out all over the world. 743 00:41:19,719 --> 00:41:23,359 Speaker 1: It managed to plan out engineering projects that, of course 744 00:41:23,440 --> 00:41:26,440 Speaker 1: humans can only dream of right now, like Transatlantic bridges. 745 00:41:26,520 --> 00:41:29,040 Speaker 1: Obviously we're not going to do that. It wasn't, I didn't. 746 00:41:29,080 --> 00:41:30,840 Speaker 1: I don't think it was ever able to really conquer 747 00:41:30,880 --> 00:41:34,399 Speaker 1: the Pacific. Pacific was just too too uh too great 748 00:41:34,400 --> 00:41:38,360 Speaker 1: of a distance without Augur there for it to grow on. Uh. 749 00:41:38,480 --> 00:41:42,200 Speaker 1: But but it also managed to recreate the Silk Road 750 00:41:43,080 --> 00:41:46,840 Speaker 1: as well as the modern Asian Highway network, which consists 751 00:41:46,840 --> 00:41:49,520 Speaker 1: of about the eighty seven thousand miles of roads running 752 00:41:49,520 --> 00:41:53,239 Speaker 1: between thirty two countries. So it exactly it provides a 753 00:41:53,320 --> 00:41:57,360 Speaker 1: great example of uh, not a problem solving intelligence, but 754 00:41:57,400 --> 00:42:02,800 Speaker 1: an algorithmic problem solving organic system. Of course, this I 755 00:42:02,840 --> 00:42:05,880 Speaker 1: do think raises the specter of the question how do 756 00:42:05,920 --> 00:42:10,400 Speaker 1: you tell the difference between an algorithm and intelligence? Unless 757 00:42:10,400 --> 00:42:13,360 Speaker 1: you want to be anthropomorphic and say, well, intelligence is 758 00:42:13,400 --> 00:42:16,520 Speaker 1: consciousness and the ability to love and yeah, and then 759 00:42:16,560 --> 00:42:19,719 Speaker 1: we start gazing down that of this right, Yeah, yeah, 760 00:42:19,760 --> 00:42:21,800 Speaker 1: we get back to some of the problems we've discussed 761 00:42:22,320 --> 00:42:25,120 Speaker 1: in relation to AI in the past. How do you 762 00:42:25,160 --> 00:42:28,160 Speaker 1: know when you created it? If you can't really say 763 00:42:28,200 --> 00:42:30,919 Speaker 1: what it is, it becomes this this problem. We can't 764 00:42:30,920 --> 00:42:33,040 Speaker 1: even define the problem, So how can we come up 765 00:42:33,040 --> 00:42:35,120 Speaker 1: with the solution or check the solution? Yeah, And it's 766 00:42:35,120 --> 00:42:38,840 Speaker 1: funny that we were talking earlier about about sets that 767 00:42:38,960 --> 00:42:42,080 Speaker 1: sort of recursively consume one another. Is one set within 768 00:42:42,120 --> 00:42:44,920 Speaker 1: another set, but then the first set is the second 769 00:42:44,920 --> 00:42:48,440 Speaker 1: set is also in the first set. Now, we just 770 00:42:48,800 --> 00:42:52,000 Speaker 1: gave an example of how nature can be like an algorithm. 771 00:42:52,040 --> 00:42:55,400 Speaker 1: But you can also say that plenty of algorithms and 772 00:42:55,400 --> 00:43:00,000 Speaker 1: computer science have been essentially derived from nature. Yeah. Um, 773 00:43:00,160 --> 00:43:02,839 Speaker 1: And there's a great example of this with ants. So 774 00:43:03,640 --> 00:43:06,040 Speaker 1: ant colonies. We've discussed ance here in the podcast before 775 00:43:06,040 --> 00:43:07,600 Speaker 1: and I'm sure we'll cover them again in the future. 776 00:43:07,920 --> 00:43:10,440 Speaker 1: You know that they're complex societies, and we see plenty 777 00:43:10,440 --> 00:43:14,400 Speaker 1: of examples in which colonies accomplished complex tasks that exceed 778 00:43:14,480 --> 00:43:17,319 Speaker 1: the individual capacities of a single ant. Of course, so 779 00:43:17,400 --> 00:43:20,880 Speaker 1: they worked together and they're able to solve problems, and 780 00:43:20,920 --> 00:43:24,800 Speaker 1: this is mound mind. Yeah, exactly, it's the the the 781 00:43:24,600 --> 00:43:29,600 Speaker 1: the the emergent intelligence of the group, the swarm intelligence. Um. 782 00:43:29,640 --> 00:43:32,560 Speaker 1: This is a particular note to computer programming as though, 783 00:43:32,600 --> 00:43:35,839 Speaker 1: as we see how they're self organizing capacities and distributed 784 00:43:36,080 --> 00:43:42,040 Speaker 1: organization enable them to solve difficult optimization and distributed control problems. Okay, 785 00:43:42,080 --> 00:43:46,680 Speaker 1: So back in a Stanford study looked at how harvester 786 00:43:46,880 --> 00:43:51,399 Speaker 1: ants determine how many foragers to send out. Of course, 787 00:43:51,400 --> 00:43:54,560 Speaker 1: so they're sending out a rating party, right, Uh, which 788 00:43:54,600 --> 00:43:57,120 Speaker 1: is you know, more complex than than than you might think, 789 00:43:58,080 --> 00:44:00,359 Speaker 1: because you get down to the basic you know, how 790 00:44:00,360 --> 00:44:03,960 Speaker 1: many do you send out? You know, what's the way time? Uh? 791 00:44:04,000 --> 00:44:06,120 Speaker 1: And then they compared this to the manner in which 792 00:44:06,160 --> 00:44:10,480 Speaker 1: a search engine brings back search results. So specifically, yeah, specifically, 793 00:44:10,520 --> 00:44:15,279 Speaker 1: we're talking about transmission control protocol or TCP, which, if 794 00:44:15,320 --> 00:44:17,919 Speaker 1: you're like me, that's mostly something that you run into 795 00:44:17,960 --> 00:44:21,320 Speaker 1: when you have to adjust something on your your internet 796 00:44:21,600 --> 00:44:33,120 Speaker 1: situation at home. Yeah, but in anyway, it's a it's 797 00:44:33,200 --> 00:44:36,880 Speaker 1: essentially an algorithm that manages data congestion on the Internet. 798 00:44:37,680 --> 00:44:41,400 Speaker 1: So they they compared the two, right, and they combined 799 00:44:41,480 --> 00:44:46,160 Speaker 1: the two and they created the anternet. This is so 800 00:44:46,239 --> 00:44:50,520 Speaker 1: that's that's Internet, except instead of you have an ant UM. 801 00:44:50,560 --> 00:44:55,160 Speaker 1: So it's a TCP influenced algorithm that accurately matched ant 802 00:44:55,200 --> 00:44:57,720 Speaker 1: behavior in the experiment. So it's an example of ants 803 00:44:58,080 --> 00:45:03,560 Speaker 1: reaching the same place as our computer programming UM. So basically, 804 00:45:03,600 --> 00:45:07,719 Speaker 1: they created a TCP program that accurately predicted how the 805 00:45:07,760 --> 00:45:11,360 Speaker 1: ants were going to act. Yeah, we've definitely talked on 806 00:45:11,440 --> 00:45:13,719 Speaker 1: the other podcasts that I do with Jonathan Strickland and 807 00:45:13,800 --> 00:45:17,200 Speaker 1: Lauren Vogelbaum Forward Thinking. We talk about biomimetic robotics a 808 00:45:17,239 --> 00:45:21,480 Speaker 1: lot about ways that robots can be inspired by the 809 00:45:21,520 --> 00:45:25,360 Speaker 1: ways that animals move, especially in mobile robotics. You know, 810 00:45:25,360 --> 00:45:27,480 Speaker 1: how do you get a swarm of things to behave 811 00:45:27,560 --> 00:45:30,920 Speaker 1: as a group correctly? And I'd imagine that this would 812 00:45:30,920 --> 00:45:34,040 Speaker 1: be a very good example of how to control them. Yeah, 813 00:45:34,080 --> 00:45:37,759 Speaker 1: you see swarm organisms used in various AI programs. I 814 00:45:37,760 --> 00:45:41,560 Speaker 1: believe here Georgia Tech in Atlanta, they've used bees a lot. 815 00:45:41,760 --> 00:45:45,040 Speaker 1: Oh yeah, yeah. And more recently, a two thousand fourteen 816 00:45:45,080 --> 00:45:48,239 Speaker 1: study from Zurix Institute of Pharmaceutical Science has explored the 817 00:45:48,280 --> 00:45:51,839 Speaker 1: possibility of using ant algorithms to search for new composite 818 00:45:51,880 --> 00:45:55,120 Speaker 1: agents in the development of new pharmaceuticals. And this gets 819 00:45:55,400 --> 00:45:58,160 Speaker 1: down to the whole situation discussing earlier about do you 820 00:45:58,200 --> 00:46:01,920 Speaker 1: do you take a brute uh ranked approach to finding 821 00:46:01,960 --> 00:46:05,239 Speaker 1: out which connections amid all these possible connections are the 822 00:46:05,320 --> 00:46:09,600 Speaker 1: ideal ones to then test and explore. Um. So they've 823 00:46:09,600 --> 00:46:12,240 Speaker 1: looked into the possibility of using an ant based algorithm 824 00:46:12,440 --> 00:46:16,040 Speaker 1: to find though to do essentially, you know, magically guess 825 00:46:16,480 --> 00:46:21,240 Speaker 1: those uh those those the composite agents that that deserve 826 00:46:21,360 --> 00:46:24,520 Speaker 1: further examination. Well, you know, one thing these examples in 827 00:46:24,640 --> 00:46:27,959 Speaker 1: nature make me think about is an idea that really 828 00:46:28,000 --> 00:46:31,080 Speaker 1: intrigues me, and that's the question of whether evolution by 829 00:46:31,200 --> 00:46:36,759 Speaker 1: natural selection can itself be best interpreted as an algorithm, 830 00:46:36,800 --> 00:46:39,200 Speaker 1: because think about it, it's it's kind of an iterate 831 00:46:39,280 --> 00:46:42,600 Speaker 1: and test style algorithmic procedure. Let me give you a 832 00:46:42,640 --> 00:46:47,080 Speaker 1: list of steps. Step one, randomly introduce a change that 833 00:46:47,080 --> 00:46:49,640 Speaker 1: would be like a mutant allile into the genome. You 834 00:46:49,880 --> 00:46:53,719 Speaker 1: very one gene from the existing model, so you've random that. 835 00:46:53,719 --> 00:46:56,480 Speaker 1: That's step one. And then step two test against the 836 00:46:56,480 --> 00:46:59,960 Speaker 1: baseline performance rate of the standard allile in the environment 837 00:47:00,200 --> 00:47:03,320 Speaker 1: or you know, the individual steps here would be attempt 838 00:47:03,400 --> 00:47:07,920 Speaker 1: to copy. If copying succeeds, go to one or return 839 00:47:08,000 --> 00:47:13,000 Speaker 1: to the first step. If copying fails, return void. It's 840 00:47:13,000 --> 00:47:15,399 Speaker 1: almost like a computer program. Yeah, I think I think 841 00:47:15,520 --> 00:47:17,239 Speaker 1: there's a very strong case to be made there. I mean, 842 00:47:18,200 --> 00:47:21,920 Speaker 1: earlier I made that I said that gravity an object 843 00:47:21,920 --> 00:47:23,600 Speaker 1: of baying gravity and is it is in a way 844 00:47:24,080 --> 00:47:28,040 Speaker 1: kind of obeying a certain algorithm. It's kind of problem solving. 845 00:47:28,120 --> 00:47:31,120 Speaker 1: So this I think this FITSI bill as well. Yeah, 846 00:47:31,160 --> 00:47:33,279 Speaker 1: and I certainly didn't come up with this idea. I know, 847 00:47:33,320 --> 00:47:35,960 Speaker 1: I've read about it in the philosopher Daniel Dennett, who 848 00:47:36,200 --> 00:47:39,000 Speaker 1: advocated this point of view in this nine six book. 849 00:47:39,080 --> 00:47:42,120 Speaker 1: Darwin's dangerous idea was which was a lot about the 850 00:47:42,160 --> 00:47:46,960 Speaker 1: implications of evolution beyond just explaining the diversity of species 851 00:47:47,000 --> 00:47:50,360 Speaker 1: on Earth. Evolution is a sort of principle that extends 852 00:47:50,400 --> 00:47:56,120 Speaker 1: even beyond biology. But this universal driving force that that 853 00:47:56,320 --> 00:48:00,360 Speaker 1: drives design through the design space, and the way he 854 00:48:00,360 --> 00:48:04,040 Speaker 1: would explain it. But uh so, of course, not everybody 855 00:48:04,040 --> 00:48:07,160 Speaker 1: agrees with that interpretation, thinks that an algorithm is a 856 00:48:07,200 --> 00:48:09,760 Speaker 1: good way of thinking about evolution. But I've got another 857 00:48:09,880 --> 00:48:13,040 Speaker 1: follow up question that's kind of interesting to me, if 858 00:48:13,080 --> 00:48:16,840 Speaker 1: evolution is an algorithm, is it an efficient algorithm or 859 00:48:16,880 --> 00:48:20,880 Speaker 1: a brute force algorithm? Oh, now that's a good question. 860 00:48:20,960 --> 00:48:22,640 Speaker 1: I mean it seems to me kind of like it 861 00:48:22,640 --> 00:48:25,520 Speaker 1: would be a brute force algorithm, right, because you're you're 862 00:48:25,560 --> 00:48:29,399 Speaker 1: trying any number, it's you know, just brute force combinatrix. 863 00:48:29,480 --> 00:48:32,840 Speaker 1: You're trying something. Here's a pair of genes, here's an alleal. 864 00:48:33,280 --> 00:48:35,960 Speaker 1: Does that work? Now? Okay, throw in the trash. It 865 00:48:36,000 --> 00:48:39,120 Speaker 1: seems very wasteful. Yeah, here's a lizard with spots, here's 866 00:48:39,160 --> 00:48:41,800 Speaker 1: one without spots. Which one gets eaten? Which one continues? 867 00:48:41,880 --> 00:48:44,560 Speaker 1: That sounds like kind of a brute brute force You're 868 00:48:44,560 --> 00:48:47,000 Speaker 1: just like throwing out all the produce possible prototypes and 869 00:48:47,120 --> 00:48:48,960 Speaker 1: uh and seeing what then you see what happens. And 870 00:48:49,000 --> 00:48:51,280 Speaker 1: I think this would actually be one way of framing 871 00:48:51,360 --> 00:48:55,880 Speaker 1: the difference between the standard scientific materialist view of evolution, 872 00:48:55,920 --> 00:48:58,640 Speaker 1: which I think would probably be is best I can tell. 873 00:48:58,680 --> 00:49:01,200 Speaker 1: I bet that'd be the brute force method. And then 874 00:49:01,280 --> 00:49:04,560 Speaker 1: some types of believers in intelligent design. Right, So if 875 00:49:04,600 --> 00:49:08,279 Speaker 1: you are somebody who believes um, you you accept the 876 00:49:08,320 --> 00:49:11,840 Speaker 1: evidence for evolution and common descent, but you simply believe 877 00:49:11,920 --> 00:49:14,560 Speaker 1: the process was guided in one way or another. You 878 00:49:14,600 --> 00:49:17,200 Speaker 1: think aliens or a god or some other you know, 879 00:49:17,320 --> 00:49:23,080 Speaker 1: powerful supernatural or or other worldly technological force interfered with evolution, 880 00:49:23,160 --> 00:49:27,400 Speaker 1: maybe reached in to cause specific mutations to the terrestrial 881 00:49:27,520 --> 00:49:30,680 Speaker 1: genes space at key points. That sounds like that would 882 00:49:30,680 --> 00:49:37,080 Speaker 1: be optimizing the algorithm, right, like somebody's introducing artificial efficiency. 883 00:49:37,280 --> 00:49:39,759 Speaker 1: But then again, i'd i'd be interested in hearing from 884 00:49:39,880 --> 00:49:43,240 Speaker 1: the evolutionary biologists and geneticists sat there in the audience 885 00:49:43,280 --> 00:49:45,480 Speaker 1: on this one. Like, if you accept the idea that 886 00:49:46,120 --> 00:49:49,640 Speaker 1: natural selection is like an algorithm, is it a brute 887 00:49:49,680 --> 00:49:53,600 Speaker 1: force algorithm unless you go to the intelligent design hypothesis? 888 00:49:53,719 --> 00:49:57,400 Speaker 1: Or are there other ways of thinking of the algorithm 889 00:49:57,440 --> 00:50:02,239 Speaker 1: as in some way optimized by material cir substances. And 890 00:50:02,440 --> 00:50:06,319 Speaker 1: since you did mention god or the gods here would 891 00:50:06,400 --> 00:50:09,640 Speaker 1: the god in this scenario that Okay, so this is 892 00:50:09,640 --> 00:50:13,000 Speaker 1: a force coming from outside our universe, then perhaps in 893 00:50:13,040 --> 00:50:18,680 Speaker 1: this scenario our world is is a P does not 894 00:50:18,840 --> 00:50:22,200 Speaker 1: equal in P universe, but the realm of the gods 895 00:50:22,480 --> 00:50:25,840 Speaker 1: is a P equals in P universe. Uh Well, I 896 00:50:25,840 --> 00:50:29,400 Speaker 1: mean that's an interesting way of putting it, because they 897 00:50:29,560 --> 00:50:31,879 Speaker 1: can they can knock it out right there, Gods, they're 898 00:50:31,920 --> 00:50:35,480 Speaker 1: basically limitless. Well, it's infinite possibility, I mean, the whole 899 00:50:36,719 --> 00:50:40,240 Speaker 1: the realm of mathematics and logic is. In many ways, 900 00:50:40,680 --> 00:50:43,760 Speaker 1: it often signals to me the sort of underlying hint 901 00:50:43,800 --> 00:50:48,640 Speaker 1: of infinite possibilities but also infinite constraints. At the same time. 902 00:50:49,280 --> 00:50:53,080 Speaker 1: Mathematics is both infinite power and ultimate helplessness. You know, 903 00:50:53,200 --> 00:50:56,759 Speaker 1: it's the power to accomplish anything and the inevitability of 904 00:50:56,760 --> 00:51:01,399 Speaker 1: being thwarted and destroyed by processes beyond your control. Uh, 905 00:51:01,880 --> 00:51:05,480 Speaker 1: it just sort of makes everything good and bad possible. Well, 906 00:51:05,480 --> 00:51:07,480 Speaker 1: it sounds like you're talking about the gods again. It 907 00:51:07,520 --> 00:51:10,160 Speaker 1: could be yeah, all right, So there you have it. 908 00:51:10,320 --> 00:51:14,319 Speaker 1: P m P P versus m P P equals mp 909 00:51:14,480 --> 00:51:19,160 Speaker 1: P does not equal mp. Hopefully at this point you 910 00:51:19,200 --> 00:51:20,719 Speaker 1: have if you if you didn't know what any of 911 00:51:20,719 --> 00:51:24,120 Speaker 1: this stuff was about beforehand, you have a much better 912 00:51:24,160 --> 00:51:26,840 Speaker 1: grasp on the idea. You can at least realize why. 913 00:51:26,920 --> 00:51:30,520 Speaker 1: It is a topic that people continue to discuss and 914 00:51:30,600 --> 00:51:32,640 Speaker 1: even argue about. But if you want to get into 915 00:51:32,680 --> 00:51:36,040 Speaker 1: the actual details of of it, there are plenty of 916 00:51:36,080 --> 00:51:38,279 Speaker 1: good resources out there on the internet. If you are 917 00:51:38,320 --> 00:51:41,320 Speaker 1: a math and computer science and logic inclined to person 918 00:51:41,360 --> 00:51:43,640 Speaker 1: who has a good abstract mind for that kind of thing. 919 00:51:44,239 --> 00:51:46,200 Speaker 1: But either way, I do want to remind you to 920 00:51:46,400 --> 00:51:50,360 Speaker 1: always think about the algorithmic nature of the ground beneath 921 00:51:50,400 --> 00:51:53,120 Speaker 1: your feet and the laws that govern the way the 922 00:51:53,160 --> 00:51:58,319 Speaker 1: way everything around you works, the logic of reality. Is 923 00:51:58,360 --> 00:52:02,839 Speaker 1: there a problem solving process inherent to everything that's going 924 00:52:02,880 --> 00:52:05,160 Speaker 1: on around you all the time? I don't know. It's 925 00:52:05,200 --> 00:52:08,200 Speaker 1: it's a strange specter of an idea to keep behind 926 00:52:08,200 --> 00:52:10,759 Speaker 1: your head at all times. Yeah. Again, like when a 927 00:52:11,120 --> 00:52:14,200 Speaker 1: when like a walnut falls out of a tree, there 928 00:52:14,200 --> 00:52:16,400 Speaker 1: are certainly there's an there's an algorithm at play right 929 00:52:16,440 --> 00:52:18,680 Speaker 1: as to how exactly it's going to make it to 930 00:52:18,719 --> 00:52:21,120 Speaker 1: the ground, which branches is going to hit? I guess 931 00:52:21,120 --> 00:52:23,919 Speaker 1: that depends on your perspective. Is there is there a goal, 932 00:52:24,000 --> 00:52:29,680 Speaker 1: there is something happening or did something just happen? I 933 00:52:29,719 --> 00:52:32,719 Speaker 1: don't know, man, it's pretty far out. Well, let's not 934 00:52:32,719 --> 00:52:35,040 Speaker 1: get too much into weird stone or territory. Here at 935 00:52:35,040 --> 00:52:37,560 Speaker 1: the end, I do want to say, right here at 936 00:52:37,600 --> 00:52:40,839 Speaker 1: the end, if you are somebody who's involved in mathematics 937 00:52:40,920 --> 00:52:43,920 Speaker 1: or computer science and uh, and you would like to 938 00:52:43,960 --> 00:52:46,960 Speaker 1: write in to tell us about one of the one 939 00:52:47,000 --> 00:52:49,319 Speaker 1: of the more detailed or complex aspects of this that 940 00:52:49,360 --> 00:52:51,200 Speaker 1: we didn't get to. Like we said, we we gave 941 00:52:51,239 --> 00:52:54,400 Speaker 1: you the very simple version. Please right in and we'd 942 00:52:54,400 --> 00:52:57,279 Speaker 1: love to share your thoughts with the rest of you guys. Yeah, 943 00:52:57,280 --> 00:52:59,360 Speaker 1: and I'd also love to hear from anyone you know 944 00:52:59,360 --> 00:53:02,520 Speaker 1: who's read, uh some science fiction that definitely weighs in 945 00:53:02,600 --> 00:53:04,879 Speaker 1: on this. Um. You know, I was trying to think 946 00:53:04,880 --> 00:53:09,439 Speaker 1: of specific examples from N. N. Banks culture books because 947 00:53:09,480 --> 00:53:12,520 Speaker 1: the cour they have these minds, these ais that are 948 00:53:12,760 --> 00:53:15,960 Speaker 1: incredibly powerful, But for the life of me, I can't remember, uh, 949 00:53:16,000 --> 00:53:20,279 Speaker 1: exactly where they they weigh in in terms of the 950 00:53:20,440 --> 00:53:23,960 Speaker 1: or we're indeed, um Banks is universe there weighs in 951 00:53:24,160 --> 00:53:28,759 Speaker 1: on the on the the P and P spectrum. Hey. 952 00:53:28,800 --> 00:53:30,960 Speaker 1: But in the meantime, you want to check out this 953 00:53:31,040 --> 00:53:33,799 Speaker 1: and other pieces of content, head on over to stuff 954 00:53:33,800 --> 00:53:35,759 Speaker 1: to Blow your Mind dot com. That's the mothership. That's 955 00:53:35,760 --> 00:53:37,520 Speaker 1: where we have all the articles. That's where we have 956 00:53:37,600 --> 00:53:40,239 Speaker 1: blog posts, we have links out to social media, we 957 00:53:40,280 --> 00:53:43,520 Speaker 1: have some videos, uh, be sure to go there. Hey, 958 00:53:43,520 --> 00:53:48,000 Speaker 1: and if you listen to us on what iTunes, Spotify, 959 00:53:48,120 --> 00:53:52,960 Speaker 1: Google Play, however, you get your podcasts um it's possible 960 00:53:53,080 --> 00:53:55,319 Speaker 1: to do, so leave us a nice review there, give 961 00:53:55,360 --> 00:53:58,919 Speaker 1: us a little boost in the algorithm that ultimately uh 962 00:53:59,200 --> 00:54:02,200 Speaker 1: determines our eight. You can tweak that at that hour, 963 00:54:02,280 --> 00:54:04,280 Speaker 1: and you can reach out as a as an outside 964 00:54:04,280 --> 00:54:07,400 Speaker 1: force like a god, and shift things in our favor. 965 00:54:07,480 --> 00:54:10,640 Speaker 1: So invite you to do stuff. And as always, if 966 00:54:10,640 --> 00:54:12,120 Speaker 1: you'd like to get in touch with us, and we 967 00:54:12,200 --> 00:54:14,359 Speaker 1: really hope you do, you can email us and blow 968 00:54:14,400 --> 00:54:26,520 Speaker 1: the mind at how stuff works dot com for more 969 00:54:26,560 --> 00:54:28,839 Speaker 1: on this and thousands of other topics. Is it how 970 00:54:28,920 --> 00:54:52,680 Speaker 1: stuff works dot com.