1 00:00:09,080 --> 00:00:12,039 Speaker 1: Hey, jorgey, do you think science is good at predicting 2 00:00:12,039 --> 00:00:15,960 Speaker 1: the future. It doesn't do very well with predicting the weather, 3 00:00:16,079 --> 00:00:19,400 Speaker 1: does it well? Here in California just says sunny every day? Yeah? 4 00:00:19,440 --> 00:00:21,639 Speaker 1: I guess some states are easier to predict the better. 5 00:00:21,840 --> 00:00:23,880 Speaker 1: All right, but let me try my hand at it. 6 00:00:24,120 --> 00:00:27,920 Speaker 1: I have a prediction of the future. You're a psychic, 7 00:00:28,000 --> 00:00:33,199 Speaker 1: note and a physicist. Well, I predict that you cannot 8 00:00:33,200 --> 00:00:39,640 Speaker 1: go a whole episode without talking about bananas. What that's bananas? Boom, 9 00:00:39,680 --> 00:00:42,440 Speaker 1: there you go. I'm totally psychic. You know the difference 10 00:00:42,440 --> 00:00:47,519 Speaker 1: between a psychic and a physicist their salary where you 11 00:00:47,560 --> 00:01:08,040 Speaker 1: put the h That's the only difference. Hey, I'm RhE 12 00:01:08,440 --> 00:01:11,959 Speaker 1: and my cartoonist and the creator of PhD comments. Hi. 13 00:01:12,160 --> 00:01:15,800 Speaker 1: I'm Daniel. I'm a particle physicist, not a particle psychic, 14 00:01:16,080 --> 00:01:18,080 Speaker 1: and I'm a co author of our book We Have 15 00:01:18,240 --> 00:01:22,039 Speaker 1: No Idea, A Guide to the Unknown Universe. Welcome to 16 00:01:22,040 --> 00:01:25,679 Speaker 1: our podcast, Daniel and Jorge Explain the Universe, a production 17 00:01:25,880 --> 00:01:28,280 Speaker 1: of I Heart Radio, in which we talk about all 18 00:01:28,319 --> 00:01:31,520 Speaker 1: the amazing, the fascinating, the hot, the nasty, the wet, 19 00:01:31,560 --> 00:01:33,920 Speaker 1: the bright, the dirty, the soft, the quiet, the loud, 20 00:01:33,959 --> 00:01:36,560 Speaker 1: the hot, the cold, all the stuff in the universe 21 00:01:36,600 --> 00:01:39,679 Speaker 1: that's fascinating and beautiful and amazing, and explain it to 22 00:01:39,720 --> 00:01:42,240 Speaker 1: you in a way that we hope you actually understand 23 00:01:42,319 --> 00:01:45,759 Speaker 1: and hopefully even enjoy. Yeah, and sometimes on the podcast, 24 00:01:45,840 --> 00:01:50,600 Speaker 1: we like to talk about predicting the future. What's gonna happen, 25 00:01:50,880 --> 00:01:53,000 Speaker 1: what's gonna happen to the universe, what's going to happen 26 00:01:53,040 --> 00:01:56,040 Speaker 1: to the Earth, what's gonna happen to the solar system, 27 00:01:56,120 --> 00:01:59,480 Speaker 1: what's going to happen to this podcast? Where is this 28 00:01:59,520 --> 00:02:02,920 Speaker 1: podcast going anyway? Yeah, we like to talk about what 29 00:02:03,120 --> 00:02:06,280 Speaker 1: science does and doesn't know about the future, what we 30 00:02:06,280 --> 00:02:08,679 Speaker 1: can predict about what's going to happen, and where we 31 00:02:08,720 --> 00:02:11,839 Speaker 1: are totally clueless. Yeah, well, as I was we were 32 00:02:11,840 --> 00:02:14,919 Speaker 1: saying earlier, there's a there's a fine line between being 33 00:02:14,960 --> 00:02:18,280 Speaker 1: a psychic and being a physicist. I would say it's 34 00:02:18,280 --> 00:02:21,320 Speaker 1: a bright but fine line. It's right, Okay, it's an 35 00:02:21,360 --> 00:02:26,840 Speaker 1: impenetrable barrier, quantum barrier. It's difficult to tunnel between being 36 00:02:26,880 --> 00:02:29,640 Speaker 1: a physicist and being a psychic, but in some sense 37 00:02:29,680 --> 00:02:32,560 Speaker 1: we do have the same job. Physics also wants to 38 00:02:32,600 --> 00:02:34,720 Speaker 1: predict the future. Yeah, that's kind of. I mean, that's 39 00:02:34,800 --> 00:02:38,160 Speaker 1: kind of why, in a way, physics was invented, right, 40 00:02:38,240 --> 00:02:42,160 Speaker 1: Like we want to know where this catapult payload is 41 00:02:42,200 --> 00:02:45,400 Speaker 1: going to land. We want to know how far my 42 00:02:45,480 --> 00:02:47,079 Speaker 1: car is going to go? Were you there when physics 43 00:02:47,160 --> 00:02:49,839 Speaker 1: was invented? I didn't get invited to that meeting? Oh yeah, 44 00:02:49,960 --> 00:02:52,880 Speaker 1: and you didn't? Um? I think Einstein was there and 45 00:02:52,919 --> 00:02:56,000 Speaker 1: Newton was there, and damn I knew I should have 46 00:02:56,080 --> 00:02:58,760 Speaker 1: checked my email that day. He missed the calendar invite. 47 00:02:58,919 --> 00:03:00,880 Speaker 1: That's right, And all those exams, as you mentioned, are 48 00:03:00,960 --> 00:03:04,680 Speaker 1: totally valuable and our examples of why science has an 49 00:03:04,720 --> 00:03:07,520 Speaker 1: impact on everyday life. You know, is my catapult going 50 00:03:07,560 --> 00:03:09,840 Speaker 1: to get over those castle walls and this kind of stuff? 51 00:03:10,120 --> 00:03:12,959 Speaker 1: But also physics just wants to understand what's going to 52 00:03:13,080 --> 00:03:15,320 Speaker 1: happen to us, what's our fate? How can we plan 53 00:03:15,400 --> 00:03:18,120 Speaker 1: to live ahead? Um? And can we understand the mysteries 54 00:03:18,120 --> 00:03:20,280 Speaker 1: of everything around us so that we can know when 55 00:03:20,360 --> 00:03:22,480 Speaker 1: lightning is going to strike or when disease is going 56 00:03:22,520 --> 00:03:24,960 Speaker 1: to wipe out our cattle? Right? In a way, that's 57 00:03:25,000 --> 00:03:27,560 Speaker 1: kind of the the standard for physics, right. It's like 58 00:03:27,560 --> 00:03:31,200 Speaker 1: like we we we say we understand something in a 59 00:03:31,240 --> 00:03:33,440 Speaker 1: way we can sort of predict what's going to happen. 60 00:03:33,639 --> 00:03:36,600 Speaker 1: You want to verify that your theory of physics describes 61 00:03:36,720 --> 00:03:39,240 Speaker 1: the real universe out there and not just some idea 62 00:03:39,280 --> 00:03:42,960 Speaker 1: in your mind. You have to make a testable prediction. Einstein, 63 00:03:43,000 --> 00:03:45,720 Speaker 1: for example, predicted how light was going to be bent 64 00:03:45,880 --> 00:03:48,720 Speaker 1: during an eclipse, and people measured it and he was right. 65 00:03:48,800 --> 00:03:51,600 Speaker 1: And that's really when people started to believe his theory 66 00:03:51,640 --> 00:03:55,320 Speaker 1: of relativity. Because if your theory can't predict the future, 67 00:03:55,400 --> 00:03:57,920 Speaker 1: then what uses it, Right, That's what science does. It 68 00:03:58,200 --> 00:04:01,280 Speaker 1: predicts the outcome of future elements. Yeah, and you know, 69 00:04:01,320 --> 00:04:04,000 Speaker 1: I think people are comfortable with this idea that physics 70 00:04:05,080 --> 00:04:07,160 Speaker 1: is able to some degree predict the future. Like if 71 00:04:07,240 --> 00:04:09,720 Speaker 1: if you told somebody, hey, physics predict that the Earth 72 00:04:09,840 --> 00:04:11,480 Speaker 1: is going to be around for a very long time, 73 00:04:11,720 --> 00:04:14,760 Speaker 1: or that the universe will never end that those are comforting, 74 00:04:15,240 --> 00:04:18,120 Speaker 1: or if you eat that candy you will get bad. Yeah, 75 00:04:18,160 --> 00:04:22,760 Speaker 1: I think that's a that's more of a physical education. Um, 76 00:04:24,640 --> 00:04:31,440 Speaker 1: there's physics there. You're converting candy energy into squishy stomach energy. Yeah. 77 00:04:31,440 --> 00:04:34,360 Speaker 1: So we're comfortable I think with some physics predicting some 78 00:04:34,640 --> 00:04:37,000 Speaker 1: things about the future. But I think we're a little 79 00:04:37,000 --> 00:04:41,960 Speaker 1: bit uncomfortable about physics describing other things about the future, right. Yeah. 80 00:04:42,000 --> 00:04:44,400 Speaker 1: We like to use physics and science to explore the 81 00:04:44,480 --> 00:04:48,080 Speaker 1: universe around us and outside us. But then sometimes we 82 00:04:48,200 --> 00:04:51,360 Speaker 1: turn that science on ourselves and we seek to gain 83 00:04:51,400 --> 00:04:54,640 Speaker 1: insight into how we work. And then that makes you wonder. Yeah, 84 00:04:54,640 --> 00:04:57,240 Speaker 1: it makes you wonder if you are predictable in a 85 00:04:57,279 --> 00:05:01,919 Speaker 1: way you know, like could physics but actually one day, 86 00:05:02,240 --> 00:05:04,960 Speaker 1: you know, simulate the human mind, or simulate your mind 87 00:05:05,320 --> 00:05:08,440 Speaker 1: and predict what you're gonna think and do. It's a 88 00:05:08,480 --> 00:05:11,520 Speaker 1: fascinating question because so many things that humans have puzzled 89 00:05:11,560 --> 00:05:14,599 Speaker 1: over for so many years, how eclipse has happened, where 90 00:05:14,680 --> 00:05:17,880 Speaker 1: lightning strikes, all this stuff, how reproduction works. All of 91 00:05:17,920 --> 00:05:20,400 Speaker 1: these things have in the end been explained through science. 92 00:05:20,400 --> 00:05:23,839 Speaker 1: It turns out they are mechanistic. We can understand the 93 00:05:24,200 --> 00:05:27,160 Speaker 1: microscopically how it works and predict what's going to happen 94 00:05:27,200 --> 00:05:29,880 Speaker 1: in the future. And so then it's a natural question 95 00:05:29,960 --> 00:05:32,840 Speaker 1: to wonder how far you can extend that strategy. Can 96 00:05:32,880 --> 00:05:35,599 Speaker 1: you turn that around and extend it into your own 97 00:05:35,640 --> 00:05:38,359 Speaker 1: inner life? Right yeah, because you know, like if physics 98 00:05:38,400 --> 00:05:41,880 Speaker 1: can predict what, uh, you know a can of gas 99 00:05:41,920 --> 00:05:44,800 Speaker 1: particles is going to do. Why can't it predict what 100 00:05:45,040 --> 00:05:47,120 Speaker 1: a brain full of neurons is going to do? What's 101 00:05:47,120 --> 00:05:49,000 Speaker 1: the difference in the end between a can of gas 102 00:05:49,080 --> 00:05:52,880 Speaker 1: and a brain of neurons Really depends on the person probably, 103 00:05:52,960 --> 00:05:57,320 Speaker 1: but it depends on what that person eight recently. Yeah, 104 00:05:57,440 --> 00:05:59,640 Speaker 1: So it's a big question with I think some really 105 00:05:59,760 --> 00:06:04,239 Speaker 1: deep philosophical implications right about who we are and whether 106 00:06:04,400 --> 00:06:07,080 Speaker 1: or not we're predictable, or whether or not we have 107 00:06:07,360 --> 00:06:10,760 Speaker 1: this thing called free will. Yeah, that's right. Like many 108 00:06:10,760 --> 00:06:13,640 Speaker 1: of the topics we touch on in this podcast, there 109 00:06:13,640 --> 00:06:17,040 Speaker 1: are deep philosophical implications, and so I think today we 110 00:06:17,080 --> 00:06:19,880 Speaker 1: should walk carefully and focus on the science and then 111 00:06:19,960 --> 00:06:23,039 Speaker 1: think about what the philosophical implications are of what we 112 00:06:23,120 --> 00:06:25,279 Speaker 1: do and do not know. Yeah, and so today on 113 00:06:25,320 --> 00:06:34,280 Speaker 1: the podcast, we'll be tackling the question can't science predict 114 00:06:34,480 --> 00:06:37,520 Speaker 1: what you're going to do? And I predict that there 115 00:06:37,520 --> 00:06:40,440 Speaker 1: will be a lot of predictions in this episode, but 116 00:06:40,520 --> 00:06:44,440 Speaker 1: maybe not a lot of answers. Well that's sort of 117 00:06:44,480 --> 00:06:48,000 Speaker 1: our specialty, right, opening questions and not really answering that. 118 00:06:50,800 --> 00:06:52,600 Speaker 1: I think what you mean is getting people excited about 119 00:06:52,600 --> 00:06:57,760 Speaker 1: the questions, right, and that's right, sparking people's fundamental curiosity. 120 00:06:58,440 --> 00:07:00,520 Speaker 1: I think it's I think it's wonderful to talk about 121 00:07:00,520 --> 00:07:03,400 Speaker 1: things that we don't understand very well, because hopefully it's 122 00:07:03,440 --> 00:07:05,320 Speaker 1: a preview for what's going to happen in the future. 123 00:07:05,800 --> 00:07:08,279 Speaker 1: It's like a fantasy for future science. Maybe in five 124 00:07:08,760 --> 00:07:11,760 Speaker 1: years or in a thousand years, Signs will have figured 125 00:07:11,800 --> 00:07:13,920 Speaker 1: out the exact workings of the human brain. It can 126 00:07:13,920 --> 00:07:16,400 Speaker 1: tell you exactly what you're gonna do tomorrow. That would 127 00:07:16,520 --> 00:07:19,560 Speaker 1: be amazing. We totally change the way life operates, the 128 00:07:19,600 --> 00:07:21,880 Speaker 1: way what it's like to be a human being, right, Yeah, 129 00:07:21,880 --> 00:07:23,520 Speaker 1: I mean, we wrote a whole book about all the 130 00:07:23,560 --> 00:07:26,040 Speaker 1: things we don't know, Daniel, I'm sure we can pull 131 00:07:26,040 --> 00:07:31,520 Speaker 1: off a podcast also, I predict we will. So. Yeah. 132 00:07:31,680 --> 00:07:33,840 Speaker 1: So this is an interesting question, and it's kind of 133 00:07:34,440 --> 00:07:37,360 Speaker 1: goes back to a long time ago, when you know, 134 00:07:37,440 --> 00:07:41,480 Speaker 1: once Signs started seeing that the universe was what they 135 00:07:41,480 --> 00:07:45,520 Speaker 1: called deterministic, meaning like like just the giant machine that 136 00:07:45,640 --> 00:07:49,160 Speaker 1: follows the laws of physics like a clock. Then I 137 00:07:49,200 --> 00:07:51,560 Speaker 1: think that's when people started to think, like, hey, maybe 138 00:07:52,280 --> 00:07:56,800 Speaker 1: maybe our humanity is also predictable like a clock. Yeah. 139 00:07:56,920 --> 00:07:59,640 Speaker 1: I think that's sort of shocking. I think when people 140 00:07:59,680 --> 00:08:02,120 Speaker 1: first had that idea, it might have been a terrifying 141 00:08:02,200 --> 00:08:06,240 Speaker 1: moment to imagine that this experience they're having might actually 142 00:08:06,280 --> 00:08:11,400 Speaker 1: just be explainable, that things could be determined from the past. Yeah. Well, 143 00:08:11,440 --> 00:08:13,400 Speaker 1: I think if you told my eleven year old self 144 00:08:13,480 --> 00:08:17,200 Speaker 1: that I was a robot, I'd be like cool. Nowadays, though, 145 00:08:17,320 --> 00:08:20,680 Speaker 1: you know, it's more of an uncomfortable statement. You're a 146 00:08:20,760 --> 00:08:26,600 Speaker 1: robot and we gave the remote control to your sister. Sorry. Yeah, no. 147 00:08:27,720 --> 00:08:31,040 Speaker 1: I think also it's fun to think even more deeply 148 00:08:31,120 --> 00:08:34,160 Speaker 1: into sort of the origins of human thought on this question. 149 00:08:34,840 --> 00:08:37,080 Speaker 1: Is the sort of the hubrist that the universe is 150 00:08:37,120 --> 00:08:39,880 Speaker 1: explainable at any level? You know, I think a thousand 151 00:08:40,000 --> 00:08:41,920 Speaker 1: years ago or five thousand years ago, people might have 152 00:08:41,920 --> 00:08:45,280 Speaker 1: been comfortable with the idea that the universe doesn't follow laws, 153 00:08:45,320 --> 00:08:47,440 Speaker 1: it just sort of is, and maybe there are these 154 00:08:47,559 --> 00:08:50,400 Speaker 1: omniscient sentient beings out there that are in control of 155 00:08:50,400 --> 00:08:52,520 Speaker 1: stuff and they can do whatever they like. So the 156 00:08:52,559 --> 00:08:55,199 Speaker 1: idea that the universe that we could write down rules, 157 00:08:55,440 --> 00:08:58,320 Speaker 1: we could discover rules that the universe follows and use 158 00:08:58,360 --> 00:09:00,920 Speaker 1: those to predict what's going to happen. That's an incredible 159 00:09:00,920 --> 00:09:03,640 Speaker 1: step forward in human intellectual history. But also it's not 160 00:09:03,720 --> 00:09:08,160 Speaker 1: something we can necessarily explain, like why would the universe 161 00:09:08,200 --> 00:09:11,280 Speaker 1: be deterministic? Why would the universe even follow laws? Is 162 00:09:11,280 --> 00:09:15,120 Speaker 1: it true even today, given our amazing success in science, 163 00:09:15,240 --> 00:09:18,320 Speaker 1: is it true that everything we can discover, Um, every 164 00:09:18,400 --> 00:09:21,400 Speaker 1: natural phenomena we discover will eventually be explained by science 165 00:09:21,480 --> 00:09:24,199 Speaker 1: or can be explained by science. Those are open questions 166 00:09:24,200 --> 00:09:27,280 Speaker 1: in philosophy. Yeah, and so I guess the question then is, 167 00:09:28,000 --> 00:09:30,480 Speaker 1: you know how far can we push science? Like if 168 00:09:30,520 --> 00:09:33,240 Speaker 1: science can predict the future to some degree, like you know, 169 00:09:33,280 --> 00:09:35,040 Speaker 1: I can predict that the airplane is not going to 170 00:09:35,120 --> 00:09:38,200 Speaker 1: fall from the sky, or it's going to predict that, Um, 171 00:09:38,240 --> 00:09:39,760 Speaker 1: you know, if I shoot this laser, this is what's 172 00:09:39,800 --> 00:09:42,640 Speaker 1: gonna happen. You know, how far can we push the 173 00:09:42,800 --> 00:09:46,880 Speaker 1: science and maybe even predict what your brain is going 174 00:09:46,920 --> 00:09:49,319 Speaker 1: to do? Fascinating questions. And I think you'll see from 175 00:09:49,320 --> 00:09:52,080 Speaker 1: some of these reactions that there's a wide variety of 176 00:09:52,120 --> 00:09:56,120 Speaker 1: opinions out there. So, as usual, I walked around campus 177 00:09:56,120 --> 00:09:58,959 Speaker 1: at you see Irvine, and I asked folks this question, 178 00:09:59,120 --> 00:10:01,880 Speaker 1: can science predict what you're going to think or do? 179 00:10:02,679 --> 00:10:04,400 Speaker 1: So think about it for a second. Do you think 180 00:10:04,440 --> 00:10:08,760 Speaker 1: you're just a big, squeehy biological robot machine or do 181 00:10:08,800 --> 00:10:11,600 Speaker 1: you think you have some sort of free will or 182 00:10:11,720 --> 00:10:14,640 Speaker 1: some sort of free spirit that nobody can predict what 183 00:10:14,679 --> 00:10:16,920 Speaker 1: you're going to think or do. Here's what people have 184 00:10:17,040 --> 00:10:22,800 Speaker 1: to say, not precisely to an extent. There's just so 185 00:10:22,840 --> 00:10:28,720 Speaker 1: many variables. No, no, not um, because we're humans are unpredictable. 186 00:10:29,440 --> 00:10:32,680 Speaker 1: I'm a psychology aager. So yeah, I don't believe that 187 00:10:32,800 --> 00:10:39,000 Speaker 1: we can. We can try, but it's not likely. Yeah, 188 00:10:39,040 --> 00:10:41,199 Speaker 1: I think so, you think so? Yeah, So then is 189 00:10:41,200 --> 00:10:45,800 Speaker 1: their free will in that case? Yeah, because it's still 190 00:10:46,600 --> 00:10:49,120 Speaker 1: the person that'slf, that's stims, just the other person that's 191 00:10:49,160 --> 00:10:52,640 Speaker 1: from the predicting correctly, So it's not their choice theoretically. 192 00:10:52,720 --> 00:10:55,559 Speaker 1: I mean, if we did have these capabilities, then that 193 00:10:55,640 --> 00:10:59,400 Speaker 1: might be possible. We just need to advance. Science is 194 00:10:59,440 --> 00:11:03,040 Speaker 1: far enough to able to figure that out. No, I don't. 195 00:11:03,080 --> 00:11:05,200 Speaker 1: I don't think so. So do you think the brain 196 00:11:05,320 --> 00:11:07,680 Speaker 1: is like not described by physical laws or does something 197 00:11:07,720 --> 00:11:11,840 Speaker 1: else happening there? No? I think, UM, strong determinism I 198 00:11:11,880 --> 00:11:17,480 Speaker 1: think is a very uh optimistic Yeah you do. Why 199 00:11:17,559 --> 00:11:20,280 Speaker 1: is that? Um? I don't know. I just feel like 200 00:11:20,320 --> 00:11:23,400 Speaker 1: it could, and if it does, does that mean that 201 00:11:23,440 --> 00:11:26,400 Speaker 1: you have free will or don't have free will? Or um? 202 00:11:26,440 --> 00:11:28,680 Speaker 1: If you can predict what people are going to think, 203 00:11:28,720 --> 00:11:33,000 Speaker 1: then probably not. Yes, yes, Why is that? I don't know. 204 00:11:33,160 --> 00:11:36,079 Speaker 1: I think I saw like a video where they're like 205 00:11:36,320 --> 00:11:39,080 Speaker 1: trying to map out like a brain or something, and 206 00:11:39,080 --> 00:11:42,320 Speaker 1: they were talking about like how to recreate it, and 207 00:11:42,400 --> 00:11:46,079 Speaker 1: that everything we think is just like a series of decisions, 208 00:11:46,440 --> 00:11:48,840 Speaker 1: and that you can put it in like binary or something, 209 00:11:48,880 --> 00:11:52,200 Speaker 1: so like zero for yes or one for no, and 210 00:11:52,240 --> 00:11:55,800 Speaker 1: it all leads to what choices you make. I think 211 00:11:55,840 --> 00:11:58,080 Speaker 1: that if there were a method for it, it wouldn't 212 00:11:58,120 --> 00:12:00,240 Speaker 1: be super exact, just because there were so many Does 213 00:12:00,280 --> 00:12:01,880 Speaker 1: that play into it. I don't think it's possible to 214 00:12:01,880 --> 00:12:07,440 Speaker 1: be able to predict every single one of them. Yes, yes, um, 215 00:12:07,520 --> 00:12:10,720 Speaker 1: oh no, no No. So you don't think the brain 216 00:12:10,760 --> 00:12:13,480 Speaker 1: follows physical laws. I think it does, but I think 217 00:12:13,480 --> 00:12:19,520 Speaker 1: they're chaotic, chaotic or random. I've given enough information. Are 218 00:12:19,559 --> 00:12:25,160 Speaker 1: you just a complicated mechanical watch? Uh? No. I think 219 00:12:25,200 --> 00:12:28,880 Speaker 1: I may or may not subscribe to the view that 220 00:12:29,760 --> 00:12:33,599 Speaker 1: there's some quantum mechanics at play. So maybe you're not 221 00:12:33,720 --> 00:12:35,920 Speaker 1: seating quantum mechanics because it's an opening for free will. 222 00:12:36,880 --> 00:12:40,360 Speaker 1: M M. Now, I don't know if that's true. I'm 223 00:12:41,320 --> 00:12:45,160 Speaker 1: I'm a pessimist. I feel like eventually, yes, but I 224 00:12:45,160 --> 00:12:47,839 Speaker 1: feel like we're not there yet. So if so, does 225 00:12:47,880 --> 00:12:50,000 Speaker 1: that mean that your brain is deterministic but you're just 226 00:12:50,800 --> 00:12:52,640 Speaker 1: just a product of what's happened in the past and 227 00:12:52,720 --> 00:12:56,120 Speaker 1: the stimulus it's getting. Obviously there's like some varying factories, 228 00:12:56,120 --> 00:12:58,480 Speaker 1: but I think there are some things that are like patternistic, 229 00:12:58,520 --> 00:13:00,720 Speaker 1: like if you're gonna wake up in like drink a coffee, 230 00:13:00,800 --> 00:13:02,600 Speaker 1: or if you're gonna wake up and like go for 231 00:13:02,679 --> 00:13:05,880 Speaker 1: a walk, Like those things are patterns that like are habitual. 232 00:13:06,520 --> 00:13:08,880 Speaker 1: But then there's things that aren't. I feel like you 233 00:13:08,920 --> 00:13:12,000 Speaker 1: could predict it to an extent, but not like every 234 00:13:12,080 --> 00:13:15,200 Speaker 1: single action by action, like there might be like faults 235 00:13:15,240 --> 00:13:19,200 Speaker 1: in the prediction. But yeah, it's hard to do because 236 00:13:19,200 --> 00:13:23,000 Speaker 1: it's impossible. Just I feel like that's probably impossible because 237 00:13:23,040 --> 00:13:28,120 Speaker 1: your things can change, you know, very complicated robot. Does 238 00:13:28,160 --> 00:13:34,040 Speaker 1: that mean that there's no free will? I currently thinking 239 00:13:34,559 --> 00:13:38,120 Speaker 1: that there's no free will. Yes, Yeah, So maybe that's 240 00:13:38,120 --> 00:13:41,240 Speaker 1: where it becomes a bit complicated to answer this question, 241 00:13:41,240 --> 00:13:44,199 Speaker 1: because I do believe that the laws of physics govern 242 00:13:45,120 --> 00:13:49,520 Speaker 1: everything in the universe, so that would include us, all right. 243 00:13:49,679 --> 00:13:52,040 Speaker 1: I feel like these answers were all very yes and no. 244 00:13:52,640 --> 00:13:55,120 Speaker 1: You know, some people were like no, some people are 245 00:13:55,160 --> 00:13:57,920 Speaker 1: like yes. Nobody said like maybe, I don't know, you know, 246 00:13:58,080 --> 00:14:02,559 Speaker 1: like people had very opinions. They certainly did, and I 247 00:14:02,600 --> 00:14:04,319 Speaker 1: should have taken some data to see if these were 248 00:14:04,400 --> 00:14:07,240 Speaker 1: like all science majors who had confidence as science would 249 00:14:07,240 --> 00:14:10,839 Speaker 1: eventually figure it out or not. But the last two 250 00:14:10,840 --> 00:14:12,599 Speaker 1: were maybe the most fun because they turned out to 251 00:14:12,640 --> 00:14:15,120 Speaker 1: be a husband and wife and he said, yes, he's 252 00:14:15,120 --> 00:14:18,280 Speaker 1: a complicated robot and she says, no, we're not robots. 253 00:14:18,480 --> 00:14:21,080 Speaker 1: And then after I left, I heard them arguing about 254 00:14:21,120 --> 00:14:26,640 Speaker 1: it in a good natured way. In a good natured way, Oh, 255 00:14:26,720 --> 00:14:28,760 Speaker 1: I see one of them. One of them thought that 256 00:14:28,800 --> 00:14:31,280 Speaker 1: the other one was predictable and the other one did 257 00:14:31,280 --> 00:14:34,080 Speaker 1: not like to be predicted. Yeah. I thought that was fascinating. 258 00:14:34,680 --> 00:14:37,920 Speaker 1: Maybe sparked some dinner table conversation. But you're right, and 259 00:14:37,920 --> 00:14:40,880 Speaker 1: and in contrast to some other questions, this is definitely 260 00:14:40,920 --> 00:14:44,520 Speaker 1: a topic everybody felt comfortable giving an answer to. Sometimes 261 00:14:44,560 --> 00:14:46,280 Speaker 1: I'll ask people a question and they'll be like, what, 262 00:14:46,320 --> 00:14:48,080 Speaker 1: I never heard of that before or I don't know, 263 00:14:48,360 --> 00:14:51,720 Speaker 1: but here everybody had something thoughtful to say. Yeah, because 264 00:14:51,840 --> 00:14:54,640 Speaker 1: I think it touches something very deep within us, you know, 265 00:14:54,920 --> 00:14:57,800 Speaker 1: just this idea that there's something more to me than 266 00:14:57,880 --> 00:15:02,320 Speaker 1: just like a big biological clock or a big biological 267 00:15:02,840 --> 00:15:05,720 Speaker 1: you know, clump of cells doing what there would do 268 00:15:05,800 --> 00:15:08,160 Speaker 1: without thinking about it. Yeah, I think most people feel 269 00:15:08,200 --> 00:15:10,960 Speaker 1: like they are steering, even if your body is a 270 00:15:11,000 --> 00:15:14,240 Speaker 1: big biological robot. They feel like they're in charge. They're 271 00:15:14,280 --> 00:15:16,920 Speaker 1: making choices. They decide to eat that cookie, or they 272 00:15:16,920 --> 00:15:19,560 Speaker 1: decide to step on that crack. They feel like they're 273 00:15:19,560 --> 00:15:22,120 Speaker 1: making these decisions and so so and so. It doesn't 274 00:15:22,120 --> 00:15:25,880 Speaker 1: sort of jibe with that experience to imagine that those 275 00:15:25,920 --> 00:15:28,920 Speaker 1: decisions are just the product of the situation you're in 276 00:15:28,960 --> 00:15:32,440 Speaker 1: an instant before. It's hard to imagine how you could 277 00:15:32,840 --> 00:15:36,280 Speaker 1: have such a visceral experience of free will if you 278 00:15:36,320 --> 00:15:38,480 Speaker 1: don't actually have it. All right, so let's stick into 279 00:15:38,480 --> 00:15:41,360 Speaker 1: the question. Here is the question here, which is a 280 00:15:41,480 --> 00:15:44,520 Speaker 1: cook science predict what you're going to do. And so 281 00:15:44,600 --> 00:15:47,360 Speaker 1: let's maybe paint the picture to our listeners about what 282 00:15:47,520 --> 00:15:50,360 Speaker 1: that might look like. You know, like, how could science, 283 00:15:50,840 --> 00:15:53,920 Speaker 1: physics or you know, a combination of biology and computer 284 00:15:53,960 --> 00:15:59,160 Speaker 1: science possibly predict what a human brain might think or do. YEA. So, 285 00:15:59,200 --> 00:16:03,160 Speaker 1: the typical raggy for understanding something is to think about 286 00:16:03,160 --> 00:16:05,880 Speaker 1: it in terms of its microscopic bits, like what's going 287 00:16:05,920 --> 00:16:09,040 Speaker 1: on inside of it? Can we understand those bits and 288 00:16:09,280 --> 00:16:12,240 Speaker 1: from that build up some sort of understanding, you know, 289 00:16:12,280 --> 00:16:15,400 Speaker 1: like if you wanted to say understand how a watch worked, 290 00:16:15,640 --> 00:16:17,080 Speaker 1: you would take it apart and you would say, oh, 291 00:16:17,120 --> 00:16:19,640 Speaker 1: there's a gear here, and there's a lever there, and 292 00:16:19,920 --> 00:16:22,680 Speaker 1: this lever touches that gear which turns this thing, and 293 00:16:22,760 --> 00:16:24,760 Speaker 1: that's how this thing works. And if you can make 294 00:16:24,760 --> 00:16:27,000 Speaker 1: a model of all the things inside of it, and 295 00:16:27,120 --> 00:16:30,040 Speaker 1: each one of those things is following the laws of physics, 296 00:16:30,280 --> 00:16:33,680 Speaker 1: then together the whole thing has to follow the laws 297 00:16:33,680 --> 00:16:36,760 Speaker 1: of physics. Right. If something is made up of pieces 298 00:16:36,760 --> 00:16:40,520 Speaker 1: which are predictable, then putting them together it should also 299 00:16:40,560 --> 00:16:43,600 Speaker 1: be predictable, right you mean like a so science would 300 00:16:43,640 --> 00:16:47,080 Speaker 1: might be maybe break down your brain and maybe build 301 00:16:47,120 --> 00:16:51,160 Speaker 1: like a computer model of each one of your neurons 302 00:16:51,280 --> 00:16:55,840 Speaker 1: and that somehow, you know, acts exactly the same way 303 00:16:55,960 --> 00:16:58,040 Speaker 1: that is put together the exact same way that your 304 00:16:58,080 --> 00:17:01,040 Speaker 1: brain is. And so maybe like if you capture your 305 00:17:01,080 --> 00:17:06,439 Speaker 1: brain in a computer, could that maybe predict what you're thinking, 306 00:17:06,480 --> 00:17:08,560 Speaker 1: what you'll think and do. Yeah, And it doesn't have 307 00:17:08,640 --> 00:17:10,600 Speaker 1: to be a computer model in the end, it just 308 00:17:10,640 --> 00:17:13,280 Speaker 1: has to follow the mathematical laws of physics, and you 309 00:17:13,280 --> 00:17:15,760 Speaker 1: can express those as a computer model. You could also 310 00:17:15,760 --> 00:17:19,120 Speaker 1: build a mechanical model. You can imagine building a basically 311 00:17:19,119 --> 00:17:22,159 Speaker 1: a physical copy of your brain. So that's I think, uh, 312 00:17:22,359 --> 00:17:25,520 Speaker 1: that detail is not critical the concept. The critical concept 313 00:17:25,560 --> 00:17:28,800 Speaker 1: is understanding what's going on inside the little bits and 314 00:17:28,800 --> 00:17:32,920 Speaker 1: then build putting those together to basically replicate your brain. 315 00:17:33,320 --> 00:17:35,480 Speaker 1: But replicating is not enough. Like if I had a 316 00:17:35,560 --> 00:17:39,000 Speaker 1: perfect copy of your brain, Like if I created another Hooge, 317 00:17:39,240 --> 00:17:41,880 Speaker 1: that wouldn't necessarily help me understand what you're going to do. 318 00:17:42,400 --> 00:17:45,239 Speaker 1: In order to in order to predict what you're gonna do, 319 00:17:45,520 --> 00:17:47,840 Speaker 1: I need to be able to run experiments. And so 320 00:17:47,920 --> 00:17:50,440 Speaker 1: you're having a simulation of your brain. I could like say, well, 321 00:17:50,680 --> 00:17:52,720 Speaker 1: what would Hooge do if I offered him a cookie? 322 00:17:52,760 --> 00:17:54,640 Speaker 1: Would he say yes or no? So if I had 323 00:17:54,680 --> 00:17:59,600 Speaker 1: a perfect simulation of your brain, I could run those experiments, right, Yeah, 324 00:18:00,080 --> 00:18:02,639 Speaker 1: I think it goes to the idea that you know 325 00:18:02,840 --> 00:18:06,720 Speaker 1: we're really complicated as human beings, as thinking beings. But 326 00:18:06,880 --> 00:18:09,040 Speaker 1: you know, if you break down our brain, it's made 327 00:18:09,040 --> 00:18:12,480 Speaker 1: out of you know, lobes and chunks of brain tissue, 328 00:18:12,560 --> 00:18:15,240 Speaker 1: and those are made out of neurons connected to each other, 329 00:18:16,040 --> 00:18:19,200 Speaker 1: and neurons are made out of molecules, and so all 330 00:18:19,240 --> 00:18:21,680 Speaker 1: of these things down to the molecule level kind of 331 00:18:21,800 --> 00:18:24,400 Speaker 1: follow the laws of physics. You know. It is sort 332 00:18:24,400 --> 00:18:27,879 Speaker 1: of at the end, just a big and very very complicated, 333 00:18:27,920 --> 00:18:31,320 Speaker 1: but still a big clock. Yes. And that's a really 334 00:18:31,359 --> 00:18:35,080 Speaker 1: deeply powerful implication of this discovery that we made a 335 00:18:35,080 --> 00:18:37,560 Speaker 1: long long time ago, that everything is made out of 336 00:18:37,600 --> 00:18:40,080 Speaker 1: the same bits. Right, I'm made out of atoms. You're 337 00:18:40,119 --> 00:18:42,000 Speaker 1: made out of atoms. This chair I'm sitting on is 338 00:18:42,040 --> 00:18:44,320 Speaker 1: made out of atoms. Things are not made out of 339 00:18:44,320 --> 00:18:46,600 Speaker 1: their own kind of stuff, which means that they in 340 00:18:46,600 --> 00:18:49,240 Speaker 1: the end, they all follow the same rules. The rules 341 00:18:49,280 --> 00:18:51,679 Speaker 1: that hold a chair together are the same rules that 342 00:18:51,760 --> 00:18:54,520 Speaker 1: hold your cat together, and your hamster together and you together. 343 00:18:54,880 --> 00:18:56,520 Speaker 1: So if we can figure out what the rules are 344 00:18:56,640 --> 00:19:00,480 Speaker 1: that describe how molecules and atoms work, that improve instaple. 345 00:19:00,640 --> 00:19:03,200 Speaker 1: And that's an important distinction and principle we should be 346 00:19:03,200 --> 00:19:07,120 Speaker 1: able to extrapolate it up and understand how you work. Right, 347 00:19:07,320 --> 00:19:09,080 Speaker 1: It's kind of like saying, like, you know, we are 348 00:19:09,119 --> 00:19:12,880 Speaker 1: made out of inanimate bids. You know, I think we're 349 00:19:12,880 --> 00:19:15,520 Speaker 1: made out of things that are just plain things which 350 00:19:15,560 --> 00:19:18,359 Speaker 1: you can maybe predict what they're going to do. And 351 00:19:18,400 --> 00:19:22,040 Speaker 1: so does that mean that we are also inanimate in 352 00:19:22,040 --> 00:19:25,880 Speaker 1: a way unpredictable. Imagine you came across a robot, for example, 353 00:19:26,200 --> 00:19:27,919 Speaker 1: and it was doing weird stuff and you want to 354 00:19:27,960 --> 00:19:30,000 Speaker 1: understand it. How would you understand it? What? You would 355 00:19:30,000 --> 00:19:31,520 Speaker 1: take it apart and say, oh, it's made out of 356 00:19:31,520 --> 00:19:33,760 Speaker 1: these pieces, and can I understand each of those pieces? 357 00:19:33,760 --> 00:19:35,959 Speaker 1: And if so, you could put that together to an 358 00:19:36,000 --> 00:19:38,919 Speaker 1: understanding of the whole robot. And so the idea is 359 00:19:39,080 --> 00:19:40,800 Speaker 1: apply that to a person. Yeah, and you could do 360 00:19:40,880 --> 00:19:43,159 Speaker 1: things like, you know, if I, you know, if I 361 00:19:43,160 --> 00:19:45,960 Speaker 1: poked this little bit here in this engine, you know, 362 00:19:46,040 --> 00:19:48,560 Speaker 1: I predicted the and the car is gonna move forward 363 00:19:49,000 --> 00:19:52,480 Speaker 1: or backwards exactly. And all of this happening, of course, 364 00:19:52,520 --> 00:19:56,400 Speaker 1: assumes a few things. It assumes that you can get 365 00:19:56,400 --> 00:19:59,040 Speaker 1: a picture of what's what's happening inside your brain. The 366 00:19:59,080 --> 00:20:01,600 Speaker 1: idea you presented for choirs that we know each neuron, 367 00:20:01,800 --> 00:20:04,920 Speaker 1: we know the situation that neuron is in, right, It's 368 00:20:04,920 --> 00:20:06,800 Speaker 1: like for the watch, it's like that we know where 369 00:20:06,800 --> 00:20:10,800 Speaker 1: the levers are and where the gears are, and that 370 00:20:10,920 --> 00:20:13,720 Speaker 1: we can extrapolate up from those levers and gears to 371 00:20:13,800 --> 00:20:16,800 Speaker 1: the operation of a whole brain. That's not trivial. And 372 00:20:16,840 --> 00:20:20,240 Speaker 1: then you know, there's questions about like is it actually deterministic, 373 00:20:20,320 --> 00:20:22,800 Speaker 1: or is there like some funky quantum magic going on, 374 00:20:22,960 --> 00:20:26,240 Speaker 1: or is there something in there that like science can describe. 375 00:20:26,280 --> 00:20:28,240 Speaker 1: So it's not a trivial thing to say just just 376 00:20:28,280 --> 00:20:30,400 Speaker 1: because your brain is made of atoms that we could 377 00:20:30,480 --> 00:20:33,600 Speaker 1: therefore predict what you're gonna do. Right, Yeah, I think 378 00:20:33,600 --> 00:20:37,680 Speaker 1: that's the at the core question here, And so let's 379 00:20:37,680 --> 00:20:40,439 Speaker 1: get into it and um, just to say, some time, Daniel, 380 00:20:40,440 --> 00:20:42,919 Speaker 1: I will always take the cookie. You don't need a 381 00:20:42,960 --> 00:20:46,680 Speaker 1: fancy computer simulation to predict that I will take the cookie. 382 00:20:47,320 --> 00:20:49,360 Speaker 1: All right, I'm gonna cross that off my deep questions 383 00:20:49,359 --> 00:20:51,639 Speaker 1: of the universal list. Yeah, so let's get into it 384 00:20:51,680 --> 00:20:54,359 Speaker 1: a little bit more deeply and figure out how you 385 00:20:54,440 --> 00:20:57,000 Speaker 1: might even do this, or whether we can or if 386 00:20:57,000 --> 00:20:59,840 Speaker 1: there is some kind of quantum magic that might give 387 00:20:59,840 --> 00:21:02,080 Speaker 1: a little bit of a loophole to get free will. 388 00:21:02,359 --> 00:21:17,080 Speaker 1: But first let's take a quick break. All right, we're 389 00:21:17,080 --> 00:21:19,640 Speaker 1: talking about whether science can predict what you're gonna think 390 00:21:19,680 --> 00:21:22,440 Speaker 1: and do, and we talked about, um, yeah, that maybe 391 00:21:22,480 --> 00:21:24,240 Speaker 1: your your brain is made out of little bits and 392 00:21:24,240 --> 00:21:27,080 Speaker 1: pieces which which are sort of little biological machines. And 393 00:21:27,119 --> 00:21:29,479 Speaker 1: so does that mean that your whole brain is just 394 00:21:29,600 --> 00:21:33,640 Speaker 1: a machine or like a robot, which anyone could predict 395 00:21:33,760 --> 00:21:35,680 Speaker 1: what they're going to do? So let's get into it. 396 00:21:35,840 --> 00:21:38,439 Speaker 1: What are some of the things Daniel, that physicists know 397 00:21:38,920 --> 00:21:43,280 Speaker 1: or think that might make that picture really difficult or 398 00:21:43,320 --> 00:21:46,600 Speaker 1: maybe impossible to predict what you're going to do. The 399 00:21:46,640 --> 00:21:49,800 Speaker 1: first thing that comes to my mind is just knowing 400 00:21:49,840 --> 00:21:52,879 Speaker 1: the current status of your brain. Like if I wanted 401 00:21:52,920 --> 00:21:56,720 Speaker 1: to model the flight of a of a baseball, I 402 00:21:56,920 --> 00:21:59,800 Speaker 1: wanted to know exactly where baseball went. I would need 403 00:21:59,840 --> 00:22:03,440 Speaker 1: to know it's current position and it's current direction. If 404 00:22:03,440 --> 00:22:05,240 Speaker 1: you wanted to know if you wanted to predict where 405 00:22:05,320 --> 00:22:07,920 Speaker 1: what's gonna land, you would sort of need to know 406 00:22:08,920 --> 00:22:10,280 Speaker 1: you know where it is and where it's going at 407 00:22:10,280 --> 00:22:13,119 Speaker 1: any given time. Yeah, and and at any moment, just 408 00:22:13,200 --> 00:22:15,560 Speaker 1: one moment in time would determine where it's going to be. 409 00:22:15,640 --> 00:22:17,960 Speaker 1: So if I knew where that baseball was now and 410 00:22:18,000 --> 00:22:20,080 Speaker 1: where it was in which direction it was going in, 411 00:22:20,440 --> 00:22:23,200 Speaker 1: I could just apply the law of physics and propagate 412 00:22:23,240 --> 00:22:25,240 Speaker 1: that information forward. I would say, Oh, it's going to 413 00:22:25,359 --> 00:22:28,159 Speaker 1: move in a certain path under the under gravity, and 414 00:22:28,200 --> 00:22:29,800 Speaker 1: I can tell you exactly where it's going to land. 415 00:22:29,800 --> 00:22:32,440 Speaker 1: That's physics predicting the future, and that works for one 416 00:22:32,480 --> 00:22:36,280 Speaker 1: little tiny particle or baseball or whatever. But in order 417 00:22:36,320 --> 00:22:38,040 Speaker 1: to do that for your brain, I would need to 418 00:22:38,040 --> 00:22:41,479 Speaker 1: know the current situation of your brain, Like, what is 419 00:22:42,040 --> 00:22:44,520 Speaker 1: the situation of every neuron? Is it about to fire 420 00:22:44,760 --> 00:22:47,160 Speaker 1: a little pulse? Is it not about to fire pulse? 421 00:22:47,280 --> 00:22:50,560 Speaker 1: How strong is this connection between it and the next neuron? 422 00:22:50,880 --> 00:22:53,480 Speaker 1: And you know there's billions and billions of neurons in 423 00:22:53,480 --> 00:22:56,000 Speaker 1: your brains. We're talking about an enormous amount of information 424 00:22:56,080 --> 00:22:57,720 Speaker 1: you'd need to know. I think what you mean is 425 00:22:57,760 --> 00:23:00,639 Speaker 1: that even if I had a giant computers similiation of 426 00:23:00,680 --> 00:23:02,600 Speaker 1: your brain, I would need to give each of the 427 00:23:02,640 --> 00:23:05,720 Speaker 1: neurons kind of a starting value, right, or like I 428 00:23:05,720 --> 00:23:09,280 Speaker 1: would need to know where each neuron was in order 429 00:23:09,320 --> 00:23:12,439 Speaker 1: to predict what the whole brain was going to do. Precisely. 430 00:23:12,760 --> 00:23:14,840 Speaker 1: If I had a computer program that could simulate an 431 00:23:14,920 --> 00:23:17,320 Speaker 1: arbitrary brain, I would somehow need to configure it to 432 00:23:17,440 --> 00:23:20,040 Speaker 1: your brain. And that would mean knowing where all the 433 00:23:20,080 --> 00:23:22,200 Speaker 1: neurons were and how they were connected to each other 434 00:23:22,800 --> 00:23:24,800 Speaker 1: and all that kind of stuff. Just like if you 435 00:23:24,840 --> 00:23:26,679 Speaker 1: want to calculate where the baseball is going to go, 436 00:23:26,920 --> 00:23:28,800 Speaker 1: you need to know the initial conditions. You need to 437 00:23:28,840 --> 00:23:31,320 Speaker 1: know where it is now and which direction is going in. 438 00:23:31,760 --> 00:23:34,359 Speaker 1: So like, even if I had that computer program, and 439 00:23:34,359 --> 00:23:37,240 Speaker 1: even if it was possible um to use that computer 440 00:23:37,240 --> 00:23:39,600 Speaker 1: program to predict the future, how do I get that 441 00:23:39,680 --> 00:23:42,639 Speaker 1: information from your brain? Do I have to like scan 442 00:23:42,760 --> 00:23:45,080 Speaker 1: your brain somehow? You have to take your head off 443 00:23:45,119 --> 00:23:48,560 Speaker 1: and slice it into hyperfine little bits? Like how do 444 00:23:48,600 --> 00:23:52,239 Speaker 1: you even physically do that? Right? Because each neuron might 445 00:23:52,280 --> 00:23:54,480 Speaker 1: be in a different state, you know, like one win 446 00:23:54,600 --> 00:23:57,240 Speaker 1: might be more excited than others, or you might be 447 00:23:57,840 --> 00:23:59,240 Speaker 1: you know, you might be in a bad mood. And 448 00:23:59,280 --> 00:24:02,040 Speaker 1: so there's a general I don't know dopamine levels that 449 00:24:02,080 --> 00:24:05,439 Speaker 1: are you know, suppressing some of your neurons right now. 450 00:24:05,480 --> 00:24:07,320 Speaker 1: You need to know that in order to predict what 451 00:24:07,359 --> 00:24:08,920 Speaker 1: you're going to think and do. Yeah, And you need 452 00:24:08,960 --> 00:24:10,879 Speaker 1: to know it all at the same moment. Right, I 453 00:24:10,920 --> 00:24:12,600 Speaker 1: don't want to know what the left half of your 454 00:24:12,600 --> 00:24:14,280 Speaker 1: brain is doing right now and the right half of 455 00:24:14,280 --> 00:24:16,359 Speaker 1: your brain is doing two seconds ago. I need a 456 00:24:16,400 --> 00:24:19,919 Speaker 1: complete picture of your brain at one moment, so that 457 00:24:20,000 --> 00:24:23,600 Speaker 1: I know how what direction everything is moving in. I'm 458 00:24:23,640 --> 00:24:25,119 Speaker 1: just thinking of your brain is like a lot of 459 00:24:25,160 --> 00:24:28,000 Speaker 1: tiny little baseballs, and so I need to know where 460 00:24:28,040 --> 00:24:30,280 Speaker 1: all those baseballs are at the same time. So even 461 00:24:30,520 --> 00:24:33,399 Speaker 1: slicing your brain, you know, like a ham sandwich or whatever, 462 00:24:33,640 --> 00:24:36,960 Speaker 1: wouldn't work. I need to somehow instantly scan everything in 463 00:24:37,000 --> 00:24:43,840 Speaker 1: your brain. And that just seems almost impossible, right, It 464 00:24:43,920 --> 00:24:45,920 Speaker 1: seems almost impossible. I mean I read a lot of 465 00:24:45,960 --> 00:24:49,160 Speaker 1: science fiction, and there's some good stuff where the people 466 00:24:49,160 --> 00:24:52,160 Speaker 1: are uploading brains into the cloud, and sometimes they talk 467 00:24:52,200 --> 00:24:54,840 Speaker 1: about how scanning the brain is a destructive process, but 468 00:24:54,880 --> 00:24:56,959 Speaker 1: they never talked about how fast it has to be. Like, 469 00:24:57,000 --> 00:24:58,680 Speaker 1: if you want to scan the brain, it's got to 470 00:24:58,720 --> 00:25:01,359 Speaker 1: be a snapshot. You gotta know where everything is in 471 00:25:01,440 --> 00:25:05,480 Speaker 1: an instant or it's all averaged like over the last 472 00:25:05,600 --> 00:25:08,119 Speaker 1: five minutes or ten minutes or twenty minutes, or however 473 00:25:08,119 --> 00:25:10,639 Speaker 1: long it takes to scan the brain. And that's going 474 00:25:10,680 --> 00:25:13,280 Speaker 1: to be a disaster. So this is a technological issue, 475 00:25:13,320 --> 00:25:16,080 Speaker 1: not philosophical. But I don't know if that's even possible. 476 00:25:16,240 --> 00:25:18,399 Speaker 1: Oh I see, Okay, so that's kind of the first 477 00:25:18,880 --> 00:25:22,480 Speaker 1: um barrier that might prevent you from predicting somebody's brain 478 00:25:22,800 --> 00:25:25,960 Speaker 1: is getting a snapshot of it. But uh, and it 479 00:25:26,080 --> 00:25:28,720 Speaker 1: sounds impossible, but I think technically you can't rule it 480 00:25:28,760 --> 00:25:33,159 Speaker 1: out right, Like, it's not technically or theoretically impossible that 481 00:25:33,600 --> 00:25:36,040 Speaker 1: maybe one day people will come up with a brain 482 00:25:36,080 --> 00:25:38,639 Speaker 1: scan that can somehow capture all of your brain in 483 00:25:38,680 --> 00:25:41,280 Speaker 1: one instant. That's right, unless you need to know it's 484 00:25:41,320 --> 00:25:43,840 Speaker 1: sort of at the level of quantum mechanical objects, like 485 00:25:43,920 --> 00:25:46,679 Speaker 1: you need to know the exact location of every electron. 486 00:25:47,040 --> 00:25:50,520 Speaker 1: Then it's theoretically impossible because you can't measure the perfect 487 00:25:50,560 --> 00:25:53,440 Speaker 1: quantum state of the entire brain all at the same time. 488 00:25:54,200 --> 00:25:56,520 Speaker 1: I think that's theoretically impossible. But as long as it's 489 00:25:56,520 --> 00:25:59,600 Speaker 1: you know, not necessarily sensitive to quantum effects in theory, 490 00:25:59,680 --> 00:26:03,680 Speaker 1: it's possible, but technologically way out of our grasp today today. Yeah, 491 00:26:03,720 --> 00:26:06,280 Speaker 1: let's assume that that's possible and then think about what 492 00:26:06,320 --> 00:26:09,560 Speaker 1: the other problems would be. Yeah, yeah, because that's that's 493 00:26:09,600 --> 00:26:12,160 Speaker 1: not that's not the only problem is scanning your brain. 494 00:26:13,040 --> 00:26:16,560 Speaker 1: There are first we have to achieve this almost impossible 495 00:26:16,560 --> 00:26:20,320 Speaker 1: sounding technological feat. Then we got the real problems. Yeah, well, 496 00:26:20,359 --> 00:26:22,200 Speaker 1: you know that's what they said about the iPhone. And 497 00:26:22,560 --> 00:26:24,840 Speaker 1: look where we are now? Is that? Is that what 498 00:26:24,880 --> 00:26:27,679 Speaker 1: they said about the iPhone? Was it that? This was 499 00:26:27,720 --> 00:26:29,520 Speaker 1: that at the physics meeting you had with Einstein and 500 00:26:29,560 --> 00:26:31,679 Speaker 1: you know all those guys, Well it was science pitching 501 00:26:31,800 --> 00:26:34,840 Speaker 1: like fifty years ago, right, like this device that fit 502 00:26:34,880 --> 00:26:37,600 Speaker 1: in your pocket and you can talk video with people 503 00:26:37,840 --> 00:26:40,800 Speaker 1: and play addictive video games on for hours. I mean 504 00:26:40,800 --> 00:26:43,399 Speaker 1: that was like a Star trek. You know that's true. 505 00:26:43,800 --> 00:26:46,960 Speaker 1: And it'd be folly to say this kind of technology 506 00:26:46,960 --> 00:26:49,600 Speaker 1: will never be developed or would take years or centuries 507 00:26:49,680 --> 00:26:52,840 Speaker 1: or whatever, because those kind of predictions are always people 508 00:26:52,880 --> 00:26:55,720 Speaker 1: always look silly five years later after making those predictions. 509 00:26:56,119 --> 00:26:58,760 Speaker 1: So we should avoid that. And maybe technology will progress 510 00:26:58,880 --> 00:27:01,200 Speaker 1: rapidly and they'll invent the brain scan next week, and 511 00:27:01,240 --> 00:27:03,520 Speaker 1: they'll have a brain scan app that you can put 512 00:27:03,600 --> 00:27:06,639 Speaker 1: on your phone. Maybe, But then there are other things 513 00:27:06,640 --> 00:27:09,320 Speaker 1: that will would make this really difficult or that kind 514 00:27:09,320 --> 00:27:13,520 Speaker 1: of make your brain almost pretty much unpredictable. Right, Yeah, 515 00:27:13,520 --> 00:27:16,520 Speaker 1: there are a lot of practical issues and actually accomplishing 516 00:27:16,680 --> 00:27:18,879 Speaker 1: predicting what your brain is going to do. Say, we 517 00:27:18,920 --> 00:27:21,200 Speaker 1: had a snapshot at your brain, we knew the current 518 00:27:21,240 --> 00:27:23,880 Speaker 1: status of every neuron. We could load that into the computer, 519 00:27:24,320 --> 00:27:26,480 Speaker 1: and we wanted to run that forward. We wanted to say, 520 00:27:26,640 --> 00:27:29,320 Speaker 1: all right, well, what's this person going to do in 521 00:27:29,359 --> 00:27:33,760 Speaker 1: five minutes. That's not always easy. Some systems, even if 522 00:27:33,760 --> 00:27:36,720 Speaker 1: they're made of atoms which follow physical laws, and even 523 00:27:36,760 --> 00:27:39,760 Speaker 1: if you knew where all those atoms were, are hard 524 00:27:39,840 --> 00:27:44,280 Speaker 1: to describe. Yeah, because there um you're telling me earlier. 525 00:27:44,560 --> 00:27:48,840 Speaker 1: They're not easy systems to kind of simulate, or they're 526 00:27:48,840 --> 00:27:53,280 Speaker 1: not easy systems to um to kind of predict what 527 00:27:53,320 --> 00:27:55,159 Speaker 1: they're going to do. Yeah, and this is again a 528 00:27:55,200 --> 00:27:58,760 Speaker 1: technological problem, but it's a real problem, Like why can't 529 00:27:58,760 --> 00:28:02,480 Speaker 1: we predict the weather? In theory, weather follows rules that 530 00:28:02,520 --> 00:28:06,480 Speaker 1: we understand water droplets and air resistance and wind and 531 00:28:06,520 --> 00:28:09,560 Speaker 1: all that stuff. That's not complicated physics. They're just atoms. Yeah, 532 00:28:09,600 --> 00:28:11,800 Speaker 1: they're just atoms, And we don't even need to worry 533 00:28:11,800 --> 00:28:13,840 Speaker 1: about the atoms. We can think about the water droplets 534 00:28:13,840 --> 00:28:16,720 Speaker 1: and the temperature and stuff. So why is it so 535 00:28:16,760 --> 00:28:18,960 Speaker 1: hard to predict when it's going to rain or when 536 00:28:18,960 --> 00:28:20,840 Speaker 1: a hurricane is going to come. If it's following the 537 00:28:20,880 --> 00:28:23,679 Speaker 1: laws of physics, and we have amazing satellites that are 538 00:28:23,720 --> 00:28:26,840 Speaker 1: like gathering all this data about what the temperature is everywhere, 539 00:28:26,960 --> 00:28:30,359 Speaker 1: it's a very similar problem. The reason that weather it 540 00:28:30,400 --> 00:28:32,680 Speaker 1: is hard to predict is not because it doesn't follow 541 00:28:32,680 --> 00:28:35,920 Speaker 1: the laws of physics. It's because it's very, very sensitive 542 00:28:36,160 --> 00:28:40,120 Speaker 1: to the exact tiny details of the situation. A small 543 00:28:40,280 --> 00:28:43,320 Speaker 1: change in temperature over here leads to a big change 544 00:28:43,360 --> 00:28:46,320 Speaker 1: over there. That's something we call chaos. Right, It's this 545 00:28:46,360 --> 00:28:48,600 Speaker 1: idea that the kind of goes back to that whole 546 00:28:48,600 --> 00:28:52,600 Speaker 1: butterfly idea, right, Like like they say that if a 547 00:28:52,680 --> 00:28:55,360 Speaker 1: butterfly flaps its wing in one part of the world, 548 00:28:55,480 --> 00:28:58,720 Speaker 1: it can actually affect the weather in another part of 549 00:28:58,760 --> 00:29:02,080 Speaker 1: the world, just because it's so sensitive to even small 550 00:29:02,160 --> 00:29:05,240 Speaker 1: things like a butterfly flapping its wing. That's right, and 551 00:29:05,320 --> 00:29:08,520 Speaker 1: not every system is chaotic. Some things are not right 552 00:29:08,560 --> 00:29:10,760 Speaker 1: some things. For example, when you throw a ball, you 553 00:29:10,800 --> 00:29:13,640 Speaker 1: throw a ball, if you change the angle you're throwing 554 00:29:13,640 --> 00:29:16,520 Speaker 1: it at by a tiny bit, then the outcome is 555 00:29:16,640 --> 00:29:18,960 Speaker 1: changed by a tiny bit. Right, you throw it a 556 00:29:19,000 --> 00:29:21,640 Speaker 1: tiny bit harder, it lands, it goes a tin a 557 00:29:21,640 --> 00:29:24,480 Speaker 1: bit further. But other things, if you change how you 558 00:29:24,480 --> 00:29:25,959 Speaker 1: do them by a tiny bit, you get a very 559 00:29:25,960 --> 00:29:29,040 Speaker 1: different outcome, Like rolling a die. If you roll a 560 00:29:29,080 --> 00:29:31,480 Speaker 1: dice slightly differently, you spin your hand to tying a 561 00:29:31,480 --> 00:29:33,760 Speaker 1: bit differently, you get a four instead of a two, 562 00:29:33,880 --> 00:29:36,040 Speaker 1: or a five instead of a one. So the outcome 563 00:29:36,080 --> 00:29:39,560 Speaker 1: depends very very sensitive and exactly how you did it. 564 00:29:39,560 --> 00:29:43,200 Speaker 1: It's not random, it's following the laws of physics. In theory, 565 00:29:43,360 --> 00:29:47,080 Speaker 1: it's predictable, but in practice it's very difficult because the 566 00:29:47,120 --> 00:29:51,080 Speaker 1: outcome depends very sensitively on how you flap those butterfly wings, yeah, 567 00:29:51,200 --> 00:29:53,360 Speaker 1: or how you throw the die. Like to predict a 568 00:29:53,440 --> 00:29:56,120 Speaker 1: die roll, you would need to really kind of like 569 00:29:56,160 --> 00:30:00,560 Speaker 1: pay attention to the exact angle that each I is 570 00:30:01,320 --> 00:30:03,800 Speaker 1: at when it leaves your hand, and you have to 571 00:30:03,840 --> 00:30:07,480 Speaker 1: predict also or simulate how like when the corners hit 572 00:30:07,560 --> 00:30:09,720 Speaker 1: the ground, how that's going to affect the spin of 573 00:30:09,800 --> 00:30:12,720 Speaker 1: the die. And so it's really it's just a much 574 00:30:12,760 --> 00:30:16,800 Speaker 1: more complicated system to simulate and predict than like just 575 00:30:16,840 --> 00:30:19,120 Speaker 1: throwing a baseball. So then the question is, is your 576 00:30:19,120 --> 00:30:22,040 Speaker 1: brain like a big bag of dice with very each 577 00:30:22,040 --> 00:30:24,520 Speaker 1: one very difficult to predict and bouncing off the other one. 578 00:30:24,800 --> 00:30:27,920 Speaker 1: It's a very complex thing, or is it made out 579 00:30:27,920 --> 00:30:30,600 Speaker 1: of things which are easy to predict? And if you 580 00:30:30,680 --> 00:30:34,360 Speaker 1: knew pretty much the current situation that you could predict 581 00:30:34,360 --> 00:30:36,000 Speaker 1: how it's going to come about? You know, is your 582 00:30:36,000 --> 00:30:38,960 Speaker 1: brain a hurricane or is it just a baseball right? 583 00:30:39,440 --> 00:30:41,480 Speaker 1: Or you know, if I have butterflies in my stomach 584 00:30:41,520 --> 00:30:43,520 Speaker 1: and those they flap, what's how is it going to 585 00:30:44,400 --> 00:30:47,320 Speaker 1: affect my decision to eat a cookie or not exactly? 586 00:30:47,360 --> 00:30:49,720 Speaker 1: So the point is that even if you knew exactly 587 00:30:49,760 --> 00:30:52,320 Speaker 1: the current situation of the brain, could you have a 588 00:30:52,400 --> 00:30:55,120 Speaker 1: powerful enough computer to predict what it's going to happen 589 00:30:55,120 --> 00:30:57,680 Speaker 1: going forward? And in the end, that's really the limitation. 590 00:30:58,080 --> 00:31:01,200 Speaker 1: For example, for hurricanes, if we the location of every 591 00:31:01,280 --> 00:31:04,200 Speaker 1: drop of water on Earth and its current position and velocity, 592 00:31:04,400 --> 00:31:07,320 Speaker 1: and we had a super duper powerful computer, then yeah, 593 00:31:07,360 --> 00:31:11,400 Speaker 1: we could probably predict the weather very accurately. But we 594 00:31:11,480 --> 00:31:14,560 Speaker 1: don't have those computers right well, were I feel like 595 00:31:14,560 --> 00:31:16,720 Speaker 1: we're getting better though, you know, like sometimes I'm really 596 00:31:16,760 --> 00:31:19,400 Speaker 1: impressed by how far ahead we can predict the weather 597 00:31:19,480 --> 00:31:22,360 Speaker 1: and how somewhat accurately, we can. This is just a 598 00:31:22,440 --> 00:31:26,840 Speaker 1: hurdle of technology. Something of the fastest computers in the world, 599 00:31:26,840 --> 00:31:30,160 Speaker 1: like the supercomputers, are all devoted to this problem predicting 600 00:31:30,200 --> 00:31:32,840 Speaker 1: the future of the weather, understanding the atmosphere and all 601 00:31:32,920 --> 00:31:35,880 Speaker 1: of its chaos, and so it's really just like throwing 602 00:31:35,880 --> 00:31:39,080 Speaker 1: more computational power at the problem. Well, and that's because 603 00:31:39,200 --> 00:31:42,920 Speaker 1: weather is a chaotic system, right, Definitely, weather is very chaotic. 604 00:31:42,960 --> 00:31:45,480 Speaker 1: It's very difficult to predict the outcome, even if you 605 00:31:45,560 --> 00:31:47,960 Speaker 1: know the current conditions. But do we know if the 606 00:31:48,040 --> 00:31:52,200 Speaker 1: brain is chaotic or humans or chaotic? Humans seem chaoic 607 00:31:52,240 --> 00:31:53,760 Speaker 1: to me. I mean some of them I've known for 608 00:31:53,800 --> 00:31:56,280 Speaker 1: decades and I still don't understand why. Then good decisions 609 00:31:56,320 --> 00:32:01,240 Speaker 1: they do, and not just their children, not just of children. No, 610 00:32:01,440 --> 00:32:03,280 Speaker 1: we don't know for sure, but it seems to me 611 00:32:03,600 --> 00:32:07,400 Speaker 1: very likely. I mean, it's a hyper connected, very sensitive 612 00:32:07,600 --> 00:32:10,560 Speaker 1: set of neurons. It seems to me very unlikely that 613 00:32:10,640 --> 00:32:12,440 Speaker 1: it wouldn't be a chaotic, But we don't know. It 614 00:32:12,560 --> 00:32:16,080 Speaker 1: might be that there are sort of emergent phenomenon that 615 00:32:16,360 --> 00:32:19,160 Speaker 1: you're not really sensitive to all the little details, and 616 00:32:19,200 --> 00:32:22,760 Speaker 1: that you could build a model that predicts roughly where 617 00:32:22,800 --> 00:32:25,280 Speaker 1: things are going. If you don't care about the details, 618 00:32:25,360 --> 00:32:27,960 Speaker 1: you can tell whether somebody's gonna have a cookie and 619 00:32:28,000 --> 00:32:29,960 Speaker 1: who they're going to vote for the next election. It's 620 00:32:30,000 --> 00:32:32,720 Speaker 1: possible that you could build those models, but that requires 621 00:32:32,720 --> 00:32:36,400 Speaker 1: a sort of another layer of insight. Right. Imagine you 622 00:32:36,800 --> 00:32:39,840 Speaker 1: only knew the baseball in terms of little particles inside 623 00:32:39,840 --> 00:32:41,200 Speaker 1: of it. You're like, oh, there's no way I could 624 00:32:41,200 --> 00:32:43,240 Speaker 1: predict the way this baseball is gonna move. There's ten 625 00:32:43,280 --> 00:32:47,080 Speaker 1: to the twenty three particles. It's impossible. But if you understand, 626 00:32:47,080 --> 00:32:49,280 Speaker 1: if you took a step back and saw the flight 627 00:32:49,320 --> 00:32:51,280 Speaker 1: of it, you can say, oh, actually, I can describe 628 00:32:51,320 --> 00:32:53,760 Speaker 1: this and ignore all the particles. I can ignore all 629 00:32:53,760 --> 00:32:56,800 Speaker 1: those details they're not relevant, and I can just describe 630 00:32:56,840 --> 00:32:59,520 Speaker 1: this in terms of simple motion. So it's possible that 631 00:32:59,600 --> 00:33:03,640 Speaker 1: there's an emergent theory of psychology mathematical psychology that could 632 00:33:03,920 --> 00:33:05,960 Speaker 1: describe the motion of the brain. We just don't know, 633 00:33:06,760 --> 00:33:09,600 Speaker 1: or I wonder if it might vary with people. You know, 634 00:33:09,680 --> 00:33:12,960 Speaker 1: some people might be more predictable than others. Yeah, I 635 00:33:13,000 --> 00:33:17,000 Speaker 1: certainly know some people who seem pretty chaotic. All right, Well, 636 00:33:17,280 --> 00:33:19,600 Speaker 1: those are two pretty big hurdles, and but then there's 637 00:33:19,760 --> 00:33:23,719 Speaker 1: um another hurdle coming up and and or possibly a 638 00:33:23,720 --> 00:33:27,000 Speaker 1: loophole that might still let us have some free will 639 00:33:27,040 --> 00:33:31,760 Speaker 1: in this deterministic machine like universe. So let's get into that. 640 00:33:31,800 --> 00:33:47,240 Speaker 1: But first let's take a quick break, all right, Daniel. 641 00:33:47,280 --> 00:33:53,040 Speaker 1: The last topic here is about randomness and whether randomness 642 00:33:53,040 --> 00:33:57,480 Speaker 1: at the quantum level can maybe effect what we see 643 00:33:57,520 --> 00:34:00,000 Speaker 1: as free will, or whether it can affect with or 644 00:34:00,080 --> 00:34:02,400 Speaker 1: not we can predict what people will think or do. 645 00:34:02,600 --> 00:34:04,760 Speaker 1: The whole premise here, The assumption we're making is that 646 00:34:04,840 --> 00:34:08,560 Speaker 1: the brain should be predictable if it's various bits are predictable, 647 00:34:08,719 --> 00:34:10,600 Speaker 1: if it's made out of predictable bits, then you should 648 00:34:10,600 --> 00:34:13,279 Speaker 1: be able to put those predictable bits together into a 649 00:34:13,320 --> 00:34:17,240 Speaker 1: predictable brain. But listeners of the podcast are probably wondering, 650 00:34:17,600 --> 00:34:20,520 Speaker 1: are those bits predictable? The brain is made of atoms, 651 00:34:20,560 --> 00:34:22,719 Speaker 1: and atoms are made of protons and electrons and all 652 00:34:22,800 --> 00:34:25,680 Speaker 1: this stuff, and we know that those things are quantum mechanical, 653 00:34:25,719 --> 00:34:28,880 Speaker 1: and we talked on the podcast recently about the crazy 654 00:34:28,920 --> 00:34:33,120 Speaker 1: probabilistic nature or quantum mechanics, that things are not determined 655 00:34:33,160 --> 00:34:36,200 Speaker 1: until their measures, that there is really true randomness at 656 00:34:36,200 --> 00:34:39,080 Speaker 1: the quantum scale. So you might be wondering, how can 657 00:34:39,120 --> 00:34:43,399 Speaker 1: you build determinism on top of this fuzzy quantum randomness. Yeah, 658 00:34:43,440 --> 00:34:45,520 Speaker 1: because we talked about some of the challenges, right, We 659 00:34:45,600 --> 00:34:48,680 Speaker 1: talked about scanning your brain and about chaos, and but 660 00:34:48,760 --> 00:34:52,239 Speaker 1: those are technical problems. You're saying that maybe just in 661 00:34:52,280 --> 00:34:55,400 Speaker 1: the bits themselves of the brain, there is some inherent 662 00:34:55,920 --> 00:35:00,319 Speaker 1: randomness that might make it impossible to predict. Absolutely, we 663 00:35:00,440 --> 00:35:03,239 Speaker 1: know that there's inherent randomness. Everything, as we say, is 664 00:35:03,320 --> 00:35:06,440 Speaker 1: made of atoms, and those things are governed by fundamentally 665 00:35:06,480 --> 00:35:11,000 Speaker 1: quantum mechanical properties. What we don't know is if that matters, right, 666 00:35:11,040 --> 00:35:14,520 Speaker 1: It certainly doesn't matter for predicting the motion of a 667 00:35:14,560 --> 00:35:18,080 Speaker 1: mechanical watch. People can build mechanical watches that are super 668 00:35:18,120 --> 00:35:20,640 Speaker 1: accurate for years at a time. We can predict the 669 00:35:20,680 --> 00:35:23,800 Speaker 1: flight of a baseball without even knowing that quantum mechanics 670 00:35:23,840 --> 00:35:26,799 Speaker 1: was the thing. I remember. Quantum mechanics affects things only 671 00:35:26,800 --> 00:35:31,799 Speaker 1: on super duper tiny scales. The uncertainty principle delta x 672 00:35:31,840 --> 00:35:36,320 Speaker 1: delta P the relationship between the uncertainty and motion and position. 673 00:35:36,800 --> 00:35:39,880 Speaker 1: The uncertainty there is is related to this. Planks constant 674 00:35:39,960 --> 00:35:42,520 Speaker 1: each bar, which is like the fundamental unit of the universe. 675 00:35:42,600 --> 00:35:45,120 Speaker 1: But this is a super tiny number. It's ten to 676 00:35:45,160 --> 00:35:49,120 Speaker 1: the minus thirty four jewels. And so it might be 677 00:35:49,520 --> 00:35:52,120 Speaker 1: that on top of the quantum randomness, we do have 678 00:35:52,360 --> 00:35:55,640 Speaker 1: a layer of physics which is deterministic. Or it could 679 00:35:55,680 --> 00:35:57,840 Speaker 1: be that you know that the quantum randomness sort of 680 00:35:57,880 --> 00:36:00,279 Speaker 1: seeps up from below and affects the world working of 681 00:36:00,360 --> 00:36:03,120 Speaker 1: the brain. Okay, so you're saying that there is a 682 00:36:03,200 --> 00:36:08,759 Speaker 1: fundamental randomness in the universe and in my brain cells, 683 00:36:08,800 --> 00:36:13,520 Speaker 1: Like at the to the smallest level, I can't possibly 684 00:36:13,520 --> 00:36:16,880 Speaker 1: predict my brain, but maybe if I go up a 685 00:36:16,920 --> 00:36:20,120 Speaker 1: few levels, then the brain starts to be more predictable. 686 00:36:20,200 --> 00:36:22,040 Speaker 1: We see this in physics. We see at the very 687 00:36:22,080 --> 00:36:24,960 Speaker 1: smallest levels you cannot predict what's going to happen. You 688 00:36:25,000 --> 00:36:27,800 Speaker 1: shoot the same photon into the same experiment twice, you 689 00:36:27,880 --> 00:36:31,719 Speaker 1: get two different outcomes. There's really true randomness there, and 690 00:36:31,800 --> 00:36:36,680 Speaker 1: that breaks determinism. So photons not deterministic, electrons not deterministic. 691 00:36:36,920 --> 00:36:41,840 Speaker 1: Protons not deterministic. But somehow, when you put these things together, 692 00:36:42,080 --> 00:36:45,719 Speaker 1: you put enough of them together, all those random fluctuations 693 00:36:45,760 --> 00:36:48,560 Speaker 1: average out that basically cancel each other. This is a 694 00:36:48,680 --> 00:36:51,440 Speaker 1: deep theorem in physics. It's called the air infest theorem, 695 00:36:51,480 --> 00:36:53,680 Speaker 1: that you have enough of these things and it doesn't 696 00:36:53,719 --> 00:36:55,759 Speaker 1: matter anymore as long as you're measuring things on the 697 00:36:55,840 --> 00:36:59,960 Speaker 1: sort of classical scale and sizes that we care about centimeters, millimeters. 698 00:37:00,040 --> 00:37:03,200 Speaker 1: Even then the quantum mechanic effects average out, which is 699 00:37:03,320 --> 00:37:06,239 Speaker 1: why we didn't notice quantum mechanics for thousands of years. 700 00:37:06,400 --> 00:37:09,560 Speaker 1: It's hidden inside the summation of all of these quantum 701 00:37:09,600 --> 00:37:12,399 Speaker 1: little events. Yeah, which is why it was so hard 702 00:37:12,440 --> 00:37:15,600 Speaker 1: to accept. Right, we had just accepted the mind boggling 703 00:37:15,600 --> 00:37:19,120 Speaker 1: consequences of a deterministic universe, like, wow, the universe seems 704 00:37:19,120 --> 00:37:21,040 Speaker 1: to follow rules and we can predict it. And then 705 00:37:21,080 --> 00:37:23,880 Speaker 1: we discovered, oh, actually no, at its deepest level, it 706 00:37:23,920 --> 00:37:27,000 Speaker 1: seems weirdly random. And that was totally in contradiction with 707 00:37:27,040 --> 00:37:29,760 Speaker 1: everything we thought we understood, which is why quantum mechanics 708 00:37:29,800 --> 00:37:33,360 Speaker 1: are so counterintuitive and so fuzzy. But still, it didn't 709 00:37:33,400 --> 00:37:36,400 Speaker 1: mean that all those things we learned were wrong. It 710 00:37:36,480 --> 00:37:39,280 Speaker 1: just meant that they applied to sort of the larger scales, 711 00:37:39,320 --> 00:37:43,240 Speaker 1: that those same rules can't be applied to electrons and protons. 712 00:37:43,280 --> 00:37:46,040 Speaker 1: But they can still be applied to baseballs and watches. 713 00:37:46,600 --> 00:37:49,800 Speaker 1: So then the question is is your brain a baseball 714 00:37:49,800 --> 00:37:52,319 Speaker 1: and watch or is it an electron. Yeah, it's kind 715 00:37:52,320 --> 00:37:55,080 Speaker 1: of like if I said, um, hey, Dana, I'm gonna 716 00:37:55,080 --> 00:37:57,279 Speaker 1: flip this coin, and if it's head, i'm gonna take 717 00:37:57,320 --> 00:37:59,279 Speaker 1: your cookie, and if its tails, I'm not going to 718 00:37:59,360 --> 00:38:01,799 Speaker 1: take your cookie. Then it then it's kind of like 719 00:38:02,200 --> 00:38:04,440 Speaker 1: then you might say it's unpredictable what am I gonna do? Right, 720 00:38:04,680 --> 00:38:07,680 Speaker 1: because you don't know except that a coin is deterministic. 721 00:38:07,800 --> 00:38:10,279 Speaker 1: A coin is a classical object. If I knew exactly 722 00:38:10,320 --> 00:38:12,839 Speaker 1: how you're gonna flip it, I could predict exactly how 723 00:38:12,880 --> 00:38:14,960 Speaker 1: that coin was gonna land, and I would know that 724 00:38:15,000 --> 00:38:16,480 Speaker 1: you were going to take the cookie no matter how 725 00:38:16,480 --> 00:38:19,000 Speaker 1: the coin flipped, actually, because I know you as a person, 726 00:38:19,680 --> 00:38:22,640 Speaker 1: but but yeah, exactly the coin flip is not written. 727 00:38:22,680 --> 00:38:25,200 Speaker 1: But if you, for example, how a radioactive particle and 728 00:38:25,239 --> 00:38:27,920 Speaker 1: you said, I'm gonna wait one minute. If the particle 729 00:38:28,040 --> 00:38:30,440 Speaker 1: decays in within this minute, I'm eating the cookie. And 730 00:38:30,440 --> 00:38:32,960 Speaker 1: if it doesn't decay within this minute, I'm not eating 731 00:38:33,000 --> 00:38:35,920 Speaker 1: the cookie. Then I can't predict whether or not you're 732 00:38:35,920 --> 00:38:38,440 Speaker 1: gonna eat the cookie. That's because you linked a macroscopic 733 00:38:38,480 --> 00:38:42,600 Speaker 1: action eating the cookie to a quantum mechanical thing. That's rare. Right, 734 00:38:42,640 --> 00:38:45,960 Speaker 1: that's a whole Shrowdinger's box argument. It's very difficult to 735 00:38:46,000 --> 00:38:48,840 Speaker 1: find quantum mechanic effects that you can see on the 736 00:38:48,880 --> 00:38:52,480 Speaker 1: macroscopic scale. Oh, I see, like, if if I made 737 00:38:52,560 --> 00:38:59,319 Speaker 1: my cookie eating dependent on whether this particle decays or not, 738 00:38:59,760 --> 00:39:02,960 Speaker 1: then it's unpredictable. But if I make my eating the 739 00:39:03,000 --> 00:39:07,240 Speaker 1: cookie depending on whether the particle will decay over a minute, 740 00:39:07,760 --> 00:39:10,960 Speaker 1: then we know a lot more. Right, it's less unpredictable 741 00:39:10,960 --> 00:39:14,040 Speaker 1: because you know that on average, most of these particles, 742 00:39:14,080 --> 00:39:17,680 Speaker 1: let's say, like these particles decay within a minute. Yeah, 743 00:39:17,840 --> 00:39:20,960 Speaker 1: that's true. And if you want to make statements about averages, 744 00:39:21,239 --> 00:39:23,799 Speaker 1: then you're a rock solid footing even quantum mechanically, because 745 00:39:23,840 --> 00:39:26,319 Speaker 1: quantum mechanics doesn't mean things don't follow laws. It just 746 00:39:26,360 --> 00:39:29,200 Speaker 1: means those laws are probabilistic. And so you can say, 747 00:39:29,200 --> 00:39:31,120 Speaker 1: on average, this is going to happen. On average, that's 748 00:39:31,160 --> 00:39:34,520 Speaker 1: gonna happen for one particular particle. You can't make any 749 00:39:34,600 --> 00:39:37,799 Speaker 1: predictions on a quantum mechanical scale, but you know what 750 00:39:37,920 --> 00:39:41,040 Speaker 1: quantum mechanical effects do you observe as a person, Like 751 00:39:41,520 --> 00:39:45,160 Speaker 1: for quantum, for something to affect your life that's really random, 752 00:39:45,400 --> 00:39:48,120 Speaker 1: it has to have an impact like that somebody's measuring 753 00:39:48,160 --> 00:39:51,200 Speaker 1: this quantum mechanical thing and making a decision based on it. 754 00:39:51,200 --> 00:39:54,840 Speaker 1: It's really pretty rare for quantum mechanical randomness to affect 755 00:39:54,960 --> 00:39:58,920 Speaker 1: the macroscopic world. People do really complicated experiments to try 756 00:39:58,920 --> 00:40:01,799 Speaker 1: to set this kind of thing up. Bose Einstein condensate. 757 00:40:02,200 --> 00:40:05,400 Speaker 1: It's like a macroscopic quantum state that you can see 758 00:40:05,719 --> 00:40:08,960 Speaker 1: that actually behaves quantum mechanically. It's really weird and amazing, 759 00:40:08,960 --> 00:40:12,400 Speaker 1: and people want the Nobel Prize for that. So two, 760 00:40:12,440 --> 00:40:15,800 Speaker 1: for the brain to be dependent on quantum mechanical randomness, 761 00:40:15,840 --> 00:40:18,480 Speaker 1: you'd have to show somehow that like the motion of 762 00:40:18,560 --> 00:40:23,040 Speaker 1: these electrons and this weird quantum mechanical effect was triggering neurons, 763 00:40:23,040 --> 00:40:26,759 Speaker 1: and neurons were somehow sensitive to these things. Yeah, So 764 00:40:26,920 --> 00:40:30,000 Speaker 1: it's so it's it's it's fundamentally random. But the question, 765 00:40:30,120 --> 00:40:32,279 Speaker 1: it seems like the core and the key question here 766 00:40:32,360 --> 00:40:36,360 Speaker 1: is whether that randomness at the quantum level really affects 767 00:40:36,360 --> 00:40:38,400 Speaker 1: whether I'm going to eat my cookie or not, or 768 00:40:38,400 --> 00:40:41,960 Speaker 1: whether that all gets drowned out by the Brazilian electrons 769 00:40:41,960 --> 00:40:43,719 Speaker 1: that I have in my brain. And there are some 770 00:40:43,760 --> 00:40:46,239 Speaker 1: folks who have made that argument, and I think that 771 00:40:46,239 --> 00:40:50,320 Speaker 1: that argument is largely in response to fear of determinism. 772 00:40:50,360 --> 00:40:52,479 Speaker 1: They don't want to think that the that the brain 773 00:40:52,600 --> 00:40:54,920 Speaker 1: is just a mechanical watch that can be predicted. So 774 00:40:54,920 --> 00:40:58,400 Speaker 1: there's sort of striving for some way to leave a 775 00:40:58,440 --> 00:41:00,600 Speaker 1: window open for free will. And they, oh, well, if 776 00:41:00,640 --> 00:41:03,279 Speaker 1: we can connect it to quantum randomness, then you know 777 00:41:03,320 --> 00:41:05,640 Speaker 1: you can't be predicted and therefore there might be room 778 00:41:05,680 --> 00:41:08,640 Speaker 1: for free will. And famous people like Roger Penrose make 779 00:41:08,680 --> 00:41:11,680 Speaker 1: this argument, but it's just a it's more like a 780 00:41:11,680 --> 00:41:14,560 Speaker 1: an outline of an argument. There's no evidence that the 781 00:41:14,600 --> 00:41:17,600 Speaker 1: brain is dependent on quantum mechanic. It's just like, can 782 00:41:17,680 --> 00:41:20,880 Speaker 1: we find some path maybe go down that leads us 783 00:41:20,920 --> 00:41:24,400 Speaker 1: to a nondeterministic brain. There's no real evidence or argument there. 784 00:41:24,400 --> 00:41:27,600 Speaker 1: It's just like a suggestion. Well, the I think the 785 00:41:27,640 --> 00:41:30,000 Speaker 1: thing is that it's it's a it's a yes or 786 00:41:30,040 --> 00:41:32,719 Speaker 1: no decision, right, whether I eat the cookie, it's either 787 00:41:32,840 --> 00:41:35,440 Speaker 1: yes or no. So I'm kind of like sitting on 788 00:41:35,520 --> 00:41:39,759 Speaker 1: the edge of a really thin knife, right, and you 789 00:41:39,800 --> 00:41:42,279 Speaker 1: know who knows. Are you can you really rule out 790 00:41:42,280 --> 00:41:46,600 Speaker 1: that you know how one particular electron behaved it was 791 00:41:46,840 --> 00:41:49,120 Speaker 1: resulted in pushing me one way or the other, Or 792 00:41:49,160 --> 00:41:54,040 Speaker 1: are you saying that pretty much that's unlikely. No, I'm 793 00:41:54,040 --> 00:41:55,560 Speaker 1: not saying that we can rule it out at all. 794 00:41:55,600 --> 00:41:57,520 Speaker 1: You're totally right, and it could be that there are 795 00:41:57,600 --> 00:42:01,040 Speaker 1: quantum mechanic effects that determine whether you make that decision. 796 00:42:01,080 --> 00:42:03,920 Speaker 1: It's possible, but I'm saying we have no evidence for that. 797 00:42:04,080 --> 00:42:06,800 Speaker 1: Nobody has shown that we don't even understand the mechanism 798 00:42:06,880 --> 00:42:08,799 Speaker 1: of it. That doesn't mean it's not happening. It just 799 00:42:08,880 --> 00:42:12,160 Speaker 1: means that it's more of a suggestion, an open door, 800 00:42:12,400 --> 00:42:15,400 Speaker 1: than an actual idea. All right, well, I feel I 801 00:42:15,440 --> 00:42:17,799 Speaker 1: feel pretty good. I feel like we um we're right 802 00:42:17,840 --> 00:42:20,960 Speaker 1: where we predicted we would be. I think you deserved 803 00:42:20,960 --> 00:42:25,279 Speaker 1: a cookie or a banana at the very list. Oh no, wait, 804 00:42:25,320 --> 00:42:28,319 Speaker 1: do you say? I would say, there you go? But 805 00:42:28,440 --> 00:42:31,000 Speaker 1: I think that there's some fascinating questions there, like even 806 00:42:31,040 --> 00:42:33,000 Speaker 1: if you knew the answer to this question, even if 807 00:42:33,000 --> 00:42:36,759 Speaker 1: you showed that the brain was deterministic or wasn't that 808 00:42:36,840 --> 00:42:40,000 Speaker 1: the brain was quantum mechanically random, what would that mean 809 00:42:40,280 --> 00:42:43,400 Speaker 1: for this experience of free will. Let me see if 810 00:42:43,440 --> 00:42:46,280 Speaker 1: I can readat what we learned. We learned that it's 811 00:42:46,280 --> 00:42:50,200 Speaker 1: predicting what people are going to do is super technically 812 00:42:50,280 --> 00:42:53,040 Speaker 1: hard with the scanning all of the neurons in your 813 00:42:53,040 --> 00:42:58,000 Speaker 1: brain and chaos maybe playing a large party making it unpredictable. 814 00:42:58,120 --> 00:43:00,480 Speaker 1: But let's say like we invented technology to take care 815 00:43:00,520 --> 00:43:02,879 Speaker 1: of that, there's still sort of the question of whether 816 00:43:03,000 --> 00:43:09,040 Speaker 1: randomness trickles up to influence decisions, you know, randomness at 817 00:43:09,080 --> 00:43:12,440 Speaker 1: the quantum level, whether that trickles up to influenced decisions, 818 00:43:12,520 --> 00:43:14,680 Speaker 1: And it sounds like we we we don't know. It 819 00:43:14,719 --> 00:43:16,960 Speaker 1: sounds like we can't say either way. That's right. Quantum 820 00:43:16,960 --> 00:43:20,919 Speaker 1: mechanics might make it theoretically impossible to predict the brain. 821 00:43:21,000 --> 00:43:23,680 Speaker 1: We don't know, And if it doesn't, if it doesn't 822 00:43:23,680 --> 00:43:26,400 Speaker 1: make it impossible, and we know, it's still super duper 823 00:43:26,480 --> 00:43:29,319 Speaker 1: hard because you need to know the exact state of 824 00:43:29,320 --> 00:43:32,800 Speaker 1: the brain and you need to overcome the potential overwhelming 825 00:43:32,960 --> 00:43:36,919 Speaker 1: chaos of your billions and billions of neurons. So even 826 00:43:36,960 --> 00:43:40,239 Speaker 1: if it's not impossible, it's definitely very very tricky, right, 827 00:43:40,320 --> 00:43:42,000 Speaker 1: And then we could probably have a whole podcast just 828 00:43:42,080 --> 00:43:45,279 Speaker 1: on the implications on how we feel about free will 829 00:43:45,360 --> 00:43:48,640 Speaker 1: and whether it would still exist even if it was 830 00:43:48,960 --> 00:43:51,879 Speaker 1: governed by quantum randomness. That's right, And this is an 831 00:43:51,920 --> 00:43:55,480 Speaker 1: area of philosophy, And neither you nor I have formal 832 00:43:55,480 --> 00:43:57,400 Speaker 1: training in philosophy. What do you mean have a doctorate 833 00:43:57,400 --> 00:44:03,520 Speaker 1: in philosophy in engineerings. So then let me ask you, Jorge, 834 00:44:03,800 --> 00:44:06,560 Speaker 1: if I could prove that you were deterministic, does that 835 00:44:06,640 --> 00:44:08,960 Speaker 1: mean you don't have free will? This being a big 836 00:44:09,000 --> 00:44:11,759 Speaker 1: mechanical robot mean you're not making choices? Or is that 837 00:44:11,840 --> 00:44:15,640 Speaker 1: just describe the choices you are you are making? I 838 00:44:15,680 --> 00:44:18,759 Speaker 1: think that would be bananas. No, I think for me 839 00:44:18,880 --> 00:44:20,839 Speaker 1: it sort of doesn't matter. I feel like it doesn't matter. 840 00:44:20,840 --> 00:44:23,160 Speaker 1: I'm not I'm not someone who sweats free will too much. 841 00:44:23,640 --> 00:44:25,680 Speaker 1: To be honest, Like, I feel like it might be 842 00:44:25,719 --> 00:44:29,239 Speaker 1: a robot. That's fine, And um, you know, currently it's 843 00:44:29,239 --> 00:44:31,120 Speaker 1: almost like who would want to predict what I'm going 844 00:44:31,160 --> 00:44:33,160 Speaker 1: to do? Do you know what I mean? Like, it's 845 00:44:33,239 --> 00:44:37,160 Speaker 1: such like, why is this problem even interesting? Who would 846 00:44:37,160 --> 00:44:39,200 Speaker 1: take the time to care about what I'm going to 847 00:44:39,320 --> 00:44:42,279 Speaker 1: think and do? Um? So, as long as nobody else 848 00:44:42,320 --> 00:44:44,640 Speaker 1: can think or would want to know, what I think 849 00:44:44,680 --> 00:44:46,759 Speaker 1: and do. Then to me like, okay, I could be 850 00:44:46,800 --> 00:44:48,520 Speaker 1: a robot or I could not be a robot. Well, 851 00:44:48,520 --> 00:44:50,960 Speaker 1: I think there's an implication of moral responsibility. We've been 852 00:44:51,000 --> 00:44:53,400 Speaker 1: talking about you eating a cookie. What if that was 853 00:44:53,480 --> 00:44:56,920 Speaker 1: somebody else's cookie, right, and you ate that person's cookie, 854 00:44:57,160 --> 00:44:59,600 Speaker 1: Then could you say, Hey, I didn't make that choice. 855 00:44:59,640 --> 00:45:02,000 Speaker 1: I have no free will, and therefore I can't be 856 00:45:02,040 --> 00:45:05,640 Speaker 1: more morally culpable. If everybody is deterministic and people aren't 857 00:45:05,719 --> 00:45:08,680 Speaker 1: making choices, than our sort of whole theory of morality 858 00:45:08,760 --> 00:45:12,000 Speaker 1: kind of falls apart and we can't really punish anybody 859 00:45:12,000 --> 00:45:14,920 Speaker 1: for anything. All right, we'll say that. For our other podcast, 860 00:45:15,000 --> 00:45:18,839 Speaker 1: Daniel and Jorge explain the Moral Universe. Daniel and Jorge 861 00:45:19,160 --> 00:45:21,400 Speaker 1: um blather on about philosophy. They don't really know what 862 00:45:21,440 --> 00:45:24,680 Speaker 1: they're talking about. I would listen to that, and I 863 00:45:24,760 --> 00:45:27,799 Speaker 1: predict that a lot of people will probably not. But 864 00:45:27,840 --> 00:45:29,680 Speaker 1: I think these are, like many of the things that 865 00:45:29,680 --> 00:45:31,879 Speaker 1: we talked about on the show, there are deep implications 866 00:45:31,920 --> 00:45:34,240 Speaker 1: for just what it means to be human being. Often 867 00:45:34,239 --> 00:45:37,319 Speaker 1: we talk about how the universe was created and how 868 00:45:37,360 --> 00:45:39,200 Speaker 1: it came to be in his future, and that has 869 00:45:39,560 --> 00:45:43,000 Speaker 1: deep importance for people, how what people think about their 870 00:45:43,040 --> 00:45:45,279 Speaker 1: place in the cosmos and how they should live their lives. 871 00:45:45,360 --> 00:45:47,840 Speaker 1: And this is similar. It's a question, you know, is 872 00:45:47,840 --> 00:45:50,239 Speaker 1: this the threshold that science will ever cross? And if so, 873 00:45:50,320 --> 00:45:52,600 Speaker 1: what does it mean? And so I love the fact 874 00:45:52,640 --> 00:45:55,719 Speaker 1: that science is so relevant sometimes that everything touches the 875 00:45:55,800 --> 00:45:57,560 Speaker 1: deep core of how you're gonna live your life and 876 00:45:57,600 --> 00:45:59,480 Speaker 1: what it means to be human. Yeah, it's amazing to 877 00:45:59,560 --> 00:46:02,359 Speaker 1: think that physics can have such an impact in who 878 00:46:02,400 --> 00:46:06,520 Speaker 1: we are as a in our souls, in our conception 879 00:46:06,600 --> 00:46:08,640 Speaker 1: of who we are. Well, physics is deep in my 880 00:46:08,680 --> 00:46:11,640 Speaker 1: soul and now maybe you're discovering it's actually deep in 881 00:46:11,719 --> 00:46:16,080 Speaker 1: your soul as well. We got you, We infected your soul, 882 00:46:16,880 --> 00:46:19,880 Speaker 1: dear audience, physics. All right, Well, thanks for joining us. 883 00:46:19,920 --> 00:46:22,279 Speaker 1: We hope you enjoyed that, and we hope that meant 884 00:46:22,320 --> 00:46:25,680 Speaker 1: all of your expectations. But what this podcast was going 885 00:46:25,719 --> 00:46:27,920 Speaker 1: to be, that's right. Thanks for tuning in. And for 886 00:46:27,960 --> 00:46:31,400 Speaker 1: those of you who wonder whether science can actually explain 887 00:46:31,400 --> 00:46:34,239 Speaker 1: the universe, remember that we don't know. Science has worked 888 00:46:34,239 --> 00:46:36,120 Speaker 1: pretty well so far and been able to explain a 889 00:46:36,200 --> 00:46:38,279 Speaker 1: lot of what we've seen, but there might be some 890 00:46:38,360 --> 00:46:40,719 Speaker 1: day in the future where we find some phenomenon that 891 00:46:40,840 --> 00:46:43,800 Speaker 1: can't be explained by science, or by our current visioning 892 00:46:43,840 --> 00:46:47,000 Speaker 1: of science or our mathematical rules. So we are continuing 893 00:46:47,040 --> 00:46:49,600 Speaker 1: on this journey of trying to explore and explain the 894 00:46:49,680 --> 00:46:52,399 Speaker 1: universe we find around us. Thanks for joining us, see 895 00:46:52,400 --> 00:47:02,360 Speaker 1: you next time. Before you still have a question after 896 00:47:02,400 --> 00:47:05,520 Speaker 1: listening to all these explanations, please drop us the line. 897 00:47:05,560 --> 00:47:07,719 Speaker 1: We'd love to hear from you. You can find us 898 00:47:07,719 --> 00:47:11,520 Speaker 1: on Facebook, Twitter, and Instagram at Daniel and Jorge That's 899 00:47:11,560 --> 00:47:14,920 Speaker 1: one Word, or email us at Feedback at Daniel and 900 00:47:15,040 --> 00:47:18,480 Speaker 1: Jorge dot com. Thanks for listening and remember that Daniel 901 00:47:18,520 --> 00:47:21,040 Speaker 1: and Jorge Explain the Universe is a production of I 902 00:47:21,280 --> 00:47:24,719 Speaker 1: Heart Radio. For more podcast from my heart Radio, visit 903 00:47:24,760 --> 00:47:28,239 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 904 00:47:28,320 --> 00:47:29,840 Speaker 1: listen to your favorite shows.