1 00:00:09,119 --> 00:00:11,159 Speaker 1: Or Hey, how do you feel when you get an 2 00:00:11,240 --> 00:00:15,880 Speaker 1: unexpected piece of mail, like a letter to my house? 3 00:00:16,840 --> 00:00:20,680 Speaker 1: It's usually either good news or bad news exactly, But 4 00:00:20,720 --> 00:00:22,919 Speaker 1: do you get a feeling about it before you open it, 5 00:00:22,920 --> 00:00:25,319 Speaker 1: about whether it's good or bad? Sure, Like, if it 6 00:00:25,320 --> 00:00:28,200 Speaker 1: looks like a check, it's probably good news. If it 7 00:00:28,240 --> 00:00:32,520 Speaker 1: comes from the I R S, probably not good news. Yeah, 8 00:00:32,560 --> 00:00:37,559 Speaker 1: and anything handwritten is probably also not bad news. What 9 00:00:37,680 --> 00:00:40,440 Speaker 1: if the I r S says your handwritten note, that's 10 00:00:40,479 --> 00:00:44,000 Speaker 1: probably extra bad news. Actually, well, it could be a 11 00:00:44,000 --> 00:00:45,800 Speaker 1: fan from the I R S. You could have a 12 00:00:46,360 --> 00:00:48,640 Speaker 1: fan letter, unless it's cut off from little pieces of 13 00:00:48,720 --> 00:00:52,640 Speaker 1: magazine fonts. I don't want anybody the I r S 14 00:00:52,680 --> 00:00:55,440 Speaker 1: to be impressed by the creativity of my tax return. 15 00:01:10,560 --> 00:01:13,560 Speaker 1: I am more handmay cartoonists and the creator of PhD comics. 16 00:01:14,080 --> 00:01:17,000 Speaker 1: I'm Daniel. I'm a particle physicist and a professor at 17 00:01:17,040 --> 00:01:20,520 Speaker 1: u C Irvine, where my mailbox is usually filled with 18 00:01:20,959 --> 00:01:24,000 Speaker 1: people sending me their theories of the universe and letters 19 00:01:24,120 --> 00:01:27,000 Speaker 1: from prisoners. Is that true you get letters from prison 20 00:01:27,120 --> 00:01:30,080 Speaker 1: like physical letters, I do. I get handwritten letters from 21 00:01:30,120 --> 00:01:32,959 Speaker 1: prison from folks who have found our book in their 22 00:01:33,000 --> 00:01:36,800 Speaker 1: prison library. Wow. That's pretty interesting. That's pretty cool. Yeah, 23 00:01:36,840 --> 00:01:39,200 Speaker 1: a little window out into the universe. I thought you 24 00:01:39,200 --> 00:01:41,040 Speaker 1: were going to say they're from your friends in prison, 25 00:01:41,680 --> 00:01:45,039 Speaker 1: although I guess now you have friends in prison. There 26 00:01:45,040 --> 00:01:47,760 Speaker 1: are my friends now. I actually used to teach in 27 00:01:47,800 --> 00:01:49,640 Speaker 1: prison when I was at Berkeley. I used to go 28 00:01:49,720 --> 00:01:51,720 Speaker 1: up and be a t a at San Quentin. They 29 00:01:51,720 --> 00:01:54,800 Speaker 1: had a class there in mathematics for prisoners. So go 30 00:01:55,040 --> 00:01:58,400 Speaker 1: and help teach arithmetic to prisoners. Oh, that's pretty cool, 31 00:01:58,560 --> 00:02:00,840 Speaker 1: until the day I discovered that one of the prisoners 32 00:02:00,840 --> 00:02:04,040 Speaker 1: there used to be a grad student working for my advisor, 33 00:02:04,200 --> 00:02:07,240 Speaker 1: also at Berkeley. No way, so you were, you thought, 34 00:02:07,240 --> 00:02:10,040 Speaker 1: because they do more than you, or it sort of 35 00:02:10,080 --> 00:02:13,760 Speaker 1: scared you about your potential future. I thought, Wow, there 36 00:02:13,760 --> 00:02:16,240 Speaker 1: are more career paths from grad school than I realized, 37 00:02:17,200 --> 00:02:20,720 Speaker 1: creative ones at that But welcome to our podcast Daniel 38 00:02:20,720 --> 00:02:23,240 Speaker 1: and Jorge Explain the Universe, a production of I Heart 39 00:02:23,320 --> 00:02:25,800 Speaker 1: Radio in which we try to unlock your mind and 40 00:02:25,840 --> 00:02:29,200 Speaker 1: release you from the prison of our human ignorance. We 41 00:02:29,200 --> 00:02:31,920 Speaker 1: want to set everybody's brains free, and we want to 42 00:02:32,000 --> 00:02:34,840 Speaker 1: understand everything that's out there in the universe from the 43 00:02:34,919 --> 00:02:38,440 Speaker 1: true nature of the fundamental elements of the universe, our 44 00:02:38,560 --> 00:02:41,360 Speaker 1: space and time fundamental, or do they emerge from some 45 00:02:41,480 --> 00:02:45,240 Speaker 1: crazy frothing quantum reality all the way up to the largest, 46 00:02:45,320 --> 00:02:48,280 Speaker 1: most dramatic structures in the universe, to the very edges 47 00:02:48,320 --> 00:02:51,280 Speaker 1: of the known universe and beyond. We aim to explain 48 00:02:51,360 --> 00:02:53,400 Speaker 1: all of it to you. Yeah, because it is a 49 00:02:53,400 --> 00:02:56,720 Speaker 1: pretty amazing universe. It's vast and full of interesting and 50 00:02:56,760 --> 00:02:59,000 Speaker 1: crazy mysteries. But we seem to be sort of maybe 51 00:02:59,040 --> 00:03:02,680 Speaker 1: stuck and maybe sort of also prisoners in one little 52 00:03:02,720 --> 00:03:05,080 Speaker 1: tiny corner of it, sort of blocked off by the 53 00:03:05,200 --> 00:03:07,800 Speaker 1: huge walls of space around us that make it really 54 00:03:07,840 --> 00:03:10,560 Speaker 1: hard for us to go visit other places. It does 55 00:03:10,600 --> 00:03:13,239 Speaker 1: sort of seem like we are in solitary confinement, and 56 00:03:13,440 --> 00:03:16,480 Speaker 1: we don't know if there are other intelligent beings out 57 00:03:16,520 --> 00:03:19,960 Speaker 1: there in the galaxy also wondering about the same questions 58 00:03:20,000 --> 00:03:22,919 Speaker 1: that we are proving the nature of matter, and wondering 59 00:03:23,000 --> 00:03:25,880 Speaker 1: if they are also alone. Yeah, we're like in a 60 00:03:25,960 --> 00:03:31,280 Speaker 1: cosmological time out. Maybe we misbehave We've definitely been misbehaving, 61 00:03:31,280 --> 00:03:32,920 Speaker 1: But I don't know if we deserve to be in 62 00:03:33,000 --> 00:03:36,920 Speaker 1: isolation for thousands and thousands of years. It seems a 63 00:03:36,920 --> 00:03:38,760 Speaker 1: bit extreme. We can't play with the breast of the 64 00:03:38,840 --> 00:03:42,120 Speaker 1: kids kids species in the universe. Maybe we're just waiting 65 00:03:42,160 --> 00:03:43,640 Speaker 1: for the day we get let out into the yard 66 00:03:43,680 --> 00:03:46,760 Speaker 1: and we get to meet all the other prisoners. Although 67 00:03:46,760 --> 00:03:49,240 Speaker 1: it's not a bad prison metaphorically speaking. I mean, the 68 00:03:49,280 --> 00:03:54,040 Speaker 1: Earth is pretty comfortable. It's like home confinement. That's right. 69 00:03:54,080 --> 00:03:56,280 Speaker 1: I got no complaints. I love the Earth, And we 70 00:03:56,360 --> 00:03:58,280 Speaker 1: got a pretty good view out of our window. We 71 00:03:58,320 --> 00:04:01,600 Speaker 1: can see really far into the universe, across billions and 72 00:04:01,760 --> 00:04:06,200 Speaker 1: billions of light years and are witnessing cataclysmic cosmic events 73 00:04:06,240 --> 00:04:09,240 Speaker 1: that give us clues into the nature of reality. One 74 00:04:09,280 --> 00:04:11,839 Speaker 1: thing we haven't seen yet, however, is evidence that there's 75 00:04:11,880 --> 00:04:13,920 Speaker 1: somebody else out there. Yeah, it kind of makes you 76 00:04:13,960 --> 00:04:17,159 Speaker 1: wonder if there are other alien species sort of looking 77 00:04:17,200 --> 00:04:19,479 Speaker 1: at the universe like we are, and maybe looking at 78 00:04:19,520 --> 00:04:21,640 Speaker 1: the same events at the same time as we are, 79 00:04:21,800 --> 00:04:24,640 Speaker 1: and asking the same questions we're asking about the universe. 80 00:04:24,760 --> 00:04:27,240 Speaker 1: I think it's even more interesting if there are alien 81 00:04:27,279 --> 00:04:30,800 Speaker 1: astronomers looking at the same events but before we are, 82 00:04:30,960 --> 00:04:33,919 Speaker 1: Like some photons that have taken five hundred million years 83 00:04:33,960 --> 00:04:36,360 Speaker 1: to get to Earth, aliens might have seen photons from 84 00:04:36,400 --> 00:04:40,080 Speaker 1: that same event. A hundred million years ago, they might 85 00:04:40,080 --> 00:04:42,240 Speaker 1: have been like working on this and writing papers and 86 00:04:42,240 --> 00:04:46,679 Speaker 1: scooped us by a hundred million years. Are you saying 87 00:04:46,680 --> 00:04:49,359 Speaker 1: the science we have right now is old news like 88 00:04:49,440 --> 00:04:52,880 Speaker 1: it's it's it's based on stale data. That's right. It's 89 00:04:52,920 --> 00:04:54,920 Speaker 1: time to go out there and get some fresh results. 90 00:04:55,200 --> 00:04:57,800 Speaker 1: But depending on where you are in the universe, you 91 00:04:57,839 --> 00:05:01,159 Speaker 1: may see something a lot sooner then somebody else. So 92 00:05:01,200 --> 00:05:03,760 Speaker 1: it could be that there's something amazing that happened in 93 00:05:03,800 --> 00:05:06,719 Speaker 1: the universe and the light from it is still getting here, 94 00:05:06,720 --> 00:05:09,800 Speaker 1: but it's already hit alien astronomers and they have received 95 00:05:09,839 --> 00:05:13,960 Speaker 1: this incredible insight from this one of a kind cosmic event. Right, 96 00:05:14,000 --> 00:05:16,800 Speaker 1: But I guess in a relative universe there's such a 97 00:05:16,800 --> 00:05:20,039 Speaker 1: thing as sort of instantaneity, right, Like, maybe they saw 98 00:05:20,040 --> 00:05:21,800 Speaker 1: it before we did, But by the time they tell 99 00:05:21,880 --> 00:05:24,799 Speaker 1: us they saw it, we have already seen this discovery. 100 00:05:24,880 --> 00:05:27,839 Speaker 1: That's true. Photons from the event will arrive here before 101 00:05:28,040 --> 00:05:30,160 Speaker 1: their paper on it arrives here, so we'll have a 102 00:05:30,279 --> 00:05:32,039 Speaker 1: narrow window to claim that we came up with the 103 00:05:32,080 --> 00:05:35,960 Speaker 1: ideas ourselves, unless their paper is about wormholes and somehow 104 00:05:35,960 --> 00:05:38,360 Speaker 1: they get it to us before we see the light, 105 00:05:39,040 --> 00:05:42,599 Speaker 1: in which case it's like maybe getting a preview or something. 106 00:05:42,680 --> 00:05:46,039 Speaker 1: Maybe or maybe the first interstellar war will be started 107 00:05:46,080 --> 00:05:50,080 Speaker 1: by physicists arguing about who came up with an idea first. Yeah, 108 00:05:50,200 --> 00:05:54,520 Speaker 1: it'll be fought on the wormhole Internet probably, or maybe 109 00:05:54,520 --> 00:05:56,560 Speaker 1: their first paper will be like, hey, Earth, watch out, 110 00:05:56,560 --> 00:05:59,120 Speaker 1: there's a giant gamma raypers coming your way that will 111 00:05:59,200 --> 00:06:01,400 Speaker 1: kill you. Would be very kind to them. I hope 112 00:06:01,520 --> 00:06:03,200 Speaker 1: that aliens are out there and looking out for us, 113 00:06:03,320 --> 00:06:05,039 Speaker 1: But it is a pretty big question of whether or 114 00:06:05,040 --> 00:06:07,280 Speaker 1: not we are alone in the universe. That you know, 115 00:06:07,320 --> 00:06:09,480 Speaker 1: it's a vast universe and we are able to sit 116 00:06:09,560 --> 00:06:12,080 Speaker 1: of sea and look out there, and we're also able 117 00:06:12,120 --> 00:06:14,839 Speaker 1: to send messages. But it's a little bit of a 118 00:06:14,880 --> 00:06:17,680 Speaker 1: debate about whether we should be sending out messages or not. 119 00:06:17,880 --> 00:06:21,520 Speaker 1: That's right. Some folks are desperately curious to contact aliens 120 00:06:21,600 --> 00:06:24,640 Speaker 1: or even just to know about their existence, and other 121 00:06:24,680 --> 00:06:28,120 Speaker 1: folks are wondering, is it a good idea for aliens 122 00:06:28,160 --> 00:06:30,840 Speaker 1: to know that we are here? Yeah, because you might 123 00:06:30,880 --> 00:06:34,560 Speaker 1: be a learning dangerous aliens to our presence. Right, you 124 00:06:34,640 --> 00:06:36,600 Speaker 1: might be telling them, Hey, there's a whole bunch of 125 00:06:36,600 --> 00:06:40,919 Speaker 1: delicious looking meat running around here in this blue planet 126 00:06:40,960 --> 00:06:43,040 Speaker 1: where you can also get a nice drink of water, 127 00:06:43,279 --> 00:06:45,760 Speaker 1: come get it, that's right, Or you might accidentally say 128 00:06:45,839 --> 00:06:50,040 Speaker 1: something offensive to spark an interstellar war. Sometimes just staying 129 00:06:50,120 --> 00:06:52,479 Speaker 1: quiet is the best approach. You're saying, the aliens could 130 00:06:52,480 --> 00:06:55,120 Speaker 1: have thin skin. Maybe they don't even have skin, you know, 131 00:06:55,160 --> 00:06:58,640 Speaker 1: maybe they just take offense to everything instantly, that's right. 132 00:06:59,000 --> 00:07:00,880 Speaker 1: Who knows they're aliens? Right, it could just be like 133 00:07:00,920 --> 00:07:04,080 Speaker 1: a cloud of group or something. But I guess it's 134 00:07:04,120 --> 00:07:06,520 Speaker 1: sort of the idea that maybe our quest to be 135 00:07:06,640 --> 00:07:10,120 Speaker 1: like reach out and contact other civilizations maybe naive, right, 136 00:07:10,200 --> 00:07:13,239 Speaker 1: because it could be leading us into a potentially dangerous 137 00:07:13,240 --> 00:07:16,440 Speaker 1: situation exactly. And it's a question we often are confronted 138 00:07:16,440 --> 00:07:20,240 Speaker 1: by in physics, not just whether we can, but whether 139 00:07:20,320 --> 00:07:22,760 Speaker 1: we should. So today on the podcast, we'll be tackling 140 00:07:22,760 --> 00:07:31,680 Speaker 1: the question is it dangerous to try to communicate with aliens? 141 00:07:32,160 --> 00:07:35,120 Speaker 1: That question sounds dangerous. I mean, anytime you say sort 142 00:07:35,120 --> 00:07:37,480 Speaker 1: of aliens, you gotta you gotta step back for a second. 143 00:07:37,640 --> 00:07:39,880 Speaker 1: You and I often joke about this on the podcast. 144 00:07:39,960 --> 00:07:42,320 Speaker 1: I'm looking forward to the day the aliens arrived because 145 00:07:42,360 --> 00:07:44,920 Speaker 1: I hope I can ask them physics questions about like 146 00:07:44,960 --> 00:07:48,360 Speaker 1: what is the true nature of reality and quantum gravity 147 00:07:48,400 --> 00:07:51,080 Speaker 1: and how did the universe begin? Etcetera. But sometimes you 148 00:07:51,120 --> 00:07:53,520 Speaker 1: get the impression that you're a little bit more concerned 149 00:07:53,600 --> 00:07:56,240 Speaker 1: about whether or not the aliens will just fry us 150 00:07:56,320 --> 00:08:00,400 Speaker 1: from orbit. Yeah, well, you know, you look at it 151 00:08:00,440 --> 00:08:02,640 Speaker 1: from all sides, I guess. But um, you know, first 152 00:08:02,640 --> 00:08:04,240 Speaker 1: of all, I guess you're assuming that the aliens know 153 00:08:04,360 --> 00:08:06,800 Speaker 1: the answer to these questions. Maybe they're just as clues 154 00:08:06,840 --> 00:08:09,720 Speaker 1: as as we are and hungry. That's a great point. 155 00:08:09,720 --> 00:08:11,800 Speaker 1: It depends a lot on how we actually get in 156 00:08:11,840 --> 00:08:14,080 Speaker 1: touch with these aliens. If aliens come to Earth, and 157 00:08:14,120 --> 00:08:16,520 Speaker 1: it's likely that they have traveled through space in a 158 00:08:16,520 --> 00:08:18,480 Speaker 1: way that we haven't been able to, so it's very 159 00:08:18,520 --> 00:08:22,160 Speaker 1: likely they're more advanced than we are, and so probably 160 00:08:22,280 --> 00:08:25,160 Speaker 1: they have some answers or at least different perspectives on 161 00:08:25,200 --> 00:08:27,640 Speaker 1: these questions. But if we just get a message from them, 162 00:08:27,640 --> 00:08:30,560 Speaker 1: then there's no guarantee that they have developed anything that 163 00:08:30,600 --> 00:08:32,840 Speaker 1: we haven't developed. And if they all they do is 164 00:08:32,880 --> 00:08:35,800 Speaker 1: here our message, they might not have very sophisticated technology 165 00:08:35,840 --> 00:08:38,520 Speaker 1: at all. I think it's just kind of dangerous to 166 00:08:38,640 --> 00:08:41,560 Speaker 1: have those expectations. You know, like, what if you hail 167 00:08:41,640 --> 00:08:44,280 Speaker 1: the aliens they come all the way here, and then 168 00:08:44,320 --> 00:08:45,680 Speaker 1: you ask them like, hey, what are the secrets to 169 00:08:45,679 --> 00:08:48,760 Speaker 1: the universe? And they're like, we don't know. What are 170 00:08:48,760 --> 00:08:51,600 Speaker 1: you gonna do? It's gonna be really awkward to I'm 171 00:08:51,600 --> 00:08:54,679 Speaker 1: gonna be like, I made all this guacamole just just 172 00:08:54,760 --> 00:08:57,240 Speaker 1: for have this party with you to celebrate this, you know, 173 00:08:57,360 --> 00:09:00,160 Speaker 1: solutions to the universe, and now it's just gonna go bad. Uh, 174 00:09:00,400 --> 00:09:01,600 Speaker 1: And then what are you gonna do? Or you can 175 00:09:01,640 --> 00:09:03,760 Speaker 1: disinvite them? What are you going to talk about for 176 00:09:03,800 --> 00:09:06,600 Speaker 1: the next you know, billion years with them in your house. 177 00:09:06,720 --> 00:09:10,559 Speaker 1: I'm definitely not great at keeping conversation going, especially with strangers, 178 00:09:10,920 --> 00:09:12,719 Speaker 1: so it would be a tricky situation. But I think 179 00:09:12,720 --> 00:09:15,960 Speaker 1: it's worth the risk because they could have incredible answers, 180 00:09:16,160 --> 00:09:18,280 Speaker 1: or at least, you know, maybe they even just ask 181 00:09:18,440 --> 00:09:22,520 Speaker 1: different questions, questions we haven't thought to ask. I think, 182 00:09:22,800 --> 00:09:26,479 Speaker 1: joking aside, the greatest thing we would learn from communicating 183 00:09:26,559 --> 00:09:29,560 Speaker 1: intellectually with another species is just learning from all the 184 00:09:29,559 --> 00:09:33,680 Speaker 1: differences in our approaches, not even necessarily the answers interesting. 185 00:09:33,760 --> 00:09:35,800 Speaker 1: Yet it wouldn't it be easier just to try a 186 00:09:35,840 --> 00:09:40,520 Speaker 1: different approach ourselves. Yeah, sit down, brainstorm, other ways we 187 00:09:40,520 --> 00:09:43,520 Speaker 1: could think about things. What do you need to outsource? 188 00:09:45,000 --> 00:09:47,000 Speaker 1: Sometimes I think the human thought is sort of trapped 189 00:09:47,000 --> 00:09:49,400 Speaker 1: in a rut and we don't even realize all the 190 00:09:49,400 --> 00:09:52,480 Speaker 1: time the decisions we've made in the arbitrary choices that 191 00:09:52,679 --> 00:09:55,640 Speaker 1: have influenced us. And just like the path of human thought, 192 00:09:55,720 --> 00:09:57,679 Speaker 1: you know who thought of what when and who had 193 00:09:57,720 --> 00:10:00,040 Speaker 1: time to go into science and influence the direction and 194 00:10:00,120 --> 00:10:02,840 Speaker 1: of human science. So I think if you ran the 195 00:10:02,920 --> 00:10:05,240 Speaker 1: Earth is an experiment a thousand times, you would probably 196 00:10:05,240 --> 00:10:08,160 Speaker 1: get lots of different ideas about science. But we only 197 00:10:08,200 --> 00:10:12,199 Speaker 1: have this one experiment. So it's a little frustrating, all right. Well, 198 00:10:12,200 --> 00:10:15,000 Speaker 1: as usually, we were wondering how people thought about this 199 00:10:15,080 --> 00:10:18,439 Speaker 1: idea of contacting aliens or looking for aliens, or whether 200 00:10:18,480 --> 00:10:21,880 Speaker 1: they thought it was a good idea or risky one. 201 00:10:22,720 --> 00:10:25,640 Speaker 1: So Daniel went out there, this time in person, out 202 00:10:25,679 --> 00:10:29,040 Speaker 1: into the campus of you see Irvine, to ask people 203 00:10:29,080 --> 00:10:31,680 Speaker 1: this question. That's right, you see, I opened up again 204 00:10:31,720 --> 00:10:33,560 Speaker 1: after the pandemic, and so I was able to walk 205 00:10:33,600 --> 00:10:36,680 Speaker 1: around campus and make people feel weird by asking them 206 00:10:36,720 --> 00:10:38,880 Speaker 1: about aliens. To thank you to all the u c 207 00:10:39,000 --> 00:10:41,679 Speaker 1: I students, who answered this question. Wow, was it extra 208 00:10:41,840 --> 00:10:45,640 Speaker 1: weird being back and asking questions of strangers? Like I 209 00:10:45,640 --> 00:10:48,080 Speaker 1: imagine it was weird before to have this, you know, 210 00:10:48,160 --> 00:10:50,960 Speaker 1: scurfy looking physicist approach you and ask you questions about 211 00:10:50,960 --> 00:10:53,400 Speaker 1: the universe. But now you're doing it with a mask 212 00:10:53,640 --> 00:10:56,840 Speaker 1: and there's a pandemic going on. How was that weird? Well, 213 00:10:56,880 --> 00:10:58,839 Speaker 1: I think I'm a little less scruffy because of the mask. 214 00:10:58,920 --> 00:11:01,800 Speaker 1: It hides some of the rough so maybe it makes 215 00:11:01,840 --> 00:11:03,880 Speaker 1: me look a little more professional. But it was nice. 216 00:11:03,920 --> 00:11:05,880 Speaker 1: It's good to be out there again. It's good to 217 00:11:05,920 --> 00:11:08,079 Speaker 1: see people. It's good to have the campus be alive. 218 00:11:08,280 --> 00:11:11,119 Speaker 1: So I think people are a little hungry for some interaction. 219 00:11:11,280 --> 00:11:14,040 Speaker 1: I was surprised everybody was very receptive. All right, Well, 220 00:11:14,120 --> 00:11:16,200 Speaker 1: think about it for a second. If someone asked you 221 00:11:16,240 --> 00:11:18,760 Speaker 1: whether it's a good idea to look for aliens or 222 00:11:18,920 --> 00:11:22,240 Speaker 1: is it risky? What would you answer. Here's what people 223 00:11:22,240 --> 00:11:24,240 Speaker 1: had to say. I don't really know. I mean, I 224 00:11:24,280 --> 00:11:28,120 Speaker 1: think it's pretty cool. I guess I think it's more 225 00:11:28,200 --> 00:11:31,439 Speaker 1: interesting rather than like dangerous if you were to try 226 00:11:31,480 --> 00:11:35,280 Speaker 1: to communicate. So, yeah, there is always a risk, but uh, 227 00:11:36,120 --> 00:11:39,760 Speaker 1: there's also an opportunity. So I feel yeah, I feel 228 00:11:39,760 --> 00:11:42,880 Speaker 1: it's actually not necessarily It's hard for me to say 229 00:11:42,920 --> 00:11:45,360 Speaker 1: it's cool or bad, but it's also hard for me 230 00:11:45,440 --> 00:11:48,920 Speaker 1: to say it's necessarily bad. I think it's incredibly risky. 231 00:11:49,040 --> 00:11:52,640 Speaker 1: I think that the prospect of an extra strestrial civilization 232 00:11:52,760 --> 00:11:55,920 Speaker 1: showing up on Earth is very scary. You don't know 233 00:11:56,840 --> 00:11:59,960 Speaker 1: what they what their intentions might be, and I think 234 00:12:00,440 --> 00:12:03,520 Speaker 1: history kind of shows us that when people who are 235 00:12:03,559 --> 00:12:08,240 Speaker 1: outsiders show up in a place where you know they 236 00:12:08,280 --> 00:12:11,240 Speaker 1: don't really think of the other inhabitants, is so much 237 00:12:11,320 --> 00:12:13,680 Speaker 1: like then, or you know, the equivalent of them, then 238 00:12:13,960 --> 00:12:19,079 Speaker 1: these are usually turn out so well, really, because we 239 00:12:19,160 --> 00:12:22,800 Speaker 1: could advance our knowledge about how life is form and 240 00:12:22,840 --> 00:12:27,760 Speaker 1: how it functions and ways we cannot even begin to 241 00:12:27,800 --> 00:12:30,839 Speaker 1: comprehend right now. And what about the risks worth it? 242 00:12:31,120 --> 00:12:36,600 Speaker 1: Every new idea brings risks, and everything we do depends 243 00:12:36,640 --> 00:12:40,200 Speaker 1: on on who we are and how we go about it. 244 00:12:40,320 --> 00:12:46,319 Speaker 1: So based on that, we could never do anything if 245 00:12:46,360 --> 00:12:50,280 Speaker 1: we don't trust the people that doesn't, they could bring 246 00:12:50,400 --> 00:12:54,280 Speaker 1: us diseases that we couldn't fight. Well. I think it's 247 00:12:54,280 --> 00:12:57,080 Speaker 1: a good idea for a multitude of reasons. I think, 248 00:12:57,080 --> 00:12:59,719 Speaker 1: first and foremost, it's just informative to know whether or 249 00:12:59,760 --> 00:13:03,200 Speaker 1: not where alone or not? And then that just opens 250 00:13:03,280 --> 00:13:06,320 Speaker 1: up a whole new lineup questioning, which is, you know, 251 00:13:06,360 --> 00:13:08,920 Speaker 1: how do they develop differently from us? Where did we diverge? 252 00:13:08,920 --> 00:13:10,600 Speaker 1: Do we look at the universe in the same way? 253 00:13:11,120 --> 00:13:12,680 Speaker 1: You know? I think in much the same way that 254 00:13:13,160 --> 00:13:16,720 Speaker 1: it's informative to get other cultures experiences on certain things. 255 00:13:16,720 --> 00:13:19,600 Speaker 1: It can be interesting to get an entirely different species 256 00:13:19,880 --> 00:13:23,360 Speaker 1: perspective on something. What about the risks? There are risks, 257 00:13:23,840 --> 00:13:27,440 Speaker 1: but there are risks inherent in anything, right, Um, the 258 00:13:27,480 --> 00:13:30,920 Speaker 1: Manhattan rodinking built a new you know, the risk was, 259 00:13:31,600 --> 00:13:33,400 Speaker 1: are we going to ignite the atmosphere? We're just gonna 260 00:13:33,400 --> 00:13:35,720 Speaker 1: wipe out the human race right now? Probably not if 261 00:13:35,720 --> 00:13:37,440 Speaker 1: the risk is there, right, And if we go out 262 00:13:37,440 --> 00:13:40,000 Speaker 1: into space looking for extra terrescials, the same risk is 263 00:13:40,040 --> 00:13:42,600 Speaker 1: inherentble But I think the benefits out where the risks. 264 00:13:43,000 --> 00:13:45,800 Speaker 1: But then if it's pot wiping out the human race, well, 265 00:13:47,520 --> 00:13:49,600 Speaker 1: some might argue that there is a benefit to wiping 266 00:13:49,600 --> 00:13:51,720 Speaker 1: out the human race. I wouldn't go that far, but 267 00:13:51,720 --> 00:13:53,760 Speaker 1: but some people would make that argument. But that is 268 00:13:53,760 --> 00:13:56,600 Speaker 1: actually a benefit. I definitely think it's a good idea 269 00:13:56,640 --> 00:13:58,680 Speaker 1: of course we should. I guess it doesn't have to 270 00:13:58,720 --> 00:14:03,040 Speaker 1: be balanced with other needs for resources. So, for example, um, 271 00:14:03,040 --> 00:14:07,240 Speaker 1: should we educate our children or do biomedical research versus 272 00:14:07,679 --> 00:14:10,880 Speaker 1: for all our resources into the hope of maybe finding 273 00:14:11,000 --> 00:14:14,600 Speaker 1: alien life. I guess I'd still vote for like, keeping 274 00:14:14,679 --> 00:14:17,920 Speaker 1: our planet working right here at home, I think would 275 00:14:17,920 --> 00:14:19,800 Speaker 1: be a good idea. I could see how it would 276 00:14:19,800 --> 00:14:22,440 Speaker 1: be dangerous if but I would choose to believe that 277 00:14:23,160 --> 00:14:25,320 Speaker 1: they would be just as curious as we are and 278 00:14:25,400 --> 00:14:29,800 Speaker 1: not aggressive or hostile in anyway, and then it would 279 00:14:29,840 --> 00:14:34,000 Speaker 1: be a mutual communication between both of us. All Right. 280 00:14:34,040 --> 00:14:37,200 Speaker 1: We have a pretty wide range of opinions here from people. 281 00:14:37,360 --> 00:14:39,800 Speaker 1: Some people think it's super risky, some people think it's 282 00:14:39,800 --> 00:14:42,520 Speaker 1: super cool, yeah, And some people think it's risky but 283 00:14:42,720 --> 00:14:45,320 Speaker 1: worth the risk. I guess I wonder if they're thinking 284 00:14:45,320 --> 00:14:49,800 Speaker 1: about the full range of risk potential here. You know, 285 00:14:50,000 --> 00:14:52,160 Speaker 1: there was somebody who argued that the human race itself 286 00:14:52,200 --> 00:14:55,680 Speaker 1: getting wiped out might be a benefit, not necessarily a risk. 287 00:14:55,800 --> 00:14:59,040 Speaker 1: Oh my goodness. There was some pretty dark thoughts going on. Yeah, 288 00:14:59,560 --> 00:15:02,720 Speaker 1: pretty uh dark people who go to uc I apparently. 289 00:15:03,080 --> 00:15:06,040 Speaker 1: But it's an interesting question whether it's worth the risk, right, 290 00:15:06,080 --> 00:15:08,480 Speaker 1: because the risk could be a pretty bad You could 291 00:15:08,480 --> 00:15:11,320 Speaker 1: wipe out the whole maybe human race potentially. Yeah, we're 292 00:15:11,320 --> 00:15:15,600 Speaker 1: talking about alien technology with unknown capabilities. So definitely, like 293 00:15:15,760 --> 00:15:19,640 Speaker 1: sterilizing the entire planet is in the realm of possibility, 294 00:15:19,720 --> 00:15:21,600 Speaker 1: And so it seems like a weird balance to strike 295 00:15:21,680 --> 00:15:23,640 Speaker 1: the right, Like, on the one hand, we could be 296 00:15:23,720 --> 00:15:26,200 Speaker 1: destroyed and wipe out our species. On the other hand, 297 00:15:26,360 --> 00:15:29,840 Speaker 1: we might satisfy Daniel Whiteson's curiosity. I don't know, man, 298 00:15:31,000 --> 00:15:33,560 Speaker 1: that's a tough call. That is a tough It's a 299 00:15:33,640 --> 00:15:36,560 Speaker 1: tough call for the rest of us to be audest well. 300 00:15:36,600 --> 00:15:38,320 Speaker 1: I promised to share some of the insights if they 301 00:15:38,360 --> 00:15:41,560 Speaker 1: do come. How about that balance a little bit? Of 302 00:15:41,600 --> 00:15:43,920 Speaker 1: course you're going to share that, Daniel. I mean you 303 00:15:44,600 --> 00:15:46,440 Speaker 1: were you thinking that you're going to keep it to yourself. 304 00:15:47,000 --> 00:15:49,720 Speaker 1: I might. I might just build a prison of solitude 305 00:15:49,720 --> 00:15:52,040 Speaker 1: and sit there knowing all these answers on my own. 306 00:15:52,440 --> 00:15:54,720 Speaker 1: I see, you're gonna put the entire human race and 307 00:15:54,880 --> 00:15:57,960 Speaker 1: risk and then keep all of the reward to yourself. 308 00:15:58,200 --> 00:16:00,360 Speaker 1: I'll just dribble out of paper every ten years or so, 309 00:16:00,640 --> 00:16:04,360 Speaker 1: blowing everything out of the water. From the alien prison. 310 00:16:04,520 --> 00:16:06,320 Speaker 1: You'll be writing us and then the rest of us 311 00:16:06,360 --> 00:16:09,400 Speaker 1: will get it through the prison newsletter? Is that the 312 00:16:09,600 --> 00:16:11,880 Speaker 1: said the idea, and then we're gonna think, oh wow, 313 00:16:11,920 --> 00:16:14,120 Speaker 1: that was totally worth it. Yeah, well, jokes aside, there 314 00:16:14,120 --> 00:16:17,040 Speaker 1: are some really interesting questions there about how to communicate 315 00:16:17,120 --> 00:16:19,440 Speaker 1: with aliens if they do come, or if they do 316 00:16:19,520 --> 00:16:22,600 Speaker 1: send us a message, how to speak for Earth, you know, 317 00:16:22,680 --> 00:16:25,920 Speaker 1: like who gets to speak for Earth? It's a difficult question. 318 00:16:26,000 --> 00:16:29,200 Speaker 1: We actually talked to Jill Tartar, head of CETI about that, 319 00:16:29,240 --> 00:16:32,400 Speaker 1: about how they get a community together, a multicultural, multi 320 00:16:32,440 --> 00:16:36,280 Speaker 1: perspective community to think about how to respond potentially to 321 00:16:36,320 --> 00:16:39,560 Speaker 1: an alien message. It's a hard question. Interesting all right, 322 00:16:39,600 --> 00:16:41,560 Speaker 1: But before I guess we get into those details, let's 323 00:16:41,560 --> 00:16:44,000 Speaker 1: maybe take a step back here and think about this 324 00:16:44,080 --> 00:16:46,760 Speaker 1: larger question of are we alone? What have we learned 325 00:16:46,760 --> 00:16:49,800 Speaker 1: about that question? What have we done to answer that question? Daniel? 326 00:16:49,880 --> 00:16:52,320 Speaker 1: It's a really interesting question, and obviously we all want 327 00:16:52,360 --> 00:16:55,200 Speaker 1: to know the answer are we alone? But it's sort 328 00:16:55,200 --> 00:16:58,120 Speaker 1: of puzzling that we don't know the answer already. It's 329 00:16:58,160 --> 00:17:01,160 Speaker 1: famously said by for me that was wondering, like where 330 00:17:01,280 --> 00:17:04,800 Speaker 1: is everybody? Because the galaxy is pretty big? I mean, 331 00:17:04,880 --> 00:17:08,640 Speaker 1: stars are far away, but it's also really really old 332 00:17:08,840 --> 00:17:12,000 Speaker 1: and then filled with planets, And so if you think 333 00:17:12,040 --> 00:17:15,160 Speaker 1: that life is not like totally unlikely, not like one 334 00:17:15,200 --> 00:17:17,840 Speaker 1: in a trillion, then the number of planets means there 335 00:17:17,880 --> 00:17:20,240 Speaker 1: should be a lot of life out there, which is 336 00:17:20,280 --> 00:17:23,639 Speaker 1: a lot of opportunity for intelligent aliens. And because the 337 00:17:23,720 --> 00:17:27,720 Speaker 1: galaxy is really old compared to how big it is, 338 00:17:27,880 --> 00:17:31,240 Speaker 1: it's totally possible for an intelligence civilization to have sent 339 00:17:31,359 --> 00:17:34,640 Speaker 1: us messages or to have explored the universe. So it's 340 00:17:34,680 --> 00:17:37,960 Speaker 1: sort of puzzling, like why we haven't been contacted yet? 341 00:17:37,960 --> 00:17:41,560 Speaker 1: Why aliens haven't found us or sent us messages yet? Right? 342 00:17:41,640 --> 00:17:44,960 Speaker 1: That's the famous form me paradox, right, Like we look around, 343 00:17:45,040 --> 00:17:48,360 Speaker 1: there's so many stars, so many potential habitable planets. There 344 00:17:48,440 --> 00:17:50,880 Speaker 1: must be other we can't be like the only ones. 345 00:17:50,960 --> 00:17:53,320 Speaker 1: That would be sort of a crazy coincidence, And so 346 00:17:53,320 --> 00:17:55,880 Speaker 1: you've got to wonder where why haven't we made contact? 347 00:17:56,080 --> 00:17:57,880 Speaker 1: Though you sort of also think that, you know, being 348 00:17:57,880 --> 00:18:00,920 Speaker 1: an old universe, it also makes it harder to contact 349 00:18:00,960 --> 00:18:03,520 Speaker 1: people if it's an old universe, right, because maybe there 350 00:18:03,680 --> 00:18:06,200 Speaker 1: was an alien species nearby, but they lived billions of 351 00:18:06,280 --> 00:18:08,800 Speaker 1: years ago, and so maybe they did some messages, but 352 00:18:09,359 --> 00:18:12,120 Speaker 1: we weren't around or listening to to hear them. Yeah, 353 00:18:12,160 --> 00:18:15,120 Speaker 1: there's lots of potential solutions to this paradox. And that's 354 00:18:15,119 --> 00:18:16,920 Speaker 1: why I love about the paradox. It's not really like 355 00:18:17,000 --> 00:18:21,080 Speaker 1: a contradiction necessarily makes you question your assumptions and you say, 356 00:18:21,200 --> 00:18:23,119 Speaker 1: if all these things I think are true, then we 357 00:18:23,119 --> 00:18:25,680 Speaker 1: should have been contacted. So let's go back and examine 358 00:18:25,680 --> 00:18:27,680 Speaker 1: those assumptions. And one of the important ones is that 359 00:18:27,720 --> 00:18:29,600 Speaker 1: when you just mentioned which has to do with how 360 00:18:29,680 --> 00:18:34,119 Speaker 1: long civilizations survive, like if intelligence civilizations only last for 361 00:18:34,160 --> 00:18:36,920 Speaker 1: a few thousand years before they blow themselves up or 362 00:18:36,960 --> 00:18:39,720 Speaker 1: you know, ruin their environments or something, then to be 363 00:18:39,800 --> 00:18:42,919 Speaker 1: very difficult to have them line up in time so 364 00:18:42,960 --> 00:18:45,240 Speaker 1: that they could discover each other and maybe even communicate 365 00:18:45,320 --> 00:18:47,800 Speaker 1: with each other. This is sometimes called the great filter, 366 00:18:48,320 --> 00:18:52,040 Speaker 1: that maybe most civilizations are extinguished before they can become 367 00:18:52,119 --> 00:18:56,600 Speaker 1: long lasting civilizations. Yeah, like it's inevitable sort of yeah. 368 00:18:56,640 --> 00:18:58,439 Speaker 1: And then the question is like, well, did we already 369 00:18:58,440 --> 00:19:01,000 Speaker 1: survive the Great filter? Are we through it or is 370 00:19:01,040 --> 00:19:03,399 Speaker 1: it ahead of us? You know, are we about to 371 00:19:03,440 --> 00:19:08,679 Speaker 1: extinguish ourselves? All signs point to yes. It depends on 372 00:19:08,720 --> 00:19:11,080 Speaker 1: whether those planets have a Daniel Whites and who's crazy 373 00:19:11,080 --> 00:19:14,359 Speaker 1: curiosity leads them down the wrong path. I see, So 374 00:19:14,440 --> 00:19:17,000 Speaker 1: you're the filter. Is that what you're saying, the thing 375 00:19:17,080 --> 00:19:19,600 Speaker 1: that filters us from a longevity. All I'm saying is 376 00:19:19,600 --> 00:19:21,680 Speaker 1: that we have one of me and we've survived so far. 377 00:19:21,880 --> 00:19:26,880 Speaker 1: You know, that's what the data says. Years Daniel, you're 378 00:19:26,920 --> 00:19:29,679 Speaker 1: barely getting started. Look, I have destroyed the Earth and 379 00:19:29,720 --> 00:19:33,320 Speaker 1: extinguished humanity exactly zero times in my life. So that's 380 00:19:33,320 --> 00:19:35,040 Speaker 1: all I can say. In fact, the trend is going 381 00:19:35,320 --> 00:19:38,800 Speaker 1: the wrong way, Like recently you've got a microphone and 382 00:19:39,800 --> 00:19:44,600 Speaker 1: reaching more people, and so the filter is she's getting 383 00:19:44,720 --> 00:19:47,280 Speaker 1: more clogged up. I think that's true. But there's lots 384 00:19:47,280 --> 00:19:49,520 Speaker 1: of ways to attack this problem. You know. One is 385 00:19:49,560 --> 00:19:52,440 Speaker 1: to say, well, maybe life just is much more rare 386 00:19:52,600 --> 00:19:55,760 Speaker 1: than we ever imagine. Maybe we really are alone out 387 00:19:55,760 --> 00:19:58,000 Speaker 1: there in the universe, and that's certainly possible. You know, 388 00:19:58,040 --> 00:20:00,880 Speaker 1: we don't know how many times life started. We've only 389 00:20:00,880 --> 00:20:03,239 Speaker 1: ever seen this one example though. If you look at 390 00:20:03,240 --> 00:20:04,960 Speaker 1: the history of life on Earth, we see that it 391 00:20:05,080 --> 00:20:09,119 Speaker 1: started pretty quickly after the conditions were favorable, and we 392 00:20:09,160 --> 00:20:11,919 Speaker 1: think those conditions are not that rare. So it'd be 393 00:20:11,960 --> 00:20:15,240 Speaker 1: pretty strange if life itself at least you know, heiny 394 00:20:15,280 --> 00:20:18,680 Speaker 1: microbial life started somewhere else. What do you mean it 395 00:20:18,760 --> 00:20:21,959 Speaker 1: started early, like as soon as the Earth form we 396 00:20:22,000 --> 00:20:25,400 Speaker 1: had sort of self assembling molecules in our primorial soup. 397 00:20:25,600 --> 00:20:29,320 Speaker 1: After we had reasonable conditions and liquid water. It only 398 00:20:29,320 --> 00:20:32,439 Speaker 1: took a few hundred million years at most before we 399 00:20:32,520 --> 00:20:35,440 Speaker 1: had very simple forms of life. You know, on the 400 00:20:35,480 --> 00:20:38,080 Speaker 1: time scale of the Earth billions of years, that's not 401 00:20:38,160 --> 00:20:40,320 Speaker 1: that long. And it might have started sooner. That's just 402 00:20:40,359 --> 00:20:42,959 Speaker 1: like the oldest evidence that we have for life. It's 403 00:20:43,080 --> 00:20:46,480 Speaker 1: very difficult to find very very ancient signs of life 404 00:20:46,480 --> 00:20:49,440 Speaker 1: in old rocks, so it could have started even sooner. Wow. 405 00:20:49,480 --> 00:20:51,240 Speaker 1: So I guess the fact that we got it early 406 00:20:51,440 --> 00:20:53,680 Speaker 1: means it must have been easy, right, I guess is 407 00:20:53,720 --> 00:20:56,359 Speaker 1: that the reasoning, like, oh, we got it pretty quick, 408 00:20:56,400 --> 00:20:58,960 Speaker 1: it must not be that hard, that's the reasoning. Although 409 00:20:59,000 --> 00:21:00,960 Speaker 1: we only have one example, it could be that we 410 00:21:01,080 --> 00:21:04,159 Speaker 1: got super duper crazy lucky, right, We just don't know. 411 00:21:04,280 --> 00:21:06,480 Speaker 1: We need to find life somewhere else to get a 412 00:21:06,520 --> 00:21:09,920 Speaker 1: clear view of that situation. But you know, intelligence arrived 413 00:21:09,920 --> 00:21:13,200 Speaker 1: fairly late on Earth, and so that makes the opposite suggestion. 414 00:21:13,320 --> 00:21:16,760 Speaker 1: Maybe life is fairly common in the universe, but mostly 415 00:21:16,800 --> 00:21:20,879 Speaker 1: it's not very smart and intelligence is very rare, or 416 00:21:20,920 --> 00:21:23,120 Speaker 1: maybe the Earth is unusual. Right. We just can't draw 417 00:21:23,160 --> 00:21:25,960 Speaker 1: a lot of conclusions from one example, right, Right, That's 418 00:21:25,960 --> 00:21:28,439 Speaker 1: why I'm always late making things and turning things in, 419 00:21:28,560 --> 00:21:30,520 Speaker 1: because if you do it too early, people are going 420 00:21:30,560 --> 00:21:33,320 Speaker 1: to think it's too easy. Right, Jo must be really 421 00:21:33,359 --> 00:21:37,520 Speaker 1: intelligent because he's always very late. That's exactly exactly, yes, yes, 422 00:21:38,200 --> 00:21:41,399 Speaker 1: Or what he's doing must be really hard it's taking 423 00:21:41,440 --> 00:21:45,480 Speaker 1: in so long, another argument for procrastination. And there are also, 424 00:21:45,520 --> 00:21:47,879 Speaker 1: of course questions about the nature of intelligence. You know, 425 00:21:48,240 --> 00:21:51,119 Speaker 1: would we even recognize intelligent life if it sent us 426 00:21:51,119 --> 00:21:54,040 Speaker 1: a message or if it flew by? Are we trapped 427 00:21:54,080 --> 00:21:56,920 Speaker 1: into a box of thinking about intelligent life as sort 428 00:21:56,960 --> 00:22:00,680 Speaker 1: of variations on the human experience and the human an example, 429 00:22:01,040 --> 00:22:04,240 Speaker 1: unable to even imagine the crazy forms that life or 430 00:22:04,240 --> 00:22:07,320 Speaker 1: intelligence might take out there in the universe. Yeah, you mean, 431 00:22:07,359 --> 00:22:09,960 Speaker 1: like they could have been sending us message all this time. 432 00:22:09,960 --> 00:22:12,960 Speaker 1: There could be messages from aliens washing over us right now, 433 00:22:13,000 --> 00:22:16,320 Speaker 1: but maybe in a totally different way than what we 434 00:22:16,320 --> 00:22:20,040 Speaker 1: were expecting. Right, Like, we're listening the way we send signals, 435 00:22:20,119 --> 00:22:22,600 Speaker 1: but maybe they're sending sending signals in a totally different 436 00:22:22,640 --> 00:22:25,280 Speaker 1: way exactly, And we're sending signals in ways that seem 437 00:22:25,359 --> 00:22:28,159 Speaker 1: obvious to us, and like anybody would do it this way, right, 438 00:22:28,320 --> 00:22:30,240 Speaker 1: But that's exactly what we want to learn when we 439 00:22:30,320 --> 00:22:32,800 Speaker 1: talk to aliens. We want to learn how they might 440 00:22:32,840 --> 00:22:35,240 Speaker 1: think differently. So when we start by assuming that they're 441 00:22:35,240 --> 00:22:37,760 Speaker 1: doing things the same way, we trap ourselves and only 442 00:22:37,800 --> 00:22:40,359 Speaker 1: being able to find aliens that are basically like us, 443 00:22:40,480 --> 00:22:44,720 Speaker 1: you know, with maybe wrinkly foreheads or pointy ears, right, 444 00:22:45,280 --> 00:22:47,600 Speaker 1: But to be honest, that seems like a very human 445 00:22:47,640 --> 00:22:51,200 Speaker 1: thing to do, like, let's only talk to people who 446 00:22:51,240 --> 00:22:55,000 Speaker 1: are exactly like us. I try to only talk to 447 00:22:55,040 --> 00:22:58,280 Speaker 1: physicists except twice a week. When I talk to you, good, 448 00:22:58,359 --> 00:23:02,360 Speaker 1: I'm your alien. He's taking a risk here, a big 449 00:23:02,480 --> 00:23:05,359 Speaker 1: risk that I might destroy you, but you get so 450 00:23:05,480 --> 00:23:08,040 Speaker 1: much out of our conversations exactly, it's worth the risk 451 00:23:08,040 --> 00:23:10,720 Speaker 1: because I always say that's right. Yeah, Well, thank you, Daniel, 452 00:23:10,720 --> 00:23:13,000 Speaker 1: I appreciate that. Thank you for not frying me with 453 00:23:13,080 --> 00:23:18,680 Speaker 1: your death ray so far. Yet, well, there are other possibilities, 454 00:23:19,000 --> 00:23:21,440 Speaker 1: and it also raises a question, why should we be 455 00:23:21,600 --> 00:23:24,520 Speaker 1: looking for aliens out there. So these are interesting and 456 00:23:24,600 --> 00:23:27,120 Speaker 1: big questions that we are going to talk about. But first, 457 00:23:27,240 --> 00:23:41,920 Speaker 1: let's take a quick break. All right. We're talking about 458 00:23:42,000 --> 00:23:44,440 Speaker 1: whether it's a good idea or a bad idea to 459 00:23:44,680 --> 00:23:47,680 Speaker 1: look for aliens out there in the universe, and it's 460 00:23:47,720 --> 00:23:50,240 Speaker 1: a tricky balance, Daniel, I guess maybe what are some 461 00:23:50,359 --> 00:23:53,680 Speaker 1: of the arguments for looking for aliens? The arguments for 462 00:23:53,840 --> 00:23:56,080 Speaker 1: are easy to make. I mean, we just want to know. 463 00:23:56,320 --> 00:23:59,320 Speaker 1: It's one of the deepest questions in modern science. Are 464 00:23:59,400 --> 00:24:02,040 Speaker 1: we alone in the universe? It tells us so much 465 00:24:02,119 --> 00:24:05,080 Speaker 1: about the nature of our existence. You know, I'm always 466 00:24:05,119 --> 00:24:07,440 Speaker 1: saying that the reason we do science is not just 467 00:24:07,600 --> 00:24:10,600 Speaker 1: to have spinoffs and technology and faster phones, but to 468 00:24:10,760 --> 00:24:13,920 Speaker 1: understand the very context of our lives. What is this 469 00:24:14,160 --> 00:24:16,600 Speaker 1: universe we're living in? How did it come to be? 470 00:24:16,920 --> 00:24:19,600 Speaker 1: Because it changes how we live our lives. And knowing 471 00:24:19,760 --> 00:24:22,280 Speaker 1: that we are alone in the universe or knowing that 472 00:24:22,359 --> 00:24:26,120 Speaker 1: we are not, both of those totally change the context 473 00:24:26,200 --> 00:24:28,040 Speaker 1: of our lives. And so I think it's just one 474 00:24:28,080 --> 00:24:31,160 Speaker 1: of the deepest questions and we're desperate to know. Yeah, 475 00:24:31,400 --> 00:24:33,760 Speaker 1: I totally agree, although I also have to kind of 476 00:24:33,800 --> 00:24:36,440 Speaker 1: say that we sort of have that answer in a way, right, Like, 477 00:24:36,600 --> 00:24:38,320 Speaker 1: at this point, we know so much about the universe 478 00:24:38,400 --> 00:24:41,680 Speaker 1: that we think that the answer is probably right, like 479 00:24:41,920 --> 00:24:44,800 Speaker 1: probably there's life out there or most likely there's life 480 00:24:44,800 --> 00:24:46,960 Speaker 1: out there. Isn't that sort of good enough? Yeah, I'd 481 00:24:46,960 --> 00:24:51,000 Speaker 1: say the answer is plus or minus. You know, but 482 00:24:51,119 --> 00:24:53,320 Speaker 1: we we basically don't know anything. You know, we have 483 00:24:53,520 --> 00:24:56,879 Speaker 1: this one example. We can make estimates, but until you know, 484 00:24:57,400 --> 00:25:00,240 Speaker 1: you don't really know anything. That's why we do experiment, right. 485 00:25:00,359 --> 00:25:02,159 Speaker 1: We think we figured out how this works, and we 486 00:25:02,240 --> 00:25:04,960 Speaker 1: go out there, and then the universe surprises us. Every 487 00:25:05,040 --> 00:25:07,920 Speaker 1: time we open up new eyeballs or new ears into 488 00:25:07,960 --> 00:25:11,560 Speaker 1: the universe, we learned something surprising and shocking that changes 489 00:25:11,840 --> 00:25:13,720 Speaker 1: the very way we think about the universe. So it's 490 00:25:13,720 --> 00:25:16,399 Speaker 1: definitely worth looking because I bet the answer is pretty 491 00:25:16,480 --> 00:25:19,840 Speaker 1: different from what we expect. Well, I guess, you know, 492 00:25:19,920 --> 00:25:23,240 Speaker 1: we expect it to be yes. So you're saying maybe 493 00:25:23,280 --> 00:25:26,040 Speaker 1: the answers no, The answers very so much depends on 494 00:25:26,119 --> 00:25:28,920 Speaker 1: who you ask. Like, personally, I think it's very likely 495 00:25:29,000 --> 00:25:31,760 Speaker 1: that there's life all over the universe in lots of 496 00:25:31,800 --> 00:25:35,240 Speaker 1: ways we can't even imagine, and that intelligence takes such 497 00:25:35,280 --> 00:25:38,320 Speaker 1: a varied forms that would be very difficult for us 498 00:25:38,400 --> 00:25:41,159 Speaker 1: to recognize it, not to mention, communicate with it, or 499 00:25:41,240 --> 00:25:44,040 Speaker 1: understand a message. I think aliens are probably more alien 500 00:25:44,119 --> 00:25:46,879 Speaker 1: than we can even imagine interesting, and so you just 501 00:25:47,000 --> 00:25:49,280 Speaker 1: want to kind of find them and to just to 502 00:25:49,440 --> 00:25:52,080 Speaker 1: see how weird they can be. Yeah, absolutely, the same 503 00:25:52,119 --> 00:25:54,359 Speaker 1: reason I like to go traveling, Like, Wow, look what 504 00:25:54,400 --> 00:25:56,240 Speaker 1: people put on their French fries in this country. I 505 00:25:56,359 --> 00:25:58,639 Speaker 1: never even thought of that. Oh my gosh. It just 506 00:25:58,800 --> 00:26:01,320 Speaker 1: you know, expands ther insense of your brain. It makes 507 00:26:01,359 --> 00:26:04,080 Speaker 1: you think in Newton different ways, right, But would you 508 00:26:04,119 --> 00:26:07,320 Speaker 1: still eat it? Though? Do you always try them? I 509 00:26:07,480 --> 00:26:12,960 Speaker 1: try every condiment once I see you just want to 510 00:26:13,000 --> 00:26:15,800 Speaker 1: sample the aliens. He's not take a picture of him. 511 00:26:17,400 --> 00:26:22,480 Speaker 1: Maybe these aliens are delicious also, you never know. Oh jeez, well, 512 00:26:22,520 --> 00:26:24,520 Speaker 1: I guess. Um. Now the question is how are we 513 00:26:24,720 --> 00:26:27,560 Speaker 1: looking for these aliens? Are we actively listening? I know 514 00:26:27,680 --> 00:26:30,119 Speaker 1: there's a SETI program out there, Um what else is 515 00:26:30,160 --> 00:26:33,760 Speaker 1: out there? And setting means search for an extraterrestrial intelligence. 516 00:26:34,080 --> 00:26:36,760 Speaker 1: They're the biggest deal out there in terms of trying 517 00:26:36,840 --> 00:26:40,720 Speaker 1: to identify messages from space, and so this is very 518 00:26:40,760 --> 00:26:43,840 Speaker 1: different from like questions about whether alien craft are in 519 00:26:43,920 --> 00:26:47,159 Speaker 1: the skies and these alien UFO videos said, He's like, 520 00:26:47,280 --> 00:26:50,760 Speaker 1: let's just listen for messages from across the universe, trying 521 00:26:50,800 --> 00:26:53,000 Speaker 1: to see if somebody out there is similar enough to 522 00:26:53,119 --> 00:26:56,720 Speaker 1: us in intelligence and technology that they're sending us messages. 523 00:26:56,720 --> 00:26:58,840 Speaker 1: At least we should be listening. They got about a 524 00:26:58,920 --> 00:27:02,440 Speaker 1: hundred million bucks from a centric billionaires about five years ago, 525 00:27:02,760 --> 00:27:05,080 Speaker 1: and they've been using it to buy telescope time like 526 00:27:05,359 --> 00:27:08,520 Speaker 1: the Green Bank Observatory and the Park Observatory to try 527 00:27:08,560 --> 00:27:11,320 Speaker 1: to listen for these messages. But I guess they're listening 528 00:27:11,880 --> 00:27:16,000 Speaker 1: in the electromagnetic spectrum, right, like looking for radio signals, 529 00:27:16,080 --> 00:27:18,879 Speaker 1: just like we use radio signals. And so that's assuming 530 00:27:18,920 --> 00:27:22,520 Speaker 1: the aliens use radio signals and within a certain frequency 531 00:27:22,600 --> 00:27:25,520 Speaker 1: range they are doing that. They also have some visible 532 00:27:25,640 --> 00:27:29,040 Speaker 1: light observations and they also look for this weird stuff. 533 00:27:29,280 --> 00:27:31,720 Speaker 1: Jill Tarter was on the podcast a few weeks ago, 534 00:27:31,840 --> 00:27:33,840 Speaker 1: and she was telling us about how they're trying to 535 00:27:33,920 --> 00:27:36,800 Speaker 1: imagine other ways aliens might communicate, like what if they're 536 00:27:36,880 --> 00:27:41,040 Speaker 1: changing the frequency of pulsars through some crazy engineering project, 537 00:27:41,160 --> 00:27:43,920 Speaker 1: and so not just like listening for messages in the 538 00:27:43,960 --> 00:27:46,320 Speaker 1: way that we might format them. They're out there trying 539 00:27:46,359 --> 00:27:48,879 Speaker 1: to think about other ways aliens might be affecting the 540 00:27:49,000 --> 00:27:52,120 Speaker 1: cosmos that we could discover, right. We we talked once 541 00:27:52,200 --> 00:27:54,840 Speaker 1: on the podcast about like our aliens building a giant 542 00:27:55,359 --> 00:27:58,399 Speaker 1: Dicen sphere, or are they you know, moving galaxies around 543 00:27:58,520 --> 00:28:00,880 Speaker 1: or something. And so they're being actually very thoughtful about 544 00:28:00,920 --> 00:28:04,280 Speaker 1: this and involving philosophers and cultural historians and trying to 545 00:28:04,320 --> 00:28:06,880 Speaker 1: think really carefully about the assumptions they are making when 546 00:28:06,880 --> 00:28:09,400 Speaker 1: they're searching for these signals. But in the end, it's 547 00:28:09,480 --> 00:28:13,119 Speaker 1: limited to our imagination. Right It's possible that there are 548 00:28:13,200 --> 00:28:15,399 Speaker 1: signals already out there in our data that we just 549 00:28:15,680 --> 00:28:19,040 Speaker 1: haven't interpreted in the right way. Interesting, So listen to 550 00:28:19,119 --> 00:28:21,240 Speaker 1: that episode if you have a chance. But there are 551 00:28:21,280 --> 00:28:23,920 Speaker 1: other things we're doing, not just listening for messages or 552 00:28:24,240 --> 00:28:27,280 Speaker 1: strange things in the universe. We're also sending stuff out there, 553 00:28:27,640 --> 00:28:31,119 Speaker 1: that's right. There's another program called METTI m e t I, 554 00:28:31,280 --> 00:28:35,040 Speaker 1: which is messaging extraterrestrial intelligence, And this is a community 555 00:28:35,080 --> 00:28:37,280 Speaker 1: people who think that we should not just be listening 556 00:28:37,320 --> 00:28:40,760 Speaker 1: for these messages, we should be broadcasting our location into 557 00:28:40,880 --> 00:28:44,400 Speaker 1: space and letting other civilizations know that we are here 558 00:28:44,640 --> 00:28:46,880 Speaker 1: interesting like a group of people who think we should 559 00:28:46,880 --> 00:28:50,520 Speaker 1: be more proactive about content ing aliens. Yeah. Absolutely, And 560 00:28:50,600 --> 00:28:52,880 Speaker 1: this has actually a long history. It's not just a 561 00:28:53,000 --> 00:28:55,680 Speaker 1: recent effort. There's some really fun stories. There's an Austrian 562 00:28:55,760 --> 00:28:59,920 Speaker 1: astronomer who wanted to dig massive trenches in the Sahara desert, 563 00:29:00,360 --> 00:29:03,080 Speaker 1: fill them with water, top them with kerosene, and then 564 00:29:03,160 --> 00:29:06,520 Speaker 1: set them on fire to communicate with people who might 565 00:29:06,600 --> 00:29:10,160 Speaker 1: be living on Mars. What this was recent or a 566 00:29:10,280 --> 00:29:11,920 Speaker 1: long time ago? This is a long time ago. This 567 00:29:12,040 --> 00:29:13,720 Speaker 1: is like more than a hundred years ago, back when 568 00:29:13,960 --> 00:29:17,640 Speaker 1: we didn't know are their civilizations on Mars, and people thought, well, 569 00:29:17,800 --> 00:29:19,840 Speaker 1: here's the way we could send a message because we 570 00:29:19,960 --> 00:29:22,560 Speaker 1: saw what looked like canals on Mars, and so for 571 00:29:22,600 --> 00:29:25,400 Speaker 1: a while people thought maybe there was a civilization on Mars. 572 00:29:25,520 --> 00:29:28,080 Speaker 1: So it's like writing help in the sand on your 573 00:29:28,160 --> 00:29:30,000 Speaker 1: desert island, you know. Yeah, it seems a live with 574 00:29:30,040 --> 00:29:32,520 Speaker 1: desperate if you ask me, think you might as well 575 00:29:32,680 --> 00:29:35,160 Speaker 1: stand outside their driveway holding a boom box or something. 576 00:29:36,880 --> 00:29:39,000 Speaker 1: And there was a French guy who asked for money 577 00:29:39,040 --> 00:29:40,960 Speaker 1: from the French government because he wanted to build a 578 00:29:41,040 --> 00:29:45,240 Speaker 1: giant mirror which would focus sunlight and write messages onto 579 00:29:45,440 --> 00:29:52,040 Speaker 1: the surface of Mars, Like wouldn't that piss people might 580 00:29:52,120 --> 00:29:55,440 Speaker 1: send them message? Don't you think so? Sorry about frying 581 00:29:55,480 --> 00:29:58,400 Speaker 1: your elementary school filled with alien children. We just really 582 00:29:58,440 --> 00:30:02,560 Speaker 1: wanted to get in touch. Boy. Well, it also doesn't 583 00:30:02,560 --> 00:30:05,080 Speaker 1: see make a lot of sense because if there were 584 00:30:05,600 --> 00:30:08,440 Speaker 1: Mars Martians looking at us, they would already see us, right, 585 00:30:08,480 --> 00:30:10,560 Speaker 1: they would look at our you know lights at night 586 00:30:10,640 --> 00:30:12,720 Speaker 1: and stuff, right, because you can see those from space. Yeah, 587 00:30:12,760 --> 00:30:14,680 Speaker 1: you could. But this is a long time ago, before 588 00:30:14,800 --> 00:30:17,479 Speaker 1: I think the Earth was as electrified, some people were 589 00:30:17,520 --> 00:30:20,080 Speaker 1: thinking about this kind of stuff. But more recently, you know, 590 00:30:20,200 --> 00:30:23,520 Speaker 1: people like Carl Sagan have thought about how somebody might 591 00:30:23,600 --> 00:30:26,400 Speaker 1: respond to seeing one of our space probes, until like 592 00:30:26,520 --> 00:30:29,680 Speaker 1: on the Voyager, we sent this golden record that has 593 00:30:29,800 --> 00:30:33,640 Speaker 1: transmissions from Earth, you know, whale songs and people singing, 594 00:30:33,920 --> 00:30:36,240 Speaker 1: and this information on there about like how to find 595 00:30:36,280 --> 00:30:38,280 Speaker 1: the planet, done in a way that's supposed to be 596 00:30:38,320 --> 00:30:41,000 Speaker 1: sort of self explanatory that anybody with a mathematical and 597 00:30:41,080 --> 00:30:44,600 Speaker 1: astronomical understanding might be able to decode, right, yeah, and 598 00:30:44,680 --> 00:30:48,480 Speaker 1: you can you can find images of the plaque online, right, Yeah, 599 00:30:48,560 --> 00:30:52,080 Speaker 1: that's right, and more aggressively. We actually sent a dedicated 600 00:30:52,160 --> 00:30:55,360 Speaker 1: message to space from the air Receiva Observatory, this huge 601 00:30:55,520 --> 00:30:58,120 Speaker 1: radio dish that used to be operating in Puerto Rico. 602 00:30:58,360 --> 00:31:01,360 Speaker 1: We sent a message out in to space specifically for 603 00:31:01,560 --> 00:31:04,080 Speaker 1: aliens in designing a way we thought maybe they would 604 00:31:04,120 --> 00:31:07,800 Speaker 1: be able to decode. It's sort of a pictogram. Interesting. Now, 605 00:31:08,320 --> 00:31:10,400 Speaker 1: is this group of people METI. Is this like an 606 00:31:10,440 --> 00:31:12,880 Speaker 1: actual institution or is it just sort of like what 607 00:31:13,000 --> 00:31:15,760 Speaker 1: you call people who want to send messages out. It's 608 00:31:15,760 --> 00:31:18,400 Speaker 1: an actual institution. There's a longer history here of this 609 00:31:18,480 --> 00:31:21,040 Speaker 1: sort of movement of people sending messages out. But there 610 00:31:21,200 --> 00:31:24,000 Speaker 1: is a group actually, it's based in San Francisco called MEDI, 611 00:31:24,480 --> 00:31:27,400 Speaker 1: and they are sending messages. Like in two thousand and seventeen, 612 00:31:27,480 --> 00:31:30,920 Speaker 1: they sent a message consisting of a scientific and mathematical 613 00:31:31,000 --> 00:31:34,520 Speaker 1: tutorial to one particular star that's located twelve light years 614 00:31:34,560 --> 00:31:37,560 Speaker 1: from Earth, you know, hoping that the aliens would get 615 00:31:37,600 --> 00:31:39,760 Speaker 1: it then learn about our math and science. Is there 616 00:31:39,800 --> 00:31:42,600 Speaker 1: a shorter version of this group called the eye MEDI 617 00:31:43,960 --> 00:31:48,720 Speaker 1: instant messaging extra terrestrial intelligence somebody wants to text to 618 00:31:48,840 --> 00:31:51,400 Speaker 1: other planets. Yeah, it seems like it might be a 619 00:31:51,440 --> 00:31:54,920 Speaker 1: little more efficient and you know, casual cash. Maybe that's 620 00:31:54,920 --> 00:31:57,840 Speaker 1: the problem. You know, their inboxes are so overflowing with 621 00:31:57,960 --> 00:32:01,160 Speaker 1: interstellar spam that they're not reading our That's right, we're 622 00:32:01,160 --> 00:32:04,320 Speaker 1: in their spam folder. Yeah, that's the nightmare, isn't it 623 00:32:04,920 --> 00:32:07,840 Speaker 1: the nightmare? Like, hey, we're here, we're desperate to connect 624 00:32:07,880 --> 00:32:13,560 Speaker 1: with other people. Spam delete exactly. They get so many offers. 625 00:32:13,880 --> 00:32:17,120 Speaker 1: That's the great filter that I think. Really it's the 626 00:32:17,160 --> 00:32:21,960 Speaker 1: great spam filter of the Google of the universe. You see, 627 00:32:22,000 --> 00:32:26,480 Speaker 1: spam really does have costs. Well. Despite this idea of 628 00:32:26,680 --> 00:32:29,600 Speaker 1: sending messages to aliens, some people think it's a bad 629 00:32:29,720 --> 00:32:34,080 Speaker 1: idea and something that maybe the reason it's a bad 630 00:32:34,080 --> 00:32:36,720 Speaker 1: idea could be maybe an explanation for why we haven't 631 00:32:36,760 --> 00:32:39,800 Speaker 1: contacted other aliens, Like maybe it's a bad idea. That's 632 00:32:39,800 --> 00:32:43,320 Speaker 1: why nobody else in the universe is sending signals out. Yeah, 633 00:32:43,360 --> 00:32:47,160 Speaker 1: that's right. Such famous physicists as Michio Kaku said, trying 634 00:32:47,200 --> 00:32:51,479 Speaker 1: to contact aliens is quote the biggest mistake in human history. 635 00:32:52,040 --> 00:32:53,840 Speaker 1: So there are definitely some people who think this is 636 00:32:54,040 --> 00:32:56,560 Speaker 1: not a good plan. Yeah. I think the pictures they're 637 00:32:56,560 --> 00:32:58,280 Speaker 1: trying to pain is like maybe the universe is kind 638 00:32:58,320 --> 00:33:01,240 Speaker 1: of like a dark force, and they're be predators out there, 639 00:33:01,960 --> 00:33:04,440 Speaker 1: and so we're sort of like a vulnerable species. We 640 00:33:04,600 --> 00:33:07,080 Speaker 1: maybe don't want to advertise our presence exactly in the 641 00:33:07,160 --> 00:33:09,880 Speaker 1: same way that if you're in a horror movie and 642 00:33:09,960 --> 00:33:12,760 Speaker 1: you're walking through dark woods, you don't start singing a 643 00:33:12,840 --> 00:33:16,880 Speaker 1: loud song, right. You tiptoe very carefully because you don't 644 00:33:16,880 --> 00:33:19,640 Speaker 1: want to announce your presence, and that's why everybody else 645 00:33:19,760 --> 00:33:21,880 Speaker 1: is being quiet. So the lesson is like, if the 646 00:33:21,960 --> 00:33:27,160 Speaker 1: universe seems quiet, maybe there's a reason that's right. And 647 00:33:27,240 --> 00:33:30,600 Speaker 1: if you're a cheerleader or a football player, you're probably 648 00:33:30,640 --> 00:33:34,520 Speaker 1: toasted exactly. And we are alone, right, we have separated. 649 00:33:34,640 --> 00:33:36,880 Speaker 1: We decided to go out into this forest by ourselves, 650 00:33:37,360 --> 00:33:40,120 Speaker 1: and so maybe we should think twice before announcing our location. 651 00:33:40,280 --> 00:33:42,200 Speaker 1: So it's it's sort of an interesting scenario and a 652 00:33:42,400 --> 00:33:45,560 Speaker 1: potentially dangerous idea to contact other aliens. And so to 653 00:33:45,600 --> 00:33:48,440 Speaker 1: get more perspective, Daniel went out there to interview an 654 00:33:48,520 --> 00:33:50,720 Speaker 1: expert on this topic. That's right. I found a really 655 00:33:50,800 --> 00:33:55,000 Speaker 1: interesting paper from Dr Kareem Jabari, a professor of philosophy, 656 00:33:55,240 --> 00:33:57,960 Speaker 1: who are the paper about the dangers of communicating with 657 00:33:58,120 --> 00:34:00,200 Speaker 1: aliens I reached out to him and he was kind 658 00:34:00,320 --> 00:34:02,640 Speaker 1: enough to spend a few minutes explaining the ideas to me. 659 00:34:02,800 --> 00:34:06,160 Speaker 1: So here is Daniel's interview with Dr Kareem Jabari on 660 00:34:06,360 --> 00:34:08,960 Speaker 1: whether it's a good idea or a bad idea to 661 00:34:09,160 --> 00:34:16,840 Speaker 1: contact aliens? All right, So then it's by great pleasure 662 00:34:17,000 --> 00:34:21,000 Speaker 1: to introduce to the program Dr Kareem Jabari, who has 663 00:34:21,080 --> 00:34:23,920 Speaker 1: a PhD in philosophy and as a researcher at the 664 00:34:24,080 --> 00:34:27,200 Speaker 1: Institute for Future Studies. Kareem, thank you very much for 665 00:34:27,320 --> 00:34:29,840 Speaker 1: joining us today. Thanks for the invitation. So you wrote 666 00:34:29,880 --> 00:34:34,960 Speaker 1: this fascinating paper about the dangers of communicating with extraterrestrials. 667 00:34:35,200 --> 00:34:37,560 Speaker 1: Tell me a little bit how you got interested in 668 00:34:37,680 --> 00:34:40,400 Speaker 1: this topic. Is this something people discuss a lot at 669 00:34:40,440 --> 00:34:43,440 Speaker 1: the Institute for Future Studies And not at all. Actually, 670 00:34:43,719 --> 00:34:47,360 Speaker 1: it's more focused around the future of child poverty and 671 00:34:47,480 --> 00:34:50,040 Speaker 1: those kind of more Monday issues. But I've always been 672 00:34:50,080 --> 00:34:53,160 Speaker 1: a huge fan of science fiction, and I made a 673 00:34:53,480 --> 00:34:57,320 Speaker 1: stab at trying to write a short science fiction story. 674 00:34:57,520 --> 00:34:59,480 Speaker 1: It turned out to be much more difficult than I 675 00:34:59,560 --> 00:35:02,440 Speaker 1: anticipate paid it. But I did a lot of research 676 00:35:02,520 --> 00:35:05,040 Speaker 1: for that story, and that research led me to this idea. 677 00:35:05,120 --> 00:35:07,920 Speaker 1: And so how was it received by your colleagues. Do 678 00:35:08,040 --> 00:35:10,880 Speaker 1: they see this as like, Wow, what a fascinating discussion 679 00:35:10,960 --> 00:35:12,920 Speaker 1: of this important topic, or do they think of you 680 00:35:13,000 --> 00:35:17,759 Speaker 1: as like, oh my gosh, it's kareem In contacted by aliens. No, no, 681 00:35:18,160 --> 00:35:22,680 Speaker 1: I think my colleagues really appreciate it. Although there a 682 00:35:22,800 --> 00:35:24,839 Speaker 1: lot of the ideas that I presented this paper are 683 00:35:24,920 --> 00:35:29,920 Speaker 1: not especially novel for other philosophers, so when I talk 684 00:35:30,000 --> 00:35:32,719 Speaker 1: to them, they go like, yeah, this is interesting, but yeah, 685 00:35:32,800 --> 00:35:36,400 Speaker 1: we know this already. But I thought that the added 686 00:35:36,440 --> 00:35:39,560 Speaker 1: value was trying to reach out then and try to 687 00:35:40,040 --> 00:35:43,640 Speaker 1: share these ideas with the non philosopherical community right well. 688 00:35:43,800 --> 00:35:46,600 Speaker 1: As a non philosopher or an amateur philosopher science, I 689 00:35:46,640 --> 00:35:49,000 Speaker 1: found it very interesting. So I want to begin by 690 00:35:49,200 --> 00:35:54,839 Speaker 1: speculating with you about what level of technology or civilization 691 00:35:55,440 --> 00:35:59,040 Speaker 1: we might be able to guess that extraterrestrial intelligences have. 692 00:35:59,440 --> 00:36:01,920 Speaker 1: I mean, if they arrive on Earth, obviously they have 693 00:36:02,280 --> 00:36:04,839 Speaker 1: some technology that we don't have. But in the case 694 00:36:04,920 --> 00:36:09,279 Speaker 1: that we're communicating with distant civilizations around other stars, do 695 00:36:09,400 --> 00:36:11,480 Speaker 1: we know anything or can we guess anything about what 696 00:36:11,760 --> 00:36:15,640 Speaker 1: level of technology or civilization they might have. Sure, if 697 00:36:15,760 --> 00:36:18,560 Speaker 1: if we got a signal from them, for example, by 698 00:36:18,840 --> 00:36:22,800 Speaker 1: radio waves, we could infer that they have some machine, 699 00:36:22,960 --> 00:36:25,759 Speaker 1: some apparatus to produce that signal, because I think that 700 00:36:25,960 --> 00:36:30,319 Speaker 1: most people in the field assume that biological systems could 701 00:36:30,400 --> 00:36:35,279 Speaker 1: not produce a signal that is strong enough and an 702 00:36:35,360 --> 00:36:39,239 Speaker 1: accurate enough to to convey through many light years. So 703 00:36:39,560 --> 00:36:42,560 Speaker 1: so I think we could we could conclude that, and 704 00:36:42,680 --> 00:36:46,200 Speaker 1: if the signal allows us to know where the planet is, 705 00:36:46,360 --> 00:36:48,600 Speaker 1: we can pain point it. Then we could direct our 706 00:36:48,640 --> 00:36:51,640 Speaker 1: telescopes there and and maybe that would help us to 707 00:36:51,719 --> 00:36:55,000 Speaker 1: find out other things about about these creatures. Maybe if 708 00:36:55,120 --> 00:36:58,759 Speaker 1: if their star is a yellow star like ours, then 709 00:36:59,200 --> 00:37:02,000 Speaker 1: that may is it more likely, for example, that they 710 00:37:02,040 --> 00:37:04,960 Speaker 1: would have a visual system that is similar to ours, 711 00:37:05,040 --> 00:37:07,759 Speaker 1: because eyes are really good to have, and we see 712 00:37:07,800 --> 00:37:11,320 Speaker 1: that a lot of different animals on Earth have developed 713 00:37:11,400 --> 00:37:14,440 Speaker 1: ice independently of each other. So if they have a 714 00:37:14,480 --> 00:37:17,680 Speaker 1: sound like star, then the adds are pretty good that 715 00:37:18,040 --> 00:37:21,280 Speaker 1: they would have eyes. Other than that is kind of difficult. 716 00:37:21,520 --> 00:37:24,520 Speaker 1: Can we get anything based on the fact that many 717 00:37:24,640 --> 00:37:28,080 Speaker 1: stars out there are older than our son, So it 718 00:37:28,239 --> 00:37:30,840 Speaker 1: might be that other civilizations have had, you know, a 719 00:37:30,960 --> 00:37:35,799 Speaker 1: billion year head start in terms of evolution, and developing technology. Yes, well, 720 00:37:35,840 --> 00:37:40,120 Speaker 1: I mean, it's certainly possible that they could be really advanced. Um. 721 00:37:40,520 --> 00:37:43,440 Speaker 1: On the other hand, if they're so advanced, how calm 722 00:37:44,360 --> 00:37:47,600 Speaker 1: they have not changed more stuff. So one of the 723 00:37:47,719 --> 00:37:51,680 Speaker 1: ideas out there is that if on a civilization is 724 00:37:51,800 --> 00:37:54,800 Speaker 1: very advanced, they will have a great energy needs. And 725 00:37:54,880 --> 00:37:59,080 Speaker 1: if fat civilization has all this need for energy, they 726 00:37:59,120 --> 00:38:02,719 Speaker 1: would cons shuck, dison, swarm, or they would cover their 727 00:38:02,800 --> 00:38:06,440 Speaker 1: sun or their starring in solar sales or something like that, 728 00:38:06,719 --> 00:38:09,799 Speaker 1: and that would be visible for us because we would 729 00:38:09,840 --> 00:38:13,800 Speaker 1: see stars disappearing. That suggests that if we don't see it, 730 00:38:13,880 --> 00:38:16,440 Speaker 1: then we can assume that there may be not that advanced. 731 00:38:16,920 --> 00:38:19,640 Speaker 1: But of course it's also possible that they are very 732 00:38:19,680 --> 00:38:23,640 Speaker 1: advantage just don't care about don't care about energy, or 733 00:38:23,760 --> 00:38:25,800 Speaker 1: they get their engine in some other way. Well, I 734 00:38:25,920 --> 00:38:30,040 Speaker 1: think the prospect of receiving a message from an alien 735 00:38:30,200 --> 00:38:32,719 Speaker 1: civilization and then trying to figure out what they're like 736 00:38:33,040 --> 00:38:35,239 Speaker 1: and what that message means, and what they're trying to 737 00:38:35,239 --> 00:38:37,799 Speaker 1: communicate to us and how we might communicate with them 738 00:38:38,000 --> 00:38:40,800 Speaker 1: is certainly a very fascinating one. In your paper, you 739 00:38:40,960 --> 00:38:44,319 Speaker 1: make the argument that this problem might be harder than 740 00:38:44,440 --> 00:38:47,239 Speaker 1: people anticipate. In fact, you write in the paper, no 741 00:38:47,560 --> 00:38:50,759 Speaker 1: entity can translate any message that humans could send with 742 00:38:51,080 --> 00:38:55,480 Speaker 1: nothing but electromagnetic transmission. Essentially, you're saying that it's impossible 743 00:38:55,920 --> 00:38:59,080 Speaker 1: for us to send something which could be decoded by aliens, 744 00:38:59,160 --> 00:39:02,399 Speaker 1: which I guess means in converse, that there's no way 745 00:39:02,440 --> 00:39:07,440 Speaker 1: we could understand any message aliens and with just electromagnetic transmissions. 746 00:39:07,760 --> 00:39:09,560 Speaker 1: He sketch out that argument for us. Why do you 747 00:39:09,640 --> 00:39:12,600 Speaker 1: believe that it's impossible for us to understand the alien message? 748 00:39:12,800 --> 00:39:15,480 Speaker 1: I can also add that this would be true even 749 00:39:15,600 --> 00:39:19,160 Speaker 1: if the aliens were very similar to us. So, just 750 00:39:19,360 --> 00:39:23,360 Speaker 1: as a thought experiment imagined that the aliens are actually 751 00:39:24,280 --> 00:39:28,480 Speaker 1: biologically identical to us, they just look like humans. According 752 00:39:28,560 --> 00:39:30,520 Speaker 1: to this argument, it would be impossible for us to 753 00:39:30,600 --> 00:39:33,920 Speaker 1: translate that message. But of course they it's very unlikely 754 00:39:34,000 --> 00:39:35,839 Speaker 1: that they would be identical to us, so that would 755 00:39:35,880 --> 00:39:38,640 Speaker 1: make it even more difficult. So the argument basically goes 756 00:39:39,320 --> 00:39:42,080 Speaker 1: a theory that is fairly prominent in the philosophy of 757 00:39:42,200 --> 00:39:47,920 Speaker 1: language that was articulated by William van orman Quin. But 758 00:39:48,960 --> 00:39:52,640 Speaker 1: and that builds on some very interesting insights that were 759 00:39:52,719 --> 00:39:56,279 Speaker 1: formulated by the philosopher a little big Wittgenstein, and the 760 00:39:56,360 --> 00:40:01,280 Speaker 1: idea is that utterances in a language do not contain 761 00:40:01,640 --> 00:40:06,080 Speaker 1: the meaning that they are trying to convey. Rather, meaning 762 00:40:06,320 --> 00:40:10,120 Speaker 1: is something that that exists entirely in our heads and 763 00:40:10,239 --> 00:40:15,560 Speaker 1: that we derive from observing the behaviors of other people. 764 00:40:15,760 --> 00:40:20,120 Speaker 1: And to illustrate this, Equin has this for experiment where 765 00:40:20,200 --> 00:40:24,600 Speaker 1: he imagines a person going into a jungle and this 766 00:40:24,719 --> 00:40:27,560 Speaker 1: person is a linguist, so he tries to communicate with 767 00:40:27,680 --> 00:40:30,080 Speaker 1: the people who are living in the jungle. They don't 768 00:40:30,160 --> 00:40:33,880 Speaker 1: know his language and he doesn't know their language. And 769 00:40:34,239 --> 00:40:37,839 Speaker 1: the way you start to communicate or start to learn 770 00:40:37,880 --> 00:40:43,880 Speaker 1: another people's language is by observing their behavior in their environment, 771 00:40:44,080 --> 00:40:47,839 Speaker 1: and with the help of different hypothesis of how these 772 00:40:47,920 --> 00:40:51,680 Speaker 1: people work and what they do, you can slowly but 773 00:40:51,800 --> 00:40:54,960 Speaker 1: surely construct a translation. Manue. So let's say that that 774 00:40:55,680 --> 00:40:58,040 Speaker 1: you have this person in the jungle and he points 775 00:40:58,080 --> 00:41:00,399 Speaker 1: at a rabbit and he says, gabba guy. And then 776 00:41:01,160 --> 00:41:04,120 Speaker 1: you make a hypothesis here that gaba guy can mean rabbit, 777 00:41:04,800 --> 00:41:07,719 Speaker 1: but gaba guy could also mean food, and or maybe 778 00:41:07,800 --> 00:41:12,040 Speaker 1: gave a guy could be I mean rabbitness or being 779 00:41:12,160 --> 00:41:14,640 Speaker 1: very fast. Because this rabbit happened to be very fast, 780 00:41:14,880 --> 00:41:19,880 Speaker 1: and all these hypothesis are consistent with your observation. So 781 00:41:20,000 --> 00:41:23,680 Speaker 1: what you need to do is to make more observations. 782 00:41:23,760 --> 00:41:27,799 Speaker 1: For example, if the people that you're trying to talk 783 00:41:27,880 --> 00:41:30,640 Speaker 1: to they point at a potato and they say, have 784 00:41:30,760 --> 00:41:34,920 Speaker 1: a guy, then we can start discarding some of the 785 00:41:35,040 --> 00:41:38,239 Speaker 1: hypothesis because the potato is not fast, right, but the 786 00:41:38,280 --> 00:41:41,000 Speaker 1: potato is food or so you assume. So then you 787 00:41:41,120 --> 00:41:42,960 Speaker 1: say that, yeah, it's more likely that you have a 788 00:41:43,040 --> 00:41:46,560 Speaker 1: guy means food, because that's what is common with these 789 00:41:46,600 --> 00:41:49,440 Speaker 1: two things. But this is just a tentative hypothesis, right, 790 00:41:50,200 --> 00:41:54,160 Speaker 1: But at some point, after a long period of interaction, 791 00:41:54,400 --> 00:41:57,160 Speaker 1: you will be good enough in understanding them that you 792 00:41:57,200 --> 00:42:00,640 Speaker 1: will be able to ask them. So you you will 793 00:42:00,680 --> 00:42:02,879 Speaker 1: be able to ask them by rabbit, do you mean 794 00:42:02,960 --> 00:42:05,400 Speaker 1: this or that? But this is not something you can 795 00:42:05,440 --> 00:42:08,200 Speaker 1: do initially. Initially, the only thing you can do is 796 00:42:08,280 --> 00:42:12,640 Speaker 1: to observe behavior and from your understanding about how humans work. 797 00:42:12,800 --> 00:42:16,080 Speaker 1: For example, you you know that humans eat rabbits sometimes 798 00:42:16,120 --> 00:42:18,160 Speaker 1: and they also eat potatoes, and that's why you can 799 00:42:18,520 --> 00:42:20,920 Speaker 1: kind of make this hypothesis. But with the help of 800 00:42:21,000 --> 00:42:23,360 Speaker 1: this knowledge, this background knowledge about how humans work, you 801 00:42:23,400 --> 00:42:26,879 Speaker 1: can start producing these hypothesis. The problem is that when 802 00:42:27,000 --> 00:42:29,719 Speaker 1: when we're dealing with a string of digits that an 803 00:42:29,880 --> 00:42:33,440 Speaker 1: extraterrestrial might send to us, then we have, first of all, 804 00:42:33,520 --> 00:42:36,520 Speaker 1: we don't have any information or very little information about 805 00:42:36,719 --> 00:42:38,560 Speaker 1: you know, if they like potatoes or not, or if 806 00:42:38,600 --> 00:42:40,480 Speaker 1: they call them French fries or if they call them 807 00:42:40,560 --> 00:42:44,480 Speaker 1: something else, or if potatoes are like some ritual object 808 00:42:44,600 --> 00:42:48,239 Speaker 1: in their everyday lives, or maybe potatoes do move fast 809 00:42:48,280 --> 00:42:49,880 Speaker 1: on their planet. But let me back you up and 810 00:42:49,920 --> 00:42:52,040 Speaker 1: ask you a question, just to make sure I understand 811 00:42:52,080 --> 00:42:54,920 Speaker 1: the point you're making. You're saying that even just pointing 812 00:42:55,000 --> 00:42:58,600 Speaker 1: to things and saying the words requires some sort of 813 00:42:58,760 --> 00:43:02,560 Speaker 1: common context. Have to have some common experience, some common cultures, 814 00:43:02,600 --> 00:43:06,400 Speaker 1: some common inherent understanding to make those connections. That you 815 00:43:06,600 --> 00:43:10,120 Speaker 1: can't truly translate an alien language because you don't have 816 00:43:10,239 --> 00:43:13,239 Speaker 1: the context to understand that that the framework to fit 817 00:43:13,320 --> 00:43:16,600 Speaker 1: those things into. So what Chrime actually says is that 818 00:43:16,760 --> 00:43:20,320 Speaker 1: you can never truly translate anything. So think about it. 819 00:43:20,640 --> 00:43:24,880 Speaker 1: We learn language by observing human behavior and inferring an 820 00:43:25,880 --> 00:43:31,359 Speaker 1: observing utterances and inferring what those utterances can mean by 821 00:43:31,440 --> 00:43:34,279 Speaker 1: then observing their behavior. For example, if you say, pass 822 00:43:34,360 --> 00:43:37,160 Speaker 1: me the salt, and then I give you the salt. 823 00:43:37,200 --> 00:43:40,120 Speaker 1: Then you seem satisfied with that. Then I infer that 824 00:43:40,680 --> 00:43:42,879 Speaker 1: you know I was in the ballpark of what you meant. 825 00:43:44,239 --> 00:43:47,480 Speaker 1: But we can never have a complete total understanding of 826 00:43:47,600 --> 00:43:50,719 Speaker 1: what you meant by the different utterances that you make, 827 00:43:50,760 --> 00:43:53,160 Speaker 1: all right, but sometimes we can accomplish what we need 828 00:43:53,239 --> 00:43:55,279 Speaker 1: to write. I don't really know if you have like 829 00:43:55,360 --> 00:43:58,719 Speaker 1: a deep, you know, philosophical understanding of salt, but I 830 00:43:58,800 --> 00:44:00,960 Speaker 1: know that you passed me the salt, just like you know. 831 00:44:01,160 --> 00:44:03,440 Speaker 1: We are speaking English, which is not a perfect language 832 00:44:03,480 --> 00:44:06,080 Speaker 1: to convey all of our ideas as well. In that sense, 833 00:44:06,239 --> 00:44:08,440 Speaker 1: you know, you could say no language is perfect, but 834 00:44:08,560 --> 00:44:10,560 Speaker 1: that's not the sense that we're grasping for, right, We're 835 00:44:10,560 --> 00:44:12,960 Speaker 1: trying to develop some basic understandings. So it was this 836 00:44:13,120 --> 00:44:15,880 Speaker 1: argument saying, you can never have a perfect translation of 837 00:44:15,960 --> 00:44:18,799 Speaker 1: an alien language, or you can never have any translation. 838 00:44:19,200 --> 00:44:24,520 Speaker 1: So Twiet theory would allow for a good enough translation 839 00:44:24,880 --> 00:44:29,279 Speaker 1: of an alien language if we had those two preconditions, 840 00:44:29,400 --> 00:44:34,360 Speaker 1: if you had the interaction, and if that interaction was 841 00:44:34,520 --> 00:44:37,280 Speaker 1: in a in a context where we can observe the behavior. 842 00:44:37,440 --> 00:44:42,200 Speaker 1: So imagine you get these people coming here from proximal 843 00:44:42,280 --> 00:44:47,239 Speaker 1: Centauri with their embassy, their alien embassy, and then we 844 00:44:47,320 --> 00:44:50,399 Speaker 1: can look at their behavior and then we can see like, oh, 845 00:44:50,600 --> 00:44:54,680 Speaker 1: they eat, they seem to be absorbing substances from that orrifice. 846 00:44:55,640 --> 00:44:58,960 Speaker 1: Let us postulate that they're eating now, and they seemed 847 00:44:58,960 --> 00:45:01,160 Speaker 1: to be eating this kind of I don't know, it 848 00:45:01,360 --> 00:45:03,400 Speaker 1: looks like porridge, and then we can do a chemical 849 00:45:03,440 --> 00:45:06,320 Speaker 1: analysis of that and realize that, yeah, it's pretty similar 850 00:45:06,360 --> 00:45:10,520 Speaker 1: to porridge. And then that would allow us to slowly 851 00:45:10,640 --> 00:45:15,000 Speaker 1: but surely construct a translation manual over many interactions. But 852 00:45:15,400 --> 00:45:18,440 Speaker 1: when we just get the string, uh, we we we 853 00:45:18,640 --> 00:45:20,719 Speaker 1: can't make much of it. I see. So if we 854 00:45:20,800 --> 00:45:23,200 Speaker 1: could spend some time with the aliens and develop some 855 00:45:23,360 --> 00:45:27,440 Speaker 1: common context, then we could have the framework for developing 856 00:45:27,520 --> 00:45:30,759 Speaker 1: some sort of effective but imperfect translation. But you're saying, 857 00:45:30,760 --> 00:45:33,640 Speaker 1: if all we get other messages, even if they're actually 858 00:45:33,680 --> 00:45:36,279 Speaker 1: biologically humans and think the same way we do, it 859 00:45:36,280 --> 00:45:39,200 Speaker 1: would be impossible for us to translate because we don't 860 00:45:39,239 --> 00:45:41,320 Speaker 1: have a context or what their lives are like and 861 00:45:41,400 --> 00:45:43,920 Speaker 1: what they might mean exactly. That's really interesting, And you 862 00:45:44,000 --> 00:45:47,480 Speaker 1: make this comment in your paper about Chomsky's analysis of 863 00:45:47,600 --> 00:45:51,680 Speaker 1: language is universal grammar. This claim that all human languages 864 00:45:51,719 --> 00:45:54,480 Speaker 1: are based on the same sort of mathematical structure. And 865 00:45:54,600 --> 00:45:57,360 Speaker 1: you're write in the paper that Chomsky never claimed that 866 00:45:57,520 --> 00:46:00,799 Speaker 1: universal grammar was universal in the sense that it would 867 00:46:00,840 --> 00:46:04,600 Speaker 1: be shared across all possible species able to use language. 868 00:46:04,760 --> 00:46:07,520 Speaker 1: That is, you know, is universal grammar universal in the 869 00:46:07,560 --> 00:46:11,320 Speaker 1: sense that it might also apply to structures of alien languages, 870 00:46:11,440 --> 00:46:13,759 Speaker 1: which is a really cool concept. And Chomsky was here 871 00:46:13,760 --> 00:46:15,920 Speaker 1: on our podcast a couple of weeks ago. I actually 872 00:46:15,920 --> 00:46:18,520 Speaker 1: asked him about this whether he thought that it was 873 00:46:18,600 --> 00:46:22,239 Speaker 1: possible that alien languages might be constructed in the same 874 00:46:22,280 --> 00:46:25,719 Speaker 1: way as human languages, and he made some argument that 875 00:46:26,000 --> 00:46:29,000 Speaker 1: the development of language and symbolic thought might be due 876 00:46:29,080 --> 00:46:31,919 Speaker 1: to evolutionary pressure, and so it might be some sort 877 00:46:31,920 --> 00:46:35,840 Speaker 1: of like optimum solution to this problem, and therefore it 878 00:46:36,000 --> 00:46:38,520 Speaker 1: might be or it could be likely, or you could 879 00:46:38,600 --> 00:46:42,320 Speaker 1: argue at least that alien languages might have a similar 880 00:46:42,400 --> 00:46:45,160 Speaker 1: structure if that were true, are you saying that that 881 00:46:45,400 --> 00:46:47,680 Speaker 1: still doesn't help us, Like even if we have human 882 00:46:47,760 --> 00:46:51,440 Speaker 1: biological aliens with minds similar to ours that have a 883 00:46:51,560 --> 00:46:54,480 Speaker 1: universal grammar, in that case, we still couldn't develop enough 884 00:46:54,520 --> 00:46:58,520 Speaker 1: common understanding just by sending signals to effectively communicate. I'm 885 00:46:58,560 --> 00:47:01,560 Speaker 1: not exactly sure what chance the means by that, but 886 00:47:02,040 --> 00:47:05,600 Speaker 1: it's only true that we can see convergent evolution across 887 00:47:05,680 --> 00:47:10,120 Speaker 1: many traits, right. But for example, the wings of baths 888 00:47:10,239 --> 00:47:14,680 Speaker 1: are similar to the wings of birds, but they're not identical. Right, 889 00:47:15,000 --> 00:47:17,880 Speaker 1: So they're similar in some respects and different in other respects. 890 00:47:17,920 --> 00:47:22,520 Speaker 1: And I think that a language that emerges some say 891 00:47:22,600 --> 00:47:26,560 Speaker 1: proximus century, it's probably going to emerge under you know, 892 00:47:27,239 --> 00:47:31,760 Speaker 1: some evolutionary pressure. And I think it's sensible to assume 893 00:47:31,880 --> 00:47:35,440 Speaker 1: that it would be similar in some respects to human languages, 894 00:47:35,960 --> 00:47:39,439 Speaker 1: and and and different in other respects. The crucial point 895 00:47:39,520 --> 00:47:42,200 Speaker 1: here is that would it be similar in the sense 896 00:47:42,360 --> 00:47:47,400 Speaker 1: that would allow this innate language sense or innate language 897 00:47:47,880 --> 00:47:52,759 Speaker 1: grammar to allow us to pick out the right translation 898 00:47:53,239 --> 00:47:57,840 Speaker 1: from a whole range of potential translations. And I think that, 899 00:47:58,200 --> 00:48:00,319 Speaker 1: you know, it might be possible, But I think then 900 00:48:01,480 --> 00:48:04,239 Speaker 1: you would be committed to some fairly controversial views in 901 00:48:04,280 --> 00:48:08,080 Speaker 1: the philosophy of language. And I think that, I mean, 902 00:48:08,160 --> 00:48:11,600 Speaker 1: I have the utmost respect for Chomsky and his work, 903 00:48:12,000 --> 00:48:15,200 Speaker 1: but I think that his view of universal grammar remains 904 00:48:15,520 --> 00:48:18,319 Speaker 1: quite controversial to this name. Yeah, I agree I think 905 00:48:18,360 --> 00:48:21,280 Speaker 1: it's interesting also to wonder why we haven't been able 906 00:48:21,360 --> 00:48:24,400 Speaker 1: to cross the species barrier. I mean, if we're going 907 00:48:24,440 --> 00:48:28,000 Speaker 1: to solve the puzzle speaking with another intelligent species from 908 00:48:28,040 --> 00:48:30,479 Speaker 1: another planet, we might first want to tackle the question 909 00:48:30,560 --> 00:48:34,200 Speaker 1: of like speaking to dolphins or two other intelligent mammals. 910 00:48:34,360 --> 00:48:36,279 Speaker 1: Do you have thoughts about why we have not yet 911 00:48:36,320 --> 00:48:40,000 Speaker 1: been able to crack the dolphin language. Well, it's controversial 912 00:48:40,360 --> 00:48:43,040 Speaker 1: as to whether the dolphins have a language. I mean, 913 00:48:43,360 --> 00:48:45,680 Speaker 1: we know that they communicate, and I think this is 914 00:48:45,760 --> 00:48:51,160 Speaker 1: what Chomsky says too, that that dolphins and other clever mammals, 915 00:48:51,400 --> 00:48:55,120 Speaker 1: they they have advanced communication, but they don't have this 916 00:48:55,239 --> 00:49:00,680 Speaker 1: ability to generate an infinite amount of center from a 917 00:49:00,760 --> 00:49:04,200 Speaker 1: finite set of words. So so they don't have grammar basically, 918 00:49:04,320 --> 00:49:08,920 Speaker 1: and that makes their communication fundamentally different from what we 919 00:49:09,040 --> 00:49:13,560 Speaker 1: call language. But if we think about the bracket that 920 00:49:13,680 --> 00:49:16,800 Speaker 1: sort of speak, we think that we are actually pretty 921 00:49:16,840 --> 00:49:21,080 Speaker 1: good at communicating with some animals, especially animals we have 922 00:49:21,200 --> 00:49:24,799 Speaker 1: bread to be attentive to our utterances, for example, dogs. 923 00:49:25,040 --> 00:49:27,160 Speaker 1: I don't have a dog myself, but I find it 924 00:49:27,280 --> 00:49:30,239 Speaker 1: amazing the things that people can do when they train 925 00:49:30,320 --> 00:49:33,600 Speaker 1: their dogs. And I think it shows a very high 926 00:49:33,719 --> 00:49:37,480 Speaker 1: level of communication, of ability to communicate. I mean, you 927 00:49:37,560 --> 00:49:41,120 Speaker 1: can't use understand the language of dogs because they don't 928 00:49:41,120 --> 00:49:44,680 Speaker 1: have a language. On on that kind of more precise 929 00:49:44,960 --> 00:49:47,800 Speaker 1: notion of what languages. I understand that you also have 930 00:49:47,880 --> 00:49:51,879 Speaker 1: a pet, one that you've named after Chomsky. Yeah. Yeah, 931 00:49:52,160 --> 00:49:56,000 Speaker 1: she's a rabbit. Her name is Chomska, and she's not 932 00:49:56,160 --> 00:50:01,600 Speaker 1: as eloquent as as Chomsky, but she's very charismatic. I 933 00:50:01,719 --> 00:50:04,040 Speaker 1: hope that you have a shared emotional context with your 934 00:50:04,160 --> 00:50:06,800 Speaker 1: Chomska at least so a lot of people speculate that 935 00:50:06,880 --> 00:50:10,120 Speaker 1: it might be impossible to have a common cultural context 936 00:50:10,200 --> 00:50:13,560 Speaker 1: with aliens, and therefore these questions of language are always 937 00:50:13,600 --> 00:50:17,040 Speaker 1: going to be impossible. But many folks argue that mathematics 938 00:50:17,160 --> 00:50:19,719 Speaker 1: and physics and the sort of physical rules about the 939 00:50:19,840 --> 00:50:23,160 Speaker 1: universe might be that context that we might be able 940 00:50:23,160 --> 00:50:26,759 Speaker 1: to communicate with aliens by first starting with simple mathematics 941 00:50:26,880 --> 00:50:29,040 Speaker 1: and building up from there. You make a reference in 942 00:50:29,080 --> 00:50:33,040 Speaker 1: your paper to Frudenthal's self explaining message in his language, 943 00:50:33,239 --> 00:50:36,880 Speaker 1: the lingual Cosmico, which is built up from these mathematical primitives. 944 00:50:37,000 --> 00:50:39,160 Speaker 1: Do you think it's not possible to start from just 945 00:50:39,320 --> 00:50:42,759 Speaker 1: mathematics and develop some way to discuss with each other, 946 00:50:42,920 --> 00:50:46,279 Speaker 1: some way to communicate and transfer information. I think that 947 00:50:46,360 --> 00:50:50,040 Speaker 1: would run into similar problems. So, first of all, if 948 00:50:50,080 --> 00:50:52,520 Speaker 1: we just talk about science and this is this goes 949 00:50:52,560 --> 00:50:55,640 Speaker 1: also back to a theory articulated by Quine and others 950 00:50:55,719 --> 00:51:00,560 Speaker 1: in that philosophical movement. But what clients is that science 951 00:51:00,640 --> 00:51:05,560 Speaker 1: is underdetermined by observation. So imagine that you have a 952 00:51:05,640 --> 00:51:09,840 Speaker 1: theory and you make an observation that seems inconsistent with 953 00:51:09,920 --> 00:51:13,120 Speaker 1: that theory. That means that, as a good scientist, we 954 00:51:13,160 --> 00:51:16,560 Speaker 1: need to revise something right, But the observation doesn't tell 955 00:51:16,600 --> 00:51:20,760 Speaker 1: you what to revise. So, for example, um, the according 956 00:51:20,800 --> 00:51:25,640 Speaker 1: to Nitsonian mechanics, um, the behavior of the planet Mercury 957 00:51:25,840 --> 00:51:30,000 Speaker 1: was rather odd. But one of the suggested solutions to 958 00:51:30,080 --> 00:51:32,240 Speaker 1: this problem was to postulate that there was a planet 959 00:51:32,280 --> 00:51:36,160 Speaker 1: there that we couldn't see that they named volcan, which 960 00:51:36,200 --> 00:51:38,400 Speaker 1: I think is a pretty awesome name for a planet. 961 00:51:40,200 --> 00:51:44,160 Speaker 1: It's a very logical choice. Yes, but but yeah, and 962 00:51:45,160 --> 00:51:48,279 Speaker 1: I don't think it was a bad hypothesis. But it 963 00:51:48,400 --> 00:51:52,960 Speaker 1: just shows you that making the observation that Mercury behaves 964 00:51:53,040 --> 00:51:56,640 Speaker 1: in an other way doesn't tell you what is the truth. 965 00:51:57,080 --> 00:52:00,880 Speaker 1: It just tells you that something needs revising. So the 966 00:52:00,960 --> 00:52:04,040 Speaker 1: idea that Quin has is that you could imagine two 967 00:52:04,960 --> 00:52:14,000 Speaker 1: distinct scientific um worlds or or scientific systems, with coherent 968 00:52:14,440 --> 00:52:18,680 Speaker 1: theories about everything about you know, evolution and physics and chemistry, 969 00:52:19,320 --> 00:52:22,440 Speaker 1: and that you can do the same things with these theories, 970 00:52:23,440 --> 00:52:27,520 Speaker 1: but that that these theories imply very different views about 971 00:52:27,560 --> 00:52:31,080 Speaker 1: the world. And in the paper, I have an example, uh, 972 00:52:31,560 --> 00:52:33,759 Speaker 1: And I don't think it's mine. Actually it's it's also 973 00:52:33,840 --> 00:52:37,560 Speaker 1: from Quin, where you can imagine a civilization, perhaps in 974 00:52:37,880 --> 00:52:43,080 Speaker 1: an Alpha centauri, that that developed general relativity and special 975 00:52:43,160 --> 00:52:48,400 Speaker 1: relativity without first having developed Newtonian mechanics. Might be unlikely, 976 00:52:48,480 --> 00:52:51,560 Speaker 1: but it's certainly possible. Uh. And that means that if 977 00:52:51,640 --> 00:52:54,600 Speaker 1: they had this, then they could make the same predictions, 978 00:52:55,480 --> 00:52:58,279 Speaker 1: and you know, they could be as a as a 979 00:52:58,320 --> 00:53:02,440 Speaker 1: Newtonian civilization like we could in the nineteenth century, but 980 00:53:02,560 --> 00:53:05,200 Speaker 1: they would have very different views about the world, because 981 00:53:05,440 --> 00:53:09,360 Speaker 1: Newtonian physics implies one set of beliefs about the worlds 982 00:53:09,400 --> 00:53:13,360 Speaker 1: about the universe, and general and special relativisty implies a 983 00:53:13,480 --> 00:53:15,680 Speaker 1: very different set of beliefs. So you're saying that even 984 00:53:15,760 --> 00:53:18,720 Speaker 1: though we exist in the same physical universe and observe 985 00:53:18,840 --> 00:53:21,719 Speaker 1: the same things about the universe, we might come to 986 00:53:21,960 --> 00:53:26,560 Speaker 1: different internal mental explanations of that universe, two different theories 987 00:53:26,640 --> 00:53:30,000 Speaker 1: of science that both work, and therefore just being in 988 00:53:30,080 --> 00:53:33,160 Speaker 1: the same physical universe doesn't give us enough of a 989 00:53:33,280 --> 00:53:38,040 Speaker 1: shared mental context to develop communication. Is that the argument 990 00:53:38,160 --> 00:53:40,120 Speaker 1: I would like to say that this is different from 991 00:53:40,560 --> 00:53:43,720 Speaker 1: because sometimes this view is conflated with relativism, for example 992 00:53:44,120 --> 00:53:47,279 Speaker 1: Koon's view about you know, scientific revolutions and so on. 993 00:53:47,840 --> 00:53:50,759 Speaker 1: But this is different. This is not relativism because here 994 00:53:51,600 --> 00:53:55,360 Speaker 1: we would say that both scientific theories or scientific systems 995 00:53:55,480 --> 00:53:58,920 Speaker 1: are true in the relevant sets. So let's say that 996 00:53:59,040 --> 00:54:01,840 Speaker 1: you have a scientific system that can make correct predictions 997 00:54:01,920 --> 00:54:05,080 Speaker 1: about the universe and about physical systems and so on, 998 00:54:05,840 --> 00:54:09,279 Speaker 1: and it also gives an adequate explanation of how things work. 999 00:54:09,480 --> 00:54:12,520 Speaker 1: I think in many respects we would say that that 1000 00:54:12,800 --> 00:54:15,200 Speaker 1: is sufficient for us to say that this is true. 1001 00:54:15,320 --> 00:54:19,200 Speaker 1: The fact that there are other descriptions of the universe 1002 00:54:19,320 --> 00:54:22,600 Speaker 1: and other ways of making the same predictions that is 1003 00:54:23,160 --> 00:54:26,120 Speaker 1: also true but different, does not make our of you 1004 00:54:26,600 --> 00:54:28,480 Speaker 1: less true, right. I think that that would blow the 1005 00:54:28,600 --> 00:54:33,280 Speaker 1: minds of most practicing scientists, especially particle physicists to imagine 1006 00:54:33,600 --> 00:54:36,520 Speaker 1: that the description we're building of the universe, you know, 1007 00:54:36,560 --> 00:54:41,240 Speaker 1: whether it be tiny strings or bouncing particles or wiggling fields, 1008 00:54:41,560 --> 00:54:44,359 Speaker 1: that these are not necessarily unique. That even if it's 1009 00:54:44,400 --> 00:54:48,160 Speaker 1: true and it works perfectly, it might just reflect mathematical 1010 00:54:48,320 --> 00:54:51,840 Speaker 1: models in our minds rather than the actual structure of 1011 00:54:51,920 --> 00:54:54,920 Speaker 1: the universe in any sort of objective way. That's the argument, 1012 00:54:55,000 --> 00:54:58,320 Speaker 1: isn't it. Yeah, well almost, because I would say that 1013 00:54:58,840 --> 00:55:03,799 Speaker 1: what makes a very true if our ability to understand 1014 00:55:04,120 --> 00:55:07,360 Speaker 1: and manipulate the world around us. So if if a 1015 00:55:07,480 --> 00:55:11,080 Speaker 1: theory is good at that, then it's true in the 1016 00:55:11,200 --> 00:55:13,839 Speaker 1: relevant sense. I see. So you're saying my theory could 1017 00:55:13,840 --> 00:55:16,239 Speaker 1: be true and your theory could be true, and they 1018 00:55:16,280 --> 00:55:20,400 Speaker 1: could have basically nothing in common other than that they 1019 00:55:20,480 --> 00:55:23,680 Speaker 1: both work. Yes, But it's also true that that a 1020 00:55:23,760 --> 00:55:26,160 Speaker 1: theory could be better than the other even though both 1021 00:55:26,200 --> 00:55:29,920 Speaker 1: are true. So, for example, we would say that special 1022 00:55:30,000 --> 00:55:33,640 Speaker 1: and general relativity as a theory is better than Newtonian mechanics, 1023 00:55:33,920 --> 00:55:39,840 Speaker 1: but Newtonian mechanics is true for predicting the behavior of 1024 00:55:40,719 --> 00:55:43,600 Speaker 1: small objects moving at non relativistic speeds. All right, well, 1025 00:55:43,680 --> 00:55:47,120 Speaker 1: this is really fascinating. Dive into how to communicate with aliens? 1026 00:55:47,360 --> 00:55:49,400 Speaker 1: Are a lot more questions for you, but first we 1027 00:55:49,480 --> 00:56:04,000 Speaker 1: have to take a short break. All right, we're back 1028 00:56:04,040 --> 00:56:06,840 Speaker 1: and we're talking with Dr Kareem Jabari about how to 1029 00:56:06,960 --> 00:56:11,719 Speaker 1: communicate with aliens and especially how to stave off interstellar war. 1030 00:56:12,040 --> 00:56:14,800 Speaker 1: So the reason that we're talking about how to communicate 1031 00:56:14,880 --> 00:56:17,640 Speaker 1: with aliens is that we're wondering about the question of 1032 00:56:17,719 --> 00:56:21,600 Speaker 1: how to get along with aliens or why aliens haven't 1033 00:56:21,680 --> 00:56:24,600 Speaker 1: contacted us, and the hypothesis that the universe out there 1034 00:56:24,680 --> 00:56:27,040 Speaker 1: is like a dark forest where everybody is being quiet 1035 00:56:27,239 --> 00:56:30,879 Speaker 1: to avoid being wiped out by some sort of predator race. 1036 00:56:31,120 --> 00:56:34,360 Speaker 1: So in your paper, you build on this argument that 1037 00:56:34,800 --> 00:56:38,680 Speaker 1: we can't understand aliens to suggest that we might essentially 1038 00:56:38,840 --> 00:56:42,320 Speaker 1: stumble our way into an interstellar war because we have 1039 00:56:42,400 --> 00:56:44,960 Speaker 1: no good way to communicate with them. So walk me 1040 00:56:45,040 --> 00:56:47,279 Speaker 1: through the argument there. You use some game theory to 1041 00:56:47,440 --> 00:56:49,880 Speaker 1: suggest that will end up in a situation where it 1042 00:56:49,880 --> 00:56:54,120 Speaker 1: seems logical to us to essentially first strike against these aliens. 1043 00:56:54,239 --> 00:56:56,600 Speaker 1: How do we get there from we got a message 1044 00:56:56,640 --> 00:56:59,000 Speaker 1: and we don't understand it too. Now we're sending a 1045 00:56:59,120 --> 00:57:04,359 Speaker 1: nuke to Alpha Centauri. Yeah, that would really suck. Yeah. 1046 00:57:04,640 --> 00:57:08,120 Speaker 1: So the idea comes from a game game theories called 1047 00:57:08,320 --> 00:57:12,880 Speaker 1: Thomas Shelling that in turn got the idea from Thomas Hobbes. 1048 00:57:13,040 --> 00:57:16,200 Speaker 1: And the idea is that we can think about our 1049 00:57:16,360 --> 00:57:20,560 Speaker 1: interactions with this extraterrestrial as a coordination game and in 1050 00:57:20,640 --> 00:57:25,360 Speaker 1: a coordination and that there are many classical coordination games. 1051 00:57:25,400 --> 00:57:28,320 Speaker 1: But basically in a coordination game, you you you have 1052 00:57:28,400 --> 00:57:31,200 Speaker 1: two alternatives. One is to cooperate and the other is 1053 00:57:31,280 --> 00:57:35,680 Speaker 1: to play it safe and don't cooperate. And the idea 1054 00:57:35,800 --> 00:57:38,320 Speaker 1: is that when you cooperate, then you take a small 1055 00:57:38,440 --> 00:57:41,640 Speaker 1: risk or you take a risk. But if I also cooperate, 1056 00:57:42,000 --> 00:57:44,880 Speaker 1: then there will be a great advantage for both of us. 1057 00:57:45,080 --> 00:57:48,439 Speaker 1: But if you played safe and don't cooperate, then you're 1058 00:57:48,480 --> 00:57:51,240 Speaker 1: not taking as much risk. On the other hand, you're 1059 00:57:51,280 --> 00:57:53,600 Speaker 1: not getting a chance to get a big payoff. And 1060 00:57:53,680 --> 00:57:57,240 Speaker 1: a classical example to illustrate this is to imagine two 1061 00:57:57,440 --> 00:58:00,640 Speaker 1: people walking in the forest and decide as to whether 1062 00:58:00,800 --> 00:58:03,760 Speaker 1: or not to chase a stag or to chase a rabbit. 1063 00:58:04,080 --> 00:58:07,520 Speaker 1: And to chase a stag, we need to cooperate. So 1064 00:58:07,840 --> 00:58:11,120 Speaker 1: if we both choose to chase a stag, then we'll 1065 00:58:11,160 --> 00:58:13,640 Speaker 1: get the stag and we'll get a lot of food 1066 00:58:13,680 --> 00:58:16,200 Speaker 1: and we'll be very happy. But if you choose to 1067 00:58:17,440 --> 00:58:20,480 Speaker 1: chase the stag and I go for the rabbit, then 1068 00:58:20,560 --> 00:58:25,280 Speaker 1: you won't get get the stag and you'll be very hungry. 1069 00:58:25,560 --> 00:58:28,920 Speaker 1: Whereas to get the rabbit, you don't. You can do 1070 00:58:29,000 --> 00:58:31,280 Speaker 1: it by yourself. So I'll get the rabbit. I'll get 1071 00:58:31,680 --> 00:58:34,400 Speaker 1: one one piece of food, not as much as the stag, 1072 00:58:34,480 --> 00:58:35,920 Speaker 1: but I would get enough. You know, I have to say, 1073 00:58:35,960 --> 00:58:38,320 Speaker 1: you talk a lot about eating rabbits for somebody who 1074 00:58:38,400 --> 00:58:40,600 Speaker 1: has a pet rabbit, you know, in there's a little 1075 00:58:40,600 --> 00:58:44,040 Speaker 1: bit of a mental tension there for you. Well, yes, sometimes, 1076 00:58:44,800 --> 00:58:47,320 Speaker 1: but it's just because they appear so much in the 1077 00:58:47,360 --> 00:58:51,760 Speaker 1: philosophical literature. Well, I hope you cover her ears when 1078 00:58:51,840 --> 00:58:54,360 Speaker 1: you talk about this in front of her, all right, 1079 00:58:54,560 --> 00:58:58,440 Speaker 1: saying you're explaining why you need to communicate in order 1080 00:58:58,480 --> 00:59:01,240 Speaker 1: to cooperate and take risks together. So if we trust 1081 00:59:01,320 --> 00:59:03,960 Speaker 1: each other, then of course we'll corporate, right because I 1082 00:59:04,240 --> 00:59:06,400 Speaker 1: know that you're a good guy that corporates with people. 1083 00:59:06,880 --> 00:59:08,840 Speaker 1: And then we'll go and we'll get a stag and 1084 00:59:08,880 --> 00:59:11,920 Speaker 1: will be very happy. But if I am suspicious about you, 1085 00:59:12,160 --> 00:59:14,440 Speaker 1: then I will try to go for the rabbit, So 1086 00:59:14,720 --> 00:59:18,000 Speaker 1: you would say that the rabbit is a risk dominant equilibrium, 1087 00:59:18,040 --> 00:59:21,840 Speaker 1: whereas this stag is a payoff dominant equilibrium. And both 1088 00:59:21,880 --> 00:59:26,720 Speaker 1: of these strategies are rational, or none of these strategies 1089 00:59:26,960 --> 00:59:29,880 Speaker 1: is more rational than the other if we don't know anything. 1090 00:59:30,360 --> 00:59:33,280 Speaker 1: But the problem here is that if we can't communicate, 1091 00:59:34,120 --> 00:59:37,600 Speaker 1: then there will be and we we also have this 1092 00:59:37,760 --> 00:59:42,120 Speaker 1: opportunity to defect, to not be cooperative. Then Hobbes and 1093 00:59:43,960 --> 00:59:49,200 Speaker 1: Shelling and others argue that the option to defect or 1094 00:59:49,280 --> 00:59:53,400 Speaker 1: to be non cooperative becomes the focal point. So we 1095 00:59:53,600 --> 00:59:57,840 Speaker 1: get both attracted to that option because I start thinking that, 1096 00:59:58,760 --> 01:00:01,160 Speaker 1: oh my god, he's the going to go for the rabbit. 1097 01:00:01,680 --> 01:00:04,440 Speaker 1: And he also probably thinks that I think that he's 1098 01:00:04,480 --> 01:00:06,800 Speaker 1: going to go for the rabbit. But that means that 1099 01:00:07,480 --> 01:00:10,560 Speaker 1: he's more inclined to go for the rammit himself, and 1100 01:00:10,760 --> 01:00:14,320 Speaker 1: he knows that I know that he knows that I 1101 01:00:14,520 --> 01:00:16,840 Speaker 1: think this way, so that means that he will be 1102 01:00:16,920 --> 01:00:18,920 Speaker 1: even more inclined to go off to the rabbit, which 1103 01:00:19,000 --> 01:00:20,880 Speaker 1: makes me more inclined to go off to the rabbit, 1104 01:00:20,920 --> 01:00:23,320 Speaker 1: and so on. And in the context of interstellar war, 1105 01:00:23,960 --> 01:00:27,520 Speaker 1: going for the rabbit or or being non cooperative could 1106 01:00:27,600 --> 01:00:31,160 Speaker 1: imply a first strike. Right, So if our options instead 1107 01:00:31,200 --> 01:00:33,400 Speaker 1: of hunting for a stag or hunting for a rabbit, 1108 01:00:33,680 --> 01:00:37,320 Speaker 1: if our choices are a share all of our theories 1109 01:00:37,400 --> 01:00:39,520 Speaker 1: of physics and all of our knowledge and the beauty 1110 01:00:39,560 --> 01:00:42,520 Speaker 1: that we develop this humanity and be send a nuclear 1111 01:00:42,600 --> 01:00:45,360 Speaker 1: weapon to avoid being wiped out by their nuclear weapon, 1112 01:00:45,520 --> 01:00:48,640 Speaker 1: then you're saying that a rational choice would be to 1113 01:00:49,280 --> 01:00:52,800 Speaker 1: fire first to avoid being wiped out by their new 1114 01:00:53,080 --> 01:00:56,840 Speaker 1: rather than just like dumping the Encyclopedia Britannica into a 1115 01:00:56,880 --> 01:00:59,680 Speaker 1: message and sending it to Alpha Centauri. Yeah, that's are 1116 01:00:59,720 --> 01:01:02,240 Speaker 1: going to ocean in this paper. But I was always 1117 01:01:02,320 --> 01:01:05,640 Speaker 1: very dissatisfied with this conclusion, so I wrote another paper 1118 01:01:05,720 --> 01:01:09,840 Speaker 1: where I argue that that we are wrong. But that's 1119 01:01:09,880 --> 01:01:13,560 Speaker 1: another kind of discussion. But but yeah, basically, if we 1120 01:01:13,640 --> 01:01:16,400 Speaker 1: assume that there's only one other player there, then the 1121 01:01:16,480 --> 01:01:19,680 Speaker 1: rational strategy should be to attack first. I think this 1122 01:01:19,840 --> 01:01:24,080 Speaker 1: game theory approach is really fascinating because you're thinking carefully 1123 01:01:24,080 --> 01:01:26,640 Speaker 1: about the risks you want to take, and obviously we 1124 01:01:26,720 --> 01:01:28,880 Speaker 1: have a lot at stake here. We're just on one planet, 1125 01:01:28,960 --> 01:01:31,400 Speaker 1: and it goes to the heart of understanding what the 1126 01:01:31,480 --> 01:01:33,720 Speaker 1: other player might do, you know, the choices that the 1127 01:01:33,760 --> 01:01:36,000 Speaker 1: other player might make, and the risks we are taking. 1128 01:01:36,240 --> 01:01:37,600 Speaker 1: And I think, you know, we see that of course 1129 01:01:37,600 --> 01:01:39,560 Speaker 1: a lot on Earth and game theory developed in the 1130 01:01:39,640 --> 01:01:43,040 Speaker 1: context of you know, human warfare and the prisoner's dilemma 1131 01:01:43,320 --> 01:01:45,280 Speaker 1: and all this kind of stuff. But isn't that also 1132 01:01:45,400 --> 01:01:48,240 Speaker 1: a case of projection? I mean, aren't we assuming in 1133 01:01:48,320 --> 01:01:50,960 Speaker 1: that case that the aliens would be thinking in a 1134 01:01:51,040 --> 01:01:53,680 Speaker 1: game theoretical way, that they would be rational in a 1135 01:01:53,760 --> 01:01:57,280 Speaker 1: way according to our definition of rationality. You know, if 1136 01:01:57,320 --> 01:01:59,680 Speaker 1: we don't understand these aliens and we can't communicate with them, 1137 01:01:59,720 --> 01:02:03,600 Speaker 1: should we just assume we may never even understand their motives, 1138 01:02:03,760 --> 01:02:07,600 Speaker 1: that they might operate essentially randomly. Yes, No, that's certainly true. 1139 01:02:07,680 --> 01:02:09,840 Speaker 1: So it's a it's a pretty big assumption that we're 1140 01:02:09,880 --> 01:02:14,120 Speaker 1: making here that that game theory is a plausible tool 1141 01:02:14,440 --> 01:02:19,080 Speaker 1: to understand or predict alien behavior. I can add that 1142 01:02:19,560 --> 01:02:22,960 Speaker 1: game theory has, with some success, actually been used in 1143 01:02:23,120 --> 01:02:27,200 Speaker 1: many non human contexts. So we've used to game theory 1144 01:02:27,280 --> 01:02:34,600 Speaker 1: to understand how pathogens emerge and behave and evolve, but 1145 01:02:34,760 --> 01:02:39,200 Speaker 1: also how animal populations interact with each other in complex 1146 01:02:39,760 --> 01:02:45,240 Speaker 1: ecological systems. So there seems to be some fundamental aspect 1147 01:02:45,320 --> 01:02:48,760 Speaker 1: of game theory that allows us to predict these systems 1148 01:02:48,840 --> 01:02:52,040 Speaker 1: where where we're dealing with utterly alien life forms. So 1149 01:02:52,200 --> 01:02:55,040 Speaker 1: that is one reason to believe that that perhaps this 1150 01:02:55,240 --> 01:02:57,240 Speaker 1: this tool can be used in this context. But yeah, 1151 01:02:57,280 --> 01:02:59,760 Speaker 1: it's certainly an assumption that needs to be made, will 1152 01:02:59,920 --> 01:03:02,400 Speaker 1: be fascinating problem to have. I think that a more 1153 01:03:02,520 --> 01:03:05,920 Speaker 1: likely scenario if we do meet aliens or get messages 1154 01:03:05,920 --> 01:03:10,320 Speaker 1: from aliens, is that we sort of hilariously misunderstand them, 1155 01:03:10,480 --> 01:03:12,760 Speaker 1: or that we send them a message of friendship which 1156 01:03:12,880 --> 01:03:15,920 Speaker 1: is interpreted to be offensive in their context. You know, 1157 01:03:16,000 --> 01:03:17,720 Speaker 1: we send them a picture of our rabbit and then 1158 01:03:18,080 --> 01:03:24,479 Speaker 1: you know, to them, rabbits are terrifying or yeah. Yeah. 1159 01:03:24,840 --> 01:03:27,040 Speaker 1: The paper that I wrote was in the context of 1160 01:03:27,160 --> 01:03:30,720 Speaker 1: this idea that some astronomers have that we should send 1161 01:03:30,760 --> 01:03:35,880 Speaker 1: out signals to nearby solar systems um, even though we 1162 01:03:35,920 --> 01:03:39,240 Speaker 1: don't know if somebody is there, just just to see 1163 01:03:39,280 --> 01:03:42,600 Speaker 1: what happens. And we argue that this could be very 1164 01:03:42,680 --> 01:03:45,720 Speaker 1: dangerous if because it were would alert the aliens of 1165 01:03:45,760 --> 01:03:48,720 Speaker 1: our presence. Perhaps, I mean, it's it's not certain that 1166 01:03:48,800 --> 01:03:51,120 Speaker 1: they know where we are, But if we do this, 1167 01:03:51,320 --> 01:03:55,720 Speaker 1: then that chance increases. If they contact us first, that 1168 01:03:55,880 --> 01:03:58,240 Speaker 1: means that we are not certain that they know where 1169 01:03:58,320 --> 01:04:01,240 Speaker 1: we are. Maybe they have these kind of equivalent of 1170 01:04:01,640 --> 01:04:04,760 Speaker 1: the seat set people who won't just want to send 1171 01:04:04,800 --> 01:04:08,800 Speaker 1: out some random signals. So answering could also be very 1172 01:04:08,840 --> 01:04:12,360 Speaker 1: dangerous because it could reveal our location. But as a 1173 01:04:12,440 --> 01:04:15,680 Speaker 1: philosopher of science and somebody who's desperately curious about the world, 1174 01:04:15,800 --> 01:04:18,080 Speaker 1: if we get a message from a distant star and 1175 01:04:18,200 --> 01:04:21,480 Speaker 1: it clearly has information in it, we spend time decoding it. 1176 01:04:21,640 --> 01:04:24,000 Speaker 1: Are you saying that we shouldn't respond, for example, that 1177 01:04:24,040 --> 01:04:27,840 Speaker 1: we should stay silent, that we should ignore a message 1178 01:04:27,920 --> 01:04:30,560 Speaker 1: from an alien world. I don't think we should ignore it, 1179 01:04:30,680 --> 01:04:33,040 Speaker 1: but I don't think we should respond, and we should 1180 01:04:33,840 --> 01:04:39,120 Speaker 1: instead invest every effort that we can spare to gather 1181 01:04:39,240 --> 01:04:42,000 Speaker 1: more information about about them. So that would be my 1182 01:04:42,160 --> 01:04:47,040 Speaker 1: first option. But of course, if I am correct in 1183 01:04:47,120 --> 01:04:50,560 Speaker 1: the article that I'm writing at the moment, then then 1184 01:04:51,280 --> 01:04:53,840 Speaker 1: it would be fine to to answer them. But I 1185 01:04:53,960 --> 01:04:57,800 Speaker 1: think that, yeah, I think we we would rather want 1186 01:04:57,840 --> 01:05:00,160 Speaker 1: to collect as much information as possible before or we 1187 01:05:00,240 --> 01:05:02,800 Speaker 1: do it. In any case, I mean, if if it's 1188 01:05:02,840 --> 01:05:05,440 Speaker 1: going to go, if a message is going to travel 1189 01:05:05,520 --> 01:05:09,200 Speaker 1: for fifty or sixty years, it wouldn't hurt to, you know, 1190 01:05:09,360 --> 01:05:13,360 Speaker 1: take five years, build some cool space based telescopes and 1191 01:05:13,440 --> 01:05:16,480 Speaker 1: try to gather as much information about that's the planetary 1192 01:05:16,480 --> 01:05:19,600 Speaker 1: system before. And then we discovered that they all look 1193 01:05:19,640 --> 01:05:21,800 Speaker 1: like rabbits, and then we think, oh, how could they 1194 01:05:21,880 --> 01:05:25,440 Speaker 1: possibly be dangerous? So one last question is you suggested 1195 01:05:25,600 --> 01:05:28,800 Speaker 1: that you're writing an article currently which disagrees with the 1196 01:05:28,880 --> 01:05:31,080 Speaker 1: previous article and suggest it might be fine to talk 1197 01:05:31,160 --> 01:05:34,200 Speaker 1: the aliens. What's the essential argument there were evolved in 1198 01:05:34,240 --> 01:05:36,120 Speaker 1: your thinking to make you think it might be fine 1199 01:05:36,280 --> 01:05:40,840 Speaker 1: to talk to the aliens. So let's assume that we 1200 01:05:41,360 --> 01:05:46,080 Speaker 1: stumble upon some aliens in a nearby planetary system, let's 1201 01:05:46,080 --> 01:05:51,040 Speaker 1: say Intel or Trappists. So that means that either alias 1202 01:05:51,360 --> 01:05:56,800 Speaker 1: extraterrestrial intelligence are extremely common. That's one possibility that would 1203 01:05:56,840 --> 01:06:02,160 Speaker 1: explain why we find aliens so close, or it would 1204 01:06:02,160 --> 01:06:07,919 Speaker 1: be a fantastic coincidence that too intelligent beings involved independently 1205 01:06:08,000 --> 01:06:11,920 Speaker 1: of each other in in in in such proximity. And 1206 01:06:12,200 --> 01:06:15,200 Speaker 1: the second alternative is of course not very plausible. So 1207 01:06:15,440 --> 01:06:19,080 Speaker 1: so the the idea is that upon finding one alien 1208 01:06:19,080 --> 01:06:21,680 Speaker 1: species on say Tau City, then that would mean that 1209 01:06:21,800 --> 01:06:25,280 Speaker 1: we may should make a basion update on our subjective 1210 01:06:25,320 --> 01:06:32,280 Speaker 1: probability on the average density of extraterrestrial intelligences in the galaxy. 1211 01:06:32,440 --> 01:06:35,000 Speaker 1: And at the moment we don't know. We don't have 1212 01:06:35,080 --> 01:06:38,160 Speaker 1: any information about that, so we have a uniform prior 1213 01:06:38,560 --> 01:06:41,200 Speaker 1: we you know, the closest we might be alone in 1214 01:06:41,240 --> 01:06:44,000 Speaker 1: the galaxy, or there might be aliens in almost every 1215 01:06:44,040 --> 01:06:46,080 Speaker 1: solar system. We don't know. But if we will find 1216 01:06:46,320 --> 01:06:48,800 Speaker 1: some aliens at Tau City that's about twelve life years 1217 01:06:48,840 --> 01:06:52,320 Speaker 1: from here, then we should expect that there to be 1218 01:06:52,680 --> 01:06:55,080 Speaker 1: lots of them and and lots of them nearby. The 1219 01:06:55,160 --> 01:06:58,400 Speaker 1: next step is that in that case, we should ask ourselves, like, 1220 01:06:58,600 --> 01:07:01,880 Speaker 1: how come we're still here? You know, if there's so 1221 01:07:02,040 --> 01:07:07,000 Speaker 1: many aliens out there and it's it's very unlikely that 1222 01:07:07,200 --> 01:07:11,960 Speaker 1: nobody is aggressive, you get a conundrum like how come 1223 01:07:12,240 --> 01:07:14,680 Speaker 1: we're still here? And the only way to explain that 1224 01:07:15,280 --> 01:07:18,720 Speaker 1: would be to postulate that there must be some reason 1225 01:07:19,120 --> 01:07:21,919 Speaker 1: for why aliens don't attack each other, and that could 1226 01:07:21,960 --> 01:07:24,720 Speaker 1: be many different reasons, but there must be some reason 1227 01:07:24,840 --> 01:07:29,960 Speaker 1: for why that explains our existence. So it's sort of 1228 01:07:30,000 --> 01:07:33,200 Speaker 1: like an inverse firm me paradox for me, a paradox 1229 01:07:33,320 --> 01:07:36,360 Speaker 1: saying why is nobody out there? The answer is maybe 1230 01:07:36,520 --> 01:07:39,520 Speaker 1: there's somebody out there squashing anybody who raises their head. 1231 01:07:39,840 --> 01:07:41,600 Speaker 1: But you're saying, if we go out there and find 1232 01:07:41,720 --> 01:07:44,280 Speaker 1: that actually the galaxy is filled with aliens, and that 1233 01:07:44,360 --> 01:07:48,200 Speaker 1: suggests that there isn't anybody out there hunting species that 1234 01:07:48,400 --> 01:07:51,520 Speaker 1: speak up and make themselves known exactly, because otherwise we 1235 01:07:51,640 --> 01:07:54,640 Speaker 1: would be dead, or we would have an evational already 1236 01:07:55,000 --> 01:07:57,720 Speaker 1: on our way or something like. So for some reason, 1237 01:07:58,040 --> 01:08:04,040 Speaker 1: aliens seeing peaceful in in or an observation of that 1238 01:08:04,240 --> 01:08:07,200 Speaker 1: kind would suggest that aliens are peaceful, and that means 1239 01:08:07,320 --> 01:08:11,520 Speaker 1: that if the cloud setiens see us, then they would 1240 01:08:11,560 --> 01:08:14,360 Speaker 1: think the same thing. If we both have this assumption, 1241 01:08:14,920 --> 01:08:18,600 Speaker 1: then we can move ourselves in the coordination game to 1242 01:08:18,760 --> 01:08:22,479 Speaker 1: the other equilibrium, to the payoff dominant equilibrium, because we 1243 01:08:22,600 --> 01:08:25,400 Speaker 1: are not concerned about them attacking, because we assume that 1244 01:08:25,520 --> 01:08:29,080 Speaker 1: it's uncommon for aliens to attack each other, and that 1245 01:08:29,240 --> 01:08:32,519 Speaker 1: makes us more confident that they think that we are 1246 01:08:32,560 --> 01:08:35,400 Speaker 1: not going to attack them, and that makes us even 1247 01:08:35,680 --> 01:08:38,559 Speaker 1: more confident that we shouldn't attack them, which makes us 1248 01:08:38,600 --> 01:08:41,280 Speaker 1: even more confident that they shouldn't attack us, all right, 1249 01:08:41,360 --> 01:08:44,280 Speaker 1: So I guess the message is if we discover aliens, 1250 01:08:44,360 --> 01:08:46,960 Speaker 1: we hope that they are nearby, because I suggests that 1251 01:08:47,080 --> 01:08:52,080 Speaker 1: there's lots more aliens out there and maybe they're all peaceful. Yeah. 1252 01:08:52,280 --> 01:08:55,360 Speaker 1: Or another possibility, of course, would be that imagine that 1253 01:08:55,439 --> 01:08:58,800 Speaker 1: we would find an alien civilizations but they were not 1254 01:08:59,120 --> 01:09:02,160 Speaker 1: very advanced, say that there in the Bronze Age or 1255 01:09:02,560 --> 01:09:05,519 Speaker 1: the Iron Age. Then that would not rule out the 1256 01:09:05,640 --> 01:09:10,120 Speaker 1: possibility that aliens in general are aggressive, because if if 1257 01:09:10,160 --> 01:09:13,160 Speaker 1: we would find that primitive civilization, that would rather suggest 1258 01:09:13,240 --> 01:09:16,120 Speaker 1: that we just happened to be lucky to be first 1259 01:09:16,479 --> 01:09:20,160 Speaker 1: in our neighborhood or very early. So if we would 1260 01:09:20,160 --> 01:09:23,640 Speaker 1: find a civilization like that, then we couldn't draw the 1261 01:09:23,720 --> 01:09:26,520 Speaker 1: same influence. On the other hand, a Bronze Age civilization 1262 01:09:26,680 --> 01:09:28,880 Speaker 1: is not gonna be a threat to us, um, not 1263 01:09:29,080 --> 01:09:32,760 Speaker 1: for some time, so yeah, that's also a consideration. All right, 1264 01:09:32,800 --> 01:09:34,760 Speaker 1: Thank you very much for coming on the podcast and 1265 01:09:34,840 --> 01:09:37,920 Speaker 1: talking about these amazing and super fun and mind bending 1266 01:09:38,000 --> 01:09:40,799 Speaker 1: ideas with us. Very grateful for your time and your energy, 1267 01:09:41,200 --> 01:09:44,559 Speaker 1: and give our regards to your rabbit. I will thank 1268 01:09:44,600 --> 01:09:48,160 Speaker 1: you all right. Pretty interesting interview and I'm sort of 1269 01:09:48,200 --> 01:09:50,320 Speaker 1: getting the general message of these things. It's a bad 1270 01:09:50,400 --> 01:09:55,800 Speaker 1: idea because of the potential for miscommunication or misunderstandings, yeah, 1271 01:09:55,960 --> 01:09:58,600 Speaker 1: or just lack of ability to communicate at all. And 1272 01:09:58,720 --> 01:10:01,240 Speaker 1: so if they misinterpret our message and they don't know 1273 01:10:01,360 --> 01:10:04,560 Speaker 1: what we mean, they might be worried about our intentions. 1274 01:10:04,760 --> 01:10:07,040 Speaker 1: And then who knows what they would do if they're 1275 01:10:07,200 --> 01:10:09,680 Speaker 1: you know, run by some generals that are have their 1276 01:10:09,760 --> 01:10:11,960 Speaker 1: fingers on the button. They might get nervous and launch 1277 01:10:12,040 --> 01:10:15,000 Speaker 1: a first strike when they get our message. You mean, 1278 01:10:15,040 --> 01:10:17,000 Speaker 1: like they're going to get a message and they're gonna 1279 01:10:17,080 --> 01:10:21,360 Speaker 1: interpret the message as like a first strike or something 1280 01:10:21,760 --> 01:10:23,960 Speaker 1: like who gets a message that you can't understand then 1281 01:10:24,000 --> 01:10:28,240 Speaker 1: and then assumes that they're trying to they're insulting you. Well, 1282 01:10:28,320 --> 01:10:30,439 Speaker 1: you don't know, like the Daniel on that planet is 1283 01:10:30,479 --> 01:10:32,920 Speaker 1: probably like, yeah, we heard from aliens, let's send them 1284 01:10:32,960 --> 01:10:35,639 Speaker 1: all of our physics textbooks, you know, But everybody else 1285 01:10:35,680 --> 01:10:37,760 Speaker 1: in that planet is like, hold on a second, do 1286 01:10:37,840 --> 01:10:39,599 Speaker 1: we really want to let those folks know we're here? 1287 01:10:39,680 --> 01:10:41,840 Speaker 1: Maybe it's a trap, and the risks, as you say, 1288 01:10:42,040 --> 01:10:44,080 Speaker 1: are large, And so if you put yourself in the 1289 01:10:44,200 --> 01:10:46,960 Speaker 1: minds of those aliens, they might have the same worries, 1290 01:10:47,040 --> 01:10:50,360 Speaker 1: and so they might be aggressive in response. I see. 1291 01:10:50,640 --> 01:10:52,120 Speaker 1: But then he said sort of said that, you know, 1292 01:10:52,439 --> 01:10:54,600 Speaker 1: if there are a lot of aliens out there, the 1293 01:10:54,680 --> 01:10:57,679 Speaker 1: fact that nobody has attacked us maybe means that maybe 1294 01:10:57,760 --> 01:11:00,800 Speaker 1: that's not a realistic scenario. Perhaps there aren't a lot 1295 01:11:00,840 --> 01:11:03,320 Speaker 1: of trigger happy aliens out there. Yeah. I like that 1296 01:11:03,400 --> 01:11:05,680 Speaker 1: he ended on a hopeful note there. If there are 1297 01:11:05,720 --> 01:11:08,439 Speaker 1: a lot of civilizations out there, they've somehow learned to 1298 01:11:08,520 --> 01:11:11,160 Speaker 1: live together, and that gives us hope that we can 1299 01:11:11,280 --> 01:11:13,800 Speaker 1: join that community. I guess either way, you're sort of 1300 01:11:14,280 --> 01:11:18,080 Speaker 1: projecting our own human bias onto these scenarios, right, Like, 1301 01:11:18,400 --> 01:11:20,479 Speaker 1: maybe we would react a certain way, but that doesn't 1302 01:11:20,520 --> 01:11:22,920 Speaker 1: mean other aliens would react the same way exactly. And 1303 01:11:22,960 --> 01:11:25,320 Speaker 1: I put that same question to him, and maybe you 1304 01:11:25,400 --> 01:11:27,799 Speaker 1: heard his answer. He thinks that this game theory analysis 1305 01:11:27,960 --> 01:11:30,680 Speaker 1: might be universal, that it goes just beyond humans, that 1306 01:11:30,760 --> 01:11:33,920 Speaker 1: it's useful in understanding like even microbes and all sorts 1307 01:11:33,960 --> 01:11:36,560 Speaker 1: of systems. And so there's some arguments made there that 1308 01:11:36,640 --> 01:11:38,519 Speaker 1: it might be universal. But in the end, you're right, 1309 01:11:38,720 --> 01:11:41,320 Speaker 1: we don't know what motivates an alien, so we definitely 1310 01:11:41,439 --> 01:11:44,760 Speaker 1: don't have a basis for speculating about their choices, right, 1311 01:11:44,960 --> 01:11:47,120 Speaker 1: And then I know you've brought something up before, which 1312 01:11:47,200 --> 01:11:49,320 Speaker 1: is like, it doesn't make a lot of sense for 1313 01:11:49,439 --> 01:11:52,640 Speaker 1: aliens to come here for our resources or to eat us, right, 1314 01:11:53,080 --> 01:11:56,200 Speaker 1: Like that's what the alien movies and TV shows always show. 1315 01:11:56,280 --> 01:11:58,000 Speaker 1: But I'm always like you, I think thinking in the 1316 01:11:58,080 --> 01:11:59,800 Speaker 1: back of my head, like why would you come all 1317 01:11:59,840 --> 01:12:02,080 Speaker 1: this way just for like a little bit of water 1318 01:12:02,400 --> 01:12:05,880 Speaker 1: or just to eat like you know, a few billion humans, 1319 01:12:06,000 --> 01:12:08,559 Speaker 1: Like it's like trying across the country to eat a hamburger. 1320 01:12:08,680 --> 01:12:10,760 Speaker 1: You know, I know you want some water. Neptune is 1321 01:12:10,800 --> 01:12:14,320 Speaker 1: basically all water. Help yourself, right, You want platinum? We 1322 01:12:14,439 --> 01:12:16,880 Speaker 1: got asteroids filled with platinum. We don't even know how 1323 01:12:16,920 --> 01:12:20,320 Speaker 1: to get to take some. Please, nobody would come here 1324 01:12:20,400 --> 01:12:22,960 Speaker 1: just for those resources. Or you like meat. Hey, we 1325 01:12:23,200 --> 01:12:27,559 Speaker 1: have this great animal thing called the impossible burger. That's 1326 01:12:27,640 --> 01:12:29,960 Speaker 1: just as good. I'm sure with your advanced illuciation you 1327 01:12:30,040 --> 01:12:32,160 Speaker 1: can figure it out too exactly. And so I agree 1328 01:12:32,200 --> 01:12:34,640 Speaker 1: with you that it's unlikely to have interstellar war for 1329 01:12:34,920 --> 01:12:38,479 Speaker 1: resources because they seem essentially infinite. But this is another angle. 1330 01:12:38,600 --> 01:12:40,680 Speaker 1: You know, we might have an interstellar war due to 1331 01:12:40,800 --> 01:12:44,440 Speaker 1: a misunderstanding if those aliens are worried that we're aggressive, 1332 01:12:44,520 --> 01:12:46,760 Speaker 1: they might be aggressive in response, even if it doesn't 1333 01:12:46,800 --> 01:12:49,560 Speaker 1: make any sense for anybody to be aggressive. And so 1334 01:12:49,680 --> 01:12:52,479 Speaker 1: there is the potential there for misunderstanding, like maybe the 1335 01:12:52,520 --> 01:12:55,120 Speaker 1: aliens are just doing it for support, you mean, like 1336 01:12:55,240 --> 01:12:57,400 Speaker 1: they just maybe like going to war. Maybe they're just 1337 01:12:57,479 --> 01:12:59,720 Speaker 1: not as curious about the universe. They don't really care 1338 01:12:59,760 --> 01:13:03,120 Speaker 1: about everything. We figured out we need to Daniel dim Daniel. 1339 01:13:03,640 --> 01:13:07,320 Speaker 1: We need to like implanted Daniel in there amongst their ranks. Now, 1340 01:13:07,400 --> 01:13:09,919 Speaker 1: you got it. We need to spread Daniel across the universe. 1341 01:13:13,640 --> 01:13:15,599 Speaker 1: And that's what this podcast is all about. So maybe 1342 01:13:15,680 --> 01:13:18,400 Speaker 1: if they hear this podcast, they'll realize, hey, we're just 1343 01:13:18,520 --> 01:13:21,400 Speaker 1: kind of goofy and curious. We pose no harm. That 1344 01:13:21,400 --> 01:13:23,800 Speaker 1: would be hilarious. They like in a thousand years alien 1345 01:13:23,920 --> 01:13:26,040 Speaker 1: come and they say, hey, we're going to wipe you out. 1346 01:13:26,080 --> 01:13:28,400 Speaker 1: But then we picked up this podcast that you're transmitting, 1347 01:13:28,520 --> 01:13:33,880 Speaker 1: and this famous profit that Daniel, Why the Lights and 1348 01:13:34,000 --> 01:13:37,000 Speaker 1: the Chosen One illuminated us into what it means to 1349 01:13:37,040 --> 01:13:40,200 Speaker 1: be curious about the universe. We were slightly offended Jorge's jokes, 1350 01:13:40,280 --> 01:13:43,759 Speaker 1: but we decided that's right. All the Banana Joeys were terrible. 1351 01:13:44,280 --> 01:13:47,760 Speaker 1: We wish or would just go away. Bring on this 1352 01:13:47,920 --> 01:13:50,679 Speaker 1: other guest, Hose, and they were much more interesting. Anyways, 1353 01:13:52,080 --> 01:13:54,639 Speaker 1: Can we be a guest on your show? Yeah, exactly, Aliens, 1354 01:13:54,680 --> 01:13:56,320 Speaker 1: you are welcome to come and be a guest in 1355 01:13:56,360 --> 01:13:58,679 Speaker 1: the show. Please do come and tell us all about 1356 01:13:58,720 --> 01:14:01,320 Speaker 1: the secrets of the universe. That is my scientific fantasy. 1357 01:14:01,479 --> 01:14:04,439 Speaker 1: That's right. Just don't eat horrete or anybody. You don't 1358 01:14:04,479 --> 01:14:09,160 Speaker 1: eat anybody. Have some impossible burghers, have some impossible humans. 1359 01:14:10,080 --> 01:14:12,040 Speaker 1: All right, Well, I guess it's something to think about, 1360 01:14:12,120 --> 01:14:14,120 Speaker 1: you know, whether or not we want to contact ailings 1361 01:14:14,160 --> 01:14:16,640 Speaker 1: out there, whether it would be worth the risk. You know, 1362 01:14:16,720 --> 01:14:20,000 Speaker 1: how would our lives change? And we newly precise answer 1363 01:14:20,080 --> 01:14:22,800 Speaker 1: to that question. Yeah, it is an important question, and 1364 01:14:22,880 --> 01:14:25,880 Speaker 1: it's a question that no individual person can or should 1365 01:14:26,040 --> 01:14:28,640 Speaker 1: make on behalf of humanity. That's what scares me a 1366 01:14:28,760 --> 01:14:31,639 Speaker 1: little bit about this rogue group that's just sending messages 1367 01:14:31,720 --> 01:14:34,120 Speaker 1: out there. You know, they're making decisions for the entire 1368 01:14:34,320 --> 01:14:37,360 Speaker 1: human race, and it's not a small decision to make. Right. Well, 1369 01:14:37,400 --> 01:14:40,320 Speaker 1: I have bad news for you, Daniel. The fate of 1370 01:14:40,400 --> 01:14:42,920 Speaker 1: human kind is already in the hands of like four people. 1371 01:14:43,280 --> 01:14:46,080 Speaker 1: That's true, So it sounds like you have a little 1372 01:14:46,080 --> 01:14:48,880 Speaker 1: deeper problem with the way humans organize themselves. That's true, 1373 01:14:48,920 --> 01:14:52,240 Speaker 1: but out of scope I think for today's podcast. Let's 1374 01:14:52,240 --> 01:14:55,719 Speaker 1: try stay tuned for our next episode. Daniel and jorgey 1375 01:14:56,000 --> 01:15:00,200 Speaker 1: overthrow the global elite. That's right, yes, don't wipe obile 1376 01:15:00,240 --> 01:15:03,439 Speaker 1: politics and other political leadings. All right, Well, we hope 1377 01:15:03,479 --> 01:15:06,320 Speaker 1: you enjoyed that. Thanks for joining us, see you next time. 1378 01:15:14,200 --> 01:15:17,000 Speaker 1: Thanks for listening, and remember that Daniel and Jorge Explain 1379 01:15:17,080 --> 01:15:19,920 Speaker 1: the Universe is a production of I Heart Radio. For 1380 01:15:20,080 --> 01:15:23,000 Speaker 1: more podcast For my heart Radio, visit the I heart 1381 01:15:23,120 --> 01:15:26,680 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 1382 01:15:26,760 --> 01:15:29,400 Speaker 1: favorite shows. H