1 00:00:05,760 --> 00:00:12,040 Speaker 1: What exactly is intelligence? And why don't crows invent an Internet? 2 00:00:12,119 --> 00:00:15,800 Speaker 1: And why don't gophers right novels? And what does any 3 00:00:15,800 --> 00:00:18,680 Speaker 1: of this have to do with George Orwell's analysis of 4 00:00:18,720 --> 00:00:23,279 Speaker 1: Adolf Hitler? Or why World War five could involve many 5 00:00:23,360 --> 00:00:29,920 Speaker 1: other species besides Homo sapiens. Welcome to Inner Cosmos with me, 6 00:00:30,160 --> 00:00:35,640 Speaker 1: David Eagleman. I'm a neuroscientist and an author at Stanford University, 7 00:00:36,960 --> 00:00:41,080 Speaker 1: and I've spent my whole career studying the intersection between 8 00:00:41,360 --> 00:00:48,559 Speaker 1: how the brain works and how we experience life. On 9 00:00:48,640 --> 00:00:52,479 Speaker 1: today's episode, we're going to explore the future not just 10 00:00:52,560 --> 00:00:57,000 Speaker 1: for humans, but for other species as well. The other day, 11 00:00:57,080 --> 00:01:00,000 Speaker 1: my kids were in the car while I was pumping gas, 12 00:01:00,040 --> 00:01:03,639 Speaker 1: and I was thinking about the future for them, and 13 00:01:03,760 --> 00:01:07,200 Speaker 1: specifically what my kids would look back on in twenty 14 00:01:07,280 --> 00:01:10,360 Speaker 1: years or thirty years or fifty years and think, wow, 15 00:01:10,400 --> 00:01:15,080 Speaker 1: I can't believe we did that. So California has new 16 00:01:15,160 --> 00:01:18,560 Speaker 1: legislation that all new cars will have to be electric 17 00:01:18,640 --> 00:01:21,400 Speaker 1: by twenty thirty five, and so I was realizing that 18 00:01:21,440 --> 00:01:24,760 Speaker 1: when my kids are the age I am now, they'll 19 00:01:24,760 --> 00:01:28,160 Speaker 1: look back in moments like this and think, wow, I 20 00:01:28,200 --> 00:01:31,680 Speaker 1: remember when dad used to pump fossil fuel into his car, 21 00:01:32,280 --> 00:01:35,440 Speaker 1: and they'll see this as a surprising feature of their 22 00:01:35,560 --> 00:01:39,600 Speaker 1: childhoods and not something that exists for them anymore. It's 23 00:01:39,680 --> 00:01:42,559 Speaker 1: analogous to the way that I look back on black 24 00:01:42,600 --> 00:01:46,360 Speaker 1: and white televisions, or pay phones, or cassette tapes or 25 00:01:46,480 --> 00:01:50,160 Speaker 1: walkman's or any of the other things from my childhood 26 00:01:50,520 --> 00:01:52,960 Speaker 1: that we just don't have anymore. So when I got 27 00:01:53,040 --> 00:01:56,360 Speaker 1: back in the car, I asked them to brainstorm about 28 00:01:56,440 --> 00:01:59,640 Speaker 1: things that would be different for them when they're adults, 29 00:02:00,280 --> 00:02:02,760 Speaker 1: and they brought up more things than I expected. We 30 00:02:02,760 --> 00:02:05,640 Speaker 1: were on a road trip and we were passing a 31 00:02:05,760 --> 00:02:09,520 Speaker 1: ranch where we saw thousands of cows grazing, and what 32 00:02:09,600 --> 00:02:12,519 Speaker 1: we ended up talking about was whether they might look 33 00:02:12,560 --> 00:02:17,440 Speaker 1: back strangely in thirty years on raising animals to kill 34 00:02:17,480 --> 00:02:20,440 Speaker 1: them for meat, because we looked at those cows and 35 00:02:20,480 --> 00:02:23,600 Speaker 1: we understood that they would all be dead soon and 36 00:02:23,760 --> 00:02:28,400 Speaker 1: on plates around tables. Now that might change, and not 37 00:02:28,639 --> 00:02:33,679 Speaker 1: necessarily because society drifts towards vegetarianism, and not even necessarily 38 00:02:33,720 --> 00:02:39,200 Speaker 1: because companies develop better and better plant based alternatives to burgers. 39 00:02:39,200 --> 00:02:42,960 Speaker 1: Our bodies, after all, have been clearly carved by evolution 40 00:02:43,000 --> 00:02:46,080 Speaker 1: to eat meat, So I think the more likely scenario 41 00:02:46,400 --> 00:02:51,560 Speaker 1: will be the move toward lab grown meat will pop 42 00:02:51,600 --> 00:02:54,880 Speaker 1: a single cell off a cow and reproducing the lab 43 00:02:54,919 --> 00:02:59,080 Speaker 1: dish to grow the burger or the steak directly from 44 00:02:59,120 --> 00:03:03,200 Speaker 1: that cell, rather than raise the entire animal and feed 45 00:03:03,240 --> 00:03:06,280 Speaker 1: it for years and then slice its throat and process 46 00:03:06,320 --> 00:03:09,480 Speaker 1: it and ship the meat, so we could eat meat 47 00:03:10,040 --> 00:03:13,520 Speaker 1: free of cruelty and with a lot less ranching effort. 48 00:03:13,919 --> 00:03:16,120 Speaker 1: And as time goes on, we'll get better and better 49 00:03:16,120 --> 00:03:19,680 Speaker 1: at growing the proper ratios of fat for marbling and 50 00:03:19,720 --> 00:03:22,200 Speaker 1: blood vessels to carry in the right kinds of nutrition, 51 00:03:22,800 --> 00:03:27,000 Speaker 1: and you'll be able to grow a filet mignon without 52 00:03:27,040 --> 00:03:30,440 Speaker 1: actually having to have thousands of acres and raise and 53 00:03:30,480 --> 00:03:34,480 Speaker 1: slaughter a living being that seems to care about it's 54 00:03:34,560 --> 00:03:38,680 Speaker 1: young and tries to avoid pain and presumably has a 55 00:03:38,720 --> 00:03:42,440 Speaker 1: whole range of emotions. So there are several companies working 56 00:03:42,480 --> 00:03:45,320 Speaker 1: on this already, and with time we'll be able to 57 00:03:45,360 --> 00:03:48,280 Speaker 1: do this at a price that competes with traditional meat, 58 00:03:48,720 --> 00:03:51,640 Speaker 1: and eventually that's far cheaper because you won't have to 59 00:03:51,760 --> 00:03:53,960 Speaker 1: raise the cows and feed them and tend to them 60 00:03:53,960 --> 00:03:57,880 Speaker 1: and shelter them for years. You won't drive your gas 61 00:03:57,960 --> 00:04:00,840 Speaker 1: powered vehicle along the highway and look at these thousands 62 00:04:00,880 --> 00:04:04,200 Speaker 1: of cows waiting for their own murder. And once we 63 00:04:04,200 --> 00:04:08,600 Speaker 1: can grow burgers from single cells, we can enjoy barbecues 64 00:04:08,640 --> 00:04:13,960 Speaker 1: with lion burgers or tiger burgers or elephant burgers or whatever, 65 00:04:14,120 --> 00:04:18,680 Speaker 1: because we're just reproducing cells. Anyhow, as we talked, we 66 00:04:18,720 --> 00:04:22,040 Speaker 1: got into the future of genetic editing, and there are 67 00:04:22,080 --> 00:04:27,360 Speaker 1: tools like crisper Cast nine. This is a molecular tool 68 00:04:27,440 --> 00:04:31,880 Speaker 1: that allows scientists to splice out a well defined section 69 00:04:31,920 --> 00:04:35,000 Speaker 1: of a genome, in other words, a particular sequence of 70 00:04:35,080 --> 00:04:38,160 Speaker 1: a's and seas and teas and geese, and replace that 71 00:04:38,240 --> 00:04:41,760 Speaker 1: sequence with a different sequence that they want, so maybe 72 00:04:41,760 --> 00:04:46,440 Speaker 1: it's one that doesn't have a particular genetic mutation or 73 00:04:46,480 --> 00:04:50,360 Speaker 1: that causes a disease or a disorder. Now, this is 74 00:04:50,360 --> 00:04:54,160 Speaker 1: a technology, crisper Cast nine, that started to come into 75 00:04:54,240 --> 00:04:58,120 Speaker 1: dim focus in the mid nineteen nineties, but using it 76 00:04:58,160 --> 00:05:02,359 Speaker 1: to edit genomes in alien cells that wasn't demonstrated until 77 00:05:02,600 --> 00:05:06,240 Speaker 1: twenty twelve, and suddenly the technology blew up and it's 78 00:05:06,240 --> 00:05:09,440 Speaker 1: an every lab around the planet. Now. Now, the idea 79 00:05:09,520 --> 00:05:14,400 Speaker 1: of using that kind of technology on the human germline 80 00:05:14,920 --> 00:05:18,920 Speaker 1: has been an area of really heated ethical debate, and 81 00:05:19,120 --> 00:05:22,360 Speaker 1: it appears to have been used on humans illegally by 82 00:05:22,440 --> 00:05:25,800 Speaker 1: a doctor in China in twenty eighteen, and it's an 83 00:05:25,839 --> 00:05:30,279 Speaker 1: open question where this all goes from here. All countries 84 00:05:30,279 --> 00:05:33,680 Speaker 1: have said this is illegal for human use, but it's 85 00:05:33,760 --> 00:05:37,279 Speaker 1: just a matter of time probably before more examples hit 86 00:05:37,320 --> 00:05:40,640 Speaker 1: the news. And by the way, nowadays, instead of cutting 87 00:05:40,640 --> 00:05:44,400 Speaker 1: and replacing whole chunks of DNA like Crisper does, there 88 00:05:44,440 --> 00:05:49,120 Speaker 1: are methods to change just a single base pair. Molecular 89 00:05:49,120 --> 00:05:52,960 Speaker 1: biologists from MIT and Harvard made a gene editing technique 90 00:05:52,960 --> 00:05:56,599 Speaker 1: that rewrites individual as and seas and teas and g's 91 00:05:56,839 --> 00:06:00,159 Speaker 1: in the genetic code. This is called base editing. But 92 00:06:00,240 --> 00:06:06,000 Speaker 1: what difference would it make if we could edit genomes? Well, 93 00:06:06,040 --> 00:06:09,000 Speaker 1: the thing that interests me today is whether we could 94 00:06:09,160 --> 00:06:13,600 Speaker 1: leverage these advances in genetic engineering to someday allow us 95 00:06:13,640 --> 00:06:17,960 Speaker 1: to boost human intelligence. Now, there are several things to 96 00:06:18,080 --> 00:06:22,120 Speaker 1: unpack about that prospect. The first is that every few 97 00:06:22,200 --> 00:06:26,480 Speaker 1: years in the neuroscience community there's buzz about the possibility 98 00:06:26,520 --> 00:06:30,680 Speaker 1: that someone discovered a noatropic drug, and that just means 99 00:06:31,040 --> 00:06:34,479 Speaker 1: one that's going to boost memory or recall or some 100 00:06:34,560 --> 00:06:39,239 Speaker 1: other cognitive function. But despite the occasional buzz, there's really 101 00:06:39,560 --> 00:06:44,040 Speaker 1: nothing that has changed our cognitive function beyond just drinking 102 00:06:44,040 --> 00:06:48,480 Speaker 1: a cup of coffee. But instead of a drug, could 103 00:06:48,520 --> 00:06:54,040 Speaker 1: we improve intelligence by tweaking the genome? Now, one possibility 104 00:06:54,160 --> 00:06:57,479 Speaker 1: is that nature has already optimized our intelligence and there's 105 00:06:57,520 --> 00:07:00,760 Speaker 1: not too much room to improve it. So let me 106 00:07:00,760 --> 00:07:04,640 Speaker 1: explain that by an analogy. Take life span. There's been 107 00:07:04,640 --> 00:07:07,960 Speaker 1: a lot of research on how to increase human lifespan, 108 00:07:08,480 --> 00:07:12,320 Speaker 1: but the length of a human's lifespan actually has not 109 00:07:12,360 --> 00:07:15,960 Speaker 1: really changed. When we look back to a time before 110 00:07:16,120 --> 00:07:20,360 Speaker 1: modern medicine, we see that people in history we're capable 111 00:07:20,480 --> 00:07:25,000 Speaker 1: of living long lives. Take Benjamin Franklin, who is born 112 00:07:25,400 --> 00:07:28,840 Speaker 1: in seventeen oh six and he died in seventeen ninety 113 00:07:28,880 --> 00:07:31,840 Speaker 1: at age eighty four. That's a very ripe old age, 114 00:07:31,960 --> 00:07:35,280 Speaker 1: especially for a man. So it's not that people couldn't 115 00:07:35,280 --> 00:07:39,760 Speaker 1: live that long. What has changed is the average lifespan, 116 00:07:40,120 --> 00:07:43,960 Speaker 1: which is known as life expectancy. So when researchers talk 117 00:07:44,000 --> 00:07:49,280 Speaker 1: about the massive improvements in human life expectancy, they're talking 118 00:07:49,280 --> 00:07:54,080 Speaker 1: about the average human lifespan. So as we've developed medicine 119 00:07:54,120 --> 00:07:59,280 Speaker 1: and antibiotics and methods for reducing child mortality and even 120 00:07:59,320 --> 00:08:02,360 Speaker 1: reducing from simple things like getting gang green when you 121 00:08:02,400 --> 00:08:06,400 Speaker 1: get a cut. That means that on average, people don't 122 00:08:06,480 --> 00:08:10,040 Speaker 1: die in their childhood or young adult lives anymore. It 123 00:08:10,080 --> 00:08:12,240 Speaker 1: turns out that some of the most important steps in 124 00:08:12,320 --> 00:08:17,000 Speaker 1: reducing mortality where the basic medications introduced last century to 125 00:08:17,120 --> 00:08:21,880 Speaker 1: control diarrhea and vomiting. These were massively important in reducing 126 00:08:21,960 --> 00:08:25,120 Speaker 1: death tolls. So people make it down the road of 127 00:08:25,280 --> 00:08:29,400 Speaker 1: possibility to live until they're old age. So what's been 128 00:08:29,440 --> 00:08:33,600 Speaker 1: improved upon is the average age of death. The point 129 00:08:33,640 --> 00:08:36,160 Speaker 1: I want to draw here is that it's not as 130 00:08:36,160 --> 00:08:39,319 Speaker 1: though any of our researchers said, oh, I got it, 131 00:08:39,480 --> 00:08:41,960 Speaker 1: here's how to make somebody live until they're two hundred 132 00:08:42,000 --> 00:08:45,280 Speaker 1: years old. And this is because the human body, at 133 00:08:45,320 --> 00:08:48,480 Speaker 1: least at this point doesn't appear to be set up 134 00:08:48,520 --> 00:08:52,719 Speaker 1: to live more than about a century. So by analogy, 135 00:08:52,840 --> 00:08:56,440 Speaker 1: it could be the case that human intelligence has been 136 00:08:56,480 --> 00:09:01,400 Speaker 1: carefully shaped by evolutionary pressures and is essentially maxed out. 137 00:09:01,920 --> 00:09:05,040 Speaker 1: In other words, if you tweaked the brain circuitry, you 138 00:09:05,160 --> 00:09:08,280 Speaker 1: might not be able to squeeze much more juice out 139 00:09:08,320 --> 00:09:11,720 Speaker 1: of the lemon. Nonetheless, even if this is the case, 140 00:09:12,120 --> 00:09:16,760 Speaker 1: we could in theory improve the intelligence expectancy of the 141 00:09:16,800 --> 00:09:21,199 Speaker 1: population by pushing everyone up towards the ceiling. So everyone 142 00:09:21,320 --> 00:09:24,760 Speaker 1: is essentially operating at the level of the best brains 143 00:09:25,240 --> 00:09:29,360 Speaker 1: like Albert Einstein or Tony Morrison, or all the people 144 00:09:29,400 --> 00:09:32,480 Speaker 1: who are super talented and domains like math or music 145 00:09:32,559 --> 00:09:36,760 Speaker 1: or language. Those people are massive outliers from the average. 146 00:09:36,800 --> 00:09:40,800 Speaker 1: So that tells us the possibility that on average, most 147 00:09:40,800 --> 00:09:45,760 Speaker 1: people could get a bit smarter. However we define intelligence now. 148 00:09:45,880 --> 00:09:48,160 Speaker 1: I'll come back to genetic editing in just a moment, 149 00:09:48,200 --> 00:09:50,720 Speaker 1: but first I need to lay another brick in the 150 00:09:50,800 --> 00:09:56,839 Speaker 1: foundation by addressing the issue of what is intelligence? The 151 00:09:56,880 --> 00:10:00,839 Speaker 1: main challenge we have with understanding it. We don't have 152 00:10:00,960 --> 00:10:05,360 Speaker 1: a precise definition for it. The interest in this concept 153 00:10:05,440 --> 00:10:08,319 Speaker 1: of intelligence goes back to at least the time of Plato, 154 00:10:08,679 --> 00:10:11,600 Speaker 1: but just over a century ago people started working on 155 00:10:11,679 --> 00:10:15,880 Speaker 1: how to quantify this, and by nineteen oh four, the 156 00:10:15,920 --> 00:10:20,800 Speaker 1: English psychologist Charles Spearman found that people who scored well 157 00:10:20,960 --> 00:10:24,880 Speaker 1: on one sort of task like verbal skills, also tended 158 00:10:24,920 --> 00:10:28,640 Speaker 1: to perform well on other sorts of tasks like spatial skills, 159 00:10:28,920 --> 00:10:32,480 Speaker 1: so that led Spearman to conclude that intelligence could be 160 00:10:32,640 --> 00:10:37,520 Speaker 1: generalized across tasks, and other researchers then found that this 161 00:10:37,600 --> 00:10:41,440 Speaker 1: seemed to be true. In other words, tasks of different 162 00:10:41,520 --> 00:10:46,520 Speaker 1: abilities tended to have high positive correlations. So what that 163 00:10:46,600 --> 00:10:50,600 Speaker 1: means is a person with a high general intelligence score 164 00:10:50,960 --> 00:10:55,520 Speaker 1: tends to do better at language and memory and tackling 165 00:10:55,520 --> 00:10:58,880 Speaker 1: new sorts of problems and real life activities and so on. 166 00:10:59,440 --> 00:11:02,280 Speaker 1: And in nine twenty three a researcher wrote a famous 167 00:11:02,360 --> 00:11:07,240 Speaker 1: paper in which he concluded tests tested by which he 168 00:11:07,280 --> 00:11:11,120 Speaker 1: meant that whatever the heck intelligence is, there are ways 169 00:11:11,240 --> 00:11:13,960 Speaker 1: to put a number to it. Now, I'll just mention 170 00:11:14,040 --> 00:11:17,080 Speaker 1: this has been an area of debate where people point 171 00:11:17,120 --> 00:11:21,040 Speaker 1: out that there can be cross cultural issues with testing intelligence. 172 00:11:21,080 --> 00:11:23,520 Speaker 1: But that's not the point I'm making here. I'm talking 173 00:11:23,559 --> 00:11:26,000 Speaker 1: about if you have a hundred people from the same 174 00:11:26,040 --> 00:11:28,480 Speaker 1: culture in the same place and they all sit down 175 00:11:28,520 --> 00:11:32,080 Speaker 1: for the test, you'll get a spectrum of scores, and 176 00:11:32,200 --> 00:11:36,920 Speaker 1: these scores generally correlate with success in many domains, including 177 00:11:37,160 --> 00:11:39,800 Speaker 1: how they do later in life. So with these early 178 00:11:39,880 --> 00:11:45,080 Speaker 1: researchers noted is that something is getting captured. But what 179 00:11:45,400 --> 00:11:51,440 Speaker 1: exactly is intelligence. Well, it's proven difficult to nail down 180 00:11:51,480 --> 00:11:56,240 Speaker 1: with a single definition because it's not one thing, and 181 00:11:56,280 --> 00:11:59,640 Speaker 1: it's certainly not defined by any single area in the brain. 182 00:12:00,480 --> 00:12:02,600 Speaker 1: So what do we know about it? Well, first we 183 00:12:02,679 --> 00:12:07,719 Speaker 1: know it's not defined by brain's size. At the end 184 00:12:07,720 --> 00:12:11,480 Speaker 1: of the eighteen nineties, the Spanish neuroscientist Ramoni ca Hall 185 00:12:12,280 --> 00:12:15,800 Speaker 1: was blown away by the fact that an elephant's brain, 186 00:12:15,880 --> 00:12:19,480 Speaker 1: let's say, is so much larger than a mouse's brain, 187 00:12:19,800 --> 00:12:24,839 Speaker 1: and yet they're both equally capable of strategies and clever 188 00:12:25,000 --> 00:12:28,720 Speaker 1: moves and finding mates and so on. So a bigger 189 00:12:28,840 --> 00:12:33,120 Speaker 1: brain is not necessarily any smarter. Or take somebody like 190 00:12:33,160 --> 00:12:37,200 Speaker 1: Andre the Giant. He probably wasn't eight times smarter than you, 191 00:12:37,679 --> 00:12:39,960 Speaker 1: even though his brain may have had eight times the 192 00:12:40,040 --> 00:12:44,360 Speaker 1: volume of yours. So Ramoni ca Hall said, think about 193 00:12:44,360 --> 00:12:48,640 Speaker 1: it like the giant clock tower of Big Ben compared 194 00:12:48,679 --> 00:12:52,479 Speaker 1: to a wristwatch. They both keep time with the same accuracy. 195 00:12:53,320 --> 00:12:57,640 Speaker 1: And so it goes with brains big and small. It's 196 00:12:57,640 --> 00:13:00,600 Speaker 1: not the amount of stuff, but the al rhythms that 197 00:13:00,600 --> 00:13:03,680 Speaker 1: are running on it. It may be that what we 198 00:13:03,800 --> 00:13:07,720 Speaker 1: call intelligence is about how many things you could hold 199 00:13:07,720 --> 00:13:11,040 Speaker 1: in your short term memory at one time, or maybe 200 00:13:11,400 --> 00:13:15,440 Speaker 1: it's the idea that you can store stronger associations between facts, 201 00:13:15,920 --> 00:13:20,280 Speaker 1: or maybe it's the ability to resolve cognitive conflict, or 202 00:13:20,320 --> 00:13:25,040 Speaker 1: maybe it's about being able to better squelch distractors. So 203 00:13:25,160 --> 00:13:28,559 Speaker 1: all these things have been suggested as the basis of intelligence, 204 00:13:29,080 --> 00:13:31,720 Speaker 1: but it's going to involve all of these in some 205 00:13:31,760 --> 00:13:35,280 Speaker 1: degree or another, and probably a bunch of other things, 206 00:13:35,720 --> 00:13:40,080 Speaker 1: like whether you can better structure the information that you've stored, 207 00:13:40,720 --> 00:13:44,640 Speaker 1: or do more parallel processing, or i'll talk about this 208 00:13:44,679 --> 00:13:51,120 Speaker 1: in the future episode, whether you're better at simulating possible futures. Okay, 209 00:13:51,160 --> 00:13:54,839 Speaker 1: So all this is to say that intelligence doesn't represent 210 00:13:55,240 --> 00:13:58,120 Speaker 1: a simple single thing that we can point to in 211 00:13:58,160 --> 00:14:01,880 Speaker 1: the brain. Instead, in eligence is one of those words 212 00:14:01,920 --> 00:14:05,720 Speaker 1: that carries a lot of semantic weight, meaning there are 213 00:14:05,960 --> 00:14:08,920 Speaker 1: all kinds of things that are being lumped together in 214 00:14:08,960 --> 00:14:14,240 Speaker 1: that one word. But whatever intelligence is, it sits at 215 00:14:14,240 --> 00:14:18,640 Speaker 1: the heart of what is special about Homo sapiens. Other 216 00:14:18,720 --> 00:14:24,080 Speaker 1: species seem to come out more hardwired to solve particular problems, 217 00:14:24,520 --> 00:14:29,080 Speaker 1: but we are more live wired, which means our brains adapt. 218 00:14:29,560 --> 00:14:32,960 Speaker 1: So when we drop into the world, we can in 219 00:14:33,000 --> 00:14:36,760 Speaker 1: our first several years learn all the most important things 220 00:14:36,800 --> 00:14:39,600 Speaker 1: that our species has figured out before us, and then 221 00:14:39,640 --> 00:14:42,880 Speaker 1: we can springboard from there. And that means we can 222 00:14:42,920 --> 00:14:47,040 Speaker 1: solve open ended problems, and that can be surviving in 223 00:14:47,080 --> 00:14:50,960 Speaker 1: the Arctic Circle or surviving at the equator, whether that's 224 00:14:51,000 --> 00:14:54,640 Speaker 1: figuring out quantum physics or how to distill nitrogen from 225 00:14:54,680 --> 00:14:59,240 Speaker 1: the air for fertilizer or whatever. Other species do smart things, 226 00:14:59,680 --> 00:15:03,920 Speaker 1: but none can even come close to our capacity to 227 00:15:03,920 --> 00:15:09,040 Speaker 1: tackle abstract, open ended problems. And all of this leads 228 00:15:09,040 --> 00:15:12,360 Speaker 1: me to the main point of today's podcast, which is 229 00:15:12,400 --> 00:15:15,560 Speaker 1: one of the most interesting possibilities for how the world 230 00:15:15,960 --> 00:15:19,880 Speaker 1: could be different for our children. They've grown up with 231 00:15:20,000 --> 00:15:24,640 Speaker 1: Homo sapiens being the only intelligent species that is doing 232 00:15:24,720 --> 00:15:27,760 Speaker 1: anything remarkable on this planet. And that's not to say 233 00:15:27,760 --> 00:15:30,400 Speaker 1: other animals aren't intelligent in their own ways, but they 234 00:15:30,440 --> 00:15:33,080 Speaker 1: are not as far as we know, building an internet, 235 00:15:33,200 --> 00:15:38,720 Speaker 1: or vaccines or quantum computation, or certainly not building satellites 236 00:15:38,720 --> 00:15:42,200 Speaker 1: and rocketships and getting off the planet. For example, did 237 00:15:42,200 --> 00:15:44,680 Speaker 1: you know that at any given moment in time, there 238 00:15:44,680 --> 00:15:49,440 Speaker 1: are approximately one million Homo sapiens up in the atmosphere 239 00:15:49,440 --> 00:15:53,840 Speaker 1: above the clouds, sitting in comfortable leather chairs. So we 240 00:15:53,960 --> 00:15:59,000 Speaker 1: might argue about animal intelligence, but what's clear is that 241 00:15:59,440 --> 00:16:02,400 Speaker 1: other animals are not building airplanes, or for that matter, 242 00:16:02,760 --> 00:16:07,080 Speaker 1: fleets of electric vehicles, or neutron bombs or smartphones. And 243 00:16:07,160 --> 00:16:10,840 Speaker 1: it's an interesting question, right, why don't we find wolves 244 00:16:10,880 --> 00:16:16,200 Speaker 1: constructing great libraries, or kangaroo's building hospital systems, or just 245 00:16:16,400 --> 00:16:20,800 Speaker 1: universities where say, Koalas from all over the world come 246 00:16:20,920 --> 00:16:24,200 Speaker 1: loping over and they spend four years to learn everything 247 00:16:24,240 --> 00:16:27,480 Speaker 1: about Kowala history and the future and new technologies, and 248 00:16:27,840 --> 00:16:31,720 Speaker 1: they study all night to take Koala qualifications to make 249 00:16:31,760 --> 00:16:35,040 Speaker 1: them experts in law or medicine, or study their own 250 00:16:35,360 --> 00:16:41,480 Speaker 1: Koala brains. So what I want to consider is whether 251 00:16:41,640 --> 00:16:45,040 Speaker 1: we will be able to leverage our growing knowledge of 252 00:16:45,120 --> 00:16:53,120 Speaker 1: genetics to uplift other animal species. That means, to help them, 253 00:16:53,240 --> 00:16:59,440 Speaker 1: presumably genetically, to have much higher levels of intelligence. The 254 00:16:59,560 --> 00:17:03,720 Speaker 1: idea is that as we better understand genetics, will be 255 00:17:03,800 --> 00:17:08,520 Speaker 1: able to tweak the circuitry to optimize things. After all, 256 00:17:08,520 --> 00:17:11,520 Speaker 1: the important part is that every single organism on the 257 00:17:11,560 --> 00:17:16,240 Speaker 1: Earth is made of exactly the same letters ACTG, and 258 00:17:16,359 --> 00:17:19,520 Speaker 1: there's something just slightly different about the way that our 259 00:17:19,640 --> 00:17:23,080 Speaker 1: letters structure the human brain, so that we not only 260 00:17:23,080 --> 00:17:27,879 Speaker 1: have bipedal walking in hairless faces and acne and really 261 00:17:27,960 --> 00:17:33,960 Speaker 1: useful thumbs, but also we have this massive intelligence. Could 262 00:17:34,000 --> 00:17:38,640 Speaker 1: we crack the code on that part and give intelligence 263 00:17:38,760 --> 00:17:44,119 Speaker 1: to our animal cousins. Now that might sound like pure fantasy, 264 00:17:44,160 --> 00:17:46,119 Speaker 1: but in fact, there have been a growing number of 265 00:17:46,160 --> 00:17:51,200 Speaker 1: experiments that make this easier to envision. For example, there's 266 00:17:51,200 --> 00:17:54,919 Speaker 1: a gene called fox P two, which is thought to 267 00:17:54,960 --> 00:17:58,360 Speaker 1: have something to do with the sudden explosion of speech 268 00:17:58,440 --> 00:18:02,520 Speaker 1: and language in humans. Now, the interpretation about fox P 269 00:18:02,680 --> 00:18:05,240 Speaker 1: two is a bit controversial, but the thing I want 270 00:18:05,240 --> 00:18:08,360 Speaker 1: to point out is that colleagues in mt took a 271 00:18:08,480 --> 00:18:10,639 Speaker 1: version of this gene that was like the human gene, 272 00:18:10,960 --> 00:18:13,399 Speaker 1: and they put it into mice and they were able 273 00:18:13,440 --> 00:18:16,720 Speaker 1: to show that the mice could learn mazes faster than 274 00:18:16,800 --> 00:18:19,600 Speaker 1: normal mice. Now, there's still a long way to go, 275 00:18:20,200 --> 00:18:22,359 Speaker 1: and it won't just be this gene, but hundreds of 276 00:18:22,359 --> 00:18:25,520 Speaker 1: other genes that will be required to make cells in 277 00:18:25,560 --> 00:18:28,199 Speaker 1: the brain hook up in the right ways, and it 278 00:18:28,320 --> 00:18:31,560 Speaker 1: probably won't happen in our lifetime. But the point is, 279 00:18:32,480 --> 00:18:37,239 Speaker 1: animal uplift is a possibility in the distant future, and 280 00:18:37,280 --> 00:18:41,160 Speaker 1: It's something that's important to start thinking about now. As 281 00:18:41,280 --> 00:18:45,040 Speaker 1: Charles Joy said in a recent Scientific American blog, if 282 00:18:45,080 --> 00:18:49,000 Speaker 1: we cannot find aliens in the stars, we might create 283 00:18:49,280 --> 00:18:54,639 Speaker 1: alien intelligences on Earth. Imagine our children looking back on 284 00:18:54,720 --> 00:18:58,440 Speaker 1: this time and saying, Wow, that was the time before 285 00:18:58,680 --> 00:19:02,720 Speaker 1: dogs could get elected as senators, or crows ran certain 286 00:19:02,840 --> 00:19:07,679 Speaker 1: university courses, or zebras were winning the Nobel Prize. What 287 00:19:07,960 --> 00:19:26,600 Speaker 1: a world we might be heading into now, as you 288 00:19:26,680 --> 00:19:30,920 Speaker 1: might imagine these sorts of studies and the imagination that 289 00:19:30,960 --> 00:19:35,679 Speaker 1: gets opened about animal uplift. This sparks massive debate among 290 00:19:35,720 --> 00:19:40,560 Speaker 1: bioethicists and philosophers. The idea is, possibly we will get 291 00:19:40,600 --> 00:19:43,920 Speaker 1: to the point where we can uplift in animal species, 292 00:19:44,600 --> 00:19:49,040 Speaker 1: but should we The author David Brynn, who writes great 293 00:19:49,040 --> 00:19:53,240 Speaker 1: science fiction, wrote a series of novels called the Uplift Series, 294 00:19:53,280 --> 00:19:56,480 Speaker 1: and he explores this idea. And in these books we 295 00:19:56,560 --> 00:20:00,240 Speaker 1: come to live with all kinds of intelligent animals who 296 00:20:00,280 --> 00:20:03,800 Speaker 1: live among us, and they're working parts of society. And 297 00:20:03,920 --> 00:20:07,840 Speaker 1: in a interview, Brynn pointed out that he felt the 298 00:20:07,880 --> 00:20:12,159 Speaker 1: benefits of animal uplift could be amazing and really change 299 00:20:12,200 --> 00:20:15,399 Speaker 1: the planet. And in fact, others argue that it is 300 00:20:15,440 --> 00:20:20,840 Speaker 1: a moral imperative, meaning we're obligated to do this. We're 301 00:20:20,840 --> 00:20:23,320 Speaker 1: like the first ones to get up the ladder onto 302 00:20:23,359 --> 00:20:26,119 Speaker 1: the roof, and we need to extend our hands down 303 00:20:26,160 --> 00:20:30,159 Speaker 1: and lift up all our cousin species who happen to 304 00:20:30,200 --> 00:20:33,119 Speaker 1: be behind us. I think if it, like if my 305 00:20:33,240 --> 00:20:36,720 Speaker 1: dog hurts her leg, I will use human discoveries to 306 00:20:36,800 --> 00:20:40,480 Speaker 1: help her, like antibiotics in a cast, and if she 307 00:20:40,560 --> 00:20:44,359 Speaker 1: gets cancer, I'm going to use imaging technology like MRI 308 00:20:44,960 --> 00:20:49,720 Speaker 1: and oncogenic drugs. My colleague George Davorski, who's a futurist 309 00:20:49,800 --> 00:20:53,399 Speaker 1: and an ethicist, he said in an interview with The 310 00:20:53,440 --> 00:20:57,640 Speaker 1: Boston Globe, quote, there are other creatures on this planet 311 00:20:57,680 --> 00:21:01,040 Speaker 1: that may be in need or deserving of also getting 312 00:21:01,080 --> 00:21:05,280 Speaker 1: these sorts of interventions. We should always be considering the 313 00:21:05,359 --> 00:21:09,680 Speaker 1: larger family of sentient organisms on this planet, not just 314 00:21:09,880 --> 00:21:13,600 Speaker 1: human beings. And in fact, Dvorski argues that if you 315 00:21:13,800 --> 00:21:17,640 Speaker 1: deny this future technology of making a brain better, if 316 00:21:17,640 --> 00:21:21,280 Speaker 1: you deny this to an animal species, it's just as 317 00:21:21,400 --> 00:21:25,359 Speaker 1: unethical as if you deny the education as some group 318 00:21:25,359 --> 00:21:28,200 Speaker 1: of people based on their nationality or race or whatever. 319 00:21:28,760 --> 00:21:33,520 Speaker 1: In other words, we're morally obligated to help. Now. On 320 00:21:33,560 --> 00:21:36,959 Speaker 1: the other side of the argument, philosophers argue that the 321 00:21:37,000 --> 00:21:39,880 Speaker 1: problem is that our long line of experiments to make 322 00:21:39,960 --> 00:21:44,359 Speaker 1: this happen in animals won't be instant because it never is, 323 00:21:44,920 --> 00:21:47,480 Speaker 1: and this will lead to lots of suffering among the 324 00:21:47,560 --> 00:21:50,720 Speaker 1: animal species while we try to uplift them but don't 325 00:21:50,800 --> 00:21:54,359 Speaker 1: quite get it right, and especially if we make them 326 00:21:54,359 --> 00:21:56,720 Speaker 1: smarter but we don't get the rest of the thing 327 00:21:56,840 --> 00:22:00,879 Speaker 1: perfect right away, such that they have meant illness and 328 00:22:00,920 --> 00:22:03,399 Speaker 1: are smart enough to know it, or they have a 329 00:22:03,480 --> 00:22:06,560 Speaker 1: ticking clock of a short lifespan and they're smart enough 330 00:22:06,600 --> 00:22:09,440 Speaker 1: to know it, and this could get really terrible if 331 00:22:09,440 --> 00:22:12,879 Speaker 1: we have to keep experimenting on them once they are 332 00:22:13,040 --> 00:22:19,160 Speaker 1: cognitively enhanced like us. The difficulty is that uplift seems 333 00:22:19,200 --> 00:22:22,200 Speaker 1: like an easy task. Perhaps we just make an animal's 334 00:22:22,280 --> 00:22:25,520 Speaker 1: brain bigger, but it's going to be way harder than that. 335 00:22:26,000 --> 00:22:30,320 Speaker 1: One quick example, in Sweden, some researchers tried to breed 336 00:22:30,840 --> 00:22:34,600 Speaker 1: bigger brains in guppies, these little fish. So they took 337 00:22:34,600 --> 00:22:37,240 Speaker 1: the ones with the biggest brains and kept breeding them 338 00:22:37,240 --> 00:22:40,159 Speaker 1: together for several generations, and they did the same with 339 00:22:40,240 --> 00:22:44,640 Speaker 1: the smallest brains too. Now, this sort of artificial selection 340 00:22:44,880 --> 00:22:48,399 Speaker 1: is what agriculturalists and dog breeders and horse breeders have 341 00:22:48,440 --> 00:22:52,680 Speaker 1: been doing forever. So eventually they had two lines with 342 00:22:52,760 --> 00:22:56,919 Speaker 1: a nine percent size difference in their brains, and they 343 00:22:56,920 --> 00:22:59,640 Speaker 1: were able to show that the bigger brained guppies could 344 00:22:59,640 --> 00:23:03,920 Speaker 1: perform a little better on some simple cognitive tasks. So 345 00:23:03,960 --> 00:23:07,199 Speaker 1: that seemed great, But it turned out the fish with 346 00:23:07,280 --> 00:23:10,920 Speaker 1: the big brains were incapable of producing as many offspring, 347 00:23:11,000 --> 00:23:15,600 Speaker 1: and they had smaller guts. So brains are massively energy 348 00:23:15,720 --> 00:23:19,399 Speaker 1: hungry engines. And the conclusion the scienceists came to is 349 00:23:19,400 --> 00:23:22,159 Speaker 1: that if you want bigger brains, the body needs to 350 00:23:22,240 --> 00:23:26,520 Speaker 1: scale back somewhere else. In other words, big brains come 351 00:23:26,720 --> 00:23:30,159 Speaker 1: with a cost. Now, the study doesn't really tell us 352 00:23:30,240 --> 00:23:33,959 Speaker 1: much about intelligence anyway, because we already saw that brain 353 00:23:34,200 --> 00:23:38,760 Speaker 1: size is almost entirely uncorrelated with intelligence. So there are 354 00:23:38,800 --> 00:23:41,240 Speaker 1: far better studies that need to be done on this. 355 00:23:41,320 --> 00:23:43,480 Speaker 1: But I just want to make a single point here. 356 00:23:44,000 --> 00:23:48,560 Speaker 1: Changes in biology never come by themselves. They're always tied 357 00:23:48,600 --> 00:23:51,880 Speaker 1: to the rest of the functioning of the machinery. So 358 00:23:51,920 --> 00:23:54,879 Speaker 1: if you make a heart bigger or legs longer, or 359 00:23:54,920 --> 00:23:59,239 Speaker 1: a head bigger than other. Things change as well, So 360 00:23:59,280 --> 00:24:02,920 Speaker 1: when the con text of uplifting animals, it might mean 361 00:24:02,960 --> 00:24:05,680 Speaker 1: that we need to figure out genetic changes not only 362 00:24:05,720 --> 00:24:08,119 Speaker 1: to the brain, but to all kinds of aspects of 363 00:24:08,160 --> 00:24:12,160 Speaker 1: their body, which could make the whole endeavor more challenging 364 00:24:12,400 --> 00:24:14,480 Speaker 1: than we thought it was when we were just looking 365 00:24:14,480 --> 00:24:17,160 Speaker 1: at the brain. And again, we know it won't just 366 00:24:17,240 --> 00:24:21,159 Speaker 1: be about making brains bigger. For example, horses have brains 367 00:24:21,200 --> 00:24:25,800 Speaker 1: as big as humans, but they're not discovering mathematics. Presumably 368 00:24:25,840 --> 00:24:29,720 Speaker 1: the issue is that they're not running the same algorithms 369 00:24:29,800 --> 00:24:32,879 Speaker 1: in their forests of neurons, or if you look at 370 00:24:32,920 --> 00:24:35,760 Speaker 1: the great apes are nearest neighbors, they seem to have 371 00:24:35,840 --> 00:24:39,199 Speaker 1: everything that you would need, but they're not building the 372 00:24:39,240 --> 00:24:43,879 Speaker 1: Internet and writing novels and doing molecular biology. Why not, Well, 373 00:24:43,960 --> 00:24:46,760 Speaker 1: it's true their brains are a little smaller, but presumably 374 00:24:46,800 --> 00:24:50,840 Speaker 1: the issue is a more microscopic detail than that. It's 375 00:24:50,840 --> 00:24:53,800 Speaker 1: almost certainly something in the algorithms that are getting run 376 00:24:53,880 --> 00:24:56,880 Speaker 1: by the billions of neurons, So it's more than size. 377 00:24:56,960 --> 00:25:00,520 Speaker 1: It's what programs are running, and there are decades of 378 00:25:00,560 --> 00:25:03,359 Speaker 1: work ahead of us to even scratch the surface of 379 00:25:03,400 --> 00:25:08,199 Speaker 1: that scientific problem. And there's another challenge here. Typically, when 380 00:25:08,280 --> 00:25:12,320 Speaker 1: we think about animal uplift, we picture an animal speaking 381 00:25:12,400 --> 00:25:15,800 Speaker 1: English and having a conversation with us. But it's possible 382 00:25:16,200 --> 00:25:18,480 Speaker 1: that it's going to be very difficult to do animal 383 00:25:18,560 --> 00:25:22,840 Speaker 1: uplift because the intelligence of our species is so tied 384 00:25:22,920 --> 00:25:28,040 Speaker 1: to other things like our opposable thumbs or our larynx, 385 00:25:28,119 --> 00:25:30,520 Speaker 1: which allows us to do things like make podcasts and 386 00:25:30,600 --> 00:25:33,640 Speaker 1: sing songs. And so the argument is, if you give 387 00:25:33,640 --> 00:25:37,720 Speaker 1: a squirrel a better brain or the right algorithms to run, 388 00:25:38,359 --> 00:25:41,080 Speaker 1: it still won't be able to play a flute. And 389 00:25:41,200 --> 00:25:43,840 Speaker 1: even if it can think of some new solution to 390 00:25:43,920 --> 00:25:47,480 Speaker 1: quantum electrodynamics, it can't simply tell it to you because 391 00:25:47,480 --> 00:25:50,600 Speaker 1: it doesn't have a larynx to produce language. So there 392 00:25:50,640 --> 00:25:52,639 Speaker 1: are a whole series of things that may need to 393 00:25:52,680 --> 00:25:57,520 Speaker 1: be put into place so that we can translate languages. 394 00:25:57,560 --> 00:26:00,240 Speaker 1: And I'm not talking about English to Chinese like I 395 00:26:00,280 --> 00:26:04,240 Speaker 1: have on my Google Translate app, but human to dolphin 396 00:26:04,920 --> 00:26:08,720 Speaker 1: or gopher to human, which takes into account the evolutionary 397 00:26:08,880 --> 00:26:13,440 Speaker 1: history of the species and things that would have meaning 398 00:26:13,520 --> 00:26:17,040 Speaker 1: to that animal. For example, a dolphin lives in a 399 00:26:17,160 --> 00:26:20,640 Speaker 1: three dimensional world of movements and patterns, and so it's 400 00:26:20,720 --> 00:26:24,760 Speaker 1: picking up on different things, and it presumably cares about 401 00:26:24,880 --> 00:26:29,199 Speaker 1: different things. And that's what Google Translate may look like 402 00:26:29,240 --> 00:26:33,200 Speaker 1: a century from now, figuring out how to communicate the 403 00:26:33,359 --> 00:26:38,840 Speaker 1: needs and desires and insights of one species to another. 404 00:26:40,040 --> 00:26:43,680 Speaker 1: Now there's one other thing to consider, and this one 405 00:26:43,800 --> 00:26:46,280 Speaker 1: we can't ignore. So I'm going to take a tangent 406 00:26:46,359 --> 00:26:49,600 Speaker 1: here for just a few moments. Many years ago, when 407 00:26:49,600 --> 00:26:52,920 Speaker 1: the European Union was first getting put together, I wrote 408 00:26:52,920 --> 00:26:57,000 Speaker 1: an article about the possible problems that such a union 409 00:26:57,240 --> 00:27:00,879 Speaker 1: might confront in the future, and I quoted George Orwell, 410 00:27:01,320 --> 00:27:05,480 Speaker 1: who wrote a deeply insightful essay about Adolf Hitler in 411 00:27:05,600 --> 00:27:10,160 Speaker 1: nineteen forty and Orwell pointed out the following, and I quote, 412 00:27:10,680 --> 00:27:16,320 Speaker 1: Hitler knows that human beings don't only want comfort, safety, 413 00:27:16,840 --> 00:27:21,920 Speaker 1: short working hours, hygiene, birth control, and in general common sense, 414 00:27:22,640 --> 00:27:29,120 Speaker 1: they also, at least intermittently want struggle and self sacrifice, 415 00:27:29,760 --> 00:27:35,080 Speaker 1: not to mention drums, flags and loyalty parades. I used 416 00:27:35,240 --> 00:27:39,120 Speaker 1: Orwell's quotation to illustrate something that could end up being 417 00:27:39,119 --> 00:27:43,080 Speaker 1: the Achilles heel of the European Union, which is that 418 00:27:43,240 --> 00:27:48,400 Speaker 1: eventually neighbors will fight one another, just as in America 419 00:27:48,440 --> 00:27:51,479 Speaker 1: the North and South did in the Civil War. People 420 00:27:51,640 --> 00:27:55,000 Speaker 1: don't just want peace. Sometimes they want to bang their 421 00:27:55,080 --> 00:27:58,280 Speaker 1: drums and wave their flags. If you look at the 422 00:27:58,280 --> 00:28:02,480 Speaker 1: history of our species, this is the standard. It's very 423 00:28:02,520 --> 00:28:06,359 Speaker 1: hard to get groups to form a union forever or 424 00:28:06,359 --> 00:28:10,520 Speaker 1: even for very long, and that's because of identity politics 425 00:28:10,520 --> 00:28:13,399 Speaker 1: and our desire to show that we are different from 426 00:28:13,520 --> 00:28:17,359 Speaker 1: and usually better than our neighbors. So just look at 427 00:28:17,359 --> 00:28:20,800 Speaker 1: the entire history of the twentieth century to see how 428 00:28:20,840 --> 00:28:26,360 Speaker 1: easily groups will foamen genocide against other people they're living with. 429 00:28:26,960 --> 00:28:30,440 Speaker 1: Whether that's the Communists in the USSR or the Hutu 430 00:28:30,480 --> 00:28:33,720 Speaker 1: in Rwanda, or Hitler telling the Germans that they deserve 431 00:28:33,800 --> 00:28:37,479 Speaker 1: the entirety of Europe around them. It is shockingly easy 432 00:28:37,560 --> 00:28:41,360 Speaker 1: to get groups to orient against other groups that they 433 00:28:41,440 --> 00:28:43,520 Speaker 1: used to be fine with, they used to be neighbors with. 434 00:28:44,880 --> 00:28:48,040 Speaker 1: Now I'm going to do another episode about empathy and 435 00:28:48,120 --> 00:28:50,920 Speaker 1: how people view one another and why we as a 436 00:28:50,960 --> 00:28:56,040 Speaker 1: species so easily slip into war. But in today's context, 437 00:28:56,680 --> 00:29:00,200 Speaker 1: you can guess my reason for mentioning all this will 438 00:29:00,280 --> 00:29:05,080 Speaker 1: and uplifted animal species at some point want to beat 439 00:29:05,120 --> 00:29:08,160 Speaker 1: its own drums and fly its own flags and have 440 00:29:08,240 --> 00:29:13,320 Speaker 1: its own loyalty parades. In other words, what happens when 441 00:29:13,320 --> 00:29:15,560 Speaker 1: we get our dogs to communicate with us and work 442 00:29:15,600 --> 00:29:18,400 Speaker 1: with us? Are they always going to put up with us? 443 00:29:19,080 --> 00:29:22,200 Speaker 1: Or do they at some point say, I don't want 444 00:29:22,240 --> 00:29:24,800 Speaker 1: you to put me outside. I don't want you to 445 00:29:24,880 --> 00:29:27,040 Speaker 1: hit me on the nose when I pee in the house. 446 00:29:27,320 --> 00:29:30,840 Speaker 1: I think that all of our problems stem from our 447 00:29:30,920 --> 00:29:34,720 Speaker 1: historical relationship with you, and it's time for us to 448 00:29:34,760 --> 00:29:38,400 Speaker 1: shed our callers and bear our fangs and bite you 449 00:29:38,480 --> 00:29:43,040 Speaker 1: right in the juggular in the name of freedom and dignity. Now, 450 00:29:43,040 --> 00:29:46,080 Speaker 1: this is similar to the situation that was portrayed in 451 00:29:46,120 --> 00:29:51,000 Speaker 1: the New Planet of the Apes movies. Humans genetically engineer 452 00:29:51,080 --> 00:29:55,040 Speaker 1: the apes to uplift them, and then things devolve into 453 00:29:55,080 --> 00:29:59,080 Speaker 1: a war between the species. So this is one worry 454 00:29:59,160 --> 00:30:02,960 Speaker 1: about the future of animal uplift. If human intelligence is 455 00:30:03,040 --> 00:30:05,440 Speaker 1: any guide, and it's really the only guide we have, 456 00:30:06,040 --> 00:30:10,640 Speaker 1: then the introduction of intelligence will probably open the door 457 00:30:10,680 --> 00:30:13,440 Speaker 1: to the horrors not of a civil war, but of 458 00:30:13,480 --> 00:30:17,760 Speaker 1: a species war. And what's interesting is you could extrapolate 459 00:30:17,880 --> 00:30:22,320 Speaker 1: from what happened in previous world wars where unlikely countries 460 00:30:22,360 --> 00:30:25,680 Speaker 1: banded together, like in World War Two, where the United 461 00:30:25,720 --> 00:30:30,360 Speaker 1: States banded with their unlikely bedfellow Stalin and the USSR, 462 00:30:30,960 --> 00:30:34,840 Speaker 1: or for that matter, Germany linked arms with Japan. Everyone 463 00:30:34,960 --> 00:30:38,560 Speaker 1: collaborates where they see the most benefit in the political sphere. 464 00:30:38,800 --> 00:30:42,040 Speaker 1: So just imagine what could happen when we enter not 465 00:30:42,120 --> 00:30:45,120 Speaker 1: just a world war, but a multi species world war 466 00:30:45,600 --> 00:30:49,240 Speaker 1: where humans have to fight against dolphins and camels and 467 00:30:49,320 --> 00:30:52,240 Speaker 1: hippos who have all linked up with one another. Or 468 00:30:52,320 --> 00:30:54,960 Speaker 1: maybe in the more distant future, as the world gets 469 00:30:55,080 --> 00:30:59,240 Speaker 1: used to having multiple species, things will get complex in 470 00:30:59,280 --> 00:31:02,560 Speaker 1: the way they always do, where some dolphins collaborate with 471 00:31:02,600 --> 00:31:05,760 Speaker 1: some humans and they ride the waves as a unit, 472 00:31:06,040 --> 00:31:08,800 Speaker 1: and other dolphins and other humans are finding on the 473 00:31:08,840 --> 00:31:12,000 Speaker 1: other side, So you have species that are split down 474 00:31:12,080 --> 00:31:16,960 Speaker 1: the middle, like families in a civil war. Or maybe, 475 00:31:17,680 --> 00:31:21,600 Speaker 1: just maybe this all happens in a different way and 476 00:31:21,680 --> 00:31:24,760 Speaker 1: they are smart enough or just different from us enough 477 00:31:25,360 --> 00:31:28,880 Speaker 1: to bring us peace or at least lock us into 478 00:31:28,920 --> 00:31:34,400 Speaker 1: non aggression. And we learn the meaning of humanity from 479 00:31:34,760 --> 00:31:39,760 Speaker 1: non humans. So let's wrap up. When my children are 480 00:31:39,880 --> 00:31:43,040 Speaker 1: my age and they're thinking about how they used to 481 00:31:43,080 --> 00:31:46,200 Speaker 1: eat animals and pump gas. The way that we look 482 00:31:46,240 --> 00:31:50,160 Speaker 1: back now on physicians promoting cigarettes or babies being in 483 00:31:50,160 --> 00:31:54,360 Speaker 1: the car without seat belts. My children might also look 484 00:31:54,400 --> 00:31:56,880 Speaker 1: backward and think about the time when they were this 485 00:31:57,080 --> 00:32:02,520 Speaker 1: single species on Earth, building durable knowledge and making discoveries 486 00:32:02,560 --> 00:32:06,120 Speaker 1: about our cosmos. And maybe this will be our grandkids 487 00:32:06,240 --> 00:32:09,400 Speaker 1: or our great grandkids. But when I try to imagine 488 00:32:09,440 --> 00:32:12,520 Speaker 1: what their experiences are like, I wonder if it'll be 489 00:32:12,600 --> 00:32:16,040 Speaker 1: like the way we travel around and are always surprised 490 00:32:16,080 --> 00:32:20,480 Speaker 1: and impressed with other cultures, Like when the Spaniards first 491 00:32:20,600 --> 00:32:22,840 Speaker 1: stepped foot on the shores of what we now call 492 00:32:23,000 --> 00:32:27,840 Speaker 1: North America and discovered entire empires built by the Aztec 493 00:32:28,280 --> 00:32:31,760 Speaker 1: and the Inca and the Maya, complete with buildings and 494 00:32:31,840 --> 00:32:35,800 Speaker 1: communities and zigarots and other forms of writing such as 495 00:32:35,920 --> 00:32:39,800 Speaker 1: tying knots and strings, and completely different religions with their 496 00:32:39,880 --> 00:32:43,960 Speaker 1: own deities and stories of creation. I wonder whether it'll 497 00:32:44,000 --> 00:32:47,239 Speaker 1: be like this when we stumble on animal communities that 498 00:32:47,280 --> 00:32:54,040 Speaker 1: we've uplifted, let's say, gophers, and discovering entire mythologies that 499 00:32:54,120 --> 00:32:58,400 Speaker 1: the gophers have written, and whole types of architecture and 500 00:32:58,480 --> 00:33:03,400 Speaker 1: technologies we simply wouldn't have thought of because of differences 501 00:33:03,440 --> 00:33:08,520 Speaker 1: in our needs or imaginations. Just like different cultures invent 502 00:33:08,560 --> 00:33:13,520 Speaker 1: different technologies based on their local needs, so will different species. 503 00:33:14,520 --> 00:33:19,000 Speaker 1: In any case, someday after I'm long gone and my 504 00:33:19,200 --> 00:33:23,000 Speaker 1: podcast exist on some medium that we can't even imagine, 505 00:33:23,600 --> 00:33:28,480 Speaker 1: I hope that you, dear listener of the future, will 506 00:33:28,520 --> 00:33:31,000 Speaker 1: really picture me pulling up to a gas station and 507 00:33:31,120 --> 00:33:34,360 Speaker 1: having to buy gas to get from place to place. 508 00:33:35,160 --> 00:33:40,960 Speaker 1: And maybe, listener, you are a member of an animal 509 00:33:41,040 --> 00:33:45,520 Speaker 1: species that in my day had very little cognitive power. 510 00:33:46,360 --> 00:33:50,680 Speaker 1: So I dedicate this podcast to you, especially if you've 511 00:33:50,760 --> 00:33:55,000 Speaker 1: joined our ranks as being fascinated by not only where 512 00:33:55,000 --> 00:33:59,320 Speaker 1: we are, but you also simulate possible paths to where 513 00:33:59,320 --> 00:34:03,040 Speaker 1: we could go, and then you leverage the scientific method 514 00:34:03,480 --> 00:34:09,799 Speaker 1: to build pathways to get there. To find out more 515 00:34:09,840 --> 00:34:12,439 Speaker 1: and to share your thoughts, head over to eagleman dot 516 00:34:12,440 --> 00:34:16,280 Speaker 1: com slash podcasts. I have links there for further reading 517 00:34:16,320 --> 00:34:20,040 Speaker 1: about Animal Uplift, and I also have that incredible George 518 00:34:20,160 --> 00:34:24,200 Speaker 1: Orwell essay about Hitler, and I recommend reading it as 519 00:34:24,239 --> 00:34:28,000 Speaker 1: one of the most insightful political essays I know. Even 520 00:34:28,040 --> 00:34:31,960 Speaker 1: after seven decades, it is as relevant as ever until 521 00:34:32,040 --> 00:34:35,400 Speaker 1: next time, I'm David Eagleman and thank you for joining 522 00:34:35,440 --> 00:34:37,160 Speaker 1: me in the inner cosmos.