1 00:00:04,320 --> 00:00:08,200 Speaker 1: What is creativity and how does it work in the brain? 2 00:00:08,760 --> 00:00:11,840 Speaker 1: Are all brains creative? And what does that tell us 3 00:00:11,840 --> 00:00:16,160 Speaker 1: about the evolution of species and the evolution of ideas? 4 00:00:16,800 --> 00:00:20,000 Speaker 1: How do the arts expose what's happening in the brain, 5 00:00:20,360 --> 00:00:23,120 Speaker 1: and what does that have to do with how cell 6 00:00:23,160 --> 00:00:27,360 Speaker 1: phones got their names? And why koala bears don't write novels. 7 00:00:30,400 --> 00:00:33,519 Speaker 1: Welcome to Inner Cosmos with me David Eagleman. I'm a 8 00:00:33,560 --> 00:00:37,320 Speaker 1: neuroscientist and an author at Stanford and in these episodes 9 00:00:37,400 --> 00:00:41,400 Speaker 1: we dive into our three pound universe to uncover the 10 00:00:41,400 --> 00:00:55,040 Speaker 1: most surprising secrets behind our daily lives. Today's episode is 11 00:00:55,040 --> 00:00:58,880 Speaker 1: about creativity. What is it and how do brains do it? 12 00:00:59,360 --> 00:01:02,360 Speaker 1: And why do human brains do it better than anyone 13 00:01:02,400 --> 00:01:05,160 Speaker 1: else in the animal kingdom? So I'm going to start 14 00:01:05,400 --> 00:01:08,440 Speaker 1: tangentially with two stories that have hit the news in 15 00:01:08,520 --> 00:01:13,200 Speaker 1: recent weeks. The first is that there are no fewer 16 00:01:13,240 --> 00:01:18,800 Speaker 1: than nine copyright infringement lawsuits raging in the courts right now, 17 00:01:18,840 --> 00:01:23,000 Speaker 1: with groups suing Open Ai and Meta and other companies 18 00:01:23,360 --> 00:01:27,200 Speaker 1: for hundreds of millions of dollars. Now, the groups suing 19 00:01:27,520 --> 00:01:32,000 Speaker 1: are creatives, they're writers, and they're suing because their books 20 00:01:32,040 --> 00:01:38,240 Speaker 1: were consumed in the training of these massive generative AI systems, 21 00:01:38,400 --> 00:01:42,800 Speaker 1: these large language models like CHATCHPT. So one of the 22 00:01:42,840 --> 00:01:47,880 Speaker 1: suits calls this quote systematic theft on a mass scale. 23 00:01:48,440 --> 00:01:52,560 Speaker 1: In other words, the suit asserts that these large language 24 00:01:52,560 --> 00:01:57,400 Speaker 1: models are producing nothing fundamentally new. They're just recycling the 25 00:01:57,520 --> 00:02:01,520 Speaker 1: old and they're remixing that in different way ways. Okay, 26 00:02:01,560 --> 00:02:03,640 Speaker 1: so that was one story. The second story in the 27 00:02:03,680 --> 00:02:07,800 Speaker 1: news is that the company under Armour made an advertisement 28 00:02:08,280 --> 00:02:12,040 Speaker 1: that featured a boxer named Anthony Joshua, and this set 29 00:02:12,080 --> 00:02:15,840 Speaker 1: off a firestorm on Instagram because this claimed to be 30 00:02:15,960 --> 00:02:20,840 Speaker 1: the first AI powered sports commercial. Now the critics are 31 00:02:20,919 --> 00:02:23,280 Speaker 1: hating on it because they say the ad is just 32 00:02:23,680 --> 00:02:30,880 Speaker 1: repurposing work from other humans other creatives without proper acknowledgment. 33 00:02:31,440 --> 00:02:34,480 Speaker 1: And the debate around both these stories is highlighting the 34 00:02:34,520 --> 00:02:38,520 Speaker 1: broad issue of whether AI is going to undermine human 35 00:02:38,600 --> 00:02:44,400 Speaker 1: creativity because it treats all the accomplishments of previous artists 36 00:02:44,520 --> 00:02:47,200 Speaker 1: just as training data for doing its own thing, for 37 00:02:47,320 --> 00:02:50,960 Speaker 1: generating its own content, which it then takes credit for. 38 00:02:51,560 --> 00:02:54,560 Speaker 1: So many creatives are arguing that AI is going to 39 00:02:54,680 --> 00:02:59,440 Speaker 1: diminish the value of human creativity. It simply remixes things 40 00:02:59,440 --> 00:03:02,680 Speaker 1: that act o people have done before it. So that's 41 00:03:02,680 --> 00:03:06,359 Speaker 1: what I want to talk about today, because human creativity 42 00:03:06,840 --> 00:03:11,000 Speaker 1: is the most important reason our species has achieved what 43 00:03:11,120 --> 00:03:14,240 Speaker 1: it has. When you fly over a forest and you 44 00:03:14,280 --> 00:03:17,440 Speaker 1: look down on it from an airplane, the species in 45 00:03:17,480 --> 00:03:19,760 Speaker 1: that forest are doing the same thing that they were 46 00:03:19,800 --> 00:03:22,680 Speaker 1: doing one hundred thousand years ago. There's no difference. But 47 00:03:22,720 --> 00:03:26,080 Speaker 1: when you reach your destination and you're coming into any 48 00:03:26,440 --> 00:03:30,200 Speaker 1: modern city, you can see that the landscape has been 49 00:03:30,280 --> 00:03:34,320 Speaker 1: so influenced by one single species that it looks like 50 00:03:34,400 --> 00:03:39,520 Speaker 1: a giant, colorful motherboard of buildings has risen out of 51 00:03:39,560 --> 00:03:43,920 Speaker 1: the ground. There's something really different going on about humans, 52 00:03:44,320 --> 00:03:48,640 Speaker 1: And it's a reasonable question to ask, why don't squirrels 53 00:03:49,080 --> 00:03:55,040 Speaker 1: design elevators, Why don't alligators invent speedboats? Why don't koalas 54 00:03:55,320 --> 00:03:58,880 Speaker 1: spend their free time building an internet to surf? So 55 00:03:58,920 --> 00:04:01,280 Speaker 1: we're going to get into that shortly. Let me just 56 00:04:01,320 --> 00:04:05,360 Speaker 1: say that the reason we are communicating over a podcast, 57 00:04:05,480 --> 00:04:09,440 Speaker 1: shooting zeros and ones across the planet to transmit a 58 00:04:09,640 --> 00:04:14,120 Speaker 1: message from one human mouth to multiple human ears, is 59 00:04:14,160 --> 00:04:18,240 Speaker 1: because we are doing something very different than any other species. 60 00:04:18,880 --> 00:04:21,720 Speaker 1: We have occupied every corner of the planet. We have 61 00:04:22,120 --> 00:04:25,400 Speaker 1: a million human beings above the clouds at any moment 62 00:04:25,520 --> 00:04:28,680 Speaker 1: because of air travel. We have humans floating in the 63 00:04:28,720 --> 00:04:33,520 Speaker 1: International Space Station, and we've landed on other orbiting bodies 64 00:04:33,560 --> 00:04:36,720 Speaker 1: like the Moon, and our machines have landed on Mars. 65 00:04:37,120 --> 00:04:40,080 Speaker 1: And no other animal has even invented the wheel or 66 00:04:40,120 --> 00:04:46,000 Speaker 1: discovered fire, much less coordinated millions of its congeners into 67 00:04:46,279 --> 00:04:50,560 Speaker 1: kingdoms and nations. So let's be clear that there's something 68 00:04:50,720 --> 00:04:55,320 Speaker 1: very different going on with our species. Homo sapiens represents 69 00:04:55,760 --> 00:05:01,120 Speaker 1: a runaway species, and that term the runaway species is 70 00:05:01,160 --> 00:05:03,520 Speaker 1: in fact the title of a book that I wrote 71 00:05:03,600 --> 00:05:09,200 Speaker 1: with my close friend Anthony Brandt. Anthony is a professor 72 00:05:09,279 --> 00:05:12,760 Speaker 1: of music composition at the Shepherd School of Music, which 73 00:05:12,800 --> 00:05:16,200 Speaker 1: is part of Rice University, and generally he is a 74 00:05:16,240 --> 00:05:19,760 Speaker 1: sharp and sensitive thinker about creativity and he's a real 75 00:05:19,920 --> 00:05:22,839 Speaker 1: student of the brain. So years ago he and I 76 00:05:22,880 --> 00:05:25,279 Speaker 1: were getting a cup of coffee and we started talking 77 00:05:25,279 --> 00:05:29,120 Speaker 1: about creativity from the psychology point of view, from the 78 00:05:29,160 --> 00:05:31,880 Speaker 1: neuroscience point of view, from the point of view of 79 00:05:32,480 --> 00:05:36,839 Speaker 1: artistic endeavor, and we realized that our views on this 80 00:05:37,120 --> 00:05:40,719 Speaker 1: converged so much so that we both learned a lot 81 00:05:40,760 --> 00:05:43,719 Speaker 1: and we realized that we had a framework for understanding 82 00:05:43,760 --> 00:05:47,000 Speaker 1: what the creative act is, and we ended up writing 83 00:05:47,040 --> 00:05:50,080 Speaker 1: this book together. So I called up Anthony today to 84 00:05:50,160 --> 00:05:53,400 Speaker 1: be part of this conversation with me, and I started 85 00:05:53,400 --> 00:05:59,320 Speaker 1: by asking him what NASA and Picasso have in common. 86 00:06:04,440 --> 00:06:07,280 Speaker 2: You know, back in nineteen seventy there was the famous 87 00:06:07,279 --> 00:06:12,960 Speaker 2: incident where the Apollo thirteen spacecraft completely lost power, three 88 00:06:13,000 --> 00:06:16,680 Speaker 2: astronauts lives hanging in the balance, and the NASA engineers 89 00:06:16,800 --> 00:06:20,600 Speaker 2: basically had just a few hours to improvise a solution, 90 00:06:21,400 --> 00:06:24,280 Speaker 2: and they had a very closed system. All they could 91 00:06:24,400 --> 00:06:28,600 Speaker 2: use to save those wonderful lives was what was on 92 00:06:28,680 --> 00:06:33,560 Speaker 2: board the spacecraft, and they had to use a shrink 93 00:06:33,600 --> 00:06:37,040 Speaker 2: wrap to help make an air scrubber. They had to 94 00:06:37,320 --> 00:06:40,920 Speaker 2: power the main capsule with the lunar module. Everything that 95 00:06:40,960 --> 00:06:44,480 Speaker 2: they did was spur of the moment, working together as 96 00:06:44,520 --> 00:06:48,960 Speaker 2: an amazing team bringing home the astronauts safely. Cut to 97 00:06:49,200 --> 00:06:52,200 Speaker 2: Picasso working alone in his studio near the turn of 98 00:06:52,200 --> 00:06:56,400 Speaker 2: the twentieth century painting this radical painting, the likes of 99 00:06:56,440 --> 00:06:59,080 Speaker 2: which no one had ever seen before, and he was 100 00:06:59,120 --> 00:07:02,880 Speaker 2: so nervous about it. He invited his mistress at the time, 101 00:07:03,480 --> 00:07:07,760 Speaker 2: his gallery distributor to come, and they laughed at it, 102 00:07:07,920 --> 00:07:10,680 Speaker 2: and he was feeling so uncomfortable and a shame. He 103 00:07:10,720 --> 00:07:13,280 Speaker 2: actually rolled up the painting and hid it in his closet, 104 00:07:13,920 --> 00:07:17,320 Speaker 2: but he kept feeling drawn to it, kept working on it, 105 00:07:17,880 --> 00:07:20,880 Speaker 2: but still when he finished, he wasn't even quite sure 106 00:07:20,920 --> 00:07:24,280 Speaker 2: he was finished, rolled it up again, put it in 107 00:07:24,400 --> 00:07:29,040 Speaker 2: his closet for nine years, and then put it out 108 00:07:29,040 --> 00:07:32,040 Speaker 2: into the world and became one of the breakthrough paintings 109 00:07:32,080 --> 00:07:36,080 Speaker 2: of the twentieth century. Le Demoiselle Davignan. And so you've 110 00:07:36,120 --> 00:07:40,720 Speaker 2: got the NASA engineers working collaboratively, Picasso working all by himself. 111 00:07:41,400 --> 00:07:44,440 Speaker 2: The NASA engineers with a very targeted result. What they 112 00:07:44,480 --> 00:07:47,040 Speaker 2: do has to work, it has to succeed otherwise the 113 00:07:47,080 --> 00:07:51,280 Speaker 2: astronauts parish. Picasso is doing something much more open ended 114 00:07:51,320 --> 00:07:55,280 Speaker 2: and speculative. He's inventing a new type of art. And 115 00:07:55,400 --> 00:07:58,640 Speaker 2: yet when you look carefully at what they were doing, 116 00:07:58,960 --> 00:08:03,960 Speaker 2: essentially their brains were doing very similar mental operations. They 117 00:08:04,000 --> 00:08:08,320 Speaker 2: were creating variations on things that were already there. They 118 00:08:08,360 --> 00:08:10,600 Speaker 2: were tearing up the world as they knew it and 119 00:08:10,600 --> 00:08:13,160 Speaker 2: putting it back together in new ways, and they were 120 00:08:13,200 --> 00:08:17,280 Speaker 2: mixing things in combinations that hadn't existed before, and each 121 00:08:17,440 --> 00:08:20,880 Speaker 2: was applying it in their own domain, but under the 122 00:08:20,920 --> 00:08:24,840 Speaker 2: hood of their brains what was happening is very deeply related. 123 00:08:26,960 --> 00:08:29,440 Speaker 1: And it seems to be the key with creativity, right 124 00:08:29,640 --> 00:08:33,240 Speaker 1: is absorbing everything in the world around you, absorbing your 125 00:08:33,280 --> 00:08:37,880 Speaker 1: own experiences, and then remixing them, making something new out 126 00:08:37,880 --> 00:08:40,840 Speaker 1: of it. And what we did when we wrote our 127 00:08:40,880 --> 00:08:44,960 Speaker 1: book together is outlined sort of three different ways that 128 00:08:45,040 --> 00:08:49,319 Speaker 1: this can happen, three different cognitive operations that the software 129 00:08:49,320 --> 00:08:52,920 Speaker 1: of the brain is running. So tell us about those 130 00:08:53,080 --> 00:08:57,800 Speaker 1: three operations. So we called them bending, breaking, and blending. 131 00:08:58,720 --> 00:09:03,560 Speaker 1: Bending is making a copy and altering it. So fonts 132 00:09:03,720 --> 00:09:06,560 Speaker 1: are a great example of bending. I mean, why do 133 00:09:06,640 --> 00:09:10,520 Speaker 1: we need different typefaces for the same letters. They convey 134 00:09:10,600 --> 00:09:14,600 Speaker 1: the same information. But we have this compulsion to bend, 135 00:09:14,640 --> 00:09:19,400 Speaker 1: and so we create this unbelievable proliferation of variations. Jazz 136 00:09:19,440 --> 00:09:23,400 Speaker 1: improvisation is a type of bending. Different car models are 137 00:09:23,400 --> 00:09:26,439 Speaker 1: a type of bending. Anytime you take a prototype and 138 00:09:26,480 --> 00:09:28,760 Speaker 1: you remake it in some way, that's. 139 00:09:28,600 --> 00:09:33,559 Speaker 2: A version of bending. So breaking involves taking something complete 140 00:09:33,720 --> 00:09:37,360 Speaker 2: and deconstructing it, tearing apart, pulling it apart into little 141 00:09:37,440 --> 00:09:42,800 Speaker 2: pieces that's what makes an LED screen possible. It's where 142 00:09:42,800 --> 00:09:46,000 Speaker 2: the birth of cell phones happened. Originally, there was only 143 00:09:46,080 --> 00:09:50,120 Speaker 2: one phone tower per urban area, and as a result, 144 00:09:50,200 --> 00:09:53,000 Speaker 2: only a few dozen people could get on their mobile 145 00:09:53,000 --> 00:09:55,480 Speaker 2: phones at the same time, and calls were dropping. It 146 00:09:55,520 --> 00:09:58,719 Speaker 2: was very frustrating, and then engineers at Bell Labs had 147 00:09:58,720 --> 00:10:01,360 Speaker 2: this idea, oh wait, what if we break up the 148 00:10:01,480 --> 00:10:05,199 Speaker 2: urban area into individual cells and give them each their 149 00:10:05,200 --> 00:10:08,560 Speaker 2: own tower, And then they design software so you could 150 00:10:08,800 --> 00:10:10,840 Speaker 2: drive in your car from one of those cells to 151 00:10:10,880 --> 00:10:15,080 Speaker 2: another without losing the call. And basically, modern mobile communication 152 00:10:15,480 --> 00:10:19,199 Speaker 2: was born out of that cognitive process of breaking and. 153 00:10:19,160 --> 00:10:22,160 Speaker 1: Not everyone is aware, but that's how cell phone gets 154 00:10:22,200 --> 00:10:25,760 Speaker 1: its name because the landscape is divided up into different. 155 00:10:25,520 --> 00:10:30,520 Speaker 2: Cells exactly exactly, and like so many things, that ingenuity 156 00:10:30,559 --> 00:10:33,760 Speaker 2: is hidden behind the scenes. We just take it for granted, 157 00:10:33,920 --> 00:10:38,640 Speaker 2: not realizing that this fundamental cognitive operation is responsible for it. 158 00:10:39,120 --> 00:10:42,160 Speaker 2: So blending, which is a term first introduced by our 159 00:10:42,200 --> 00:10:46,800 Speaker 2: colleagues Mark Turner and Jills Fauconnier, involves combining two or 160 00:10:46,840 --> 00:10:50,800 Speaker 2: more sources. And you know, a smartphone is a beautiful 161 00:10:50,840 --> 00:10:55,360 Speaker 2: example of a blend. Mermaids and centaurs are blends of 162 00:10:55,440 --> 00:10:59,840 Speaker 2: humans and animals. Compromise is a form of blending where 163 00:11:00,040 --> 00:11:02,560 Speaker 2: you work out with another person, Oh, you know, what 164 00:11:02,720 --> 00:11:05,920 Speaker 2: is our middle ground. I'll contribute something to the solution, 165 00:11:06,280 --> 00:11:10,080 Speaker 2: and you'll contribute to something to the solution. Blending is 166 00:11:10,120 --> 00:11:13,839 Speaker 2: so much a part of how we just approach our world. Again, 167 00:11:13,960 --> 00:11:17,160 Speaker 2: we take it for granted, but that's how we get 168 00:11:17,200 --> 00:11:22,760 Speaker 2: metaphors and house boats and fusion cuisine and so many 169 00:11:22,760 --> 00:11:24,920 Speaker 2: of the things that are familiar to us in our world. 170 00:11:40,400 --> 00:11:43,200 Speaker 1: Actually, so let's talk about that for a second, about 171 00:11:43,440 --> 00:11:45,560 Speaker 1: the things that are hidden behind the scenes and the 172 00:11:45,559 --> 00:11:48,280 Speaker 1: things that are obvious. So one of the things about 173 00:11:48,280 --> 00:11:51,680 Speaker 1: the world of art is that the bending, breaking and 174 00:11:51,760 --> 00:11:57,240 Speaker 1: blending is overt, is done so that everybody can see it. 175 00:11:57,720 --> 00:12:02,720 Speaker 1: And in the case of scientific breakthroughs or technological breakthroughs, 176 00:12:02,760 --> 00:12:06,720 Speaker 1: it's typically covert, as in most people don't see the 177 00:12:06,920 --> 00:12:09,520 Speaker 1: things that's happening there. But the argument we made is 178 00:12:09,520 --> 00:12:13,360 Speaker 1: that it's actually the same processes happening under the hood. 179 00:12:13,559 --> 00:12:14,280 Speaker 1: Tell us about that. 180 00:12:14,600 --> 00:12:18,240 Speaker 2: Yeah, I think that's one of the art's great cultural contributions. 181 00:12:18,280 --> 00:12:23,480 Speaker 2: In a sense, they are contributing to science in externalizing 182 00:12:23,600 --> 00:12:27,160 Speaker 2: internal features of how our minds work in a way 183 00:12:27,200 --> 00:12:30,440 Speaker 2: that is often hard to see. You know, there's a 184 00:12:30,480 --> 00:12:34,680 Speaker 2: cool example with YouTube. When you're watching HD video, you're 185 00:12:34,720 --> 00:12:38,440 Speaker 2: actually not watching HD video. You're watching a mosaic of 186 00:12:38,600 --> 00:12:44,400 Speaker 2: different levels of resolution, where YouTube is monitoring your computer's 187 00:12:44,440 --> 00:12:48,120 Speaker 2: bandwidth in real time and sending the resolution that'll make 188 00:12:48,160 --> 00:12:50,720 Speaker 2: it through. And the hope is that if there's enough 189 00:12:50,920 --> 00:12:54,400 Speaker 2: HD in that stream, you just think you're watching a 190 00:12:54,480 --> 00:12:57,640 Speaker 2: perfectly wonderful HD video and you don't realize that it's 191 00:12:57,679 --> 00:13:00,000 Speaker 2: actually created by all these little fragments. 192 00:13:00,679 --> 00:13:02,920 Speaker 1: So you mean, what I'm seeing is some sort of 193 00:13:03,040 --> 00:13:05,400 Speaker 1: rough video, and then when my bandwidth is better for 194 00:13:05,440 --> 00:13:08,280 Speaker 1: a moment, I see the very clear high definition video, 195 00:13:08,400 --> 00:13:10,560 Speaker 1: and then I see some mixed up stuff in there. 196 00:13:10,600 --> 00:13:13,520 Speaker 2: If it's done well, it happens all so fast that 197 00:13:13,720 --> 00:13:17,640 Speaker 2: you don't notice the pebbles among pearls, and you just think, oh, 198 00:13:17,720 --> 00:13:21,400 Speaker 2: this is great, My streaming is working perfectly. I'm not freezing, 199 00:13:21,520 --> 00:13:25,400 Speaker 2: and so on and so forth. And if YouTube engineers 200 00:13:25,400 --> 00:13:29,800 Speaker 2: have done their job, the creativity is completely hidden and 201 00:13:29,840 --> 00:13:31,840 Speaker 2: you don't even know that it's there. You just think 202 00:13:31,840 --> 00:13:35,760 Speaker 2: you're watching a video. What's wonderful and significant about the 203 00:13:35,840 --> 00:13:38,760 Speaker 2: arts is that the goal is actually to share the 204 00:13:38,800 --> 00:13:41,240 Speaker 2: inner workings of the creativity and have it be out 205 00:13:41,280 --> 00:13:43,960 Speaker 2: in the open, so you see the bending, breaking and 206 00:13:44,000 --> 00:13:47,480 Speaker 2: blending in a way that everyone can share. There's an 207 00:13:47,520 --> 00:13:53,720 Speaker 2: exhibition that the installation artist Christian Markley made where essentially 208 00:13:53,800 --> 00:13:58,720 Speaker 2: he told the time by clips from movies that showed 209 00:13:58,720 --> 00:14:01,880 Speaker 2: a clock in the background at that specific time. So 210 00:14:02,520 --> 00:14:05,120 Speaker 2: the way that installation works is it runs twenty four 211 00:14:05,160 --> 00:14:07,800 Speaker 2: hours a day. You can show up at any time 212 00:14:07,880 --> 00:14:10,240 Speaker 2: you want and say as long as you want, and 213 00:14:10,400 --> 00:14:13,800 Speaker 2: each second is shown by a different scene from a movie. 214 00:14:14,600 --> 00:14:17,400 Speaker 2: It's exactly the same principle at work in the YouTube, 215 00:14:17,679 --> 00:14:21,440 Speaker 2: but now you're actually experiencing the bending, breaking and blending 216 00:14:21,440 --> 00:14:22,000 Speaker 2: in front of you. 217 00:14:22,680 --> 00:14:25,280 Speaker 1: Okay. So one of the questions that comes up often 218 00:14:25,640 --> 00:14:30,320 Speaker 1: is you look at some new invention and you think, wow, 219 00:14:30,400 --> 00:14:32,400 Speaker 1: that came out of the blue. For example, in two 220 00:14:32,440 --> 00:14:36,080 Speaker 1: thousand and six, when Steve Jobs introduced the iPhone, one 221 00:14:36,080 --> 00:14:39,000 Speaker 1: of the reporters there called it the Jesus phone, by 222 00:14:39,040 --> 00:14:41,840 Speaker 1: which he meant it was an immaculate conception, nothing like 223 00:14:41,920 --> 00:14:44,920 Speaker 1: that had existed before, and then it just showed up. 224 00:14:45,240 --> 00:14:49,600 Speaker 1: But in fact, every idea has a genealogy that can 225 00:14:49,640 --> 00:14:52,880 Speaker 1: be traced in this case. You know, in nineteen ninety three, 226 00:14:53,480 --> 00:14:58,880 Speaker 1: IBM had introduced a cell phone with a touch screen. 227 00:14:59,000 --> 00:15:02,080 Speaker 1: Now this was very you know, earlier technology, So it 228 00:15:02,200 --> 00:15:03,920 Speaker 1: was a big brick of a cell phone and it 229 00:15:04,000 --> 00:15:07,240 Speaker 1: had just allows you little touch screen. But the idea 230 00:15:07,400 --> 00:15:10,760 Speaker 1: is that there's a smooth progression from there to here. 231 00:15:11,240 --> 00:15:14,160 Speaker 1: And this is true of all ideas. And this doesn't 232 00:15:14,200 --> 00:15:18,359 Speaker 1: reduce anything about the amazingness of the idea, the importance 233 00:15:18,360 --> 00:15:22,480 Speaker 1: of the idea. But everything has a genealogy because we 234 00:15:22,720 --> 00:15:25,720 Speaker 1: are remixing what's already in there and coming up with 235 00:15:25,800 --> 00:15:29,480 Speaker 1: new versions of things. So the question is, you know, 236 00:15:29,560 --> 00:15:31,480 Speaker 1: how do new ideas evolve. 237 00:15:32,160 --> 00:15:36,560 Speaker 2: So new ideas evolve, we would say through this cognitive 238 00:15:36,600 --> 00:15:40,880 Speaker 2: process of bending, breaking, and blending. We take our storehouse 239 00:15:40,920 --> 00:15:43,880 Speaker 2: with experiences and essentially our minds are like a food 240 00:15:43,960 --> 00:15:48,120 Speaker 2: processor that scrambles it all up and comes up with 241 00:15:48,560 --> 00:15:53,480 Speaker 2: new possibilities. And that ability to generate what ifs and 242 00:15:53,680 --> 00:15:57,560 Speaker 2: hypotheticals is frankly part of what makes it most fun 243 00:15:57,600 --> 00:16:00,840 Speaker 2: to be a human being. I don't think we appreciate 244 00:16:01,480 --> 00:16:04,840 Speaker 2: just how much improvisation is part of our daily lives. 245 00:16:04,920 --> 00:16:09,080 Speaker 2: I mean, having a conversation is an improvisation. You know, 246 00:16:09,160 --> 00:16:12,200 Speaker 2: cookie dinner can be improvising. It takes a lot of 247 00:16:12,240 --> 00:16:17,960 Speaker 2: improvisation to raise your kids. Every time, we're making flexible solutions, 248 00:16:18,480 --> 00:16:21,760 Speaker 2: combining our inputs and coming up with some new wonderful 249 00:16:21,840 --> 00:16:25,720 Speaker 2: way to you know, entertain the kids or keep our 250 00:16:25,840 --> 00:16:29,320 Speaker 2: loved ones interested in us. We are being creative and 251 00:16:30,240 --> 00:16:33,480 Speaker 2: all of that relies on the bending, breaking and blending. 252 00:16:34,360 --> 00:16:37,080 Speaker 1: So given this, given that we absorb our world and 253 00:16:37,120 --> 00:16:40,280 Speaker 1: remix it and spit out something new, how does your 254 00:16:40,560 --> 00:16:42,440 Speaker 1: time and place matter? 255 00:16:43,000 --> 00:16:46,200 Speaker 2: So one thing very beautifully you've shared with your podcast 256 00:16:46,280 --> 00:16:50,080 Speaker 2: audiences is that a lot of our brains development happens 257 00:16:50,480 --> 00:16:53,080 Speaker 2: out in the world rather than in the womb. And 258 00:16:53,160 --> 00:16:57,680 Speaker 2: that makes us neurologically incredibly susceptible to our environment and 259 00:16:57,720 --> 00:17:02,560 Speaker 2: our upbringing. And you that with human creativity and our 260 00:17:02,760 --> 00:17:05,439 Speaker 2: way that we build community, and you end up with 261 00:17:05,520 --> 00:17:09,840 Speaker 2: incredible diversity around the world. I know this very well 262 00:17:09,840 --> 00:17:14,160 Speaker 2: in music my field, because you take somebody like Beethoven, 263 00:17:14,640 --> 00:17:19,000 Speaker 2: who was probably the most experimental, radical European composer of 264 00:17:19,040 --> 00:17:22,919 Speaker 2: his day, doing extraordinary things that were far ahead of 265 00:17:22,920 --> 00:17:27,360 Speaker 2: his time, but he never asked people to detom their 266 00:17:27,359 --> 00:17:32,160 Speaker 2: instruments to create beating. He didn't use the breathing into 267 00:17:32,200 --> 00:17:35,879 Speaker 2: a fluid for expressive purposes and make extraneous noises, and 268 00:17:36,200 --> 00:17:39,880 Speaker 2: consider that part of the drama. But that was happening 269 00:17:40,040 --> 00:17:44,159 Speaker 2: in honored traditions in Japan and the Far East at 270 00:17:44,200 --> 00:17:47,520 Speaker 2: the same time, halfway around the world. Their notion of 271 00:17:47,520 --> 00:17:51,960 Speaker 2: what was beautiful, what was meaningful was very different from Beethoven, 272 00:17:52,640 --> 00:17:55,560 Speaker 2: and to the people in each of those cultures, those 273 00:17:55,600 --> 00:18:00,600 Speaker 2: were complete, self contained, absolutely compelling worlds, and yet in 274 00:18:00,640 --> 00:18:02,600 Speaker 2: many ways not with much overlap. 275 00:18:03,600 --> 00:18:07,280 Speaker 1: That's right, and this is why all new ideas evolve 276 00:18:07,400 --> 00:18:10,399 Speaker 1: from the old based on your culture. One of the 277 00:18:10,440 --> 00:18:13,280 Speaker 1: pictures that we have in the book is the evolution 278 00:18:13,440 --> 00:18:17,359 Speaker 1: of Greek helmets through time, and it's fascinating to see 279 00:18:17,400 --> 00:18:21,280 Speaker 1: the way that the models evolve and change, but each 280 00:18:21,320 --> 00:18:24,480 Speaker 1: one is clearly a function of what came before it. 281 00:18:24,520 --> 00:18:27,720 Speaker 1: But if you look at war helmets from across the 282 00:18:27,760 --> 00:18:31,199 Speaker 1: world and some other place in the world, they are 283 00:18:31,280 --> 00:18:35,399 Speaker 1: quite different. And this is because we take whatever we 284 00:18:35,480 --> 00:18:39,000 Speaker 1: have around us, whatever we absorb, and we use that 285 00:18:39,240 --> 00:18:43,320 Speaker 1: as the fodder that we're operating on. So what's fascinating 286 00:18:43,400 --> 00:18:48,560 Speaker 1: is that humans are so compelled to bend, break and 287 00:18:48,680 --> 00:18:52,239 Speaker 1: blend all the time. We sort of never stop doing this, 288 00:18:52,359 --> 00:18:54,240 Speaker 1: and so tell us about that. 289 00:18:54,880 --> 00:18:59,040 Speaker 2: You know, sometimes there's this sphere everything that can be 290 00:18:59,080 --> 00:19:02,399 Speaker 2: done has already been done. There was even a book 291 00:19:02,560 --> 00:19:05,240 Speaker 2: at the turn of the twentieth century that said all 292 00:19:05,400 --> 00:19:07,639 Speaker 2: diseases had been cured. There was nothing left to do. 293 00:19:07,920 --> 00:19:11,400 Speaker 2: And you know, artists are always anxious. You know, it's 294 00:19:11,440 --> 00:19:14,879 Speaker 2: all been done, there's nothing new. But the story of 295 00:19:14,960 --> 00:19:18,240 Speaker 2: humanity is that there is no finish line because this 296 00:19:18,600 --> 00:19:21,879 Speaker 2: software is running in our brains all the time, and 297 00:19:22,200 --> 00:19:26,119 Speaker 2: just by bouncing off of what came right before us, 298 00:19:26,200 --> 00:19:30,040 Speaker 2: we're constantly pushing the envelope and going farther and farther. 299 00:19:30,720 --> 00:19:33,800 Speaker 2: And so, you know, you go from the stone age 300 00:19:34,240 --> 00:19:38,760 Speaker 2: of the very first knives to this amazing display in 301 00:19:39,000 --> 00:19:42,120 Speaker 2: countries like Polynesia of knives of every shape and size, 302 00:19:42,600 --> 00:19:46,240 Speaker 2: to you know, Swiss army knives and the latest gizmos 303 00:19:46,240 --> 00:19:49,119 Speaker 2: that are sold on late in ITV and on and 304 00:19:49,160 --> 00:19:54,040 Speaker 2: on it goes without stop, without stopping, and there's really 305 00:19:54,160 --> 00:19:56,760 Speaker 2: nothing where you can say, well, we reached the endpoint 306 00:19:56,800 --> 00:20:01,119 Speaker 2: and it can't get any better, including strativarious violins, where 307 00:20:01,200 --> 00:20:05,160 Speaker 2: now they're making them out of carbon fiber. They're lighter, 308 00:20:05,680 --> 00:20:09,000 Speaker 2: they are more durable, they don't warp in different weather 309 00:20:09,560 --> 00:20:12,520 Speaker 2: and people can't even tell the difference sometimes, and the 310 00:20:12,560 --> 00:20:17,480 Speaker 2: new instrument sounds just as beautiful as a strad umbrellas 311 00:20:18,320 --> 00:20:21,000 Speaker 2: you think, you know, most of us are like, okay, 312 00:20:21,240 --> 00:20:24,080 Speaker 2: the umbrella. You know, it's functional, it does what it 313 00:20:24,119 --> 00:20:26,720 Speaker 2: needs to do, keeps the rain off pretty much. People 314 00:20:26,720 --> 00:20:30,000 Speaker 2: have stopped working on that. Actually, so many people are 315 00:20:30,040 --> 00:20:33,280 Speaker 2: trying to make umbrellas that there are four people whose 316 00:20:33,320 --> 00:20:35,879 Speaker 2: full time job it is in the patent office to 317 00:20:36,080 --> 00:20:40,200 Speaker 2: review patents for new umbrellas. There's nowhere you look where 318 00:20:40,440 --> 00:20:45,320 Speaker 2: you say case closed, it's finished. Just stop right. 319 00:20:45,440 --> 00:20:48,240 Speaker 1: People are trying to patent umbrellas where the for example, 320 00:20:48,320 --> 00:20:51,439 Speaker 1: the ribs bend the other way, or it's asymmetrical for 321 00:20:51,520 --> 00:20:54,360 Speaker 1: the wind, or you wear it like a backpack, and 322 00:20:54,480 --> 00:20:58,560 Speaker 1: on and on. Yeah, exactly. And in your field in music, 323 00:20:59,160 --> 00:21:03,400 Speaker 1: how do you see that, this constant compulsion to bend, 324 00:21:03,440 --> 00:21:04,480 Speaker 1: break and blend. 325 00:21:04,440 --> 00:21:09,280 Speaker 2: Well, you see that in covers of songs in a 326 00:21:09,400 --> 00:21:13,440 Speaker 2: jazz improvisation, I mean, night after night, the same performer 327 00:21:13,520 --> 00:21:16,560 Speaker 2: will go and play the same wonderful jazz standard somewhere 328 00:21:16,560 --> 00:21:19,359 Speaker 2: over the rainbow. But it will never be the same 329 00:21:19,400 --> 00:21:22,320 Speaker 2: way twice. And you know, if you walked up to 330 00:21:22,359 --> 00:21:25,560 Speaker 2: that performer and said, well, you know, is that the 331 00:21:25,640 --> 00:21:27,480 Speaker 2: last time you're ever going to do this, then will 332 00:21:27,520 --> 00:21:29,320 Speaker 2: look look at you, you know, like what are you 333 00:21:29,400 --> 00:21:34,520 Speaker 2: talking about. The whole thrill of it is to constantly 334 00:21:34,560 --> 00:21:37,480 Speaker 2: reimagine and you know, come up with a new version. 335 00:21:37,840 --> 00:21:41,800 Speaker 1: And you see this bending, breaking and blending in language 336 00:21:41,920 --> 00:21:42,920 Speaker 1: all around us too. 337 00:21:43,720 --> 00:21:47,639 Speaker 2: Language is really built as a wonderful bending, breaking and 338 00:21:47,720 --> 00:21:51,840 Speaker 2: blending tool. For instance, there's fringe for lan slang where 339 00:21:51,920 --> 00:21:56,240 Speaker 2: syllables are flipped. So the word meshon, which means you're 340 00:21:56,280 --> 00:22:01,160 Speaker 2: being mean, is sean May and originally it was developed 341 00:22:01,200 --> 00:22:04,400 Speaker 2: by you know, people in the underworld, so the Katalkan 342 00:22:04,520 --> 00:22:07,160 Speaker 2: code and the police wildn't understand what they were saying, 343 00:22:07,720 --> 00:22:11,479 Speaker 2: but verlong became so commonplace that now just everybody in 344 00:22:11,480 --> 00:22:16,080 Speaker 2: France uses it, and then abbreviations are great examples of breaking. 345 00:22:16,680 --> 00:22:20,760 Speaker 2: The word gymnasium is the Greek word for exercising in 346 00:22:20,800 --> 00:22:24,359 Speaker 2: the nude, and we very comfortably turned that into jim 347 00:22:25,080 --> 00:22:31,119 Speaker 2: with you know, a less exacting dress code and blending 348 00:22:31,880 --> 00:22:39,320 Speaker 2: happens all the time in composite words railroad, heart, throb, suitcase, etc. 349 00:22:40,160 --> 00:22:42,439 Speaker 2: And in fact, one of the interesting things is different 350 00:22:42,520 --> 00:22:46,000 Speaker 2: languages will blend different words to express the same idea. 351 00:22:46,560 --> 00:22:49,920 Speaker 2: So we call it a rail road. In French, it's 352 00:22:49,960 --> 00:22:53,960 Speaker 2: chamande fair, which means iron road. And part of the 353 00:22:53,960 --> 00:22:57,760 Speaker 2: fun of excavating languages is to see how different brains 354 00:22:57,760 --> 00:23:01,159 Speaker 2: in different communities are marring eye in different ways. 355 00:23:01,720 --> 00:23:04,080 Speaker 1: Yeah, you know, it's always been interesting to me how 356 00:23:04,480 --> 00:23:08,720 Speaker 1: invisible these blends become to us. For example, I think 357 00:23:08,760 --> 00:23:10,560 Speaker 1: it's the case that most people don't think about the 358 00:23:10,560 --> 00:23:13,960 Speaker 1: word rainbow as you know, a bow like a bow 359 00:23:14,000 --> 00:23:17,520 Speaker 1: and arrow that you get from the rain. But that's 360 00:23:17,560 --> 00:23:20,320 Speaker 1: how language evolves, is by putting things together like that. 361 00:23:21,000 --> 00:23:24,200 Speaker 1: Let's come back to this issue about science and art. 362 00:23:24,400 --> 00:23:29,240 Speaker 1: So one example that is very cool is with chimeras. 363 00:23:29,440 --> 00:23:32,439 Speaker 1: So in the world of art, there's this idea of 364 00:23:32,560 --> 00:23:36,480 Speaker 1: putting together different creatures. You mentioned a mermaid before, we're 365 00:23:36,480 --> 00:23:38,960 Speaker 1: putting together a woman and a fish, and you get 366 00:23:38,960 --> 00:23:41,880 Speaker 1: this mermaid. And what's interesting is that you see these 367 00:23:41,920 --> 00:23:45,719 Speaker 1: same kind of things happening in science. So tell us 368 00:23:45,720 --> 00:23:48,080 Speaker 1: about the goat named freckles. 369 00:23:48,920 --> 00:23:53,120 Speaker 2: So there's Freckles, a spider goat. How did that come 370 00:23:53,160 --> 00:23:57,480 Speaker 2: to be? Well, spider silk is one of the strongest 371 00:23:57,520 --> 00:24:01,920 Speaker 2: materials in nature and could potentially be used for things 372 00:24:01,960 --> 00:24:06,040 Speaker 2: like bulletproof bests. But the problem is it's very hard 373 00:24:06,040 --> 00:24:10,080 Speaker 2: to harvest spider silk. You put enough spiders together and 374 00:24:10,119 --> 00:24:14,200 Speaker 2: they eat each other. It takes literally millions of them 375 00:24:14,320 --> 00:24:17,120 Speaker 2: just to create a few square yards of silk. So 376 00:24:17,160 --> 00:24:19,760 Speaker 2: scientists were wondering, you know, is there a more efficient 377 00:24:19,800 --> 00:24:23,199 Speaker 2: way that we could produce this, And there was a 378 00:24:23,200 --> 00:24:25,879 Speaker 2: biologist who came up with the idea of splicing the 379 00:24:25,960 --> 00:24:30,520 Speaker 2: gene or spider silk into the genetic code of a goat, 380 00:24:31,160 --> 00:24:35,040 Speaker 2: and thus was born Freckles, the spider goat who secretes 381 00:24:35,640 --> 00:24:38,679 Speaker 2: spider silk in her milk. And they've been able to 382 00:24:38,800 --> 00:24:45,000 Speaker 2: harvest significant quantities of that by creating an actual, real 383 00:24:45,080 --> 00:24:45,719 Speaker 2: life chimera. 384 00:24:46,320 --> 00:24:49,000 Speaker 1: So is what is happening in the brain of a 385 00:24:49,240 --> 00:24:53,000 Speaker 1: chemist different than what's happening in the brain of a composer. 386 00:24:53,400 --> 00:24:56,760 Speaker 2: To some degree, of course, it takes special expertise and 387 00:24:56,880 --> 00:24:59,320 Speaker 2: knowledge to be a chemist. That's really different from the 388 00:24:59,359 --> 00:25:02,960 Speaker 2: preparation that takes to write a piece of music. But 389 00:25:03,000 --> 00:25:06,399 Speaker 2: the amazing thing about human beings is that we apply 390 00:25:06,600 --> 00:25:10,679 Speaker 2: our creativity to everything we do. So there is a 391 00:25:10,760 --> 00:25:13,199 Speaker 2: part of what we are born equipped with that is 392 00:25:13,240 --> 00:25:19,200 Speaker 2: a general purpose creativity software that can be mobilized for 393 00:25:19,480 --> 00:25:23,200 Speaker 2: anything we care to apply our creativity to. That, again, 394 00:25:23,480 --> 00:25:26,200 Speaker 2: is one of the superpowers that we have as a species. 395 00:25:38,600 --> 00:25:41,920 Speaker 1: This is really what has launched humans into this success 396 00:25:42,000 --> 00:25:45,720 Speaker 1: that we've had compared to all our animal neighbors. You know, 397 00:25:45,760 --> 00:25:49,680 Speaker 1: we've erected skyscrapers and composed symphony and invent you know, 398 00:25:50,760 --> 00:25:53,120 Speaker 1: medicines and get to the moon and stuff like that. 399 00:25:53,240 --> 00:25:57,600 Speaker 1: Because each generation that's born doesn't have to you know, 400 00:25:57,800 --> 00:26:00,239 Speaker 1: just live the life of what happened before them, they 401 00:26:00,280 --> 00:26:03,440 Speaker 1: can learn about that because of brain plasticity, they can 402 00:26:03,480 --> 00:26:06,480 Speaker 1: absorb everything that came before them and then springboard off 403 00:26:06,520 --> 00:26:10,239 Speaker 1: that into the future. And the way they do that 404 00:26:10,480 --> 00:26:14,359 Speaker 1: is by taking in all that data and then remixing 405 00:26:14,359 --> 00:26:16,639 Speaker 1: it and coming up with new versions of things. 406 00:26:17,400 --> 00:26:21,119 Speaker 2: No, I mean again, not to take anything away from animals, 407 00:26:21,200 --> 00:26:24,600 Speaker 2: but if you were like a guidance counselor for crocodiles 408 00:26:24,640 --> 00:26:27,480 Speaker 2: fifty thousand years ago, and you know, what are you 409 00:26:27,520 --> 00:26:30,200 Speaker 2: going to do with your life. The answer would be 410 00:26:30,240 --> 00:26:33,679 Speaker 2: pretty much the same then as it is now. And 411 00:26:33,920 --> 00:26:37,000 Speaker 2: if you were a guidance counselor for the earliest hominids, 412 00:26:37,680 --> 00:26:40,800 Speaker 2: I mean to say they would be astonished at the 413 00:26:40,880 --> 00:26:44,480 Speaker 2: possibilities of what a human being can do now, you know, 414 00:26:44,520 --> 00:26:48,879 Speaker 2: would be a big understatement. And every single bit of 415 00:26:48,880 --> 00:26:52,960 Speaker 2: that is due to creativity running in every single human brain. 416 00:26:53,160 --> 00:26:56,440 Speaker 1: That's right, And we do see creativity happening in various 417 00:26:56,520 --> 00:27:00,119 Speaker 1: ways in the animal world. There's just not nearly as 418 00:27:00,240 --> 00:27:00,800 Speaker 1: much of it. 419 00:27:01,320 --> 00:27:05,719 Speaker 2: So, yes, there's creativity in the animal kingdom, for sure, 420 00:27:05,840 --> 00:27:10,399 Speaker 2: but it's very targeted. So there are birds that design 421 00:27:10,560 --> 00:27:13,800 Speaker 2: very colorful creative ness, but they sing the same song 422 00:27:13,880 --> 00:27:17,359 Speaker 2: all the time. There are whale songs that change in 423 00:27:17,520 --> 00:27:21,159 Speaker 2: pods over time, but they don't apply their creativity in 424 00:27:21,240 --> 00:27:24,479 Speaker 2: other parts of their lives. What makes human special is 425 00:27:24,520 --> 00:27:28,359 Speaker 2: we treat creativity as this all purpose, general purpose tool 426 00:27:28,840 --> 00:27:30,879 Speaker 2: that we apply to everything that we do. 427 00:27:31,640 --> 00:27:35,640 Speaker 1: Right, So, as brains take in data, they're constantly bending 428 00:27:35,720 --> 00:27:38,680 Speaker 1: and breaking and blending that data to come up with 429 00:27:38,800 --> 00:27:41,040 Speaker 1: new things. And we do this more than other animals. 430 00:27:41,200 --> 00:27:43,560 Speaker 1: And when we look at what is different or special 431 00:27:43,600 --> 00:27:46,760 Speaker 1: about human brains, what we see are a few things. 432 00:27:47,240 --> 00:27:50,520 Speaker 1: One of them is just the size of the core 433 00:27:50,560 --> 00:27:53,240 Speaker 1: texas the wrinkly outer part. We just have much more 434 00:27:53,240 --> 00:27:57,200 Speaker 1: of it than any of our neighbors as a relative 435 00:27:57,240 --> 00:28:00,719 Speaker 1: to our body size. And one of the things that 436 00:28:00,880 --> 00:28:05,520 Speaker 1: gives is much more real estate in between the inputs 437 00:28:05,560 --> 00:28:09,000 Speaker 1: and the outputs. So if you're a cat and I 438 00:28:09,080 --> 00:28:12,480 Speaker 1: put some food in front of you, your visual system 439 00:28:12,600 --> 00:28:14,080 Speaker 1: is going to see that and your motor system is 440 00:28:14,119 --> 00:28:16,720 Speaker 1: going to eat the food. But with humans, because we 441 00:28:16,800 --> 00:28:20,400 Speaker 1: have more computational capacity in between the in and the out, 442 00:28:20,840 --> 00:28:22,840 Speaker 1: we can say, Okay, maybe I won't eat it, I'm 443 00:28:22,840 --> 00:28:24,840 Speaker 1: on a diet, maybe I'll eat it later. Things like that. 444 00:28:24,880 --> 00:28:28,280 Speaker 1: We just have more options, more possibilities that we can 445 00:28:28,359 --> 00:28:30,879 Speaker 1: take as a result. The other thing is we have 446 00:28:31,000 --> 00:28:34,400 Speaker 1: these huge frontal lobes, that's the part behind the forehead, 447 00:28:34,760 --> 00:28:38,200 Speaker 1: and this is what allows us to simulate what ifs, 448 00:28:38,240 --> 00:28:40,480 Speaker 1: possible futures. And this is a really big deal. And 449 00:28:40,520 --> 00:28:43,960 Speaker 1: this is what I've suggested in a different episode is what, 450 00:28:44,120 --> 00:28:46,000 Speaker 1: at least at the moment, as far as we can tell, 451 00:28:46,120 --> 00:28:50,640 Speaker 1: separates us from AI is that AI can be massively 452 00:28:50,720 --> 00:28:53,920 Speaker 1: creative because it's constantly bending and breaking and blending things. 453 00:28:53,960 --> 00:28:56,040 Speaker 1: It does a really good job at that. But the 454 00:28:56,080 --> 00:28:59,840 Speaker 1: thing that allows humans to do scientific discovery, at least 455 00:29:00,080 --> 00:29:04,000 Speaker 1: the moment that AI can't do is think about what 456 00:29:04,040 --> 00:29:07,000 Speaker 1: ifs and then evaluate those and think about, Okay, how 457 00:29:07,000 --> 00:29:09,720 Speaker 1: would I know if I were writing on a photon 458 00:29:09,880 --> 00:29:12,520 Speaker 1: and going the speed of light? What would things be like? 459 00:29:12,600 --> 00:29:15,120 Speaker 1: What would things look like? That's what we are really 460 00:29:15,200 --> 00:29:16,160 Speaker 1: good at. 461 00:29:15,960 --> 00:29:18,800 Speaker 2: Doing, building out what you're saying. A lot of the 462 00:29:18,920 --> 00:29:22,960 Speaker 2: large languid models are also aimed towards the mean. They're 463 00:29:22,960 --> 00:29:26,600 Speaker 2: aimed towards the average. They're aim towards the most common solution, 464 00:29:27,320 --> 00:29:30,800 Speaker 2: because that makes the AI more likely to be correct. 465 00:29:31,560 --> 00:29:36,880 Speaker 2: And humans are able sometimes to choose unlikely propositions and 466 00:29:36,960 --> 00:29:40,400 Speaker 2: then develop them and build them. So you know, you 467 00:29:40,480 --> 00:29:42,880 Speaker 2: think about the theory of relativity, for instance, and so 468 00:29:42,920 --> 00:29:46,200 Speaker 2: I sign comes up with the idea, wait, the speed 469 00:29:46,200 --> 00:29:50,120 Speaker 2: of light is constant to all observers, rather than trying 470 00:29:50,160 --> 00:29:54,040 Speaker 2: to fold that into our real life experience and our 471 00:29:54,080 --> 00:29:57,000 Speaker 2: common sense says with how the world works, he's able 472 00:29:57,080 --> 00:30:00,880 Speaker 2: to go beyond that, beyond any availab data that he 473 00:30:01,040 --> 00:30:04,320 Speaker 2: had in his time, to realize that that meant you 474 00:30:04,400 --> 00:30:07,360 Speaker 2: get heavier as you approach the speed of light, that 475 00:30:08,000 --> 00:30:11,239 Speaker 2: absolute simultaneity doesn't exist, that there's such a thing as 476 00:30:11,240 --> 00:30:16,720 Speaker 2: superluminal connections. And right now AI systems are totally constrained 477 00:30:16,720 --> 00:30:20,160 Speaker 2: by their data and they will always attempt to normalize it. 478 00:30:20,840 --> 00:30:23,680 Speaker 1: What's interesting is that you can push the AI by 479 00:30:23,680 --> 00:30:26,480 Speaker 1: the way and force it to give you less and 480 00:30:26,520 --> 00:30:31,000 Speaker 1: less probable answers. And interestingly, people talk about this as 481 00:30:31,200 --> 00:30:35,120 Speaker 1: hallucination of these large language models. But another way to 482 00:30:35,200 --> 00:30:40,840 Speaker 1: view this is hypothesis generation. It's really great actually at 483 00:30:40,840 --> 00:30:43,880 Speaker 1: saying okay, well maybe this maybe that. The interesting part 484 00:30:43,920 --> 00:30:47,600 Speaker 1: is it doesn't have the capacity right now to say, hey, 485 00:30:47,600 --> 00:30:51,520 Speaker 1: here's an interesting hypothesis. Now I'm going to actually evaluate 486 00:30:51,560 --> 00:30:53,120 Speaker 1: that and think about it that way. It just doesn't 487 00:30:53,160 --> 00:30:55,680 Speaker 1: have the capacity to do that. But what it does 488 00:30:55,720 --> 00:30:59,560 Speaker 1: seem to have is the capacity to bend, break, and blend, 489 00:31:00,040 --> 00:31:03,560 Speaker 1: which is very cool. So you know, ten years ago 490 00:31:03,840 --> 00:31:06,520 Speaker 1: when people were talking about computers, people said, okay, well, 491 00:31:06,520 --> 00:31:10,480 Speaker 1: computers can't be creative, and the reason was because they 492 00:31:10,520 --> 00:31:13,640 Speaker 1: weren't doing bending, breaking, blending. They were just repeating whatever 493 00:31:13,800 --> 00:31:17,440 Speaker 1: was put in. Now they're actually doing that remixing all 494 00:31:17,480 --> 00:31:22,360 Speaker 1: the time, which allows us to be creative. The interesting 495 00:31:22,440 --> 00:31:25,880 Speaker 1: part is there are actually two sides to creativity. One 496 00:31:25,960 --> 00:31:30,960 Speaker 1: is the generation of new ideas, and the other is 497 00:31:31,120 --> 00:31:33,200 Speaker 1: the filtering. You say, Okay, well that was not so good. 498 00:31:33,240 --> 00:31:36,240 Speaker 1: That was that's so good. And humans, you know, we 499 00:31:36,680 --> 00:31:40,120 Speaker 1: happen to know what it is like to be other humans, 500 00:31:40,160 --> 00:31:41,960 Speaker 1: and so we're pretty good at saying, okay, I'm going 501 00:31:42,040 --> 00:31:43,800 Speaker 1: to bag that one and kill that one because that 502 00:31:43,840 --> 00:31:46,959 Speaker 1: won't be interesting to my fellow people. But computers at 503 00:31:46,960 --> 00:31:48,960 Speaker 1: the moment aren't so good at the filtering part. 504 00:31:49,840 --> 00:31:53,640 Speaker 2: I one hundred percent agree that. Get back to the 505 00:31:53,680 --> 00:31:58,160 Speaker 2: whole time and place. We're born into a particular cultural situation, 506 00:31:58,520 --> 00:32:02,720 Speaker 2: and we are very aptible, and we're very sensitive to 507 00:32:02,800 --> 00:32:05,800 Speaker 2: the community and how they will respond and what's taking 508 00:32:05,840 --> 00:32:09,920 Speaker 2: things too far, what's so implausible, No one will understand it, 509 00:32:10,160 --> 00:32:13,560 Speaker 2: what's so ordinary, it'll just get overlooked. And the computer 510 00:32:13,760 --> 00:32:17,680 Speaker 2: right now isn't able to evaluate itself in that way 511 00:32:17,800 --> 00:32:21,240 Speaker 2: exactly as you're saying. It doesn't recognize when it has 512 00:32:21,280 --> 00:32:23,960 Speaker 2: its own best ideas and then says, wait a second, 513 00:32:24,080 --> 00:32:26,120 Speaker 2: that was a really good one. Let me build on that. 514 00:32:26,680 --> 00:32:29,479 Speaker 2: And it has no sense at all of this. One 515 00:32:29,520 --> 00:32:34,040 Speaker 2: will fly and that joke is just not funny at all. 516 00:32:33,320 --> 00:32:35,760 Speaker 1: Right, because it doesn't know what it is like to 517 00:32:35,840 --> 00:32:37,640 Speaker 1: be a human and find something funny. 518 00:32:38,440 --> 00:32:41,520 Speaker 2: Right. Yes, And I think one of the most interesting 519 00:32:41,560 --> 00:32:45,760 Speaker 2: things about AI creativity as it's developing is the light 520 00:32:45,840 --> 00:32:50,600 Speaker 2: it shines on some of the unexplored and underappreciated facets 521 00:32:50,600 --> 00:32:54,760 Speaker 2: of human creativity. So, for instance, AI is essentially a 522 00:32:54,800 --> 00:32:58,080 Speaker 2: great improviser. It works on the same principle as the 523 00:32:58,200 --> 00:33:01,480 Speaker 2: yes and principle in theater, going from one thing to 524 00:33:01,520 --> 00:33:03,040 Speaker 2: the next, to the next to the next, but never 525 00:33:03,080 --> 00:33:06,959 Speaker 2: looking backwards, never questioning itself, never saying, oh, well, now 526 00:33:07,000 --> 00:33:08,600 Speaker 2: that I ended up here, let me go back and 527 00:33:08,640 --> 00:33:13,800 Speaker 2: revise the first chapter. We have this incredible time jumping, 528 00:33:14,520 --> 00:33:19,320 Speaker 2: reverse chronology, flexible way of looking at things holistically and granularly. 529 00:33:19,840 --> 00:33:24,160 Speaker 2: I mean again, one is just awe struck at the 530 00:33:24,320 --> 00:33:27,440 Speaker 2: power of the machinery that we have in terms of 531 00:33:27,480 --> 00:33:30,360 Speaker 2: the flexibility of turning things right side up, upside down 532 00:33:30,480 --> 00:33:31,880 Speaker 2: and inside out. 533 00:33:34,400 --> 00:33:38,200 Speaker 1: So how does our social context affect our creativity? 534 00:33:38,720 --> 00:33:41,800 Speaker 2: So it's great to have this machinery we've been talking 535 00:33:41,800 --> 00:33:45,080 Speaker 2: about working in our heads, but what really supercharges is 536 00:33:45,600 --> 00:33:49,000 Speaker 2: our need to engage each other. As you've shared on 537 00:33:49,040 --> 00:33:53,200 Speaker 2: the podcast, brains are infotropic, so they're attracted to new information, 538 00:33:53,960 --> 00:33:58,440 Speaker 2: and that's really critical for social bonding. Imagine waking up 539 00:33:58,600 --> 00:34:00,960 Speaker 2: every day and saying that the same thing to your 540 00:34:01,000 --> 00:34:04,400 Speaker 2: loved one, getting them the same present for their birthday, 541 00:34:05,640 --> 00:34:08,400 Speaker 2: cooking them the same thing for dinner. It's hard to 542 00:34:08,440 --> 00:34:11,200 Speaker 2: imagine that a relationship is going to last very long 543 00:34:12,000 --> 00:34:15,600 Speaker 2: because we tune out to the familiar, and there's a 544 00:34:15,719 --> 00:34:20,200 Speaker 2: phenomenon repetition suppression, where the more stimulus is repetitive or predictable, 545 00:34:20,760 --> 00:34:23,319 Speaker 2: the less attention we pay to it. And you can 546 00:34:23,360 --> 00:34:26,760 Speaker 2: see this on neuroimaging scans, where the first time someone 547 00:34:26,800 --> 00:34:29,840 Speaker 2: has shown a surprise, you know, a huge part of 548 00:34:29,880 --> 00:34:32,800 Speaker 2: their brain lights up. But then if you keep repeating 549 00:34:32,840 --> 00:34:37,280 Speaker 2: that surprise at regular intervals, eventually becomes a tiny little 550 00:34:37,320 --> 00:34:40,520 Speaker 2: dot in the brain. Their brain has made it part 551 00:34:40,560 --> 00:34:43,960 Speaker 2: of its internal model of reality and doesn't care anymore. 552 00:34:44,520 --> 00:34:49,680 Speaker 2: And that is fatal to social relationships. And as essentially 553 00:34:49,719 --> 00:34:53,560 Speaker 2: the most social species on the planet, we learned to 554 00:34:53,719 --> 00:34:57,680 Speaker 2: leverage our creativity to keep the people around us enchanted. 555 00:34:58,600 --> 00:35:01,080 Speaker 2: You know, often it's talked about the birth of human 556 00:35:01,120 --> 00:35:05,280 Speaker 2: culture was sitting around a campfire telling stories after hunting 557 00:35:05,280 --> 00:35:09,160 Speaker 2: wooly mammoths, and to a large degree, that probably gets 558 00:35:09,200 --> 00:35:12,719 Speaker 2: to the heart of why we are such a creative species. 559 00:35:13,200 --> 00:35:16,200 Speaker 1: That's right. Yeah, you see it. At dinner party, someone 560 00:35:16,440 --> 00:35:18,839 Speaker 1: tells the funny story and everyone laughs. You would never 561 00:35:18,840 --> 00:35:21,960 Speaker 1: see someone then tell that story a second time, right then, 562 00:35:22,160 --> 00:35:27,480 Speaker 1: right exactly, And nobody watches the basketball game from last 563 00:35:27,520 --> 00:35:30,239 Speaker 1: week a second time. There's no reason to watch it 564 00:35:30,280 --> 00:35:33,800 Speaker 1: a second time. What we care about is the surprise, 565 00:35:33,880 --> 00:35:36,319 Speaker 1: the new information, that's right, And this is because the 566 00:35:36,320 --> 00:35:39,600 Speaker 1: brain is constantly seeking to update its internal model, and 567 00:35:39,680 --> 00:35:43,359 Speaker 1: it does so by surprise. When it sees something that 568 00:35:43,400 --> 00:35:46,400 Speaker 1: it was not expecting, then it knows, Oops, there's a 569 00:35:46,480 --> 00:35:49,080 Speaker 1: place that I need to improve the model over there. 570 00:35:49,360 --> 00:35:52,680 Speaker 1: And so that's why we're totally attracted to surprise and 571 00:35:52,719 --> 00:35:53,839 Speaker 1: things that are novel. 572 00:35:54,160 --> 00:35:56,080 Speaker 2: And I would go so far as to say that 573 00:35:56,480 --> 00:35:59,960 Speaker 2: creativity became one of our most useful tools for self 574 00:36:00,120 --> 00:36:04,560 Speaker 2: generating surprise. In other words, we like our internal model 575 00:36:04,600 --> 00:36:08,520 Speaker 2: of reality to always be under construction because that keeps 576 00:36:08,600 --> 00:36:12,560 Speaker 2: us very engaged in alert and in a sense partners 577 00:36:12,600 --> 00:36:16,440 Speaker 2: with consciousness and keeping us present in the moment and 578 00:36:16,440 --> 00:36:17,479 Speaker 2: connected to each other. 579 00:36:18,120 --> 00:36:20,840 Speaker 1: Yeah, And in the animal world we see this trade 580 00:36:20,840 --> 00:36:26,120 Speaker 1: off between exploration and exploitation. In other words, animals explore 581 00:36:26,160 --> 00:36:28,040 Speaker 1: their environment and they try to figure out what's going on, 582 00:36:28,239 --> 00:36:30,279 Speaker 1: and once they get it, okay, I can get the 583 00:36:30,360 --> 00:36:33,640 Speaker 1: grubworms under these rocks or whatever. They spend about eighty 584 00:36:33,680 --> 00:36:36,520 Speaker 1: percent of their time exploiting the data that they know, 585 00:36:37,000 --> 00:36:40,240 Speaker 1: but across the animal kingdom they spend about twenty percent 586 00:36:40,239 --> 00:36:43,120 Speaker 1: of their time exploring new things. And it turns out 587 00:36:43,160 --> 00:36:46,839 Speaker 1: this is evolutionarily very useful because you never know when 588 00:36:46,840 --> 00:36:49,239 Speaker 1: the world is going to change. So if you're constantly 589 00:36:49,600 --> 00:36:53,200 Speaker 1: keeping a finger on the pulse of other possible paths, 590 00:36:53,760 --> 00:36:57,520 Speaker 1: that's the way to stay alive when things change. 591 00:36:57,840 --> 00:37:01,560 Speaker 2: And in a sense, that's the evolutionary roots of our creativity. 592 00:37:01,640 --> 00:37:04,080 Speaker 2: We have the rest of the animal kingdom to thank 593 00:37:04,480 --> 00:37:08,600 Speaker 2: for that exploration necessity being built into our brains, and 594 00:37:08,680 --> 00:37:12,120 Speaker 2: then our need to build community kind of took that 595 00:37:12,160 --> 00:37:13,000 Speaker 2: to the next level. 596 00:37:17,960 --> 00:37:20,960 Speaker 1: So that was my friend Anthony Brandt, and in the 597 00:37:20,960 --> 00:37:24,600 Speaker 1: book we co authored, The Runaway Species, we examined how 598 00:37:24,719 --> 00:37:30,360 Speaker 1: human brains generate creativity by constantly remixing its inputs, and 599 00:37:30,480 --> 00:37:34,760 Speaker 1: this is we argue part of the basic operating system 600 00:37:34,800 --> 00:37:37,239 Speaker 1: of the brain. So the way to think about this 601 00:37:37,480 --> 00:37:41,520 Speaker 1: is if you look at a graphics program that says, hi, 602 00:37:41,640 --> 00:37:45,920 Speaker 1: I rotate photographs by forty five degrees. The program doesn't 603 00:37:45,960 --> 00:37:48,960 Speaker 1: care what the photograph is. The photo can be of 604 00:37:49,080 --> 00:37:53,560 Speaker 1: ostriches or kazoos or volcanoes or whatever. The program doesn't care. 605 00:37:53,880 --> 00:37:58,480 Speaker 1: It just performs the rotation operation on the pixels. It 606 00:37:58,640 --> 00:38:01,279 Speaker 1: just doesn't matter what the photograph is about. And this 607 00:38:01,520 --> 00:38:05,439 Speaker 1: is the same thing your brain is doing. You are 608 00:38:05,600 --> 00:38:09,239 Speaker 1: born in your moment of space and time, and you 609 00:38:09,440 --> 00:38:13,719 Speaker 1: move through the world and absorb the art and science 610 00:38:13,800 --> 00:38:17,400 Speaker 1: and language around you, and that's what you remix. So, 611 00:38:17,800 --> 00:38:21,160 Speaker 1: as Anthony said, if you're born in Japan five hundred 612 00:38:21,239 --> 00:38:24,520 Speaker 1: years ago, you will write one kind of music, And 613 00:38:24,560 --> 00:38:27,960 Speaker 1: if you are Beethoven born in Europe at the same time, 614 00:38:28,360 --> 00:38:31,640 Speaker 1: you'll write a different kind of music because you're absorbing 615 00:38:32,120 --> 00:38:35,520 Speaker 1: different data. And if you are in New York in 616 00:38:35,560 --> 00:38:38,560 Speaker 1: the current day, you have a very different set of 617 00:38:38,640 --> 00:38:42,840 Speaker 1: worldwide music to draw from. And if you're composing twenty 618 00:38:42,840 --> 00:38:46,960 Speaker 1: five years from now, your diet will consist of ai 619 00:38:47,160 --> 00:38:50,840 Speaker 1: music and things you never even realized. You hadn't thought 620 00:38:50,840 --> 00:38:55,920 Speaker 1: of new innovations driven by humans and by machines, and 621 00:38:56,040 --> 00:38:59,319 Speaker 1: that's what you'll take to be the background furniture, and 622 00:38:59,360 --> 00:39:04,240 Speaker 1: you'll in on that. So, given what we talked about today, 623 00:39:04,239 --> 00:39:07,520 Speaker 1: I want to circle back to the controversies I highlighted 624 00:39:07,520 --> 00:39:10,480 Speaker 1: at the beginning of the episode where Open AI is 625 00:39:10,520 --> 00:39:14,920 Speaker 1: getting sued and Under Armour is getting criticized because people 626 00:39:15,000 --> 00:39:19,120 Speaker 1: are saying that the AI is merely absorbing what is 627 00:39:19,160 --> 00:39:22,719 Speaker 1: out there and then remixing its own versions. And I 628 00:39:22,760 --> 00:39:26,840 Speaker 1: think it's an interesting question to ask whether humans have 629 00:39:27,080 --> 00:39:32,279 Speaker 1: ever done otherwise, because that's what human creativity is, and 630 00:39:32,320 --> 00:39:36,440 Speaker 1: we typically like to compliment ourselves and tell the story 631 00:39:36,719 --> 00:39:39,720 Speaker 1: of how we came up with things out of the blue, 632 00:39:40,320 --> 00:39:44,799 Speaker 1: leveraging our individual brilliance. But the fact is we are 633 00:39:44,840 --> 00:39:47,680 Speaker 1: each born in a particular place, in a particular moment 634 00:39:47,719 --> 00:39:50,279 Speaker 1: of time, and all we ever have to work with 635 00:39:50,719 --> 00:39:55,240 Speaker 1: are the pieces and parts that our culture provides, pieces 636 00:39:55,239 --> 00:39:58,400 Speaker 1: and parts that have each been touched on and worked 637 00:39:58,440 --> 00:40:02,839 Speaker 1: over by the people before us. So, although it's a 638 00:40:03,000 --> 00:40:06,960 Speaker 1: jagged pill to swallow, it's interesting to consider this possibility. 639 00:40:07,040 --> 00:40:11,000 Speaker 1: Let's say about that Under Armoured commercial that was accused 640 00:40:11,040 --> 00:40:15,200 Speaker 1: of copying the styles of work from several other creatives 641 00:40:15,239 --> 00:40:20,000 Speaker 1: without proper acknowledgment. If a human director had made that 642 00:40:20,280 --> 00:40:23,560 Speaker 1: same ad a few years ago, we would have agreed. 643 00:40:23,719 --> 00:40:27,680 Speaker 1: Of course, he took inspiration from the work of other creatives. 644 00:40:27,719 --> 00:40:30,800 Speaker 1: That's his diet, that's what he has to draw from, 645 00:40:31,120 --> 00:40:34,120 Speaker 1: and we certainly would have no expectation that at the 646 00:40:34,239 --> 00:40:37,600 Speaker 1: end of his commercial he would list all the sources, 647 00:40:37,640 --> 00:40:41,120 Speaker 1: all the other commercials from which he took inspiration. We 648 00:40:41,160 --> 00:40:46,839 Speaker 1: wouldn't say, hey, you remixed all these inspirations without proper acknowledgement. 649 00:40:47,239 --> 00:40:50,640 Speaker 1: And in fact, this is what companies like OpenAI and 650 00:40:50,760 --> 00:40:54,600 Speaker 1: Meta are arguing in court now that their language models 651 00:40:55,080 --> 00:40:58,480 Speaker 1: learn from books the same way that humans do. The 652 00:40:58,600 --> 00:41:04,200 Speaker 1: language models original work that is a transformation of its sources, 653 00:41:04,520 --> 00:41:08,279 Speaker 1: just like humans do. Therefore, these companies are arguing in 654 00:41:08,320 --> 00:41:12,880 Speaker 1: courte that this training is legal. It is quote quintessential 655 00:41:13,040 --> 00:41:17,799 Speaker 1: fair use unquote. So we're in an interesting time. Is 656 00:41:17,840 --> 00:41:22,759 Speaker 1: it just human chauvinism where we're okay remixing things ourselves, 657 00:41:22,800 --> 00:41:25,000 Speaker 1: but we don't want a machine doing it because it 658 00:41:25,080 --> 00:41:27,759 Speaker 1: can do it faster. I don't know the answer to this, 659 00:41:28,000 --> 00:41:31,680 Speaker 1: but I do think the questions about creativity will gain 660 00:41:32,200 --> 00:41:36,160 Speaker 1: refinement as we get clearer on what human brains actually 661 00:41:36,239 --> 00:41:40,120 Speaker 1: do and what AI does, and what these have in common. 662 00:41:41,000 --> 00:41:43,200 Speaker 1: In any case, one of the great pleasures of being 663 00:41:43,239 --> 00:41:47,600 Speaker 1: here on this planet, whether we're surrounded by other humans 664 00:41:47,719 --> 00:41:51,600 Speaker 1: or by machines, is that, even though we often feel 665 00:41:52,280 --> 00:41:55,200 Speaker 1: like everything that can be done has already been done, 666 00:41:55,840 --> 00:41:59,520 Speaker 1: there's just no end on the horizon for what this 667 00:41:59,640 --> 00:42:05,239 Speaker 1: run of way species and our inventions will create. Next. 668 00:42:09,600 --> 00:42:12,800 Speaker 1: Go to eagleman dot com slash podcast for more information 669 00:42:12,920 --> 00:42:16,200 Speaker 1: and to find further reading. Send me an email at 670 00:42:16,320 --> 00:42:19,920 Speaker 1: podcast at eagleman dot com with questions or discussion, and 671 00:42:20,120 --> 00:42:23,600 Speaker 1: check out and subscribe to Inner Cosmos on YouTube for 672 00:42:23,760 --> 00:42:27,960 Speaker 1: videos of each episode and to leave comments. Until next time. 673 00:42:28,320 --> 00:42:32,960 Speaker 1: I'm David Eagleman, and this is Inner Cosmos.