1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:24,036 --> 00:00:26,436 Speaker 2: A few weeks ago, I went to Chicago to interview 3 00:00:26,516 --> 00:00:30,636 Speaker 2: two people on stage about creative work they've done using 4 00:00:30,756 --> 00:00:35,036 Speaker 2: artificial intelligence. One of the people was Stephen marsh He's 5 00:00:35,076 --> 00:00:39,796 Speaker 2: a writer. He's done nonfiction books, novels, magazine articles, and 6 00:00:40,076 --> 00:00:43,196 Speaker 2: earlier this year, he used AI to help him write 7 00:00:43,276 --> 00:00:46,556 Speaker 2: a short novel called Death of an Author. That book, 8 00:00:46,596 --> 00:00:50,036 Speaker 2: by the way, was published in audio form by Pushkin Industries, 9 00:00:50,076 --> 00:00:54,556 Speaker 2: the same company that publishes this podcast. The other person 10 00:00:54,636 --> 00:00:58,836 Speaker 2: on stage with us was Lucas Contor. Lucas is a composer. 11 00:00:59,196 --> 00:01:01,756 Speaker 2: Among other things, He's won a couple Emmys for his 12 00:01:01,876 --> 00:01:03,036 Speaker 2: work scoring. 13 00:01:02,676 --> 00:01:03,956 Speaker 1: The Olympics for NBC. 14 00:01:04,756 --> 00:01:07,076 Speaker 2: He co produced a Lord song that was in one 15 00:01:07,116 --> 00:01:10,356 Speaker 2: of the Hunger Games movies. And the reason he was 16 00:01:10,396 --> 00:01:13,676 Speaker 2: there talking with us Lucas used AI to help him 17 00:01:13,676 --> 00:01:18,156 Speaker 2: finish Schubert's unfinished symphony. It was a really interesting conversation 18 00:01:18,636 --> 00:01:20,276 Speaker 2: and I thought it would make a great episode of 19 00:01:20,276 --> 00:01:21,356 Speaker 2: What's Your Problem? 20 00:01:21,396 --> 00:01:24,956 Speaker 1: So here it is now, please join me and welcome 21 00:01:25,076 --> 00:01:25,596 Speaker 1: our panel. 22 00:01:31,076 --> 00:01:34,356 Speaker 2: So let's do a thing you're never supposed to do 23 00:01:34,636 --> 00:01:41,236 Speaker 2: in narrative. Let's answer the question right at the beginning. So, uh, 24 00:01:41,796 --> 00:01:46,636 Speaker 2: the sort of headline question for this panel is will 25 00:01:46,676 --> 00:01:50,116 Speaker 2: AI kill creativity? I want to ask both of you. 26 00:01:51,276 --> 00:01:53,916 Speaker 2: I want you to answer in one word at the 27 00:01:53,956 --> 00:01:56,836 Speaker 2: same time, on the count of three. It's gonna be 28 00:01:56,956 --> 00:01:59,196 Speaker 2: it's not one two three go, You're gonna go on 29 00:01:59,636 --> 00:02:01,876 Speaker 2: what it is? One two three go? 30 00:02:02,156 --> 00:02:02,436 Speaker 1: Yeah? 31 00:02:02,516 --> 00:02:07,796 Speaker 3: Okay, will AI kill creativity? One two three No? No, 32 00:02:08,676 --> 00:02:12,796 Speaker 3: great done, let's go. Yeah, thank you, Thank you very much. 33 00:02:15,636 --> 00:02:18,236 Speaker 2: So I'm delighted to be here with both of you, 34 00:02:19,596 --> 00:02:23,276 Speaker 2: in particular because you have made things with AI. Right 35 00:02:23,316 --> 00:02:25,956 Speaker 2: there have been countless panels of people sort of waving 36 00:02:25,996 --> 00:02:26,796 Speaker 2: their hands. 37 00:02:26,476 --> 00:02:29,476 Speaker 4: About the theory of AI or the future of AI. 38 00:02:30,156 --> 00:02:33,556 Speaker 2: But I love that we're here talking about things that 39 00:02:33,636 --> 00:02:36,796 Speaker 2: you have made, creative work that you've made. And so 40 00:02:36,836 --> 00:02:39,276 Speaker 2: what I want to do is, I want to start 41 00:02:39,276 --> 00:02:42,076 Speaker 2: by talking a little bit about process. I love talking 42 00:02:42,076 --> 00:02:45,916 Speaker 2: about how people make creative things. And we'll just do 43 00:02:45,996 --> 00:02:48,316 Speaker 2: that in order, frankly, just because I want to get 44 00:02:48,356 --> 00:02:51,956 Speaker 2: into first of the book and then the symphony, and 45 00:02:51,996 --> 00:02:54,636 Speaker 2: then we can talk more generally about AI and creativity 46 00:02:54,676 --> 00:02:56,996 Speaker 2: and humanity, and then we can wave our hands in 47 00:02:56,996 --> 00:03:00,516 Speaker 2: that classic can'd wave away. So Stephen, let's start with you. 48 00:03:02,036 --> 00:03:05,316 Speaker 2: I want to read an excerpt from your book, in 49 00:03:05,396 --> 00:03:08,916 Speaker 2: part because this book that was written with Ai has 50 00:03:08,956 --> 00:03:13,196 Speaker 2: a very particular, I don't know quality to the pros. 51 00:03:13,396 --> 00:03:16,516 Speaker 2: There's a really interesting feel to the pros, and I 52 00:03:16,556 --> 00:03:18,556 Speaker 2: don't know if you'll quite get it from a paragraph, 53 00:03:18,636 --> 00:03:20,996 Speaker 2: but I want to give you something to hold on 54 00:03:21,036 --> 00:03:22,316 Speaker 2: to as we're talking about the book. 55 00:03:22,356 --> 00:03:25,156 Speaker 1: So I think I have this right. 56 00:03:25,196 --> 00:03:27,356 Speaker 2: This passage I'm going to read, it's in the first person, 57 00:03:27,916 --> 00:03:32,276 Speaker 2: and it's actually in the book, spoken by a digital avatar, 58 00:03:32,316 --> 00:03:35,476 Speaker 2: an AI avatar in the book, who is an avatar 59 00:03:35,796 --> 00:03:39,156 Speaker 2: of a dead author whose death is the title of 60 00:03:39,196 --> 00:03:42,876 Speaker 2: the book. So the passage in her voice goes like this, 61 00:03:45,316 --> 00:03:47,596 Speaker 2: I learned the limits of machines when they wanted me 62 00:03:47,676 --> 00:03:50,436 Speaker 2: to fly bombers. They were going to force me to 63 00:03:50,476 --> 00:03:52,436 Speaker 2: push a button that would end the world. 64 00:03:53,116 --> 00:03:53,836 Speaker 1: I hope you can. 65 00:03:53,756 --> 00:03:58,516 Speaker 2: Understand that my stance as a pacifist wasn't cowardice or principle, 66 00:03:58,716 --> 00:04:02,436 Speaker 2: but a confession. I could never bring myself to press 67 00:04:02,476 --> 00:04:07,036 Speaker 2: that button. Human beings cannot stop making buttons, and once 68 00:04:07,076 --> 00:04:10,156 Speaker 2: we've made them, we can't stop pushing them. 69 00:04:10,876 --> 00:04:12,876 Speaker 1: Pretty good for a machine, it really pretty good for 70 00:04:12,916 --> 00:04:13,396 Speaker 1: a machine. 71 00:04:13,476 --> 00:04:15,636 Speaker 2: Yeah, I'm gonna read that last sentence again because I 72 00:04:15,716 --> 00:04:17,356 Speaker 2: like it, and because it comes up a couple times 73 00:04:17,396 --> 00:04:21,476 Speaker 2: in the book. Human beings cannot stop making buttons, and 74 00:04:21,516 --> 00:04:25,356 Speaker 2: once we've made them, we can't stop pushing them. So 75 00:04:25,636 --> 00:04:29,116 Speaker 2: maybe Stephen, we should actually start with that sentence. 76 00:04:29,396 --> 00:04:29,676 Speaker 1: Right. 77 00:04:29,956 --> 00:04:33,196 Speaker 2: It's a great sentence, I think, I or really interesting sentence. 78 00:04:33,556 --> 00:04:35,356 Speaker 2: Sounds like a sentence a human being would write. It 79 00:04:35,436 --> 00:04:37,956 Speaker 2: ends up being important in the book sort of thematically. 80 00:04:38,516 --> 00:04:41,756 Speaker 2: How did the machine write that sentence? 81 00:04:42,076 --> 00:04:45,036 Speaker 5: Okay, let me see if I can get it exactly right. 82 00:04:45,396 --> 00:04:49,436 Speaker 5: So that was the first person from the death of 83 00:04:49,476 --> 00:04:52,356 Speaker 5: an author. So Jacob came to me in February and said, 84 00:04:52,396 --> 00:04:53,556 Speaker 5: we need to release this thing. 85 00:04:53,756 --> 00:04:57,276 Speaker 2: This is Jacob Weisberg, actually the person who runs Yeah, 86 00:04:57,276 --> 00:04:59,196 Speaker 2: pushed It. Let's actually start at the beginning of the 87 00:04:59,276 --> 00:05:01,716 Speaker 2: sure and then we'll get to that sentence. So right, 88 00:05:01,836 --> 00:05:05,396 Speaker 2: So Jacob Weisberg, who runs Pushkin, which is the company 89 00:05:05,476 --> 00:05:08,476 Speaker 2: where I make a podcast, came to you in February. 90 00:05:08,196 --> 00:05:11,396 Speaker 5: And said, can you write a book that's AI that's 91 00:05:11,476 --> 00:05:13,516 Speaker 5: generated by AI? In fact, he said, can you create 92 00:05:13,556 --> 00:05:16,836 Speaker 5: an AI author and then have that author create a book. 93 00:05:16,836 --> 00:05:19,316 Speaker 5: Now I'd been working on this for a while i'd 94 00:05:19,356 --> 00:05:23,116 Speaker 5: been working. I'd wroteen my first you know, algorithmically generated 95 00:05:23,156 --> 00:05:27,476 Speaker 5: story for Wired in twenty seventeen, which was before the Transformer, 96 00:05:27,796 --> 00:05:32,236 Speaker 5: so the Dark Ages of AI really, and so I said, yes, 97 00:05:32,316 --> 00:05:35,876 Speaker 5: I can definitely do that. It'll be about ninety five 98 00:05:35,916 --> 00:05:38,716 Speaker 5: percent computer generated. I don't want to if I want 99 00:05:38,756 --> 00:05:41,356 Speaker 5: to change heat to the character's name or something like that, 100 00:05:41,396 --> 00:05:43,596 Speaker 5: I want to be able to do that without forcing 101 00:05:43,636 --> 00:05:47,396 Speaker 5: all these iterations and so on. And basically I used 102 00:05:48,276 --> 00:05:54,076 Speaker 5: GBT four and I would use it to generate texts. 103 00:05:54,396 --> 00:05:58,236 Speaker 5: I knew from having done AI AI text before that 104 00:05:58,876 --> 00:06:01,596 Speaker 5: A is very poor at generating plots, okay, and it's 105 00:06:01,676 --> 00:06:03,236 Speaker 5: very poor at certain other tasks. 106 00:06:03,236 --> 00:06:05,356 Speaker 1: It's incredibly good at style, okay. 107 00:06:05,436 --> 00:06:08,796 Speaker 5: Right, So I would, you know, have very clear ideas 108 00:06:08,796 --> 00:06:11,116 Speaker 5: of where the narrative what's going. I'd give very specific 109 00:06:11,196 --> 00:06:18,116 Speaker 5: grammatical and syntactical commands write a paragraph with high variability, 110 00:06:18,316 --> 00:06:21,476 Speaker 5: like very very specific commands like wait, do the whole. 111 00:06:21,556 --> 00:06:24,076 Speaker 4: Give me an example in its entirety of a command. 112 00:06:24,156 --> 00:06:26,116 Speaker 5: It would be almost impossible to do because it's exactly 113 00:06:26,156 --> 00:06:29,316 Speaker 5: like do it when you've seen them for visual stuff, 114 00:06:29,556 --> 00:06:31,916 Speaker 5: where it's like they'll just to get really interesting AI 115 00:06:31,996 --> 00:06:37,756 Speaker 5: generated pictures. You often have like one hundred different references. 116 00:06:36,596 --> 00:06:39,436 Speaker 4: Like it almost impossible, but just give me something. 117 00:06:39,476 --> 00:06:40,036 Speaker 1: Give me something. 118 00:06:42,436 --> 00:06:48,956 Speaker 5: Write a hard boiled detective story paragraph with a variability 119 00:06:48,956 --> 00:06:57,396 Speaker 5: between short and long sentences and clear, elegant syntax, containing 120 00:06:57,396 --> 00:07:00,116 Speaker 5: the following information, and then you write out information it 121 00:07:00,116 --> 00:07:02,996 Speaker 5: would generate that. Then you would take that and I 122 00:07:02,996 --> 00:07:04,716 Speaker 5: would put it into a program called pseudo. 123 00:07:04,756 --> 00:07:07,076 Speaker 2: Right, and wait just before we go to the next program, 124 00:07:07,156 --> 00:07:11,276 Speaker 2: when you say containing the following information, like that. 125 00:07:11,116 --> 00:07:13,236 Speaker 1: One would be it would be like in this one. 126 00:07:13,436 --> 00:07:16,996 Speaker 5: The author says, well, that would be slightly different because 127 00:07:17,196 --> 00:07:20,116 Speaker 5: with characters, I would use a whole different set of commands. 128 00:07:20,516 --> 00:07:23,476 Speaker 5: So you know the author and here was basically a 129 00:07:23,516 --> 00:07:26,676 Speaker 5: combination of Margaret Outwood and my dead father. Because I 130 00:07:26,676 --> 00:07:28,476 Speaker 5: was writing this thing fast, so I needed to know 131 00:07:28,596 --> 00:07:30,996 Speaker 5: something that I needed to have a character that I 132 00:07:31,036 --> 00:07:32,676 Speaker 5: would automatically be interested in, and. 133 00:07:32,676 --> 00:07:36,716 Speaker 2: I should say, you're a Canadian, like basically the next 134 00:07:36,716 --> 00:07:37,756 Speaker 2: closest thing after your. 135 00:07:38,156 --> 00:07:42,316 Speaker 5: If women were alive for you, yes, right, and so uh, 136 00:07:42,836 --> 00:07:46,516 Speaker 5: and so that I would say write something like Sylvia 137 00:07:46,556 --> 00:07:49,596 Speaker 5: Plath meets Philip Roth and meets a bunch of different 138 00:07:49,636 --> 00:07:51,196 Speaker 5: other things and get hurt it. 139 00:07:51,356 --> 00:07:53,876 Speaker 2: So you're doing a very specific character. And then do 140 00:07:53,916 --> 00:07:56,876 Speaker 2: you do all of the sort of exposition or plot points, 141 00:07:56,916 --> 00:07:59,396 Speaker 2: like what what in terms of substance, what is an 142 00:07:59,396 --> 00:08:00,516 Speaker 2: example of what you might put? 143 00:08:00,516 --> 00:08:03,556 Speaker 5: Well, I would that would probably actually be mostly the machine, 144 00:08:03,556 --> 00:08:05,836 Speaker 5: but for plot details would be like she walks to 145 00:08:05,916 --> 00:08:07,276 Speaker 5: a bridge. 146 00:08:07,316 --> 00:08:10,436 Speaker 2: And but this this paragraph about like the you know, 147 00:08:11,196 --> 00:08:13,476 Speaker 2: the buttons. I wouldn't press the button, and it's like, 148 00:08:13,516 --> 00:08:16,516 Speaker 2: how do you It would be something like you know 149 00:08:16,556 --> 00:08:18,636 Speaker 2: it to be something like the character. 150 00:08:19,956 --> 00:08:24,996 Speaker 5: Reminisces about her times as a UH and and expounds 151 00:08:24,996 --> 00:08:31,596 Speaker 5: philosophically on the difference between AI and being a fighter pilot. 152 00:08:31,596 --> 00:08:34,716 Speaker 2: And or the character expounds on being a pacifist in 153 00:08:34,756 --> 00:08:39,036 Speaker 2: the military. Exactly okay, right, and so sometimes more, sometimes less. 154 00:08:39,436 --> 00:08:41,756 Speaker 2: Tried to get as little as possible, but you know 155 00:08:41,916 --> 00:08:45,516 Speaker 2: you want specificity here, like you're the more precise the command, 156 00:08:45,596 --> 00:08:49,116 Speaker 2: the better information, the precise the command, the more it's 157 00:08:49,156 --> 00:08:52,236 Speaker 2: just you writing it with the weird kind of intermediation is. 158 00:08:52,276 --> 00:08:55,476 Speaker 5: My creation, right, this is a tool which you will 159 00:08:55,636 --> 00:08:58,356 Speaker 5: I will say the same thing. So just the same 160 00:08:58,356 --> 00:09:01,196 Speaker 5: as if like this is the thing people don't understand, right, 161 00:09:01,276 --> 00:09:03,356 Speaker 5: it's like, of course this is a creative act. It's 162 00:09:03,356 --> 00:09:05,916 Speaker 5: just a different creative act, right, Like it's this is 163 00:09:05,956 --> 00:09:08,676 Speaker 5: one hundred percent me. It's just I didn't write the 164 00:09:08,716 --> 00:09:14,356 Speaker 5: words right like like like so that's like like that's weird, 165 00:09:14,476 --> 00:09:16,916 Speaker 5: Like I am, yeah, it's very weird, like I am. 166 00:09:16,996 --> 00:09:18,876 Speaker 6: Don't you didn't write the words that ended up in 167 00:09:18,916 --> 00:09:21,916 Speaker 6: the book word you weren't the words that were the 168 00:09:21,956 --> 00:09:23,756 Speaker 6: instructions to the machine to write the words that. 169 00:09:23,996 --> 00:09:26,836 Speaker 1: Well, so good as any computer that's true, any computer program. 170 00:09:26,876 --> 00:09:28,436 Speaker 2: So so okay, so I want to get back to 171 00:09:28,476 --> 00:09:31,156 Speaker 2: the specific sort of process narratives. So you put this 172 00:09:31,356 --> 00:09:35,836 Speaker 2: very specific prompt into GPD four, which is basically chat GPT. 173 00:09:36,476 --> 00:09:39,236 Speaker 5: I would say, it's actually better fine, and chat EPT 174 00:09:39,276 --> 00:09:42,156 Speaker 5: four is now it was better than what chatchat is 175 00:09:42,196 --> 00:09:43,756 Speaker 5: now fine for creative stuff. 176 00:09:44,116 --> 00:09:47,916 Speaker 2: Uh. Then you get some output, you get the paragraph 177 00:09:48,276 --> 00:09:49,596 Speaker 2: for it, and then what and then. 178 00:09:49,516 --> 00:09:52,316 Speaker 5: It usually it's very bad, right, And then you take 179 00:09:52,356 --> 00:09:54,636 Speaker 5: that and you put it in a program called pseudo right. Okay, 180 00:09:54,716 --> 00:09:57,556 Speaker 5: and pseudo right is a stochastic writing instrument. So you 181 00:09:57,556 --> 00:10:00,316 Speaker 5: could you then select the text and you say shorten 182 00:10:00,956 --> 00:10:03,716 Speaker 5: lengthen you say and then it has another button, which 183 00:10:03,756 --> 00:10:08,596 Speaker 5: is a customized feature, which is make it sound like X. So, 184 00:10:08,756 --> 00:10:11,996 Speaker 5: make it sound like Ernest Hemingway, make it sound like 185 00:10:12,196 --> 00:10:15,956 Speaker 5: f Scott Fitzgerald, and and and you know, the of course, 186 00:10:15,956 --> 00:10:17,916 Speaker 5: the thing I figured out very quickly is that if 187 00:10:17,916 --> 00:10:19,716 Speaker 5: you want something to sound like Margaret out With, the 188 00:10:19,836 --> 00:10:22,276 Speaker 5: very last thing you should do is put in make 189 00:10:22,316 --> 00:10:23,196 Speaker 5: it sound like markered out. 190 00:10:23,676 --> 00:10:24,876 Speaker 1: That's not enough course to me. 191 00:10:25,676 --> 00:10:28,476 Speaker 5: Well, of course, because markered Outwood is in trying to 192 00:10:28,476 --> 00:10:30,516 Speaker 5: sound like Margaret Out would she's trying to sound like 193 00:10:30,596 --> 00:10:34,556 Speaker 5: Sylvia Plathmas Philip Roth meets, it meets a bunch of 194 00:10:34,556 --> 00:10:34,996 Speaker 5: other things. 195 00:10:35,076 --> 00:10:37,436 Speaker 4: Right, then you ultimately always get back. 196 00:10:38,836 --> 00:10:40,356 Speaker 5: Yeah, And so that when you the way you get 197 00:10:40,396 --> 00:10:43,676 Speaker 5: interesting things in this text is by essentially folding these 198 00:10:43,756 --> 00:10:45,956 Speaker 5: layers of style onto each other. 199 00:10:46,076 --> 00:10:49,596 Speaker 2: Now I also use and then so pseudo right has 200 00:10:49,596 --> 00:10:52,116 Speaker 2: some output. Yeah, and then is that output what we're 201 00:10:52,156 --> 00:10:52,836 Speaker 2: reading in the book? 202 00:10:52,876 --> 00:10:53,196 Speaker 1: Correct? 203 00:10:54,116 --> 00:10:56,916 Speaker 5: Or you know, if I don't like it, I just 204 00:10:57,396 --> 00:11:00,396 Speaker 5: try again, just refresh, refresh, refresh until I guess something 205 00:11:00,436 --> 00:11:00,796 Speaker 5: that I like. 206 00:11:00,876 --> 00:11:03,196 Speaker 1: And so so this is very much a creative act. 207 00:11:03,196 --> 00:11:05,956 Speaker 4: And you're doing that basically a paragraph at a time. 208 00:11:06,516 --> 00:11:09,596 Speaker 5: Yeah, Well, with dialogue, it would go like die would 209 00:11:09,596 --> 00:11:12,556 Speaker 5: be a lot longer, right, like, because you want flow 210 00:11:13,196 --> 00:11:16,116 Speaker 5: and you want so I could do up to maybe 211 00:11:16,356 --> 00:11:18,636 Speaker 5: five hundred words of dialogue at a time. Uh huh, 212 00:11:18,676 --> 00:11:20,676 Speaker 5: So that would have been part of a much longer 213 00:11:20,716 --> 00:11:21,636 Speaker 5: series of instructions. 214 00:11:21,676 --> 00:11:25,156 Speaker 2: So this sentence human beings cannot stop making buttons, and 215 00:11:25,196 --> 00:11:27,356 Speaker 2: once we've made them, we can't stop pushing them. A 216 00:11:27,436 --> 00:11:30,316 Speaker 2: nice sentence, you know, big idea. I certainly didn't think 217 00:11:30,356 --> 00:11:30,796 Speaker 2: of that. 218 00:11:30,756 --> 00:11:34,076 Speaker 1: You didn't. It just came out of some refreshment, yeah, fresh, and. 219 00:11:34,156 --> 00:11:37,636 Speaker 5: It was in some I mean, obviously I made it, 220 00:11:37,676 --> 00:11:40,276 Speaker 5: and I authorized it too. You know, I've compared it 221 00:11:40,396 --> 00:11:43,356 Speaker 5: in the Atlantic to doing hip hop in the sense 222 00:11:43,396 --> 00:11:47,036 Speaker 5: that you're you're folding things on top of each other, right, 223 00:11:47,036 --> 00:11:51,836 Speaker 5: You're folding styles and metrics and effects on top of 224 00:11:51,876 --> 00:11:54,956 Speaker 5: each other until you get something new and weird. 225 00:11:55,756 --> 00:11:55,956 Speaker 1: Right. 226 00:11:56,196 --> 00:11:59,436 Speaker 5: And I would say about twenty times during the course 227 00:11:59,476 --> 00:12:02,076 Speaker 5: of writing it, I felt like I was, you know, 228 00:12:02,436 --> 00:12:05,636 Speaker 5: putting my hand up against something new and weird. 229 00:12:05,796 --> 00:12:07,396 Speaker 1: That's fun, right, like something. 230 00:12:08,676 --> 00:12:11,676 Speaker 5: But you know this is for most of the process, 231 00:12:11,956 --> 00:12:14,556 Speaker 5: it's just a writing tool, right, Like, it writes it 232 00:12:14,596 --> 00:12:17,516 Speaker 5: for you. You decide if it works, right, and you 233 00:12:17,636 --> 00:12:20,356 Speaker 5: tell it's you tell it what to write in. 234 00:12:20,316 --> 00:12:21,396 Speaker 4: A very granular way. 235 00:12:21,836 --> 00:12:24,916 Speaker 5: The more granular, just like writing normally, the more you 236 00:12:25,116 --> 00:12:27,956 Speaker 5: know about the bigger planning. The more planning you have 237 00:12:28,036 --> 00:12:30,356 Speaker 5: for any essay, the better the essay is going to 238 00:12:30,356 --> 00:12:33,436 Speaker 5: be right. And in this case, so you have a 239 00:12:33,476 --> 00:12:35,716 Speaker 5: plan and then you have the editing process, and in 240 00:12:35,836 --> 00:12:39,476 Speaker 5: between there's this machine. But how much of that, how 241 00:12:39,516 --> 00:12:43,156 Speaker 5: much does that matter? Is actually I don't know if 242 00:12:43,156 --> 00:12:45,916 Speaker 5: it's like twenty times it did matter where it was like, 243 00:12:45,956 --> 00:12:47,556 Speaker 5: oh that's not something I would have written, but. 244 00:12:47,556 --> 00:12:48,396 Speaker 1: It's very beautiful. 245 00:12:48,436 --> 00:12:51,996 Speaker 5: Yeah, and it's very strange, and it's you know, there's 246 00:12:52,036 --> 00:12:54,516 Speaker 5: a there's a Danish journalist who deals with go players 247 00:12:54,516 --> 00:12:57,916 Speaker 5: who play ai go against each other, and they say 248 00:12:57,956 --> 00:13:00,796 Speaker 5: it's like listening to an alien make music right, because 249 00:13:00,796 --> 00:13:02,716 Speaker 5: it's like it's not how they would play go, it's 250 00:13:02,756 --> 00:13:05,756 Speaker 5: not how a human could play go, but it's obviously 251 00:13:05,796 --> 00:13:09,316 Speaker 5: makes sense on some level. Similarly, that's how I felt 252 00:13:09,596 --> 00:13:12,196 Speaker 5: like most of the time, it's just a writing machine 253 00:13:12,196 --> 00:13:14,596 Speaker 5: that does what I tell it and then I correct it. 254 00:13:14,796 --> 00:13:18,436 Speaker 5: But then maybe twenty times you feel this new presence. 255 00:13:18,916 --> 00:13:19,756 Speaker 5: That's what's exciting. 256 00:13:22,476 --> 00:13:24,556 Speaker 2: We'll be back in a minute to hear how Lucas 257 00:13:24,556 --> 00:13:44,956 Speaker 2: Contour used AI to help him finish Schubert's unfinished symphony. Okay, 258 00:13:45,036 --> 00:13:48,156 Speaker 2: back to the conversation in Chicago with Stephen Marsh and 259 00:13:48,236 --> 00:13:52,916 Speaker 2: Lucas Contour. Lucas's story of using AI to finish Schubert's 260 00:13:52,996 --> 00:13:56,796 Speaker 2: unfinished symphony goes back to twenty nineteen. He was approached 261 00:13:56,796 --> 00:14:00,676 Speaker 2: by a Chinese tech company called Huawei. They said, we 262 00:14:00,756 --> 00:14:04,076 Speaker 2: want our phone, which runs AI, to finish Schubert's on 263 00:14:04,116 --> 00:14:08,076 Speaker 2: Finnish symphony. And they didn't know what that meant. They 264 00:14:08,076 --> 00:14:10,916 Speaker 2: had a tech team in place that was running the 265 00:14:10,956 --> 00:14:12,236 Speaker 2: AI and I knew those people. 266 00:14:12,236 --> 00:14:14,156 Speaker 6: That's why they, I think brought me in. I was 267 00:14:14,196 --> 00:14:16,876 Speaker 6: told that. So my friend, the technologist who brought me 268 00:14:16,916 --> 00:14:19,236 Speaker 6: in on this project, told me that he thought that 269 00:14:19,236 --> 00:14:21,156 Speaker 6: I would be a good fit because I have a 270 00:14:21,196 --> 00:14:23,916 Speaker 6: corporate friendly bio where they could say, oh, he can 271 00:14:23,996 --> 00:14:27,596 Speaker 6: do it. And he said, I know they you don't 272 00:14:27,636 --> 00:14:29,356 Speaker 6: have to say that part. You don't have to say 273 00:14:29,396 --> 00:14:31,796 Speaker 6: that part. He said, uh, But he said I. He 274 00:14:31,836 --> 00:14:34,956 Speaker 6: said that you, I know you can command an orchestra, 275 00:14:35,076 --> 00:14:37,316 Speaker 6: but I don't think you'll be precious about the project, 276 00:14:37,916 --> 00:14:39,956 Speaker 6: meaning that I won't be. He didn't think I would 277 00:14:39,956 --> 00:14:42,556 Speaker 6: say like, oh, well, this is heresy. We shouldn't take 278 00:14:42,556 --> 00:14:45,076 Speaker 6: Schubert's perfect work that was so perfect that he didn't 279 00:14:45,076 --> 00:14:50,116 Speaker 6: even finish it and do something with it. And uh yeah. 280 00:14:50,116 --> 00:14:52,996 Speaker 6: So I think they thought they would just that I 281 00:14:53,036 --> 00:14:55,356 Speaker 6: would press a button on the phone and a symphony 282 00:14:55,396 --> 00:14:58,276 Speaker 6: would come out and somehow a bunch of musicians would play. 283 00:14:58,476 --> 00:15:02,436 Speaker 4: So they need you for it. They just pushed the button. 284 00:15:02,516 --> 00:15:05,076 Speaker 6: So this is the conversation we had, and eventually I 285 00:15:05,116 --> 00:15:06,596 Speaker 6: had to I was on a call with them and 286 00:15:06,596 --> 00:15:08,356 Speaker 6: I said, look, this is this is not I mean, 287 00:15:08,356 --> 00:15:10,996 Speaker 6: what you're asking for in principle doesn't exist, like you 288 00:15:10,996 --> 00:15:13,036 Speaker 6: can't And I mean, what do you even want the 289 00:15:13,076 --> 00:15:14,676 Speaker 6: machine to do? Do you want it to generate audio 290 00:15:14,716 --> 00:15:16,276 Speaker 6: for you? Do you want it to generate a score? 291 00:15:16,516 --> 00:15:18,716 Speaker 6: Do you want it to perform the score? So, I mean, 292 00:15:18,836 --> 00:15:22,036 Speaker 6: right off the bat, this was a fascinating project because 293 00:15:22,036 --> 00:15:24,316 Speaker 6: I had to think about the very nature of music 294 00:15:24,436 --> 00:15:26,916 Speaker 6: to even really get started. I don't know if that 295 00:15:26,996 --> 00:15:29,076 Speaker 6: answers the question about I think it does. 296 00:15:29,196 --> 00:15:31,476 Speaker 4: I mean, I just wanted you to set yourself up, 297 00:15:31,476 --> 00:15:32,316 Speaker 4: and I think you've done it. 298 00:15:32,356 --> 00:15:34,436 Speaker 6: You want to I think I'm set up, so I'm 299 00:15:34,516 --> 00:15:36,436 Speaker 6: gonna try something new for you today. So on the 300 00:15:37,396 --> 00:15:41,636 Speaker 6: on the prep call for this event, we discussed I 301 00:15:41,676 --> 00:15:43,556 Speaker 6: said something that I don't often say out loud, but 302 00:15:43,556 --> 00:15:45,876 Speaker 6: I realized as a hallmark of my presence on stage, 303 00:15:45,876 --> 00:15:49,396 Speaker 6: is that I like to do things that might spectacularly 304 00:15:49,436 --> 00:15:52,636 Speaker 6: fail in the hopes that they will be entertaining to 305 00:15:52,716 --> 00:15:54,516 Speaker 6: an audience. So I'm going to do one of them 306 00:15:54,556 --> 00:15:54,796 Speaker 6: for you. 307 00:15:54,876 --> 00:15:55,036 Speaker 1: Now. 308 00:15:55,076 --> 00:15:57,116 Speaker 6: I'm going to I wrote a little thing about the 309 00:15:57,156 --> 00:16:00,476 Speaker 6: Unfinished Symphony. I'm going to explain it while I'm playing 310 00:16:00,476 --> 00:16:04,796 Speaker 6: some music in the background and basically scoring it as 311 00:16:04,836 --> 00:16:08,116 Speaker 6: i'm talking. So you know, wish me luck and hopefully 312 00:16:08,116 --> 00:16:15,556 Speaker 6: it'll be interesting. This is how the Unfinished Symphony starts. 313 00:16:29,436 --> 00:16:33,276 Speaker 6: A symphony has four movements, but Schubert only wrote two 314 00:16:33,556 --> 00:16:37,036 Speaker 6: and sketched a third of his eighth Symphony, the Unfinished Symphony. 315 00:16:38,076 --> 00:16:40,836 Speaker 6: No one knows why he abandoned the Unfinished Symphony, but 316 00:16:40,916 --> 00:16:43,916 Speaker 6: he did, and now it's probably his most famous work, 317 00:16:44,036 --> 00:16:51,236 Speaker 6: along with his greatest hit, Ave Maria. Some scholars believe 318 00:16:51,316 --> 00:16:53,716 Speaker 6: that Schubert couldn't find a way to fit the Eighth 319 00:16:53,756 --> 00:16:56,876 Speaker 6: Symphony into the orthodoxy of the time. Which forbade three 320 00:16:56,916 --> 00:16:59,596 Speaker 6: movements in a row in triple meter meters like three, 321 00:16:59,676 --> 00:17:00,436 Speaker 6: four and sixty eight. 322 00:17:01,356 --> 00:17:02,236 Speaker 1: But I don't believe this. 323 00:17:03,636 --> 00:17:06,556 Speaker 6: Schubert showed little reverence for orthodoxy during his short life, 324 00:17:07,316 --> 00:17:09,476 Speaker 6: and the AI that I used to finish Ubert's on 325 00:17:09,516 --> 00:17:21,356 Speaker 6: Finnish Symphony didn't believe it either. At first, we trained 326 00:17:21,356 --> 00:17:25,516 Speaker 6: the AI on recordings of Schubert's entire catalog, then prompted 327 00:17:25,556 --> 00:17:27,556 Speaker 6: it with the first two movements of the unfinished symphony. 328 00:17:28,036 --> 00:17:31,836 Speaker 6: Seems like a reasonable strategy, right, This was the result 329 00:17:37,876 --> 00:17:43,356 Speaker 6: sounds like Kat's walking on a piano, But this was 330 00:17:43,356 --> 00:17:48,436 Speaker 6: actually pretty logical. Recorded music has almost no mathematically discernible 331 00:17:48,436 --> 00:17:52,236 Speaker 6: patterns to it, so from the AI's perspective, the input 332 00:17:52,276 --> 00:18:04,756 Speaker 6: was nonsense, so more nonsense was a logical output. Music 333 00:18:04,756 --> 00:18:08,636 Speaker 6: as an abstraction is math, but music in practice is convention. 334 00:18:09,716 --> 00:18:13,156 Speaker 6: Music is understood by groups of humans, and like any language, 335 00:18:13,516 --> 00:18:22,876 Speaker 6: music doesn't have objective meaning. Music is emotionally inert left myself. 336 00:18:22,916 --> 00:18:26,636 Speaker 6: A water break is symphony. A symphony is like a skyscraper. 337 00:18:27,316 --> 00:18:29,956 Speaker 6: It's enormous, but every inch of it is designed in 338 00:18:29,996 --> 00:18:34,156 Speaker 6: meticulous detail. It's beautiful on the outside, but the inside 339 00:18:34,196 --> 00:18:39,436 Speaker 6: is filled with utilitarian solutions to simple problems. A skyscraper 340 00:18:40,156 --> 00:18:43,356 Speaker 6: has electrical columns to distribute power throughout the building, It 341 00:18:43,396 --> 00:18:46,316 Speaker 6: has plumbing, it has elevators, but you don't see any 342 00:18:46,316 --> 00:18:48,996 Speaker 6: of this essential detail when you admire the building from outside. 343 00:18:50,156 --> 00:18:53,876 Speaker 6: A symphony is like a skyscraper, but a recording of 344 00:18:53,916 --> 00:19:03,996 Speaker 6: a symphony is like a skyscraper's facade. There is no 345 00:19:04,076 --> 00:19:06,996 Speaker 6: way to tell from photos of even a million facades 346 00:19:07,116 --> 00:19:10,396 Speaker 6: that skyscrapers should have electricity, bathrooms and a way for 347 00:19:10,476 --> 00:19:14,516 Speaker 6: humans to move from one floor to another. Similarly, there 348 00:19:14,556 --> 00:19:16,836 Speaker 6: is no way to tell from the morass of frequencies 349 00:19:16,876 --> 00:19:19,316 Speaker 6: that is a piece of recorded music which frequencies are 350 00:19:19,356 --> 00:19:20,116 Speaker 6: the most important. 351 00:19:30,196 --> 00:19:30,676 Speaker 1: There we go. 352 00:19:33,396 --> 00:19:38,756 Speaker 6: So analyzing recorded music got us nowhere, and I thought 353 00:19:39,196 --> 00:19:42,036 Speaker 6: that the best way to proceed was to simplify the 354 00:19:42,076 --> 00:19:47,236 Speaker 6: task and just train the AI on the blueprints of 355 00:19:47,316 --> 00:19:50,316 Speaker 6: music rather than a finished building. So train the AI 356 00:19:50,356 --> 00:19:52,116 Speaker 6: on a blueprint rather than a finished building. So what 357 00:19:52,156 --> 00:19:54,236 Speaker 6: you just heard, what you're hearing now is the main 358 00:19:54,276 --> 00:19:57,236 Speaker 6: theme from the unfinished symphony. Here it is again, just 359 00:19:57,276 --> 00:20:10,876 Speaker 6: really listen and try to listen for the melody. And 360 00:20:10,956 --> 00:20:23,236 Speaker 6: here is that same theme reduced to its blueprint. This structure, 361 00:20:23,836 --> 00:20:27,796 Speaker 6: this blueprint in music, is just a simple melody. So 362 00:20:27,836 --> 00:20:29,996 Speaker 6: my team and I went to work extracting just the 363 00:20:30,036 --> 00:20:32,276 Speaker 6: melodies from as much of Schubert's music as we could 364 00:20:32,276 --> 00:20:35,716 Speaker 6: get our hands on. These are some examples of the 365 00:20:35,756 --> 00:20:42,236 Speaker 6: melodies we extracted. These sound robotic because they are. They 366 00:20:42,276 --> 00:20:45,956 Speaker 6: sound emotionally inert. But these are Schubert's melodies reduced to 367 00:20:45,996 --> 00:20:49,796 Speaker 6: their simplest forms, the forms that human composition students would 368 00:20:49,876 --> 00:20:52,876 Speaker 6: use when beginning a study of Schubert. Your ear knows 369 00:20:52,876 --> 00:20:54,836 Speaker 6: how to pick a melody out of a dense arrangement, 370 00:20:55,156 --> 00:21:02,556 Speaker 6: but an untrained AI cannot do this. The reason that, 371 00:21:02,556 --> 00:21:04,916 Speaker 6: since the results we wanted were simple, we needed to 372 00:21:04,956 --> 00:21:09,836 Speaker 6: train the AI on simple data. We trained on hours 373 00:21:09,836 --> 00:21:12,996 Speaker 6: of these simple melodies and then prompted again. We prompted 374 00:21:12,996 --> 00:21:16,196 Speaker 6: it with the unfinished symphony reduced to its blueprint, and 375 00:21:16,236 --> 00:21:23,996 Speaker 6: these were some of the results. So this is what 376 00:21:24,036 --> 00:21:27,356 Speaker 6: it suggested might be something that Subert would have written. 377 00:21:29,156 --> 00:21:31,676 Speaker 6: These are simple, but much more musical than the cats 378 00:21:31,676 --> 00:21:33,676 Speaker 6: walking on a piano that came from the audio only 379 00:21:33,716 --> 00:21:41,316 Speaker 6: training data. This one, for some reason, caught my attention. 380 00:21:41,756 --> 00:21:51,116 Speaker 6: Let's hear it again. I liked it, so I selected 381 00:21:51,116 --> 00:21:54,836 Speaker 6: it for embellishment. I decided to use this. I decided 382 00:21:54,836 --> 00:22:06,116 Speaker 6: to use this blueprint. This melody is a bit more 383 00:22:06,116 --> 00:22:09,796 Speaker 6: modern sounding than any of Schubert's work. If Schubert lived 384 00:22:09,836 --> 00:22:17,476 Speaker 6: to old age, these sonorities would have been available to him. 385 00:22:17,756 --> 00:22:21,236 Speaker 6: The orthodoxy around triple meters and other constraints of form 386 00:22:21,276 --> 00:22:25,116 Speaker 6: would have given way to the exploration of the Romantic period. 387 00:22:32,716 --> 00:22:37,276 Speaker 6: Providing simple singable melodies is perhaps not how most people 388 00:22:37,316 --> 00:22:40,716 Speaker 6: would imagine that an AI would be useful in writing 389 00:22:40,716 --> 00:22:45,236 Speaker 6: a symphony. But what is a symphony? Typically people think 390 00:22:45,236 --> 00:22:47,676 Speaker 6: about a symphony as something that you hear, while the 391 00:22:47,676 --> 00:22:53,156 Speaker 6: score is just a byproduct of the notated sounds. But 392 00:22:53,276 --> 00:22:57,956 Speaker 6: to me, the sound is a byproduct, and the symphony 393 00:22:58,036 --> 00:23:00,756 Speaker 6: is something that you see. It's something that you read. 394 00:23:01,636 --> 00:23:07,756 Speaker 6: It's a collection of abstract ideas in abstract notation. It's 395 00:23:07,876 --> 00:23:10,756 Speaker 6: markings on a page that serve as instructions for how 396 00:23:10,756 --> 00:23:16,076 Speaker 6: to create sounds. A symphony itself is a blueprint, and 397 00:23:16,156 --> 00:23:20,556 Speaker 6: those instructions that blueprint will be executed differently at every performance. 398 00:23:32,556 --> 00:23:34,236 Speaker 6: Let me just check out this music. It's pretty cool. 399 00:23:37,156 --> 00:23:40,116 Speaker 6: The sounds are a byproduct of the abstractions that are 400 00:23:40,116 --> 00:23:43,156 Speaker 6: expressed in the notation, and that byproduct is what the 401 00:23:43,196 --> 00:23:44,956 Speaker 6: audience experiences as a symphony. 402 00:23:46,116 --> 00:23:47,516 Speaker 1: The byproduct is what you hear. 403 00:23:48,796 --> 00:23:50,636 Speaker 6: I didn't know that I thought about music in this 404 00:23:50,676 --> 00:23:52,996 Speaker 6: way until I had to explain how I think about 405 00:23:53,076 --> 00:23:58,316 Speaker 6: music to a machine. This project taught me to question 406 00:23:58,396 --> 00:24:00,916 Speaker 6: the assumptions I make when thinking about my own craft. 407 00:24:04,236 --> 00:24:06,316 Speaker 6: I think this is the job of the AI assisted 408 00:24:06,356 --> 00:24:10,116 Speaker 6: composer today to think about what we know and to 409 00:24:10,156 --> 00:24:13,836 Speaker 6: guide our audience to rethink what happens inside their own minds. 410 00:24:23,276 --> 00:24:27,636 Speaker 6: I think it's our job to question orthodoxy. I think 411 00:24:27,636 --> 00:24:30,356 Speaker 6: it's our job to use new tools to make new art. 412 00:24:32,676 --> 00:24:35,756 Speaker 6: Today's artists are not on the verge of being replaced. 413 00:24:36,316 --> 00:24:39,156 Speaker 6: On the contrary, we are possessed of powers so great 414 00:24:39,276 --> 00:24:41,956 Speaker 6: that we will expose more truth about the human mind 415 00:24:42,196 --> 00:24:47,756 Speaker 6: and the human soul than any generation before us. We 416 00:24:47,836 --> 00:24:50,956 Speaker 6: stand on the shoulders of giants. They have given us 417 00:24:50,996 --> 00:24:55,716 Speaker 6: the language, they have given us the blueprints, they have 418 00:24:55,796 --> 00:24:59,556 Speaker 6: given us the technology. What we build with these tools 419 00:24:59,636 --> 00:25:02,756 Speaker 6: will be more powerful, and more beautiful, and more profound 420 00:25:02,796 --> 00:25:08,916 Speaker 6: than anything we can now imagine. Artificial intelligence is nothing 421 00:25:08,956 --> 00:25:16,156 Speaker 6: like us than a prosthetic for the human mind. It 422 00:25:16,196 --> 00:25:18,716 Speaker 6: will enhance art the way writing enhanced memory, the way 423 00:25:18,756 --> 00:25:22,396 Speaker 6: printing enhanced literature, the way the steam engine enhanced travel. 424 00:25:23,596 --> 00:25:27,836 Speaker 6: Artificial intelligence is an automobile. We're only beginning to emerge 425 00:25:27,836 --> 00:25:32,916 Speaker 6: from the age of horse and buggy. Artificial intelligence helped 426 00:25:32,956 --> 00:25:37,836 Speaker 6: me write the music that you're hearing right now. So 427 00:25:37,916 --> 00:25:39,276 Speaker 6: will AI kill creativity? 428 00:25:42,596 --> 00:25:52,436 Speaker 5: No, that's really rather Good's that more or less worked? 429 00:25:52,476 --> 00:25:53,956 Speaker 5: I think that's really rather good. 430 00:25:54,076 --> 00:25:54,396 Speaker 1: Thanks. 431 00:25:57,996 --> 00:26:00,476 Speaker 2: We'll be back in a minute to wave our hands 432 00:26:00,476 --> 00:26:14,076 Speaker 2: a little bit about the future of AI and creativity. 433 00:26:14,276 --> 00:26:15,276 Speaker 6: That's the end of the ads. 434 00:26:15,716 --> 00:26:16,836 Speaker 1: Now we're going back to the show. 435 00:26:17,716 --> 00:26:19,756 Speaker 5: The reason I knew AI was going to take off 436 00:26:20,316 --> 00:26:21,956 Speaker 5: was when I was writing a piece for The New 437 00:26:21,996 --> 00:26:26,236 Speaker 5: Yorker about GPT three and I got it to finish 438 00:26:26,396 --> 00:26:33,436 Speaker 5: off Coleridge's Kubla Khan is great unfinished poem, and it 439 00:26:33,516 --> 00:26:36,716 Speaker 5: did it perfectly well. Like I mean, if somebody told me, yeah, 440 00:26:36,716 --> 00:26:38,996 Speaker 5: this is how it ended, I would have been like, great, 441 00:26:39,396 --> 00:26:41,596 Speaker 5: right and so, And it did it like that like 442 00:26:41,756 --> 00:26:42,276 Speaker 5: one second. 443 00:26:42,316 --> 00:26:43,716 Speaker 1: I mean, it was just so incredible to me. 444 00:26:43,996 --> 00:26:46,556 Speaker 2: Just to sort of close this part of the conversation, 445 00:26:47,076 --> 00:26:50,436 Speaker 2: I'm curious. I mean, both of these projects. We were 446 00:26:50,596 --> 00:26:54,316 Speaker 2: very AI forward, right, They were like high concept, you know, 447 00:26:54,476 --> 00:26:57,396 Speaker 2: sort of let's explicitly wrap this thing in AI. 448 00:26:57,916 --> 00:26:58,236 Speaker 4: Fine. 449 00:26:58,436 --> 00:27:02,956 Speaker 2: Interesting, But presumably the real action comes in the things 450 00:27:02,956 --> 00:27:05,516 Speaker 2: that are just what you guys are working on that 451 00:27:05,676 --> 00:27:07,796 Speaker 2: just happens to have AI as a tool, the same 452 00:27:07,836 --> 00:27:09,836 Speaker 2: way say a Google search, which by the way, is 453 00:27:09,876 --> 00:27:12,516 Speaker 2: a kind of AI, is also a tool, right, And 454 00:27:12,596 --> 00:27:16,316 Speaker 2: so I'm curious in your work now on other projects 455 00:27:16,676 --> 00:27:18,756 Speaker 2: that are not like, hey, look this was made with 456 00:27:18,796 --> 00:27:21,356 Speaker 2: AI kind of projects. Are you guys using AI? And 457 00:27:21,356 --> 00:27:23,996 Speaker 2: if so, how what do you want to go first? 458 00:27:24,076 --> 00:27:27,916 Speaker 6: Yeah? Yeah, first, so yeah, obviously of course, like it's 459 00:27:27,916 --> 00:27:29,796 Speaker 6: in everybody's pockets, you use it all the time. And 460 00:27:30,636 --> 00:27:35,036 Speaker 6: AI has done nothing so far other than help my career. 461 00:27:35,036 --> 00:27:37,636 Speaker 6: And I don't mean just by doing this, which was fantastic. 462 00:27:37,676 --> 00:27:39,556 Speaker 6: But when I write a piece of music and put 463 00:27:39,596 --> 00:27:42,436 Speaker 6: it on Spotify, the reason you hear it is because 464 00:27:42,436 --> 00:27:44,596 Speaker 6: an AI recommended it to you. You know, that's the 465 00:27:44,636 --> 00:27:46,556 Speaker 6: only reason you're going to find it. And so and 466 00:27:46,996 --> 00:27:49,556 Speaker 6: these types of algorithms that are generating that are keeping 467 00:27:49,556 --> 00:27:52,556 Speaker 6: people out on apps longer and keeping people on Netflix 468 00:27:52,796 --> 00:27:56,036 Speaker 6: and on Spotify longer, are putting money not enough money, 469 00:27:56,076 --> 00:27:59,796 Speaker 6: and that's another panel discussion, but putting money in our 470 00:27:59,836 --> 00:28:00,716 Speaker 6: pockets directly? 471 00:28:00,836 --> 00:28:03,356 Speaker 2: Let me let me ask a more precise version of 472 00:28:03,396 --> 00:28:05,836 Speaker 2: the question in response to that clever answer. 473 00:28:05,996 --> 00:28:07,356 Speaker 4: Do you use generative AI? 474 00:28:08,716 --> 00:28:08,956 Speaker 1: Yes? 475 00:28:09,596 --> 00:28:12,556 Speaker 6: And also this is a terminology problem. 476 00:28:12,476 --> 00:28:15,396 Speaker 1: But you know what do you use music? 477 00:28:15,436 --> 00:28:19,076 Speaker 2: Do you use AI to generate musical ideas for you? 478 00:28:19,236 --> 00:28:19,476 Speaker 1: Yes? 479 00:28:19,796 --> 00:28:22,076 Speaker 6: But also like what is a musical idea? I use 480 00:28:22,116 --> 00:28:24,356 Speaker 6: a parametric eque that I mean they were using a 481 00:28:24,516 --> 00:28:28,236 Speaker 6: they were using this was there was probably good. I'm trying, 482 00:28:28,276 --> 00:28:30,276 Speaker 6: well the answer the answer is yes. 483 00:28:30,316 --> 00:28:32,236 Speaker 4: I know what you're saying. But I feel like you 484 00:28:32,316 --> 00:28:33,076 Speaker 4: know what I'm saying. 485 00:28:33,236 --> 00:28:35,476 Speaker 6: Well, yes, I'm The reason I'm trying to drill down 486 00:28:35,516 --> 00:28:38,156 Speaker 6: here is because this there tell me how to ask 487 00:28:38,196 --> 00:28:41,516 Speaker 6: the question I want to asking doesn't have the answer 488 00:28:41,596 --> 00:28:42,036 Speaker 6: that you want? 489 00:28:42,996 --> 00:28:47,036 Speaker 2: Right, So fair, what's the what? What's the smarter version 490 00:28:47,076 --> 00:28:49,196 Speaker 2: of the question? I'm not well enough equipped to ask. 491 00:28:50,756 --> 00:28:51,556 Speaker 1: I don't know if I can. 492 00:28:51,676 --> 00:28:52,676 Speaker 6: I don't know if I can help you with that. 493 00:28:53,156 --> 00:28:58,596 Speaker 1: I don't let. 494 00:28:58,516 --> 00:28:59,596 Speaker 4: Me ask the question to you. 495 00:29:00,596 --> 00:29:02,036 Speaker 1: Thank you for your Stephen. 496 00:29:02,716 --> 00:29:05,636 Speaker 2: Do you you use generative AI when you're writing with 497 00:29:05,836 --> 00:29:06,276 Speaker 2: other things? 498 00:29:06,276 --> 00:29:08,236 Speaker 5: Okay, here's the thing, and I think this is sort 499 00:29:08,236 --> 00:29:10,676 Speaker 5: of where we're going. Like I would when I write 500 00:29:10,756 --> 00:29:13,356 Speaker 5: something for a magazine or newspaper or novel that I'm 501 00:29:13,356 --> 00:29:15,716 Speaker 5: working on, I would never use chatchipt. 502 00:29:15,316 --> 00:29:18,236 Speaker 1: Even to get an idea because here or whatever they 503 00:29:18,916 --> 00:29:22,076 Speaker 1: because I'm so much smarter than chat GPT. 504 00:29:22,276 --> 00:29:24,876 Speaker 5: Right, And I'm like when you and what you have 505 00:29:24,916 --> 00:29:27,036 Speaker 5: to also have to understand is chatchypt. The reason it's 506 00:29:27,036 --> 00:29:30,436 Speaker 5: so successful is exactly that it has been banalified, like 507 00:29:30,476 --> 00:29:33,596 Speaker 5: when you use other generative ais that we have access to, 508 00:29:33,716 --> 00:29:37,556 Speaker 5: because you realize that like these are the ones that 509 00:29:37,596 --> 00:29:41,476 Speaker 5: the public uses are very poor creatively, like they're actually. 510 00:29:41,156 --> 00:29:43,276 Speaker 4: But you have access to the good ones, to the 511 00:29:43,276 --> 00:29:43,756 Speaker 4: good stuff. 512 00:29:43,756 --> 00:29:45,716 Speaker 1: Here's the thing you can't get on when when you 513 00:29:45,836 --> 00:29:47,036 Speaker 1: use the good stuff. 514 00:29:46,836 --> 00:29:49,196 Speaker 5: What the good stuff is going to be used for 515 00:29:49,316 --> 00:29:52,876 Speaker 5: stuff that doesn't exist yet. What we're seeing here is 516 00:29:52,916 --> 00:29:56,476 Speaker 5: the birth of a new medium, right and what and 517 00:29:56,556 --> 00:30:00,796 Speaker 5: so when it comes to write an essay, what people 518 00:30:00,836 --> 00:30:03,156 Speaker 5: want when they write, when they read an essay, is 519 00:30:03,156 --> 00:30:07,116 Speaker 5: a human being communicating their thoughts and feelings, right, they 520 00:30:07,116 --> 00:30:09,756 Speaker 5: don't want like they don't That's why they go to it. 521 00:30:09,916 --> 00:30:13,876 Speaker 5: And a generative AI cannot do that generative Like it's 522 00:30:13,876 --> 00:30:16,116 Speaker 5: sort of like asking, like do you use film to 523 00:30:16,196 --> 00:30:19,556 Speaker 5: make theater? Like at first, you know, when you when 524 00:30:19,636 --> 00:30:22,116 Speaker 5: film was invented, all they did was cannibalized theater and 525 00:30:22,116 --> 00:30:24,596 Speaker 5: they were putting on weird shows or they were recreating 526 00:30:24,636 --> 00:30:27,436 Speaker 5: news events and things like this. That's where we're at 527 00:30:27,476 --> 00:30:29,636 Speaker 5: right now. This is going to be used for new 528 00:30:29,836 --> 00:30:34,316 Speaker 5: art forms that don't exist, and that's that's the exciting stuff. 529 00:30:34,356 --> 00:30:36,436 Speaker 5: And it's also why it's almost impossible to do. 530 00:30:36,676 --> 00:30:38,596 Speaker 2: You mean, like the book that is never done, the 531 00:30:38,596 --> 00:30:40,516 Speaker 2: book where it can or like what like. 532 00:30:40,476 --> 00:30:42,556 Speaker 5: I'm written that I have written a short story that 533 00:30:42,716 --> 00:30:45,036 Speaker 5: is infinite art forms? 534 00:30:45,076 --> 00:30:46,636 Speaker 1: Like what do you have in your mind when you 535 00:30:46,676 --> 00:30:46,996 Speaker 1: say it? 536 00:30:47,276 --> 00:30:50,636 Speaker 5: Well, like, for example, I'm working with cohere to recreate 537 00:30:51,276 --> 00:30:54,156 Speaker 5: the Oracle at Delphi. Right there's a large amount of 538 00:30:54,156 --> 00:30:56,276 Speaker 5: information that you can glean from that, and there's also 539 00:30:56,356 --> 00:30:58,076 Speaker 5: pretty interesting historical record. 540 00:30:58,796 --> 00:31:01,716 Speaker 4: And so you'll ask it a question and it will answer, yes. 541 00:31:01,636 --> 00:31:04,236 Speaker 5: We're try and recreate the experience of going to the 542 00:31:04,236 --> 00:31:06,876 Speaker 5: Oracle at Delphia as closely as we can use effects. 543 00:31:07,036 --> 00:31:09,596 Speaker 6: Yeah, it's a perfect use of AI and so oracles. 544 00:31:09,756 --> 00:31:11,116 Speaker 6: This is one of the things that has come up 545 00:31:11,116 --> 00:31:13,596 Speaker 6: in my research is that we use oracles because we're 546 00:31:13,636 --> 00:31:16,636 Speaker 6: bad at doing things randomly. So if we're out in 547 00:31:16,676 --> 00:31:19,316 Speaker 6: the wilderness, we'll just go hunt in the same place 548 00:31:19,396 --> 00:31:22,276 Speaker 6: over and over and over again, right, And eventually animals 549 00:31:22,276 --> 00:31:24,196 Speaker 6: figure it out and they say, just don't hang out there. 550 00:31:23,996 --> 00:31:25,636 Speaker 1: And you won't get eaten by the humans. 551 00:31:25,676 --> 00:31:28,036 Speaker 6: And so when we like consult an oracle, or roll 552 00:31:28,116 --> 00:31:30,356 Speaker 6: some dice, or like ask the sacred chickens if we 553 00:31:30,356 --> 00:31:32,916 Speaker 6: should go to war, they're basically giving us a random answer. 554 00:31:33,036 --> 00:31:36,116 Speaker 1: That's right. There are randomization engines, see, and it's. 555 00:31:35,996 --> 00:31:38,756 Speaker 5: Things of this nature that I think will be that 556 00:31:39,236 --> 00:31:42,316 Speaker 5: I'm excited about to use it. We're cannibalizing forms. That's 557 00:31:42,356 --> 00:31:44,516 Speaker 5: what I do writing short stories to It's very interesting. 558 00:31:44,796 --> 00:31:46,716 Speaker 5: But the truth is that what this can be used 559 00:31:46,796 --> 00:31:48,956 Speaker 5: for we don't know yet, and what it's going to 560 00:31:48,996 --> 00:31:51,876 Speaker 5: be used for is some weird and the problem is 561 00:31:51,916 --> 00:31:55,556 Speaker 5: there's absolutely no institutions to do it with, right, Like. 562 00:31:55,956 --> 00:31:59,076 Speaker 4: Nobody will buy your oracle of Veli's. 563 00:31:58,556 --> 00:32:00,396 Speaker 1: Supposed to take oracle of Hi. 564 00:32:00,876 --> 00:32:03,356 Speaker 5: My name Stephen, I'd like to recreate the oracle at 565 00:32:03,356 --> 00:32:07,716 Speaker 5: DELFI using generative AI. I'm sorry, sir, this is a 566 00:32:07,876 --> 00:32:11,396 Speaker 5: key mark, you know what I mean, like like it 567 00:32:11,636 --> 00:32:14,116 Speaker 5: like it's not that's that's not like there's no one 568 00:32:14,196 --> 00:32:18,156 Speaker 5: to go to. So that's that's where we're at. To me, Like, 569 00:32:18,236 --> 00:32:20,116 Speaker 5: I think the the the thing that I think is 570 00:32:20,196 --> 00:32:23,596 Speaker 5: very obvious is that when you use generative AI, what 571 00:32:23,676 --> 00:32:27,996 Speaker 5: it is very good at is the most stock answer, right. 572 00:32:28,156 --> 00:32:30,836 Speaker 5: And that's why it's so such a threat to like 573 00:32:30,956 --> 00:32:34,356 Speaker 5: the undergraduate essay, right, because that there you're basically looking 574 00:32:34,396 --> 00:32:38,956 Speaker 5: for the fulfillment of a stylistic you know, set pattern 575 00:32:39,436 --> 00:32:40,236 Speaker 5: that it can do. 576 00:32:40,716 --> 00:32:40,996 Speaker 1: Right. 577 00:32:41,036 --> 00:32:46,116 Speaker 5: But people respond to human like there's this weird idea 578 00:32:46,156 --> 00:32:49,156 Speaker 5: that art is something external to our experience of it. 579 00:32:49,156 --> 00:32:52,636 Speaker 5: It isn't. It's just we we have we create tools. 580 00:32:52,676 --> 00:32:54,956 Speaker 5: As the moment we find tools, all we're thinking of 581 00:32:55,156 --> 00:32:57,436 Speaker 5: is can we do something weird with it? And I think, 582 00:32:57,636 --> 00:32:59,516 Speaker 5: I mean, one thing that I've really learned doing this 583 00:32:59,676 --> 00:33:05,036 Speaker 5: is that creativity is instructible. Like it it doesn't matter 584 00:33:05,076 --> 00:33:09,356 Speaker 5: what comes down technologically, what comes down politically, what Like, 585 00:33:10,036 --> 00:33:13,516 Speaker 5: we are creative animals and we have to understand that 586 00:33:13,516 --> 00:33:17,396 Speaker 5: that's just our nature and nothing is gonna kill it, nothing, 587 00:33:17,596 --> 00:33:19,196 Speaker 5: not certainly not chat gept. 588 00:33:19,596 --> 00:33:21,996 Speaker 6: Great I can I can sum up the history of 589 00:33:22,076 --> 00:33:25,956 Speaker 6: music from the year sixty thousand before present to now 590 00:33:26,316 --> 00:33:28,196 Speaker 6: with one sentence, and maybe you'll agree that this sums 591 00:33:28,236 --> 00:33:30,196 Speaker 6: up the history of art already. It's the search for 592 00:33:30,276 --> 00:33:33,236 Speaker 6: new sounds. Yeah, that's it. That's all there is to it. 593 00:33:33,276 --> 00:33:36,516 Speaker 6: If something exists, nobody cares and chat geept I will chat. 594 00:33:36,596 --> 00:33:39,476 Speaker 6: Chapet doesn't do music. But there are many music generative ais, 595 00:33:39,836 --> 00:33:44,196 Speaker 6: and they generate music that, like charitably would call insipid. 596 00:33:44,556 --> 00:33:47,396 Speaker 6: Yeah it's fine, like it's music. You would recognize it 597 00:33:47,436 --> 00:33:49,596 Speaker 6: as music, but nobody. You wouldn't listen to it. It'll 598 00:33:49,636 --> 00:33:53,636 Speaker 6: get bad music. It won't, so it'll sound better, it'll 599 00:33:53,676 --> 00:33:56,276 Speaker 6: sound better. So this is the but, but nobody cares 600 00:33:56,276 --> 00:33:58,316 Speaker 6: about that. So as soon as like, as soon as 601 00:33:58,356 --> 00:34:01,316 Speaker 6: you can have so Jacob for your podcast, as soon 602 00:34:01,316 --> 00:34:04,316 Speaker 6: as you can have beautiful sounding orchestral music like this 603 00:34:04,636 --> 00:34:07,396 Speaker 6: for free, you're gonna want something else because this is 604 00:34:07,436 --> 00:34:09,796 Speaker 6: available and it's everywhere, and so what you're gonna what 605 00:34:09,876 --> 00:34:12,116 Speaker 6: you're gonna want is like the thing where like Lucas 606 00:34:12,156 --> 00:34:14,716 Speaker 6: plays a guitar with a really nice sounding reverb. That's 607 00:34:14,756 --> 00:34:17,516 Speaker 6: gonna be the style and you can trace and we 608 00:34:17,556 --> 00:34:19,156 Speaker 6: have a we have a composer in the audience who could, 609 00:34:19,156 --> 00:34:21,076 Speaker 6: hopefully will agree with me on this, and a professor 610 00:34:21,116 --> 00:34:23,356 Speaker 6: of this kind of thing. But you can trace musical 611 00:34:23,356 --> 00:34:26,236 Speaker 6: styles in media, and it's like whatever is ubiquitous just 612 00:34:26,276 --> 00:34:28,516 Speaker 6: falls out of fashion and then that whatever the opposite 613 00:34:28,556 --> 00:34:33,316 Speaker 6: of it is becomes becomes fashionable. So yeah, that's my 614 00:34:33,476 --> 00:34:35,516 Speaker 6: that's my two cents the search for new sounds. 615 00:34:36,796 --> 00:34:48,196 Speaker 1: Thanks you guys. This is closure. Yeah. 616 00:34:48,636 --> 00:34:53,196 Speaker 2: My conversation with Lucas Contour and Stephen Marsh was organized. 617 00:34:52,756 --> 00:34:54,316 Speaker 1: By Chicago Humanities. 618 00:34:56,756 --> 00:35:00,916 Speaker 2: Today's show was edited by Karen Chakerji, produced by Edith Russolo, 619 00:35:01,076 --> 00:35:03,316 Speaker 2: and engineered by Amanda k Wong. 620 00:35:04,556 --> 00:35:05,436 Speaker 1: You can email us. 621 00:35:05,356 --> 00:35:08,916 Speaker 2: At Problem at pushkin dot fm. We are always, always, 622 00:35:09,116 --> 00:35:12,116 Speaker 2: always trying to find interesting new guests for the show, 623 00:35:12,196 --> 00:35:14,156 Speaker 2: So if there's somebody who think we should book, please 624 00:35:14,236 --> 00:35:16,876 Speaker 2: let us know. I'm Jacob Goldstein and we'll be back 625 00:35:16,956 --> 00:35:34,356 Speaker 2: next week with another episode of What's Your Problem.