1 00:00:03,120 --> 00:00:05,920 Speaker 1: Welcome to Stuff to Blow Your Mind from how Stuff 2 00:00:05,920 --> 00:00:14,600 Speaker 1: Works dot com. Hey, welcome to Stuff to Blow your Mind. 3 00:00:14,640 --> 00:00:18,439 Speaker 1: My name is Robert Lamb and my name is Christian Sager. Hey. 4 00:00:18,640 --> 00:00:21,520 Speaker 1: Something that people probably don't know about both of us 5 00:00:21,640 --> 00:00:24,880 Speaker 1: is that outside of doing this show, we're both quote 6 00:00:24,920 --> 00:00:29,160 Speaker 1: unquote creative people, right Like, even in the show. Well, 7 00:00:29,200 --> 00:00:30,920 Speaker 1: in the show, we try to be creative. We can't 8 00:00:30,960 --> 00:00:33,760 Speaker 1: strip the creativity off of cats like that. This is 9 00:00:33,840 --> 00:00:37,920 Speaker 1: very true. But um, you are a fiction writer, which 10 00:00:37,960 --> 00:00:39,800 Speaker 1: I always try to mention on the episodes that you're 11 00:00:39,800 --> 00:00:42,880 Speaker 1: not on so that you're not like somehow embarrassed or 12 00:00:42,920 --> 00:00:45,960 Speaker 1: anything like that. But I try to plug Robert's books. Uh, 13 00:00:46,000 --> 00:00:48,400 Speaker 1: And then I do comics and and in the past 14 00:00:48,440 --> 00:00:51,800 Speaker 1: I've done music as well, and I didn't know that. Yeah, 15 00:00:51,960 --> 00:00:53,680 Speaker 1: and you were you like in a band. You were 16 00:00:53,680 --> 00:00:56,720 Speaker 1: in a band, right, Yeah. And I had a period 17 00:00:56,760 --> 00:00:58,360 Speaker 1: of time I'm gonna talk about in this episode where 18 00:00:58,400 --> 00:01:02,360 Speaker 1: I dabbled with electronic music as well. Um, but yeah, so, 19 00:01:02,520 --> 00:01:04,639 Speaker 1: but I was in a band for a couple of years. 20 00:01:05,160 --> 00:01:09,600 Speaker 1: And Uh, there's this interesting thing going on right now 21 00:01:09,640 --> 00:01:11,759 Speaker 1: that you saw when you went to the World Science 22 00:01:11,800 --> 00:01:15,720 Speaker 1: Festival two weeks ago. That is sort of like trying 23 00:01:15,760 --> 00:01:19,959 Speaker 1: to come to grips with what happens when you teach 24 00:01:20,400 --> 00:01:23,280 Speaker 1: artificial intelligence or really, I guess the correct term is 25 00:01:23,319 --> 00:01:29,119 Speaker 1: machine learning to be creative quote unquote, and like, how 26 00:01:29,160 --> 00:01:32,039 Speaker 1: do people feel about that? And you saw this amazing 27 00:01:32,040 --> 00:01:34,800 Speaker 1: panel where they had people who were dabbling in that, 28 00:01:34,840 --> 00:01:38,520 Speaker 1: both in music and culinary arts and drawing, and then 29 00:01:38,560 --> 00:01:40,600 Speaker 1: they had the psychologist on the panel to who is 30 00:01:40,640 --> 00:01:43,039 Speaker 1: like the skeptic kind of Yeah, yeah, this was a 31 00:01:43,080 --> 00:01:45,800 Speaker 1: really good, good panel. I saw it live at the 32 00:01:45,800 --> 00:01:48,720 Speaker 1: World Science Festival this year, and if you're listening to this, 33 00:01:48,920 --> 00:01:50,880 Speaker 1: you can watch it as well, because this one not 34 00:01:50,960 --> 00:01:52,520 Speaker 1: all the talks are available on video, but this one 35 00:01:52,560 --> 00:01:54,440 Speaker 1: is available on video, and I'll include a link to 36 00:01:54,600 --> 00:01:56,640 Speaker 1: that talk on the landing page for this episode of 37 00:01:56,680 --> 00:01:58,520 Speaker 1: Stuff to Blow Your Mind dot Com. But it was 38 00:01:58,560 --> 00:02:02,520 Speaker 1: titled Computational Creativity. He was moderated by w n y 39 00:02:02,560 --> 00:02:07,080 Speaker 1: C S John Schaefer, and the panelist included artist Song 40 00:02:07,200 --> 00:02:11,720 Speaker 1: Win Chung, computer scientists and musician Jesse Ingle, neuroscientist Peter 41 00:02:11,800 --> 00:02:16,560 Speaker 1: you Rock Say, and engineering theorist A lab of Varshney. Yeah, 42 00:02:16,600 --> 00:02:18,960 Speaker 1: and so the four of them had this interesting discussion 43 00:02:19,040 --> 00:02:22,079 Speaker 1: kind of a back and forth about what what they 44 00:02:22,080 --> 00:02:25,400 Speaker 1: were experimenting with in terms of trying to teach machines 45 00:02:25,440 --> 00:02:28,640 Speaker 1: to be creative, and then more along the lines of 46 00:02:28,760 --> 00:02:35,240 Speaker 1: defining creativity, uh, systematizing creativity for human beings, so that 47 00:02:35,280 --> 00:02:38,200 Speaker 1: you could somehow try to take that and apply it 48 00:02:38,240 --> 00:02:42,000 Speaker 1: to machines and algorithms. Uh. And that the that's where 49 00:02:42,040 --> 00:02:45,280 Speaker 1: Peter Rix, say was sort of like the the neuroscientists, 50 00:02:45,320 --> 00:02:48,200 Speaker 1: not naysayer really, but just like he was doubtful right 51 00:02:48,320 --> 00:02:51,400 Speaker 1: because he was he could not see a way that 52 00:02:51,520 --> 00:02:55,320 Speaker 1: quote unquote creativity as he understood it could be replicated 53 00:02:55,960 --> 00:02:58,280 Speaker 1: uh in machines the way that the other three were 54 00:02:58,320 --> 00:03:00,960 Speaker 1: pretty excited about. Yeah. He I guess if you were 55 00:03:00,960 --> 00:03:03,560 Speaker 1: thinking of this as a like a jam band or something. 56 00:03:04,320 --> 00:03:07,080 Speaker 1: He was the guy with doing the jazzy solos, like 57 00:03:07,120 --> 00:03:11,600 Speaker 1: all the other three had had really particular examples tied 58 00:03:11,639 --> 00:03:14,320 Speaker 1: into their research, and then Say would come in and 59 00:03:14,360 --> 00:03:16,640 Speaker 1: he'd just kind of he'd weigh a little bit. He 60 00:03:16,720 --> 00:03:19,440 Speaker 1: had very interesting stuff to add to Yeah. Yeah, he 61 00:03:19,480 --> 00:03:21,680 Speaker 1: was great, um, and it definitely brought some balance to it. 62 00:03:21,760 --> 00:03:24,239 Speaker 1: So we thought, after you've seen it, you said, you 63 00:03:24,280 --> 00:03:26,560 Speaker 1: gotta check this out. I watched the video, took notes, 64 00:03:26,800 --> 00:03:29,560 Speaker 1: and we thought this is worth us having a conversation 65 00:03:29,600 --> 00:03:33,440 Speaker 1: here on the show about it, especially because we're sort 66 00:03:33,480 --> 00:03:37,280 Speaker 1: of at the intersectionality of science and creativity. Yeah, so, 67 00:03:38,000 --> 00:03:39,560 Speaker 1: what we're not gonna do here today, We're not going 68 00:03:39,600 --> 00:03:42,720 Speaker 1: to just replicate their discussion. We're gonna We're gonna spring 69 00:03:42,760 --> 00:03:46,120 Speaker 1: off their discussion, uh, talk about our own experience, his 70 00:03:46,120 --> 00:03:48,000 Speaker 1: own thoughts, and we're gonna drag in a few additional 71 00:03:48,080 --> 00:03:51,920 Speaker 1: resources as well and in some additional examples, uh, to 72 00:03:51,920 --> 00:03:54,440 Speaker 1: to bring this conversation to you and then hopefully here 73 00:03:54,440 --> 00:03:58,520 Speaker 1: back from all of you creative people about your thoughts 74 00:03:58,840 --> 00:04:03,800 Speaker 1: on machine creativity, machine learning, and the future of human creativity. 75 00:04:04,040 --> 00:04:06,680 Speaker 1: So along those lines, Robert, like, when you hear the 76 00:04:06,760 --> 00:04:11,040 Speaker 1: idea of a machine being creative, how does that make 77 00:04:11,080 --> 00:04:13,360 Speaker 1: you feel as a person who like you know, I 78 00:04:13,680 --> 00:04:16,719 Speaker 1: I imagine you're like me, Like it's work, you know, 79 00:04:16,839 --> 00:04:20,799 Speaker 1: being creative, writing, drawing, playing music. It. You know, there's 80 00:04:20,880 --> 00:04:26,720 Speaker 1: there's something that feels unescapable human about it. Yeah, yeah, 81 00:04:26,800 --> 00:04:29,160 Speaker 1: you know. I I hear, I hear about it, and 82 00:04:29,160 --> 00:04:32,240 Speaker 1: and I read articles or studies where they've taken a 83 00:04:32,240 --> 00:04:35,680 Speaker 1: computer program and it's writing high coups, or it's where 84 00:04:35,720 --> 00:04:40,000 Speaker 1: it's printing paintings, that sort of thing. I see that, 85 00:04:40,160 --> 00:04:42,719 Speaker 1: and then I I think about these stories of say 86 00:04:42,839 --> 00:04:47,120 Speaker 1: journalism jobs like local journalism jobs, jobs being outsourced to 87 00:04:47,160 --> 00:04:50,919 Speaker 1: another country that I hear about, of course, the impact 88 00:04:50,920 --> 00:04:55,920 Speaker 1: on manufacturing, of the impact of automation on manufacturing, and 89 00:04:55,960 --> 00:04:58,480 Speaker 1: that makes me, you know, try and envision the future 90 00:04:58,880 --> 00:05:02,520 Speaker 1: or where suddenly you're going to have programs and machines 91 00:05:02,600 --> 00:05:06,880 Speaker 1: that can replicate what I do. Yeah, we've actually exactly 92 00:05:06,880 --> 00:05:10,360 Speaker 1: to that point. We've actually like encountered that in this business, 93 00:05:10,760 --> 00:05:13,280 Speaker 1: where like people will occasionally come to us and say, like, hey, 94 00:05:13,560 --> 00:05:16,160 Speaker 1: you want to make your job easier podcasting. We've come 95 00:05:16,240 --> 00:05:19,840 Speaker 1: up with this program that will do all the research 96 00:05:19,960 --> 00:05:22,360 Speaker 1: for you, and all you have to do is just 97 00:05:22,440 --> 00:05:24,719 Speaker 1: like you know, printed out and go into the studio 98 00:05:24,800 --> 00:05:28,719 Speaker 1: with it. Uh. And you know that seems really interesting 99 00:05:28,760 --> 00:05:30,520 Speaker 1: to me. That's I don't like I got that email. 100 00:05:30,640 --> 00:05:33,120 Speaker 1: I got the one about where you're basically outsourcing it 101 00:05:33,120 --> 00:05:36,360 Speaker 1: to other researchers. There's that, but then there's also, um, yeah, 102 00:05:36,400 --> 00:05:38,880 Speaker 1: it'll just basically just pluck stuff for you and put 103 00:05:38,880 --> 00:05:41,119 Speaker 1: it into a dock. Uh. This has has been pitched 104 00:05:41,160 --> 00:05:42,920 Speaker 1: to me before at least. And then the other thing 105 00:05:42,920 --> 00:05:44,800 Speaker 1: that I've heard about is like, you know, some of 106 00:05:44,839 --> 00:05:48,720 Speaker 1: our listeners are familiar that we do videos occasionally. Uh, 107 00:05:48,839 --> 00:05:50,560 Speaker 1: the old stuff to blow your mind videos would be 108 00:05:50,600 --> 00:05:52,919 Speaker 1: a good example of this. So essentially you punch in 109 00:05:53,040 --> 00:05:55,760 Speaker 1: parameters of what your video would be about. In this 110 00:05:56,800 --> 00:06:01,240 Speaker 1: software spits out a like video for you that has 111 00:06:01,400 --> 00:06:05,440 Speaker 1: text on the screen and pulls image clips or video 112 00:06:05,520 --> 00:06:08,960 Speaker 1: clips out of a library to just make something rather 113 00:06:09,000 --> 00:06:12,320 Speaker 1: than you're working together with a producer, writing script. All 114 00:06:12,400 --> 00:06:14,360 Speaker 1: the things that we do when we make those videos 115 00:06:14,400 --> 00:06:16,560 Speaker 1: just makes me feel like we need to be marketing. 116 00:06:16,560 --> 00:06:19,880 Speaker 1: Our podcast is a craft podcast, you know, because it's 117 00:06:19,920 --> 00:06:23,279 Speaker 1: like all the research is done by hand, granted, you know, 118 00:06:23,320 --> 00:06:25,120 Speaker 1: on a computer screen. A lot of people don't know this. 119 00:06:25,200 --> 00:06:27,640 Speaker 1: Book Mark Marin, all of his research is done by 120 00:06:27,760 --> 00:06:32,520 Speaker 1: robot in his garage. Yeah yeah, okay, among the cats 121 00:06:32,560 --> 00:06:37,200 Speaker 1: I'm imagining it's a robot cat actually yeah, um, but yeah, 122 00:06:37,240 --> 00:06:39,080 Speaker 1: I kind of feel the same way, especially like when 123 00:06:39,120 --> 00:06:41,240 Speaker 1: I hear these examples. But then at the same time, like, 124 00:06:41,760 --> 00:06:45,080 Speaker 1: so I've done commercial art before. I was a graphic 125 00:06:45,080 --> 00:06:47,240 Speaker 1: designer for a number of years before I got into 126 00:06:47,279 --> 00:06:50,680 Speaker 1: the podcasting business. And I've done creative stuff where I've 127 00:06:50,720 --> 00:06:54,719 Speaker 1: been an illustrator, I've worked as a musician, I write, uh, 128 00:06:54,760 --> 00:06:57,760 Speaker 1: And so the thing that I've learned is that, like 129 00:06:58,640 --> 00:07:01,279 Speaker 1: the one of the most iportant lessons I've learned about 130 00:07:01,279 --> 00:07:05,720 Speaker 1: creativity is that you have to impose limits upon your canvas. 131 00:07:05,760 --> 00:07:08,840 Speaker 1: No matter what medium you're performing in, right or else, 132 00:07:08,839 --> 00:07:11,720 Speaker 1: You're just gonna like stare into this abyss forever with 133 00:07:11,960 --> 00:07:15,720 Speaker 1: like trying to figure out where to place your creative ideas, um, 134 00:07:15,760 --> 00:07:18,880 Speaker 1: because there's this infinite possibility of like what could be right, 135 00:07:18,920 --> 00:07:21,080 Speaker 1: what could be wrong? Which which things should I use? 136 00:07:21,120 --> 00:07:25,640 Speaker 1: Which combination of words, which kind of ink all those things? Um? 137 00:07:25,800 --> 00:07:29,080 Speaker 1: For instance, like writers that I've encountered who have the 138 00:07:29,080 --> 00:07:33,560 Speaker 1: problem where they'll rewrite the first paragraph that they write 139 00:07:33,600 --> 00:07:36,360 Speaker 1: over and over and over again and obsessed on making 140 00:07:36,400 --> 00:07:39,160 Speaker 1: that paragraph perfect before they'll move on, and then they 141 00:07:39,200 --> 00:07:42,800 Speaker 1: never finished. Right. Yeah, I've I've I've done that before. 142 00:07:43,000 --> 00:07:46,119 Speaker 1: Like when I first started righting fiction, I was doing 143 00:07:46,160 --> 00:07:48,080 Speaker 1: that model, and of course the thing I was writing 144 00:07:48,480 --> 00:07:52,640 Speaker 1: is still to this day incomplete from from that totally. Yeah, 145 00:07:52,680 --> 00:07:54,400 Speaker 1: I've been there too, and It's like that's one of 146 00:07:54,440 --> 00:07:57,000 Speaker 1: those things that they don't ever really teach you. At 147 00:07:57,080 --> 00:08:00,880 Speaker 1: least they didn't teach me very well in composition, yes, 148 00:08:01,040 --> 00:08:03,800 Speaker 1: like just move on, um. But when I think about 149 00:08:03,840 --> 00:08:08,360 Speaker 1: creative machines, then I have to wonder, first of all, 150 00:08:08,400 --> 00:08:11,200 Speaker 1: are they aware of what their limitations are or are 151 00:08:11,240 --> 00:08:15,280 Speaker 1: they just executing an algorithm? And how do they do 152 00:08:15,360 --> 00:08:18,040 Speaker 1: they know how to make those limitations work for them 153 00:08:18,080 --> 00:08:22,160 Speaker 1: creatively the same way that human beings do well. It's 154 00:08:22,200 --> 00:08:25,320 Speaker 1: interesting because you know, when you when you start picking 155 00:08:25,360 --> 00:08:28,600 Speaker 1: apart your own creative process, you can you can begin 156 00:08:28,720 --> 00:08:31,160 Speaker 1: to see how a machine could do some of it, 157 00:08:31,200 --> 00:08:34,240 Speaker 1: you know, because you're talking about it imposing limits on yourself. 158 00:08:34,440 --> 00:08:36,560 Speaker 1: Like I'm instantly thinking like, all right, I'm writing, I'm 159 00:08:36,559 --> 00:08:40,360 Speaker 1: gonna write a I'm gonna write this science fiction short story. Okay, 160 00:08:40,679 --> 00:08:43,560 Speaker 1: what kind of ideas can I explore within the confines 161 00:08:43,600 --> 00:08:46,440 Speaker 1: of this story, this setting, maybe this type of character 162 00:08:46,480 --> 00:08:48,640 Speaker 1: that I want to write? And then you start you 163 00:08:48,679 --> 00:08:51,400 Speaker 1: start throwing in another parameter. It's like what kind of 164 00:08:51,440 --> 00:08:53,960 Speaker 1: stories are selling? What? What? What kind of story can 165 00:08:54,000 --> 00:08:55,920 Speaker 1: I write that doesn't have a vampire in it? You know, 166 00:08:55,960 --> 00:08:59,080 Speaker 1: stuff like that, And so there's a there's a certain 167 00:08:59,120 --> 00:09:04,240 Speaker 1: amount of creativity. Is not necessarily this just magical explosion 168 00:09:05,040 --> 00:09:07,720 Speaker 1: that takes place. There's a lot of computation that goes 169 00:09:07,760 --> 00:09:11,120 Speaker 1: into it, and therefore there's an entire side to it 170 00:09:11,160 --> 00:09:15,360 Speaker 1: that could easily be mastered by an inhuman entity, right, 171 00:09:15,520 --> 00:09:18,800 Speaker 1: especially if like we as humans are able to go 172 00:09:18,960 --> 00:09:22,880 Speaker 1: to that machine and input the parameters of those limits 173 00:09:22,920 --> 00:09:26,000 Speaker 1: to it, right and say like, okay, here's your limitations 174 00:09:26,320 --> 00:09:29,079 Speaker 1: now right this story or here's your limitations now create 175 00:09:29,120 --> 00:09:32,120 Speaker 1: this music. And that seems to be what they're trying 176 00:09:32,160 --> 00:09:35,200 Speaker 1: to do. Yeah, And I know when when I think about, 177 00:09:35,760 --> 00:09:39,320 Speaker 1: you know, these various anxieties about you know, future writers 178 00:09:39,320 --> 00:09:42,640 Speaker 1: and podcasters that are entirely machine. You know, we all 179 00:09:42,640 --> 00:09:45,320 Speaker 1: tell ourselves, well, they can replicate a lot of it, 180 00:09:45,360 --> 00:09:47,560 Speaker 1: but there there's a certain spark, there's something I have 181 00:09:47,920 --> 00:09:50,920 Speaker 1: that cannot be replicated. And we we also tell that's 182 00:09:50,920 --> 00:09:53,400 Speaker 1: the same thing we tell ourselves when we're talking about 183 00:09:53,480 --> 00:09:55,720 Speaker 1: human competition. You can say, all right, well this this 184 00:09:55,800 --> 00:09:59,800 Speaker 1: individual may be better at at this skill set than me. 185 00:10:00,280 --> 00:10:02,079 Speaker 1: They maybe have a better voice or what have you, 186 00:10:02,440 --> 00:10:05,120 Speaker 1: but but there's something special about me, and so do 187 00:10:05,160 --> 00:10:07,760 Speaker 1: I extend or is that just sort of a self 188 00:10:07,760 --> 00:10:12,600 Speaker 1: inflating uh fiction that we're telling ourselves all the time anyway. 189 00:10:12,679 --> 00:10:15,000 Speaker 1: Uh And and so maybe the machines will just swoop 190 00:10:15,040 --> 00:10:18,200 Speaker 1: in and take all of our creative jobs. Maybe, Although 191 00:10:18,400 --> 00:10:20,520 Speaker 1: I have to say, like sitting down and listening to 192 00:10:20,559 --> 00:10:23,040 Speaker 1: this conversation and then like filling out notes and doing 193 00:10:23,040 --> 00:10:25,880 Speaker 1: a little extra research for this episode, it made me 194 00:10:25,960 --> 00:10:30,000 Speaker 1: think about my creative process in a systematized way that 195 00:10:30,080 --> 00:10:33,440 Speaker 1: I haven't before, And it made me really realize that 196 00:10:34,240 --> 00:10:36,160 Speaker 1: for me at least, I think like one of the 197 00:10:37,080 --> 00:10:41,400 Speaker 1: big like jumping points for human beings being creative is 198 00:10:41,760 --> 00:10:46,240 Speaker 1: when they start creating new tools to be creative in 199 00:10:46,280 --> 00:10:48,880 Speaker 1: different ways. And we'll talk about that later because this 200 00:10:48,920 --> 00:10:52,319 Speaker 1: is something they discussed in the panel as well. Uh. 201 00:10:52,360 --> 00:10:57,480 Speaker 1: And I can't see a future yet where AI is 202 00:10:57,559 --> 00:11:02,680 Speaker 1: doing those things where AI says, I like a guitar, 203 00:11:02,880 --> 00:11:05,160 Speaker 1: and I can write these songs on guitar, But what 204 00:11:05,240 --> 00:11:07,000 Speaker 1: if I go over to a tree and I carve 205 00:11:07,040 --> 00:11:10,160 Speaker 1: an entirely different kind of instrument so that I can 206 00:11:10,160 --> 00:11:12,439 Speaker 1: get this effect. That's true. That's that's a good point, 207 00:11:12,720 --> 00:11:16,360 Speaker 1: sort of like meta creativity ideas. All right, so we'll 208 00:11:16,400 --> 00:11:18,520 Speaker 1: definitely get into the tools here in a bit, but 209 00:11:19,080 --> 00:11:20,520 Speaker 1: before we roll on, I just want to want to 210 00:11:21,000 --> 00:11:24,560 Speaker 1: really draw attention to two particular ideas that came up 211 00:11:24,559 --> 00:11:27,760 Speaker 1: in the discussion um that that I that I thought 212 00:11:27,800 --> 00:11:30,720 Speaker 1: were particularly striking, and that will come up again and 213 00:11:30,720 --> 00:11:33,560 Speaker 1: again as we uh as we roll through the episode. 214 00:11:33,600 --> 00:11:36,079 Speaker 1: First of all, they kept to bring up this analogy 215 00:11:36,160 --> 00:11:39,200 Speaker 1: of the career of creative technology compared to the airplane. 216 00:11:39,640 --> 00:11:42,400 Speaker 1: So obviously we make airplanes that fly faster and higher 217 00:11:42,480 --> 00:11:45,360 Speaker 1: than any bird, but that doesn't mean we fully understand 218 00:11:45,360 --> 00:11:48,439 Speaker 1: how birds fly, nor does it mean we can replicate 219 00:11:48,520 --> 00:11:52,360 Speaker 1: the evolved perfection of their powered flight. So technology essentially 220 00:11:52,400 --> 00:11:56,040 Speaker 1: gives us a version of the same thing that outperforms 221 00:11:56,080 --> 00:12:00,480 Speaker 1: the organic in many respects, but also underperforms and other areas. 222 00:12:00,559 --> 00:12:03,719 Speaker 1: And this was al rix Says primary argument sort of 223 00:12:03,840 --> 00:12:08,199 Speaker 1: against the idea that AI could be quote unquote creative, 224 00:12:08,360 --> 00:12:13,280 Speaker 1: right because birds airplanes are not birds, right, And so 225 00:12:13,320 --> 00:12:18,080 Speaker 1: he was saying basically that then subsequently the software couldn't 226 00:12:18,080 --> 00:12:22,200 Speaker 1: necessarily be human in its creativity, But it depends on 227 00:12:22,240 --> 00:12:25,080 Speaker 1: how you define creativity too. Yeah, it's a great metaphor, 228 00:12:25,200 --> 00:12:28,160 Speaker 1: and I think all the panelists ended up, picking up 229 00:12:28,200 --> 00:12:29,559 Speaker 1: and playing with it a little bit, and well we'll 230 00:12:29,600 --> 00:12:32,360 Speaker 1: play with it here as well, now chewing. For her part, 231 00:12:32,440 --> 00:12:36,800 Speaker 1: she highlighted her own projects with creative machine learning, in 232 00:12:36,840 --> 00:12:41,360 Speaker 1: which she and there's some wonderful like overhead video examples 233 00:12:41,360 --> 00:12:45,040 Speaker 1: of this, but she draws, uh with or paints with 234 00:12:45,080 --> 00:12:47,320 Speaker 1: a piano. Believe it was a piano. It looked to 235 00:12:47,360 --> 00:12:51,160 Speaker 1: me like she was using some kind of stylist tablet thing. 236 00:12:51,440 --> 00:12:54,560 Speaker 1: But yeah, like basically she from what I could tell 237 00:12:54,760 --> 00:12:58,120 Speaker 1: as somebody who does illustration, she was drawing on the 238 00:12:58,240 --> 00:13:02,160 Speaker 1: right half of a of a canvas, and what she 239 00:13:02,240 --> 00:13:05,320 Speaker 1: would draw, the computer would then draw on the left 240 00:13:05,320 --> 00:13:07,839 Speaker 1: half of the canvas. But it wouldn't just imitate her. 241 00:13:07,880 --> 00:13:09,520 Speaker 1: It wasn't like, I mean, you can do this in 242 00:13:09,520 --> 00:13:12,040 Speaker 1: like Adobe Illustrator where it just like draws the exact 243 00:13:12,120 --> 00:13:14,400 Speaker 1: same thing you draw, and you've got like a perfectly 244 00:13:14,400 --> 00:13:17,520 Speaker 1: symmetrical drawing. So it's not a mirror image though. This 245 00:13:17,600 --> 00:13:21,720 Speaker 1: was this was like creating its own thing, learning from 246 00:13:21,840 --> 00:13:25,120 Speaker 1: her curves and lines. Yeah, I was playing off her movement, 247 00:13:25,200 --> 00:13:28,360 Speaker 1: sort of improvising with her to create a work of 248 00:13:28,400 --> 00:13:31,040 Speaker 1: collaborative art. And so she argues that the notion of 249 00:13:31,080 --> 00:13:35,040 Speaker 1: a robotic agent that creates with you the dreams with 250 00:13:35,120 --> 00:13:39,360 Speaker 1: you in harmony is quote an underserved narrative in our culture, 251 00:13:39,440 --> 00:13:42,240 Speaker 1: and I and I definitely agree with that. She was 252 00:13:42,960 --> 00:13:45,240 Speaker 1: to me the most compelling person on the panel. I 253 00:13:45,280 --> 00:13:48,240 Speaker 1: thought they were all really interesting. There was something about 254 00:13:48,280 --> 00:13:49,839 Speaker 1: her that I was like, this person is going to 255 00:13:49,960 --> 00:13:53,160 Speaker 1: go on to do like amazing work, and she's gonna 256 00:13:53,240 --> 00:13:55,680 Speaker 1: come back. Because she was relatively young, I can see 257 00:13:55,679 --> 00:13:57,679 Speaker 1: her in like ten twenty years coming back to us 258 00:13:57,679 --> 00:14:00,880 Speaker 1: and just like dropping like some huge revelation on us 259 00:14:00,920 --> 00:14:08,360 Speaker 1: about creativity and computer learning after the Singularity, after shelded 260 00:14:08,400 --> 00:14:11,600 Speaker 1: into the mind. They chose me I'm going to be 261 00:14:11,679 --> 00:14:13,520 Speaker 1: the go between. But she was like she was dropping 262 00:14:13,559 --> 00:14:17,840 Speaker 1: some really interesting tidbits into the conversation, Like she was 263 00:14:17,920 --> 00:14:23,440 Speaker 1: thinking about the way that she draws based on competitive gaming. 264 00:14:24,000 --> 00:14:26,360 Speaker 1: Like she kept bringing up like, and by this we 265 00:14:26,400 --> 00:14:29,000 Speaker 1: mean like video games where people are playing I don't know, 266 00:14:29,080 --> 00:14:33,680 Speaker 1: League of Legends or um Overwatch, and so just the 267 00:14:33,720 --> 00:14:36,360 Speaker 1: idea that she was thinking about how like the AI 268 00:14:36,480 --> 00:14:40,600 Speaker 1: in those games was compensating for the human movements or 269 00:14:40,720 --> 00:14:45,000 Speaker 1: or play styles as opposed to how she drew and 270 00:14:45,040 --> 00:14:48,040 Speaker 1: how the computer drew along with her. That's an very 271 00:14:48,120 --> 00:14:52,040 Speaker 1: interesting connection that I would never have thought of before. Yeah, 272 00:14:52,080 --> 00:14:54,480 Speaker 1: that's that's that that is that's a That's a really 273 00:14:54,520 --> 00:14:56,640 Speaker 1: a really great example because you don't think about like 274 00:14:56,720 --> 00:15:01,520 Speaker 1: this competitive model being applicable to a creative collaboration. And 275 00:15:01,640 --> 00:15:03,840 Speaker 1: you and I play a lot of video games, you know, 276 00:15:03,920 --> 00:15:06,080 Speaker 1: but I don't spend a lot of time thinking about 277 00:15:06,080 --> 00:15:10,040 Speaker 1: how the AI in the game is compensating for my movements. 278 00:15:10,720 --> 00:15:12,560 Speaker 1: I have the only time I've thought about this, and 279 00:15:12,600 --> 00:15:15,520 Speaker 1: this is going to be kind of geeky and and silly, 280 00:15:15,640 --> 00:15:19,560 Speaker 1: but in playing uh, pro wrestling video games, I've thought 281 00:15:19,600 --> 00:15:23,560 Speaker 1: about this because because there's like certain wrestling styles that 282 00:15:23,640 --> 00:15:27,040 Speaker 1: the characters have, well, it basically comes down to this weirdness. 283 00:15:27,640 --> 00:15:34,080 Speaker 1: Professional wrestling is a simulated combat sport. So yes, sorry spoilers, 284 00:15:34,120 --> 00:15:39,479 Speaker 1: but as you see it on TV, it's two humans pretending, 285 00:15:40,120 --> 00:15:42,800 Speaker 1: you know, acting out a combat scenario. We actually have 286 00:15:42,840 --> 00:15:44,920 Speaker 1: a video about this on KFE that you did a 287 00:15:44,920 --> 00:15:46,920 Speaker 1: couple of years ago. Yeah, and so when people make 288 00:15:46,920 --> 00:15:49,680 Speaker 1: a video game of that, they make a fighting game 289 00:15:50,000 --> 00:15:53,560 Speaker 1: in which too like a huge too humans or a 290 00:15:53,640 --> 00:15:55,680 Speaker 1: human in an AI or two AI s if you're 291 00:15:55,680 --> 00:15:58,400 Speaker 1: doing a demo mode, are fighting each other in a 292 00:15:58,440 --> 00:16:03,840 Speaker 1: competitive in calendar using the moves of professional wrestling. And 293 00:16:03,880 --> 00:16:06,560 Speaker 1: then on top of that, some players are going to 294 00:16:06,760 --> 00:16:09,120 Speaker 1: play that video game just like a fighting game. I'm 295 00:16:09,120 --> 00:16:12,520 Speaker 1: gonna beat my opponent, but others will play it attempting 296 00:16:12,560 --> 00:16:15,880 Speaker 1: to tell a story, attempting to get the most dramatic match. 297 00:16:16,400 --> 00:16:18,720 Speaker 1: And in those cases, I don't know that any programmers 298 00:16:18,720 --> 00:16:24,560 Speaker 1: have really gained that really strange area of video game playing, 299 00:16:25,160 --> 00:16:27,720 Speaker 1: but like, like that's that's what I think of when 300 00:16:27,760 --> 00:16:29,840 Speaker 1: she brings up this example, Like, here's an example of 301 00:16:29,920 --> 00:16:34,520 Speaker 1: somebody trying to not only defeat an AI opponent, but 302 00:16:34,600 --> 00:16:38,120 Speaker 1: to try and tell a story with an AI opponent. Okay, 303 00:16:38,120 --> 00:16:39,640 Speaker 1: so this is an episode where we're gonna have a 304 00:16:39,640 --> 00:16:42,600 Speaker 1: little bit of more digressions than we normally do because 305 00:16:42,720 --> 00:16:45,080 Speaker 1: of our creative backgrounds and sort of things we can 306 00:16:45,120 --> 00:16:48,040 Speaker 1: bring to this. I have a story that I think 307 00:16:48,080 --> 00:16:50,160 Speaker 1: can tie into this that we can then maybe bring 308 00:16:50,200 --> 00:16:52,320 Speaker 1: back to this AI. Okay, So a friend of mine 309 00:16:52,400 --> 00:16:55,440 Speaker 1: is currently working on a comic book for a w 310 00:16:55,440 --> 00:16:57,680 Speaker 1: W E comic. He's just doing like a short story 311 00:16:58,120 --> 00:17:05,560 Speaker 1: and it's about some missfight between the Undertaker and Mankind 312 00:17:05,720 --> 00:17:10,199 Speaker 1: is the guy that would okay, and uh so I 313 00:17:10,240 --> 00:17:12,399 Speaker 1: never saw this. I'm not as bigger a wrestling fan 314 00:17:12,480 --> 00:17:15,000 Speaker 1: as Robert, but uh when he told me about it, 315 00:17:15,040 --> 00:17:17,359 Speaker 1: was like, oh, cool, like you could you could do 316 00:17:17,440 --> 00:17:19,359 Speaker 1: some fun stuff like that, like have them being a 317 00:17:19,400 --> 00:17:21,800 Speaker 1: slaughterhouse or have them be in a graveyard and they're 318 00:17:21,840 --> 00:17:24,320 Speaker 1: hitting each other with gravestones or something, right, and he 319 00:17:24,400 --> 00:17:27,040 Speaker 1: was like, no, we have to do an exact replica 320 00:17:27,280 --> 00:17:30,760 Speaker 1: in comic form of this infamous fight they hadn't like 321 00:17:30,760 --> 00:17:34,480 Speaker 1: I don't know, two thousand or something. And I was like, huh, 322 00:17:34,520 --> 00:17:38,040 Speaker 1: that's that's interesting. But to me that like here we 323 00:17:38,080 --> 00:17:42,200 Speaker 1: go with limitations, right, Like the limitation there is so 324 00:17:42,320 --> 00:17:45,840 Speaker 1: limiting that I don't see what would be creative about 325 00:17:45,840 --> 00:17:48,760 Speaker 1: it or interesting other than obviously my friend and his 326 00:17:48,840 --> 00:17:51,520 Speaker 1: style and how he draws the thing. But he's not 327 00:17:51,920 --> 00:17:58,240 Speaker 1: being able to express anything. Mhmm. Yeah, yeah, yeah, I 328 00:17:58,280 --> 00:17:59,520 Speaker 1: see what you mean. Because the one thing if you 329 00:17:59,520 --> 00:18:02,560 Speaker 1: could say, I tell this, tell the story, but they 330 00:18:02,560 --> 00:18:06,520 Speaker 1: actually have magical powers, like they're Undertakers, really an undead 331 00:18:06,520 --> 00:18:09,000 Speaker 1: creature exactly, That's how I was thinking it would be. 332 00:18:09,040 --> 00:18:11,320 Speaker 1: I was like, oh, that this is the benefit of 333 00:18:11,359 --> 00:18:13,320 Speaker 1: doing it in the comics medium is that you can 334 00:18:13,359 --> 00:18:16,679 Speaker 1: actually give them weird powers and you can like really 335 00:18:16,760 --> 00:18:22,440 Speaker 1: play with the the goofy narrative of like their their roles. Right. Um. 336 00:18:22,600 --> 00:18:24,679 Speaker 1: But then this this also leads to the creative challenge, 337 00:18:24,760 --> 00:18:28,480 Speaker 1: Like sometimes just because the constraints are there doesn't mean 338 00:18:28,520 --> 00:18:31,280 Speaker 1: there's there's not an opportunity exactly. And that's what I'm 339 00:18:31,320 --> 00:18:34,640 Speaker 1: looking forward to it because when I hear that, I go, man, 340 00:18:34,680 --> 00:18:38,120 Speaker 1: there's there's too many limitations there. But you know, who knows, 341 00:18:38,240 --> 00:18:41,120 Speaker 1: maybe there's a way within all those limitations. I trust 342 00:18:41,200 --> 00:18:42,520 Speaker 1: that my friend is gonna be able to do this, 343 00:18:43,000 --> 00:18:45,240 Speaker 1: that that they'll be like a really cool way to 344 00:18:45,280 --> 00:18:48,520 Speaker 1: depict that fight. All right, but it's gonna take creativity. 345 00:18:48,600 --> 00:18:50,679 Speaker 1: And so we're gonna we're gonna roll it back a 346 00:18:50,680 --> 00:18:54,879 Speaker 1: little bit and talk about what creativity actually is, like 347 00:18:55,000 --> 00:18:58,720 Speaker 1: boiling it down, reconstructing it so that we can understand 348 00:18:58,800 --> 00:19:01,119 Speaker 1: what we're doing when we're create and in order to 349 00:19:01,600 --> 00:19:05,240 Speaker 1: uh make a machine engage in the same process. Right, 350 00:19:05,600 --> 00:19:09,760 Speaker 1: So they come up with some of their uh kind 351 00:19:09,800 --> 00:19:14,119 Speaker 1: of limited limitations on what creativity can or cannot be. 352 00:19:14,560 --> 00:19:16,679 Speaker 1: And this is where I think, like Ultimately, like the 353 00:19:16,720 --> 00:19:19,680 Speaker 1: split between the four of them, or maybe the three 354 00:19:19,680 --> 00:19:24,040 Speaker 1: of them versus Alri say came down was that Ria 355 00:19:24,160 --> 00:19:28,120 Speaker 1: has this sort of broader definition of creativity as being 356 00:19:28,119 --> 00:19:30,840 Speaker 1: this sort of magical thing that happens in the human brain. Right, 357 00:19:31,000 --> 00:19:33,520 Speaker 1: like he talks about at different times in the discussion, 358 00:19:33,520 --> 00:19:37,040 Speaker 1: talks about the importance of pain, of intent, of consciousness, 359 00:19:37,720 --> 00:19:40,919 Speaker 1: various attributes that are currently beyond the scope of of 360 00:19:41,000 --> 00:19:43,840 Speaker 1: AI and would be problematic if we had them, you know, 361 00:19:43,880 --> 00:19:47,080 Speaker 1: if you could, would it be It's a whole separate issue, 362 00:19:47,119 --> 00:19:49,760 Speaker 1: but it would it be ethical to make a you know, 363 00:19:49,800 --> 00:19:53,920 Speaker 1: a depressed artist robot because why what kind of cruel 364 00:19:53,920 --> 00:19:55,800 Speaker 1: god are you to make a robot so depressed? You know, 365 00:19:55,800 --> 00:19:57,520 Speaker 1: it's funny you mentioned that later on in the notes. 366 00:19:57,560 --> 00:20:00,680 Speaker 1: I have a question on whether robots can be crazy 367 00:20:00,760 --> 00:20:04,640 Speaker 1: enough to be creative, because, as we know, many creative 368 00:20:04,800 --> 00:20:08,320 Speaker 1: people suffer mental health issues. Yeah, well that's how we 369 00:20:08,400 --> 00:20:13,080 Speaker 1: got David and in the Covenant. Well they already answered 370 00:20:13,119 --> 00:20:16,240 Speaker 1: it for us, Thanks for at le Scott. Alright, So 371 00:20:16,600 --> 00:20:18,560 Speaker 1: one of the one of these sort of loose models 372 00:20:18,560 --> 00:20:21,879 Speaker 1: though for creativity that that is thrown out here is 373 00:20:22,119 --> 00:20:25,560 Speaker 1: it comes from engineering theorist lab Varshny, and he breaks 374 00:20:25,560 --> 00:20:28,360 Speaker 1: it all down to a comp to a combination of 375 00:20:28,840 --> 00:20:32,880 Speaker 1: novelty and high quality that comes together in a way 376 00:20:32,920 --> 00:20:36,040 Speaker 1: that changes your belief. So like a Venn diagram, I'm 377 00:20:36,080 --> 00:20:39,080 Speaker 1: imagining here of is it novel, is it knew? Is 378 00:20:39,080 --> 00:20:41,920 Speaker 1: it something different? And is it of high enough quality? 379 00:20:42,119 --> 00:20:44,080 Speaker 1: And when it comes and does it come together in 380 00:20:44,119 --> 00:20:46,760 Speaker 1: a way that actually have some sort of meaning to me? 381 00:20:46,960 --> 00:20:50,480 Speaker 1: I think this is a really interesting way to model 382 00:20:51,560 --> 00:20:54,359 Speaker 1: aesthetics and that I had not thought of before. And 383 00:20:54,400 --> 00:20:56,560 Speaker 1: when you try to like sort of lay this on 384 00:20:56,640 --> 00:21:00,240 Speaker 1: top of like creative artifacts and see like how that works, 385 00:21:00,560 --> 00:21:02,560 Speaker 1: it's interesting, right, like the first thing I thought of. 386 00:21:02,600 --> 00:21:04,000 Speaker 1: I was talking to you about this before we went 387 00:21:04,000 --> 00:21:06,160 Speaker 1: in to the studio, and no spoilers, but the new 388 00:21:06,200 --> 00:21:08,359 Speaker 1: Twin Peaks is on TV and I've been watching it 389 00:21:08,400 --> 00:21:11,040 Speaker 1: every week, and as you would expect from David Lynch, 390 00:21:11,400 --> 00:21:15,960 Speaker 1: it's wacky, right, But in terms of lab varsenyse ven 391 00:21:15,960 --> 00:21:19,240 Speaker 1: diagram here of novelty and high quality, it's just the 392 00:21:19,359 --> 00:21:22,159 Speaker 1: right amount of novelty and just the right amount of 393 00:21:22,280 --> 00:21:26,160 Speaker 1: quality that it works for me. Now maybe for other people, 394 00:21:26,160 --> 00:21:31,080 Speaker 1: because obviously creativity and artist objective. It doesn't, but but yeah, 395 00:21:31,160 --> 00:21:33,280 Speaker 1: I was seeing it like, oh, that is the perfect 396 00:21:33,359 --> 00:21:35,360 Speaker 1: David Lynch is like the perfect metaphor for this, because, 397 00:21:35,440 --> 00:21:38,080 Speaker 1: let's be honest, like some of David Lynch's stuff, the 398 00:21:38,160 --> 00:21:41,080 Speaker 1: novelty factor can get a little too big in that 399 00:21:41,160 --> 00:21:44,480 Speaker 1: ven diagramic situation and you're like and and Varsny actually 400 00:21:44,560 --> 00:21:47,520 Speaker 1: describes it as noise. He says, like, when either one 401 00:21:47,520 --> 00:21:50,480 Speaker 1: of those things gets too big in comparison to the other, 402 00:21:50,800 --> 00:21:53,800 Speaker 1: it turns into noise and we're no longer experiencing something 403 00:21:53,840 --> 00:21:56,600 Speaker 1: like that's communicative. And he also says that if they 404 00:21:56,680 --> 00:21:59,439 Speaker 1: both are too if there's too much of both at 405 00:21:59,440 --> 00:22:02,120 Speaker 1: the same time, right, Like, if it's incredibly novel and 406 00:22:02,720 --> 00:22:06,040 Speaker 1: it's super high quality, then it's just like that overloads 407 00:22:06,080 --> 00:22:09,200 Speaker 1: people's circuits. Right, So I'm thinking of like Finnegan's Wake 408 00:22:09,600 --> 00:22:11,960 Speaker 1: or something like that. Probably well, and it's gonna vary 409 00:22:11,960 --> 00:22:14,600 Speaker 1: from person to person because you know, obviously, because you're 410 00:22:14,880 --> 00:22:16,960 Speaker 1: I think we can all imagine movies that we've seen 411 00:22:17,000 --> 00:22:19,520 Speaker 1: where either like I've seen plenty of movies where there's 412 00:22:19,560 --> 00:22:22,119 Speaker 1: some great ideas, the quality is not there though you know, 413 00:22:22,600 --> 00:22:25,440 Speaker 1: maybe the acting is bad, the monster doesn't look right 414 00:22:25,720 --> 00:22:27,760 Speaker 1: or whatever, and then you see the reverse to some 415 00:22:27,840 --> 00:22:31,440 Speaker 1: beautiful films and you're just like, my eyes are eating 416 00:22:31,480 --> 00:22:34,680 Speaker 1: up everything that's on the screen, but I feel nothing. Yeah, 417 00:22:34,800 --> 00:22:37,400 Speaker 1: exactly like you get. You get the opposite too, when 418 00:22:37,400 --> 00:22:40,960 Speaker 1: like these things are too small in comparison to one another, Right, Like, 419 00:22:41,160 --> 00:22:44,040 Speaker 1: if it's super high quality, but it's a story that 420 00:22:44,119 --> 00:22:48,520 Speaker 1: you've seen and it's been done three times before, you're 421 00:22:48,560 --> 00:22:51,120 Speaker 1: not necessarily going to be that engaged with it, right, 422 00:22:51,320 --> 00:22:53,880 Speaker 1: So there's got to be a little bit of novel 423 00:22:54,400 --> 00:22:56,960 Speaker 1: something to it. But at the same time, if it's 424 00:22:56,960 --> 00:22:59,879 Speaker 1: totally novel, it's this cool idea, it's it's like the 425 00:23:00,040 --> 00:23:04,800 Speaker 1: ultimate Hollywood pitch high concept, like the Ocean Walker, the 426 00:23:04,920 --> 00:23:07,720 Speaker 1: rest of Development, Yeah, something like that. And then the 427 00:23:07,800 --> 00:23:11,719 Speaker 1: qualities garbage. Well, you know that's not going to do 428 00:23:11,840 --> 00:23:14,320 Speaker 1: very well either. But but to his point about like 429 00:23:14,359 --> 00:23:18,240 Speaker 1: both areas being blown out, both novelty and high quality, well, 430 00:23:18,680 --> 00:23:20,159 Speaker 1: I guess, I guess one example would be if you 431 00:23:20,160 --> 00:23:22,800 Speaker 1: take somebody who does not have much, uh you know, 432 00:23:22,840 --> 00:23:26,440 Speaker 1: of of a bedrock understanding of a modern art or surrealism, 433 00:23:26,600 --> 00:23:28,280 Speaker 1: and you just really throw them into the deep end 434 00:23:28,359 --> 00:23:31,320 Speaker 1: at a modern art museum, So they might encounter a 435 00:23:31,359 --> 00:23:34,959 Speaker 1: piece of work that is, you know, it's highly novel uh, 436 00:23:35,160 --> 00:23:37,840 Speaker 1: you know, created with wonderful you know, high level craftsmanship, 437 00:23:38,440 --> 00:23:40,040 Speaker 1: but they're just not in a place where they're going 438 00:23:40,080 --> 00:23:42,200 Speaker 1: to be able to understand. It's just gonna be essentially 439 00:23:42,240 --> 00:23:45,960 Speaker 1: over their head and therefore, like the it's going to 440 00:23:46,040 --> 00:23:49,560 Speaker 1: be a failure of the art in a way. You 441 00:23:49,600 --> 00:23:51,840 Speaker 1: may have created something wonderful and thought provoking, but if 442 00:23:51,840 --> 00:23:55,600 Speaker 1: it doesn't connect with people, then does it Does it work? 443 00:23:55,760 --> 00:23:59,359 Speaker 1: When he says noise, I immediately think of the image 444 00:23:59,400 --> 00:24:01,440 Speaker 1: of a television and with white noise on it, right, 445 00:24:01,480 --> 00:24:04,119 Speaker 1: so the snow. So that for some people that's what 446 00:24:04,200 --> 00:24:06,280 Speaker 1: it's like looking at modern art is just like staring 447 00:24:06,320 --> 00:24:09,280 Speaker 1: at the screen while while it's playing, you know, the 448 00:24:09,560 --> 00:24:14,080 Speaker 1: staticky snow. But rix Say basically argues that because brains 449 00:24:14,080 --> 00:24:18,480 Speaker 1: and AI are using completely different processes that he doesn't 450 00:24:18,520 --> 00:24:21,359 Speaker 1: even know that you can call what they're doing creativity, 451 00:24:21,600 --> 00:24:25,080 Speaker 1: right because he's defining it as being pretty specific to 452 00:24:25,160 --> 00:24:30,040 Speaker 1: human beings. Uh. And in that human creativity is seen 453 00:24:30,080 --> 00:24:35,040 Speaker 1: as simple problem solving by generating a lot of possibilities 454 00:24:35,280 --> 00:24:39,040 Speaker 1: and then selecting from those possibilities and that's this is 455 00:24:39,040 --> 00:24:42,040 Speaker 1: how we evolved as human beings, right, So we went 456 00:24:42,080 --> 00:24:46,720 Speaker 1: from using stone tools to access to eventually Neanderthal's using 457 00:24:47,200 --> 00:24:50,440 Speaker 1: instruments and actually making cave art. You know, like that 458 00:24:50,520 --> 00:24:55,560 Speaker 1: process of evolution through our our physical evolution also evolved 459 00:24:55,600 --> 00:24:59,240 Speaker 1: our creativity. And he's wondering, like if that topic selection, 460 00:24:59,359 --> 00:25:02,840 Speaker 1: if that type problem solving is even possible for computers, 461 00:25:03,440 --> 00:25:05,960 Speaker 1: but where it really skyrockets. And this is what I 462 00:25:05,960 --> 00:25:09,320 Speaker 1: think is most fascinating about creativity. Here is when you 463 00:25:09,320 --> 00:25:13,840 Speaker 1: start to imagine things that don't really exist. And possibly 464 00:25:14,000 --> 00:25:16,400 Speaker 1: he was saying that this kind of creativity is due 465 00:25:16,480 --> 00:25:19,159 Speaker 1: to the fact that we haven't evolved prefrontal cortex. And 466 00:25:19,160 --> 00:25:20,760 Speaker 1: then the video you get to see, like he shows 467 00:25:20,800 --> 00:25:25,879 Speaker 1: you different um images of various hominids and their skull 468 00:25:25,960 --> 00:25:29,920 Speaker 1: shapes and how they evolved over time, and then creativity 469 00:25:29,960 --> 00:25:32,280 Speaker 1: evolved with them. And then you see examples where there's 470 00:25:32,320 --> 00:25:34,800 Speaker 1: cave art with a with like say, human with a 471 00:25:34,840 --> 00:25:37,720 Speaker 1: beast's head. This is something that doesn't actually exist but 472 00:25:37,920 --> 00:25:40,760 Speaker 1: is a uh, you know, arguably at an example of 473 00:25:40,840 --> 00:25:45,160 Speaker 1: early creativity creating these unreal things that then have various 474 00:25:45,200 --> 00:25:48,480 Speaker 1: meanings and an effect on the viewer and then they like, 475 00:25:48,560 --> 00:25:50,719 Speaker 1: they get to this point in their conversation where they 476 00:25:50,720 --> 00:25:57,239 Speaker 1: start talking about where the AI quote unquote fails at creativity. 477 00:25:57,320 --> 00:25:59,879 Speaker 1: And that was interesting to me too, because they seem 478 00:25:59,920 --> 00:26:01,960 Speaker 1: to be talking about how the machines were failing, but 479 00:26:02,040 --> 00:26:05,920 Speaker 1: not how humans could necessarily fail. Yeah, yeah, they're they 480 00:26:05,920 --> 00:26:08,320 Speaker 1: were talking about how essentially in all of this, you're 481 00:26:08,359 --> 00:26:12,440 Speaker 1: gonna have this machine essentially brainstorming, but there's gonna be 482 00:26:12,480 --> 00:26:14,520 Speaker 1: a person that comes along, and the person is is 483 00:26:14,560 --> 00:26:18,320 Speaker 1: then selecting and judging the ideas in the In the 484 00:26:18,359 --> 00:26:21,720 Speaker 1: case of Varcheny, he's doing a lot of culinary computation. 485 00:26:22,240 --> 00:26:25,800 Speaker 1: So humans are gonna cook the food based on the 486 00:26:25,840 --> 00:26:28,239 Speaker 1: recipe that the computer has come up with, and then 487 00:26:28,240 --> 00:26:30,560 Speaker 1: they're going to taste it. And in some cases it's 488 00:26:30,560 --> 00:26:33,200 Speaker 1: gonna be like, Oh, that's a novel combination that actually works. 489 00:26:33,200 --> 00:26:35,560 Speaker 1: I wouldn't think it would. In other cases you would 490 00:26:35,560 --> 00:26:38,320 Speaker 1: say they would say, all right, we'll slow down their 491 00:26:38,880 --> 00:26:44,640 Speaker 1: machine overlords, because this is not really all that tasty. So, yeah, 492 00:26:44,720 --> 00:26:47,600 Speaker 1: you get into this idea where there are are failures. Uh, 493 00:26:47,920 --> 00:26:50,280 Speaker 1: But failures and mistakes are of course an important part 494 00:26:50,280 --> 00:26:53,840 Speaker 1: of the creative process as well. For humans. Right. This 495 00:26:53,920 --> 00:26:55,800 Speaker 1: is where like I stepped back and I was thinking 496 00:26:55,800 --> 00:26:59,199 Speaker 1: about my creative process because I was like, look, I 497 00:26:59,320 --> 00:27:02,440 Speaker 1: know that I spend a good amount of time quote 498 00:27:02,520 --> 00:27:06,000 Speaker 1: unquote failing on every project that I work on. I mean, 499 00:27:06,040 --> 00:27:08,840 Speaker 1: that's just part of being creative is like doing a 500 00:27:08,920 --> 00:27:11,040 Speaker 1: thing and then realizing that's not what it is, crossing 501 00:27:11,040 --> 00:27:13,280 Speaker 1: it out, moving onto the next thing. You know. It's 502 00:27:13,320 --> 00:27:15,560 Speaker 1: basically like what they were talking about before, where you're 503 00:27:15,600 --> 00:27:18,480 Speaker 1: generating all these possibilities and then you're selecting from them. 504 00:27:18,600 --> 00:27:21,639 Speaker 1: But you have to generate some possibilities that you're not 505 00:27:21,680 --> 00:27:24,040 Speaker 1: going to use, that you're going to see as failures 506 00:27:24,080 --> 00:27:26,080 Speaker 1: before you can eventually get to the thing that you 507 00:27:26,119 --> 00:27:29,080 Speaker 1: see as a success. Right. Yeah, I mean I think 508 00:27:29,080 --> 00:27:32,320 Speaker 1: anybody out there who has has engaged in a creative process, 509 00:27:32,320 --> 00:27:34,600 Speaker 1: you've come up with some They can't all be singers, 510 00:27:34,600 --> 00:27:37,120 Speaker 1: as they say, You're gonna come up with some duds, 511 00:27:37,160 --> 00:27:38,760 Speaker 1: and for a while, you might think that dud is 512 00:27:38,760 --> 00:27:42,439 Speaker 1: pretty amazing because maybe it is really novel, you know, 513 00:27:42,560 --> 00:27:46,320 Speaker 1: or maybe it is really high quality, but the balance 514 00:27:46,400 --> 00:27:50,040 Speaker 1: is is perhaps not there. Yeah, Now there's a there's 515 00:27:50,080 --> 00:27:53,200 Speaker 1: also the notion of intent in all of this feelings, 516 00:27:53,640 --> 00:27:56,000 Speaker 1: the later of which is is you know, obviously currently 517 00:27:56,200 --> 00:28:00,240 Speaker 1: uh uh not present in artificial intelligence and machine learning, 518 00:28:00,600 --> 00:28:03,160 Speaker 1: but perhaps you're we're just talking about two different sorts 519 00:28:03,160 --> 00:28:06,160 Speaker 1: of flight. To come back to the bird airplane scenario. 520 00:28:06,520 --> 00:28:10,280 Speaker 1: And indeed, moderator John Shaffer points brought up the point 521 00:28:10,600 --> 00:28:14,520 Speaker 1: of apophenia. So this is a this is this is interesting. 522 00:28:14,520 --> 00:28:18,160 Speaker 1: This is a concept coined by German scientists Class Conrad 523 00:28:18,359 --> 00:28:22,280 Speaker 1: in nineteen fifty eight, and it's the opposite of an epiphany, 524 00:28:22,359 --> 00:28:24,600 Speaker 1: and epiphany being you know, the you know, a true 525 00:28:24,760 --> 00:28:29,680 Speaker 1: intuition of the world's interconnectedness. And in statistics, apophenia is 526 00:28:29,760 --> 00:28:32,919 Speaker 1: essentially a type one error or false positive, where you 527 00:28:32,920 --> 00:28:36,560 Speaker 1: think something's connected and it's not. Uh In. In psychology, 528 00:28:36,560 --> 00:28:40,600 Speaker 1: according to Conrad, it's the stuff of schizophrenia, right, And 529 00:28:40,640 --> 00:28:44,760 Speaker 1: that's where it's primarily being discussed in present day as 530 00:28:44,800 --> 00:28:49,960 Speaker 1: like a psychological problem. It's when unrelated details seem to 531 00:28:50,000 --> 00:28:53,400 Speaker 1: be saturated with connections and meaning, but they're those are false. 532 00:28:53,440 --> 00:28:56,280 Speaker 1: They ultimately lead to nowhere. Right. And I've absolutely had 533 00:28:56,280 --> 00:28:58,960 Speaker 1: this experience with writing before I'm like, oh, I'm totally 534 00:28:59,040 --> 00:29:01,160 Speaker 1: on the right path, and then like I look back 535 00:29:01,160 --> 00:29:02,960 Speaker 1: at it a week later and like that was garbage, 536 00:29:03,320 --> 00:29:06,480 Speaker 1: you know. Yeah, And in the like the schizophrenia case, uh, 537 00:29:06,520 --> 00:29:08,160 Speaker 1: this would be like if you see the same person 538 00:29:08,200 --> 00:29:10,880 Speaker 1: on the subway twice and you're convinced that if someone 539 00:29:10,960 --> 00:29:14,200 Speaker 1: following making a connection that isn't there, and it might 540 00:29:14,720 --> 00:29:19,520 Speaker 1: it might take on you know, a pathological um uh energy. 541 00:29:19,720 --> 00:29:21,840 Speaker 1: What's interesting to me here, though, is that I think 542 00:29:22,040 --> 00:29:25,840 Speaker 1: this type one error definition of it is sort of 543 00:29:26,160 --> 00:29:30,600 Speaker 1: connected to the idea of creativity back to so it's 544 00:29:30,600 --> 00:29:34,800 Speaker 1: it's defined as believing something is real when it isn't. Okay, 545 00:29:34,840 --> 00:29:37,000 Speaker 1: that could be said to be part of the creative 546 00:29:37,040 --> 00:29:40,800 Speaker 1: process when you're imagining something that isn't real, at least 547 00:29:40,800 --> 00:29:43,880 Speaker 1: not yet, right either, Like like if you're writing a 548 00:29:43,920 --> 00:29:47,680 Speaker 1: fantasy story and it's like, Okay, my main character has 549 00:29:48,200 --> 00:29:51,040 Speaker 1: wings in the head of a lion and carries a 550 00:29:51,080 --> 00:29:54,720 Speaker 1: flaming sword, right, like, like that doesn't exist, it probably 551 00:29:54,760 --> 00:29:58,880 Speaker 1: won't ever exist. But that doesn't mean necessarily that it's 552 00:29:59,400 --> 00:30:03,040 Speaker 1: not create native, right, But in this sense, it's the 553 00:30:03,080 --> 00:30:05,680 Speaker 1: actual belief that it's real, right Like that if you 554 00:30:05,720 --> 00:30:08,680 Speaker 1: are like, so, there's this angel with a lion's head 555 00:30:08,720 --> 00:30:12,120 Speaker 1: that's following me, around everywhere the flaming sword. Then I 556 00:30:12,160 --> 00:30:15,560 Speaker 1: can see how that would be schizophrenia. Yeah. So yeah, 557 00:30:15,560 --> 00:30:18,280 Speaker 1: in real life and in statistics this is a problem. 558 00:30:18,320 --> 00:30:22,400 Speaker 1: But but in again creativity, uh as John Shaffer points out, 559 00:30:22,480 --> 00:30:26,600 Speaker 1: like our brain engages with it. Our brain uses apithania 560 00:30:26,720 --> 00:30:30,240 Speaker 1: essentially to forward connections where there where there isn't one, 561 00:30:30,600 --> 00:30:32,640 Speaker 1: and a lot of that is where we end up, 562 00:30:33,480 --> 00:30:37,320 Speaker 1: you know, creating something unique. You know, like, for instance, 563 00:30:37,360 --> 00:30:39,320 Speaker 1: the whole why does the human have have a lion 564 00:30:39,400 --> 00:30:40,920 Speaker 1: for a head and a flaming sword? You feel in 565 00:30:40,960 --> 00:30:43,040 Speaker 1: the details and you can reach the point where it's like, oh, 566 00:30:43,040 --> 00:30:45,680 Speaker 1: well now it makes sense I have I have sown 567 00:30:45,720 --> 00:30:51,240 Speaker 1: these two things together and now I have this complete form. Yeah. 568 00:30:51,320 --> 00:30:55,320 Speaker 1: And so when you make those associations in specific ways 569 00:30:55,400 --> 00:30:59,720 Speaker 1: with different media, right, So, like I'm thinking here of 570 00:30:59,800 --> 00:31:03,520 Speaker 1: a bum art, because you listen to the music, that's 571 00:31:03,560 --> 00:31:07,840 Speaker 1: the artifact that the album is ultimately four, But then 572 00:31:07,880 --> 00:31:10,600 Speaker 1: there's album art that's created kind of as part of 573 00:31:10,600 --> 00:31:16,840 Speaker 1: the creative package, but also as marketing and ultimately like 574 00:31:16,960 --> 00:31:20,000 Speaker 1: it gives you, there's connections between those two things, like 575 00:31:20,280 --> 00:31:22,080 Speaker 1: when you buy them. It's interesting to me since I 576 00:31:22,120 --> 00:31:24,600 Speaker 1: moved away from buying physical records and I mainly just 577 00:31:24,600 --> 00:31:27,800 Speaker 1: get digital music. Now I realized that I had so 578 00:31:27,840 --> 00:31:31,960 Speaker 1: many emotional connections with music purely because of the artwork 579 00:31:32,160 --> 00:31:34,080 Speaker 1: that it came with, because of the packaging that it 580 00:31:34,160 --> 00:31:38,200 Speaker 1: came with, and I wasn't simply just judging the music 581 00:31:38,320 --> 00:31:42,440 Speaker 1: on its own merits outside of any kind of visual thing. Well, 582 00:31:42,560 --> 00:31:44,680 Speaker 1: so here's a quick question on this, this idea of 583 00:31:44,680 --> 00:31:47,920 Speaker 1: of albums in their album art. What's an example in 584 00:31:47,920 --> 00:31:51,479 Speaker 1: your opinion where the album art and the music like 585 00:31:51,640 --> 00:31:54,920 Speaker 1: perfectly matched. Oh. You know, it's funny, as I was 586 00:31:54,960 --> 00:32:00,280 Speaker 1: actually listening to this record this morning, thinking of you, uh, Tools, 587 00:32:00,520 --> 00:32:05,640 Speaker 1: is it undertow? Yeah, that like weird sculpture that's on 588 00:32:05,680 --> 00:32:08,800 Speaker 1: the cover of that. And then okay, surrounding that album 589 00:32:08,880 --> 00:32:11,440 Speaker 1: coming out, there were like at least three videos that 590 00:32:11,560 --> 00:32:16,320 Speaker 1: used that similar kind of like gothic claymation style. Right, 591 00:32:16,960 --> 00:32:20,400 Speaker 1: None of those things were the music themselves, But when 592 00:32:20,400 --> 00:32:24,080 Speaker 1: I hear that music, now, for whatever reason in my head, 593 00:32:24,560 --> 00:32:28,960 Speaker 1: are these like swirling clay sculpture forms. Yeah, I think 594 00:32:28,960 --> 00:32:31,680 Speaker 1: that's a great example. My example would have been Tools 595 00:32:31,760 --> 00:32:35,600 Speaker 1: on Automa. Actually, they're certainly a band that has always 596 00:32:35,680 --> 00:32:37,600 Speaker 1: put a lot of thought into how their art and 597 00:32:37,720 --> 00:32:41,360 Speaker 1: music coming together. Now that being said, there are plenty 598 00:32:42,160 --> 00:32:44,800 Speaker 1: of albums out there where the art and the music 599 00:32:44,880 --> 00:32:47,800 Speaker 1: seemed to be on separate planets, and you did a 600 00:32:47,840 --> 00:32:51,400 Speaker 1: fascinating experiment for this episode. I'm thrilled about this, and 601 00:32:51,480 --> 00:32:53,720 Speaker 1: I can't wait for you to tell the audience. Because 602 00:32:54,080 --> 00:32:57,040 Speaker 1: it's interactive, they can go and listen and look at 603 00:32:57,080 --> 00:33:00,200 Speaker 1: this album art as well. That's right. If you think 604 00:33:00,200 --> 00:33:03,080 Speaker 1: back to our episode on Ntrarch, Gieger, we mentioned how 605 00:33:03,200 --> 00:33:06,880 Speaker 1: he was generally cool with just about anybody using his 606 00:33:07,120 --> 00:33:10,040 Speaker 1: art for their album, provided they went through official channels. 607 00:33:10,360 --> 00:33:12,280 Speaker 1: You know, it was, you know, on the level. So 608 00:33:12,320 --> 00:33:15,360 Speaker 1: there are a number of different albums out there that 609 00:33:15,440 --> 00:33:19,000 Speaker 1: have Geeger covers and the music doesn't always match up 610 00:33:19,320 --> 00:33:23,560 Speaker 1: with Or, I would argue, rarely matches up with the 611 00:33:23,560 --> 00:33:27,479 Speaker 1: the aesthetic energy that he possessed. Yeah, with Geeger, like, 612 00:33:27,600 --> 00:33:30,240 Speaker 1: I assume that it's going to be something that's kind 613 00:33:30,240 --> 00:33:35,239 Speaker 1: of kind of industrial or metallic in someone, right. So 614 00:33:36,000 --> 00:33:37,680 Speaker 1: what I did is I went on Spotify. I made 615 00:33:37,680 --> 00:33:41,760 Speaker 1: a playlist with like one song from every album I 616 00:33:41,760 --> 00:33:45,680 Speaker 1: could find on Spotify that had a Geeger cover, like 617 00:33:45,680 --> 00:33:48,800 Speaker 1: like or Or in one case came out at at 618 00:33:48,840 --> 00:33:51,360 Speaker 1: a at a certain point with geeger Art promoting it. 619 00:33:52,040 --> 00:33:55,600 Speaker 1: So I then listened to each example and I tried 620 00:33:55,640 --> 00:33:57,760 Speaker 1: to decide, all right, is this something where the album 621 00:33:58,000 --> 00:33:59,880 Speaker 1: art makes sense for me? Does it do? Is there 622 00:33:59,880 --> 00:34:03,160 Speaker 1: a an actual connection in my mind between Giger and 623 00:34:03,200 --> 00:34:07,200 Speaker 1: this artist? Is they're totally not? And can I detect 624 00:34:07,200 --> 00:34:09,960 Speaker 1: any moments where I feel apophania kicking in and this 625 00:34:10,080 --> 00:34:14,160 Speaker 1: creative process in my brain bringing the two together, finding 626 00:34:14,200 --> 00:34:16,960 Speaker 1: and forging the connections. So that's what I did, and 627 00:34:17,280 --> 00:34:19,479 Speaker 1: you can do this too. I'm gonna have the link 628 00:34:20,000 --> 00:34:23,080 Speaker 1: for this Spotify playlist on the landing page for this 629 00:34:23,080 --> 00:34:26,080 Speaker 1: episode of Stuff to Blow Your Mind dot Com. But yeah, 630 00:34:26,080 --> 00:34:28,799 Speaker 1: I started listening to these. I listened to Emerson Lake 631 00:34:28,840 --> 00:34:33,560 Speaker 1: and Palmer's Brain Salad Surgery, which, uh, we're talking about 632 00:34:33,560 --> 00:34:35,719 Speaker 1: this before. I don't know a lot about them, but 633 00:34:35,760 --> 00:34:37,360 Speaker 1: I just kind of assume it's sort of like a 634 00:34:37,880 --> 00:34:41,479 Speaker 1: like a Prague post hippie band. It's prog rock, for sure, 635 00:34:41,520 --> 00:34:44,440 Speaker 1: it's it's many people love them. I think Joe's a 636 00:34:44,520 --> 00:34:47,239 Speaker 1: big fan of it. Loves the Tarkus cover for sure. 637 00:34:47,239 --> 00:34:49,319 Speaker 1: He talks about that a lot. Yeah, so it's it's 638 00:34:49,320 --> 00:34:52,000 Speaker 1: a bit too jazzy from my personal taste. And so 639 00:34:52,040 --> 00:34:53,920 Speaker 1: this was an example where I'm like, yeah, I'm just 640 00:34:54,000 --> 00:34:57,360 Speaker 1: not I'm not not feeling this Geeger, Emerson Lake and 641 00:34:57,360 --> 00:35:02,280 Speaker 1: Palmber connection. The same thing with Debbie Harry's uh Cuckoo album. 642 00:35:02,920 --> 00:35:06,680 Speaker 1: I'm straining to make that connection. The same with Steve 643 00:35:06,800 --> 00:35:10,800 Speaker 1: Simons Electric Playboys, and with the Dead Kennedy's album as well. 644 00:35:10,960 --> 00:35:13,600 Speaker 1: And I think on the Dead Kennedy's one, because that 645 00:35:13,719 --> 00:35:18,000 Speaker 1: was like so hotly contested with the pr MC, it 646 00:35:18,080 --> 00:35:20,840 Speaker 1: was only in the interior they have like a different 647 00:35:20,880 --> 00:35:23,560 Speaker 1: cover that's like I think like Shriner's and like little 648 00:35:23,840 --> 00:35:25,759 Speaker 1: roller Cars or something. Yeah, And then I think on 649 00:35:25,800 --> 00:35:30,520 Speaker 1: the inside is that notorious controversial thing the landscape with 650 00:35:30,920 --> 00:35:36,040 Speaker 1: a lot of yeah. Yeah. So those are so those 651 00:35:36,040 --> 00:35:37,879 Speaker 1: are examples where I'm all, right, no connection at all. 652 00:35:37,920 --> 00:35:40,960 Speaker 1: I don't get this, the connection between the art and 653 00:35:41,000 --> 00:35:43,560 Speaker 1: the music. But then there's stuff on the other hands 654 00:35:43,560 --> 00:35:47,360 Speaker 1: into the spectrum. So Celtic Frost for example, Uh, that's 655 00:35:47,400 --> 00:35:51,759 Speaker 1: some some early death metal very much feels uh like 656 00:35:51,800 --> 00:35:55,960 Speaker 1: a Geeger soundscape. Same thing with the Danzig three. I 657 00:35:56,120 --> 00:35:59,719 Speaker 1: you know, it's gloomy and dark and you know, as 658 00:35:59,760 --> 00:36:05,280 Speaker 1: all the this focus on you know, weird ideas, it works, 659 00:36:05,320 --> 00:36:07,960 Speaker 1: all right. I totally but totally buy these album covers. 660 00:36:08,840 --> 00:36:12,560 Speaker 1: But the one area, the one example where I felt 661 00:36:12,560 --> 00:36:15,160 Speaker 1: things coming together and I had to like kind of struggling, 662 00:36:15,160 --> 00:36:17,160 Speaker 1: but also I could I could feel the connection taking 663 00:36:17,160 --> 00:36:21,000 Speaker 1: place was with a group called Magma. So they had 664 00:36:21,040 --> 00:36:24,440 Speaker 1: this album called Attack A T T A h K. 665 00:36:24,760 --> 00:36:27,560 Speaker 1: I've never heard of them until today, and it's and 666 00:36:27,600 --> 00:36:29,440 Speaker 1: it has like Geeger. It's one of these where Gieger 667 00:36:29,560 --> 00:36:31,840 Speaker 1: made the album art with the the name of the bands. 668 00:36:31,880 --> 00:36:33,600 Speaker 1: It was like a commission. It looks kind of like 669 00:36:33,640 --> 00:36:36,799 Speaker 1: his Atomic Babies very much. So yeah, but I think 670 00:36:36,800 --> 00:36:41,040 Speaker 1: there's some clothes, some safety pins in there as well, so, 671 00:36:41,120 --> 00:36:43,480 Speaker 1: you know, very much a Geeger piece with Geegers a 672 00:36:43,560 --> 00:36:47,279 Speaker 1: signature symbolism and uh. And at first time like I'm 673 00:36:47,320 --> 00:36:49,360 Speaker 1: not really feeling this connection, but then I started reading 674 00:36:49,400 --> 00:36:53,239 Speaker 1: about the band French prog Rock. Uh. Their album is 675 00:36:53,320 --> 00:36:56,440 Speaker 1: again very weird. The drum the drummer and founder Christian 676 00:36:56,560 --> 00:37:01,640 Speaker 1: vander He composed the lyrics and a constructed language called Kobayan, 677 00:37:02,360 --> 00:37:06,359 Speaker 1: and the whole project came together over an ecological, spiritual 678 00:37:06,440 --> 00:37:09,359 Speaker 1: vision for humanity's future. And so I read about that 679 00:37:09,440 --> 00:37:10,799 Speaker 1: and I listened to it a bit more, and I 680 00:37:10,840 --> 00:37:13,160 Speaker 1: have to say, yeah, I was beginning to feel the 681 00:37:13,160 --> 00:37:18,839 Speaker 1: connection between the track No No and Gieger's art. So 682 00:37:18,960 --> 00:37:21,839 Speaker 1: this makes me think of, so you and I are 683 00:37:21,880 --> 00:37:25,680 Speaker 1: old enough that we're seeing this resurgence with vinyl, and 684 00:37:25,719 --> 00:37:28,000 Speaker 1: it's like, well, that's stuff that we used to listen 685 00:37:28,360 --> 00:37:31,240 Speaker 1: on and then it went away, and now it's coming back, 686 00:37:31,400 --> 00:37:34,400 Speaker 1: and it in a way, it's novel and high quality, 687 00:37:34,520 --> 00:37:39,080 Speaker 1: right and and uh, but the vinyl resurgence seems to 688 00:37:39,120 --> 00:37:43,520 Speaker 1: me to be inherently connected to this apophenia into this 689 00:37:43,640 --> 00:37:47,640 Speaker 1: like creativity that's going on in the consumer's head. Right. 690 00:37:47,719 --> 00:37:51,000 Speaker 1: And uh, my friend Charlie actually talks about this because 691 00:37:51,000 --> 00:37:53,760 Speaker 1: he's a big vinyl fan. He refers to like having 692 00:37:53,800 --> 00:37:57,680 Speaker 1: the artifact as the material oomph of a thing. And 693 00:37:58,000 --> 00:38:01,400 Speaker 1: I I just I stopped acting records because I was like, 694 00:38:01,520 --> 00:38:05,560 Speaker 1: I have too much physical space. I can't carry these 695 00:38:05,600 --> 00:38:07,399 Speaker 1: around with me anymore for the rest of my life, 696 00:38:07,680 --> 00:38:10,640 Speaker 1: and moved almost entirely to digital. But so many of 697 00:38:10,680 --> 00:38:14,560 Speaker 1: my friends are really into this new resurgence of vinyl, 698 00:38:14,640 --> 00:38:17,120 Speaker 1: and I understand it, right, because like if you're sitting 699 00:38:17,160 --> 00:38:20,120 Speaker 1: there between like what you just said about Magma, you 700 00:38:20,600 --> 00:38:22,880 Speaker 1: look at this record art and then like maybe you 701 00:38:22,880 --> 00:38:25,640 Speaker 1: look the liner notes while the album's playing, and then 702 00:38:25,680 --> 00:38:28,000 Speaker 1: maybe you go on Wikipedia or something and you read 703 00:38:28,120 --> 00:38:31,520 Speaker 1: up on them. It creates this like a series of 704 00:38:31,600 --> 00:38:36,480 Speaker 1: connections that may or may not exist but add something 705 00:38:36,480 --> 00:38:39,440 Speaker 1: to the music. Yeah. So with to bring it back 706 00:38:39,480 --> 00:38:43,080 Speaker 1: to machine learning and creativity, to what extent is it 707 00:38:43,120 --> 00:38:46,560 Speaker 1: a case of machines throwing together combinations? Still something begins 708 00:38:46,560 --> 00:38:49,839 Speaker 1: to catch till you have this Magma moment, and then 709 00:38:49,880 --> 00:38:53,920 Speaker 1: the randomness touches on possible synchronicity, and then just as 710 00:38:53,960 --> 00:38:56,480 Speaker 1: our brains try to make sense of it, we engage 711 00:38:56,480 --> 00:38:59,759 Speaker 1: in the creative process of refining the machine made connections. 712 00:39:00,280 --> 00:39:03,440 Speaker 1: So I'll just say this before we take a break, 713 00:39:03,480 --> 00:39:07,799 Speaker 1: which is that whatever the first album is that's created 714 00:39:08,000 --> 00:39:12,200 Speaker 1: entirely by AI, it should have an HR cover. That's true. 715 00:39:12,239 --> 00:39:14,560 Speaker 1: And if it's already out there, the first d A A 716 00:39:14,719 --> 00:39:17,120 Speaker 1: I album, it needs a new cover by g As 717 00:39:17,160 --> 00:39:19,440 Speaker 1: we talked about in the hr Giger episode, that house 718 00:39:19,560 --> 00:39:23,680 Speaker 1: in Switzerland is just full of unused arts. Yeah, contact them, 719 00:39:23,719 --> 00:39:26,959 Speaker 1: commission knock on something and get the rights to it. So, okay, 720 00:39:27,000 --> 00:39:29,160 Speaker 1: let's take a break, and when we come back, we're 721 00:39:29,200 --> 00:39:31,680 Speaker 1: going to talk more about the effects of the actual 722 00:39:31,760 --> 00:39:39,759 Speaker 1: technology on the act of creativity. Alright, we're back. So 723 00:39:39,880 --> 00:39:42,160 Speaker 1: in all of this, we're talking about machine learning and 724 00:39:42,840 --> 00:39:46,600 Speaker 1: robotic arms that are drawing, uh, computer programs that are 725 00:39:46,640 --> 00:39:50,200 Speaker 1: taking the various you know, hierarchies of of of values 726 00:39:50,200 --> 00:39:52,720 Speaker 1: within a particular discipline and then using those to create 727 00:39:53,520 --> 00:39:57,120 Speaker 1: quote unquote art or fiction or you know, or or 728 00:39:57,520 --> 00:40:00,480 Speaker 1: or a recipe. In all of these cases, Oh, it's 729 00:40:00,520 --> 00:40:03,440 Speaker 1: it's essential to point out that the technology is working 730 00:40:03,480 --> 00:40:07,400 Speaker 1: as a tool. And obviously, creativity and tools have always 731 00:40:07,440 --> 00:40:10,080 Speaker 1: co evolved. So think of any great work of human art, 732 00:40:10,360 --> 00:40:12,880 Speaker 1: and you have to contemplate the tools and the technology 733 00:40:12,920 --> 00:40:16,080 Speaker 1: is required to create it. The physical tools for carving 734 00:40:16,200 --> 00:40:20,680 Speaker 1: rock or would working and firing pottery, the chemical technologies 735 00:40:20,719 --> 00:40:24,360 Speaker 1: of paints, varnishes, and laquers. The evolution of musical instruments, 736 00:40:24,400 --> 00:40:30,120 Speaker 1: electrical musical instruments and recording and producing technology, writing technologies 737 00:40:30,160 --> 00:40:33,960 Speaker 1: from clay tablets two pens to the printing press, the typewriter, 738 00:40:34,280 --> 00:40:37,839 Speaker 1: word processors, and beyond. Yeah, and so in this conversation, 739 00:40:37,960 --> 00:40:41,279 Speaker 1: the panelists are essentially arguing that AI. Some of them 740 00:40:41,280 --> 00:40:44,440 Speaker 1: are arguing that the AI would just be a tool 741 00:40:44,719 --> 00:40:47,560 Speaker 1: for humans to use creatively. It's not that the AI 742 00:40:47,640 --> 00:40:50,399 Speaker 1: itself would be creative, but that we would be using 743 00:40:50,400 --> 00:40:52,160 Speaker 1: it in the same way we use a guitar or 744 00:40:52,200 --> 00:40:54,799 Speaker 1: a pen or something like that. Yeah, And and that 745 00:40:55,120 --> 00:40:57,799 Speaker 1: ultimately it's not going to be any more destructive than 746 00:40:58,120 --> 00:41:00,759 Speaker 1: you know, the effects of the synthesizer or any of 747 00:41:00,800 --> 00:41:04,680 Speaker 1: any of these various electronic musical technologies that obviously did 748 00:41:04,719 --> 00:41:09,759 Speaker 1: not destroy traditional music or or or caused judgment day. Yeah, 749 00:41:09,800 --> 00:41:12,120 Speaker 1: or caused judgment day. That they didn't destroy the the 750 00:41:12,280 --> 00:41:16,480 Speaker 1: artistic traditions that came before. They took the existing traditions 751 00:41:16,480 --> 00:41:19,520 Speaker 1: in new directions, sometimes surprising new directions. They bring up 752 00:41:19,560 --> 00:41:22,279 Speaker 1: the example of the drum machine, where when it came out, 753 00:41:22,640 --> 00:41:24,839 Speaker 1: you know it, there were people who were figuring out 754 00:41:24,840 --> 00:41:28,319 Speaker 1: ways to use it where you're creating sounds that did 755 00:41:28,360 --> 00:41:31,680 Speaker 1: you know, they didn't match, They weren't perfectly replicating real 756 00:41:31,880 --> 00:41:34,640 Speaker 1: world drumming, but they were able to to use them 757 00:41:34,640 --> 00:41:37,879 Speaker 1: in novel ways that that brought about some some new 758 00:41:37,960 --> 00:41:40,600 Speaker 1: sounds and hip hop and electronic music. But then one 759 00:41:40,600 --> 00:41:43,120 Speaker 1: of the panelists is the guy from Google and forgetting 760 00:41:43,120 --> 00:41:46,800 Speaker 1: his name right now, but he likened AI to a garden, 761 00:41:47,239 --> 00:41:49,719 Speaker 1: and he said it's where you're growing things, but you're 762 00:41:49,760 --> 00:41:53,560 Speaker 1: growing them with intentionality, and that computer systems themselves are 763 00:41:53,600 --> 00:41:57,720 Speaker 1: not in a state where they can reflect upon that intentionality. Yet, 764 00:41:58,080 --> 00:42:01,319 Speaker 1: So does this count as create activity when they're doing 765 00:42:01,320 --> 00:42:04,000 Speaker 1: it without intentionality? And it really depends on what you're 766 00:42:04,040 --> 00:42:06,759 Speaker 1: focusing on here. Some of them were focusing on the 767 00:42:06,800 --> 00:42:10,759 Speaker 1: process of creativity and others were focusing on the artifacts 768 00:42:10,760 --> 00:42:14,880 Speaker 1: of creativity. Right. So a computer program can certainly currently 769 00:42:15,000 --> 00:42:18,520 Speaker 1: create an artifact, right, whether it's a recipe, a song, 770 00:42:19,239 --> 00:42:22,600 Speaker 1: or a drawing, right, as we saw examples of all 771 00:42:22,840 --> 00:42:27,239 Speaker 1: three in this panel. Um, but it's not necessarily engaged 772 00:42:27,280 --> 00:42:29,279 Speaker 1: in the process. And so that seems to be where 773 00:42:29,320 --> 00:42:33,680 Speaker 1: they're diverging, especially because for the latter, intentionality may not 774 00:42:33,800 --> 00:42:37,160 Speaker 1: actually be necessary. You can make artifacts without having any 775 00:42:37,200 --> 00:42:40,319 Speaker 1: intention right. And to your point on the process, Chung 776 00:42:40,480 --> 00:42:43,200 Speaker 1: points out that the process is vital to what she does, 777 00:42:43,280 --> 00:42:46,800 Speaker 1: like it's really more important than the finished artifact exactly. 778 00:42:47,040 --> 00:42:48,959 Speaker 1: I mean, the painting is one thing, but it's it's 779 00:42:49,000 --> 00:42:53,160 Speaker 1: that that video as well of her interacting with this 780 00:42:53,360 --> 00:42:58,360 Speaker 1: robotic arm that is that is essentially jamming with her. Yeah, yeah, exactly. 781 00:42:58,719 --> 00:43:01,040 Speaker 1: I think this is really interesting because it forces us 782 00:43:01,120 --> 00:43:05,719 Speaker 1: to systematize our own creativity. And ironically, you know, we 783 00:43:05,760 --> 00:43:09,080 Speaker 1: as human beings like to think that our own internal system, 784 00:43:09,200 --> 00:43:12,359 Speaker 1: especially when we're being creative, is not computational, right, but 785 00:43:12,640 --> 00:43:15,920 Speaker 1: maybe it is more than we know. Well. This is 786 00:43:15,920 --> 00:43:18,319 Speaker 1: always one of those areas where you don't want to 787 00:43:18,320 --> 00:43:22,120 Speaker 1: fall into the trap of thinking entirely of human cognition 788 00:43:22,160 --> 00:43:25,920 Speaker 1: in terms of a computer. Um, even though technology is 789 00:43:25,960 --> 00:43:29,920 Speaker 1: always a a handy way to try and uh and 790 00:43:30,040 --> 00:43:33,520 Speaker 1: see our own cognitive processes. But on the other hand, 791 00:43:33,560 --> 00:43:35,480 Speaker 1: you don't want to You can't dismiss the fact that 792 00:43:35,560 --> 00:43:39,200 Speaker 1: there are there are aspects of our cognition that they 793 00:43:39,200 --> 00:43:42,080 Speaker 1: are very much in line with the functioning of a computer. Right. 794 00:43:42,080 --> 00:43:44,799 Speaker 1: It's the same thing as the bird is not an airplane, right, 795 00:43:45,040 --> 00:43:47,879 Speaker 1: But at the same time they are, they both learned, 796 00:43:47,920 --> 00:43:51,040 Speaker 1: they both carry out the same process. Yeah. Um. I 797 00:43:51,080 --> 00:43:53,319 Speaker 1: think though, like the tool building thing, is where it 798 00:43:53,360 --> 00:43:55,839 Speaker 1: really gets interesting to me and determining whether or not 799 00:43:55,920 --> 00:43:59,200 Speaker 1: computers are creative because for creative purposes, when you actually 800 00:43:59,239 --> 00:44:01,080 Speaker 1: go out and build old the new tool. For me, 801 00:44:01,200 --> 00:44:04,520 Speaker 1: that's a sign that a creator has actually like graduated 802 00:44:04,520 --> 00:44:06,640 Speaker 1: to this other level of making, right. I think I 803 00:44:06,719 --> 00:44:09,799 Speaker 1: referred to it as meta creativity earlier. They're somehow not 804 00:44:10,000 --> 00:44:13,359 Speaker 1: satisfied with the available tools that are there, and they 805 00:44:13,400 --> 00:44:17,560 Speaker 1: need to experiment, and in terms of that novelty quality divide, 806 00:44:17,920 --> 00:44:22,200 Speaker 1: this isn't always successful obviously, right, And it's especially fascinating 807 00:44:22,200 --> 00:44:25,920 Speaker 1: when a creator balances the novelty of that experiment with 808 00:44:25,960 --> 00:44:28,879 Speaker 1: actual aesthetics and creates like a new tool that other 809 00:44:28,920 --> 00:44:30,960 Speaker 1: people end up going on to use. You know, we 810 00:44:30,960 --> 00:44:35,080 Speaker 1: were talking about Burrows, William Burrows and Naked Lunch here. 811 00:44:35,120 --> 00:44:39,239 Speaker 1: I think the cutoup machine is an example where the 812 00:44:39,480 --> 00:44:43,560 Speaker 1: it is a novel approach to take essentially like taking 813 00:44:43,640 --> 00:44:46,759 Speaker 1: these these these paragraphs and phrases and they're literally cut 814 00:44:46,840 --> 00:44:51,040 Speaker 1: up and then may combined. And yet when I've when 815 00:44:51,040 --> 00:44:55,200 Speaker 1: I've read cut up literature, it's noise. It's mostly noise. Yeah, 816 00:44:55,320 --> 00:44:58,440 Speaker 1: And then sometimes I like it, But most of the 817 00:44:58,440 --> 00:45:01,680 Speaker 1: time I it doesn't quite connect from me. And when 818 00:45:01,680 --> 00:45:04,879 Speaker 1: I've tried to do it myself, I've really not not 819 00:45:05,000 --> 00:45:08,160 Speaker 1: liked it. So I experimented with it a while ago 820 00:45:08,719 --> 00:45:10,799 Speaker 1: and what I ended up doing was rather than just 821 00:45:10,840 --> 00:45:13,160 Speaker 1: like doing cut up method, I guess we should explain 822 00:45:13,239 --> 00:45:16,440 Speaker 1: what cut up is. So you write something with intentionality 823 00:45:16,560 --> 00:45:19,200 Speaker 1: and then you cut out all the words up into 824 00:45:19,239 --> 00:45:21,520 Speaker 1: separate little pieces of paper. At least is how Burrows 825 00:45:21,560 --> 00:45:23,120 Speaker 1: did it. I think there are programs that will do 826 00:45:23,160 --> 00:45:27,600 Speaker 1: it for you know, UM, and you literally randomly paced 827 00:45:27,680 --> 00:45:32,799 Speaker 1: them together so they form just ungrammatical, oddly formed sentences, 828 00:45:32,880 --> 00:45:37,279 Speaker 1: and try to see what kind of apothenic connections are 829 00:45:37,320 --> 00:45:40,480 Speaker 1: formed by these words combined the way they are. UM. 830 00:45:40,520 --> 00:45:42,359 Speaker 1: The way I was doing it was I would do 831 00:45:42,400 --> 00:45:45,080 Speaker 1: that and then I would look to see where there 832 00:45:45,080 --> 00:45:47,480 Speaker 1: were interesting connections, and then I would pull those out 833 00:45:47,480 --> 00:45:50,359 Speaker 1: and put them in my writing rather than actually use 834 00:45:50,480 --> 00:45:53,000 Speaker 1: cut up method to write anything. Well. And that that 835 00:45:53,040 --> 00:45:56,080 Speaker 1: comes back around to what what the panels we're talking 836 00:45:56,080 --> 00:45:58,480 Speaker 1: about here? The idea that that it is a tool, 837 00:45:58,760 --> 00:46:01,480 Speaker 1: that that there's still going to be this this human 838 00:46:01,520 --> 00:46:04,560 Speaker 1: at the center of it that's walking in this garden 839 00:46:04,719 --> 00:46:08,759 Speaker 1: of robotic creativity, of machine creativity and saying that's a 840 00:46:08,760 --> 00:46:11,400 Speaker 1: good carrot. I'm going to pick that one. That carrot 841 00:46:11,719 --> 00:46:14,480 Speaker 1: looks like crap. We're gonna leave that that carrot was 842 00:46:14,520 --> 00:46:17,400 Speaker 1: never meant to be. So this whole thing leads me 843 00:46:17,480 --> 00:46:19,560 Speaker 1: to ask a question that I don't think they addressed, 844 00:46:19,560 --> 00:46:23,080 Speaker 1: which is if tool use and making tools is an 845 00:46:23,160 --> 00:46:26,480 Speaker 1: essential part of creativity, right, Like somebody had to invent 846 00:46:26,800 --> 00:46:29,759 Speaker 1: the right kind of crow quill pen to illustrate with, 847 00:46:30,120 --> 00:46:34,320 Speaker 1: or somebody invented a particular style of guitar with pickups 848 00:46:34,360 --> 00:46:37,400 Speaker 1: in the six strings and the exact kind of materials 849 00:46:37,400 --> 00:46:39,440 Speaker 1: that are used in the body and the neck right, 850 00:46:40,040 --> 00:46:43,719 Speaker 1: or narrative style, or even stoves right, particular kinds of 851 00:46:43,760 --> 00:46:47,120 Speaker 1: stoves for cooking, So all of those things are actually 852 00:46:47,239 --> 00:46:51,959 Speaker 1: part of creativity. Our computers then capable of creating their 853 00:46:52,080 --> 00:46:56,200 Speaker 1: own tools for the sake of experimenting and being novel. 854 00:46:56,719 --> 00:46:59,000 Speaker 1: And I think the answer to this would be that 855 00:46:59,440 --> 00:47:01,880 Speaker 1: we would say, well, that's how they write algorithms, right, 856 00:47:01,920 --> 00:47:05,640 Speaker 1: The algorithms are their tools. But what about actual artifacts 857 00:47:05,680 --> 00:47:09,919 Speaker 1: that they're using to make other artifacts? Huh? Yeah, I don't. 858 00:47:10,120 --> 00:47:12,799 Speaker 1: I don't recall any examples of that, whether within this 859 00:47:12,840 --> 00:47:16,759 Speaker 1: discussion or or outside of it. Um that seems to 860 00:47:16,800 --> 00:47:20,360 Speaker 1: me like it might be a good place to draw 861 00:47:20,400 --> 00:47:24,400 Speaker 1: a line of like when AI starts being creative at 862 00:47:24,440 --> 00:47:27,400 Speaker 1: least from my perspective. But then again, I mean I 863 00:47:27,400 --> 00:47:29,960 Speaker 1: think of of great creative minds, and so many of 864 00:47:30,000 --> 00:47:35,120 Speaker 1: them your prefab tools. Yeah, and they're working within the 865 00:47:35,280 --> 00:47:39,000 Speaker 1: confines of their genre. But between you know, within say, 866 00:47:39,040 --> 00:47:42,480 Speaker 1: literary norms in many cases, well in many cases, of course, 867 00:47:42,520 --> 00:47:45,560 Speaker 1: you see someone sort of mastering the norms and then 868 00:47:45,760 --> 00:47:47,480 Speaker 1: figuring out what new spins to put on it. But 869 00:47:47,520 --> 00:47:50,680 Speaker 1: they're not really I would I would ask are they 870 00:47:50,760 --> 00:47:53,600 Speaker 1: actually creating any new tools though, or are they just 871 00:47:54,160 --> 00:47:57,920 Speaker 1: using the existing tools in slightly different ways, breaking the rules? Yeah, 872 00:47:58,000 --> 00:47:59,800 Speaker 1: I don't know. It's it's interesting, though. I wish we 873 00:47:59,840 --> 00:48:03,360 Speaker 1: could to ask those panelists this question. Like Corn McCarthy 874 00:48:03,840 --> 00:48:05,560 Speaker 1: used as a typewriter. I think he uses like an 875 00:48:05,560 --> 00:48:09,719 Speaker 1: old typewriter to knock out his his fiction, but he 876 00:48:09,760 --> 00:48:14,000 Speaker 1: did not create his own. He didn't build his typewriter, like, uh, 877 00:48:14,400 --> 00:48:18,279 Speaker 1: you know, bowl skulls and children's skulls. This is kind 878 00:48:18,280 --> 00:48:22,520 Speaker 1: of what I'm getting at, Yeah, exactly, Like, but but 879 00:48:22,600 --> 00:48:25,520 Speaker 1: you could argue maybe that Cormac McCarthy had created a 880 00:48:25,600 --> 00:48:30,919 Speaker 1: certain kind of prose style, right that is unique to him. 881 00:48:31,040 --> 00:48:34,280 Speaker 1: Um or and I don't I haven't read enough McCarthy. 882 00:48:34,360 --> 00:48:37,680 Speaker 1: So I might be totally off here, but like maybe 883 00:48:37,760 --> 00:48:41,719 Speaker 1: he's like fiddled with narrative rules in such a way 884 00:48:41,800 --> 00:48:44,840 Speaker 1: that it's novel true. Yeah, I mean you can also 885 00:48:44,960 --> 00:48:46,759 Speaker 1: go back to Gieger when we've talked about how he 886 00:48:46,800 --> 00:48:50,239 Speaker 1: was using the airbrush. He did not invente brush, but 887 00:48:50,280 --> 00:48:55,279 Speaker 1: he was using the airbrush in ways that well. I 888 00:48:55,640 --> 00:48:57,319 Speaker 1: don't I don't want to speak to method too much 889 00:48:57,320 --> 00:48:59,440 Speaker 1: because I don't really know much about the methodology of 890 00:48:59,480 --> 00:49:02,520 Speaker 1: airbrush art, but he was doing things with it that 891 00:49:02,640 --> 00:49:04,399 Speaker 1: no one else had done. From what I could tell 892 00:49:04,520 --> 00:49:07,520 Speaker 1: from when we did that hr Geiger episode, he was 893 00:49:08,120 --> 00:49:11,120 Speaker 1: using ink and paint that hadn't been used in an 894 00:49:11,200 --> 00:49:13,920 Speaker 1: airbrush for Yeah, So I think that might be part 895 00:49:13,960 --> 00:49:15,680 Speaker 1: of that. Would that would yeah, I think that would 896 00:49:15,719 --> 00:49:17,960 Speaker 1: support the idea of using a new tool. I mean, 897 00:49:18,000 --> 00:49:20,919 Speaker 1: if you're changing the tool, you are creating a new tool. 898 00:49:21,000 --> 00:49:23,279 Speaker 1: That's That's something they touched on in the in the 899 00:49:23,480 --> 00:49:25,840 Speaker 1: panel discussion of the World Science Festivals that one of 900 00:49:25,840 --> 00:49:28,880 Speaker 1: the things about human tools is that they have evolved 901 00:49:28,880 --> 00:49:31,600 Speaker 1: and we have evolved with them. Our brain, our culture, 902 00:49:31,680 --> 00:49:35,200 Speaker 1: our language, this is all been part of the same 903 00:49:35,280 --> 00:49:38,680 Speaker 1: journey out of out of the Stone Age out of 904 00:49:38,680 --> 00:49:41,400 Speaker 1: the pre Stone Age. As we as we we adapt 905 00:49:41,440 --> 00:49:43,799 Speaker 1: these tools, we figure out new ways to communicate the 906 00:49:43,800 --> 00:49:49,160 Speaker 1: construction of these tools and then create increasingly complex um 907 00:49:50,000 --> 00:49:54,759 Speaker 1: bits of pieces of art, technologies, etcetera with those tools. Yeah. 908 00:49:54,920 --> 00:49:57,719 Speaker 1: So then getting back to limits, right, which is something 909 00:49:57,800 --> 00:50:01,200 Speaker 1: I brought up really before even we out into these panels, 910 00:50:01,840 --> 00:50:05,000 Speaker 1: is that, Okay, a I can use the tools, and 911 00:50:05,080 --> 00:50:08,239 Speaker 1: it can learn the composition traits of art, and it 912 00:50:08,239 --> 00:50:11,320 Speaker 1: can learn those rules. What we're questioning is whether or 913 00:50:11,360 --> 00:50:14,680 Speaker 1: not it can learn how to break those rules. But uh, 914 00:50:14,719 --> 00:50:18,040 Speaker 1: when they're using a tool, they have to kind of 915 00:50:18,040 --> 00:50:21,480 Speaker 1: figure out the same constraints on the system that humans do. 916 00:50:21,800 --> 00:50:24,839 Speaker 1: And so I'm wondering if these patterns are based on 917 00:50:25,080 --> 00:50:28,839 Speaker 1: quote unquote how the world works as we humans see it, 918 00:50:29,280 --> 00:50:34,160 Speaker 1: then that's human culture, and the we're just teaching the 919 00:50:34,200 --> 00:50:38,000 Speaker 1: computers the sort of like mathematics of human culture and 920 00:50:38,040 --> 00:50:42,200 Speaker 1: then having them replicate that rather than the machines looking 921 00:50:42,200 --> 00:50:44,400 Speaker 1: at the world and kind of coming up with their 922 00:50:44,400 --> 00:50:48,840 Speaker 1: own cultural interpretation of it. Maybe I'm leaning more towards Ulrica, 923 00:50:49,040 --> 00:50:53,000 Speaker 1: say than I thought I would. All Right, well, let's 924 00:50:53,040 --> 00:50:55,760 Speaker 1: take another quick break and when we come back, we'll 925 00:50:55,880 --> 00:50:58,399 Speaker 1: talk about what it means for the future and uh, 926 00:50:58,400 --> 00:51:02,040 Speaker 1: and also some more compare between the creative process and 927 00:51:02,360 --> 00:51:10,520 Speaker 1: machine learning and airplanes. Thank thank Alright, we're back. So 928 00:51:10,640 --> 00:51:12,960 Speaker 1: you were really taken with Chung as I was. I 929 00:51:13,000 --> 00:51:15,520 Speaker 1: think like she she just stood up. I mean, everybody 930 00:51:15,520 --> 00:51:18,560 Speaker 1: on the panel was interesting, but Chung just was really 931 00:51:18,600 --> 00:51:20,719 Speaker 1: fascinating the way she talked about her process and then 932 00:51:20,719 --> 00:51:23,520 Speaker 1: when you watched her actual process. Yeah. The second she 933 00:51:23,640 --> 00:51:26,320 Speaker 1: said that bit about like the narrative of machines and 934 00:51:26,400 --> 00:51:29,040 Speaker 1: humans cooperating, I totally agreed with because it lines up 935 00:51:29,120 --> 00:51:31,759 Speaker 1: with a lot of what I've been thinking about with 936 00:51:31,800 --> 00:51:35,959 Speaker 1: science fiction, is that we we continually see the dystopian 937 00:51:36,160 --> 00:51:41,200 Speaker 1: vision of robot overlords and evil robots, evil androids, and 938 00:51:41,239 --> 00:51:43,239 Speaker 1: I love all that stuff, don't get me wrong. But 939 00:51:43,960 --> 00:51:46,319 Speaker 1: when I read in in Banks and his view of 940 00:51:46,360 --> 00:51:52,239 Speaker 1: the culture, uh, this post scarcity, far future um utopian 941 00:51:52,280 --> 00:51:57,800 Speaker 1: society living with computers live basically having this sort of 942 00:51:57,920 --> 00:52:02,440 Speaker 1: hybrid cultural scenaria, you know, with super intelligent machines, Like 943 00:52:02,480 --> 00:52:04,520 Speaker 1: it makes me think, what we should have more of that? 944 00:52:04,560 --> 00:52:06,600 Speaker 1: We should have more of this This side of the 945 00:52:06,760 --> 00:52:11,360 Speaker 1: argument for a post singularity world and and the and 946 00:52:11,360 --> 00:52:13,360 Speaker 1: and not not to say that we shouldn't be concerned 947 00:52:13,360 --> 00:52:17,600 Speaker 1: about the potential dire consequences. I mean, obviously we if 948 00:52:18,000 --> 00:52:21,239 Speaker 1: to whatever extent it's practical, we should try to avoid 949 00:52:22,280 --> 00:52:26,279 Speaker 1: terminator scenarios. But but there's this whole other side. There's 950 00:52:26,280 --> 00:52:28,680 Speaker 1: this whole there's this view of the machine as a tool. 951 00:52:28,920 --> 00:52:33,080 Speaker 1: This machine is a collaborative of process, and I think 952 00:52:33,080 --> 00:52:35,279 Speaker 1: that's that's very important to keep in mind. Yeah, and 953 00:52:35,400 --> 00:52:39,080 Speaker 1: I totally admire their intention, but I wonder if there 954 00:52:39,239 --> 00:52:41,680 Speaker 1: is a point where it's really down to like sort 955 00:52:41,680 --> 00:52:44,759 Speaker 1: of what our linguistic definitions are here. I think there 956 00:52:44,800 --> 00:52:48,840 Speaker 1: might be a little bit of confusing collaboration for tool building. 957 00:52:49,560 --> 00:52:53,960 Speaker 1: Is the machine actually collaborating with Chung or was she 958 00:52:54,080 --> 00:52:56,640 Speaker 1: building a new tool that she could use to create 959 00:52:56,680 --> 00:52:59,160 Speaker 1: new art with. Yeah, because you could argue that in 960 00:52:59,200 --> 00:53:02,440 Speaker 1: the same way that a French horn, you you buzz 961 00:53:02,480 --> 00:53:05,880 Speaker 1: your error into one end, and the technology of the 962 00:53:05,960 --> 00:53:10,400 Speaker 1: horn allows a different sound to come out. So you 963 00:53:10,400 --> 00:53:12,480 Speaker 1: could say that that's what she's doing here. She's essentially 964 00:53:12,520 --> 00:53:17,520 Speaker 1: blowing into a horn, and we're appreciating the the duality 965 00:53:17,560 --> 00:53:21,319 Speaker 1: of her organic sound and in the manufactured sound of 966 00:53:21,320 --> 00:53:24,000 Speaker 1: the horn, right, It's like she's developed a new instrument 967 00:53:24,080 --> 00:53:26,640 Speaker 1: and she's learning how to play it. Yeah, and I 968 00:53:26,680 --> 00:53:28,800 Speaker 1: think that's a valid reading what's going on as well 969 00:53:29,680 --> 00:53:32,640 Speaker 1: now in all of this, uh. Jesse Ingle, the computer 970 00:53:32,680 --> 00:53:36,440 Speaker 1: scientist musician. UH, he spoke a bit about the Google 971 00:53:36,480 --> 00:53:40,080 Speaker 1: Magenta project that he's involved with, and he pointed out 972 00:53:40,120 --> 00:53:43,239 Speaker 1: that with Magenta, there's always a human in the process 973 00:53:43,880 --> 00:53:45,560 Speaker 1: because again the human as a tool. This is the 974 00:53:45,600 --> 00:53:50,719 Speaker 1: gardener walking, you know, amid the the creative machines in 975 00:53:50,760 --> 00:53:52,759 Speaker 1: the garden. But this isn't soil and green. It's not 976 00:53:52,800 --> 00:53:56,120 Speaker 1: like Google is made of people. Right. But I found 977 00:53:56,120 --> 00:53:59,239 Speaker 1: this particularly interesting because it closely mirrors the inner workings 978 00:53:59,280 --> 00:54:02,480 Speaker 1: of military drones. So a few years back I spoke 979 00:54:02,560 --> 00:54:06,400 Speaker 1: with this guy, Knowles Sharkley, as a professor of Artificial 980 00:54:06,440 --> 00:54:09,560 Speaker 1: Intelligence and Robotics of the University of Sheffield and uh 981 00:54:09,920 --> 00:54:13,080 Speaker 1: and and he he had some some interesting revelations about 982 00:54:13,080 --> 00:54:16,839 Speaker 1: the necessary human components of u a v s of 983 00:54:16,840 --> 00:54:20,960 Speaker 1: of unmanned aerial vehicles. He said, quote, these are all 984 00:54:21,080 --> 00:54:24,719 Speaker 1: man in the loop systems, which means essentially there's someone, uh, 985 00:54:24,840 --> 00:54:28,200 Speaker 1: someone controls the applications of lethal force. They're not exactly 986 00:54:28,239 --> 00:54:31,040 Speaker 1: remote control. They're sort of a hybrid. They have a 987 00:54:31,040 --> 00:54:34,120 Speaker 1: certain they have certain autonomous functions, meaning they can be 988 00:54:34,160 --> 00:54:37,520 Speaker 1: programmed to react to their to their GPS, so they 989 00:54:37,520 --> 00:54:40,200 Speaker 1: can go about on their own. They can navigate themselves, 990 00:54:40,360 --> 00:54:42,640 Speaker 1: though a pilot will control their height and that sort 991 00:54:42,640 --> 00:54:46,120 Speaker 1: of thing. It's the first step towards full autonomy. The 992 00:54:46,120 --> 00:54:49,480 Speaker 1: most recent US Air Force documents describe a swarm of planes. 993 00:54:49,480 --> 00:54:52,360 Speaker 1: The term swarm is kind of a technical term in robotics, 994 00:54:52,400 --> 00:54:54,960 Speaker 1: meaning a bunch of robots that interact with one another 995 00:54:55,040 --> 00:54:58,040 Speaker 1: on a local basis. The man on the loop would 996 00:54:58,040 --> 00:55:01,640 Speaker 1: be in an executive control of the swarm, So rather 997 00:55:01,719 --> 00:55:03,799 Speaker 1: than having at least two pilots in charge of a 998 00:55:03,960 --> 00:55:06,719 Speaker 1: predator drone in his example, you'll have one person in 999 00:55:06,800 --> 00:55:10,239 Speaker 1: charge of a swarm of robots. So human technology can 1000 00:55:10,280 --> 00:55:13,680 Speaker 1: be used to kill people or create heart as well. Yeah, yeah, 1001 00:55:13,760 --> 00:55:17,320 Speaker 1: this is interesting, especially in lieu of the other episode 1002 00:55:17,320 --> 00:55:19,640 Speaker 1: that we're working on right now, which is it's gonna 1003 00:55:19,640 --> 00:55:23,239 Speaker 1: come out after this one. But it's about violence in 1004 00:55:23,560 --> 00:55:27,200 Speaker 1: the capacity for human violence, and drones in general, I 1005 00:55:27,200 --> 00:55:30,680 Speaker 1: think are interesting in that respect because they allow us 1006 00:55:30,760 --> 00:55:34,719 Speaker 1: to create violence from a distance and there is sort 1007 00:55:34,760 --> 00:55:40,000 Speaker 1: of an apophenia. I don't know what the right application 1008 00:55:40,040 --> 00:55:43,040 Speaker 1: of apophenia is there, but there's something going on there 1009 00:55:43,080 --> 00:55:47,000 Speaker 1: with that right that allows that disconnection. Yeah, So in 1010 00:55:47,040 --> 00:55:48,880 Speaker 1: the case of the drones, it's you know, very much 1011 00:55:48,880 --> 00:55:52,480 Speaker 1: a person uh, you know, checking in and uh in 1012 00:55:52,640 --> 00:55:56,840 Speaker 1: fine tuning everything. And in the case of say Young's 1013 00:55:56,880 --> 00:55:59,080 Speaker 1: example of working with a with a with a machine 1014 00:55:59,080 --> 00:56:02,080 Speaker 1: collaborator a similar case. So it's it's like a feedback 1015 00:56:02,239 --> 00:56:05,440 Speaker 1: loot process. The machine creates something, the human ads their spin, 1016 00:56:05,560 --> 00:56:07,480 Speaker 1: and the machine adds a spin on that, and the 1017 00:56:07,560 --> 00:56:11,719 Speaker 1: human adds an additional spin and and weeds out what's 1018 00:56:11,760 --> 00:56:14,800 Speaker 1: not working and maybe add some flourishes that that tweak 1019 00:56:14,880 --> 00:56:17,560 Speaker 1: it for human consumption. And this is again what Jesse 1020 00:56:17,680 --> 00:56:21,640 Speaker 1: Engle compared to tending a garden, you know. And this 1021 00:56:21,800 --> 00:56:24,640 Speaker 1: also reminded me of of another conversation I had, and 1022 00:56:24,680 --> 00:56:26,960 Speaker 1: this was back in two thousand and eleven. I spoke 1023 00:56:27,000 --> 00:56:31,560 Speaker 1: with Atlanta area electronic musician Richard Divine uh, who was 1024 00:56:31,600 --> 00:56:33,880 Speaker 1: a really cool guy. I'll include the link to the 1025 00:56:33,880 --> 00:56:37,279 Speaker 1: full interview on the landing page for this episode, but 1026 00:56:38,160 --> 00:56:41,120 Speaker 1: he said there were at the time against eleven, he 1027 00:56:41,120 --> 00:56:44,240 Speaker 1: said there were more electronic music tools on the market 1028 00:56:44,440 --> 00:56:46,600 Speaker 1: that didn't exist when he was starting out, and it 1029 00:56:46,640 --> 00:56:49,800 Speaker 1: was at the point where anyone could really create something 1030 00:56:49,840 --> 00:56:53,040 Speaker 1: and put it out on SoundCloud. And he told me quote, 1031 00:56:53,040 --> 00:56:55,800 Speaker 1: there are so many people trying to emulate specific styles, 1032 00:56:56,040 --> 00:56:58,520 Speaker 1: so now you have hundreds and hundreds of people trying 1033 00:56:58,560 --> 00:57:00,840 Speaker 1: to sound a particular way. I find that there is 1034 00:57:00,960 --> 00:57:04,080 Speaker 1: less and less innovation in music, but more and more 1035 00:57:04,120 --> 00:57:07,680 Speaker 1: people creating it. And he also talked about the use 1036 00:57:07,719 --> 00:57:11,760 Speaker 1: of algorithmic music composition, UH, something that's been around for years. 1037 00:57:12,200 --> 00:57:17,120 Speaker 1: Brian Eno has engaged with it, aw Tecker, the famous 1038 00:57:17,160 --> 00:57:19,280 Speaker 1: I d M group, They've they've used it various to 1039 00:57:19,280 --> 00:57:23,080 Speaker 1: our artists. This is essentially where you you have you 1040 00:57:23,120 --> 00:57:26,800 Speaker 1: have a program ustually have an algorithm for creative output, 1041 00:57:27,280 --> 00:57:30,720 Speaker 1: and you tend it like the garden that is, and 1042 00:57:30,960 --> 00:57:33,800 Speaker 1: and see what it creates. But in those cases, right 1043 00:57:33,840 --> 00:57:37,960 Speaker 1: at least with Eno and Oddiker, they're creating the software themselves, 1044 00:57:38,120 --> 00:57:41,000 Speaker 1: right with a team of people. I believe, so yes, 1045 00:57:41,720 --> 00:57:44,520 Speaker 1: So I asked, I asked Divine about this, and he said, 1046 00:57:44,760 --> 00:57:48,040 Speaker 1: it's really interesting because you're defining the rules and parameters 1047 00:57:48,040 --> 00:57:50,360 Speaker 1: of that environment, and then you can decide how that 1048 00:57:50,480 --> 00:57:54,360 Speaker 1: environment behave. But then he he closed with the following 1049 00:57:54,360 --> 00:57:56,760 Speaker 1: about the limits of your tools, and this is this 1050 00:57:56,800 --> 00:57:58,680 Speaker 1: is key because it gets into what we're talking about. 1051 00:57:58,680 --> 00:58:02,720 Speaker 1: Imposing limits, he said, I don't necessarily think that always 1052 00:58:02,720 --> 00:58:05,760 Speaker 1: helps creatively or makes people more creative. I think it 1053 00:58:05,880 --> 00:58:09,400 Speaker 1: sometimes makes people lazy. When I have too many resources 1054 00:58:09,400 --> 00:58:12,200 Speaker 1: at my fingertips, I have a tendency to get really 1055 00:58:12,280 --> 00:58:14,480 Speaker 1: lazy with the creativity. So for me, I try to 1056 00:58:14,520 --> 00:58:17,479 Speaker 1: limit myself with how many tools I use. I try 1057 00:58:17,520 --> 00:58:20,320 Speaker 1: to keep it to just a couple of pieces of 1058 00:58:20,360 --> 00:58:23,800 Speaker 1: equipment and learn those pieces of equipment really really well. Yeah, 1059 00:58:23,840 --> 00:58:25,320 Speaker 1: this gets back to sort of what I was talking 1060 00:58:25,360 --> 00:58:29,880 Speaker 1: about earlier about like self imposed rotations. Yeah. Um, so 1061 00:58:29,920 --> 00:58:32,400 Speaker 1: this is actually interesting to me because, as I brought 1062 00:58:32,480 --> 00:58:35,760 Speaker 1: up earlier, I did have a little foray in the 1063 00:58:35,800 --> 00:58:39,160 Speaker 1: early two thousand's where I locked myself in my bedroom 1064 00:58:39,240 --> 00:58:42,480 Speaker 1: and made electronic music over a winter. Um, and the 1065 00:58:42,480 --> 00:58:46,120 Speaker 1: way I did it was with this this video game 1066 00:58:46,360 --> 00:58:48,920 Speaker 1: for PlayStation. This is this shows you how far back 1067 00:58:48,960 --> 00:58:51,760 Speaker 1: it's going. It's the original PlayStation It was called the 1068 00:58:51,920 --> 00:58:55,280 Speaker 1: MTV Music Generator. Have you ever heard of this? Yeah, 1069 00:58:55,520 --> 00:58:59,720 Speaker 1: this is branded MTV. Uh and Basically, it was like 1070 00:58:59,840 --> 00:59:03,000 Speaker 1: the the most basic way that you could create your 1071 00:59:03,040 --> 00:59:05,520 Speaker 1: own electronic music. I had like pre built in beats 1072 00:59:05,760 --> 00:59:08,160 Speaker 1: and pre built in like melodies and things like that, 1073 00:59:08,480 --> 00:59:11,120 Speaker 1: and you could add layers of effects and things like 1074 00:59:11,160 --> 00:59:12,880 Speaker 1: that on top of it. But what I thought was 1075 00:59:12,920 --> 00:59:15,760 Speaker 1: really cool was you could take c d s at 1076 00:59:15,760 --> 00:59:18,720 Speaker 1: the time, because the MP three's were just just getting 1077 00:59:18,720 --> 00:59:21,240 Speaker 1: off the ground, um, and you could put them in 1078 00:59:21,600 --> 00:59:24,160 Speaker 1: and you could rip segments off from your c d 1079 00:59:24,400 --> 00:59:28,720 Speaker 1: s and then insert those sound files into these compositions 1080 00:59:28,760 --> 00:59:32,320 Speaker 1: that you've created using mtvs just prefab generator. And so 1081 00:59:32,440 --> 00:59:35,880 Speaker 1: I wrote like five songs I think, and like put 1082 00:59:35,880 --> 00:59:39,320 Speaker 1: them all up online there. I don't know, maybe if 1083 00:59:39,320 --> 00:59:41,160 Speaker 1: the audience is interested, maybe I'll put them on the 1084 00:59:41,240 --> 00:59:43,080 Speaker 1: landing page for this or one or two of them. 1085 00:59:43,280 --> 00:59:47,920 Speaker 1: They're silly, I I wrote under the name Invisible Maniac. 1086 00:59:48,120 --> 00:59:50,520 Speaker 1: And this literally lasted for like three months and then 1087 00:59:50,560 --> 00:59:54,080 Speaker 1: nothing ever came of it. But the thing about it, though, 1088 00:59:54,160 --> 00:59:56,760 Speaker 1: was that that was all I did with the electronic music, right. 1089 00:59:57,040 --> 01:00:01,360 Speaker 1: I never went beyond my CD collection and the PlayStation 1090 01:00:01,520 --> 01:00:04,200 Speaker 1: in this video game that allowed me to do all 1091 01:00:04,240 --> 01:00:08,959 Speaker 1: these things. But uh, the very idea of doing something 1092 01:00:09,000 --> 01:00:12,880 Speaker 1: like Ottoker or uh Brian Eno going out and writing 1093 01:00:12,880 --> 01:00:15,520 Speaker 1: my own software, or going out and recording my own 1094 01:00:15,640 --> 01:00:18,480 Speaker 1: drumbeats or other sounds to work with. It was like 1095 01:00:18,640 --> 01:00:21,960 Speaker 1: way past the point of my creative interest with electronic music. Right. 1096 01:00:22,200 --> 01:00:24,960 Speaker 1: But I take something like writing or or comics the 1097 01:00:25,000 --> 01:00:27,360 Speaker 1: stuff that I work in now, and I spend a 1098 01:00:27,400 --> 01:00:29,920 Speaker 1: lot of time thinking about the tools that are available 1099 01:00:29,920 --> 01:00:32,480 Speaker 1: in the ways I can use them to tell stories differently. 1100 01:00:33,040 --> 01:00:37,520 Speaker 1: I think this all comes back to the systematization of art. Right. 1101 01:00:37,600 --> 01:00:41,080 Speaker 1: So you're you're when you start getting to a point 1102 01:00:41,160 --> 01:00:44,760 Speaker 1: with the art form where you're you're so interested in 1103 01:00:44,800 --> 01:00:46,800 Speaker 1: it and you want to take it a step further 1104 01:00:47,320 --> 01:00:50,200 Speaker 1: that you start figuring out like, Okay, well how does 1105 01:00:50,240 --> 01:00:53,920 Speaker 1: the actual form work, how does it tick? And then 1106 01:00:53,920 --> 01:00:56,480 Speaker 1: how can I take that and apply it in new ways. 1107 01:00:57,240 --> 01:00:59,160 Speaker 1: That's that's something really interesting to me. I mean, like, 1108 01:00:59,320 --> 01:01:01,120 Speaker 1: for instance, like and I played in bands, I never 1109 01:01:01,120 --> 01:01:04,000 Speaker 1: built my own guitars. I never built my own pedals. 1110 01:01:04,040 --> 01:01:06,160 Speaker 1: I have friends who do that though, Like they have 1111 01:01:06,720 --> 01:01:09,520 Speaker 1: you know, little electronics backgrounds, and they build their own 1112 01:01:09,520 --> 01:01:12,920 Speaker 1: pedals and make music that nobody else has ever made before. 1113 01:01:13,680 --> 01:01:16,160 Speaker 1: And I feel like there's a distinction between us. They're 1114 01:01:16,200 --> 01:01:18,400 Speaker 1: like they feel it feels to me like they're more 1115 01:01:18,440 --> 01:01:20,960 Speaker 1: masters of their craft than I am. They go out 1116 01:01:21,000 --> 01:01:24,720 Speaker 1: and they build something new or likewise, uh for instance, 1117 01:01:24,720 --> 01:01:27,480 Speaker 1: like in graphic design, I have friends who are graphic designers. 1118 01:01:27,800 --> 01:01:32,360 Speaker 1: They'll go and write algorithms for Adobe software to help 1119 01:01:32,400 --> 01:01:36,080 Speaker 1: them make graphic design in ways that they haven't been 1120 01:01:36,160 --> 01:01:39,919 Speaker 1: able to before. What's interesting from what we're talking about 1121 01:01:39,920 --> 01:01:43,280 Speaker 1: here is that it seems like in music certainly, and 1122 01:01:43,320 --> 01:01:47,800 Speaker 1: then also in UH, in the use of Adobe, Photoshop 1123 01:01:47,800 --> 01:01:53,840 Speaker 1: and digital um digital art creation tools, like there many 1124 01:01:53,880 --> 01:01:58,800 Speaker 1: steps ahead of of the literary model. Right, so we 1125 01:01:58,800 --> 01:02:02,840 Speaker 1: can try to hype right analogy from earlier is exactly right. 1126 01:02:02,920 --> 01:02:05,320 Speaker 1: It doesn't exactly work the same way tool was now. 1127 01:02:05,440 --> 01:02:09,840 Speaker 1: Now granted, we have we're fabulous word processing options out there. 1128 01:02:09,960 --> 01:02:12,960 Speaker 1: We have we have spell check and grammar checks and all. 1129 01:02:13,000 --> 01:02:15,600 Speaker 1: You know, of course, the what Clippy popping up and 1130 01:02:15,680 --> 01:02:19,160 Speaker 1: giving us tips. But imagine if you know easily it 1131 01:02:19,320 --> 01:02:21,600 Speaker 1: easily imagine I can easily imagine reaching the point where 1132 01:02:21,640 --> 01:02:24,720 Speaker 1: Clippy pops up and says, hey, I see you're writing 1133 01:02:24,720 --> 01:02:27,400 Speaker 1: a short story in the style of Clark Ashton Smith. 1134 01:02:28,040 --> 01:02:30,640 Speaker 1: Would you like to tweak it in this direction, and 1135 01:02:30,680 --> 01:02:32,600 Speaker 1: I wonder, I wonder what it would be like to 1136 01:02:32,640 --> 01:02:36,920 Speaker 1: reach that point where you're essentially you're writing with this 1137 01:02:36,960 --> 01:02:41,160 Speaker 1: machine filter in place. And you know, certainly you can 1138 01:02:41,200 --> 01:02:43,480 Speaker 1: be a very purist about it and say like, well, 1139 01:02:43,480 --> 01:02:46,560 Speaker 1: that's that's cheating if you're you're writing through a machine 1140 01:02:46,640 --> 01:02:48,320 Speaker 1: and something else is coming out on the on the 1141 01:02:48,360 --> 01:02:52,320 Speaker 1: other side, that's that's not the authentic process of writing. 1142 01:02:52,320 --> 01:02:54,440 Speaker 1: But on the other hand, how is it any different 1143 01:02:54,480 --> 01:02:56,600 Speaker 1: than buzzing into a horn and getting that song on 1144 01:02:56,640 --> 01:03:00,600 Speaker 1: the other end? You know, we don't say, ah, who 1145 01:03:00,680 --> 01:03:03,480 Speaker 1: is the great trumpeter dizny Llespie? Yeah, that sounds right, 1146 01:03:03,480 --> 01:03:05,880 Speaker 1: and nobody says, ah, if you heard the sound of 1147 01:03:06,120 --> 01:03:08,560 Speaker 1: him just buzzing his lips, it's awful. It's all that 1148 01:03:08,640 --> 01:03:10,880 Speaker 1: more And it was just it's clearly it's the technology. 1149 01:03:10,960 --> 01:03:15,400 Speaker 1: That's that's who's to just be Disney Galpsie's trumpet credited 1150 01:03:15,400 --> 01:03:18,280 Speaker 1: on the album. No, nobody's doing that, like you credit 1151 01:03:18,320 --> 01:03:21,040 Speaker 1: the lips buzzing into the into the machine. You know 1152 01:03:21,040 --> 01:03:24,280 Speaker 1: what's interesting though, is I think of like the media 1153 01:03:24,320 --> 01:03:28,440 Speaker 1: world that that we exist in today, where in you 1154 01:03:28,480 --> 01:03:30,640 Speaker 1: and I hear this all the time content is king 1155 01:03:31,320 --> 01:03:36,080 Speaker 1: and the idea that just like the the consumption levels 1156 01:03:36,320 --> 01:03:39,800 Speaker 1: for content are so high right now that like almost 1157 01:03:39,880 --> 01:03:42,960 Speaker 1: the human beings creating the content can't keep up right. 1158 01:03:43,040 --> 01:03:44,960 Speaker 1: And so there's all these like businesses that are trying 1159 01:03:45,000 --> 01:03:46,880 Speaker 1: to come up with a lot of like quick and 1160 01:03:46,920 --> 01:03:50,160 Speaker 1: easy ways to to get more content out there. And 1161 01:03:50,360 --> 01:03:52,720 Speaker 1: this seems to me like a way to do that 1162 01:03:52,720 --> 01:03:58,520 Speaker 1: that might not necessarily be aesthetically displeasing to the audience 1163 01:03:58,600 --> 01:04:01,160 Speaker 1: that's that's consuming it. And that's whether or not you're 1164 01:04:01,160 --> 01:04:03,400 Speaker 1: reading something in an RSS feed or you're listening to 1165 01:04:03,440 --> 01:04:08,080 Speaker 1: an MP three file or whatever you're consuming digitally. Uh, 1166 01:04:08,320 --> 01:04:11,320 Speaker 1: having machines that can create content for you, I can 1167 01:04:11,360 --> 01:04:14,240 Speaker 1: totally see in like less than twenty five years that 1168 01:04:14,320 --> 01:04:17,120 Speaker 1: being the thing. I don't think it will replace us necessarily, 1169 01:04:17,600 --> 01:04:21,440 Speaker 1: but it could absolutely be a way for what's the 1170 01:04:21,480 --> 01:04:24,720 Speaker 1: word click farms, for click farms to sort of create 1171 01:04:24,800 --> 01:04:27,120 Speaker 1: more stuff. I'll tell you. Another example that comes to 1172 01:04:27,160 --> 01:04:30,560 Speaker 1: mind is, uh, I enjoyed I enjoyed reading. For a 1173 01:04:30,560 --> 01:04:32,920 Speaker 1: brief time towards the end of his life. Hunter S. 1174 01:04:32,960 --> 01:04:36,320 Speaker 1: Thompson wrote for one of the ESPN websites, and he 1175 01:04:36,360 --> 01:04:39,160 Speaker 1: was comment sports stuff is interesting, yeah, and but in 1176 01:04:39,200 --> 01:04:41,439 Speaker 1: this one he just wrote about whatever. So he's often 1177 01:04:41,440 --> 01:04:44,280 Speaker 1: talking about politics and what have you. For someone who 1178 01:04:44,320 --> 01:04:48,240 Speaker 1: wanted Hunter S. Thompson to essentially come back from the 1179 01:04:48,280 --> 01:04:52,520 Speaker 1: grave and comment on today's political news, you could conceivably, 1180 01:04:52,840 --> 01:04:55,000 Speaker 1: I mean, or I can conceive of a future in 1181 01:04:55,040 --> 01:04:58,240 Speaker 1: which you have the the Hunter S. Thompson AI that 1182 01:04:58,360 --> 01:05:00,600 Speaker 1: and you just drop in the news feeds to it 1183 01:05:00,680 --> 01:05:03,600 Speaker 1: and then it creates commentary in the style of him, 1184 01:05:03,640 --> 01:05:05,520 Speaker 1: and maybe there's a human in the mix as well. 1185 01:05:05,920 --> 01:05:10,160 Speaker 1: Maybe not right, but I mean I can see people 1186 01:05:10,400 --> 01:05:13,320 Speaker 1: digging that even though they know that Hunter S. Thompson 1187 01:05:13,360 --> 01:05:15,480 Speaker 1: has been dead for a long time. What I mean 1188 01:05:15,520 --> 01:05:19,320 Speaker 1: what you just described, if that existed tomorrow, that would 1189 01:05:19,400 --> 01:05:22,880 Speaker 1: be the news of the week other than like you know, 1190 01:05:23,280 --> 01:05:25,280 Speaker 1: big current events and things that are going on. But 1191 01:05:25,360 --> 01:05:28,800 Speaker 1: like in terms of like media, everybody would be sharing 1192 01:05:28,840 --> 01:05:30,880 Speaker 1: it and everybody would be like, look what I plugged 1193 01:05:30,880 --> 01:05:33,160 Speaker 1: into Hunter S. Thompson bought and it came out with this, 1194 01:05:33,240 --> 01:05:36,200 Speaker 1: you know, remember that was the Twitter thing that was 1195 01:05:36,240 --> 01:05:39,360 Speaker 1: created that was like Jonathan Strickling would know this because 1196 01:05:39,360 --> 01:05:41,360 Speaker 1: I think they talked about it on tech stuff. There's 1197 01:05:41,360 --> 01:05:44,240 Speaker 1: like a Twitter bot that was created, and the idea 1198 01:05:44,320 --> 01:05:47,960 Speaker 1: was that it would learn how to tweet from reading 1199 01:05:47,960 --> 01:05:50,800 Speaker 1: other people's tweets, and like, within like a day or something, 1200 01:05:50,920 --> 01:05:54,320 Speaker 1: it eventually was just like swearing and saying like horrible, 1201 01:05:54,480 --> 01:05:58,760 Speaker 1: racist and like things like Like it was like super 1202 01:05:58,840 --> 01:06:03,240 Speaker 1: quick that it just devolved into this monster. Now along 1203 01:06:03,280 --> 01:06:06,160 Speaker 1: these lines, um, an example that came up in the 1204 01:06:06,200 --> 01:06:10,840 Speaker 1: talk is, uh this a Sony CSL research laboratory project 1205 01:06:10,880 --> 01:06:14,480 Speaker 1: where they use their AI flow machines system to create 1206 01:06:14,480 --> 01:06:17,680 Speaker 1: a new Beatles song. So essentially they just loaded in, 1207 01:06:18,160 --> 01:06:22,760 Speaker 1: you know, all the parameters of of the Beatles discography 1208 01:06:22,800 --> 01:06:24,840 Speaker 1: and then they created this song. It wrote this song 1209 01:06:25,040 --> 01:06:28,400 Speaker 1: Daddy's Car, and then they got humans to perform it, 1210 01:06:28,680 --> 01:06:30,280 Speaker 1: because you know, we're not at the point where that 1211 01:06:30,400 --> 01:06:34,240 Speaker 1: the machine can perfectly replicate recording that would sound like 1212 01:06:34,280 --> 01:06:37,720 Speaker 1: the Beatles, and this essentially sounds like a Beatles cover band. 1213 01:06:38,000 --> 01:06:40,720 Speaker 1: I listened to it, and it's very weird, like the 1214 01:06:40,800 --> 01:06:43,560 Speaker 1: lyrics are written by humans. That's another thing that's important 1215 01:06:43,560 --> 01:06:47,920 Speaker 1: to distinguish. But the machine itself decides, you know, what 1216 01:06:48,040 --> 01:06:51,480 Speaker 1: the composition is, and there is something just a little 1217 01:06:53,120 --> 01:06:54,840 Speaker 1: I don't know, I don't know how to like uncanny 1218 01:06:54,920 --> 01:06:58,800 Speaker 1: Valley in music, which I've never thought of before. When 1219 01:06:58,800 --> 01:07:01,280 Speaker 1: you guys talked about Uncanny Alley in that episode, did 1220 01:07:01,280 --> 01:07:04,400 Speaker 1: it come up in any other sense structure other than vision, 1221 01:07:05,160 --> 01:07:07,480 Speaker 1: I do not. I know it didn't come up in 1222 01:07:07,840 --> 01:07:10,240 Speaker 1: terms of music. So yeah, I think this is an 1223 01:07:10,240 --> 01:07:13,440 Speaker 1: interesting example because when I listened to it, I definitely 1224 01:07:13,440 --> 01:07:16,440 Speaker 1: feel that uncanny effect. But how much of that is 1225 01:07:16,520 --> 01:07:19,880 Speaker 1: me knowing beforehand that this is not a Beatles song? Right? 1226 01:07:20,320 --> 01:07:22,440 Speaker 1: So it was presented to me as hey, here's the 1227 01:07:22,480 --> 01:07:24,760 Speaker 1: lost Beatles song. Would I dig it? Would I be 1228 01:07:24,920 --> 01:07:27,680 Speaker 1: all in on it? And then the additional question is 1229 01:07:27,920 --> 01:07:31,720 Speaker 1: will we reach a point where audiences won't care? So, 1230 01:07:31,840 --> 01:07:35,800 Speaker 1: like I mentioned Tool earlier, I've I've had to wait 1231 01:07:35,840 --> 01:07:37,920 Speaker 1: for the last three albums to come out, and that's 1232 01:07:37,920 --> 01:07:41,480 Speaker 1: like a twenty something year span there. Uh, But there 1233 01:07:41,480 --> 01:07:43,240 Speaker 1: are new fans who come online and they're like, all 1234 01:07:43,240 --> 01:07:45,040 Speaker 1: the albums are there and they can start waiting on 1235 01:07:45,080 --> 01:07:47,600 Speaker 1: the next album with the rest office and if and 1236 01:07:47,640 --> 01:07:50,400 Speaker 1: if fake material comes out, we're not going to accept it. 1237 01:07:50,840 --> 01:07:53,680 Speaker 1: I mentioned I mentioned at Tacker earlier. I believe a 1238 01:07:53,880 --> 01:07:56,360 Speaker 1: tecker Is is an example where there have been fake 1239 01:07:56,520 --> 01:07:59,320 Speaker 1: leaks that have come out where someone says, here's the 1240 01:07:59,360 --> 01:08:01,400 Speaker 1: new Autacker and listen to an itch really some other 1241 01:08:02,160 --> 01:08:05,320 Speaker 1: album that somebody else created, and everyone just you know, 1242 01:08:05,480 --> 01:08:07,919 Speaker 1: craps on it. They're like, get this, I can see 1243 01:08:07,960 --> 01:08:12,120 Speaker 1: that being pretty easily not replicable, but like you could 1244 01:08:12,120 --> 01:08:14,160 Speaker 1: get faked out on that pretty easily given their music. 1245 01:08:14,360 --> 01:08:16,920 Speaker 1: Exactly can you imagine the a point in the future 1246 01:08:16,960 --> 01:08:19,160 Speaker 1: where the fakery, if you want to call it, or 1247 01:08:19,200 --> 01:08:21,960 Speaker 1: at least the AI the machine creativity involved is so 1248 01:08:22,040 --> 01:08:24,680 Speaker 1: advanced that they can come up with something that scratches 1249 01:08:24,720 --> 01:08:28,400 Speaker 1: your itch, that itch for say, another Tool album or 1250 01:08:28,400 --> 01:08:32,880 Speaker 1: another mos Art composition, whatever the uh the need is, 1251 01:08:33,520 --> 01:08:35,720 Speaker 1: and maybe on top of that, it even customizes it 1252 01:08:35,800 --> 01:08:38,360 Speaker 1: to your own particular taste within that group, your own 1253 01:08:39,040 --> 01:08:43,679 Speaker 1: real time emotional demands. Well, what this all boils down 1254 01:08:43,680 --> 01:08:46,000 Speaker 1: to that I think is really interesting and didn't come 1255 01:08:46,040 --> 01:08:48,720 Speaker 1: up in their conversation at all, is um Are you 1256 01:08:48,760 --> 01:08:53,519 Speaker 1: familiar with Walter Benjamin or Benjamin sometimes pronounced his nineteen 1257 01:08:54,240 --> 01:08:57,439 Speaker 1: essay Work of Art in the Age of Mechanical Reproduction? No, 1258 01:08:57,520 --> 01:08:59,920 Speaker 1: I don't, I am not. It's this really interesting take 1259 01:09:00,280 --> 01:09:04,760 Speaker 1: on what the difference is between art and copies of 1260 01:09:04,960 --> 01:09:08,439 Speaker 1: art and he throws out this term aura in it, 1261 01:09:08,520 --> 01:09:13,080 Speaker 1: where he essentially argues that mechanically reproduced art is missing 1262 01:09:13,640 --> 01:09:16,760 Speaker 1: this aesthetic uniqueness that the original has. Right, So, if 1263 01:09:16,800 --> 01:09:19,120 Speaker 1: you take the Mona Lisa and then you run it over, 1264 01:09:20,000 --> 01:09:23,160 Speaker 1: you get a like really high definition quality print of 1265 01:09:23,160 --> 01:09:25,720 Speaker 1: it that's not the Mona Lisa though, right, Like, it 1266 01:09:25,760 --> 01:09:28,960 Speaker 1: doesn't have the aesthetic uniqueness there. And so when you 1267 01:09:29,000 --> 01:09:32,960 Speaker 1: apply that here, I'm wondering, is the art of the 1268 01:09:33,000 --> 01:09:38,000 Speaker 1: computer authentic then? Like, so, for instance, they were talking 1269 01:09:38,040 --> 01:09:41,160 Speaker 1: about the recipes that were being created by the by 1270 01:09:41,200 --> 01:09:44,559 Speaker 1: the computer. If that comes out and it tastes great 1271 01:09:44,640 --> 01:09:46,840 Speaker 1: and you're like, oh, this is really good, does it 1272 01:09:47,000 --> 01:09:52,000 Speaker 1: have that aesthetic uniqueness of somebody figuring it out or 1273 01:09:52,160 --> 01:09:55,840 Speaker 1: is it a mechanical reproduction? Huh? But this brings to 1274 01:09:55,920 --> 01:09:59,320 Speaker 1: mind an example that also ties into my trip to 1275 01:09:59,320 --> 01:10:02,719 Speaker 1: New York that I kept thinking about Arnold Bocklin's painting 1276 01:10:02,800 --> 01:10:05,559 Speaker 1: Isle of the Dead, in part because I've just seen 1277 01:10:05,600 --> 01:10:09,400 Speaker 1: Alien Covenant and they referenced the painting in the film. Yeah, 1278 01:10:09,439 --> 01:10:11,680 Speaker 1: there's a scene where it's like one of the castles 1279 01:10:11,680 --> 01:10:15,000 Speaker 1: of the Engineers and David and Walter standing out there 1280 01:10:15,000 --> 01:10:18,160 Speaker 1: having a conversation, and it's clearly inspired by this painting. 1281 01:10:18,160 --> 01:10:21,759 Speaker 1: The The Isle of the Dead coincidentally, is a painting 1282 01:10:21,800 --> 01:10:25,280 Speaker 1: the gieger Um did versions of and in his own style. 1283 01:10:25,400 --> 01:10:27,400 Speaker 1: Is that right? Well, that must be where they got 1284 01:10:27,400 --> 01:10:29,479 Speaker 1: the idea for it from. I would I would assume 1285 01:10:29,800 --> 01:10:31,880 Speaker 1: that's very like when you look at it and you've 1286 01:10:31,880 --> 01:10:34,120 Speaker 1: seen that movie, you can see the influence. Yea. But 1287 01:10:34,280 --> 01:10:36,360 Speaker 1: it's a it has it's a picture. It's an image 1288 01:10:36,360 --> 01:10:39,240 Speaker 1: with a lot of darkness in it, like literal darkness. 1289 01:10:39,760 --> 01:10:42,360 Speaker 1: And there are some fabulous prints out there of it. 1290 01:10:42,400 --> 01:10:44,800 Speaker 1: There's some fabulous digital versions of it online you go 1291 01:10:44,840 --> 01:10:48,040 Speaker 1: on Getty, etcetera. But seeing it, seeing an actual version 1292 01:10:48,080 --> 01:10:50,920 Speaker 1: of it at the met Or it was actually a 1293 01:10:50,960 --> 01:10:54,960 Speaker 1: little lackluster because there's this gleam on the black on 1294 01:10:55,000 --> 01:10:57,720 Speaker 1: the black paint and it's aged a bit. So it 1295 01:10:57,800 --> 01:11:00,400 Speaker 1: was one of these rare cases where seeing the original 1296 01:11:00,960 --> 01:11:04,160 Speaker 1: it felt unique in many in many cases like here's 1297 01:11:04,200 --> 01:11:07,920 Speaker 1: the thing itself. But on the other hand, it was 1298 01:11:08,000 --> 01:11:11,360 Speaker 1: less satisfying. Wow. So that's like a reverse of Walter 1299 01:11:11,439 --> 01:11:16,360 Speaker 1: Benjamine's uh Aura. It's like the aura itself was diminished somehow. 1300 01:11:16,640 --> 01:11:18,960 Speaker 1: And I should add that usually I feel the other 1301 01:11:18,960 --> 01:11:21,559 Speaker 1: way around. Usually if I see the an actual painting 1302 01:11:21,560 --> 01:11:24,000 Speaker 1: that I that I care about, like it's it's amazing 1303 01:11:24,040 --> 01:11:25,800 Speaker 1: to get in there close to it or close as 1304 01:11:25,800 --> 01:11:29,560 Speaker 1: you're allowed and uh and see the details, see the brushstrokes, 1305 01:11:29,560 --> 01:11:32,599 Speaker 1: see that like the physical paint. Another example would be 1306 01:11:32,640 --> 01:11:35,360 Speaker 1: like the difference between seeing a band live and listening 1307 01:11:35,400 --> 01:11:38,960 Speaker 1: to them on recording um. And this makes me think 1308 01:11:39,640 --> 01:11:43,479 Speaker 1: of synthesis and synthesizers because in this talk they brought 1309 01:11:43,560 --> 01:11:46,560 Speaker 1: up synthesizers as being like an early example kind of 1310 01:11:46,600 --> 01:11:50,840 Speaker 1: like the drum machine of this computer generated creativity and 1311 01:11:51,040 --> 01:11:55,439 Speaker 1: synthesis makes me think of blooms taxonomy of learning and 1312 01:11:55,680 --> 01:11:59,040 Speaker 1: within it, synthesis is one of the modes of learning 1313 01:11:59,080 --> 01:12:01,880 Speaker 1: that you're you're supposed to try to achieve, and it's 1314 01:12:01,920 --> 01:12:05,759 Speaker 1: basically one has to put together parts from diverse elements 1315 01:12:05,760 --> 01:12:09,200 Speaker 1: to form a whole. So the process of synthesis creates 1316 01:12:09,200 --> 01:12:14,920 Speaker 1: a unique form of ideas, communication, operations, relations, and sometimes art. 1317 01:12:15,439 --> 01:12:20,000 Speaker 1: So while machines are currently capable of being synthesizers right there, 1318 01:12:20,400 --> 01:12:23,120 Speaker 1: they have a combination of those elements, but the humans 1319 01:12:23,120 --> 01:12:27,400 Speaker 1: are the ones doing the synthesizing. The machines themselves aren't 1320 01:12:27,400 --> 01:12:30,360 Speaker 1: doing that, So that role still seems to be held 1321 01:12:30,400 --> 01:12:34,879 Speaker 1: by humans in this relationship between us and seems inherent 1322 01:12:34,920 --> 01:12:38,080 Speaker 1: to the creative process to me. And they sort of 1323 01:12:38,120 --> 01:12:40,840 Speaker 1: get into this a little bit when they start talking 1324 01:12:40,840 --> 01:12:44,760 Speaker 1: about ownership, because they brought up the idea that computational 1325 01:12:44,880 --> 01:12:47,840 Speaker 1: artifacts are actually not owned by anyone under current law. 1326 01:12:47,840 --> 01:12:51,559 Speaker 1: They're defined as public domain. So if a company can't 1327 01:12:51,840 --> 01:12:56,880 Speaker 1: patent what comes out of a computer that generates art, 1328 01:12:57,360 --> 01:13:01,920 Speaker 1: what's their incentive to fund further develop min of AI creativity. 1329 01:13:01,920 --> 01:13:04,040 Speaker 1: So actually, this goes back to what we were saying earlier, 1330 01:13:04,120 --> 01:13:08,400 Speaker 1: right like if you were going to, uh, let's say, 1331 01:13:08,400 --> 01:13:12,200 Speaker 1: how stuff Works built an army AI robot that would 1332 01:13:12,200 --> 01:13:14,840 Speaker 1: write all our articles for us, and it was able 1333 01:13:14,840 --> 01:13:17,120 Speaker 1: to write like a hundred articles a day based on 1334 01:13:17,240 --> 01:13:21,320 Speaker 1: like whatever it saw coming up in the news. Technically 1335 01:13:21,880 --> 01:13:26,400 Speaker 1: how stuff Works wouldn't own those articles because they'd be 1336 01:13:26,400 --> 01:13:32,519 Speaker 1: in the public domain because they're created by computation. Interesting, Yeah, 1337 01:13:32,600 --> 01:13:34,840 Speaker 1: I mean this brings back them to mind the idea 1338 01:13:34,880 --> 01:13:37,200 Speaker 1: of having a human in the loop, And so I 1339 01:13:37,200 --> 01:13:38,800 Speaker 1: could see the case where you'd have to have a 1340 01:13:38,880 --> 01:13:41,720 Speaker 1: human in the loop just for for legal purposes, just 1341 01:13:41,760 --> 01:13:44,360 Speaker 1: so that it could be a definite owner because then 1342 01:13:44,439 --> 01:13:47,000 Speaker 1: that that that human can have it in their contract. 1343 01:13:47,040 --> 01:13:49,479 Speaker 1: Of course that anything they create, uh you know on 1344 01:13:49,479 --> 01:13:52,960 Speaker 1: the company clock is property of the company. That's pretty standard. Uh. 1345 01:13:53,040 --> 01:13:55,760 Speaker 1: And then I assume the machines contributions to that. The 1346 01:13:55,800 --> 01:13:58,479 Speaker 1: machine would be more of a pure tool in that scenario, 1347 01:13:59,080 --> 01:14:03,360 Speaker 1: but a a self aware tool or if you will, 1348 01:14:03,720 --> 01:14:06,600 Speaker 1: is just going to be outside of the confines of 1349 01:14:06,640 --> 01:14:11,120 Speaker 1: existing law. Yeah, yeah, for now at least. So wow, 1350 01:14:11,200 --> 01:14:15,240 Speaker 1: we've had a pretty extended and a deep discussion about this. 1351 01:14:15,400 --> 01:14:17,640 Speaker 1: I knew that this was going to be interesting, but 1352 01:14:18,000 --> 01:14:21,320 Speaker 1: we really dove deep. And one of the things that I, 1353 01:14:21,560 --> 01:14:24,280 Speaker 1: you know, to close out watching the panel, they sort 1354 01:14:24,280 --> 01:14:26,000 Speaker 1: of said, well, why are we asking this now? Why 1355 01:14:26,040 --> 01:14:28,880 Speaker 1: are we asking these questions about machines and creativity? And 1356 01:14:28,880 --> 01:14:32,400 Speaker 1: they said, well, because this was actually Chung I think 1357 01:14:32,439 --> 01:14:35,799 Speaker 1: said this. She said, we're currently generating these huge amounts 1358 01:14:35,800 --> 01:14:39,440 Speaker 1: of data, right, like just think of like our information systems, 1359 01:14:39,479 --> 01:14:41,639 Speaker 1: just on the Internet in general, all the data that's 1360 01:14:41,640 --> 01:14:44,880 Speaker 1: being generated, and we're trying to wrap our heads around it. 1361 01:14:45,320 --> 01:14:49,439 Speaker 1: So we're using models that we're already familiar with. So 1362 01:14:49,479 --> 01:14:53,439 Speaker 1: in this case, the model of creativity right again, getting 1363 01:14:53,439 --> 01:14:57,240 Speaker 1: back to systematizing it, and so human culture is being 1364 01:14:57,320 --> 01:15:02,120 Speaker 1: applied on top of he chnology. Uh, and that that's 1365 01:15:02,120 --> 01:15:05,040 Speaker 1: what's interesting, I think, is like we've gotten to this 1366 01:15:05,080 --> 01:15:07,559 Speaker 1: point now where we're like, oh my god, there's so 1367 01:15:07,640 --> 01:15:12,439 Speaker 1: much information that even like we cannot process it and 1368 01:15:12,479 --> 01:15:14,879 Speaker 1: figure it out. We need to turn to these machines 1369 01:15:14,920 --> 01:15:16,479 Speaker 1: to try to help us do that, but we need 1370 01:15:16,520 --> 01:15:19,840 Speaker 1: to like layer our cultural understanding of the world on 1371 01:15:19,920 --> 01:15:22,680 Speaker 1: top of that. Yeah, So creativity is not it's not 1372 01:15:22,960 --> 01:15:25,639 Speaker 1: it goes beyond just merely using these machines to make 1373 01:15:25,640 --> 01:15:30,360 Speaker 1: our art, but using these machines to make sense of ourselves. Yeah, yeah, exactly. 1374 01:15:30,680 --> 01:15:34,280 Speaker 1: I mean I guess ultimately that's the point. Well that 1375 01:15:34,439 --> 01:15:39,320 Speaker 1: in like cat videos all right, Well, you know this 1376 01:15:39,360 --> 01:15:40,920 Speaker 1: is gonna be a great one to get some feedback 1377 01:15:40,960 --> 01:15:43,040 Speaker 1: on because I I know for a fact that we 1378 01:15:43,160 --> 01:15:47,599 Speaker 1: have creators out there who create their art while listening 1379 01:15:47,600 --> 01:15:50,800 Speaker 1: to episodes of stuff to blow your mind. So you 1380 01:15:50,840 --> 01:15:54,960 Speaker 1: guys and gals in particular probably have some insightful commentary 1381 01:15:55,200 --> 01:15:58,160 Speaker 1: on on the material discussed here today. Yeah, and we 1382 01:15:58,200 --> 01:16:00,400 Speaker 1: would love to hear from you about at There are 1383 01:16:00,439 --> 01:16:02,000 Speaker 1: a number of ways that you can get in touch 1384 01:16:02,040 --> 01:16:04,559 Speaker 1: with us. We're all over social media. If you want 1385 01:16:04,600 --> 01:16:07,599 Speaker 1: to write us about your creative experiments or your thoughts 1386 01:16:07,840 --> 01:16:12,680 Speaker 1: on computational creativity, you can find us on Facebook, Twitter, Tumbler, 1387 01:16:12,720 --> 01:16:15,960 Speaker 1: and Instagram. And has already mentioned we are a craft 1388 01:16:16,000 --> 01:16:19,599 Speaker 1: podcast and uh and we were. We require your support 1389 01:16:19,920 --> 01:16:23,320 Speaker 1: for our for our art, so we're not asking for money, 1390 01:16:23,360 --> 01:16:25,320 Speaker 1: but we are asking for iTunes reviews. So if you 1391 01:16:25,360 --> 01:16:27,120 Speaker 1: listen to us on iTunes, or even if you don't, 1392 01:16:27,160 --> 01:16:29,120 Speaker 1: you just have an iTunes account, why don't you go 1393 01:16:29,120 --> 01:16:31,360 Speaker 1: on there and leave us a nice review? Throw us 1394 01:16:31,720 --> 01:16:34,760 Speaker 1: five stars, six star, seven stars, however many stars they 1395 01:16:34,880 --> 01:16:38,000 Speaker 1: let you give us, Uh, just give us the maximum 1396 01:16:38,040 --> 01:16:39,920 Speaker 1: and I'll leave a nice review, And that helps us 1397 01:16:39,960 --> 01:16:41,639 Speaker 1: out and helps us to continue to do this show 1398 01:16:41,680 --> 01:16:45,000 Speaker 1: for it and most of our episodes as well. All 1399 01:16:45,000 --> 01:16:48,200 Speaker 1: of our episodes have landing pages on our website, Stuff 1400 01:16:48,200 --> 01:16:50,639 Speaker 1: to Blow your Mind dot com, So we always recommend 1401 01:16:50,680 --> 01:16:52,280 Speaker 1: that you go check those out because they have like 1402 01:16:52,400 --> 01:16:55,040 Speaker 1: interesting links to other stuff that we've worked on that's related. 1403 01:16:55,200 --> 01:16:57,519 Speaker 1: But this one, as we've discussed in the episode, is 1404 01:16:57,520 --> 01:17:00,320 Speaker 1: going to have all kinds of cool stuff, including that 1405 01:17:00,400 --> 01:17:04,000 Speaker 1: playlist that Robert put together of all the Gager album 1406 01:17:04,080 --> 01:17:08,160 Speaker 1: cover songs. Uh so check that out for sure, and 1407 01:17:08,479 --> 01:17:10,400 Speaker 1: as always, you can get in touch with us directly 1408 01:17:10,439 --> 01:17:13,080 Speaker 1: by emailing us at blow the Mind at how stuff 1409 01:17:13,080 --> 01:17:25,840 Speaker 1: works dot com for more on this and thousands of 1410 01:17:25,840 --> 01:17:51,360 Speaker 1: other topics. Is it how stuff works dot com