1 00:00:19,880 --> 00:00:23,680 Speaker 1: Hey, and welcome to What Future on your host Joshua 2 00:00:23,720 --> 00:00:26,840 Speaker 1: to Pulski, and I thought I'd be coming to you 3 00:00:27,560 --> 00:00:32,320 Speaker 1: this week to talk about the end of American democracy, 4 00:00:33,000 --> 00:00:38,960 Speaker 1: which seemed, I don't know, very likely, based on the 5 00:00:39,000 --> 00:00:41,600 Speaker 1: way the media depicted what was about to happen in 6 00:00:41,640 --> 00:00:45,040 Speaker 1: the mid term elections, and based on how the Republicans 7 00:00:45,080 --> 00:00:48,519 Speaker 1: were talking about crime in America, which of course is 8 00:00:48,640 --> 00:00:52,400 Speaker 1: rampant and out of control and coming to a town 9 00:00:52,479 --> 00:00:56,840 Speaker 1: near you at any moment. But now we are a 10 00:00:56,880 --> 00:01:00,400 Speaker 1: couple of days past the mid terms, and it did 11 00:01:00,400 --> 00:01:03,480 Speaker 1: not go nearly as well for the Republicans as I 12 00:01:03,520 --> 00:01:06,280 Speaker 1: think everybody thought it would, and it did not go 13 00:01:06,360 --> 00:01:08,679 Speaker 1: as badly for the Democrats as everybody thought it would. 14 00:01:08,680 --> 00:01:11,760 Speaker 1: And it turns out it's kind of a split decision. Really. 15 00:01:12,160 --> 00:01:15,920 Speaker 1: It seems like the Republicans may get control of the House, 16 00:01:16,760 --> 00:01:20,320 Speaker 1: seems like Democrats may hold on to a narrow, very 17 00:01:20,400 --> 00:01:27,040 Speaker 1: narrow control of the Senate, and we'll continue to get 18 00:01:27,080 --> 00:01:30,040 Speaker 1: nothing done in Washington or in any other political arena 19 00:01:30,080 --> 00:01:33,080 Speaker 1: in America. So, you know, that's good. I think for 20 00:01:33,120 --> 00:01:35,640 Speaker 1: the most part, I feel like we didn't lose, you know, 21 00:01:35,720 --> 00:01:38,840 Speaker 1: the right to vote in America this week, which sounds 22 00:01:38,880 --> 00:01:41,199 Speaker 1: like that's bottom of the barrel, sort of low low 23 00:01:41,319 --> 00:01:45,440 Speaker 1: stakes for things, but that's kind of where we are anyhow. 24 00:01:45,480 --> 00:01:49,920 Speaker 1: So all that said, that means we can engage in 25 00:01:50,080 --> 00:01:53,240 Speaker 1: conversation about something else, which I think for all of 26 00:01:53,320 --> 00:01:57,919 Speaker 1: us is going to feel really great. And that's why 27 00:01:58,040 --> 00:02:00,840 Speaker 1: I'm excited to talk about a topic that has i 28 00:02:00,840 --> 00:02:04,280 Speaker 1: would say, almost nothing to do with American politics or 29 00:02:04,320 --> 00:02:09,000 Speaker 1: politics generally. And that topic is, of course artificial intelligence 30 00:02:09,880 --> 00:02:13,480 Speaker 1: that creates art. So there's several bots that do this. 31 00:02:13,600 --> 00:02:18,440 Speaker 1: One has recently become widely available to the public. It's 32 00:02:18,440 --> 00:02:21,400 Speaker 1: a bot called mid Journey. From a user perspective. From 33 00:02:21,400 --> 00:02:23,560 Speaker 1: a person's perspective, what you can do is you can 34 00:02:23,600 --> 00:02:28,200 Speaker 1: sit down and type something and it will generate based 35 00:02:28,240 --> 00:02:30,720 Speaker 1: on all of the images that's ever looked at, which 36 00:02:30,760 --> 00:02:34,160 Speaker 1: are billions of images on the Internet and wherever else, 37 00:02:34,560 --> 00:02:37,600 Speaker 1: databases they feed and stuff like that. It will generate 38 00:02:38,280 --> 00:02:42,040 Speaker 1: what it thinks you want from a prompt from a sentence. 39 00:02:42,600 --> 00:02:44,840 Speaker 1: You actually use it through Discord, which is like kind 40 00:02:44,840 --> 00:02:48,280 Speaker 1: of a chat you know, network, which was popularized by gamers, 41 00:02:48,320 --> 00:02:50,200 Speaker 1: but Discord you can basically go and talk to the 42 00:02:50,240 --> 00:02:52,840 Speaker 1: discord bought from mid Journey, and you can say, for instance, 43 00:02:53,160 --> 00:02:57,120 Speaker 1: Dracula explaining his vampiresm to a crowd of onlookers, and 44 00:02:57,160 --> 00:03:00,520 Speaker 1: it will generate four different images that it thinks capture 45 00:03:01,360 --> 00:03:05,320 Speaker 1: your idea, and they're insanely accurate. They look like somebody 46 00:03:05,360 --> 00:03:08,080 Speaker 1: painted a picture of something that you wanted. It is 47 00:03:08,200 --> 00:03:14,480 Speaker 1: essentially like the closest thing to being able to visualize 48 00:03:14,480 --> 00:03:18,600 Speaker 1: a dream. I don't know that I've ever had my 49 00:03:18,680 --> 00:03:21,600 Speaker 1: mind more blown by anything the computer has ever done 50 00:03:22,040 --> 00:03:26,600 Speaker 1: than this piece of software. I mean, it is hard 51 00:03:26,680 --> 00:03:29,400 Speaker 1: to articulate what it feels like when you write a 52 00:03:29,480 --> 00:03:33,160 Speaker 1: sentence of something that seems completely impossible and then see 53 00:03:33,600 --> 00:03:36,800 Speaker 1: a pretty good representation of it in a matter of 54 00:03:36,840 --> 00:03:40,120 Speaker 1: like thirty seconds, forty five seconds, maybe a minute, to 55 00:03:40,120 --> 00:03:43,680 Speaker 1: see your first four versions of something. Let's just say 56 00:03:43,680 --> 00:03:46,560 Speaker 1: you're not an artist, you're not a designer, You're not 57 00:03:46,560 --> 00:03:49,480 Speaker 1: gonna make your living off of doing paintings for magazines 58 00:03:49,560 --> 00:03:52,320 Speaker 1: or whatever. You're just a person from a pure like 59 00:03:52,960 --> 00:03:57,560 Speaker 1: thrill level. As a person, I think this is fucking amazing. 60 00:03:57,640 --> 00:04:00,480 Speaker 1: It is like the most fascinating and most azy thing 61 00:04:00,480 --> 00:04:03,320 Speaker 1: I've almost maybe I've ever done on a computer. And 62 00:04:03,320 --> 00:04:05,120 Speaker 1: I've done a lot of stuff on the computer, you 63 00:04:05,160 --> 00:04:07,480 Speaker 1: know what I mean. But this is like tripping. I mean, 64 00:04:07,480 --> 00:04:10,040 Speaker 1: it's fucking insane. It's like, like I said, it's sort 65 00:04:10,080 --> 00:04:12,040 Speaker 1: of like as close as you might get to you know, 66 00:04:12,120 --> 00:04:14,880 Speaker 1: you had a dream and you can then see the dream. 67 00:04:14,920 --> 00:04:17,160 Speaker 1: I would also say, what's interesting is that it is 68 00:04:17,200 --> 00:04:19,920 Speaker 1: like a computer dreaming. I mean, what it is is 69 00:04:20,040 --> 00:04:22,039 Speaker 1: you giving this like what what I consider to be 70 00:04:22,040 --> 00:04:25,839 Speaker 1: fairly abstract input to a computer, and the computer deciding 71 00:04:25,920 --> 00:04:30,000 Speaker 1: making all of these really creative decisions about what that 72 00:04:30,120 --> 00:04:33,160 Speaker 1: thing should look like. Anyhow, I'm not an artist. I'm 73 00:04:33,160 --> 00:04:36,880 Speaker 1: not a painter. I have been working with mid Journey 74 00:04:36,920 --> 00:04:40,000 Speaker 1: to create art for this podcast, and you can see 75 00:04:40,000 --> 00:04:41,839 Speaker 1: some of the prompts for these. Some of the ones 76 00:04:41,880 --> 00:04:45,680 Speaker 1: that I did science fiction paperback book cover about society 77 00:04:45,720 --> 00:04:48,120 Speaker 1: in the future. One of them is a phrase from 78 00:04:48,480 --> 00:04:51,640 Speaker 1: Blade Runner attack ships on fire off the shoulder of Oriyan, 79 00:04:51,720 --> 00:04:54,320 Speaker 1: which is Rutger Howard has this monologue at the end 80 00:04:54,320 --> 00:04:56,240 Speaker 1: of Blade Runner, and that's one of the things he says, 81 00:04:56,480 --> 00:05:00,480 Speaker 1: and it created imagery based on that sentence. Um Dracula 82 00:05:00,560 --> 00:05:03,000 Speaker 1: explaining his vampirism to a crowd of onlookers you can 83 00:05:03,000 --> 00:05:06,000 Speaker 1: see several variations. I mean, they are fucking beautiful pieces 84 00:05:06,040 --> 00:05:09,600 Speaker 1: of art in my opinion, like legitimately beautiful pieces of art. 85 00:05:09,839 --> 00:05:13,800 Speaker 1: There is some art in figuring out how to get 86 00:05:13,839 --> 00:05:15,680 Speaker 1: this thing to do what you want, or to at 87 00:05:15,760 --> 00:05:19,040 Speaker 1: least create a result that is that is pleasing. Now, anyhow, 88 00:05:19,279 --> 00:05:21,919 Speaker 1: what's interesting about this? And there's many interesting things about it, 89 00:05:21,960 --> 00:05:23,479 Speaker 1: and like I'll just go down the list of some 90 00:05:23,520 --> 00:05:26,200 Speaker 1: of the ones that I'm thinking about. First off, there's 91 00:05:26,240 --> 00:05:29,960 Speaker 1: obviously this question about art, like what is art and 92 00:05:30,120 --> 00:05:32,680 Speaker 1: is this art? And what kind of art is it? 93 00:05:32,839 --> 00:05:34,720 Speaker 1: Meaning As a guy who who's run a lot of 94 00:05:34,760 --> 00:05:37,440 Speaker 1: newsrooms and a lot of publications, I could see this 95 00:05:37,600 --> 00:05:42,919 Speaker 1: is very functionally important in like an organization that needs 96 00:05:42,920 --> 00:05:45,880 Speaker 1: original art for things, but maybe doesn't have the budget 97 00:05:46,000 --> 00:05:49,039 Speaker 1: or the time to generate original art for everything they'd 98 00:05:49,040 --> 00:05:51,240 Speaker 1: like to generate original art for. Right, So there's an 99 00:05:51,240 --> 00:05:53,400 Speaker 1: implication there, like for me that I'm like, oh, that's 100 00:05:53,400 --> 00:05:56,640 Speaker 1: really interesting, Right, that's really exciting. I follow a bunch 101 00:05:56,680 --> 00:05:58,600 Speaker 1: of designers and artists on Instagram and they have been 102 00:05:58,600 --> 00:06:00,839 Speaker 1: talking about this for a while. I mean, this opens 103 00:06:00,880 --> 00:06:03,760 Speaker 1: up an enormous amount of serious questions like, for instance, 104 00:06:03,760 --> 00:06:07,920 Speaker 1: the bots are obviously taking content and material and analyzing 105 00:06:07,960 --> 00:06:10,680 Speaker 1: it and learning from it and in some cases replicating 106 00:06:10,680 --> 00:06:13,320 Speaker 1: it in some way from from what real artists right 107 00:06:13,320 --> 00:06:16,839 Speaker 1: from historic you know, pieces of art up to modern 108 00:06:16,880 --> 00:06:19,599 Speaker 1: pieces of art as far as I know, in essence, 109 00:06:19,640 --> 00:06:21,440 Speaker 1: these ai s can go and look at and then 110 00:06:21,560 --> 00:06:24,800 Speaker 1: learn from. But there's this little bit of controversy or 111 00:06:24,839 --> 00:06:27,000 Speaker 1: not a little bit, maybe a lot from some artists 112 00:06:27,040 --> 00:06:29,719 Speaker 1: who say this is you know, it's theft of our work. 113 00:06:29,800 --> 00:06:33,680 Speaker 1: You know, they're using things we've created without any license 114 00:06:33,720 --> 00:06:36,320 Speaker 1: to do so, and in creating new works based on it. 115 00:06:36,960 --> 00:06:40,040 Speaker 1: That argument, to me is a little bit like every 116 00:06:40,120 --> 00:06:43,360 Speaker 1: artist uses somebody else's work to create what they do. 117 00:06:43,400 --> 00:06:46,240 Speaker 1: I mean, as we know, remixing in music has become 118 00:06:46,560 --> 00:06:49,039 Speaker 1: one of the baseline ways you make music now, no 119 00:06:49,120 --> 00:06:52,320 Speaker 1: pun intended on baseline. So the idea of like sampling 120 00:06:52,640 --> 00:06:55,080 Speaker 1: somebody else's art to create something new is not new. 121 00:06:55,400 --> 00:06:58,240 Speaker 1: I think what's sort of insane and threatening on a 122 00:06:58,240 --> 00:07:03,240 Speaker 1: bunch of different levels is that this is creating real art, 123 00:07:03,400 --> 00:07:09,120 Speaker 1: really interesting pieces of art and imagery that have real applications, 124 00:07:09,120 --> 00:07:11,360 Speaker 1: whether it's hanging in the gallery or using for an 125 00:07:11,400 --> 00:07:14,160 Speaker 1: illustration in a magazine or whatever, um and it is 126 00:07:14,160 --> 00:07:18,000 Speaker 1: just removing a person completely. Basically, this image that I 127 00:07:18,040 --> 00:07:21,000 Speaker 1: created for the podcast is a perfect example. I could 128 00:07:21,000 --> 00:07:22,760 Speaker 1: sit with a designer and tell them about what I 129 00:07:22,800 --> 00:07:24,800 Speaker 1: wanted and show them examples. Can you make something like 130 00:07:24,800 --> 00:07:26,520 Speaker 1: this and we could work through it over and over 131 00:07:26,600 --> 00:07:29,200 Speaker 1: again until we got to something that fell right. The 132 00:07:29,240 --> 00:07:31,600 Speaker 1: idea that I could just say it and it could 133 00:07:31,600 --> 00:07:33,880 Speaker 1: be like, this could be it. This could be the 134 00:07:33,960 --> 00:07:38,120 Speaker 1: art for the show, and that's one job that a 135 00:07:38,240 --> 00:07:40,760 Speaker 1: artist is not going to get now, like for sure, right, 136 00:07:40,800 --> 00:07:43,600 Speaker 1: there's an implication for people who work in these fields 137 00:07:43,600 --> 00:07:45,560 Speaker 1: that is way different than what we're talking about. Like 138 00:07:45,600 --> 00:07:47,680 Speaker 1: I was like, at a pure human level, this is thrilling. 139 00:07:48,160 --> 00:07:51,440 Speaker 1: But on the flip side of that, there's entire industries 140 00:07:51,480 --> 00:07:54,360 Speaker 1: that potentially are wiped out by this. What is this 141 00:07:54,480 --> 00:07:56,800 Speaker 1: open up? I think is a question that I don't 142 00:07:56,800 --> 00:07:59,480 Speaker 1: know the answer to, which is in five or ten years, 143 00:07:59,520 --> 00:08:02,880 Speaker 1: this is going to be so much more capable to 144 00:08:03,000 --> 00:08:05,760 Speaker 1: create things like this, capable to a point where I 145 00:08:05,800 --> 00:08:08,320 Speaker 1: think it's likely in the next five to ten years 146 00:08:08,320 --> 00:08:11,440 Speaker 1: you can simply tell it to do something whatever it is, 147 00:08:11,480 --> 00:08:14,160 Speaker 1: and it will create a perfectly photo realistic version of it. 148 00:08:14,320 --> 00:08:15,720 Speaker 1: I mean, and there are versions of this where you 149 00:08:15,720 --> 00:08:17,920 Speaker 1: can say, you know, show me this thing and show 150 00:08:17,920 --> 00:08:19,920 Speaker 1: it to me in these different styles and it'll show 151 00:08:19,920 --> 00:08:22,880 Speaker 1: you an image in the style of this painter, or 152 00:08:22,880 --> 00:08:25,040 Speaker 1: like it was a photo taken from this era, or 153 00:08:25,240 --> 00:08:26,920 Speaker 1: like it was you know, shot on a certain kind 154 00:08:26,960 --> 00:08:31,160 Speaker 1: of film. What that means going forward is is almost 155 00:08:31,160 --> 00:08:33,840 Speaker 1: like kind of frightening. Like people are like talked a 156 00:08:33,840 --> 00:08:36,240 Speaker 1: lot about deep fakes, you know, and they're like, oh yeah, 157 00:08:36,320 --> 00:08:38,400 Speaker 1: like they're gonna fake a voice or they're gonna fake 158 00:08:38,440 --> 00:08:40,880 Speaker 1: a person's face or whatever. Like this is essentially like 159 00:08:40,880 --> 00:08:42,439 Speaker 1: we're getting to the point where you can just fake 160 00:08:42,800 --> 00:08:47,160 Speaker 1: any situation. You can just create visually any situation you 161 00:08:47,200 --> 00:08:49,280 Speaker 1: can think of. And I think the logical thing is 162 00:08:49,320 --> 00:08:51,960 Speaker 1: that eventually, pretty soon, I would imagine it will be 163 00:08:51,960 --> 00:08:54,959 Speaker 1: able to do this with video, right, and I think 164 00:08:55,120 --> 00:08:58,400 Speaker 1: with moving images sound is not too far behind it. 165 00:08:58,679 --> 00:08:59,920 Speaker 1: You start to think of how this could be a 166 00:09:00,000 --> 00:09:02,360 Speaker 1: fly to all sorts of other things. I mean, presumably 167 00:09:03,120 --> 00:09:06,240 Speaker 1: if it can do this with art with visual art, 168 00:09:06,280 --> 00:09:08,040 Speaker 1: I think you can do it with other forms of art. Right. 169 00:09:08,880 --> 00:09:10,880 Speaker 1: Will we discover when it's like you could have any 170 00:09:10,960 --> 00:09:13,240 Speaker 1: art available to you, any type of content available to you. 171 00:09:13,280 --> 00:09:16,839 Speaker 1: Perhaps that like what you want is somebody else's brain 172 00:09:17,920 --> 00:09:20,520 Speaker 1: in mind, right, Like I want to understand or see 173 00:09:20,600 --> 00:09:24,040 Speaker 1: something or hear something from somebody else's brain, but I 174 00:09:24,080 --> 00:09:26,120 Speaker 1: don't know what it's like if if the other brain 175 00:09:26,200 --> 00:09:28,280 Speaker 1: can just create any of those things that I would 176 00:09:28,320 --> 00:09:30,920 Speaker 1: be intrigued by, Like this art is a great example. 177 00:09:31,240 --> 00:09:35,559 Speaker 1: Clearly this non brain entity can create things that surprise 178 00:09:35,600 --> 00:09:39,760 Speaker 1: and delight me, that feel as authentic and original as 179 00:09:40,040 --> 00:09:44,640 Speaker 1: any art that I've looked at. It's obvious that the 180 00:09:44,760 --> 00:09:47,920 Speaker 1: systems that are creating this are very advanced, and they 181 00:09:47,920 --> 00:09:49,920 Speaker 1: are only going to get better. They're not going to 182 00:09:49,960 --> 00:09:53,080 Speaker 1: get worse. There is no going back to a state 183 00:09:53,280 --> 00:09:56,640 Speaker 1: where this is not possible. And so when you think 184 00:09:56,679 --> 00:10:00,720 Speaker 1: about what that looks like down the road, maybe not 185 00:10:00,760 --> 00:10:02,600 Speaker 1: everybody feels this way, but I can kind of like 186 00:10:02,679 --> 00:10:04,480 Speaker 1: in the middle of my brain, I get this like 187 00:10:04,679 --> 00:10:08,560 Speaker 1: very upsetting feeling when I think about what space actually is, 188 00:10:08,640 --> 00:10:11,440 Speaker 1: which is like this endless nothing and actually nothing and 189 00:10:11,440 --> 00:10:13,480 Speaker 1: what is that like? It's very upsetting to think about. 190 00:10:13,800 --> 00:10:15,760 Speaker 1: To me. When I think about like the future of 191 00:10:15,760 --> 00:10:17,840 Speaker 1: this stuff, it's sort of a similar kind of weight 192 00:10:17,880 --> 00:10:20,080 Speaker 1: in the middle of my brain, which is like where 193 00:10:20,120 --> 00:10:23,000 Speaker 1: does this go? Like it feels like all of reality 194 00:10:23,080 --> 00:10:25,880 Speaker 1: is almost called into question by the technology. And maybe 195 00:10:25,920 --> 00:10:29,079 Speaker 1: I'm overstating it, maybe I sound crazy. I'm not saying 196 00:10:29,080 --> 00:10:32,360 Speaker 1: the computer sension or it's alive or it's got a 197 00:10:32,400 --> 00:10:35,040 Speaker 1: soul now or anything. But there's something in between the 198 00:10:35,120 --> 00:10:37,120 Speaker 1: lines of all this where it's just sort of like 199 00:10:37,679 --> 00:10:41,840 Speaker 1: it leaps beyond even my understanding of what is happening, 200 00:10:41,880 --> 00:10:44,559 Speaker 1: Like it leaps to a place that's almost like, I 201 00:10:44,600 --> 00:10:46,959 Speaker 1: don't want to say spiritual, but it leaps to a 202 00:10:47,040 --> 00:10:49,760 Speaker 1: kind of almost religious place where it's like how can 203 00:10:49,840 --> 00:10:51,760 Speaker 1: this be? You kind of feel like when you do it, 204 00:10:51,800 --> 00:11:10,200 Speaker 1: how can this be? Like? How is it possible? My 205 00:11:10,240 --> 00:11:14,160 Speaker 1: guest today is David Hols, the founder and CEO of 206 00:11:14,360 --> 00:11:17,560 Speaker 1: mid Journey. David, thank you for being here. Thank you. 207 00:11:17,880 --> 00:11:19,760 Speaker 1: Just before this, I said, can I say CEO? And 208 00:11:19,800 --> 00:11:22,280 Speaker 1: you didn't want me to, But I've done it anyway, 209 00:11:22,280 --> 00:11:26,480 Speaker 1: and we're all gonna have to live with the repercussion. Okay, 210 00:11:26,520 --> 00:11:28,320 Speaker 1: Let's say you and I met at a party. Let's 211 00:11:28,320 --> 00:11:30,679 Speaker 1: pretend we're at a cool party. You don't know where 212 00:11:30,720 --> 00:11:32,319 Speaker 1: I'm coming from and I'm like, what do you do? 213 00:11:33,320 --> 00:11:36,080 Speaker 1: You say, I'm the founder and CEO of mid Journey, 214 00:11:36,120 --> 00:11:38,040 Speaker 1: and I go, what's that? How would you describe it 215 00:11:38,080 --> 00:11:42,920 Speaker 1: to somebody just randomly at a party. I try not to, Uh, 216 00:11:42,960 --> 00:11:44,800 Speaker 1: I'm pretty low key, but if they asked, They're like, 217 00:11:44,840 --> 00:11:47,800 Speaker 1: what does mid Journey do? Yeah, I don't know. Um. 218 00:11:47,840 --> 00:11:49,600 Speaker 1: I never really wanted a company. I just kind of 219 00:11:49,600 --> 00:11:51,920 Speaker 1: wanted a home, and so like mid Journey is world 220 00:11:51,920 --> 00:11:53,400 Speaker 1: of meant to be like my new home for the 221 00:11:53,440 --> 00:11:55,160 Speaker 1: next ten years to work on a lot of cool 222 00:11:55,200 --> 00:11:58,640 Speaker 1: projects that I care about, with with cool people and 223 00:11:58,720 --> 00:12:02,320 Speaker 1: that hopefully are are good for everybody else too, you know. 224 00:12:02,840 --> 00:12:04,679 Speaker 1: Like we have sort of themes that I want to 225 00:12:04,720 --> 00:12:07,000 Speaker 1: work on, and the themes, if I had to put 226 00:12:07,040 --> 00:12:10,440 Speaker 1: in through words, it's like reflection, imagination, and coordination. I 227 00:12:10,440 --> 00:12:13,440 Speaker 1: feel like like in order to kind of flourishes and civilization, 228 00:12:13,440 --> 00:12:14,959 Speaker 1: we're gonna have to like make a lot of new things. 229 00:12:15,000 --> 00:12:18,080 Speaker 1: And making new things involves those three words, uh, and 230 00:12:18,120 --> 00:12:20,600 Speaker 1: we need a lot more around them, like infrastructure and 231 00:12:20,600 --> 00:12:23,520 Speaker 1: new fundamental forms of infrastructure really around each of them. 232 00:12:23,600 --> 00:12:26,640 Speaker 1: And we were actually originally working more on the like reflection, 233 00:12:26,720 --> 00:12:29,959 Speaker 1: tools and coordination tools. We're doing some imagination stuff. But 234 00:12:30,000 --> 00:12:32,480 Speaker 1: then like there were certain breakthroughs on the AI side, 235 00:12:33,120 --> 00:12:34,959 Speaker 1: Um that we're happening. It was about like a year 236 00:12:34,960 --> 00:12:37,520 Speaker 1: and a half ago. Now it looks like everything's blowing up. 237 00:12:37,559 --> 00:12:39,200 Speaker 1: But like a year and a half ago in San Francisco, 238 00:12:39,280 --> 00:12:41,280 Speaker 1: we all went to the same Christmas parties and stuff. 239 00:12:41,320 --> 00:12:43,240 Speaker 1: All the A people are kind of out here and 240 00:12:43,400 --> 00:12:45,160 Speaker 1: we were kind of all together, and I'm like, these 241 00:12:45,200 --> 00:12:48,679 Speaker 1: the fusion models. It seems different. It seems different than 242 00:12:48,679 --> 00:12:50,360 Speaker 1: the other stuff. And they're like, yeah, I know this 243 00:12:50,440 --> 00:12:52,200 Speaker 1: is different than like, well, what are you gonna do? 244 00:12:52,280 --> 00:12:54,000 Speaker 1: What are you gonna do? We're all kind of talking 245 00:12:54,080 --> 00:12:57,439 Speaker 1: and and I'm eventually like, I think there's gonna be 246 00:12:57,520 --> 00:12:59,600 Speaker 1: a human side of this, that it's not just about 247 00:12:59,640 --> 00:13:01,880 Speaker 1: making pictures, but that there's a sort of a back 248 00:13:01,920 --> 00:13:04,200 Speaker 1: and forth. There's like a lot more to this that's 249 00:13:04,200 --> 00:13:06,520 Speaker 1: gonna It's gonna be hard to figure out from just 250 00:13:06,559 --> 00:13:09,800 Speaker 1: optimizing a single number in a computer program, and there 251 00:13:10,080 --> 00:13:12,120 Speaker 1: may be some taste involved, and no one knows what 252 00:13:12,160 --> 00:13:14,360 Speaker 1: that is, uh, and like and I'm like, I I 253 00:13:14,400 --> 00:13:18,280 Speaker 1: think there's something I have to contribute to, right, Um, yeah, 254 00:13:18,679 --> 00:13:20,839 Speaker 1: can you imagine though I'm a guy you just met 255 00:13:20,840 --> 00:13:23,520 Speaker 1: at a party. I've got no context whatsoever about mid 256 00:13:23,600 --> 00:13:25,720 Speaker 1: Journey and you just told me that, which I'll, by 257 00:13:25,720 --> 00:13:27,679 Speaker 1: the way, all very interesting. I have many questions related 258 00:13:27,679 --> 00:13:29,320 Speaker 1: to what you just said. I'm gonna dumb it down 259 00:13:29,360 --> 00:13:31,600 Speaker 1: a little bit only because maybe not every single person 260 00:13:31,720 --> 00:13:35,199 Speaker 1: will know, but mid Journey is known right now. The 261 00:13:35,280 --> 00:13:38,800 Speaker 1: company has risen to kind of a place in the 262 00:13:38,840 --> 00:13:41,880 Speaker 1: spotlight because it is what I think we're all sort 263 00:13:41,880 --> 00:13:44,280 Speaker 1: of talking about now, is like an AI art tool 264 00:13:44,400 --> 00:13:48,360 Speaker 1: or a tool to create art based on artificial intelligence 265 00:13:48,360 --> 00:13:50,080 Speaker 1: and machine learning and all of these sort of other 266 00:13:50,240 --> 00:13:53,960 Speaker 1: very complex technologies that are kind of fusing together to 267 00:13:54,120 --> 00:13:57,400 Speaker 1: make something that is relatively new. So I think most 268 00:13:57,400 --> 00:14:00,760 Speaker 1: people would say, you've built a tool that can take 269 00:14:01,200 --> 00:14:04,160 Speaker 1: human language tax like basic like English prompts or whatever, 270 00:14:04,200 --> 00:14:05,760 Speaker 1: and maybe you do in different languages I don't know, 271 00:14:06,120 --> 00:14:09,040 Speaker 1: and convert a prompt like a description of something into 272 00:14:09,080 --> 00:14:12,280 Speaker 1: a piece of art that is created basically wholly by 273 00:14:12,400 --> 00:14:16,760 Speaker 1: a machine. Is that correct? Yeah? I try to avoid 274 00:14:16,920 --> 00:14:20,000 Speaker 1: like the word art almost to be honest, because I 275 00:14:20,360 --> 00:14:24,160 Speaker 1: think that it's like not really about art. It's about imagination. 276 00:14:25,000 --> 00:14:28,160 Speaker 1: And sometimes people use their imaginations for art, but usually not, 277 00:14:28,640 --> 00:14:30,080 Speaker 1: and so I usually think of it as we're trying 278 00:14:30,080 --> 00:14:35,600 Speaker 1: to create these machine augmented imaginative powers. Sometimes I almost 279 00:14:35,600 --> 00:14:37,960 Speaker 1: call it like a vehicle, you know, to really like 280 00:14:38,040 --> 00:14:39,600 Speaker 1: just asked, like what are we doing, Like is it 281 00:14:39,640 --> 00:14:42,280 Speaker 1: like the invention of photography, you know how it changed painting, 282 00:14:42,640 --> 00:14:44,120 Speaker 1: And I tend to say no, it's much more like 283 00:14:44,160 --> 00:14:47,080 Speaker 1: the invention of the combustion engine and the car. And 284 00:14:47,160 --> 00:14:49,680 Speaker 1: like when we invented cars, they're faster than us, but 285 00:14:49,720 --> 00:14:52,080 Speaker 1: we didn't chop our legs off. We have to really 286 00:14:52,080 --> 00:14:54,120 Speaker 1: move somewhere. You move through vehicles, So it's kind of 287 00:14:54,160 --> 00:14:56,000 Speaker 1: like a vehicle for imaginiction. You really have to go somewhere. 288 00:14:56,000 --> 00:14:59,400 Speaker 1: You're gonna use these vehicles like jets and boats and cars. 289 00:15:00,000 --> 00:15:02,160 Speaker 1: We never like have a little robot as like our icon. 290 00:15:02,200 --> 00:15:04,960 Speaker 1: It's like a sailboat, you know, very much trying to 291 00:15:05,040 --> 00:15:08,360 Speaker 1: kind of help people explore and imagine these like seeds 292 00:15:08,480 --> 00:15:12,520 Speaker 1: of of like aesthetic possibilities. I mean, it's interesting that 293 00:15:12,560 --> 00:15:14,480 Speaker 1: there's a little bit of like a defensive stance you 294 00:15:14,480 --> 00:15:16,760 Speaker 1: have to take now because the art aspect of it 295 00:15:16,800 --> 00:15:19,280 Speaker 1: gets under the skin of a certain part of the 296 00:15:19,320 --> 00:15:22,440 Speaker 1: audience that's like, wait a second, you know, what is 297 00:15:22,480 --> 00:15:24,240 Speaker 1: this thing doing? What does it mean? What does it 298 00:15:24,280 --> 00:15:26,680 Speaker 1: mean for all these different industries. I think a lot 299 00:15:26,680 --> 00:15:28,480 Speaker 1: of people feel and maybe you guys have had to 300 00:15:28,520 --> 00:15:32,040 Speaker 1: play some new round of defense because of it, that 301 00:15:32,480 --> 00:15:36,280 Speaker 1: this has been engineered to kind of like up end industries. Right, 302 00:15:36,280 --> 00:15:37,840 Speaker 1: But you don't you're saying you don't really view it 303 00:15:37,880 --> 00:15:41,200 Speaker 1: that way. No, not to me, is actually very uninteresting. Uh, 304 00:15:41,360 --> 00:15:44,280 Speaker 1: Like the idea of like making fake art is really uninteresting. 305 00:15:44,360 --> 00:15:47,320 Speaker 1: Like who cares are making fake photos? Uh? It's really 306 00:15:47,520 --> 00:15:49,560 Speaker 1: like to me, it's not like I think what's interesting 307 00:15:49,640 --> 00:15:52,480 Speaker 1: is making stuff that never could have existed before. I 308 00:15:52,520 --> 00:15:54,160 Speaker 1: don't like it when somebody makes a deep fake photo 309 00:15:54,160 --> 00:15:55,480 Speaker 1: of a dog. We make it really hard to do that. 310 00:15:55,520 --> 00:15:57,800 Speaker 1: Other ones do that well. To me, the most interesting 311 00:15:57,800 --> 00:15:59,880 Speaker 1: and images are the ones that don't look like a 312 00:16:00,080 --> 00:16:02,200 Speaker 1: thing we've ever seen before. They don't look human, they 313 00:16:02,200 --> 00:16:04,480 Speaker 1: don't look like the AI made. They look like something new, 314 00:16:04,520 --> 00:16:05,880 Speaker 1: and all we know is that it's this new thing, 315 00:16:05,920 --> 00:16:08,680 Speaker 1: it's this new frontier. Right. I should tell you that 316 00:16:08,760 --> 00:16:12,520 Speaker 1: the art for this podcast is generated by mid Journey, uh, 317 00:16:12,560 --> 00:16:15,120 Speaker 1: and it ended up producing results that I think are 318 00:16:15,200 --> 00:16:17,560 Speaker 1: like at once very familiar to me, like in terms 319 00:16:17,600 --> 00:16:19,640 Speaker 1: of stylistically, there's something very familiar to me about it, 320 00:16:19,640 --> 00:16:22,280 Speaker 1: but there's also something about it that is um like, 321 00:16:22,400 --> 00:16:36,800 Speaker 1: totally original, I think to your point, I'll tell you this, 322 00:16:36,880 --> 00:16:39,120 Speaker 1: like I'll give you my stance a little bit because 323 00:16:39,520 --> 00:16:40,800 Speaker 1: one of the reasons I wanted to talk to you, 324 00:16:40,840 --> 00:16:42,120 Speaker 1: one of the reasons I want to talk about this 325 00:16:42,160 --> 00:16:45,040 Speaker 1: at all is um As you know, I'm a huge nerd, 326 00:16:45,560 --> 00:16:48,760 Speaker 1: and I've spent my entire life like being you know, 327 00:16:48,880 --> 00:16:53,600 Speaker 1: sort of mesmerized and interested in emerging technology in all 328 00:16:53,640 --> 00:16:58,120 Speaker 1: sorts of different forms. And when I started using mid journey, 329 00:16:58,760 --> 00:17:03,240 Speaker 1: mid journeys producing something that to me feels I'll try 330 00:17:03,280 --> 00:17:06,440 Speaker 1: to avoid using the term art, it feels like it's 331 00:17:06,480 --> 00:17:10,600 Speaker 1: creating something very original. I could say, like, Okay, I 332 00:17:10,640 --> 00:17:12,240 Speaker 1: know where some of this stuff is coming from, Like 333 00:17:12,280 --> 00:17:14,639 Speaker 1: I can kind of understand like there's certain styles that 334 00:17:14,680 --> 00:17:16,720 Speaker 1: are present, or if you give it a prompt to 335 00:17:16,720 --> 00:17:19,240 Speaker 1: get a certain style, you can get that. But to me, 336 00:17:19,280 --> 00:17:21,840 Speaker 1: it was like, and I still feel this having processed 337 00:17:21,840 --> 00:17:24,760 Speaker 1: it now for you know, weeks and months, it maybe 338 00:17:24,800 --> 00:17:27,280 Speaker 1: the most amazing thing that I've ever seen a machine do. 339 00:17:27,920 --> 00:17:29,919 Speaker 1: I totally understand the idea that you're not trying to 340 00:17:29,920 --> 00:17:33,359 Speaker 1: build a tool that is like a new photoshop, although 341 00:17:33,359 --> 00:17:35,640 Speaker 1: I think there are applications that are obvious that are 342 00:17:35,680 --> 00:17:39,280 Speaker 1: in that realm. When I first asked about what it was, 343 00:17:39,520 --> 00:17:41,119 Speaker 1: you use three words. What were the three words that 344 00:17:41,240 --> 00:17:46,200 Speaker 1: used with reflection, imagination, and coordination? Okay, so coordination and reflection. 345 00:17:46,240 --> 00:17:48,520 Speaker 1: I want to talk about like what that means, because 346 00:17:48,520 --> 00:17:50,760 Speaker 1: I understand the imagination part, and I think I understand 347 00:17:50,800 --> 00:17:54,480 Speaker 1: how you are thinking about like what mid Journey does 348 00:17:55,240 --> 00:17:57,760 Speaker 1: now in that department, But tell me about like those 349 00:17:57,840 --> 00:18:01,320 Speaker 1: roots of reflection and and nation, Like what was this 350 00:18:01,640 --> 00:18:03,840 Speaker 1: before it was? What it is? Um, we were working 351 00:18:03,880 --> 00:18:07,400 Speaker 1: on a lot of things trying to like understand human minds, 352 00:18:07,440 --> 00:18:09,960 Speaker 1: like individually to help people reflect and then also to 353 00:18:10,000 --> 00:18:12,359 Speaker 1: kind of help people come together and like work on 354 00:18:12,440 --> 00:18:14,639 Speaker 1: things better. And so we were doing a lot of 355 00:18:14,640 --> 00:18:18,760 Speaker 1: like quantitative psychology and like structured thinking to kind of 356 00:18:18,760 --> 00:18:21,040 Speaker 1: like create like boots stop a behive mind as fast 357 00:18:21,080 --> 00:18:22,560 Speaker 1: as you can. Kind of a future is going to 358 00:18:22,600 --> 00:18:25,080 Speaker 1: say lots of weird things. It's good are you saying that? 359 00:18:25,119 --> 00:18:26,880 Speaker 1: Like the roots of this are kind of like can 360 00:18:26,880 --> 00:18:29,520 Speaker 1: we get this thing to think on a collective level 361 00:18:29,560 --> 00:18:32,120 Speaker 1: for us to like solve problems. Yeah, I think there's 362 00:18:32,119 --> 00:18:35,080 Speaker 1: two areas. There's both like how do you help somebody 363 00:18:35,160 --> 00:18:36,960 Speaker 1: think about like who they are and what they want 364 00:18:36,960 --> 00:18:39,400 Speaker 1: and just kind of like deal with their things. Uh, 365 00:18:39,440 --> 00:18:40,920 Speaker 1: And then there's also like how do you help them 366 00:18:40,960 --> 00:18:44,639 Speaker 1: find like the right people anything big meet other people? 367 00:18:44,720 --> 00:18:47,040 Speaker 1: So how do you kind of find the people? And 368 00:18:47,119 --> 00:18:49,800 Speaker 1: I don't know. When I was like twenty, I would say, 369 00:18:50,440 --> 00:18:52,040 Speaker 1: you have to have your goals and then you align 370 00:18:52,040 --> 00:18:54,199 Speaker 1: people who share the goals. And then I've done that, 371 00:18:54,280 --> 00:18:55,960 Speaker 1: and it turns out that the second the goals change, 372 00:18:56,000 --> 00:18:59,040 Speaker 1: the groups blow apart because like it's about values or something. 373 00:18:59,200 --> 00:19:01,200 Speaker 1: And then if you like around values and then over 374 00:19:01,240 --> 00:19:03,080 Speaker 1: like five or ten years, it blows apart again because 375 00:19:03,080 --> 00:19:04,560 Speaker 1: it turns out that our values like change in our 376 00:19:04,600 --> 00:19:06,679 Speaker 1: lives and our experience change. And so then maybe that's 377 00:19:06,720 --> 00:19:08,119 Speaker 1: what this idea is. Like we mean some higher than 378 00:19:08,160 --> 00:19:10,960 Speaker 1: values and maybe it's aesthetics. It's like not about what's 379 00:19:10,960 --> 00:19:13,120 Speaker 1: like right or wrong or what it's important and importance. 380 00:19:13,200 --> 00:19:15,000 Speaker 1: It's like really deep down, it's about like what we 381 00:19:15,080 --> 00:19:17,480 Speaker 1: feel is beautiful and what we feel is ugly. That 382 00:19:17,600 --> 00:19:19,399 Speaker 1: like really leads to the things that we value, the 383 00:19:19,440 --> 00:19:22,080 Speaker 1: things that we actually tried to build. And so there's 384 00:19:22,080 --> 00:19:24,679 Speaker 1: this idea of like maybe aesthetics themselves are like some 385 00:19:24,760 --> 00:19:27,199 Speaker 1: of the highest things, and maybe aesthetics can be like 386 00:19:27,200 --> 00:19:30,960 Speaker 1: a foundational layer of like a social world in a 387 00:19:30,960 --> 00:19:32,639 Speaker 1: way that like is beyond where it is, because right 388 00:19:32,640 --> 00:19:34,440 Speaker 1: now it's like the Internet, what is it? It's about? 389 00:19:34,480 --> 00:19:36,560 Speaker 1: Like Facebook, it's like who's your mom and who went 390 00:19:36,600 --> 00:19:39,119 Speaker 1: to school with and then like on Twitter, it's like 391 00:19:39,160 --> 00:19:41,000 Speaker 1: almost like like you say one thing a day that 392 00:19:41,040 --> 00:19:43,080 Speaker 1: pisses people off and then half of them will follow you, 393 00:19:43,440 --> 00:19:45,600 Speaker 1: and like those are both shitty foundations for like a 394 00:19:45,680 --> 00:19:47,760 Speaker 1: better social world. I would ever want to build a 395 00:19:47,760 --> 00:19:49,840 Speaker 1: team that way. So there's something really interesting on like 396 00:19:49,840 --> 00:19:52,120 Speaker 1: mid Journey where people come together and they're like, man, 397 00:19:52,200 --> 00:19:55,440 Speaker 1: you love like Egyptian space pyramids too, that's like me, 398 00:19:55,920 --> 00:19:57,639 Speaker 1: And then like you have nothing else in common, but 399 00:19:57,680 --> 00:19:59,800 Speaker 1: you both love Egyptians based pyramids, and it actually like 400 00:20:00,119 --> 00:20:04,520 Speaker 1: something really deep. I think that like aesthetics have the 401 00:20:04,560 --> 00:20:09,359 Speaker 1: potential to be a foundation of a better social imcordant 402 00:20:09,440 --> 00:20:12,320 Speaker 1: layer in a way that's like really hard to understand, 403 00:20:12,359 --> 00:20:15,000 Speaker 1: but that is actually like really interesting. I mean, that's 404 00:20:15,000 --> 00:20:18,840 Speaker 1: a fascinating and frankly, I have so many questions around 405 00:20:18,840 --> 00:20:21,320 Speaker 1: just the basic concept there. But like I would agree 406 00:20:21,359 --> 00:20:24,879 Speaker 1: with you that aesthetics do tend to bring people together. 407 00:20:24,920 --> 00:20:28,800 Speaker 1: I mean, but aesthetics conceptually the idea of, you know, 408 00:20:29,000 --> 00:20:32,280 Speaker 1: having a taste or a preference for something, there's a 409 00:20:32,320 --> 00:20:36,840 Speaker 1: limit I would imagine to people who identify around an 410 00:20:36,840 --> 00:20:42,800 Speaker 1: aesthetic position. Meaning my mother, who's a wonderful, wonderful and 411 00:20:43,000 --> 00:20:46,080 Speaker 1: extremely insane person. She could talk about things she's visually 412 00:20:46,080 --> 00:20:48,160 Speaker 1: finds beautiful or whatever, but I would not say it's 413 00:20:48,160 --> 00:20:49,919 Speaker 1: like a central part of her personality or something that 414 00:20:49,960 --> 00:20:52,520 Speaker 1: she has an enormous amount of interest in. Right. The 415 00:20:52,520 --> 00:20:55,920 Speaker 1: thing about Facebook is that a raw opinion or sharing 416 00:20:56,000 --> 00:20:58,480 Speaker 1: something like oh I found this article interesting or whatever 417 00:20:58,600 --> 00:21:02,240 Speaker 1: is very right forward in the sense of we all 418 00:21:02,240 --> 00:21:04,960 Speaker 1: know what an idea is or an interesting article or 419 00:21:05,000 --> 00:21:08,399 Speaker 1: an opinion. But I don't know that everybody thinks on 420 00:21:08,520 --> 00:21:11,879 Speaker 1: an aesthetic level. Maybe I'm not giving everybody enough credit. 421 00:21:12,000 --> 00:21:14,840 Speaker 1: It's possible now, I think you people don't think about it, 422 00:21:14,880 --> 00:21:17,679 Speaker 1: but it's there. Right, Like I tried this, I'd like, 423 00:21:17,760 --> 00:21:19,240 Speaker 1: what are your aesthetics that leads to your values that 424 00:21:19,320 --> 00:21:20,920 Speaker 1: lead to your goals. Like you can ask the question 425 00:21:20,960 --> 00:21:23,320 Speaker 1: and almost nobody can answer it. It's a really hard question. 426 00:21:23,480 --> 00:21:24,720 Speaker 1: But all of a sudden you give them something like 427 00:21:24,760 --> 00:21:26,639 Speaker 1: mid Journey, and it's like you can make a picture 428 00:21:26,640 --> 00:21:28,679 Speaker 1: of anything, what do you want? And like everything just 429 00:21:28,720 --> 00:21:31,040 Speaker 1: spills out and then they go through this whole like 430 00:21:31,119 --> 00:21:33,800 Speaker 1: heroes mid Journey, and like the process of looking them 431 00:21:33,800 --> 00:21:36,439 Speaker 1: through that journey, like you like it's all there and 432 00:21:36,480 --> 00:21:40,200 Speaker 1: it's very clear, like a lot of stuff comes out. Actually, 433 00:21:40,320 --> 00:21:41,760 Speaker 1: But if I'm like, I'm gonna give you like a 434 00:21:41,800 --> 00:21:43,840 Speaker 1: really extreme example, and so forgive me if this feels 435 00:21:43,840 --> 00:21:46,639 Speaker 1: like a like a gotcha or whatever. But I'm like 436 00:21:46,640 --> 00:21:49,320 Speaker 1: a neo Nazi. For instance, i might love Star Wars. 437 00:21:49,359 --> 00:21:52,080 Speaker 1: Let's say, although I always find it fascinating when like 438 00:21:52,119 --> 00:21:54,119 Speaker 1: people who really into fascism like are like I like 439 00:21:54,160 --> 00:21:56,159 Speaker 1: Star Wars, I'm into the Rebels or whatever. I'm like, 440 00:21:56,200 --> 00:21:58,960 Speaker 1: you know, it's but okay, let's say I like Star Wars. 441 00:21:59,040 --> 00:22:00,480 Speaker 1: You like Star Wars, but like one of us is 442 00:22:00,520 --> 00:22:03,200 Speaker 1: a white supremacist and one of us isn't. We may 443 00:22:03,240 --> 00:22:06,160 Speaker 1: share some asthetic interests, right, Or we may both love 444 00:22:06,560 --> 00:22:09,840 Speaker 1: a certain artist, right, You know we're Lichtenstein fans or whatever, 445 00:22:10,440 --> 00:22:12,320 Speaker 1: but like, at the end of the day, deep down, 446 00:22:12,400 --> 00:22:15,600 Speaker 1: I don't know that that aesthetic preference has any deeper 447 00:22:16,320 --> 00:22:19,320 Speaker 1: residence on who we are. There's a limit, right, So 448 00:22:19,400 --> 00:22:22,240 Speaker 1: like for example, like you know, you're a rebel obviously, 449 00:22:22,359 --> 00:22:24,639 Speaker 1: and then like that nowadays is also kind of a 450 00:22:24,640 --> 00:22:26,520 Speaker 1: rebel in their own way, So do you do have 451 00:22:26,600 --> 00:22:30,399 Speaker 1: something in coming? But like, but there's probably also other things. 452 00:22:30,440 --> 00:22:33,159 Speaker 1: I mean, that's a that is a leap, I would say. 453 00:22:33,240 --> 00:22:35,000 Speaker 1: I mean, I get what You're saved. They're definitely going 454 00:22:35,040 --> 00:22:37,639 Speaker 1: against the grain, right, I got it, They're going against 455 00:22:37,640 --> 00:22:40,560 Speaker 1: the grain. Yeah, yeah, definitely. A lot of us are rebels, 456 00:22:40,560 --> 00:22:43,440 Speaker 1: and they're living types of rebels, but we are rebels. 457 00:22:43,720 --> 00:22:46,160 Speaker 1: But now, like I think there are other things too, 458 00:22:46,480 --> 00:22:48,760 Speaker 1: So you don't want to just lock on rebels. You 459 00:22:48,800 --> 00:22:50,600 Speaker 1: want to have something like a little bit broader and 460 00:22:50,640 --> 00:22:52,600 Speaker 1: more interesting. And so the question is, after you make 461 00:22:52,640 --> 00:22:54,199 Speaker 1: a bunch of picture of rebels, what's the next thing 462 00:22:54,240 --> 00:22:57,080 Speaker 1: you do? You know, and then what's that? What's that 463 00:22:57,160 --> 00:22:59,320 Speaker 1: all come together? You know? I Mean, now we're like 464 00:22:59,400 --> 00:23:01,080 Speaker 1: very far a few from like I've got a mid 465 00:23:01,160 --> 00:23:02,680 Speaker 1: journey bought that I can talk to and it can 466 00:23:02,680 --> 00:23:04,960 Speaker 1: make images for me. I mean, how would you describe it? 467 00:23:05,000 --> 00:23:08,040 Speaker 1: You described as AI? Yeah, I mean it is. It 468 00:23:08,160 --> 00:23:10,000 Speaker 1: is a I I don't like. I kind of avoid 469 00:23:10,000 --> 00:23:12,800 Speaker 1: the words AI and art actually both together. Weirdly, the 470 00:23:12,840 --> 00:23:14,960 Speaker 1: problem with words like AI is that people give it 471 00:23:14,960 --> 00:23:17,520 Speaker 1: a lot of agency and like will and purpose and 472 00:23:17,560 --> 00:23:19,720 Speaker 1: meaning and so whereas like this thing it doesn't have 473 00:23:19,720 --> 00:23:22,960 Speaker 1: a story or a narrative or like any will and 474 00:23:23,080 --> 00:23:26,560 Speaker 1: concension doesn't have a soul. It does learn actually from 475 00:23:26,640 --> 00:23:29,160 Speaker 1: lots of people, and it changes and there's a coevolution. 476 00:23:29,400 --> 00:23:31,840 Speaker 1: It's almost like mid journey as a flower and then 477 00:23:31,840 --> 00:23:34,439 Speaker 1: the users are bees and like the flowers trying to 478 00:23:34,480 --> 00:23:36,560 Speaker 1: be beautiful for the bees, but the bees pick which 479 00:23:36,640 --> 00:23:39,439 Speaker 1: flowers are the ones that get to survive, and so 480 00:23:39,600 --> 00:23:42,000 Speaker 1: like there's this coevolution between the flowers and the bees, 481 00:23:42,320 --> 00:23:44,080 Speaker 1: Like there's not a lot of will, there's some will, 482 00:23:44,119 --> 00:23:46,320 Speaker 1: there's a will to be beautiful, um, and then there's 483 00:23:46,359 --> 00:23:48,359 Speaker 1: only weird about flowers being beautiful because we find them 484 00:23:48,359 --> 00:23:50,400 Speaker 1: beautiful too. It's like what does that mean? It's because 485 00:23:50,400 --> 00:23:52,840 Speaker 1: they're not really for us specifically. It's like why do 486 00:23:52,960 --> 00:23:55,120 Speaker 1: both us and the bees find something beautiful, Like it's 487 00:23:55,119 --> 00:23:57,879 Speaker 1: sort of speaking some weird objective thing. No, and I 488 00:23:57,880 --> 00:24:00,199 Speaker 1: can understand that from a philosophical level. I mean, like, 489 00:24:00,240 --> 00:24:02,520 Speaker 1: what is it doing. It's a program, Yeah, it's a program. 490 00:24:02,520 --> 00:24:04,720 Speaker 1: It's a program with a lot of models in it. 491 00:24:05,040 --> 00:24:08,439 Speaker 1: There's a model that models language, and there's something that 492 00:24:08,520 --> 00:24:11,320 Speaker 1: models the connection between language and images. There's another thing 493 00:24:11,320 --> 00:24:14,200 Speaker 1: that tries to model what images look like. There's actually 494 00:24:14,200 --> 00:24:17,399 Speaker 1: also models to try to understand like beauty, like what 495 00:24:17,560 --> 00:24:19,880 Speaker 1: is beautiful actually? And then there's other models that try 496 00:24:19,920 --> 00:24:24,119 Speaker 1: to understand like trade offs between like diversity versus creativity 497 00:24:24,280 --> 00:24:26,800 Speaker 1: versus like how literal should you be? How metaphorical should 498 00:24:26,840 --> 00:24:28,800 Speaker 1: you be? How do you read things? And so it's 499 00:24:28,840 --> 00:24:32,000 Speaker 1: kind of a it's like a structure and there's a 500 00:24:32,000 --> 00:24:35,360 Speaker 1: lot of like duct tape, and you know, it's it's 501 00:24:35,400 --> 00:24:38,239 Speaker 1: weird because like people will be like is it alive? Like, well, 502 00:24:38,240 --> 00:24:40,800 Speaker 1: how does it understand things? If I say something like 503 00:24:40,880 --> 00:24:43,119 Speaker 1: sadness or happiness? How is it able to make an 504 00:24:43,119 --> 00:24:44,959 Speaker 1: image of an emotion that it's never had? Like they 505 00:24:45,160 --> 00:24:47,040 Speaker 1: asking these questions like what is this? Like this'snd like 506 00:24:47,080 --> 00:24:49,000 Speaker 1: a piece of software, you know, but it's not an 507 00:24:49,000 --> 00:24:51,600 Speaker 1: AI because it's never had those experiences, right, Like, what 508 00:24:51,640 --> 00:24:53,680 Speaker 1: does it mean? There's a lot of really interesting questions. 509 00:24:53,840 --> 00:24:55,280 Speaker 1: I think a lot of people they hear AI. They 510 00:24:55,320 --> 00:24:58,000 Speaker 1: think there's like a machine somewhere with like a glowing 511 00:24:58,080 --> 00:25:00,280 Speaker 1: red orb in the middle of it, and like it's 512 00:25:00,320 --> 00:25:03,640 Speaker 1: like pulsing exactly, and there's like some neural net. You've 513 00:25:03,680 --> 00:25:06,320 Speaker 1: built some custom hardware where there's like the net. It's 514 00:25:06,359 --> 00:25:11,080 Speaker 1: like a digital brain. It's like software, right, These programs 515 00:25:11,440 --> 00:25:13,919 Speaker 1: they do share things with our brains, like like how 516 00:25:13,960 --> 00:25:16,640 Speaker 1: an airplane share something with a bird, Like they both 517 00:25:16,640 --> 00:25:19,520 Speaker 1: share air dynamics and physics and the sky like these 518 00:25:19,520 --> 00:25:22,719 Speaker 1: things are sharing some physics of thoughts, right with that? Right? 519 00:25:22,920 --> 00:25:25,520 Speaker 1: But I'm just saying, like, it's not how you built software. 520 00:25:25,520 --> 00:25:28,560 Speaker 1: The software does some pretty sophisticated things. It is hosted 521 00:25:28,600 --> 00:25:32,119 Speaker 1: on like a a ws rack somewhere essentially. I mean 522 00:25:32,160 --> 00:25:34,280 Speaker 1: maybe you don't use a ws or whatever, But so 523 00:25:34,320 --> 00:25:36,919 Speaker 1: what is the product? Like you've got investors, right, No, 524 00:25:38,119 --> 00:25:42,080 Speaker 1: you don't know your boots drapped? Yeah, okay, not listen, 525 00:25:42,119 --> 00:25:44,840 Speaker 1: I've paid for a subscription. I'm a mid journey free now, 526 00:25:44,920 --> 00:25:48,359 Speaker 1: So is that the product people pay for subscriptions to use? It. Yeah, 527 00:25:48,560 --> 00:25:50,360 Speaker 1: I try to have very honest business. It's like, you're 528 00:25:50,359 --> 00:25:52,000 Speaker 1: not gonna run on your computer ground the cloud and 529 00:25:52,000 --> 00:25:53,920 Speaker 1: then we're gonna pay it takes money and then we'll 530 00:25:53,920 --> 00:25:56,119 Speaker 1: take some of margin on that, and that's the business. 531 00:25:56,160 --> 00:25:58,080 Speaker 1: And you feel like that's a good foundation for like 532 00:25:58,080 --> 00:25:59,399 Speaker 1: whatever this thing is going to be, Like you can 533 00:25:59,400 --> 00:26:01,840 Speaker 1: build off of that. Yeah, you don't have like Mark 534 00:26:01,880 --> 00:26:03,560 Speaker 1: Andrews in coming to you being like I'll give you 535 00:26:03,880 --> 00:26:06,040 Speaker 1: X number of billions of dollars if you can let 536 00:26:06,040 --> 00:26:08,560 Speaker 1: me turn this into whatever Mark Andrews and wants. We 537 00:26:08,600 --> 00:26:10,320 Speaker 1: do have a lot investors coming to us offering us 538 00:26:10,359 --> 00:26:12,560 Speaker 1: lots of money. You're not taking the money. We haven't 539 00:26:12,560 --> 00:26:14,879 Speaker 1: taken anything so far. That's pretty amazing. Can the business 540 00:26:14,880 --> 00:26:18,359 Speaker 1: be profitable like this? We're profitable already, you are. Yeah, 541 00:26:18,600 --> 00:26:20,719 Speaker 1: that's one reason not to take money is we're already profitable. Well, 542 00:26:20,720 --> 00:26:22,520 Speaker 1: I mean, if you're making money, it's definitely a good 543 00:26:22,560 --> 00:26:24,400 Speaker 1: reason not to take it, right Yeah. I mean because 544 00:26:24,440 --> 00:26:25,840 Speaker 1: people come to us and they offer us money and 545 00:26:25,840 --> 00:26:27,840 Speaker 1: I'm like, what am I going to spend it on? 546 00:26:28,760 --> 00:26:31,359 Speaker 1: And they're like, it's good to have it, you should 547 00:26:31,400 --> 00:26:34,639 Speaker 1: have it. I'm like, I don't we have money. We're 548 00:26:34,680 --> 00:26:36,280 Speaker 1: trying to spend it already, and they're like, well, you 549 00:26:36,280 --> 00:26:38,120 Speaker 1: should have money, you take us take our advice. It's 550 00:26:38,119 --> 00:26:39,879 Speaker 1: not about the money, it's about advice or like they 551 00:26:39,880 --> 00:26:42,200 Speaker 1: try to make those arguments and so far having heard 552 00:26:42,240 --> 00:26:45,639 Speaker 1: a very compelling argument, so you're happy to iterate on 553 00:26:45,680 --> 00:26:47,720 Speaker 1: this product where it's at now and let the users 554 00:26:47,720 --> 00:26:50,439 Speaker 1: sort of maybe dictate some of the direction because of 555 00:26:50,440 --> 00:26:52,119 Speaker 1: the way that they're using it. Yeah, I mean it's 556 00:26:52,160 --> 00:26:55,280 Speaker 1: kind of beautiful. It's like we make something and people 557 00:26:55,280 --> 00:26:57,199 Speaker 1: like it, they pay us money, and then if they 558 00:26:57,200 --> 00:26:58,919 Speaker 1: don't like it, we don't make money. But like we 559 00:26:58,960 --> 00:27:00,399 Speaker 1: have we're like we're trying to make some people like 560 00:27:00,600 --> 00:27:03,520 Speaker 1: because it supports our stuff. And then like it's very 561 00:27:03,880 --> 00:27:07,639 Speaker 1: sort of honest and straightforward and it's an easy business, 562 00:27:07,640 --> 00:27:09,399 Speaker 1: Like give it this way, I would keep it this way. 563 00:27:09,680 --> 00:27:12,680 Speaker 1: I mean, presumably there's commercial applications for this, right. Yeah. 564 00:27:12,840 --> 00:27:15,000 Speaker 1: I think of this because I'm like a guy who 565 00:27:15,080 --> 00:27:17,800 Speaker 1: runs like media businesses. I think, oh wow, there's all 566 00:27:17,840 --> 00:27:20,399 Speaker 1: the time like I want art for something and I'm 567 00:27:20,400 --> 00:27:22,080 Speaker 1: actually gonna get into a bunch of questions about the 568 00:27:22,480 --> 00:27:24,240 Speaker 1: kind of it. But all the time like I'm gonna 569 00:27:24,240 --> 00:27:26,240 Speaker 1: news room. I'm like publishing in a twenty stories a 570 00:27:26,320 --> 00:27:29,160 Speaker 1: day or fifty stories a day or whatever, and every 571 00:27:29,160 --> 00:27:31,959 Speaker 1: one of those pieces has some art attached to it. 572 00:27:32,000 --> 00:27:35,920 Speaker 1: Like presumably you're already doing more enterprise level stuff, where 573 00:27:35,920 --> 00:27:38,280 Speaker 1: like I just want like some design for a story 574 00:27:38,400 --> 00:27:40,600 Speaker 1: or for a blog that I'm writing or whatever that 575 00:27:40,680 --> 00:27:44,720 Speaker 1: you could generate that any sort of infinite iterations of 576 00:27:44,760 --> 00:27:46,720 Speaker 1: original pieces of art, Like is that a part of 577 00:27:46,720 --> 00:27:49,600 Speaker 1: the business. I would say we're a consumer business that 578 00:27:49,760 --> 00:27:52,479 Speaker 1: also has like some professionals, so it's probably like seventy 579 00:27:52,840 --> 00:27:57,119 Speaker 1: consumers and professionals. The professionals are mostly using it for 580 00:27:57,160 --> 00:28:00,440 Speaker 1: like brainstorming and concept ng. Then the consumers are having 581 00:28:00,440 --> 00:28:05,720 Speaker 1: fun and sort of having these reflective, spiritual personal experiences. Um, 582 00:28:05,760 --> 00:28:08,560 Speaker 1: I'm not that excited by professional use, even though, like 583 00:28:08,600 --> 00:28:10,199 Speaker 1: I'm happy when I see people are finding it to 584 00:28:10,200 --> 00:28:13,000 Speaker 1: be useful. The regular people have definitely been a lot 585 00:28:13,080 --> 00:28:15,560 Speaker 1: more motivating and inspiring to me than the professional uses. 586 00:28:26,480 --> 00:28:28,919 Speaker 1: I have very little interest in the world as it is. 587 00:28:29,119 --> 00:28:31,680 Speaker 1: I want to like make it really different, and it's 588 00:28:31,720 --> 00:28:34,160 Speaker 1: much easier to do something really different for consumers than 589 00:28:34,200 --> 00:28:36,760 Speaker 1: it is to like have that immediately impact the sort 590 00:28:36,760 --> 00:28:40,000 Speaker 1: of professional world. And so like for video game people 591 00:28:40,000 --> 00:28:41,680 Speaker 1: come to me, they're like, well, and like they literally 592 00:28:41,800 --> 00:28:44,120 Speaker 1: they have to file us under their photoshop budget because 593 00:28:44,120 --> 00:28:45,840 Speaker 1: like the video game is already budgeted out and it 594 00:28:45,840 --> 00:28:47,320 Speaker 1: takes sixteen months, and I have to wait for them 595 00:28:47,320 --> 00:28:49,080 Speaker 1: to make their next video game. And I'm like, this 596 00:28:49,120 --> 00:28:51,560 Speaker 1: is bullshit. I'm so happy that my business isn't reliant 597 00:28:51,600 --> 00:28:54,200 Speaker 1: on somebody finishing their video game in sixteen months, you know, 598 00:28:55,040 --> 00:28:58,120 Speaker 1: And that's what that world is like, Listen, I thought 599 00:28:58,120 --> 00:29:00,400 Speaker 1: a lot about this, Like if I'm making video game, 600 00:29:00,480 --> 00:29:03,440 Speaker 1: especially if I'm like an independent developer, like an indie, 601 00:29:03,480 --> 00:29:05,640 Speaker 1: depth I need art, I need assets, like I want 602 00:29:05,640 --> 00:29:07,959 Speaker 1: to make like I want to make this world that 603 00:29:08,000 --> 00:29:10,800 Speaker 1: hasn't been made before and like normally and this actually 604 00:29:10,800 --> 00:29:12,480 Speaker 1: gets into this part of the conversation I want to 605 00:29:12,520 --> 00:29:15,080 Speaker 1: have about art and about the sort of implications of it. 606 00:29:15,840 --> 00:29:17,960 Speaker 1: You know, I might go and hire an artist or 607 00:29:18,000 --> 00:29:20,680 Speaker 1: whatever to do that, but now, like mid Journey, potentially, 608 00:29:20,680 --> 00:29:22,040 Speaker 1: like if I'm using it in that way, I can 609 00:29:22,040 --> 00:29:25,760 Speaker 1: create assets and backgrounds and scenery or even brainstorm off 610 00:29:25,800 --> 00:29:28,320 Speaker 1: of that to build something from. Like that's a that's 611 00:29:28,360 --> 00:29:31,240 Speaker 1: not the exact thing, but a kind of iteration of it. 612 00:29:31,440 --> 00:29:33,920 Speaker 1: But there is a certain, very vocal segment of people 613 00:29:34,080 --> 00:29:37,280 Speaker 1: out there in the world, and there are people who 614 00:29:37,280 --> 00:29:39,680 Speaker 1: are artists who are you know, digital artists, or who 615 00:29:39,720 --> 00:29:42,760 Speaker 1: are working artists today, or even people who are doing 616 00:29:42,800 --> 00:29:45,480 Speaker 1: fine art that's like hanging in galleries, and they're like, one, 617 00:29:45,600 --> 00:29:48,600 Speaker 1: this is theft because it's using our work. It's using 618 00:29:48,640 --> 00:29:51,040 Speaker 1: work that is out there that is available to see 619 00:29:51,520 --> 00:29:54,800 Speaker 1: as inspiration for these works. And two, it's like they're 620 00:29:54,800 --> 00:29:57,080 Speaker 1: not getting anything when it does create new work. Not 621 00:29:57,120 --> 00:30:02,240 Speaker 1: only is it making their jobs or sort of obsolete, 622 00:30:02,520 --> 00:30:04,800 Speaker 1: but it's also like doing it on the backs of 623 00:30:04,840 --> 00:30:07,959 Speaker 1: all of their work. It's not a non compelling argument. 624 00:30:08,000 --> 00:30:10,760 Speaker 1: There is some reason to think that all of those 625 00:30:10,800 --> 00:30:12,920 Speaker 1: sort of notions are in some way true. Like what's 626 00:30:12,920 --> 00:30:15,440 Speaker 1: your take on that. There's a lot of misunderstandings around 627 00:30:15,440 --> 00:30:19,040 Speaker 1: the technology, and it makes sense that like artists really 628 00:30:19,040 --> 00:30:21,560 Speaker 1: aren't going to understand what this is doing. Some of 629 00:30:21,600 --> 00:30:23,960 Speaker 1: my favorite images I've made with any of these models 630 00:30:24,000 --> 00:30:27,480 Speaker 1: that looked artistic. We're trained only on photos, and so 631 00:30:27,600 --> 00:30:30,120 Speaker 1: what this is is it's a system that understands what 632 00:30:30,200 --> 00:30:32,520 Speaker 1: images look like like and if you've seen enough photos 633 00:30:32,560 --> 00:30:33,960 Speaker 1: in your life and then you see a painting you 634 00:30:33,960 --> 00:30:37,200 Speaker 1: can describe the painting without having ever having been trained 635 00:30:37,200 --> 00:30:39,960 Speaker 1: on paintings, and so like, what this is this thing 636 00:30:39,960 --> 00:30:42,720 Speaker 1: that understands images, and then it understands language in a 637 00:30:42,760 --> 00:30:44,440 Speaker 1: sense of the connection to languages and images, And there's 638 00:30:44,440 --> 00:30:47,280 Speaker 1: some elements of like knowing what a style looks like 639 00:30:47,320 --> 00:30:49,880 Speaker 1: requires having seen the word and the style before, So 640 00:30:49,960 --> 00:30:53,000 Speaker 1: there's like some connections to it, but like, largely speaking, 641 00:30:53,280 --> 00:30:56,040 Speaker 1: it's not I think working like the way they think 642 00:30:56,040 --> 00:30:58,680 Speaker 1: it is. And so the problem is the artists are 643 00:30:58,680 --> 00:31:01,480 Speaker 1: scared about being in the data set, but literally you 644 00:31:01,480 --> 00:31:03,000 Speaker 1: can just take one of their pictures and feed int 645 00:31:03,000 --> 00:31:04,640 Speaker 1: one of these models would ever never have seen before, 646 00:31:04,640 --> 00:31:06,400 Speaker 1: and they can make pictures like that. So it's not 647 00:31:06,400 --> 00:31:09,240 Speaker 1: about the training data. First off, if it understands images, 648 00:31:09,320 --> 00:31:12,120 Speaker 1: it's game over for that battle. It wasn't seem enough 649 00:31:12,160 --> 00:31:14,360 Speaker 1: general images, and often like know what textures are and 650 00:31:14,400 --> 00:31:16,160 Speaker 1: know what colors are. You can show it a picture 651 00:31:16,160 --> 00:31:18,040 Speaker 1: and it can make pictures like that never having seen 652 00:31:18,040 --> 00:31:22,000 Speaker 1: that specific artsila before. Right, So I mean you know 653 00:31:22,120 --> 00:31:25,040 Speaker 1: that obviously raises like all kinds of weird questions about 654 00:31:25,080 --> 00:31:28,000 Speaker 1: like you know how fine tune does that get? Can 655 00:31:28,040 --> 00:31:29,920 Speaker 1: I pick any artists like a photographer I like on 656 00:31:30,000 --> 00:31:32,680 Speaker 1: Instagram and say in the style of this Instagram photographer 657 00:31:32,760 --> 00:31:35,320 Speaker 1: and like it will do something well. I mean, you 658 00:31:35,320 --> 00:31:37,880 Speaker 1: could certainly put a photo of theirs into another service 659 00:31:37,880 --> 00:31:39,200 Speaker 1: and it'll give you a a photo that looks like it. 660 00:31:39,800 --> 00:31:41,840 Speaker 1: So you know, that's I think that's really the more 661 00:31:42,000 --> 00:31:44,719 Speaker 1: that's kind of the technical thing. And so basically, if 662 00:31:44,720 --> 00:31:46,920 Speaker 1: these systems understand images, they'll be able to copy anything 663 00:31:46,920 --> 00:31:49,240 Speaker 1: you show them, regardless of we're not. They're trained on them. 664 00:31:49,280 --> 00:31:51,560 Speaker 1: So I think the training data is the wrong battle 665 00:31:51,600 --> 00:31:54,080 Speaker 1: to fight, But there is potentially a battle to fight 666 00:31:54,120 --> 00:31:56,360 Speaker 1: over like use of these tools like what is good 667 00:31:56,360 --> 00:31:59,360 Speaker 1: and what is bad? Youth um, and certainly like the 668 00:31:59,440 --> 00:32:01,520 Speaker 1: law cover that already. If you make something that's really 669 00:32:01,560 --> 00:32:04,640 Speaker 1: derivative of another artist, like too derivative, it does it's 670 00:32:04,640 --> 00:32:07,360 Speaker 1: not okay even legally right like there is it is 671 00:32:07,400 --> 00:32:09,400 Speaker 1: covered a little bit by blaw already. Maybe there should 672 00:32:09,440 --> 00:32:13,840 Speaker 1: be something more strict because like it's getting easier. Uh, 673 00:32:13,840 --> 00:32:16,120 Speaker 1: But that's that's the battle to fight. I think it's 674 00:32:16,120 --> 00:32:19,840 Speaker 1: like what's too similar? Not like this training data? Think 675 00:32:20,720 --> 00:32:22,480 Speaker 1: you know, I think about like c G I in 676 00:32:22,480 --> 00:32:24,440 Speaker 1: a way, if you're building an environment for like a 677 00:32:24,440 --> 00:32:26,240 Speaker 1: film or something, right like, and you're like I want 678 00:32:26,240 --> 00:32:27,720 Speaker 1: to make a mountain or whatever, You're not going to 679 00:32:27,840 --> 00:32:31,520 Speaker 1: hand draw every polygon that builds the mountain, right, the 680 00:32:31,520 --> 00:32:33,440 Speaker 1: computer is going to figure out and even now, like 681 00:32:33,480 --> 00:32:37,600 Speaker 1: it'll just basically terraform a mountain right in unreal or whatever. YEA, 682 00:32:38,000 --> 00:32:40,120 Speaker 1: once upon a time most people couldn't read and write, 683 00:32:40,280 --> 00:32:43,240 Speaker 1: and now everybody can, and there are more writers now 684 00:32:43,320 --> 00:32:46,440 Speaker 1: and more readers now professionally than there ever where before. Right, Well, 685 00:32:46,440 --> 00:32:49,400 Speaker 1: it's kind of like photography, yeah, right, Like everybody has 686 00:32:49,440 --> 00:32:51,440 Speaker 1: like a kind of pro grade camera in their pocket 687 00:32:51,440 --> 00:32:52,920 Speaker 1: all the time now, and so like we're just a 688 00:32:52,960 --> 00:32:56,240 Speaker 1: wash in really high quality photos. Whereas like if you 689 00:32:56,280 --> 00:33:00,520 Speaker 1: go back fifty years, not even twenty five years, best 690 00:33:00,600 --> 00:33:03,920 Speaker 1: phone camera you could have was really shitty and was 691 00:33:03,960 --> 00:33:08,920 Speaker 1: obviously low quality, we weren't awash in just photos everywhere, right, 692 00:33:08,960 --> 00:33:11,920 Speaker 1: And like in the last twenty five years, pretty much 693 00:33:11,920 --> 00:33:16,240 Speaker 1: everybody has become like somewhat of a pro am photographer. Yeah, 694 00:33:16,440 --> 00:33:17,920 Speaker 1: maybe this is a strawman. I don't want to throw 695 00:33:17,920 --> 00:33:19,560 Speaker 1: a strawman at you, but like is there a question 696 00:33:19,560 --> 00:33:22,680 Speaker 1: about like deep fakes and sort of like creating reality 697 00:33:22,680 --> 00:33:25,160 Speaker 1: that didn't exist. Is that something real? You guys grapple 698 00:33:25,200 --> 00:33:28,800 Speaker 1: with you know, it's a real risk. For us specifically, 699 00:33:28,840 --> 00:33:32,040 Speaker 1: we did some special algorithms. It's very hard to make 700 00:33:32,040 --> 00:33:34,120 Speaker 1: it make a deep fake. Usually what it does if 701 00:33:34,120 --> 00:33:36,160 Speaker 1: you ask me to make a photo, it'll look realistic, 702 00:33:36,520 --> 00:33:38,520 Speaker 1: but there's like something to it in the lighting and 703 00:33:38,560 --> 00:33:41,320 Speaker 1: the shading and the use where it's like just far 704 00:33:41,400 --> 00:33:43,880 Speaker 1: enough away from a photo that it looks very realistic, 705 00:33:43,920 --> 00:33:46,480 Speaker 1: but your body like knows it's not a real image immediately. 706 00:33:46,880 --> 00:33:48,680 Speaker 1: What if I'm imagining something that looks exactly like a 707 00:33:48,720 --> 00:33:51,280 Speaker 1: real image, We're not doing that right now. My imagination 708 00:33:51,320 --> 00:33:53,760 Speaker 1: has a limit. Yeah, right now it does. Yeah. Yeah. 709 00:33:53,920 --> 00:33:56,080 Speaker 1: Would you think that limit will be lifted? I don't 710 00:33:56,080 --> 00:33:58,280 Speaker 1: know for certain users. Maybe for this guy, I guess 711 00:33:58,480 --> 00:34:00,840 Speaker 1: very creative ideas, Maybe let me check it. There's lots 712 00:34:00,880 --> 00:34:02,840 Speaker 1: of pros and cons of doing that. So we found 713 00:34:02,840 --> 00:34:04,920 Speaker 1: that when we flipped it over that boundary, sometimes it 714 00:34:04,960 --> 00:34:07,760 Speaker 1: looks perfect, and then sometimes it looks really like Uncanny 715 00:34:07,840 --> 00:34:11,880 Speaker 1: Valley zombie like. So right now, if we flip it, 716 00:34:11,880 --> 00:34:15,440 Speaker 1: it's like kind of sometimes looks perfectly. Sometimes it looks 717 00:34:15,440 --> 00:34:18,239 Speaker 1: like uncanny, and the uncanny is so like upsetting to me, 718 00:34:18,360 --> 00:34:20,480 Speaker 1: as like a visual aesthetic person, I don't want to 719 00:34:20,480 --> 00:34:22,880 Speaker 1: make anything that looks like that, and so like I 720 00:34:22,880 --> 00:34:24,839 Speaker 1: I just it's better it is not loud at all. 721 00:34:25,000 --> 00:34:26,759 Speaker 1: Maybe in the future it'll be so good that it 722 00:34:26,800 --> 00:34:30,319 Speaker 1: never looks uncanny, and i'd say the technology is not. Yeah, 723 00:34:30,360 --> 00:34:32,120 Speaker 1: I mean there's no chance, just to be clear, there's 724 00:34:32,160 --> 00:34:34,759 Speaker 1: no chance that in like five years from now that 725 00:34:34,840 --> 00:34:37,160 Speaker 1: we won't be at a point where mid Journey or 726 00:34:37,200 --> 00:34:41,160 Speaker 1: other programs like it will be able to create completely 727 00:34:41,160 --> 00:34:45,040 Speaker 1: photo realistic, if not full moving images for sure, still 728 00:34:45,160 --> 00:34:47,799 Speaker 1: like in five years time, right, yeah, Yeah, There's gonna 729 00:34:47,840 --> 00:34:50,200 Speaker 1: be multiple directions here. I think one will be trying 730 00:34:50,239 --> 00:34:52,920 Speaker 1: to like make photo realistic duplications of reality, and I 731 00:34:52,920 --> 00:34:54,160 Speaker 1: think the other one will be like making things sort 732 00:34:54,160 --> 00:34:56,719 Speaker 1: of sort of super real, like beyond real. And I 733 00:34:56,800 --> 00:34:59,040 Speaker 1: think the beyond real stuff is where it's both interesting 734 00:34:59,080 --> 00:35:01,520 Speaker 1: as a human and where all like consumer and commercial 735 00:35:01,520 --> 00:35:04,879 Speaker 1: stuff is. I will say, I'm unabashedly like a fan 736 00:35:05,120 --> 00:35:07,480 Speaker 1: of this thing, but like I also can understand people's 737 00:35:07,480 --> 00:35:08,839 Speaker 1: fear about it. But people are afraid of a lot 738 00:35:08,840 --> 00:35:10,760 Speaker 1: of things that computers do, and for very good reason. 739 00:35:10,800 --> 00:35:13,680 Speaker 1: I would also say, and this is kind of your problem. 740 00:35:13,840 --> 00:35:16,200 Speaker 1: People are afraid of people like you. I don't mean 741 00:35:16,239 --> 00:35:18,719 Speaker 1: you personally, You're a lovely person um as far as 742 00:35:18,760 --> 00:35:22,000 Speaker 1: I know, uh, but like you are like, hey, I 743 00:35:22,040 --> 00:35:24,279 Speaker 1: am interested in imagination all these things. And like if 744 00:35:24,320 --> 00:35:27,920 Speaker 1: you ask like Mark Zuckerberg, like the early stages of Facebook, 745 00:35:27,920 --> 00:35:29,200 Speaker 1: you know, he would be like, I just want to 746 00:35:29,239 --> 00:35:31,440 Speaker 1: connect people, you know, I would just want people to 747 00:35:31,560 --> 00:35:34,880 Speaker 1: like get together in this social environment or whatever and connect. 748 00:35:35,080 --> 00:35:38,080 Speaker 1: But like, actually down the road, as that thing developed, 749 00:35:38,120 --> 00:35:40,640 Speaker 1: Mark Zuckerberg made a lot of like really crazy, weird, 750 00:35:40,680 --> 00:35:42,560 Speaker 1: bad decisions. You don't have to go on record by 751 00:35:42,560 --> 00:35:44,160 Speaker 1: agreeing with me, but I think in your heart you 752 00:35:44,160 --> 00:35:47,040 Speaker 1: know it's true. And so what do you do to 753 00:35:47,160 --> 00:35:51,759 Speaker 1: protect against like these things that feel like creative decisions now? Right, Like, 754 00:35:51,800 --> 00:35:54,360 Speaker 1: we couldn't have seen the misinformation machine that Facebook was 755 00:35:54,400 --> 00:35:57,240 Speaker 1: going to become, with like all of these bad actors 756 00:35:57,320 --> 00:35:58,920 Speaker 1: and all you know, sort of the ways that you 757 00:35:58,920 --> 00:36:01,000 Speaker 1: could abused the sis tims, Like we didn't know that 758 00:36:01,000 --> 00:36:02,600 Speaker 1: that was going to be a thing until like we 759 00:36:02,600 --> 00:36:05,680 Speaker 1: started to see the actual abuse. How do you protect 760 00:36:05,680 --> 00:36:07,880 Speaker 1: against the things where you've got to take in like 761 00:36:08,560 --> 00:36:10,960 Speaker 1: the worst of humanity. Like, are you doing that on 762 00:36:11,000 --> 00:36:12,600 Speaker 1: an active basis? Right? Because, like the thing with a 763 00:36:12,640 --> 00:36:15,560 Speaker 1: tool like this is that the best parts of humanity 764 00:36:15,600 --> 00:36:17,279 Speaker 1: will find like amazing things to do with it. But 765 00:36:17,320 --> 00:36:21,200 Speaker 1: there is an equal opposite actor, they're right, who will 766 00:36:21,200 --> 00:36:22,640 Speaker 1: do the worst things with it. So tell me, like 767 00:36:22,680 --> 00:36:24,359 Speaker 1: how that you build a product like this and don't 768 00:36:24,400 --> 00:36:31,080 Speaker 1: let it become destructive. Yeah. So my philosophy is that 769 00:36:31,960 --> 00:36:36,279 Speaker 1: creators imbue their values and the things they create, whether 770 00:36:36,320 --> 00:36:38,279 Speaker 1: they know it or not, and that those things have 771 00:36:38,320 --> 00:36:40,319 Speaker 1: a way of spreading those values even when they're no 772 00:36:40,360 --> 00:36:44,480 Speaker 1: longer around. That does actually put a lot of blame 773 00:36:44,520 --> 00:36:46,960 Speaker 1: on people like Zuckerberg. It implies that he made Facebook 774 00:36:46,960 --> 00:36:49,680 Speaker 1: with the wrong values. I don't know, Mark, But an 775 00:36:49,719 --> 00:36:52,319 Speaker 1: interesting example that I like to think about is um 776 00:36:52,360 --> 00:36:55,319 Speaker 1: the default of something like Facebook versus my Space. Like, 777 00:36:55,360 --> 00:36:57,319 Speaker 1: obviously he was aware of my Space, we know that, right, 778 00:36:57,360 --> 00:37:00,359 Speaker 1: definitely the main competitor. And when I remember are going 779 00:37:00,360 --> 00:37:02,200 Speaker 1: on to my space for the first time and my 780 00:37:02,239 --> 00:37:04,680 Speaker 1: pages blank, and it's that I had one friend, I 781 00:37:04,760 --> 00:37:06,560 Speaker 1: was like, who's my friend? Oh my god, it's Tom 782 00:37:07,520 --> 00:37:09,759 Speaker 1: who's Tom. You know, it's this nice guy. He's the 783 00:37:09,760 --> 00:37:12,120 Speaker 1: guy maker of my Space. This is cool, Like Tom's 784 00:37:12,160 --> 00:37:13,920 Speaker 1: my friend, he must care about me. I bet I 785 00:37:13,920 --> 00:37:15,920 Speaker 1: could make other friends that I don't know, Like my 786 00:37:15,960 --> 00:37:18,440 Speaker 1: space is a place where I can make friends, and 787 00:37:18,560 --> 00:37:21,480 Speaker 1: Tom cares. And when you sign on to Facebook, you 788 00:37:21,520 --> 00:37:26,160 Speaker 1: have no friends. And Marcus certainly not your friend. But 789 00:37:26,320 --> 00:37:28,440 Speaker 1: he's not your first friend on Facebook. That's definitely and 790 00:37:28,480 --> 00:37:30,520 Speaker 1: like what the fund does that mean? What the fund 791 00:37:30,560 --> 00:37:32,839 Speaker 1: does that mean? Not only is he not your first friend, 792 00:37:32,880 --> 00:37:35,640 Speaker 1: but you have no friends when you join right when 793 00:37:35,680 --> 00:37:40,200 Speaker 1: you join Facebook, you are this friendless non person then, 794 00:37:40,280 --> 00:37:41,759 Speaker 1: and this you have to try to grab out to 795 00:37:41,760 --> 00:37:44,520 Speaker 1: anybody who you already know, Like please, somebody who already 796 00:37:44,520 --> 00:37:48,680 Speaker 1: knows me be my friend on Facebook. And like these 797 00:37:48,719 --> 00:37:53,279 Speaker 1: there's like these really deep details that are made by 798 00:37:53,400 --> 00:37:55,600 Speaker 1: real people who have values, Like he had to think 799 00:37:55,600 --> 00:37:58,040 Speaker 1: about this, obviously, he thought about Like he's not dumb, 800 00:37:58,280 --> 00:38:00,560 Speaker 1: like you must have thought about it. I mean maybe 801 00:38:00,560 --> 00:38:02,399 Speaker 1: he wanted to be your first friend. But they were 802 00:38:02,440 --> 00:38:05,319 Speaker 1: like actually, like my Space, Tom could sue us for 803 00:38:05,520 --> 00:38:08,719 Speaker 1: like I P stuff like infringement if um we do 804 00:38:08,760 --> 00:38:10,600 Speaker 1: the same thing that he did. You know, I think 805 00:38:10,640 --> 00:38:12,520 Speaker 1: we know he wasn't that cautious about being sued because 806 00:38:12,520 --> 00:38:27,960 Speaker 1: it happened, right, That's true. I mean there's a lot 807 00:38:28,000 --> 00:38:30,080 Speaker 1: of interesting things like that. I think that actually maybe 808 00:38:30,120 --> 00:38:33,000 Speaker 1: everything is that way. The goal is like not to 809 00:38:33,120 --> 00:38:35,920 Speaker 1: not make things, but to make things with like really 810 00:38:35,920 --> 00:38:39,560 Speaker 1: good values and to have people with good values making things, 811 00:38:40,000 --> 00:38:43,359 Speaker 1: and like that making things is not equivalent between any people. 812 00:38:43,880 --> 00:38:45,880 Speaker 1: I agree with you, but like what is the expression 813 00:38:45,960 --> 00:38:48,200 Speaker 1: like the road to hell is paved with good intentions? Whatever? 814 00:38:48,239 --> 00:38:51,759 Speaker 1: I mean, I agree that that you can avoid some 815 00:38:51,880 --> 00:38:54,200 Speaker 1: of these mistakes like if you have a different set 816 00:38:54,239 --> 00:38:57,160 Speaker 1: of like goals or values, but like you already do 817 00:38:57,280 --> 00:39:00,040 Speaker 1: things with mid Journey where you're trying to sort of 818 00:39:00,160 --> 00:39:04,120 Speaker 1: protect against like misuse, right, like obviously like hate speech 819 00:39:04,239 --> 00:39:06,960 Speaker 1: or images of violence. I mean I definitely like tried 820 00:39:07,000 --> 00:39:08,520 Speaker 1: some stuff that I didn't think was like going to 821 00:39:08,520 --> 00:39:10,439 Speaker 1: produce a violent result, and it was like, we don't 822 00:39:10,480 --> 00:39:12,760 Speaker 1: do like this kind of image or whatever. I actually 823 00:39:12,760 --> 00:39:15,160 Speaker 1: have a question about porn, which is a big one. 824 00:39:15,480 --> 00:39:18,040 Speaker 1: I Mean, my guess is if you wanted Mid Journey 825 00:39:18,040 --> 00:39:21,600 Speaker 1: to create like incredible original like porn scenes, because there's 826 00:39:21,600 --> 00:39:23,960 Speaker 1: a lot of pornography on the internet, right, would you say? 827 00:39:23,960 --> 00:39:25,480 Speaker 1: There's quite a bit of it, and it's all a 828 00:39:25,560 --> 00:39:29,200 Speaker 1: visual medium basically. I mean there's obviously some erotica out there. 829 00:39:29,520 --> 00:39:32,160 Speaker 1: There's somewhere You've got the X rated Mid Journey instance 830 00:39:32,239 --> 00:39:35,000 Speaker 1: running right where I can create like full on porn scenes, right, 831 00:39:36,200 --> 00:39:38,440 Speaker 1: don't lie to me. I know the truth. Somebody there's 832 00:39:38,480 --> 00:39:40,560 Speaker 1: doing it. Yeah. You know, when I first thought about 833 00:39:40,640 --> 00:39:44,720 Speaker 1: this problem, I was like, who wants an AI generated booty? 834 00:39:44,920 --> 00:39:47,880 Speaker 1: Who doesn't? And then like, honestly, as the albums get 835 00:39:47,920 --> 00:39:50,040 Speaker 1: over time, like I see some booties and I'm like, 836 00:39:50,480 --> 00:39:53,160 Speaker 1: it's a pretty nice booty, Like it's pretty good. Pretty good. Yeah, 837 00:39:53,239 --> 00:39:55,319 Speaker 1: Like it obviously can do really good like just how 838 00:39:55,440 --> 00:39:57,959 Speaker 1: to make beautiful anything else. I mean that's a huge 839 00:39:57,960 --> 00:39:59,960 Speaker 1: deal though, Like I can't even do like a Renaiss 840 00:40:00,000 --> 00:40:04,680 Speaker 1: on his painting of nudes, like tasteful artistic news with 841 00:40:04,760 --> 00:40:08,239 Speaker 1: Mid Journey. Correct, No, right, Like is there a tier 842 00:40:08,239 --> 00:40:10,120 Speaker 1: where I can do nudes? This is really just I'm 843 00:40:10,160 --> 00:40:12,759 Speaker 1: asking for myself. But like you know, no, you're not 844 00:40:12,800 --> 00:40:16,160 Speaker 1: gonna let anybody ever do a nude, you know. I 845 00:40:16,200 --> 00:40:18,359 Speaker 1: think it's about like what is a thing that like 846 00:40:18,560 --> 00:40:21,719 Speaker 1: helps the world, Like what's so like. For example, there 847 00:40:21,719 --> 00:40:23,120 Speaker 1: are two things we have tried. I can give you 848 00:40:23,160 --> 00:40:26,120 Speaker 1: two stories. Well, one is when their system wasn't filtering 849 00:40:26,120 --> 00:40:28,279 Speaker 1: well enough, you'd have people trying to basically create like 850 00:40:28,280 --> 00:40:32,239 Speaker 1: their fantasy person basically and they're like becomes super fixated 851 00:40:32,280 --> 00:40:35,440 Speaker 1: on like this redhead whatever, like it becomes this very 852 00:40:35,440 --> 00:40:38,680 Speaker 1: specific thing over time. I don't know if that feels healthy. 853 00:40:38,960 --> 00:40:42,759 Speaker 1: It's certainly a market, right, I mean by the way 854 00:40:42,840 --> 00:40:44,879 Speaker 1: that phrase, I don't know if it's healthy, but it's 855 00:40:44,880 --> 00:40:48,520 Speaker 1: certainly a market. Is like the things that are available online, 856 00:40:48,840 --> 00:40:51,359 Speaker 1: like literally social media is like I don't know if 857 00:40:51,360 --> 00:40:53,880 Speaker 1: it's healthy, but there's certainly you know someone's going to 858 00:40:53,920 --> 00:40:55,840 Speaker 1: do it, and I think it's not going to be healthy. 859 00:40:56,239 --> 00:40:58,880 Speaker 1: Um right now. There are other things that we tried. 860 00:40:59,239 --> 00:41:01,920 Speaker 1: So for example, we did this thing where we created 861 00:41:01,920 --> 00:41:04,880 Speaker 1: this chat room. We called it not Safe, don't judge, 862 00:41:05,280 --> 00:41:06,960 Speaker 1: and we threw like a hudder people into it and 863 00:41:06,960 --> 00:41:09,000 Speaker 1: we turned off all the filters, oh my god, just 864 00:41:09,040 --> 00:41:11,440 Speaker 1: to see what would happen. And it was really interesting. 865 00:41:11,440 --> 00:41:13,440 Speaker 1: We put them all in there. We go there's no filters. Everybody. 866 00:41:13,440 --> 00:41:14,600 Speaker 1: You can do whatever you want, but everyone else is 867 00:41:14,640 --> 00:41:16,120 Speaker 1: gonna see what you see, there's gotta be some people 868 00:41:16,120 --> 00:41:18,839 Speaker 1: who would be shameless in that scenario. It was very 869 00:41:18,960 --> 00:41:22,680 Speaker 1: quiet at first, and then someone goes Bob and then 870 00:41:22,840 --> 00:41:25,840 Speaker 1: there's some boot pictures and someone goes like ass and 871 00:41:25,840 --> 00:41:27,480 Speaker 1: that was like an as picture, and everyone's like kind 872 00:41:27,480 --> 00:41:29,000 Speaker 1: of startled a first, like they didn't know what to do. 873 00:41:29,440 --> 00:41:33,080 Speaker 1: And then somebody goes, uh, fifty person orgy in a Walmart, 874 00:41:33,560 --> 00:41:35,360 Speaker 1: and it's just like these piles of naked bodies and 875 00:41:35,400 --> 00:41:38,359 Speaker 1: a Walmart, and then all of a sudden, everyone else 876 00:41:38,400 --> 00:41:42,279 Speaker 1: goes uh it orgy in space, alien orgies, and then 877 00:41:42,280 --> 00:41:43,960 Speaker 1: all of a sudden everyone starts losing their minds and 878 00:41:44,000 --> 00:41:46,719 Speaker 1: it gets really strange. Eventually it went to like Bill 879 00:41:46,800 --> 00:41:49,799 Speaker 1: Cosby eating out Hitler, Like it got pretty intense. Oh 880 00:41:49,840 --> 00:41:52,880 Speaker 1: my god, I mean that's a very, that's a very, 881 00:41:53,000 --> 00:41:55,480 Speaker 1: that's a full cancel on that image. I would say, yeah, 882 00:41:56,000 --> 00:41:58,960 Speaker 1: everything but was. But what was happening was like it 883 00:41:59,080 --> 00:42:03,560 Speaker 1: became so absurd that everyone just started to kind of 884 00:42:04,040 --> 00:42:07,960 Speaker 1: like let go of all of the bullshit that they 885 00:42:08,040 --> 00:42:10,440 Speaker 1: knew that like that they would normally be outraged of. 886 00:42:10,560 --> 00:42:13,120 Speaker 1: And when somebody finally did Bill Cosby eating out Hitler 887 00:42:13,600 --> 00:42:16,520 Speaker 1: like that was like an hour in Okay, yeah, and 888 00:42:16,640 --> 00:42:18,120 Speaker 1: is that when you shut it down? Was that this 889 00:42:18,280 --> 00:42:20,919 Speaker 1: was that when you closed shut it down shortly after? Yeah? 890 00:42:20,960 --> 00:42:25,360 Speaker 1: But so because imagine but that's like such a small sample, 891 00:42:25,480 --> 00:42:27,840 Speaker 1: and like it it went immediately to a place that 892 00:42:27,880 --> 00:42:33,040 Speaker 1: would defend like probably the like normal users of the internet. Well, 893 00:42:33,080 --> 00:42:35,400 Speaker 1: I think what's interesting to how about psychological experience of 894 00:42:35,440 --> 00:42:37,399 Speaker 1: all people had in this room, because it went from 895 00:42:37,440 --> 00:42:40,200 Speaker 1: like whops to like you know, they're kind of escalated 896 00:42:40,200 --> 00:42:43,360 Speaker 1: to like and Walmart? Is that? What isn't that what 897 00:42:43,400 --> 00:42:45,719 Speaker 1: always happens though, like you're testing the limits, but like 898 00:42:45,760 --> 00:42:48,160 Speaker 1: you know, but what happened was it's like at some 899 00:42:48,280 --> 00:42:52,280 Speaker 1: point they kind of like let go during this process 900 00:42:52,320 --> 00:42:53,960 Speaker 1: and they were like it doesn't matter anymore. Yeah, Bill 901 00:42:53,960 --> 00:42:56,440 Speaker 1: Cosby Hitler, that's really funny, or someone else did like 902 00:42:56,480 --> 00:42:58,759 Speaker 1: Michael Jackson's asshole and it's hit like a butt hole 903 00:42:58,800 --> 00:43:01,040 Speaker 1: where the where the whole was Michael Jackson's face. It 904 00:43:01,120 --> 00:43:04,120 Speaker 1: was funny, it was weird. You know those people thought 905 00:43:04,120 --> 00:43:06,480 Speaker 1: it was funny, but like a very large audience would 906 00:43:06,480 --> 00:43:08,440 Speaker 1: not think that. So this is the thing, So like, 907 00:43:08,520 --> 00:43:10,320 Speaker 1: I mean, it's not funny like at a at a 908 00:43:10,400 --> 00:43:12,680 Speaker 1: kind of basic level, like you know, the Cosby stuff 909 00:43:12,719 --> 00:43:15,279 Speaker 1: is really fucked up, and Hitler's Hitler, so like at 910 00:43:15,320 --> 00:43:17,680 Speaker 1: a really kind of basic level, like if you're like 911 00:43:17,719 --> 00:43:20,239 Speaker 1: in good taste, that's very very not in good And 912 00:43:20,280 --> 00:43:22,080 Speaker 1: there was no taste anymore. It was like everyone just 913 00:43:22,120 --> 00:43:24,759 Speaker 1: like lost it. They're like, look, nothing matters, like this 914 00:43:24,840 --> 00:43:27,160 Speaker 1: is all bullshit, like it doesn't really Like everyone kind 915 00:43:27,160 --> 00:43:29,560 Speaker 1: of let go. It felt very cathartic because at first 916 00:43:29,560 --> 00:43:31,640 Speaker 1: they were really shy, and by the end they had 917 00:43:31,680 --> 00:43:34,680 Speaker 1: all let go. It was kind of a beautiful process. 918 00:43:35,360 --> 00:43:36,719 Speaker 1: I don't know though, but like it went to a 919 00:43:36,719 --> 00:43:39,080 Speaker 1: place that was pretty offensive, right. I mean, I'm glad 920 00:43:39,120 --> 00:43:42,600 Speaker 1: that you don't allow that particular type of use. I 921 00:43:42,640 --> 00:43:44,680 Speaker 1: think it was really interesting and I would say everybody 922 00:43:44,680 --> 00:43:46,239 Speaker 1: who was involved in the sperence felt it was like 923 00:43:46,280 --> 00:43:50,120 Speaker 1: cathartic and a positive, like spiritual experience because they realized 924 00:43:50,160 --> 00:43:53,120 Speaker 1: how pent up they were in stupid ways, like maybe 925 00:43:53,120 --> 00:43:55,000 Speaker 1: the last thing was bad, Like we could say that 926 00:43:55,040 --> 00:43:56,879 Speaker 1: was bad, but there was something No, the last thing 927 00:43:56,960 --> 00:43:58,520 Speaker 1: was bad. The last thing was bad. I don't I 928 00:43:58,520 --> 00:44:00,680 Speaker 1: don't want to be like, uh, you know, like uh 929 00:44:01,280 --> 00:44:04,400 Speaker 1: the policing culture or whatever, but I mean, but the know, 930 00:44:04,520 --> 00:44:06,600 Speaker 1: but the reality is like, actually, like I think that 931 00:44:06,680 --> 00:44:10,000 Speaker 1: raises an interesting sort of scenario, and it's like, what 932 00:44:10,040 --> 00:44:12,760 Speaker 1: do people do when given this kind of unbridled power 933 00:44:12,800 --> 00:44:14,799 Speaker 1: to create whatever is in their mind? Like I like 934 00:44:14,840 --> 00:44:16,440 Speaker 1: to think people will come up with a really cool 935 00:44:16,480 --> 00:44:19,520 Speaker 1: stuff that's like awesome, but definitely for sure there's a 936 00:44:19,560 --> 00:44:21,440 Speaker 1: segment of the audience. And this is actually gets back 937 00:44:21,480 --> 00:44:23,239 Speaker 1: to my what I was asking, which is like, so 938 00:44:23,320 --> 00:44:25,239 Speaker 1: you did ran an experiment a room full of people. 939 00:44:25,239 --> 00:44:27,360 Speaker 1: They were just like users, like test like beta users 940 00:44:27,440 --> 00:44:28,799 Speaker 1: or something. It was. It was, it was a bunch 941 00:44:28,800 --> 00:44:30,919 Speaker 1: of users. We did it for one hour, and I said, 942 00:44:30,920 --> 00:44:33,440 Speaker 1: if anybody leaked an image, I would ban them for life. Right, 943 00:44:33,520 --> 00:44:35,919 Speaker 1: So that's your little kind of window into it. You're like, Okay, 944 00:44:35,960 --> 00:44:38,840 Speaker 1: this could get pretty crazy. Obviously, the way you've built 945 00:44:38,840 --> 00:44:41,439 Speaker 1: the system is that you cannot do those things. I guess. 946 00:44:41,480 --> 00:44:43,759 Speaker 1: Like the question is like do you have to be 947 00:44:43,880 --> 00:44:47,080 Speaker 1: constantly vigilant about like the ways that the thing might 948 00:44:47,080 --> 00:44:50,120 Speaker 1: be abused? Like how do you counter like abuse you 949 00:44:50,160 --> 00:44:53,000 Speaker 1: haven't even thought of yet. We have like forty moderators 950 00:44:53,000 --> 00:44:55,279 Speaker 1: who kind of watch things and then they just have 951 00:44:55,440 --> 00:44:57,760 Speaker 1: they have a little slash band commands, they say slash 952 00:44:57,760 --> 00:45:00,920 Speaker 1: band titties and also no keys the word to you anymore? Right? 953 00:45:01,280 --> 00:45:03,719 Speaker 1: Are you actively like yesterday? Was there something that Mid 954 00:45:03,760 --> 00:45:07,239 Speaker 1: Journey produced that was like a surprise to the moderators? 955 00:45:07,440 --> 00:45:09,480 Speaker 1: I know that there are words that were banned today, 956 00:45:09,800 --> 00:45:12,360 Speaker 1: Like what what was banned? I'm super curious, Like, but 957 00:45:12,400 --> 00:45:14,399 Speaker 1: today you're way far into it. There's like, how many 958 00:45:14,400 --> 00:45:16,800 Speaker 1: people have used mid Journey? Do you know the numbers? Millions? 959 00:45:16,840 --> 00:45:20,040 Speaker 1: Millions of people. Yeah, so millions of people have been 960 00:45:20,040 --> 00:45:23,600 Speaker 1: in there. But you're still today, as of October five 961 00:45:23,760 --> 00:45:26,799 Speaker 1: or whatever, you've banned words. I'd love to know what 962 00:45:26,840 --> 00:45:30,000 Speaker 1: the last banned word was. The moderators came back recently 963 00:45:30,040 --> 00:45:32,240 Speaker 1: and they're like, David, we want to unband the following 964 00:45:32,239 --> 00:45:37,400 Speaker 1: words blood, bloody, sexy, kill, killing, cutting, disturbing, and gut. Wow. 965 00:45:38,200 --> 00:45:40,200 Speaker 1: What an image. They're like, they're like, what do you think, David, 966 00:45:40,719 --> 00:45:42,879 Speaker 1: we could probably unbanned those things. And I was like, Okay, 967 00:45:42,960 --> 00:45:45,560 Speaker 1: let me think about this, uh child with guts filled 968 00:45:45,560 --> 00:45:48,440 Speaker 1: across the ground, disturbing, huge pools of blood, and like, 969 00:45:48,520 --> 00:45:51,000 Speaker 1: oh yeah, we probably don't want right where I was 970 00:45:51,040 --> 00:45:53,640 Speaker 1: like a little girl cut themselves like, oh yeah, that 971 00:45:53,680 --> 00:45:56,640 Speaker 1: seems bad, right, Well okay, but so here's my question 972 00:45:56,680 --> 00:45:57,920 Speaker 1: for you. And I think you've got kind of a 973 00:45:57,960 --> 00:45:59,920 Speaker 1: crazy responsibility. And I'm not saying this to be a 974 00:46:00,080 --> 00:46:03,320 Speaker 1: jerk at all, but like you're just like a guy 975 00:46:03,560 --> 00:46:06,279 Speaker 1: who's interested in creating this this product and create this 976 00:46:06,360 --> 00:46:09,720 Speaker 1: kind of beautiful and imaginative and exciting images and beyond. 977 00:46:10,200 --> 00:46:13,480 Speaker 1: But you're not like a linguist. I don't know you're 978 00:46:13,520 --> 00:46:15,200 Speaker 1: all of your background, but I mean, like you're not 979 00:46:15,320 --> 00:46:18,520 Speaker 1: like an ethicist. Do you employ an ethicist at the company? 980 00:46:18,560 --> 00:46:22,359 Speaker 1: Do you employ like linguistic experts? How diverse is the team? 981 00:46:22,440 --> 00:46:24,279 Speaker 1: I think these are like things that people are gonna 982 00:46:24,280 --> 00:46:25,799 Speaker 1: want to know, which is like you mentioned the Bill 983 00:46:25,840 --> 00:46:27,680 Speaker 1: Cosby Hitler thing, and I can think of like a 984 00:46:27,719 --> 00:46:30,879 Speaker 1: bunch of people who are not like a white Jewish guy. 985 00:46:30,880 --> 00:46:32,200 Speaker 1: And I say this as a white Jewish guy who 986 00:46:32,239 --> 00:46:35,120 Speaker 1: would be much more offended about some of that stuff 987 00:46:35,120 --> 00:46:39,880 Speaker 1: for people with different experience is an example, pretty outrageous thing. 988 00:46:39,920 --> 00:46:41,480 Speaker 1: And I get that, and no, no, I understand it. 989 00:46:41,520 --> 00:46:43,239 Speaker 1: Like you were in this experiment, somebody took you to 990 00:46:43,280 --> 00:46:44,920 Speaker 1: this crazy place and then you're like, all right, we 991 00:46:44,920 --> 00:46:46,680 Speaker 1: gotta shut it down. This is sort of what I 992 00:46:46,719 --> 00:46:48,279 Speaker 1: was trying to get to, is like, how do you 993 00:46:48,960 --> 00:46:51,800 Speaker 1: make a company that has all the lofty and interesting 994 00:46:51,800 --> 00:46:55,440 Speaker 1: and exciting ideals I think you have, but also protect 995 00:46:55,480 --> 00:46:58,840 Speaker 1: against building a product that ultimately ends up repeating the 996 00:46:58,880 --> 00:47:01,000 Speaker 1: mistakes of the facebooks of the twitters of the world. 997 00:47:01,480 --> 00:47:03,680 Speaker 1: And the question does come down to, like, when you're 998 00:47:03,719 --> 00:47:07,440 Speaker 1: having those conversations, who's in the room, who's having that 999 00:47:07,480 --> 00:47:10,239 Speaker 1: conversation with you? Like, what are you gonna do? This 1000 00:47:10,280 --> 00:47:12,560 Speaker 1: is might being put in my hardcore journalists add on, 1001 00:47:12,840 --> 00:47:14,080 Speaker 1: like what are you gonna do to make sure that 1002 00:47:14,080 --> 00:47:16,040 Speaker 1: you have conversations with a big enough set of people 1003 00:47:16,320 --> 00:47:18,120 Speaker 1: and with a smart enough set of people who are 1004 00:47:18,160 --> 00:47:20,680 Speaker 1: experts in these fields, like in the fields of like 1005 00:47:20,719 --> 00:47:24,120 Speaker 1: ethics and linguistics and like you know, history, and that 1006 00:47:24,160 --> 00:47:26,360 Speaker 1: it's a diverse group, like to actually make a product 1007 00:47:26,360 --> 00:47:28,520 Speaker 1: that serves everybody and not just one that feels like 1008 00:47:28,560 --> 00:47:30,839 Speaker 1: cool to like a couple of you know, Jewish guys 1009 00:47:30,840 --> 00:47:33,200 Speaker 1: like us, but may not work for a million other 1010 00:47:33,239 --> 00:47:36,359 Speaker 1: people in the world. Yeah, I mean there's a lot 1011 00:47:36,360 --> 00:47:39,800 Speaker 1: of questions there. First off, I'm okay not serving everybody, 1012 00:47:39,800 --> 00:47:41,439 Speaker 1: Like if this is majority is a two million person 1013 00:47:41,440 --> 00:47:42,840 Speaker 1: thing is ever bigger than that. I'm happy with that. 1014 00:47:43,200 --> 00:47:45,399 Speaker 1: But you want to make it inclusive, I would assume, yeah, 1015 00:47:45,440 --> 00:47:47,120 Speaker 1: I wanted to be inclusive, but also it's only two 1016 00:47:47,120 --> 00:47:49,239 Speaker 1: million people. I'll be okay with that, Like I don't like, 1017 00:47:49,280 --> 00:47:51,200 Speaker 1: I'm not I don't have this like this, you want 1018 00:47:51,200 --> 00:47:52,759 Speaker 1: two million of the same people, though, you don't want 1019 00:47:52,760 --> 00:47:55,720 Speaker 1: two million to the same people, two million white Jewish guys. 1020 00:47:55,960 --> 00:47:58,360 Speaker 1: If it makes two million white Jewish guys really happy 1021 00:47:58,400 --> 00:48:01,279 Speaker 1: to improve their lives in a significant way, like they've 1022 00:48:01,320 --> 00:48:03,440 Speaker 1: made the world better. Now, obviously I'd like to make 1023 00:48:03,480 --> 00:48:06,400 Speaker 1: it diverse, like and we try really hard there, but 1024 00:48:06,520 --> 00:48:08,239 Speaker 1: like I mean, at the end of the day, it's 1025 00:48:08,239 --> 00:48:10,440 Speaker 1: it's more important that it's good for the people who 1026 00:48:10,480 --> 00:48:12,239 Speaker 1: interact with it then that it has as many people 1027 00:48:12,280 --> 00:48:14,759 Speaker 1: as possible. And that's the first trade off. That's the 1028 00:48:14,760 --> 00:48:17,160 Speaker 1: first that's a huge trade off, because most people decided 1029 00:48:17,200 --> 00:48:18,880 Speaker 1: to not make that trade off. No, I agree with 1030 00:48:18,880 --> 00:48:21,120 Speaker 1: you that, Like, if you're thinking of like the infinite audience, 1031 00:48:21,239 --> 00:48:23,640 Speaker 1: obviously you don't want to be like every person should 1032 00:48:23,640 --> 00:48:25,920 Speaker 1: be in this thing or using this thing or whatever. 1033 00:48:26,000 --> 00:48:28,799 Speaker 1: But like I guess it's such a sensitive space where 1034 00:48:28,840 --> 00:48:31,319 Speaker 1: like you've built a tool that can create something out 1035 00:48:31,360 --> 00:48:33,359 Speaker 1: of nothing, Like you build a tool that can make 1036 00:48:33,920 --> 00:48:37,719 Speaker 1: a dream look like real basically, and so you know, 1037 00:48:37,800 --> 00:48:39,640 Speaker 1: how do you do it the right way? I feel 1038 00:48:39,640 --> 00:48:41,759 Speaker 1: like here's a chance to bring a bunch of people 1039 00:48:41,760 --> 00:48:44,160 Speaker 1: into the conversation that we're never there at Google on 1040 00:48:44,280 --> 00:48:47,200 Speaker 1: day one. When I think about any new technology like this, 1041 00:48:47,360 --> 00:48:49,920 Speaker 1: I always think now, and perhaps because I've been so 1042 00:48:49,960 --> 00:48:52,799 Speaker 1: abused by the technology companies that have existed before us, 1043 00:48:53,760 --> 00:48:56,440 Speaker 1: you know what could go wrong? Right? And how do 1044 00:48:56,480 --> 00:48:59,719 Speaker 1: you prevent that? Yeah, there are a lot of things 1045 00:48:59,719 --> 00:49:02,759 Speaker 1: we do, so like I do office hours every week 1046 00:49:02,800 --> 00:49:04,799 Speaker 1: for four hours where I just talked to as many 1047 00:49:04,800 --> 00:49:07,160 Speaker 1: people as I can, will do a theme thing like 1048 00:49:07,160 --> 00:49:09,239 Speaker 1: I brought up like twelve women once and I said, like, 1049 00:49:09,320 --> 00:49:11,640 Speaker 1: let's have a women panel, and I want to ask 1050 00:49:11,680 --> 00:49:15,239 Speaker 1: everybody how you feel about bikini photos? Should I ban 1051 00:49:15,320 --> 00:49:17,920 Speaker 1: bikini And that's one way of getting the women's side 1052 00:49:17,920 --> 00:49:20,560 Speaker 1: of things, because every single day I heard some some 1053 00:49:20,640 --> 00:49:23,960 Speaker 1: asshole dude who's like, tits are natural. I like, bikini photos, 1054 00:49:23,960 --> 00:49:25,640 Speaker 1: have as many as you can, and then like women 1055 00:49:25,640 --> 00:49:27,400 Speaker 1: who are uncomfortable, And I was like, you know what, 1056 00:49:27,440 --> 00:49:29,120 Speaker 1: I just want to hear a bunch of women talk 1057 00:49:29,200 --> 00:49:30,719 Speaker 1: about this issue of how do you feel like the 1058 00:49:30,719 --> 00:49:33,040 Speaker 1: bikini photos and like I will do whatever you say. 1059 00:49:33,239 --> 00:49:34,840 Speaker 1: I was like, should I ban bikinis? That was like 1060 00:49:34,880 --> 00:49:37,480 Speaker 1: the simplest question. Did you ban bikinis? They decided as 1061 00:49:37,480 --> 00:49:40,760 Speaker 1: a group, like, we do not want you to ban bikinis, 1062 00:49:40,800 --> 00:49:43,799 Speaker 1: Like it was like pretty unanimous, but we want you 1063 00:49:43,840 --> 00:49:45,440 Speaker 1: to hide them so that none of us ever have 1064 00:49:45,560 --> 00:49:48,359 Speaker 1: to see some dude making a bikini. And so that's 1065 00:49:48,360 --> 00:49:51,200 Speaker 1: what we did. It's a good middle ground. To me, 1066 00:49:51,320 --> 00:49:53,719 Speaker 1: this is so weird because like the reality is like 1067 00:49:53,760 --> 00:49:56,120 Speaker 1: the naked human body. Is that not like, on its 1068 00:49:56,200 --> 00:49:58,120 Speaker 1: face offensive to me in any way, Like it's like 1069 00:49:58,280 --> 00:50:01,319 Speaker 1: very normal. I agree, And it's like funny to think 1070 00:50:01,360 --> 00:50:04,040 Speaker 1: that you've got a buffer against like people abusing the 1071 00:50:04,080 --> 00:50:06,800 Speaker 1: system who are making weird like you know, sexual bikini 1072 00:50:06,840 --> 00:50:09,520 Speaker 1: photos or whatever. Yeah. I mean what the women basically 1073 00:50:09,560 --> 00:50:11,160 Speaker 1: said on the on the whole is that they're like 1074 00:50:11,239 --> 00:50:13,360 Speaker 1: they're basically even we like a little cleavage, but like 1075 00:50:13,360 --> 00:50:15,439 Speaker 1: what an average guy thinks is sexy, it's really easy 1076 00:50:15,480 --> 00:50:17,799 Speaker 1: for most of us to find creepy and unwelcoming, and 1077 00:50:17,840 --> 00:50:19,719 Speaker 1: so basically we don't have We don't feel like we 1078 00:50:19,719 --> 00:50:22,080 Speaker 1: should have to see that like against our will. That's 1079 00:50:22,080 --> 00:50:25,480 Speaker 1: so true both in AI and in life. There's a 1080 00:50:25,480 --> 00:50:28,360 Speaker 1: lot of these sort of nuanced things like technically, it 1081 00:50:28,400 --> 00:50:30,160 Speaker 1: probably should be able to do a tasteful nude, but 1082 00:50:30,200 --> 00:50:32,200 Speaker 1: it shouldn't be able to do like a hyper sexualized dude. 1083 00:50:32,239 --> 00:50:35,359 Speaker 1: Like technically, like that seems right, you know, but it's 1084 00:50:35,360 --> 00:50:37,400 Speaker 1: it's hard. There's a really hard boundary. You know, no 1085 00:50:37,520 --> 00:50:39,160 Speaker 1: taste is I mean, it's a question of art, right, 1086 00:50:39,200 --> 00:50:40,960 Speaker 1: Like what's porn. It's like, well, you know when you 1087 00:50:41,000 --> 00:50:43,759 Speaker 1: see it, and it's like but there's different levels of that, right. 1088 00:50:44,560 --> 00:50:47,360 Speaker 1: We've been trying to teach the system actually lately some 1089 00:50:47,440 --> 00:50:49,520 Speaker 1: of these nuances. We have certain users who go in 1090 00:50:49,520 --> 00:50:52,160 Speaker 1: and they rate images randomly. We find that on the whole, 1091 00:50:52,160 --> 00:50:55,359 Speaker 1: people very rarely say anything is offensive, like very rare. 1092 00:50:55,600 --> 00:50:57,520 Speaker 1: So when they say it, it's interesting, and then we 1093 00:50:57,640 --> 00:50:59,640 Speaker 1: and we aggreate all those together and then we teach 1094 00:50:59,680 --> 00:51:02,200 Speaker 1: the a We're like, hey, regardless of whether or not 1095 00:51:02,280 --> 00:51:04,400 Speaker 1: something is offensive, this is how people are responding to 1096 00:51:04,480 --> 00:51:07,319 Speaker 1: your images. And then what it does that actually changed 1097 00:51:07,360 --> 00:51:10,560 Speaker 1: its behavior? Do you worry you're you're creating a kind 1098 00:51:10,560 --> 00:51:12,839 Speaker 1: of prudish AI, like do you worry that like you're 1099 00:51:12,840 --> 00:51:15,719 Speaker 1: actually making a sexually repressed AI that like is going 1100 00:51:15,800 --> 00:51:18,759 Speaker 1: to be weird about sex and human bodies. I think 1101 00:51:18,760 --> 00:51:21,880 Speaker 1: the question is like when we build these technologies, like 1102 00:51:22,000 --> 00:51:23,520 Speaker 1: what battles do you want to fight? And where do 1103 00:51:23,560 --> 00:51:25,960 Speaker 1: we want to push the world forward? And like me, 1104 00:51:26,239 --> 00:51:28,880 Speaker 1: I want the world to be more imaginative, like and 1105 00:51:28,920 --> 00:51:32,400 Speaker 1: I want to push the boundaries of like aesthetics and creation. 1106 00:51:33,040 --> 00:51:35,560 Speaker 1: I think that's really interesting and it's really worthwhile. But 1107 00:51:35,600 --> 00:51:37,200 Speaker 1: I can be a little picky. I'm not as interested 1108 00:51:37,200 --> 00:51:40,239 Speaker 1: in doing that for violence of sexuality. Like there is 1109 00:51:40,239 --> 00:51:42,239 Speaker 1: an argument that's like push the boundaries of sexuality, let's 1110 00:51:42,239 --> 00:51:44,799 Speaker 1: make the world way more sexual. Someone else can do that. 1111 00:51:45,400 --> 00:51:48,000 Speaker 1: I just don't feel spiritually compelled for that. And so 1112 00:51:48,040 --> 00:51:49,759 Speaker 1: but I think that like there's this broader thing, which 1113 00:51:49,800 --> 00:51:52,399 Speaker 1: is like letting people reflect. The average person comes in 1114 00:51:52,440 --> 00:51:54,720 Speaker 1: here and they say something like Maltese dog in heaven 1115 00:51:55,400 --> 00:51:57,160 Speaker 1: and I reach out like, hey, why did you do that? 1116 00:51:57,160 --> 00:51:59,160 Speaker 1: That's interesting? And they go because my dog just died. 1117 00:51:59,600 --> 00:52:01,680 Speaker 1: And I'm like, oh shit, are you okay? And they're like, yeah, 1118 00:52:01,680 --> 00:52:03,480 Speaker 1: this is making me feel better, or there was like 1119 00:52:03,520 --> 00:52:05,480 Speaker 1: another woman and she goes like she was putting in 1120 00:52:05,520 --> 00:52:07,040 Speaker 1: these weird lyrics and I'm like, what are you doing? 1121 00:52:07,080 --> 00:52:09,080 Speaker 1: Like these don't show up on Google. And she goes, 1122 00:52:09,239 --> 00:52:11,080 Speaker 1: when I was very young, my older brother died and 1123 00:52:11,120 --> 00:52:13,000 Speaker 1: all he left me was this, like this cassette tape 1124 00:52:13,000 --> 00:52:15,320 Speaker 1: of these songs and I'm literally putting lyrics in and 1125 00:52:15,320 --> 00:52:17,000 Speaker 1: I'm feeling closest person never got to be part of 1126 00:52:17,080 --> 00:52:19,160 Speaker 1: my life. It's not always. Yeah, there was one person 1127 00:52:19,200 --> 00:52:20,880 Speaker 1: who was like temple of donuts and like, why are 1128 00:52:20,880 --> 00:52:23,400 Speaker 1: you doing Temple of Donuts? And like, well, I'm an atheist, 1129 00:52:23,560 --> 00:52:25,960 Speaker 1: but I don't really understand worship or religion, but I 1130 00:52:25,960 --> 00:52:28,560 Speaker 1: do understand like donuts and sweets. It's like combining all 1131 00:52:28,560 --> 00:52:30,520 Speaker 1: the things I don't understand what the things I do understand, 1132 00:52:30,600 --> 00:52:32,760 Speaker 1: and I'm like trying to understand, like what is worship. 1133 00:52:33,000 --> 00:52:35,839 Speaker 1: The Hong Kong girl, she she said, I'm a woman, 1134 00:52:35,880 --> 00:52:37,520 Speaker 1: I'm in Hong Kong, and the one thing your parents 1135 00:52:37,560 --> 00:52:38,600 Speaker 1: and Hong Kong I never want you to be as 1136 00:52:38,600 --> 00:52:40,480 Speaker 1: an artist. And so I'm a banker, and I'm a 1137 00:52:40,520 --> 00:52:43,080 Speaker 1: good banker. But now as I'm starting to get to 1138 00:52:43,160 --> 00:52:45,359 Speaker 1: use mid journey. I'm starting to get to feel like 1139 00:52:45,480 --> 00:52:47,200 Speaker 1: I'm getting to be the person I never got to be, 1140 00:52:47,360 --> 00:52:49,680 Speaker 1: and I'm having to think about that. And so like 1141 00:52:49,840 --> 00:52:51,719 Speaker 1: these are like the good stories. Just like those are 1142 00:52:51,719 --> 00:52:54,520 Speaker 1: great stories. Somebody else is just like huge chips covered 1143 00:52:54,560 --> 00:52:57,440 Speaker 1: in blood and it's like I don't care about that person. 1144 00:52:57,760 --> 00:53:00,600 Speaker 1: That's not a real human story, and like maybe there's 1145 00:53:00,640 --> 00:53:02,839 Speaker 1: something going on there, but it's not interesting. Like there's 1146 00:53:02,880 --> 00:53:04,640 Speaker 1: so many interesting things going on, and I want to 1147 00:53:04,640 --> 00:53:07,640 Speaker 1: create a space for that, and and I'm doing that. 1148 00:53:08,120 --> 00:53:09,719 Speaker 1: There's a path that we see over and over again 1149 00:53:09,760 --> 00:53:11,560 Speaker 1: with people in the Journey almost called like the heroes 1150 00:53:11,600 --> 00:53:13,600 Speaker 1: and mid Journey, and what happens is they come in 1151 00:53:13,760 --> 00:53:15,600 Speaker 1: and they realize they can make pictures of something they like. 1152 00:53:16,080 --> 00:53:18,600 Speaker 1: For me, it was cats and cyberpunk. Right, I'm like, okay, 1153 00:53:18,680 --> 00:53:20,720 Speaker 1: the start punk cats. I'm like, okay, I'll make sunk 1154 00:53:21,160 --> 00:53:24,040 Speaker 1: Priest book Ninjas and Sara Punks and I'm making typ 1155 00:53:24,040 --> 00:53:25,239 Speaker 1: about everything. And then all of a sudden, like you 1156 00:53:25,320 --> 00:53:26,960 Speaker 1: you combine all the things you like and then you 1157 00:53:27,040 --> 00:53:28,759 Speaker 1: just burn out and you're like, oh my god, I 1158 00:53:28,840 --> 00:53:30,239 Speaker 1: never like cyer punk. I never want to see sar 1159 00:53:30,320 --> 00:53:33,000 Speaker 1: punk again. Star punk isn't me? And then and then 1160 00:53:33,120 --> 00:53:35,359 Speaker 1: it's like month one, Month one, and then month two 1161 00:53:35,440 --> 00:53:37,839 Speaker 1: is you're like, but then who am I? And then 1162 00:53:37,840 --> 00:53:39,920 Speaker 1: you start looking at everybody else pictures. You're like art deco, 1163 00:53:40,160 --> 00:53:44,120 Speaker 1: am I art deco? Or like vaporways? Am I vaporways? 1164 00:53:44,239 --> 00:53:46,040 Speaker 1: And then you start like looking at everything and you're 1165 00:53:46,080 --> 00:53:47,359 Speaker 1: and you're kind of saying, like, you know, and you're 1166 00:53:47,400 --> 00:53:49,400 Speaker 1: trying to do this path of like who am I? 1167 00:53:49,440 --> 00:53:51,520 Speaker 1: What is my real aesthetic? And then you learn a lot. 1168 00:53:51,560 --> 00:53:53,600 Speaker 1: People learn like all this hard history and all these movements, 1169 00:53:53,680 --> 00:53:55,879 Speaker 1: and they start putting things together into like the sense 1170 00:53:55,880 --> 00:53:58,200 Speaker 1: of who they are. And then like month three is 1171 00:53:58,239 --> 00:54:00,400 Speaker 1: like you have this like aesthetic universe. You're starting to 1172 00:54:00,560 --> 00:54:03,040 Speaker 1: apply it to everything You're like and it's like, you know, 1173 00:54:03,040 --> 00:54:04,239 Speaker 1: it's a little bit of this, a little bit of that. 1174 00:54:04,239 --> 00:54:06,080 Speaker 1: It's all these things together, and you're like creating all 1175 00:54:06,120 --> 00:54:09,200 Speaker 1: this stuff, and it's like you're like having to like 1176 00:54:09,400 --> 00:54:11,640 Speaker 1: your people are paying off authetic depth. They're like exploring 1177 00:54:11,640 --> 00:54:13,839 Speaker 1: the nature of their identities and then they're like expressing it. 1178 00:54:13,960 --> 00:54:16,640 Speaker 1: They're like they're working out all this ship and like 1179 00:54:16,680 --> 00:54:20,080 Speaker 1: it's really really healthy and it's literally just regular people 1180 00:54:20,560 --> 00:54:23,840 Speaker 1: and almost nobody shares anything. It's crazy, like almost no 1181 00:54:23,920 --> 00:54:27,160 Speaker 1: pictures ever shared and almost no pictures ever sold, and 1182 00:54:27,680 --> 00:54:30,040 Speaker 1: it's just like it's mostly just regular people having this 1183 00:54:30,120 --> 00:54:34,279 Speaker 1: like really healthy experience. So to be clear, basically you 1184 00:54:34,320 --> 00:54:36,560 Speaker 1: see this as a form of therapy. Is that correct? 1185 00:54:36,600 --> 00:54:40,480 Speaker 1: There at least of all the US is literally aren't therapy? 1186 00:54:39,920 --> 00:54:44,640 Speaker 1: R wow, mental health through AI. That was entirely unexpected, 1187 00:54:44,760 --> 00:54:47,280 Speaker 1: but it's really important. It's clearly this tool for reflection. 1188 00:54:47,280 --> 00:54:48,960 Speaker 1: And then people are starting to meet each other like 1189 00:54:49,000 --> 00:54:51,080 Speaker 1: and they're starting to like form these groups and they're 1190 00:54:51,120 --> 00:54:53,520 Speaker 1: like pushing these aesthetic boundaries and discovering new things and 1191 00:54:53,560 --> 00:54:56,359 Speaker 1: like that's really beautiful and it's obviously part of an 1192 00:54:56,360 --> 00:54:59,920 Speaker 1: honest and positive future and like that's what I care about. Okay, 1193 00:55:00,000 --> 00:55:01,360 Speaker 1: really quickly and then we gotta wrap up. But do 1194 00:55:01,400 --> 00:55:04,040 Speaker 1: you think that like there's a future state where it's 1195 00:55:04,080 --> 00:55:06,640 Speaker 1: like mid Journey is its own Instagram. There's gonna be 1196 00:55:06,680 --> 00:55:09,239 Speaker 1: like that, but it's crazier. I think the future is 1197 00:55:09,280 --> 00:55:12,080 Speaker 1: more of like, um, it's more it's more like liquid 1198 00:55:12,120 --> 00:55:15,120 Speaker 1: imagination swirling around the room and like forming mountains and 1199 00:55:15,160 --> 00:55:17,759 Speaker 1: little trees and animals and little runs you're trying to 1200 00:55:17,800 --> 00:55:19,799 Speaker 1: figure out how to give people's surfboards or both like 1201 00:55:19,880 --> 00:55:24,400 Speaker 1: surf like oceans of liquid imagination, like discover entirely new lands. 1202 00:55:24,400 --> 00:55:26,840 Speaker 1: But it's like a very different thing. And it's like 1203 00:55:26,880 --> 00:55:29,239 Speaker 1: it forms like a new substance that you kind of 1204 00:55:29,280 --> 00:55:33,120 Speaker 1: can create and the world with and manifest through and 1205 00:55:33,200 --> 00:55:35,680 Speaker 1: like reflect through, and like that's what it's about. It's 1206 00:55:35,680 --> 00:55:38,239 Speaker 1: like creating a new substance. It's really not about like 1207 00:55:38,480 --> 00:55:42,040 Speaker 1: making an Instagram or making poor or huge kids. It's 1208 00:55:42,040 --> 00:55:44,319 Speaker 1: obvious that all that stuff will happen, but that it 1209 00:55:44,360 --> 00:55:46,840 Speaker 1: doesn't matter, Like it's not the real thing. It's like 1210 00:55:46,880 --> 00:55:50,200 Speaker 1: there was a civilization before engines and after engines, and 1211 00:55:50,200 --> 00:55:52,520 Speaker 1: and now the fun thing is moving to a civilization 1212 00:55:52,719 --> 00:55:55,680 Speaker 1: that has these engines of imagination and how does that 1213 00:55:55,719 --> 00:55:57,279 Speaker 1: transform things? Like how did engines an? I think we 1214 00:55:57,280 --> 00:56:00,719 Speaker 1: have highways, we have both. We have like huge international trade, 1215 00:56:00,760 --> 00:56:04,440 Speaker 1: Like there's like a Amazon and stuff, yeah Amazon. A 1216 00:56:04,440 --> 00:56:06,520 Speaker 1: lot of people in technology feel like we have no past. 1217 00:56:06,640 --> 00:56:08,400 Speaker 1: A lot of regular people literally feel like we have 1218 00:56:08,480 --> 00:56:11,040 Speaker 1: no future. But like I feel like we're really mid 1219 00:56:11,120 --> 00:56:13,160 Speaker 1: journey in this, like we have this rich and beautiful 1220 00:56:13,200 --> 00:56:16,160 Speaker 1: past behind us and it's like wondrous and unimaginable future 1221 00:56:16,160 --> 00:56:18,680 Speaker 1: ahead of us, and like the whole goal of making 1222 00:56:18,719 --> 00:56:20,799 Speaker 1: anything is to figure out what we can be and 1223 00:56:20,840 --> 00:56:23,560 Speaker 1: what that can be in like a positive and explorative 1224 00:56:23,560 --> 00:56:26,399 Speaker 1: and wonderful, humane way. And like I don't know, That's 1225 00:56:26,400 --> 00:56:28,000 Speaker 1: what I'm trying to do and hopefully it shows up 1226 00:56:28,000 --> 00:56:31,520 Speaker 1: a little bit in this stuff. But like I agree, 1227 00:56:31,560 --> 00:56:35,080 Speaker 1: I'm so on board with that sentiment. Like agree, like 1228 00:56:35,480 --> 00:56:37,279 Speaker 1: we don't know yet what all of this is going 1229 00:56:37,320 --> 00:56:39,000 Speaker 1: to be. It's like we have to figure that out. 1230 00:56:39,080 --> 00:56:40,520 Speaker 1: And that's why like people are like we're done, and 1231 00:56:40,560 --> 00:56:42,759 Speaker 1: it's like, no, we just really got started, like with 1232 00:56:42,800 --> 00:56:46,440 Speaker 1: technology I think, and what it can do. I agree, 1233 00:56:46,480 --> 00:56:48,319 Speaker 1: like you, you are not going a sentiment that I 1234 00:56:48,320 --> 00:56:51,359 Speaker 1: have definitely spoken on on more than one occasion. David Um. 1235 00:56:51,400 --> 00:56:54,600 Speaker 1: This is, first off, super fucking interesting ship that you're building, 1236 00:56:54,760 --> 00:56:57,920 Speaker 1: extremely fascinating conversation. We should do like a check in 1237 00:56:57,960 --> 00:56:59,680 Speaker 1: like a year from now or something to see all 1238 00:56:59,680 --> 00:57:01,560 Speaker 1: of the new mid journey things that have been created. 1239 00:57:01,600 --> 00:57:03,560 Speaker 1: So it's gonna get really scary. It's even the next 1240 00:57:03,600 --> 00:57:05,680 Speaker 1: six months, six months is going to be really intense. 1241 00:57:05,960 --> 00:57:08,400 Speaker 1: Like six months is the farthest I can see twelve 1242 00:57:08,400 --> 00:57:11,040 Speaker 1: months actually, I actually okay, we'll do a six month 1243 00:57:11,120 --> 00:57:13,200 Speaker 1: check in. We'll see if the journey said it's um 1244 00:57:13,239 --> 00:57:17,280 Speaker 1: like three quarter journey, it's gonna be, it's gonna be. 1245 00:57:17,320 --> 00:57:19,520 Speaker 1: It's it's gonna be moving really fast. It's gonna seem 1246 00:57:19,520 --> 00:57:22,520 Speaker 1: frightening to a lot of people, but it's like it's um, 1247 00:57:22,560 --> 00:57:25,440 Speaker 1: it's like an honest shot at the future. You know, 1248 00:57:25,960 --> 00:57:35,480 Speaker 1: I'm ready, David, thank you so much. Well, that is 1249 00:57:35,480 --> 00:57:37,560 Speaker 1: our show for this week. We'll be back next week 1250 00:57:37,760 --> 00:57:41,160 Speaker 1: with more what future, And as always, I wish you 1251 00:57:41,160 --> 00:57:46,080 Speaker 1: and your family the very best, even the Discantist family.