1 00:00:05,160 --> 00:00:07,800 Speaker 1: Hey, this is Annie and Samantha and welcome to Stephane 2 00:00:07,800 --> 00:00:19,319 Speaker 1: Never told your production by Heart Radio, and today we 3 00:00:19,400 --> 00:00:22,400 Speaker 1: are thrilled to be joined by Katrina, who I'm just 4 00:00:22,440 --> 00:00:24,760 Speaker 1: gonna go ahead and say friend of the show hopefully 5 00:00:24,960 --> 00:00:28,960 Speaker 1: by the end of this UM, who has been involved 6 00:00:29,080 --> 00:00:31,960 Speaker 1: in the AI world, the world of artificial intelligence as 7 00:00:32,000 --> 00:00:35,120 Speaker 1: both a user and professionally for a while. Now. Thank 8 00:00:35,159 --> 00:00:37,400 Speaker 1: you so much for joining us, Katrina, Thank you for 9 00:00:37,440 --> 00:00:41,640 Speaker 1: having me. Yes, yes, yes, yes. Can you tell the 10 00:00:41,680 --> 00:00:45,840 Speaker 1: audience a little bit about yourself? Yes? I can so. 11 00:00:46,000 --> 00:00:50,880 Speaker 1: My name is Katrina slaps Off. I, like Annie has said, 12 00:00:50,920 --> 00:00:56,200 Speaker 1: I've been in the AI art world for well, not forever, 13 00:00:56,520 --> 00:01:01,040 Speaker 1: but like considering how quickly the air our world is 14 00:01:01,080 --> 00:01:03,320 Speaker 1: kind of popped up, I would say I've been there 15 00:01:03,360 --> 00:01:07,959 Speaker 1: for like a good chunk now. I have been working 16 00:01:08,000 --> 00:01:13,319 Speaker 1: with mid Journey UM in the volunteer since since the 17 00:01:13,360 --> 00:01:17,720 Speaker 1: beginning of July UM and I started working with them 18 00:01:17,760 --> 00:01:21,800 Speaker 1: as a community liasion. I think I'm saying that right 19 00:01:22,440 --> 00:01:25,480 Speaker 1: since September. They're not my only job. I also work 20 00:01:25,520 --> 00:01:28,560 Speaker 1: in the creative industry. I have for about six years now, 21 00:01:28,600 --> 00:01:32,080 Speaker 1: maybe seven. For my day job, I am a senior designer. 22 00:01:32,400 --> 00:01:38,600 Speaker 1: I work in a democratic slash left leaning political agency. 23 00:01:38,840 --> 00:01:44,039 Speaker 1: We do a lot of work for UM campaigns, political campaigns, 24 00:01:44,080 --> 00:01:47,920 Speaker 1: for nonprofits, UM, for some super PACs and whatnot. UM 25 00:01:47,960 --> 00:01:51,520 Speaker 1: and we just got out of the election cycle, which 26 00:01:51,600 --> 00:01:54,280 Speaker 1: was intense and fun. Yeah. And I've also, like in 27 00:01:54,320 --> 00:01:57,280 Speaker 1: the past, I've worked with a ton of big brands 28 00:01:57,320 --> 00:02:01,840 Speaker 1: like Amazon, Alexa, bunch of brands under PNG UM and 29 00:02:01,920 --> 00:02:07,000 Speaker 1: also on an immersive space of spaces actually weirdly so yeah, 30 00:02:07,080 --> 00:02:09,280 Speaker 1: so I've kind of been all over the place, but 31 00:02:09,400 --> 00:02:13,480 Speaker 1: it's really fun. So yeah, Yeah, you sent us kind 32 00:02:13,480 --> 00:02:15,720 Speaker 1: of a like breakdown of all the stuff you've done, 33 00:02:15,720 --> 00:02:19,840 Speaker 1: and it's quite a bit, quite a bit. Um. We 34 00:02:19,880 --> 00:02:22,200 Speaker 1: did want to put a disclaimer in here because you're 35 00:02:22,240 --> 00:02:27,080 Speaker 1: here speaking as an individual and that for any particular company. Yes, 36 00:02:27,160 --> 00:02:30,960 Speaker 1: and also we Samantha and I have an episode on 37 00:02:31,200 --> 00:02:34,560 Speaker 1: artificial intelligence coming out. Keep an ear out for that 38 00:02:34,600 --> 00:02:37,000 Speaker 1: because there is a lot to talk about. But um, 39 00:02:37,160 --> 00:02:42,800 Speaker 1: for people, Katrina who do not know what we're talking about, 40 00:02:42,840 --> 00:02:44,560 Speaker 1: can you explain it like that? It can be the 41 00:02:45,440 --> 00:02:48,880 Speaker 1: barest explanation, but can you give us kind of a 42 00:02:48,880 --> 00:02:53,480 Speaker 1: working definition the way that I always explain it to people? Uh, 43 00:02:53,520 --> 00:02:55,280 Speaker 1: you know, when I'm talking to them in real life 44 00:02:55,320 --> 00:02:58,400 Speaker 1: and they have no idea what it is. I just 45 00:02:58,680 --> 00:03:02,080 Speaker 1: always say in the simplest way, you just put in 46 00:03:02,120 --> 00:03:06,440 Speaker 1: words and outcomes pretty art. It's really what it is 47 00:03:06,440 --> 00:03:10,200 Speaker 1: at its simplest, you know, like it's it's it's at 48 00:03:10,240 --> 00:03:13,880 Speaker 1: its foundation, you just put in just put in words, right. 49 00:03:14,280 --> 00:03:17,919 Speaker 1: So I know because when we're talking about AI specifically 50 00:03:17,960 --> 00:03:21,240 Speaker 1: with you, we are talking about the prompts and uh, well, 51 00:03:21,360 --> 00:03:23,280 Speaker 1: I would have thought of AI a long time ago, 52 00:03:23,320 --> 00:03:25,360 Speaker 1: which is the artificial intelligence that we are like, it's 53 00:03:25,360 --> 00:03:27,560 Speaker 1: going to take over the world. It's robots and it's 54 00:03:27,560 --> 00:03:31,799 Speaker 1: gonna take our personality, which is always a sci fi moment. 55 00:03:32,080 --> 00:03:35,240 Speaker 1: But here we're talking with you specifically about the art world, 56 00:03:35,240 --> 00:03:37,000 Speaker 1: such as you were talking about mid Journey, and I 57 00:03:37,040 --> 00:03:38,760 Speaker 1: know Dolly was one of the first ones that I'd 58 00:03:38,800 --> 00:03:41,640 Speaker 1: ever heard of, uh coming out, which has some really 59 00:03:41,680 --> 00:03:45,480 Speaker 1: interesting and fascinating background and even though it feels like 60 00:03:45,600 --> 00:03:48,880 Speaker 1: to me as a newbie, it's really just it's come 61 00:03:48,920 --> 00:03:52,400 Speaker 1: at us fast. But I know it's been around, but 62 00:03:52,520 --> 00:03:55,920 Speaker 1: to this point that it is growing in leaps and bounds. Um. 63 00:03:55,960 --> 00:03:57,800 Speaker 1: And you talked about how you got involved. You went 64 00:03:57,840 --> 00:04:01,000 Speaker 1: from volunteer to now you're actually a part of that 65 00:04:01,080 --> 00:04:03,640 Speaker 1: community for real. How did you get into this type 66 00:04:03,640 --> 00:04:06,880 Speaker 1: of AI? You know, it's really funny that you say, yet, Yeah, 67 00:04:06,920 --> 00:04:09,560 Speaker 1: it has actually been around before Dolly and mid Journey, 68 00:04:09,640 --> 00:04:12,600 Speaker 1: I'd say, like in the capacity that we know mid 69 00:04:12,680 --> 00:04:17,640 Speaker 1: Journey as like there have been private versions of that 70 00:04:17,680 --> 00:04:21,840 Speaker 1: with like Google's image in and Navidia has like their 71 00:04:21,880 --> 00:04:23,359 Speaker 1: own version of that and they've had it for a 72 00:04:23,400 --> 00:04:25,520 Speaker 1: couple of years now, I think. So this is actually 73 00:04:25,520 --> 00:04:29,679 Speaker 1: like the first time that regular people have had access 74 00:04:29,720 --> 00:04:33,360 Speaker 1: to UM these services. And before I ever knew about 75 00:04:33,400 --> 00:04:37,880 Speaker 1: mid Journey, I was experimenting with another service which I 76 00:04:37,920 --> 00:04:40,640 Speaker 1: cannot remember the name of. It was like play play 77 00:04:40,680 --> 00:04:44,560 Speaker 1: something and that used AI, but you had to input 78 00:04:44,640 --> 00:04:47,320 Speaker 1: your own images for it, and it was um a 79 00:04:47,320 --> 00:04:51,400 Speaker 1: lot more basic and not as cool looking as mid Journey, right, 80 00:04:51,720 --> 00:04:55,640 Speaker 1: but I had a lot of fun just like putting 81 00:04:55,640 --> 00:04:59,039 Speaker 1: in my own art with that stuff. And UM, I 82 00:04:59,080 --> 00:05:02,760 Speaker 1: actually do a lot of a experimental video work, uh, 83 00:05:03,120 --> 00:05:06,320 Speaker 1: just just weird, glitchy stuff. And I've been doing that 84 00:05:06,400 --> 00:05:10,039 Speaker 1: for many years now, and so I have gained quite 85 00:05:10,040 --> 00:05:14,360 Speaker 1: a few friends within that world, and so UM, one 86 00:05:14,440 --> 00:05:16,520 Speaker 1: day one of my friends messages me on Instagram and 87 00:05:16,560 --> 00:05:18,080 Speaker 1: they're like, Yo, you need to check out my journey 88 00:05:18,120 --> 00:05:21,200 Speaker 1: And I was like, what is that? What is that? 89 00:05:22,520 --> 00:05:24,359 Speaker 1: And that was when you still needed an invite to 90 00:05:24,400 --> 00:05:27,479 Speaker 1: get in. I it was really hard because people are 91 00:05:27,520 --> 00:05:30,120 Speaker 1: always begging for advites and stuff and I didn't really 92 00:05:30,120 --> 00:05:32,200 Speaker 1: know anybody that could give me one. So all I 93 00:05:32,240 --> 00:05:35,760 Speaker 1: did was just post on my Instagram story like hey, 94 00:05:35,800 --> 00:05:38,080 Speaker 1: anybody got one of those? And by the end of 95 00:05:38,120 --> 00:05:40,400 Speaker 1: the day I was in because I just already knew 96 00:05:40,400 --> 00:05:43,440 Speaker 1: a lot of people, you know, in the art world 97 00:05:43,600 --> 00:05:46,360 Speaker 1: that we're already using it. So that's how I got 98 00:05:46,400 --> 00:05:51,520 Speaker 1: into my journey. There are still only like forty people. 99 00:05:52,000 --> 00:05:55,240 Speaker 1: We just actually we the discord just hit ten million 100 00:05:55,480 --> 00:05:59,720 Speaker 1: people about an hour ago. That is yes, um, we 101 00:05:59,720 --> 00:06:04,080 Speaker 1: were all celebrating. I was like, wow, I remember. It's 102 00:06:04,120 --> 00:06:05,919 Speaker 1: so funny too, because I remember when I got in 103 00:06:05,960 --> 00:06:08,480 Speaker 1: at four d K. I was kind of disappointed. I 104 00:06:08,520 --> 00:06:10,279 Speaker 1: was like, man, I could have been here when it 105 00:06:10,360 --> 00:06:16,680 Speaker 1: was smaller, and look at us now. In terms of 106 00:06:16,720 --> 00:06:22,840 Speaker 1: how I like became a guide, I don't know. I 107 00:06:22,880 --> 00:06:25,360 Speaker 1: think I think the story of how that happen is 108 00:06:25,400 --> 00:06:28,760 Speaker 1: actually kind of funny because they just started like a 109 00:06:28,800 --> 00:06:32,159 Speaker 1: new feedback channel at the time, um, and they didn't 110 00:06:32,240 --> 00:06:35,360 Speaker 1: have any slow mode or any restrictions people posting in there. 111 00:06:36,000 --> 00:06:39,600 Speaker 1: And one night and it was it was so empty, right, 112 00:06:39,640 --> 00:06:41,400 Speaker 1: like there's like not as many people that were there 113 00:06:41,400 --> 00:06:44,039 Speaker 1: as there are now. And so one night, this one 114 00:06:44,120 --> 00:06:46,880 Speaker 1: guy just goes on a long ramble and I'm like, 115 00:06:48,160 --> 00:06:51,000 Speaker 1: I was. It was like pages just worth of like 116 00:06:51,080 --> 00:06:56,479 Speaker 1: not this wasn't even feedback. Eventually, I was like, yo, dude, 117 00:06:56,480 --> 00:07:00,279 Speaker 1: go somewhere else with that. And the next day I 118 00:07:00,279 --> 00:07:02,360 Speaker 1: woke up to one of the developers being like, yo, 119 00:07:02,440 --> 00:07:04,200 Speaker 1: you want to be a guide? And I was like, oh, 120 00:07:04,960 --> 00:07:06,560 Speaker 1: I thought I was going to get in trouble for that, 121 00:07:06,680 --> 00:07:10,680 Speaker 1: but okay, I like it. You spoke in mind. Yeah, 122 00:07:10,720 --> 00:07:13,200 Speaker 1: And I remember it was early July because it was 123 00:07:13,240 --> 00:07:16,040 Speaker 1: the very next day I had off for July fourth, 124 00:07:16,120 --> 00:07:20,160 Speaker 1: and I was manning the support channels and stuff and yeah, 125 00:07:20,240 --> 00:07:24,640 Speaker 1: and it's just been all up from that since then. 126 00:07:24,920 --> 00:07:28,200 Speaker 1: That's cool. Yeah, Um, for people who don't know, can 127 00:07:28,240 --> 00:07:31,520 Speaker 1: you kind of describe what being a guide is? Yeah? 128 00:07:31,600 --> 00:07:35,800 Speaker 1: So there are actually two tiers of volunteers that helped 129 00:07:35,880 --> 00:07:39,280 Speaker 1: run the discord now and I say to volunteers. UM. 130 00:07:39,320 --> 00:07:43,440 Speaker 1: There are benefits to working as a guide or a moderator, 131 00:07:43,560 --> 00:07:47,680 Speaker 1: but we don't really talk about them much because we 132 00:07:47,760 --> 00:07:51,920 Speaker 1: have a very specific recruitment process where it's like we 133 00:07:52,160 --> 00:07:55,720 Speaker 1: don't want people trying to become a guide or a 134 00:07:55,760 --> 00:07:59,520 Speaker 1: mod for the wrong reasons. And it's worked really well 135 00:07:59,520 --> 00:08:01,760 Speaker 1: so far. We have a really great group of people. 136 00:08:02,240 --> 00:08:06,040 Speaker 1: But basically you have the guides and you have the moderators. 137 00:08:06,520 --> 00:08:10,000 Speaker 1: They're both kind of the same group. The guides are 138 00:08:10,040 --> 00:08:14,400 Speaker 1: meant more as like people to like help and support 139 00:08:14,480 --> 00:08:17,520 Speaker 1: channels we have, but we also have guides that work 140 00:08:17,600 --> 00:08:20,400 Speaker 1: in other capacities to UM. We have like a good 141 00:08:20,400 --> 00:08:24,080 Speaker 1: group of people that work in our prompt chat channel, 142 00:08:24,160 --> 00:08:28,240 Speaker 1: which is basically like people helping other people with like 143 00:08:28,280 --> 00:08:30,920 Speaker 1: getting the images they want, you know, getting the prompt 144 00:08:31,000 --> 00:08:33,480 Speaker 1: right so they can get the results that they want. UM. 145 00:08:33,480 --> 00:08:36,679 Speaker 1: I could listen more examples. UM. The moderators, on the 146 00:08:36,679 --> 00:08:40,920 Speaker 1: other hand, they have a couple more abilities. UM. They 147 00:08:40,920 --> 00:08:46,720 Speaker 1: can ban people. UM, they can get rid of images. UH. 148 00:08:47,040 --> 00:08:49,400 Speaker 1: Those are the big ones. UM. We can ban words 149 00:08:49,440 --> 00:08:51,839 Speaker 1: and phrases in the prompts, which we can talk about 150 00:08:52,240 --> 00:08:55,360 Speaker 1: later if you want. And a couple of other things, 151 00:08:55,640 --> 00:09:10,880 Speaker 1: but those are the big main ones. I would really 152 00:09:10,880 --> 00:09:14,160 Speaker 1: be curious about how the conversation that goes into how 153 00:09:14,160 --> 00:09:17,600 Speaker 1: do you ban something? How do you decide what gets banned? 154 00:09:17,600 --> 00:09:19,800 Speaker 1: Because there is there is a lot of conversation and 155 00:09:19,840 --> 00:09:23,720 Speaker 1: there are some lawsuits right now around artificial intelligence, and 156 00:09:23,920 --> 00:09:28,679 Speaker 1: it's just a lot of conversation happening. And um, because 157 00:09:28,679 --> 00:09:32,440 Speaker 1: we're an intersectional feminist show, we're really interested in in 158 00:09:33,240 --> 00:09:38,560 Speaker 1: how artificial intelligence interacts with and impacts marginalized communities. And 159 00:09:38,640 --> 00:09:42,920 Speaker 1: you've been yeah, involved in in moderating and guiding UM 160 00:09:43,000 --> 00:09:46,640 Speaker 1: artificial intelligence communities AI communities and setting standards and training 161 00:09:46,720 --> 00:09:49,240 Speaker 1: others on them. So yeah, we'd like to break that 162 00:09:49,520 --> 00:09:54,880 Speaker 1: down a little bit more, UM, starting with like, why 163 00:09:54,920 --> 00:09:59,240 Speaker 1: do you think it's important to bring an intersectional feminist 164 00:09:59,280 --> 00:10:04,120 Speaker 1: perspective two things like this? I think it's very important. 165 00:10:04,640 --> 00:10:07,559 Speaker 1: And also I just realized I didn't hit on that. 166 00:10:07,679 --> 00:10:12,040 Speaker 1: I also run the Facebook group from a Journey and 167 00:10:12,080 --> 00:10:15,600 Speaker 1: I also run uh well, I'm one of the main 168 00:10:15,679 --> 00:10:19,840 Speaker 1: mods that helps run the subreddit. Um, so that's what 169 00:10:19,880 --> 00:10:22,040 Speaker 1: you were hitting on when you said, like I said, 170 00:10:22,080 --> 00:10:25,120 Speaker 1: the moderation standards, I'm the one that's I'm actually one 171 00:10:25,240 --> 00:10:28,480 Speaker 1: of two what we call what we're calling super moods, 172 00:10:29,320 --> 00:10:34,720 Speaker 1: who we basically have access to internal communications with the team, 173 00:10:34,800 --> 00:10:37,720 Speaker 1: and so we do a lot of back and forth there. Um. 174 00:10:37,760 --> 00:10:41,359 Speaker 1: But also like we are kind of like the managers 175 00:10:41,400 --> 00:10:45,680 Speaker 1: slash bosses of the rest of the mods. So even 176 00:10:45,760 --> 00:10:48,040 Speaker 1: though they do a lot of work and a lot 177 00:10:48,040 --> 00:10:50,280 Speaker 1: of decision making on their own, like me and one 178 00:10:50,320 --> 00:10:54,840 Speaker 1: other person are basically like the last Like if we 179 00:10:54,920 --> 00:10:57,240 Speaker 1: say you can't, you gotta do this, you gotta do it, 180 00:10:57,360 --> 00:11:02,240 Speaker 1: like you know what I mean. So, yeah, that's some 181 00:11:02,240 --> 00:11:04,920 Speaker 1: more context there for like where I'm where I'm at 182 00:11:04,960 --> 00:11:07,640 Speaker 1: with the other mods. Um. In terms of having like 183 00:11:07,679 --> 00:11:11,280 Speaker 1: the intersectional part, I think it's super important we have 184 00:11:11,320 --> 00:11:14,200 Speaker 1: a really really good group of mods and guides that 185 00:11:14,240 --> 00:11:17,720 Speaker 1: are super diverse because if you, if you once you realize, 186 00:11:17,720 --> 00:11:21,160 Speaker 1: like we're doing this on a global scale, so there 187 00:11:21,160 --> 00:11:24,600 Speaker 1: are a lot of different cultures and communities that are 188 00:11:24,640 --> 00:11:29,040 Speaker 1: basically coming in and forming one right, and so there's 189 00:11:29,040 --> 00:11:31,560 Speaker 1: gonna be different standards of like what's okay and what's 190 00:11:31,559 --> 00:11:34,440 Speaker 1: not okay in a lot of different places. Um. And 191 00:11:34,640 --> 00:11:39,080 Speaker 1: what I really love about mid Journey and our moderation 192 00:11:39,160 --> 00:11:46,760 Speaker 1: team is we have always made decisions in the sense 193 00:11:46,800 --> 00:11:49,040 Speaker 1: of like, we are always trying to put real people 194 00:11:49,320 --> 00:11:53,959 Speaker 1: and the community first. So we think it's more important 195 00:11:55,200 --> 00:11:59,840 Speaker 1: um to reduce harm to real people than it is 196 00:12:00,120 --> 00:12:03,280 Speaker 1: to let other people make art that could be harmful 197 00:12:03,320 --> 00:12:08,920 Speaker 1: to them. Right. So I'll give you one example. My 198 00:12:09,080 --> 00:12:11,960 Speaker 1: very first word that I got spanned on my very 199 00:12:12,040 --> 00:12:14,559 Speaker 1: first day as a moderator. I'm not going to say 200 00:12:14,559 --> 00:12:17,439 Speaker 1: the word because it is a slur to a large 201 00:12:17,440 --> 00:12:21,080 Speaker 1: group of people, but it starts the G and it 202 00:12:21,160 --> 00:12:25,800 Speaker 1: describes people of Romani descent um. A lot of people 203 00:12:25,880 --> 00:12:29,200 Speaker 1: in the US don't know that that's a very harmful word. 204 00:12:29,320 --> 00:12:31,040 Speaker 1: I kind of, I kind of knew a little bit, 205 00:12:31,080 --> 00:12:33,439 Speaker 1: but I actually I didn't realize the full extent of it. 206 00:12:34,000 --> 00:12:38,120 Speaker 1: And somebody had brought it up to me, and I 207 00:12:38,200 --> 00:12:41,000 Speaker 1: was still a guide at this point, and so I 208 00:12:41,240 --> 00:12:44,600 Speaker 1: brought it up to the moderators and I actually talked 209 00:12:44,640 --> 00:12:47,160 Speaker 1: to a team member, a dev team member about it 210 00:12:47,200 --> 00:12:49,680 Speaker 1: for a little while, and we were going back and forth. 211 00:12:50,000 --> 00:12:53,400 Speaker 1: I was like, look, most people in the USA don't 212 00:12:53,520 --> 00:12:58,240 Speaker 1: know this is bad. They use it innocently, even you know, 213 00:12:58,320 --> 00:13:01,040 Speaker 1: even if the term itself isn't in this in you know, 214 00:13:01,120 --> 00:13:04,959 Speaker 1: they don't know. But on the other hand, this term 215 00:13:05,280 --> 00:13:12,400 Speaker 1: was literally used like to create genocide. It's violent, you know, 216 00:13:12,600 --> 00:13:15,640 Speaker 1: so maybe we shouldn't allow people to use that because 217 00:13:16,120 --> 00:13:20,600 Speaker 1: it's really hurtful and harmful to the very real, alive 218 00:13:20,720 --> 00:13:24,600 Speaker 1: people today, you know, to see that term getting used 219 00:13:24,640 --> 00:13:29,160 Speaker 1: and actually, UM, people get really upset about it. They're 220 00:13:29,200 --> 00:13:32,040 Speaker 1: like seriously, like you're you're something, you're banning that, you're 221 00:13:32,040 --> 00:13:36,880 Speaker 1: censoring that, and we're just like, yeah, find other words, 222 00:13:37,960 --> 00:13:43,880 Speaker 1: you know. Um. Another really good example actually came up recently. 223 00:13:44,000 --> 00:13:46,720 Speaker 1: It's been brought up a couple of times, but somebody 224 00:13:46,760 --> 00:13:51,319 Speaker 1: in our feedback channel, UM was pretty upset. Understandably, so 225 00:13:51,400 --> 00:13:54,720 Speaker 1: I think, um, because the term down syndrome is banned. 226 00:13:55,679 --> 00:14:00,200 Speaker 1: And when we banned that, we actually we went come 227 00:14:00,280 --> 00:14:02,960 Speaker 1: forth on it because we we do not like banning 228 00:14:03,120 --> 00:14:07,080 Speaker 1: things that are identities. UM. You know, you don't like 229 00:14:07,120 --> 00:14:10,400 Speaker 1: banning things that are real people unless it's like very obviously, 230 00:14:10,440 --> 00:14:13,199 Speaker 1: you know, a hurtful way to to refer to real people. 231 00:14:13,760 --> 00:14:16,280 Speaker 1: Down to the term isn't a hurtful way to refer 232 00:14:16,600 --> 00:14:21,120 Speaker 1: to refer to people. However, nine of people who are 233 00:14:21,200 --> 00:14:24,280 Speaker 1: using that term, we're using it in a really bad 234 00:14:24,320 --> 00:14:28,840 Speaker 1: way and in a really hurtful way, and so it 235 00:14:28,880 --> 00:14:30,600 Speaker 1: got to the point where or we were just so 236 00:14:30,680 --> 00:14:34,720 Speaker 1: flooded with that that it started to become unmanageable UM 237 00:14:35,080 --> 00:14:38,080 Speaker 1: and moderating it, and so we just went ahead and 238 00:14:38,120 --> 00:14:43,960 Speaker 1: just banned the whole term. And it's it is upsetting, 239 00:14:44,000 --> 00:14:48,480 Speaker 1: I think, because you know, it's like the fact that 240 00:14:48,600 --> 00:14:54,240 Speaker 1: people are misusing certain phrases, certain words in a way 241 00:14:54,280 --> 00:14:59,560 Speaker 1: that is harmful or crappy enough, you know, it's that 242 00:14:59,600 --> 00:15:04,359 Speaker 1: we have to like ban it, like that sucks UM 243 00:15:04,440 --> 00:15:06,760 Speaker 1: and we have like people get upset about it, Like 244 00:15:06,800 --> 00:15:09,600 Speaker 1: we have suggestions like use this word instead, or use 245 00:15:09,640 --> 00:15:13,040 Speaker 1: that instead. The guy in feedback, for example, he was like, oh, 246 00:15:13,080 --> 00:15:15,560 Speaker 1: my wife works of down syndromes children and like wants 247 00:15:15,560 --> 00:15:17,800 Speaker 1: to be able to create with this. And so we're like, okay, 248 00:15:18,080 --> 00:15:21,320 Speaker 1: use I'm gonna say this wrong. It's like tris some 249 00:15:21,600 --> 00:15:26,000 Speaker 1: me or try somebody twenty one. It's like the specific 250 00:15:26,640 --> 00:15:29,760 Speaker 1: mutation or something like that that will create what you want. 251 00:15:29,920 --> 00:15:35,800 Speaker 1: But anybody who's trying and create imagery about down syndrome 252 00:15:36,000 --> 00:15:38,560 Speaker 1: people in a bad way isn't going to know that, 253 00:15:39,040 --> 00:15:41,320 Speaker 1: or they're not going to take the time to try 254 00:15:41,360 --> 00:15:44,720 Speaker 1: and figure that out, you know, so a lot of 255 00:15:44,720 --> 00:15:48,640 Speaker 1: our decisions, you know, it's really just trying to balance 256 00:15:49,920 --> 00:15:53,680 Speaker 1: between all the people who are abusing a term and 257 00:15:55,000 --> 00:15:58,040 Speaker 1: you know the rest of everyone else, like trying to 258 00:15:58,040 --> 00:16:02,280 Speaker 1: give people the most freedom to to create things that 259 00:16:03,000 --> 00:16:08,280 Speaker 1: are important. And that's absolutely a big conversation about UM 260 00:16:08,440 --> 00:16:11,800 Speaker 1: able is UM and what is seen as how disability 261 00:16:11,840 --> 00:16:15,240 Speaker 1: is seen in general, And and I know many disabled 262 00:16:15,320 --> 00:16:18,200 Speaker 1: activists would say disability is not a negative thing, so 263 00:16:18,520 --> 00:16:20,600 Speaker 1: saying it is, I think that is a complicated issue 264 00:16:20,640 --> 00:16:23,920 Speaker 1: because you do have the bad players who come in 265 00:16:24,080 --> 00:16:28,440 Speaker 1: to do something as obvious in an evil or bad intent. 266 00:16:28,840 --> 00:16:31,200 Speaker 1: So that's an interesting perspective they have to look at, 267 00:16:31,640 --> 00:16:33,520 Speaker 1: you know. And on on top of this UM One 268 00:16:33,600 --> 00:16:35,520 Speaker 1: of the things I find interesting with all of the 269 00:16:35,680 --> 00:16:39,280 Speaker 1: AI conversations, and this includes chat GPT, I know, it 270 00:16:39,320 --> 00:16:42,080 Speaker 1: is getting bigger and bigger, and there's so much conversation 271 00:16:42,240 --> 00:16:46,720 Speaker 1: about the information that is being fed into AI and 272 00:16:46,880 --> 00:16:50,120 Speaker 1: what happens and and what the biases are and what's 273 00:16:50,400 --> 00:16:54,040 Speaker 1: even available. And it's pretty interesting because I know, for 274 00:16:54,160 --> 00:16:57,360 Speaker 1: a chunk of content, let's say, for when people are 275 00:16:57,360 --> 00:17:00,400 Speaker 1: talking about specific imagery of women, a lot of it 276 00:17:00,480 --> 00:17:03,960 Speaker 1: comes from an and cel kind of group of men 277 00:17:04,040 --> 00:17:06,320 Speaker 1: who think this is the perfect woman, and for some 278 00:17:06,400 --> 00:17:10,000 Speaker 1: reason that overtakes an image. So you may not want this, 279 00:17:10,600 --> 00:17:13,080 Speaker 1: Like I think I've seen a few where I've seen 280 00:17:13,119 --> 00:17:15,439 Speaker 1: files or it's just as women and I like, in 281 00:17:15,480 --> 00:17:17,680 Speaker 1: my mind, I'm like, oh, women are creating this, let's 282 00:17:17,720 --> 00:17:19,760 Speaker 1: what this is. But it turns out these are men 283 00:17:19,880 --> 00:17:22,639 Speaker 1: creating what they think is a beautiful woman, and you're like, 284 00:17:22,640 --> 00:17:25,320 Speaker 1: oh my god, what is this? Is this the way? 285 00:17:25,560 --> 00:17:27,640 Speaker 1: And there's nothing wrong with porn, but is this one 286 00:17:27,680 --> 00:17:30,200 Speaker 1: more way of those bad players that we talk about 287 00:17:30,200 --> 00:17:34,280 Speaker 1: to have another misogynistic hold on something that is so new. 288 00:17:34,680 --> 00:17:36,680 Speaker 1: So with all of that, when it comes to women 289 00:17:36,920 --> 00:17:40,200 Speaker 1: and other marginalized communities, do you have a concern about 290 00:17:40,240 --> 00:17:42,879 Speaker 1: that in the AI world or what do you see 291 00:17:42,920 --> 00:17:45,800 Speaker 1: as it's coming out and it's so brand new. Yeah, 292 00:17:45,840 --> 00:17:48,520 Speaker 1: So the first thing I want to say about that, 293 00:17:49,040 --> 00:17:52,399 Speaker 1: I think the next sentence applies to many subjects within 294 00:17:53,040 --> 00:17:56,720 Speaker 1: the AI world, not just this one. But what I 295 00:17:56,720 --> 00:18:04,280 Speaker 1: will say is AI art isn't necessarily creating problems as 296 00:18:04,359 --> 00:18:07,720 Speaker 1: much as it's highlighting the issues that have been issues 297 00:18:09,480 --> 00:18:13,400 Speaker 1: both within the art world and within society have been 298 00:18:13,440 --> 00:18:16,160 Speaker 1: issues for a long time, and it's really just like 299 00:18:17,280 --> 00:18:21,160 Speaker 1: highlighting a lot of those problems. Let's think I said back, 300 00:18:21,320 --> 00:18:23,040 Speaker 1: and I want to explain it this way because I 301 00:18:23,040 --> 00:18:26,000 Speaker 1: think it's really it's really important to understand this part 302 00:18:26,119 --> 00:18:29,320 Speaker 1: of it, which is that actually, when it comes to 303 00:18:29,359 --> 00:18:36,359 Speaker 1: a I R, it's everything's bias. Everything is. So let's 304 00:18:36,359 --> 00:18:41,200 Speaker 1: say you feed in um a bunch of pictures of apples, right, 305 00:18:41,560 --> 00:18:45,320 Speaker 1: Unlet's say all of those apples, they're the round, they're nice, 306 00:18:45,320 --> 00:18:49,080 Speaker 1: they're pretty, um, they're also like all mainly green, they're 307 00:18:49,080 --> 00:18:52,480 Speaker 1: all mainly green apples. Right. The more images of green 308 00:18:52,520 --> 00:18:54,120 Speaker 1: apples you put in there. You may have a couple 309 00:18:54,119 --> 00:18:57,600 Speaker 1: of pictures of red apples, but the more pictures of 310 00:18:57,600 --> 00:18:59,399 Speaker 1: green apples that you put in there, that when you 311 00:18:59,760 --> 00:19:03,520 Speaker 1: put it into a service like mid Journey or any 312 00:19:03,520 --> 00:19:07,760 Speaker 1: other text to image service, you put in the word apple, 313 00:19:08,560 --> 00:19:12,720 Speaker 1: it'll put it'll give you a green apple. But we 314 00:19:12,800 --> 00:19:16,119 Speaker 1: as humans don't care about that, right, That's that's not 315 00:19:16,280 --> 00:19:19,520 Speaker 1: something that we think of, Oh, you know, mid Journeys 316 00:19:19,560 --> 00:19:22,359 Speaker 1: biased to make green apples only, blah blah. But we 317 00:19:22,600 --> 00:19:26,800 Speaker 1: notice it when it comes to other more important things, 318 00:19:26,840 --> 00:19:29,440 Speaker 1: like if you put in doctor and it comes out 319 00:19:29,480 --> 00:19:32,119 Speaker 1: with only men, or if you put in woman and 320 00:19:32,200 --> 00:19:37,760 Speaker 1: it comes out with a pretty white, young female. You know, 321 00:19:38,320 --> 00:19:44,000 Speaker 1: I know the reason that happens is because society in 322 00:19:44,080 --> 00:19:46,960 Speaker 1: general is just already so biased. And I can talk 323 00:19:47,000 --> 00:19:50,159 Speaker 1: on this as a designer. You know, when I was working, 324 00:19:50,240 --> 00:19:52,320 Speaker 1: I worked for Clorox, I did I did some work 325 00:19:52,359 --> 00:19:56,240 Speaker 1: for Clorox, and I was very specifically like, I'm gonna 326 00:19:56,240 --> 00:19:59,520 Speaker 1: put some men cleaning imagery in here, because men can 327 00:19:59,560 --> 00:20:03,800 Speaker 1: clean too, And Clorox came back and went, Nope, you 328 00:20:03,840 --> 00:20:07,720 Speaker 1: need to put women. And even like their whole style guide, 329 00:20:08,920 --> 00:20:11,520 Speaker 1: whenever they talked about their consumer, it was always she, 330 00:20:12,600 --> 00:20:16,000 Speaker 1: and it made me so upset, you know what I mean. 331 00:20:16,160 --> 00:20:19,600 Speaker 1: So in a lot of ways, like you know, there 332 00:20:19,880 --> 00:20:26,960 Speaker 1: are biases. Everything that any AI image generator makes is biased, right, 333 00:20:28,080 --> 00:20:30,359 Speaker 1: but not biased in the sense of like how you 334 00:20:30,400 --> 00:20:32,920 Speaker 1: and I understand it, because the way that a computer 335 00:20:33,280 --> 00:20:35,479 Speaker 1: understands bias is going to be a lot different than 336 00:20:35,520 --> 00:20:39,879 Speaker 1: a human. To fix that, it needs to be actively combated. 337 00:20:40,200 --> 00:20:43,000 Speaker 1: But that is really, really, really hard, because you're talking 338 00:20:43,040 --> 00:20:48,040 Speaker 1: about images being fed in on like a million scale, 339 00:20:48,520 --> 00:20:51,040 Speaker 1: maybe even billions. I'm actually not sure the amount. But 340 00:20:52,040 --> 00:20:55,280 Speaker 1: it's a really really hard problem to solve because you know, 341 00:20:56,440 --> 00:21:01,000 Speaker 1: there are so many different ways these biases can present itself. UM. 342 00:21:01,040 --> 00:21:03,760 Speaker 1: And you also have the fact of like the users 343 00:21:03,760 --> 00:21:08,240 Speaker 1: who are making these images themselves, you know, when we're 344 00:21:08,240 --> 00:21:10,320 Speaker 1: talking about like men making these images and you're like, 345 00:21:10,359 --> 00:21:13,880 Speaker 1: oh cool, Like you know, it's very different to me. 346 00:21:14,040 --> 00:21:20,560 Speaker 1: For example, when like a woman makes bikini images versus 347 00:21:20,560 --> 00:21:24,560 Speaker 1: a straight man, it feels different, right. The thing is, 348 00:21:24,600 --> 00:21:28,480 Speaker 1: we can't really moderate or verify intent either, is the 349 00:21:28,520 --> 00:21:33,120 Speaker 1: hard part, and that's why um and and it's it's 350 00:21:33,160 --> 00:21:36,480 Speaker 1: such a hard line to balance because you know, we 351 00:21:36,600 --> 00:21:42,000 Speaker 1: do moderate female bodies um and that can feel really 352 00:21:42,040 --> 00:21:45,399 Speaker 1: bad to some woman too. UM. But it can also 353 00:21:45,440 --> 00:21:49,480 Speaker 1: similarly similarly feel just as bad too. You know, go 354 00:21:49,600 --> 00:21:51,679 Speaker 1: on the top of the community feed and you just 355 00:21:51,760 --> 00:21:55,520 Speaker 1: see a bunch of like young, pretty hot, you know, 356 00:21:56,080 --> 00:22:00,240 Speaker 1: fake woman. UM. And that actually happened, by the way, 357 00:22:00,320 --> 00:22:06,119 Speaker 1: I uh, months and months ago. I after like a 358 00:22:06,119 --> 00:22:08,040 Speaker 1: week or two or a couple of weeks of this happening, 359 00:22:08,080 --> 00:22:13,080 Speaker 1: I finally went into um the correct channels on Discord 360 00:22:13,119 --> 00:22:16,639 Speaker 1: and was like, hey, guys, UM, have any of you 361 00:22:16,720 --> 00:22:19,520 Speaker 1: noticed that there's just like a bunch of you know, 362 00:22:19,720 --> 00:22:25,119 Speaker 1: pretty women of like lots of cleavage and they're always busty. 363 00:22:25,280 --> 00:22:27,520 Speaker 1: I'm like, wow, redheaded and busty is what I've seen. 364 00:22:29,760 --> 00:22:32,479 Speaker 1: It's uncomfortable. It's it was really and it still is 365 00:22:32,520 --> 00:22:36,520 Speaker 1: really uncomfortable. And it got to the point where it 366 00:22:36,600 --> 00:22:39,760 Speaker 1: was like it was starting to make me feel bad, 367 00:22:40,359 --> 00:22:42,600 Speaker 1: you know, I was like, Wow, this is really reminding 368 00:22:42,640 --> 00:22:46,880 Speaker 1: me that my worth as a person is like just 369 00:22:46,920 --> 00:22:51,520 Speaker 1: based on how pretty I am and like how portable 370 00:22:51,600 --> 00:22:55,800 Speaker 1: I am, and it's you know, what I will say 371 00:22:55,920 --> 00:22:58,080 Speaker 1: is that when I did bring that up and I 372 00:22:58,119 --> 00:23:03,280 Speaker 1: had that conversation, the response was immediate and it was overwhelming. 373 00:23:03,320 --> 00:23:05,919 Speaker 1: It was like, I'm so glad you to this, like 374 00:23:05,960 --> 00:23:09,560 Speaker 1: we've been talking before. I was um in internal channels, 375 00:23:09,600 --> 00:23:12,240 Speaker 1: by the way, but the response even from the team 376 00:23:12,280 --> 00:23:14,800 Speaker 1: was like, we've been talking about this for like the 377 00:23:14,840 --> 00:23:17,560 Speaker 1: past couple of weeks, like it bothers us too, we've noticed, 378 00:23:17,680 --> 00:23:19,960 Speaker 1: thank you for bringing it up. We want to try 379 00:23:20,040 --> 00:23:23,159 Speaker 1: and fix this. And then um. Every week there's an 380 00:23:23,160 --> 00:23:27,440 Speaker 1: office hours, which means that David the founder, he goes 381 00:23:27,480 --> 00:23:31,480 Speaker 1: on to the discord Um voice chat stage for hours 382 00:23:31,600 --> 00:23:35,439 Speaker 1: each week and people in the community asked him questions 383 00:23:35,480 --> 00:23:38,840 Speaker 1: and talk about certain topics blah blah. And the next 384 00:23:38,840 --> 00:23:43,560 Speaker 1: two weeks was literally just like bringing this issue to 385 00:23:43,600 --> 00:23:46,199 Speaker 1: the community and we did this thing where we just 386 00:23:46,240 --> 00:23:49,320 Speaker 1: like brought up as many women as possible to the 387 00:23:49,400 --> 00:23:53,280 Speaker 1: stage and just had like a huge conversation between like 388 00:23:53,600 --> 00:23:56,120 Speaker 1: all different kinds of women, like how do you feel 389 00:23:56,119 --> 00:23:58,480 Speaker 1: about this? Like is this okay? Is it's not okay? 390 00:23:58,480 --> 00:24:01,439 Speaker 1: Like how should we handle them moderation here? You know, 391 00:24:01,520 --> 00:24:04,560 Speaker 1: And so there's always been an effort to try and 392 00:24:04,600 --> 00:24:08,199 Speaker 1: include real people who it affects and these convertations. And 393 00:24:08,240 --> 00:24:11,720 Speaker 1: I've always really really appreciated that, um, and I've always 394 00:24:11,760 --> 00:24:15,680 Speaker 1: felt very safe bringing these issues up. UM, and I've 395 00:24:15,680 --> 00:24:18,840 Speaker 1: always felt very heard. A lot of what came out 396 00:24:18,840 --> 00:24:23,359 Speaker 1: of that was, um, we we did. We do softband 397 00:24:23,400 --> 00:24:28,159 Speaker 1: stuff now, so like you can still make you know, 398 00:24:28,280 --> 00:24:32,000 Speaker 1: imagery at a beach or whatever. And um, you know, 399 00:24:32,040 --> 00:24:35,160 Speaker 1: if you if you start, that doesn't mean our moderation 400 00:24:35,840 --> 00:24:40,280 Speaker 1: around it has changed. Like if you're being weird about it, UM, 401 00:24:40,359 --> 00:24:43,160 Speaker 1: we're still gonna go, hey, stop doing that. It's gross. 402 00:24:43,480 --> 00:24:47,000 Speaker 1: But it also allows you know, people like you and me, 403 00:24:47,480 --> 00:24:49,959 Speaker 1: you know, to maybe make stuff, you know, based on 404 00:24:50,000 --> 00:24:53,719 Speaker 1: how we view our own selves, um without necessarily having 405 00:24:53,920 --> 00:24:58,240 Speaker 1: to see it on the main feed. You know. So 406 00:25:00,200 --> 00:25:04,959 Speaker 1: it's not a perfect system, it really isn't, but it 407 00:25:05,000 --> 00:25:08,159 Speaker 1: works for what we needed to do. You know, it 408 00:25:08,240 --> 00:25:10,600 Speaker 1: stops a lot of bad stuff, and that's good to 409 00:25:10,640 --> 00:25:12,880 Speaker 1: hear because I know it's a lot of conversation because 410 00:25:12,920 --> 00:25:16,119 Speaker 1: y'all have grown really quickly. We're specifically talking about my journey, 411 00:25:16,200 --> 00:25:20,080 Speaker 1: but Dolly has been growing exponentially as well. Because I'm 412 00:25:20,200 --> 00:25:22,320 Speaker 1: I'm wondering, and this may not be something that you 413 00:25:22,359 --> 00:25:25,240 Speaker 1: can answer right off the bat, but like, as many 414 00:25:26,040 --> 00:25:29,320 Speaker 1: people as you're coming in, how easily is it to 415 00:25:29,400 --> 00:25:32,400 Speaker 1: moderate ten million people? Who's I guess that's just discord 416 00:25:32,440 --> 00:25:34,240 Speaker 1: I'm not I'm guessing that's not the people who actually 417 00:25:34,280 --> 00:25:37,239 Speaker 1: subscribe to mid Journey and use it um because I'm 418 00:25:37,280 --> 00:25:40,280 Speaker 1: guessing not everyone comes onto the discord, But like, how 419 00:25:40,320 --> 00:25:44,520 Speaker 1: do you actually monitor ten million people? When you literally 420 00:25:44,520 --> 00:25:47,360 Speaker 1: grew from as you were saying, forty thousand, ten ten 421 00:25:47,359 --> 00:25:49,560 Speaker 1: million in less than a year? Is that correct? Like 422 00:25:49,600 --> 00:25:53,080 Speaker 1: six months? But like in that that growth that quickly, 423 00:25:53,240 --> 00:25:57,080 Speaker 1: how do you actually moderate? How do you actually try 424 00:25:57,160 --> 00:26:00,359 Speaker 1: to keep those standards in line? Because I'm a guessing 425 00:26:00,400 --> 00:26:03,640 Speaker 1: with that exponential growth, it would be hard to control 426 00:26:04,119 --> 00:26:07,760 Speaker 1: the bad players that come into uh that tub of space. Yeah, 427 00:26:07,840 --> 00:26:10,280 Speaker 1: So we have a couple of different systems set up 428 00:26:10,600 --> 00:26:14,440 Speaker 1: for all the banned words phrases we have used. We 429 00:26:14,520 --> 00:26:17,600 Speaker 1: have we actually have um different levels to them. So 430 00:26:17,720 --> 00:26:20,600 Speaker 1: like if you use the N word, that's like the 431 00:26:20,640 --> 00:26:24,920 Speaker 1: worst level you will get automatically timed out. We have 432 00:26:25,080 --> 00:26:28,720 Speaker 1: like a whole channel where we can see what people 433 00:26:28,720 --> 00:26:31,960 Speaker 1: are like what banned words phrases they're using, We can 434 00:26:32,000 --> 00:26:36,159 Speaker 1: see their entire prompt so UM we have like specific 435 00:26:36,240 --> 00:26:41,320 Speaker 1: things that if we see somebody using them, you're using 436 00:26:41,359 --> 00:26:45,240 Speaker 1: specific words, instant band goodbye, Like we don't want you 437 00:26:45,280 --> 00:26:47,320 Speaker 1: as a part of our community, even if you're just 438 00:26:47,359 --> 00:26:51,280 Speaker 1: like testing it. You know, like it's just you know, 439 00:26:51,359 --> 00:26:53,879 Speaker 1: even it's just it's not worth it. It's not worth 440 00:26:53,960 --> 00:26:58,400 Speaker 1: to have certain people in our community. Um Our moderators 441 00:26:58,480 --> 00:27:01,720 Speaker 1: are really active, which is great. It's it's kind of 442 00:27:01,760 --> 00:27:04,359 Speaker 1: a mix. So we have like a mod queue on 443 00:27:04,560 --> 00:27:10,000 Speaker 1: the website itself. People report UM, the report images through 444 00:27:10,000 --> 00:27:12,600 Speaker 1: the site. It'll go to a feed that only the 445 00:27:12,640 --> 00:27:16,680 Speaker 1: mods can see UM and we you know, we can 446 00:27:16,680 --> 00:27:20,720 Speaker 1: delete photos you can even unpublish them, which means they're 447 00:27:20,720 --> 00:27:23,720 Speaker 1: not actually deleted. Um, that's first that we do that. 448 00:27:23,960 --> 00:27:25,960 Speaker 1: We do that very rarely. We do that for stuff 449 00:27:25,960 --> 00:27:29,439 Speaker 1: that's like actually like not breaking rules, but it may 450 00:27:29,440 --> 00:27:35,200 Speaker 1: be so uncomfortable. It really depends. Um, but yeah, it's 451 00:27:36,359 --> 00:27:39,680 Speaker 1: it's definitely, it's definitely grown a lot. But I will 452 00:27:39,720 --> 00:27:47,000 Speaker 1: say our community is pretty good. Actually, yeah, it sometimes 453 00:27:47,040 --> 00:27:52,119 Speaker 1: seems very overwhelming, like how many bad people are using 454 00:27:52,119 --> 00:27:55,160 Speaker 1: it for bad things, But it's actually not like if 455 00:27:55,160 --> 00:27:59,600 Speaker 1: you compare like the actual amount of stuff for moderating 456 00:27:59,720 --> 00:28:01,640 Speaker 1: versus the amount of people who are using it, it's 457 00:28:01,720 --> 00:28:05,359 Speaker 1: like and usually the people who are who are doing 458 00:28:05,920 --> 00:28:10,639 Speaker 1: bad stuff with it, they're like only doing that, and 459 00:28:10,720 --> 00:28:12,840 Speaker 1: so once you clean that up, it's like we just 460 00:28:12,880 --> 00:28:15,760 Speaker 1: got rid of like a big chunk, you know. But yeah, 461 00:28:15,800 --> 00:28:20,480 Speaker 1: our moderator team is great. Um, they care very strongly 462 00:28:20,520 --> 00:28:23,040 Speaker 1: about it, and we can't like the way we kind 463 00:28:23,040 --> 00:28:24,760 Speaker 1: of see or at least the way I see us 464 00:28:25,040 --> 00:28:29,240 Speaker 1: is that like where we have to like wade through 465 00:28:29,480 --> 00:28:34,080 Speaker 1: a bunch of but to protect other people from having 466 00:28:34,080 --> 00:28:37,560 Speaker 1: to do so, you know, right, which should be which 467 00:28:37,560 --> 00:28:54,440 Speaker 1: should be the case. Very fascinated as the new medium 468 00:28:54,480 --> 00:28:56,960 Speaker 1: has kind of come about UM and not even talking 469 00:28:56,960 --> 00:29:00,240 Speaker 1: about any of the controversy outside of you know, the 470 00:29:00,360 --> 00:29:03,160 Speaker 1: art world into the art world, but within because one 471 00:29:03,160 --> 00:29:04,880 Speaker 1: of the things that we talked about, especially when we 472 00:29:05,160 --> 00:29:08,000 Speaker 1: talk about new mediums, new places to maybe be able 473 00:29:08,040 --> 00:29:11,000 Speaker 1: to monetize or not even monetize, but at least express 474 00:29:11,040 --> 00:29:14,760 Speaker 1: yourself or do things to create UM when we see 475 00:29:14,920 --> 00:29:17,640 Speaker 1: this is kind of a social media platform in its 476 00:29:17,680 --> 00:29:20,160 Speaker 1: own right, obviously with a discord and being able to 477 00:29:20,160 --> 00:29:24,440 Speaker 1: publish UM being being a community. How do you think 478 00:29:24,800 --> 00:29:28,200 Speaker 1: companies like yours can because you're at the cust You're 479 00:29:28,200 --> 00:29:30,880 Speaker 1: at the beginning essentially, which we're saying it's not new, 480 00:29:30,960 --> 00:29:34,080 Speaker 1: but as the popularity is growing, you're at the beginning, 481 00:29:34,560 --> 00:29:37,360 Speaker 1: can be an opening and welcoming space for women. When 482 00:29:37,440 --> 00:29:41,600 Speaker 1: oftentimes platforms like this or popular platforms like this can 483 00:29:41,640 --> 00:29:46,680 Speaker 1: be overrun by your patriarchal cis white men who want 484 00:29:46,760 --> 00:29:50,760 Speaker 1: to capitalize. How does the platform rather make sure that 485 00:29:50,800 --> 00:29:54,200 Speaker 1: they are or do you think they are welcoming the 486 00:29:54,240 --> 00:29:56,920 Speaker 1: marginalized communities like women and people of color? You know, 487 00:29:57,000 --> 00:30:00,520 Speaker 1: I hope we are. I have been approached by numerous 488 00:30:00,640 --> 00:30:04,320 Speaker 1: people that have said, like my presence for example, because 489 00:30:04,920 --> 00:30:08,400 Speaker 1: I'm very I present very obviously as she her like 490 00:30:08,480 --> 00:30:12,320 Speaker 1: I have it in my bio my uh uh, my 491 00:30:12,400 --> 00:30:14,920 Speaker 1: profile picture is like a pink eye. You know, it's 492 00:30:14,960 --> 00:30:18,440 Speaker 1: it's But I've been approached by many people that said, oh, 493 00:30:18,520 --> 00:30:21,960 Speaker 1: you know, your presence here has made me feel safer. 494 00:30:22,120 --> 00:30:26,520 Speaker 1: And I know my fellow moderators, I know the fellow guides, 495 00:30:27,280 --> 00:30:29,960 Speaker 1: they're all really good people. Um. And I think that's 496 00:30:29,960 --> 00:30:35,720 Speaker 1: how that's how we manage that is we have a 497 00:30:35,880 --> 00:30:39,800 Speaker 1: really really really solid moderator team who we are not 498 00:30:40,080 --> 00:30:43,200 Speaker 1: We are not afraid to be like stop it. We 499 00:30:43,280 --> 00:30:46,080 Speaker 1: are we do not want that in our community. Get 500 00:30:46,080 --> 00:30:51,400 Speaker 1: out of here. Um. I will say, uh, we used 501 00:30:51,400 --> 00:30:54,560 Speaker 1: to have a lot more of that nonsense. A lot 502 00:30:54,600 --> 00:30:58,440 Speaker 1: of people get really upset that they can't do anything 503 00:30:58,440 --> 00:31:01,520 Speaker 1: they want to do um or make any image they 504 00:31:01,520 --> 00:31:05,560 Speaker 1: want to make, and they cry about censorship and uh 505 00:31:06,760 --> 00:31:11,480 Speaker 1: um they have migrated to other services. And I think 506 00:31:12,040 --> 00:31:14,080 Speaker 1: I can speak for a lot of other moderators at 507 00:31:14,120 --> 00:31:18,360 Speaker 1: least that we're we're glad you know, if you want 508 00:31:18,360 --> 00:31:20,600 Speaker 1: to go do that stuff, just don't don't do it. 509 00:31:21,240 --> 00:31:23,280 Speaker 1: Don't do it where we have to deal with it. Please, 510 00:31:23,840 --> 00:31:28,120 Speaker 1: not that I think it's okay. But you know, I 511 00:31:28,120 --> 00:31:30,200 Speaker 1: don't have control over the other spaces, but I have 512 00:31:30,280 --> 00:31:33,479 Speaker 1: control over this space, and so I feel very strongly. 513 00:31:33,520 --> 00:31:36,800 Speaker 1: The moderators feel very strongly UM we and I mentioned 514 00:31:36,800 --> 00:31:39,760 Speaker 1: that one channel, we we have a similar channel so 515 00:31:39,800 --> 00:31:41,600 Speaker 1: that we can keep an eye on like things happening 516 00:31:41,600 --> 00:31:45,440 Speaker 1: within discord um on Facebook, which I mentioned like, I 517 00:31:45,520 --> 00:31:48,320 Speaker 1: was the main admin of that. I launched it, I 518 00:31:48,840 --> 00:31:51,360 Speaker 1: the standards for it, I made decisions of how it 519 00:31:51,440 --> 00:31:54,040 Speaker 1: was going to be ran um, and I also created 520 00:31:54,160 --> 00:31:57,360 Speaker 1: the team for it. And one of the decisions that 521 00:31:57,400 --> 00:32:00,560 Speaker 1: I made that ended up being such a good decision. 522 00:32:00,600 --> 00:32:02,640 Speaker 1: I had to fight for it to UM. But one 523 00:32:02,640 --> 00:32:05,240 Speaker 1: of those insis I made was UM and the Facebook 524 00:32:05,240 --> 00:32:09,520 Speaker 1: group that everything has to have post approval, which means 525 00:32:09,560 --> 00:32:13,160 Speaker 1: you can submit something, but then before it to actually 526 00:32:13,240 --> 00:32:16,920 Speaker 1: get posted on the group, a moderator, a Facebook moderator 527 00:32:16,960 --> 00:32:20,000 Speaker 1: has to prove it. And so there have been so 528 00:32:20,040 --> 00:32:25,440 Speaker 1: many things that we've caught UM that just feel bad 529 00:32:25,680 --> 00:32:27,920 Speaker 1: or you know, aren't appropriate or not even really as 530 00:32:28,040 --> 00:32:31,280 Speaker 1: mid journey sometimes UM fun side story, we have one 531 00:32:31,280 --> 00:32:33,960 Speaker 1: guy try to post his entire camera roll on accident. 532 00:32:34,080 --> 00:32:39,959 Speaker 1: It was like a over a thousand pictures. But you know, 533 00:32:40,160 --> 00:32:42,960 Speaker 1: if we have to, we have to stay vigilant, and 534 00:32:44,680 --> 00:32:47,080 Speaker 1: you know, we can't always catch everything. So I'm always 535 00:32:47,120 --> 00:32:51,200 Speaker 1: trying to remind people like, report it, report it. Message 536 00:32:51,240 --> 00:32:53,000 Speaker 1: me if you need to. If I don't answer right 537 00:32:53,040 --> 00:32:56,480 Speaker 1: away because I'm sleeping, you know, go to their mod um. 538 00:32:56,520 --> 00:32:58,479 Speaker 1: There are ways you can report through Facebook. There are 539 00:32:58,480 --> 00:33:00,920 Speaker 1: ways you can report through the Reddit, there are ways 540 00:33:00,960 --> 00:33:04,240 Speaker 1: you can report through Discord, and there are ways you 541 00:33:04,240 --> 00:33:07,200 Speaker 1: can report on our our website UM. And if it's 542 00:33:07,240 --> 00:33:10,320 Speaker 1: like really bad, message us and we will get it 543 00:33:10,400 --> 00:33:14,640 Speaker 1: right away. But yeah, it really you know, we have 544 00:33:14,760 --> 00:33:16,360 Speaker 1: to coordinate a lot because you know, we have to 545 00:33:16,360 --> 00:33:18,720 Speaker 1: find people that can do the night shifts we know, 546 00:33:18,840 --> 00:33:22,000 Speaker 1: and the USA people are asleep, and it's it's it's 547 00:33:22,040 --> 00:33:25,640 Speaker 1: a lot, but and I think we all try really hard, 548 00:33:25,680 --> 00:33:28,600 Speaker 1: and we all really really care a lot about creating 549 00:33:28,600 --> 00:33:32,680 Speaker 1: like safe spaces UM for people. So I hope, I 550 00:33:32,760 --> 00:33:36,240 Speaker 1: really hope they feel safe. I really hope so. And 551 00:33:37,200 --> 00:33:40,640 Speaker 1: if they don't, I hope that people feel like they 552 00:33:40,640 --> 00:33:43,760 Speaker 1: can message me about it and I will try and 553 00:33:43,840 --> 00:33:48,240 Speaker 1: help um. For anybody who's listening on Discord, I'm knuckle 554 00:33:48,520 --> 00:33:51,560 Speaker 1: it's f n you c k l E, and then 555 00:33:51,560 --> 00:33:55,120 Speaker 1: I have in parentheses community. I always say this world 556 00:33:55,120 --> 00:33:58,560 Speaker 1: wrongly at the it's a hard one. My couple picture 557 00:33:58,600 --> 00:34:02,959 Speaker 1: is a giant pink eye. Please message me UM, and 558 00:34:03,000 --> 00:34:05,120 Speaker 1: I will try and help as much as I can. 559 00:34:05,280 --> 00:34:09,040 Speaker 1: I feel very strongly about it. I want mid Journey 560 00:34:09,239 --> 00:34:13,200 Speaker 1: community to be safe and open to everyone. Well, that's awesome, 561 00:34:13,239 --> 00:34:15,600 Speaker 1: and I think that might be the answer to itself 562 00:34:15,640 --> 00:34:20,319 Speaker 1: and a lot of the platforms that don't actually look 563 00:34:20,440 --> 00:34:23,799 Speaker 1: for the safety and and guidelines for the safety of 564 00:34:23,840 --> 00:34:28,040 Speaker 1: their users and allow for misinformation. I know that's not 565 00:34:28,080 --> 00:34:30,680 Speaker 1: the same thing for this, but like misinformation or harassment 566 00:34:30,719 --> 00:34:34,719 Speaker 1: in general, to run amuck um without any kind of 567 00:34:34,760 --> 00:34:38,400 Speaker 1: checks and balances, it definitely doesn't make it unsafe for 568 00:34:38,480 --> 00:34:41,480 Speaker 1: many of the users. So if you are like helping, 569 00:34:41,560 --> 00:34:43,600 Speaker 1: which it sounds like, I love that to make the 570 00:34:43,680 --> 00:34:47,719 Speaker 1: safe for marginalized people UM and and queer communities. So 571 00:34:47,760 --> 00:34:49,719 Speaker 1: that's awesome to hear because that is one of the 572 00:34:49,719 --> 00:34:52,520 Speaker 1: things I think it's missing in general when it comes 573 00:34:52,560 --> 00:34:55,719 Speaker 1: to social media platforms or different types of mediums that 574 00:34:55,760 --> 00:34:58,480 Speaker 1: can be for the public. Yeah, and I think it's 575 00:34:58,520 --> 00:35:02,919 Speaker 1: important to note to UM as we're as we're trying 576 00:35:02,920 --> 00:35:05,080 Speaker 1: to build my journey up and and this is this 577 00:35:05,160 --> 00:35:08,200 Speaker 1: is more from the actual team itself. UM, But I 578 00:35:08,280 --> 00:35:10,560 Speaker 1: know and David talks about this all the time in 579 00:35:10,600 --> 00:35:13,160 Speaker 1: the office hours, like he does not want to repeat 580 00:35:13,280 --> 00:35:17,879 Speaker 1: the same mistakes as regular social media. And that's why 581 00:35:17,920 --> 00:35:21,080 Speaker 1: you can't, for example, you can't see who follows you. 582 00:35:21,080 --> 00:35:22,680 Speaker 1: You know you can't. You can't see the amount of 583 00:35:22,719 --> 00:35:26,760 Speaker 1: people who follow you because we just want to avoid 584 00:35:26,960 --> 00:35:32,080 Speaker 1: like a lot of the uh like likes validation kind 585 00:35:32,120 --> 00:35:35,320 Speaker 1: of thing. UM that happens, and it's still happening with 586 00:35:35,400 --> 00:35:37,799 Speaker 1: the community feed right now, and we talk about it 587 00:35:38,400 --> 00:35:42,600 Speaker 1: and talk about how we want to change it. So UM, 588 00:35:42,640 --> 00:35:46,200 Speaker 1: whenever that gets released, that will be really interesting and fun, 589 00:35:46,239 --> 00:35:49,920 Speaker 1: I think. So, I guess we should have said this 590 00:35:49,960 --> 00:35:51,800 Speaker 1: at the top, and I didn't. We are not currently 591 00:35:51,840 --> 00:35:54,640 Speaker 1: sponsored by anything we are mentioning, including Chlorox, which you 592 00:35:54,719 --> 00:35:59,200 Speaker 1: really came for. Actually like that. I'm fine with it. 593 00:35:59,320 --> 00:36:05,759 Speaker 1: But saying so, one of the things you brought up 594 00:36:05,840 --> 00:36:08,880 Speaker 1: when we were first communicating about this, UM was building 595 00:36:08,920 --> 00:36:12,040 Speaker 1: community with women, and you've kind of been talking about 596 00:36:12,040 --> 00:36:14,839 Speaker 1: that throughout, but I'm curious about that. And also just 597 00:36:14,920 --> 00:36:17,799 Speaker 1: in terms of users and people who work on it. 598 00:36:17,880 --> 00:36:24,480 Speaker 1: What have you seen demographically so right now? Uh, it's 599 00:36:24,520 --> 00:36:28,080 Speaker 1: it's been acknowledged, uh in like office hours and whatnot. 600 00:36:28,120 --> 00:36:32,359 Speaker 1: But there are less women that use my journey than 601 00:36:33,719 --> 00:36:39,360 Speaker 1: we wish there was. And it's been really interesting because 602 00:36:41,160 --> 00:36:44,759 Speaker 1: I will say this, our moderator and guide team UM 603 00:36:44,880 --> 00:36:48,080 Speaker 1: is very diverse, very very very diverse. We have all ages, 604 00:36:48,360 --> 00:36:53,040 Speaker 1: we have UM, people who are neuro divergent, including me UM. 605 00:36:53,080 --> 00:36:55,520 Speaker 1: We have people who are parents, people who aren't parents, 606 00:36:55,560 --> 00:37:02,160 Speaker 1: people UM, all ethnicities, religion, everything. But the one thing 607 00:37:02,200 --> 00:37:05,400 Speaker 1: that I noticed a while ago was after I was 608 00:37:05,440 --> 00:37:11,719 Speaker 1: made moderator, I was like, I'm one of three outwardly 609 00:37:11,840 --> 00:37:15,360 Speaker 1: she her moderators, and I'm the only one who interacts 610 00:37:15,440 --> 00:37:19,520 Speaker 1: the community because the other one, one of the other ones, UM, 611 00:37:19,600 --> 00:37:25,680 Speaker 1: she was more in the Chinese community, and the third one, UM, 612 00:37:25,800 --> 00:37:28,840 Speaker 1: she does a lot more behind the scenes work. She 613 00:37:29,000 --> 00:37:31,799 Speaker 1: just did she actually just did our new docs. If 614 00:37:31,880 --> 00:37:35,520 Speaker 1: you ever want to look, it's really fancy, it's cute. 615 00:37:35,880 --> 00:37:38,200 Speaker 1: Sorry I had to say something today. We just released 616 00:37:38,239 --> 00:37:40,239 Speaker 1: it like yesterday, and I'm very proud of it. But 617 00:37:40,600 --> 00:37:43,640 Speaker 1: the point is, like I was, I realized I'm the 618 00:37:43,719 --> 00:37:48,680 Speaker 1: only female moderator that like people know UM. And so 619 00:37:49,719 --> 00:37:51,720 Speaker 1: it was like the day before I was going on vacation, 620 00:37:51,800 --> 00:37:56,000 Speaker 1: I was like, hey, guys, UM, I just realized this thing. 621 00:37:56,120 --> 00:38:01,640 Speaker 1: And actually I just realized also this by so many 622 00:38:01,840 --> 00:38:06,600 Speaker 1: she hers reach out to me, UM and not to 623 00:38:06,680 --> 00:38:09,759 Speaker 1: other people, because I'm the only obvious one that like 624 00:38:09,840 --> 00:38:16,040 Speaker 1: gets it, you know, and that's really cool. But also, um, 625 00:38:16,200 --> 00:38:20,480 Speaker 1: you know, like women aren't a monolith. I don't want 626 00:38:20,520 --> 00:38:23,239 Speaker 1: to speak for the you know, all of the women everywhere, UM, 627 00:38:23,320 --> 00:38:25,359 Speaker 1: because I'm not you know, my opinions are not gonna 628 00:38:25,360 --> 00:38:28,839 Speaker 1: be the same as everyone else's. I don't know what 629 00:38:28,960 --> 00:38:31,520 Speaker 1: we can do about it, but like, this is the thing. 630 00:38:31,520 --> 00:38:33,080 Speaker 1: You should be aware of it. And I went on 631 00:38:33,160 --> 00:38:35,880 Speaker 1: vacation the next day, and when I got back, I 632 00:38:36,040 --> 00:38:41,279 Speaker 1: found that my counterpart, who I love, UM, he and 633 00:38:41,320 --> 00:38:43,759 Speaker 1: I have the same brain. He's great. UM. He's the 634 00:38:43,760 --> 00:38:48,920 Speaker 1: other boss of mods that works alongside me. UM. He 635 00:38:49,000 --> 00:38:52,880 Speaker 1: had added a bunch of new guides that were women 636 00:38:53,280 --> 00:38:57,120 Speaker 1: she Herd coded UM and I was like, that's really 637 00:38:57,120 --> 00:39:01,359 Speaker 1: really nice because you just heard me and you just 638 00:39:02,120 --> 00:39:04,680 Speaker 1: made the steps to solve this and like bring more 639 00:39:04,760 --> 00:39:08,040 Speaker 1: women in, um, all on your own without me having 640 00:39:08,080 --> 00:39:09,680 Speaker 1: to do much more than be like, hey, this is 641 00:39:09,719 --> 00:39:13,120 Speaker 1: a thing. So that was the thing, um, and we're 642 00:39:13,160 --> 00:39:15,920 Speaker 1: working on it. But yeah, because I wanted because I 643 00:39:15,960 --> 00:39:17,960 Speaker 1: was one of the only mods he was a woman, 644 00:39:18,320 --> 00:39:20,960 Speaker 1: like I said, people would reach out to me. UM. 645 00:39:21,040 --> 00:39:25,200 Speaker 1: And then I just ended up somehow accidentally just actually 646 00:39:25,200 --> 00:39:28,200 Speaker 1: knowing a lot of women in the community. And I 647 00:39:28,280 --> 00:39:31,640 Speaker 1: think that's been really interesting and really cool. Yeah. I'm 648 00:39:31,640 --> 00:39:33,399 Speaker 1: trying to think what more can I say on that. UM. 649 00:39:33,440 --> 00:39:36,080 Speaker 1: I just actually I was just reached out to actually 650 00:39:36,160 --> 00:39:39,400 Speaker 1: yesterday by somebody who has their own server for a 651 00:39:39,480 --> 00:39:45,919 Speaker 1: woman and uh, marginalized people in different communities, UM, which 652 00:39:45,960 --> 00:39:47,839 Speaker 1: I just joined today, and I was like poking around 653 00:39:47,880 --> 00:39:50,439 Speaker 1: and I was like, oh, this is so cool, you know. Um. 654 00:39:50,480 --> 00:39:52,720 Speaker 1: But yeah, I definitely think there can be more done 655 00:39:53,080 --> 00:39:57,560 Speaker 1: in that area to invite um, not just women, but 656 00:39:57,719 --> 00:40:03,120 Speaker 1: just anybody who is an assist white man and invite 657 00:40:03,160 --> 00:40:05,280 Speaker 1: them into our space and just show them how cool 658 00:40:05,680 --> 00:40:08,360 Speaker 1: AI and how empowering it can be too. Yeah, it 659 00:40:08,400 --> 00:40:11,160 Speaker 1: seems really neat, you know, And I'm thinking about because 660 00:40:11,160 --> 00:40:13,440 Speaker 1: I'm kind of kind of alluded to it earlier about 661 00:40:13,480 --> 00:40:15,440 Speaker 1: you know, I do hear a lot about the controversy 662 00:40:15,480 --> 00:40:18,960 Speaker 1: when it comes to the art world and uh AI 663 00:40:19,080 --> 00:40:21,920 Speaker 1: and interesting in love you are in the art world 664 00:40:22,160 --> 00:40:24,719 Speaker 1: specifically actually doing this as a job as part of 665 00:40:24,760 --> 00:40:28,239 Speaker 1: your creative jobs, which is interesting in itself, And but 666 00:40:28,239 --> 00:40:34,560 Speaker 1: you're also really really adamant about the benefits of AI. UM. Both, 667 00:40:34,600 --> 00:40:36,719 Speaker 1: how do you see that? Do you see that at all? 668 00:40:36,719 --> 00:40:40,000 Speaker 1: In conflicting because we are we have again we're hearing 669 00:40:40,040 --> 00:40:42,799 Speaker 1: a lot of controversy back and forth about you know, 670 00:40:42,920 --> 00:40:46,000 Speaker 1: is this going to disrupt uh individual artists? Is going 671 00:40:46,040 --> 00:40:49,280 Speaker 1: to disrupt the art world as we see it? Um 672 00:40:49,320 --> 00:40:51,640 Speaker 1: And then coming back to you know, but you do 673 00:40:51,680 --> 00:40:54,200 Speaker 1: you use this as a part of your job as well, 674 00:40:54,239 --> 00:40:57,080 Speaker 1: Like how does that all encompass for you as an 675 00:40:57,160 --> 00:41:01,840 Speaker 1: artist slash AI expert. I'm gonna say, call your expert. 676 00:41:02,520 --> 00:41:06,280 Speaker 1: You know. I think that's one of the biggest things 677 00:41:06,320 --> 00:41:10,440 Speaker 1: that I don't want to stay hurtful, but it's kind 678 00:41:10,440 --> 00:41:15,600 Speaker 1: of hurtful this idea that it's AI versus artists because 679 00:41:17,320 --> 00:41:20,520 Speaker 1: I mean, I'm an artist, I work in the creative industry. 680 00:41:21,040 --> 00:41:23,719 Speaker 1: A lot of my friends who are working in the 681 00:41:23,719 --> 00:41:27,480 Speaker 1: creative industry for many years use AI. My friend who 682 00:41:27,520 --> 00:41:31,879 Speaker 1: invited me to my journey is an artist. I don't 683 00:41:31,880 --> 00:41:35,479 Speaker 1: think it conflicts at all. I maybe a little bit, 684 00:41:35,680 --> 00:41:41,719 Speaker 1: but like not not really. I know most people who 685 00:41:41,800 --> 00:41:44,400 Speaker 1: I've talked to, and I say most, not all, but 686 00:41:44,520 --> 00:41:46,759 Speaker 1: most people I've talked to who have been working in 687 00:41:46,800 --> 00:41:52,480 Speaker 1: the creative industry, once they try it and really get 688 00:41:52,480 --> 00:41:56,120 Speaker 1: a feel for it, they start to understand, Oh, this 689 00:41:56,200 --> 00:42:00,080 Speaker 1: is like really cool and really awesome. Um. I I 690 00:42:00,280 --> 00:42:02,360 Speaker 1: do use it. I use it as a tool. Is 691 00:42:02,440 --> 00:42:06,040 Speaker 1: really great for brainstorming, is really great for trying to 692 00:42:06,080 --> 00:42:10,040 Speaker 1: figure out like interesting new ways to take a concept. 693 00:42:11,080 --> 00:42:14,239 Speaker 1: You know. I have this thing when I'm working my 694 00:42:14,320 --> 00:42:16,480 Speaker 1: day job as a designer where it's like I have 695 00:42:16,600 --> 00:42:20,160 Speaker 1: like a specific style in mind, but then when I 696 00:42:20,280 --> 00:42:24,160 Speaker 1: try to find examples of it, I can't find it, 697 00:42:24,640 --> 00:42:27,480 Speaker 1: And so that's really helpful. I think it's a really 698 00:42:27,480 --> 00:42:30,640 Speaker 1: really powerful creative tool. UM. And the other thing I 699 00:42:30,680 --> 00:42:32,759 Speaker 1: want to say is, and I said this before, but 700 00:42:32,960 --> 00:42:35,839 Speaker 1: the big thing of AI art is it is highlighting 701 00:42:35,880 --> 00:42:38,959 Speaker 1: a lot of the issue is already present, um within 702 00:42:39,000 --> 00:42:42,560 Speaker 1: the art world. So one of the big issues that 703 00:42:42,560 --> 00:42:47,160 Speaker 1: has already been present is that, um, there's a certain 704 00:42:47,760 --> 00:42:52,560 Speaker 1: group of artists and designers who are very are valued 705 00:42:52,680 --> 00:42:57,680 Speaker 1: very lowly, and who don't get paid enough. And some 706 00:42:57,719 --> 00:42:59,920 Speaker 1: of the problems around that is, you know, one, they're 707 00:43:00,360 --> 00:43:02,000 Speaker 1: a lot of people out there that are doing it, 708 00:43:02,280 --> 00:43:05,520 Speaker 1: so you have a lot of competition. Um, you also 709 00:43:05,560 --> 00:43:09,480 Speaker 1: have competition from overseas. You know, there there are companies 710 00:43:09,480 --> 00:43:13,719 Speaker 1: where they outsource their design, you know, to another country 711 00:43:13,800 --> 00:43:16,680 Speaker 1: where the costs of living maybe a lot lower, and 712 00:43:16,760 --> 00:43:21,560 Speaker 1: so they can pay them a lot lower wage. That's 713 00:43:21,600 --> 00:43:27,800 Speaker 1: not right, right, And so now a AI has done 714 00:43:27,920 --> 00:43:31,160 Speaker 1: is it basically taken a lot of these issues that 715 00:43:31,200 --> 00:43:34,560 Speaker 1: we've already had for a long time, and it's just 716 00:43:35,239 --> 00:43:38,680 Speaker 1: making him more obvious and making it harder. And it 717 00:43:38,800 --> 00:43:41,160 Speaker 1: is a little bit scary, I think for people who 718 00:43:41,640 --> 00:43:44,520 Speaker 1: are already struggling. I think you could say the same 719 00:43:44,560 --> 00:43:49,680 Speaker 1: thing for people who whose jobs have already been overtaken 720 00:43:49,680 --> 00:43:53,640 Speaker 1: by machines in a way, you know, like cashiers for example. 721 00:43:54,200 --> 00:43:57,600 Speaker 1: And personally, I think a lot of the stems from 722 00:43:57,600 --> 00:44:02,440 Speaker 1: the fact that we just live in a capital realistic world, um, 723 00:44:03,080 --> 00:44:07,239 Speaker 1: where you have to have you have to have some 724 00:44:07,280 --> 00:44:10,520 Speaker 1: sort of skill or value or you know, something to 725 00:44:10,760 --> 00:44:15,719 Speaker 1: contribute to be able to live. And that's really unfortunate. 726 00:44:16,200 --> 00:44:21,239 Speaker 1: And so I asked somebody who is in a very democratic, 727 00:44:21,280 --> 00:44:25,400 Speaker 1: like leftist side of the world. I'm like, hey, maybe 728 00:44:25,440 --> 00:44:28,040 Speaker 1: this is a great way to start talking about things 729 00:44:28,080 --> 00:44:30,840 Speaker 1: like you b I or like other ways that we 730 00:44:30,960 --> 00:44:37,640 Speaker 1: can we can help people live without necessarily having to 731 00:44:37,719 --> 00:44:42,920 Speaker 1: rely on employment, because I think that's an issue not 732 00:44:43,000 --> 00:44:47,080 Speaker 1: justin the art world but in many different areas of 733 00:44:47,080 --> 00:44:50,800 Speaker 1: of our society right now. So um, I think also 734 00:44:50,920 --> 00:44:53,839 Speaker 1: that the fine arts world people are scared of that. 735 00:44:54,680 --> 00:44:56,279 Speaker 1: Like the people are like, oh, is the end of art? 736 00:44:56,320 --> 00:44:58,480 Speaker 1: This is not the end of art. This is not 737 00:44:58,520 --> 00:45:00,360 Speaker 1: the end of art at all. The fine arts world, 738 00:45:01,160 --> 00:45:03,399 Speaker 1: they're they're going to be interested in it, but it's 739 00:45:03,440 --> 00:45:06,440 Speaker 1: not really going to change it because fine arts people 740 00:45:06,480 --> 00:45:10,440 Speaker 1: are way more concerned not just of the the output, 741 00:45:10,960 --> 00:45:13,160 Speaker 1: but like who made it, why do they make it, 742 00:45:13,400 --> 00:45:15,799 Speaker 1: how do they make it? And so yeah, there's gonna 743 00:45:15,840 --> 00:45:18,920 Speaker 1: be some galleries and whatnot that show off AI art 744 00:45:18,920 --> 00:45:21,799 Speaker 1: because it's interesting, but it's not going to change the 745 00:45:21,880 --> 00:45:27,600 Speaker 1: value of people who are painting or or illustrated or whatever, 746 00:45:27,800 --> 00:45:29,520 Speaker 1: or sculpting or whatever it is. It's not going to 747 00:45:29,640 --> 00:45:32,319 Speaker 1: change their value as an artist. Um. In the fine 748 00:45:32,400 --> 00:45:39,640 Speaker 1: arts world specifically, I think a lot of what it 749 00:45:39,760 --> 00:45:45,000 Speaker 1: will disrupt, right now is what I personally as a 750 00:45:45,040 --> 00:45:49,279 Speaker 1: designer was already just using regular stock imagery for and 751 00:45:49,360 --> 00:45:53,160 Speaker 1: things will change, things will change, UM, But it's also 752 00:45:54,400 --> 00:45:58,280 Speaker 1: one of those things where it's like things have already 753 00:45:58,280 --> 00:46:02,840 Speaker 1: been changing and things will change anyway. And it's actually 754 00:46:02,880 --> 00:46:06,880 Speaker 1: really interesting talking to UM A lot of the people 755 00:46:06,880 --> 00:46:08,560 Speaker 1: in the creative industry who have been in it for 756 00:46:08,600 --> 00:46:11,360 Speaker 1: many years, who have a lot more experience than me, 757 00:46:11,440 --> 00:46:16,200 Speaker 1: because regardless of which part of the industry they've been in, 758 00:46:16,800 --> 00:46:20,200 Speaker 1: they'll have some story for you where they're like, oh, yeah, 759 00:46:20,360 --> 00:46:23,880 Speaker 1: I remember when photoshop came onto the scene and people 760 00:46:23,880 --> 00:46:26,560 Speaker 1: were freaking out about it because of this and this 761 00:46:26,640 --> 00:46:31,319 Speaker 1: and this, or like, for example, if you UM, have 762 00:46:31,400 --> 00:46:34,719 Speaker 1: you ever read about Matt painters for movies. Let's take 763 00:46:34,760 --> 00:46:40,799 Speaker 1: Star Wars as an example. Oh so UM probably did 764 00:46:40,800 --> 00:46:44,480 Speaker 1: their special effects. They would basically they would take a 765 00:46:44,600 --> 00:46:48,640 Speaker 1: giant glass pain and put it over top of like 766 00:46:49,320 --> 00:46:53,080 Speaker 1: the movie film frame. UM. I don't know if they 767 00:46:53,080 --> 00:46:55,680 Speaker 1: projected it or not. They'd have to because it was huge. 768 00:46:56,280 --> 00:46:58,440 Speaker 1: And then they take this giant glass frame, they put 769 00:46:58,480 --> 00:47:01,400 Speaker 1: it over top of it and it would literally paint 770 00:47:02,440 --> 00:47:04,839 Speaker 1: the thing they're trying to paint. Some Star Wars. You know, 771 00:47:04,880 --> 00:47:09,040 Speaker 1: the iconic scene where are two D two and uh, 772 00:47:09,280 --> 00:47:11,279 Speaker 1: what's Oh my god, don't come from C C three 773 00:47:11,320 --> 00:47:15,000 Speaker 1: po or you know, they're they're like in the desert 774 00:47:15,040 --> 00:47:21,160 Speaker 1: and they see UM machine. The You see that over 775 00:47:21,200 --> 00:47:24,480 Speaker 1: the hill in the desert, right that was literally painted 776 00:47:24,520 --> 00:47:28,840 Speaker 1: on to a piece of glass. You know. Now that 777 00:47:28,880 --> 00:47:31,120 Speaker 1: doesn't happen at all, you know what I mean, Like 778 00:47:31,560 --> 00:47:37,200 Speaker 1: that entire industry got obliterated because of digital effects and 779 00:47:37,280 --> 00:47:39,120 Speaker 1: c G I that came onto the scene. And so 780 00:47:39,280 --> 00:47:43,400 Speaker 1: this is gonna be like that. It's going to change 781 00:47:43,480 --> 00:47:48,360 Speaker 1: some things in the sense of some people are whose 782 00:47:48,440 --> 00:47:52,360 Speaker 1: jobs are doing right now might be less relevant. But 783 00:47:53,080 --> 00:48:00,400 Speaker 1: that's okay in the longer, bigger picture of things, because 784 00:48:00,440 --> 00:48:04,759 Speaker 1: that's just how that's that's how things happen. And from 785 00:48:04,800 --> 00:48:06,680 Speaker 1: how I see it, I think it's going to open 786 00:48:06,760 --> 00:48:13,840 Speaker 1: up an entirely new world of created creativity and just 787 00:48:14,480 --> 00:48:17,400 Speaker 1: entirely new things to make um. And it's going to 788 00:48:17,480 --> 00:48:20,520 Speaker 1: be guided by humans. You know, even though it's AI 789 00:48:20,600 --> 00:48:23,920 Speaker 1: and you're like, oh, the machines making it. No, at 790 00:48:23,920 --> 00:48:25,520 Speaker 1: the end of the day, it's still humans that are 791 00:48:25,520 --> 00:48:27,640 Speaker 1: making it and guiding it, you know. And it's and 792 00:48:27,680 --> 00:48:31,279 Speaker 1: it's taking our visions and making it real. It's just 793 00:48:31,320 --> 00:48:33,560 Speaker 1: a different way that it's doing it. If you can 794 00:48:33,600 --> 00:48:36,000 Speaker 1: prompt right, I've seen the nightmare where it's not prompted 795 00:48:36,080 --> 00:48:40,200 Speaker 1: correctly exactly, and it's it takes some practice to to 796 00:48:40,239 --> 00:48:54,960 Speaker 1: get it right. I mean, it is interesting. We've been 797 00:48:55,000 --> 00:48:59,000 Speaker 1: doing a lot of technology episodes lately, and essentially the 798 00:48:59,080 --> 00:49:02,840 Speaker 1: takeaway is like they're pros and cons. There's bias involved 799 00:49:02,840 --> 00:49:07,360 Speaker 1: because humans are involved, and you can't remove technology from 800 00:49:07,480 --> 00:49:09,480 Speaker 1: the society that we are and the people who created 801 00:49:09,520 --> 00:49:12,400 Speaker 1: it and the people who use it. Um. But I 802 00:49:13,280 --> 00:49:17,960 Speaker 1: it is also true that, yeah, I've been working kind 803 00:49:18,000 --> 00:49:21,120 Speaker 1: of in this industry for ten years, and there were 804 00:49:21,160 --> 00:49:24,279 Speaker 1: there have been panics like if Google changes the algorithm, 805 00:49:24,280 --> 00:49:26,279 Speaker 1: our whole office would shut down, and it's like, oh 806 00:49:26,360 --> 00:49:34,279 Speaker 1: my god, we're ruined. Yeah, Like, I don't think I 807 00:49:34,320 --> 00:49:36,319 Speaker 1: get it because I remember sitting in the office and 808 00:49:36,360 --> 00:49:39,600 Speaker 1: thinking like, oh I'm fired. Um well Google changed this 809 00:49:39,680 --> 00:49:42,879 Speaker 1: one thing and my whole job's gone. What will I do? 810 00:49:43,480 --> 00:49:45,440 Speaker 1: So I get the fear of it. But it is 811 00:49:45,520 --> 00:49:49,400 Speaker 1: true that, you know, we adapted and we changed and 812 00:49:49,440 --> 00:49:51,200 Speaker 1: I still have a job. It's not the same job, 813 00:49:51,360 --> 00:49:56,640 Speaker 1: but I do still have a related job. I mean, 814 00:49:56,760 --> 00:49:59,320 Speaker 1: and you already see this in a lot of different 815 00:50:00,239 --> 00:50:02,799 Speaker 1: like in the fashion industry. Right now, we're starting to 816 00:50:02,880 --> 00:50:07,239 Speaker 1: see a rise in people valuing handmade stuff, you know, 817 00:50:07,280 --> 00:50:12,120 Speaker 1: handmade pottery and whatnot, um, sustainable stuff, you know. And 818 00:50:12,400 --> 00:50:17,000 Speaker 1: I think in that sense, actually, really nice handmade stuff 819 00:50:17,040 --> 00:50:21,480 Speaker 1: will be valued even more because we'll be so bombarded 820 00:50:22,239 --> 00:50:26,480 Speaker 1: by everything else that people you know, we're we're never 821 00:50:26,520 --> 00:50:29,680 Speaker 1: gonna not value the stuff that people make of their 822 00:50:29,719 --> 00:50:34,200 Speaker 1: own hands. You know, that's not going to change. So 823 00:50:34,280 --> 00:50:38,080 Speaker 1: I'm not too worried. I'm actually really excited. Um. I 824 00:50:38,120 --> 00:50:40,839 Speaker 1: think a lot about do you guys know Stardi value 825 00:50:41,600 --> 00:50:43,520 Speaker 1: all these things that I don't know? I'm so old 826 00:50:43,680 --> 00:50:48,320 Speaker 1: keep going up. It's a really cute and very popular 827 00:50:48,360 --> 00:50:50,839 Speaker 1: games based off of the Harvest Moon games, which are 828 00:50:51,120 --> 00:50:57,560 Speaker 1: um the o G farming simulators. Um uh. But this 829 00:50:57,880 --> 00:51:00,960 Speaker 1: the guy who made Stardi value. He it. He was 830 00:51:01,000 --> 00:51:04,000 Speaker 1: the only person that made it. He spent ten years 831 00:51:04,040 --> 00:51:08,160 Speaker 1: making it, you know, writing the code, um, making all 832 00:51:08,239 --> 00:51:10,040 Speaker 1: the art and I and I remember reading about it 833 00:51:10,080 --> 00:51:12,839 Speaker 1: once and it was like he changed all the art 834 00:51:12,880 --> 00:51:15,040 Speaker 1: like two or three times at a certain point, Like 835 00:51:15,080 --> 00:51:18,000 Speaker 1: I said, it grew and evolved and it's such a 836 00:51:18,239 --> 00:51:20,319 Speaker 1: it's it's won so many awards and it's such a 837 00:51:20,440 --> 00:51:24,480 Speaker 1: loved piece of media, and you know, it just really 838 00:51:24,520 --> 00:51:27,799 Speaker 1: makes me think of that because I'm like, Okay, now 839 00:51:27,840 --> 00:51:33,040 Speaker 1: everybody has ten years to dedicate to making like a 840 00:51:33,080 --> 00:51:37,319 Speaker 1: big project like this, but now with AI, you know, 841 00:51:37,600 --> 00:51:41,440 Speaker 1: and especially once it evolves to something a lot um 842 00:51:41,440 --> 00:51:45,600 Speaker 1: stronger and more usable. Imagine if like anybody, you know, 843 00:51:45,800 --> 00:51:48,200 Speaker 1: they could do what this guy did in ten years, 844 00:51:48,239 --> 00:51:51,080 Speaker 1: they could do it in a couple of months, you know. 845 00:51:52,040 --> 00:51:57,600 Speaker 1: And imagine the amount of new stories, the amount of 846 00:51:57,640 --> 00:52:02,120 Speaker 1: new media, the amount of niche stuff too. Because if 847 00:52:02,120 --> 00:52:06,279 Speaker 1: you're a big movie production place, you know, you have 848 00:52:07,040 --> 00:52:09,480 Speaker 1: you're spending like millions of dollars on making a movie. 849 00:52:09,920 --> 00:52:12,600 Speaker 1: You have to make, you know, something kind of generic 850 00:52:13,320 --> 00:52:15,520 Speaker 1: um so that everybody will like because you're you know, 851 00:52:15,640 --> 00:52:18,319 Speaker 1: you have to like stick to certain formulas, you know, 852 00:52:18,440 --> 00:52:22,280 Speaker 1: because you don't want to lose money blah blah. Imagine 853 00:52:22,280 --> 00:52:24,920 Speaker 1: a world where it won't take millions of dollars to 854 00:52:25,000 --> 00:52:31,200 Speaker 1: make movies. And so imagine like you or me, you know, 855 00:52:31,280 --> 00:52:35,680 Speaker 1: or your kids or these these stuff whatever, right, Imagine 856 00:52:37,400 --> 00:52:39,480 Speaker 1: view or me like being able to like make our 857 00:52:39,520 --> 00:52:44,440 Speaker 1: own movies about whatever we want, and how powerful and 858 00:52:44,480 --> 00:52:48,680 Speaker 1: empowering that can be, you know, and also awful actually 859 00:52:48,719 --> 00:52:56,680 Speaker 1: another think about it, but art is magical. UM. Being 860 00:52:56,719 --> 00:53:00,879 Speaker 1: able to create is magical. UM. It's gotten me through 861 00:53:01,000 --> 00:53:04,640 Speaker 1: some of the hardest parts of my life. UM, try 862 00:53:04,680 --> 00:53:07,440 Speaker 1: not to actually cry. But when my dad died, that 863 00:53:07,520 --> 00:53:11,480 Speaker 1: was that was I was twenty three, my sister and brother. 864 00:53:11,600 --> 00:53:13,880 Speaker 1: My sister was fifteen, my brother was fourteen, So it 865 00:53:13,960 --> 00:53:18,480 Speaker 1: was just a whole thing. And a week after he 866 00:53:18,560 --> 00:53:22,480 Speaker 1: had died, I sat myself down and I forced myself 867 00:53:22,560 --> 00:53:25,120 Speaker 1: to create because I knew he'd want me to do that. 868 00:53:25,320 --> 00:53:29,759 Speaker 1: And that was so healing because it was something that 869 00:53:29,880 --> 00:53:33,280 Speaker 1: I could hold on to, you know, through my grief. 870 00:53:34,440 --> 00:53:38,960 Speaker 1: And I have talked to so many people who have 871 00:53:39,080 --> 00:53:42,120 Speaker 1: had similar stories I've I've talked to. There was one 872 00:53:42,160 --> 00:53:45,439 Speaker 1: woman who she lost her son and she was like, 873 00:53:45,719 --> 00:53:50,000 Speaker 1: I was barely living. I was just surviving. And somebody 874 00:53:50,040 --> 00:53:52,120 Speaker 1: introduced me to mid Journey and I've been able to 875 00:53:53,440 --> 00:53:56,719 Speaker 1: work through this grief and not only do that but 876 00:53:56,800 --> 00:54:00,200 Speaker 1: also just make fun stuff and you know, and or 877 00:54:00,239 --> 00:54:04,319 Speaker 1: people who UM, like one person he was telling me 878 00:54:04,360 --> 00:54:07,880 Speaker 1: how he used to be able to play guitar, and 879 00:54:08,800 --> 00:54:11,719 Speaker 1: he uh, he got an injury in his hands. He 880 00:54:11,760 --> 00:54:14,120 Speaker 1: can't move his hands right. He can type, but he 881 00:54:14,120 --> 00:54:16,080 Speaker 1: he can't He can't even learn to draw if he 882 00:54:16,120 --> 00:54:21,480 Speaker 1: wanted to, and so he was just cut off from 883 00:54:21,520 --> 00:54:24,480 Speaker 1: being able to create in these capacities. And now it's 884 00:54:24,560 --> 00:54:27,719 Speaker 1: like now he has the ability, and I think that's 885 00:54:27,760 --> 00:54:33,239 Speaker 1: so powerful. Um. I know a lot of people who, um, 886 00:54:33,239 --> 00:54:36,160 Speaker 1: they were disabled and like they're like stuck in bed 887 00:54:36,200 --> 00:54:38,480 Speaker 1: all day or they just they just aren't able to 888 00:54:38,640 --> 00:54:41,799 Speaker 1: like go out and do things they want to do. 889 00:54:42,080 --> 00:54:45,400 Speaker 1: And they've told me personally, you know, being able to 890 00:54:45,400 --> 00:54:50,239 Speaker 1: do this is like brought so much meaning and like 891 00:54:50,440 --> 00:54:54,319 Speaker 1: so you know, so much magic to their lives, and 892 00:54:54,360 --> 00:54:58,200 Speaker 1: like that to me is so impactful and so meaningful 893 00:54:58,360 --> 00:55:02,520 Speaker 1: because I am understand I understand that magic. I've been 894 00:55:02,520 --> 00:55:08,240 Speaker 1: there myself, and I'm just like I'm just so happy 895 00:55:08,520 --> 00:55:12,640 Speaker 1: and so um so honored that I get to be 896 00:55:12,719 --> 00:55:17,480 Speaker 1: a part of something that is opening up that magic 897 00:55:17,719 --> 00:55:21,160 Speaker 1: to people who have never experienced it before. Um, I 898 00:55:21,239 --> 00:55:24,360 Speaker 1: think that's really wonderful. So, yeah, you make a great point. 899 00:55:24,640 --> 00:55:28,280 Speaker 1: I think this is definitely a conversation about accessibility as well. 900 00:55:28,640 --> 00:55:31,279 Speaker 1: That for people who were not able to create and 901 00:55:31,360 --> 00:55:33,880 Speaker 1: had images, our thoughts or things that they wanted to 902 00:55:33,920 --> 00:55:36,359 Speaker 1: do that we're never able to do so or thought 903 00:55:36,400 --> 00:55:38,720 Speaker 1: they couldn't do. So this could be a conversation about 904 00:55:38,800 --> 00:55:42,440 Speaker 1: it did open up to those too have an accessible 905 00:55:42,480 --> 00:55:45,200 Speaker 1: outlet to do those So it is it's an interesting 906 00:55:45,600 --> 00:55:48,120 Speaker 1: point of view do and not just people who are disabled. 907 00:55:48,120 --> 00:55:51,160 Speaker 1: But like, I mean, you know, if all the nepotism 908 00:55:51,200 --> 00:55:55,000 Speaker 1: stuff being talked about recently, you know, it's like even 909 00:55:55,040 --> 00:55:58,400 Speaker 1: just getting into the art world can be very daunting 910 00:55:58,440 --> 00:56:02,200 Speaker 1: and expensive. You know, our colleges are super expensive. I 911 00:56:02,200 --> 00:56:04,640 Speaker 1: would know I went to one, but I also knew 912 00:56:04,640 --> 00:56:07,440 Speaker 1: people who weren't able to go because they would have 913 00:56:07,520 --> 00:56:11,160 Speaker 1: never been able to afford it. And so I think 914 00:56:11,200 --> 00:56:15,160 Speaker 1: about it in that sense, to like just the amount 915 00:56:15,200 --> 00:56:18,080 Speaker 1: of people who may never have had access to this 916 00:56:18,160 --> 00:56:22,279 Speaker 1: world it's now become even more easy, and I think 917 00:56:22,320 --> 00:56:24,160 Speaker 1: that's a good thing. There's so many things we can 918 00:56:24,200 --> 00:56:26,080 Speaker 1: talk about. There's so many intersections here, and I feel 919 00:56:26,080 --> 00:56:28,439 Speaker 1: like we've touched on a lot of them. Um, So 920 00:56:29,200 --> 00:56:33,400 Speaker 1: thank you for for coming on and and discussing it 921 00:56:33,440 --> 00:56:36,719 Speaker 1: with us, because I'm I'm a newbie, so I learned 922 00:56:36,719 --> 00:56:42,080 Speaker 1: a lot. Is there anything any resources you want to 923 00:56:42,080 --> 00:56:44,320 Speaker 1: shout out anywhere people can go to find you or 924 00:56:44,440 --> 00:56:48,680 Speaker 1: learn more? So, UM, you can find me on discord 925 00:56:48,800 --> 00:56:51,640 Speaker 1: if you want. I'm I set myself as offline a 926 00:56:51,640 --> 00:56:53,799 Speaker 1: lot because I get a lot of spam otherwise. But 927 00:56:54,440 --> 00:56:58,000 Speaker 1: sometimes you might catch me on the side board of 928 00:56:58,040 --> 00:57:00,239 Speaker 1: all the list of people. UM, I should be in 929 00:57:00,239 --> 00:57:05,520 Speaker 1: the moderator's column. I'm Knuckle community age on which UM 930 00:57:05,560 --> 00:57:09,719 Speaker 1: I'm also Knuckle on Instagram. UM, you can message me 931 00:57:09,760 --> 00:57:13,480 Speaker 1: through there. I will see it, and I guess you 932 00:57:13,520 --> 00:57:15,480 Speaker 1: can't connect with me on LinkedIn if you want, but 933 00:57:15,680 --> 00:57:19,240 Speaker 1: I might not see it as quick, so I don't know. 934 00:57:19,400 --> 00:57:22,160 Speaker 1: I think you're the first person to use LinkedIn. That's 935 00:57:22,200 --> 00:57:28,280 Speaker 1: that's awesome. There's like actually like a really solid AI 936 00:57:28,400 --> 00:57:31,880 Speaker 1: community on LinkedIn. And the other thing is, UM, I 937 00:57:31,960 --> 00:57:34,120 Speaker 1: might not be able to help everybody, like if you 938 00:57:34,160 --> 00:57:37,960 Speaker 1: message me, and I might not be online or I 939 00:57:38,040 --> 00:57:39,760 Speaker 1: might not because I do a lot of stuff. I'm 940 00:57:39,880 --> 00:57:42,040 Speaker 1: very busy, but like I might not be able to 941 00:57:42,080 --> 00:57:45,480 Speaker 1: help you personally. But we have UM the trial support 942 00:57:45,520 --> 00:57:49,240 Speaker 1: and the member support channels and the guides are there 943 00:57:49,720 --> 00:57:53,120 Speaker 1: to help you. UM and we also have the promp 944 00:57:53,200 --> 00:57:57,440 Speaker 1: Chat channel, which is a lot more specific to helping 945 00:57:57,480 --> 00:58:00,760 Speaker 1: prop stuff. They're great. And then all to the Facebook. 946 00:58:00,800 --> 00:58:03,120 Speaker 1: I think is a really great place to ask questions too, 947 00:58:03,160 --> 00:58:05,560 Speaker 1: because you have a lot of really cool community members there. 948 00:58:06,480 --> 00:58:13,640 Speaker 1: So yeah, options galore. All right, be hearing from us, 949 00:58:13,760 --> 00:58:18,320 Speaker 1: I guess, thank you, thank you, thank you again for 950 00:58:18,480 --> 00:58:21,440 Speaker 1: joining us. Listeners. If you'd like to contact us, you 951 00:58:21,480 --> 00:58:23,960 Speaker 1: can our emails Stephanie your mom stuff at ihart meia 952 00:58:24,040 --> 00:58:26,560 Speaker 1: dot com. You can find us on Twitter, ms, podcasts, 953 00:58:26,800 --> 00:58:29,439 Speaker 1: or Instagram and TikTok that stuff I've never told you. Thanks, 954 00:58:29,480 --> 00:58:32,640 Speaker 1: it's always started. Super producer Christina, Thank you Christina, and 955 00:58:32,680 --> 00:58:34,480 Speaker 1: thanks to you for listening stuff I Never told you. 956 00:58:34,480 --> 00:58:36,280 Speaker 1: Re Prospection by heart Radio. For more podcast in my 957 00:58:36,280 --> 00:58:37,800 Speaker 1: heart Radio, you can check out the heart radio app 958 00:58:37,800 --> 00:58:39,960 Speaker 1: Apple podcast wherevery that's into your favorite shows.